Take a Video with Arkit

How can I record an ARKit scene but exclude UI elements?

You can use ReplayKit for this but it isn't very well documented. The key is that you render all of your UI elements in a separate UIWindow that is overlaid on top of a primary UIWindow that contains the AR content. ReplayKit only records the primary window, so with this structure the user interface elements will not show up in the recording.

While there may be a better way to do this, here's an example of how I setup this window structure for my SwiftUI based app. Here I use the UIWindow.level property to mark the AR content as the main window, while putting the UI into its own secondary window at a higher level:

class SceneDelegate: UIResponder, UIWindowSceneDelegate {

var arWindow: UIWindow?
var uiWindow: UIWindow?

func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options
if let windowScene = scene as? UIWindowScene {
// Create a window for the AR content.
// This is the main window.
let arWindow = UIWindow(windowScene: windowScene)
self.arWindow = arWindow
arWindow.windowLevel = .normal

// Add your AR view controller here or set the view controller lazily
// when you actually need to show AR content
let vc = UIViewController()
arWindow.rootViewController = vc


// Now create a window for the UI
let uiWindow = UIWindow(windowScene: windowScene)
self.uiWindow = uiWindow
// Setting the level makes this window's content be excluded from replaykit
uiWindow.windowLevel = UIWindow.Level(UIWindow.Level.normal.rawValue + 1)
uiWindow.isOpaque = false

// Render your SwiftUI based user interface
let content = MyUserInterfaceView()
.background(Color.clear)

let hosting = UIHostingController(rootView: content)
hosting.view.backgroundColor = .clear
hosting.view.isOpaque = false
uiWindow.rootViewController = hosting

uiWindow.makeKeyAndVisible()
}
}
}

My app initializes the AR content lazily, so I simply update arWindow.viewController when I need to show it.

A few notes:

  • This approach requires that you separate your AR view controller from the rest of your user interface. This is a simple change in many cases but could be quite involved for a more complex app.

  • Keep in mind that any user event handlers and gesture recognizers you have on your AR view controller may no longer work as expected after you split the AR content and layout into their own windows.

    I work around this by having a transparent proxy view controller in my main layout that forwards user events to the real AR view controller

  • This approach isn't ARKit specific. It should also work for normal SceneKit apps, Metal apps, and conventional apps.

  • Unlike some third party recording libraries, ReplayKit prompts the user when recording the screen.

    I actually like this feature as it makes it easy for the user to pick if they want the microphone recorded or not. You may find it gets in the way though, so there is still a use case for third party recording libraries.

  • ReplayKit also provides a nice built-in user interface for trimming and saving the recorded video.

How to play hosted video on an image with Arkit in Swift?

Your code looks just fine. Only thing that you need to change is the line that you've defined the videoNode

Example:

let videoNode = SKVideoNode(url: URL(fileURLWithPath: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4"))

Hope this helps!

Record video from front facing camera during ARKit ARSession on iPhone X

ARKit runs its own AVCaptureSession, and there can be only one capture session running at a time — if you run a capture session, you preempt ARKit’s, which prevents ARKit from working.

However, ARKit does provide access to the camera pixel buffers it receives from its capture session, so you can record video by feeding those sample buffers to an AVAssetWriter. (It’s basically the same workflow you’d use when recording video from AVCaptureVideoDataOutput... a lower-level way of doing video recording compared to AVCaptureMovieFileOutput.)

You can also feed the ARKit camera pixel buffers (see ARFrame.capturedImage) to other technologies that work with live camera imagery, like the Vision framework. Apple has a sample code project demonstrating such usage.



Related Topics



Leave a reply



Submit