Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

D3DMetal unsupported CreateCommandQueue1 API while running simple wglgears using Mesa 24.3 GLon12 driver..
Hi, wanted to test if possible to use Mesa3D OGLon12+D3DMetal 2b3 to get GL>4.1 support on windows apps via D3D12Metal.. using simple wglgears.c app (similar glxgears) and running like: GALLIUM_DRIVER=d3d12 wine64 wglgears64 -info with overridden opengl32.dll using contents from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z I get: [D3DMetal:LOG:5E53] Unsupported API: CreateCommandQueue1 caused by: https://gitlab.freedesktop.org/mesa/mesa/-/commit/c022c9603d500b59ff5e6f93c8a214d1785ab20a API: https://learn.microsoft.com/en-us/windows/win32/api/d3d12/nf-d3d12-id3d12device9-createcommandqueue1 note setup is correct as using: GALLIUM_DRIVER=llvmpipe wine64 wglgears64 -info I get: GL_RENDERER = llvmpipe (LLVM 19.1.3, 128 bits) GL_VERSION = 4.5 (Compatibility Profile) Mesa 24.3.0-rc1 (git-85ba713d76) GL_VENDOR = Mesa GL_EXTENSIONS = GL_ARB_multisample GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_minmax GL_EXT_blend_subtract r GL_EXT_texture.. etc..
1
0
638
May ’25
Bone deformation
I have been tasked with creating content for the Apple Vision Pro. Just the 3D content and animation, not the programming end of things. I can't seem to get any kind of mesh deformation animation to import into Reality Composer Pro. By that I mean bones/skin, or even point cache. I work on PC, and my main software is 3DS Max, but I'm borrowing an iMac for this job, and was instructed to use RCP on it for testing before handing things off to the programmer. My files open and play fine in other USD programs, like Omniverse, or USD View, just not Reality Composer Pro. I've seen the dinosaur demo in AVP, so I know mesh deformation is possible. If there are other essential tools that might make this possible, I have not been made aware of them. I am experimenting with bouncing things off of Blender, in case that exports better, but not really having luck there either -though my results are different. Thanks.
0
0
137
Sep ’25
How to implement c for vision ?
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
1
0
111
Jul ’25
Game Center breaks "Kids games" Rules on iPadOS 26
I have several games on the app store which are setup as "For Kids" which means these games cannot have any way to access the outside world including the App Store. No problem. These games have worked fine for years until iOS 26. Now, all my updates are being rejected because the new version of Game Center running on iPadOS 26 has a way for people to exit the game and go to the App Store. I have no control over this since it's built into Game Center, and the app review folks want me to put a "parental gate" on it, but there's no way to do that because... well... it's in Game Center, not my code. So, I'm unable to update my apps because of this. Presumably, the existing versions on the app store still do this exact same thing, so my update isn't going to make any difference. Does anyone know of a way to make that crap at the top go away so this isn't an issue?
1
0
441
Sep ’25
How to use CharacterControllerComponent.
I am trying to implement a ChacterControllerComponent using the following URL. https://developer.apple.com/documentation/realitykit/charactercontrollercomponent I have written sample code, but PhysicsSimulationEvents.WillSimulate is not executed and nothing happens. import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { let gravity: SIMD3<Float> = [0, -50, 0] let jumpSpeed: Float = 10 enum PlayerInput { case none, jump } @State private var testCharacter: Entity = Entity() @State private var myPlayerInput = PlayerInput.none var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) testCharacter = immersiveContentEntity.findEntity(named: "Capsule")! testCharacter.components.set(CharacterControllerComponent()) let _ = content.subscribe(to: PhysicsSimulationEvents.WillSimulate.self, on: testCharacter) { event in print("subscribe run") let deltaTime: Float = Float(event.deltaTime) var velocity: SIMD3<Float> = .zero var isOnGround: Bool = false // RealityKit automatically adds `CharacterControllerStateComponent` after moving the character for the first time. if let ccState = testCharacter.components[CharacterControllerStateComponent.self] { velocity = ccState.velocity isOnGround = ccState.isOnGround } if !isOnGround { // Gravity is a force, so you need to accumulate it for each frame. velocity += gravity * deltaTime } else if myPlayerInput == .jump { // Set the character's velocity directly to launch it in the air when the player jumps. velocity.y = jumpSpeed } testCharacter.moveCharacter(by: velocity * deltaTime, deltaTime: deltaTime, relativeTo: nil) { event in print("playerEntity collided with \(event.hitEntity.name)") } } } } } } The scene is loaded from RCP. It is simple, just a capsule on a pedestal. Do I need a separate code to run testCharacter from this state?
0
0
184
May ’25
Can you delete a MTLLibrary once shaders are placed into pipeline?
Hello, I am quite new to using the metal API and was wondering if it was common (or even possible) if you knew that, when a pipeline was created, you never needed to make another one with the same shaders again, if it is safe to release the library the was used to reference the shaders? Only asking because this is possible in other apis, but apple never mentions (as far as I have found) if this is safe or not safe to do.
1
0
402
Oct ’25
WWDC25 Metal & game technologies group lab
Hello, Thank you for attending today’s Metal & game technologies group lab at WWDC25! We were delighted to answer many questions from developers and energized by the community engagement. We hope you enjoyed it and welcome your feedback. We invite you to carry on the conversation here, particularly if your question appeared in Slido and we were unable to answer it during the lab. If your question received feedback let us know if you need clarification. You may want to ask your question again in a different lab e.g. visionOS tomorrow. (We realize that this can be confusing when frameworks interoperate) We have a lot to learn from each other so let’s get to Q&A and make the best of WWDC25! 😃 Looking forward to your questions posted in new threads.
2
0
480
Jul ’25
Score range of ImageAestheticsScoresObservation in Vision framework
Hi everyone, I'm using the Vision framework’s ImageAestheticsScoresObservation class (https://developer.apple.com/documentation/vision/imageaestheticsscoresobservation). I noticed that the overallScore returned sometimes gives negative values. Could someone confirm whether the expected range of the score is from -1.0 to 1.0? The documentation doesn’t explicitly state the possible score range, so I’d appreciate any clarification or insights. Thanks in advance!
0
0
181
Apr ’25
RealityKit and Reality Composer Pro aren't forward or backward compatible with each other
TL;DR: RealityKit and Reality Composer Pro aren't forward or backward compatible with each other, and the resulting error message is terse and unhelpful. (FB14828873) So far, I've been sticking with Xcode 16.4 for development and only using Xcode 26.0 beta experimentally. Yesterday, I used xcode-select to switch to Xcode 26.0 beta 3 to test it, but I forgot to switch back. Consequently, this morning I unintentionally used the future Reality Composer Pro (the version included with Xcode 26) to make a small change to a USD file. Now I realize that if I'm unlucky, it's possible Reality Composer Pro may have silently introduced a small change into the USD file that may make RealityKit fail to read the file on iOS 18 and visionOS 2, which in the past has resulted in hours of debugging to track down the source of the failure, often a single line in the USD file that RealityKit can't communicate to me other than with the error "the operation couldn't be completed". As an analogy, this situation is as if, during regular development (not involving Reality Composer Pro), Xcode didn't warn you about specific API version conflicts, but instead failed with a generic error message, without highlighting the line in your Swift file that was the source of the error.
1
0
480
Jul ’25
Animations for streaming
We have a macOS app (not yet released, but in use by ourselves), that provides scoreboards for streaming sport events. Today it is expected, that there are nice animations for goals, etc. We are streaming using NDI, which requires a CVPixelBuffer for each frame. We currently create these animations using CABasicAnimation, CAAnimation and CAKeyframeAnimation. In addition we use ScreenCaptureKit to generate the frames. This works fine with 25/30 fps, as long as the window where our animations are performed in is visible. But this is not what it should be. We have a smaller window as main app window and control display performing the animations in reduced size, while the streaming animations need to be in HD format and later maybe in 4K. When using an offscreen window, the animations are not calculated. We get 1 frame per second or so. So we actually have to connect an external display to the MacBook and open the large windows there. Ugly solution. Do we use a completely wrong approach? Or is there a way to tell the macOS to perform the animations although it is an offscreen window? If it cannot work that way, what is an alternative?
0
0
155
May ’25
iOS 26.1 beta: third-party Broadcast Extension sessions doesnt work after 3 s, then picker frozen
It's a Broadcast Extension issue: on iOS 26.1 beta the extension never launches—after you tap “Start Broadcast” in the system picker the countdown disappears after 3 s and no broadcast starts, so every live-streaming app(and all other non-system apps that use Broadcast Extension) fails to go live (only the native Photos screen recording still works). Is this a known regression or is a new entitlement required?
0
4
255
Oct ’25
ModelEntity(named:in:) fails to load USD file from RealityKitContent bundle with misleading error?
My experience has been that ModelEntity(named:in:) can be used to load a USD file with a simple structure consisting of entities and model entities, and, critically, it will flatten the entity hierarchy down to a single ModelEntity, presumably reducing the number of draw calls. However, can anyone verify that the following is true? If ModelEntity(named:in:) is used to load a USD file from a RealityKit content bundle, it may fail when the USD file contains more complex data, such as shader graph material definitions, or perhaps for some other reason. I am not sure. AND the error that ModelEntity(named:in:) throws in this case is Cannot load RealityKitContent entity: Failed to find resource with name "<name>" in bundle which would literally suggest that the file does not exist, instead of what I assume the error actually is, which is "the file exists but its entity hierarchy could not be flattened to a single ModelEntity" ? Is that an accurate description of the known behavior of ModelEntity:named:in:)? I understand that I could use Entity(named:in:) instead, without the flattening feature. My question is really more about the seemingly misleading error message. Thank you for any clarification you can provide.
2
0
225
May ’25
Firebase not initializing in iOS Unreal Engine 5.6 app (using MetaHuman + FirebaseFeatures)
Hi everyone, I'm building a native iOS app using Unreal Engine 5.6 with Firebase for authentication and Firestore. The app uses a MetaHuman avatar and is meant to run as a standalone UE app on iPhone. I'm using this Firebase wrapper: 👉 https://pandoa.github.io/FirebaseFeatures/ I've followed all the steps, including: Adding GoogleService-Info.plist to the Xcode project and ensuring it’s in the correct target Calling FIRApp.configure() in AppDelegate Verifying the plist is bundled correctly However, the app crashes on launch, and Firebase does not initialize properly. Crash log shows: [FirebaseCore][I-COR000005] No app has been configured yet. Setup details: Unreal Engine: 5.6 (source build, macOS) iOS Deployment: 17.5 MetaHuman character packaged correctly and app launches fine without Firebase Has anyone here managed to get Firebase working inside a native Unreal Engine iOS app with this setup? I'd love to hear if there’s something I’m missing — maybe something with initialization timing or module loading? Thanks so much in advance 🙏
1
0
762
Jul ’25
You cannot debug in simulator
I can't create any breakpoint in my Xcode after I upgraded to macOS 15.4 macOS: Version 15.4 (24E248) visionOS Simulator: 2.3 Xcode: Version 16.2 (16C5032a) My app works well without any breakpoints. But if I create any breakpoint it shows me this: Couldn't find the Objective-C runtime library in loaded images. Message from debugger: The LLDB RPC server has crashed. You may need to manually terminate your process. The crash log is located in ~/Library/Logs/DiagnosticReports and has a prefix 'lldb-rpc-server'. Please file a bug and attach the most recent crash log.
3
5
537
Apr ’25
Metal useResource vs. MTLFence
Hello, I'm tracking down a bug where useResource doesn't seem to apply proper synchronization when a resource is produced by the render pass then consumed by the compute pass, but when I use MTLFence between the to signal and wait between the render/compute encoders, the artifact goes away. The resource is created with MTLHazardTrackingModeTracked and useResource is called on the compute encoder after the render pass. Metal API Validation doesn't report any warnings/errors. Am I misunderstanding the difference between the two APIs? I dug through the Metal documentation and it looks like useResource should handle synchronization given the resource has MTLHazardTrackingModeTracked but on the other hand, MTLFence should be used to ensure proper synchronization between command encoders. Can someone can clarify the difference between the two APIs and when to use them.
3
0
166
Jul ’25
screencapturekit
I have code that captures a window and displays a cropped image. The problem is 2 fold. Kit doesn't seem to allow to modify stop and recapture image in window mode to capture a portion of the screen. So this makes me having to crop and display the cropped image via a published variable. This all works find. But seems to stop after some time. Using an M1 16gig ram. program is taking less than 100meg of mem with 40-70%cpu as the crow flies. printing captured success in debug mode and sometimes frame isn't valid so guarding against it. any ideas on how to improve my strategy?
0
0
158
Jun ’25
How to Apple Unity Plugins
When running my game in the Unity Editor on Windows platform I get an error: DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null) Apple.GameKit.DefaultNSErrorHandler.Init () (at ./Library/PackageCache/com.apple.unityplugin.gamekit@0abcad546f73/Source/DefaultHandlers.cs:35) This is because GameKitWrapper dynamically linked library is not available under Windows platform. Besides, "Apple Build Settings" are declared under UNITY_EDITOR_OSX and also not available under Windows platform. Does anyone managed to solve this?
1
1
606
Jul ’25
D3DMetal unsupported CreateCommandQueue1 API while running simple wglgears using Mesa 24.3 GLon12 driver..
Hi, wanted to test if possible to use Mesa3D OGLon12+D3DMetal 2b3 to get GL>4.1 support on windows apps via D3D12Metal.. using simple wglgears.c app (similar glxgears) and running like: GALLIUM_DRIVER=d3d12 wine64 wglgears64 -info with overridden opengl32.dll using contents from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z I get: [D3DMetal:LOG:5E53] Unsupported API: CreateCommandQueue1 caused by: https://gitlab.freedesktop.org/mesa/mesa/-/commit/c022c9603d500b59ff5e6f93c8a214d1785ab20a API: https://learn.microsoft.com/en-us/windows/win32/api/d3d12/nf-d3d12-id3d12device9-createcommandqueue1 note setup is correct as using: GALLIUM_DRIVER=llvmpipe wine64 wglgears64 -info I get: GL_RENDERER = llvmpipe (LLVM 19.1.3, 128 bits) GL_VERSION = 4.5 (Compatibility Profile) Mesa 24.3.0-rc1 (git-85ba713d76) GL_VENDOR = Mesa GL_EXTENSIONS = GL_ARB_multisample GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_minmax GL_EXT_blend_subtract r GL_EXT_texture.. etc..
Replies
1
Boosts
0
Views
638
Activity
May ’25
Bone deformation
I have been tasked with creating content for the Apple Vision Pro. Just the 3D content and animation, not the programming end of things. I can't seem to get any kind of mesh deformation animation to import into Reality Composer Pro. By that I mean bones/skin, or even point cache. I work on PC, and my main software is 3DS Max, but I'm borrowing an iMac for this job, and was instructed to use RCP on it for testing before handing things off to the programmer. My files open and play fine in other USD programs, like Omniverse, or USD View, just not Reality Composer Pro. I've seen the dinosaur demo in AVP, so I know mesh deformation is possible. If there are other essential tools that might make this possible, I have not been made aware of them. I am experimenting with bouncing things off of Blender, in case that exports better, but not really having luck there either -though my results are different. Thanks.
Replies
0
Boosts
0
Views
137
Activity
Sep ’25
How to implement c for vision ?
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
Replies
1
Boosts
0
Views
111
Activity
Jul ’25
Game Center breaks "Kids games" Rules on iPadOS 26
I have several games on the app store which are setup as "For Kids" which means these games cannot have any way to access the outside world including the App Store. No problem. These games have worked fine for years until iOS 26. Now, all my updates are being rejected because the new version of Game Center running on iPadOS 26 has a way for people to exit the game and go to the App Store. I have no control over this since it's built into Game Center, and the app review folks want me to put a "parental gate" on it, but there's no way to do that because... well... it's in Game Center, not my code. So, I'm unable to update my apps because of this. Presumably, the existing versions on the app store still do this exact same thing, so my update isn't going to make any difference. Does anyone know of a way to make that crap at the top go away so this isn't an issue?
Replies
1
Boosts
0
Views
441
Activity
Sep ’25
How to use CharacterControllerComponent.
I am trying to implement a ChacterControllerComponent using the following URL. https://developer.apple.com/documentation/realitykit/charactercontrollercomponent I have written sample code, but PhysicsSimulationEvents.WillSimulate is not executed and nothing happens. import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { let gravity: SIMD3<Float> = [0, -50, 0] let jumpSpeed: Float = 10 enum PlayerInput { case none, jump } @State private var testCharacter: Entity = Entity() @State private var myPlayerInput = PlayerInput.none var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) testCharacter = immersiveContentEntity.findEntity(named: "Capsule")! testCharacter.components.set(CharacterControllerComponent()) let _ = content.subscribe(to: PhysicsSimulationEvents.WillSimulate.self, on: testCharacter) { event in print("subscribe run") let deltaTime: Float = Float(event.deltaTime) var velocity: SIMD3<Float> = .zero var isOnGround: Bool = false // RealityKit automatically adds `CharacterControllerStateComponent` after moving the character for the first time. if let ccState = testCharacter.components[CharacterControllerStateComponent.self] { velocity = ccState.velocity isOnGround = ccState.isOnGround } if !isOnGround { // Gravity is a force, so you need to accumulate it for each frame. velocity += gravity * deltaTime } else if myPlayerInput == .jump { // Set the character's velocity directly to launch it in the air when the player jumps. velocity.y = jumpSpeed } testCharacter.moveCharacter(by: velocity * deltaTime, deltaTime: deltaTime, relativeTo: nil) { event in print("playerEntity collided with \(event.hitEntity.name)") } } } } } } The scene is loaded from RCP. It is simple, just a capsule on a pedestal. Do I need a separate code to run testCharacter from this state?
Replies
0
Boosts
0
Views
184
Activity
May ’25
Can you delete a MTLLibrary once shaders are placed into pipeline?
Hello, I am quite new to using the metal API and was wondering if it was common (or even possible) if you knew that, when a pipeline was created, you never needed to make another one with the same shaders again, if it is safe to release the library the was used to reference the shaders? Only asking because this is possible in other apis, but apple never mentions (as far as I have found) if this is safe or not safe to do.
Replies
1
Boosts
0
Views
402
Activity
Oct ’25
WWDC25 Metal & game technologies group lab
Hello, Thank you for attending today’s Metal & game technologies group lab at WWDC25! We were delighted to answer many questions from developers and energized by the community engagement. We hope you enjoyed it and welcome your feedback. We invite you to carry on the conversation here, particularly if your question appeared in Slido and we were unable to answer it during the lab. If your question received feedback let us know if you need clarification. You may want to ask your question again in a different lab e.g. visionOS tomorrow. (We realize that this can be confusing when frameworks interoperate) We have a lot to learn from each other so let’s get to Q&A and make the best of WWDC25! 😃 Looking forward to your questions posted in new threads.
Replies
2
Boosts
0
Views
480
Activity
Jul ’25
Disabling autofill for non-interactive PDFs
My IOS app generates pdf files. Every time my users open the generated pdf files, the autofill popup jumps out, but my pdf file is NOT for interacting. I'm here to ask if there's a way to mark my pdf files as "not a form", like in metadata or anywhere else?
Replies
0
Boosts
0
Views
477
Activity
Oct ’25
Score range of ImageAestheticsScoresObservation in Vision framework
Hi everyone, I'm using the Vision framework’s ImageAestheticsScoresObservation class (https://developer.apple.com/documentation/vision/imageaestheticsscoresobservation). I noticed that the overallScore returned sometimes gives negative values. Could someone confirm whether the expected range of the score is from -1.0 to 1.0? The documentation doesn’t explicitly state the possible score range, so I’d appreciate any clarification or insights. Thanks in advance!
Replies
0
Boosts
0
Views
181
Activity
Apr ’25
RealityKit Debugger unavailable
Hi team, I'm looking for the RealityKit debugger in Xcode 26 beta 3. I'm running a RealityKit app on my iPad running iPadOS 26 b3, but the debugger option is not there in Xcode.
Replies
3
Boosts
1
Views
668
Activity
Oct ’25
RealityKit and Reality Composer Pro aren't forward or backward compatible with each other
TL;DR: RealityKit and Reality Composer Pro aren't forward or backward compatible with each other, and the resulting error message is terse and unhelpful. (FB14828873) So far, I've been sticking with Xcode 16.4 for development and only using Xcode 26.0 beta experimentally. Yesterday, I used xcode-select to switch to Xcode 26.0 beta 3 to test it, but I forgot to switch back. Consequently, this morning I unintentionally used the future Reality Composer Pro (the version included with Xcode 26) to make a small change to a USD file. Now I realize that if I'm unlucky, it's possible Reality Composer Pro may have silently introduced a small change into the USD file that may make RealityKit fail to read the file on iOS 18 and visionOS 2, which in the past has resulted in hours of debugging to track down the source of the failure, often a single line in the USD file that RealityKit can't communicate to me other than with the error "the operation couldn't be completed". As an analogy, this situation is as if, during regular development (not involving Reality Composer Pro), Xcode didn't warn you about specific API version conflicts, but instead failed with a generic error message, without highlighting the line in your Swift file that was the source of the error.
Replies
1
Boosts
0
Views
480
Activity
Jul ’25
RealityView show debug options
Hi! Using ARView in UIKit or through a UIViewRepresentable in SwiftUI, we can do: arView.debugOptions = [.showPhysics, .showStatistics] What is the equivalent in RealityView?
Replies
0
Boosts
0
Views
234
Activity
Oct ’25
Animations for streaming
We have a macOS app (not yet released, but in use by ourselves), that provides scoreboards for streaming sport events. Today it is expected, that there are nice animations for goals, etc. We are streaming using NDI, which requires a CVPixelBuffer for each frame. We currently create these animations using CABasicAnimation, CAAnimation and CAKeyframeAnimation. In addition we use ScreenCaptureKit to generate the frames. This works fine with 25/30 fps, as long as the window where our animations are performed in is visible. But this is not what it should be. We have a smaller window as main app window and control display performing the animations in reduced size, while the streaming animations need to be in HD format and later maybe in 4K. When using an offscreen window, the animations are not calculated. We get 1 frame per second or so. So we actually have to connect an external display to the MacBook and open the large windows there. Ugly solution. Do we use a completely wrong approach? Or is there a way to tell the macOS to perform the animations although it is an offscreen window? If it cannot work that way, what is an alternative?
Replies
0
Boosts
0
Views
155
Activity
May ’25
iOS 26.1 beta: third-party Broadcast Extension sessions doesnt work after 3 s, then picker frozen
It's a Broadcast Extension issue: on iOS 26.1 beta the extension never launches—after you tap “Start Broadcast” in the system picker the countdown disappears after 3 s and no broadcast starts, so every live-streaming app(and all other non-system apps that use Broadcast Extension) fails to go live (only the native Photos screen recording still works). Is this a known regression or is a new entitlement required?
Replies
0
Boosts
4
Views
255
Activity
Oct ’25
ModelEntity(named:in:) fails to load USD file from RealityKitContent bundle with misleading error?
My experience has been that ModelEntity(named:in:) can be used to load a USD file with a simple structure consisting of entities and model entities, and, critically, it will flatten the entity hierarchy down to a single ModelEntity, presumably reducing the number of draw calls. However, can anyone verify that the following is true? If ModelEntity(named:in:) is used to load a USD file from a RealityKit content bundle, it may fail when the USD file contains more complex data, such as shader graph material definitions, or perhaps for some other reason. I am not sure. AND the error that ModelEntity(named:in:) throws in this case is Cannot load RealityKitContent entity: Failed to find resource with name "<name>" in bundle which would literally suggest that the file does not exist, instead of what I assume the error actually is, which is "the file exists but its entity hierarchy could not be flattened to a single ModelEntity" ? Is that an accurate description of the known behavior of ModelEntity:named:in:)? I understand that I could use Entity(named:in:) instead, without the flattening feature. My question is really more about the seemingly misleading error message. Thank you for any clarification you can provide.
Replies
2
Boosts
0
Views
225
Activity
May ’25
Firebase not initializing in iOS Unreal Engine 5.6 app (using MetaHuman + FirebaseFeatures)
Hi everyone, I'm building a native iOS app using Unreal Engine 5.6 with Firebase for authentication and Firestore. The app uses a MetaHuman avatar and is meant to run as a standalone UE app on iPhone. I'm using this Firebase wrapper: 👉 https://pandoa.github.io/FirebaseFeatures/ I've followed all the steps, including: Adding GoogleService-Info.plist to the Xcode project and ensuring it’s in the correct target Calling FIRApp.configure() in AppDelegate Verifying the plist is bundled correctly However, the app crashes on launch, and Firebase does not initialize properly. Crash log shows: [FirebaseCore][I-COR000005] No app has been configured yet. Setup details: Unreal Engine: 5.6 (source build, macOS) iOS Deployment: 17.5 MetaHuman character packaged correctly and app launches fine without Firebase Has anyone here managed to get Firebase working inside a native Unreal Engine iOS app with this setup? I'd love to hear if there’s something I’m missing — maybe something with initialization timing or module loading? Thanks so much in advance 🙏
Replies
1
Boosts
0
Views
762
Activity
Jul ’25
You cannot debug in simulator
I can't create any breakpoint in my Xcode after I upgraded to macOS 15.4 macOS: Version 15.4 (24E248) visionOS Simulator: 2.3 Xcode: Version 16.2 (16C5032a) My app works well without any breakpoints. But if I create any breakpoint it shows me this: Couldn't find the Objective-C runtime library in loaded images. Message from debugger: The LLDB RPC server has crashed. You may need to manually terminate your process. The crash log is located in ~/Library/Logs/DiagnosticReports and has a prefix 'lldb-rpc-server'. Please file a bug and attach the most recent crash log.
Replies
3
Boosts
5
Views
537
Activity
Apr ’25
Metal useResource vs. MTLFence
Hello, I'm tracking down a bug where useResource doesn't seem to apply proper synchronization when a resource is produced by the render pass then consumed by the compute pass, but when I use MTLFence between the to signal and wait between the render/compute encoders, the artifact goes away. The resource is created with MTLHazardTrackingModeTracked and useResource is called on the compute encoder after the render pass. Metal API Validation doesn't report any warnings/errors. Am I misunderstanding the difference between the two APIs? I dug through the Metal documentation and it looks like useResource should handle synchronization given the resource has MTLHazardTrackingModeTracked but on the other hand, MTLFence should be used to ensure proper synchronization between command encoders. Can someone can clarify the difference between the two APIs and when to use them.
Replies
3
Boosts
0
Views
166
Activity
Jul ’25
screencapturekit
I have code that captures a window and displays a cropped image. The problem is 2 fold. Kit doesn't seem to allow to modify stop and recapture image in window mode to capture a portion of the screen. So this makes me having to crop and display the cropped image via a published variable. This all works find. But seems to stop after some time. Using an M1 16gig ram. program is taking less than 100meg of mem with 40-70%cpu as the crow flies. printing captured success in debug mode and sometimes frame isn't valid so guarding against it. any ideas on how to improve my strategy?
Replies
0
Boosts
0
Views
158
Activity
Jun ’25
How to Apple Unity Plugins
When running my game in the Unity Editor on Windows platform I get an error: DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null) Apple.GameKit.DefaultNSErrorHandler.Init () (at ./Library/PackageCache/com.apple.unityplugin.gamekit@0abcad546f73/Source/DefaultHandlers.cs:35) This is because GameKitWrapper dynamically linked library is not available under Windows platform. Besides, "Apple Build Settings" are declared under UNITY_EDITOR_OSX and also not available under Windows platform. Does anyone managed to solve this?
Replies
1
Boosts
1
Views
606
Activity
Jul ’25