[METAL] API PROVIDES 32-BIT DEPTH TEXTURE WHEN A 16-BIT TEXTURE IS EXPECTED
iOS-Apr 07, 2016-Priority: 7Not yet prioritized for a release-Severity: 0Severity not yet defined
How to reproduce:
1. Open attached project
2. Build to iOS
3. Open in Xcode
4. In Xcode, go to Product -> Scheme -> Edit Scheme…
5. In the Options tab change the GPU Frame Capture setting to Metal
6. Build to device
– Note that it crashes before getting to the Splash Screen with this error:
/BuildRoot/Library/Caches/com.apple.xbs/Sources/Metal/Metal-56.7/ToolsLayers/Debug/MTLDebugRenderCommandEncoder.mm:1602: failed assertion `MTLPixelFormat of texture [TempBuffer 2 1920×1080 (MTLPixelFormatDepth32Float)] bound at index 1 is incompatible with texture parameter [MTLDataTypeHalf _CameraDepthTexture]. MTLPixelFormatDepth32Float is compatible with texture data types type(s) (
– Reproduced in Version 5.2.4f1 (98095704e6fe), Version 5.3.4p2 (fdf8d87c549e), Version 5.4.0b14 (0d4790749194) on an iPhone 6+ (iOS 9.3.1)
– Not reproducible on an iPhone 5 (iOS 7.1)
As a workaround: please disable the Metal validation setting in the Xcode schema options
Proper solution: search for CameraDepthTexture in your shaders and verify that it uses sampler2D_float, not just sampler2D. Note that ignoring this issue may crash the GPU driver on certain hardware.
Unity will have a warning for this in the future.