Color format and color space issue in sample projects

Hello, I’ve noticed in most of the samples the MTKView default pixel format of .bgra8Unorm is used with the colorspace of nil (which means no color matching by the compositor). In contrast The official Metal examples from all set MKTView colorPixelFormat to .bgraunorm8_srgb

This results is that the samples render darker than I believe is intended because most displays are natively an sRGB format and the lack of color matching means the linear color space used in the samples is being treated as sRGB values by the display. E.g. a value of .5 is relatively bright in linear space but darker in sRGB space.

I recommend one of two things to improve how the samples look:

  1. set MTKView colorPixelFormat to .bgraunorm8_srgb like Apple does in their examples, or
  2. set MTKView colorspace to CGColorSpace(name: CGColorSpace.linearSRGB) so that the compositor know what space your output is in and can handle converting it to whatever the display expected.

(1) works well enough in most cases, but I believe (2) is technically more correct and can also pair well with (1) in more advanced use cases.

Compare these screenshots from the deferred rendering chapter. the left is from the default sample and the right is with colorspace set to linear:

Viewing the train model in the Xcode editor makes me think it is not intended to be so dark:

Thoughts?

1 Like

I’ll certainly review it. I can’t remember why I went with Data not Color for the baseColor textures in the asset catalog, which meant I used .bgraunorm8, but I do remember going back and forth on using .bgraunorm8_srgb in the early chapters, and then I stuck with it perhaps for longer than I should when doing the shadows etc.

Thank you for the suggestions. I do like the look of the brighter render better.

I’ve worked out where the .bgra8Unorm choice came from.

Chapter 8, Textures, is when I first introduce loading textures. When loading textures from a .usdz file, Model I/O loads them tagged with the linear format, and you can’t override this with the texture loader. <\rant>

So if the MTKView is using .bgra8unorm_srgb, then the house texture is washed out.

Without becoming more complicated early on in the book, I decided to go with .bgra8Unorm.

In the next version of MbT I will explain this a bit further, and see if I can work out some way of converting the base color texture to sRGB with provided code.

Edit: It appears that the texture is loaded in sRGB format but tagged linear. I can provide a core image function that converts the texture to be tagged sRGB, and that works, but it would have to only apply to .usd loaded files.

Yeah I think in another thread we talked about how the MTKTextureLoader fails to respect sRBG as an option when calling newTexture on an MDLTexture.

Here’s the quick workaround I added to your TextureController.loadTexture method that loads from MDLTexture:

print("loaded texture from USD file")
let tex = texture?.makeTextureView(pixelFormat: .rgba8Unorm_srgb)

textures[name] = tex
return tex

The TextureView basically forces it to interpret the bits as sRGB. It works well.

So with this setup I set my view pixel format and colorspace to sRBG values as described above and I get the textures in usdz files to load as expected.

I haven’t checked macOS 26 beta yet to see if MTKTextureLoader.newTexture is fixed yet. Cheers and thanks for all your work!

1 Like

I’ll add that it took my a while to wrap my head around color spaces.

What helped me was to think what RGB 0.5 0.5 0.5 should be – in sRBG is it a mid gray. In linear space it is much lighter.

I’ve written shaders to just emit RGB 0.5 0.5 0.5 just to make sure I’m not confused and I eventually get it figured out.

The summary I keep in mind:

  • Linear has very little dark near 0, gets light really fast
  • sRBG is a smoother dark to light, mid gray at 50%
  • shaders always work in linear space
  • textures will be converted when sampled
  • render targets will be converted when written to
  • linear texture interpreted as SRGB when sampling will mean it looks too dark (similarly linear frame buffer sent to monitor expecting sRGB will look too dark, setting colorspace can force color matching to correct this)
  • sRGB texture interpreted as linear when sampling will look too bright/washed out

One thing that is cool is that in the real world I can see how linear space impacts lighting: in home kit I can set my smart bulb brightness and it is very clearly linear. 25% setting almost feels like 50% brightness to my eyes.

1 Like

Yes I often get it the wrong way around. Because you have to take into account whether the value is being transformed back and forward automatically by the GPU (as in reading sRGB textures and rendering to an sRGB render target).

Hopefully the new edition will improve things. Thank you!

Your workaround will convert all textures to sRGB? The normal/roughness/metallic should be linear I think. It’s a nice workaround though. Better than my Core Image one.

macOS 26 isn’t any better. If they did change it now, it would probably break a lot of stuff!

This works in Submesh.swift, MDLMaterial.texture(type:) :crossed_fingers::

if let property = property(with: semantic),
property.type == .texture,
let mdlTexture = property.textureSamplerValue?.texture {
  var texture = TextureController.loadTexture(
    texture: mdlTexture,
    name: property.textureName)
  if semantic == .baseColor,
  let loadedTexture = texture,
  loadedTexture.pixelFormat == .rgba8Unorm {
    let sRGBTexture = loadedTexture.makeTextureView(
      pixelFormat: .rgba8Unorm_srgb)
    TextureController.textures[property.textureName] = 
      sRGBTexture
    texture = sRGBTexture
  }
  return texture
}