Lighting issue (or not?)

Hi there,

Thank you for the book. It goes without saying that it’s probably the best single resource on learning how to use Metal and Swift out there.

I’m experimenting with some of the approaches used in the book (3rd edition) in conjunction with game engine design, specifically around GameplayKit’s entity-component system. For instance, all update and render logic, among other things, has been moved to systems acting on components.

This has been fairly successful, with the exception of an issue that at first glance appears to be texture-related, but I think is more likely a lighting problem: I can load models fine (and animations etc), and display them, but they are entirely black. Why not a texture issue? I appear to be initializing materials on submeshes fine. But if the scene has no light, how could I see the materials?

I’ve tried to add various lights (a sun, an ambient light, and some others for good measure), and I create a lightBuffer which seems OK (and is used in the forward render pass draw to set fragment buffer, and to update uniforms).

I have been mindful of the different behaviour of classes and structs in terms of pass by reference and value and don’t see any obvious issues with updating a struct value thinking it’s a reference, for example.

My question is, as someone new to both Swift and Metal: where might one look to try to get an idea of what the issue could be here? Keeping in mind that besides moving logic around and operating on components instead of Models in a GameScene, a lot of what I have so far is similar to what you would see in the final chapters of Metal By Tutorials third ed.

Would appreciate any advice. Thank you again!

Hi @spamheat - welcome to the forums, and thank you for such kind words about our book :blush: !

Remember that the black color on the rendered model is solely the output of your fragment shader.

The output depends upon the input (materials, textures, lights etc), and what you do with the combination of the input within your shader code.

The first place to look is in the Metal frame debugger (Capture GPU Workload). Click the M icon in Xcode above the debug console while your app is running.

When you click on the drawIndexedPrimitives draw call for your model, you will be able to see everything that the GPU has available to it for rendering the model. If you click the bug icon, you can debug the vertex and fragment functions. You can check that the input is what you expect it to be and see what calculations are made on that input.

Personally, whenever I have a query about a piece of Metal code, before I even look closely at the code, I look at the frame capture, to see how all the buffers are set up, and if the data makes sense.

At least you are getting a recognisable render :smiley: - it’s much more frustrating, and very common, to get nothing at all.


Wow! Thank you for this generous response and advice. This gives me a lot to work with in terms of trying to understand what it going on with the rendering.


I solved the issue. It turned out that the fragment color was being calculated based on a texture that wasn’t there. So it ended up with color [0, 0, 0, 0]. I’m not sure if I would have been able to debug this without using the Metal frame capture and debugger (which are fantastic, by the way) to look at those fragment values. So, thank you again!

1 Like

I’m so glad you’ve worked out how to use the Metal frame capture.

Use it for everything! It will show you all your render encode commands, and the buffers that are in place at the time of the command.