one more thing… based on my previous questions, I am trying to modify the navigation chapter sample to be able to render simple mesh ( quad from chapter 4 ) instead of the full model, but it currently fails.
is my render call, I have already added uniforms and params buffers, but with option .triangle I get only one triangle, with .line just single line, and .linestrip gives me 2 lines.
vertex buffer has optimized version of 4 vertices, with proper indices being sent.
I know this has something to do with the vertex function in .metal shader file, but how to combine it? applying matrices AND using indexed primitives from buffers?
this is how my VertexBuffer looks in frame capture.
weird is, that I have only numbers 1, or 0 in Quad code. however if I compare the code, I see no differences between web and my code.
probably, there is also an error in displaying data for vertex buffer. shall I set up the type for float3? 4? packed_float currently I have already both triangles displayed, but displaced.
I find it very difficult to debug code with just snippets .
When I debug, I examine the MTLBuffer and vertex descriptor on the CPU side as well as looking at the frame capture. The problem could be anything, but I would guess you are scaling your quad on the CPU side, and that you are sending 0.8 to the GPU, and that your vertex descriptor doesn’t match the data.
attached see image of my debugger. I have already found out, that vertices have to correspond to vertex descriptor… so 4 floats for position, 3 for normals, 2 for UV. in left bottom corner of screenshot, contents of ‘in’ is visible, but… it says out of range in the right side of screen, and I cannot understand this.
Your quad mesh consists of Position: Float x 3. However, your vertex descriptor and vertex shader function has position, normal and uv, so they are never going to match up.
You don’t need to explicitly send the index buffer to the GPU in a buffer, as the indexed draw call references the index buffer, so it will automatically go to the GPU.
Your vertices array consists of packed floats, which the vertex descriptor can’t handle. You could send them to the GPU without using a vertex descriptor, and take the packed float array directly into the vertex shader without using stage_in. Or you can define your array as [float3]
VertexDescriptor.swift now defines a vertex descriptor for quadLayout with just a float3, and Renderer uses this vertex descriptor when creating the pipeline state object. I’ve commented out the normal and uv in the Shaders.metal, as you’re not using those.
I’ve commented out the setVertexBuffer for the index buffer.
In Quad.swift, I’ve set up a float3 array for `vertices and changed the length of the vertex buffer.
May I ask, what changes would be needed in the code, in order to include also normals? as I plan to use them for lightning purposes.
have made several attempts, but all of them failed.
as for those “packed” floats… so using Float data type means it will be packed? and float3(0,0,0) will not be packed and shader can work well with it? this has surprised me a bit, although I have read this in the book
To add normals, you’d calculate them, and then add them to an array like the position vertices as float3, and add them to the vertex descriptor and also to the structure in the shader. You’d pass the vertex buffer before the draw call, just like the position vertex buffer.
Have a read of this answer to see if it helps with float3 vs packed floats.
so it’s another buffer? I thought it can be combined in quad vertices variable… but there I’d have to have a tuple of (float3, float3) and this approach does not seem to work.
so… if normals are just another buffer, and they will be only combined in the shader… still bit confused
It doesn’t have to be another buffer. If you layout the vertex descriptor properly, and you define your vertices array with a structure that contains position and normal, then you can interleave position / normal / position / normal in the same buffer.
P.S. Irrelevant if you’re using my latest example, but I notice now that you had another error when you were using defaultLayout (which my latest example doesn’t).
It’s a good idea to spend a lot of time on buffers, vertex descriptors and checking the GPU frame capture to see what goes onto the GPU. In my opinion, that and matrices is the hardest thing about computer graphics with Metal.
yes… have tried a lot now to look into GPU capture… starting to understand it more and more. which timezone are you living, btw? I also realized, that for my purposes ( wireframe of something ressembling Minecraft, I need not share vertices, as the normals used for lighting cannot be shared. it would mess up lighting then