Blender vs. parametric mesh coordinates

I have enjoyed the MbT book immensely.

In the simple lighting example (Ch. 5), I created a couple of parametric meshes and added them to the models for display. However, they have the wrong orientation relative to the Blender models – the y direction points sideways. I know that Blender and Metal coordinate systems are different, but is there a place in the code where you are implicity rotating coordinates to transform coordinates from Blender orientation to left-hand orientation? Thanks!

Are you exporting from Blender with the Y up? There are options (for .obj, anyway) on export.

We opted to use the left handed coordinate system which flips Blender models on import - but I think that’s along the x axis.

You could try a matrix operation in the vertex shader to flip the vertices back along one axis.

Hi Caroline,

I am using the models provided with the supporting materials for the book (treefir, train). When I render them, they have the y normals pointing up, as I expect. But when I create and display (using the shaders from Ch. 5) a new MDLMesh using boxWithExtent, the resulting box object has its y normals pointing in the z direction.

Thanks again for your help! This is the coolest book I have read in a long time.

Jason

Do you have a project I can look at?

Here’s my modification of the final version of the Chapter 5 project. I’m using a fragment shader that colors upward-pointing normals blue and downward-pointing normals red. The y normals for the sphere point sideways! Thanks again!

Chapter5.Lighting.final.zip (147.3 KB)

If you place a breakpoint on line 53 in Model.swift, that’s meshes = [mesh], then do po mdlMesh.vertexDescriptor in the console, you’ll get this:

Screen Shot 2020-08-23 at 11.47.46 am

However, when you read in the Blender models, you’re using an entirely different vertex descriptor layout from MDLVertexDescriptor.defaultVertexDescriptor

You’ll need to match your vertex descriptors or use a different pipeline state.

If you change the way you read in the Blender models, ie change the defaultVertexDescriptor to match the sphere’s vertex descriptor, then it will work.

extension MDLVertexDescriptor {
  static var defaultVertexDescriptor: MDLVertexDescriptor = {
    let vertexDescriptor = MDLVertexDescriptor()
    var offset = 0
    
    vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                        format: .float3,
                                                        offset: 0, bufferIndex: 0)
    offset = 12
    vertexDescriptor.attributes[1] =
      MDLVertexAttribute(name: MDLVertexAttributeNormal,
                         format: .float3,
                         offset: offset,
                         bufferIndex: 0)
    offset = 24
    vertexDescriptor.attributes[2] = MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate, format: .float2, offset: offset, bufferIndex: 0)
    offset = 32
    vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: offset)
    return vertexDescriptor
  }()
}

Screen Shot 2020-08-23 at 11.50.31 am

Thank you for uploading your project - it makes it much easier!

Thank you so much! This is subtle. MDLMesh must use a default vertex descriptor, but I can’t find anything in Apple’s documentation describing it :-/
I like your idea of using a different a different pipeline state–more elegant.

Thanks again,
JDH