Fire Emitter Texture

In Chapter 16 when we add the fire, I notice that the fire texture has a great deal of detail. However, in the scene it appears just like it’s a mask and the detail is lost. I understand why this is for the snowflakes but I’m not sure why this is for the fire. Is it the way the alpha blending is done? I tried adjusting the discard_fragment threshold but that didn’t change anything.

I’m not quite sure I understand what you’re asking. The fire texture isn’t losing any detail. If you replace the fire texture with the snowflake texture in Particles.swift, you’ll see that the texture is unchanged, albeit smaller than the snow.

You can also append the snow emitter in mtkView(_:drawableSizeWillChange:), to compare the two at the same time.

Blending will make it look different. The color of the fire is coming from descriptor.color in Particles.swift. If you comment out the four blending lines of code in buildPipelineStates(), then you can see the fire without blending.

What I meant was that there is a lot of alpha channel detail in the fire texture not present in the snowflake texture. But this detail does not show up in the translucency of the fire particles.

For example, I altered the fire texture as shown. However, the fire particle translucency does not mirror the textures translucency. The particle seems to be either translucent or transparent. I was initially thinking this was because of the we are simply discarding fragments below an alpha-channel threshold, but that does not seem to be the case as lowering it, or removing it, simply results in more of the fire being translucent.

I would expect this result with the snowflake texture as it seems to be binary.



The pipeline’s blending mode has a lot to do with defining how transparency and blending works.

In the shader, remove the discard fragment, and ensure that you have:

color = float4(, color.a);

to pass along the alpha.

Here’s a png that has 100% opaque in blue at the left and 0% opaque in red at the right.


Try this blending in buildPipelineStates():

descriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
descriptor.colorAttachments[0].isBlendingEnabled = true
descriptor.colorAttachments[0].rgbBlendOperation = .add
descriptor.colorAttachments[0].sourceRGBBlendFactor = .sourceAlpha
descriptor.colorAttachments[0].destinationRGBBlendFactor = .oneMinusSourceAlpha
descriptor.colorAttachments[0].sourceAlphaBlendFactor = .one
descriptor.colorAttachments[0].destinationAlphaBlendFactor = .zero
descriptor.colorAttachments[0].alphaBlendOperation = .add

The three lines of alpha blending code are not necessary, as they are the defaults, but I’ve put them in so that you know about them.

It’s all about color arithmetic. Here’s an article about blending (it’s DirectX, but same principles apply): DirectX10 Tutorial 6: Blending and Alpha Blending | Taking Initiative

There’s also a section about blending in Chapter 10, Fragment Post-Processing.

Thank you so much. For some reason I was not thinking about passing color.a to the output color. But this makes complete sense. Once I did that the effect was as I expected.

Thank you for reference material. I will be sure to take a close look.

1 Like