Kodeco Forums

OpenGL ES 2.0 for iPhone Tutorial Part 2: Textures

In this tutorial series, our aim is to take the mystery and difficulty out of OpenGL ES 2.0, by giving you hands-on experience using it from the ground up! In the first part of the series, we covered the basics of initializing OpenGL, creating some simple vertex and fragment shaders, and presenting a simple rotating […]


This is a companion discussion topic for the original entry at https://www.raywenderlich.com/3047-opengl-es-2-0-for-iphone-tutorial-part-2-textures

And here is the equivalent in Swift GitHub - gsabran/HelloOpenGL_Swift: A Swift implementation of Ray Wenderlich's OpenGL ES 2.0 Tutorial (based on the part 1 done by drouck: https://github.com/drouck/HelloOpenGL_Swift)

Let me know if you have any comments/suggestions!

Thanks for great tutorial! Here is project for Objective-C, using XCode 8.2.1 https://gitlab.com/kirstone/OpenGL-ES-Tutorial-RayWenderlich

Great tutorials. I do have a follow-up question – if I wanted to take a 2D CGPath and turn it into vertex coordinates, should I just use the x,y values from my path and a 0 for the z coordinate? Or is there something else (math?) that I need to be thinking about? My end goal is to be able to apply textures to regions defined by VNFaceObservations. I’m already able to get landmark points from the Vision API - but I’m struggling a bit on how to use those points as the basis for vertex coordinates…maybe I’m overthinking it? I guess my concern with just 0-ing out the Z axis is that I’m guessing that the final texture will feel flat rather than feeling as if has depth… Any resources/links/suggestions appreciated.

This tutorial is more than six months old so questions are no longer supported at the moment for it. We will update it as soon as possible. Thank you! :]