Check your context if glCreateShader returns 0 and GL_INVALID_OPERATION

I passed half a day hunting weird bugs on Android’s Native Activity.

The one I’ll talk about here is if glCreateShader suddenly returns 0 and glError sends you back 0x502 (GL_INVALID_OPERATION or 1282 in decimal).

I had a lot of trouble finding a working solution for this problem, so here it is, for the record.

The solution that worked for me

Pass a GLEnum array[] = { EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE } as the last parameter of your eglCreateContext call, in order to be sure that EGL will set up an OpenGL ES 2.0 context.

Defining { ..., EGL_CONFORMANT, EGL_OPENGL_ES2_BIT, ... } in your configuration attributes list is NOT enough. I guess that a conformant OpenGL ES 2.0 configuration can still be used for OpenGL ES 1.0.

Example

    const EGLint attribs[] = {
      EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
      EGL_BLUE_SIZE, 8,
      EGL_GREEN_SIZE, 8,
      EGL_RED_SIZE, 8,
      EGL_CONFORMANT, EGL_OPENGL_ES2_BIT,
      EGL_NONE
    };

    const EGLint GiveMeGLES2[] = {
      EGL_CONTEXT_CLIENT_VERSION, 2,
      EGL_NONE
    };

    /* ... */

    eglChooseConfig(display, attribs, &config, 1, &numConfigs);

    eglGetConfigAttrib(display, config, EGL_NATIVE_VISUAL_ID, &format);

    ANativeWindow_setBuffersGeometry(window, 0, 0, format);

    surface = eglCreateWindowSurface(display, config, window, NULL);
    context = eglCreateContext(display, config, NULL, GiveMeGLES2);

Passing EGL_CONTEXT_CLIENT_VERSION as the third parameter of eglQueryContext can tell you which OpenGL ES version your context was prepared for.

    GLuint v = 0;
    eglQueryContext(display, context, EGL_CONTEXT_CLIENT_VERSION, &v);

    __android_log_print(ANDROID_LOG_ERROR, LOGTAG, "Context prepared for OpenGL ES : %d", v);

Red herrings

Shader calls should be within a GL thread that is onSurfaceChanged(), onSurfaceCreated() or onDrawFrame(). (from StackOverflow)

This was not an issue. However, thanks to the BioniC team who was kind enough to implement the gettid system call.

You will need the following headers to get gettid working :

#include <unistd.h>
#include <sys/syscall.h>
#include <sys/types.h>

Then, for example, this logs the current thread id :

	__android_log_print(ANDROID_LOG_ERROR, "native-insanity", "Thread ID : %d", gettid());

You can use this to log the id of the threads calling your different procedures, for comparison.

glCreateShader will return GL_INVALID_OPERATION if called between glBegin and glEnd. (from old OpenGL manuals)

Non-issue with OpenGL ES > 2.x since those calls are unavailable.

Note that the current Khronos manuals do not even mention this error.

It really took me a LOT of time to understand what went wrong with this, given the lack of documentation on this issue. I’d really appreciate if some website compiled all the potential issues you can have with OpenGL with potential solutions and hints.

Depth testing : Context is everything

I just lost a few hours trying to play with the Z index between draw calls, in order to try Z-layering, as advised by peterharris on ARM Community, in my question For binary transparency : Clip, discard or blend ?.

However, for reasons I did not understand, the Z layer seemed to be completely ignored. Only the glDraw calls order was taken into account.

The solution

Request an EGL_DEPTH_SIZE when selecting a display configuration for your EGL context.

Thought pattern

I really tried everything :

glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LESS);
glDepthMask(GL_TRUE);
glClearDepthf(1.0f);
glDepthRangef(0.1f, 1.0f);
glClear( GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT );

Still… each glDrawArrays drew pixels over previously drawn pixels that had an inferior Z value. Switched the value provided to glDepthFunc, switched the Z values, … same result.

I really started to think that Z-layering only worked for one draw batch…

Until, I searched the OpenGL wiki for “Depth Buffer” informations and stumbled upon Common Mistakes : Depth Testing Doesn’t Work :

Assuming all of that has been set up correctly, your framebuffer may not have a depth buffer at all. This is easy to see for a Framebuffer Object you created. For the Default Framebuffer, this depends entirely on how you created your OpenGL Context.

Not again

After a quick search for EGL Depth buffer on the web, I found the EGL manual page : eglChooseConfig - EGL Reference Pages, which stated this :

EGL_DEPTH_SIZE

Must be followed by a nonnegative integer that indicates the desired depth buffer size, in bits. The smallest depth buffers of at least the specified size is preferred. If the desired size is zero, frame buffer configurations with no depth buffer are preferred. The default value is zero.

The depth buffer is used only by OpenGL and OpenGL ES client APIs.

I should have known…