• 0

AMD OpenGL binary shader programs


Question

I've been toying around with some OpenGL stuff and have been looking into saving and loading binary shaders using OpenGL 3.3... I have ATI catalyst and driver 15.x with a HD5700 GPU.
One thing doesn't really seem to go with the spec, the OpenGL docs say you should use glGetIntegerv(GL_NUM_SHADER_BINARY_FORMATS, &Var); to see if binary shaders are supported or not... Well if I do this, I get a 0 meaning that they are not supported (they definately should be), so ignoring that and just getting the binary of the shader using:
glGetProgramiv(Program, GL_PROGRAM_BINARY_LENGTH, &ShaderSize);
glGetProgramBinary(Program, ShaderSize, NULL, &binaryFormat, binary);

(I've set binaryFormat to 0 and 1 but it give the same results) returns me the compiled shader program - great! The only problem is it seems to be a complete waste of time, I'm saving the contents to a file and checking it, and inside the file I have an ELF header followed by some binary data, followed by my whole shader source code in ascii - why ? The whole point on binary shaders is to compile them and remove the source code... After this is some more binary data, another ELF header and what I assume to be the binary shader. Ironically there's also ascii warnings in this binary shader (e.g. WARNING: warning(#276) Symbol "FragPos" usage doesn't match between two stages).

So what I want to know is why is GL_NUM_SHADER_BINARY_FORMATS returning 0 (GL_SHADER_BINARY_FORMATS also returns nothing even if I provide it an array with 10 elements) when the card and driver clearly supports binary shaders and why is the ATI hardware/software giving me a binary shader with the full source code of the shader in it?

Link to comment
Share on other sites

5 answers to this question

Recommended Posts

  • 0

Can't actually load the binary shader, get an error but glGetProgramInfoLog() returns absolutely nothing, even when I trim off until the second ELF header - great work AMD, can't possibly imagine why your market share is sliding...
 

 

Link to comment
Share on other sites

  • 0

You created a context right?

Yes, that's the first thing I do (using glfw). It's able to compile and use the shaders fine but just won't present me with any supported formats or accept binary shaders in.

Link to comment
Share on other sites

  • 0

I did a quick test and I think the problem is that you're mixing shader binary formats with program binary formats. I believe GL_NUM_SHADER_BINARY_FORMATS is meant to be used with OpenGL-ES and the glshader functions. On a non-embedded system such as the PC, you should be querying GL_NUM_PROGRAM_BINARY_FORMATS:

#include <stdio.h>
#include <stdlib.h>
#include <GL/glut.h>

void
init() {
	glClearColor(0,0,0,0);
	
	GLint numFormats = 0;
	glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, &numFormats);
	printf("%i\n", numFormats);
}	

void
display() {
	glClear(GL_COLOR_BUFFER_BIT);
	glutSwapBuffers();
}	

int
main (int argc, char **argv) {
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
	glutInitWindowSize(640, 480);
	glutCreateWindow(argv[0]);
	init();
	glutDisplayFunc(display);
	glutMainLoop();

	return EXIT_SUCCESS;
}	
$ gcc -o formats formats.c -lglut -lGL && ./formats
1

As can be seen above My R9 270 / catalyst returns 1.

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.