I've been toying around with some OpenGL stuff and have been looking into saving and loading binary shaders using OpenGL 3.3... I have ATI catalyst and driver 15.x with a HD5700 GPU. One thing doesn't really seem to go with the spec, the OpenGL docs say you should use glGetIntegerv(GL_NUM_SHADER_BINARY_FORMATS, &Var); to see if binary shaders are supported or not... Well if I do this, I get a 0 meaning that they are not supported (they definately should be), so ignoring that and just getting the binary of the shader using: glGetProgramiv(Program, GL_PROGRAM_BINARY_LENGTH, &ShaderSize); glGetProgramBinary(Program, ShaderSize, NULL, &binaryFormat, binary);
(I've set binaryFormat to 0 and 1 but it give the same results) returns me the compiled shader program - great! The only problem is it seems to be a complete waste of time, I'm saving the contents to a file and checking it, and inside the file I have an ELF header followed by some binary data, followed by my whole shader source code in ascii - why ? The whole point on binary shaders is to compile them and remove the source code... After this is some more binary data, another ELF header and what I assume to be the binary shader. Ironically there's also ascii warnings in this binary shader (e.g. WARNING: warning(#276) Symbol "FragPos" usage doesn't match between two stages).
So what I want to know is why is GL_NUM_SHADER_BINARY_FORMATS returning 0 (GL_SHADER_BINARY_FORMATS also returns nothing even if I provide it an array with 10 elements) when the card and driver clearly supports binary shaders and why is the ATI hardware/software giving me a binary shader with the full source code of the shader in it?
Question
n_K
I've been toying around with some OpenGL stuff and have been looking into saving and loading binary shaders using OpenGL 3.3... I have ATI catalyst and driver 15.x with a HD5700 GPU.
One thing doesn't really seem to go with the spec, the OpenGL docs say you should use glGetIntegerv(GL_NUM_SHADER_BINARY_FORMATS, &Var); to see if binary shaders are supported or not... Well if I do this, I get a 0 meaning that they are not supported (they definately should be), so ignoring that and just getting the binary of the shader using:
glGetProgramiv(Program, GL_PROGRAM_BINARY_LENGTH, &ShaderSize);
glGetProgramBinary(Program, ShaderSize, NULL, &binaryFormat, binary);
(I've set binaryFormat to 0 and 1 but it give the same results) returns me the compiled shader program - great! The only problem is it seems to be a complete waste of time, I'm saving the contents to a file and checking it, and inside the file I have an ELF header followed by some binary data, followed by my whole shader source code in ascii - why ? The whole point on binary shaders is to compile them and remove the source code... After this is some more binary data, another ELF header and what I assume to be the binary shader. Ironically there's also ascii warnings in this binary shader (e.g. WARNING: warning(#276) Symbol "FragPos" usage doesn't match between two stages).
So what I want to know is why is GL_NUM_SHADER_BINARY_FORMATS returning 0 (GL_SHADER_BINARY_FORMATS also returns nothing even if I provide it an array with 10 elements) when the card and driver clearly supports binary shaders and why is the ATI hardware/software giving me a binary shader with the full source code of the shader in it?
Link to comment
Share on other sites
5 answers to this question
Recommended Posts