-
Notifications
You must be signed in to change notification settings - Fork 281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[xml] glGetString commands use "GLubyte*" instead of "GLchar*" for strings #363
Comments
The group will consider this, though it's unlikely we're going to change an actual API signature. |
Can you explain in more details, what can be bad about changing signature in this way?
And, as i said, there are other commands in .xml's, that are using And while only visually - it is wrong. This is like using |
Changing the signature will be a breaking change, while a beneficial one, a breaking change nonetheless. As you said, Chipping in my own opinion here, I do agree that |
Well it wouldn't be breaking but... Just checked - they aren't actually same,
Well, in that case i think it's understandable to not want to fix consistency. But, in that case there should be comment in .xml and something about this in spec. As i said in first comment here - |
Gotcha. So all that's needed is some clarification I think and then this issue is resolved. |
It's not clear to me where the proposed clarification should go. In the XML? @SunSerega do you have a PR in mind? |
Clarification should explain that while prototype shows that And no, i haven't yet thought about exact words for PR. |
This is a spec issue, not XML. The utility of noting that there's something peculiar about the return type of a nearly 30 (?!) year old API seems minimal to me, but we could put in a footnote. |
We discussed this in the OpenGL/ES working group meeting today. I learnt an interesting bit of history that gives some explanation as to why these functions have been specified this way. Way back in OpenGL 1.x days there were no functions which take a string from the application as input to the OpenGL implementation. For returning a string from OpenGL to the application there is "GLubyte * glGetString()", and in that case the OpenGL specifies how it should be interpreted so it could use GLubyte. Then came along GL_ARB_shader_objects, which added a way for the application to pass a string into OpenGL. If you see issue 6 you'll note that it was a conscious decision to add the new GLcharARB type. Now that strings are passed from the application to OpenGL it made sense for OpenGL to accept standard C/C++ strings as is, without trying to reinterpret them as GLubyte. Thus GLchar was born and added to OpenGL 2.0. And for the reasons suggested above, the old glGetString() functions were not changed so that they remained strictly backwards compatible. So the fact glGetString returns GLubyte is not a typo, just an artifact of history. I suggest we don't change the XML or spec. Maybe just add a comment to the XML pointing to this issue. |
See #396 which closes this issue. |
…for reference)
(these are only 2 functions, using
GLubyte*
as string)Type
GLchar
is already defined and used by some extensions.And it makes much more sense in this context, then
GLubyte
withgroup="String"
.The text was updated successfully, but these errors were encountered: