opengl - Linux GLEW with GLX segfault (core dumped) -


I am trying to set up a very basic program with the OpenGL 3.2 core profile in Linux, and GLEW I tried That's with its help

This is my code:

  #define GLEW_STATIC #include & lt; Iostream & gt; # Include & lt; Cstdio & gt; #include & lt; String & gt; # Include & lt; GL / glew.h & gt; # Include & lt; X11 / Xlib.h & gt; #include & lt; X11 / Xutil.h & gt; # Include & lt; GL / gl.h & gt; # Include & lt; Jl / glx.h & gt; // # include & lt; GL / glut.h & gt; # Include "stb_image_write.h" # Include & lt; Cstdlib & gt; # Include & lt; GL / glfw.h & gt; Static Float _view PortHight = 30; Static Float _view PortVid = 10; GLXContext (* glXCreateContextAttribsARBProc) (display *, GLXFBConfig, GLXContext, Bull, constant integer *) typedef; Type Fef bool (* Glacks Mackektenkt Kantrntarbiprok) (Display * Jielaksdrovebl, Jielaksdraunebl, GLX Conteks); Static glXCreateContextAttribsARBProc glXCreateContextAttribsARB = null; Stable glXMakeContextCurrentARBProc glXMakeContextCurrentARB = faucet; Int main (int argc, char * argv []) {glXCreateContextAttribsARB = (glXCreateContextAttribsARBProc) glXGetProcAddressARB ((constant GLubyte *) "glXCreateContextAttribsARB"); glXMakeContextCurrentARB = (glXMakeContextCurrentARBProc) glXGetProcAddressARB ((constant GLubyte *) "glXMakeContextCurrent"); Display * Display = XOpenDisplay (NULL); If (display == faucet) {std :: cout & lt; & Lt; "Error getting X display"; Return -1; } Stable Entry Attribute [] = {none}; Int noofframebuffer configuration; GLXFBConfig * fbConfigs = glXChooseFBConfig (performance, DefaultScreen (performance), visualAttribs, & amp; numberOfFrameBufferConfigurations); Int context_attribs [] = {GLX_CONTEXT_MAJOR_VERSION_ARB, 3, GLX_CONTEXT_MINOR_VERSION_ARB, 2, GLX_CONTEXT_FLAGS_ARB, GLX_CONTEXT_DEBUG_BIT_ARB, GLX_CONTEXT_PROFILE_MASK_ARB, GLX_CONTEXT_CORE_PROFILE_BIT_ARB, none}; Std :: cout & lt; & Lt; "Initial Reference ..."; Jielakscontekst Opanjielsiontekst = Glackskrettentektatrybisaarbi (display, FB Config [0], 0, True, Contakt_aditryj); Int pBufferAttribs [] = {GLX_PBUFFER_WIDTH, 32, GLX_PBUFFER_HEIGHT, 32, none}; GLXPbuffer pbuffer = glXCreatePbuffer (display, fbConfigs [0], pBufferAttribs); XFree (fbConfigs); XSync (display, incorrect); If (glXMakeContextCurrent (display, pbuffer, pbuffer, openGLContext)) {std :: cout & lt; & Lt; "Error creating content"; Return -1; } GlXMakeCurrent (display, pbuffer, openGLContext); GLenum Error = GlewInit (); If (error! = GLEW_OK) {std :: cout & lt; & Lt; Error with Glew init () \ n "; } And {std :: cout & lt; & Lt; "OK \ n \ n"; } Glutton test; Glumem framebuffer; GlenFramesbuffers (1, and framebuffer); Glacolar roller (1.0 F, 0.0F, 0.0F, 1.0F); GlClear (GL_COLOR_BUFFER_BIT); Std :: string path ("output page"); ExportToPath (path); Return 0; }  

The output is this:

Reference Beginning ... Glew is OK

Partial Defect (Core Throwed) < / P>

The line that causes the problem is on call the glGenFrameBuffers , which is also the first call for functions produced by Glew.

Can anyone tell me in the right direction?

glew.h already in gl.h is included and GLX is available with glx.h ... and then using some macro magic to turn some symbols. I recommend that you remove lines

  #include & lt; GL / gl.h & gt; # Include & lt; Jl / glx.h & gt;  

and use just

  #include & lt; Gl / glew.h & gt;  

After joining GLEW, some of the macro loopholes in the original may be lost and you end up with a wrongly connected symbol, resulting in an accident.


One note: Why do you include the GLFW and GLUT header? If you intend to use naked GLX then you do not need them and they should not be included.


Comments