GDI Renderer and LWJGL

Started by Jbgohlke, April 14, 2006, 23:27:08

Previous topic - Next topic

Jbgohlke

Now, this is really messing with my head guys.

I have a Dell PC over here that doesn't have a decent graphics chipset (Some obscure ATi integrated thing) and ends up using the GDI Generic as the renderer for anything OpenGL.

So here's my question: Why can I not run anything LWJGL using this renderer? Here are some specs from GLView:

Quote
System Info
Windows XP Professional

Vendor
Microsoft Corporation
1.1.0

Renderer
GDI Generic

Extensions
GL_EXT_bgra
GL_EXT_paletted_texture
GL_WIN_swap_hint


Core features

v1.1 (100 % - 7/7)
v1.2 (12 % - 1/8)
v1.3 (0 % - 0/9)
v1.4 (0 % - 0/15)
v1.5 (0 % - 0/3)
v2.0 (0 % - 0/9)
OpenGL driver version check (Current: 6.13.3036, Latest known: 6.13.3036):
Latest version of display drivers found

According the database, you are running the latest display drivers for your video card.
No hardware support

Your current video configuration DOES NOT support hardware accelerated OpenGL.
No compiled vertex array support

This may cause performance loss in some applications.
No paletted texture support

This may cause performance loss in some older applications.
No multitexturing support

This may cause performance loss in some applications.
No secondary color support

Some applications may not render polygon highlights correctly.
No S3TC compression support

This may cause performance loss in some applications.
No texture edge clamp support

This feature adds clamping control to edge texel filtering. Some programs may not render textures correctly (black line on borders.)
No vertex program support

This feature enables vertex programming (equivalent to DX8 Vertex Shader.) Some current or future OpenGL programs may require this feature.
No fragment program support

This feature enables per pixel programming (equivalent to DX9 Pixel Shader.) Some current or future OpenGL programs may require this feature.
No OpenGL Shading Language support

This may break compatibility for applications using per pixel shading.
No Frame buffer object support

This may break compatibility for applications using render to texture functions.
Few texture units found

This may slow down some applications using fragment programs or extensive texture mapping.
Extension verification:
GL_EXT_color_subtable was not found, but has the entry point glColorSubTableEXT

GLView can also show render demos, but LWJGL just utterly refuses to cooperate. Any help or insight into this problem would be excellent.

I also set the 'org.lwjgl.opengl.Window.allowSoftwareOpenGL' property to true beforehand.
ersonally, you could strap me into a mech cockpit with 47 virtual HUD displays, two complex hand controls and foot pedals and I'd be grinning like the gamer freak I am.

darkprophet

Quote
Your current video configuration DOES NOT support hardware accelerated OpenGL.

Enough said ?

But all the 'org.lwjgl.opengl.Window.allowSoftwareOpenGL' does is set a flag in teh driver to allow for software rendering; the drivers do that, not LWJGL. So im guessing you have an uberly crappy GFX and an uberly crappier driver for that crappy GFX :)

Go to www.dabs.com and buy yourself a cheap nvidia card (those 5200s are cheap and not bad at all)

DP

Jbgohlke

I would, but this isn't 'my' PC, it's my schools.

Thanks for the answer anyway. I'll create my own software emulation layer, probably with something like J2DA.
ersonally, you could strap me into a mech cockpit with 47 virtual HUD displays, two complex hand controls and foot pedals and I'd be grinning like the gamer freak I am.