CG

Started by Chman, December 29, 2004, 18:15:46

Previous topic - Next topic

Chman

Can someone explain me how to use the EXTCgShader class ?

From the javadoc :

Quote
You can pass GL_CG_VERTEX_SHADER_EXT to glCreateShaderARB instead of GL_VERTEX_SHADER_ARB to create a vertex shader object that will parse and compile its shader source with the Cg compiler front-end rather than the GLSL front-end.
I've tried that but I can't make it works at all... I've found a piece of code in an old LWJGL package that use CG, but it uses nVidia's extensions and I'm running on an ATI card... Anyone ?

Thanks,

Chman

spasi

I haven't tested this, but I'm pretty sure it would work only on NV cards. Nvidia drivers have a Cg compiler inside them, that is used for both Cg and GLSL shaders, so they can do this easily.

By the way, why do you need this?

Chman

Quote from: "spasi"I haven't tested this, but I'm pretty sure it would work only on NV cards. Nvidia drivers have a Cg compiler inside them, that is used for both Cg and GLSL shaders, so they can do this easily.
How does this work on ATI cards ? Is this different from nVidia's ? Cg don't work only on nVidia's, I can run a lot of Cg sample on my 9800...

QuoteBy the way, why do you need this?
To play with Cg... I have a lot of fun with shaders those days (mainly GLSL), and I would like to test Cg too :)
Oh, and I would like to implement Cg support to Jme...

Chman

spasi

This is different from plain Cg, in that case you use the Cg runtime. The Cg shader is compiled by the runtime (to NV_vertex/fragment_program for NVs, to ARB_vertex/fragment_program for ATIs). In EXT_cg_shader's case, the Cg shader is compiled by the *driver*, (NV's contains a Cg compiler, I guess the same included in the Cg runtime).

Chman

Ok, thanks.
So I what do I really need to make the Cg shaders work with an ATI card (or all cards type) ?

Chman

spasi

I guess...nothing. Not with LWJGL at least. You'd need a Cg runtime binding (iirc Jogl has one, hasn't it?). Or wait for the unthinkable, that is, ATI to implement the EXT_cg_shader extension :wink:.

Chman

... so I will wait :D

Thanks for your support !

Chman

spasi

Glad to help!

About shader developing in general, I find Cg's features much better than GLSL's (interfaces, imports, decent syntax), except a couple of things (e.g. vertex attribute binding). But, GLSL is the standard and I'm not seeing Cg as a viable solution for commercial applications (for several other reasons too).

I guess GLSL will catch up in the future, but in the meantime we'll have to live with ARB's unfortunate decision (to adopt 3DLabs' proposal).

Anyway, writing shaders in either language is lots of fun as you say! :D

Chman

Yeah, so much fun :)

The problem with GLSL is that all drivers are crappy... It tends to work better on ATI but it's a bit buggy.

nVidia is designing a complete driver for GLSL, which is good. About ATI, they are reworking there OpenGL from scratch, and I must admit it's a good thing giving the fact that there previous OpenGL drivers were so bad, but we'll surely have to wait longer to get good GLSL drivers...

Chman