Hello Guest

LWJGL + Cg = ?

  • 6 Replies
  • 8665 Views
*

Offline Optus

  • *
  • 24
    • http://www.3r337.org
LWJGL + Cg = ?
« on: April 12, 2004, 19:01:35 »
Hi Everyone, I'm sure this has been covered before.  I searched multiple times and came up with nothing :/.  Anyway, I just got a new video card, and was interested in checking out Cg.  Can I use Cg with LWJGL?

*

Offline Matzon

  • *****
  • 2242
LWJGL + Cg = ?
« Reply #1 on: April 12, 2004, 19:30:55 »
Though I answered it on #lwjgl @ irc.freenode.net, I'll just answer it here too:

Yes, LWJGL supports Cg, check the org.lwjgl.test.opengl.Grass by elias (nVidia only).

*

Offline elias

  • *****
  • 899
    • http://oddlabs.com
LWJGL + Cg = ?
« Reply #2 on: April 12, 2004, 20:59:03 »
Actually, that's not quite true. The Grass example is in fact just using a plain NV_vertex_program created by Cg and hand-imported. The actual Cg runtime is not supported by LWJGL.

 - elias

*

Offline spasi

  • *****
  • 2261
    • WebHotelier
LWJGL + Cg = ?
« Reply #3 on: April 13, 2004, 06:42:04 »
Why limit to NV only, when there's GLSL? And, on NV hardware, you can use Cg functionality straight through GLSL (which may sound great, but sucks big time IMHO, for several reasons). You'll have to grab the latest drivers (leaked/beta) to use it though. Of course, LWJGL has support for GLSL :).

*

Offline elias

  • *****
  • 899
    • http://oddlabs.com
LWJGL + Cg = ?
« Reply #4 on: April 13, 2004, 10:34:36 »
Sure. But the grass demo is way old (older than even ARB_vertex_program, or else I would have used that). And with the advent of GLSL and the Cg extension to it, I see no real reason to support the Cg runtime at the moment.

 - elias

*

Offline Optus

  • *
  • 24
    • http://www.3r337.org
LWJGL + Cg = ?
« Reply #5 on: April 13, 2004, 19:26:40 »
So, just so I'm clear on the subject.  Should I use Cg to compile to a ARB_vertex_program or should I investigate GLSL? I know nothing about GLSL (It's long for GLSlang right?), does it, like Cg, compile high-level code to vertex_programs, or is it different?

*

Offline spasi

  • *****
  • 2261
    • WebHotelier
LWJGL + Cg = ?
« Reply #6 on: April 13, 2004, 20:40:47 »
Cg, like DirectX's HLSL, compiles to low-level shaders/programs. You can do it offline, or at runtime. The problem is, there are lots of targets (nv20, nv30, ati ones, etc.) to worry about. As an example, I read in a Valve presentation that Half-Life 2 uses THOUSANDS of, HLSL generated, low-level shaders. They write HLSL, compile all of the possible "versions" of the shaders (different # of lights, # of bones, enabled effects, etc.) and use the low-level ones at runtime, which of course is a big mess to manage.

With GLSL, the drivers compile at runtime only. And you don't have to worry about what's being generated. The driver will choose the best one for the hardware to run it. For example, on my FX it compiles to NV VP2.0 and NV FP1.0, thus being able to support everything available on my card (and optimize better). On a GeForce4, it'll probably compile to NV VP1.0 and an ATI would use the ARB programs. But you really don't (and shouldn't) care. It's all being done behind the scenes, you only care about shader objects and shader programs. No low-level stuff. It's much easier for the developer and as I said, if you want more flexibility you can always use Cg stuff (hint: check for the EXT_Cg_shader extension availability).