Hello again. I apologise for posting another question right on the heels of another. I'm a firm believer in "sit on it a week and figuire it out yourself". However, a week in other people's time is a day in mine...
So, the story goes, I stayed up for the past 30 hours coding again. Right now I'm trying to implement a HLSL shader that utilizes two textures. I've posted on gamedev my issue: http://www.gamedev.net/community/forums/topic.asp?topic_id=287220
Although this isn't a HLSL forum, I believe my issue might be tied to LWJGL itself. In other exampels of HLSL coding, I have come across the "glActiveTextureARB(GL_TEXTURE1_ARB);" method. Now, LWJGL doesn't have this function- the closest thing to it is the glActiveTexture(GL_TEXTURE1) call. Perhaps my problems in multitexturing lie in not utilizing LWJGL properly? For reference, here is an example of what I'm going for: http://www.clockworkcoders.com/oglsl/tutorial8.htm
If someone woudl like to package up a nifty LWJGL example of the above tutorial I'd love them forever!
Hum, I would be very suprised if you were able to run a HLSL shader with LWJGL... HLSL are DirectX only from what I know... Or are you speaking of GLSL ?
Chman
edit: yes, you're speaking of GLSL according to the link you give...
You are correct... GLSL. That's what 30 hours awake will do to you!
glActiveTextureARB in the ARBMultitexture class?
Quote from: "Matzon"glActiveTextureARB in the ARBMultitexture class?
Why, yes, that's it. Honestly, I'm very confused by the whole structure of LWJGL, at least when I start doing more cryptic things, then functions start hiding under different classes. Anyway, my post on gamedev has been updated. I've setup things using glActiveTextureARB and things still do not work properly.
Blech. All this torment over sending 2 simple texture ids to a shader. Am I the first to do this in LWJGL?
I posted a GLSL starter guide in the LWJGL wiki a few months back. I've had multiple samper2d's working without too much of a hitch. Pay attention to compiler & linker errors if you're getting any. Failing that, just try the simplest possible (like using the same set of texture coords for both textures, or just trying to sample the other texture).
Quote from: "Funkapotamus"Honestly, I'm very confused by the whole structure of LWJGL, at least when I start doing more cryptic things, then functions start hiding under different classes.
Multitexturing has been in core OpenGL since version 1.3. So, you can immediately use the GL13 class (you won't find a card that supports GLSL and doesn't also support at least 1.4). Generally, at runtime, you check which OpenGL version the user's system supports, so that you can use the appropriate GLXX classes. For functionality other than that (say, for an extension that got into the core at 2.0 (like GLSL), but you are on 1.5), you use the corresponding extension (ARB, EXT, etc).
For your shader problem...hmm, your code looks ok. Two points:
1. Are you sure you're null-terminating the uniform strings?
2. It doesn't affect your situation, but it isn't necessary to enable the texture targets (GL_TEXTURE_2D) when using fragment shaders.
Yep, I'm null terminating the uniform strings. In fact, I'm using your code! Straight from the examples you put on the cvs.
I think my issue is texture coordinates. I have no idea how they work in GLSL- I'll have to go read up on them somewhere' (anyone recommend any books? Orange book sound good to any one?) Anyway, one texture is 1024X1024px and the other is 256X256... so I definatly expect it to be an issue there. However, it still doesn' tmake much sense as to why it'd be such a big deal- I mean, if the texture coordinates are reduced to one unit (i.e. (0,0), (1,0), (1,1), (0, 1)) then they should be valid for either texture, yes?
Thanks for all your help guys... I'll get this working yet :/
Quote from: "Funkapotamus"(anyone recommend any books? Orange book sound good to any one?)
This one is definitly a MUST-have for GLSL coders...
QuoteAnyway, one texture is 1024X1024px and the other is 256X256... so I definatly expect it to be an issue there. However, it still doesn' tmake much sense as to why it'd be such a big deal- I mean, if the texture coordinates are reduced to one unit (i.e. (0,0), (1,0), (1,1), (0, 1)) then they should be valid for either texture, yes?
There's no problem here. It can handle different texture sizes for multi-texturing, as OpenGL do... So the issue is not there...
I will look at your GLSL code more in depth when I'll have more time (if you don't find the solution before)...
Chman
Quote from: "Funkapotamus"I mean, if the texture coordinates are reduced to one unit (i.e. (0,0), (1,0), (1,1), (0, 1)) then they should be valid for either texture, yes?
Do you mean you are using one set of texture coordinates? If so, this (from the code in gamedev) won't work:
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_TexCoord[1] = gl_MultiTexCoord1;
You'll need to put the same texcoord in both varyings:
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_TexCoord[1] = gl_MultiTexCoord0;