[Matzon Reply #5]
> the fact that we're doing ANY conversion, COULD be problematic.
Exactly. But as early as when native byte (or alchar?) arrays are converted to Java Strings (at JNI level) there is an assumption about the encoding. JNI provides GetStringUTFChars or GetStringChars with utf-8 and Unicode encodings - respectively.
Having a look at LWJGL's implementation... (tracking it down from alcGetString, which provides the default record device):
#234 common_tools.c
jstring NewStringNativeWithLength(JNIEnv *env, const char *str, int length) {
// ...
jcls_str = (*env)->FindClass(env,"java/lang/String");
...
jmethod_str = (*env)->GetMethodID(env,jcls_str, "<init>", "([B)V");
...
result = (jstring)(*env)->NewObject(env,jcls_str, jmethod_str, bytes);
//...
The Java string is created with default c'tor, that is, it "constructs a new String by decoding the specified array of bytes using the platform's default charset." (from Java's String doc)... So, there is the native char/byte array of unknown encoding, converted by a platform specific encoding. On the way 'back' (ie when opening the device from the app code) the same string gets converted to ASCII by org.lwjgl.MemoryUtil.encodeASCII... well.. good luck with that.
So, again, how about using byte array arguments? Along the lines of:
byte[] ALC10.alcGetStringAsByteArray(...)
alcCaptureOpenDevice(byte [], ...)
// as for JNI... something like...
static jbyteArray JNICALL Java_org_lwjgl_openal_ALC10_alcGetStringAsByteArray (JNIEnv *env, jclass clazz, jlong deviceaddress, jint token) {
const char* alcString = (const char*) alcGetString((ALCdevice*)((intptr_t)deviceaddress), (ALenum) token);
jbyteArray jb=(*env)->NewByteArray(env, strlen(alcString));
(*env)->SetByteArrayRegion(env, jb, 0,
strlen(alcString), (jbyte *)m);
...