There several things that aren't quite right with your code and approach:
- As you've discovered,
(env*)->JNIFunc(env,...)
should be env->JNIFunc(...)
in C++. Your vendor's (Google Android's) jni.h simplifies the C++ syntax over the C syntax.
- You're not calling the "Release" function (ReleaseStringUTFChars) corresponding to the "pinning" function (GetStringUTFChars). This is very important because pinned objects reduce the memory efficiency of the JVM's garbage collector.
- You've misinterpreted the final argument to
GetStringUTFChars
. It's a pointer for an output parameter. The result isn't very interesting so pass nullptr
.
- You're using JNI functions that deal with the modified UTF-8 encoding (
GetStringUTFChars
et al). There should be no need to ever use that encoding. Java classes are very capable at converting encodings. They also give you control over what happens when a character cannot be encoded in the target encoding. (The default is to convert it to a ?
.)
- The idea of converting a JVM object reference (
jstring
) to a pointer to one byte storage (char*
) needs a lot of refinement. You probably want to copy the characters in a JVM java.lang.String
to a "native" string using a specific or OS-default encoding. The Java string has Unicode characters with a UTF-16 encoding. Android generally uses the Unicode character set with the UTF-8 encoding. If do you need something else, you can specify it with a Charset
object.
- Also, in C++, it is more convenient to use STL
std::string
to hold counted byte-sequences for a string. You can get a pointer to a null-terminated buffer from a std::string
if you need it.
Be sure to read Android's JNI Tips.
Here is an implementation of your function that lets the vendor's JVM implementation pick the target encoding (which is UTF-8 for Android):
extern "C" JNIEXPORT void Java_com_sek_test_JNITest_printSomething
(JNIEnv * env, jclass cl, jstring str) {
// TODO check for JVM exceptions where appropriate
// javap -s -public java.lang.String | egrep -A 2 "getBytes"
const auto stringClass = env->FindClass("java/lang/String");
const auto getBytes = env->GetMethodID(stringClass, "getBytes", "()[B");
const auto stringJbytes = (jbyteArray) env->CallObjectMethod(str, getBytes);
const auto length = env->GetArrayLength(stringJbytes);
const auto pBytes = env->GetByteArrayElements(stringJbytes, nullptr);
std::string s((char *)pBytes, length);
env->ReleaseByteArrayElements(stringJbytes, pBytes, JNI_ABORT);
const auto pChars = s.c_str(); // if you really do need a pointer
}
However, I'd probably do the call to String.getBytes
on the Java side, defining the native method to take a byte array instead of a string.
(Of course, implementations that use GetStringUTFChars do work for some subset of Unicode strings but why impose an esoteric and needless limit?)