9

I'm writing a Kafka Streams application on my development Windows machine. If I try to use the leftJoin and branch features of Kafka Streams I get the error below when executing the jar application:

Exception in thread "StreamThread-1" java.lang.UnsatisfiedLinkError: C:\Users\user\AppData\Local\Temp\librocksdbjni325337723194862275.dll: Can't find dependent libraries
    at java.lang.ClassLoader$NativeLibrary.load(Native Method)
    at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
    at java.lang.Runtime.load0(Runtime.java:809)
    at java.lang.System.load(System.java:1086)
    at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78)
    at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56)
    at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64)
    at org.rocksdb.RocksDB.<clinit>(RocksDB.java:35)
    at org.rocksdb.Options.<clinit>(Options.java:22)
    at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:115)
    at org.apache.kafka.streams.state.internals.Segment.openDB(Segment.java:38)
    at org.apache.kafka.streams.state.internals.Segments.getOrCreateSegment(Segments.java:75)
    at org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStore.put(RocksDBSegmentedBytesStore.java:72)
    at org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStore.put(ChangeLoggingSegmentedBytesStore.java:54)
    at org.apache.kafka.streams.state.internals.MeteredSegmentedBytesStore.put(MeteredSegmentedBytesStore.java:101)
    at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:109)
    at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:101)
    at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:65)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:43)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70)
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:641)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:368)

It seems like Kafka does not find a DLL, but wait...I'm developing a Java application!

What could be the problem? And why this error doesn't show off if I try to do simpler streaming operations like only a filter?

UPDATE:

This problem raises only when a message is present in the broker. I'm using Kafka Streams version 0.10.2.1.

This is the piece of code which raises the problem

public class KafkaStreamsMainClass {

    private KafkaStreamsMainClass() {
    }

    public static void main(final String[] args) throws Exception {
        Properties streamsConfiguration = new Properties();
        streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams");
        streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-server:9092");
        streamsConfiguration.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "schema-registry:8082");
        streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000);
        streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
        streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
        streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
        KStreamBuilder builder = new KStreamBuilder();
        KStream<GenericRecord, GenericRecord> sourceStream = builder.stream(SOURCE_TOPIC);

        KStream<GenericRecord, GenericRecord> finishedFiltered = sourceStream
                .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") != null);

        KStream<GenericRecord, GenericRecord>[] branchedStreams = sourceStream
                .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") == null)
                .branch((GenericRecord key, GenericRecord value) -> value.get("firstField") != null,
                        (GenericRecord key, GenericRecord value) -> value.get("secondField") != null);

        branchedStreams[0] = finishedFiltered.join(branchedStreams[0],
                (GenericRecord value1, GenericRecord value2) -> {
                    return value1;
                }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));

        branchedStreams[1] = finishedFiltered.join(branchedStreams[1],
                (GenericRecord value1, GenericRecord value2) -> {
                    return value1;
                }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));

        KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
        streams.setUncaughtExceptionHandler((Thread thread, Throwable throwable) -> {
            throwable.printStackTrace();
        });
        streams.start();

        Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
    }

}

I opened the rocksdbjni-5.0.1.jar archive downloaded by Maven and it includes the librocksdbjni-win64.dll library. It seems that it is trying to retrieve the library from the outside of the RocksDB instead from the inner.

I'm developing on a Windows 7 machine.

Have you ever experienced this problem?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
gvdm
  • 3,006
  • 5
  • 35
  • 73
  • This is odd. For reference, I ran the `mvn test` suite of Confluent's Kafka Streams demo apps (https://github.com/confluentinc/examples) on Windows 10 w/ Oracle JDK 1.8 yesterday, and (with one exception due to a Kafka broker bug on Windows = unrelated to Kafka Streams) everything worked out of the box. Perhaps you can provide more detail about your environment (Windows version, Java version, etc.), the exact version of Kafka Streams you are using, and the code so that it's easier to reproduce? – miguno May 03 '17 at 08:20
  • 1
    I think I found the problem. In my local Maven repository I had two versions of RocksDB, version 4.4.1 and 5.0.1 (which is the one used by Kafka Streams 0.10.2 I'm using). I deleted the 4.4.1 version and the problem went away. The strange thing is that Maven was using the old version of the library.. – gvdm May 03 '17 at 09:09
  • Nope, the problem is again there. It was not showing off because I had no messages in the broker (they were deleted by the Kafka deletion job). I will update my question with the required information – gvdm May 03 '17 at 15:51
  • See also http://stackoverflow.com/questions/41291996/executing-kafka-stream-example-with-eclipse-fails-with-unsatisfiedlinkerror-for/41308528?noredirect=1#comment74549064_41308528 – Matthias J. Sax May 04 '17 at 03:20
  • @gvdm - is this issue resolved? I am getting exactly same issue. strange thing same setup is working in another similar machine. – Mudit bhaintwal Jun 22 '17 at 15:31
  • HI @Mudit. Nope, when I need to develop on Kafka Streams I use a Linux machine and everything works – gvdm Jun 22 '17 at 15:40
  • It's very strange that Kafka still not have support for Windows!!! Have you tried with cygwin also. – Mudit bhaintwal Jun 22 '17 at 15:55
  • No but the problem is the compilation with Maven in a Windows environment. I think that Cygwin would not do the trick in this case – gvdm Jun 23 '17 at 07:16

6 Answers6

15

Recently I came through this problem too. I managed to solve this in two steps:

  1. Delete all librocksdbjni[...].dll files from C:\Users\[your_user]\AppData\Local\Temp folder.
  2. Add maven dependency for rocksdb in your project, this works for me: https://mvnrepository.com/artifact/org.rocksdb/rocksdbjni/5.0.1

Compile your Kafka Stream App and run it. It should work!

David Corral
  • 4,085
  • 3
  • 26
  • 34
  • Hi @David, thanks for the response. Does this solution work also in a Linux development environment, or messes up the classpath in such an environment? – gvdm Jan 09 '18 at 11:49
  • 1
    Hello @gvdm, I'm sorry but I don't know because actually I only have access to develop in Windows. – David Corral Jan 09 '18 at 15:27
  • Ok @David, I will try as soon as I can and let you know – gvdm Jan 10 '18 at 08:19
  • 1
    Hi @David. I solved the problem upgrading to 1.0.0. More in the answer https://stackoverflow.com/questions/43742423/unsatisfiedlinkerror-on-lib-rocks-db-dll-when-developing-with-kafka-streams#48340251 . Thank you for your support. – gvdm Jan 19 '18 at 11:38
  • Thank you very much, i was using Windows and upgrading version was not help – meobeo173 Apr 03 '18 at 07:21
  • Why 5.0.1 works fine, but 5.7.3 included in dependencies by spring boot 2.1 management plugin doesn't work? – CHEM_Eugene Apr 12 '19 at 08:33
  • How do I fix it on Mac? – emeraldhieu Oct 03 '22 at 16:14
3

I updated my kafka-streams project to the latest released version 1.0.0.

This version suffers of this bug but after patching it and uploading this patched version on the internal Artifactory server we were able to execute our kafka-streams agent both on Windows and on Linux. The next versions 1.0.1 and 1.1.0 will have this bug fix so as soon as one of these versions will be released we will switch to them instead of the patched version.

To sum up the Kafka guys solved this bug with the 1.0.0 release.

gvdm
  • 3,006
  • 5
  • 35
  • 73
1

My problem was permissions in /tmp/ directory (CentOS)

rockdb uses

java.io.tmpdir 

system property internally to decide where to place librocksdbjnifile, usually something like this /tmp/librocksdbjni2925599838907625983.so

Solved by setting different tempdir property with appropriate permissions in kafka-streams app.

System.setProperty("java.io.tmpdir", "/opt/kafka-streams/tmp");
SeaBiscuit
  • 2,553
  • 4
  • 25
  • 40
0

You are missing some native libraries that the rocksdb dll depends on. See https://github.com/facebook/rocksdb/issues/1302

Nicholas
  • 15,916
  • 4
  • 42
  • 66
  • Yes, it is obvious, but why the Maven Java console application is trying to use an external DLL library? It should use only stuff from the classpath! – gvdm May 03 '17 at 06:54
  • Hi @Nicholas. I solved the problem upgrading to 1.0.0. More in the answer https://stackoverflow.com/questions/43742423/unsatisfiedlinkerror-on-lib-rocks-db-dll-when-developing-with-kafka-streams#48340251 . Thank you for your support. – gvdm Jan 19 '18 at 11:39
0

I had same issue while using jdk 1.8. It got resolved when I changed it to jre.

0

Faced similar issue in Mac. As per this link, https://github.com/facebook/rocksdb/issues/5064 issue is related to older libc installed in my version of Mac OS (10.11.6).