Skip to content

Instantly share code, notes, and snippets.

@DaGeRe
Last active June 4, 2025 13:26
Show Gist options
  • Select an option

  • Save DaGeRe/d49d72d5a029a5d3b78e52b1f9db65c1 to your computer and use it in GitHub Desktop.

Select an option

Save DaGeRe/d49d72d5a029a5d3b78e52b1f9db65c1 to your computer and use it in GitHub Desktop.
OpenTelemetry to Kieker Demo

OpenTelemetry Kieker Demo

To run the OpenTelemetry demo using Kieker, run the following steps:

  1. Start the kieker listener:
git clone -b GH-13-GetOERFromProto git@github.com:kieker-monitoring/kieker.git
cd kieker/
cd tools/otel-transformer
../../gradlew assemble
cd build/distributions
unzip otel-transformer-2.0.3-SNAPSHOT.zip
cd otel-transformer-2.0.3-SNAPSHOT/
bin/otel-transformer -lp 9000 -standard OPENTELEMETRY

(leave this process running, and stop via Ctrl + C after running the demo)

  1. Prepare the OpenTelemetry demo: git clone https://github.com/open-telemetry/opentelemetry-demo.git
  2. Edit the repo (go to opentelemetry-demo first):
  • Edit src/otel-collector/otelcol-config-extras.yml: Add (replacing $MYIP by your IP):
exporters:
  otlp/example:
    endpoint: http://$MYIP:9000
    tls:
      insecure: true
      
service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlp/example]
  • Edit docker-compose.yml: Add - _JAVA_OPTIONS=-Dotel.instrumentation.methods.include=oteldemo.AdService[main,getInstance,getAdsByCategory,getRandomAds,createAdsMap,start,stop,blockUntilShutdown];oteldemo.AdService.AdServiceImpl[getAds];oteldemo.problempattern.CPULoad[getInstance,execute,spawnLoadWorkers,stopWorkers] to the AdService
  1. Run the demo (in opentelemetry-demo)
  • Run make start
  • Do anything you'd like to do with the interface
  • Run docker compose down -v
  1. Visualize the results within Kieker (replacing $KIEKERPATH with the path the listener showed you):
cd tools/trace-analysis
../../gradlew assemble
cd build/distributions
unzip trace-analysis-2.0.3-SNAPSHOT.zip
cd trace-analysis-2.0.3-SNAPSHOT/
mkdir graphs
bin/trace-analysis \
  -i $KIEKERPATH \
  -o graphs \
  --plot-Deployment-Component-Dependency-Graph responseTimes-ms \
  --plot-Deployment-Operation-Dependency-Graph responseTimes-ms \
  --plot-Aggregated-Deployment-Call-Tree \
  --plot-Aggregated-Assembly-Call-Tree
cd graphs
for file in *.dot; do dot -Tpng $file -o $file.png; done
@shinhyungyang
Copy link

shinhyungyang commented Jun 2, 2025

@DaGeRe now I continue looking at my issue. Below is the log of the ad container:

Picked up JAVA_TOOL_OPTIONS: -javaagent:/usr/src/app/opentelemetry-javaagent.jar
Picked up _JAVA_OPTIONS: -Dotel.instrumentation.methods.include=oteldemo.AdService[main,getInstance,getAdsByCategory,getRandomAds,createAdsMap,start,stop,blockUntilShutdown];oteldemo.AdService.AdServiceImpl[getAds];oteldemo.problempattern.CPULoad[getInstance,execute,spawnLoadWorkers,stopWorkers]
OpenJDK 64-Bit Server VM warning: Sharing is only supported for boot loader classes because bootstrap classpath has been appended
[otel.javaagent 2025-06-02 07:34:43:973 +0000] [main] INFO io.opentelemetry.javaagent.tooling.VersionLogger - opentelemetry-javaagent - version: 2.15.0
2025-06-02 07:34:47 - oteldemo.AdService - Ad service starting. trace_id=0589a16b0002210f47f4cb1044104881 span_id=5d0a88b29a0e2fae trace_flags=01 
SLF4J(W): No SLF4J providers were found.
SLF4J(W): Defaulting to no-operation (NOP) logger implementation
SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details.
2025-06-02 07:34:48 - oteldemo.AdService - Ad service started, listening on 9555 trace_id=0589a16b0002210f47f4cb1044104881 span_id=6118f8b81c8b740f trace_flags=01 
[otel.javaagent 2025-06-02 07:35:00:676 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export logs. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more
[otel.javaagent 2025-06-02 07:35:00:737 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export spans. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more
[otel.javaagent 2025-06-02 07:35:17:110 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export logs. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more

This log keeps repeating:

[otel.javaagent 2025-06-02 07:35:00:676 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export logs. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more

One thing I overlooked: otlp/http -> otlp

  • old
exporters:
  otlphttp/example:
    endpoint: http://172.19.233.3:9000
    tls:
      insecure: true

service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlphttp/example]
  • new
exporters:
  otlp/example:
    endpoint: http://172.19.233.3:9000
    tls:
      insecure: true

service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlp/example]

After this, otel-transformer logs look different (no records yet)

bin/otel-transformer -lp 9000 -standard OPENTELEMETRY
15:00:24.776 [main] DEBUG RecordReceiverMain -- OpenTelemetry Transformer
15:00:24.845 [main] DEBUG kieker.analysis.generic.sink.DataSink:DataSink-0 -- numOpenedInputPorts (inc): 1
15:00:24.846 [main] DEBUG kieker.analysis.generic.sink.DataSink -- Configuration complete.
15:00:24.862 [main] INFO kieker.monitoring.core.controller.TCPController -- Could not parse port for the TCPController, deactivating this option. Received string was:
15:00:24.941 [main] DEBUG kieker.monitoring.core.controller.WriterController -- Initializing Writer Controller
15:00:24.941 [main] INFO kieker.monitoring.core.controller.StateController -- Enabling monitoring
15:00:24.941 [Thread-0] DEBUG kieker.monitoring.writer.MonitoringWriterThread -- kieker.monitoring.writer.MonitoringWriterThread is running.
15:00:24.955 [main] INFO kieker.monitoring.core.controller.MonitoringController -- Current State of kieker.monitoring (2.0.3-SNAPSHOT) Status: 'enabled'
        Name: 'KIEKER'; Hostname: 'fedora'; experimentID: '1'
JMXController: JMX disabled
TimeSource: 'kieker.monitoring.timer.SystemNanoTimer'
        Time in nanoseconds (with nanoseconds precision) since Thu Jan 01 01:00:00 CET 1970'
ProbeController: disabled
WriterController:
        Queue type: class kieker.monitoring.queue.BlockingQueueDecorator
        Queue capacity: 10000
        Insert behavior (a.k.a. QueueFullBehavior): class kieker.monitoring.queue.behavior.BlockOnFailedInsertBehavior
                numBlocked: 0
Writer: 'kieker.monitoring.writer.filesystem.FileWriter'
        Configuration:
                kieker.monitoring.writer.filesystem.FileWriter.logFilePoolHandler='kieker.monitoring.writer.filesystem.RotatingLogFilePoolHandler'
                kieker.monitoring.writer.filesystem.FileWriter.charsetName='UTF-8'
                kieker.monitoring.writer.filesystem.FileWriter.logStreamHandler='kieker.monitoring.writer.filesystem.TextLogStreamHandler'
                kieker.monitoring.writer.filesystem.FileWriter.maxEntriesInFile='25000'
                kieker.monitoring.writer.filesystem.FileWriter.maxLogFiles='-1'
                kieker.monitoring.writer.filesystem.FileWriter.maxLogSize='-1'
                kieker.monitoring.writer.filesystem.FileWriter.mapFileHandler='kieker.monitoring.writer.filesystem.TextMapFileHandler'
                kieker.monitoring.writer.filesystem.FileWriter.flush='false'
                kieker.monitoring.writer.filesystem.FileWriter.customStoragePath=''
                kieker.monitoring.writer.filesystem.FileWriter.actualStoragePath='/tmp/kieker-20250602-130024-27376755983835-UTC--KIEKER'

        Automatic assignment of logging timestamps: 'true'
Sampling Controller: Periodic Sensor available: Poolsize: '0'; Scheduled Tasks: '0'
15:00:24.962 [main] DEBUG teetime.framework.scheduling.pushpullmodel.A3PipeInstantiation -- Connected (unsynch) teetime.framework.OutputPort@7ea37dbf and teetime.framework.InputPort@4b44655e
15:00:24.966 [Thread for OtlpGrpcReceiverStage-0] DEBUG kieker.tools.oteltransformer.receiver.OtlpGrpcReceiverStage -- Executing runnable stage...
15:00:24.966 [main] DEBUG teetime.framework.Execution -- Using scheduler: teetime.framework.scheduling.pushpullmodel.PushPullScheduling
15:00:24.967 [main] DEBUG RecordReceiverMain -- Running transformer
15:00:25.006 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.logging.InternalLoggerFactory -- Using SLF4J as the default logging framework
15:00:25.007 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- -Dio.netty.noUnsafe: false
15:00:25.008 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- Java version: 21
15:00:25.008 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.theUnsafe: available
15:00:25.009 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.copyMemory: available
15:00:25.009 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.storeFence: available
15:00:25.010 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.Buffer.address: available
15:00:25.010 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- direct buffer constructor: unavailable: Reflective setAccessible(true) disabled
15:00:25.011 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.Bits.unaligned: available, true
15:00:25.012 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable: class io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0$7 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @67784306
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.DirectByteBuffer.<init>(long, {int,long}): unavailable
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- sun.misc.Unsafe: available
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.bitMode: 64 (sun.arch.data.model)
15:00:25.014 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.maxDirectMemory: -1 bytes
15:00:25.015 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.uninitializedArrayAllocationThreshold: -1
15:00:25.022 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.CleanerJava9 -- java.nio.ByteBuffer.cleaner(): available
15:00:25.022 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.noPreferDirect: false
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.deleteLibAfterLoading: true
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.tryPatchShadedId: true
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.detectNativeLibraryDuplicates: true
15:00:25.067 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- Successfully loaded the library /tmp/libio_grpc_netty_shaded_netty_transport_native_epoll_x86_649226255254318685962.so
15:00:25.071 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- -Djava.net.preferIPv4Stack: false
15:00:25.071 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- -Djava.net.preferIPv6Addresses: false
15:00:25.073 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtilInitializations -- Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
15:00:25.074 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- /proc/sys/net/core/somaxconn: 4096
15:00:25.092 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.channel.MultithreadEventLoopGroup -- -Dio.netty.eventLoopThreads: 24
15:00:25.132 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.concurrent.GlobalEventExecutor -- -Dio.netty.globalEventExecutor.quietPeriodSeconds: 1
15:00:25.136 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.InternalThreadLocalMap -- -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
15:00:25.136 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.InternalThreadLocalMap -- -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
15:00:25.147 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- org.jctools-core.MpscChunkedArrayQueue: available
15:00:25.171 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector -- -Dio.grpc.netty.shaded.io.netty.leakDetection.level: simple
15:00:25.171 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector -- -Dio.grpc.netty.shaded.io.netty.leakDetection.targetRecords: 4
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.numHeapArenas: 24
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.numDirectArenas: 24
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.pageSize: 8192
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxOrder: 9
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.chunkSize: 4194304
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.smallCacheSize: 256
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.normalCacheSize: 64
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxCachedBufferCapacity: 32768
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.cacheTrimInterval: 8192
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.cacheTrimIntervalMillis: 0
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.useCacheForAllThreads: false
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
15:00:25.195 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId -- -Dio.netty.processId: 576675 (auto-detected)
15:00:25.197 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId -- -Dio.netty.machineId: 2c:58:b9:ff:fe:b3:8e:20 (auto-detected)
15:00:25.218 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.allocator.type: pooled
15:00:25.218 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.threadLocalDirectBufferSize: 0
15:00:25.218 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.maxThreadLocalCharBufferSize: 16384
15:00:25.222 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.bootstrap.ChannelInitializerExtensions -- -Dio.netty.bootstrap.extensions: null
OTLP gRPC Receiver läuft auf Port 9000
15:00:39.565 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf -- -Dio.grpc.netty.shaded.io.netty.buffer.checkAccessible: true
15:00:39.565 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf -- -Dio.grpc.netty.shaded.io.netty.buffer.checkBounds: true
15:00:39.565 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory -- Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector@3d86d6ae
15:00:39.619 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.maxCapacityPerThread: 4096
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.ratio: 8
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.chunkSize: 32
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.blocking: false
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.batchFastThreadLocalOnly: true
15:00:39.629 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:39.639 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] INBOUND SETTINGS: ack=false settings={}
15:00:39.640 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] OUTBOUND SETTINGS: ack=true
15:00:40.275 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:40.276 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:40.277 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] INBOUND SETTINGS: ack=false settings={}
15:00:40.277 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] OUTBOUND SETTINGS: ack=true
15:00:40.277 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] INBOUND SETTINGS: ack=true
15:00:40.279 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] INBOUND GO_AWAY: lastStreamId=2147483647 errorCode=0 length=25 bytes=636c69656e74207472616e73706f72742073687574646f776e
15:00:41.040 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:41.047 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:41.048 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] INBOUND SETTINGS: ack=false settings={}
15:00:41.048 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] OUTBOUND SETTINGS: ack=true
15:00:41.968 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:41.970 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:41.971 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] INBOUND SETTINGS: ack=false settings={}
15:00:41.971 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] OUTBOUND SETTINGS: ack=true
15:00:43.206 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:43.207 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:43.208 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] INBOUND SETTINGS: ack=false settings={}
15:00:43.208 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] OUTBOUND SETTINGS: ack=true
15:00:43.209 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] INBOUND SETTINGS: ack=true
15:00:43.209 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] INBOUND GO_AWAY: lastStreamId=2147483647 errorCode=0 length=25 bytes=636c69656e74207472616e73706f72742073687574646f776e
15:00:45.248 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:45.250 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:45.250 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] INBOUND SETTINGS: ack=false settings={}
15:00:45.250 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] OUTBOUND SETTINGS: ack=true
15:00:48.910 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:48.911 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:48.911 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] INBOUND SETTINGS: ack=false settings={}
15:00:48.911 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] OUTBOUND SETTINGS: ack=true

These may be related:

@shinhyungyang
Copy link

shinhyungyang commented Jun 4, 2025

@DaGeRe

Thanks for having a look at the problem on my laptop during yesterday's Kieker meeting. Today I also solved this issue on my office laptop including the ulimits issue.

I have re-installed docker several times, so I cannot point out where the problem was, but a clean docker installation should work.

diff --git a/docker-compose.yml b/docker-compose.yml
index 85f9ee5..73eb1d6 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -72,7 +72,7 @@ services:
       - OTEL_LOGS_EXPORTER=otlp
       - OTEL_SERVICE_NAME=ad
       # Workaround on OSX for https://bugs.openjdk.org/browse/JDK-8345296
-      - _JAVA_OPTIONS
+      - _JAVA_OPTIONS=-Dotel.instrumentation.methods.include=oteldemo.AdService[main,getInstance,getAdsByCategory,getRandomAds,createAdsMap,start,stop,blockUntilShutdown];oteldemo.AdService.AdServiceImpl[getAds];oteldemo.problempattern.CPULoad[getInstance,execute,spawnLoadWorkers,stopWorkers]
     depends_on:
       otel-collector:
         condition: service_started
diff --git a/src/otel-collector/otelcol-config-extras.yml b/src/otel-collector/otelcol-config-extras.yml
index 73dd011..15b2721 100644
--- a/src/otel-collector/otelcol-config-extras.yml
+++ b/src/otel-collector/otelcol-config-extras.yml
@@ -8,11 +8,13 @@
 ## Note: the spanmetrics exporter must be included in the exporters array
 ## if overriding the traces pipeline.
 ##
-#  exporters:
-#    otlphttp/example:
-#      endpoint: <your-endpoint-url>
-#
-#  service:
-#    pipelines:
-#      traces:
-#        exporters: [spanmetrics, otlphttp/example]
+exporters:
+  otlp/example:
+    endpoint: http://172.19.233.3:9000
+    tls:
+      insecure: true
+
+service:
+  pipelines:
+    traces:
+      exporters: [spanmetrics, otlp/example]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment