Skip to content

Instantly share code, notes, and snippets.

@DaGeRe
Last active June 4, 2025 13:26
Show Gist options
  • Select an option

  • Save DaGeRe/d49d72d5a029a5d3b78e52b1f9db65c1 to your computer and use it in GitHub Desktop.

Select an option

Save DaGeRe/d49d72d5a029a5d3b78e52b1f9db65c1 to your computer and use it in GitHub Desktop.
OpenTelemetry to Kieker Demo

OpenTelemetry Kieker Demo

To run the OpenTelemetry demo using Kieker, run the following steps:

  1. Start the kieker listener:
git clone -b GH-13-GetOERFromProto git@github.com:kieker-monitoring/kieker.git
cd kieker/
cd tools/otel-transformer
../../gradlew assemble
cd build/distributions
unzip otel-transformer-2.0.3-SNAPSHOT.zip
cd otel-transformer-2.0.3-SNAPSHOT/
bin/otel-transformer -lp 9000 -standard OPENTELEMETRY

(leave this process running, and stop via Ctrl + C after running the demo)

  1. Prepare the OpenTelemetry demo: git clone https://github.com/open-telemetry/opentelemetry-demo.git
  2. Edit the repo (go to opentelemetry-demo first):
  • Edit src/otel-collector/otelcol-config-extras.yml: Add (replacing $MYIP by your IP):
exporters:
  otlp/example:
    endpoint: http://$MYIP:9000
    tls:
      insecure: true
      
service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlp/example]
  • Edit docker-compose.yml: Add - _JAVA_OPTIONS=-Dotel.instrumentation.methods.include=oteldemo.AdService[main,getInstance,getAdsByCategory,getRandomAds,createAdsMap,start,stop,blockUntilShutdown];oteldemo.AdService.AdServiceImpl[getAds];oteldemo.problempattern.CPULoad[getInstance,execute,spawnLoadWorkers,stopWorkers] to the AdService
  1. Run the demo (in opentelemetry-demo)
  • Run make start
  • Do anything you'd like to do with the interface
  • Run docker compose down -v
  1. Visualize the results within Kieker (replacing $KIEKERPATH with the path the listener showed you):
cd tools/trace-analysis
../../gradlew assemble
cd build/distributions
unzip trace-analysis-2.0.3-SNAPSHOT.zip
cd trace-analysis-2.0.3-SNAPSHOT/
mkdir graphs
bin/trace-analysis \
  -i $KIEKERPATH \
  -o graphs \
  --plot-Deployment-Component-Dependency-Graph responseTimes-ms \
  --plot-Deployment-Operation-Dependency-Graph responseTimes-ms \
  --plot-Aggregated-Deployment-Call-Tree \
  --plot-Aggregated-Assembly-Call-Tree
cd graphs
for file in *.dot; do dot -Tpng $file -o $file.png; done
@shinhyungyang
Copy link

shinhyungyang commented May 27, 2025

It's Fedora 42 on x86-64. I forgot to mention, but followed the solution here: wazuh/wazuh-docker#903 (comment)
The solution was to re-configure the elasticsearch container, and opentelemetry-demo uses opensearch.

@shinhyungyang
Copy link

I can see the demo via localhost:8080 but not receiving any kieker records.

src/otel-collector/otelcol-config-extras.yml file:

exporters:
  otlphttp/example:
    endpoint: http://172.19.233.3:9000
    tls:
      insecure: true

service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlphttp/example]

It may just be possible at home. Running network related things under the university network policy seems hard, and yet they are hardening more and more.

@DaGeRe
Copy link
Author

DaGeRe commented May 27, 2025

It's Fedora 42 on x86-64. I forgot to mention, but followed the solution here: wazuh/wazuh-docker#903 (comment) The solution was to re-configure the elasticsearch container, and opentelemetry-demo uses opensearch.

It works for me with Fedora 41 on x86-64, so that is a bit surprising. I'll update and see what happens.

@DaGeRe
Copy link
Author

DaGeRe commented May 27, 2025

I can see the demo via localhost:8080 but not receiving any kieker records.
src/otel-collector/otelcol-config-extras.yml file:

exporters:
  otlphttp/example:
    endpoint: http://172.19.233.3:9000
    tls:
      insecure: true

service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlphttp/example]

It may just be possible at home. Running network related things under the university network policy seems hard, and yet they are hardening more and more.

Could you try docker logs ad? There should be no errors here.

And in the kieker process, it should contain stuff like

Hostname: flagd.evaluation.v1.Service-172.19.0.21
16:55:54.246 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] OUTBOUND HEADERS: streamId=7569 headers=GrpcHttp2OutboundHeaders[:status: 200, content-type: application/grpc, grpc-encoding: identity, grpc-accept-encoding: gzip] padding=0 endStream=false
16:55:54.246 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] OUTBOUND DATA: streamId=7569 padding=0 endStream=false length=5 bytes=0000000000
16:55:54.246 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] OUTBOUND HEADERS: streamId=7569 headers=GrpcHttp2OutboundHeaders[grpc-status: 0] padding=0 endStream=true
16:55:54.246 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=5
16:55:54.246 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] INBOUND PING: ack=false bytes=145258749040133895
16:55:54.246 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] OUTBOUND PING: ack=true bytes=145258749040133895
16:55:59.262 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] INBOUND HEADERS: streamId=7571 headers=GrpcHttp2RequestHeaders[:path: /opentelemetry.proto.collector.trace.v1.TraceService/Export, :authority: 10.45.29.12:9000, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, user-agent: OpenTelemetry Collector Contrib/0.125.0 (linux/amd64) grpc-go/1.72.0, grpc-encoding: gzip, grpc-accept-encoding: snappy,zstd,gzip,zstdarrow1,zstdarrow2,zstdarrow3,zstdarrow4,zstdarrow5,zstdarrow6,zstdarrow7,zstdarrow8,zstdarrow9,zstdarrow10, grpc-timeout: 4999945u] padding=0 endStream=false
16:55:59.262 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x3f97b643, L:/10.45.29.12:9000 - R:/172.19.0.9:40948] INBOUND DATA: streamId=7571 padding=0 endStream=true length=2014 bytes=01000007d91f8b08000000000000ffc4db7b5053571ec0719e791c89c26dc090085c79c843b9c90d0f01b7561a41cb6c8415b19515986b7208d12437e6de0444...
key: "net.peer.name"
value {
  string_value: "172.19.0.12"
}
 -- string_value: "172.19.0.12"

key: "net.peer.port"
value {
  int_value: 46082
}
 -- int_value: 46082

key: "rpc.system"
value {
  string_value: "grpc"
}
 -- string_value: "grpc"

key: "rpc.service"
value {
  string_value: "flagd.evaluation.v1.Service"
}
 -- string_value: "flagd.evaluation.v1.Service"

key: "rpc.method"
value {
  string_value: "EventStream"
}
 -- string_value: "EventStream"

key: "rpc.grpc.status_code"
value {
  int_value: 0
}
 -- int_value: 0

The university network should not interfere, because if your services can communicate among each other, it should also be possible to communicate between the containers and port 9000.

@shinhyungyang
Copy link

shinhyungyang commented May 27, 2025

On my personal desktop (Fedora 42) the Kieker listener is responding.

Probably this part is important:

INFO: Transport failed
io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception: Unexpected HTTP/1.x request: POST /v1/traces
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception.connectionError(Http2Exception.java:107)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.readClientPrefaceString(Http2ConnectionHandler.java:315)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.decode(Http2ConnectionHandler.java:245)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:451)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:530)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:469)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1407)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:918)
	at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:799)
	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:501)
	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:399)
	at io.grpc.netty.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:994)
	at io.grpc.netty.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.grpc.netty.shaded.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:1583)

And the entire log follows:

bin/otel-transformer -lp 9000 -standard OPENTELEMETRY
22:48:39.333 [main] DEBUG RecordReceiverMain -- OpenTelemetry Transformer
22:48:39.386 [main] DEBUG kieker.analysis.generic.sink.DataSink:DataSink-0 -- numOpenedInputPorts (inc): 1
22:48:39.387 [main] DEBUG kieker.analysis.generic.sink.DataSink -- Configuration complete.
22:48:39.399 [main] INFO kieker.monitoring.core.controller.TCPController -- Could not parse port for the TCPController, deactivating this option. Received string was:
22:48:39.478 [main] DEBUG kieker.monitoring.core.controller.WriterController -- Initializing Writer Controller
22:48:39.478 [main] INFO kieker.monitoring.core.controller.StateController -- Enabling monitoring
22:48:39.478 [Thread-0] DEBUG kieker.monitoring.writer.MonitoringWriterThread -- kieker.monitoring.writer.MonitoringWriterThread is running.
22:48:39.490 [main] INFO kieker.monitoring.core.controller.MonitoringController -- Current State of kieker.monitoring (2.0.3-SNAPSHOT) Status: 'enabled'
	Name: 'KIEKER'; Hostname: 'fedora.papenkamp'; experimentID: '1'
JMXController: JMX disabled
TimeSource: 'kieker.monitoring.timer.SystemNanoTimer'
	Time in nanoseconds (with nanoseconds precision) since Thu Jan 01 01:00:00 CET 1970'
ProbeController: disabled
WriterController:
	Queue type: class kieker.monitoring.queue.BlockingQueueDecorator
	Queue capacity: 10000
	Insert behavior (a.k.a. QueueFullBehavior): class kieker.monitoring.queue.behavior.BlockOnFailedInsertBehavior
		numBlocked: 0
Writer: 'kieker.monitoring.writer.filesystem.FileWriter'
	Configuration:
		kieker.monitoring.writer.filesystem.FileWriter.logFilePoolHandler='kieker.monitoring.writer.filesystem.RotatingLogFilePoolHandler'
		kieker.monitoring.writer.filesystem.FileWriter.charsetName='UTF-8'
		kieker.monitoring.writer.filesystem.FileWriter.logStreamHandler='kieker.monitoring.writer.filesystem.TextLogStreamHandler'
		kieker.monitoring.writer.filesystem.FileWriter.maxEntriesInFile='25000'
		kieker.monitoring.writer.filesystem.FileWriter.maxLogFiles='-1'
		kieker.monitoring.writer.filesystem.FileWriter.maxLogSize='-1'
		kieker.monitoring.writer.filesystem.FileWriter.mapFileHandler='kieker.monitoring.writer.filesystem.TextMapFileHandler'
		kieker.monitoring.writer.filesystem.FileWriter.flush='false'
		kieker.monitoring.writer.filesystem.FileWriter.customStoragePath=''
		kieker.monitoring.writer.filesystem.FileWriter.actualStoragePath='/tmp/kieker-20250527-204839-2302315611002-UTC--KIEKER'

	Automatic assignment of logging timestamps: 'true'
Sampling Controller: Periodic Sensor available: Poolsize: '0'; Scheduled Tasks: '0'
22:48:39.496 [main] DEBUG teetime.framework.scheduling.pushpullmodel.A3PipeInstantiation -- Connected (unsynch) teetime.framework.OutputPort@7ea37dbf and teetime.framework.InputPort@4b44655e
22:48:39.500 [Thread for OtlpGrpcReceiverStage-0] DEBUG kieker.tools.oteltransformer.receiver.OtlpGrpcReceiverStage -- Executing runnable stage...
22:48:39.500 [main] DEBUG teetime.framework.Execution -- Using scheduler: teetime.framework.scheduling.pushpullmodel.PushPullScheduling
22:48:39.502 [main] DEBUG RecordReceiverMain -- Running transformer
22:48:39.537 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.logging.InternalLoggerFactory -- Using SLF4J as the default logging framework
22:48:39.539 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- -Dio.netty.noUnsafe: false
22:48:39.539 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- Java version: 21
22:48:39.540 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.theUnsafe: available
22:48:39.540 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.copyMemory: available
22:48:39.540 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.storeFence: available
22:48:39.541 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.Buffer.address: available
22:48:39.541 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- direct buffer constructor: unavailable: Reflective setAccessible(true) disabled
22:48:39.542 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.Bits.unaligned: available, true
22:48:39.543 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable: class io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0$7 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @67784306
22:48:39.543 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.DirectByteBuffer.<init>(long, {int,long}): unavailable
22:48:39.543 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- sun.misc.Unsafe: available
22:48:39.544 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
22:48:39.544 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.bitMode: 64 (sun.arch.data.model)
22:48:39.545 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.maxDirectMemory: -1 bytes
22:48:39.545 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.uninitializedArrayAllocationThreshold: -1
22:48:39.553 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.CleanerJava9 -- java.nio.ByteBuffer.cleaner(): available
22:48:39.554 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.noPreferDirect: false
22:48:39.602 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)
22:48:39.602 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.deleteLibAfterLoading: true
22:48:39.602 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.tryPatchShadedId: true
22:48:39.602 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.detectNativeLibraryDuplicates: true
22:48:39.611 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- Successfully loaded the library /tmp/libio_grpc_netty_shaded_netty_transport_native_epoll_x86_6416751553277586714025.so
22:48:39.614 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- -Djava.net.preferIPv4Stack: false
22:48:39.614 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- -Djava.net.preferIPv6Addresses: false
22:48:39.616 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtilInitializations -- Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
22:48:39.617 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- /proc/sys/net/core/somaxconn: 4096
22:48:39.634 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.channel.MultithreadEventLoopGroup -- -Dio.netty.eventLoopThreads: 24
22:48:39.676 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.concurrent.GlobalEventExecutor -- -Dio.netty.globalEventExecutor.quietPeriodSeconds: 1
22:48:39.679 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.InternalThreadLocalMap -- -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
22:48:39.680 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.InternalThreadLocalMap -- -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
22:48:39.688 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- org.jctools-core.MpscChunkedArrayQueue: available
22:48:39.707 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector -- -Dio.grpc.netty.shaded.io.netty.leakDetection.level: simple
22:48:39.707 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector -- -Dio.grpc.netty.shaded.io.netty.leakDetection.targetRecords: 4
22:48:39.708 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.numHeapArenas: 24
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.numDirectArenas: 24
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.pageSize: 8192
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxOrder: 9
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.chunkSize: 4194304
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.smallCacheSize: 256
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.normalCacheSize: 64
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxCachedBufferCapacity: 32768
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.cacheTrimInterval: 8192
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.cacheTrimIntervalMillis: 0
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.useCacheForAllThreads: false
22:48:39.709 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
22:48:39.730 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId -- -Dio.netty.processId: 42815 (auto-detected)
22:48:39.732 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId -- -Dio.netty.machineId: 6e:56:4d:ff:fe:25:7b:9a (auto-detected)
22:48:39.746 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.allocator.type: pooled
22:48:39.746 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.threadLocalDirectBufferSize: 0
22:48:39.746 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.maxThreadLocalCharBufferSize: 16384
22:48:39.748 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.bootstrap.ChannelInitializerExtensions -- -Dio.netty.bootstrap.extensions: null
OTLP gRPC Receiver läuft auf Port 9000
22:49:28.827 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf -- -Dio.grpc.netty.shaded.io.netty.buffer.checkAccessible: true
22:49:28.827 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf -- -Dio.grpc.netty.shaded.io.netty.buffer.checkBounds: true
22:49:28.828 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory -- Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector@239e3de9
22:49:28.912 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xc0843247, L:/192.168.178.31:9000 - R:/172.18.0.9:47898] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
22:49:28.916 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.maxCapacityPerThread: 4096
22:49:28.917 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.ratio: 8
22:49:28.917 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.chunkSize: 32
22:49:28.917 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.blocking: false
22:49:28.917 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.batchFastThreadLocalOnly: true
22:49:28.932 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xc0843247, L:/192.168.178.31:9000 - R:/172.18.0.9:47898] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
22:49:28.950 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xc0843247, L:/192.168.178.31:9000 - R:/172.18.0.9:47898] OUTBOUND GO_AWAY: lastStreamId=2147483647 errorCode=1 length=45 bytes=556e657870656374656420485454502f312e7820726571756573743a20504f5354202f76312f74726163657320
22:49:28.955 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler -- [id: 0xc0843247, L:/192.168.178.31:9000 - R:/172.18.0.9:47898] Sent GOAWAY: lastStreamId '2147483647', errorCode '1', debugData 'Unexpected HTTP/1.x request: POST /v1/traces '. Forcing shutdown of the connection.
May 27, 2025 10:49:28 PM io.grpc.netty.shaded.io.grpc.netty.NettyServerTransport notifyTerminated
INFO: Transport failed
io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception: Unexpected HTTP/1.x request: POST /v1/traces
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception.connectionError(Http2Exception.java:107)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.readClientPrefaceString(Http2ConnectionHandler.java:315)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.decode(Http2ConnectionHandler.java:245)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:451)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:530)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:469)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1407)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:918)
	at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:799)
	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:501)
	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:399)
	at io.grpc.netty.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:994)
	at io.grpc.netty.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.grpc.netty.shaded.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:1583)

22:49:30.600 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xcc40dd9f, L:/192.168.178.31:9000 - R:/172.18.0.9:35654] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
22:49:30.603 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xcc40dd9f, L:/192.168.178.31:9000 - R:/172.18.0.9:35654] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
22:49:30.605 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xcc40dd9f, L:/192.168.178.31:9000 - R:/172.18.0.9:35654] OUTBOUND GO_AWAY: lastStreamId=2147483647 errorCode=1 length=45 bytes=556e657870656374656420485454502f312e7820726571756573743a20504f5354202f76312f74726163657320
22:49:30.606 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler -- [id: 0xcc40dd9f, L:/192.168.178.31:9000 - R:/172.18.0.9:35654] Sent GOAWAY: lastStreamId '2147483647', errorCode '1', debugData 'Unexpected HTTP/1.x request: POST /v1/traces '. Forcing shutdown of the connection.
May 27, 2025 10:49:30 PM io.grpc.netty.shaded.io.grpc.netty.NettyServerTransport notifyTerminated
INFO: Transport failed
io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception: Unexpected HTTP/1.x request: POST /v1/traces
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2Exception.connectionError(Http2Exception.java:107)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.readClientPrefaceString(Http2ConnectionHandler.java:315)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.decode(Http2ConnectionHandler.java:245)
	at io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:451)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:530)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:469)
	at io.grpc.netty.shaded.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
	at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1407)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
	at io.grpc.netty.shaded.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
	at io.grpc.netty.shaded.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:918)
	at io.grpc.netty.shaded.io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:799)
	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:501)
	at io.grpc.netty.shaded.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:399)
	at io.grpc.netty.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:994)
	at io.grpc.netty.shaded.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.grpc.netty.shaded.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:1583)

22:49:30.803 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xac750bb3, L:/192.168.178.31:9000 - R:/172.18.0.9:35670] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
22:49:30.814 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xac750bb3, L:/192.168.178.31:9000 - R:/172.18.0.9:35670] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
22:49:30.816 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xac750bb3, L:/192.168.178.31:9000 - R:/172.18.0.9:35670] OUTBOUND GO_AWAY: lastStreamId=2147483647 errorCode=1 length=45 bytes=556e657870656374656420485454502f312e7820726571756573743a20504f5354202f76312f74726163657320
22:49:30.817 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler -- [id: 0xac750bb3, L:/192.168.178.31:9000 - R:/172.18.0.9:35670] Sent GOAWAY: lastStreamId '2147483647', errorCode '1', debugData 'Unexpected HTTP/1.x request: POST /v1/traces '. Forcing shutdown of the connection.
May 27, 2025 10:49:30 PM io.grpc.netty.shaded.io.grpc.netty.NettyServerTransport notifyTerminated

@DaGeRe
Copy link
Author

DaGeRe commented Jun 1, 2025

Thanks for the log, that is pretty surprising.

Could you check whether

exporters:
  otlp/example:
    endpoint: grpc://192.168.178.28:9000
    tls:
      insecure: true
      
service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlp/example]

works? Usually, the records should be sent via gRPC, not via HTTP.

EDIT: Sorry, thats wrong. I tried again, and for me, it works with http, not with grpc. Could you double-check whether your configuration file contains exactly the code displayed above?

@DaGeRe
Copy link
Author

DaGeRe commented Jun 1, 2025

And just for the record:

git clone -b GH-13-GetOERFromProto git@github.com:kieker-monitoring/kieker.git
cd kieker/
cd tools/otel-transformer
../../gradlew assemble
cd build/distributions
unzip otel-transformer-2.0.3-SNAPSHOT.zip
cd otel-transformer-2.0.3-SNAPSHOT/
bin/otel-transformer -lp 9000 -standard OPENTELEMETRY

is the first part of the process,

cd ../../../trace-analysis
../../gradlew assemble
cd build/distributions
unzip trace-analysis-2.0.3-SNAPSHOT.zip
cd trace-analysis-2.0.3-SNAPSHOT/
mkdir graphs
ls /tmp

the second part, and then

export KIEKERPATH=/tmp/kieker-20250601-104535-7242664420886-UTC--KIEKER
bin/trace-analysis   -i $KIEKERPATH   -o graphs   --plot-Deployment-Component-Dependency-Graph responseTimes-ms   --plot-Deployment-Operation-Dependency-Graph responseTimes-ms   --plot-Aggregated-Deployment-Call-Tree   --plot-Aggregated-Assembly-Call-Tree
cd graphs/
for file in *.dot; do dot -Tpng $file -o $file.png; done

is the third part.

@DaGeRe
Copy link
Author

DaGeRe commented Jun 1, 2025

After looking deeper into this, there is a conceptual problem here.

Kieker assumes that we have operation call traces, which means that the calls are synchronous - once an operation call has been finished, we cannot go back to it. OpenTelemetry on the other hand has potentially concurrent calls, that in some cases cannot be put in a serial order.

One example happens in one trace: ListProducts does something, in parallel, a lot of GetProduct calls are done, but later, ListProducts comes back with another call that has been done later (in trace 8339157054149764712). Since we do not know before when this kind of additional data is read, it is hard to create Kieker traces from this.

The situation looks like this:

grafik

I see two options to handle this:

  • Have some processing that makes everything un-concurrent, but reordering the eoi/ess's afterwards. Would work, but wouldn't be really clean.
  • Store spans inside of Kieker, and create MessageTrace from them, potentially allowing most of the visualizations we had before. Potentially we could build on AbstractTraceEvent. Would be more sustainable, but I do not know whether we need this kind of comprehensive rewrite (would require rewriting significant parts of the analysis pipeline, as far as I see).

@shinhyungyang Let's discuss during the Kieker meeting.

@shinhyungyang
Copy link

shinhyungyang commented Jun 2, 2025

@DaGeRe now I continue looking at my issue. Below is the log of the ad container:

Picked up JAVA_TOOL_OPTIONS: -javaagent:/usr/src/app/opentelemetry-javaagent.jar
Picked up _JAVA_OPTIONS: -Dotel.instrumentation.methods.include=oteldemo.AdService[main,getInstance,getAdsByCategory,getRandomAds,createAdsMap,start,stop,blockUntilShutdown];oteldemo.AdService.AdServiceImpl[getAds];oteldemo.problempattern.CPULoad[getInstance,execute,spawnLoadWorkers,stopWorkers]
OpenJDK 64-Bit Server VM warning: Sharing is only supported for boot loader classes because bootstrap classpath has been appended
[otel.javaagent 2025-06-02 07:34:43:973 +0000] [main] INFO io.opentelemetry.javaagent.tooling.VersionLogger - opentelemetry-javaagent - version: 2.15.0
2025-06-02 07:34:47 - oteldemo.AdService - Ad service starting. trace_id=0589a16b0002210f47f4cb1044104881 span_id=5d0a88b29a0e2fae trace_flags=01 
SLF4J(W): No SLF4J providers were found.
SLF4J(W): Defaulting to no-operation (NOP) logger implementation
SLF4J(W): See https://www.slf4j.org/codes.html#noProviders for further details.
2025-06-02 07:34:48 - oteldemo.AdService - Ad service started, listening on 9555 trace_id=0589a16b0002210f47f4cb1044104881 span_id=6118f8b81c8b740f trace_flags=01 
[otel.javaagent 2025-06-02 07:35:00:676 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export logs. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more
[otel.javaagent 2025-06-02 07:35:00:737 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export spans. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more
[otel.javaagent 2025-06-02 07:35:17:110 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export logs. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more

This log keeps repeating:

[otel.javaagent 2025-06-02 07:35:00:676 +0000] [OkHttp http://otel-collector:4318/...] ERROR io.opentelemetry.exporter.internal.http.HttpExporter - Failed to export logs. The request could not be executed. Full error message: timeout
java.io.InterruptedIOException: timeout
	at okhttp3.internal.connection.RealCall.timeoutExit(RealCall.kt:398)
	at okhttp3.internal.connection.RealCall.callDone(RealCall.kt:360)
	at okhttp3.internal.connection.RealCall.noMoreExchanges$okhttp(RealCall.kt:325)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:209)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.io.IOException: Canceled
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:72)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at io.opentelemetry.exporter.sender.okhttp.internal.RetryInterceptor.intercept(RetryInterceptor.java:96)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
	at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
	... 4 more

One thing I overlooked: otlp/http -> otlp

  • old
exporters:
  otlphttp/example:
    endpoint: http://172.19.233.3:9000
    tls:
      insecure: true

service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlphttp/example]
  • new
exporters:
  otlp/example:
    endpoint: http://172.19.233.3:9000
    tls:
      insecure: true

service:
  pipelines:
    traces:
      exporters: [spanmetrics, otlp/example]

After this, otel-transformer logs look different (no records yet)

bin/otel-transformer -lp 9000 -standard OPENTELEMETRY
15:00:24.776 [main] DEBUG RecordReceiverMain -- OpenTelemetry Transformer
15:00:24.845 [main] DEBUG kieker.analysis.generic.sink.DataSink:DataSink-0 -- numOpenedInputPorts (inc): 1
15:00:24.846 [main] DEBUG kieker.analysis.generic.sink.DataSink -- Configuration complete.
15:00:24.862 [main] INFO kieker.monitoring.core.controller.TCPController -- Could not parse port for the TCPController, deactivating this option. Received string was:
15:00:24.941 [main] DEBUG kieker.monitoring.core.controller.WriterController -- Initializing Writer Controller
15:00:24.941 [main] INFO kieker.monitoring.core.controller.StateController -- Enabling monitoring
15:00:24.941 [Thread-0] DEBUG kieker.monitoring.writer.MonitoringWriterThread -- kieker.monitoring.writer.MonitoringWriterThread is running.
15:00:24.955 [main] INFO kieker.monitoring.core.controller.MonitoringController -- Current State of kieker.monitoring (2.0.3-SNAPSHOT) Status: 'enabled'
        Name: 'KIEKER'; Hostname: 'fedora'; experimentID: '1'
JMXController: JMX disabled
TimeSource: 'kieker.monitoring.timer.SystemNanoTimer'
        Time in nanoseconds (with nanoseconds precision) since Thu Jan 01 01:00:00 CET 1970'
ProbeController: disabled
WriterController:
        Queue type: class kieker.monitoring.queue.BlockingQueueDecorator
        Queue capacity: 10000
        Insert behavior (a.k.a. QueueFullBehavior): class kieker.monitoring.queue.behavior.BlockOnFailedInsertBehavior
                numBlocked: 0
Writer: 'kieker.monitoring.writer.filesystem.FileWriter'
        Configuration:
                kieker.monitoring.writer.filesystem.FileWriter.logFilePoolHandler='kieker.monitoring.writer.filesystem.RotatingLogFilePoolHandler'
                kieker.monitoring.writer.filesystem.FileWriter.charsetName='UTF-8'
                kieker.monitoring.writer.filesystem.FileWriter.logStreamHandler='kieker.monitoring.writer.filesystem.TextLogStreamHandler'
                kieker.monitoring.writer.filesystem.FileWriter.maxEntriesInFile='25000'
                kieker.monitoring.writer.filesystem.FileWriter.maxLogFiles='-1'
                kieker.monitoring.writer.filesystem.FileWriter.maxLogSize='-1'
                kieker.monitoring.writer.filesystem.FileWriter.mapFileHandler='kieker.monitoring.writer.filesystem.TextMapFileHandler'
                kieker.monitoring.writer.filesystem.FileWriter.flush='false'
                kieker.monitoring.writer.filesystem.FileWriter.customStoragePath=''
                kieker.monitoring.writer.filesystem.FileWriter.actualStoragePath='/tmp/kieker-20250602-130024-27376755983835-UTC--KIEKER'

        Automatic assignment of logging timestamps: 'true'
Sampling Controller: Periodic Sensor available: Poolsize: '0'; Scheduled Tasks: '0'
15:00:24.962 [main] DEBUG teetime.framework.scheduling.pushpullmodel.A3PipeInstantiation -- Connected (unsynch) teetime.framework.OutputPort@7ea37dbf and teetime.framework.InputPort@4b44655e
15:00:24.966 [Thread for OtlpGrpcReceiverStage-0] DEBUG kieker.tools.oteltransformer.receiver.OtlpGrpcReceiverStage -- Executing runnable stage...
15:00:24.966 [main] DEBUG teetime.framework.Execution -- Using scheduler: teetime.framework.scheduling.pushpullmodel.PushPullScheduling
15:00:24.967 [main] DEBUG RecordReceiverMain -- Running transformer
15:00:25.006 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.logging.InternalLoggerFactory -- Using SLF4J as the default logging framework
15:00:25.007 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- -Dio.netty.noUnsafe: false
15:00:25.008 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- Java version: 21
15:00:25.008 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.theUnsafe: available
15:00:25.009 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.copyMemory: available
15:00:25.009 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- sun.misc.Unsafe.storeFence: available
15:00:25.010 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.Buffer.address: available
15:00:25.010 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- direct buffer constructor: unavailable: Reflective setAccessible(true) disabled
15:00:25.011 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.Bits.unaligned: available, true
15:00:25.012 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable: class io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0$7 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module @67784306
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent0 -- java.nio.DirectByteBuffer.<init>(long, {int,long}): unavailable
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- sun.misc.Unsafe: available
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
15:00:25.013 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.bitMode: 64 (sun.arch.data.model)
15:00:25.014 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.maxDirectMemory: -1 bytes
15:00:25.015 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.uninitializedArrayAllocationThreshold: -1
15:00:25.022 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.CleanerJava9 -- java.nio.ByteBuffer.cleaner(): available
15:00:25.022 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- -Dio.netty.noPreferDirect: false
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.deleteLibAfterLoading: true
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.tryPatchShadedId: true
15:00:25.057 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- -Dio.netty.native.detectNativeLibraryDuplicates: true
15:00:25.067 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.NativeLibraryLoader -- Successfully loaded the library /tmp/libio_grpc_netty_shaded_netty_transport_native_epoll_x86_649226255254318685962.so
15:00:25.071 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- -Djava.net.preferIPv4Stack: false
15:00:25.071 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- -Djava.net.preferIPv6Addresses: false
15:00:25.073 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtilInitializations -- Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
15:00:25.074 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil -- /proc/sys/net/core/somaxconn: 4096
15:00:25.092 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.channel.MultithreadEventLoopGroup -- -Dio.netty.eventLoopThreads: 24
15:00:25.132 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.concurrent.GlobalEventExecutor -- -Dio.netty.globalEventExecutor.quietPeriodSeconds: 1
15:00:25.136 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.InternalThreadLocalMap -- -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
15:00:25.136 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.InternalThreadLocalMap -- -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
15:00:25.147 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent -- org.jctools-core.MpscChunkedArrayQueue: available
15:00:25.171 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector -- -Dio.grpc.netty.shaded.io.netty.leakDetection.level: simple
15:00:25.171 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector -- -Dio.grpc.netty.shaded.io.netty.leakDetection.targetRecords: 4
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.numHeapArenas: 24
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.numDirectArenas: 24
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.pageSize: 8192
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxOrder: 9
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.chunkSize: 4194304
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.smallCacheSize: 256
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.normalCacheSize: 64
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxCachedBufferCapacity: 32768
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.cacheTrimInterval: 8192
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.cacheTrimIntervalMillis: 0
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.useCacheForAllThreads: false
15:00:25.173 [Thread for OtlpGrpcReceiverStage-0] DEBUG io.grpc.netty.shaded.io.netty.buffer.PooledByteBufAllocator -- -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
15:00:25.195 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId -- -Dio.netty.processId: 576675 (auto-detected)
15:00:25.197 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId -- -Dio.netty.machineId: 2c:58:b9:ff:fe:b3:8e:20 (auto-detected)
15:00:25.218 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.allocator.type: pooled
15:00:25.218 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.threadLocalDirectBufferSize: 0
15:00:25.218 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.ByteBufUtil -- -Dio.netty.maxThreadLocalCharBufferSize: 16384
15:00:25.222 [grpc-default-boss-ELG-1-1] DEBUG io.grpc.netty.shaded.io.netty.bootstrap.ChannelInitializerExtensions -- -Dio.netty.bootstrap.extensions: null
OTLP gRPC Receiver läuft auf Port 9000
15:00:39.565 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf -- -Dio.grpc.netty.shaded.io.netty.buffer.checkAccessible: true
15:00:39.565 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf -- -Dio.grpc.netty.shaded.io.netty.buffer.checkBounds: true
15:00:39.565 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory -- Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector@3d86d6ae
15:00:39.619 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.maxCapacityPerThread: 4096
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.ratio: 8
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.chunkSize: 32
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.blocking: false
15:00:39.621 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler -- -Dio.netty.recycler.batchFastThreadLocalOnly: true
15:00:39.629 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:39.639 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] INBOUND SETTINGS: ack=false settings={}
15:00:39.640 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x0fa706b6, L:/172.19.233.3:9000 - R:/172.19.233.3:48632] OUTBOUND SETTINGS: ack=true
15:00:40.275 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:40.276 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:40.277 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] INBOUND SETTINGS: ack=false settings={}
15:00:40.277 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] OUTBOUND SETTINGS: ack=true
15:00:40.277 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] INBOUND SETTINGS: ack=true
15:00:40.279 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x27fe0560, L:/172.19.233.3:9000 - R:/172.19.233.3:48646] INBOUND GO_AWAY: lastStreamId=2147483647 errorCode=0 length=25 bytes=636c69656e74207472616e73706f72742073687574646f776e
15:00:41.040 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:41.047 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:41.048 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] INBOUND SETTINGS: ack=false settings={}
15:00:41.048 [grpc-default-worker-ELG-3-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x7df79209, L:/172.19.233.3:9000 - R:/172.19.233.3:45224] OUTBOUND SETTINGS: ack=true
15:00:41.968 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:41.970 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:41.971 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] INBOUND SETTINGS: ack=false settings={}
15:00:41.971 [grpc-default-worker-ELG-3-4] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xe02df6fd, L:/172.19.233.3:9000 - R:/172.19.233.3:45232] OUTBOUND SETTINGS: ack=true
15:00:43.206 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:43.207 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:43.208 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] INBOUND SETTINGS: ack=false settings={}
15:00:43.208 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] OUTBOUND SETTINGS: ack=true
15:00:43.209 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] INBOUND SETTINGS: ack=true
15:00:43.209 [grpc-default-worker-ELG-3-5] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0xd8487eae, L:/172.19.233.3:9000 - R:/172.19.233.3:45238] INBOUND GO_AWAY: lastStreamId=2147483647 errorCode=0 length=25 bytes=636c69656e74207472616e73706f72742073687574646f776e
15:00:45.248 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:45.250 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:45.250 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] INBOUND SETTINGS: ack=false settings={}
15:00:45.250 [grpc-default-worker-ELG-3-6] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2ec89ed1, L:/172.19.233.3:9000 - R:/172.19.233.3:45254] OUTBOUND SETTINGS: ack=true
15:00:48.910 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
15:00:48.911 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
15:00:48.911 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] INBOUND SETTINGS: ack=false settings={}
15:00:48.911 [grpc-default-worker-ELG-3-7] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler -- [id: 0x2af0aba7, L:/172.19.233.3:9000 - R:/172.19.233.3:45270] OUTBOUND SETTINGS: ack=true

These may be related:

@shinhyungyang
Copy link

shinhyungyang commented Jun 4, 2025

@DaGeRe

Thanks for having a look at the problem on my laptop during yesterday's Kieker meeting. Today I also solved this issue on my office laptop including the ulimits issue.

I have re-installed docker several times, so I cannot point out where the problem was, but a clean docker installation should work.

diff --git a/docker-compose.yml b/docker-compose.yml
index 85f9ee5..73eb1d6 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -72,7 +72,7 @@ services:
       - OTEL_LOGS_EXPORTER=otlp
       - OTEL_SERVICE_NAME=ad
       # Workaround on OSX for https://bugs.openjdk.org/browse/JDK-8345296
-      - _JAVA_OPTIONS
+      - _JAVA_OPTIONS=-Dotel.instrumentation.methods.include=oteldemo.AdService[main,getInstance,getAdsByCategory,getRandomAds,createAdsMap,start,stop,blockUntilShutdown];oteldemo.AdService.AdServiceImpl[getAds];oteldemo.problempattern.CPULoad[getInstance,execute,spawnLoadWorkers,stopWorkers]
     depends_on:
       otel-collector:
         condition: service_started
diff --git a/src/otel-collector/otelcol-config-extras.yml b/src/otel-collector/otelcol-config-extras.yml
index 73dd011..15b2721 100644
--- a/src/otel-collector/otelcol-config-extras.yml
+++ b/src/otel-collector/otelcol-config-extras.yml
@@ -8,11 +8,13 @@
 ## Note: the spanmetrics exporter must be included in the exporters array
 ## if overriding the traces pipeline.
 ##
-#  exporters:
-#    otlphttp/example:
-#      endpoint: <your-endpoint-url>
-#
-#  service:
-#    pipelines:
-#      traces:
-#        exporters: [spanmetrics, otlphttp/example]
+exporters:
+  otlp/example:
+    endpoint: http://172.19.233.3:9000
+    tls:
+      insecure: true
+
+service:
+  pipelines:
+    traces:
+      exporters: [spanmetrics, otlp/example]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment