Started by an SCM change Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on ha:////4IjRXFRxgekvIhAS3APr0BMdQDmMO5X+tUVcXkHyPGnYAAAAqh+LCAAAAAAAAP9b85aBtbiIQTGjNKU4P08vOT+vOD8nVc83PyU1x6OyILUoJzMv2y+/JJUBAhiZGBgqihhk0NSjKDWzXb3RdlLBUSYGJk8GtpzUvPSSDB8G5tKinBIGIZ+sxLJE/ZzEvHT94JKizLx0a6BxUmjGOUNodHsLgAz5EgY5/eT83ILSktQi/ZT85OzUIl0DA5OkDAMDs6KKzPxSAKMYL0jTAAAAdocker-004bh006rxiou on gcloud1 (bazel-chrome-latest bazel-debian mvn bazel-debian-chrome-latest docker) in workspace /home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6 Selected Git installation does not exist. Using Default Wiping out workspace first. Cloning the remote Git repository Avoid fetching tags Cloning repository https://gerrit.googlesource.com/a/gerrit > /usr/bin/git init /home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6 # timeout=10 Fetching upstream changes from https://gerrit.googlesource.com/a/gerrit > /usr/bin/git --version # timeout=10 using GIT_ASKPASS to set credentials .netrc credentials for gerrit.googlesource.com > /usr/bin/git fetch --no-tags --force --progress -- https://gerrit.googlesource.com/a/gerrit +refs/heads/*:refs/remotes/gerrit/* # timeout=10 > /usr/bin/git config remote.gerrit.url https://gerrit.googlesource.com/a/gerrit # timeout=10 > /usr/bin/git config --add remote.gerrit.fetch +refs/heads/*:refs/remotes/gerrit/* # timeout=10 > /usr/bin/git config remote.gerrit.url https://gerrit.googlesource.com/a/gerrit # timeout=10 Fetching upstream changes from https://gerrit.googlesource.com/a/gerrit using GIT_ASKPASS to set credentials .netrc credentials for gerrit.googlesource.com > /usr/bin/git fetch --no-tags --force --progress -- https://gerrit.googlesource.com/a/gerrit +refs/heads/*:refs/remotes/gerrit/* # timeout=10 > /usr/bin/git config remote.origin.url https://gerrit.googlesource.com/a/plugins/events-kafka # timeout=10 Fetching upstream changes from https://gerrit.googlesource.com/a/plugins/events-kafka using GIT_ASKPASS to set credentials .netrc credentials for gerrit.googlesource.com > /usr/bin/git fetch --no-tags --force --progress -- https://gerrit.googlesource.com/a/plugins/events-kafka +refs/heads/*:refs/remotes/origin/* # timeout=10 > /usr/bin/git rev-parse origin/stable-3.6^{commit} # timeout=10 > /usr/bin/git rev-parse refs/remotes/gerrit/origin/stable-3.6^{commit} # timeout=10 > /usr/bin/git rev-parse refs/remotes/origin/stable-3.6^{commit} # timeout=10 > /usr/bin/git rev-parse refs/remotes/origin/origin/stable-3.6^{commit} # timeout=10 Checking out Revision 9d7e4bc05b2886b2f590263a755a229eb3cd1562 (origin/stable-3.6) > /usr/bin/git config core.sparsecheckout # timeout=10 > /usr/bin/git checkout -f 9d7e4bc05b2886b2f590263a755a229eb3cd1562 # timeout=10 Commit message: "Log publishing of stream events in message_log file" > /usr/bin/git rev-list --no-walk d626973cafb58bc579fe5d9fcfac36f0767446e7 # timeout=10 [plugin-events-kafka-bazel-stable-3.6] $ /bin/bash -e /tmp/jenkins5795173198332208754.sh [plugin-events-kafka-bazel-stable-3.6] $ /bin/bash -e /tmp/jenkins2768039559958485835.sh Java set to: /usr/lib/jvm/java-11-openjdk-amd64/bin/java Previous HEAD position was 9d7e4bc05b Log publishing of stream events in message_log file Switched to a new branch 'stable-3.6' Branch 'stable-3.6' set up to track remote branch 'stable-3.6' from 'gerrit'. Submodule 'modules/jgit' (https://gerrit.googlesource.com/a/jgit) registered for path 'modules/jgit' Submodule 'plugins/codemirror-editor' (https://gerrit.googlesource.com/a/plugins/codemirror-editor) registered for path 'plugins/codemirror-editor' Submodule 'plugins/commit-message-length-validator' (https://gerrit.googlesource.com/a/plugins/commit-message-length-validator) registered for path 'plugins/commit-message-length-validator' Submodule 'plugins/delete-project' (https://gerrit.googlesource.com/a/plugins/delete-project) registered for path 'plugins/delete-project' Submodule 'plugins/download-commands' (https://gerrit.googlesource.com/a/plugins/download-commands) registered for path 'plugins/download-commands' Submodule 'plugins/gitiles' (https://gerrit.googlesource.com/a/plugins/gitiles) registered for path 'plugins/gitiles' Submodule 'plugins/hooks' (https://gerrit.googlesource.com/a/plugins/hooks) registered for path 'plugins/hooks' Submodule 'plugins/plugin-manager' (https://gerrit.googlesource.com/a/plugins/plugin-manager) registered for path 'plugins/plugin-manager' Submodule 'plugins/replication' (https://gerrit.googlesource.com/a/plugins/replication) registered for path 'plugins/replication' Submodule 'plugins/reviewnotes' (https://gerrit.googlesource.com/a/plugins/reviewnotes) registered for path 'plugins/reviewnotes' Submodule 'plugins/singleusergroup' (https://gerrit.googlesource.com/a/plugins/singleusergroup) registered for path 'plugins/singleusergroup' Submodule 'plugins/webhooks' (https://gerrit.googlesource.com/a/plugins/webhooks) registered for path 'plugins/webhooks' Submodule 'polymer-bridges' (https://gerrit.googlesource.com/a/polymer-bridges) registered for path 'polymer-bridges' Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/modules/jgit'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/codemirror-editor'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/commit-message-length-validator'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/delete-project'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/download-commands'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/gitiles'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/hooks'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/plugin-manager'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/replication'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/reviewnotes'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/singleusergroup'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/plugins/webhooks'... Cloning into '/home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/polymer-bridges'... Submodule path 'modules/jgit': checked out '82e277c813398c9f519f16e83d080a94fa29a27c' Submodule path 'plugins/codemirror-editor': checked out 'c5bda5b6b5fe91a2f7cd40c5a917dd2280b04814' Submodule path 'plugins/commit-message-length-validator': checked out 'c38e0a9d36767092b20558b28eff7f546c6d754c' Submodule path 'plugins/delete-project': checked out '5717badf4250dfe900c05fc00d0758a09ba77297' Submodule path 'plugins/download-commands': checked out 'b90e523f589a0e2902823233010163f453243926' Submodule path 'plugins/gitiles': checked out '24529d232268ac51fd6850770f70dc0fcd732dd8' Submodule path 'plugins/hooks': checked out '30073628612bce23826f4be71bfdd159da521cbc' Submodule path 'plugins/plugin-manager': checked out 'ba74d4969462c2592bcf97868dd76c33041d47b2' Submodule path 'plugins/replication': checked out '47ee3dab0dd96900e85662adf0d5f48a33d17733' Submodule path 'plugins/reviewnotes': checked out '10db2cf772989d031c6f3558010c51fe07cf9722' Submodule path 'plugins/singleusergroup': checked out '3239ce3a471f5aa9edd8f6f702bee655ea81f77d' Submodule path 'plugins/webhooks': checked out '1dc0a718839f8872a59c189da7243ee77a4fe782' Submodule path 'polymer-bridges': checked out '855f4781b702de120953a64da5c277ea4908deaa' From https://gerrit.googlesource.com/a/plugins/events-kafka * branch HEAD -> FETCH_HEAD ~/workspace ~/workspace/plugin-events-kafka-bazel-stable-3.6 Cloning into 'events-broker'... ~/workspace/plugin-events-kafka-bazel-stable-3.6 ~/workspace/plugin-events-kafka-bazel-stable-3.6/plugins ~/workspace/plugin-events-kafka-bazel-stable-3.6 ~/workspace/plugin-events-kafka-bazel-stable-3.6 openjdk full version "11.0.20+8-post-Debian-1deb11u1" 2023/12/21 13:40:16 Downloading https://releases.bazel.build/5.3.1/release/bazel-5.3.1-linux-x86_64... 2023/12/21 13:40:16 Skipping basic authentication for releases.bazel.build because no credentials found in /home/jenkins/.netrc Bazelisk version: v1.12.0 Extracting Bazel installation... Starting local Bazel server and connecting to it... Build label: 5.3.1 Build target: bazel-out/k8-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar Build time: Mon Sep 19 17:28:49 2022 (1663608529) Build timestamp: 1663608529 Build timestamp as int: 1663608529 INFO: Invocation ID: 21397459-8eb2-4aef-8081-cb657330ff64 INFO: Options provided by the client: Inherited 'common' options: --isatty=0 --terminal_columns=80 INFO: Reading rc options for 'build' from /home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/.bazelrc: 'build' options: --workspace_status_command=python3 ./tools/workspace_status.py --repository_cache=~/.gerritcodereview/bazel-cache/repository --action_env=PATH --disk_cache=~/.gerritcodereview/bazel-cache/cas --java_language_version=11 --java_runtime_version=remotejdk_11 --tool_java_language_version=11 --tool_java_runtime_version=remotejdk_11 --host_conlyopt=-std=c90 --incompatible_strict_action_env --announce_rc Loading: Loading: 0 packages loaded Loading: 0 packages loaded Loading: 0 packages loaded Loading: 0 packages loaded Loading: 0 packages loaded Analyzing: target //plugins/events-kafka:events-kafka (1 packages loaded, 0 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (34 packages loaded, 7 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (34 packages loaded, 7 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (34 packages loaded, 7 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (34 packages loaded, 7 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (34 packages loaded, 7 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (50 packages loaded, 466 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (56 packages loaded, 844 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (56 packages loaded, 844 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (57 packages loaded, 992 targets configured) Analyzing: target //plugins/events-kafka:events-kafka (57 packages loaded, 992 targets configured) INFO: Analyzed target //plugins/events-kafka:events-kafka (235 packages loaded, 5830 targets configured). INFO: Found 1 target... [0 / 37] [Prepa] Symlinking virtual headers for @remote_java_tools//:logging [139 / 544] Compiling src/google/protobuf/compiler/cpp/cpp_file.cc; 5s remote-cache, linux-sandbox ... (63 actions, 62 running) [211 / 544] Compiling src/google/protobuf/compiler/cpp/cpp_message.cc; 13s remote-cache, linux-sandbox ... (63 actions, 62 running) [288 / 544] Compiling src/google/protobuf/compiler/js/js_generator.cc; 14s remote-cache, linux-sandbox ... (63 actions, 62 running) INFO: From Compiling java_tools/src/tools/singlejar/combiners.cc: In file included from /usr/include/string.h:495, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h:24, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/transient_bytes.h:26, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/combiners.h:24, from external/remote_java_tools/java_tools/src/tools/singlejar/combiners.cc:15: In function 'void* memcpy(void*, const void*, size_t)', inlined from 'void LH::extra_fields(const uint8_t*, uint16_t)' at bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h:290:13, inlined from 'virtual void* Concatenator::OutputEntry(bool)' at external/remote_java_tools/java_tools/src/tools/singlejar/combiners.cc:84:21: /usr/include/x86_64-linux-gnu/bits/string_fortified.h:34:33: warning: writing 2 bytes into a region of size 0 [-Wstringop-overflow=] 34 | return __builtin___memcpy_chk (__dest, __src, __len, __bos0 (__dest)); | ~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/transient_bytes.h:26, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/combiners.h:24, from external/remote_java_tools/java_tools/src/tools/singlejar/combiners.cc:15: bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h: In member function 'virtual void* Concatenator::OutputEntry(bool)': bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h:327:8: note: at offset 0 to object 'LH::file_name_' with size 0 declared here 327 | char file_name_[0]; | ^~~~~~~~~~ In file included from /usr/include/string.h:495, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h:24, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/transient_bytes.h:26, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/combiners.h:24, from external/remote_java_tools/java_tools/src/tools/singlejar/combiners.cc:15: In function 'void* memcpy(void*, const void*, size_t)', inlined from 'void LH::extra_fields(const uint8_t*, uint16_t)' at bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h:290:13, inlined from 'virtual void* Concatenator::OutputEntry(bool)' at external/remote_java_tools/java_tools/src/tools/singlejar/combiners.cc:84:21: /usr/include/x86_64-linux-gnu/bits/string_fortified.h:34:33: warning: writing 2 bytes into a region of size 0 [-Wstringop-overflow=] 34 | return __builtin___memcpy_chk (__dest, __src, __len, __bos0 (__dest)); | ~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/transient_bytes.h:26, from bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/combiners.h:24, from external/remote_java_tools/java_tools/src/tools/singlejar/combiners.cc:15: bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h: In member function 'virtual void* Concatenator::OutputEntry(bool)': bazel-out/k8-opt-exec-2B5CBBC6/bin/external/remote_java_tools/_virtual_includes/combiners/src/tools/singlejar/zip_headers.h:327:8: note: at offset 0 to object 'LH::file_name_' with size 0 declared here 327 | char file_name_[0]; | ^~~~~~~~~~ [469 / 544] Compiling src/google/protobuf/descriptor.cc; 22s remote-cache, linux-sandbox ... (10 actions running) [494 / 544] Building external/jgit/org.eclipse.jgit/libjgit_non_stamped-class.jar (904 source files); 15s remote-cache, multiplex-worker ... (4 actions running) [496 / 544] Building external/jgit/org.eclipse.jgit/libjgit_non_stamped-class.jar (904 source files); 29s remote-cache, multiplex-worker [496 / 544] Building external/jgit/org.eclipse.jgit/libjgit_non_stamped-class.jar (904 source files); 45s remote-cache, multiplex-worker [523 / 544] Compiling Java headers java/com/google/gerrit/lucene/liblucene-hjar.jar (13 source files); 0s remote-cache, linux-sandbox ... (14 actions running) Target //plugins/events-kafka:events-kafka up-to-date: bazel-bin/plugins/events-kafka/events-kafka.jar INFO: Elapsed time: 157.364s, Critical Path: 89.27s INFO: 544 processes: 95 internal, 440 linux-sandbox, 9 worker. INFO: Build completed successfully, 544 total actions INFO: Build completed successfully, 544 total actions INFO: Invocation ID: 850a6cd6-287d-4698-8a35-40ebd007c0a1 INFO: Options provided by the client: Inherited 'common' options: --isatty=0 --terminal_columns=80 INFO: Reading rc options for 'test' from /home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/.bazelrc: Inherited 'build' options: --workspace_status_command=python3 ./tools/workspace_status.py --repository_cache=~/.gerritcodereview/bazel-cache/repository --action_env=PATH --disk_cache=~/.gerritcodereview/bazel-cache/cas --java_language_version=11 --java_runtime_version=remotejdk_11 --tool_java_language_version=11 --tool_java_runtime_version=remotejdk_11 --host_conlyopt=-std=c90 --incompatible_strict_action_env --announce_rc INFO: Reading rc options for 'test' from /home/jenkins/workspace/plugin-events-kafka-bazel-stable-3.6/.bazelrc: 'test' options: --build_tests_only --test_output=all Loading: Loading: 0 packages loaded INFO: Build option --test_env has changed, discarding analysis cache. Analyzing: 2 targets (1 packages loaded, 0 targets configured) Analyzing: 2 targets (7 packages loaded, 4588 targets configured) Analyzing: 2 targets (65 packages loaded, 6383 targets configured) INFO: Analyzed 2 targets (77 packages loaded, 6503 targets configured). INFO: Found 2 test targets... [0 / 1] [Prepa] BazelWorkspaceStatusAction stable-status.txt INFO: From Testing //tools/bzl:always_pass_test: ==================== Test output for //tools/bzl:always_pass_test: ================================================================================ [459 / 696] 1 / 2 tests; [Prepa] BazelWorkspaceStatusAction stable-status.txt ... (63 actions, 53 running) [463 / 696] 1 / 2 tests; Building proto/libentities_proto-speed.jar (1 source jar); 2s remote-cache, multiplex-worker ... (64 actions, 55 running) [555 / 704] 1 / 2 tests; Building java/com/google/gerrit/index/libquery_exception.jar (2 source files); 3s remote-cache ... (64 actions, 40 running) [557 / 704] 1 / 2 tests; Building java/com/google/gerrit/index/libquery_exception.jar (2 source files); 4s remote-cache ... (64 actions, 40 running) [563 / 704] 1 / 2 tests; Building java/com/google/gerrit/index/libquery_exception.jar (2 source files); 5s remote-cache ... (64 actions, 39 running) [566 / 704] 1 / 2 tests; Building java/com/google/gerrit/index/libquery_exception.jar (2 source files); 7s remote-cache ... (64 actions, 39 running) [572 / 704] 1 / 2 tests; Building java/com/google/gerrit/index/libquery_exception.jar (2 source files); 9s remote-cache ... (64 actions, 36 running) [572 / 704] 1 / 2 tests; Building java/com/google/gerrit/index/libquery_exception.jar (2 source files); 11s remote-cache ... (64 actions, 36 running) [580 / 704] 1 / 2 tests; Building proto/libcache_proto-speed.jar (1 source jar); 12s remote-cache, multiplex-worker ... (64 actions, 36 running) [588 / 704] 1 / 2 tests; Building proto/libcache_proto-speed.jar (1 source jar); 15s remote-cache, multiplex-worker ... (64 actions, 37 running) [602 / 704] 1 / 2 tests; Building java/com/google/gerrit/exceptions/libexceptions.jar (13 source files); 16s remote-cache ... (64 actions, 38 running) [611 / 704] 1 / 2 tests; Building java/com/google/gerrit/metrics/libmetrics.jar (33 source files) and running annotation processors (AutoAnnotationProcessor, AutoValueProcessor, AutoOneOfProcessor); 17s remote-cache ... (62 actions, 37 running) [632 / 704] 1 / 2 tests; Building external/com_google_protobuf/java/core/libcore.jar (36 source files, 1 source jar); 14s remote-cache, multiplex-worker ... (63 actions, 37 running) [648 / 704] 1 / 2 tests; Building external/com_google_protobuf/java/core/libcore.jar (36 source files, 1 source jar); 19s remote-cache, multiplex-worker ... (62 actions, 36 running) [661 / 704] 1 / 2 tests; Building java/com/google/gerrit/pgm/init/api/libapi.jar (14 source files); 15s remote-cache ... (62 actions, 37 running) [682 / 704] 1 / 2 tests; Building java/com/google/gerrit/jgit/libjgit.jar (1 source file); 14s remote-cache ... (28 actions, 19 running) [694 / 704] 1 / 2 tests; Building java/com/google/gerrit/server/restapi/librestapi.jar (303 source files) and running annotation processors (AutoAnnotationProcessor, AutoValueProcessor, AutoOneOfProcessor); 14s remote-cache, multiplex-worker ... (12 actions, 11 running) [701 / 704] 1 / 2 tests; Building java/com/google/gerrit/server/libserver-class.jar (1132 source files) and running annotation processors (AutoAnnotationProcessor, AutoValueProcessor, AutoOneOfProcessor); 19s remote-cache, multiplex-worker ... (4 actions running) [702 / 704] 1 / 2 tests; Building java/com/google/gerrit/server/libserver-class.jar (1132 source files) and running annotation processors (AutoAnnotationProcessor, AutoValueProcessor, AutoOneOfProcessor); 31s remote-cache, multiplex-worker [702 / 704] 1 / 2 tests; Building java/com/google/gerrit/server/libserver-class.jar (1132 source files) and running annotation processors (AutoAnnotationProcessor, AutoValueProcessor, AutoOneOfProcessor); 51s remote-cache, multiplex-worker [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 12s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 29s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 49s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 71s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 97s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 157s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 196s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 241s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 292s remote-cache, linux-sandbox [704 / 705] 1 / 2 tests; Testing //plugins/events-kafka:events_kafka_tests; 351s remote-cache, linux-sandbox INFO: From Testing //plugins/events-kafka:events_kafka_tests: ==================== Test output for //plugins/events-kafka:events_kafka_tests: JUnit4 Test Runner .log4j:WARN No appenders could be found for logger (org.eclipse.jgit.internal.storage.file.FileSnapshot). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Auto-configured "receive.autogc = false" to disable auto-gc after git-receive-pack. Initialized /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit8839782130294921931 Reindexed 0 documents in accounts index in 0.0s (0.0/s) Index accounts in version 11 is ready Reindexing groups: 50% (1/2) Reindexing groups: 100% (2/2) Reindexing groups: 100% (2/2) Reindexed 2 documents in groups index in 0.4s (4.9/s) Index groups in version 8 is ready Reindexing changes: Slicing projects: 100% (2/2), done Reindexed 0 documents in changes index in 0.0s (0.0/s) Index changes in version 77 is ready Reindexing projects: 50% (1/2) Reindexing projects: 100% (2/2) Reindexing projects: 100% (2/2) Reindexed 2 documents in projects index in 0.1s (17.5/s) Index projects in version 4 is ready [2023-12-21T13:44:57.519Z] [pool-11-thread-1] INFO com.google.gerrit.server.git.SystemReaderInstaller : Set JGit's SystemReader to read system config from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit8839782130294921931/etc/jgit.config [2023-12-21T13:44:57.521Z] [pool-11-thread-1] INFO com.google.gerrit.server.git.LocalDiskRepositoryManager : Defaulting core.streamFileThreshold to 2047m [2023-12-21T13:44:57.713Z] [pool-11-thread-1] WARN com.google.gerrit.server.project.PeriodicProjectListCacheWarmer : project_list cache warmer is disabled [2023-12-21T13:44:57.716Z] [pool-11-thread-1] INFO com.google.gerrit.server.plugins.PluginLoader : Loading plugins from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit8839782130294921931/plugins [2023-12-21T13:44:57.737Z] [pool-11-thread-1] INFO com.google.gerrit.server.config.ScheduleConfig : No schedule configuration for "accountDeactivation". [2023-12-21T13:44:57.875Z] [pool-11-thread-1] INFO com.google.gerrit.pgm.Daemon : Gerrit Code Review [headless] (dev) ready Gerrit Server Started [2023-12-21T13:44:59.151Z] [main] WARN org.testcontainers.utility.TestcontainersConfiguration : Attempted to read Testcontainers configuration file at file:/home/jenkins/.testcontainers.properties but the file was not found. Exception message: FileNotFoundException: /home/jenkins/.testcontainers.properties (No such file or directory) [2023-12-21T13:44:59.165Z] [main] INFO org.testcontainers.dockerclient.DockerMachineClientProviderStrategy : docker-machine executable was not found on PATH ([., /home/jenkins/.cache/bazelisk/downloads/bazelbuild/bazel-5.3.1-linux-x86_64/bin, /usr/lib/jvm/java-11-openjdk-amd64/bin, /usr/lib/jvm/java-11-openjdk-amd64/jre/bin, /usr/local/sbin, /usr/local/bin, /usr/sbin, /usr/bin, /sbin, /bin]) [2023-12-21T13:44:59.843Z] [main] INFO org.testcontainers.dockerclient.DockerClientProviderStrategy : Found Docker environment with Environment variables, system properties and defaults. Resolved dockerHost=tcp://10.0.1.1:2375 [2023-12-21T13:44:59.846Z] [main] INFO org.testcontainers.DockerClientFactory : Docker host IP address is 10.0.1.1 [2023-12-21T13:44:59.895Z] [main] INFO org.testcontainers.DockerClientFactory : Connected to docker: Server Version: 20.10.18 API Version: 1.41 Operating System: Rocky Linux 8.6 (Green Obsidian) Total Memory: 128558 MB [2023-12-21T13:44:59.901Z] [main] INFO org.testcontainers.utility.ImageNameSubstitutor : Image name substitution will be performed by: DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor') [2023-12-21T13:44:59.951Z] [main] INFO org.testcontainers.utility.RegistryAuthLocator : Failure when attempting to lookup auth config. Please ignore if you don't have images in an authenticated registry. Details: (dockerImageName: testcontainers/ryuk:0.3.1, configFile: /home/jenkins/.docker/config.json. Falling back to docker-java default behaviour. Exception message: /home/jenkins/.docker/config.json (No such file or directory) [2023-12-21T13:45:00.971Z] [main] INFO org.testcontainers.DockerClientFactory : Ryuk started - will monitor and terminate Testcontainers containers on JVM exit [2023-12-21T13:45:00.972Z] [main] INFO org.testcontainers.DockerClientFactory : Checking the system... [2023-12-21T13:45:00.973Z] [main] INFO org.testcontainers.DockerClientFactory : ?? Docker server version should be at least 1.6.0 [2023-12-21T13:45:01.096Z] [main] INFO org.testcontainers.DockerClientFactory : ?? Docker environment should have more than 2GB free disk space [2023-12-21T13:45:01.236Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Creating container for image: confluentinc/cp-kafka:5.4.3 [2023-12-21T13:45:01.793Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Starting container with ID: f3f9c06cbaf313d53344c5679ea5d06bd862da2b7330061d45d01f3737f978c3 [2023-12-21T13:45:02.616Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 is starting: f3f9c06cbaf313d53344c5679ea5d06bd862da2b7330061d45d01f3737f978c3 [2023-12-21T13:45:12.057Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 started in PT10.960876S [2023-12-21T13:45:12.194Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connect to INTERNAL://tc-zGh5qjey:9094,PLAINTEXT://10.0.1.1:51923... [2023-12-21T13:45:12.226Z] [main] INFO org.apache.kafka.clients.producer.ProducerConfig : ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [INTERNAL://tc-zGh5qjey:9094, PLAINTEXT://10.0.1.1:51923] buffer.memory = 33554432 client.dns.lookup = default client.id = 3919b4be-f789-451b-afb9-69d58ca73726 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer [2023-12-21T13:45:12.355Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-zGh5qjey:9094 from bootstrap.servers as DNS resolution failed for tc-zGh5qjey [2023-12-21T13:45:12.424Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'key.deserializer' was supplied but isn't a known config. [2023-12-21T13:45:12.424Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'value.deserializer' was supplied but isn't a known config. [2023-12-21T13:45:12.424Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'group.id' was supplied but isn't a known config. [2023-12-21T13:45:12.424Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'polling.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:12.425Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'heartbeat.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:12.425Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'auto.offset.reset' was supplied but isn't a known config. [2023-12-21T13:45:12.430Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:45:12.430Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:45:12.433Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connection established. [2023-12-21T13:45:13.207Z] [kafka-producer-network-thread | 3919b4be-f789-451b-afb9-69d58ca73726] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=3919b4be-f789-451b-afb9-69d58ca73726] Error while fetching metadata with correlation id 1 : {a_topic=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:13.207Z] [kafka-producer-network-thread | 3919b4be-f789-451b-afb9-69d58ca73726] INFO org.apache.kafka.clients.Metadata : Cluster ID: aTwWs5K_T4-w93icBBYNhg [2023-12-21T13:45:13.348Z] [kafka-producer-network-thread | 3919b4be-f789-451b-afb9-69d58ca73726] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=3919b4be-f789-451b-afb9-69d58ca73726] Error while fetching metadata with correlation id 3 : {a_topic=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:13.472Z] [kafka-producer-network-thread | 3919b4be-f789-451b-afb9-69d58ca73726] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=3919b4be-f789-451b-afb9-69d58ca73726] Error while fetching metadata with correlation id 4 : {a_topic=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:13.639Z] [main] INFO com.googlesource.gerrit.plugins.kafka.subscribe.KafkaEventNativeSubscriber : Kafka consumer subscribing to topic alias [a_topic] for event topic [a_topic] with groupId [tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] [2023-12-21T13:45:13.660Z] [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig : ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [INTERNAL://tc-zGh5qjey:9094, PLAINTEXT://10.0.1.1:51923] check.crcs = true client.dns.lookup = default client.id = c829b34d-11bf-47e8-92d6-58713d419c4c connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750 heartbeat.interval.ms = 1000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [2023-12-21T13:45:13.663Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-zGh5qjey:9094 from bootstrap.servers as DNS resolution failed for tc-zGh5qjey [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'acks' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'batch.size' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'polling.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'buffer.memory' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'key.serializer' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'retries' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'value.serializer' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'linger.ms' was supplied but isn't a known config. [2023-12-21T13:45:13.729Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:45:13.729Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:45:13.807Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.Metadata : Cluster ID: aTwWs5K_T4-w93icBBYNhg [2023-12-21T13:45:13.851Z] [kafka-producer-network-thread | 3919b4be-f789-451b-afb9-69d58ca73726] INFO message_log : PUBLISH a_topic {"type":"project-created","eventCreatedOn":1703166312,"instanceId":"test-instance-id"} [2023-12-21T13:45:14.063Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Discovered group coordinator 10.0.1.1:51923 (id: 2147483646 rack: null) [2023-12-21T13:45:14.069Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Revoking previously assigned partitions [] [2023-12-21T13:45:14.070Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] (Re-)joining group [2023-12-21T13:45:14.261Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Successfully joined group with generation 1 [2023-12-21T13:45:14.263Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Setting newly assigned partitions [a_topic-0] [2023-12-21T13:45:14.296Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.Fetcher : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Resetting offset for partition a_topic-0 to offset 0. [2023-12-21T13:45:15.020Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.Fetcher : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Resetting offset for partition a_topic-0 to offset 0. [2023-12-21T13:45:15.069Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Disconnecting... [2023-12-21T13:45:15.069Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Closing Producer org.apache.kafka.clients.producer.KafkaProducer@57f83c82... [2023-12-21T13:45:15.070Z] [main] INFO org.apache.kafka.clients.producer.KafkaProducer : [Producer clientId=3919b4be-f789-451b-afb9-69d58ca73726] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. [2023-12-21T13:45:15.127Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Sending LeaveGroup request to coordinator 10.0.1.1:51923 (id: 2147483646 rack: null) [2023-12-21T13:45:15.253Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.FetchSessionHandler : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Error sending fetch request (sessionId=413361721, epoch=3) to node 1: org.apache.kafka.common.errors.DisconnectException. [2023-12-21T13:45:15.254Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@5a28390a[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@50d0527a[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@5e17a610]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=c829b34d-11bf-47e8-92d6-58713d419c4c, groupId=tc-e22f8f91-5e5f-40f1-bb23-db2bbfcc0750] Group coordinator 10.0.1.1:51923 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery Gerrit Server Shutdown .Auto-configured "receive.autogc = false" to disable auto-gc after git-receive-pack. Initialized /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit13762714665942173359 Reindexed 0 documents in accounts index in 0.0s (NaN/s) Index accounts in version 11 is ready Reindexing groups: 50% (1/2) Reindexing groups: 100% (2/2) Reindexing groups: 100% (2/2) Reindexed 2 documents in groups index in 0.1s (21.3/s) Index groups in version 8 is ready Reindexing changes: Slicing projects: 50% (1/2) Reindexing changes: Slicing projects: 100% (2/2) Reindexing changes: Slicing projects: 100% (2/2), done Reindexed 0 documents in changes index in 0.0s (0.0/s) Index changes in version 77 is ready Reindexing projects: 50% (1/2) Reindexing projects: 100% (2/2) Reindexing projects: 100% (2/2) Reindexed 2 documents in projects index in 0.1s (23.5/s) Index projects in version 4 is ready [2023-12-21T13:45:18.108Z] [pool-29-thread-1] INFO com.google.gerrit.server.git.SystemReaderInstaller : Set JGit's SystemReader to read system config from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit13762714665942173359/etc/jgit.config [2023-12-21T13:45:18.109Z] [pool-29-thread-1] INFO com.google.gerrit.server.git.LocalDiskRepositoryManager : Defaulting core.streamFileThreshold to 2047m [2023-12-21T13:45:18.283Z] [pool-29-thread-1] WARN com.google.gerrit.server.project.PeriodicProjectListCacheWarmer : project_list cache warmer is disabled [2023-12-21T13:45:18.284Z] [pool-29-thread-1] INFO com.google.gerrit.server.plugins.PluginLoader : Loading plugins from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit13762714665942173359/plugins [2023-12-21T13:45:18.286Z] [pool-29-thread-1] INFO com.google.gerrit.server.config.ScheduleConfig : No schedule configuration for "accountDeactivation". [2023-12-21T13:45:18.410Z] [pool-29-thread-1] INFO com.google.gerrit.pgm.Daemon : Gerrit Code Review [headless] (dev) ready Gerrit Server Started [2023-12-21T13:45:18.890Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Creating container for image: confluentinc/cp-kafka:5.4.3 [2023-12-21T13:45:19.732Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Starting container with ID: 65b3b61cd955ff7f70f14fa4e64c3431265b4eab7eefe5506e5a9ec6b93ed9d7 [2023-12-21T13:45:20.795Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 is starting: 65b3b61cd955ff7f70f14fa4e64c3431265b4eab7eefe5506e5a9ec6b93ed9d7 [2023-12-21T13:45:33.138Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 started in PT14.249173S [2023-12-21T13:45:33.202Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connect to INTERNAL://tc-BqV8pmtG:9094,PLAINTEXT://10.0.1.1:51926... [2023-12-21T13:45:33.203Z] [main] INFO org.apache.kafka.clients.producer.ProducerConfig : ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [INTERNAL://tc-BqV8pmtG:9094, PLAINTEXT://10.0.1.1:51926] buffer.memory = 33554432 client.dns.lookup = default client.id = 0957b99b-e305-4c7a-8552-f2b4a90b904b compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer [2023-12-21T13:45:33.230Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-BqV8pmtG:9094 from bootstrap.servers as DNS resolution failed for tc-BqV8pmtG [2023-12-21T13:45:33.235Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'key.deserializer' was supplied but isn't a known config. [2023-12-21T13:45:33.235Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'value.deserializer' was supplied but isn't a known config. [2023-12-21T13:45:33.235Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'group.id' was supplied but isn't a known config. [2023-12-21T13:45:33.235Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'send.stream.events' was supplied but isn't a known config. [2023-12-21T13:45:33.235Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'heartbeat.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:33.235Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'auto.offset.reset' was supplied but isn't a known config. [2023-12-21T13:45:33.236Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:45:33.236Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:45:33.238Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connection established. [2023-12-21T13:45:34.068Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b] Error while fetching metadata with correlation id 1 : {gerrit=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:34.068Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO org.apache.kafka.clients.Metadata : Cluster ID: 4lVAppNTRlqnKpeXMqQo9Q [2023-12-21T13:45:34.193Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b] Error while fetching metadata with correlation id 3 : {gerrit=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:34.318Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b] Error while fetching metadata with correlation id 4 : {gerrit=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:34.433Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b] Error while fetching metadata with correlation id 5 : {gerrit=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:34.561Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b] Error while fetching metadata with correlation id 6 : {gerrit=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:34.729Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO message_log : PUBLISH gerrit {"refUpdate":{"oldRev":"56a6051ca2b02b04ef92d5150c9ef600403cb1de","newRev":"b5045cc4046dbc1d7cafa4c603fd3cdf35dc5dde","refName":"refs/sequences/changes","project":"All-Projects"},"type":"ref-updated","eventCreatedOn":1703166333} [2023-12-21T13:45:35.336Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO message_log : PUBLISH gerrit {"submitter":{"name":"Administrator","email":"admin@example.com","username":"admin"},"refUpdate":{"oldRev":"0000000000000000000000000000000000000000","newRev":"64c1a11b8c8f4d0cc17312aa1fe83c514c2ad39f","refName":"refs/changes/01/1/1","project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project"},"type":"ref-updated","eventCreatedOn":1703166335} [2023-12-21T13:45:35.337Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO message_log : PUBLISH gerrit {"submitter":{"name":"Administrator","email":"admin@example.com","username":"admin"},"refUpdate":{"oldRev":"0000000000000000000000000000000000000000","newRev":"e769b053ad82f5ee8698c5f3bf0e5cfe592778b2","refName":"refs/changes/01/1/meta","project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project"},"type":"ref-updated","eventCreatedOn":1703166335} [2023-12-21T13:45:37.437Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO message_log : PUBLISH gerrit {"uploader":{"name":"Administrator","email":"admin@example.com","username":"admin"},"patchSet":{"number":1,"revision":"64c1a11b8c8f4d0cc17312aa1fe83c514c2ad39f","parents":["a0e6dc1c9fc8ccb3fe17340853f2ec9724df1aad"],"ref":"refs/changes/01/1/1","uploader":{"name":"Administrator","email":"admin@example.com","username":"admin"},"createdOn":1703166334,"author":{"name":"Administrator","email":"admin@example.com","username":"admin"},"kind":"REWORK","sizeInsertions":10,"sizeDeletions":0},"change":{"project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project","branch":"master","id":"I0000000000000000000000000000000000000001","number":1,"subject":"test commit","owner":{"name":"Administrator","email":"admin@example.com","username":"admin"},"url":"http://localhost:0/c/com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project/+/1","commitMessage":"test commit\n\nChange-Id: I0000000000000000000000000000000000000001\n","createdOn":1703166334,"status":"NEW"},"project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project","refName":"refs/heads/master","changeKey":{"id":"I0000000000000000000000000000000000000001"},"type":"patchset-created","eventCreatedOn":1703166337} [2023-12-21T13:45:37.643Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO message_log : PUBLISH gerrit {"submitter":{"name":"Administrator","email":"admin@example.com","username":"admin"},"refUpdate":{"oldRev":"e769b053ad82f5ee8698c5f3bf0e5cfe592778b2","newRev":"816f38a782592ddb52eed0a4f0dce1e53b341559","refName":"refs/changes/01/1/meta","project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project"},"type":"ref-updated","eventCreatedOn":1703166337} [2023-12-21T13:45:37.800Z] [kafka-producer-network-thread | 0957b99b-e305-4c7a-8552-f2b4a90b904b] INFO message_log : PUBLISH gerrit {"author":{"name":"Administrator","email":"admin@example.com","username":"admin"},"approvals":[{"type":"Code-Review","description":"Code-Review","value":"1","oldValue":"0"}],"comment":"Patch Set 1: Code-Review+1\n\nLGTM","patchSet":{"number":1,"revision":"64c1a11b8c8f4d0cc17312aa1fe83c514c2ad39f","parents":["a0e6dc1c9fc8ccb3fe17340853f2ec9724df1aad"],"ref":"refs/changes/01/1/1","uploader":{"name":"Administrator","email":"admin@example.com","username":"admin"},"createdOn":1703166334,"author":{"name":"Administrator","email":"admin@example.com","username":"admin"},"kind":"REWORK","sizeInsertions":10,"sizeDeletions":0},"change":{"project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project","branch":"master","id":"I0000000000000000000000000000000000000001","number":1,"subject":"test commit","owner":{"name":"Administrator","email":"admin@example.com","username":"admin"},"url":"http://localhost:0/c/com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project/+/1","commitMessage":"test commit\n\nChange-Id: I0000000000000000000000000000000000000001\n","createdOn":1703166334,"status":"NEW"},"project":"com.googlesource.gerrit.plugins.kafka.EventConsumerIT_consumeEvents_project","refName":"refs/heads/master","changeKey":{"id":"I0000000000000000000000000000000000000001"},"type":"comment-added","eventCreatedOn":1703166337} [2023-12-21T13:45:37.882Z] [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig : ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [INTERNAL://tc-BqV8pmtG:9094, PLAINTEXT://10.0.1.1:51926] check.crcs = true client.dns.lookup = default client.id = 0957b99b-e305-4c7a-8552-f2b4a90b904b connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tc-d7c4707c-b965-443e-89f4-a515bc7d7d64 heartbeat.interval.ms = 1000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [2023-12-21T13:45:37.884Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-BqV8pmtG:9094 from bootstrap.servers as DNS resolution failed for tc-BqV8pmtG [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'acks' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'batch.size' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'send.stream.events' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'buffer.memory' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'key.serializer' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'retries' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'value.serializer' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'linger.ms' was supplied but isn't a known config. [2023-12-21T13:45:37.890Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:45:37.890Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:45:37.901Z] [main] INFO org.apache.kafka.clients.Metadata : Cluster ID: 4lVAppNTRlqnKpeXMqQo9Q [2023-12-21T13:45:38.015Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] Discovered group coordinator 10.0.1.1:51926 (id: 2147483646 rack: null) [2023-12-21T13:45:38.016Z] [main] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] Revoking previously assigned partitions [] [2023-12-21T13:45:38.016Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] (Re-)joining group [2023-12-21T13:45:38.101Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] Successfully joined group with generation 1 [2023-12-21T13:45:38.101Z] [main] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] Setting newly assigned partitions [gerrit-0] [2023-12-21T13:45:38.120Z] [main] INFO org.apache.kafka.clients.consumer.internals.Fetcher : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] Resetting offset for partition gerrit-0 to offset 0. [2023-12-21T13:45:38.199Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b, groupId=tc-d7c4707c-b965-443e-89f4-a515bc7d7d64] Sending LeaveGroup request to coordinator 10.0.1.1:51926 (id: 2147483646 rack: null) [2023-12-21T13:45:38.228Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Disconnecting... [2023-12-21T13:45:38.228Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Closing Producer org.apache.kafka.clients.producer.KafkaProducer@5b4ce00... [2023-12-21T13:45:38.228Z] [main] INFO org.apache.kafka.clients.producer.KafkaProducer : [Producer clientId=0957b99b-e305-4c7a-8552-f2b4a90b904b] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. Gerrit Server Shutdown .Auto-configured "receive.autogc = false" to disable auto-gc after git-receive-pack. Initialized /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit10220881792244643459 Reindexed 0 documents in accounts index in 0.0s (NaN/s) Index accounts in version 11 is ready Reindexing groups: 50% (1/2) Reindexing groups: 100% (2/2) Reindexing groups: 100% (2/2) Reindexed 2 documents in groups index in 0.1s (30.8/s) Index groups in version 8 is ready Reindexing changes: Slicing projects: 50% (1/2) Reindexing changes: Slicing projects: 100% (2/2) Reindexing changes: Slicing projects: 100% (2/2), done Reindexed 0 documents in changes index in 0.0s (0.0/s) Index changes in version 77 is ready Reindexing projects: 50% (1/2) Reindexing projects: 100% (2/2) Reindexing projects: 100% (2/2) Reindexed 2 documents in projects index in 0.1s (31.7/s) Index projects in version 4 is ready [2023-12-21T13:45:40.368Z] [pool-46-thread-1] INFO com.google.gerrit.server.git.SystemReaderInstaller : Set JGit's SystemReader to read system config from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit10220881792244643459/etc/jgit.config [2023-12-21T13:45:40.368Z] [pool-46-thread-1] INFO com.google.gerrit.server.git.LocalDiskRepositoryManager : Defaulting core.streamFileThreshold to 2047m [2023-12-21T13:45:40.465Z] [pool-46-thread-1] WARN com.google.gerrit.server.project.PeriodicProjectListCacheWarmer : project_list cache warmer is disabled [2023-12-21T13:45:40.467Z] [pool-46-thread-1] INFO com.google.gerrit.server.plugins.PluginLoader : Loading plugins from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit10220881792244643459/plugins [2023-12-21T13:45:40.468Z] [pool-46-thread-1] INFO com.google.gerrit.server.config.ScheduleConfig : No schedule configuration for "accountDeactivation". [2023-12-21T13:45:40.598Z] [pool-46-thread-1] INFO com.google.gerrit.pgm.Daemon : Gerrit Code Review [headless] (dev) ready Gerrit Server Started [2023-12-21T13:45:40.822Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Creating container for image: confluentinc/cp-kafka:5.4.3 [2023-12-21T13:45:41.328Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Starting container with ID: 96b34515b2b0fc6f8a3bca22b924e23320b1f79af946b7841e2330d4564f5ab7 [2023-12-21T13:45:42.182Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 is starting: 96b34515b2b0fc6f8a3bca22b924e23320b1f79af946b7841e2330d4564f5ab7 [2023-12-21T13:45:49.152Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 started in PT8.330196S [2023-12-21T13:45:49.185Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connect to INTERNAL://tc-1TczxBkP:9094,PLAINTEXT://10.0.1.1:51929... [2023-12-21T13:45:49.186Z] [main] INFO org.apache.kafka.clients.producer.ProducerConfig : ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [INTERNAL://tc-1TczxBkP:9094, PLAINTEXT://10.0.1.1:51929] buffer.memory = 33554432 client.dns.lookup = default client.id = b182cdf2-35e7-4a07-ab16-5c1b2983e97c compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer [2023-12-21T13:45:49.201Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-1TczxBkP:9094 from bootstrap.servers as DNS resolution failed for tc-1TczxBkP [2023-12-21T13:45:49.204Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'key.deserializer' was supplied but isn't a known config. [2023-12-21T13:45:49.204Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'value.deserializer' was supplied but isn't a known config. [2023-12-21T13:45:49.204Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'group.id' was supplied but isn't a known config. [2023-12-21T13:45:49.204Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'polling.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:49.204Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'heartbeat.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:49.204Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'auto.offset.reset' was supplied but isn't a known config. [2023-12-21T13:45:49.204Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:45:49.204Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:45:49.204Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connection established. [2023-12-21T13:45:49.254Z] [kafka-producer-network-thread | b182cdf2-35e7-4a07-ab16-5c1b2983e97c] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=b182cdf2-35e7-4a07-ab16-5c1b2983e97c] Error while fetching metadata with correlation id 1 : {a_topic=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:49.254Z] [kafka-producer-network-thread | b182cdf2-35e7-4a07-ab16-5c1b2983e97c] INFO org.apache.kafka.clients.Metadata : Cluster ID: lPqUeOnyRNKvNVqdWt1LpQ [2023-12-21T13:45:49.373Z] [kafka-producer-network-thread | b182cdf2-35e7-4a07-ab16-5c1b2983e97c] WARN org.apache.kafka.clients.NetworkClient : [Producer clientId=b182cdf2-35e7-4a07-ab16-5c1b2983e97c] Error while fetching metadata with correlation id 3 : {a_topic=LEADER_NOT_AVAILABLE} [2023-12-21T13:45:49.484Z] [main] INFO com.googlesource.gerrit.plugins.kafka.subscribe.KafkaEventNativeSubscriber : Kafka consumer subscribing to topic alias [a_topic] for event topic [a_topic] with groupId [tc-1e9c9214-bdc5-4699-b585-509f2800bb1b] [2023-12-21T13:45:49.485Z] [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig : ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [INTERNAL://tc-1TczxBkP:9094, PLAINTEXT://10.0.1.1:51929] check.crcs = true client.dns.lookup = default client.id = af2de2af-bb54-487d-ad6f-ead634d34c82 connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = consumer-group-1 heartbeat.interval.ms = 1000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer [2023-12-21T13:45:49.486Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-1TczxBkP:9094 from bootstrap.servers as DNS resolution failed for tc-1TczxBkP [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'acks' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'batch.size' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'polling.interval.ms' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'buffer.memory' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'key.serializer' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'retries' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'value.serializer' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'linger.ms' was supplied but isn't a known config. [2023-12-21T13:45:49.488Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:45:49.489Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:45:49.496Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.Metadata : Cluster ID: lPqUeOnyRNKvNVqdWt1LpQ [2023-12-21T13:45:49.583Z] [kafka-producer-network-thread | b182cdf2-35e7-4a07-ab16-5c1b2983e97c] INFO message_log : PUBLISH a_topic {"type":"project-created","eventCreatedOn":1703166349,"instanceId":"test-instance-id-1"} [2023-12-21T13:45:49.608Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Discovered group coordinator 10.0.1.1:51929 (id: 2147483646 rack: null) [2023-12-21T13:45:49.609Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Revoking previously assigned partitions [] [2023-12-21T13:45:49.610Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] (Re-)joining group [2023-12-21T13:45:49.697Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Successfully joined group with generation 1 [2023-12-21T13:45:49.697Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Setting newly assigned partitions [a_topic-0] [2023-12-21T13:45:49.719Z] [kafka-subscriber-1[java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask@7f86c369[Not completed, task = java.util.concurrent.Executors$RunnableAdapter@757ea819[Wrapped task = com.google.gerrit.server.logging.LoggingContextAwareRunnable@7bb958c6]]]] INFO org.apache.kafka.clients.consumer.internals.Fetcher : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Resetting offset for partition a_topic-0 to offset 0. [2023-12-21T13:45:49.792Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Disconnecting... [2023-12-21T13:45:49.792Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Closing Producer org.apache.kafka.clients.producer.KafkaProducer@40477d52... [2023-12-21T13:45:49.792Z] [main] INFO org.apache.kafka.clients.producer.KafkaProducer : [Producer clientId=b182cdf2-35e7-4a07-ab16-5c1b2983e97c] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. [2023-12-21T13:45:49.867Z] [kafka-coordinator-heartbeat-thread | consumer-group-1] INFO org.apache.kafka.clients.FetchSessionHandler : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Error sending fetch request (sessionId=1314884491, epoch=1) to node 1: org.apache.kafka.common.errors.DisconnectException. [2023-12-21T13:45:49.867Z] [kafka-coordinator-heartbeat-thread | consumer-group-1] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=af2de2af-bb54-487d-ad6f-ead634d34c82, groupId=consumer-group-1] Group coordinator 10.0.1.1:51929 (id: 2147483646 rack: null) is unavailable or invalid, will attempt rediscovery Gerrit Server Shutdown .Auto-configured "receive.autogc = false" to disable auto-gc after git-receive-pack. Initialized /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit10516230945112370325 Reindexed 0 documents in accounts index in 0.0s (NaN/s) Index accounts in version 11 is ready Reindexing groups: 50% (1/2) Reindexing groups: 100% (2/2) Reindexing groups: 100% (2/2) Reindexed 2 documents in groups index in 0.1s (35.1/s) Index groups in version 8 is ready Reindexing changes: Slicing projects: 50% (1/2) Reindexing changes: Slicing projects: 100% (2/2) Reindexing changes: Slicing projects: 100% (2/2), done Reindexed 0 documents in changes index in 0.0s (0.0/s) Index changes in version 77 is ready Reindexing projects: 50% (1/2) Reindexing projects: 100% (2/2) Reindexing projects: 100% (2/2) Reindexed 2 documents in projects index in 0.1s (37.0/s) Index projects in version 4 is ready [2023-12-21T13:46:20.626Z] [pool-63-thread-1] INFO com.google.gerrit.server.git.SystemReaderInstaller : Set JGit's SystemReader to read system config from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit10516230945112370325/etc/jgit.config [2023-12-21T13:46:20.627Z] [pool-63-thread-1] INFO com.google.gerrit.server.git.LocalDiskRepositoryManager : Defaulting core.streamFileThreshold to 2047m [2023-12-21T13:46:20.707Z] [pool-63-thread-1] WARN com.google.gerrit.server.project.PeriodicProjectListCacheWarmer : project_list cache warmer is disabled [2023-12-21T13:46:20.708Z] [pool-63-thread-1] INFO com.google.gerrit.server.plugins.PluginLoader : Loading plugins from /home/jenkins/.cache/bazel/_bazel_jenkins/f23fb057377bb8b723aa4eea20581744/sandbox/linux-sandbox/610/execroot/gerrit/_tmp/5e0cb084303a4879d3a6f18963ce6255/junit7080557961914518267/junit10516230945112370325/plugins [2023-12-21T13:46:20.709Z] [pool-63-thread-1] INFO com.google.gerrit.server.config.ScheduleConfig : No schedule configuration for "accountDeactivation". [2023-12-21T13:46:20.820Z] [pool-63-thread-1] INFO com.google.gerrit.pgm.Daemon : Gerrit Code Review [headless] (dev) ready Gerrit Server Started [2023-12-21T13:46:21.014Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Creating container for image: confluentinc/cp-kafka:5.4.3 [2023-12-21T13:46:21.549Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Starting container with ID: fbf9729a8a4c91446f321b13cdc0a78f2e21ddfd856bc6280fd237ce20672710 [2023-12-21T13:46:22.339Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 is starting: fbf9729a8a4c91446f321b13cdc0a78f2e21ddfd856bc6280fd237ce20672710 [2023-12-21T13:46:29.332Z] [main] INFO docker[confluentinc/cp-kafka:5.4.3] : Container confluentinc/cp-kafka:5.4.3 started in PT8.318146S [2023-12-21T13:46:29.361Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connect to INTERNAL://tc-SJDTZtxo:9094,PLAINTEXT://10.0.1.1:51932... [2023-12-21T13:46:29.361Z] [main] INFO org.apache.kafka.clients.producer.ProducerConfig : ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [INTERNAL://tc-SJDTZtxo:9094, PLAINTEXT://10.0.1.1:51932] buffer.memory = 33554432 client.dns.lookup = default client.id = d94a8213-864a-4c7e-8df9-e6348ce42398 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer [2023-12-21T13:46:29.382Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-SJDTZtxo:9094 from bootstrap.servers as DNS resolution failed for tc-SJDTZtxo [2023-12-21T13:46:29.385Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'key.deserializer' was supplied but isn't a known config. [2023-12-21T13:46:29.385Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'value.deserializer' was supplied but isn't a known config. [2023-12-21T13:46:29.385Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'group.id' was supplied but isn't a known config. [2023-12-21T13:46:29.385Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'send.stream.events' was supplied but isn't a known config. [2023-12-21T13:46:29.385Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'heartbeat.interval.ms' was supplied but isn't a known config. [2023-12-21T13:46:29.385Z] [main] WARN org.apache.kafka.clients.producer.ProducerConfig : The configuration 'auto.offset.reset' was supplied but isn't a known config. [2023-12-21T13:46:29.385Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:46:29.385Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:46:29.386Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Connection established. [2023-12-21T13:46:30.011Z] [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig : ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [INTERNAL://tc-SJDTZtxo:9094, PLAINTEXT://10.0.1.1:51932] check.crcs = true client.dns.lookup = default client.id = d94a8213-864a-4c7e-8df9-e6348ce42398 connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = tc-bbe4526b-fa82-4a19-a276-6f00c747ed98 heartbeat.interval.ms = 1000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 5000 reconnect.backoff.ms = 5000 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [2023-12-21T13:46:30.012Z] [main] WARN org.apache.kafka.clients.ClientUtils : Couldn't resolve server INTERNAL://tc-SJDTZtxo:9094 from bootstrap.servers as DNS resolution failed for tc-SJDTZtxo [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'acks' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'batch.size' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'send.stream.events' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'buffer.memory' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'key.serializer' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'retries' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'value.serializer' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] WARN org.apache.kafka.clients.consumer.ConsumerConfig : The configuration 'linger.ms' was supplied but isn't a known config. [2023-12-21T13:46:30.014Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka version : 2.1.1 [2023-12-21T13:46:30.014Z] [main] INFO org.apache.kafka.common.utils.AppInfoParser : Kafka commitId : 21234bee31165527 [2023-12-21T13:46:30.045Z] [main] WARN org.apache.kafka.clients.NetworkClient : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Error while fetching metadata with correlation id 2 : {gerrit=LEADER_NOT_AVAILABLE} [2023-12-21T13:46:30.046Z] [main] INFO org.apache.kafka.clients.Metadata : Cluster ID: 664hPiu8TDikkj8c6ICE4Q [2023-12-21T13:46:30.162Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Discovered group coordinator 10.0.1.1:51932 (id: 2147483646 rack: null) [2023-12-21T13:46:30.163Z] [main] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Revoking previously assigned partitions [] [2023-12-21T13:46:30.163Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] (Re-)joining group [2023-12-21T13:46:30.256Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Successfully joined group with generation 1 [2023-12-21T13:46:30.257Z] [main] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Setting newly assigned partitions [gerrit-0] [2023-12-21T13:46:30.280Z] [main] INFO org.apache.kafka.clients.consumer.internals.Fetcher : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Resetting offset for partition gerrit-0 to offset 0. [2023-12-21T13:46:40.021Z] [main] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator : [Consumer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398, groupId=tc-bbe4526b-fa82-4a19-a276-6f00c747ed98] Sending LeaveGroup request to coordinator 10.0.1.1:51932 (id: 2147483646 rack: null) [2023-12-21T13:46:40.036Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Disconnecting... [2023-12-21T13:46:40.036Z] [main] INFO com.googlesource.gerrit.plugins.kafka.session.KafkaSession : Closing Producer org.apache.kafka.clients.producer.KafkaProducer@136480b... [2023-12-21T13:46:40.036Z] [main] INFO org.apache.kafka.clients.producer.KafkaProducer : [Producer clientId=d94a8213-864a-4c7e-8df9-e6348ce42398] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. Gerrit Server Shutdown II..................I........................... Time: 355.201 OK (49 tests) BazelTestRunner exiting with a return value of 0 JVM shutdown hooks (if any) will run now. The JVM will exit once they complete. -- JVM shutdown starting at 2023-12-21 13:50:42 -- ================================================================================ INFO: Elapsed time: 458.193s, Critical Path: 428.75s INFO: 168 processes: 7 internal, 76 linux-sandbox, 85 worker. INFO: Build completed successfully, 168 total actions //plugins/events-kafka:events_kafka_tests PASSED in 356.6s //tools/bzl:always_pass_test PASSED in 0.1s Executed 2 out of 2 tests: 2 tests pass. There were tests whose specified size is too big. Use the --test_verbose_timeout_warnings command line option to see which ones these are. INFO: Build completed successfully, 168 total actions Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8 Dec 21, 2023 1:50:44 PM java.util.jar.Attributes read WARNING: Duplicate name in Manifest: Implementation-Version. Ensure that the manifest does not have duplicate entries, and that blank lines separate individual sections in both your manifest and in the META-INF/MANIFEST.MF entry in the jar file. [plugin-events-kafka-bazel-stable-3.6] $ /bin/bash -e /tmp/jenkins7651997351352438064.sh % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 2291 0 2291 0 0 37557 0 --:--:-- --:--:-- --:--:-- 37557 Archiving artifacts Finished: SUCCESS