Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ArgumentError: wrong number of arguments (given 2, expected 0..1) Version 8.10.3 #15451

Closed
cethink opened this issue Oct 15, 2023 · 0 comments
Closed

Comments

@cethink
Copy link

cethink commented Oct 15, 2023

Logstash information:

Please include the following information:

  1. Logstash version (e.g. bin/logstash --version)
  2. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker)
  3. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes)
1. logstash version 8.10.3
2. archive logstash-8.10.3-darwin-x86_64.tar.gz
3. via command line > ./logstash -f ../logstash-8.10.3/config/logstash.conf

Plugins installed: (bin/logstash-plugin list --verbose)

logstash-codec-avro (3.4.0)
logstash-codec-cef (6.2.7)
logstash-codec-collectd (3.1.0)
logstash-codec-dots (3.0.6)
logstash-codec-edn (3.1.0)
logstash-codec-edn_lines (3.1.0)
logstash-codec-es_bulk (3.1.0)
logstash-codec-fluent (3.4.2)
logstash-codec-graphite (3.0.6)
logstash-codec-json (3.1.1)
logstash-codec-json_lines (3.1.0)
logstash-codec-line (3.1.1)
logstash-codec-msgpack (3.1.0)
logstash-codec-multiline (3.1.1)
logstash-codec-netflow (4.3.0)
logstash-codec-plain (3.1.0)
logstash-codec-rubydebug (3.1.0)
logstash-filter-aggregate (2.10.0)
logstash-filter-anonymize (3.0.7)
logstash-filter-cidr (3.1.3)
logstash-filter-clone (4.2.0)
logstash-filter-csv (3.1.1)
logstash-filter-date (3.1.15)
logstash-filter-de_dot (1.0.4)
logstash-filter-dissect (1.2.5)
logstash-filter-dns (3.2.0)
logstash-filter-drop (3.0.5)
logstash-filter-elasticsearch (3.15.3)
logstash-filter-fingerprint (3.4.3)
logstash-filter-geoip (7.2.13)
logstash-filter-grok (4.4.3)
logstash-filter-http (1.4.3)
logstash-filter-json (3.2.0)
logstash-filter-kv (4.7.0)
logstash-filter-memcached (1.2.0)
logstash-filter-metrics (4.0.7)
logstash-filter-mutate (3.5.7)
logstash-filter-prune (3.0.4)
logstash-filter-ruby (3.1.8)
logstash-filter-sleep (3.0.7)
logstash-filter-split (3.1.8)
logstash-filter-syslog_pri (3.2.0)
logstash-filter-throttle (4.0.4)
logstash-filter-translate (3.4.2)
logstash-filter-truncate (1.0.6)
logstash-filter-urldecode (3.0.6)
logstash-filter-useragent (3.3.5)
logstash-filter-uuid (3.0.5)
logstash-filter-xml (4.2.0)
logstash-input-azure_event_hubs (1.4.5)
logstash-input-beats (6.6.4)
└── logstash-input-elastic_agent (alias)
logstash-input-couchdb_changes (3.1.6)
logstash-input-dead_letter_queue (2.0.0)
logstash-input-elastic_serverless_forwarder (0.1.3)
logstash-input-elasticsearch (4.17.2)
logstash-input-exec (3.6.0)
logstash-input-file (4.4.4)
logstash-input-ganglia (3.1.4)
logstash-input-gelf (3.3.2)
logstash-input-generator (3.1.0)
logstash-input-graphite (3.0.6)
logstash-input-heartbeat (3.1.1)
logstash-input-http (3.7.2)
logstash-input-http_poller (5.4.0)
logstash-input-imap (3.2.0)
logstash-input-jms (3.2.2)
logstash-input-pipe (3.1.0)
logstash-input-redis (3.7.0)
logstash-input-snmp (1.3.3)
logstash-input-snmptrap (3.1.0)
logstash-input-stdin (3.4.0)
logstash-input-syslog (3.6.0)
logstash-input-tcp (6.4.0)
logstash-input-twitter (4.1.0)
logstash-input-udp (3.5.0)
logstash-input-unix (3.1.2)
logstash-integration-aws (7.1.6)
 ├── logstash-codec-cloudfront
 ├── logstash-codec-cloudtrail
 ├── logstash-input-cloudwatch
 ├── logstash-input-s3
 ├── logstash-input-sqs
 ├── logstash-output-cloudwatch
 ├── logstash-output-s3
 ├── logstash-output-sns
 └── logstash-output-sqs
logstash-integration-elastic_enterprise_search (2.2.1)
 ├── logstash-output-elastic_app_search
 └──  logstash-output-elastic_workplace_search
logstash-integration-jdbc (5.4.5)
 ├── logstash-input-jdbc
 ├── logstash-filter-jdbc_streaming
 └── logstash-filter-jdbc_static
logstash-integration-kafka (11.3.1)
 ├── logstash-input-kafka
 └── logstash-output-kafka
logstash-integration-rabbitmq (7.3.3)
 ├── logstash-input-rabbitmq
 └── logstash-output-rabbitmq
logstash-output-csv (3.0.9)
logstash-output-elasticsearch (11.16.0)
logstash-output-email (4.1.2)
logstash-output-file (4.3.0)
logstash-output-graphite (3.1.6)
logstash-output-http (5.5.0)
logstash-output-lumberjack (3.1.9)
logstash-output-nagios (3.0.6)
logstash-output-null (3.0.5)
logstash-output-pipe (3.0.6)
logstash-output-redis (5.0.0)
logstash-output-stdout (3.1.4)
logstash-output-tcp (6.1.2)
logstash-output-udp (3.2.0)
logstash-output-webhdfs (3.0.6)
logstash-patterns-core (4.3.4)

JVM (e.g. java -version):

If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:

  1. JVM version (java -version)
  2. JVM installation source (e.g. from the Operating System's package manager, from source, etc).
  3. Value of the LS_JAVA_HOME environment variable if set.
Using bundled JDK: /Users/octo/Microservice/ELKLogs/logstash-8.10.3/jdk.app/Contents/Home

OS version (uname -a if on a Unix-like system):

Darwin macos 23.0.0 Darwin Kernel Version 23.0.0: Fri Sep 15 14:42:42 PDT 2023; root:xnu-10002.1.13~1/RELEASE_X86_64 x86_64

macOS Sonoma 14.0

Description of the problem including expected versus actual behavior:

Steps to reproduce:

Please include a minimal but complete recreation of the problem,
including (e.g.) pipeline definition(s), settings, locale, etc. The easier
you make for us to reproduce it, the more likely that somebody will take the
time to look at it.

1. run ELK success
2. close all terminal directly, not by ctrl+c
3. rerun and it just happened

Provide logs (if relevant):

➜  bin ./logstash -f ../logstash-8.10.3/config/logstash.conf
Using bundled JDK: /Users/octo/Microservice/ELKLogs/logstash-8.10.3/jdk.app/Contents/Home
Sending Logstash logs to /Users/octo/Microservice/ELKLogs/logstash-8.10.3/logs which is now configured via log4j2.properties
[2023-10-15T13:01:12,621][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/octo/Microservice/ELKLogs/logstash-8.10.3/config/log4j2.properties
[2023-10-15T13:01:12,632][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.10.3", "jruby.version"=>"jruby 9.4.2.0 (3.1.0) 2023-03-08 90d2913fda OpenJDK 64-Bit Server VM 17.0.8+7 on 17.0.8+7 +indy +jit [x86_64-darwin]"}
[2023-10-15T13:01:12,637][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-10-15T13:01:12,706][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-10-15T13:01:13,786][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9601, :ssl_enabled=>false}
[2023-10-15T13:01:14,295][INFO ][org.reflections.Reflections] Reflections took 218 ms to scan 1 urls, producing 132 keys and 464 values
[2023-10-15T13:01:14,747][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-10-15T13:01:14,767][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://localhost:9200"]}
[2023-10-15T13:01:14,964][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@localhost:9200/]}}
[2023-10-15T13:01:15,348][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@localhost:9200/"}
[2023-10-15T13:01:15,360][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.10.3) {:es_version=>8}
[2023-10-15T13:01:15,361][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-10-15T13:01:15,472][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"elklogtest-%{+yyyy.MM.dd}"}
[2023-10-15T13:01:15,472][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2023-10-15T13:01:15,491][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2023-10-15T13:01:15,515][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/octo/Microservice/ELKLogs/logstash-8.10.3/config/logstash.conf"], :thread=>"#<Thread:0x4f70bd31 /Users/octo/Microservice/ELKLogs/logstash-8.10.3/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-10-15T13:01:16,499][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.98}
[2023-10-15T13:01:16,722][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-10-15T13:01:16,725][INFO ][logstash.inputs.tcp      ][main][1a9d65e53f979f52a527309dfbf80c70c486beee114c3d901f2c30b6b25f1f68] Starting tcp input listener {:address=>"0.0.0.0:9902", :ssl_enabled=>false}
[2023-10-15T13:01:16,745][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
warning: thread "[main]<tcp" terminated with exception (report_on_exception is true):
ArgumentError: wrong number of arguments (given 2, expected 0..1)
    translate at /Users/octo/Microservice/ELKLogs/logstash-8.10.3/vendor/bundle/jruby/3.1.0/gems/i18n-1.14.1/lib/i18n.rb:210
  inputworker at /Users/octo/Microservice/ELKLogs/logstash-8.10.3/logstash-core/lib/logstash/java_pipeline.rb:427
  start_input at /Users/octo/Microservice/ELKLogs/logstash-8.10.3/logstash-core/lib/logstash/java_pipeline.rb:405
[2023-10-15T13:01:21,134][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<ArgumentError: wrong number of arguments (given 2, expected 0..1)>, :backtrace=>["/Users/octo/Microservice/ELKLogs/logstash-8.10.3/vendor/bundle/jruby/3.1.0/gems/i18n-1.14.1/lib/i18n.rb:210:in `translate'", "/Users/octo/Microservice/ELKLogs/logstash-8.10.3/logstash-core/lib/logstash/java_pipeline.rb:427:in `inputworker'", "/Users/octo/Microservice/ELKLogs/logstash-8.10.3/logstash-core/lib/logstash/java_pipeline.rb:405:in `block in start_input'"]}
[2023-10-15T13:01:21,312][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2023-10-15T13:01:21,315][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:795) ~[jruby.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:758) ~[jruby.jar:?]
	at Users.octo.Microservice.ELKLogs.logstash_minus_8_dot_10_dot_3.lib.bootstrap.environment.<main>(/Users/octo/Microservice/ELKLogs/logstash-8.10.3/lib/bootstrap/environment.rb:90) ~[?:?]
# logstash.conf
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  tcp {
    port => "9902"
    type => "nlog"
  }
}

output {
  elasticsearch {
    hosts => ["https://localhost:9200"]
    index => "elklogtest-%{+yyyy.MM.dd}"
    document_id => "%{@timestamp}"
    ssl_certificate_authorities => "/elasticsearch-8.10.3/config/certs/http_ca.crt"
    user => "elastic"
    password => "xxxxxxxxx"
  }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant