-
Notifications
You must be signed in to change notification settings - Fork 102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tcp_Client #189
Comments
The current TCP client count is not exposed via the HTTP statistics channels. The TCP high-water is available via http://localhost:8053/json/v1/server: {
"nsstats":{
"TCPConnHighWater":32, There are also socket stats available via http://localhost:8053/json/v1/net {
"json-stats-version":"1.8",
"boot-time":"2024-02-04T17:26:28.754Z",
"config-time":"2024-02-04T17:26:28.846Z",
"current-time":"2024-02-14T22:12:37.821Z",
"version":"9.19.19-1-Debian",
"sockstats":{
"UDP4Open":227854,
"TCP4Open":1703,
"UDP4Close":227846,
"TCP4Close":121809,
"TCP4ConnFail":1071,
"UDP4Conn":227846,
"TCP4Conn":622,
"TCP4Accept":120118,
"UDP4Active":10,
"TCP4Active":16
}
} In both cases, bind_exporter does not currently implement parsing of these data. |
So we cannot monitor and plot graphs of current TCP client count unless they are exposed via the HTTP statistics channels ? |
I mean there is no feasible way currently to obtain that data with bind_exporter, since the HTTP statistics channel is the sole method employed by bind_exporter to fetch counters from BIND. There are other means at your disposal, such as writing your own textfile collector script which called |
nsstats":{ |
You can find detailed descriptions of name server statistics counters at https://bind9.readthedocs.io/en/latest/reference.html
|
I want to Plot graph of tcp_client from bind_exporter.
is there any option in bind_exporter ,If not please mark this as feature-request
this is rndc status .
+++++++++++++++++
recursive clients: 4/900/1000
✔ tcp clients: 3/150
TCP high-water: 6
server is up and running
+++++++++++++++
kindly Suggest.
The text was updated successfully, but these errors were encountered: