We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi,
I'm improving your library adding support for gzipped (compressed / batch) write on KairosDB. I've modified the write_metrics_list to:
write_metrics_list
def write_metrics_list(conn, metric_list, _gzip=False):
Then I write the following code inside test_writer.py:
test_writer.py
# Test batch write (compressed) data = [ { "name": "test1", "timestamp": 1349109376, "value": 20, "tags":{"host":"test"} }, { "name": "test2", "timestamp": 1349109374, "value": 23, "tags":{"host":"test"} } ] r = c.write_metrics_list(c, data, True)
I run python setup.py build and python setup.py install and later python pyKairosDB/tests/test_writer.py. But when I try to test that I got a issue:
python setup.py build
python setup.py install
python pyKairosDB/tests/test_writer.py
[paladini@starstuff tests]$ python test_writer.py Traceback (most recent call last): File "test_writer.py", line 26, in <module> r = c.write_metrics_list(c, data, True) AttributeError: 'KairosDBConnection' object has no attribute 'write_metrics_list'
Can you help me with that? After that, I can make a pull request :)
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hi,
I'm improving your library adding support for gzipped (compressed / batch) write on KairosDB. I've modified the
write_metrics_list
to:Then I write the following code inside
test_writer.py
:I run
python setup.py build
andpython setup.py install
and laterpython pyKairosDB/tests/test_writer.py
. But when I try to test that I got a issue:Can you help me with that? After that, I can make a pull request :)
The text was updated successfully, but these errors were encountered: