Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OTLP for HTTP #725

Merged
merged 7 commits into from
Dec 12, 2024
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
172 changes: 170 additions & 2 deletions docs/shipping/Code/http.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
---
id: HTTP
title: HTTP
overview: Ship logs from your code directly to the Logz.io listener as a minified JavaScript Object Notation (JSON) file, a standard text-based format for representing structured data based on JavaScript object syntax.
product: ['logs']
overview: Ship data from your code directly to the Logz.io listener as a minified JavaScript Object Notation (JSON) file, a standard text-based format for representing structured data based on JavaScript object syntax.
product: ['logs', 'tracing']
os: ['windows', 'linux']
filters: ['Code', 'Most Popular']
logo: https://logzbucket.s3.eu-west-1.amazonaws.com/logz-docs/shipper-logos/json.svg
Expand Down Expand Up @@ -138,5 +138,173 @@ Allow some time for data ingestion, then open [Open Search Dashboards](https://a

Encounter an issue? See our [log shipping troubleshooting](https://docs.logz.io/docs/user-guide/log-management/troubleshooting/log-shipping-troubleshooting/) guide.

</TabItem>
<TabItem value="http-otlp" label="Protobuf via OpenTelemetry" default>

This guide provides step-by-step instructions to Logz.io users on how to send logs in Protobuf format using the OTLP listener. Follow these steps to set up your environment and send logs via the OTLP protocol using the protocurl tool.

## Download `protocurl`

`protocurl` is a tool based on curl and Protobuf, designed for working with Protobuf-encoded requests over HTTP. Follow the instructions on the [`protocurl` GitHub page](https://github.com/qaware/protocurl) to download and install the tool on your machine.

Once installed, verify the installation with:

```bash
protocurl --version
```

## Download OpenTelemetry Protobuf Definitions

Download the OpenTelemetry Protobuf definitions from the [OpenTelemetry-proto GitHub repository](https://github.com/open-telemetry/opentelemetry-proto/tree/main).

You need the `.proto` files to compile Protobuf messages and send logs. Download the repository to a local folder, for example:

```bash
git clone <https://github.com/open-telemetry/opentelemetry-proto.git> ~/Downloads/proto/opentelemetry-proto
```

## Prepare the Command

Once `protocurl` is installed and the OpenTelemetry Protobuf files are downloaded, you can start sending logs.

Here is the basic structure of the command:

```bash
protocurl -v \\
-I ~/Downloads/proto/opentelemetry-proto \\
-i opentelemetry.proto.collector.logs.v1.ExportLogsServiceRequest \\
-o opentelemetry.proto.collector.logs.v1.ExportLogsServiceResponse \\
-u '<https://otlp-listener.logz.io/v1/logs>' \\
-H 'Authorization: Bearer <Logzio-Token-Logs>' \\
-H 'user-agent: logzio-protobuf-logs' \\
-d @export_logs_request.json
```

Breakdown:

* `I`: Points to the location of the OpenTelemetry Protobuf definitions.
* `i`: Specifies the Protobuf request type (`ExportLogsServiceRequest`).
* `o`: Specifies the Protobuf response type (`ExportLogsServiceResponse`).
* `u`: URL of the Logz.io OTLP listener endpoint. Adjust the URL for your region using Logz.io [region settings](https://docs.logz.io/docs/user-guide/admin/hosting-regions/account-region/).
Simplychee marked this conversation as resolved.
Show resolved Hide resolved
* `H`: Include headers like the Authorization token and user-agent.
* `d`: Specifies the JSON file containing the log data.

## Prepare the JSON Data

You need to create the `export_logs_request.json` file, which contains the structure of the log data to be sent to the OTLP listener. The required fields in the log request are as follows:


```bash
{
"resourceLogs": [
{
"resource": {
"attributes": [
{
"key": "service.name",
"value": { "stringValue": "example-service" }
}
]
},
"scopeLogs": [
{
"scope": {
"name": "example-scope"
},
"logRecords": [
{
"timeUnixNano": "<timestamp>", // e.g., 1727270794000000000
"severityNumber": "SEVERITY_NUMBER_INFO",
"severityText": "INFO",
"body": { "stringValue": "Log message here" }
}
]
}
]
}
]
}
```

Key fields:

* `timeUnixNano`: A required field that represents the timestamp in nanoseconds since epoch (e.g., `1727270794000000000` for a future time). This needs to be manually set, but you can automate it in future feature requests.
* `severityNumber`: Log severity level (e.g., `SEVERITY_NUMBER_INFO`).
* `body`: The log message content.

## Adjust the Region URL

You must adjust the OTLP listener URL based on your Logz.io account region. Find the correct endpoint for your region [here](https://docs.logz.io/docs/user-guide/admin/hosting-regions/account-region/).

For example, if you're in the US region, your endpoint might be:

```bash
-u '<https://otlp-listener.logz.io/v1/logs>'
```

For EU region:

```bash
-u '<https://otlp-eu.logz.io/v1/logs>'
```

## Full Command Example

```bash
protocurl -v \\
-I ~/Downloads/proto/opentelemetry-proto \\
-i opentelemetry.proto.collector.logs.v1.ExportLogsServiceRequest \\
-o opentelemetry.proto.collector.logs.v1.ExportLogsServiceResponse \\
-u '<https://otlp-listener.logz.io/v1/logs>' \\
-H 'Authorization: Bearer <Your-Logzio-Token>' \\
-H 'user-agent: logzio-protobuf-logs' \\
-d @export_logs_request.json
```

## Sample Output in Console

When you run the command, you should see an output like this in your console:

```bash
Simplychee marked this conversation as resolved.
Show resolved Hide resolved
=========================== POST Request JSON =========================== >>>
{
"resource_logs": [
{
"resource": {
"attributes": [
{
"key": "service.name",
"value": {"string_value": "example-service"}
}
]
},
"scope_logs": [
{
"scope": {
"name": "example-scope"
},
"log_records": [
{
"time_unix_nano": "1727065212000000000",
"severity_number": "SEVERITY_NUMBER_INFO",
"severity_text": "INFO",
"body": {"string_value": "Log message here"}
}
]
}
]
}
]
}
=========================== POST Request Binary =========================== >>>
00000000 0a 5c 0a 23 0a 21 0a 0c 73 65 72 76 69 63 65 2e |.\\.#.!..service.|
00000010 6e 61 6d 65 12 11 0a 0f 65 78 61 6d 70 6c 65 2d |name....example-|
...
=========================== POST Response Headers =========================== <<<
HTTP/1.1 200 OK
```

If everything is set correctly, you should see an HTTP status code `200 OK`, indicating the logs were successfully sent.

</TabItem>
</Tabs>
2 changes: 1 addition & 1 deletion static/manifest.json

Large diffs are not rendered by default.

Loading