Skip to content

Commit

Permalink
DOC-558
Browse files Browse the repository at this point in the history
  • Loading branch information
“nico-shishkin” committed Nov 12, 2023
1 parent 713b1fd commit de8f8bf
Show file tree
Hide file tree
Showing 27 changed files with 141 additions and 12 deletions.
4 changes: 4 additions & 0 deletions docs/shipping/Access-Management/okta.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,10 @@ You can send logs from multiple Okta tenants and any Okta domain.
If you want to ship from multiple Okta tenants over the same docker, you'll need to use the latest configuration using a tenants-credentials.yml file. Otherwise, you can continue using the previous configuration without a tenants-credentials.yml.
:::

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-okta/)
:::

**Before you begin, you'll need**:

* Okta administrator privileges
Expand Down
3 changes: 3 additions & 0 deletions docs/shipping/Azure/azure-activity-logs.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,9 @@ At the end of this process, you'll have configured an event hub namespace, an ev

The resources set up by the automated deployment can collect data for a single Azure region.

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-azure-serverless/)
:::

### Overview of the services you'll be setting up in your Azure account

Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Azure/azure-blob-trigger.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,10 @@ The following resources are needed for this integration:

![Integration-architecture](https://dytvr9ot2sszz.cloudfront.net/logz-docs/azure_blob/blob-trigger-resources.png)

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-azure-blob-trigger/)
:::

### Supported data types

This Logz.io function supports the following data types:
Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Azure/azure-diagnostic-logs.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,10 @@ At the end of this process, your Azure function will forward logs from an Azure

![Overview of Azure Diagnostic Logz.io integration](https://dytvr9ot2sszz.cloudfront.net/logz-docs/log-shipping/azure-diagnostic-logs-overview.png)

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-azure-serverless/)
:::

### Overview of the services you'll be setting up in your Azure account

The automated deployment sets up a new Event Hub namespace and all the components you'll need to collect logs in one Azure region.
Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Azure/azure-graph.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,10 @@ Logzio-api-fetcher supports many API endpoints, including but not limited to:
* Azure Active Directory sign-in logs

There are many other APIs available through Microsoft Graph.

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-api-fetcher/)
:::


## Register a new app in Azure Active Directory
Expand Down
23 changes: 20 additions & 3 deletions docs/shipping/Code/dotnet.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@ import TabItem from '@theme/TabItem';
* .NET Core SDK version 2.0 or higher
* .NET Framework version 4.6.1 or higher

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-dotnet/)
:::


#### Add the dependency to your project

Expand Down Expand Up @@ -385,6 +389,9 @@ namespace LogzioLog4NetSampleApplication
* .NET Core SDK version 2.0 or higher
* .NET Framework version 4.6.1 or higher

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-dotnet/)
:::

#### Add the dependency to your project

Expand Down Expand Up @@ -685,7 +692,9 @@ namespace LogzioNLogSampleApplication
* .NET Core SDK version 2.0 or higher
* .NET Framework version 4.6.1 or higher


:::note
[Project's GitHub repo](https://github.com/logzio/logzio-dotnet/)
:::

#### Add the dependency to your project

Expand Down Expand Up @@ -1027,6 +1036,9 @@ This integration is based on [Serilog.Sinks.Logz.Io repository](https://github.c
* .NET Core SDK version 2.0 or higher
* .NET Framework version 4.6.1 or higher

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-dotnet/)
:::

#### Install the Logz.io Serilog sink

Expand Down Expand Up @@ -1214,6 +1226,10 @@ Replace `<<TYPE>` with the type that you want to assign to your logs. You will u

Helm is a tool for managing packages of pre-configured Kubernetes resources using Charts. This integration allows you to collect and ship diagnostic metrics of your .NET application in Kubernetes to Logz.io, using dotnet-monitor and OpenTelemetry. logzio-dotnet-monitor runs as a sidecar in the same pod as the .NET application.

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-helm/)
:::

###### Sending metrics from nodes with taints

If you want to ship metrics from any of the nodes that have a taint, make sure that the taint key values are listed in your in your daemonset/deployment configuration as follows:
Expand Down Expand Up @@ -1356,8 +1372,9 @@ These instructions show you how to:
* Add advanced settings to the basic custom metrics export configuration
:::note
[Project's GitHub repo](https://github.com/logzio/logzio-app-metrics/)
:::
#### Send custom metrics to Logz.io with a hardcoded Logz.io exporter
Expand Down
9 changes: 9 additions & 0 deletions docs/shipping/Code/go.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,10 @@ If your code is running inside Kubernetes the best practice will be to use our [

## Logs

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-go/)
:::

This shipper uses **goleveldb** and **goqueue** as a persistent storage implementation of a persistent queue, so the shipper backs up your logs to the local file system before sending them.
Logs are queued in the buffer and 100% non-blocking.
A background Go routine ships the logs every 5 seconds.
Expand Down Expand Up @@ -102,6 +106,11 @@ l.Stop() // Drains the log buffer
```

## Metrics

:::note
[Project's GitHub repo](https://github.com/logzio/go-metrics-sdk/)
:::

### Install the SDK

Run the following command:
Expand Down
15 changes: 14 additions & 1 deletion docs/shipping/Code/java.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,19 @@ drop_filter: []
If your code runs within Kubernetes, it's best practice to use our Kubernetes integration to collect various telemetry types.
:::

## Logs
## Logs


import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

<Tabs>
<TabItem value="Logzio-Log4j2-Appender" label="Logzio-Log4j2-Appender" default>

:::note
[Project's GitHub repo](https://github.com/logzio/go-metrics-sdk/)
:::

The Logz.io Log4j 2 appender sends logs using non-blocking threading, bulks, and HTTPS encryption to port 8071.

This appender uses LogzioSender.
Expand Down Expand Up @@ -259,6 +264,10 @@ public class LogzioLog4j2Example {
</TabItem>
<TabItem value="Logzio-Logback-Appender" label="Logzio-Logback-Appender">

:::note
[Project's GitHub repo](https://github.com/logzio/ogzio-log4j2-appender/)
:::

Logback sends logs to your Logz.io account using non-blocking threading, bulks, and HTTPS encryption to port 8071.

This appender uses BigQueue implementation of persistent queue, so all logs are backed up to a local file system before being sent.
Expand Down Expand Up @@ -509,6 +518,10 @@ If the log appender does not ship logs, add `<inMemoryQueue>true</inMemoryQueue>
</Tabs>

## Metrics

:::note
[Project's GitHub repo](https://github.com/logzio/micrometer-registry-logzio/)
:::
### Usage

<Tabs>
Expand Down
13 changes: 13 additions & 0 deletions docs/shipping/Code/node-js.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,10 @@ import TabItem from '@theme/TabItem';
<Tabs>
<TabItem value="logzio-nodejs" label="logzio-nodejs" default>

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-nodejs/)
:::

logzio-nodejs collects log messages in an array, which is sent asynchronously when it reaches its size limit or time limit (100 messages or 10 seconds), whichever comes first.
It contains a simple retry mechanism which upon connection reset or client timeout, tries to send a waiting bulk (2 seconds default).

Expand Down Expand Up @@ -122,6 +126,10 @@ logger.log(obj);
</TabItem>
<TabItem value="winston-logzio" label="winston-logzio">

:::note
[Project's GitHub repo](https://github.com/logzio/winston-logzio/)
:::

This winston plugin is a wrapper for the logzio-nodejs appender, which basically means it just wraps our nodejs logzio shipper.
With winston-logzio, you can take advantage of the winston logger framework with your Node.js app.

Expand Down Expand Up @@ -380,6 +388,11 @@ Deploy this integration to send custom metrics from your Node.js application to

The provided example uses the [OpenTelemetry JS SDK](https://github.com/open-telemetry/opentelemetry-js) and is based on [OpenTelemetry exporter collector proto](https://github.com/open-telemetry/opentelemetry-js/tree/main/packages/opentelemetry-exporter-collector-proto).

:::note
[Project's GitHub repo](https://github.com/logzio/js-metrics/)
:::


**Before you begin, you'll need**:

Node 8 or higher
Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Code/python.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,10 @@ drop_filter: []

## Logs

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-python-handler/)
:::

Logz.io Python Handler sends logs in bulk over HTTPS to Logz.io.
Logs are grouped into bulks based on their size.

Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Containers/docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,10 @@ from other Docker containers and forward them to your Logz.io account.
To use docker-collector-logs, you'll set environment variables when you run the container.
The Docker logs directory and docker.sock are mounted to the container, allowing Filebeat to collect the logs and metadata.

:::note
[Project's GitHub repo](https://github.com/logzio/docker-collector-logs/)
:::


##### Upgrading to a newer version

Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Containers/openshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,10 @@ drop_filter: []
OpenShift is a family of containerization software products developed by Red Hat. Deploy this integration to ship logs from your OpenShift cluster to Logz.io. Deploy this integration to ship logs from your OpenShift cluster to Logz.io.
This integration will deploy the default daemonset, which sends only container logs while ignoring all containers with "openshift" namespace.

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-openshift/)
:::

**Before you begin, you'll need**:

* Working Openshift cluster
Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Database/mysql.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,10 @@ drop_filter: []

## Logs

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-mysql-logs/)
:::

### Default configuration

**Before you begin, you'll need**:
Expand Down
3 changes: 3 additions & 0 deletions docs/shipping/GCP/gcp-stackdriver.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,9 @@ drop_filter: []

Google Cloud Platform (GCP) Stackdriver collects logs from your cloud services. You can use Google Cloud Pub/Sub to forward your logs from GP sinks to Logz.io.

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-pubsub/)
:::

**Before you begin, you'll need**:

Expand Down
6 changes: 4 additions & 2 deletions docs/shipping/Other/fluent-bit.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,13 +14,15 @@ metrics_alerts: []
drop_filter: []
---



## Run Fluent Bit as a standalone app

Fluent Bit is an open source Log Processor and Forwarder which allows you to collect any data like metrics and logs from different sources. This integration allows you to send logs from Fluent Bit running as a standalone app and forward them to your Logz.io account.


:::note
[Project's GitHub repo](https://github.com/logzio/fluent-bit-logzio-output/)
:::


### Install Fluent Bit
Expand Down
5 changes: 5 additions & 0 deletions docs/shipping/Other/fluentd.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ Fluentd is a data collector, which unifies the data collection and consumption.
Fluentd will fetch all existing logs, as it is not able to ignore older logs.
:::


## Configure Fluentd with Ruby Gems

**Before you begin, you'll need**:
Expand Down Expand Up @@ -290,6 +291,10 @@ This integration includes:

Upon deployment, each container on your host system, including the Fluentd container, writes logs to a dedicated log file. Fluentd fetches the log data from this file and ships the data over HTTP or HTTPS to your Logz.io account, either via an optional proxy sever or directly.

:::note
[Project's GitHub repo](https://github.com/logzio/fluentd-docker-logs/)
:::


**Before you begin, you'll need**:
Docker installed on your host system
Expand Down
4 changes: 3 additions & 1 deletion docs/shipping/Other/heroku.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,9 @@ If you still don't see your logs, see [log shipping troubleshooting]({{site.base
* [Heroku CLI](https://devcenter.heroku.com/articles/heroku-cli#download-and-install)
:::note
[Project's GitHub repo](https://github.com/logzio/gheroku-buildpack-telegraf/)
:::
:::note
All commands in these instructions should be run from your Heroku app directory.
Expand Down
4 changes: 3 additions & 1 deletion docs/shipping/Other/microsoft-graph.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,9 @@ drop_filter: []

Microsoft Graph is a RESTful web API that enables you to access Microsoft Cloud service resources. This integration allows you to collect data from Microsoft Graph API and send it to your Logz.io account.


:::note
[Project's GitHub repo](https://github.com/logzio/logzio-api-fetcher/)
:::



Expand Down
4 changes: 3 additions & 1 deletion docs/shipping/Other/salesforce-commerce-cloud.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,9 @@ Salesforce Commerce Cloud is a scalable, cloud-based software-as-a-service (SaaS

The default configuration uses a Docker container with environment variables.


:::note
[Project's GitHub repo](https://github.com/logzio/sfcc-webdav-fetcher/)
:::


##### Pull the Docker image of the Logz.io Salesforce Commerce Cloud data fetcher
Expand Down
4 changes: 3 additions & 1 deletion docs/shipping/Other/salesforce.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,9 @@ drop_filter: []
Salesforce is a customer relationship management solution. The Account sObject is an abstraction of the account record and holds the account field information in memory as an object. This integration allows you to collect sObject data from Salesforce and send it to your Logz.io account.



:::note
[Project's GitHub repo](https://github.com/logzio/salesforce-logs-receiver/)
:::


##### Pull the Docker image of the Logz.io API fetcher
Expand Down
4 changes: 3 additions & 1 deletion docs/shipping/Security/carbon-black.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,9 @@ drop_filter: []

With this integration, you can collect Logs from Carbon Black and forward them to Logz.io.


:::note
[Project's GitHub repo](https://github.com/logzio/s3-hook/)
:::

### Set Carbon Black Event Forwarder

Expand Down
4 changes: 3 additions & 1 deletion docs/shipping/Security/cisco-securex.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,9 @@ drop_filter: []
Cisco SecureX connects the breadth of Cisco's integrated security portfolio and your infrastructure. This integration allows you to collect data from Cisco SecureX API and send it to your Logz.io account.



:::note
[Project's GitHub repo](https://github.com/logzio/logzio-api-fetcher/)
:::



Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Security/x509.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,10 @@ Deploy this integration to collect X509 certificate metrics from URLs and send t
* x509_start_date (in seconds passed since 1.1.1970)
* x509_end_date (in seconds passed since 1.1.1970)

:::note
[Project's GitHub repo](https://github.com/logzio/x509-certificate-metrics-lambda/)
:::


## Collect certificate metrics using AWS Lambda

Expand Down
4 changes: 4 additions & 0 deletions docs/shipping/Synthetic-Monitoring/api-status-metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,10 @@ drop_filter: []

Deploy this integration to collect API status metrics of user API and send them to Logz.io.

:::note
[Project's GitHub repo](https://github.com/logzio/logzio-api-status/)
:::

The integration is based on a Lambda function that will be auto-deployed together with the layer [LogzioLambdaExtensionLogs](https://github.com/logzio/logzio-lambda-extensions/tree/main/logzio-lambda-extensions-logs).


Expand Down
Loading

0 comments on commit de8f8bf

Please sign in to comment.