forked from opensearch-project/OpenSearch
-
Notifications
You must be signed in to change notification settings - Fork 22
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
These are made by the dashboard team, and migrated to this repo
- Loading branch information
Showing
38 changed files
with
27,840 additions
and
6 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1 @@ | ||
elastic | ||
opensearch | ||
splunk | ||
common | ||
config | ||
docker/certs |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,41 @@ | ||
WAZUH_VERSION=4.3.10 | ||
ELASTIC_PASSWORD=changeme | ||
|
||
|
||
## ELASTIC STACK | ||
# Password for the 'kibana_system' user (at least 6 characters) | ||
KIBANA_PASSWORD=kibana_system | ||
|
||
# Version of Elastic products | ||
STACK_VERSION=8.6.2 | ||
|
||
# Set the cluster name | ||
CLUSTER_NAME=docker-cluster | ||
|
||
# Set to 'basic' or 'trial' to automatically start the 30-day trial | ||
LICENSE=basic | ||
#LICENSE=trial | ||
|
||
# Port to expose Elasticsearch HTTP API to the host | ||
ES_PORT=9201 | ||
#ES_PORT=127.0.0.1:9200 | ||
|
||
# Port to expose Kibana to the host | ||
KIBANA_PORT=5602 | ||
#KIBANA_PORT=80 | ||
|
||
# Increase or decrease based on the available host memory (in bytes) | ||
MEM_LIMIT=1073741824 | ||
|
||
## OPENSEARCH STACK | ||
#Stack version | ||
OS_VERSION=2.6.0 | ||
|
||
#Opensearch port | ||
OS_PORT=9202 | ||
|
||
#Opensearch dashboard port | ||
OSD_PORT=5603 | ||
|
||
SPLUNK_FORWARDER_URL=https://download.splunk.com/products/universalforwarder/releases/9.0.4/linux/splunkforwarder-9.0.4-de405f4a7979-linux-2.6-amd64.deb | ||
LOGSTASH_URL=https://artifacts.elastic.co/downloads/logstash/logstash-8.6.2-linux-x86_64.tar.gz |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,75 @@ | ||
# Wazuh integrations | ||
|
||
This folder contains a Docker compose project to test integrations with Splunk and Elasticsearch, from the Wazuh Indexer as well as from the Wazuh manager. | ||
|
||
## Services | ||
|
||
The Docker Compose project contains these services: | ||
|
||
- 1x Splunk Indexer 9.0.4. | ||
- 1x Wazuh stack (indexer, dashboard and manager). The manager container also has a Splunk Forwarder and a Logstash installation in the `/opt` folder. | ||
- 1x Elastic stack (Elasticsearch, Kibana and the setup container). | ||
- 1x OpenSearch stack (OpenSearch and OpenSearch Dashboards). | ||
- 1x Logstash 8.6.2. | ||
- 1x Generator, to generate the certificates and download the required packages. | ||
|
||
### Additional content | ||
|
||
- Dashboards for Splunk, Kibana and OpenSearch | ||
- Sample alerts for the last 7 days after starting the environments. Those are inside the `wazuh-manager` in `/var/ossec/logs/alerts/sample_alerts.json` and also in the `alerts.json` file merged with the non-sample data. | ||
|
||
## Requirements | ||
|
||
Installed and working installations of: | ||
|
||
- Docker. | ||
- Docker compose. | ||
|
||
## Usage | ||
|
||
The **.env** file contains variables used to configure the environment, such as component versions, forwarded ports and initial credentials. Modify it as required. | ||
|
||
Start all the containers running `docker compose up -d`. It is necessary to manually start the Splunk integration in the manager container by running `/opt/splunkforwarder/bin/splunk start --accept-license`. To stop the environment, use `docker compose down`. | ||
|
||
The Splunk Indexer instance is accessible on https://localhost:8000, credentials `admin:password`. In this instance, the logs imported from the Wazuh Indexer are in the `main` index, and the logs imported from the manager are in the `wazuh-alerts` index. | ||
|
||
The Wazuh Dashboard instance is accessible on https://localhost:5601 credentials `admin:SecretPassword`. | ||
|
||
The Kibana instance is accessible on http://localhost:5602 credentials `elastic:changeme`. In this instance, the logs imported from the Wazuh Indexer are in the `indexer-wazuh-alerts-4.x-<date>` index, and the logs imported from the manager are in the `wazuh-alerts-4.x-<date>` index. | ||
|
||
The OpenSearch dashboards instance is accessible on http://localhost:5603 credentials `admin:admin`. In this instance, the logs imported from the Wazuh Indexer are in the `indexer-wazuh-alerts-4.x-<date>` index, and the logs imported from the manager are in the `wazuh-alerts-4.x-<date>` index. | ||
|
||
The integration from the manager contains sample data, and also the alerts that are generated. The integration from the indexer will not contain any sample data. Additionally, the dashboards for all the platforms will only work with the index `wazuh-alerts...`, meaning that they will not reflect the data generated from the Indexer integration. | ||
|
||
## Importing the dashboards | ||
|
||
### Splunk | ||
|
||
The dashboards for Splunk are located in `extra/dashboards/Splunk`. The steps to import them to the indexer are the following: | ||
|
||
- Open a dashboard file and copy all its content. | ||
- In the Splunk UI, navigate to `Search & Reporting`, `Dashboards`, click `Create New Dashboard`, write the title and select `Dashboard Studio`, select `Grid` and click on `Create`. | ||
- On the top menu, there is a `Source` icon. Click on it, and replace all the content with the copied content from the dashboard file. After that, click on `Back` and click on `Save`. | ||
- Repeat the steps for all the desired dashboards. | ||
|
||
### Elastic | ||
|
||
The dashboards for Elastic are located in `docker/integrations/extra/dashboards/elastic`. The steps to import them to the indexer are the following: | ||
|
||
- On Kibana, expand the left menu, and go to `Stack management`. | ||
- Click on `Saved Objects`, select `Import`, click on the `Import` icon and browse the dashboard file. It is possible to import only the desired dashboard, or the file `all-dashboards.ndjson`, that contains all the dashboards. | ||
- Click on Import. | ||
- Repeat the steps for all the desired dashboards. | ||
|
||
Imported dashboards will appear in the `Dashboards` app on the left menu. | ||
|
||
### OpenSearch | ||
|
||
The dashboards for OpenSearch are located in `docker/integrations/extra/dashboards/opensearch`. The steps to import them to the indexer are the following: | ||
|
||
- On OpenSearch Dashboards, expand the left menu, and go to `Stack management` | ||
- Click on `Saved Objects`, select `Import`, click on the `Import` icon and browse the dashboard file. It is possible to import only the desired dashboard, or the file `all-dashboards.ndjson`, that contains all the dashboards. | ||
- Click on Import. | ||
- Repeat the steps for all the desired dashboards. | ||
|
||
Imported dashboards will appear in the `Dashboards` app on the left menu. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
{ | ||
"CN": "Wazuh", | ||
"key": { | ||
"algo": "rsa", | ||
"size": 2048 | ||
}, | ||
"names": [ | ||
{ | ||
"C": "US", | ||
"L": "San Francisco", | ||
"O": "Wazuh", | ||
"OU": "Wazuh Root CA" | ||
} | ||
] | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,58 @@ | ||
{ | ||
"signing": { | ||
"default": { | ||
"expiry": "8760h" | ||
}, | ||
"profiles": { | ||
"intermediate_ca": { | ||
"usages": [ | ||
"signing", | ||
"digital signature", | ||
"key encipherment", | ||
"cert sign", | ||
"crl sign", | ||
"server auth", | ||
"client auth" | ||
], | ||
"expiry": "8760h", | ||
"ca_constraint": { | ||
"is_ca": true, | ||
"max_path_len": 0, | ||
"max_path_len_zero": true | ||
} | ||
}, | ||
"peer": { | ||
"usages": [ | ||
"signing", | ||
"digital signature", | ||
"key encipherment", | ||
"data encipherment", | ||
"client auth", | ||
"server auth" | ||
], | ||
"expiry": "8760h" | ||
}, | ||
"server": { | ||
"usages": [ | ||
"signing", | ||
"digital signing", | ||
"key encipherment", | ||
"data encipherment", | ||
"server auth" | ||
], | ||
"expiry": "8760h" | ||
}, | ||
"client": { | ||
"usages": [ | ||
"signing", | ||
"digital signature", | ||
"key encipherment", | ||
"data encipherment", | ||
"client auth" | ||
], | ||
"expiry": "8760h" | ||
} | ||
} | ||
} | ||
} | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
{ | ||
"CN": "HOST", | ||
"key": { | ||
"algo": "rsa", | ||
"size": 2048 | ||
}, | ||
"names": [ | ||
{ | ||
"C": "US", | ||
"L": "California", | ||
"O": "Wazuh", | ||
"OU": "Wazuh" | ||
} | ||
], | ||
"hosts": [ | ||
"HOST", | ||
"localhost" | ||
] | ||
} |
Oops, something went wrong.