Skip to content
This repository has been archived by the owner on Jan 29, 2024. It is now read-only.

Commit

Permalink
Merge branch 'main' into harshini-kafka-prometheus-privatelink
Browse files Browse the repository at this point in the history
  • Loading branch information
harshini-rangaswamy authored Jan 11, 2024
2 parents 2fe0fe9 + 8a68558 commit 4f4ae4a
Show file tree
Hide file tree
Showing 39 changed files with 93 additions and 62 deletions.
13 changes: 13 additions & 0 deletions _redirects
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,19 @@
/docs/tools/cli/card /docs/tools/cli/account


/docs/tools/api/examples /docs/tools/api
/docs/products/postgresql/getting-started /docs/products/postgresql/get-started
/docs/products/m3db/getting-started /docs/products/m3db/get-started
/docs/products/flink/getting-started /docs/products/flink/get-started
/docs/products/kafka/getting-started /docs/products/kafka/get-started
/docs/products/clickhouse/getting-started /docs/products/clickhouse/get-started
/docs/products/opensearch/getting-started /docs/products/opensearch/get-started
/docs/products/kafka/karapace/getting-started /docs/products/kafka/karapace/get-started
/docs/products/kafka/kafka-connect/getting-started /docs/products/kafka/kafka-connect/get-started
/docs/products/opensearch/dashboards/getting-started /docs/products/opensearch/dashboards/get-started
/docs/products/kafka/kafka-mirrormaker/getting-started /docs/products/kafka/kafka-mirrormaker/get-started


# Redirect from .index.html to specific page names for landing

# with one section and no subsections, i. e. docs/platform
Expand Down
20 changes: 10 additions & 10 deletions _toc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -313,7 +313,7 @@ entries:
- file: docs/products/kafka
title: Apache Kafka
entries:
- file: docs/products/kafka/getting-started
- file: docs/products/kafka/get-started
title: Get started
- file: docs/products/kafka/howto/fake-sample-data
title: Sample data generator
Expand Down Expand Up @@ -449,7 +449,7 @@ entries:
- file: docs/products/kafka/kafka-connect
title: Apache Kafka Connect
entries:
- file: docs/products/kafka/kafka-connect/getting-started
- file: docs/products/kafka/kafka-connect/get-started
- file: docs/products/kafka/kafka-connect/concepts
entries:
- file: docs/products/kafka/kafka-connect/concepts/list-of-connector-plugins
Expand Down Expand Up @@ -560,7 +560,7 @@ entries:
- file: docs/products/kafka/kafka-mirrormaker
title: Apache Kafka MirrorMaker2
entries:
- file: docs/products/kafka/kafka-mirrormaker/getting-started
- file: docs/products/kafka/kafka-mirrormaker/get-started
- file: docs/products/kafka/kafka-mirrormaker/concepts
entries:
- file: docs/products/kafka/kafka-mirrormaker/concepts/disaster-recovery-migration
Expand All @@ -587,7 +587,7 @@ entries:
- file: docs/products/kafka/karapace
title: Karapace
entries:
- file: docs/products/kafka/karapace/getting-started
- file: docs/products/kafka/karapace/get-started
- file: docs/products/kafka/karapace/concepts
title: Concepts
entries:
Expand Down Expand Up @@ -621,7 +621,7 @@ entries:
title: Plans and pricing
- file: docs/products/flink/reference/flink-limitations
title: Limitations
- file: docs/products/flink/getting-started
- file: docs/products/flink/get-started
title: Quickstart
- file: docs/products/flink/concepts
title: Concepts
Expand Down Expand Up @@ -767,7 +767,7 @@ entries:
title: Plans and pricing
- file: docs/products/clickhouse/reference/limitations
title: Limits and limitations
- file: docs/products/clickhouse/getting-started
- file: docs/products/clickhouse/get-started
title: Quickstart
- file: docs/products/clickhouse/concepts
title: Concepts
Expand Down Expand Up @@ -985,7 +985,7 @@ entries:
- file: docs/products/m3db
title: M3DB
entries:
- file: docs/products/m3db/getting-started
- file: docs/products/m3db/get-started
title: Get started
- file: docs/products/m3db/concepts
title: Concepts
Expand Down Expand Up @@ -1083,7 +1083,7 @@ entries:
- file: docs/products/opensearch
title: OpenSearch
entries:
- file: docs/products/opensearch/getting-started
- file: docs/products/opensearch/get-started
title: Quickstart
entries:
- file: docs/products/opensearch/howto/sample-dataset
Expand Down Expand Up @@ -1182,7 +1182,7 @@ entries:
- file: docs/products/opensearch/dashboards
title: OpenSearch Dashboards
entries:
- file: docs/products/opensearch/dashboards/getting-started
- file: docs/products/opensearch/dashboards/get-started
- file: docs/products/opensearch/dashboards/howto
title: HowTo
entries:
Expand All @@ -1209,7 +1209,7 @@ entries:
entries:
- file: docs/products/postgresql/overview
title: Overview
- file: docs/products/postgresql/getting-started
- file: docs/products/postgresql/get-started
title: Quickstart
- file: docs/products/postgresql/concepts
title: Concepts
Expand Down
2 changes: 1 addition & 1 deletion docs/products/clickhouse.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Aiven for ClickHouse® is a fully managed distributed columnar database based on

.. grid:: 1 2 2 2

.. grid-item-card:: :doc:`Quickstart </docs/products/clickhouse/getting-started>`
.. grid-item-card:: :doc:`Quickstart </docs/products/clickhouse/get-started>`
:shadow: md
:margin: 2 2 0 0

Expand Down
2 changes: 1 addition & 1 deletion docs/products/clickhouse/howto/load-dataset.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Once done, you should have two files available: ``hits_v1.tsv`` and ``visits_v1.
Set up the service and database
-------------------------------

If you don't yet have an Aiven for ClickHouse service, follow the steps in our :doc:`getting started guide </docs/products/clickhouse/getting-started>` to create one.
If you don't yet have an Aiven for ClickHouse service, follow the steps in our :doc:`getting started guide </docs/products/clickhouse/get-started>` to create one.

When you create a service, a default database was already added. However, you can create separate databases specific to your use case. We will create a database with the name ``datasets``, keeping it the same as in the ClickHouse documentation.

Expand Down
2 changes: 1 addition & 1 deletion docs/products/flink.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Aiven for Apache Flink® is a fully managed service that leverages the power of

.. grid:: 1 2 2 2

.. grid-item-card:: :doc:`Quickstart </docs/products/flink/getting-started>`
.. grid-item-card:: :doc:`Quickstart </docs/products/flink/get-started>`
:shadow: md
:margin: 2 2 0 0

Expand Down
2 changes: 1 addition & 1 deletion docs/products/flink/howto/flink-confluent-avro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Prerequisites
--------------

* :doc:`Aiven for Apache Flink service </docs/platform/howto/create_new_service>` with Aiven for Apache Kafka® integration. See :doc:`/docs/products/flink/howto/create-integration` for more information.
* Aiven for Apache Kafka® service with Karapace Schema registry enabled. See :doc:`/docs/products/kafka/karapace/getting-started` for more information.
* Aiven for Apache Kafka® service with Karapace Schema registry enabled. See :doc:`/docs/products/kafka/karapace/get-started` for more information.
* By default, Flink cannot create Apache Kafka topics while pushing the first record automatically. To change this behavior, enable in the Aiven for Apache Kafka target service the ``kafka.auto_create_topics_enable`` option in **Advanced configuration** section.

Create an Apache Flink® table with Confluent Avro
Expand Down
2 changes: 1 addition & 1 deletion docs/products/kafka.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Apache Kafka moves data between systems, and Apache Kafka Connect is how to inte
Get started with Aiven for Apache Kafka
---------------------------------------

Take your first steps with Aiven for Apache Kafka by following our :doc:`/docs/products/kafka/getting-started` article, or browse through our full list of articles:
Take your first steps with Aiven for Apache Kafka by following our :doc:`/docs/products/kafka/get-started` article, or browse through our full list of articles:

.. grid:: 1 2 2 2

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Schema registry authorization
=============================

The schema registry authorization feature when enabled in :doc:`Karapace schema registry </docs/products/kafka/karapace/getting-started>` allows you to authenticate the user, and control read or write access to the individual resources available in the Schema Registry.
The schema registry authorization feature when enabled in :doc:`Karapace schema registry </docs/products/kafka/karapace/get-started>` allows you to authenticate the user, and control read or write access to the individual resources available in the Schema Registry.

For information on schema registry authorization for Aiven for Apache Kafka® services, see :doc:`Karapace schema registry authorization </docs/products/kafka/karapace/concepts/schema-registry-authorization>`.
68 changes: 43 additions & 25 deletions docs/products/kafka/howto/datadog-customised-metrics.rst
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
Configure Apache Kafka® metrics sent to Datadog
===============================================

When creating a :doc:`Datadog service integration </docs/integrations/datadog/datadog-metrics>`, you can customise which metrics are sent to the Datadog endpoint using the :doc:`Aiven CLI </docs/tools/cli>`.
When creating a `Datadog service integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_, customize which metrics are sent to the Datadog endpoint using the :doc:`Aiven CLI </docs/tools/cli>`.

For each Apache Kafka® topic and partition, the following metrics are currently supported:
The following metrics are currently supported for each topic and partition in Apache Kafka®:

* ``kafka.log.log_size``
* ``kafka.log.log_start_offset``
* ``kafka.log.log_end_offset``

.. Tip::
.. note::

All the above metrics are tagged with ``topic`` and ``partition`` allowing you to monitor each topic and partition independently.
All metrics are tagged with ``topic`` and ``partition``, enabling independent monitoring of each ``topic`` and ``partition``.

Variables
---------
Expand All @@ -23,45 +23,63 @@ Variable Description
================== ============================================================================
``SERVICE_NAME`` Aiven for Apache Kafka® service name
------------------ ----------------------------------------------------------------------------
``INTEGRATION_ID`` ID of the integration between the Aiven for Apache Kafka service and Datadog
``INTEGRATION_ID`` ID of the integration between Aiven for Apache Kafka service and Datadog
================== ============================================================================

.. Tip::

The ``INTEGRATION_ID`` parameter can be found by issuing:
.. code::
You can find the ``INTEGRATION_ID`` parameter by executing this command:

.. code::
avn service integration-list SERVICE_NAME
avn service integration-list SERVICE_NAME
Customise Apache Kafka® metrics sent to Datadog
-----------------------------------------------
Customize Apache Kafka® metrics for Datadog
----------------------------------------------------

Before customising the metrics, make sure that you have a Datadog endpoint configured and enabled in your Aiven for Apache Kafka service. For details on how to set up the Datadog integration, check the :doc:`dedicated article </docs/integrations/datadog/datadog-metrics>`. Please note that in all the below parameters a 'comma separated list' has the following format: ``['value0','value1','value2','...']``.
Before customizing metrics, ensure a Datadog endpoint is configured and enabled in your Aiven for Apache Kafka service. For setup instructions, see :doc:`Send metrics to Datadog </docs/integrations/datadog/datadog-metrics>`. Format any listed parameters as a comma-separated list: ``['value0', 'value1', 'value2', ...]``.

To customise the metrics sent to Datadog, you can use the ``service integration-update`` passing the following customised parameters:

* ``kafka_custom_metrics``: defining the comma separated list of custom metrics to include (within ``kafka.log.log_size``, ``kafka.log.log_start_offset`` and ``kafka.log.log_end_offset``)
* ``include_topics``: defining the comma separated list of topics to include
To customize the metrics sent to Datadog, you can use the ``service integration-update`` passing the following customized parameters:

.. Tip::
* ``kafka_custom_metrics``: defining the comma-separated list of custom metrics to include (within ``kafka.log.log_size``, ``kafka.log.log_start_offset`` and ``kafka.log.log_end_offset``)

By default, all topics are included.
For example, to send the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` metrics, execute the following code:

.. code::
avn service integration-update \
-c kafka_custom_metrics=['kafka.log.log_size','kafka.log.log_end_offset'] \
INTEGRATION_ID
After you successfully update and the metrics are collected and sent to Datadog, you can view them in your Datadog explorer.

* ``exclude_topics``: defining the comma separated list of topics to exclude
* ``include_consumer_groups``: defining the comma separated list of consumer groups to include
* ``exclude_consumer_groups``: defining the comma separated list of consumer groups to include
.. seealso:: Learn more about :doc:`Datadog and Aiven </docs/integrations/datadog>`.


As example to sent the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` metrics for ``topic1`` and ``topic2`` execute the following code:
Customize Apache Kafka® consumer metrics for Datadog
-----------------------------------------------------

`Kafka Consumer Integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_ collects metrics for message offsets. To customize the metrics sent from this Datadog integration to Datadog, you can use the ``service integration-update`` passing the following customized parameters:

* ``include_topics``: Specify a comma-separated list of topics to include.

.. Note::

By default, all topics are included.

* ``exclude_topics``: Specify a comma-separated list of topics to exclude.
* ``include_consumer_groups``: Specify a comma-separated list of consumer groups to include.
* ``exclude_consumer_groups``: Specify a comma-separated list of consumer groups to exclude.

For example, to include topics ``topic1`` and ``topic2``, and exclude ``topic3``, execute the following code:

.. code::
avn service integration-update \
-c kafka_custom_metrics="['kafka.log.log_size','kafka.log.log_end_offset']" \
-c include_topics="['topic1','topic2']" \
INTEGRATION_ID
Once the update is successful and metrics have been collected and pushed, you should see them in your Datadog explorer.

.. seealso:: Learn more about :doc:`/docs/integrations/datadog`.
After you successfully update and the metrics are collected and sent to Datadog, you can view them in your Datadog explorer.
2 changes: 1 addition & 1 deletion docs/products/kafka/howto/enable-oidc.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Aiven for Apache Kafka integrates with a wide range of OpenID Connect identity p

Before proceeding with the setup, ensure you have:

* :doc:`Aiven for Apache Kafka® </docs/products/kafka/getting-started>` service running.
* :doc:`Aiven for Apache Kafka® </docs/products/kafka/get-started>` service running.
* **Access to an OIDC provider**: Options include Auth0, Okta, Google Identity Platform, Azure, or any other OIDC compliant provider.
* Required configuration details from your OIDC provider:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Prerequisites
Before you start, ensure you have the following:

- Aiven account.
- :doc:`Aiven for Apache Kafka® </docs/products/kafka/getting-started>` service running.
- :doc:`Aiven for Apache Kafka® </docs/products/kafka/get-started>` service running.
- :doc:`Prometheus integration </docs/platform/howto/integrations/prometheus-metrics>` set up for your Aiven for Apache Kafka for extracting metrics.
- Necessary permissions to modify service configurations.

Expand Down
2 changes: 1 addition & 1 deletion docs/products/kafka/howto/fake-sample-data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Learning to work with streaming data is much more fun with data, so to get you s

The following example is based on `Docker <https://www.docker.com/>`_ images, which require `Docker <https://www.docker.com/>`_ or `Podman <https://podman.io/>`_ to be executed.

The following example assumes you have an Aiven for Apache Kafka® service running. You can create one following the :doc:`dedicated instructions </docs/products/kafka/getting-started>`.
The following example assumes you have an Aiven for Apache Kafka® service running. You can create one following the :doc:`dedicated instructions </docs/products/kafka/get-started>`.


Fake data generator on Docker
Expand Down
2 changes: 1 addition & 1 deletion docs/products/kafka/howto/kafka-conduktor.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Connect to Apache Kafka® with Conduktor

`Conduktor <https://www.conduktor.io/>`_ is a friendly user interface for Apache Kafka, and it works well with Aiven. In fact, there is built-in support for setting up the connection. You will need to add the CA certificate for each of your Aiven projects to Conduktor before you can connect, this is outlined in the steps below.

1. Visit the **Service overview** page for your Aiven for Apache Kafka® service (the :doc:`/docs/products/kafka/getting-started` page is a good place for more information about creating a new service if you don't have one already).
1. Visit the **Service overview** page for your Aiven for Apache Kafka® service (the :doc:`/docs/products/kafka/get-started` page is a good place for more information about creating a new service if you don't have one already).

2. Download the **Access Key**, **Access Certificate** and **CA Certificate** (if you didn't have that already) into a directory on your computer.

Expand Down
2 changes: 1 addition & 1 deletion docs/products/kafka/howto/kafka-klaw.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Prerequisites
-------------
To connect Aiven for Apache Kafka® and Klaw, you need to have the following setup:

* A running Aiven for Apache Kafka® service. See :doc:`Getting started with Aiven for Apache Kafka </docs/products/kafka/getting-started>` for more information.
* A running Aiven for Apache Kafka® service. See :doc:`Getting started with Aiven for Apache Kafka </docs/products/kafka/get-started>` for more information.
* A running Klaw cluster. See `Run Klaw from the source <https://www.klaw-project.io/docs/quickstart>`_ for more information.
* Configured :doc:`Java keystore and truststore containing the service SSL certificates </docs/products/kafka/howto/keystore-truststore>`.

Expand Down
2 changes: 1 addition & 1 deletion docs/products/kafka/kafka-connect.rst
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ Sink connectors
Get started with Aiven for Apache Kafka® Connect
------------------------------------------------

Take your first steps with Aiven for Apache Kafka Connect by following our :doc:`/docs/products/kafka/kafka-connect/getting-started` article, or browse through our full list of articles:
Take your first steps with Aiven for Apache Kafka Connect by following our :doc:`/docs/products/kafka/kafka-connect/get-started` article, or browse through our full list of articles:


.. grid:: 1 2 2 2
Expand Down
2 changes: 1 addition & 1 deletion docs/products/kafka/kafka-mirrormaker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Apache Kafka® represents the best in class data streaming solution. Apache Kafk
Get started with Aiven for Apache Kafka® MirrorMaker 2
------------------------------------------------------

Take your first steps with Aiven for Apache Kafka® MirrorMaker 2 by following our :doc:`/docs/products/kafka/kafka-mirrormaker/getting-started` article, or browse through our full list of articles:
Take your first steps with Aiven for Apache Kafka® MirrorMaker 2 by following our :doc:`/docs/products/kafka/kafka-mirrormaker/get-started` article, or browse through our full list of articles:


.. grid:: 1 2 2 2
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ To define a replication flow between a source Apache Kafka cluster and a target

.. Note::

If no Aiven for Apache Kafka MirrorMaker 2 are already defined, :doc:`you can create one in the Aiven console <../getting-started>`.
If no Aiven for Apache Kafka MirrorMaker 2 are already defined, :doc:`you can create one in the Aiven console <../get-started>`.

2. In the service **Overview** screen, scroll to the **Service integrations** section and select **Manage integrations**.

Expand Down
Loading

0 comments on commit 4f4ae4a

Please sign in to comment.