From e032f4789b4dd43c9249c77872d49580f847c120 Mon Sep 17 00:00:00 2001 From: Juha Mynttinen <juha.mynttinen@aiven.io> Date: Wed, 20 Dec 2023 08:17:52 +0200 Subject: [PATCH 1/4] Kafka datadog: clarify Kafka metrics vs Kafka consumer metrics Split the chapter of customising metrics into two - first for the main Kafka integration and the other for the Consumer Integration. Add an example how to use exclude_topics. The issue was that documentation failed to mention that the four keys include_topics, exclude_topics, include_consumer_groups and exclude_consumer_groups only affect the consumer integration. And that kafka_custom_metrics applies to Kafka integration. --- .../howto/datadog-customised-metrics.rst | 30 +++++++++++++++---- 1 file changed, 24 insertions(+), 6 deletions(-) diff --git a/docs/products/kafka/howto/datadog-customised-metrics.rst b/docs/products/kafka/howto/datadog-customised-metrics.rst index fdde87aad3..ff397c015e 100644 --- a/docs/products/kafka/howto/datadog-customised-metrics.rst +++ b/docs/products/kafka/howto/datadog-customised-metrics.rst @@ -42,9 +42,30 @@ Before customising the metrics, make sure that you have a Datadog endpoint confi To customise the metrics sent to Datadog, you can use the ``service integration-update`` passing the following customised parameters: * ``kafka_custom_metrics``: defining the comma separated list of custom metrics to include (within ``kafka.log.log_size``, ``kafka.log.log_start_offset`` and ``kafka.log.log_end_offset``) + +As example to sent the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` metrics execute the following code: + +.. code:: + + avn service integration-update \ + -c kafka_custom_metrics=['kafka.log.log_size','kafka.log.log_end_offset'] \ + INTEGRATION_ID + +Once the update is successful and metrics have been collected and pushed, you should see them in your Datadog explorer. + +.. seealso:: Learn more about :doc:`/docs/integrations/datadog`. + + +Customise Apache Kafka® Consumer Integration metrics sent to Datadog +==================================================================== + +`Kafka Consumer Integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_ collects metrics for message offsets. + +To customise the metrics sent from this Datadog integration to Datadog, you can use the ``service integration-update`` passing the following customised parameters: + * ``include_topics``: defining the comma separated list of topics to include -.. Tip:: +.. Tip:: By default, all topics are included. @@ -52,16 +73,13 @@ To customise the metrics sent to Datadog, you can use the ``service integration- * ``include_consumer_groups``: defining the comma separated list of consumer groups to include * ``exclude_consumer_groups``: defining the comma separated list of consumer groups to include - -As example to sent the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` metrics for ``topic1`` and ``topic2`` execute the following code: +As example to include topics ``topic1`` and ``topic2`` and exclude topic ``topic3`` execute the following code: .. code:: avn service integration-update \ - -c kafka_custom_metrics=['kafka.log.log_size','kafka.log.log_end_offset'] \ -c include_topics=['topic1','topic2'] \ + -c exclude_topics=['topic3'] \ INTEGRATION_ID Once the update is successful and metrics have been collected and pushed, you should see them in your Datadog explorer. - -.. seealso:: Learn more about :doc:`/docs/integrations/datadog`. \ No newline at end of file From 3133a24fc257a4d5ba77560216f05a6f6d892b73 Mon Sep 17 00:00:00 2001 From: Harshini Rangaswamy <harshini.rangaswamy@aiven.io> Date: Wed, 20 Dec 2023 14:52:49 +0100 Subject: [PATCH 2/4] Updated content clarity and consistency --- .../howto/datadog-customised-metrics.rst | 57 +++++++++---------- 1 file changed, 28 insertions(+), 29 deletions(-) diff --git a/docs/products/kafka/howto/datadog-customised-metrics.rst b/docs/products/kafka/howto/datadog-customised-metrics.rst index ff397c015e..17c1d3bae9 100644 --- a/docs/products/kafka/howto/datadog-customised-metrics.rst +++ b/docs/products/kafka/howto/datadog-customised-metrics.rst @@ -1,17 +1,17 @@ Configure Apache Kafka® metrics sent to Datadog =============================================== -When creating a :doc:`Datadog service integration </docs/integrations/datadog/datadog-metrics>`, you can customise which metrics are sent to the Datadog endpoint using the :doc:`Aiven CLI </docs/tools/cli>`. +When creating a `Datadog service integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_, customize which metrics are sent to the Datadog endpoint using the `Aiven CLI <https://aiven.io/docs/products/kafka>`_. -For each Apache Kafka® topic and partition, the following metrics are currently supported: +The following metrics are currently supported for each topic and partition in Apache Kafka®: * ``kafka.log.log_size`` * ``kafka.log.log_start_offset`` * ``kafka.log.log_end_offset`` -.. Tip:: +.. note:: - All the above metrics are tagged with ``topic`` and ``partition`` allowing you to monitor each topic and partition independently. + All metrics are tagged with ``topic`` and ``partition``, enabling independent monitoring of each ``topic`` and ``partition``. Variables --------- @@ -23,27 +23,27 @@ Variable Description ================== ============================================================================ ``SERVICE_NAME`` Aiven for Apache Kafka® service name ------------------ ---------------------------------------------------------------------------- -``INTEGRATION_ID`` ID of the integration between the Aiven for Apache Kafka service and Datadog +``INTEGRATION_ID`` ID of the integration between Aiven for Apache Kafka service and Datadog ================== ============================================================================ -.. Tip:: - The ``INTEGRATION_ID`` parameter can be found by issuing: - - .. code:: +You can find the ``INTEGRATION_ID`` parameter by executing this command: + +.. code:: - avn service integration-list SERVICE_NAME + avn service integration-list SERVICE_NAME + +Customize Apache Kafka® metrics for Datadog +---------------------------------------------------- -Customise Apache Kafka® metrics sent to Datadog ------------------------------------------------ +Before customizing metrics, ensure a Datadog endpoint is configured and enabled in your Aiven for Apache Kafka service. For setup instructions, see `Send metrics to Datadog <https://aiven.io/docs/integrations/datadog/datadog-metrics>`_. Format any listed parameters as a comma-separated list: ``['value0', 'value1', 'value2', ...]``. -Before customising the metrics, make sure that you have a Datadog endpoint configured and enabled in your Aiven for Apache Kafka service. For details on how to set up the Datadog integration, check the :doc:`dedicated article </docs/integrations/datadog/datadog-metrics>`. Please note that in all the below parameters a 'comma separated list' has the following format: ``['value0','value1','value2','...']``. -To customise the metrics sent to Datadog, you can use the ``service integration-update`` passing the following customised parameters: +To customize the metrics sent to Datadog, you can use the ``service integration-update`` passing the following customized parameters: -* ``kafka_custom_metrics``: defining the comma separated list of custom metrics to include (within ``kafka.log.log_size``, ``kafka.log.log_start_offset`` and ``kafka.log.log_end_offset``) +* ``kafka_custom_metrics``: defining the comma-separated list of custom metrics to include (within ``kafka.log.log_size``, ``kafka.log.log_start_offset`` and ``kafka.log.log_end_offset``) -As example to sent the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` metrics execute the following code: +For example, to send the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` metrics, execute the following code: .. code:: @@ -51,29 +51,28 @@ As example to sent the ``kafka.log.log_size`` and ``kafka.log.log_end_offset`` m -c kafka_custom_metrics=['kafka.log.log_size','kafka.log.log_end_offset'] \ INTEGRATION_ID -Once the update is successful and metrics have been collected and pushed, you should see them in your Datadog explorer. -.. seealso:: Learn more about :doc:`/docs/integrations/datadog`. +After you successfully update and the metrics are collected and sent to Datadog, you can view them in your Datadog explorer. +.. seealso:: Learn more about :doc:`Datadog and Aiven </docs/integrations/datadog>`. -Customise Apache Kafka® Consumer Integration metrics sent to Datadog -==================================================================== -`Kafka Consumer Integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_ collects metrics for message offsets. +Customize Apache Kafka® consumer metrics for Datadog +----------------------------------------------------- -To customise the metrics sent from this Datadog integration to Datadog, you can use the ``service integration-update`` passing the following customised parameters: +`Kafka Consumer Integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_ collects metrics for message offsets. To customize the metrics sent from this Datadog integration to Datadog, you can use the ``service integration-update`` passing the following customized parameters: -* ``include_topics``: defining the comma separated list of topics to include +* ``include_topics``: Specify a comma-separated list of topics to include. -.. Tip:: + .. Note:: By default, all topics are included. -* ``exclude_topics``: defining the comma separated list of topics to exclude -* ``include_consumer_groups``: defining the comma separated list of consumer groups to include -* ``exclude_consumer_groups``: defining the comma separated list of consumer groups to include +* ``exclude_topics``: Specify a comma-separated list of topics to exclude. +* ``include_consumer_groups``: Specify a comma-separated list of consumer groups to include. +* ``exclude_consumer_groups``: Specify a comma-separated list of consumer groups to exclude. -As example to include topics ``topic1`` and ``topic2`` and exclude topic ``topic3`` execute the following code: +For example, to include topics ``topic1`` and ``topic2``, and exclude ``topic3``, execute the following code: .. code:: @@ -82,4 +81,4 @@ As example to include topics ``topic1`` and ``topic2`` and exclude topic ``topic -c exclude_topics=['topic3'] \ INTEGRATION_ID -Once the update is successful and metrics have been collected and pushed, you should see them in your Datadog explorer. +After you successfully update and the metrics are collected and sent to Datadog, you can view them in your Datadog explorer. From eaba0cd7f470a12b3b295de35f2a4a95b06736d7 Mon Sep 17 00:00:00 2001 From: Harshini Rangaswamy <harshini.rangaswamy@aiven.io> Date: Wed, 20 Dec 2023 14:58:17 +0100 Subject: [PATCH 3/4] Fixed link issues --- docs/products/kafka/howto/datadog-customised-metrics.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/products/kafka/howto/datadog-customised-metrics.rst b/docs/products/kafka/howto/datadog-customised-metrics.rst index 17c1d3bae9..d89c0b936e 100644 --- a/docs/products/kafka/howto/datadog-customised-metrics.rst +++ b/docs/products/kafka/howto/datadog-customised-metrics.rst @@ -1,7 +1,7 @@ Configure Apache Kafka® metrics sent to Datadog =============================================== -When creating a `Datadog service integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_, customize which metrics are sent to the Datadog endpoint using the `Aiven CLI <https://aiven.io/docs/products/kafka>`_. +When creating a `Datadog service integration <https://docs.datadoghq.com/integrations/kafka/?tab=host#kafka-consumer-integration>`_, customize which metrics are sent to the Datadog endpoint using the :doc:`Aiven CLI </docs/tools/cli>`. The following metrics are currently supported for each topic and partition in Apache Kafka®: @@ -36,7 +36,7 @@ You can find the ``INTEGRATION_ID`` parameter by executing this command: Customize Apache Kafka® metrics for Datadog ---------------------------------------------------- -Before customizing metrics, ensure a Datadog endpoint is configured and enabled in your Aiven for Apache Kafka service. For setup instructions, see `Send metrics to Datadog <https://aiven.io/docs/integrations/datadog/datadog-metrics>`_. Format any listed parameters as a comma-separated list: ``['value0', 'value1', 'value2', ...]``. +Before customizing metrics, ensure a Datadog endpoint is configured and enabled in your Aiven for Apache Kafka service. For setup instructions, see :doc:`Send metrics to Datadog </docs/integrations/datadog/datadog-metrics>`. Format any listed parameters as a comma-separated list: ``['value0', 'value1', 'value2', ...]``. To customize the metrics sent to Datadog, you can use the ``service integration-update`` passing the following customized parameters: From 4cf0dde7f298dbd17c50f9751fcd9ef3d4ed69de Mon Sep 17 00:00:00 2001 From: Arthur <arthur.flageul-marquez@aiven.io> Date: Wed, 10 Jan 2024 18:19:33 +0100 Subject: [PATCH 4/4] harmonize file name for get started pages (#2417) --- _redirects | 10 ++++++++++ _toc.yml | 20 +++++++++---------- docs/products/clickhouse.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 .../clickhouse/howto/load-dataset.rst | 2 +- docs/products/flink.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 .../flink/howto/flink-confluent-avro.rst | 2 +- docs/products/kafka.rst | 2 +- .../schema-registry-authorization.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 docs/products/kafka/howto/enable-oidc.rst | 2 +- .../howto/enabled-consumer-lag-predictor.rst | 2 +- .../products/kafka/howto/fake-sample-data.rst | 2 +- docs/products/kafka/howto/kafka-conduktor.rst | 2 +- docs/products/kafka/howto/kafka-klaw.rst | 2 +- docs/products/kafka/kafka-connect.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 docs/products/kafka/kafka-mirrormaker.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 .../howto/setup-replication-flow.rst | 2 +- docs/products/kafka/karapace.rst | 2 +- .../karapace/concepts/acl-definition.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 .../enable-oauth-oidc-kafka-rest-proxy.rst | 2 +- docs/products/m3db.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 docs/products/opensearch.rst | 2 +- .../concepts/opensearch-vs-elasticsearch.rst | 2 +- docs/products/opensearch/dashboards.rst | 2 +- .../{getting-started.rst => get-started.rst} | 2 +- .../{getting-started.rst => get-started.rst} | 0 .../opensearch-aggregations-and-nodejs.rst | 2 +- .../howto/opensearch-and-nodejs.rst | 2 +- docs/products/postgresql.rst | 2 +- .../{getting-started.rst => get-started.rst} | 0 .../tools/terraform/howto/vpc-peering-aws.rst | 2 +- index.rst | 2 +- 38 files changed, 47 insertions(+), 37 deletions(-) rename docs/products/clickhouse/{getting-started.rst => get-started.rst} (100%) rename docs/products/flink/{getting-started.rst => get-started.rst} (100%) rename docs/products/kafka/{getting-started.rst => get-started.rst} (100%) rename docs/products/kafka/kafka-connect/{getting-started.rst => get-started.rst} (100%) rename docs/products/kafka/kafka-mirrormaker/{getting-started.rst => get-started.rst} (100%) rename docs/products/kafka/karapace/{getting-started.rst => get-started.rst} (100%) rename docs/products/m3db/{getting-started.rst => get-started.rst} (100%) rename docs/products/opensearch/dashboards/{getting-started.rst => get-started.rst} (90%) rename docs/products/opensearch/{getting-started.rst => get-started.rst} (100%) rename docs/products/postgresql/{getting-started.rst => get-started.rst} (100%) diff --git a/_redirects b/_redirects index f4a334f006..0854f9a152 100644 --- a/_redirects +++ b/_redirects @@ -92,6 +92,16 @@ /docs/tools/cli/account/account-authentication-method /docs/tools/cli/account /docs/tools/cli/card /docs/tools/cli/account /docs/tools/api/examples /docs/tools/api +/docs/products/postgresql/getting-started /docs/products/postgresql/get-started +/docs/products/m3db/getting-started /docs/products/m3db/get-started +/docs/products/flink/getting-started /docs/products/flink/get-started +/docs/products/kafka/getting-started /docs/products/kafka/get-started +/docs/products/clickhouse/getting-started /docs/products/clickhouse/get-started +/docs/products/opensearch/getting-started /docs/products/opensearch/get-started +/docs/products/kafka/karapace/getting-started /docs/products/kafka/karapace/get-started +/docs/products/kafka/kafka-connect/getting-started /docs/products/kafka/kafka-connect/get-started +/docs/products/opensearch/dashboards/getting-started /docs/products/opensearch/dashboards/get-started +/docs/products/kafka/kafka-mirrormaker/getting-started /docs/products/kafka/kafka-mirrormaker/get-started # Redirect from .index.html to specific page names for landing diff --git a/_toc.yml b/_toc.yml index b2fba7fea1..2c59e12d8f 100644 --- a/_toc.yml +++ b/_toc.yml @@ -313,7 +313,7 @@ entries: - file: docs/products/kafka title: Apache Kafka entries: - - file: docs/products/kafka/getting-started + - file: docs/products/kafka/get-started title: Get started - file: docs/products/kafka/howto/fake-sample-data title: Sample data generator @@ -448,7 +448,7 @@ entries: - file: docs/products/kafka/kafka-connect title: Apache Kafka Connect entries: - - file: docs/products/kafka/kafka-connect/getting-started + - file: docs/products/kafka/kafka-connect/get-started - file: docs/products/kafka/kafka-connect/concepts entries: - file: docs/products/kafka/kafka-connect/concepts/list-of-connector-plugins @@ -559,7 +559,7 @@ entries: - file: docs/products/kafka/kafka-mirrormaker title: Apache Kafka MirrorMaker2 entries: - - file: docs/products/kafka/kafka-mirrormaker/getting-started + - file: docs/products/kafka/kafka-mirrormaker/get-started - file: docs/products/kafka/kafka-mirrormaker/concepts entries: - file: docs/products/kafka/kafka-mirrormaker/concepts/disaster-recovery-migration @@ -586,7 +586,7 @@ entries: - file: docs/products/kafka/karapace title: Karapace entries: - - file: docs/products/kafka/karapace/getting-started + - file: docs/products/kafka/karapace/get-started - file: docs/products/kafka/karapace/concepts title: Concepts entries: @@ -620,7 +620,7 @@ entries: title: Plans and pricing - file: docs/products/flink/reference/flink-limitations title: Limitations - - file: docs/products/flink/getting-started + - file: docs/products/flink/get-started title: Quickstart - file: docs/products/flink/concepts title: Concepts @@ -766,7 +766,7 @@ entries: title: Plans and pricing - file: docs/products/clickhouse/reference/limitations title: Limits and limitations - - file: docs/products/clickhouse/getting-started + - file: docs/products/clickhouse/get-started title: Quickstart - file: docs/products/clickhouse/concepts title: Concepts @@ -984,7 +984,7 @@ entries: - file: docs/products/m3db title: M3DB entries: - - file: docs/products/m3db/getting-started + - file: docs/products/m3db/get-started title: Get started - file: docs/products/m3db/concepts title: Concepts @@ -1082,7 +1082,7 @@ entries: - file: docs/products/opensearch title: OpenSearch entries: - - file: docs/products/opensearch/getting-started + - file: docs/products/opensearch/get-started title: Quickstart entries: - file: docs/products/opensearch/howto/sample-dataset @@ -1181,7 +1181,7 @@ entries: - file: docs/products/opensearch/dashboards title: OpenSearch Dashboards entries: - - file: docs/products/opensearch/dashboards/getting-started + - file: docs/products/opensearch/dashboards/get-started - file: docs/products/opensearch/dashboards/howto title: HowTo entries: @@ -1208,7 +1208,7 @@ entries: entries: - file: docs/products/postgresql/overview title: Overview - - file: docs/products/postgresql/getting-started + - file: docs/products/postgresql/get-started title: Quickstart - file: docs/products/postgresql/concepts title: Concepts diff --git a/docs/products/clickhouse.rst b/docs/products/clickhouse.rst index 5667e0b532..cf520b0017 100644 --- a/docs/products/clickhouse.rst +++ b/docs/products/clickhouse.rst @@ -7,7 +7,7 @@ Aiven for ClickHouse® is a fully managed distributed columnar database based on .. grid:: 1 2 2 2 - .. grid-item-card:: :doc:`Quickstart </docs/products/clickhouse/getting-started>` + .. grid-item-card:: :doc:`Quickstart </docs/products/clickhouse/get-started>` :shadow: md :margin: 2 2 0 0 diff --git a/docs/products/clickhouse/getting-started.rst b/docs/products/clickhouse/get-started.rst similarity index 100% rename from docs/products/clickhouse/getting-started.rst rename to docs/products/clickhouse/get-started.rst diff --git a/docs/products/clickhouse/howto/load-dataset.rst b/docs/products/clickhouse/howto/load-dataset.rst index 3b34effe14..39b90004a5 100644 --- a/docs/products/clickhouse/howto/load-dataset.rst +++ b/docs/products/clickhouse/howto/load-dataset.rst @@ -29,7 +29,7 @@ Once done, you should have two files available: ``hits_v1.tsv`` and ``visits_v1. Set up the service and database ------------------------------- -If you don't yet have an Aiven for ClickHouse service, follow the steps in our :doc:`getting started guide </docs/products/clickhouse/getting-started>` to create one. +If you don't yet have an Aiven for ClickHouse service, follow the steps in our :doc:`getting started guide </docs/products/clickhouse/get-started>` to create one. When you create a service, a default database was already added. However, you can create separate databases specific to your use case. We will create a database with the name ``datasets``, keeping it the same as in the ClickHouse documentation. diff --git a/docs/products/flink.rst b/docs/products/flink.rst index 31d57263e5..abf3fe8366 100644 --- a/docs/products/flink.rst +++ b/docs/products/flink.rst @@ -5,7 +5,7 @@ Aiven for Apache Flink® is a fully managed service that leverages the power of .. grid:: 1 2 2 2 - .. grid-item-card:: :doc:`Quickstart </docs/products/flink/getting-started>` + .. grid-item-card:: :doc:`Quickstart </docs/products/flink/get-started>` :shadow: md :margin: 2 2 0 0 diff --git a/docs/products/flink/getting-started.rst b/docs/products/flink/get-started.rst similarity index 100% rename from docs/products/flink/getting-started.rst rename to docs/products/flink/get-started.rst diff --git a/docs/products/flink/howto/flink-confluent-avro.rst b/docs/products/flink/howto/flink-confluent-avro.rst index 1e53fd0927..b190132267 100644 --- a/docs/products/flink/howto/flink-confluent-avro.rst +++ b/docs/products/flink/howto/flink-confluent-avro.rst @@ -12,7 +12,7 @@ Prerequisites -------------- * :doc:`Aiven for Apache Flink service </docs/platform/howto/create_new_service>` with Aiven for Apache Kafka® integration. See :doc:`/docs/products/flink/howto/create-integration` for more information. -* Aiven for Apache Kafka® service with Karapace Schema registry enabled. See :doc:`/docs/products/kafka/karapace/getting-started` for more information. +* Aiven for Apache Kafka® service with Karapace Schema registry enabled. See :doc:`/docs/products/kafka/karapace/get-started` for more information. * By default, Flink cannot create Apache Kafka topics while pushing the first record automatically. To change this behavior, enable in the Aiven for Apache Kafka target service the ``kafka.auto_create_topics_enable`` option in **Advanced configuration** section. Create an Apache Flink® table with Confluent Avro diff --git a/docs/products/kafka.rst b/docs/products/kafka.rst index b5c581a86c..d3202c99ff 100644 --- a/docs/products/kafka.rst +++ b/docs/products/kafka.rst @@ -27,7 +27,7 @@ Apache Kafka moves data between systems, and Apache Kafka Connect is how to inte Get started with Aiven for Apache Kafka --------------------------------------- -Take your first steps with Aiven for Apache Kafka by following our :doc:`/docs/products/kafka/getting-started` article, or browse through our full list of articles: +Take your first steps with Aiven for Apache Kafka by following our :doc:`/docs/products/kafka/get-started` article, or browse through our full list of articles: .. grid:: 1 2 2 2 diff --git a/docs/products/kafka/concepts/schema-registry-authorization.rst b/docs/products/kafka/concepts/schema-registry-authorization.rst index 0181e73918..3e4f140860 100644 --- a/docs/products/kafka/concepts/schema-registry-authorization.rst +++ b/docs/products/kafka/concepts/schema-registry-authorization.rst @@ -1,6 +1,6 @@ Schema registry authorization ============================= -The schema registry authorization feature when enabled in :doc:`Karapace schema registry </docs/products/kafka/karapace/getting-started>` allows you to authenticate the user, and control read or write access to the individual resources available in the Schema Registry. +The schema registry authorization feature when enabled in :doc:`Karapace schema registry </docs/products/kafka/karapace/get-started>` allows you to authenticate the user, and control read or write access to the individual resources available in the Schema Registry. For information on schema registry authorization for Aiven for Apache Kafka® services, see :doc:`Karapace schema registry authorization </docs/products/kafka/karapace/concepts/schema-registry-authorization>`. diff --git a/docs/products/kafka/getting-started.rst b/docs/products/kafka/get-started.rst similarity index 100% rename from docs/products/kafka/getting-started.rst rename to docs/products/kafka/get-started.rst diff --git a/docs/products/kafka/howto/enable-oidc.rst b/docs/products/kafka/howto/enable-oidc.rst index 96350440d6..5294f1a5e2 100644 --- a/docs/products/kafka/howto/enable-oidc.rst +++ b/docs/products/kafka/howto/enable-oidc.rst @@ -10,7 +10,7 @@ Aiven for Apache Kafka integrates with a wide range of OpenID Connect identity p Before proceeding with the setup, ensure you have: -* :doc:`Aiven for Apache Kafka® </docs/products/kafka/getting-started>` service running. +* :doc:`Aiven for Apache Kafka® </docs/products/kafka/get-started>` service running. * **Access to an OIDC provider**: Options include Auth0, Okta, Google Identity Platform, Azure, or any other OIDC compliant provider. * Required configuration details from your OIDC provider: diff --git a/docs/products/kafka/howto/enabled-consumer-lag-predictor.rst b/docs/products/kafka/howto/enabled-consumer-lag-predictor.rst index 0d7bfacfc1..066d450b77 100644 --- a/docs/products/kafka/howto/enabled-consumer-lag-predictor.rst +++ b/docs/products/kafka/howto/enabled-consumer-lag-predictor.rst @@ -13,7 +13,7 @@ Prerequisites Before you start, ensure you have the following: - Aiven account. -- :doc:`Aiven for Apache Kafka® </docs/products/kafka/getting-started>` service running. +- :doc:`Aiven for Apache Kafka® </docs/products/kafka/get-started>` service running. - :doc:`Prometheus integration </docs/platform/howto/integrations/prometheus-metrics>` set up for your Aiven for Apache Kafka for extracting metrics. - Necessary permissions to modify service configurations. diff --git a/docs/products/kafka/howto/fake-sample-data.rst b/docs/products/kafka/howto/fake-sample-data.rst index ad49b0e1dd..d9ee79684f 100644 --- a/docs/products/kafka/howto/fake-sample-data.rst +++ b/docs/products/kafka/howto/fake-sample-data.rst @@ -7,7 +7,7 @@ Learning to work with streaming data is much more fun with data, so to get you s The following example is based on `Docker <https://www.docker.com/>`_ images, which require `Docker <https://www.docker.com/>`_ or `Podman <https://podman.io/>`_ to be executed. -The following example assumes you have an Aiven for Apache Kafka® service running. You can create one following the :doc:`dedicated instructions </docs/products/kafka/getting-started>`. +The following example assumes you have an Aiven for Apache Kafka® service running. You can create one following the :doc:`dedicated instructions </docs/products/kafka/get-started>`. Fake data generator on Docker diff --git a/docs/products/kafka/howto/kafka-conduktor.rst b/docs/products/kafka/howto/kafka-conduktor.rst index 902c3af577..646985051b 100644 --- a/docs/products/kafka/howto/kafka-conduktor.rst +++ b/docs/products/kafka/howto/kafka-conduktor.rst @@ -3,7 +3,7 @@ Connect to Apache Kafka® with Conduktor `Conduktor <https://www.conduktor.io/>`_ is a friendly user interface for Apache Kafka, and it works well with Aiven. In fact, there is built-in support for setting up the connection. You will need to add the CA certificate for each of your Aiven projects to Conduktor before you can connect, this is outlined in the steps below. -1. Visit the **Service overview** page for your Aiven for Apache Kafka® service (the :doc:`/docs/products/kafka/getting-started` page is a good place for more information about creating a new service if you don't have one already). +1. Visit the **Service overview** page for your Aiven for Apache Kafka® service (the :doc:`/docs/products/kafka/get-started` page is a good place for more information about creating a new service if you don't have one already). 2. Download the **Access Key**, **Access Certificate** and **CA Certificate** (if you didn't have that already) into a directory on your computer. diff --git a/docs/products/kafka/howto/kafka-klaw.rst b/docs/products/kafka/howto/kafka-klaw.rst index f9a3a90c8c..28fb29879f 100644 --- a/docs/products/kafka/howto/kafka-klaw.rst +++ b/docs/products/kafka/howto/kafka-klaw.rst @@ -9,7 +9,7 @@ Prerequisites ------------- To connect Aiven for Apache Kafka® and Klaw, you need to have the following setup: -* A running Aiven for Apache Kafka® service. See :doc:`Getting started with Aiven for Apache Kafka </docs/products/kafka/getting-started>` for more information. +* A running Aiven for Apache Kafka® service. See :doc:`Getting started with Aiven for Apache Kafka </docs/products/kafka/get-started>` for more information. * A running Klaw cluster. See `Run Klaw from the source <https://www.klaw-project.io/docs/quickstart>`_ for more information. * Configured :doc:`Java keystore and truststore containing the service SSL certificates </docs/products/kafka/howto/keystore-truststore>`. diff --git a/docs/products/kafka/kafka-connect.rst b/docs/products/kafka/kafka-connect.rst index 2b46b20b70..8ff05357d5 100644 --- a/docs/products/kafka/kafka-connect.rst +++ b/docs/products/kafka/kafka-connect.rst @@ -127,7 +127,7 @@ Sink connectors Get started with Aiven for Apache Kafka® Connect ------------------------------------------------ -Take your first steps with Aiven for Apache Kafka Connect by following our :doc:`/docs/products/kafka/kafka-connect/getting-started` article, or browse through our full list of articles: +Take your first steps with Aiven for Apache Kafka Connect by following our :doc:`/docs/products/kafka/kafka-connect/get-started` article, or browse through our full list of articles: .. grid:: 1 2 2 2 diff --git a/docs/products/kafka/kafka-connect/getting-started.rst b/docs/products/kafka/kafka-connect/get-started.rst similarity index 100% rename from docs/products/kafka/kafka-connect/getting-started.rst rename to docs/products/kafka/kafka-connect/get-started.rst diff --git a/docs/products/kafka/kafka-mirrormaker.rst b/docs/products/kafka/kafka-mirrormaker.rst index ece98f0ff4..35bbe2954d 100644 --- a/docs/products/kafka/kafka-mirrormaker.rst +++ b/docs/products/kafka/kafka-mirrormaker.rst @@ -18,7 +18,7 @@ Apache Kafka® represents the best in class data streaming solution. Apache Kafk Get started with Aiven for Apache Kafka® MirrorMaker 2 ------------------------------------------------------ -Take your first steps with Aiven for Apache Kafka® MirrorMaker 2 by following our :doc:`/docs/products/kafka/kafka-mirrormaker/getting-started` article, or browse through our full list of articles: +Take your first steps with Aiven for Apache Kafka® MirrorMaker 2 by following our :doc:`/docs/products/kafka/kafka-mirrormaker/get-started` article, or browse through our full list of articles: .. grid:: 1 2 2 2 diff --git a/docs/products/kafka/kafka-mirrormaker/getting-started.rst b/docs/products/kafka/kafka-mirrormaker/get-started.rst similarity index 100% rename from docs/products/kafka/kafka-mirrormaker/getting-started.rst rename to docs/products/kafka/kafka-mirrormaker/get-started.rst diff --git a/docs/products/kafka/kafka-mirrormaker/howto/setup-replication-flow.rst b/docs/products/kafka/kafka-mirrormaker/howto/setup-replication-flow.rst index bbd940a653..2c0bf96b0e 100644 --- a/docs/products/kafka/kafka-mirrormaker/howto/setup-replication-flow.rst +++ b/docs/products/kafka/kafka-mirrormaker/howto/setup-replication-flow.rst @@ -13,7 +13,7 @@ To define a replication flow between a source Apache Kafka cluster and a target .. Note:: - If no Aiven for Apache Kafka MirrorMaker 2 are already defined, :doc:`you can create one in the Aiven console <../getting-started>`. + If no Aiven for Apache Kafka MirrorMaker 2 are already defined, :doc:`you can create one in the Aiven console <../get-started>`. 2. In the service **Overview** screen, scroll to the **Service integrations** section and select **Manage integrations**. diff --git a/docs/products/kafka/karapace.rst b/docs/products/kafka/karapace.rst index 50c7509695..bc33deea19 100644 --- a/docs/products/kafka/karapace.rst +++ b/docs/products/kafka/karapace.rst @@ -14,7 +14,7 @@ Karapace REST provides a RESTful interface to your Apache Kafka cluster, allowin Get started with Karapace ------------------------- -Take your first steps Karapace by following our :doc:`/docs/products/kafka/karapace/getting-started` article, or browse through other articles: +Take your first steps Karapace by following our :doc:`/docs/products/kafka/karapace/get-started` article, or browse through other articles: .. grid:: 1 2 2 2 diff --git a/docs/products/kafka/karapace/concepts/acl-definition.rst b/docs/products/kafka/karapace/concepts/acl-definition.rst index b8424716d5..74c597a5cc 100644 --- a/docs/products/kafka/karapace/concepts/acl-definition.rst +++ b/docs/products/kafka/karapace/concepts/acl-definition.rst @@ -73,5 +73,5 @@ The following table provides you with examples: The user that manages the ACLs is a superuser with write access to everything in the schema registry. In the Aiven Console, the superuser can view and modify all schemas in the Schema tab of a Kafka service. The superuser and its ACL entries are not visible in the Console but are added automatically by the Aiven platform. -The schema registry authorization feature enabled in :doc:`Karapace schema registry </docs/products/kafka/karapace/getting-started>` allows you to both authenticate the user, and additionally grant or deny access to individual `Karapace schema registry REST API endpoints <https://github.com/aiven/karapace>`_ and filter the content the endpoints return. +The schema registry authorization feature enabled in :doc:`Karapace schema registry </docs/products/kafka/karapace/get-started>` allows you to both authenticate the user, and additionally grant or deny access to individual `Karapace schema registry REST API endpoints <https://github.com/aiven/karapace>`_ and filter the content the endpoints return. diff --git a/docs/products/kafka/karapace/getting-started.rst b/docs/products/kafka/karapace/get-started.rst similarity index 100% rename from docs/products/kafka/karapace/getting-started.rst rename to docs/products/kafka/karapace/get-started.rst diff --git a/docs/products/kafka/karapace/howto/enable-oauth-oidc-kafka-rest-proxy.rst b/docs/products/kafka/karapace/howto/enable-oauth-oidc-kafka-rest-proxy.rst index 1e2fcbef47..376d2b3685 100644 --- a/docs/products/kafka/karapace/howto/enable-oauth-oidc-kafka-rest-proxy.rst +++ b/docs/products/kafka/karapace/howto/enable-oauth-oidc-kafka-rest-proxy.rst @@ -36,7 +36,7 @@ To establish OAuth2/OIDC authentication for the Karapace REST proxy, complete th Prerequisites ``````````````` -* :doc:`Aiven for Apache Kafka® </docs/products/kafka/getting-started>` service running with :doc:`OAuth2/OIDC enabled </docs/products/kafka/howto/enable-oidc>`. +* :doc:`Aiven for Apache Kafka® </docs/products/kafka/get-started>` service running with :doc:`OAuth2/OIDC enabled </docs/products/kafka/howto/enable-oidc>`. * :doc:`Karapace schema registry and REST APIs enabled </docs/products/kafka/karapace/howto/enable-karapace>`. * Ensure access to an OIDC-compliant provider, such as Auth0, Okta, Google Identity Platform, or Azure. diff --git a/docs/products/m3db.rst b/docs/products/m3db.rst index 99f33d4314..31c80ec010 100644 --- a/docs/products/m3db.rst +++ b/docs/products/m3db.rst @@ -23,7 +23,7 @@ Read more about `the M3 components <https://m3db.io/docs/overview/components/>`_ Get started with Aiven for M3 ----------------------------- -Take your first steps with Aiven for M3 by following our :doc:`/docs/products/m3db/getting-started` article, or browse through our full list of articles: +Take your first steps with Aiven for M3 by following our :doc:`/docs/products/m3db/get-started` article, or browse through our full list of articles: .. grid:: 1 2 2 2 diff --git a/docs/products/m3db/getting-started.rst b/docs/products/m3db/get-started.rst similarity index 100% rename from docs/products/m3db/getting-started.rst rename to docs/products/m3db/get-started.rst diff --git a/docs/products/opensearch.rst b/docs/products/opensearch.rst index d26522ba59..e9d0d9b2e7 100644 --- a/docs/products/opensearch.rst +++ b/docs/products/opensearch.rst @@ -8,7 +8,7 @@ Aiven for OpenSearch® is a fully managed distributed search and analytics suite .. grid:: 1 2 2 2 - .. grid-item-card:: :doc:`Quickstart </docs/products/opensearch/getting-started>` + .. grid-item-card:: :doc:`Quickstart </docs/products/opensearch/get-started>` :shadow: md :margin: 2 2 0 0 diff --git a/docs/products/opensearch/concepts/opensearch-vs-elasticsearch.rst b/docs/products/opensearch/concepts/opensearch-vs-elasticsearch.rst index 22a6f00bd6..cbcbf5874f 100644 --- a/docs/products/opensearch/concepts/opensearch-vs-elasticsearch.rst +++ b/docs/products/opensearch/concepts/opensearch-vs-elasticsearch.rst @@ -5,7 +5,7 @@ OpenSearch® is the open source continuation of the original Elasticsearch proje Version 1.0 release of OpenSearch should be very similar to the Elasticsearch release that it is based on, and Aiven encourages all customers to upgrade at their earliest convenience. This is to ensure that your platforms can continue to receive upgrades in the future. -To start exploring Aiven for OpenSearch®, check out the :doc:`Get Started with Aiven for OpenSearch® </docs/products/opensearch/getting-started>`. +To start exploring Aiven for OpenSearch®, check out the :doc:`Get Started with Aiven for OpenSearch® </docs/products/opensearch/get-started>`. ----- diff --git a/docs/products/opensearch/dashboards.rst b/docs/products/opensearch/dashboards.rst index 6d2d855725..7c2ad9eff6 100644 --- a/docs/products/opensearch/dashboards.rst +++ b/docs/products/opensearch/dashboards.rst @@ -9,7 +9,7 @@ OpenSearch® Dashboards is both a visualisation tool for data in the cluster and Get started with Aiven for OpenSearch Dashboards ------------------------------------------------ -Take your first steps with Aiven for OpenSearch Dashboards by following our :doc:`/docs/products/opensearch/dashboards/getting-started` article. +Take your first steps with Aiven for OpenSearch Dashboards by following our :doc:`/docs/products/opensearch/dashboards/get-started` article. .. note:: Starting with Aiven for OpenSearch® versions 1.3.13 and 2.10, OpenSearch Dashboards will remain available during a maintenance update that also consists of version updates to your Aiven for OpenSearch service. diff --git a/docs/products/opensearch/dashboards/getting-started.rst b/docs/products/opensearch/dashboards/get-started.rst similarity index 90% rename from docs/products/opensearch/dashboards/getting-started.rst rename to docs/products/opensearch/dashboards/get-started.rst index a9b542656d..77fc0c6612 100644 --- a/docs/products/opensearch/dashboards/getting-started.rst +++ b/docs/products/opensearch/dashboards/get-started.rst @@ -1,7 +1,7 @@ Getting started =============== -To start using **Aiven for OpenSearch® Dashboards**, :doc:`create Aiven for OpenSearch® service first</docs/products/opensearch/getting-started>` and OpenSearch Dashboards service will be added alongside it. Once the Aiven for OpenSearch service is running you can find connection information to your OpenSearch Dashboards in the service overview page and use your favourite browser to access OpenSearch Dashboards service. +To start using **Aiven for OpenSearch® Dashboards**, :doc:`create Aiven for OpenSearch® service first</docs/products/opensearch/get-started>` and OpenSearch Dashboards service will be added alongside it. Once the Aiven for OpenSearch service is running you can find connection information to your OpenSearch Dashboards in the service overview page and use your favourite browser to access OpenSearch Dashboards service. .. note:: diff --git a/docs/products/opensearch/getting-started.rst b/docs/products/opensearch/get-started.rst similarity index 100% rename from docs/products/opensearch/getting-started.rst rename to docs/products/opensearch/get-started.rst diff --git a/docs/products/opensearch/howto/opensearch-aggregations-and-nodejs.rst b/docs/products/opensearch/howto/opensearch-aggregations-and-nodejs.rst index c16127a2ec..b534ca7567 100644 --- a/docs/products/opensearch/howto/opensearch-aggregations-and-nodejs.rst +++ b/docs/products/opensearch/howto/opensearch-aggregations-and-nodejs.rst @@ -9,7 +9,7 @@ Learn how to aggregate data using OpenSearch and its NodeJS client. In this tuto Prepare the playground ********************** -You can create an OpenSearch cluster either with the visual interface or with the command line. Depending on your preference follow the instructions for :doc:`getting started with the console for Aiven for Opensearch </docs/products/opensearch/getting-started>` or see :doc:`how to create a service with the help of Aiven command line interface </docs/tools/cli/service>`. +You can create an OpenSearch cluster either with the visual interface or with the command line. Depending on your preference follow the instructions for :doc:`getting started with the console for Aiven for Opensearch </docs/products/opensearch/get-started>` or see :doc:`how to create a service with the help of Aiven command line interface </docs/tools/cli/service>`. .. note:: diff --git a/docs/products/opensearch/howto/opensearch-and-nodejs.rst b/docs/products/opensearch/howto/opensearch-and-nodejs.rst index 83a046a2f6..537c26cad7 100644 --- a/docs/products/opensearch/howto/opensearch-and-nodejs.rst +++ b/docs/products/opensearch/howto/opensearch-and-nodejs.rst @@ -6,7 +6,7 @@ Learn how the OpenSearch® JavaScript client gives a clear and useful interface Prepare the playground ********************** -You can create an OpenSearch cluster either with the visual interface or with the command line. Depending on your preference follow the instructions for :doc:`getting started with the console for Aiven for Opensearch </docs/products/opensearch/getting-started>` or see :doc:`how to create a service with the help of Aiven command line interface </docs/tools/cli/service>`. +You can create an OpenSearch cluster either with the visual interface or with the command line. Depending on your preference follow the instructions for :doc:`getting started with the console for Aiven for Opensearch </docs/products/opensearch/get-started>` or see :doc:`how to create a service with the help of Aiven command line interface </docs/tools/cli/service>`. .. note:: diff --git a/docs/products/postgresql.rst b/docs/products/postgresql.rst index 31e38f5c23..d606b8a273 100644 --- a/docs/products/postgresql.rst +++ b/docs/products/postgresql.rst @@ -7,7 +7,7 @@ Aiven for PostgreSQL® is is a fully-managed and hosted relational database serv .. grid:: 1 2 2 2 - .. grid-item-card:: :doc:`Quickstart </docs/products/postgresql/getting-started>` + .. grid-item-card:: :doc:`Quickstart </docs/products/postgresql/get-started>` :shadow: md :margin: 2 2 0 0 diff --git a/docs/products/postgresql/getting-started.rst b/docs/products/postgresql/get-started.rst similarity index 100% rename from docs/products/postgresql/getting-started.rst rename to docs/products/postgresql/get-started.rst diff --git a/docs/tools/terraform/howto/vpc-peering-aws.rst b/docs/tools/terraform/howto/vpc-peering-aws.rst index 675bf43b47..217128a44b 100644 --- a/docs/tools/terraform/howto/vpc-peering-aws.rst +++ b/docs/tools/terraform/howto/vpc-peering-aws.rst @@ -12,7 +12,7 @@ Prerequisites: * Create an :doc:`Aiven authentication token </docs/platform/howto/create_authentication_token>`. -* `Install the AWS CLI <https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html>`_. +* `Install the AWS CLI <https://docs.aws.amazon.com/cli/latest/userguide/get-started-install.html>`_. * `Configure the AWS CLI <https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html>`_. diff --git a/index.rst b/index.rst index 0445c034c6..1dad969650 100644 --- a/index.rst +++ b/index.rst @@ -253,7 +253,7 @@ Automation A public API you can use for programmatic integrations. - .. button-link:: docs/tools/api + .. button-link:: https://docs.aiven.io/docs/tools/api :color: primary :outline: