Skip to content
This repository has been archived by the owner on Jan 29, 2024. It is now read-only.

Commit

Permalink
Merge pull request #2085 from aiven/harshini-flink-BQ-console
Browse files Browse the repository at this point in the history
Added instructions for BigQuery integration to Flink via Aiven Console
  • Loading branch information
juha-aiven authored Aug 11, 2023
2 parents 30adbee + f06242c commit e24dd0b
Show file tree
Hide file tree
Showing 2 changed files with 32 additions and 7 deletions.
25 changes: 24 additions & 1 deletion docs/products/flink/howto/connect-bigquery.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Aiven for Apache Flink® is a fully managed service that provides distributed, s
Aiven for Apache Flink® uses `BigQuery Connector for Apache Flink <https://github.com/aiven/bigquery-connector-for-apache-flink>`_ as the connector to connect to Google BigQuery.


Learn how to connect Aiven for Apache Flink® with Google BigQuery as a sink using the Aiven CLI.
Learn how to connect Aiven for Apache Flink® with Google BigQuery as a sink using :doc:`Aiven client </docs/tools/cli>` and `Aiven Console <https://console.aiven.io/>`_.


Prerequisites
Expand Down Expand Up @@ -205,3 +205,26 @@ Following is an example of a Google BigQuery SINK table:
If the integration is successfully created, the service credentials and project id will be automatically populated in the Sink (if you have left them back as shown in the example above).


Configure integration using Aiven Console
--------------------------------------------

If you're using Google BigQuery for your data storage and analysis, you can seamlessly integrate it as a sink for Aiven for Apache Flink streams. To achieve this via the `Aiven Console <https://console.aiven.io/>`_, follow these steps:

1. Log in to `Aiven Console <https://console.aiven.io/>`_ and choose your project.
2. From the **Services** page, you can either :doc:`create a new Aiven for Apache Flink </docs/platform/howto/create_new_service>` service or select an existing service.
3. Next, configure Google BigQuery service integration endpoint:

* Navigate to the **Projects** screen where all the services are listed.
* From the left sidebar, select **Integration endpoints**.
* Select **Google Cloud BigQuery** from the list, and then select **Add new endpoint** or **Create new**.
* Enter an *Endpoint name*, *GCP Project ID*, *Google Service Account Credentials*, and select **Create**.

4. Select **Services** and access the Aiven for Apache Flink service where you plan to integrate the Google BigQuery endpoint.
5. If you're integrating with Aiven for Apache Flink for the first time, on the **Overview** page and select **Get Started**. Alternatively, you can add a new integration in the **Data Flow** section by using the plus (+) button.
6. On the **Data Service integrations** screen, select the **Create external integration endpoint** tab.
7. Select the checkbox next to BigQuery, and choose the BigQuery endpoint from the list to integrate.
8. Select **Integrate**.

Once you have completed these steps, the integration will be ready, and you can start creating :doc:`Aiven for Apache Flink applications <..//howto/create-flink-applications>` that use Google BigQuery as a sink.


14 changes: 8 additions & 6 deletions docs/products/flink/howto/ext-kafka-flink-integration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -214,18 +214,20 @@ Configure integration using Aiven Console

If you have an external Apache Kafka service already running, you can integrate it with Aiven for Apache Flink using the `Aiven Console <https://console.aiven.io/>`_ by following these steps:

1. In the `Aiven Console <https://console.aiven.io/>`_, :doc:`create a new Aiven for Apache Flink </docs/platform/howto/create_new_service>` service or select an existing service.
2. Next, configure an external Apache Kafka service integration endpoint:
1. Log in to `Aiven Console <https://console.aiven.io/>`_ and choose your project.
2. From the **Services** page, you can either :doc:`create a new Aiven for Apache Flink </docs/platform/howto/create_new_service>` service or select an existing service.
3. Next, configure an external Apache Kafka service integration endpoint:

* Navigate to the Projects screen where all the services are listed.
* From the left sidebar, select **Integration endpoints**.
* Select **External Apache Kafka** from the list, and then select **Add new endpoint**.
* Enter an *Endpoint name* and the *Bootstrap servers*. Then, choose a *Security protocol* from the dropdown list and select **Create**.

3. Access the Aiven for Apache Flink service where you plan to integrate the external Apache Kafka endpoint.
4. If it is the first integration for the selected service, select **Get Started** in the service **Overview** screen, or use the plus (+) button to add a new integration in the **Data Flow** section.
5. On the **Data Service integrations** screen, select the checkbox next to Aiven for Apache Kafka, and choose the external Apache Kafka endpoint from the list to integrate.
6. Select **Integrate**.
4. Select **Services** from the left sidebar, and access the Aiven for Apache Flink service where you plan to integrate the external Apache Kafka endpoint.
5. If you're integrating with Aiven for Apache Flink for the first time, on the **Overview** page and select **Get Started**. Alternatively, you can add a new integration in the **Data Flow** section by using the plus (+) button.
6. On the **Data Service integrations** screen, select the **Create external integration endpoint** tab.
7. Select the checkbox next to **Apache Kafka**, and choose the external Apache Kafka endpoint from the list to integrate.
8. Select **Integrate**.

Once you have completed these steps, the integration will be ready, and you can start creating :doc:`Aiven for Apache Flink applications <..//howto/create-flink-applications>` that use the external Apache Kafka service as either a source or sink.

Expand Down

0 comments on commit e24dd0b

Please sign in to comment.