+
+
+{% endblock %}
diff --git a/cosmos/plugin/templates/dbt_docs_not_set_up.html b/cosmos/plugin/templates/dbt_docs_not_set_up.html
new file mode 100644
index 000000000..1fcc6ef7f
--- /dev/null
+++ b/cosmos/plugin/templates/dbt_docs_not_set_up.html
@@ -0,0 +1,9 @@
+{% extends base_template %}
+{% block content %}
+ ⚠️ Your dbt docs are not set up yet! ⚠️
+
+
+ Read the Astronomer Cosmos docs for information on how to set up dbt docs.
+
+
+{% endblock %}
diff --git a/dev/dags/dbt/jaffle_shop/.gitignore b/dev/dags/dbt/jaffle_shop/.gitignore
index 49f147cb9..45d294b9a 100644
--- a/dev/dags/dbt/jaffle_shop/.gitignore
+++ b/dev/dags/dbt/jaffle_shop/.gitignore
@@ -2,3 +2,4 @@
target/
dbt_packages/
logs/
+!target/manifest.json
diff --git a/dev/docker-compose.yaml b/dev/docker-compose.yaml
index 23b012d15..5345f4b13 100644
--- a/dev/docker-compose.yaml
+++ b/dev/docker-compose.yaml
@@ -10,6 +10,7 @@ x-airflow-common:
environment:
&airflow-common-env
DB_BACKEND: postgres
+ AIRFLOW__COSMOS__DBT_DOCS_DIR: http://cosmos-docs.s3-website-us-east-1.amazonaws.com/
AIRFLOW__CORE__EXECUTOR: LocalExecutor
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:pg_password@postgres:5432/airflow
AIRFLOW__CORE__FERNET_KEY: ''
diff --git a/docs/_static/location_of_dbt_docs_in_airflow.png b/docs/_static/location_of_dbt_docs_in_airflow.png
new file mode 100644
index 000000000..348a53c8e
Binary files /dev/null and b/docs/_static/location_of_dbt_docs_in_airflow.png differ
diff --git a/docs/configuration/generating-docs.rst b/docs/configuration/generating-docs.rst
index 6112ebcee..54ec80fc9 100644
--- a/docs/configuration/generating-docs.rst
+++ b/docs/configuration/generating-docs.rst
@@ -5,7 +5,9 @@ Generating Docs
dbt allows you to generate static documentation on your models, tables, and more. You can read more about it in the `official dbt documentation `_. For an example of what the docs look like with the ``jaffle_shop`` project, check out `this site `_.
-Many users choose to generate and serve these docs on a static website. This is a great way to share your data models with your team and other stakeholders.
+After generating the dbt docs, you can host them natively within Airflow via the Cosmos Airflow plugin; see `Hosting Docs `__ for more information.
+
+Alternatively, many users choose to serve these docs on a separate static website. This is a great way to share your data models with a broad array of stakeholders.
Cosmos offers two pre-built ways of generating and uploading dbt docs and a fallback option to run custom code after the docs are generated:
diff --git a/docs/configuration/hosting-docs.rst b/docs/configuration/hosting-docs.rst
new file mode 100644
index 000000000..5143a9f67
--- /dev/null
+++ b/docs/configuration/hosting-docs.rst
@@ -0,0 +1,127 @@
+.. hosting-docs:
+
+Hosting Docs
+============
+
+dbt docs can be served directly from the Apache Airflow webserver with the Cosmos Airflow plugin, without requiring the user to set up anything outside of Airflow. This page describes how to host docs in the Airflow webserver directly, although some users may opt to host docs externally.
+
+Overview
+~~~~~~~~
+
+The dbt docs are available in the Airflow menu under ``Browse > dbt docs``:
+
+.. image:: /_static/location_of_dbt_docs_in_airflow.png
+ :alt: Airflow UI - Location of dbt docs in menu
+ :align: center
+
+In order to access the dbt docs, you must specify the following config variables:
+
+- ``cosmos.dbt_docs_dir``: A path to where the docs are being hosted.
+- (Optional) ``cosmos.dbt_docs_conn_id``: A conn ID to use for a cloud storage deployment. If not specified _and_ the URI points to a cloud storage platform, then the default conn ID for the AWS/Azure/GCP hook will be used.
+
+.. code-block:: cfg
+
+ [cosmos]
+ dbt_docs_dir = path/to/docs/here
+ dbt_docs_conn_id = my_conn_id
+
+or as an environment variable:
+
+.. code-block:: shell
+
+ AIRFLOW__COSMOS__DBT_DOCS_DIR="path/to/docs/here"
+ AIRFLOW__COSMOS__DBT_DOCS_CONN_ID="my_conn_id"
+
+The path can be either a folder in the local file system the webserver is running on, or a URI to a cloud storage platform (S3, GCS, Azure).
+
+Host from Cloud Storage
+~~~~~~~~~~~~~~~~~~~~~~~
+
+For typical users, the recommended setup for hosting dbt docs would look like this:
+
+1. Generate the docs via one of Cosmos' pre-built operators for generating dbt docs (see `Generating Docs `__ for more information)
+2. Wherever you dumped the docs, set your ``cosmos.dbt_docs_dir`` to that location.
+3. If you want to use a conn ID other than the default connection, set your ``cosmos.dbt_docs_conn_id``. Otherwise, leave this blank.
+
+AWS S3 Example
+^^^^^^^^^^^^^^
+
+.. code-block:: cfg
+
+ [cosmos]
+ dbt_docs_dir = s3://my-bucket/path/to/docs
+ dbt_docs_conn_id = aws_default
+
+.. code-block:: shell
+
+ AIRFLOW__COSMOS__DBT_DOCS_DIR="s3://my-bucket/path/to/docs"
+ AIRFLOW__COSMOS__DBT_DOCS_CONN_ID="aws_default"
+
+Google Cloud Storage Example
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. code-block:: cfg
+
+ [cosmos]
+ dbt_docs_dir = gs://my-bucket/path/to/docs
+ dbt_docs_conn_id = google_cloud_default
+
+.. code-block:: shell
+
+ AIRFLOW__COSMOS__DBT_DOCS_DIR="gs://my-bucket/path/to/docs"
+ AIRFLOW__COSMOS__DBT_DOCS_CONN_ID="google_cloud_default"
+
+Azure Blob Storage Example
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+.. code-block:: cfg
+
+ [cosmos]
+ dbt_docs_dir = wasb://my-container/path/to/docs
+ dbt_docs_conn_id = wasb_default
+
+.. code-block:: shell
+
+ AIRFLOW__COSMOS__DBT_DOCS_DIR="wasb://my-container/path/to/docs"
+ AIRFLOW__COSMOS__DBT_DOCS_CONN_ID="wasb_default"
+
+Host from Local Storage
+~~~~~~~~~~~~~~~~~~~~~~~
+
+By default, Cosmos will not generate docs on the fly. Local storage only works if you are pre-compiling your dbt project before deployment.
+
+If your Airflow deployment process involves running ``dbt compile``, you will also want to add ``dbt docs generate`` to your deployment process as well to generate all the artifacts necessary to run the dbt docs from local storage.
+
+By default, dbt docs are generated in the ``target`` folder; so that will also be your docs folder by default.
+
+For example, if your dbt project directory is ``/usr/local/airflow/dags/my_dbt_project``, then by default your dbt docs directory will be ``/usr/local/airflow/dags/my_dbt_project/target``:
+
+.. code-block:: cfg
+
+ [cosmos]
+ dbt_docs_dir = /usr/local/airflow/dags/my_dbt_project/target
+
+.. code-block:: shell
+
+ AIRFLOW__COSMOS__DBT_DOCS_DIR="/usr/local/airflow/dags/my_dbt_project/target"
+
+Using docs out of local storage has the downside that some values in the dbt docs can become stale unless the docs are periodically refreshed and redeployed:
+
+- Counts of the numbers of rows.
+- The compiled SQL for incremental models before and after the first run.
+
+Host from HTTP/HTTPS
+~~~~~~~~~~~~~~~~~~~~
+
+.. code-block:: cfg
+
+ [cosmos]
+ dbt_docs_dir = https://my-site.com/path/to/docs
+
+.. code-block:: shell
+
+ AIRFLOW__COSMOS__DBT_DOCS_DIR="https://my-site.com/path/to/docs"
+
+
+You do not need to set a ``dbt_docs_conn_id`` when using HTTP/HTTPS.
+If you do set the ``dbt_docs_conn_id``, then the ``HttpHook`` will be used.
diff --git a/docs/configuration/index.rst b/docs/configuration/index.rst
index 8c282be03..919ed9b1e 100644
--- a/docs/configuration/index.rst
+++ b/docs/configuration/index.rst
@@ -16,6 +16,7 @@ Cosmos offers a number of configuration options to customize its behavior. For m
Parsing Methods
Configuring Lineage
Generating Docs
+ Hosting Docs
Scheduling
Testing Behavior
Selecting & Excluding
diff --git a/pyproject.toml b/pyproject.toml
index 522431da7..7758f9669 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -95,6 +95,9 @@ azure-container-instance = [
[project.entry-points.cosmos]
provider_info = "cosmos:get_provider_info"
+[project.entry-points."airflow.plugins"]
+cosmos = "cosmos.plugin:CosmosPlugin"
+
[project.urls]
Homepage = "https://github.com/astronomer/astronomer-cosmos"
Documentation = "https://astronomer.github.io/astronomer-cosmos"
diff --git a/tests/plugin/__init__.py b/tests/plugin/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/tests/plugin/test_plugin.py b/tests/plugin/test_plugin.py
new file mode 100644
index 000000000..df33ae13a
--- /dev/null
+++ b/tests/plugin/test_plugin.py
@@ -0,0 +1,223 @@
+# dbt-core relies on Jinja2>3, whereas Flask<2 relies on an incompatible version of Jinja2.
+#
+# This discrepancy causes the automated integration tests to fail, as dbt-core is installed in the same
+# environment as apache-airflow.
+#
+# We can get around this by patching the jinja2 namespace to include the deprecated objects:
+try:
+ import flask # noqa: F401
+except ImportError:
+ import markupsafe
+ import jinja2
+
+ jinja2.Markup = markupsafe.Markup
+ jinja2.escape = markupsafe.escape
+
+from unittest.mock import mock_open, patch, MagicMock, PropertyMock
+
+import sys
+import pytest
+from airflow.configuration import conf
+from airflow.exceptions import AirflowConfigException
+from airflow.utils.db import initdb, resetdb
+from airflow.www.app import cached_app
+from airflow.www.extensions.init_appbuilder import AirflowAppBuilder
+from flask.testing import FlaskClient
+
+import cosmos.plugin
+
+from cosmos.plugin import (
+ dbt_docs_view,
+ iframe_script,
+ open_gcs_file,
+ open_azure_file,
+ open_http_file,
+ open_s3_file,
+ open_file,
+)
+
+
+original_conf_get = conf.get
+
+
+def _get_text_from_response(response) -> str:
+ # Airflow < 2.4 uses an old version of Werkzeug that does not have Response.text.
+ if not hasattr(response, "text"):
+ return response.get_data(as_text=True)
+ else:
+ return response.text
+
+
+@pytest.fixture(scope="module")
+def app() -> FlaskClient:
+ initdb()
+
+ app = cached_app(testing=True)
+ appbuilder: AirflowAppBuilder = app.extensions["appbuilder"]
+
+ appbuilder.sm.check_authorization = lambda *args, **kwargs: True
+
+ if dbt_docs_view not in appbuilder.baseviews:
+ appbuilder._check_and_init(dbt_docs_view)
+ appbuilder.register_blueprint(dbt_docs_view)
+
+ yield app.test_client()
+
+ resetdb(skip_init=True)
+
+
+def test_dbt_docs(monkeypatch, app):
+ def conf_get(section, key, *args, **kwargs):
+ if section == "cosmos" and key == "dbt_docs_dir":
+ return "path/to/docs/dir"
+ else:
+ return original_conf_get(section, key, *args, **kwargs)
+
+ monkeypatch.setattr(conf, "get", conf_get)
+
+ response = app.get("/cosmos/dbt_docs")
+
+ assert response.status_code == 200
+ assert "