Skip to content

Commit

Permalink
Add instruction to use local machine as development
Browse files Browse the repository at this point in the history
  • Loading branch information
raffifu committed Oct 13, 2023
1 parent dac117a commit dcfa80a
Showing 1 changed file with 69 additions and 25 deletions.
94 changes: 69 additions & 25 deletions docs/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,56 +16,61 @@ To contribute to the cosmos project:
#. Link your issue to the pull request
#. Once developments are complete on your feature branch, request a review and it will be merged once approved.

Setup local development on host machine
---------------------------------------

Using Hatch for local development
---------------------------------
This guide will setup astronomer development on host machine, first clone the ``astronomer-cosmos`` repo and enter the repo directory:

We currently use `hatch <https://github.com/pypa/hatch>`_ for building and distributing ``astronomer-cosmos``.
.. code-block:: bash
The tool can also be used for local development. The `pyproject.toml <https://github.com/astronomer/astronomer-cosmos/blob/main/pyproject.toml>`_ file currently defines a matrix of supported versions of Python and Airflow for which a user can run the tests against.
git clone https://github.com/astronomer/astronomer-cosmos.git
cd astronomer-cosmos/
For instance, to run the tests using Python 3.10 and Apache Airflow 2.5, use the following:
Then install ``airflow`` and ``astronomer-cosmos`` using python-venv:

.. code-block:: bash
hatch run tests.py3.10-2.5:test-cov
python3 -venv m env && source env/bin/activate
pip3 install apache-airflow
pip3 install -e .[dbt-postgres,dbt-databricks]
It is also possible to run the tests using all the matrix combinations, by using:
Set airflow home to the ``dev/`` directory and disabled loading example DAGs:

.. code-block:: bash
hatch run tests:test-cov
export AIRFLOW_HOME=$(pwd)/dev/
export AIRFLOW__CORE__LOAD_EXAMPLES=false
The integration tests rely on Postgres. It is possible to host Postgres by using Docker, for example:
Then, initialize the database and create a new user, the command below will create a new user admin with password admin:

By default airflow will use sqlite as database, you can overwrite this by set variable ``AIRFLOW__DATABASE__SQL_ALCHEMY_CONN`` to the sql connection string.

.. code-block:: bash
docker run --name postgres -p 5432:5432 -p 5433:5433 -e POSTGRES_PASSWORD=postgres postgres
airflow db init
To run the integration tests for the first time, use:
airflow users create \
--email [email protected] --firstname admin \
--lastname admin --password admin \
--role Admin --username admin
.. code-block:: bash
Run the following airflow component on the separate terminal

export AIRFLOW_HOME=`pwd`
export AIRFLOW_CONN_AIRFLOW_DB=postgres://postgres:[email protected]:5432/postgres
hatch run tests.py3.8-2.5:test-integration-setup
hatch run tests.py3.8-2.5:test-integration
.. code-block:: bash
If testing for the same Airflow and Python version, next runs of the integration tests can be:
airflow webserver
airflow scheduler
airflow triggerer
.. code-block:: bash
Once the airflow is up, you can access the Airflow UI at ``http://localhost:8080``.

hatch run tests.py3.8-2.5:test-integration
Note: whenever you want to start the development server, you need to activate the ``virtualenv`` and set the ``environment variables``

Using Docker Compose for local development
------------------------------------------
--------------------------------------------

It is also possible to just build the development environment using docker compose


Local Sandbox
+++++++++++++

To launch a local sandbox with docker compose, first clone the ``astronomer-cosmos`` repo and enter the repo directory:

.. code-block:: bash
Expand All @@ -88,9 +93,48 @@ Then, run the docker compose command:
Once the sandbox is up, you can access the Airflow UI at ``http://localhost:8080``.

Testing application with hatch
------------------------------

We currently use `hatch <https://github.com/pypa/hatch>`_ for building and distributing ``astronomer-cosmos``.

The tool can also be used for local development. The `pyproject.toml <https://github.com/astronomer/astronomer-cosmos/blob/main/pyproject.toml>`_ file currently defines a matrix of supported versions of Python and Airflow for which a user can run the tests against.

For instance, to run the tests using Python 3.10 and Apache Airflow 2.5, use the following:

.. code-block:: bash
hatch run tests.py3.10-2.5:test-cov
It is also possible to run the tests using all the matrix combinations, by using:

.. code-block:: bash
hatch run tests:test-cov
The integration tests rely on Postgres. It is possible to host Postgres by using Docker, for example:

.. code-block:: bash
docker run --name postgres -p 5432:5432 -p 5433:5433 -e POSTGRES_PASSWORD=postgres postgres
To run the integration tests for the first time, use:

.. code-block:: bash
export AIRFLOW_HOME=`pwd`
export AIRFLOW_CONN_AIRFLOW_DB=postgres://postgres:[email protected]:5432/postgres
hatch run tests.py3.8-2.5:test-integration-setup
hatch run tests.py3.8-2.5:test-integration
If testing for the same Airflow and Python version, next runs of the integration tests can be:

.. code-block:: bash
hatch run tests.py3.8-2.5:test-integration
Pre-Commit
++++++++++
----------

We use pre-commit to run a number of checks on the code before committing. To install pre-commit, run:

Expand Down

0 comments on commit dcfa80a

Please sign in to comment.