-
Notifications
You must be signed in to change notification settings - Fork 177
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update contributing guide docs (#591)
- Remove repetitive instructions - Add instructions for Linux to prevent permission errors when running the docker-compose command Closes #588 (cherry picked from commit d829a04)
- Loading branch information
Showing
1 changed file
with
57 additions
and
38 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -16,93 +16,112 @@ To contribute to the cosmos project: | |
#. Link your issue to the pull request | ||
#. Once developments are complete on your feature branch, request a review and it will be merged once approved. | ||
|
||
Setup local development on host machine | ||
--------------------------------------- | ||
|
||
Using Hatch for local development | ||
--------------------------------- | ||
This guide will setup astronomer development on host machine, first clone the ``astronomer-cosmos`` repo and enter the repo directory: | ||
|
||
We currently use `hatch <https://github.com/pypa/hatch>`_ for building and distributing ``astronomer-cosmos``. | ||
.. code-block:: bash | ||
The tool can also be used for local development. The `pyproject.toml <https://github.com/astronomer/astronomer-cosmos/blob/main/pyproject.toml>`_ file currently defines a matrix of supported versions of Python and Airflow for which a user can run the tests against. | ||
git clone https://github.com/astronomer/astronomer-cosmos.git | ||
cd astronomer-cosmos/ | ||
For instance, to run the tests using Python 3.10 and Apache Airflow 2.5, use the following: | ||
Then install ``airflow`` and ``astronomer-cosmos`` using python-venv: | ||
|
||
.. code-block:: bash | ||
hatch run tests.py3.10-2.5:test-cov | ||
python3 -m venv env && source env/bin/activate | ||
pip3 install apache-airflow[cncf.kubernetes,openlineage] | ||
pip3 install -e .[dbt-postgres,dbt-databricks] | ||
It is also possible to run the tests using all the matrix combinations, by using: | ||
Set airflow home to the ``dev/`` directory and disabled loading example DAGs: | ||
|
||
.. code-block:: bash | ||
hatch run tests:test-cov | ||
export AIRFLOW_HOME=$(pwd)/dev/ | ||
export AIRFLOW__CORE__LOAD_EXAMPLES=false | ||
The integration tests rely on Postgres. It is possible to host Postgres by using Docker, for example: | ||
Then, run airflow in standalone mode, command below will create a new user (if not exist) and run necessary airflow component (webserver, scheduler and triggerer): | ||
|
||
By default airflow will use sqlite as database, you can overwrite this by set variable ``AIRFLOW__DATABASE__SQL_ALCHEMY_CONN`` to the sql connection string. | ||
|
||
.. code-block:: bash | ||
docker run --name postgres -p 5432:5432 -p 5433:5433 -e POSTGRES_PASSWORD=postgres postgres | ||
airflow standalone | ||
To run the integration tests for the first time, use: | ||
Once the airflow is up, you can access the Airflow UI at ``http://localhost:8080``. | ||
|
||
.. code-block:: bash | ||
Note: whenever you want to start the development server, you need to activate the ``virtualenv`` and set the ``environment variables`` | ||
|
||
export AIRFLOW_HOME=`pwd` | ||
export AIRFLOW_CONN_AIRFLOW_DB=postgres://postgres:[email protected]:5432/postgres | ||
hatch run tests.py3.8-2.5:test-integration-setup | ||
hatch run tests.py3.8-2.5:test-integration | ||
Using Docker Compose for local development | ||
-------------------------------------------- | ||
|
||
If testing for the same Airflow and Python version, next runs of the integration tests can be: | ||
It is also possible to just build the development environment using docker compose | ||
|
||
To launch a local sandbox with docker compose, first clone the ``astronomer-cosmos`` repo and enter the repo directory: | ||
|
||
.. code-block:: bash | ||
hatch run tests.py3.8-2.5:test-integration | ||
git clone https://github.com/astronomer/astronomer-cosmos.git | ||
cd astronomer-cosmos/ | ||
To prevent permission error on **Linux**, you must create dags, logs, and plugins folders and change owner to the user ``astro`` with the user ID 50000. To do this, run the following command: | ||
|
||
.. code-block:: bash | ||
docker run --name postgres -p 5432:5432 -p 5433:5433 -e POSTGRES_PASSWORD=postgres postgres | ||
mkdir -p dev/dags dev/logs dev/plugins | ||
sudo chown 50000:50000 -R dev/dags dev/logs dev/plugins | ||
To run the integration tests for the first time, use: | ||
Then, run the docker compose command: | ||
|
||
.. code-block:: bash | ||
export AIRFLOW_HOME=`pwd` | ||
export AIRFLOW_CONN_AIRFLOW_DB=postgres://postgres:[email protected]:5432/postgres | ||
hatch run tests.py3.8-2.5:test-integration-setup | ||
hatch run tests.py3.8-2.5:test-integration | ||
docker compose -f dev/docker-compose.yaml up -d --build | ||
If testing for the same Airflow and Python version, next runs of the integration tests can be: | ||
Once the sandbox is up, you can access the Airflow UI at ``http://localhost:8080``. | ||
|
||
.. code-block:: bash | ||
Testing application with hatch | ||
------------------------------ | ||
|
||
hatch run tests.py3.8-2.5:test-integration | ||
We currently use `hatch <https://github.com/pypa/hatch>`_ for building and distributing ``astronomer-cosmos``. | ||
|
||
The tool can also be used for local development. The `pyproject.toml <https://github.com/astronomer/astronomer-cosmos/blob/main/pyproject.toml>`_ file currently defines a matrix of supported versions of Python and Airflow for which a user can run the tests against. | ||
|
||
Using Docker Compose for local development | ||
------------------------------------------ | ||
For instance, to run the tests using Python 3.10 and Apache Airflow 2.5, use the following: | ||
|
||
It is also possible to just build the development environment using docker compose | ||
.. code-block:: bash | ||
hatch run tests.py3.10-2.5:test-cov | ||
It is also possible to run the tests using all the matrix combinations, by using: | ||
|
||
.. code-block:: bash | ||
Local Sandbox | ||
+++++++++++++ | ||
hatch run tests:test-cov | ||
To launch a local sandbox with docker compose, first clone the ``astronomer-cosmos`` repo: | ||
The integration tests rely on Postgres. It is possible to host Postgres by using Docker, for example: | ||
|
||
.. code-block:: bash | ||
git clone https://github.com/astronomer/astronomer-cosmos.git | ||
docker run --name postgres -p 5432:5432 -p 5433:5433 -e POSTGRES_PASSWORD=postgres postgres | ||
Then, run the following command from the ``astronomer-cosmos`` directory: | ||
To run the integration tests for the first time, use: | ||
|
||
.. code-block:: bash | ||
docker compose -f dev/docker-compose.yaml up -d --build | ||
export AIRFLOW_HOME=`pwd` | ||
export AIRFLOW_CONN_AIRFLOW_DB=postgres://postgres:[email protected]:5432/postgres | ||
hatch run tests.py3.8-2.5:test-integration-setup | ||
hatch run tests.py3.8-2.5:test-integration | ||
Once the sandbox is up, you can access the Airflow UI at ``http://localhost:8080``. | ||
If testing for the same Airflow and Python version, next runs of the integration tests can be: | ||
|
||
.. code-block:: bash | ||
hatch run tests.py3.8-2.5:test-integration | ||
Pre-Commit | ||
++++++++++ | ||
---------- | ||
|
||
We use pre-commit to run a number of checks on the code before committing. To install pre-commit, run: | ||
|
||
|