-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Emit telemetry to Scarf during DAG run #1397
Conversation
Deploying astronomer-cosmos with Cloudflare Pages
|
56b7bea
to
1f814b8
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1397 +/- ##
==========================================
- Coverage 96.52% 96.52% -0.01%
==========================================
Files 71 73 +2
Lines 4228 4312 +84
==========================================
+ Hits 4081 4162 +81
- Misses 147 150 +3 ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly looks good to me. Have some minor questions/suggestions inline.
5d9d76f
to
d6ab6b7
Compare
d6ab6b7
to
296d492
Compare
f4afa58
to
283ae2b
Compare
Export telemetry related to Cosmos usage to [Scarf](https://about.scarf.sh/). This data assists the project maintainers in better understanding how Cosmos is used. Insights from this telemetry are critical for prioritizing patches, minor releases, and security fixes. Additionally, this information supports critical decisions related to the development road map. Deployments and individual users can opt out of analytics by setting the configuration: ``` [cosmos] enable_telemetry: False ``` As described in the [official documentation](https://docs.scarf.sh/gateway/#do-not-track), it is also possible to opt-out by setting one of the following environment variables: ```commandline AIRFLOW__COSMOS__ENABLE_TELEMETRY=False DO_NOT_TRACK=True SCARF_NO_ANALYTICS=True ``` In addition to Scarf's default data collection, Cosmos collects the following information when running Cosmos-powered DAGs: - Cosmos version - Airflow version - Python version - Operating system & machine architecture - Event type - DAG hash - Total tasks - Total Cosmos tasks No user-identifiable information (IP included) is stored in Scarf, even though Scarf infers information from the IP, such as location, and stores that. The data collection is GDPR compliant. The Apache Foundation supports this same strategy in many of its OpenSource projects, including Airflow ([#39510](apache/airflow#39510)). Example of visualisation of the data via the Scarf UI: <img width="1235" alt="Screenshot 2024-12-19 at 10 22 59" src="https://github.com/user-attachments/assets/12b9fbd4-2fdd-4e62-9876-defee3c4d8da" /> <img width="1231" alt="Screenshot 2024-12-19 at 10 23 13" src="https://github.com/user-attachments/assets/f98b849c-99be-4764-9e6d-cb7730da3688" /> <img width="1227" alt="Screenshot 2024-12-19 at 10 23 21" src="https://github.com/user-attachments/assets/421b7581-c641-422a-8469-252ba5a2fd33" /> <img width="1237" alt="Screenshot 2024-12-19 at 10 23 28" src="https://github.com/user-attachments/assets/2e5995a2-fe09-4017-a625-4dd4a60028d0" /> <img width="1248" alt="Screenshot 2024-12-19 at 10 23 51" src="https://github.com/user-attachments/assets/64a8a07f-df56-493c-a3f5-0f5165fd58e8" /> <img width="1229" alt="Screenshot 2024-12-19 at 10 24 01" src="https://github.com/user-attachments/assets/1e3e8b8d-b11d-4b31-8b46-853d541b01b8" /> <img width="1240" alt="Screenshot 2024-12-19 at 10 24 11" src="https://github.com/user-attachments/assets/b5e79cc7-4e2e-44b2-a94b-891b9226b152" /> <img width="1241" alt="Screenshot 2024-12-19 at 10 24 20" src="https://github.com/user-attachments/assets/2fb5d666-d749-416d-acf8-4a3bc94ba014" /> <img width="1234" alt="Screenshot 2024-12-19 at 10 24 31" src="https://github.com/user-attachments/assets/353eb82c-44d2-44ec-87e2-ace7138132f5" /> <img width="1245" alt="Screenshot 2024-12-19 at 10 24 39" src="https://github.com/user-attachments/assets/4a637a2a-14ad-41a8-b7fd-db186ec74357" /> <img width="1233" alt="Screenshot 2024-12-19 at 10 24 48" src="https://github.com/user-attachments/assets/bec4e2b0-49c3-4289-8f9b-3285db9ec40c" /> Closes: #1143
**New Features** * Support customizing Airflow operator arguments per dbt node by @wornjs in #1339. [More information](https://astronomer.github.io/astronomer-cosmos/getting_started/custom-airflow-properties.html). * Support uploading dbt artifacts to remote cloud storages via callback by @pankajkoti in #1389. [Read more](https://astronomer.github.io/astronomer-cosmos/configuration/callbacks.html). * Add support to ``TestBehavior.BUILD`` by @tatiana in #1377. [Documentation](https://astronomer.github.io/astronomer-cosmos/configuration/testing-behavior.html). * Add support for the "at" operator when using ``LoadMode.DBT_MANIFEST`` or ``CUSTOM`` by @benjy44 in #1372 * Add dbt clone operator by @pankajastro in #1326, as documented in [here](https://astronomer.github.io/astronomer-cosmos/getting_started/operators.html). * Support rendering tasks with non-ASCII characters by @t0momi219 in #1278 [Read more](https://astronomer.github.io/astronomer-cosmos/configuration/task-display-name.html) * Add warning callback on source freshness by @pankajastro in #1400 [Read more](https://astronomer.github.io/astronomer-cosmos/configuration/source-nodes-rendering.html#on-warning-callback-callback) * Add Oracle Profile mapping by @slords and @pankajkoti in #1190 and #1404 * Emit telemetry to Scarf during DAG run by @tatiana in #1397 * Save tasks map as ``DbtToAirflowConverter`` property by @internetcoffeephone and @hheemskerk in #1362 **Bug Fixes** * Fix the mock value of port in ``TrinoBaseProfileMapping`` to be an integer by @dwolfeu #1322 * Fix access to the ``dbt docs`` menu item outside of Astro cloud by @tatiana in #1312 * Add missing ``DbtSourceGcpCloudRunJobOperator`` in module ``cosmos.operators.gcp_cloud_run_job`` by @anai-s in #1290 * Support building ``DbtDag`` without setting paths in ``ProjectConfig`` by @tatiana in #1307 * Fix parsing dbt ls outputs that contain JSONs that are not dbt nodes by @tatiana in #1296 * Fix Snowflake Profile mapping when using AWS default region by @tatiana in #1406 * Fix dag rendering for taskflow + DbtTaskGroup combo by @pankajastro in #1360 **Enhancements** * Improve dbt command execution logs to troubleshoot ``None`` values by @tatiana in #1392 * Add logging of stdout to dbt graph run_command by @KarolGongola in #1390 * Save tasks map as DbtToAirflowConverter property by @internetcoffeephone and @hheemskerk in #1362 * Support rendering build operator task-id with non-ASCII characters by @pankajastro in #1415 **Docs** * Remove extra ` char from docs by @pankajastro in #1345 * Add limitation about copying target dir files to remote by @pankajkoti in #1305 * Generalise example from README by @ReadytoRocc in #1311 * Add security policy by @tatiana, @chaosmaw and @lzdanski in # 1385 * Mention in documentation that the callback functionality is supported in ``ExecutionMode.VIRTUALENV`` by @pankajkoti in #1401 **Others** * Restore Jaffle Shop so that ``basic_cosmos_dag`` works as documented by @tatiana in #1374 * Remove Pytest durations from tests scripts by @tatiana in #1383 * Remove typing-extensions as dependency by @pankajastro in #1381 * Pin dbt-databricks version to < 1.9 by @pankajastro in #1376 * Refactor ``dbt-sqlite`` tests to use ``dbt-postgres`` by @pankajastro in #1366 * Remove 'dbt-core<1.8.9' pin by @tatiana in #1371 * Remove dependency ``eval_type_backport`` by @tatiana in #1370 * Enable kubernetes tests for dbt>=1.8 by @pankajastro #1364 * CI Workaround: Pin dbt-core, Disable SQLite Tests, and Correctly Ignore Clone Test to Pass CI by @pankajastro in #1337 * Enable Azure task in the remote store manifest example DAG by @pankajkoti in #1333 * Enable GCP remote manifest task by @pankajastro in #1332 * Add exempt label option in GH action stale job by @pankajastro in #1328 * Add integration test for source node rendering by @pankajastro in #1327 * Fix vulnerability issue on docs dependency by @tatiana in #1313 * Add postgres pod status check for k8s tests in CI by @pankajkoti in #1320 * [CI] Reduce the amount taking to run tests in the CI from 5h to 11min by @tatiana in #1297 * Enable secret detection precommit check by @pankajastro in #1302 * Fix security vulnerability, by not pinning Airflow 2.10.0 by @tatiana in #1298 * Fix Netlify build timeouts by @tatiana in #1294 * Add stalebot to label/close stale PRs and issues by @tatiana in #1288 * Unpin dbt-databricks version by @pankajastro in #1409 * Fix source resource type tests by @pankajastro in #1405 * Increase performance tests models by @tatiana in #1403 * Drop running 1000 models in the CI by @pankajkoti in #1411 * Fix releasing package to PyPI by @tatiana in #1396 * Pre-commit hook updates in #1394, #1373, #1358, #1340, #1331, #1314, #1301 Co-authored-by: Pankaj Koti <[email protected]> Co-authored-by: Pankaj Singh <[email protected]> Closes: #1193 --------- Co-authored-by: Pankaj Koti <[email protected]> Co-authored-by: Pankaj Singh <[email protected]>
Export telemetry related to Cosmos usage to Scarf.
This data assists the project maintainers in better understanding how Cosmos is used. Insights from this telemetry are critical for prioritizing patches, minor releases, and security fixes. Additionally, this information supports critical decisions related to the development road map.
Deployments and individual users can opt out of analytics by setting the configuration:
As described in the official documentation, it is also possible to opt-out by setting one of the following environment variables:
In addition to Scarf's default data collection, Cosmos collects the following information when running Cosmos-powered DAGs:
No user-identifiable information (IP included) is stored in Scarf, even though Scarf infers information from the IP, such as location, and stores that. The data collection is GDPR compliant.
The Apache Foundation supports this same strategy in many of its OpenSource projects, including Airflow (#39510).
Example of visualisation of the data via the Scarf UI:
Closes: #1143