"
+ },
+ {
+ "name": "Visa",
+ "slug": "visa",
+ "imageUrl": "/img/logos/companies/visa.png",
+ "imageSize": "large",
+ "link": "https://blog.datahubproject.io/how-visa-uses-datahub-to-scale-data-governance-cace052d61c5",
+ "linkType": "blog",
+ "tagline": "How Visa uses DataHub to scale data governance",
+ "category": "Financial & Fintech",
+ "description": "\"We found DataHub to provide excellent coverage for our needs. What we appreciate most about DataHub is its powerful API platform.\"
— Jean-Pierre Dijcks, Sr. Dir. Product Management at VISA
"
+ },
+ {
+ "name": "Optum",
+ "slug": "optum",
+ "imageUrl": "/img/logos/companies/optum.jpg",
+ "imageSize": "medium",
+ "link": "https://opensource.optum.com/blog/2022/03/23/data-mesh-via-datahub",
+ "linkType": "blog",
+ "tagline": "Data Mesh via DataHub",
+ "category": "And More",
+ "description": "“DataHub’s event driven architecture provides us a mechanism to act on any metadata changes in real time. This allows us to perform various actions like provisioning access to a data product, notifying consumers on any schema changes that may affect their application or triggering data movement jobs to move data from source to sink platforms.”"
+ },
+ {
+ "name": "Pinterest",
+ "slug": "pinterest",
+ "imageUrl": "/img/logos/companies/pinterest.png",
+ "imageSize": "small",
+ "link": "https://www.youtube.com/watch?v=YoxTg8tQSwg&feature=youtu.be",
+ "linkType": "blog",
+ "tagline": "DataHub Project at Pinterest",
+ "category": "B2B & B2C",
+ "description": "Pinterest adopted a DataHub project to enhance metadata management for its big data query platform, facilitating better data navigation and understanding."
+ },
+ {
+ "name": "Airtel",
+ "slug": "airtel",
+ "imageUrl": "/img/logos/companies/airtel.png",
+ "imageSize": "large",
+ "link": "https://www.youtube.com/watch?v=yr24mM91BN4",
+ "linkType": "video",
+ "tagline": "A transformative journey to Airtel's data mesh architecture with DataHub",
+ "category": "B2B & B2C",
+ "description": "Airtel is a leading global telecommunication provider. DataHub is the bedrock of Data Mesh at Airtel by providing the requisite governance and metadata management functionality to ensure their Data Products are discoverable, addressable, trustworthy, self-describing, and secure.
Get a closer look at how the Airtel team has successfully integrated DataHub to take their data mesh implementation to the next level."
+ },
+ {
+ "name": "Coursera",
+ "slug": "coursera",
+ "imageUrl": "/img/logos/companies/coursera.svg",
+ "imageSize": "small",
+ "link": "https://www.youtube.com/watch?v=bd5v4fn4d4s",
+ "linkType": "video",
+ "tagline": "Coursera's DataHub Journey",
+ "category": "B2B & B2C",
+ "description": "“DataHub aligns with our needs [for] data documentation, a unified search experience, lineage information, and additional metadata. We are also very impressed with the vibrant and supportive community.”"
+ },
+ {
+ "name": "Zynga",
+ "slug": "zynga",
+ "imageUrl": "/img/logos/companies/zynga.png",
+ "imageSize": "default",
+ "link": "https://www.youtube.com/watch?v=VCU3-Hd_glI",
+ "linkType": "video",
+ "tagline": "Zynga's DataHub Implementation",
+ "category": "B2B & B2C",
+ "description": "“We looked around for data catalog tool, and DataHub was a clear winner.”
Zynga levels up data management using DataHub, highlighting its role in enhancing data management, tracing data lineage, and ensuring data quality."
+ },
+ {
+ "name": "Saxo Bank",
+ "slug": "saxo-bank",
+ "imageUrl": "/img/logos/companies/saxobank.svg",
+ "imageSize": "default",
+ "link": "https://blog.datahubproject.io/enabling-data-discovery-in-a-data-mesh-the-saxo-journey-451b06969c8f",
+ "linkType": "blog",
+ "tagline": "Enabling Data Discovery in a Data Mesh",
+ "category": "Financial & Fintech",
+ "description": "Saxo Bank adopted DataHub to enhance data quality and streamline governance, facilitating efficient data management through self-service capabilities.
By integrating Apache Kafka and Snowflake with DataHub, the bank embraced Data Mesh principles to democratize data, support rapid growth, and improve business processes."
+ },
+ {
+ "name": "MediaMarkt Saturn",
+ "slug": "mediamarkt-saturn",
+ "imageUrl": "/img/logos/companies/mediamarkt-saturn.png",
+ "imageSize": "large",
+ "link": "https://www.youtube.com/watch?v=wsCFnElN_Wo",
+ "linkType": "video",
+ "tagline": "DataHub + MediaMarktSaturn Access Management Journey",
+ "category": "B2B & B2C",
+ "description": "Europe’s #1 consumer electronics retailer implemented DataHub for three reasons:
1. DataHub provides an extremely flexible and customizable metadata platform at scale. 2. Open-source means lower cost to implement and removes the headache of license management. 3. Community-driven project which continually evolves with industry trends and best practices."
+ },
+ {
+ "name": "Adevinta",
+ "slug": "adevinta",
+ "imageUrl": "/img/logos/companies/adevinta.png",
+ "imageSize": "medium",
+ "link": "https://medium.com/@adevinta/building-the-data-catalogue-the-beginning-of-a-journey-d64e828f955c",
+ "linkType": "blog",
+ "tagline": "Building the data catalogue",
+ "category": "E-Commerce",
+ "description": "“DataHub allows us to solve the data discovery problem, which was a big challenge in our organization, and now we are solving it.”"
+ },
+ {
+ "name": "Wolt",
+ "slug": "wolt",
+ "imageUrl": "/img/logos/companies/wolt.png",
+ "imageSize": "default",
+ "link": "https://blog.datahubproject.io/humans-of-datahub-fredrik-sannholm-d673b1877f2b",
+ "linkType": "blog",
+ "tagline": "Wolt's DataHub Integration",
+ "category": "E-Commerce",
+ "description": "“[DataHub] has made our legal team very happy with being able to keep track of our sensitive data [to answer questions like] Where’s it going? How’s it being processed? Where’s it ending up? Which third party tool or API’s are we sending it to and why? Who is responsible for this integration?”"
+ },
+ {
+ "name": "Geotab",
+ "slug": "geotab",
+ "imageUrl": "/img/logos/companies/geotab.jpg",
+ "imageSize": "small",
+ "link": "https://www.youtube.com/watch?v=boyjT2OrlU4",
+ "linkType": "video",
+ "tagline": "Geotab's Experience with DataHub",
+ "category": "B2B & B2C",
+ "description": "“The key evaluation metric for selecting DataHub was the approachability and technical capabilities of its leading development team.”
Geotab’s data adoption journey explores challenges in data management, governance, and the decision to utilize DataHub for improved productivity and collaboration."
+ },
+ {
+ "name": "Hurb",
+ "slug": "hurb",
+ "imageUrl": "/img/logos/companies/hurb.png",
+ "imageSize": "medium",
+ "link": "https://blog.datahubproject.io/humans-of-datahub-patrick-franco-braz-b02b55a4c5384",
+ "linkType": "blog",
+ "tagline": "Hurb's DataHub Journey",
+ "category": "B2B & B2C",
+ "description": "“The main points that drove our decision to implement DataHub were its user-friendly interface, active and receptive community, contribution opportunities, and built-in ingestion sources for our primary services.”
diff --git a/docs-website/src/learn/_components/LearnListPage/index.jsx b/docs-website/src/learn/_components/LearnListPage/index.jsx
index 4df87a340f21ee..1ceec9afa1e8a3 100644
--- a/docs-website/src/learn/_components/LearnListPage/index.jsx
+++ b/docs-website/src/learn/_components/LearnListPage/index.jsx
@@ -58,8 +58,9 @@ function BlogListPageContent(props) {
For:
{audiences.map((audience) => (
diff --git a/docs-website/src/learn/_components/LearnListPage/styles.module.scss b/docs-website/src/learn/_components/LearnListPage/styles.module.scss
index d08b48a011de07..ce86e124afdb81 100644
--- a/docs-website/src/learn/_components/LearnListPage/styles.module.scss
+++ b/docs-website/src/learn/_components/LearnListPage/styles.module.scss
@@ -4,4 +4,10 @@
align-items: center;
gap: 10px;
flex-wrap: wrap;
-}
\ No newline at end of file
+
+ .buttonActive {
+ background-color: var(--ifm-color-primary);
+ border: 1px solid var(--ifm-color-primary);
+ color: #ffffff;
+ }
+}
diff --git a/docs-website/src/learn/business-glossary.md b/docs-website/src/learn/business-glossary.md
index d6b249617fc5ac..4568406a1667fc 100644
--- a/docs-website/src/learn/business-glossary.md
+++ b/docs-website/src/learn/business-glossary.md
@@ -91,7 +91,7 @@ Some companies use manual methods to track data terminology and manage access re
### Our Solution
-Acryl DataHub offers comprehensive features designed to support the authoring of a unified business glossary for your organization:
+DataHub Cloud offers comprehensive features designed to support the authoring of a unified business glossary for your organization:
diff --git a/docs-website/src/learn/business-metric.md b/docs-website/src/learn/business-metric.md
index 39221a67d40abc..1378168f42195e 100644
--- a/docs-website/src/learn/business-metric.md
+++ b/docs-website/src/learn/business-metric.md
@@ -54,7 +54,7 @@ Some companies try to align metric definitions through emails and meetings. Whil
### Our Solution
-Acryl DataHub offers comprehensive features designed to tackle the challenges of defining and standardizing business metrics:
+DataHub Cloud offers comprehensive features designed to tackle the challenges of defining and standardizing business metrics:
@@ -72,13 +72,14 @@ Acryl DataHub offers comprehensive features designed to tackle the challenges of
- **[Approval Flows](https://datahubproject.io/docs/managed-datahub/approval-workflows):** Structured workflows for approving changes to metric definitions, maintaining accuracy and reliability.
- -
-![Untitled](https://prod-files-secure.s3.us-west-2.amazonaws.com/f818df0d-1067-44ab-99e1-8cf45d930c01/33ebd070-32a1-4875-b220-c31373f5eedf/Untitled.png)
+
+
+
+ Lineage Tracking
+
- **[Lineage Tracking](https://datahubproject.io/docs/generated/lineage/lineage-feature-guide):** Tools to track the origin and transformations of metrics, ensuring they align with standardized definitions.
- -
-![Screenshot 2024-07-10 at 12.07.28 PM.png](https://prod-files-secure.s3.us-west-2.amazonaws.com/f818df0d-1067-44ab-99e1-8cf45d930c01/39503957-ad64-4d2d-a5b2-b140abfc1f6c/Screenshot_2024-07-10_at_12.07.28_PM.png)
By implementing these solutions, you can ensure that your business metrics are consistently defined and accurately used across all teams, supporting reliable analysis and decision-making.
diff --git a/docs-website/src/learn/data-freshness.md b/docs-website/src/learn/data-freshness.md
index e97e9b054b256d..528b2975f7528e 100644
--- a/docs-website/src/learn/data-freshness.md
+++ b/docs-website/src/learn/data-freshness.md
@@ -91,7 +91,7 @@ DataHub offers comprehensive features designed to tackle data freshness challeng
-**Freshness Monitoring & Alerting:** Automatically detect and alert when data freshness issues occur, to ensure timely updates by proactively monitoring key datasets for updates. Check out [Assertions](https://datahubproject.io/docs/managed-datahub/observe/assertions) and [Freshness Assertions](https://datahubproject.io/docs/managed-datahub/observe/freshness-assertions), Available in **Acryl Managed DataHub Only.**
+**Freshness Monitoring & Alerting:** Automatically detect and alert when data freshness issues occur, to ensure timely updates by proactively monitoring key datasets for updates. Check out [Assertions](https://datahubproject.io/docs/managed-datahub/observe/assertions) and [Freshness Assertions](https://datahubproject.io/docs/managed-datahub/observe/freshness-assertions), Available in **DataHub Cloud Only.**
diff --git a/docs-website/src/learn/data-mesh.md b/docs-website/src/learn/data-mesh.md
index f9a625d103ae71..038fcb971f5da2 100644
--- a/docs-website/src/learn/data-mesh.md
+++ b/docs-website/src/learn/data-mesh.md
@@ -90,7 +90,7 @@ While a centralized data lake or warehouse can simplify data governance by virtu
### Our Solution
-Acryl DataHub offers a comprehensive set of features designed to support the implementation of a Data Mesh at your organization:
+DataHub Cloud offers a comprehensive set of features designed to support the implementation of a Data Mesh at your organization:
- **[Data Domains](https://datahubproject.io/docs/domains)**: Clearly define and manage data products within each business unit.
- **[Data Products](https://datahubproject.io/docs/dataproducts):** Ensure each domain owns and manages its data products, promoting autonomy and agility.
@@ -100,7 +100,7 @@ Acryl DataHub offers a comprehensive set of features designed to support the imp
- Data Contracts in Acryl DataHub UI
+ Data Contracts in DataHub Cloud UI
@@ -128,4 +128,4 @@ By implementing these solutions, you can effectively manage decentralized data,
## Conclusion
-Implementing a Data Mesh can significantly improve your organization's ability to manage and leverage decentralized data. By understanding the benefits of data mesh and following best practices for implementation, you can overcome the limitations of centralized data systems and enhance your agility, scalability, and ability to generate insights. Acryl DataHub was built from the ground up to help you achieve this, providing the tools and features necessary to implement a large-scale Data Mesh successfully.
\ No newline at end of file
+Implementing a Data Mesh can significantly improve your organization's ability to manage and leverage decentralized data. By understanding the benefits of data mesh and following best practices for implementation, you can overcome the limitations of centralized data systems and enhance your agility, scalability, and ability to generate insights. DataHub Cloud was built from the ground up to help you achieve this, providing the tools and features necessary to implement a large-scale Data Mesh successfully.
\ No newline at end of file
diff --git a/docs-website/src/learn/data-pipeline.md b/docs-website/src/learn/data-pipeline.md
index f5e5bb6615f48b..d57341c30a8c73 100644
--- a/docs-website/src/learn/data-pipeline.md
+++ b/docs-website/src/learn/data-pipeline.md
@@ -61,7 +61,7 @@ Some companies resort to manual debugging or use communication tools like Slack
### Our Solution
-Acryl DataHub offers comprehensive features designed to optimize data pipelines:
+DataHub Cloud offers comprehensive features designed to optimize data pipelines:
- );
-};
-
-export default CustomerCard;
diff --git a/docs-website/src/pages/docs/_components/CustomerCardSection/index.jsx b/docs-website/src/pages/docs/_components/CustomerCardSection/index.jsx
index ca34d89df8701d..505a2810c9433c 100644
--- a/docs-website/src/pages/docs/_components/CustomerCardSection/index.jsx
+++ b/docs-website/src/pages/docs/_components/CustomerCardSection/index.jsx
@@ -57,7 +57,7 @@ const customerCardContent = [
3. Community-driven project which continually evolves with industry trends and best practices
>
),
- to: "https://www.acryldata.io/blog/data-contracts-in-datahub-combining-verifiability-with-holistic-data-management?utm_source=datahub&utm_medium=referral&utm_content=blog",
+ to: "https://youtu.be/wsCFnElN_Wo?si=i-bNAQAsbHJq5O9-",
},
{
customer: "Airtel",
@@ -75,7 +75,7 @@ const customerCardContent = [
DataHub to take their data mesh implementation to the next level.
>
),
- to: "https://youtu.be/wsCFnElN_Wo?si=i-bNAQAsbHJq5O9-",
+ to: "https://www.youtube.com/watch?v=yr24mM91BN4",
},
];
@@ -93,4 +93,4 @@ const CustomerCardSection = () => {
);
};
-export default CustomerCardSection;
+export default CustomerCardSection;
\ No newline at end of file
diff --git a/docs-website/src/pages/docs/_components/QuickstartCards/index.jsx b/docs-website/src/pages/docs/_components/QuickstartCards/index.jsx
index bcb77c043f1d0b..9f582967175edd 100644
--- a/docs-website/src/pages/docs/_components/QuickstartCards/index.jsx
+++ b/docs-website/src/pages/docs/_components/QuickstartCards/index.jsx
@@ -10,7 +10,7 @@ const quickstartContent = [
fontColor: '#091013',
},
{
- title: "Learn about Managed DataHub",
+ title: "Learn about DataHub Cloud",
icon: "acryl-logo-transparent-mark",
to: "managed-datahub/managed-datahub-overview",
color: '#091013',
diff --git a/docs-website/src/pages/index.js b/docs-website/src/pages/index.js
index 68b177d10f7aff..2eed41b4ad1bd3 100644
--- a/docs-website/src/pages/index.js
+++ b/docs-website/src/pages/index.js
@@ -3,13 +3,14 @@ import Layout from "@theme/Layout";
import Link from "@docusaurus/Link";
import useDocusaurusContext from "@docusaurus/useDocusaurusContext";
import CodeBlock from "@theme/CodeBlock";
-
+import useBaseUrl from "@docusaurus/useBaseUrl";
import Hero from "./_components/Hero";
import Features from "./_components/Features";
-import Quotes from "./_components/Quotes";
import { Section, PromoSection } from "./_components/Section";
-import { PlatformLogos, CompanyLogos } from "./_components/Logos";
+import { PlatformLogos } from "./_components/Logos";
import RoundedImage from "./_components/RoundedImage";
+import { CompanyLogos } from "./_components/Logos";
+import QuickstartContent from "./_components/QuickstartContent";
const example_recipe = `
source:
@@ -38,6 +39,18 @@ function Home() {
description="DataHub is a data discovery application built on an extensible data catalog that helps you tame the complexity of diverse data ecosystems."
>
+
+
+
+
+ Check Out Adoption Stories →
+
+
+
+
@@ -157,10 +170,6 @@ function Home() {
-
-
-
-
) : null;
}
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-adevinta.png b/docs-website/static/img/adoption-stories/adoption-stories-adevinta.png
new file mode 100644
index 00000000000000..6c790995843c54
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-adevinta.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-airtel.png b/docs-website/static/img/adoption-stories/adoption-stories-airtel.png
new file mode 100644
index 00000000000000..ae5ebdedd47aa1
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-airtel.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-coursera.png b/docs-website/static/img/adoption-stories/adoption-stories-coursera.png
new file mode 100644
index 00000000000000..4f473874d0dc26
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-coursera.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-geotab.png b/docs-website/static/img/adoption-stories/adoption-stories-geotab.png
new file mode 100644
index 00000000000000..2b3c8a158273a9
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-geotab.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-grofers.png b/docs-website/static/img/adoption-stories/adoption-stories-grofers.png
new file mode 100644
index 00000000000000..51af8a3ad69d7b
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-grofers.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-hurb.png b/docs-website/static/img/adoption-stories/adoption-stories-hurb.png
new file mode 100644
index 00000000000000..b7b8bae5d8c321
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-hurb.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-mediamarkt-saturn.png b/docs-website/static/img/adoption-stories/adoption-stories-mediamarkt-saturn.png
new file mode 100644
index 00000000000000..ac2f524a7a0e77
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-mediamarkt-saturn.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-netflix.png b/docs-website/static/img/adoption-stories/adoption-stories-netflix.png
new file mode 100644
index 00000000000000..de65a4c59419b5
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-netflix.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-optum.png b/docs-website/static/img/adoption-stories/adoption-stories-optum.png
new file mode 100644
index 00000000000000..051abaa96a0e01
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-optum.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-pinterest.png b/docs-website/static/img/adoption-stories/adoption-stories-pinterest.png
new file mode 100644
index 00000000000000..e005ea6d5750aa
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-pinterest.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-saxo-bank.png b/docs-website/static/img/adoption-stories/adoption-stories-saxo-bank.png
new file mode 100644
index 00000000000000..333003d146cf5e
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-saxo-bank.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-viasat.png b/docs-website/static/img/adoption-stories/adoption-stories-viasat.png
new file mode 100644
index 00000000000000..b6f633450296c6
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-viasat.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-visa.png b/docs-website/static/img/adoption-stories/adoption-stories-visa.png
new file mode 100644
index 00000000000000..11d732faf85fec
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-visa.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-wolt.png b/docs-website/static/img/adoption-stories/adoption-stories-wolt.png
new file mode 100644
index 00000000000000..43501a1f2f6d57
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-wolt.png differ
diff --git a/docs-website/static/img/adoption-stories/adoption-stories-zynga.png b/docs-website/static/img/adoption-stories/adoption-stories-zynga.png
new file mode 100644
index 00000000000000..94ee9e9b2fb8ee
Binary files /dev/null and b/docs-website/static/img/adoption-stories/adoption-stories-zynga.png differ
diff --git a/docs-website/static/img/adoption-stories/img.png b/docs-website/static/img/adoption-stories/img.png
new file mode 100644
index 00000000000000..4d4971018c3982
Binary files /dev/null and b/docs-website/static/img/adoption-stories/img.png differ
diff --git a/docs-website/static/img/assets/business.jpg b/docs-website/static/img/assets/business.jpg
deleted file mode 100644
index f5a91928ee2ad8..00000000000000
Binary files a/docs-website/static/img/assets/business.jpg and /dev/null differ
diff --git a/docs-website/static/img/assets/data-discovery.svg b/docs-website/static/img/assets/data-discovery.svg
new file mode 100644
index 00000000000000..1a6c6f36a231ae
--- /dev/null
+++ b/docs-website/static/img/assets/data-discovery.svg
@@ -0,0 +1,4 @@
+
diff --git a/docs-website/static/img/assets/data-governance.svg b/docs-website/static/img/assets/data-governance.svg
new file mode 100644
index 00000000000000..b1db48e3e76bfe
--- /dev/null
+++ b/docs-website/static/img/assets/data-governance.svg
@@ -0,0 +1,6 @@
+
diff --git a/docs-website/static/img/assets/data-ob.svg b/docs-website/static/img/assets/data-ob.svg
new file mode 100644
index 00000000000000..d630b0a2333a2d
--- /dev/null
+++ b/docs-website/static/img/assets/data-ob.svg
@@ -0,0 +1,8 @@
+
diff --git a/docs-website/static/img/assets/netflix.jpg b/docs-website/static/img/assets/netflix.jpg
deleted file mode 100644
index 8b555f5b63187f..00000000000000
Binary files a/docs-website/static/img/assets/netflix.jpg and /dev/null differ
diff --git a/docs-website/static/img/assets/phonecall.jpg b/docs-website/static/img/assets/phonecall.jpg
deleted file mode 100644
index 87e48f28213827..00000000000000
Binary files a/docs-website/static/img/assets/phonecall.jpg and /dev/null differ
diff --git a/docs-website/static/img/assets/travel.jpg b/docs-website/static/img/assets/travel.jpg
deleted file mode 100644
index de2697f5631217..00000000000000
Binary files a/docs-website/static/img/assets/travel.jpg and /dev/null differ
diff --git a/docs-website/static/img/cloud-bg.png b/docs-website/static/img/cloud-bg.png
new file mode 100644
index 00000000000000..392aec2ff936c5
Binary files /dev/null and b/docs-website/static/img/cloud-bg.png differ
diff --git a/docs-website/static/img/logos/companies/mediamarkt-saturn.png b/docs-website/static/img/logos/companies/mediamarkt-saturn.png
new file mode 100644
index 00000000000000..6e3a39e0ae34b1
Binary files /dev/null and b/docs-website/static/img/logos/companies/mediamarkt-saturn.png differ
diff --git a/docs-website/static/img/logos/companies/netflix.png b/docs-website/static/img/logos/companies/netflix.png
new file mode 100755
index 00000000000000..151775b3b17bc4
Binary files /dev/null and b/docs-website/static/img/logos/companies/netflix.png differ
diff --git a/docs-website/static/img/logos/companies/pinterest.png b/docs-website/static/img/logos/companies/pinterest.png
new file mode 100644
index 00000000000000..715c8c33fd85b4
Binary files /dev/null and b/docs-website/static/img/logos/companies/pinterest.png differ
diff --git a/docs-website/static/img/logos/companies/visa.png b/docs-website/static/img/logos/companies/visa.png
new file mode 100644
index 00000000000000..0a0198bfb76a28
Binary files /dev/null and b/docs-website/static/img/logos/companies/visa.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/acertus.webp b/docs-website/static/img/logos/scrollingCompanies/acertus.webp
new file mode 100644
index 00000000000000..20ff1c6d7d554f
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/acertus.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/autoscout24.webp b/docs-website/static/img/logos/scrollingCompanies/autoscout24.webp
new file mode 100644
index 00000000000000..27b34c6f724dee
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/autoscout24.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/betterup.webp b/docs-website/static/img/logos/scrollingCompanies/betterup.webp
new file mode 100644
index 00000000000000..268e019de8fd4a
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/betterup.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/depop.webp b/docs-website/static/img/logos/scrollingCompanies/depop.webp
new file mode 100644
index 00000000000000..7c006bb8620607
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/depop.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/dpg-media.png b/docs-website/static/img/logos/scrollingCompanies/dpg-media.png
new file mode 100644
index 00000000000000..40022ef0294d73
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/dpg-media.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/dpg_media.webp b/docs-website/static/img/logos/scrollingCompanies/dpg_media.webp
new file mode 100644
index 00000000000000..0d5c42847068a1
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/dpg_media.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/event-callout-img.png b/docs-website/static/img/logos/scrollingCompanies/event-callout-img.png
new file mode 100644
index 00000000000000..032c34fb6a10ec
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/event-callout-img.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/graham-stirling.png b/docs-website/static/img/logos/scrollingCompanies/graham-stirling.png
new file mode 100644
index 00000000000000..14bb23dcef0d94
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/graham-stirling.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/icon-check.svg b/docs-website/static/img/logos/scrollingCompanies/icon-check.svg
new file mode 100644
index 00000000000000..83a4d0983ebe0e
--- /dev/null
+++ b/docs-website/static/img/logos/scrollingCompanies/icon-check.svg
@@ -0,0 +1,14 @@
+
diff --git a/docs-website/static/img/logos/scrollingCompanies/log-saxo.svg b/docs-website/static/img/logos/scrollingCompanies/log-saxo.svg
new file mode 100644
index 00000000000000..f6ab4ae071b10c
--- /dev/null
+++ b/docs-website/static/img/logos/scrollingCompanies/log-saxo.svg
@@ -0,0 +1,16 @@
+
diff --git a/docs-website/static/img/logos/scrollingCompanies/myob.png b/docs-website/static/img/logos/scrollingCompanies/myob.png
new file mode 100644
index 00000000000000..c161532e650ba4
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/myob.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/myob.webp b/docs-website/static/img/logos/scrollingCompanies/myob.webp
new file mode 100644
index 00000000000000..509e3fba0804c5
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/myob.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/notion-logo.png b/docs-website/static/img/logos/scrollingCompanies/notion-logo.png
new file mode 100644
index 00000000000000..f65884f53ead89
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/notion-logo.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/notion.webp b/docs-website/static/img/logos/scrollingCompanies/notion.webp
new file mode 100644
index 00000000000000..393756855d938c
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/notion.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/ovo_energy.webp b/docs-website/static/img/logos/scrollingCompanies/ovo_energy.webp
new file mode 100644
index 00000000000000..3e08a80f5c5f42
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/ovo_energy.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/regeneron.png b/docs-website/static/img/logos/scrollingCompanies/regeneron.png
new file mode 100644
index 00000000000000..0a660b95c6d503
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/regeneron.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/regeneron.webp b/docs-website/static/img/logos/scrollingCompanies/regeneron.webp
new file mode 100644
index 00000000000000..45bed965f3a01c
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/regeneron.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/riskified.png b/docs-website/static/img/logos/scrollingCompanies/riskified.png
new file mode 100644
index 00000000000000..69b94b43fd56e8
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/riskified.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/riskified.webp b/docs-website/static/img/logos/scrollingCompanies/riskified.webp
new file mode 100644
index 00000000000000..a2a2f96d5ea7be
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/riskified.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/robinhood.png b/docs-website/static/img/logos/scrollingCompanies/robinhood.png
new file mode 100644
index 00000000000000..e75535a383f324
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/robinhood.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/robinhood.webp b/docs-website/static/img/logos/scrollingCompanies/robinhood.webp
new file mode 100644
index 00000000000000..661c25c0dd8302
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/robinhood.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/saxo_bank.webp b/docs-website/static/img/logos/scrollingCompanies/saxo_bank.webp
new file mode 100644
index 00000000000000..a4c1aae73fe48b
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/saxo_bank.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/screenshot.png b/docs-website/static/img/logos/scrollingCompanies/screenshot.png
new file mode 100644
index 00000000000000..59d982c5aec6d5
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/screenshot.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/twilio-bg.png b/docs-website/static/img/logos/scrollingCompanies/twilio-bg.png
new file mode 100644
index 00000000000000..74dcbf88a35951
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/twilio-bg.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/twilio.png b/docs-website/static/img/logos/scrollingCompanies/twilio.png
new file mode 100644
index 00000000000000..f226674d0ffbce
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/twilio.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/twilio.webp b/docs-website/static/img/logos/scrollingCompanies/twilio.webp
new file mode 100644
index 00000000000000..5ad47d5d5c87e2
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/twilio.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/xero.png b/docs-website/static/img/logos/scrollingCompanies/xero.png
new file mode 100644
index 00000000000000..653ddfb2c76869
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/xero.png differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/xero.webp b/docs-website/static/img/logos/scrollingCompanies/xero.webp
new file mode 100644
index 00000000000000..9f2b4cc0cf0f9f
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/xero.webp differ
diff --git a/docs-website/static/img/logos/scrollingCompanies/zendesk.webp b/docs-website/static/img/logos/scrollingCompanies/zendesk.webp
new file mode 100644
index 00000000000000..e790fdc2af6eda
Binary files /dev/null and b/docs-website/static/img/logos/scrollingCompanies/zendesk.webp differ
diff --git a/docs-website/yarn.lock b/docs-website/yarn.lock
index a93b0e74c327db..0970a59cbc00a3 100644
--- a/docs-website/yarn.lock
+++ b/docs-website/yarn.lock
@@ -1827,7 +1827,7 @@
"@docusaurus/theme-search-algolia" "2.4.3"
"@docusaurus/types" "2.4.3"
-"@docusaurus/react-loadable@5.5.2":
+"@docusaurus/react-loadable@5.5.2", "react-loadable@npm:@docusaurus/react-loadable@5.5.2":
version "5.5.2"
resolved "https://registry.yarnpkg.com/@docusaurus/react-loadable/-/react-loadable-5.5.2.tgz#81aae0db81ecafbdaee3651f12804580868fa6ce"
integrity sha512-A3dYjdBGuy0IGT+wyLIGIKLRE+sAk1iNk0f1HjNDysO7u8lhL4N3VEm+FAubmJbAztn94F7MxBTPmnixbiyFdQ==
@@ -9705,14 +9705,6 @@ react-loadable-ssr-addon-v5-slorber@^1.0.1:
dependencies:
"@babel/runtime" "^7.10.3"
-"react-loadable@npm:@docusaurus/react-loadable@5.5.2":
- version "5.5.2"
- resolved "https://registry.yarnpkg.com/@docusaurus/react-loadable/-/react-loadable-5.5.2.tgz#81aae0db81ecafbdaee3651f12804580868fa6ce"
- integrity sha512-A3dYjdBGuy0IGT+wyLIGIKLRE+sAk1iNk0f1HjNDysO7u8lhL4N3VEm+FAubmJbAztn94F7MxBTPmnixbiyFdQ==
- dependencies:
- "@types/react" "*"
- prop-types "^15.6.2"
-
react-markdown@^8.0.6:
version "8.0.7"
resolved "https://registry.yarnpkg.com/react-markdown/-/react-markdown-8.0.7.tgz#c8dbd1b9ba5f1c5e7e5f2a44de465a3caafdf89b"
@@ -10866,6 +10858,11 @@ swc-loader@^0.2.6:
dependencies:
"@swc/counter" "^0.1.3"
+swiper@^11.1.4:
+ version "11.1.4"
+ resolved "https://registry.yarnpkg.com/swiper/-/swiper-11.1.4.tgz#2f8e303e8bf9e5bc40a3885fc637ae60ff27996c"
+ integrity sha512-1n7kbYJB2dFEpUHRFszq7gys/ofIBrMNibwTiMvPHwneKND/t9kImnHt6CfGPScMHgI+dWMbGTycCKGMoOO1KA==
+
symbol-observable@^1.0.4:
version "1.2.0"
resolved "https://registry.yarnpkg.com/symbol-observable/-/symbol-observable-1.2.0.tgz#c22688aed4eab3cdc2dfeacbb561660560a00804"
diff --git a/docs/_feature-guide-template.md b/docs/_feature-guide-template.md
index 9c1aead5e13ab3..f03dffb957a796 100644
--- a/docs/_feature-guide-template.md
+++ b/docs/_feature-guide-template.md
@@ -5,9 +5,9 @@ import FeatureAvailability from '@site/src/components/FeatureAvailability';
diff --git a/docs/actions/actions/slack.md b/docs/actions/actions/slack.md
index a89439825d2da1..73d990948812d5 100644
--- a/docs/actions/actions/slack.md
+++ b/docs/actions/actions/slack.md
@@ -136,9 +136,9 @@ In the next steps, we'll show you how to configure the Slack Action based on the
### Installation Instructions (Deployment specific)
-#### Managed DataHub
+#### DataHub Cloud
-Head over to the [Configuring Notifications](../../managed-datahub/slack/saas-slack-setup.md#configuring-notifications) section in the Managed DataHub guide to configure Slack notifications for your Managed DataHub instance.
+Head over to the [Configuring Notifications](../../managed-datahub/slack/saas-slack-setup.md#configuring-notifications) section in the DataHub Cloud guide to configure Slack notifications for your DataHub Cloud instance.
#### Quickstart
diff --git a/docs/actions/guides/developing-a-transformer.md b/docs/actions/guides/developing-a-transformer.md
index a843dbc846cd51..6406cdfae6104b 100644
--- a/docs/actions/guides/developing-a-transformer.md
+++ b/docs/actions/guides/developing-a-transformer.md
@@ -23,7 +23,7 @@ print the configuration that is provided when it is created, and print any Event
```python
# custom_transformer.py
from datahub_actions.transform.transformer import Transformer
-from datahub_actions.event.event import EventEnvelope
+from datahub_actions.event.event_envelope import EventEnvelope
from datahub_actions.pipeline.pipeline_context import PipelineContext
from typing import Optional
@@ -75,7 +75,7 @@ Next, install the package
pip install -e .
```
-inside the module. (alt.`python setup.py`).
+inside the module. (alt.`python setup.py`).
Once we have done this, our class will be referencable via `custom_transformer_example.custom_transformer:CustomTransformer`.
@@ -96,7 +96,7 @@ source:
connection:
bootstrap: ${KAFKA_BOOTSTRAP_SERVER:-localhost:9092}
schema_registry_url: ${SCHEMA_REGISTRY_URL:-http://localhost:8081}
-transform:
+transform:
- type: "custom_transformer_example.custom_transformer:CustomTransformer"
config:
# Some sample configuration which should be printed on create.
@@ -130,4 +130,4 @@ it without defining the full module path.
Prerequisites to consideration for inclusion in the core Transformer library include
- **Testing** Define unit tests for your Transformer
-- **Deduplication** Confirm that no existing Transformer serves the same purpose, or can be easily extended to serve the same purpose
\ No newline at end of file
+- **Deduplication** Confirm that no existing Transformer serves the same purpose, or can be easily extended to serve the same purpose
diff --git a/docs/api/graphql/getting-started.md b/docs/api/graphql/getting-started.md
index 98aeca196600d7..dfa556051bd4d1 100644
--- a/docs/api/graphql/getting-started.md
+++ b/docs/api/graphql/getting-started.md
@@ -27,6 +27,7 @@ For more information on, please refer to the following links."
- [Querying for Domain of a Dataset](/docs/api/tutorials/domains.md#read-domains)
- [Querying for Glossary Terms of a Dataset](/docs/api/tutorials/terms.md#read-terms)
- [Querying for Deprecation of a dataset](/docs/api/tutorials/deprecation.md#read-deprecation)
+- [Querying for all DataJobs that belong to a DataFlow](/docs/lineage/airflow.md#get-all-datajobs-associated-with-a-dataflow)
### Search
diff --git a/docs/api/tutorials/assertions.md b/docs/api/tutorials/assertions.md
index f89fe728f7e977..6837220a581c11 100644
--- a/docs/api/tutorials/assertions.md
+++ b/docs/api/tutorials/assertions.md
@@ -5,7 +5,7 @@ import TabItem from '@theme/TabItem';
-This guide specifically covers how to use the Assertion APIs for **Acryl Cloud** native assertions, including:
+This guide specifically covers how to use the Assertion APIs for **DataHub Cloud** native assertions, including:
- [Freshness Assertions](/docs/managed-datahub/observe/freshness-assertions.md)
- [Volume Assertions](/docs/managed-datahub/observe/volume-assertions.md)
@@ -15,7 +15,7 @@ This guide specifically covers how to use the Assertion APIs for **Acryl Cloud**
## Why Would You Use Assertions APIs?
-The Assertions APIs allow you to create, schedule, run, and delete Assertions with Acryl Cloud.
+The Assertions APIs allow you to create, schedule, run, and delete Assertions with DataHub Cloud.
### Goal Of This Guide
diff --git a/docs/api/tutorials/data-contracts.md b/docs/api/tutorials/data-contracts.md
index ac19920a5c4b7b..e977345273e223 100644
--- a/docs/api/tutorials/data-contracts.md
+++ b/docs/api/tutorials/data-contracts.md
@@ -5,7 +5,7 @@ import TabItem from '@theme/TabItem';
-This guide specifically covers how to use the Data Contract APIs with **Acryl Cloud**.
+This guide specifically covers how to use the Data Contract APIs with **DataHub Cloud**.
## Why Would You Use Data Contract APIs?
diff --git a/docs/api/tutorials/forms.md b/docs/api/tutorials/forms.md
index f60699ffebab58..3f28353595be72 100644
--- a/docs/api/tutorials/forms.md
+++ b/docs/api/tutorials/forms.md
@@ -22,7 +22,7 @@ For detailed information, please refer to [Datahub Quickstart Guide](/docs/quick
-Install the relevant CLI version. Forms are available as of CLI version `0.13.1`. The corresponding SaaS release version is `v0.2.16.5`
+Install the relevant CLI version. Forms are available as of CLI version `0.13.1`. The corresponding DataHub Cloud release version is `v0.2.16.5`
Connect to your instance via [init](https://datahubproject.io/docs/cli/#init):
1. Run `datahub init` to update the instance you want to load into
diff --git a/docs/api/tutorials/operations.md b/docs/api/tutorials/operations.md
index 70ede993ec95f6..e1d41f80e68366 100644
--- a/docs/api/tutorials/operations.md
+++ b/docs/api/tutorials/operations.md
@@ -7,7 +7,7 @@ import TabItem from '@theme/TabItem';
The Operations APIs allow you to report operational changes that were made to a given Dataset or Table using the 'Operation' concept.
These operations may be viewed on the Dataset Profile (e.g. as last modified time), accessed via the DataHub GraphQL API, or
-used to as inputs to Acryl Cloud [Freshness Assertions](/docs/managed-datahub/observe/freshness-assertions.md).
+used as inputs to DataHub Cloud [Freshness Assertions](/docs/managed-datahub/observe/freshness-assertions.md).
### Goal Of This Guide
diff --git a/docs/api/tutorials/structured-properties.md b/docs/api/tutorials/structured-properties.md
index 940f4632f1d45f..c56a2848638fc2 100644
--- a/docs/api/tutorials/structured-properties.md
+++ b/docs/api/tutorials/structured-properties.md
@@ -32,7 +32,7 @@ Additionally, you need to have the following tools installed according to the me
-Install the relevant CLI version. Forms are available as of CLI version `0.13.1`. The corresponding SaaS release version is `v0.2.16.5`
+Install the relevant CLI version. Forms are available as of CLI version `0.13.1`. The corresponding DataHub Cloud release version is `v0.2.16.5`
Connect to your instance via [init](https://datahubproject.io/docs/cli/#init):
- Run `datahub init` to update the instance you want to load into.
diff --git a/docs/assertions/open-assertions-spec.md b/docs/assertions/open-assertions-spec.md
index 519e917c30587f..09dad4710a2e53 100644
--- a/docs/assertions/open-assertions-spec.md
+++ b/docs/assertions/open-assertions-spec.md
@@ -2,7 +2,7 @@
DataHub is developing an open-source Data Quality Assertions Specification & Compiler that will allow you to declare data quality checks / expectations / assertions using a simple, universal
YAML-based format, and then compile this into artifacts that can be registered or directly executed by 3rd party Data Quality tools like [Snowflake DMFs](https://docs.snowflake.com/en/user-guide/data-quality-intro),
-dbt tests, Great Expectations or Acryl Cloud natively.
+dbt tests, Great Expectations or DataHub Cloud natively.
Ultimately, our goal is to provide an framework-agnostic, highly-portable format for defining Data Quality checks, making it seamless to swap out the underlying
assertion engine without service disruption for end consumers of the results of these data quality checks in catalogging tools like DataHub.
diff --git a/docs/authentication/guides/add-users.md b/docs/authentication/guides/add-users.md
index 86dac3ea328e53..30da5c9f229f94 100644
--- a/docs/authentication/guides/add-users.md
+++ b/docs/authentication/guides/add-users.md
@@ -60,7 +60,7 @@ and many more.
This option is strongly recommended for production deployments of DataHub.
-### Managed DataHub
+### DataHub Cloud
Single Sign-On can be configured and enabled by navigating to **Settings** > **SSO** > **OIDC**. Note
that a user must have the **Manage Platform Settings** [Platform Privilege](../../authorization/access-policies-guide.md)
diff --git a/docs/authorization/policies.md b/docs/authorization/policies.md
index 9867ff6ab264dd..91b0241c7d5149 100644
--- a/docs/authorization/policies.md
+++ b/docs/authorization/policies.md
@@ -105,7 +105,7 @@ These privileges are for DataHub operators to access & manage the administrative
| Manage Monitors[^2] | Allow actor to create, update, and delete any data asset monitors, including Custom SQL monitors. Grant with care. |
[^1]: Only active if REST_API_AUTHORIZATION_ENABLED is true
-[^2]: Managed DataHub only
+[^2]: DataHub Cloud only
##### Common metadata privileges
These privileges are to view & modify any entity within DataHub.
@@ -143,10 +143,10 @@ These privileges are to view & modify any entity within DataHub.
| Manage Tag Proposals[^2] | Allow actor to manage a proposal to add a tag to an asset. |
| Manage Glossary Term Proposals[^2] | Allow actor to manage a proposal to add a glossary term to an asset. |
| Manage Documentation Proposals[^2] | Allow actor to manage a proposal update an asset's documentation |
-| Share Entity[^2] | Allow actor to share an entity with another Acryl instance. |
+| Share Entity[^2] | Allow actor to share an entity with another DataHub Cloud instance. |
[^1]: Only active if REST_API_AUTHORIZATION_ENABLED is true
-[^2]: Managed DataHub only
+[^2]: DataHub Cloud only
##### Specific entity-level privileges
These privileges are not generalizable.
diff --git a/docs/authorization/roles.md b/docs/authorization/roles.md
index fe41cae2bc3cc3..7c7b4581faffca 100644
--- a/docs/authorization/roles.md
+++ b/docs/authorization/roles.md
@@ -73,9 +73,9 @@ in DataHub.
### Role Privileges
-#### Self-Hosted DataHub and Managed DataHub
+#### Self-Hosted DataHub and DataHub Cloud
-These privileges are common to both Self-Hosted DataHub and Managed DataHub.
+These privileges are common to both Self-Hosted DataHub and DataHub Cloud.
##### Platform Privileges
@@ -146,9 +146,9 @@ These privileges are common to both Self-Hosted DataHub and Managed DataHub.
| Explain ElasticSearch Query API | :heavy_check_mark: | :heavy_check_mark: | :heavy_check_mark: | The ability to use the Operations API explain endpoint. |
| Produce Platform Event API | :heavy_check_mark: | :heavy_check_mark: | :x: | The ability to produce Platform Events using the API. |
-#### Managed DataHub
+#### DataHub Cloud
-These privileges are only relevant to Managed DataHub.
+These privileges are only relevant to DataHub Cloud.
##### Platform Privileges
@@ -178,7 +178,7 @@ These privileges are only relevant to Managed DataHub.
| Manage Group Notification Settings | :heavy_check_mark: | :heavy_check_mark: | :x: | The ability to manage notification settings for a group. |
| Manage Group Subscriptions | :heavy_check_mark: | :heavy_check_mark: | :x: | The ability to manage subscriptions for a group. |
| Manage Data Contract Proposals | :heavy_check_mark: | :heavy_check_mark: | :x: | The ability to manage a proposal for a Data Contract |
-| Share Entity | :heavy_check_mark: | :heavy_check_mark: | :x: | The ability to share an entity with another Acryl instance. |
+| Share Entity | :heavy_check_mark: | :heavy_check_mark: | :x: | The ability to share an entity with another DataHub Cloud instance. |
## Additional Resources
diff --git a/docs/cli.md b/docs/cli.md
index fdcfa6616c9bfc..1f1e6dfa26be71 100644
--- a/docs/cli.md
+++ b/docs/cli.md
@@ -102,6 +102,7 @@ Command Options:
--test-source-connection When set, ingestion will only test the source connection details from the recipe
--no-progress If enabled, mute intermediate progress ingestion reports
```
+
#### ingest --dry-run
The `--dry-run` option of the `ingest` command performs all of the ingestion steps, except writing to the sink. This is useful to validate that the
@@ -133,23 +134,8 @@ By default `--preview` creates 10 workunits. But if you wish to try producing mo
datahub ingest -c ./examples/recipes/example_to_datahub_rest.dhub.yaml -n --preview --preview-workunits=20
```
-#### ingest deploy
-
-The `ingest deploy` command instructs the cli to upload an ingestion recipe to DataHub to be run by DataHub's [UI Ingestion](./ui-ingestion.md).
-This command can also be used to schedule the ingestion while uploading or even to update existing sources. It will upload to the remote instance the
-CLI is connected to, not the sink of the recipe. Use `datahub init` to set the remote if not already set.
-
-To schedule a recipe called "test", to run at 5am everyday, London time with the recipe configured in a local `recipe.yaml` file:
-````shell
-datahub ingest deploy --name "test" --schedule "5 * * * *" --time-zone "Europe/London" -c recipe.yaml
-````
-
-To update an existing recipe please use the `--urn` parameter to specify the id of the recipe to update.
-
-**Note:** Updating a recipe will result in a replacement of the existing options with what was specified in the cli command.
-I.e: Not specifying a schedule in the cli update command will remove the schedule from the recipe to be updated.
-
#### ingest --no-default-report
+
By default, the cli sends an ingestion report to DataHub, which allows you to see the result of all cli-based ingestion in the UI. This can be turned off with the `--no-default-report` flag.
```shell
@@ -180,6 +166,52 @@ failure_log:
filename: ./path/to/failure.json
```
+### ingest deploy
+
+The `ingest deploy` command instructs the cli to upload an ingestion recipe to DataHub to be run by DataHub's [UI Ingestion](./ui-ingestion.md).
+This command can also be used to schedule the ingestion while uploading or even to update existing sources. It will upload to the remote instance the
+CLI is connected to, not the sink of the recipe. Use `datahub init` to set the remote if not already set.
+
+This command will automatically create a new recipe if it doesn't exist, or update it if it does.
+Note that this is a complete update, and will remove any options that were previously set.
+I.e: Not specifying a schedule in the cli update command will remove the schedule from the recipe to be updated.
+
+**Basic example**
+
+To schedule a recipe called "Snowflake Integration", to run at 5am every day, London time with the recipe configured in a local `recipe.yaml` file:
+
+```shell
+datahub ingest deploy --name "Snowflake Integration" --schedule "5 * * * *" --time-zone "Europe/London" -c recipe.yaml
+```
+
+By default, the ingestion recipe's identifier is generated by hashing the name.
+You can override the urn generation by passing the `--urn` flag to the CLI.
+
+**Using `deployment` to avoid CLI args**
+
+As an alternative to configuring settings from the CLI, all of these settings can also be set in the `deployment` field of the recipe.
+
+```yml
+# deployment_recipe.yml
+deployment:
+ name: "Snowflake Integration"
+ schedule: "5 * * * *"
+ time_zone: "Europe/London"
+
+source: ...
+```
+
+```shell
+datahub ingest deploy -c deployment_recipe.yml
+```
+
+This is particularly useful when you want all recipes to be stored in version control.
+
+```shell
+# Deploy every yml recipe in a directory
+ls recipe_directory/*.yml | xargs -n 1 -I {} datahub ingest deploy -c {}
+```
+
### init
The init command is used to tell `datahub` about where your DataHub instance is located. The CLI will point to localhost DataHub by default.
@@ -242,8 +274,6 @@ The [metadata deletion guide](./how/delete-metadata.md) covers the various optio
### exists
-**🤝 Version compatibility** : `acryl-datahub>=0.10.2.4`
-
The exists command can be used to check if an entity exists in DataHub.
```shell
@@ -253,7 +283,6 @@ true
false
```
-
### get
The `get` command allows you to easily retrieve metadata from DataHub, by using the REST API. This works for both versioned aspects and timeseries aspects. For timeseries aspects, it fetches the latest value.
@@ -314,6 +343,7 @@ Update succeeded with status 200
```
#### put platform
+
**🤝 Version Compatibility:** `acryl-datahub>0.8.44.4`
The **put platform** command instructs `datahub` to create or update metadata about a data platform. This is very useful if you are using a custom data platform, to set up its logo and display name for a native UI experience.
@@ -346,6 +376,7 @@ datahub timeline --urn "urn:li:dataset:(urn:li:dataPlatform:mysql,User.UserAccou
The `dataset` command allows you to interact with the dataset entity.
The `get` operation can be used to read in a dataset into a yaml file.
+
```shell
datahub dataset get --urn "$URN" --to-file "$FILE_NAME"
```
@@ -358,7 +389,6 @@ datahub dataset upsert -f dataset.yaml
An example of `dataset.yaml` would look like as in [dataset.yaml](https://github.com/datahub-project/datahub/blob/master/metadata-ingestion/examples/cli_usage/dataset/dataset.yaml).
-
### user (User Entity)
The `user` command allows you to interact with the User entity.
@@ -411,7 +441,6 @@ members:
display_name: "Joe's Hub"
```
-
### dataproduct (Data Product Entity)
**🤝 Version Compatibility:** `acryl-datahub>=0.10.2.4`
@@ -566,14 +595,12 @@ Use this to delete a Data Product from DataHub. Default to `--soft` which preser
# > datahub dataproduct delete --urn "urn:li:dataProduct:pet_of_the_week" --hard
```
-
## Miscellaneous Admin Commands
### lite (experimental)
The lite group of commands allow you to run an embedded, lightweight DataHub instance for command line exploration of your metadata. This is intended more for developer tool oriented usage rather than as a production server instance for DataHub. See [DataHub Lite](./datahub_lite.md) for more information about how you can ingest metadata into DataHub Lite and explore your metadata easily.
-
### telemetry
To help us understand how people are using DataHub, we collect anonymous usage statistics on actions such as command invocations via Mixpanel.
@@ -640,7 +667,6 @@ External Entities Affected: None
Old Entities Migrated = {'urn:li:dataset:(urn:li:dataPlatform:hive,logging_events,PROD)', 'urn:li:dataset:(urn:li:dataPlatform:hive,SampleHiveDataset,PROD)', 'urn:li:dataset:(urn:li:dataPlatform:hive,fct_users_deleted,PROD)', 'urn:li:dataset:(urn:li:dataPlatform:hive,fct_users_created,PROD)'}
```
-
## Alternate Installation Options
### Using docker
@@ -673,7 +699,7 @@ We use a plugin architecture so that you can install only the dependencies you a
Please see our [Integrations page](https://datahubproject.io/integrations) if you want to filter on the features offered by each source.
| Plugin Name | Install Command | Provides |
-|------------------------------------------------------------------------------------------------| ---------------------------------------------------------- | --------------------------------------- |
+| ---------------------------------------------------------------------------------------------- | ---------------------------------------------------------- | --------------------------------------- |
| [metadata-file](./generated/ingestion/sources/metadata-file.md) | _included by default_ | File source and sink |
| [athena](./generated/ingestion/sources/athena.md) | `pip install 'acryl-datahub[athena]'` | AWS Athena source |
| [bigquery](./generated/ingestion/sources/bigquery.md) | `pip install 'acryl-datahub[bigquery]'` | BigQuery source |
@@ -715,7 +741,7 @@ Please see our [Integrations page](https://datahubproject.io/integrations) if yo
### Sinks
| Plugin Name | Install Command | Provides |
-|-------------------------------------------------------------------| -------------------------------------------- | -------------------------- |
+| ----------------------------------------------------------------- | -------------------------------------------- | -------------------------- |
| [metadata-file](../metadata-ingestion/sink_docs/metadata-file.md) | _included by default_ | File source and sink |
| [console](../metadata-ingestion/sink_docs/console.md) | _included by default_ | Console sink |
| [datahub-rest](../metadata-ingestion/sink_docs/datahub.md) | `pip install 'acryl-datahub[datahub-rest]'` | DataHub sink over REST API |
diff --git a/docs/deploy/aws.md b/docs/deploy/aws.md
index d1003077e24861..67dd9a734e67f5 100644
--- a/docs/deploy/aws.md
+++ b/docs/deploy/aws.md
@@ -76,7 +76,7 @@ First, if you did not use eksctl to setup the kubernetes cluster, make sure to g
Download the IAM policy document for allowing the controller to make calls to AWS APIs on your behalf.
```
-curl -o iam_policy.json https://raw.githubusercontent.com/kubernetes-sigs/aws-load-balancer-controller/v2.2.0/docs/install/iam_policy.json
+curl -o iam_policy.json https://raw.githubusercontent.com/kubernetes-sigs/aws-load-balancer-controller/main/docs/install/iam_policy.json
```
Create an IAM policy based on the policy document by running the following.
@@ -148,13 +148,9 @@ datahub-frontend:
alb.ingress.kubernetes.io/certificate-arn: <>
alb.ingress.kubernetes.io/inbound-cidrs: 0.0.0.0/0
alb.ingress.kubernetes.io/listen-ports: '[{"HTTP": 80}, {"HTTPS":443}]'
- alb.ingress.kubernetes.io/actions.ssl-redirect: '{"Type": "redirect", "RedirectConfig": { "Protocol": "HTTPS", "Port": "443", "StatusCode": "HTTP_301"}}'
+ alb.ingress.kubernetes.io/ssl-redirect: '443'
hosts:
- host: <>
- redirectPaths:
- - path: /*
- name: ssl-redirect
- port: use-annotation
paths:
- /*
```
diff --git a/docs/features.md b/docs/features.md
index 7951eea1f909c8..bcf3ec3618f88b 100644
--- a/docs/features.md
+++ b/docs/features.md
@@ -28,7 +28,7 @@ To get started with DataHub, you can use a simple CLI command. Alternatively, yo
- [Quickstart](quickstart.md)
- [Self-hosted DataHub](deploy/kubernetes.md)
-- [Managed DataHub](managed-datahub/managed-datahub-overview.md)
+- [DataHub Cloud](managed-datahub/managed-datahub-overview.md)
### Ingestion
diff --git a/docs/features/feature-guides/documentation-forms.md b/docs/features/feature-guides/documentation-forms.md
index 8b2966810de7c0..b007892e660946 100644
--- a/docs/features/feature-guides/documentation-forms.md
+++ b/docs/features/feature-guides/documentation-forms.md
@@ -77,7 +77,7 @@ Provide a step-by-step guide to use feature, including relevant screenshots and/
### Videos
-**Asset Verification in Acryl Cloud**
+**Asset Verification in DataHub Cloud**
diff --git a/docs/how/search.md b/docs/how/search.md
index 7012f5321f2ff7..c809ab1efba12d 100644
--- a/docs/how/search.md
+++ b/docs/how/search.md
@@ -71,7 +71,7 @@ After creating a filter, you can choose whether results should or should not mat
### Results
-Search results appear ranked by their relevance. In self-hosted DataHub ranking is based on how closely the query matched textual fields of an asset and its metadata. In Managed DataHub, ranking is based on a combination of textual relevance, usage (queries / views), and change frequency.
+Search results appear ranked by their relevance. In self-hosted DataHub ranking is based on how closely the query matched textual fields of an asset and its metadata. In DataHub Cloud, ranking is based on a combination of textual relevance, usage (queries / views), and change frequency.
With better metadata comes better results. Learn more about ingestion technical metadata in the [metadata ingestion](../../metadata-ingestion/README.md) guide.
@@ -148,24 +148,42 @@ The same GraphQL API that powers the Search UI can be used
for integrations and programmatic use-cases.
```
-# Example query
-{
- searchAcrossEntities(
- input: {types: [], query: "*", start: 0, count: 10, filters: [{field: "fieldTags", value: "urn:li:tag:Dimension"}]}
+# Example query - search for datasets matching the example_query_text who have the Dimension tag applied to a schema field and are from the data platform looker
+query searchEntities {
+ search(
+ input: {
+ type: DATASET,
+ query: "example_query_text",
+ orFilters: [
+ {
+ and: [
+ {
+ field: "fieldTags",
+ values: ["urn:li:tag:Dimension"]
+ },
+ {
+ field: "platform",
+ values: ["urn:li:dataPlatform:looker"]
+ }
+ ]
+ }
+ ],
+ start: 0,
+ count: 10
+ }
) {
start
count
total
searchResults {
entity {
+ urn
type
... on Dataset {
- urn
- type
+ name
platform {
name
}
- name
}
}
}
diff --git a/docs/how/updating-datahub.md b/docs/how/updating-datahub.md
index ffceb7a5d1b020..08ababcb5cfce9 100644
--- a/docs/how/updating-datahub.md
+++ b/docs/how/updating-datahub.md
@@ -55,6 +55,19 @@ New (optional fields `systemMetadata` and `headers`):
"headers": {}
}
```
+- #10858 Profiling configuration for Glue source has been updated.
+
+Previously, the configuration was:
+```yaml
+profiling: {}
+```
+
+Now, it needs to be:
+
+```yaml
+profiling:
+ enabled: true
+```
### Potential Downtime
@@ -66,7 +79,10 @@ New (optional fields `systemMetadata` and `headers`):
### Other Notable Changes
- #10498 - Tableau ingestion can now be configured to ingest multiple sites at once and add the sites as containers. The feature is currently only available for Tableau Server.
-
+- #10466 - Extends configuration in `~/.datahubenv` to match `DatahubClientConfig` object definition. See full configuration in https://datahubproject.io/docs/python-sdk/clients/. The CLI should now respect the updated configurations specified in `~/.datahubenv` across its functions and utilities. This means that for systems where ssl certification is disabled, setting `disable_ssl_verification: true` in `~./datahubenv` will apply to all CLI calls.
+- #11002 - We will not auto-generate a `~/.datahubenv` file. You must either run `datahub init` to create that file, or set environment variables so that the config is loaded.
+- #11023 - Added a new parameter to datahub's `put` cli command: `--run-id`. This parameter is useful to associate a given write to an ingestion process. A use-case can be mimick transformers when a transformer for aspect being written does not exist.
+- #11051 - Ingestion reports will now trim the summary text to a maximum of 800k characters to avoid generating `dataHubExecutionRequestResult` that are too large for GMS to handle.
## 0.13.3
### Breaking Changes
diff --git a/docs/incidents/incidents.md b/docs/incidents/incidents.md
index 41b4df10b78281..c1412d0bebd043 100644
--- a/docs/incidents/incidents.md
+++ b/docs/incidents/incidents.md
@@ -417,9 +417,9 @@ Authorization: Bearer
Also, remember that you can play with an interactive version of the GraphQL API at `https://your-account-id.acryl.io/api/graphiql`
:::
-## Enabling Slack Notifications (Acryl Cloud Only)
+## Enabling Slack Notifications (DataHub Cloud Only)
-In Acryl Cloud, you can configure your to send Slack notifications to a specific channel when incidents are raised or their status is changed.
+In DataHub Cloud, you can configure your to send Slack notifications to a specific channel when incidents are raised or their status is changed.
These notifications are also able to tag the immediate asset's owners, along with the owners of downstream assets consuming it.
diff --git a/docs/lineage/airflow.md b/docs/lineage/airflow.md
index 62715ed506ffe9..2d7707637e2d1c 100644
--- a/docs/lineage/airflow.md
+++ b/docs/lineage/airflow.md
@@ -17,7 +17,7 @@ There's two actively supported implementations of the plugin, with different Air
| Approach | Airflow Version | Notes |
| --------- | --------------- | --------------------------------------------------------------------------- |
-| Plugin v2 | 2.3+ | Recommended. Requires Python 3.8+ |
+| Plugin v2 | 2.3.4+ | Recommended. Requires Python 3.8+ |
| Plugin v1 | 2.1+ | No automatic lineage extraction; may not extract lineage if the task fails. |
If you're using Airflow older than 2.1, it's possible to use the v1 plugin with older versions of `acryl-datahub-airflow-plugin`. See the [compatibility section](#compatibility) for more details.
@@ -45,7 +45,7 @@ Set up a DataHub connection in Airflow, either via command line or the Airflow U
airflow connections add --conn-type 'datahub-rest' 'datahub_rest_default' --conn-host 'http://datahub-gms:8080' --conn-password ''
```
-If you are using hosted Acryl Datahub then please use `https://YOUR_PREFIX.acryl.io/gms` as the `--conn-host` parameter.
+If you are using DataHub Cloud then please use `https://YOUR_PREFIX.acryl.io/gms` as the `--conn-host` parameter.
#### Airflow UI
@@ -66,7 +66,7 @@ enabled = True # default
```
| Name | Default value | Description |
-|----------------------------|----------------------|------------------------------------------------------------------------------------------|
+| -------------------------- | -------------------- | ---------------------------------------------------------------------------------------- |
| enabled | true | If the plugin should be enabled. |
| conn_id | datahub_rest_default | The name of the datahub rest connection. |
| cluster | prod | name of the airflow cluster, this is equivalent to the `env` of the instance |
@@ -132,7 +132,7 @@ conn_id = datahub_rest_default # or datahub_kafka_default
```
| Name | Default value | Description |
-|----------------------------|----------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| -------------------------- | -------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| enabled | true | If the plugin should be enabled. |
| conn_id | datahub_rest_default | The name of the datahub connection you set in step 1. |
| cluster | prod | name of the airflow cluster |
@@ -240,6 +240,7 @@ See this [example PR](https://github.com/datahub-project/datahub/pull/10452) whi
There might be a case where the DAGs are removed from the Airflow but the corresponding pipelines and tasks are still there in the Datahub, let's call such pipelines ans tasks, `obsolete pipelines and tasks`
Following are the steps to cleanup them from the datahub:
+
- create a DAG named `Datahub_Cleanup`, i.e.
```python
@@ -263,8 +264,31 @@ with DAG(
)
```
+
- ingest this DAG, and it will remove all the obsolete pipelines and tasks from the Datahub based on the `cluster` value set in the `airflow.cfg`
+## Get all dataJobs associated with a dataFlow
+
+If you are looking to find all tasks (aka DataJobs) that belong to a specific pipeline (aka DataFlow), you can use the following GraphQL query:
+
+```graphql
+query {
+ dataFlow(urn: "urn:li:dataFlow:(airflow,db_etl,prod)") {
+ childJobs: relationships(
+ input: { types: ["IsPartOf"], direction: INCOMING, start: 0, count: 100 }
+ ) {
+ total
+ relationships {
+ entity {
+ ... on DataJob {
+ urn
+ }
+ }
+ }
+ }
+ }
+}
+```
## Emit Lineage Directly
diff --git a/docs/managed-datahub/approval-workflows.md b/docs/managed-datahub/approval-workflows.md
index 75cab458d285d9..e28ba1b87d7eaa 100644
--- a/docs/managed-datahub/approval-workflows.md
+++ b/docs/managed-datahub/approval-workflows.md
@@ -6,7 +6,7 @@ import FeatureAvailability from '@site/src/components/FeatureAvailability';
## Overview
-Keeping all your metadata properly classified can be hard work when you only have a limited number of trusted data stewards. With Managed DataHub, you can source proposals of Tags and Glossary Terms associated to datasets or dataset columns. These proposals may come from users with limited context or programatic processes using hueristics. Then, data stewards and data owners can go through them and only approve proposals they consider correct. This reduces the burden of your stewards and owners while increasing coverage.
+Keeping all your metadata properly classified can be hard work when you only have a limited number of trusted data stewards. With DataHub Cloud, you can source proposals of Tags and Glossary Terms associated to datasets or dataset columns. These proposals may come from users with limited context or programatic processes using hueristics. Then, data stewards and data owners can go through them and only approve proposals they consider correct. This reduces the burden of your stewards and owners while increasing coverage.
Approval workflows also cover the Business Glossary itself. This allows you to source Glossary Terms and Glossary Term description changes from across your organization while limiting who has final control over what gets in.
diff --git a/docs/managed-datahub/chrome-extension.md b/docs/managed-datahub/chrome-extension.md
index a4560bc8cc09ba..a98aa6cb7e3040 100644
--- a/docs/managed-datahub/chrome-extension.md
+++ b/docs/managed-datahub/chrome-extension.md
@@ -1,12 +1,12 @@
---
-description: Learn how to upload and use the Acryl DataHub Chrome extension (beta) locally before it's available on the Chrome store.
+description: Learn how to upload and use the DataHub Cloud Chrome extension (beta) locally before it's available on the Chrome store.
---
-# Acryl DataHub Chrome Extension
+# DataHub Cloud Chrome Extension
## Installing the Extension
-In order to use the Acryl DataHub Chrome extension, you need to download it onto your browser from the Chrome web store [here](https://chrome.google.com/webstore/detail/datahub-chrome-extension/aoenebhmfokhglijmoacfjcnebdpchfj).
+In order to use the DataHub Cloud Chrome extension, you need to download it onto your browser from the Chrome web store [here](https://chrome.google.com/webstore/detail/datahub-chrome-extension/aoenebhmfokhglijmoacfjcnebdpchfj).
@@ -18,7 +18,7 @@ Simply click "Add to Chrome" then "Add extension" on the ensuing popup.
## Configuring the Extension
-Once you have your extension installed, you'll need to configure it to work with your Acryl DataHub deployment.
+Once you have your extension installed, you'll need to configure it to work with your DataHub Cloud deployment.
1. Click the extension button on the right of your browser's address bar to view all of your installed extensions. Click on the newly installed DataHub extension.
@@ -40,7 +40,7 @@ If your organization uses standard SaaS domains for Looker, you should be ready
### Additional Configurations
-Some organizations have custom SaaS domains for Looker and some Acryl DataHub deployments utilize **Platform Instances** and set custom **Environments** when creating DataHub assets. If any of these situations applies to you, please follow the next few steps to finish configuring your extension.
+Some organizations have custom SaaS domains for Looker and some DataHub Cloud deployments utilize **Platform Instances** and set custom **Environments** when creating DataHub assets. If any of these situations applies to you, please follow the next few steps to finish configuring your extension.
1. Click on the extension button and select your DataHub extension to open the popup again. Now click the settings icon in order to open the configurations page.
@@ -62,13 +62,13 @@ Some organizations have custom SaaS domains for Looker and some Acryl DataHub de
Once you have everything configured on your extension, it's time to use it!
-1. First ensure that you are logged in to your Acryl DataHub instance.
+1. First ensure that you are logged in to your DataHub Cloud instance.
2. Navigate to Looker or Tableau and log in to view your data assets.
3. Navigate to a page where DataHub can provide insights on your data assets (Dashboards and Explores).
-4. Click the Acryl DataHub extension button on the bottom right of your page to open a drawer where you can now see additional information about this asset right from your DataHub instance.
+4. Click the DataHub Cloud extension button on the bottom right of your page to open a drawer where you can now see additional information about this asset right from your DataHub instance.
@@ -78,7 +78,7 @@ Once you have everything configured on your extension, it's time to use it!
## Advanced: Self-Hosted DataHub
-If you are using the Acryl DataHub Chrome extension for your self-hosted DataHub instance, everything above is applicable. However, there is one additional step you must take in order to set up your instance to be compatible with the extension.
+If you are using the DataHub Cloud Chrome extension for your self-hosted DataHub instance, everything above is applicable. However, there is one additional step you must take in order to set up your instance to be compatible with the extension.
### Configure Auth Cookies
diff --git a/docs/managed-datahub/datahub-api/graphql-api/getting-started.md b/docs/managed-datahub/datahub-api/graphql-api/getting-started.md
index 736bf6fea6811d..5993e2dfd773dd 100644
--- a/docs/managed-datahub/datahub-api/graphql-api/getting-started.md
+++ b/docs/managed-datahub/datahub-api/graphql-api/getting-started.md
@@ -4,7 +4,7 @@ description: Getting started with the DataHub GraphQL API.
# Getting Started
-The Acryl DataHub GraphQL API is an extension of the open source [DataHub GraphQL API.](docs/api/graphql/overview.md)
+The DataHub Cloud GraphQL API is an extension of the open source [DataHub GraphQL API.](docs/api/graphql/overview.md)
For a full reference to the Queries & Mutations available for consumption, check out [Queries](graphql/queries.md) & [Mutations](graphql/mutations.md).
@@ -39,7 +39,7 @@ e.g. to connect to ingestion endpoint for doing ingestion programmatically you c
The entire GraphQL API can be explored & [introspected](https://graphql.org/learn/introspection/) using GraphiQL, an interactive query tool which allows you to navigate the entire Acryl GraphQL schema as well as craft & issue using an intuitive UI.
-[GraphiQL](https://www.gatsbyjs.com/docs/how-to/querying-data/running-queries-with-graphiql/) is available for each Acryl DataHub deployment, locating at `https://your-account.acryl.io/api/graphiql`.
+[GraphiQL](https://www.gatsbyjs.com/docs/how-to/querying-data/running-queries-with-graphiql/) is available for each DataHub Cloud deployment, locating at `https://your-account.acryl.io/api/graphiql`.
### Querying the API
diff --git a/docs/managed-datahub/managed-datahub-overview.md b/docs/managed-datahub/managed-datahub-overview.md
index 4efc96eaf17a7c..f4d155ff0dbe10 100644
--- a/docs/managed-datahub/managed-datahub-overview.md
+++ b/docs/managed-datahub/managed-datahub-overview.md
@@ -1,8 +1,8 @@
-# How Acryl DataHub compares to DataHub
+# How DataHub Cloud compares to DataHub
DataHub is the #1 open source data catalog for developers.
-Acryl DataHub takes DataHub to the next level by offering features that allow
+DataHub Cloud takes DataHub to the next level by offering features that allow
you to roll out the product to the entire organization beyond your central data
platform team.
@@ -12,7 +12,7 @@ checklists and we hope you love them too!
## Search and Discovery
Features aimed at making it easy to discover data assets at your organization and understand relationships between them.
-| Feature | DataHub | Acryl DataHub |
+| Feature | DataHub | DataHub Cloud |
| ---------------------------------------------- | ------- | ------------- |
| Integrations for 50+ data sources | ✅ | ✅ |
| Table level, Column-level, Job-level lineage | ✅ | ✅ |
@@ -32,7 +32,7 @@ Features aimed at making it easy to discover data assets at your organization an
Features that help you govern the crown jewels of your organization, and trim
out the datasets that seem to grow like weeds when no one’s looking.
-| Feature | DataHub | Acryl DataHub |
+| Feature | DataHub | DataHub Cloud |
| ---------------------------------------------- | ------- | ------------- |
| Shift-Left governance | ✅ | ✅ |
| Dataset ownership management | ✅ | ✅ |
@@ -48,7 +48,7 @@ Features that help you ensure your data pipelines are producing high quality
assets, and if they’re not, making sure you and impacted users are the first to
know.
-| Feature | DataHub | Acryl DataHub |
+| Feature | DataHub | DataHub Cloud |
| ---------------------------------------------- | ------- | ------------- |
| Surface data quality results | ✅ | ✅ |
| Create data contracts | ✅ | ✅ |
@@ -68,7 +68,7 @@ know.
## Enterprise Grade
Features needed to roll out at scale to large enterprises.
-| Feature | DataHub | Acryl DataHub |
+| Feature | DataHub | DataHub Cloud |
| ---------------------------------------------- | ------- | ------------- |
| Battle-tested open source metadata platform | ✅ | ✅ |
| Metadata change events as a real-time stream | ✅ | ✅ |
@@ -82,7 +82,7 @@ Features needed to roll out at scale to large enterprises.
## Implementation and Support
Features related to ease of deployment and maintenance.
-| Feature | DataHub | Acryl DataHub |
+| Feature | DataHub | DataHub Cloud |
| ---------------------------------------------- | ------- | ------------- |
| Community support | ✅ | ✅ |
| Your own engineering team | ✅ | ❌ (They can instead focus on high-value work like contributing features to the open source product, or build amazing data applications with the APIs!)|
@@ -102,7 +102,7 @@ the form using the link below, and someone from the Acryl team will reach
out to set up a chat.
- Learn about Acryl DataHub
+ Learn about DataHub Cloud