Skip to content

Commit

Permalink
updating text on faq and index (#89)
Browse files Browse the repository at this point in the history
Signed-off-by: Frayne, Craig <[email protected]>
  • Loading branch information
craigmateo authored Sep 6, 2024
1 parent 7dd231b commit 035e04d
Show file tree
Hide file tree
Showing 2 changed files with 28 additions and 50 deletions.
59 changes: 20 additions & 39 deletions faq.md
Original file line number Diff line number Diff line change
@@ -1,53 +1,38 @@
# OPEA Frequently Asked Questions

## What is OPEAs mission?
OPEA’s mission is to offer a validated enterprise-grade GenAI (Generative Artificial Intelligence) RAG reference implementation. This will simplify GenAI development and deployment, thereby accelerating time-to-market.
## What is OPEA's mission?
OPEA’s mission is to offer a validated enterprise-grade GenAI (Generative Artificial Intelligence) RAG reference implementation. This will simplify GenAI development and deployment, thereby accelerating time-to-market.

## What is OPEA?
The project currently consists of a technical conceptual framework that enables GenAI implementations to meet enterprise-grade requirements. The project offers a set of reference implementations for a wide range of enterprise use cases that can be used out-of-the-box. The project additionally offers a set of validation and compliance tools to ensure the reference implementations meet the needs outlined in the conceptual framework. This enables new reference implementations to be contributed and validated in an open manner. Partnering with the LF AI & Data places in the perfect spot for multi-partner development, evolution, and expansion.
The project currently consists of a technical conceptual framework that enables GenAI implementations to meet enterprise-grade requirements. The project offers a set of reference implementations for a wide range of enterprise use cases that can be used out-of-the-box. Additionally, the project provides a set of validation and compliance tools to ensure the reference implementations meet the needs outlined in the conceptual framework. This enables new reference implementations to be contributed and validated in an open manner. Partnering with the LF AI & Data places it in the perfect spot for multi-partner development, evolution, and expansion.

## What problems are faced by GenAI deployments within the enterprise?
Enterprises face a myriad of challenges in development and deployment of Gen AI. The development of new models, algorithms, fine tuning techniques, detecting and resolving bias and how to deploy large solutions at scale continues to evolve at a rapid pace. One of the biggest challenges enterprises come up against is a lack of standardized software tools and technologies from which to choose. Additionally, enterprises want the flexibility to innovate rapidly, extend the functionality to meet their business needs while ensuring the solution is secure and trustworthy. The lack of a framework that encompasses both proprietary and open solutions impedes enterprises from charting their destiny. This results in enormous investment of time and money impacting time-to-market advantage. OPEA answers the need for a multi-provider, ecosystem-supported framework that enables the evaluation, selection, customization, and trusted deployment of solutions that businesses can rely on.
Enterprises face a myriad of challenges in the development and deployment of GenAI. The development of new models, algorithms, fine-tuning techniques, detecting and resolving bias, and how to deploy large solutions at scale continues to evolve at a rapid pace. One of the biggest challenges enterprises come up against is a lack of standardized software tools and technologies from which to choose. Additionally, enterprises want the flexibility to innovate rapidly, extend functionality to meet their business needs while ensuring the solution is secure and trustworthy. The lack of a framework that encompasses both proprietary and open solutions impedes enterprises from charting their destiny. This results in an enormous investment of time and money, impacting the time-to-market advantage. OPEA answers the need for a multi-provider, ecosystem-supported framework that enables the evaluation, selection, customization, and trusted deployment of solutions that businesses can rely on.

## Why now?
The major adoption and deployment cycle of robust, secure, enterprise-grade Gen AI solutions across all industries is at its early stages. Enterprise-grade solutions will require collaboration in the open ecosystem. The time is now for the ecosystem to come together and accelerate GenAI deployments across enterprises by offering a standardized set of tools and technologies while supporting three key tenets – open, security, and scalability. This will require the ecosystem to work together to build reference implementations that are performant, trustworthy and enterprise-grade ready.
The major adoption and deployment cycle of robust, secure, enterprise-grade GenAI solutions across all industries is in its early stages. Enterprise-grade solutions will require collaboration in the open ecosystem. The time is now for the ecosystem to come together and accelerate GenAI deployments across enterprises by offering a standardized set of tools and technologies while supporting three key tenets – openness, security, and scalability. This will require the ecosystem to work together to build reference implementations that are performant, trustworthy, and enterprise-grade ready.

## How does it compare to other options for deploying Gen AI solutions within the enterprise?
There is not an alternative that brings the entire ecosystem together in a vendor neutral manner and delivers on the promise of open, security and scalability. This is our primary motivation for creating OPEA project.
There is no alternative that brings the entire ecosystem together in a vendor-neutral manner and delivers on the promise of openness, security, and scalability. This is our primary motivation for creating the OPEA project.

## Will OPEA reference implementations work with proprietary components?
Like any other open-source project, the community will determine which components are needed by the broader ecosystem. Enterprises can always extend OPEA project with other multi-vendor proprietary solutions to achieve their business goals.
Like any other open-source project, the community will determine which components are needed by the broader ecosystem. Enterprises can always extend the OPEA project with other multi-vendor proprietary solutions to achieve their business goals.

## What does OPEA acronym stand for?
Open Platform for Enterprise AI
Open Platform for Enterprise AI.

## How do I pronounce OPEA?
It is said ‘OH-PEA-AY'

## What companies and open-source projects are part of OPEA?
AnyScale
Cloudera
DataStax
Domino Data Lab
HuggingFace
Intel
KX
MariaDB Foundation
MinIO
Qdrant
Red Hat
SAS
VMware by Broadcom
Yellowbrick Data
Zilliz
It is pronounced ‘OH-PEA-AY.’

## What initial companies and open-source projects joined OPEA?
AnyScale, Cloudera, DataStax, Domino Data Lab, HuggingFace, Intel, KX, MariaDB Foundation, MinIO, Qdrant, Red Hat, SAS, VMware by Broadcom, Yellowbrick Data, Zilliz.

## What is Intel contributing?
OPEA is to be defined jointly by several community partners, with a call for broad ecosystem contribution, under the well-established LF AI & Data Foundation. As a starting point, Intel has contributed a Technical Conceptual Framework that shows how to construct and optimize curated GenAI pipelines built for secure, turnkey enterprise deployment. At launch, Intel contributed several reference implementations on Intel hardware across Intel® Xeon® 5, Intel® Xeon® 6 and Intel® Gaudi® 2, which you can see in a Github repo here. Over time we intend to add to that contribution including a software infrastructure stack to enable fully containerized AI workload deployments as well as potentially implementations of those containerized workloads.
OPEA is to be defined jointly by several community partners, with a call for broad ecosystem contribution, under the well-established LF AI & Data Foundation. As a starting point, Intel has contributed a Technical Conceptual Framework that shows how to construct and optimize curated GenAI pipelines built for secure, turnkey enterprise deployment. At launch, Intel contributed several reference implementations on Intel hardware across Intel® Xeon® 5, Intel® Xeon® 6, and Intel® Gaudi® 2, which you can see in a GitHub repo here. Over time we intend to add to that contribution, including a software infrastructure stack to enable fully containerized AI workload deployments, as well as potentially implementations of those containerized workloads.

## When you say Technical Conceptual Framework, what components are included?
The models and modules can be part of an OPEA repository, or be published in a stable unobstructed repository (e.g., Hugging Face) and cleared for use by an OPEA assessment. These include:
## When you say Technical Conceptual Framework, what components are included?
The models and modules can be part of an OPEA repository or be published in a stable, unobstructed repository (e.g., Hugging Face) and cleared for use by an OPEA assessment. These include:

GenAI models – Large Language Models (LLMs), Large Vision Models (LVMs), multimodal models, etc.
* Ingest/Data Processing
* Embedding Models/Services
* Indexing/Vector/Graph data stores
Expand All @@ -68,17 +53,13 @@ There are different ways partners can contribute to this project:
* Build the infrastructure to support OPEA projects

## Where can partners see the latest draft of the Conceptual Framework spec?
A version of the spec is available in the docs repo in this project
A version of the spec is available in the documentation (["docs"](https://github.com/opea-project/docs)) repository within this project.

## Is there a cost for joining?
There is no cost for anyone to join and contribute.
There is no cost for anyone to join and contribute to the OPEA project.

## Do I need to be Linux Foundation member to join?
## Do I need to be a Linux Foundation member to join?
Anyone can join and contribute. You don’t need to be a Linux Foundation member.

## Where can I report a bug?
Vulnerability reports can be sent to [email protected].




## Where can I report a bug or vulnerability?
Vulnerability reports and bug submissions can be sent to [[email protected]](mailto:[email protected]).
19 changes: 8 additions & 11 deletions index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@ OPEA Project Documentation
##########################

Welcome to the OPEA Project (|version|) documentation published |today|.
OPEA streamlines implementation of enterprise-grade Generative AI by efficiently
integrating secure, performant, and cost-effective Generative AI workflows to business value.
OPEA streamlines the implementation of enterprise-grade Generative AI by efficiently
integrating secure, performant, and cost-effective Generative AI workflows into business processes.

Source code for the OPEA Project is maintained in the
`OPEA Project GitHub repo`_.
`OPEA Project GitHub repository`_.

.. comment The links in this grid display can't use :ref: because we're
using raw html. There's a risk of broken links if referenced content is
Expand All @@ -20,7 +20,7 @@ Source code for the OPEA Project is maintained in the
<li class="grid-item">
<a href="introduction/index.html">
<img alt="" src="_static/images/opea-icon-white.svg" width="80px"/><br/>
<h2>What is OPEA</h2>
<h2>What is OPEA?</h2>
</a>
<p>Learn about the OPEA architecture, features, and benefits.</p>
</li>
Expand All @@ -29,9 +29,7 @@ Source code for the OPEA Project is maintained in the
<span class="grid-icon fa fa-map-signs"></span>
<h2>Getting Started</h2>
</a>
<p>Get started, whether building GenAI solutions or contributing to
the community.
</p>
<p>Start building GenAI solutions or contribute to the community.</p>
</li>
<li class="grid-item">
<a href="community/index.html">
Expand All @@ -52,15 +50,14 @@ Source code for the OPEA Project is maintained in the
<span class="grid-icon fa fa-sign-in"></span>
<h2>Deploy GenAI Solutions</h2>
</a>
<p>Select from several deployment strategies that best match your
enterprise needs.</p>
<p>Select from several deployment strategies that best match your enterprise needs.</p>
</li>
<li class="grid-item">
<a href="microservices/index.html">
<span class="grid-icon fa fa-object-group"></span>
<h2>Browse GenAI Microservices</h2>
</a>
<p>Use modular building blocks to build robust GenAI solutions</p>
<p>Use modular building blocks to build robust GenAI solutions.</p>
</li>
</ul>

Expand All @@ -80,4 +77,4 @@ Source code for the OPEA Project is maintained in the
release_notes/index
faq

.. _OPEA Project GitHub repo: https://github.com/opea-project
.. _OPEA Project GitHub repository: https://github.com/opea-project

0 comments on commit 035e04d

Please sign in to comment.