Skip to content

Commit

Permalink
Merge pull request #1284 from run-ai/integration-community-support-219
Browse files Browse the repository at this point in the history
Merge pull request #1283 from run-ai/integration-community-support
  • Loading branch information
yarongol authored Dec 11, 2024
2 parents 6859c31 + 722b496 commit 199f32a
Showing 1 changed file with 11 additions and 11 deletions.
22 changes: 11 additions & 11 deletions docs/platform-admin/integrations/integration-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,21 +14,21 @@ The Run:ai community portal is password protected and access is provided to cust
| Tool | Category | Run:ai support details | Additional Information|
| ------------------ | ----------------| --------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Triton | Orchestration | Supported | Usage via docker base image. Quickstart inference [example](../../Researcher/Walkthroughs/quickstart-inference.md) |
| Spark | Orchestration | | <div style="width: 300px;"> It is possible to schedule Spark workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. </div> Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI](https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI){target=_blank} |
| Kubeflow Pipelines | Orchestration | | It is possible to schedule kubeflow pipelines with the Run:ai scheduler. For details please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal<br>[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Apache Airflow | Orchestration | | It is possible to schedule Airflow workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Argo workflows | Orchestration | | It is possible to schedule Argo workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows){target=_blank} |
| SeldonX | Orchestration | | It is possible to schedule Seldon Core workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Seldon-Core](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Spark | Orchestration | Community Support | <div style="width: 300px;"> It is possible to schedule Spark workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. </div> Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI](https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI){target=_blank} |
| Kubeflow Pipelines | Orchestration | Community Support | It is possible to schedule kubeflow pipelines with the Run:ai scheduler. For details please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal<br>[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Apache Airflow | Orchestration | Community Support | It is possible to schedule Airflow workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Argo workflows | Orchestration | Community Support | It is possible to schedule Argo workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows){target=_blank} |
| SeldonX | Orchestration | Community Support | It is possible to schedule Seldon Core workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Seldon-Core](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Jupyter Notebook | Development | Supported | Run:ai provides integrated support with Jupyter Notebooks. Quickstart example: [https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/](https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/) |
| Jupyter Hub | Development | | It is possible to submit Run:ai workloads via JupyterHub. For more information please contact Run:ai customer support | |
| Jupyter Hub | Development | Community Support | It is possible to submit Run:ai workloads via JupyterHub. For more information please contact Run:ai customer support | |
| PyCharm | Development | Supported | Containers created by Run:ai can be accessed via PyCharm. PyCharm [example](../../Researcher/tools/dev-pycharm.md) |
| VScode | Development | Supported | - Containers created by Run:ai can be accessed via Visual Studio Code. [example](../../Researcher/tools/dev-vscode.md) <br>- You can automatically launch Visual Studio code web from the Run:ai console. [example](../../Researcher/Walkthroughs/quickstart-vscode.md). |
| Kubeflow notebooks | Development | | It is possible to launch a kubeflow notebook with the Run:ai scheduler. For details please contact Run:ai customer support Sample code can be found in the Run:ai customer success community portal:[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow<br>](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Ray | training, inference, data processing. | | It is possible to schedule Ray jobs with the Run:ai scheduler. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray](https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray){target=_blank} |
| Kubeflow notebooks | Development | Community Support | It is possible to launch a kubeflow notebook with the Run:ai scheduler. For details please contact Run:ai customer support Sample code can be found in the Run:ai customer success community portal:[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow<br>](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Ray | training, inference, data processing. | Community Support | It is possible to schedule Ray jobs with the Run:ai scheduler. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray](https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray){target=_blank} |
| TensorBoard | Experiment tracking | Supported | Run:ai comes with a preset Tensorboard [Environment](../workloads/assets/environments.md) asset. TensorBoard [example](../../Researcher/tools/dev-tensorboard.md). <br> Additional [sample](https://github.com/run-ai/use-cases/tree/master/runai_tensorboard_demo_with_resnet){target=_blank} |
| Weights & Biases | Experiment tracking | | It is possible to schedule W&B workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | Run:ai Customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases](https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases){target=_blank} <br> Additional samples [here](https://github.com/run-ai/use-cases/tree/master/runai_wandb){target=_blank} |
| ClearML | Experiment tracking | | It is possible to schedule ClearML workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML){target=_blank} |
| MLFlow | Model Serving | | It is possible to use ML Flow together with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow){target=_blank} <br> Additional MLFlow [sample](https://github.com/run-ai/use-cases/tree/master/runai_mlflow_demo){target=_blank} |
| Weights & Biases | Experiment tracking | Community Support | It is possible to schedule W&B workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | Run:ai Customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases](https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases){target=_blank} <br> Additional samples [here](https://github.com/run-ai/use-cases/tree/master/runai_wandb){target=_blank} |
| ClearML | Experiment tracking | Community Support | It is possible to schedule ClearML workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML){target=_blank} |
| MLFlow | Model Serving | Community Support | It is possible to use ML Flow together with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow){target=_blank} <br> Additional MLFlow [sample](https://github.com/run-ai/use-cases/tree/master/runai_mlflow_demo){target=_blank} |
| Hugging Face | Repositories | Supported | Run:ai provides an out of the box integration with Hugging Face |
| Docker Registry | Repositories | Supported | Run:ai allows using a docker registry as a [Credentials](../workloads/assets/credentials.md) asset. |
| S3 | Storage | Supported | Run:ai communicates with S3 by defining a [data source](../workloads/assets/datasources.md) asset. |
Expand Down

0 comments on commit 199f32a

Please sign in to comment.