Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UPSTREAM: <carry>: remove direct k8s pkg import. #13

Merged
merged 1 commit into from
Jan 30, 2024

Conversation

HumairAK
Copy link

The k8s.io/kubernetes is not meant to be utilized as a library, we can just directly leverage conditions from the api library:

Use of the k8s.io/kubernetes module or k8s.io/kubernetes/... packages as libraries is not supported.

@HumairAK
Copy link
Author

/hold

Copy link

openshift-ci bot commented Jan 29, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: HumairAK

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@HumairAK
Copy link
Author

to test this, please create a scheduled run

@dsp-developers
Copy link

A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-13
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-13
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-13
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-13
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-13
MLMD Server: quay.io/opendatahub/ds-pipelines-metadata-grpc:pr-13
MLMD Envoy Proxy: quay.io/opendatahub/ds-pipelines-metadata-envoy:pr-13
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-13

@dsp-developers
Copy link

An OCP cluster where you are logged in as cluster admin is required.

The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO.

To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named dspa.pr-13.yaml:

apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
  name: pr-13
spec:
  dspVersion: v2
  apiServer:
    image: "quay.io/opendatahub/ds-pipelines-api-server:pr-13"
    argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-13"
    argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-13"
  persistenceAgent:
    image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-13"
  scheduledWorkflow:
    image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-13"
  mlmd:  
    deploy: true  # Optional component
    grpc:
      image: "quay.io/opendatahub/ds-pipelines-metadata-grpc:pr-13"
    envoy:
      image: "quay.io/opendatahub/ds-pipelines-metadata-envoy:pr-13"
  mlpipelineUI:
    deploy: true  # Optional component 
    image: "quay.io/opendatahub/ds-pipelines-frontend:pr-13"
  objectStorage:
    minio:
      deploy: true
      image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'

Then run the following:

cd $(mktemp -d)
git clone [email protected]:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/13/head
git checkout -b pullrequest bdf37e37d0823ad6ed175db9972cd1879ac8cda0
oc apply -f dspa.pr-13.yaml

More instructions here on how to deploy and test a Data Science Pipelines Application.

@gregsheremeta
Copy link

/lgtm

I verified that regular and scheduled runs are both working.

to test this, please create a scheduled run

how does testing scheduled runs test this patch? Is there something specific about scheduled run wrt k8s.io/kubernetes?

@HumairAK
Copy link
Author

@gregsheremeta yes in this case it is being utilized in the scheduledworkflow crd status condition: https://github.com/kubeflow/pipelines/blob/bdf37e37d0823ad6ed175db9972cd1879ac8cda0/backend/src/crd/pkg/apis/scheduledworkflow/v1beta1/types.go#L184

when you create a scheduled run, this crd is created, if the status creation failed you'd probably would have encountered a failure to create this scheduled CR

@HumairAK HumairAK merged commit f3e082e into opendatahub-io:master Jan 30, 2024
1 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants