-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UPSTREAM: <carry>: Fixed workflow's ServiceAccountName #129
base: master
Are you sure you want to change the base?
Conversation
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Commit Checker results:
|
A set of new images have been built to help with testing out this PR: |
An OCP cluster where you are logged in as cluster admin is required. The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO. To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: pr-129
spec:
dspVersion: v2
apiServer:
image: "quay.io/opendatahub/ds-pipelines-api-server:pr-129"
argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-129"
argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-129"
persistenceAgent:
image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-129"
scheduledWorkflow:
image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-129"
mlmd:
deploy: true # Optional component
grpc:
image: "quay.io/opendatahub/mlmd-grpc-server:latest"
envoy:
image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
mlpipelineUI:
deploy: true # Optional component
image: "quay.io/opendatahub/ds-pipelines-frontend:pr-129"
objectStorage:
minio:
deploy: true
image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance' Then run the following: cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/129/head
git checkout -b pullrequest 223937ee5835d9e49fba38c54edb9dfa10b8f313
oc apply -f dspa.pr-129.yaml More instructions here on how to deploy and test a Data Science Pipelines Application. |
223937e
to
6bf5d29
Compare
Commit Checker results:
|
Change to PR detected. A new PR build was completed. |
6bf5d29
to
313f5d9
Compare
Commit Checker results:
|
313f5d9
to
ae78d04
Compare
Commit Checker results:
|
Change to PR detected. A new PR build was completed. |
Signed-off-by: Helber Belmiro <helber.belmiro@gmail.com>
ae78d04
to
a573091
Compare
Change to PR detected. A new PR build was completed. |
Commit Checker results:
|
I've tested the fix following the instructions in the PR and pipeline schedules are working again. Also, I see no errors in the ds-pipeline-workflow-controller-dspa pod log related to missing service account. |
/hold we decided to revert the previous changes for now. |
Resolves: https://issues.redhat.com/browse/RHOAIENG-18533
Checklist: