Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build openstack-watcher container #215

Merged

Conversation

raukadah
Copy link
Collaborator

@raukadah raukadah commented Sep 26, 2024

This pull request:
* Builds OpenStack Watcher containers
* Add python-watcherclient package to openstackclient container
* Add openstack-watcher-ui package to horizon container

Note: python-watcherclient and openstack-watcher-ui is not available in downstream currently. It is available in CentOS Stream with RDO antelope release. That's why we have added these packages under tcib_distro conditional to avoid breakage downstream.

Test results: #215 (comment)

Jira: https://issues.redhat.com/browse/OSPRH-11085
Co-authored-by: Alfredo Moralejo amoralej@redhat.com
Signed-off-by: Chandan Kumar raukadah@gmail.com

Copy link

Build failed (check pipeline). Post recheck (without leading slash)
to rerun all jobs. Make sure the failure cause has been resolved before
you rerun jobs.

https://softwarefactory-project.io/zuul/t/rdoproject.org/buildset/009b2d2b9cb144478d5d71af89970d0b

openstack-meta-content-provider FAILURE in 23m 49s
⚠️ tcib-crc-podified-edpm-baremetal SKIPPED Skipped due to failed job openstack-meta-content-provider
⚠️ tcib-podified-multinode-edpm-deployment-crc SKIPPED Skipped due to failed job openstack-meta-content-provider

@raukadah
Copy link
Collaborator Author

/assign @rlandy

- httpd
- mod_ssl
- openstack-watcher-common
- python3-mod_wsgi
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only the watcher-api runs as a wsgi in httpd, I'd move httpd, mod_ssl and python3-mod_wsgi to the watcher-api container.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the updating in the pr!

Copy link

Build failed (check pipeline). Post recheck (without leading slash)
to rerun all jobs. Make sure the failure cause has been resolved before
you rerun jobs.

https://softwarefactory-project.io/zuul/t/rdoproject.org/buildset/430f4b85b356447da30bead4066d555e

openstack-meta-content-provider FAILURE in 15m 46s
⚠️ tcib-crc-podified-edpm-baremetal SKIPPED Skipped due to failed job openstack-meta-content-provider
⚠️ tcib-podified-multinode-edpm-deployment-crc SKIPPED Skipped due to failed job openstack-meta-content-provider

@raukadah
Copy link
Collaborator Author

raukadah commented Nov 4, 2024

recheck

raukadah added a commit to openstack-k8s-operators/ci-framework that referenced this pull request Nov 4, 2024
Watcher related packages got added in RDO antelope but not available in
OSP-18 downstream.

openstack-k8s-operators/tcib#215 adds watcher
container in tcib. Since downstream tcib jobs install the tcib rpm from
main branch so it may break downstream container build and component
job.

So in order to avoid breakage, we are skip building watcher related
containers.

Signed-off-by: Chandan Kumar <raukadah@gmail.com>
raukadah added a commit to openstack-k8s-operators/ci-framework that referenced this pull request Nov 4, 2024
Watcher related packages got added in RDO antelope but not available in
OSP-18 downstream.

openstack-k8s-operators/tcib#215 adds watcher
container in tcib. Since downstream tcib jobs install the tcib rpm from
main branch so it may break downstream container build and component
job.

So in order to avoid breakage, we are skip building watcher related
containers.

Depends-On: openstack-k8s-operators/tcib#215

Signed-off-by: Chandan Kumar <raukadah@gmail.com>
raukadah added a commit to openstack-k8s-operators/ci-framework that referenced this pull request Nov 5, 2024
Watcher related packages got added in RDO antelope but not available in
OSP-18 downstream.

openstack-k8s-operators/tcib#215 adds watcher
container in tcib. Since downstream tcib jobs install the tcib rpm from
main branch so it may break downstream container build and component
job.

So in order to avoid breakage, we are skip building watcher related
containers.

Depends-On: openstack-k8s-operators/tcib#215

Signed-off-by: Chandan Kumar <raukadah@gmail.com>
raukadah added a commit to openstack-k8s-operators/ci-framework that referenced this pull request Nov 5, 2024
Watcher related packages got added in RDO antelope but not available in
OSP-18 downstream.

openstack-k8s-operators/tcib#215 adds watcher
container in tcib. Since downstream tcib jobs install the tcib rpm from
main branch so it may break downstream container build and component
job.

So in order to avoid breakage, we are skip building watcher related
containers.

Depends-On: openstack-k8s-operators/tcib#215

Signed-off-by: Chandan Kumar <raukadah@gmail.com>
@raukadah raukadah marked this pull request as ready for review November 5, 2024 10:35
@openshift-ci openshift-ci bot requested review from abays and stuggi November 5, 2024 10:35
raukadah added a commit to openstack-k8s-operators/ci-framework that referenced this pull request Nov 5, 2024
Watcher related packages got added in RDO antelope but not available in
OSP-18 downstream.

openstack-k8s-operators/tcib#215 adds watcher
container in tcib. Since downstream tcib jobs install the tcib rpm from
main branch so it may break downstream container build and component
job.

So in order to avoid breakage, we are skip building watcher related
containers.

Note: It also fixes the exclude conditional to check existence of
release key in exclude var to avoid following error:
```
''dict object'' has no attribute ''master'''
```

Depends-On: openstack-k8s-operators/tcib#215

Signed-off-by: Chandan Kumar <raukadah@gmail.com>
@openshift-ci openshift-ci bot added the lgtm label Nov 6, 2024
"${SITE_PACKAGES}/watcher_dashboard/conf/watcher_policy.json" \
"/etc/openstack-dashboard/watcher_policy.json"
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm afraid of breaking horizon deployment downstream by adding this as we need ENABLE_WATCHER as this script requires ENABLE_WATCHER variable which we need to add in horizon-operator and also requires files which are installed by openstack-watcher-ui which will not be installed downstream. What if we add the dashboard configuration part for a future PR or at least don't run it so far?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@amoralej Yes, indeed, it will break. I am moving this whole piece. https://issues.redhat.com/browse/OSPRH-11277 will track the same, will add it in future pr.

@@ -105,6 +117,7 @@ config_heat_dashboard
config_ironic_dashboard
config_manila_ui
config_octavia_dashboard
config_watcher_dashboard
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

This pull request:
* Builds OpenStack Watcher containers
* Add python-watcherclient package to openstackclient container
* Add openstack-watcher-ui package to horizon container

Note: python-watcherclient and openstack-watcher-ui is not available in
downstream currently. It is available in CentOS Stream with RDO antelope
release. That's why we have added these packages under tcib_distro
conditional to avoid breakage downstream.

Jira: https://issues.redhat.com/browse/OSPRH-11085
Co-authored-by: Alfredo Moralejo <amoralej@redhat.com>
Signed-off-by: Chandan Kumar <raukadah@gmail.com>
Copy link
Contributor

@amoralej amoralej left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Collaborator

@rabi rabi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is watcher service going to be tech-preview in a OSP 18 Feature Release or OSP19? Is there any PM agreement for that you could link it the linked jira?

@openshift-ci openshift-ci bot added the lgtm label Nov 7, 2024
Copy link
Contributor

openshift-ci bot commented Nov 7, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: rabi, raukadah, rlandy

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci openshift-ci bot added the approved label Nov 7, 2024
@openshift-merge-bot openshift-merge-bot bot merged commit bcb58ab into openstack-k8s-operators:main Nov 7, 2024
5 checks passed
@raukadah
Copy link
Collaborator Author

raukadah commented Nov 7, 2024

Is watcher service going to be tech-preview in a OSP 18 Feature Release or OSP19? Is there any PM agreement for that you could link it the linked jira?

@rabi Hello, thank you for the comments, https://issues.redhat.com/browse/OSPRH-7674 tracks it and it is targeted as a tech preview in RHOS-18.0 FR 2 release.

@raukadah raukadah deleted the add_watcher branch November 7, 2024 09:11
@raukadah
Copy link
Collaborator Author

/cherry-pick 18.0-fr1

@openshift-cherrypick-robot

@raukadah: new pull request created: #247

In response to this:

/cherry-pick 18.0-fr1

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants