Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add PR binary graduation to the things spackbot can do #45

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

scottwittenburg
Copy link
Collaborator

@scottwittenburg scottwittenburg commented Aug 11, 2021

Add a new handler to spackbot routing to process spack PRs which have been merged to develop. When these events happen, we want to copy the binaries from the per-PR mirror used by gitlab pipelines run against the PR to a shared PR mirror. The shared PR mirror can then be used by other PR pipelines that come along before the develop pipeline (associated with the aforementioned merged PR) finishes populating the main mirror with its own copies of those binaries.

Because copying the binaries could take awhile (depending on the number of specs the PR had to rebuild), and the update of the shared PR mirror index could take a long time (depending on the number of binaries in the mirror), this PR uses the rq module to schedule those tasks as jobs to be run by another process in a separate container, freeing up the handler to return quickly.

I've tested this using the updated docker-compose.yml and a webhook I set up on the spack repo to send events on pull requests. Because that's not the same as when a github app sends the requests, I temporarily commented out the installation authentication part for my testing. I'd love a better workflow for testing this part of spackbot without breaking it for everyone else though.

I'm currently working on a change to spack-infrastructure to update the spackbot-spack-io deployment to reflect the extra containers spackbot will need to support doing PR binary graduation this way. See that PR to see how the deployment looks and how I'm testing it "in production".

Remaining TODO:

  • Clean up per-PR mirrors here rather than in the cron where it is done currently (by this code). We don't want binaries deleted while we're still copying them.
  • Uncomment things that are commented for testing: installation authentication and all route registration not required for PR binary graduation
  • Sort out the mismatch between where spackbot images get pushed and where the k8s deployment pulls them

@scottwittenburg scottwittenburg force-pushed the graduate-pr-binaries branch 2 times, most recently from 4287e0f to c5ecbd3 Compare August 11, 2021 21:15
@scottwittenburg scottwittenburg force-pushed the graduate-pr-binaries branch 9 times, most recently from 124c808 to 8d37c3f Compare August 20, 2021 19:57
@scottwittenburg scottwittenburg marked this pull request as ready for review August 20, 2021 20:17
kwryankrattiger added a commit to kwryankrattiger/spackbot that referenced this pull request Sep 30, 2022
Changes migrated from PR spack#45 using the new workers
and a slightly more general name for the long running
process queue.

TODO: Add a pruning task that is occasionally inserted after a copy task but before reindex
@kwryankrattiger kwryankrattiger mentioned this pull request Sep 30, 2022
kwryankrattiger added a commit to kwryankrattiger/spackbot that referenced this pull request Oct 14, 2022
Changes migrated from PR spack#45 using the new workers
and a slightly more general name for the long running
process queue.

TODO: Add a pruning task that is occasionally inserted after a copy task but before reindex
kwryankrattiger added a commit to kwryankrattiger/spackbot that referenced this pull request Oct 21, 2022
Changes migrated from PR spack#45 using the new workers
and a slightly more general name for the long running
process queue.

TODO: Add a pruning task that is occasionally inserted after a copy task but before reindex
@kwryankrattiger
Copy link
Collaborator

Deprecated by work in #58 #67 and #68

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants