Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed CI configuration: adapted folder structure #539

Merged
merged 3 commits into from
Jan 27, 2025

Conversation

Dscano
Copy link
Contributor

@Dscano Dscano commented Jan 26, 2025

Moved the folder from docs/v1/build to docs/v1 and separately copied the generated files.
@chrispsommers

Signed-off-by: Dscano <d.scano89@gmail.com>
Signed-off-by: Dscano <d.scano89@gmail.com>
Signed-off-by: Dscano <d.scano89@gmail.com>
Copy link
Contributor

@jafingerhut jafingerhut left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@chrispsommers chrispsommers merged commit 6af490f into p4lang:main Jan 27, 2025
5 checks passed
@chrispsommers
Copy link
Collaborator

@jafingerhut @Dscano The build finally went through, until it encountered upload auth issues. Does anybody know the solution?

@Dscano
Copy link
Contributor Author

Dscano commented Jan 27, 2025

@jafingerhut @Dscano The build finally went through, until it encountered upload auth issues. Does anybody know the solution?

I compared the logs from main-branch-uploads-success and main-branch-uploads-fails. It seems that in the main-branch-uploads-fails run, the deletion of old files is performed, but we are unable to do this due to permission issues. These permissions are related to the AWS_S3 user that runs the command.

Based on the arguments and this guide, It should be fine, but further investigation is needed.

@jafingerhut
Copy link
Contributor

I don't know the solution, but perhaps this CI task does not have access to the secret credentials required to update that S3 storage account?

@chrispsommers
Copy link
Collaborator

@antoninbas Any ideas here? Thanks.

@chrispsommers
Copy link
Collaborator

@fruffy we're having S3 access issues, do you have any advice?

@antoninbas
Copy link
Member

I have added the missing permission so it should work now.

Also, please modify the job definition so that only the required files are uploaded (build outputs), not all the source files. cc @Dscano

On a side note, this is my personal AWS account. I am happy to keep it going the way it is, but feel free to switch to a p4.org account if such a thing exists.

Finally, it seems that the build outputs are no longer "copied" to the gh-pages branch, which means the official p4.org links are no longer up-to-date.

@fruffy
Copy link
Contributor

fruffy commented Jan 31, 2025

Can check this over the weekend if it isn't resolved by then.

@chrispsommers
Copy link
Collaborator

Thanks @antoninbas. @fruffy we should probably have p4.org account for this and not depend upon personal accounts, although Antonin has graciously extended his use of his.

@chrispsommers
Copy link
Collaborator

@Dscano Could you please take a look at @antoninbas's comments? Thanks!

@Dscano Dscano mentioned this pull request Jan 31, 2025
@Dscano
Copy link
Contributor Author

Dscano commented Jan 31, 2025

Hi @chrispsommers, I’ve addressed Antoinnin’s comment in this PR #541 (comment)

@fruffy
Copy link
Contributor

fruffy commented Feb 1, 2025

@AdarshRawat1 You did a great job with automated documentation and CI for P4C. We have similar problem here with P4Runtime. Do you think you could set up a similar structure for p4runtime? We do not need Doxygen etc, html and pdf previews for a branch are enough. If you are interested we can discuss with @chrispsommers, @Dscano, and @jafingerhut. I do not think it will be a big time investment.

@antoninbas Trying to understand the current setup. Why do we need AWS, to upload the html and PDF files. Is github pages not enough?

@AdarshRawat1
Copy link
Member

@AdarshRawat1 You did a great job with automated documentation and CI for P4C. We have similar problem here with P4Runtime. Do you think you could set up a similar structure for p4runtime?

I'll be happy to help! I'm looking into the current setup and will update you soon.

@antoninbas Trying to understand the current setup. Why do we need AWS, to upload the html and PDF files. Is github pages not enough?

+1 on this.

@AdarshRawat1
Copy link
Member

Finally, it seems that the build outputs are no longer "copied" to the gh-pages branch, which means the official p4.org links are no longer up-to-date.

@antoninbas Could you help me understand why the latest GitHub Pages deployment is pointing to this?

image

Even the June 7, 2024 deployment leads to an empty page. Has it been like this from the start, or did something change recently?
image

@AdarshRawat1
Copy link
Member

AdarshRawat1 commented Feb 1, 2025

Do you think you could set up a similar structure for p4runtime? We do not need Doxygen etc, html and pdf previews for a branch are enough.

Based on the docs, I'm assuming that we'll be working with AsciiDoc for rendering the output on GitHub Pages. Is this what you have in mind?

@fruffy
Copy link
Contributor

fruffy commented Feb 1, 2025

Do you think you could set up a similar structure for p4runtime? We do not need Doxygen etc, html and pdf previews for a branch are enough.

Based on the docs, I'm assuming that we'll be working with AsciiDoc for rendering the output on GitHub Pages. Is this what you have in mind?

Davide set this up in #510. Afaik this is already working but I have not kept up with the developments in this repository.

@Dscano
Copy link
Contributor Author

Dscano commented Feb 2, 2025

Do you think you could set up a similar structure for p4runtime? We do not need Doxygen etc, html and pdf previews for a branch are enough.

Based on the docs, I'm assuming that we'll be working with AsciiDoc for rendering the output on GitHub Pages. Is this what you have in mind?

Yes, so basically, AsciiDoc is the text processor used to generate the .pdf and .html files. You can generate these files via the make command within the p4runtime/docs/v1/.

@AdarshRawat1
Copy link
Member

AdarshRawat1 commented Feb 3, 2025

I prototyped the GitHub workflow based on my understanding.

@Dscano @fruffy @jafingerhut @chrispsommers @antoninbas Are you looking for something like this?

You may review the Workflow file here. & the demo deployment Repository here.

Suggestion: We can skip Amazon S3 to avoid billing for @antoninbas and use GitHub Pages for redirection. Let me know your thoughts!

@jafingerhut
Copy link
Contributor

@AdarshRawat1 I do not know the details, but I believe there are Github actions for this repository: https://github.com/p4lang/p4-spec

such that on every PR merged into the main branch, the PDF and HTML of the P4-16-spec.adoc AsciiDoc source file are generated, and they are put into a place, I think into a file in the gh-pages branch of the repo, where they are soon visible via links on this page: https://p4.org/specs/

In quickly looking at the .github/workflows files in the p4-spec repo, I notice it mentions S3 there as well, but I have not tried to follow its full behavior to understand it. From what I explain below, it is possible that the S3 version copied by the p4-spec scripts is not used by the p4.org web site.

On this page https://p4.org/specs/ there are 2 links for "working draft" versions of the P4_16 language specification. The HTML URL link on that page is to https://p4.org/p4-spec/docs/p4-16-working-draft.html

If you do wget https://p4.org/p4-spec/docs/p4-16-working-draft.html on a system with wget installed, I get this HTML text:

<!DOCTYPE html>
<html>
<head>
<script>
const url = "https://raw.githubusercontent.com/p4lang/p4-spec/gh-pages/docs/P4-16-working-spec.html";  
const req = new XMLHttpRequest();
req.open("GET", url, true);
req.onload = (_) => {
    const doc = document.open("text/html", "replace");
    doc.write( req.response );
    doc.close();
};
req.send(null)
</script>
</head>
<body></body>
</html>

I do not know JavaScript, but Nate Foster wrote the above, and the basic idea is to get the HTML from the URL https://raw.githubusercontent.com/p4lang/p4-spec/gh-pages/docs/P4-16-working-spec.html and then load that into the browser and display that. This was a short quick hack to display the HTML at that other URL into the user's browser, because simply linking directly to https://raw.githubusercontent.com/p4lang/p4-spec/gh-pages/docs/P4-16-working-spec.html shows raw HTML, not rendered HTML.

There is a similar JavaScript script for the language spec PDF link.

And also similar ones for the PSA specification, PNA specification, and P4Runtime specification. The other specifications on that page have not been updated in a while, so do not use this technique for their working draft links, if they have them, which is fine -- they don't need to be updated.

What would be nice for all of the active specifications is to have a similar workflow, such that on a merge of a PR to their main/master branch, causes the PDF and HTML to be generated, and the latest version to be stored in some place that is visible from the p4.org specifications page. This has been working for over a year now, but perhaps with different Gitub workflow scripts for different P4 specification repositories. It would be nice if they could be made more similar to each other.

One thing that was an issue for over a year in the p4-spec repo is that the way it committed new versions of generated PDF and HTML to its repo was that it kept a full history of all PDF and HTML versions, growing the repository history and size by a lot on every merge. That might be improved now, I am not sure. I believe that the p4runtime repo used different options on git commands committing to its gh-pages branch, maybe the -f option?, that caused it not to grow by a lot on every new PDF and HTML version.

@antoninbas
Copy link
Member

Davide was able to fix the CI by restoring the previous folder structure. So now the gh-pages branch is updated as expected and the latest ("draft") version of the spec is accessible through p4.org.

I don't know if any additional change is required wrt gh-pages now that this has been addressed.

On the workflow that pushes build artifacts to an AWS S3 bucket: this was the legacy way of uploading the spec, until we switch to gh-pages. We kept it around because it would upload the build outputs (.html and .pdf files) not just for the main branch and for tags, but also for any branch in this repo. This was a convenient way to review spec changes for frequent contributors who have write access to this repo and can therefore open PRs from branches in this repo instead of from a personal fork. When review the PR, reviewers can access a "well-known" link (which includes the branch name) to make sure there are not obvious formatting issues. This is still documented here: https://github.com/p4lang/p4runtime/tree/main/docs#ci-upload-of-built-documents. IIRC, there is a rule in the AWS bucket that delete the spec documents for the development branch after a couple of months. I will let the current maintainers / WG chairs decide what they want to do:

  1. keep things as they are
  2. upload the build outputs for dev branches to gh-pages instead of AWS S3 - I don't think there is any technical issue with this, it's just a bit more work to set up some job to delete stale documents (assuming one doesn't want things to accumulate in gh-pages)
  3. upload the build outputs as workflow artifacts - this is the best solution because it should also work for PRs from forks, and artifacts can be deleted automatically after 30 or 90 days
  4. remove that capabilty altogether

The 3rd solution is the best one IMO. Someone could take a look and check that the uploaded artifacts - and in particular the .html file - are easy to view.

@chrispsommers
Copy link
Collaborator

@antoninbas Thanks for the refresher. I too like your proposed 3rd approach. We should harmonize with other P4 WG's if possible. I'm not sure what effort is needed to achieve this - would the P4 LDWG need to make corresponding changes etc.

@fruffy
Copy link
Contributor

fruffy commented Feb 3, 2025

@AdarshRawat1 has set up something like this for P4C's documentation. We could have a similar approach here. Example:
p4lang/p4c#5113 (comment)

We can move the discussion in a separate issue to track this.

@smolkaj
Copy link
Member

smolkaj commented Feb 14, 2025

Question for Linux foundation from the P4 API WG discussion:

  • What is their recommendation on who to attribute the license to? Should we use individual authors or a group ("The P4Runtime authors")? Should we have a CONTRIBUTORS file?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants