Skip to content

Add support for WORKFLOW and user/group IDs to assemble job #416

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Oct 10, 2022
Merged

Conversation

sethmlarson
Copy link
Contributor

@sethmlarson sethmlarson commented Oct 5, 2022

Adapted elastic/elasticsearch-py#2021 and elastic/elasticsearch-py#2018 to elasticsearch-java as a starting point for DRA support.

Things I changed and outstanding questions:

  • Changed USER_ID and GROUP_ID to BUILDER_UID and BUILD_GID to match other Dockerfiles. I'm not sure if these had any effect somewhere hidden, maybe something with Gradle?
  • Added a patch version to version.txt, when this is backported to 8.5 branch this value will need to be updated. I'm also not certain if this has impacts on the client?
  • I didn't change anything with what is built into .ci/output. In particular we aren't doing anything with .ci/output/repository/ and I'm not sure if we need to be?

Tested with the following to emulate how assemble would behave when run under DRA:

Snapshot

$ STACK_VERSION=$(cat config/version.txt) WORKFLOW=snapshot ./.ci/make.sh assemble $STACK_VERSION

$ ls .ci/output/release/
dependencies.csv
elasticsearch-java-8.6.0-SNAPSHOT-javadoc.jar
elasticsearch-java-8.6.0-SNAPSHOT-sources.jar
elasticsearch-java-8.6.0-SNAPSHOT.jar
elasticsearch-java-8.6.0-SNAPSHOT.pom

Staging

$ STACK_VERSION=$(cat config/version.txt) WORKFLOW=staging ./.ci/make.sh assemble $STACK_VERSION

$ ls .ci/output/release/
dependencies.csv
elasticsearch-java-8.6.0.jar
elasticsearch-java-8.6.0-javadoc.jar
elasticsearch-java-8.6.0.pom
elasticsearch-java-8.6.0-sources.jar

Copy link
Member

@picandocodigo picandocodigo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The USER_ID and GROUP_ID in the Dockerfile are -as far as I understand- just for the release manager to have permissions on the artifacts produced by make.sh assemble. Hopefully it doesn't affect anything else using the Dockerfile, but Sylvain will know better.

I can't comment on the version.txt change, but I think Sylvain was going to integrate this with automating the bump action.

The generated artifacts look like they're ok, the DRA artifact set will look for a maven dir in .ci/output/release. So I guess the pom file and the rest of the files are enough and no generation of a compressed file is needed, but yeah Sylvain will know better.

Copy link
Member

@swallez swallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes to uid/gid look good.

Wondering about the version though: in the context of a DRA build, what is the version when make.sh assemble <version> is called? Where does it come from?

The Gradle build uses this value when set, and otherwise falls back to the version in config/version.txt.

@swallez
Copy link
Member

swallez commented Oct 10, 2022

I've added a commit that

  • lazily creates the group and user in the Dockerfile (the group coming from my machine already exists in the base image)
  • makes the Dockerfile more minimal: we don't care about user/group being system and the user doesn't need a home directory
  • implements the bump command, that just writes to config/version.txt.

the pom file and the rest of the files are enough and no generation of a compressed file is needed

This is what we provide currently to the release manager. If the DRA releaser behaves the same, that should be all that is needed. And indeed .ci/output/repository is not needed (I need to clean this up).

Copy link
Member

@swallez swallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with my additional commit ;-)

Do not pay attention to the failing "Code style and license headers" check, it'll be fixed in another PR.

Copy link
Contributor Author

@sethmlarson sethmlarson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The extra changes LGTM after testing locally. I especially enjoy the simple implementation of the bump task :) Once this is merged I'll update the version.txt to 8.5.0 in the backport as well.

@sethmlarson sethmlarson merged commit 8c212ed into main Oct 10, 2022
@sethmlarson sethmlarson deleted the dra branch October 10, 2022 15:14
@github-actions
Copy link

The backport to 8.5 failed:

The process '/usr/bin/git' failed with exit code 1

To backport manually, run these commands in your terminal:

# Fetch latest updates from GitHub
git fetch
# Create a new working tree
git worktree add .worktrees/backport-8.5 8.5
# Navigate to the new working tree
cd .worktrees/backport-8.5
# Create a new branch
git switch --create backport-416-to-8.5
# Cherry-pick the merged commit of this pull request and resolve the conflicts
git cherry-pick --mainline 1 8c212ed80157c91550862359285be882c0b9b410
# Push it to GitHub
git push --set-upstream origin backport-416-to-8.5
# Go back to the original working tree
cd ../..
# Delete the working tree
git worktree remove .worktrees/backport-8.5

Then, create a pull request where the base branch is 8.5 and the compare/head branch is backport-416-to-8.5.

@swallez
Copy link
Member

swallez commented Oct 10, 2022

Should we backport to 7.17 or will it stay on Unified Release Manager?

@sethmlarson
Copy link
Contributor Author

Yeah that's the plan! We're waiting to merge the rest of the backports until we get a successful status on the main branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants