Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decrease build time #125

Open
TiemenJoustraUM opened this issue Mar 11, 2021 · 1 comment
Open

Decrease build time #125

TiemenJoustraUM opened this issue Mar 11, 2021 · 1 comment
Assignees

Comments

@TiemenJoustraUM
Copy link
Contributor

Currently we use a lot of CI time building the UM-kernel.
I think most time is spend on checking out the Linux git repo
Suggestion, can we checkout a version of the linux git repo when building the docker image? Then we only would need to update the repo on build, should save a lot of time.

@TiemenJoustraUM TiemenJoustraUM self-assigned this Mar 11, 2021
@oliv3r
Copy link
Contributor

oliv3r commented Mar 15, 2021

Hey @TiemenJoustraUM ,

you could, but it would very heavily 'violate' Docker's best practises. If you do that, you end up either a) a container that doesn't have updated sources. And without updated sources, well, why rebuild? You generally want to rebuild because you made changes to repo.

If you would do this, you'd also need to push a new container each time code was changed, meaning you either have a pipeline, that generates a container based on a commit, which means, you have to clone the repo, to be able to build the container, meaning you end up with the same situation, just more complex.

You also can of course, do this locally, but then your automation and CI/CD kind-of goes out the door.

If you really want to solve this, you'd have to somehow cache the git repo, which is very hard. I'm not sure if 'cloud-builds' would be faster, e.g. gitlab-runner hosted by gitlab, pulls code from the gitlab git instance. They should be very close by (in theory) so cloning should be fast. Likewise, hosting your own git infra (your own gitlab instance) where your gitaly instance is very close to your runner would solve this also, as cloning would be extremely fast.
The middle-ish ground, is to rely on local git runner caches, and to also use 'FETCH' rather then 'CLONE' in your CI config. Using 'clone' means you are always pulling a fresh repo, using fetch, GitLab-runner will try to use the local cache and will first to try to 'git fetch' before doing git clone on failure.

But yes, GitLab has some real performance issues with cloning big (sub-module) repositories! I share your pain, but haven't found a solution myself some.

Also, Hi @Ultimaker ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants