Unzip artifacts after upload (store_artifacts)
Question:
We use Sphinx to build our Python documentation and then store the artifact. GitHub then shows directly a link in the CI to open the index.html
.
But when building Sphinx doc, thousands of files are created which slows down the upload of the artifact. To reduce upload time, from the doc, I can upload a compressed folder. But then, how to persist the uncompressed folder? This is needed as we have a file artifact_path
which needs a path to the index.html
. Or is there another way around this?
- store_artifacts:
path: folder.tar # untar after?
Disclamer: this is for SciPy, I am a core-dev. This would be extremely helpful as it takes around 30% of the build time of this pipeline.
Answers:
CircleCI is using AWS to store files. Hence you cannot uncompress artifacts and serve them directly.
We would need to use another service/server to get the artifact, uncompress and serve the files.
We use Sphinx to build our Python documentation and then store the artifact. GitHub then shows directly a link in the CI to open the index.html
.
But when building Sphinx doc, thousands of files are created which slows down the upload of the artifact. To reduce upload time, from the doc, I can upload a compressed folder. But then, how to persist the uncompressed folder? This is needed as we have a file artifact_path
which needs a path to the index.html
. Or is there another way around this?
- store_artifacts:
path: folder.tar # untar after?
Disclamer: this is for SciPy, I am a core-dev. This would be extremely helpful as it takes around 30% of the build time of this pipeline.
CircleCI is using AWS to store files. Hence you cannot uncompress artifacts and serve them directly.
We would need to use another service/server to get the artifact, uncompress and serve the files.