Newer
Older
This project implements a GitLab CI/CD template to build, test and analyse your [Python](https://www.python.org/) projects.
This template can be used both as a [CI/CD component](https://docs.gitlab.com/ee/ci/components/#use-a-component)
or using the legacy [`include:project`](https://docs.gitlab.com/ee/ci/yaml/index.html#includeproject) syntax.
### Use as a CI/CD component
Add the following to your `.gitlab-ci.yml`:
```yaml
include:
# 1: include the component
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.4.0
# 2: set/override component inputs
inputs:
Bertrand Goareguer
committed
image: registry.hub.docker.com/library/python:3.12-slim
pytest-enabled: true
```
### Use as a CI/CD template (legacy)
Add the following to your `.gitlab-ci.yml`:
variables:
# 2: set/override template variables
Bertrand Goareguer
committed
PYTHON_IMAGE: registry.hub.docker.com/library/python:3.12-slim
```
## Global configuration
The Python template uses some global configuration used throughout all jobs.
| Input / Variable | Description | Default value |
| -------------------- | ------------------------------------------------------------------------------------- | ------------------ |
Bertrand Goareguer
committed
| `image` / `PYTHON_IMAGE` | The Docker image used to run Python <br/>:warning: **set the version required by your project** | `registry.hub.docker.com/library/python:3-slim` |
| `project-dir` / `PYTHON_PROJECT_DIR` | Python project root directory | `.` |
| `build-system` / `PYTHON_BUILD_SYSTEM`| Python build-system to use to install dependencies, build and package the project (see below) | `auto` (auto-detect) |
Pierre Smeyers
committed
| `PIP_INDEX_URL` | Python repository url | _none_ |
| `PIP_EXTRA_INDEX_URL` | Extra Python repository url | _none_ |
| `pip-opts` / `PIP_OPTS` | pip [extra options](https://pip.pypa.io/en/stable/cli/pip/#general-options) | _none_ |
| `extra-deps` / `PYTHON_EXTRA_DEPS` | Python extra sets of dependencies to install<br/>For [Setuptools](https://setuptools.pypa.io/en/latest/userguide/dependency_management.html?highlight=extras#optional-dependencies) or [Poetry](https://python-poetry.org/docs/pyproject/#extras) only | _none_ |
| `reqs-file` / `PYTHON_REQS_FILE` | Main requirements file _(relative to `$PYTHON_PROJECT_DIR`)_<br/>For [Requirements Files](https://pip.pypa.io/en/stable/user_guide/#requirements-files) build-system only | `requirements.txt` |
| `extra-reqs-files` / `PYTHON_EXTRA_REQS_FILES` | Extra dev requirements file(s) to install _(relative to `$PYTHON_PROJECT_DIR`)_ | `requirements-dev.txt` |
Pierre Smeyers
committed
The cache policy also makes the necessary to manage pip cache (not to download Python dependencies over and over again).
Pierre Smeyers
committed
## Multi build-system support
The Python template supports the most popular dependency management & build systems.
By default it tries to auto-detect the build system used by the project (based on the presence of `pyproject.toml`
and/or `setup.py` and/or `requirements.txt`), but the build system might also be set explicitly using the
| Value | Build System (scope) |
| ---------------- |--------------------------------------------------------------------------------------------------------|
| _none_ (default) or `auto` | The template tries to **auto-detect** the actual build system |
| `setuptools` | [Setuptools](https://setuptools.pypa.io/) (dependencies, build & packaging) |
| `poetry` | [Poetry](https://python-poetry.org/) (dependencies, build, test & packaging) |
| `uv` | [uv](https://docs.astral.sh/uv/) (dependencies, build, test & packaging)|
| `pipenv` | [Pipenv](https://pipenv.pypa.io/) (dependencies only) |
| `reqfile` | [Requirements Files](https://pip.pypa.io/en/stable/user_guide/#requirements-files) (dependencies only) |
:warning: You can explicitly set the build tool version by setting `$PYTHON_BUILD_SYSTEM` variable including a [version identification](https://peps.python.org/pep-0440/). For example `PYTHON_BUILD_SYSTEM="poetry==1.1.15"`
### `py-package` job
This job allows building your Python project [distribution packages](https://packaging.python.org/en/latest/glossary/#term-Distribution-Package).
It is bound to the `build` stage, it is **disabled by default** and can be enabled by setting `$PYTHON_PACKAGE_ENABLED` to `true`.
This job is **disabled by default** and performs code analysis based on [pylint](http://pylint.pycqa.org/en/latest/) Python lib.
It is activated by setting `$PYLINT_ENABLED` to `true`.
It is bound to the `build` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ------------------------ | ---------------------------------- | ----------------- |
| `pylint-enabled` / `PYLINT_ENABLED` | Set to `true` to enable the `pylint` job | _none_ (disabled) |
| `pylint-args` / `PYLINT_ARGS` | Additional [pylint CLI options](http://pylint.pycqa.org/en/latest/user_guide/run.html#command-line-options) | _none_ |
| `pylint-files` / `PYLINT_FILES` | Files or directories to analyse | _none_ (by default analyses all found python source files) |
In addition to a textual report in the console, this job produces the following reports, kept for one day:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/py-lint.codeclimate.json` | [Code Climate](https://docs.codeclimate.com/docs/pylint) | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscodequality) |
| `$PYTHON_PROJECT_DIR/reports/py-lint.parseable.txt` | [parseable](https://pylint.pycqa.org/en/latest/user_guide/usage/output.html) | [SonarQube integration](https://docs.sonarqube.org/latest/analysis/external-issues/) |
### Test jobs
The Python template features four alternative test jobs:
* `py-unittest` that performs tests based on [unittest](https://docs.python.org/3/library/unittest.html) Python lib,
* or `py-pytest` that performs tests based on [pytest](https://docs.pytest.org/en/latest/) Python lib,
* or `py-nosetest` that performs tests based on [nose](https://nose.readthedocs.io/en/latest/) Python lib,
* or `py-compile` that performs byte code generation to check syntax if not tests are available.
#### `py-unittest` job
This job is **disabled by default** and performs tests based on [unittest](https://docs.python.org/3/library/unittest.html) Python lib.
It is activated by setting `$UNITTEST_ENABLED` to `true`.
In order to produce JUnit test reports, the tests are executed with the [xmlrunner](https://github.com/xmlrunner/unittest-xml-reporting) module.
It is bound to the `build` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ------------------------ | -------------------------------------------------------------------- | ----------------------- |
| `unittest-enabled` / `UNITTEST_ENABLED` | Set to `true` to enable the `unittest` job | _none_ (disabled) |
| `unittest-args` / `UNITTEST_ARGS` | Additional xmlrunner/unittest CLI options | _none_ |
:information_source: use a `.coveragerc` file at the root of your Python project to control the coverage settings.
Example:
```conf
[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
In addition to a textual report in the console, this job produces the following reports, kept for one day:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/TEST-*.xml` | [xUnit](https://en.wikipedia.org/wiki/XUnit) test report(s) | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportsjunit) & [SonarQube integration](https://docs.sonarqube.org/latest/analysis/test-coverage/test-execution-parameters/#header-8) |
| `$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml` | [Cobertura XML](https://gcovr.com/en/stable/output/cobertura.html) coverage report | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscoverage_report) & [SonarQube integration](https://docs.sonarqube.org/latest/analysis/test-coverage/python-test-coverage/) |
#### `py-pytest` job
This job is **disabled by default** and performs tests based on [pytest](https://docs.pytest.org/en/latest/) Python lib.
It is activated by setting `$PYTEST_ENABLED` to `true`.
It is bound to the `build` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ------------------------ | --------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------- |
| `pytest-enabled` / `PYTEST_ENABLED` | Set to `true` to enable the `pytest` job | _none_ (disabled) |
| `pytest-args` / `PYTEST_ARGS` | Additional [pytest](https://docs.pytest.org/en/stable/usage.html) or [pytest-cov](https://github.com/pytest-dev/pytest-cov#usage) CLI options | _none_ |
:information_source: use a `.coveragerc` file at the root of your Python project to control the coverage settings.
Example:
```conf
[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
In addition to a textual report in the console, this job produces the following reports, kept for one day:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/TEST-*.xml` | [xUnit](https://en.wikipedia.org/wiki/XUnit) test report(s) | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportsjunit) & [SonarQube integration](https://docs.sonarqube.org/latest/analysis/test-coverage/test-execution-parameters/#header-8) |
| `$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml` | [Cobertura XML](https://gcovr.com/en/stable/output/cobertura.html) coverage report | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscoverage_report) & [SonarQube integration](https://docs.sonarqube.org/latest/analysis/test-coverage/python-test-coverage/) |
This job is **disabled by default** and performs tests based on [nose](https://nose.readthedocs.io/en/latest/) Python lib.
It is activated by setting `$NOSETESTS_ENABLED` to `true`.
It is bound to the `build` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ------------------------ | --------------------------------------------------------------------------------------- | ----------------------- |
| `nosetests-enabled` / `NOSETESTS_ENABLED` | Set to `true` to enable the `nose` job | _none_ (disabled) |
| `nosetests-args` / `NOSETESTS_ARGS` | Additional [nose CLI options](https://nose.readthedocs.io/en/latest/usage.html#options) | _none_ |
By default coverage will be run on all the project directories. You can restrict it to your packages by setting the `$NOSE_COVER_PACKAGE` variable.
More [info](https://nose.readthedocs.io/en/latest/plugins/cover.html)
:information_source: use a `.coveragerc` file at the root of your Python project to control the coverage settings.
In addition to a textual report in the console, this job produces the following reports, kept for one day:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/TEST-*.xml` | [xUnit](https://en.wikipedia.org/wiki/XUnit) test report(s) | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportsjunit) & [SonarQube integration](https://docs.sonarqube.org/latest/analysis/test-coverage/test-execution-parameters/#header-8) |
| `$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml` | [Cobertura XML](https://gcovr.com/en/stable/output/cobertura.html) coverage report | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscoverage_report) & [SonarQube integration](https://docs.sonarqube.org/latest/analysis/test-coverage/python-test-coverage/) |
This job is a fallback if no unit test has been set up (`$UNITTEST_ENABLED` and `$PYTEST_ENABLED` and `$NOSETEST_ENABLED`
are not set), and performs a [`compileall`](https://docs.python.org/3/library/compileall.html).
It is bound to the `build` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| --------------------- | ----------------------------------------------------------------------------- | ------------- |
| `compile-args` / `PYTHON_COMPILE_ARGS` | [`compileall` CLI options](https://docs.python.org/3/library/compileall.html) | `*` |
### `py-bandit` job (SAST)
This job is **disabled by default** and performs a [Bandit](https://pypi.org/project/bandit/) analysis.
It is bound to the `test` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ---------------- | ---------------------------------------------------------------------- | ----------------- |
| `bandit-enabled` / `BANDIT_ENABLED` | Set to `true` to enable Bandit analysis | _none_ (disabled) |
| `bandit-args` / `BANDIT_ARGS` | Additional [Bandit CLI options](https://github.com/PyCQA/bandit#usage) | `--recursive .` |
In addition to a textual report in the console, this job produces the following reports, kept for one day and only available for download by users with the Developer role or higher:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/py-bandit.bandit.csv` | [CSV](https://bandit.readthedocs.io/en/latest/formatters/csv.html) | [SonarQube integration](https://docs.sonarqube.org/latest/analysis/external-issues/)<br/>_This report is generated only if SonarQube template is detected_ |
| `$PYTHON_PROJECT_DIR/reports/py-bandit.bandit.json` | [JSON](https://bandit.readthedocs.io/en/latest/formatters/json.html) | [DefectDojo integration](https://defectdojo.github.io/django-DefectDojo/integrations/parsers/#bandit)<br/>_This report is generated only if DefectDojo template is detected_ |
### `py-trivy` job (dependency check)
This job performs a dependency check analysis using [Trivy](https://github.com/aquasecurity/trivy/).
:warning: This job is now **enabled by default** since version 7.0.0
It is bound to the `test` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ---------------- | ----------------------------------------------------------------------- | ----------------- |
| `trivy-disabled` / `PYTHON_TRIVY_DISABLED` | Set to `true` to disable Trivy job | _none_ (enabled) |
Bertrand Goareguer
committed
| `trivy-dist-url` / `PYTHON_TRIVY_DIST_URL` | Url to the `tar.gz` package for `linux_amd64` of Trivy to use (ex: `https://github.com/aquasecurity/trivy/releases/download/v0.51.1/trivy_0.51.1_Linux-64bit.tar.gz`)<br/>_When unset, the latest version will be used_ | _none_ |
| `trivy-args` / `PYTHON_TRIVY_ARGS` | Additional [Trivy CLI options](https://aquasecurity.github.io/trivy/latest/docs/references/configuration/cli/trivy_filesystem/) | `--ignore-unfixed --pkg-types library --detection-priority comprehensive` |
In addition to a textual report in the console, this job produces the following reports, kept for one day and only available for download by users with the Developer role or higher:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/py-trivy.trivy.json` | [JSON](https://aquasecurity.github.io/trivy/latest/docs/configuration/reporting/#json) | [DefectDojo integration](https://defectdojo.github.io/django-DefectDojo/integrations/parsers/#trivy)<br/>_This report is generated only if DefectDojo template is detected_ |
### `py-sbom` job
This job generates a [SBOM](https://cyclonedx.org/) file listing all dependencies using [syft](https://github.com/anchore/syft).
It is bound to the `test` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| --------------------- | -------------------------------------- | ----------------- |
| `sbom-disabled` / `PYTHON_SBOM_DISABLED` | Set to `true` to disable this job | _none_ |
| `sbom-syft-url` / `PYTHON_SBOM_SYFT_URL` | Url to the `tar.gz` package for `linux_amd64` of Syft to use (ex: `https://github.com/anchore/syft/releases/download/v0.62.3/syft_0.62.3_linux_amd64.tar.gz`)<br/>_When unset, the latest version will be used_ | _none_ |
| `sbom-name` / `PYTHON_SBOM_NAME` | Component name of the emitted SBOM | `$CI_PROJECT_PATH/$PYTHON_PROJECT_DIR` |
| `sbom-opts` / `PYTHON_SBOM_OPTS` | Options for syft used for SBOM analysis | `--override-default-catalogers python-package-cataloger` |
In addition to logs in the console, this job produces the following reports, kept for one week:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/py-sbom.cyclonedx.json` | [CycloneDX JSON](https://cyclonedx.org/docs/latest/json/) | [Security & Compliance integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscyclonedx) |
### `py-black` job
This job **disabled by default** and runs [black](https://black.readthedocs.io) on the repo. It is bound to the build stage.
| Input / Variable | Description | Default value |
| ---------------- | ----------------------------------------------------------------------- | ----------------- |
| `black-enabled` / `PYTHON_BLACK_ENABLED` | Set to `true` to enable black job | _none_ (disabled) |
### `py-isort` job
This job **disabled by default** and runs [isort](https://pycqa.github.io/isort/) on the repo. It is bound to the build stage.
| Input / Variable | Description | Default value |
| ---------------- | ----------------------------------------------------------------------- | ----------------- |
| `isort-enabled` / `PYTHON_ISORT_ENABLED` | Set to `true` to enable isort job | _none_ (disabled) |
### `py-ruff` job
This job **disabled by default** and runs [Ruff](https://docs.astral.sh/ruff/) on the repo. It is bound to the build stage.
| Input / Variable | Description | Default value |
| ---------------- | ----------------------------------------------------------------------- | ----------------- |
| `ruff-enabled` / `RUFF_ENABLED` | Set to `true` to enable ruff job | _none_ (disabled) |
| `ruff-args` / `RUFF_ARGS` | Additional [Ruff Linter CLI options](https://docs.astral.sh/ruff/configuration/#full-command-line-interface) | _none_ |
| `ruff-ext-exclude` / `RUFF_EXT_EXCLUDE` | Define [extend-exclude](https://docs.astral.sh/ruff/settings/#extend-exclude) files | _.venv,.cache_ |
:warning: Ruff can replace isort, Black, Bandit, Pylint and much more. [More info](https://github.com/astral-sh/ruff/blob/main/docs/faq.md#which-tools-does-ruff-replace).
In addition to logs in the console, this job produces the following reports, kept for one week:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/py-ruff.gitlab.json` | [GitLab](https://docs.astral.sh/ruff/settings/#output-format) | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscodequality) |
| `$PYTHON_PROJECT_DIR/reports/py-ruff.native.json` | [JSON](https://docs.astral.sh/ruff/settings/#output-format) | [SonarQube integration](https://docs.sonarqube.org/latest/analysis/external-issues/)<br/>_This report is generated only if SonarQube template is detected_ |
#### `py-mypy` job
This job is **disabled by default** and performs code analysis based on [mypy](https://mypy.readthedocs.io/en/stable/).
It is activated by setting `$MYPY_ENABLED` to `true`.
It is bound to the `build` stage, and uses the following variables:
| Input / Variable | Description | Default value |
| ------------------------ | ---------------------------------- | ----------------- |
| `mypy-enabled` / `MYPY_ENABLED` | Set to `true` to enable the `mypy` job | _none_ (disabled) |
| `mypy-args` / `MYPY_ARGS` | Additional [mypy CLI options](https://mypy.readthedocs.io/en/stable/command_line.html) | _none_ |
| `mypy-files` / `MYPY_FILES` | Files or directories to analyse | _none_ (by default analyses all found python source files) |
In addition to a textual report in the console, this job produces the following reports, kept for one day:
| Report | Format | Usage |
| -------------- | ---------------------------------------------------------------------------- | ----------------- |
| `$PYTHON_PROJECT_DIR/reports/py-mypy.codeclimate.json` | [Code Climate](https://github.com/soul-catcher/mypy-gitlab-code-quality) | [GitLab integration](https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html#artifactsreportscodequality) |
| `$PYTHON_PROJECT_DIR/reports/py-mypy.console.txt` | [mypy console output](https://mypy.readthedocs.io/) | [SonarQube integration](https://docs.sonarqube.org/latest/analysis/external-issues/) |
### SonarQube analysis
If you're using the SonarQube template to analyse your Python code, here is a sample `sonar-project.properties` file:
```properties
# see: https://docs.sonarqube.org/latest/analyzing-source-code/test-coverage/python-test-coverage/
# set your source directory(ies) here (relative to the sonar-project.properties file)
sonar.sources=.
# exclude unwanted directories and files from being analysed
sonar.exclusions=**/test_*.py
# set your tests directory(ies) here (relative to the sonar-project.properties file)
sonar.tests=.
sonar.test.inclusions=**/test_*.py
# tests report: xUnit format
sonar.python.xunit.reportPath=reports/TEST-*.xml
# coverage report: Cobertura format
sonar.python.coverage.reportPaths=reports/py-coverage.cobertura.xml
# pylint: parseable format (if enabled)
sonar.python.pylint.reportPaths=reports/py-lint.parseable.txt
# Bandit: CSV format (if enabled)
sonar.python.bandit.reportPaths=reports/py-bandit.bandit.csv
# Ruff: JSON format (if enabled)
sonar.python.ruff.reportPaths=reports/py-ruff.native.json
# mypy: JSON format (if enabled)
sonar.python.mypy.reportPaths=reports/py-mypy.console.txt
* [Python language support](https://docs.sonarqube.org/latest/analyzing-source-code/test-coverage/python-test-coverage/)
* [test coverage & execution parameters](https://docs.sonarqube.org/latest/analysis/coverage/)
* [third-party issues](https://docs.sonarqube.org/latest/analysis/external-issues/)
### `py-release` job
This job is **disabled by default** and allows to perform a complete release of your Python code:
1. increase the Python project version,
2. Git commit changes and create a Git tag with the new version number,
3. build the [Python packages](https://packaging.python.org/),
4. publish the built packages to a PyPI compatible repository ([GitLab packages](https://docs.gitlab.com/ee/user/packages/pypi_repository/) by default).
The Python template supports three packaging systems:
* [Poetry](https://python-poetry.org/): uses Poetry-specific [version](https://python-poetry.org/docs/cli/#version), [build](https://python-poetry.org/docs/cli/#build) and [publish](https://python-poetry.org/docs/cli/#publish) commands.
* [uv](https://docs.astral.sh/uv/): uses [uv](https://docs.astral.sh/uv/) as version management, [build](https://docs.astral.sh/uv/guides/publish/#building-your-package) as package builder and [publish](https://docs.astral.sh/uv/guides/publish/) to publish.
* [Setuptools](https://setuptools.pypa.io/): uses [bump-my-version](https://github.com/callowayproject/bump-my-version) as version management, [build](https://pypa-build.readthedocs.io/) as package builder and [Twine](https://twine.readthedocs.io/) to publish.
The release job is bound to the `publish` stage, appears only on production and integration branches and uses the following variables:
| Input / Variable | Description | Default value |
| ----------------------- | ----------------------------------------------------------------------- | ----------------- |
| `release-enabled` / `PYTHON_RELEASE_ENABLED`| Set to `true` to enable the release job | _none_ (disabled) |
| `release-next` / `PYTHON_RELEASE_NEXT` | The part of the version to increase (one of: `major`, `minor`, `patch`) | `minor` |
| `semrel-release-disabled` / `PYTHON_SEMREL_RELEASE_DISABLED`| Set to `true` to disable [semantic-release integration](#semantic-release-integration) | _none_ (disabled) |
| `GIT_USERNAME` | Git username for Git push operations (see below) | _none_ |
| :lock: `GIT_PASSWORD` | Git password for Git push operations (see below) | _none_ |
| :lock: `GIT_PRIVATE_KEY`| SSH key for Git push operations (see below) | _none_ |
| `release-commit-message` / `PYTHON_RELEASE_COMMIT_MESSAGE`| The Git commit message to use on the release commit. This is templated using the [Python Format String Syntax](http://docs.python.org/2/library/string.html#format-string-syntax). Available in the template context are current_version and new_version. | `chore(python-release): {current_version} → {new_version}` |
| `repository-url` / `PYTHON_REPOSITORY_URL`| Target PyPI repository to publish packages | _[GitLab project's PyPI packages repository](https://docs.gitlab.com/ee/user/packages/pypi_repository/)_ |
| `PYTHON_REPOSITORY_USERNAME`| Target PyPI repository username credential | `gitlab-ci-token` |
| :lock: `PYTHON_REPOSITORY_PASSWORD`| Target PyPI repository password credential | `$CI_JOB_TOKEN` |
#### Setuptools tip
If you're using a Setuptools configuration, then you will have to write a `.bumpversion.toml` or `pyproject.toml` file.
Example of `.bumpversion.toml` file:
```toml
[tool.bumpversion]
current_version = "0.0.0"
```
Example of `pyproject.toml` file:
```toml
[project]
name = "project-name"
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.bumpversion]
current_version = "0.0.0"
[[tool.bumpversion.files]]
filename = "project-name/__init__.py"
#### `semantic-release` integration
If you activate the [`semantic-release-info` job from the `semantic-release` template](https://gitlab.com/to-be-continuous/semantic-release/#semantic-release-info-job), the `py-release` job will rely on the generated next version info.
Thus, a release will be performed only if a next semantic release is present.
You should disable the `semantic-release` job (as it's the `py-release` job that will perform the release and so we only need the `semantic-release-info` job) by setting `SEMREL_RELEASE_DISABLED` to `true`.
Finally, the semantic-release integration can be disabled with the `PYTHON_SEMREL_RELEASE_DISABLED` variable.
#### Git authentication
A Python release involves some Git push operations.
You can either use a SSH key or user/password credentials.
##### Using a SSH key
We recommend you to use a [project deploy key](https://docs.gitlab.com/ee/user/project/deploy_keys/#project-deploy-keys) with write access to your project.
The key should not have a passphrase (see [how to generate a new SSH key pair](https://docs.gitlab.com/ee/user/ssh.html#generate-an-ssh-key-pair)).
Specify :lock: `$GIT_PRIVATE_KEY` as secret project variable with the private part of the deploy key.
```PEM
-----BEGIN 0PENSSH PRIVATE KEY-----
blablabla
-----END OPENSSH PRIVATE KEY-----
```
The template handles both classic variable and file variable.
##### Using user/password credentials
Simply specify :lock: `$GIT_USERNAME` and :lock: `$GIT_PASSWORD` as secret project variables.
Note that the password should be an access token (preferably a [project](https://docs.gitlab.com/ee/user/project/settings/project_access_tokens.html) or [group](https://docs.gitlab.com/ee/user/group/settings/group_access_tokens.html) access token) with `write_repository` scope and `Maintainer` role.
#### Pip repositories
When depending on Python packages published in [GitLab's packages registry](https://docs.gitlab.com/ee/user/packages/pypi_repository/), it could be useful to configure a group level Package.
But such repository will require an authenticated access.
To do so, simply set the `PIP_INDEX_URL` and use the CI job token.
```YAML
variables:
PIP_INDEX_URL: "${CI_SERVER_PROTOCOL}://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}:${CI_SERVER_PORT}/api/v4/groups/<group-id>/-/packages/pypi/simple"
```
In a corporate environment, you can be faced to two repositories: the corporate proxy-cache and the project repository.
Simply use both `PIP_INDEX_URL` and `PIP_EXTRA_INDEX_URL`.
```YAML
variables:
PIP_INDEX_URL: "https://cache.corp/repository/pypi/simple"
PIP_EXTRA_INDEX_URL: "${CI_SERVER_PROTOCOL}://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}:${CI_SERVER_PORT}/api/v4/groups/<group-id>/-/packages/pypi/simple"
```
## Variants
The Python template can be used in conjunction with template variants to cover specific cases.
### Vault variant
This variant allows delegating your secrets management to a [Vault](https://www.vaultproject.io/) server.
#### Configuration
In order to be able to communicate with the Vault server, the variant requires the additional configuration parameters:
| Input / Variable | Description | Default value |
| ----------------- | -------------------------------------- | ----------------- |
| `TBC_VAULT_IMAGE` | The [Vault Secrets Provider](https://gitlab.com/to-be-continuous/tools/vault-secrets-provider) image to use (can be overridden) | `registry.gitlab.com/to-be-continuous/tools/vault-secrets-provider:latest` |
| `vault-base-url` / `VAULT_BASE_URL` | The Vault server base API url | _none_ |
| `vault-oidc-aud` / `VAULT_OIDC_AUD` | The `aud` claim for the JWT | `$CI_SERVER_URL` |
| :lock: `VAULT_ROLE_ID` | The [AppRole](https://www.vaultproject.io/docs/auth/approle) RoleID | **must be defined** |
| :lock: `VAULT_SECRET_ID` | The [AppRole](https://www.vaultproject.io/docs/auth/approle) SecretID | **must be defined** |
#### Usage
Then you may retrieve any of your secret(s) from Vault using the following syntax:
```text
@url@http://vault-secrets-provider/api/secrets/{secret_path}?field={field}
```
With:
| -------------------------------- | -------------------------------------- |
| `secret_path` (_path parameter_) | this is your secret location in the Vault server |
| `field` (_query parameter_) | parameter to access a single basic field from the secret JSON payload |
#### Example
```yaml
include:
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.4.0
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python-vault@7.4.0
inputs:
vault-base-url: "https://vault.acme.host/v1"
# audience claim for JWT
vault-oidc-aud: "https://vault.acme.host"
variables:
# Secrets managed by Vault
GIT_PASSWORD: "@url@http://vault-secrets-provider/api/secrets/b7ecb6ebabc231/git/semantic-release?field=group-access-token"
GIT_PRIVATE_KEY: "@url@http://vault-secrets-provider/api/secrets/b7ecb6ebabc231/git/semantic-release?field=private-key"
PYTHON_REPOSITORY_PASSWORD: "@url@http://vault-secrets-provider/api/secrets/b7ecb6ebabc231/pip-repo/repository?field=password"
# $VAULT_ROLE_ID and $VAULT_SECRET_ID defined as a secret CI/CD variable
```
This variant allows to use Python Google Clients. The variant follow the recommendation [Authenticate for using client libraries](https://cloud.google.com/docs/authentication/client-libraries) with [ADC](https://cloud.google.com/docs/authentication/application-default-credentials)
[Detailed article on internal OIDC impersonated with Workload Identify Federation](https://blog.salrashid.dev/articles/2021/understanding_workload_identity_federation/#oidc-impersonated)
List of requirements before using this variant for use Python Google Clients:
1. You must have a Workload Identity Federation Pool,
2. You must have a Service Account with enough permissions to run your python job.
3. Optional, you can define `GOOGLE_CLOUD_PROJECT` in template variable to define the default Google project
#### Configuration
The variant requires the additional configuration parameters:
| Input / Variable | Description | Default value |
| ----------------- | -------------------------------------- | ----------------- |
| `gcp-oidc-aud` / `GCP_OIDC_AUD` | The `aud` claim for the JWT token _(only required for [OIDC authentication](https://docs.gitlab.com/ee/ci/cloud_services/google_cloud/))_ | `$CI_SERVER_URL` |
| `gcp-oidc-provider` / `GCP_OIDC_PROVIDER` | Default Workload Identity Provider associated with GitLab to [authenticate with OpenID Connect](https://docs.gitlab.com/ee/ci/cloud_services/google_cloud/) | _none_ |
| `gcp-oidc-account` / `GCP_OIDC_ACCOUNT` | Default Service Account to which impersonate with OpenID Connect authentication | _none_ |
#### Example
```yaml
include:
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.4.0
Bertrand Goareguer
committed
image: registry.hub.docker.com/library/python:3.12-slim
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python-gcp@7.4.0
inputs:
# common OIDC config for non-prod envs
gcp-oidc-provider: "projects/<gcp_nonprod_proj_id>/locations/global/workloadIdentityPools/<pool_id>/providers/<provider_id>"
gcp-oidc-account: "<name>@$<gcp_nonprod_proj_id>.iam.gserviceaccount.com"
```
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
### AWS CodeArtifact variant
This variant allows to use PyPi packages from AWS CodeArtifact. The variant follow the recommendation [Authenticate for using client libraries](https://docs.aws.amazon.com/codeartifact/latest/ug/python-configure.html)
It authenticates with AWS CodeArtifact, retrieves and sets the following environment variable:
- `CODEARTIFACT_AUTH_TOKEN` - the AWS CodeArtifact authentication token
- `CODEARTIFACT_REPOSITORY_ENDPOINT` - the AWS CodeArtifact repository endpoint
- `CODEARTIFACT_URL` - Formatted URL for the AWS CodeArtifact repository
Most importantly, the variant sets the `pip global.index-url` to the CodeArtifact url.
The variant supports two authentication methods:
1. [federated authentication using OpenID Connect](https://docs.gitlab.com/ee/ci/cloud_services/aws/) (**recommended method**),
2. or basic authentication with AWS access key ID & secret access key.
:warning: when using this variant, you must have created the CodeArtifact repository.
#### Configuration
The variant *requires* the additional configuration parameters:
| Input / Variable | Description | Default value |
| --------------------------------------------- | --------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------- |
| `TBC_AWS_PROVIDER_IMAGE` | The [AWS Auth Provider](https://gitlab.com/to-be-continuous/tools/aws-auth-provider) image to use (can be overridden) | `registry.gitlab.com/to-be-continuous/tools/aws-auth-provider:latest` |
| `aws-region` / `AWS_REGION` | Default region (where the Codeartifact repository is located) | _none_ |
| `aws-codeartifact-domain` / `AWS_CODEARTIFACT_DOMAIN` | The CodeArtifact domain name | _none_ |
| `aws-codeartifact-domain-owner` / `AWS_CODEARTIFACT_DOMAIN_OWNER` | The CodeArtifact domain owner account ID | _none_ |
| `aws-codeartifact-repository` / `AWS_CODEARTIFACT_REPOSITORY` | The CodeArtifact repository name | _none_ |
##### OIDC authentication config
This is the recommended authentication method. In order to use it, first carefuly follow [GitLab's documentation](https://docs.gitlab.com/ee/ci/cloud_services/aws/),
then set the required configuration.
| Input / Variable | Description | Default value |
| ----------------------------------------------------------- | ---------------------------------------------------------------------------------------------- | ---------------- |
| `aws-oidc-aud` / `AWS_OIDC_AUD` | The `aud` claim for the JWT token | `$CI_SERVER_URL` |
| `aws-oidc-role-arn` / `AWS_OIDC_ROLE_ARN` | Default IAM Role ARN associated with GitLab | _none_ |
##### Basic authentication config
| Variable | Description | Default value |
| --------------------------------------- | ---------------------------------------------------------------------------- | ----------------- |
| :lock: `AWS_ACCESS_KEY_ID` | Default access key ID | _none_ (disabled) |
| :lock: `AWS_SECRET_ACCESS_KEY` | Default secret access key | _none_ (disabled) |
#### Example
```yaml
include:
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.3.2
# 2: set/override component inputs
inputs:
image: registry.hub.docker.com/library/python:3.12-slim
pytest-enabled: true
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python-aws-ca@7.3.2
inputs:
aws-region: "us-east-1"
aws-codeartifact-domain: "acme"
aws-codeartifact-domain-owner: "123456789012"
aws-codeartifact-repository: "my-repo"
# common OIDC config for non-prod envs
aws-oidc-role-arn: "arn:aws:iam::123456789012:role/gitlab-ci"
```