For another package I have been working on, the Salsa CI facility has proven to be usable, configured using files in debian/test, particularly as it allows
test-related dependencies to be specified independently. However, this other package has no dependencies that are currently unpackaged in Debian. Meanwhile, the testing of this new Moin package depends on brand new packages somehow being made available.
So, I would appreciate guidance on how I might enable testing of this new Moin
package, along with its new dependencies, in the Salsa environment or any other appropriate environment that would satisfy Debian packaging needs and policies. It would also be quite helpful if built packages might be published somewhere for people to test, but this is a separate consideration even if such packages would obviously need to be generated as part of the testing regime.
I'm not a Debian developer but I have some experience on Salsa CI, so I thought that I might be able to help... but then I was confused by a
specific part of the message:
On 17 Aug 2023 at 17:10:08, Paul Boddie wrote:
[...]
For another package I have been working on, the Salsa CI facility has proven to be usable, configured using files in debian/test, particularly
as it allows test-related dependencies to be specified independently. However, this other package has no dependencies that are currently unpackaged in Debian. Meanwhile, the testing of this new Moin package depends on brand new packages somehow being made available.
If this dependencies are available on the "build" step: could they be
made available on the autopkgtest? I didn't quite understand why this is
not possible. I've found the autopkgtest quite flexible (since the tests
are scripts that could prepare some environment)
For me, the packages on the build job are made available via a Salsa
artifact automatically (and easy to download as a .zip of the *.deb). I
think that this happens on all the "build" jobs on Salsa CI. They can be downloaded as a .zip file
(e.g. https://salsa.debian.org/freexian-team/debusine/-/jobs/4564890,
right hand side "Job artifacts" and click "Download"). Would that be
enough?
I also created a repo and hosted it on a Salsa CI page for internal
testing but this is a bit of a workaround. This is in a new job but just download the artifacts (via a Salsa CI dependency) and run
dpkg-scanpackages and copy the files to the right place).
On Friday, 18 August 2023 09:51:29 CEST Carles Pina i Estany wrote:
I'm not a Debian developer but I have some experience on Salsa CI, so I thought that I might be able to help... but then I was confused by a specific part of the message:
On 17 Aug 2023 at 17:10:08, Paul Boddie wrote:
[...]
For another package I have been working on, the Salsa CI facility has proven to be usable, configured using files in debian/test, particularly as it allows test-related dependencies to be specified independently. However, this other package has no dependencies that are currently unpackaged in Debian. Meanwhile, the testing of this new Moin package depends on brand new packages somehow being made available.
If this dependencies are available on the "build" step: could they be
made available on the autopkgtest? I didn't quite understand why this is not possible. I've found the autopkgtest quite flexible (since the tests are scripts that could prepare some environment)
The package has dependencies on installation but these dependencies
are not strictly necessary when building. However, if I wanted to run
the test suite when building, I would indeed need to pull in these dependencies as build dependencies so that the software being tested
can run without import errors.
I have to add that the other package I refer to has a test suite that takes a long time to run, so that is another reason why I chose Salsa CI for that package instead of letting autopkgtest do its work:
One can imagine having a common storage area holding these newly
introduced packages that the CI scripts could access in preference to
the usual archives. In fact, this would be something that might also
affect existing packages. Consider the situation where fixes to a
dependency are required to fix functionality in a particular package.
One would have to wait for the fixed dependency to become integrated
into the unstable archive before the principal package's tests would
start to work again.
I also created a repo and hosted it on a Salsa CI page for internal
testing but this is a bit of a workaround. This is in a new job but just download the artifacts (via a Salsa CI dependency) and run dpkg-scanpackages and copy the files to the right place).
This sounds like something related to what might be required. In effect, you seem to be doing what I am doing when I actually install my built packages in a chroot. I run apt-ftparchive (which runs dpkg-scanpackages) to generate Packages, Sources and Release files that tells apt about the new packages when
added as another package source.
In the Salsa CI environment, I would need to have the built packages (found in
the artefacts for each package's build job) copied somewhere that can then be found by the Moin package's pipeline jobs and the scripts creating a special repository of new packages.
Here, it would seem that the most prudent approach is to use the Salsa CI service instead of trying to get the test suite to run during the package build process.You should do both if possible, assuming that by "Salsa CI service" you
One motivation for doing so involves not having to specifyThis is fine. Not to mention that the same problem exists for
build dependencies for packages only needed for test suite execution, which itself requires the invocation of gbp buildpackage with --extra-package arguments since some packages are completely new to Debian.
I have also found it difficult to persuade the tests to run successfully during the build process. A few of these attempt to invoke the moin program, but this cannot be located since it is not installed in the build environment.This should also be fine, unless it's completely impossible to run it
However, one conclusion is that testing a system, as some of theIf there are tests that can't be run at build time you can skip those. You
test cases appear to do, and as opposed to testing library functionality, is not necessarily appropriate when directed by something like dh_auto_test.
For the jobs it is happening via https://salsa.debian.org/salsa-ci-team/pipeline/#using-automatically-built-apt-repository
In the Salsa CI environment, I would need to have the built packages
(found in the artefacts for each package's build job) copied somewhere
that can then be found by the Moin package's pipeline jobs and the
scripts creating a special repository of new packages.
Archiving artifacts should happen automatically on the "build" step of
Salsa CI (salsa-ci/pipeline). If I understand correctly what you
wanted...
On Thu, Aug 17, 2023 at 05:10:08PM +0200, Paul Boddie wrote:
Here, it would seem that the most prudent approach is to use the Salsa CI service instead of trying to get the test suite to run during the package build process.
You should do both if possible, assuming that by "Salsa CI service" you
mean autopkgtests which you can, and IMO should, also run locally.
One motivation for doing so involves not having to specify
build dependencies for packages only needed for test suite execution,
which itself requires the invocation of gbp buildpackage with --extra-package arguments since some packages are completely new to
Debian.
This is fine. Not to mention that the same problem exists for
autopkgtests, as you say below.
I have also found it difficult to persuade the tests to run successfully during the build process. A few of these attempt to invoke the moin program, but this cannot be located since it is not installed in the
build environment.
This should also be fine, unless it's completely impossible to run it
without installing into /.
However, one conclusion is that testing a system, as some of the
test cases appear to do, and as opposed to testing library functionality, is not necessarily appropriate when directed by something like dh_auto_test.
If there are tests that can't be run at build time you can skip those. You can even ask the upstream to provide tool arguments to simplify that.
The maintainer is supposed to invoke it before uploading the package, andHere, it would seem that the most prudent approach is to use the Salsa CI service instead of trying to get the test suite to run during the package build process.
You should do both if possible, assuming that by "Salsa CI service" you mean autopkgtests which you can, and IMO should, also run locally.
I'm not really clear on what autopkgtest really is, other than a tool that uses some kind of test suite description that may reside in debian/test. I'm also not completely clear on what is supposed to invoke it, other than either the Salsa CI mechanism or dh_auto_test.
In the Debian Wiki documentation...Yes.
https://wiki.debian.org/Python/LibraryStyleGuide
...it mentions a field in debian/control:
Testsuite: autopkgtest-pkg-python
My impression is that this calls autodep8 to generate some kind of test suite
description which is then invoked by dh_auto_test.It's not invoked by dh_auto_test. autopkgtests are not a part of the
It doesn't help that thereIt just generates a different (better) test.
is an alternative to this that resembles it but behaves differently:
Testsuite: autopkgtest-pkg-pybuild
AFAIK there should be ways to work with this.I have also found it difficult to persuade the tests to run successfully during the build process. A few of these attempt to invoke the moin program, but this cannot be located since it is not installed in the build environment.
This should also be fine, unless it's completely impossible to run it without installing into /.
The moin program is made available in setup.py using an entry point. Maybe if there were some kind of script instead, it would work.
Which is by itself not a problem from the technical side, as you couldHowever, one conclusion is that testing a system, as some of the
test cases appear to do, and as opposed to testing library functionality, is not necessarily appropriate when directed by something like dh_auto_test.
If there are tests that can't be run at build time you can skip those. You can even ask the upstream to provide tool arguments to simplify that.
I may well discuss such matters with them. One challenge that is relevant in this situation is that upstream have been working in their own virtualenv (or venv, or whatever it is now) world for a few years, using plenty of dependencies that were not packaged in Debian.
On Friday, 18 August 2023 16:12:19 CEST Carles Pina i Estany wrote:
For the jobs it is happening via https://salsa.debian.org/salsa-ci-team/pipeline/#using-automatically-built-apt-repository
Reviewing this documentation is actually more helpful than I thought it would be. I had noticed the "aptly" task in the YAML files, and I had started to wonder if that meant that some kind of repository publishing was occurring somewhere.
In the Salsa CI environment, I would need to have the built packages (found in the artefacts for each package's build job) copied somewhere that can then be found by the Moin package's pipeline jobs and the scripts creating a special repository of new packages.
Archiving artifacts should happen automatically on the "build" step of Salsa CI (salsa-ci/pipeline). If I understand correctly what you
wanted...
I think you have a good understanding of what I am trying to achieve. If I can
get the new package dependencies (emeraldtree, feedgen, and so on) to yield installable packages when built that can then be referenced by Salsa CI as it runs the build jobs for Moin, I have a chance of running the test suite.
I'm currently persuading the CI system to run the "aptly" task and to publish package repositories. I will then augment a customised YAML file for the Moin
package with references to these repositories and see if the test
suite can be invoked somewhat more successfully as a consequence.
Ha! I wasn't aware of the aptly option (https://salsa.debian.org/salsa-ci-team/pipeline/#using-automatically-built-> apt-repository and SALSA_CI_DISABLE_APTLY=0).
I think that I might have re-invented the wheel in a tiny part of
Debusine CI/CD.
I will point out at some things that might safe some time to you
(great) or not (ignore! :-) ):
debusine's .gitlab-ci.yml: https://salsa.debian.org/freexian-team/debusine/-/blob/devel/.gitlab-ci.yml
The job autopkgtest, via debci command, runs autopkgtest: https://salsa.debian.org/freexian-team/debusine/-/jobs/4574458#L65
It's possible for sure. Other people in this list might come up with a different idea. I don't have almost any experience with Debian
packaging, but I have some experience on the salsa CI. So I might be
giving you solutions that might be sub-optimal! :-)
On Friday, 18 August 2023 19:54:55 CEST Carles Pina i Estany wrote:
Ha! I wasn't aware of the aptly option (https://salsa.debian.org/salsa-ci-team/pipeline/#using-automatically-built-> apt-repository and SALSA_CI_DISABLE_APTLY=0).
I think that I might have re-invented the wheel in a tiny part of
Debusine CI/CD.
It is certainly a way of propagating packages to those that might need
them. However, the instructions indicating how a package might access
these dependencies appear to be deficient.
It does not appear to be sufficient to merely specify the dependency
package repositories and mark them as trusted. Doing that will just
cause the repositories to be regarded as ignored and the GPG keys
signalled as unrecognised.
So, the GPG keys need to be obtained. This is a hassle because something like wget is needed to do that, and then apt has to be persuaded not to fail in an opaque way. So the modified recipe is something like this:
before_script:
- apt-get update
- NON_INTERACTIVE=1 apt-get install -y wget
- echo "deb https://salsa.debian.org/moin-team/emeraldtree/-/jobs/4575438/ artifacts/raw/aptly unstable main" | tee /etc/apt/sources.list.d/pkga.list
...
- wget -q -O /etc/apt/trusted.gpg.d/emeraldtree.asc https:// salsa.debian.org/moin-team/emeraldtree/-/jobs/4575438/artifacts/raw/aptly/ public-key.asc
...
- apt-get update
This seems to make the various jobs happy, but the one that I was most concerned with remains unhappy! I don't actually need the dependencies
for anything other than autopkgtest, but that job employs its own
environment where the above recipe has no effect.
Quick answer for now...
I don't know if a .asc files are allowed. Actually, from my
.bash_history:
---
wget -O- https://www.virtualbox.org/download/oracle_vbox_2016.asc | sudo gpg --dearmor --yes --output /usr/share/keyrings/oracle-virtualbox-2016.gpg ---
(I was following instructions, I didn't try leaving the .asc file there)
You could try using:
---
variables:
SALSA_CI_AUTOPKGTEST_ARGS: '--setup-commands=ci/setup-the-repo.sh'
---
(and write and ship the ci/setup-the-repo.sh in the repo, do whatever
you need there)
Or, probably even better but less flexible:
---
variables:
SALSA_CI_AUTOPKGTEST_ARGS: '--add-apt-source="deb http://MIRROR SUITE COMPONENT' ---
(I found add-apt-source via "man autopkgtest" in TEST BED SETUP OPTIONS.
I haven't used add-apt-source, I don't know what happens with the gpg
keys... but you could use [trusted=yes] :-| )
For the MIRROR you have salsa variables that might help if you need to specify the pipeline.
So, what I now need to do is to find out how I can make the new packages available to autopkgtest specifically.
We will get there! (one way or another)
hopefully SALSA_CI_AUTOPKGTEST_ARGS will help you. I added this in salsa-ci/pipeline last year because I was trying to do a similar thing
to what you are doing (I had to pin a package from backports).
(I'm just a user of salsa-ci/pipeline, not a member of the team neither
I can speak for them!)
I don't know if an FAQ, conventions, best practises or something might help...
On Saturday, 19 August 2023 20:32:59 CEST Carles Pina i Estany wrote:
Quick answer for now...
And a very quick reply from me... :-)
[...]
I don't know if a .asc files are allowed. Actually, from my
.bash_history:
---
wget -O- https://www.virtualbox.org/download/oracle_vbox_2016.asc | sudo gpg
--dearmor --yes --output /usr/share/keyrings/oracle-virtualbox-2016.gpg ---
(I was following instructions, I didn't try leaving the .asc file there)
They are allowed in recent versions of apt, as confirmed by the man page.
[...]
You could try using:
---
variables:
SALSA_CI_AUTOPKGTEST_ARGS: '--setup-commands=ci/setup-the-repo.sh'
---
(and write and ship the ci/setup-the-repo.sh in the repo, do whatever
you need there)
This could be very useful.
Or, probably even better but less flexible:
---
variables:
SALSA_CI_AUTOPKGTEST_ARGS: '--add-apt-source="deb http://MIRROR SUITE COMPONENT' ---
(I found add-apt-source via "man autopkgtest" in TEST BED SETUP OPTIONS.
I haven't used add-apt-source, I don't know what happens with the gpg keys... but you could use [trusted=yes] :-| )
For the MIRROR you have salsa variables that might help if you need to specify the pipeline.
So could this.
So, what I now need to do is to find out how I can make the new packages available to autopkgtest specifically.
We will get there! (one way or another)
Well, I have engineered something very inelegant, revealed in full here:
https://salsa.debian.org/moin-team/moin/-/blob/debian/master/debian/salsa-ci.yml
hopefully SALSA_CI_AUTOPKGTEST_ARGS will help you. I added this in salsa-ci/pipeline last year because I was trying to do a similar thing
to what you are doing (I had to pin a package from backports).
(I'm just a user of salsa-ci/pipeline, not a member of the team neither
I can speak for them!)
If this can simplify what I've done, which is really quite horrible, then I will adopt it instead. The way that the artefacts of the dependencies are bound up in specifically numbered jobs is also particularly unfortunate. Of
Interesting approach, but if you could use the variables and ship the
shell script it might reduce some duplication between jobs (if possible,
I haven't look into too much detail in your case) and more importantly
you might be able to use the standard "autopkgtest" or "piuparts"
(instead of redefining them).
Just last week I created a new autopkgtest (I have it running on
bookworm and bullseye). It's done this way:
------
autopkgtest-bullseye:
extends: .test-autopkgtest
variables:
RELEASE: "bullseye-backports"
SALSA_CI_AUTOPKGTEST_ARGS: '--setup-commands=ci/pin-django-from-backports.sh --skip-test=debusine-doc-linkcheck' ------
Or, for more context: https://salsa.debian.org/freexian-team/debusine/-/blob/add-bullseye-support/ .gitlab-ci.yml#L84 It ends up having "autopkgtest" and "autopkgtest-bullseye". But using the variable SALSA_CI_DISABLE_AUTOPKGTEST
I could disable the one without my extends.
If it's in the same pipeline: you don't need the numbers to get the
artifacts (the .deb files from build).
For the aptly generated artifact: I will investigate this tomorrow (to
fetch it and try to use) (I want to replace some code that I did with
the aptly repo, if possible).
On Sunday, 20 August 2023 14:06:37 CEST Carles Pina i Estany wrote:
autopkgtest:
extends: .test-autopkgtest
variables:
SALSA_CI_AUTOPKGTEST_ARGS: '--setup-commands=debian/salsa/add- repositories.sh'
piuparts:
extends: .test-piuparts
variables:
SALSA_CI_PIUPARTS_PRE_INSTALL_SCRIPT: 'debian/salsa/add-repositories.sh'
You can see that by defining the variables to customise the tools, I am able to work with the existing job definitions. This simplifies the CI description file considerably:
https://salsa.debian.org/moin-team/moin/-/blob/debian/master/debian/salsa-ci.yml
The script is pretty straightforward, too:
https://salsa.debian.org/moin-team/moin/-/blob/debian/master/debian/salsa/add-repositories.sh
If you want, you can simplify more (it's not exactly the same, so itaptly
might or might not help). There is a way on GitLab to point to the
latest build of a job. For example, you have the following URL for one
of the git repos:
https://salsa.debian.org/moin-team/emeraldtree/-/jobs/4575438/artifacts/raw/
You could use instead (to avoid the pipeline number): https://salsa.debian.org/moin-team/emeraldtree/-/jobs/artifacts/debian/master/raw/aptly?job=aptly
Which is a redirect to the latest pipeline. Currently:aptly
----
$ curl -s -I "https://salsa.debian.org/moin-team/emeraldtree/-/jobs/artifacts/debian/mas ter/raw/aptly?job=aptly" | grep -E -i "^(http|location)" HTTP/2 302
location: https://salsa.debian.org/moin-team/emeraldtree/-/jobs/4575438/artifacts/raw/
----
Follows this format:
BRANCH=debian/master
DIRECTORY=aptly
JOB_NAME=aptly https://salsa.debian.org/moin-team/emeraldtree/-/jobs/artifacts/${BRANCH}/ra w/${DIRECTORY}?job=${JOB_NAME}
Just a side note: be careful about expiring artifacts. In some projects (settings dependant) only the latest artifact is kept and older ones
might be expired (deleted) after some time. I don't think that this is
the case of the moin-team/emeraldtree after a quick check... but I'm
unsure where this is properly checked on GitLab.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 350 |
Nodes: | 16 (2 / 14) |
Uptime: | 10:24:07 |
Calls: | 7,625 |
Files: | 12,793 |
Messages: | 5,686,537 |