• Publishing multiple packages with aptly in Salsa CI

    From Paul Boddie@21:1/5 to All on Sun Sep 10 23:00:01 2023
    Hello,

    A few weeks ago, I asked about techniques for making new packages available to other new packages so that the autopkgtest job could be run successfully in a pipeline in the Salsa CI environment. Eventually, this was made to work by taking advantage of the aptly job defined in the standard Debian CI pipeline.

    To recap, the aptly job publishes the package built during the execution of a pipeline by creating a dedicated apt-compatible package repository just for that package. Such repositories can then be made available to various jobs in the pipeline of another package, allowing the successful installation of that package and its dependencies, including new packages, and the successful execution of jobs like autopkgtest.

    However, one other thing I wanted to achieve was to take the complete set of new packages and to publish them in a single package repository. This would allow people to install and test the built packages in a more convenient fashion than asking them to hunt down each built package from job artefacts or to build the packages themselves.

    Obviously, the aptly job in the standard Debian CI pipeline publishes a single package (or maybe a collection of packages built from a single source
    package), but I wanted to aggregate all packages published by a collection of aptly repositories. Fortunately, it seems that this is possible by augmenting the existing aptly job definition as shown in the following file:

    https://salsa.debian.org/moin-team/moin/-/blob/debian/master/debian/salsa-ci.yml

    Ignoring the "ls" command which was there to troubleshoot this rather opaque environment, one script adds repository definitions to the apt configuration, whereas the other performs the appropriate "apt source" and "apt download" commands, making the source and binary package files available for aptly to process. Consequently, aptly will include all the dependency packages in the final package repository.

    (Since the scripts reside in my package's debian directory, I also had to request the availability of the Git repository for the job. Generally, I have found it challenging to have these job definitions make effective use of scripts due to environmental inconsistencies.)

    I imagine that publishing packages like this is not particularly desirable, at least if done widely, but I hope it shows that it can be done relatively easily, at least if all the right incantations have been discovered.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Santiago Ruano =?iso-8859-1?Q?Rinc=@21:1/5 to All on Fri Sep 15 12:50:01 2023
    Hi,

    Hello,

    A few weeks ago, I asked about techniques for making new packages available to
    other new packages so that the autopkgtest job could be run successfully in a
    pipeline in the Salsa CI environment. Eventually, this was made to work by taking advantage of the aptly job defined in the standard Debian CI pipeline.

    To recap, the aptly job publishes the package built during the execution of a
    pipeline by creating a dedicated apt-compatible package repository just for that package. Such repositories can then be made available to various jobs in
    the pipeline of another package, allowing the successful installation of that
    package and its dependencies, including new packages, and the successful execution of jobs like autopkgtest.

    However, one other thing I wanted to achieve was to take the complete set of new packages and to publish them in a single package repository. This would allow people to install and test the built packages in a more convenient fashion than asking them to hunt down each built package from job artefacts or
    to build the packages themselves.

    Obviously, the aptly job in the standard Debian CI pipeline publishes a single
    package (or maybe a collection of packages built from a single source package), but I wanted to aggregate all packages published by a collection of
    aptly repositories. Fortunately, it seems that this is possible by augmenting
    the existing aptly job definition as shown in the following file:

    https://salsa.debian.org/moin-team/moin/-/blob/debian/master/debian/salsa-ci.yml

    [snip]

    Please, considering discussing about this on the Salsa CI mailing list: debian-salsa-ci@alioth-lists.debian.net (taking the liberty to CC:)

    Ignoring the "ls" command which was there to troubleshoot this rather opaque environment, one script adds repository definitions to the apt configuration,
    whereas the other performs the appropriate "apt source" and "apt download" commands, making the source and binary package files available for aptly to process. Consequently, aptly will include all the dependency packages in the final package repository.

    (Since the scripts reside in my package's debian directory, I also had to request the availability of the Git repository for the job. Generally, I have
    found it challenging to have these job definitions make effective use of scripts due to environmental inconsistencies.)

    With my Salsa CI maintainer's hat, I am happy to receive MRs ;-)


    I imagine that publishing packages like this is not particularly desirable, at
    least if done widely, but I hope it shows that it can be done relatively easily, at least if all the right incantations have been discovered.

    It depends. Right now, I am looking forward to ease testing reverse dependencies, which is somehow related. As far as Salsa CI also makes
    it possible to control the number of jobs, and to don't abuse the Salsa infrastructure, any enhancement is welcome.

    cheers,

    -- Santiago

    -----BEGIN PGP SIGNATURE-----

    iHUEABYIAB0WIQRZVjztY8b+Ty43oH1itBCJKh26HQUCZQQ1aAAKCRBitBCJKh26 Hd44AQCQbN9xx0k/Audtva4V/ch/qRKe6BhLalEajYGD2S5RYQD/fQAdXpEIY3tK gJiQKEPOjeN/fbt5KD6zk9ov5wfMUQk=
    =GbEM
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Philip Hands@21:1/5 to Someone on Fri Sep 15 16:40:02 2023
    Hi,

    Someone wrote:

    However, one other thing I wanted to achieve was to take the complete set of
    new packages and to publish them in a single package repository. This would >> allow people to install and test the built packages in a more convenient
    fashion than asking them to hunt down each built package from job artefacts or
    to build the packages themselves.

    Obviously, the aptly job in the standard Debian CI pipeline publishes a single
    package (or maybe a collection of packages built from a single source
    package), but I wanted to aggregate all packages published by a collection of
    aptly repositories. Fortunately, it seems that this is possible by augmenting
    the existing aptly job definition as shown in the following file:

    https://salsa.debian.org/moin-team/moin/-/blob/debian/master/debian/salsa-ci.yml

    For another angle, see:

    https://salsa.debian.org/philh/user-setup/-/pipelines/576662

    In which I have a `harvest-repos` job that grabs artifacts from `build`
    jobs in other pipelines, and an `aptly-plus` job that's got an added
    `needs: harvest-repos` that can combine the artifacts from its build and harvest-repos jobs and lump them all together.

    In the resulting aptly repo you can see that it includes both the
    local package (user-setup) under the 'u' directory, and 'grub-installer'
    under the 'g' directory:

    https://salsa.debian.org/philh/user-setup/-/jobs/4671054/artifacts/browse/aptly/pool/main/

    That's done with:

    https://salsa.debian.org/installer-team/branch2repo/

    Quite a lot of that is already part of the standard salsa-CI pipeline,
    and my aim is that branch2repo will pretty-much disapear, with its
    components being optional bits of the standard pipeline, and maybe a few variable settings.

    Cheers, Phil.
    --
    Philip Hands -- https://hands.com/~phil

    --=-=-Content-Type: application/pgp-signature; name="signature.asc"

    -----BEGIN PGP SIGNATURE-----

    iQIzBAEBCgAdFiEE3/FBWs4yJ/zyBwfW0EujoAEl1cAFAmUEazUACgkQ0EujoAEl 1cDHHQ//aac3Cba7qG5Pvy3X+k5dCtxgSjd51xi8CQQWKKTezDOfSwZr9qU6JEBq tIWLN2Yt3EirQlV2aeNssd//g95TT8GBjammJ7XRrXRAjMRthXjwc7DSub+T4Bf4 Tscx70SbaXsu0m/K1fRHCZq+i75WEimt/qojWZ916sxsDEAnFYkkR+Yy/dJY9uYk dvS+bKDINbTrMZmV7cDR+sAUNZWWPmlKZOsVDXMtIXrAMuUIoVYZORC95+x4SBiF D51LBLafFXsBYA45myaIDvUGOjvbjZPdE6XZwgN4UYv8qa8r497sVAzfzzjfdhUu 1pa2joANW6rS0WFyKdYIq0E0i1+dahLnaSfyBmxC4h8spA5btKV087bJEEbAn5Gp gDIu0fwdSVmNwHzwC5PrH0T/AciFvx02hFD/hhq0OsTXu/bI7wFmjD4/P8UYNDXb WHMCYyc3f1WtjM0FYPMLIvXJ6tOGxpDCWEA/gT080ZhU54W2k/ugBhCzDa00eJ+e qcUokNimVzVkFm6ZLqrTkW8bEO3Y9JxJJUTZIaFjHW/A8EI5YocqZLVYRdTnVxag /Rzu4GQXU/DNW67tnKWSYfyaXE0m319wagQHhaAr1zYDH10xCY2k4PtUyW83pKBC 3zx7lNjxh3/3GCzzolDLACPj01GNUq6YJCmqw6BuzXLem7nnSns=EVsA
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gatewa
  • From Paul Boddie@21:1/5 to All on Fri Sep 15 17:50:01 2023
    On Friday, 15 September 2023 16:33:25 CEST Philip Hands wrote:

    For another angle, see:

    https://salsa.debian.org/philh/user-setup/-/pipelines/576662

    In which I have a `harvest-repos` job that grabs artifacts from `build`
    jobs in other pipelines, and an `aptly-plus` job that's got an added
    `needs: harvest-repos` that can combine the artifacts from its build and harvest-repos jobs and lump them all together.

    In the resulting aptly repo you can see that it includes both the
    local package (user-setup) under the 'u' directory, and 'grub-installer' under the 'g' directory:

    https://salsa.debian.org/philh/user-setup/-/jobs/4671054/artifacts/browse/
    aptly/pool/main/

    Yes, that was the desired effect: have packages corresponding to several
    source packages being made available in the pool and thus the created package repository.

    That's done with:

    https://salsa.debian.org/installer-team/branch2repo/

    Quite a lot of that is already part of the standard salsa-CI pipeline,
    and my aim is that branch2repo will pretty-much disapear, with its
    components being optional bits of the standard pipeline, and maybe a few variable settings.

    Indeed. As noted before, my modifications were effectively (1) to keep the source repository around to get access to my scripts, (2) to be able to add package repositories to the apt configuration, (3) to download packages so
    that aptly can process them.

    (1) is just preferable to writing scripts in the horrible YAML syntax and then doing fragment inclusion, which I found can lead to opaque error messages. Being able to obtain a set of scripts would be quite helpful generally as the environment can vary substantially from one kind of job to another.

    (2) is something that could be part of the standard job definitions, particularly since other jobs like autopkgtest need to know about additional repositories in certain cases, brand new packages being one of them.

    (3) is just a small enhancement for this specific scenario.

    I thought that someone must have done this before, even if the standard pipeline didn't support it. I imagine that some coordination would be
    desirable to prevent fragmentation and people like me introducing our own ways of addressing this particular need.

    Paul

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)