I'd rather propose choice C. Because I to some extent understand
both sides who support either A or B. I maintain bulky C++ packages,
and I also had a little experience reviewing packages on behalf of
ftp-team.
A -- Some (e.g. C++) packages may frequently enter the NEW queue,
with OLD src and NEW bins (e.g. SOVERSION bump). A portion of devs
feel it is not necessary for frequently because it drastically
slows down the maintainer's work. In the worst case, when the
package finally passed the NEW queue, the maintainer may have
to go through NEW queue again upon the next upload. (This is very
likely to happen for tensorflow, etc).
B -- Uploads with OLD src and OLD bin are not required to go through
NEW queue, even if a decade as passed as long as the src names and
bin names are kept unchanged. One of the supports for B is that
the d/copyright file may silently rot (get outdated), as uploads
without updated d/copyright won't be rejected. Checking packages
when they bump SOVERSION is to some extent a "periodical" check.
This worked very well for packages with stable ABI. But for pacakges
without stable ABI, especially bulky (C++) packages, this is
painful for both uploaders and ftp checkers.
Given the understanding of both options, I propose choice C:
C. Lottery NEW queue:
if (src is new):
# completely new package
require manual-review
elif (src is old) but (bin is new):
if not-checked-since-last-stable-release:
# approximate advantage of choice B.
require manual-review
elif src.version already exists in archive
# choice A wants to avoid this case.
auto-accept
else:
if (lottery := random()) < threshold
require manual-review
else:
# I expect a faster pace of debian development.
auto-accept
In this way concerns for both people supporting A and B can be partly addressed. The old-src-new-bin case have a large chance to pass NEW
as long as they have been reviewed once after the last stable release.
The burden for ftp-team can be reduced. And pace of maitnainers can
be faster with less chance of being blocked and unstable to do anything
but wait.
I would love to have this clearly documented since in case B. I would
stop wasting my and ftpmaster time with nagging which is not
rectified
than.
I personally clearly prefer A. and I wish we could clarify this
situation.
Me too. I perfer A personally as well. Debian might be the only major distribution which checks the license so thoroughly. Unconditionally
allow an old-src-new-bin upload to pass is basically impossible, I
speculate. Choice C might be more practical and feasible.
It must be many outdated d/copyright in our archive. Letting eligible
uploads automatically pass with a high probability is not likely
causing problem even if yet another outdated d/copyright sneaked in.
[1] https://lists.debian.org/debian-devel/2021/07/msg00231.html
[2] https://ftp-master.debian.org/new/onetbb_2021.4.0-1~exp1.html
2. New binary package "steals" binary from another source. This is sometimes
OK. Sometimes it's accidental. It could also be malicious (I don't remember if I've every actually seen this done for an intentional "steal" or not, I might have).
On Fri, Jan 21, 2022 at 01:28:54PM -0500, Scott Kitterman wrote:
2. New binary package "steals" binary from another source. This is sometimes OK. Sometimes it's accidental. It could also be malicious (I don't remember if I've every actually seen this done for an intentional "steal" or not, I might have).
Stealing a binary does not go through NEW.
1. When the SO name changes and the binary package name is adjusted accordingly, it is not super rare for the maintainer to mess something up in the renaming and end up with an empty binary package, which does no one any good. I note that for debhelper compat 15 there appears to be some related work in progress. Perhaps this is, or can be extended to be, sufficient to eventually make this kind of error a thing of the past.
Hi Mo,
Am Fri, Jan 21, 2022 at 09:51:12AM -0500 schrieb M. Zhou:
I'd rather propose choice C. Because I to some extent understand
both sides who support either A or B. I maintain bulky C++ packages,
and I also had a little experience reviewing packages on behalf of ftp-team.
A -- Some (e.g. C++) packages may frequently enter the NEW queue,
with OLD src and NEW bins (e.g. SOVERSION bump). A portion of devs
feel it is not necessary for frequently because it drastically
slows down the maintainer's work. In the worst case, when the
package finally passed the NEW queue, the maintainer may have
to go through NEW queue again upon the next upload. (This is very
likely to happen for tensorflow, etc).
I have heard this argument and my mail was simply to find out what
fellow developers think about this. IMHO the issue is sufficiently
important to have some kind of documented consensus about this.
Can we have better automated tooling, either in Lintian, or in when
source packages are rebuilt, that can take care of this?
The other thing that's perhaps considering here is that unfortunately,
there are some upstreams that are extremely irresponsible with library
ABI backwards compatibility, where they bump the SONAME essentially at
every release. I recall one extreme case a few years ago where there
were over ten(!) SONAME bumps for a particular library over 12 months.
But if we're going to do that, then we could also just support static libraries, and just rebuilt all of the pacakges that link statically
with libshaky, thus solving the security argument for shared
libraries. This also avoids the fairness problem where some packages
are reguarly going through ftpmaster review, and others aren't...
On Fri, 2022-01-21 at 13:55 -0500, Theodore Ts'o wrote:
The other thing that's perhaps considering here is that unfortunately, there are some upstreams that are extremely irresponsible with library
ABI backwards compatibility, where they bump the SONAME essentially at every release.
You could avoid NEW for these SONAME bumps by using a single binary
package and ensuring that the symbols/shlibs depend on the right
version ranges. Or add Provides libfoo-abi-N or libfoo-abi (= N)
and have the symbols and or shlibs generate dependencies on that.
I have heard this argument and my mail was simply to find out what
fellow developers think about this. IMHO the issue is sufficiently important to have some kind of documented consensus about this.
It's not only the copyright that the ftp-master are responsible for. New binaries fill a place in the Debian namespace and they *are* the keepers of that.
And https://lists.debian.org/debian-devel/2021/07/msg00231.html
It's not only the copyright that the ftp-master are responsible for. New binaries fill a place in the Debian namespace and they *are* the keepers
of that.
The other thing that's perhaps considering here is that unfortunately, there are some upstreams that are extremely irresponsible with library
ABI backwards compatibility, where they bump the SONAME essentially at every release. I recall one extreme case a few years ago where there
were over ten(!) SONAME bumps for a particular library over 12 months.
You could avoid NEW for these SONAME bumps by using a single binary
package and ensuring that the symbols/shlibs depend on the right
version ranges. Or add Provides libfoo-abi-N or libfoo-abi (= N)
and have the symbols and or shlibs generate dependencies on that.
In the past I've suggested a solution to static linking and binary
packages containing source could be to have a service scanning every
binary package for static/source files (.a, Rust, Golang etc), noting
the relevant package/version tuples and then searching the buildinfo
files for binary packages that built with those packages installed and automatically rebuilding affected packages, or having a service that
would let you manually rebuild packages affected by security issues.
https://wiki.debian.org/StaticLinking
That only works if there are no other packages depending on those
shared libraries which are coming from other source packages.
But my claim is that if the upstream can't manage to maintain a stable
ABI, then maybe we shouldn't be trying to ship shared libraries. But officially, that's not allowed, since it's considered bad.
If we have that solution for Rust and Golang, the maybe it can also
make life easier for upstreams that can't maintain an ABI.
On Sun, 2022-01-23 at 17:43 -0500, Theodore Y. Ts'o wrote:
That only works if there are no other packages depending on those
shared libraries which are coming from other source packages.
I don't think that is true, I believe you can put multiple things in
the depends section of an shlibs file and dpkg-shlibdeps will propagate
that to reverse dependencies just fine. From the manual pages it looks
like the same applies to the symbols files. I found on my system that
there are *already* packages that do something similar (see below).
Hi Ted,
I think this is the second time you write something like this, but for dynamically linked libraries, the rebuild happens (by the Release Team, (please use transition trackers for that) because we automatically track transitions [1]). Unless people don't follow the convention that your binary matches the SONAME. But nowadays we find those more and more due to autopkgtest (reverse dependencies that fail because they can't find the appropriate library). It becomes increasingly more difficult to hide the
fact that your package is not named appropriately.
So could the Release Team figure out a way to automatically rebuild
packages that have source dependencies on static libraries?
This would solve the problem of new binary packages causing a full
ftpmasters policy review of packages, at least for those who need to
create new binary packages each time SONAME gets bumped.
I didn't comment at first because I thought someone else would raise
the idea. But it seems people still like the idea of a NEW queue. Not
me. The NEW queue is a hindrance.
I'd rather propose choice C. Because I to some extent understand
both sides who support either A or B. I maintain bulky C++ packages,
and I also had a little experience reviewing packages on behalf of
ftp-team.
I just don't think the solution is to ignore copyright or licensing statements.
I didn't comment at first because I thought someone else would raise
the idea. But it seems people still like the idea of a NEW queue. Not
me. The NEW queue is a hindrance.
For the record, I don't "like" the NEW queue.
I don't like current copyright laws, and I suspect a fair amount of
people involved in Free Software doesn't.
I just don't think the solution is to ignore copyright or licensing statements.
For me, the copyright check is just a bad excuse. People upload non-distributable stuff everywhere and it seems the world continue to--
go round. What amount of non-distributable packages is stopped by the
NEW queue?
I just don't think the solution is to ignore copyright or licensing statements.
That's not the goal. The question, which keeps being raised in part
because I don't think it's gotten a good answer, is what the basis is for treating copyright and licensing bugs differently than any other bug in Debian?
The need for pre-screening was obvious when we had export control issues,
but my understanding is that those have gone away. Are we working from
legal advice telling us that this pre-screening is required for some legal purpose? If so, is it effective for the legal purpose at which it is
aimed? Is this system left over from old advice? Have we checked our assumptions recently?
NEW processing is a lot of friction for the project as a whole and a lot
of work for the ftp team. If we were able to do less work at the cost of
a minimal increase in bugs, or at the cost of handling bugs a bit differently, maybe that would be a good thing?
In other words, it's unclear what requirements we're attempting to meet
and what the basis of those requirements is, which makes it hard to have a conversation about whether the current design is the best design for the problem we're trying to solve.
Quoting Vincent Bernat (2022-01-25 21:38:01)
I didn't comment at first because I thought someone else would raise
the idea. But it seems people still like the idea of a NEW queue. Not
me. The NEW queue is a hindrance.
For the record, I don't "like" the NEW queue.
I don't like current copyright laws, and I suspect a fair amount of
people involved in Free Software doesn't.
I just don't think the solution is to ignore copyright or licensing statements.
For me, the copyright check is just a bad excuse. People upload non-distributable stuff everywhere and it seems the world continue to go round. What amount of non-distributable packages is stopped by the NEW
queue?
I think we should forego the NEW queue. If people want to check
packages, they can do it once they are in unstable with regular bugs.
Current checks are partly done by Lintian and I suppose people could
watch new Lintian warnings and detect bad packages quickly.
This could be done when src is not NEW as a test.
Jonas Smedegaard <jonas@jones.dk> writes:
Quoting Vincent Bernat (2022-01-25 21:38:01)
I didn't comment at first because I thought someone else would raise
the idea. But it seems people still like the idea of a NEW queue. Not
me. The NEW queue is a hindrance.
I don't like current copyright laws, and I suspect a fair amount of
people involved in Free Software doesn't.
I just don't think the solution is to ignore copyright or licensing statements.
To me, the elephant in the room is this question: Does the way the NEW
queue currently works provide good (good enough?) assurances to
ourselves that we are *not* ignoring copyright or licensing?
On Tue, Jan 25, 2022 at 09:38:01PM +0100, Vincent Bernat wrote:
I think we should forego the NEW queue. If people want to check
packages, they can do it once they are in unstable with regular bugs.
Without the NEW queue, there would be no point at which packaging receives any sort of review. I'd prefer Debian to deliver at least some level of quality.
Otherwise, we'd fall to the level of NPM. And there's ample examples what that would mean.
On Tue, Jan 25, 2022 at 09:38:01PM +0100, Vincent Bernat wrote:
I think we should forego the NEW queue. If people want to check
packages, they can do it once they are in unstable with regular bugs.
Without the NEW queue, there would be no point at which packaging receives any sort of review. I'd prefer Debian to deliver at least some level of quality.
Otherwise, we'd fall to the level of NPM. And there's ample examples what that would mean.
Current checks are partly done by Lintian and I suppose people could
watch new Lintian warnings and detect bad packages quickly.
Lintian is just a dumb machine that can ease human reviews but not replace them.
This could be done when src is not NEW as a test.
I've managed to trample upon someone else's package just yesterday -- and it escaped automated checks because a binary of that name already existed in
the archive, just not on any arch which I test.
For practical reasons we have to obey the laws, no matter how oppressive they are. But I don't see why we should do more than eg. Fedora which
has corporate backing with an actual legal team.
On Tue, Jan 25, 2022 at 09:38:01PM +0100, Vincent Bernat wrote:
For me, the copyright check is just a bad excuse. People upload
non-distributable stuff everywhere and it seems the world continue to go
round. What amount of non-distributable packages is stopped by the NEW
queue?
I think we should forego the NEW queue. If people want to check
packages, they can do it once they are in unstable with regular bugs.
Without the NEW queue, there would be no point at which packaging receives any sort of review. I'd prefer Debian to deliver at least some level of quality.
Otherwise, we'd fall to the level of NPM. And there's ample examples what that would mean.
I think we should forego the NEW queue. If people want to check
packages, they can do it once they are in unstable with regular bugs.
Current checks are partly done by Lintian and I suppose people could
watch new Lintian warnings and detect bad packages quickly. This could
be done when src is not NEW as a test. People could loose their upload
rights if they are caught abusing the system (and get DM rights for some selected packages instead).
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 296 |
Nodes: | 16 (2 / 14) |
Uptime: | 61:39:58 |
Calls: | 6,654 |
Files: | 12,200 |
Messages: | 5,331,534 |