PS: To preempt any questions as for why, the background for my decision
to stop maintaining any packages is this thread, but it's really just
the straw that broke the camel's back
https://alioth-lists.debian.net/pipermail/pkg-rust-maintainers/2022-August/022938.html
On August 25, 2022 10:52:56 AM GMT+02:00, "Sebastian Dröge" <slomo@debian.org> wrote:
PS: To preempt any questions as for why, the background for my decision
to stop maintaining any packages is this thread, but it's really just
the straw that broke the camel's back
https://alioth-lists.debian.net/pipermail/pkg-rust-maintainers/2022-August/022938.html
A bit off-topic, but I think we really ought to discuss (address?) this elephant in the room once more. I don't have the answers, but Sebastian's email yet again clearly illustrates how the status quo is hurting the project. This clear example comes inaddition to worries raised before about what the status quo does to recruitment of new developers.
PS: I do not imply that the elephant in the room is the ftpmasters. I'm thinking of the *process*. The people involved put in admirable work in carrying out said process.
Quoting Gard Spreemann (2022-08-26 08:49:21)
On August 25, 2022 10:52:56 AM GMT+02:00, "Sebastian Dröge" <slomo@debian.org> wrote:
PS: To preempt any questions as for why, the background for my decision
to stop maintaining any packages is this thread, but it's really just
the straw that broke the camel's back
https://alioth-lists.debian.net/pipermail/pkg-rust-maintainers/2022-August/022938.html
A bit off-topic, but I think we really ought to discuss (address?)
this elephant in the room once more. I don't have the answers, but
Sebastian's email yet again clearly illustrates how the status quo
is hurting the project. This clear example comes in addition to
worries raised before about what the status quo does to recruitment
of new developers.
PS: I do not imply that the elephant in the room is the
ftpmasters. I'm thinking of the *process*. The people involved put
in admirable work in carrying out said process.
The way I see it, the process is clear: provide *source* to build from.
If there is "source" built from another source, then that other source
is the true source.
If ftpmasters sometimes approve intermediary works as source, then that
is not a reason to complain that they are inconsistent - it is a reason
to acknowledge that ftpmasters try their best just as the rest of us,
and that the true source is the true source regardless of misssing it sometimes.
Yes, this is painful. Yes, upstreams sometimes consider us stupid to
care about this. Nothing new there, and not a reason to stop do it.
If you disagree, then please *elaborate* on what you find sensible -
don't assume we all agree and you can only state that the process is an elephant.
The way I see it, the process is clear: provide *source* to build from.
If there is "source" built from another source, then that other source
is the true source.
I'm afraid I cannot respond to a message of this length. As I
mentioned previously, all the ftpteam really have the bandwidth to do
is process what's in NEW.
On Fri, 26 Aug 2022 at 09:09:25 +0200, Jonas Smedegaard wrote:
The way I see it, the process is clear: provide *source* to build from.
If there is "source" built from another source, then that other source
is the true source.
I hope you agree that we are doing this in order to get some desirable properties of building from source, rather than because the rules are
the rules and we follow them without any critical thinking?
I feel strongly that if our policies are not meeting their goals of
advancing Free Software and providing benefit to our users, then we
should reconsider whether they are the right policies.
I don't think building from the least-derived form is always the right
thing to do. For instance, take rust-glib-sys, a package of Rust bindings
for the GLib library, which is one of the libraries whose handling
prompted this thread. It contains a description of the GLib library's API/ABI, basically the Rust equivalent of a C header file. It could
have been written by hand, but that would be tedious and error-prone,
so instead its upstream maintainers chose to machine-generate the API/ABI description from GLib. The thing that would be closest to "true source"
would presumably be to redo that machine-generation process, from first principles, at build-time
However, if we did that by using the version of GLib in the archive,
then the API of the Rust bindings would not be entirely determined
by the content of the rust-glib-sys package as tracked by its version
number, which seems really bad for library stability: you could take
the same version number, build it twice in different environments, and
get two different APIs! rust-glib-sys could have a minimum supported
GLib version, but then would unpredictably have or not have additional
APIs beyond the ones present in that version, depending on the version
of GLib that happened to be installed at build-time.
If the generation process is not always 100% backwards-compatible, which
I suspect it is not, then that just makes it worse: now you can't even
rely on a rebuilt version of rust-glib-sys being compatible with the
version from before it was rebuilt! If the API description is curated by
the upstream maintainers, then they have an opportunity to detect incompatible changes and release them with a suitably changed version
number, or even as a separate parallel-installable version, so that
dependent projects can migrate at their own pace; but that can't happen
if the API description is generated at build time.
Or, the other way to generate rust-glib-sys from "true source" would be
to bundle a complete copy of GLib source code in it, and generate the
API description from that (which, as an implementation detail of the way
it was done, requires compiling GLib and then introspecting the binary,
and cannot be done while cross-compiling). This is not great from either
a technical or social point of view. From a technical point of view, rust-glib-sys' build system would have to be taught to compile GLib,
which uses a totally different build system (Meson rather then Cargo)
and is quite a large library, resulting in a much slower and less
reliable build. From a social point of view, again, GLib is not small,
and I don't think either the rust-glib-sys maintainer or the ftp team
would welcome the requirement to review another complete copy of GLib, transcribe all its potential copyright holders into debian/copyright
and check that it has been done, and so on.
Generating the API description in a way that does not arbitrarily vary
based on installed software might also require bundling and building a complete copy of GObject-Introspection, which, again, is not small.
All of this is for a functional interface description, analogous to a C header file without comments. In other contexts elsewhere in the project,
we rely on the functional parts of an interface description as not being strongly protected by copyright (after all, the whole GNU system started
as a compatible implementation of the interfaces provided by 1980s Unix,
and we have code in Debian that is a compatible reimplementation of
Windows interfaces), and we strongly limit the modifications that we
are prepared to make to interface descriptions (because Free Software
is important to us, we require the technical and legal ability to make modifications, but because API/ABI compatibility is also important to us,
we treat many of those modifications as something that we will refuse
to do unless there is a really compelling reason).
More generally, I don't think it's always useful to talk about "the"
source or "the" preferred form for modification, as though there is only
one. I think it would be more appropriate to consider whether the form
in which some software is provided is suitable for exercising your Free Software rights (as described in the FSF's "four essential freedoms",
for example) within the scope of whatever package we're talking about at
the time, or whether it's unsuitable for that use. If it's suitable, then it's source, or close enough; if it's unsuitable, then that needs fixing.
If we insist on a particularly puritanical view of what is source and
what is the preferred form for modification, then I think there is a significant risk of producing a distribution which is unquestionably Free Software, but either is missing useful Free software because it would be
too hard to get that software into a form that meets our self-imposed policies, or can only contain that software as a result of individual developers putting a heroic amount of effort into meeting those policies - particularly if we always go back to the "true source" and generate from there every time (which I will note that the ftp team specifically do not insist on unless there is a technical reason to do so, they merely require source to be *available*).
If we require contributors to do a considerable amount of work that
does not advance the project's goals, at best that's a waste of our most limited resource (developers' time and motivation), and at worst it's a recipe for burned-out contributors, which we absolutely should not want,
both because we're a community that cares about the well-being of our contributors (or at least I hope we are!) and because even from a purely amoral/utilitarian point of view, contributors giving up on the project
harm our ability to achieve our goals.
smcv
"Simon" == Simon McVittie <smcv@debian.org> writes:
Quoting Gard Spreemann (2022-08-26 08:49:21)
On August 25, 2022 10:52:56 AM GMT+02:00, "Sebastian Dröge" <slomo@debian.org> wrote:
PS: To preempt any questions as for why, the background for my
decision
to stop maintaining any packages is this thread, but it's really
just
the straw that broke the camel's back  https://alioth-lists.debian.net/pipermail/pkg-rust-maintainers/2022-August/022938.html
A bit off-topic, but I think we really ought to discuss (address?)
this elephant in the room once more. I don't have the answers, but Sebastian's email yet again clearly illustrates how the status quo
is hurting the project. This clear example comes in addition to
worries raised before about what the status quo does to recruitment
of new developers.
PS: I do not imply that the elephant in the room is the
ftpmasters. I'm thinking of the *process*. The people involved put
in admirable work in carrying out said process.
The way I see it, the process is clear: provide *source* to build
from.
If there is "source" built from another source, then that other
source
is the true source.
If ftpmasters sometimes approve intermediary works as source, then
that
is not a reason to complain that they are inconsistent - it is a
reason
to acknowledge that ftpmasters try their best just as the rest of us,
and that the true source is the true source regardless of misssing it sometimes.
Yes, this is painful. Yes, upstreams sometimes consider us stupid to
care about this. Nothing new there, and not a reason to stop do it.
If you disagree, then please *elaborate* on what you find sensible -
don't assume we all agree and you can only state that the process is
an
elephant.
If one's enthusiasm on working on some package is eventually(note that using a package doesn't require uploading it and, indeed,
worn out after a break, then try to think of the following question:
Is it really necessary to introduce XXX to Debian?
Must I do this to have fun?
Strong motivations such as "I use this package, seriously" are not
likely to wear out very easily through time. Packages maintained
with a strong motivation are better cared among all packages in our
archive.
I don't think building from the least-derived form is always the
right thing to do.
For instance, take rust-glib-sys, a package of Rust bindings
for the GLib library, which is one of the libraries whose handling
prompted this thread.
To be honest, in terms of volunteered reviewing work, waiting
for several months is not something new. In academia, it may
take several months to years to get a journal paper response.
I've ever tried to think of possible ways to improve the process, but
several observations eventually changed my mind, and I'm willing
to accept the status quo.
* there is a trade-off between rigorousness and efficiency.
Any change in the process may induce disadvantages, where
the most difficult thing is to reach an agreement.
* we will add more work for ftp team if we get them involved in the
discussion of possible (but unsure) ways to improve NEW.
My ultimate opinion on NEW processing is neutral, and my only
hope for ftp team is to increase the pace of hiring new members.
To be concrete, it is much harder to write a concrete proposal
to debian-vote@l.d.o than discussing possibilities.
I understand we may have the enthusiasm to sprint on something.
However, in terms of the long-term endeavor on Debian development,
the negligible popcon number won't be less disappointing than
a long-term wait to clear the NEW queue.
If one's enthusiasm on working on some package is eventually
worn out after a break, then try to think of the following question:
Is it really necessary to introduce XXX to Debian?
Must I do this to have fun?
Strong motivations such as "I use this package, seriously" are not
likely to wear out very easily through time. Packages maintained
with a strong motivation are better cared among all packages in our
archive.
Why not calm down, and try to do something else as interesting
as Debian development when waiting for the NEW queue?
Oh no, then we instead insist that related work stops
There are a lot of examples of busywork in Debian, such as documenting licenses, packaging dependencies, removing non-free files that are only
in source packages, runtime selection of correct CPU instructions,
fixing build failures, porting reverse dependencies to newer versions
of APIs etc. All of these are things that contributors complain about
and get burned out by us requiring or even suggesting. All of them
however are necessary in some way. I think the requirements around
source and building are just another example of this.
contributing work. In some sense, contributing to Debian becomes
mostly
about waiting. (Sure, there is something to be said about extremely
short, fragmented attention spans being unhealthy – but some
contributions are naturally short and easy, and we certainly don't
want
to drive those away.)
If one's enthusiasm on working on some package is eventually
worn out after a break, then try to think of the following
question:
 Is it really necessary to introduce XXX to Debian?
I hope we won't try to define what "necessary" means, or have it
become
a criterion for inclusion :-)
 Must I do this to have fun?
I don't think Debian contribution has ever been a necessary condition
for fun. That's an incredibly high bar. If we were only to attract
people whose only idea of fun was contributing to Debian, I think
we'd
become a very unhealthy project (and one severely lacking in
contributors).
Strong motivations such as "I use this package, seriously" are not
likely to wear out very easily through time. Packages maintained
with a strong motivation are better cared among all packages in our archive.
I humbly disagree. Even from my own point of view, I may well be very motivated to package something I use seriously all the time,
seriously. But then I see its dependency chain of 10 unpackaged
items,
start thinking about the probability that they'll *all* clear the NEW
queue, and how long that would take, and I give up. And then there's
the
problem of attracting smaller contributions, as mentioned above: I
really believe that people get put off from putting in 30 minutes of
work for a nice MR on Salsa if they can't expect their work to hit
the
archives for months and months (suppose for example they contributed
to
a package whose SONAME is being bumped).
Why not calm down, and try to do something else as interesting
as Debian development when waiting for the NEW queue?
Sure. That's what I do. My list of joyful and less joyful things to
fill
my days with is enormous. **BUT: I worry for the project if our
solution
to the problem at hand is "maybe just contribute less to Debian".**
Is
that really what we want?
That's why I still hope ftp team to recruit more people. This is
a very direct and constructive way to speed up everything.
More volunteers = higher bandwidth.
Recruiting more people doesn't seem to have a serious disadvantage.
I forecast this thread will eventually end up with
"calm down and take a break" solution again.
More generally, I don't think it's always useful to talk about "the"
source or "the" preferred form for modification, as though there is only
one. I think it would be more appropriate to consider whether the form
in which some software is provided is suitable for exercising your Free Software rights (as described in the FSF's "four essential freedoms",
for example) within the scope of whatever package we're talking about at
the time, or whether it's unsuitable for that use. If it's suitable, then it's source, or close enough; if it's unsuitable, then that needs fixing.
If we insist on a particularly puritanical view of what is source and
what is the preferred form for modification, then I think there is a significant risk of producing a distribution which is unquestionably Free Software, but either is missing useful Free software because it would be
too hard to get that software into a form that meets our self-imposed policies, or can only contain that software as a result of individual developers putting a heroic amount of effort into meeting those policies - particularly if we always go back to the "true source" and generate from there every time (which I will note that the ftp team specifically do not insist on unless there is a technical reason to do so, they merely require source to be *available*).
On 2022-08-27 15:53, M. Zhou wrote:
That's why I still hope ftp team to recruit more people. This is
a very direct and constructive way to speed up everything.
More volunteers = higher bandwidth.
Recruiting more people doesn't seem to have a serious disadvantage.
It does not seem to work. Either people don't want to do that, either the FTP team is too picky on the candidates.
On Sat, 2022-08-27 at 09:50 +0200, Gard Spreemann wrote:
I humbly disagree. Even from my own point of view, I may well be very
motivated to package something I use seriously all the time,
seriously. But then I see its dependency chain of 10 unpackaged
items,
start thinking about the probability that they'll *all* clear the NEW
queue, and how long that would take, and I give up. And then there's
the
problem of attracting smaller contributions, as mentioned above: I
really believe that people get put off from putting in 30 minutes of
work for a nice MR on Salsa if they can't expect their work to hit
the
archives for months and months (suppose for example they contributed
to
a package whose SONAME is being bumped).
I agree with your disagreement but I keep my opinion. My track record involves maintaining loads of reverse dependency libraries. I've
already gone through all kinds of pains from the NEW queue and
eventually learned to take a break immediately after uploading
something to new.
That said, if someone presents a GR proposal I'll join. In Debian,
it is not that easy to push something forward unless it hurts everyone.
Our NEW queue mechanism has been there for decades, and people are
already accustomed to it (including me). From multiple times of
discussion in the past, I don't see the NEW queue problem hurting
too many people. If nothing gets changed in the NEW queue mechanism,
people may gradually get used to it, following the "do not fix it
if it ain't wrong" rule. The voice will gradually vanish.
If one's enthusiasm on working on some package is eventually
worn out after a break, then try to think of the following question:
Is it really necessary to introduce XXX to Debian?
In my fuzzy memory, the last discussion on NEW queue improvement
involves the disadvantages by allowing SOVERSION bump to directly
pass the NEW queue. I'm not going to trace back, because I know
this will not be implemented unless someone proposes a GR.
I don't think building from the least-derived form is always the
right thing to do.
Personally, I believe that instances of that represent bugs in how the upstream source trees and build processes are organised.
I feel like the right way to organise this upstream is for GLib itself
to be building the bindings for itself (as Bastian Roucariès suggests).
Alternatively, if the GLib bindings really need to be separate, then
GLib could build the XML description of its APIs, include that in a
package, then rust-glib-sys build-dep on that, include a Built-Using
header and get binNMUed after every GLib update.
Strong motivations such as "I use this package, seriously" are not
likely to wear out very easily through time. Packages maintained
with a strong motivation are better cared among all packages in our archive.
I humbly disagree. Even from my own point of view, I may well be very motivated to package something I use seriously all the time,
seriously. But then I see its dependency chain of 10 unpackaged items,
start thinking about the probability that they'll *all* clear the NEW
queue, and how long that would take, and I give up.
And then there's the
problem of attracting smaller contributions, as mentioned above: I
really believe that people get put off from putting in 30 minutes of
work for a nice MR on Salsa if they can't expect their work to hit the archives for months and months (suppose for example they contributed to
a package whose SONAME is being bumped).
Why not calm down, and try to do something else as interesting
as Debian development when waiting for the NEW queue?
Sure. That's what I do. My list of joyful and less joyful things to fill
my days with is enormous. **BUT: I worry for the project if our solution
to the problem at hand is "maybe just contribute less to Debian".** Is
that really what we want?
On Fri, 2022-08-26 at 11:58 +0100, Simon McVittie wrote:
For instance, take rust-glib-sys, a package of Rust bindings for the
GLib library, which is one of the libraries whose handling prompted
this thread.
I feel like the right way to organise this upstream is for GLib itself
to be building the bindings for itself (as Bastian Roucariès suggests).
Am Sat, Aug 27, 2022 at 09:53:40AM -0400 schrieb M. Zhou:
In my fuzzy memory, the last discussion on NEW queue improvement
involves the disadvantages by allowing SOVERSION bump to directly
pass the NEW queue. I'm not going to trace back, because I know
this will not be implemented unless someone proposes a GR.
I'm considering this once beeing back from vac. However, the problem in
such a GR is that even if there is an outcome for the voting that sais SOVERSION bumps should pass new we need some code that implements this.
So we also need someone to volunteer for this.
Hello,
On Sun 28 Aug 2022 at 07:45AM +02, Andreas Tille wrote:
Am Sat, Aug 27, 2022 at 09:53:40AM -0400 schrieb M. Zhou:
In my fuzzy memory, the last discussion on NEW queue improvement
involves the disadvantages by allowing SOVERSION bump to directly
pass the NEW queue. I'm not going to trace back, because I know
this will not be implemented unless someone proposes a GR.
I'm considering this once beeing back from vac. However, the problem in
such a GR is that even if there is an outcome for the voting that sais
SOVERSION bumps should pass new we need some code that implements this.
So we also need someone to volunteer for this.
I think we still want the binary package namespace checking?
I.e., a GR just saying "ftpteam should not do a full licensing/copyright >check for packages in binNEW".
Then no software changes are required.
Sean Whitton <spwhitton@spwhitton.name> wrote:
I think we still want the binary package namespace checking?
I.e., a GR just saying "ftpteam should not do a full
licensing/copyright check for packages in binNEW".
Then no software changes are required.
I think that a GR to prohibit developers from looking for bugs is at
least in principle inconsistent with not hiding problems.
Scott Kitterman <debian@kitterman.com> writes:
Sean Whitton <spwhitton@spwhitton.name> wrote:
I think we still want the binary package namespace checking?
I.e., a GR just saying "ftpteam should not do a full
licensing/copyright check for packages in binNEW".
Then no software changes are required.
I think that a GR to prohibit developers from looking for bugs is at
least in principle inconsistent with not hiding problems.
Saying that a project delegate, acting as a delegate, should not block
binNEW uploads for a specific sort of check that's currently mandatory is
not at *all* the same thing as prohibiting developers from looking for
bugs. It doesn't do that at all. Anyone who does ftpmaster work would
still be able to (and encouraged to!) look for and file bugs just like any other developer. If those bugs are RC, they would be treated like any
other RC bug.
But the project is entitled to override the decisions of a project
delegate by GR if it so chooses (constitution 4.1.3), and one of the
reasons why the project may decide to do so is if we collectively believe
the project delegates have misjudged the trade-offs of making a particular process mandatory on the grounds that it catches some number of RC bugs.
The project may, for example, decide that yes, this process catches some
RC bugs, but the number of bugs caught are not worth the other impacts of that process, and the RC bugs can be dealt with via other means.
If I look at a package and determine it's only in New due to a new
binary package name and that means the project has prohibited me from
looking for other issues in the package until some time later when it's
not in New, then I feel pretty precisely like I'm prohibited from doing something.
BTW, the vast amount of new packages I'm packaging are new
dependencies
for existing packages to get their new versions.
On Sun, Aug 28 2022 at 07:39:12 AM +02:00:00 +02:00:00, Andreas Tille <andreas@an3as.eu> wrote:
BTW, the vast amount of new packages I'm packaging are new
dependencies
for existing packages to get their new versions.
May be we should be able to tag packages with in NEW into categories
like this (similar to how a DD can give back a failed build via web interface). This may need to go through a GR. If we prioritize
packages in NEW required for updating other dependencies, that can
help things faster. When I upload packages to NEW, I know some
packages are priority and some are not, but there is no way to
express that currently.
Or we may also be able to reuse the urgency field in changelog if the
idea that "uploaders should be able to prioritize some packages in
NEW and ftp masters should check the priority queue before the normal
queue" is acceptable to the project.
However, GObject-Introspection has worked approximately the way it
currently works since at least 2008, and was apparently fine.
[[PGP Signed Part:No public key for 695B7AE4BF066240 created at 2022-08-27T20:24:55+0200 using RSA]]
Hello,
On Sat 27 Aug 2022 at 04:22PM +02, Vincent Bernat wrote:
On 2022-08-27 15:53, M. Zhou wrote:
That's why I still hope ftp team to recruit more people. This is
a very direct and constructive way to speed up everything.
More volunteers = higher bandwidth.
Recruiting more people doesn't seem to have a serious disadvantage.
It does not seem to work. Either people don't want to do that, either the FTP
team is too picky on the candidates.
Some combination of both, but I don't think I'm suffering from bias if I
say that it's at least 80% the former. Very few people who say they'd
like to be trained confirm they'd still like to once they've had a look
at the docs for trainees, and after that, hardly any do enough trainee reviews for the other team members to feel confident they can let them
at it on their own.
I am the only trainee who made it through in recent years and that's
because I was highly systematic about doing lots of reviews each month.
There are some technical improvements that would be possible. For
example, feedback to trainees is entirely done via IRC; I would much
prefer us to be doing that by e-mail. But other team members disagree
with me, I think, and I do recognise I like e-mail more than most people
do. There are ways the tools could be better.
In general, however, existing team members, including myself, are pretty sceptical that technical improvements would be worth the time it would
take to implement them effectively. dak as a whole is less well
maintained than other core Debian software, but the NEW queue parts are pretty good!
So, the bulk of the problem boils down to project members not being interested in doing the work. I certainly understand this. It feels
just like grading student essays. Everyone finds that highly draining
at first, until you develop a sort of detachment from the activity,
where your mind is going through the motions of the activity sort of
like how your hands can be going through the motions of something like
food preparation for a familiar dish -- you have to learn that you won't
make worse judgements if you become detached in this way, just like how
you won't prepare a worse version of the dish if you do it in the
detached way. Then I just applied what I'd learned from grading to the
NEW queue, and then it's pretty fun and even relaxing when you're not in
a frame of mind to do harder thinking. But like I said, most people
don't want to do any of this, and of course being a trainee is *not*
like that.
And then recruitment is less efficient -- not enough feedback on trainee reviews -- because there aren't enough team members. The usual
compounding effect.
May be we should be able to tag packages with in NEW into categories
like this (similar to how a DD can give back a failed build via web interface). This may need to go through a GR. If we prioritize packages
in NEW required for updating other dependencies, that can help things
faster. When I upload packages to NEW, I know some packages are
priority and some are not, but there is no way to express that
currently.
Something of an aside from the current debate, but GObject
introspection introduced significant problems with cross-compiling
because the introspection process produces different results when done
on different architectures so you couldn't just cross- do it.
Maybe someone has fixed this in the nearly a decade since I looked in
to it and the point is moot? But I doubt it.
Something of an aside from the current debate, but GObject
introspection introduced significant problems with cross-compiling
because the introspection process produces different results when done
on different architectures so you couldn't just cross- do it.
It's in need of some significant core rework IMHO
It occurred to me that if the compiler/linker could be made to generate
the GI data and then something else split it out of the binary (similar
to how debug symbols work) then this issue could potentially be solved,
does that sound feasible at all?
If I look at a package and determine it's only in New due to a new binary >package name and that means the project has prohibited me from looking for >other issues in the package until some time later when it's not in New, then I >feel pretty precisely like I'm prohibited from doing something.
On Sat 27 Aug 2022 at 04:22PM +02, Vincent Bernat wrote:
It does not seem to work. Either people don't want to do that, either the FTP
team is too picky on the candidates.
Some combination of both, but I don't think I'm suffering from bias if I
say that it's at least 80% the former. Very few people who say they'd
like to be trained confirm they'd still like to once they've had a look
at the docs for trainees, and after that, hardly any do enough trainee reviews for the other team members to feel confident they can let them
at it on their own.
Sean Whitton writes ("Re: Comments on proposing NEW queue improvement (Re: Current NEW review process saps developer motivation"):
On Sat 27 Aug 2022 at 04:22PM +02, Vincent Bernat wrote:
It does not seem to work. Either people don't want to do that, either the FTP
team is too picky on the candidates.
Some combination of both, but I don't think I'm suffering from bias if I
say that it's at least 80% the former. Very few people who say they'd
like to be trained confirm they'd still like to once they've had a look
at the docs for trainees, and after that, hardly any do enough trainee
reviews for the other team members to feel confident they can let them
at it on their own.
I am in this picture. Some years ago now I volunteered. I was
introduced to the internal ftpmaster documentation and processes. At
the time, these documents were not even published - including,
astonishingly, some elements which read like a manifesto. (I don't
know if these documents are published nowadays.)
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 360 |
Nodes: | 16 (2 / 14) |
Uptime: | 128:30:58 |
Calls: | 7,686 |
Files: | 12,828 |
Messages: | 5,711,088 |