Michael Catanzaro wrote:
1) More time to catch regressions
In theory. In practice, it mostly means more wasted time until a regression
is FIXED, i.e., it is entirely counterproductive. Many regressions are only
noticed once the update goes stable, because that's when most users start
trying it.
(1) doesn't require the service pack approach and could be
implemented
by just adding a longer waiting period to bodhi. I'm of the mind that
all updates should spend two weeks in testing before they go stable,
regardless of karma, since otherwise how will people have time to
notice regressions?
I don't see how even longer delays in testing would help. The majority that
does not have updates-testing enabled won't get it before it goes stable no
matter how long it sits in testing. The users who use updates-testing are
those who want updates fast and so will get it within the first couple days.
Currently updates tend to stay in testing for a week or less, which
isn't
even enough time for GNOME Software to prompt the user about new updates.
That is a defect/misfeature in GNOME Software and needs to be fixed there.
But I'd expect most updates-testing users to be using command-line tools (or
at least more advanced GUIs such as Yumex-DNF) anyway (and this very issue
is one of the reasons).
2) A reasonable way to QA a snapshot of what gets released
(2) allows us to solve the problem of having no effective QA for our
updates. Maybe the current team is not big enough to QA service packs
as well as the next release, but at least with service packs there is
something that can reasonably be QAed.
I don't see how a huge mix of unrelated updates is in any way easier to QA
than the individual small, targeted updates. In fact, I think it is the
exact opposite: there is no way to adequately QA the whole service pack, it
just does not scale.
It also allows for regular install media respins, which I think you
want?
That will make Linus happy, too.
The respins can be done from the current updates, it is in fact already
being done, just not "officially". The respins, by their nature,
automatically cumulate the updates, and so don't need them already
cumulated.
And I believe the respins to be much easier to QA if the updates have all
already been QAed individually. As I explained above, I don't think a
"service pack" can be adequately QAed as a unit.
3) Less-frequent updates
That is a BAD thing. I want to get my bugfixes as quickly as possible. (In
fact, I'm unhappy that our infrastructure does not support instant pushes.)
I have Apper set to check for updates hourly because the default daily is
too slow for me! Sure, we don't have hourly pushes (sadly), but daily means
up to an ADDITIONAL day of delay.
(3) is quite important, since in Workstation we reboot to install
updates, which is quite annoying.
And that is another defect/misfeature in GNOME Software that needs to be
fixed there.
(No point in arguing about that here, it's a done deal....)
So instead of fixing broken software, we should break the whole distribution
to work around it? No thanks!
Here you would only have one reboot per service pack, plus security
updates.
The "plus security updates" caveat makes even that mostly moot.
4) Small total size of updates
(4) would be true if a package would otherwise have been updated
multiple times between service packs.
Which is the exception rather than the rule. I think the win would be
negligible.
The negative effect that the whole thing hits at once would be the much more
noticeable effect. So the updates would actually feel slower, and probably
even BE slower because the mirrors would be swamped, just as they are on a
release day.
I will pick on LibreOffice. Individually, the updates rarely ever
break
anything, and fix bugs, so they're good updates taken alone. But
collectively, it's problematic because there's a new update every week,
and LibreOffice is big: it takes a lot of time to download and install.
Does the inconvenience of making the update take longer offset the
value provided by the update? It's a function of how often you use
LibreOffice, how much the relevant bugs affect you, your download
bandwidth, and your hard drive speed. I rarely use LO, so weekly
updates are definitely not worth it for me: I'd rather have the updates
once a month, at most, but it's just an inconvenience. For the
maintainer, weekly updates are surely worth it, or they wouldn't be
happening. For most users, probably not, but I guess it depends on the
bug. For bandwidth-limited users, it's not an inconvenience, but a
massive disaster. We're currently lacking criteria to say how frequent
is too frequent. I think 3-4 weeks between updating one package is
probably as fast as is ever appropriate, except in special
circumstances (maintainer discretion there), but weekly updates as a
rule is problematic, especially for such a large package.
The frequency of updates of LibreOffice is something to discuss with the
LibreOffice maintainer, changing that does not require forcing a "service
pack" approach on ALL updates. That said, I find the current update rate of
LibreOffice just fine.
As for bandwidth-limited users, what would REALLY be a massive disaster is
your "service pack" approach! If even just LibreOffice alone is a problem,
imagine all updates of the whole month at once!
(Now I'd argue that if your bandwidth is limited and/or expensive, then
Fedora is probably not the best distribution for you. Fedora is about
quickly following upstream ("First"). But the "service pack" approach
would
just make it worse for those users.)
I'd like to get to the point where a reboot to install weekly
updates
takes about a minute. Currently, it's about five minutes per weekly
update. I recently made the mistake of installing all of texlive, so
that I wouldn't have to bang my head against a wall when trying to
figure out what Fedora package to install to get a particular texlive
package; that caused my weekly updates to take 2-3 hours apiece, so my
computer was unusable for the rest of the evening when I made the
mistake of applying updates. (texlive is a good example of another set
of packages that must be updated very rarely, even if the content of
the updates itself is fine.)
That, too, would only be made WORSE by your "service pack" approach.
But of course, the real issue there is that you used offline updates. I
install my updates with Apper, which does not force me to reboot anything. I
will typically just shut down when going to sleep, and when I boot the
computer after waking up, it boots into all the shiny new updates, no time
lost at all. And for the typical application update, new instances of the
application are updated even before the reboot. I only restart the session
or reboot immediately if it is really necessary, which 99% of the time is
not the case.
Still, even if you insist on using GNOME Software / offline updates, your
"service pack" approach would only exacerbate the problem, because those
monthly updates would be even LARGER. (About 4 times a weekly update, maybe
3 to 3.5 if there are many overlapping updates.)
For most applications (not core system applications), major version
updates are fine, and the policy could be changed. For lower-level
system components, updates need to be balanced against regression
potential.
Of course, the contention point is then what is a "core system application".
If you mean things like LibreOffice, well, that is exactly the kind of
application that updating would make MOST users happy (and in fact, I
consider the current update policy for LibreOffice to be way too
conservative, it is essentially never updated to a newer upstream release
series, unfortunately).
Some packages (e.g. web engines) really need to be updated to the
latest
major version no matter what.
Yet a web engine is also a core system component, so your concept is already
falling apart.
Well, I think maintainer judgment comes into play here. If the
version
number bump does not reflect the regression potential, that should be
fine. But taking the time to explain that when requesting the update is
not the end of the world.
In my view, the neutral party would be charged with stopping the really
suspicious updates only, or allowing them but slowing the pace.
That neutral party would just be another roadblock for updates (when we
already have too many: the policies on what types of updates are acceptable,
the policies on testing time / karma that Bodhi enforces, etc.).
And I don't see at all how "slowing the pace" would help in any way. It
would just annoy people and make neither side of the debate happy. Those who
want the updates will be unhappy because they go out later for no good
reason, and those who don't want them will be unhappy too because they
eventually go out anyway.
Frankly, I think it's most important to make Bodhi work. I
haven't been
able to search for updates since the new Bodhi interface went live,
except by manually editing the URL, which is crazy. Why is there a
search box if it doesn't work? (Turns out it works, but only in
Firefox. ;) Also, I couldn't figure out how to add builds to an update.
(Admittedly, that was next to impossible in the old web interface too.)
If not for fedpkg, I don't think I would even be able to release
updates.
I agree about Bodhi 2 being a PITA. I especially dislike the fact that
JavaScript is now required for significant parts of the UI (you cannot even
submit updates without it, Bodhi 1 did just fine with a plain HTML form and
JavaScript only for optional autocompletion), and that that JavaScript is
not portable to all browsers: there are significant issues in KHTML (e.g.,
it lets you fill in the whole update form, until you hit the submit button,
which does nothing, grrr!), I have found QtWebKit (Konqueror+KWebKitPart) to
work fine with it though. What browser do you have issues with? Epiphany?
Midori? It's strange because webkitgtk should really work if QtWebKit works.
Kevin Kofler