Re: FYI: vte 0.56.2 + No more ftp tarballs
- From: Egmont Koblinger <egmont gmail com>
- To: Tristan Van Berkom <tristan vanberkom codethink co uk>
- Cc: Debarshi Ray <dray redhat com>, Kalev Lember <klember redhat com>, GNOME Release Team <release-team gnome org>, Christian Persch <chpe gnome org>
- Subject: Re: FYI: vte 0.56.2 + No more ftp tarballs
- Date: Fri, 26 Apr 2019 11:29:52 +0200
+cc chpe to close the loop :)
On Fri, Apr 26, 2019 at 6:54 AM Tristan Van Berkom
<tristan vanberkom codethink co uk> wrote:
Hi,
Thanks for bringing this up.
On Thu, 2019-04-25 at 17:08 +0200, Debarshi Ray via release-team wrote:
Hey Egmont,
Thanks for the heads up!
On Thu, Apr 25, 2019 at 11:59 AM Egmont Koblinger <egmont gmail com> wrote:
- VTE 0.56.2 was released ahead of schedule in order to fix a crash
(RH #1701590). Could you please update the Rawhide (F30) package? (F29
with VTE 0.54.x is unaffected.)
- VTE git master (0.57 series) switched to the meson build system and
dropped autotools. Since there are no longer any autogenerated files
to distribute, we will no longer manually create tarballs and upload
them to the static ftp/http download area. Instead, you should check
out particular tags from git, or download the autogenerated tarballs
from gitlab at https://gitlab.gnome.org/GNOME/vte/tags.
(gnome-terminal is likely to follow soon, I won't send a separate FYI,
you'll notice it when it happens.)
I guess I could live with downloading the autogenerated GitLab
tarballs for the purposes of building Fedora packages. I wonder if
this has a bearing on how the GNOME release team builds things to
prepare the upstream GNOME releases. I have CC'ed them.
First of all, I think this is certainty a freedom that we want
maintainers to have, avoiding steps and process just makes things
better.
Thanks for your understanding, and apologies if this wasn't
communicated or discussed aforehand properly.
To give a bit more context: VTE's main maintainer Christian hasn't
created tarballs for quite a few years now because of some worries he
has with the encryption code contained within. For a long time,
creation of VTE was up to a random GNOME developer who noticed the
tarball was missing and did it on his own, lagging behind often by
weeks if not months; having a gnome-terminal available on FTP
requiring a VTE not yet available. At one point I started nagging
Rishi to create tarballs whenever due, which was still far from ideal.
Later on I got granted FTP access and started creating these myself.
The latter one is still far from ideal. I spent quite a lot of time
evaluating whether my tarballs are just as good as Rishi's, given that
he works on Fedora while I use Ubuntu, and they patch autotools in
different ways. Later on at some other point I again wasn't quite
confident on how upgrading my computer to a newer release which uses
automake 1.16 instead of 1.15 would affect the tarballs, and whether
we can afford that change between a .0 and .1 release (although we
backport to multiple stable releases in parallel, so in order to avoid
any chance of breakage within a major branch, I'd have to set up
virtual machines). Amid all these changes around who and how generates
the tarballs, we still don't know whether any of these changes were
relevant to the API docs missing on the website:
Infrastructure/library-web#6.
With a great contribution from Inigo to switch to meson and
Christian's work in cleaning up and finalizing the patch (and my push
in the bug to make it happen), we finally no longer have autogenerated
files, the tarball is nothing more than just the files inside git
bundled together. This does not only make it reproducible (not
necessarily in the sense that the tarball is bitwise the same, as you
said below, but in the sense that the files contained within are
reproducible), but also eliminates the possibility of a bug sneaking
in (either because of a hardware error on our personal computers, or
because someone compromises my computer, or we can't even exclude the
possibility of me or whoever else who creates these tarballs
intentionally adding something nasty – obviously I'd never do that,
but how can you make sure, and how can I make sure no one else
would?). The change also eliminates a SPOF.
In addition to all these mentioned here, generating the tarballs
manually is such an additional (and from now on really unnecessary)
burden that I'd firmly prefer not to carry anymore (I'll do it with
the older branches as long as we backport to them and make new
releases, of course), and apparently neither wants Christian. So if we
stick to the old model of manually generated tarballs, someone has to
step up and volunteer to do them – keeping in mind that if he
occasionally misses to create them, it'll causes more trouble than
good for distributions and even risks them upgrading in time for a
release cut.
For the technical part of creating a GNOME release, I don't think we're
quite ready for that at the moment, but I it should not be too much
work to update the script which converts `gnome-build-meta` to be aware
of modules which don't release as tarballs here:
https://gitlab.gnome.org/GNOME/releng/blob/master/tools/smoketesting/convert-to-tarballs.py
Also, I'm not familiar with how release notes for major GNOME releases
are currently created, but I *think* there is a NEWS file aggregation
script which is based on the uploaded tarballs, that should probably
receive some attention too.
I'm not sure what NEWS file you're referring to, but of course I can
easily miss something.
What I'm aware of is an autogenerated changelog on the FTP site which
is occasionally missing (e.g. for the first in the gnome-terminal
3.25, 3.27 & 3.29 series).
However there are other concerns than just this.
* I think our gitlab instance is not really an appropriate
infrastructure for hosting releases; when compared to the ftp
server which has mirrors to help balance the load.
This might result in intense CI pipelines for builds of downstream
distros running around the world, continuously trying to download
our modules directly from our poor little gitlab instance.
Is "distros running around the world" really such an issue? I assume
they all cache their downloaded version, don't they?
* If I understand correctly, I might be wrong, but I think gitlab
creates tarballs on the fly instead of persisting them in
permanence.
I raise this because I don't know with certainty that gitlab
tarball creation is reproducible (in the bit-for-bit sense).
This could mean that an upgrade of our gitlab or it's
dependencies, can potentially result in the same tarballs being
offered up but no longer matching the checksums as before.
This is something that IMHO the downloader needs to deal with, in this
case by checking git-evtag or the files individually via other means
(assuming of course that they don't trust the integrity that comes
with a HTTPS or SSL download, but trust if there's an additional
checksum downloaded from the very same source matches; a concept I
never grasped).
I wonder if we might prefer an approach where we automate the creation
of tarballs for modules who have decided to rid themselves of the
process of creating tarballs ?
This sounds like a good idea. In particular, to address all the
worries you listed above, I think the easiest and safest short-term
solution would be to create a script which automatically fetches the
tarballs from git for newly appearing tags (for specified versions
only, e.g. vte >= 0.57.0), and pushes them to the FTP area via a
scripted "ftpadmin install" or alike. Does it sound feasible?
thanks a lot,
egmont
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]