Re: Idea: daily packs



Hello,

Just commenting an idea that just crossed my mind...

Colin Walters <walters verbum org> writes:

> Hey Owen, thanks for looking at this!
>
> On Fri, 2012-08-17 at 20:52 -0400, Owen Taylor wrote:
>
>> The one part of this idea that I haven't figured out is how you would keep
>> the next day's 'ostree --pull' from downloading the next huge tarball instead
>> of a few thousand smaller files.
>
> [...]
>
> Now, there are a few reasons that the .tar.gz is going to be a lot more
> efficient for initial download:
>
> * Compression - right now archive mode objects aren't.  We could
>   gzip individual objects on the build server, but then we'd have to
>   uncompress them when constructing buildroots.  Right now on decent
>   hardware, the build system can make buildroots in 2-5 *seconds*.
>   Keeping this fast is really important to the developer experience.

Idea: Enabling support for compressed requests in the web server. This
would keep the files uncompressed both in the server and client, yet
transfer the data compressed. If the CPU usage is too much for the
server, a gzip'd copy of the archive object is kept *in the server*,
and they are served as-is by the web server (for example with something
like the Nginx gzip_static [1] module — dunno if Apache supports such a
thing).

This way, clients don't ever have to worry about handling compressed
packs/objects, as libsoup will decompress them on-the-fly while they
are being downloaded.
  
> [...]
>
> I think we should investigate the "50% or more packfile" heuristic at
> least.

+1


Cheers,

---
[1] http://wiki.nginx.org/HttpGzipStaticModule
-- 
Adrian Perez <aperez igalia com> - Sent from my toaster
Igalia - Free Software Engineering

Attachment: pgpV1BhYaeOnj.pgp
Description: PGP signature



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]