Re: Glib resource framework

* Alexander Larsson <alexl redhat com> schrieb:

> > Maybe we could actually support "bundles" like it is done on MacOS.
> > The idea of compiling data into a binary file give me a blast from the
> > past from the MacOS 9 days.
> See:

Congratulations. You've copied one of the really worst concepts
of the windoze world.

"Desktop distributions like Fedora or Ubuntu work remarkably
 well, and have a lot of applications packaged. However,
 they are not as reliable as you would like."

#1 Fedora always has been bad (IIRC just SuSE is even worse).

#2 Ubuntu is probably one of the most stable distros around there,
   we're successfully using it for missing critical systems with
   millions of users. Do you really think that a single application
   development team can ensure application *and* system stability 
   without close coorporation with the distro maintainers ?

"Most Linux users have experienced some package update that broke
 their system, or made their app stop working.

Yes, especially when taking untested upstream packages directly.
(speaking of which: certain upstreams are even still stupid enough
for using AC_TRY_RUN, etc, ...)

"Linux users quickly learn to disable upgrades before leaving for
 some important presentation or meeting."

Sane distros don't enable this on default.

"Every package installs into a single large "system" where
 everything interacts in unpredictable ways. For example,
 upgrading a library to fix one app might affect other applications"

This is exactly one of the major points what professional distros
and their quality engeneering mechanisms are for. Installing
software from untrusted sources always brings high risks.

"Package installation modify the system at runtime, including
 running scripts on the users machine. This can give different
 results due to different package set, install order, hardware, etc."

Most of those problems come from bad upstreams or distro maintainers.

"Installing applications not packaged for your distribution"

Anyone yet had the idea that this might have some vaid reasons ?
Maybe the distro maintainers consider that particular package
not stable enough ?

"Installing a newer version of an application that requires
 newer dependencies than what is in your current repositories"

Blame the bad distro maintainers.

"Keeping multiple versions of the same app installed"

Why, exactly ?

"Keeping older versions of applications running as you update
 your overall system

Such problems come from bad upstream or package maintainer.

"I imagine a system where the OS is a well defined set of
 non-optional core libraries, services and apps. ........ "

Plan9 ?

"The platform is a small set of highly ABI stable and reliable
 core packages. It would have things like libc, coreutils,
 libz, libX11, libGL, dbus, libpng, Gtk+, Qt, and bash." 

Yeah, dbus, libpng, gtk+ and qt considered "highly ABI stable".
I just had to double-check the calendar, if I missed a few month
and we've got April 1st again.

"All applications are shipped as bundles, single files that
 contain everything (libraries, files, tools, etc) the
 application depends on."

Essentially dropping the concept of shared libraries.

"Bundles are self-contained, so they don't interact with
 other bundles that are installed.

Why not just using containers or at least chroot's for that ?

"Installing them is as easy as dropping them in a known directory."

Bypassing all fundamental security considerations.

"Then the bundle file itself is mounted as a fuse filesystem
 in a well known prefix, say /opt/bundle."

Congratulations on the performance drop.

Why not just using chroot's for that and leaving the rest
as it is ?

 Enrico Weigelt, metux IT service --

 phone:  +49 36207 519931  email: weigelt metux de
 mobile: +49 151 27565287  icq:   210169427         skype: nekrad666
 Embedded-Linux / Portierung / Opensource-QM / Verteilte Systeme

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]