Re: Using python + pygtk in Desktop modules (was Re: RevisitingtheGnome Bindings)



On Tue, 28 Sep 2004 18:29:38 +0200, Murray Cumming wrote:
> Are you suggesting that a RedHat 10.2 pyMyApp package will not work when I
> upgrade to RedHat 10.3? IF yes, then why exactly, if pyMyApp uses the #!
> technique?

It would work if you kept Python 2.2 around.

Firstly, I know this discussion started out as including Python in the
*desktop* release where the same compatibility rules don't apply, but it
seems to have wandered into the area of using it as a developer platform
too.

Let's review why ABI (as opposed to API) stability is so important for
GNOME libraries is so important.

The ELF soname versioning system and parallel installability mean that
theoretically you can break ABI whenever you like (but not API). In
practice doing so is extraordinarily inefficient, as the whole point of
shared libraries is to speed things up by letting the system share the
code pages of the library between applications. It also saves disk space
because rather than having 10 foo-compat packages installed, you only need
the latest version of foo and all the apps written to old versions will
still work.

It's also a pain for security reasons: if you find a bug in libfoo the fix
has to be backported to all the different incompatible versions and builds
and then the security update is 10x the size so dialup users will probably
skip it and we have Sasser all over again.

Therefore the longer you can keep backwards compatibility in the same
library the better - not only is it more convenient for users as
realistically today most distros do *not* install all the *-compat
packages you might want, but it increases system efficiency and security.

Python breaks that, significantly. The Python team clearly have no serious
commitment to backwards compatibility and installing N versions of Python
in parallel hacks around the problem, it doesn't solve it. If Python
became a part of the developer platform within a few years we could find
that GNOME, in order to avoid breaking backcompat, requires 3 different
versions of the CPython runtime to be installed. Add on a few more years
and now maybe it's 4 or 5 versions. This is clearly crazy and absolutely
not what we want: just imagine if you ran 5 apps that happened to use
different Python versions, you'd have 5 loaded at once! This is exactly
the sort of thing we want to avoid. Not to mention that it sends mixed
messages to 3rd party ISVs (I mean the ones who actually care if their
software breaks for a user).

Possible solutions:

- Evangelise stability to the Python team, try and get them onto a more
  standard versioning scheme whereby in release X.Y the major version is X
- Not use Python in the developer platform. This leaves us without
  anything which fills the niche Visual Basic has always done on Windows
- Somebody might fork Python
- Pick the current version of Python then never upgrade it, or maybe
  backport fixes. That boils down to a less aggressive version of the
  previous point, really.

Think about it like this: the reason you need a beefy machine to run
VMware is because you're loading two operating systems at once. But if
platform libraries break backwards compatibility all the time and simply
fall back on parallel installation you're doing pretty much the same thing!

thanks -mike




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]