Re: the same page



> Theres no reason that gnome 2 (or 3 or 4 or 5) cant include both the bloatware
> eyecandy for the yanks with loads of cash and GHz athlons,  and also sane
> alternatives for the rest of the world.  For example, nautilus is very
> pretty, but gmc will do the job if you run an "obsolete" machine.

And that's just great, except that gmc has serious usability problems.
In most circumstances it fails to be useful both to hackers who like
lots of powerful features and/or the shell (and won't be using Nautilus
either in all probability) and is simultaneously too obscure and hard to
use for novices. IMO, gmc was the worst piece of the GNOME desktop and
desperately needed to be replaced.

Also, there's a lot more slowing these systems down than the "eyecandy".
A lot of times it is the things that allow a group of programmers that
isn't growing by leaps and bounds to increase the usefulness and
features of their programs (and even the number of programs they write)
by leaps and bounds; that is its convenience layers. One of the
phenomena that I think GNOME is seeing is that we're able to do more
with less as our infrastructure grows more complete. Just looking at
panel code, a functional equivalent (i.e. no enhancements or
improvement) to the GNOME 1.4 panel would take a lot less code and a lot
less time than has been put into the panel to date.

Another example might be pango. We want to handle all sorts of
non-"roman" languages really well, right? This seems like a worthwhile
goal. But getting there requires (well, required) yet another
abstraction layer. This is apt to slow things down (I don't specifically
know if it did or does in pango's case, but I'd be suprised if it
didn't). So add a lot of these abstractions together... CORBA, pango,
Bonobo, GnomeVFS, gobject (the list goes on)...even glib is going to
slow things down. All sorts of useful techniques like signals,
components, and libraries have associated performance penalties. Most of
GNOME's "performance problems" stem from a layer effect rather than
flashy features and graphics.

A lot of slowness can also come from subtle issues. Getting the last 15%
can indeed be a costly endeavor. For example, GTK2 will be doubled
buffered which will (probably) have an increased memory load; or the use
of anti-aliased fonts is going to slow things down some. Want a system
without those things? Sure: use GNOME 1.2.

Of course, GNOME could be a lot faster and many of these layers aren't
as optimized as we'd like. Bonobo has seen a lot of substantial
performance fixes. Sure it would have been nice if these were fixed in
the first place, but a lot of things only come to the surface with use.

> > Writing
> > future Gnome releases that don't provide additional cool features for
> > those users due to hardware restraints obviously doesn't gain those
> > users anything so they're not likely to upgrade, and it doesn't provide
> > the alternative to modern desktops or operating systems so higher-end
> > users won't use it either.  Ultimately Gnome gains a few low-end
> > computer users but loses out on potentially many higher-end users.
> 
> New releases fix bugs, provide new features of used applications, and

Bugfixes is a reasonable argument. At some point these "new features"
are going to cost something. When is the right time to add
anti-aliasing? When is the right time to add good i18n (probably very
important to those villages in Nepal!)? When is the right time to add a
virtual filesystem?

> generally say that we care about accessibility to the less well off as well
> as to the physically disabled.  Think 3rd world projects to provide IT to
> Nepalese villages, for example.  They will often be running on 2nd hand
> cast-off hardware: if GNOME is too bloated, then they wont be using it and
> we lose out on a whole market share.

Or they'll use XFCE or an older version of KDE or GNOME. There is also a
worthwhile distinction between free software and a company. Companies
*have* to find markets, even small ones if they are going to survive. A
lot of times this is a compromise and means they won't be able to get
the whole enchilada. How many contenders to Microsoft have eventually
been relegated to niche markets and then eventually fade away? 

Do you think they just happen to hang on to that niche market? I bet a
lot of times they realize that they aren't going to get the whole market
given x sales, y development costs and z cash reserves and change their
focus to "suriving" instead. I'm interested in gunning for the big
picture, maybe capturing strategic small markets as they align well with
our talents and product (for example, 3D graphics and CAD might be a
market that aligns well with GNOME for historical and practical
reasons). 

The reason that I believe focusing on being the low budget alternative
is the wrong approach is that you're setting yourself up to be destroyed
by a natural tendency. Hardware is, on average, getting faster than
slower. Those 2nd hand cast-offs the Nepalese village gets in two or
three years will be a lot faster than the machines they got today. If
we're not growing and expanding to use that power they will look at
GNOME and say "wow, that's primitive compared to FoobarOS 2004". 

The marked exception to this is when something is prohibitively
expensive for a *large* market, that is to say that something is
relegated to niche status because of its cost. An example of this would
be the Macintosh vs. the Xerox Alto. But here's the thing... operating
systems don't comprise enough of the cost of the machine for them to be
the prohibitively expensive element at this point. And if Microsoft's
not stupid they'll make sure it never is (unless they, ha ha, manage to
create something that's actually so good that nobody else can match it,
or *grimace* manage to lock people in much tighter than they can today).
GNOME being free and running on old hardware would not open up a market
that's even nearly as large as the market Microsoft and Apple are
targeting today.

Why did Cyrix get screwed? Why didn't AMD? I think a big reason was that
AMD realized they needed to push forward as well as down on prices. If
you're not pushing forward, you won't have anything to show when the
"time is right". The way technology tends to work, one of the best ways
to push prices down is to push forward. Weird, but tends to be true. 

So we can forge on ahead and make desktop environments for GHz machines
(we aren't yet, fwiw, but we probably will be in a year or two) and two
years from now GNOME 3.0 (by then rather old) will run happily on the
GHz castoffs donated to the Nepalese village, or we can focus on making
GNOME run well today on the 130 MHz machines donated to the village, but
by the time the product is done they'll have faster machines and GNOME
won't look all that attractive. Assuming the village in Nepal will only
use GNOME rain or shine, they're also going to be much happier if we
push ahead rather than targeting them in the long run. While they may
get something more useful in the short run, in the longer run they're
going to have faster hardware and want a GNOME that can really use it.

I hope environments like XFCE that are selectively built from pieces of
projects like GNOME to be high performance AND pick up desirable
qualities prosper. That in my opinion is a better route to go. XFCE will
probably always be a fringe desktop for this reason, but it will also
serve a certain niche into the forseeable future while GNOME is still
growing by leaps and bounds (now once your market share and technical
achievement has stagnated, like in MS's case it may well make sense to
pursue these smaller markets). Eventually though I bet GNOME will be
sufficiently advanced that XFCE simply won't be able to keep up and an
old equivalently performance hungry version of GNOME will be more
featureful than XFCE, at which point GNOME will take over that market
share. Think towards the future...its one of the advantages of not
having to ensure a revenue stream today.

> -Dick

-Seth




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]