Re: gtk 2.4.x installation



--- David Neèas (Yeti) <yeti physics muni cz> wrote:

> On Mon, Jan 01, 2007 at 05:54:58AM -0800, Sergei Steshenko wrote:
> > 
> > Slightly (?) offtopic (?).
> 
> Probably.
> 
> > Or, 'make' is OK, but 'make check' fails - I think, for 'libpng' - that is,
> > again, on MacOS.
> 
> So there's something strange going on on that system.  It is
> the *purpose* of `make check' to fail then.
> 
> > And this fact severy undermines the statemet about high quality of FOSS.
> > 
> > And I have no time and energy to file all these bugs.
> 
> Hm, are you sure it's the fact who undermines the quality
> then?
> 
> Anyway, I agree we have a problem.  The problem is proving
> the (in)correctnes of build systems.  We do not have any
> tools for that (beside actually trying the build m*2^n times
> where m is the number of supported systems and n is the
> number of possible dependencies).  So what solution you
> offer -- or at least suggest?
> 
> Yeti
> 
> 
> --
> http://physics.muni.cz/~yeti/pf2007.png
> _______________________________________________
> gtk-list mailing list
> gtk-list gnome org
> http://mail.gnome.org/mailman/listinfo/gtk-list
> 

At least, try to build everything using my tool - everything (except for
the thin layer of standard C/C++ libraries and X, and even this can be
avoided) is in "non-standard", rather, non-default places, and, so far,
no end user effort is required, that is, when one can build.

If the build fails with ('configure' - OK <-> 'make'- fails) outcome,
you guys just fix the build mechanism.

About the supported systems - probably (QEMU + minimal OS install) is
our friend, i.e. I suggest to use this combination as the sandbox
(== test build environment).

And another point - from my experience I distrust by defintion the
kind of release notes I see in these lists. My whole developer +
verification engineer experience tells me that one has moral right to
claim that something can be built/works only in case he/she describes
the conditions under which the building and the testing was performed
- otherwise it's more rather than less an empty claim.

That's the reason I publish build log when I make a release - at
least, people can see which versions were involved, and if a dependency
does not appear, this means it was taken from system level.

And that's why I introduced the '-check_with_ldd' switch/functionality
- I want to really check myself and see which libraries are actually
involved. I haven't yet come out with similar solution for included files
yet.

So, I think, release announcements should follow the suit - build
environments should be fully disclosed, info on the dependencies versions.
preferably extratced from build results, should be published.

When I was VLSI verification engineer one of the requirements was that
designers built and tested their blocks against the rest of the chip
existing baseline model and presented compilation and test run logs - only
afterwards their blocks were accepted to become part of the to be released
next baseline model.

But proprietary VLSI world is different from SW world in a good way
- mistakes used to cost $100K in the mid-ninetees (that was the cost
of masks for new chip manufacturing), and now, AFAIK, it's about $1M.

So, as I once heard from a VP of engineering, that's why VLSI revision
control and verification approaches used to appear much more robust and
advanced than "pure" SW ones.

The last, but not least - you don't like my build tools - don't use them.

But develop something that:

* builds and installs everything in a separate directory;
* downloads sources automatically;
* takes care of dependencies automatically - at least, in the
hard-coded way I did it;
* does not require root privileges;
* uses hierarchical data structurs (Perl, Python, Ruby, Lua) and
not "flat" shell syntax - the latter is very difficult to understand
and maintain.

The point of data structures is important - when build data is a hierarchical
data structure, one can easily perform all kinds of automated manipulations
on it - for example, one can write a loop that temporarily excludes dependencies
and checks whether 'configure' catches the missing dependency.

By the way, in Perl one can dump in Perl form both data and code, i.e. one can
write a piece of code that dumps build data in normalized Perl form, and then
this data (+ code) can be imported into another Perl script for various
manipulations - at the moment only the data part can be dumped in my tool, but a
module that is able to dump (data + code) is in the released tarball, just not
used yet.

--Sergei.

Applications From Scratch: http://appsfromscratch.berlios.de/

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]