Re: regression testing

On 27/04/07, Joe Shaw <joe joeshaw org> wrote:
We also have a repo which is stored in Novell Forge, which I run every
now and then but mostly when I'm gearing up to do a release.  It was
originally put in Novell Forge because it was SVN and maintaining it
in GNOME CVS would cause major pain for people who were interested
only in the code.  A lot of those test files were making sure we
extracted the right info, got all the metadata, etc.  They weren't
really there to test "broken" files so much.  At some point I'll look
into moving that stuff over into our current SVN.

A big problem I had when I was fixing and dealing with a lot of the MS
Word crashers and misbehaving documents is that they contained private
information and I couldn't add them to a regression suite.  Since the
files were badly formed for whatever reason (or the parsers were
broken), we couldn't recreate the situation in another file.


Crashing errors are quite easy to detect as there will be lots of
stuff put in the error logs. I was thinking more along the lines of
monitoring beagle for more subtle errors such as only partially
indexing files which may go unnoticed for a while. Soft errors like
that are much worse as they might give people the impression that
beagle was just not very good (not that I'm saying there are any such
cases like that).

It would also make fair comparisons between other projects like
tracker possible.

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]