Re: [Vala] [RFC] Testing framework improvements



Hello,

Julien Peeters <jj peeters    > writes:
I'd like to add some enhancements like a better reporting, better assertion
support with pretty printing, etc.. In that case effectively this framework
may become an separate library. And, if contributors are interested in it,
why not continuing to improve it ? With your external runner for instance.

Yes, it makes sense.

I have been working on it for few weeks only and then the directions are not
really well defined for the moment.

I have some ideas what I would like to see, so I will write them down here
and will try to contribute some of them as I have time:

 - Automation for forked tests. The runner should be able to wrap all tests in
   something like:

   if(Test.trap_fork(0, Test.TRAP_SILENCE_STDOUT|Test.TRAP_SILENCE_STDERR)) {
       Object fixture = Object.new(fixturetype, null);
       run(fixture);
       fixture = null; // force destruction
       print('test-completed');
   }
   Test.trap_assert_passed();
   Test.trap_assert_stdout('test-completed');

   It should be possible to turn it on and off globally from command-line
   or in a particular suite by declaring something.

   I am not sure it's not supported by some option in the GLib.Test already
   -- the GLib.Test.init can select "mode", but it does not seem to be said
   whether it means something for the framework or only defines the
   GLib.Test.quick(), GLib.Test.slow() etc.

 - Reporting for subunit (https://launchpad.net/subunit/) (or some other
   test reporting suite). To be used for combining the unit tests with
   black-box tests using ldtp, expect or pyexpect etc. to a uniform test
   suite.

 - Running main loop until signal emission. This is needed for testing
   asynchronous interfaces.

   The basic interface would be:
   
     bool wait_for_signal(Object object, string signal, int timeout);

   which would create main loop, register callback to the signal to
   quit it, register timer to quit it and run the main loop. It would
   return whether the signal occured (well, in fact it should take
   a closure that is expected to trigger the signal, so if detects bot
   synchronous and asynchronous emits).

   Simple extension on top of that is:

   delegate bool Condition();
     bool wait_for_condition(Object object, string signal,
                             int timeout, Condition condition) {
         do {
             if(condition())
                 return true;
         } while(wait_for_signal(object, signal, timeout));
         return false;
     }

   (It should not restart the timer, but that's the basic idea)
   That is if you need to wait for something to happen, you specify
   the condition you are waiting for and a signal that might change
   whether it's true and it waits for the condition to become true
   for specified time, re-checking it every time the signal emits.

   Last the fixture should provide a "completed" signal and a
   void complete() method that would trigger it. To be used with
   something like:

     bool wait_for_completion(int timeout) {
         return wait_for_signal(this, "completed", timeout);
     }

   I've already done something like this in UnitTest++ for work, so
   I hope to do it relatively quickly.

 - FixtureWithTempDir subclass of Fixture. This would create a temp
   directory in the constructor and delete it if the test succeeded
   (unless option is given on the command-line). It should also
   have some utilities that would allow dumping some data to that
   directory, initializing it from some template and running a shell
   script (from a string) to initialize it (I already started such
   code somewhere).

   Every time I tried to test something, at work or not, I needed
   a temporary directory for some of the tests and most of the time
   just dumping a compiled-in string to a file in that directory
   and checking whether another file created by the test contained
   something particular was what I needed most, so having some
   support for that would be nice.

 - FixtureWithDBusSession subclass of Fixture (well, of FixtureWithTempDir
   really, because it needs to create the socket and put the configs
   somewhere and it's quite likely the test will need both anyway).

   This should take service description XML from some template dir
   or compiled in, start private instance of the DBus daemon with
   custom socket and tweak the environment to redirect default connections
   to that instance. That way one can test a service while another instance
   is regularly running in the system or test a DBus code against a mock
   service. The wait_for_* tools above can be nicely used to check that
   the service emits something and such.

 - Data-driven test cases. If a data source is declared in some way, all
   tests that take extra parameter are run once for each object read from
   the data source.

   I didn't really use this kind of thing yet, but I did think of it
   a few times when writing some tests where I needed to run the same
   code for several inputs to test that different code paths in the
   library.

   My initial idea is to have an interface defining a method, that would
   return an enumerable of GLib.Objects. If the fixture implemented this
   interface, than each test method that would have extra parameter of
   a GLib.Object-derived type would be called once for each object. Each
   of those invocations would be run as separate test case, including
   being separately forked if forking is on.

   There could be helper methods or subinterfaces that would implement
   the enumerator by deserializing the objects from somewhere (like JSON
   file, XML file, database etc.).

   Note: To avoid having to link against the various libraries (like
   glib-dbus, libxml etc. when one has no tests that need them, the
   suite could be a static library rather than dynamic one. That way
   the compiler will not include objects you don't actually use and
   therefore wouldn't need the underlying libraries either.

Regards,
Jan




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]