Re: [Vala] Testing framework



From: Chris Daley <chebizarro gmail com>

Sent: Tuesday, 5 January 2016, 1:16
Subject: Re: [Vala] Testing framework

I've done some thinking about this over the last couple of months and used
the holiday period to finish off a few things, namely a port of Gherkin to
Vala. You can grab it from here if you want to check it out:
https://github.com/chebizarro/gherkin-vala

I'm going to be sketching out what I think is a reasonable
roadmap over the next week or so with a view to an alpha release around the
end of February. If anyone is interested in contributing, or has any
specific ideas about what sort of features they would find the most useful,

please get in touch 


Hi,

Great work on porting Gherkin 3 to Vala.

I wanted to put forward a few ideas that I have been slowly
researching over the past year or so in the hope they are 

insightful to any developments of testing tools for Vala.

Gherkin
-------
Gherkin is a language for structuring human language in a
way that allows business analysts, developers and testers to define
features of an application.

Some computer languages have tools available to developers to convert 

Gherkin to an outline of a computer program. The following meaningless 

example shows the structure. This uses PHP's Behat to convert Gherkin 

to PHP:

Feature: test

Background: given this is a test # features/test.feature:3

Scenario: testing of app # features/test.feature:5
When i run something
Then it passes

1 scenario (1 undefined)
2 steps (2 undefined)
0m0.17s (9.41Mb)

--- FeatureContext has missing steps. Define them with these snippets:

/**
* @When i run something
*/
public function iRunSomething()
{
throw new PendingException();
}

/**
* @Then it passes
*/
public function itPasses()
{
throw new PendingException();
}

It is the feature context that forms the basis of generating
automated acceptance tests from the features specified in 

Gherkin. The developer then fills in the gaps with code that
drives the tests. In PHP web development this is a tool like
Mink that can drive various headless web browsers.

Automatic code generation for Vala
----------------------------------
I can think of two approaches to generation code. I have
tried neither.

The first is to use libvala to generate a Vala AST then output
the AST. Potentially libvala could be modified to generate a 

Genie version of the AST. This appeals to me.

The other approach would be similar to Valadoc:
https://git.gnome.org/browse/valadoc/tree/src/libvaladoc/api/formalparameter.vala#n131

Code generation could also be useful for anyone wanting to 

develop a tool similar to RSpec. So a common approach using libvala
may be helpful. I think Anjuta CTags also uses libvala for 

autosuggestion of function names etc. See:
https://github.com/GNOME/anjuta/blob/master/plugins/symbol-db/anjuta-tags/ctags-visitor.vala

Acceptance Testing Drivers
--------------------------
This is probably the hardest part given the wide range of
interfaces available.

Gherkin is from Cucumber written in Ruby with web application
development in mind. So I think most tools there use web 

interfaces. In the Vala world this could be done with libsoup
for a text analysis of the web interface, but also embedding
Webkit or Gecko which also allows Javascript to be tested.

Vala is often used for desktop GUI development. So Linux
Desktop Testing Project ( http://ldtp.freedesktop.org/wiki/ )
using Assistive Technology Service Provider Interface ( 

https://en.wikipedia.org/wiki/Assistive_Technology_Service_Provider_Interface
 ) may be relevant.

Of course software is also developed for technical users. So
there are potentially command line interfaces, D-Bus interfaces, shared
library interfaces and so on to cater for.

For command line interfaces I'm starting to think GLib's
trap_subprocess may be useful:

http://valadoc.org/#!api=glib-2.0/GLib.Test.trap_subprocess
I'm trying to write functional tests for Genie, but some 

features need to stop the compilation process. e.g. attempting
to override a protected method in a class. The test should 

trap the error from valac and make sure it matches the 

expected error.


Test Output
-----------
This is probably moving away from specification by example
with Gherkin and moving towards unit testing. Specifically
it's about my experience with GLib testing framework reports.

Each test can output the results as TAP (Test Anything Protocol)
by using the --tap switch. I would recommend TAP because it
is processed more easily by a lot of tools.e.g. Jenkins build
server.


In the past I have done unit tests using a script such as:

#/bin/sh

tests="arrays_multiline
variables_declarations
variables_type_inference"

# Build test binaries
for test in $tests
do
valac $test.gs
done

# Run test binaries
gtester --keep-going -o results.xml $tests

# Fix missing info section from results - see https://bugzilla.gnome.org/show_bug.cgi?id=668035
info="<gtester>\n<info>\n<package>Unknown</package>\n<version>Unknown</version>\n<revision></revision>\n</info>"
sed -i 's|<gtester>|'$info'|g' results.xml

# Generate HTML report
gtester-report results.xml > report.html


I have not figured out a convenient way to get the output as TAP, 

although a script for automake is available:
https://git.gnome.org/browse/glib/tree/tap-driver.sh

Unit Test of Binaries
---------------------
Vala and Genie produce binaries of course.

So far I have produced a test binary that is a Position
Independent Executable ( 
https://securityblog.redhat.com/2012/11/28/position-independent-executables-pie/ ) that has all the symbols 
in the dynamic table. This
allows each unit test to treat the test binary as a 

shared object to it can test individual functions, but also
be run as an executable for functional testing.

I haven't figured out how to do this for a production binary.
This would probably a static build so the production binary has
the unit tests aggregated after it in memory and the linker
should use the symbol table and not the dynamic symbol table.

For a test binary you may also want to include the --debug
flag in Valac to include Vala filenames. I think this was tried 

for code coverage reports with Valum - see 

https://coveralls.io/builds/4147345 for example. Not sure of the results.


Vala Bugs to be Aware Off
-------------------------
May be of interest:
https://bugzilla.gnome.org/show_bug.cgi?id=704072
https://bugzilla.gnome.org/show_bug.cgi?id=597999
https://bugzilla.gnome.org/show_bug.cgi?id=739725

Hope all this helps frame any work you may do on Vala
testing.

All the best with it,

Al


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]