Re: [xml] XSD patch and tests

On Sat, 2003-08-02 at 12:06, Daniel Veillard wrote:
On Sat, Aug 02, 2003 at 12:41:27PM -0400, Daniel Veillard wrote:
On Thu, Jul 31, 2003 at 10:20:35PM -0500, Charles Bozeman wrote:
decimal types as strings. I have also included tests (that belong in
test/schemas) for boolean, hexBinary, and decimal types.

   for hexBinary I understand but for the decimal, what is the gain ?
It seems to me that now the information stored about the decimal is a
textual form and retrieving the value needs to reparse it again. What
is the purpose of the change to the decimal implementation ? A conformance
problem, a limitation in the approach (the REC asks for 18 digit of 
precision and I had those). 
  I would like to apply your patch but on the other hand I don't understand
the reason for most of the changes made, and a bit annoyed about not being
able to get the value for the decimal types, could you provide a bit
more informations ?

  I used James Clark regression test before and after applying your patch,
I see serious conformance degradation, and memory leaks appearing in the
in the integer related types (diff appended). You can run the XSD regression
tests using the Python script:
Yikes, I didn't run the python script after I had made some later
changes; obviously I screwed something up. I figured the performance
might take a hit but I didn't think it would be too much. 
it uses the Relax-NG engine but tests the XSD Datatype implementation
conformance . The tests are in test/xsdtest/ which should be both in 
CVS and on the distribution.
  I glanced at the code too and it seems to be some problems like:
    - XML_SCHEMA_FACET_TOTALDIGITS: uses the xmlSchemaValDecimal2ULong
      function to get the number of digits
    - That function uses sscanf((const char *)decimal->istr, "%lu", &base)
    - I don't think the scanf function can process 18digits strings as
      required by the spec (at least not in a portable fashion)
This is little different than the current code (which uses the
decimal.lo value) to get the number of digits to compare against.
see section 3.2.3 decimal note:
"All ·minimally conforming· processors ·must· support decimal
 numbers with a minimum of 18 decimal digits"
As pointed in the comment in my implementation this would require using
a long long type which is not portable, that's the reason of the
implementation based on 3 unsigned longs.
  So I think more discussion will be needed before changing the decimal
implementation. In the meantime, could you separate the hexBinary patches
so that part can be fixed ?
Will do.
Sometimes I get ideas and I just want to see how far I can go with it.
Thanks for your patience.


C. Bozeman

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]