Re: large document support



On Fri, Sep 19, 2003 at 07:25:37AM -0500, datazone airmail net wrote:
Good Day,
         I searched the list but i have not seen anyone mention this
before, but is there a "usable" limit to a document that gnumeric
can work with?
I recently installed the 1.2.0 release, and feature wise it is very good. 
However, it takes over twice as long to open a csv file compared to excel
on identical hardware.  The box is a p4 2GHz with 512MB running RH9, the
file is 10MB with over 30 cols and over 40,000 rows.  After it is opened,
certain operations like changing the cell formatting of a col or even just
saving the file in standard gnumeric xml format brings the box to its
knees and require me to kill gnumeric after waiting for a long time.
Has anyone else seen this problem, and is there a way to work around it?

Standard operations should scale well.  Please send a sample and
example of anything that seems like a problem.

xml export is a known issue.  It boils down to memory consumption.
The default xml exporter uses a DOM interface which basicly requires
keeping the entire thing in memory as a giant string.  It can get
huge.  There is an experimental importer in 1.2.0
        EXPERIMENTAL SAX based Gnumeric (*.gnumeric)

That is significantly faster.  It will become the default handler
for 1.3.x, but did not receive enough testing for us to enable it as
the default for 1.2.

See if that helps,
    Jody



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]