Solving a Memory Leak Problem...



I have the following IDL:

typedef sequence < float > Row;
typedef sequence < Row > Table;

interface Test
{
    Table getTable();
};


And here is the code that implements the function:

/*** BEGIN CODE ***/
static Table *
impl_Test_getTable(impl_POA_Test * servant, CORBA_Environment * ev)
{
   Table *table;
   Row *row;

   table = Table__alloc();
   table->_buffer = Table_allocbuf(2); // 2 rows
   table->_length = 2;
   table->_maximum = 5;

   row = &table->_buffer[0];
   row->_buffer = Row_allocbuf(3); // 3 columns
   row->_length = 3;
   row->_maximum = 5;
   row->_buffer[0] = 5.42;
   row->_buffer[1] = 3.14;
   row->_buffer[2] = 12.7;

   row = &table->_buffer[1];
   row->_buffer = Row_allocbuf(3); // 3 columns
   row->_length = 3;
   row->_maximum = 5;
   row->_buffer[0] = 24.3;
   row->_buffer[1] = 13.4;
   row->_buffer[2] = 1.22;

   return table;
}
/*** END CODE ***/

Pretty simple, but obviously if this function is called multiple times
by the client, the allocated memory is not released. What is the best
solution for freeing this memory after it has been passed to the
client? 

Also, I can't find much documentation on programming multi-dimensional
sequences. Are there any best practices I should know about, or is the
above code acceptable (except for the leak problem)?

Thanks,
-- 
\ Craig Patrick McDaniel
/_\ Software Engineer
/_/_\ n + 1, Inc.
/_/_/_\ http://www.nplus1.net
/_/_/_/_\ (502) 479-5557



[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]