Issue 66846 - Out of memory when opening large tables
Summary: Out of memory when opening large tables
Status: REOPENED
Alias: None
Product: Base
Classification: Application
Component: code (show other issues)
Version: OOo 2.0.3
Hardware: All All
: P3 Trivial with 12 votes (vote)
Target Milestone: ---
Assignee: AOO issues mailing list
QA Contact:
URL:
Keywords: needmoreinfo
Depends on:
Blocks:
 
Reported: 2006-06-29 09:25 UTC by mw69
Modified: 2013-08-07 15:45 UTC (History)
5 users (show)

See Also:
Issue Type: DEFECT
Latest Confirmation in: ---
Developer Difficulty: ---


Attachments

Note You need to log in before you can comment on or make changes to this issue.
Description mw69 2006-06-29 09:25:10 UTC
If connected to an existing database by jdbc and I open a table which has
millions of rows, OOo Base seems to read *all* rows into memory until the java
vm runs out of memory. If I set the java memory to higher values (e.g. -Xmx512m)
it only takes longer for the out of memory error to occur.
Comment 1 christoph.lukasiak 2006-07-05 10:17:34 UTC
what database do you connect to, what driver do you use?
Comment 2 mw69 2006-07-05 18:13:57 UTC
mysql 5.0.20 with jdbc driver mysql Connector/J 3.1.13 or 3.1.12 
Comment 3 christoph.lukasiak 2006-09-25 15:00:11 UTC
change owner
Comment 4 christoph.lukasiak 2006-10-25 15:39:42 UTC
like we have spoken, send you the issue
Comment 5 marc.neumann 2006-10-26 14:37:27 UTC
Hi,
I can reproduce this.

mysql connect via jdbc and a table with 3.000.000 records 

When I double click on this table ooo try to open the table and nothing happen
for quite a while (~ 60 min) after this you get an error messages from java with
out of heap memory.

Set target and send to the right developer.

Bye Marc
Comment 6 mw69 2006-11-17 09:01:03 UTC
Same effect with postgresql 8.1.4 and postgresql-jdbc-407 
Comment 7 skiani 2008-01-04 14:08:07 UTC
In my book this is a problem with small tables. Any table large than about 1.3Gb
is a problem (we deal with multi-terabyte postgres tables). It also has a
problem with viewing large (>1.3Gb) CSV text files. Same problem OOo tries to
load the whole table into ram. It should just window what it needs for viewing.
Comment 8 bigandy 2009-03-08 22:09:11 UTC
I can reproduce this bug on OOO 3.0.1
Application crashes when I insert picture more than 2,6Mb in HSQLDB tables with
"java heap error"

System Fedora Linux 9, 10, jre 1.6, 1,5

This bug is major for implementation OOBase in our computing systems...
Comment 9 ludob 2009-06-19 07:29:10 UTC
Last week I reported issue 102625. A select * from table is sent to the database
to get just the key values when using ODBC.  Seems JDBC has the same problem. 

OOO310m11 has also this problem.

Looking into the OOO source code all this seems to happen in
dbaccess/source/core/api which is the part NOT depending on the connectivity
(ODBC,JDBC,mysql,....)  
Comment 10 ocke.janssen 2011-02-03 13:39:25 UTC
Fixed in cws dba34d.

I tried it with a 1 mio rows csv file which now takes only 1-2 sec on my machine
to open.
Comment 11 ocke.janssen 2011-03-18 09:25:46 UTC
Please verify. Thanks.
Comment 12 marc.neumann 2011-03-24 14:43:34 UTC
not fixed in cws dba34d. If I open a 3 Mio rows table with jdbc on a mysql server I still get an error "java heap space"
Comment 13 ocke.janssen 2011-04-01 12:10:49 UTC
That's a problem of the used jdbc driver and jre settings.