[Users] Displaying big cubes/lines; memory limits

Bert Bril Bert.Bril at opendtect.org
Tue Mar 3 11:05:00 CET 2009


Hi all,


Regularly I get e-mails from people who are surprised when OpendTect
stops ('crashes') when people try to load entire cubes. They wonder
whether there is some kind of limit. The answer is: yes, there is a
limit, it depends on the amount of memory your computer has.

Why don't we catch the situation and give an error message?
Unfortunately, the amount of actually available memory on a computer
system is very very hard to determine. In practice, the only real way to
find out is to try allocating the memory that is needed; if it fails,
there wasn't that much of free memory available.

Thus, although OpendTect lets you create a volume viewer of your
terabyte-sized seismic cube, that doesn't mean you can actually load the
data. Very helpful can be a memory monitoring program if you doubt
whether something will fit in memory (Windows: start the process
manager; Linux: use a utility like 'top'). Those programs also do some
guessing, but they're rather good at it.

You can also predict what would still fit in your memory - there is an
easy rule-of-thumb to calculate this. Let's follow a support call I got
this morning:

> > When I try to load cube of the size:
> > -         # lines = 600,
> > -         # Xlines= 1250 and
> > -         # smpl.= 500,
> > I am getting the error. If I define smaller volume data are loaded
> > successfully.

The governing rule is the 'rule of ten': you'll need 10 times the total
amount of samples. In this case, OpendTect will use about 10 * 600 *
1250 * 500 bytes, approx. 3.5 GB of memory. If you have 4 GB or less RAM
in your computer, this is hopeless. And if you're lucky OpendTect will
not crash but report the failure (although catching this is also rather
hard - memory management is a continuous problem with computers).

What can you do in such a situation? Well, don't try to display all
samples! Say you have only 2 GB RAM in your computer. In this case I'd
recommend displaying no more than 1/4 of the samples, thus not using
more than 1/2 of the memory of your computer for one cube. Some solutions:

* Don't select the entire area or Z range
* Cut the survey inline and crossline steps by a factor 2
* Work with inlines and crosslines rather than using the volume viewer
* Buy more RAM (it's cheap nowadays!)


- Bert

-- 
-- Bert Bril / OpendTect developer at dGB
-- mailto:Bert.Bril at opendtect.org , http://opendtect.org





More information about the Users mailing list