Quinn-Curtis Forums
Quinn-Curtis Forums
Home | Profile | Register | Active Topics | Members | Search | FAQ
 All Forums
 Tools for Java
 QCRTGraph for Java
 ArrayIndexOutOfBounds after long run

Note: You must be registered in order to post a reply.
To register, click here. Registration is FREE!

Screensize:
UserName:
Password:
Format Mode:
Format: BoldItalicizedUnderlineStrikethrough Align LeftCenteredAlign Right Horizontal Rule Insert HyperlinkInsert EmailInsert Image Insert CodeInsert QuoteInsert List
   
Message:

* HTML is OFF
* Forum Code is ON
Smilies
Smile [:)] Big Smile [:D] Cool [8D] Blush [:I]
Tongue [:P] Evil [):] Wink [;)] Clown [:o)]
Black Eye [B)] Eight Ball [8] Frown [:(] Shy [8)]
Shocked [:0] Angry [:(!] Dead [xx(] Sleepy [|)]
Kisses [:X] Approve [^] Disapprove [V] Question [?]

 
   

T O P I C    R E V I E W
jminer Posted - 13 Nov 2006 : 17:06:06
Using QCRTGraph for Java; semi-real-time application drawing 6-10 waveforms; display update 1000 mSecs, sample update 10 mSecs. Works great for about 1 hour or so. Thereafter, I get an ArrayIndexOutOfBoundsException: 262144 at com.quinncurtis.chart2djava.DoubleArray.arrayCopy(DoubleArray:44)
...
... traceback through quinn curtis drawing
...
java.awt.EventDispatchThread.run

Eventually, I get a java out of memory crash.

Any suggestions?
4   L A T E S T    R E P L I E S    (Newest First)
jminer Posted - 14 Nov 2006 : 17:33:57
Good advice, thanks. The value 200000 was merely to get under the index 262144 where I seemed to get into trouble. A more reasonable value is definintely in order.

Update on last run: Setting up the truncation did the trick, no more crashes.

You folks have a great product and excellent support. I am impressed.

...Jim
quinncurtis Posted - 14 Nov 2006 : 12:06:50
To keep the updates as fast as possible, keep the minimuim number of data points in the dataset only slightly larger than a screens worth of data. For efficiency, the maximum does not need to be more than 2x that, unless you want to be able to scroll back historically in the data. Your value 1000 for the minimum seems about right, the value of 200K for the maximum seems large unless you need access to all of the historical data values via the dataset.
jminer Posted - 14 Nov 2006 : 11:17:53
The code I'm using is rather involved. I'm not explicitly truncating the datasets; I guess I assumed that was done behind the scenes, newbie error. I'm doing a run now with dataset truncation turned on:
autoTruncate=true
autoTruncMin=1000
autoTruncMax=200000

I will let you know how it turns out.

Thanks...Jim
quinncurtis Posted - 13 Nov 2006 : 17:26:04
Can you supply us with a simple example program using simulated data that reproduces the problem?

Are you continuously truncating the datasets, or are you accumulating millions of data points (10 traces * 100 samples/sec * 3600 secs/hour) = 3.6 Million datapoints/hour = 30MB/hour.

Quinn-Curtis Forums © 2000-2018 Quinn-Curtis, Inc. Go To Top Of Page
Powered By: Snitz Forums 2000 Version 3.4.07