dynamic arrays - Bad Alloc with a 200GB memory available c++ -


i'm new c++, , i'm studying 'compressive sensing' need working huge matrices, , matlab slow programmed algorithm c++.

the thing store big arrays (around 100mb-1gb). 20 arrays approx. , works fine 30 gb of memory when process needs more 40gb stops. think it's memory problem, tested on linux , windows (os 64 bits - compilers 64 bits mingw - 200gb ram - intel xeon) there limitation?.

size_t tm=n*m*l; double *x=new double[tm]; 

i use around 20 arrays one. n,m ~= 1000 , l ~= 30 typically sizes.

thank you

20 arrays, problem 40 gb memory use in total - suggests program breaks when array exceeds 2 gb. should not happen, 64 bits address space should use 64 bits size_t object sizes. appears mingw incorrectly uses 31 bit size (i.e. losing sign bit well).

i don't know how allocate memory, perhaps fixable bypassing broken allocation routine , going straight os allocator. e.g. windows call virtualalloc (skip heapalloc, it's not designed such large allocations).


Comments

Popular posts from this blog

android - MPAndroidChart - How to add Annotations or images to the chart -

javascript - Add class to another page attribute using URL id - Jquery -

firefox - Where is 'webgl.osmesalib' parameter? -