470,831 Members | 2,519 Online

# How to create 4 dimensional array

Hi,
I want to create 4 dimensional array, in that first three dimenstional are fixed size and the final index will be on 0 to N-numbers.
For eg, double array[500][25][10][<NOT FIXED>].. So I cant create statically, because the index size are more. Also I have tried 4 dimenstional vector, but its giving 2 problem.
(i) I am storing 4th dimenstion size is more than vector[0][0][0].max_size()
(ii) Storing and Retrieving its more time in vector

So Please let me know, if any other solution to store large array which is 3 index is FIXED and final one is not FIXED?

Thanks.
Apr 24 '13 #1
19 9111
weaknessforcats
9,208 Expert Mod 8TB

What you describe is easily achieved by allocating your own array on the heap. There are examples in the linked article.
Apr 24 '13 #2
Hi,
Can you some example or link to create the own array on the heap?
Apr 25 '13 #3
weaknessforcats
9,208 Expert Mod 8TB
The example is in the article linked to in my post #2.
Apr 25 '13 #4
Banfa
9,065 Expert Mod 8TB
On the whole if you have optimisation switched on storing and retrieving from a vector is the same as for an array.
Apr 25 '13 #5
Hi,

double array[500][25][10][<NOT FIXED>].
I am achieving this using vector(vector(vector(vector(double)))) dBuffer(....);
As of now I am using vector[500][25][10][UnknownSize].
UnknownSize will be decided on run time. It is working fine.
But I am facing 2 problems.
(i) I am storing 4th dimenstion size is more than
vector[0][0][0].max_size()
(ii) Storing and Retrieving its taking more time in vector for large index.

Pls suggest any other method.
Apr 25 '13 #6
weaknessforcats
9,208 Expert Mod 8TB
On my system, a vector<double> can hold 62 million doubles. You have more than that?
Apr 25 '13 #7
Hi, I could able to store in my double vector is = 536870911..
But I need more than this index.
Apr 26 '13 #8
weaknessforcats
9,208 Expert Mod 8TB
How many doubles do you need?
Apr 26 '13 #9
Banfa
9,065 Expert Mod 8TB
Given that a double is 8 bytes (which it normally is) 536870911 of them take about 4GiB which if you are using a 32 bit operating system is likely to be the limit of the virtual memory space for your program.

If that is the case using an array wont help because the limit is the process memory space not what an individual array or vector can hold.

If you are using that much data use a file or a memory mapped file.
Apr 26 '13 #10
Hi,
I need to create the double vector of
500*32*16*<un_knownsize>..The final element i will push in runtime...
Apr 26 '13 #11
Hi Banfa,

1) If i increase the virtual memory whether I can push much element.
2) More over I am not aware of memory mapped file. Please could you give some example to achieve.
Apr 26 '13 #12
weaknessforcats
9,208 Expert Mod 8TB
Are you able to use database engine like Oracle? I'm starting to think that you will need to desing a database engine that more in it than one array.
Apr 26 '13 #13
I can use database engine but it will take more time to insert data and retrieve back again.But my requirement is around I will take around 300MB of data in buffer then I will process and convert into double values, this size will be around 3.5GB, then I have to write into file. But if i use database everytime i have to insert then at the end i have to read back from database then I have to write into the file.
Apr 27 '13 #14
weaknessforcats
9,208 Expert Mod 8TB
Except Oracle doesn't work that way - and it can handle terabyte tables.

As can, I believe Microsoft SQL Server.

I am starting to worry that you will send a ton of time writing a database handler rather than processing your doubles.

I have personally written a semented database from scratch to handle table than spanned six hard discs. It worked but I spent a lot of time getting it to work.

BTW: During this talk of a 4D array I need you to be clear that there are only 1D arrays in both C and C++. The "dimensions" are just a way of getting the compiler to do your pointer arithmetic.
Apr 27 '13 #15
Banfa
9,065 Expert Mod 8TB
BTW: During this talk of a 4D array I need you to be clear that there are only 1D arrays in both C and C++. The "dimensions" are just a way of getting the compiler to do your pointer arithmetic.
I have to admit to be slightly surprised that you didn't say this in your first post.

Are you sure a database is required? It sounds like the process is
2. Process Data
3. Write File
If (BIG IF) the data can be processed linearly and written straight to file then you don't actually need to do any more than store the original 300MByte of data which isn't so bad.
Apr 29 '13 #16
weaknessforcats
9,208 Expert Mod 8TB
Are you sure a database is required:
You may be right, but hard to say since the number of doubles required is unknown. That raises the possibility the data will exceed the maximum file size and that would result in a segmented data structure. The result is spending time on things other than processing the data.
Apr 29 '13 #17
Sherin
77 64KB
Try This Code

Expand|Select|Wrap|Line Numbers
1.
2. #include <stdio.h>
3. int main()
4.
5.     int i, j, k, l, size;
6.
7.     int a[2][2][2][2];
8.
9.     size = 2;
10.
11.     a[0][0][0][0] = 5;
12.     a[0][0][0][1] = 3;
13.     a[0][0][1][0] = 5;
14.     a[0][0][1][1] = 3;
15.     a[0][1][0][0] = 6;
16.     a[0][1][0][1] = 7;
17.     a[0][1][1][0] = 6;
18.     a[0][1][1][1] = 7;
19.     a[1][0][0][0] = 8;
20.     a[1][0][0][1] = 9;
21.     a[1][0][1][0] = 8;
22.     a[1][0][1][1] = 9;
23.     a[1][1][0][0] = 9;
24.     a[1][1][0][1] = 7;
25.     a[1][1][1][0] = 9;
26.     a[1][1][1][1] = 7;
27.
28.     for (i = 0; i < size; i++) {
29.         for (j = 0; j < size; j++) {
30.             for (k = 0; k < size; k++) {
31.                 for (l = 0; l < size; l++) {
32.                     printf("Value of a[%d][%d][%d][%d] :- %d ",
33.                                 i, j, k, l, a[i][j][k][l]);
34.                     printf("\n");
35.                 }
36.             }
37.         }
38.     }
39.     return 0;
Jan 20 '21 #18
Banfa
9,065 Expert Mod 8TB
@Sherin, I don't think you have appreciated the problem that this thread is about which is not having a 4 dimensional array but rather processing and extremely large volume of data. Because of that the rather trivial example code posted is of little to no use in solving the actual problem.
Jan 25 '21 #19
SwissProgrammer
212 128KB
@weaknessforcats

"there are only 1D arrays in both C and C++. The "dimensions" are just a way of getting the compiler to do your pointer arithmetic."

That is interesting. Instead of looking at it as multi-dimensional, maybe I should look at it as single dimensional with a potential for dimensions inside of that first dimension like a tree. Start with one dimension and get that to work, then work on one more dimension at a time getting each of those to work before going on to more of the tree.

Thanks: This is an easier way to look at it (for me).

I am working on (still at it) a shortest path algorithm (with variably changing constraints) that can handle a 1,000+ by 1,000+ grid. I was considering multi-dimensional vectors and trying to get that to work. I think that I shall go back to designing based upon a single dimension vector now; and potentially use other dimensions for variable conditions within that first dimension's elements.

Maybe as a single dimension like this?
vector <std::wstring> Grid_X_Y (1000000);
to start. Then I can concatenate the elements as data is added or changed.

Or maybe with multiple dimensions like this?
vector <std::wstring> Grid_X_Y (1000000,10);
to start. Then I can add up to 10 values per each of the first dimension's elements. This might make it easier to choose a value later rather than having to parse each wstring as in the single dimension example.

Maybe this process might help the OP.

Thank you weaknessforcats .

Thank you bytes.com .

Glory to God .
May 18 '21 #20