Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
malloced: 0x20afd2000; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
Note that the allocation step succeeds, and that I only receive this
error after allowing the code to deallocate the array.
Any ideas?
Thanks,
Ryan 18 2891 we********@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
malloced: 0x20afd2000; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
Note that the allocation step succeeds, and that I only receive this
error after allowing the code to deallocate the array.
Looks suspect, has the machine got the nigh on 6GB or virtual memory
available? Make sure your operator new behaves correctly by attempting
to allocate way more than the machine can provide.
--
Ian Collins.
On May 12, 2:02 am, welch.r...@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
There are lot of allocations there.
Looks to me that malloc internaly has overflow in pointer arithmetic.
Perhaps you use 32 bit malloc on 64 bit setup somehow?
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
On May 11, 8:24 pm, Ian Collins <ian-n...@hotmail.comwrote:
welch.r...@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
malloced: 0x20afd2000; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
Note that the allocation step succeeds, and that I only receive this
error after allowing the code to deallocate the array.
Looks suspect, has the machine got the nigh on 6GB or virtual memory
available? Make sure your operator new behaves correctly by attempting
to allocate way more than the machine can provide.
--
Ian Collins.
Hmm.. the machine has 8 GB of RAM, so that probably isn't the issue.
I'll try maxing it out to see what happens.
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
On May 12, 2:02 am, welch.r...@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
There are lot of allocations there.
Looks to me that malloc internaly has overflow in pointer arithmetic.
Perhaps you use 32 bit malloc on 64 bit setup somehow?
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
I can't seem to get that code to compile, it complains about the line
before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?
I think you're probably right, it has something to do with pointer
arithmetic. I'm just not sure what. The malloc() failure is happening
on the for loop where I'm deallocating each row of the array, I've
figured out that much. Beyond that, I'm not sure.
I've tried the following compiler options too but they don't warn me
of anything:
g++ -o HugeMemory.exe -O3 -mcpu=powerpc64 -arch ppc64 -faltivec -Wall -
Wconversion HugeMemory.cpp we********@gmail.com wrote:
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
>There are lot of allocations there. Looks to me that malloc internaly has overflow in pointer arithmetic. Perhaps you use 32 bit malloc on 64 bit setup somehow?
Try following and see if works:
const unsigned rows = 635000; const unsigned cols = 2350; int (*p)[cols] = new int[rows][cols]; for(unsigned i = 0; i<rows;++i) { for(unsigned j=0;j<cols;++j) p[i][j] = 0; } cin.get(); delete[] p;
Greetings, Branimir
I can't seem to get that code to compile, it complains about the line
before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?
The code is fine, with the exception of an integer overflow warning from
gcc.
I think you're probably right, it has something to do with pointer
arithmetic. I'm just not sure what. The malloc() failure is happening
on the for loop where I'm deallocating each row of the array, I've
figured out that much. Beyond that, I'm not sure.
Update your code to scan the rows for duplicate addresses. If you find
one, something is wrong!
--
Ian Collins.
On May 12, 5:37 am, welch.r...@gmail.com wrote:
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
On May 12, 2:02 am, welch.r...@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
.......
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
There are lot of allocations there.
Looks to me that malloc internaly has overflow in pointer arithmetic.
Perhaps you use 32 bit malloc on 64 bit setup somehow?
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
I can't seem to get that code to compile, it complains about the line
before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?
What is the error message?
Greetings, Branimir.
On 2007-05-12 02:02, we********@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
malloced: 0x20afd2000; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
Note that the allocation step succeeds, and that I only receive this
error after allowing the code to deallocate the array.
I have absolutely no idea, but you could try to make the code a bit more
simple by allocating everything in one large block instead:
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main() {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int* test = new int[ROWS * COLS];
for (int i = 0; i < ROWS * COLS; i++) {
test[i] = 0;
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
Do you still get the same error (or some other)? If you do there's
probably something wrong with your standard library.
--
Erik Wikström
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
On May 12, 2:02 am, welch.r...@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
There are lot of allocations there.
Looks to me that malloc internaly has overflow in pointer arithmetic.
Perhaps you use 32 bit malloc on 64 bit setup somehow?
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
This is what happens:
HugeMemory2.cpp:4: error: expected unqualified-id before 'for'
HugeMemory2.cpp:4: error: expected constructor, destructor, or type
conversion before '<' token
HugeMemory2.cpp:4: error: expected unqualified-id before '++' token
HugeMemory2.cpp:9: error: expected constructor, destructor, or type
conversion before '.' token
HugeMemory2.cpp:10: error: expected unqualified-id before 'delete'
I thought maybe it was because there's no 'int' after 'unsigned' but
that didn't help. :(
On May 12, 6:09 am, Erik Wikström <Erik-wikst...@telia.comwrote:
On 2007-05-12 02:02, welch.r...@gmail.com wrote:
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
delete[] test[k];
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
If I set ROWS and COLS to 5000 and 5000, it works just fine. However,
if I set ROWS to 635000 and COLS to 2350, it will give me the
following error upon deallocation:
HugeMemory.exe(29468) malloc: *** Deallocation of a pointer not
malloced: 0x20afd2000; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
Note that the allocation step succeeds, and that I only receive this
error after allowing the code to deallocate the array.
I have absolutely no idea, but you could try to make the code a bit more
simple by allocating everything in one large block instead:
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main() {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int* test = new int[ROWS * COLS];
for (int i = 0; i < ROWS * COLS; i++) {
test[i] = 0;
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
Do you still get the same error (or some other)? If you do there's
probably something wrong with your standard library.
--
Erik Wikström
Nope! That code succeeds. However, that code should require around 6
GB of RAM, correct? If I use top to check the memory usage of the
process, I see:
3688 HugeMemory 0.0% 0:39.81 1 13 34 232K 6.51M 1.56G
1.63G
So it's not even coming close.. or am I reading that incorrectly?
Quite strange..
On Sat, 12 May 2007 09:43:27 -0700, welch.ryan wrote:
On May 12, 6:09 am, Erik Wikström <Erik-wikst...@telia.comwrote:
>I have absolutely no idea, but you could try to make the code a bit more simple by allocating everything in one large block instead:
#include <iostream> #include <string> #include <stdexcept> using namespace std;
int main() { cout << "Attemping to allocate.." << endl;
const int ROWS = 635000; const int COLS = 2350;
// Allocate. try { int* test = new int[ROWS * COLS]; for (int i = 0; i < ROWS * COLS; i++) { test[i] = 0; }
cout << "Allocation succeeded!" << endl; cout << "Press a key to deallocate and continue.." << endl;
string blank; getline(cin,blank);
// Deallocate. delete[] test;
cout << "Deallocation completed!" << endl; cout << "Press a key to terminate.." << endl; getline(cin,blank); } catch(bad_alloc& e) { cout << "Allocation failed.." << endl; }
return 0;
}
Do you still get the same error (or some other)? If you do there's probably something wrong with your standard library.
-- Erik Wikström
Nope! That code succeeds. However, that code should require around 6 GB
of RAM, correct? If I use top to check the memory usage of the process,
I see:
3688 HugeMemory 0.0% 0:39.81 1 13 34 232K 6.51M 1.56G
1.63G
So it's not even coming close.. or am I reading that incorrectly? Quite
strange..
Could be a 32bit overflow in top or somewhere else. The reported numbers
are almost exactly 4G short of what one would expect.
--
Markus Schoder
On May 12, 12:59 pm, Markus Schoder <a3vr6dsg-use...@yahoo.dewrote:
On Sat, 12 May 2007 09:43:27 -0700, welch.ryan wrote:
On May 12, 6:09 am, Erik Wikström <Erik-wikst...@telia.comwrote:
I have absolutely no idea, but you could try to make the code a bit
more simple by allocating everything in one large block instead:
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main() {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int* test = new int[ROWS * COLS];
for (int i = 0; i < ROWS * COLS; i++) {
test[i] = 0;
}
cout << "Allocation succeeded!" << endl; cout << "Press a key to
deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
delete[] test;
cout << "Deallocation completed!" << endl; cout << "Press a key to
terminate.." << endl; getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
Do you still get the same error (or some other)? If you do there's
probably something wrong with your standard library.
--
Erik Wikström
Nope! That code succeeds. However, that code should require around 6 GB
of RAM, correct? If I use top to check the memory usage of the process,
I see:
3688 HugeMemory 0.0% 0:39.81 1 13 34 232K 6.51M 1.56G
1.63G
So it's not even coming close.. or am I reading that incorrectly? Quite
strange..
Could be a 32bit overflow in top or somewhere else. The reported numbers
are almost exactly 4G short of what one would expect.
--
Markus Schoder
Interesting.. okay, so let's suppose top is reporting it incorrectly,
and that code actually does truly successfully allocate all of that
memory. Then the question is, why is it that my original code (using a
multidimensional approach) fails, yet allocating it as one large block
seems to work?
On May 11, 11:45 pm, Ian Collins <ian-n...@hotmail.comwrote:
welch.r...@gmail.com wrote:
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
There are lot of allocations there.
Looks to me that malloc internaly has overflow in pointer arithmetic.
Perhaps you use 32 bit malloc on 64 bit setup somehow?
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
I can't seem to get that code to compile, it complains about the line
before the for loop. I'm using gcc 4.0.1 for apple/darwin. Any ideas?
The code is fine, with the exception of an integer overflow warning from
gcc.
I think you're probably right, it has something to do with pointer
arithmetic. I'm just not sure what. The malloc() failure is happening
on the for loop where I'm deallocating each row of the array, I've
figured out that much. Beyond that, I'm not sure.
Update your code to scan the rows for duplicate addresses. If you find
one, something is wrong!
--
Ian Collins.
Okay, I wrote something that I *think* would detect duplicate
addresses. Don't laugh at the implementation..
#include <iostream>
#include <string>
#include <stdexcept>
#include <map>
using namespace std;
int main(int argc, char** argv) {
cout << "Attempting to allocate.." << endl; // I can spell correctly
now..
const int ROWS = 635000;
const int COLS = 2350;
// Keep track of all used addresses.
map<int*,intaddresses;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
addresses[ test[i] ] += 1;
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
// Check for duplicate addresses.
cout << "Checking for duplicate addresses.." << endl;
map<int*,int>::iterator iter = addresses.begin();
map<int*,int>::iterator end = addresses.end();
while (iter != end) {
if (iter->second 1) {
cout << "--Duplicate address detected: " << iter->first<<
endl;
}
iter++;
}
cout << "Allocation succeeded!" << endl;
cout << "Press a key to deallocate and continue.." << endl;
string blank;
getline(cin,blank);
// Deallocate.
for (int k = 0; k < ROWS; k++) {
int** ptr = test + k;
delete[] ptr;
}
delete[] test;
cout << "Deallocation completed!" << endl;
cout << "Press a key to terminate.." << endl;
getline(cin,blank);
}
catch(bad_alloc& e) {
cout << "Allocation failed.." << endl;
}
return 0;
}
That code detects no duplicate addresses.
One additional thing I've noticed, I don't know if this helps: the
addresses in these error messages are about 8 apart from each other,
for example:
HugeMemory.exe(537) malloc: *** Deallocation of a pointer not
malloced: 0x2009350; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
HugeMemory.exe(537) malloc: *** Deallocation of a pointer not
malloced: 0x2009358; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
HugeMemory.exe(537) malloc: *** Deallocation of a pointer not
malloced: 0x2009360; This could be a double free(), or free() called
with the middle of an allocated block; Try setting environment
variable MallocHelp to see tools to help debug
On May 12, 6:35 pm, welch.r...@gmail.com wrote:
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
This is what happens:
HugeMemory2.cpp:4: error: expected unqualified-id before 'for'
HugeMemory2.cpp:4: error: expected constructor, destructor, or type
conversion before '<' token
HugeMemory2.cpp:4: error: expected unqualified-id before '++' token
HugeMemory2.cpp:9: error: expected constructor, destructor, or type
conversion before '.' token
HugeMemory2.cpp:10: error: expected unqualified-id before 'delete'
I thought maybe it was because there's no 'int' after 'unsigned' but
that didn't help. :(
Strange. Are you sure you entered code correctly?
I guess that error is triggered by something else in your code,
as I tried with comeau online, g++ 3.4.4 and latest vc++.
Greetings, Branimir. we********@gmail.com wrote:
>
Interesting.. okay, so let's suppose top is reporting it incorrectly,
and that code actually does truly successfully allocate all of that
memory. Then the question is, why is it that my original code (using a
multidimensional approach) fails, yet allocating it as one large block
seems to work?
The evidence is building a good case for a bug in your allocator, time
to try a tool/platform specific forum to see it it is a known problem.
--
Ian Collins.
On May 12, 4:49 pm, Ian Collins <ian-n...@hotmail.comwrote:
welch.r...@gmail.com wrote:
Interesting.. okay, so let's suppose top is reporting it incorrectly,
and that code actually does truly successfully allocate all of that
memory. Then the question is, why is it that my original code (using a
multidimensional approach) fails, yet allocating it as one large block
seems to work?
The evidence is building a good case for a bug in your allocator, time
to try a tool/platform specific forum to see it it is a known problem.
--
Ian Collins.
I'm starting to think you're right.. any suggestions for such a forum?
On May 12, 4:10 pm, Branimir Maksimovic <b...@hotmail.comwrote:
On May 12, 6:35 pm, welch.r...@gmail.com wrote:
On May 11, 9:15 pm, Branimir Maksimovic <b...@hotmail.comwrote:
Try following and see if works:
const unsigned rows = 635000;
const unsigned cols = 2350;
int (*p)[cols] = new int[rows][cols];
for(unsigned i = 0; i<rows;++i)
{
for(unsigned j=0;j<cols;++j)
p[i][j] = 0;
}
cin.get();
delete[] p;
Greetings, Branimir
This is what happens:
HugeMemory2.cpp:4: error: expected unqualified-id before 'for'
HugeMemory2.cpp:4: error: expected constructor, destructor, or type
conversion before '<' token
HugeMemory2.cpp:4: error: expected unqualified-id before '++' token
HugeMemory2.cpp:9: error: expected constructor, destructor, or type
conversion before '.' token
HugeMemory2.cpp:10: error: expected unqualified-id before 'delete'
I thought maybe it was because there's no 'int' after 'unsigned' but
that didn't help. :(
Strange. Are you sure you entered code correctly?
I guess that error is triggered by something else in your code,
as I tried with comeau online, g++ 3.4.4 and latest vc++.
Greetings, Branimir.
Maybe it's a bad line ending or something. I can't find my handy dandy
perl script to fix them..
<we********@gmail.comwrote in message
news:11*********************@q75g2000hsh.googlegro ups.com...
Hi all,
Having a problem with addressing large amounts of memory. I have a
simple piece of code here that is meant to allocate a large piece of
memory on a ppc64 machine. The code is:
/*
Test to see what happens when we try to allocate a massively huge
piece of memory.
*/
#include <iostream>
#include <string>
#include <stdexcept>
using namespace std;
int main(int argc, char** argv) {
cout << "Attemping to allocate.." << endl;
const int ROWS = 635000;
const int COLS = 2350;
// Allocate.
try {
int** test = new int*[ROWS];
for (int i = 0; i < ROWS; i++) {
test[i] = new int[COLS];
for (int j = 0; j < COLS; j++) {
test[i][j] = 0;
}
}
How long does it take for your code to run? I'm asking this because the
code you posted seems very inefficient.
You are invoking "new[]" 635,000 times. An alternative is to make only two
calls to "new[]". One new[] for the row pointers, and the second new[] to
allocate the pool of memory for the int data. Then in the loop, you point
the row pointers in the right place in the int data pool.
Not only will this more than likely bypass your problem with the allocator,
your code would more than likely see a significant increase in speed, both
in allocation and in deallocation (at least in this area of code). However,
you should test this (but I would be very surprised if there isn't a
significant speed increase)
Here are the internals of your code snippet rewritten making only two calls
new[] and then two calls to delete[] to deallocate the memory.
int *pool;
int** test = new int*[ROWS]; // allocate for row pointers
pool = new int [ROWS * COLS]; // allocate memory pool for data
for (int i = 0; i < ROWS; i++)
{
test[i] = pool;
pool += COLS;
}
// Deallocate.
delete [] test[0]; // deallocate pool
delete [] test; // deallocate row pointers.
- Paul
You're right, that's definitely way more efficient. For my instance,
though, I perform one massive allocation initially and then the code
can run for hours to weeks, so the initial time for allocation wasn't
a real issue for me.
However, the interesting thing is, I don't get that malloc() error
anymore upon deletion. I still don't know why that was happening
originally in the first place, but at least this method works.
Thanks for the help everyone, I really appreciate it!
Cheers,
Ryan This discussion thread is closed Replies have been disabled for this discussion. Similar topics
5 posts
views
Thread by TLOlczyk |
last post: by
|
5 posts
views
Thread by Robert Oschler |
last post: by
|
10 posts
views
Thread by Roman Mashak |
last post: by
| |
4 posts
views
Thread by Gregory.A.Book |
last post: by
|
12 posts
views
Thread by whitehatmiracle |
last post: by
|
24 posts
views
Thread by VijaKhara |
last post: by
|
reply
views
Thread by Kumar McMillan |
last post: by
|
5 posts
views
Thread by mike.m.schmidt |
last post: by
| | | | | | | | | | | |