By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,912 Members | 1,247 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,912 IT Pros & Developers. It's quick & easy.

stl string append function on SLES9 ( 32 bit and 64 bit machine)

P: n/a
Hi

I face a problem with the stl string append on our 32-bit SLES9 machine .
(It is not happening on SLES9 64-bit machines and on SuSE 10).

When we read a 179MB file into the memory,
m_oString.append(awchDestinationStr, a_nTotalNumOfUnicodes) is crashing

Leads to a crash in libstdc++ new() on 32-bit SLES9 machine byut working
fine with SLES9 64-bit machines and on SuSE 10

When we read a smaller file (I tried a 30MB file) it works.

Is it problem with library or architectures ( 32 bit and 64 bit)

Mohan


Jul 13 '06 #1
Share this Question
Share on Google+
3 Replies


P: n/a
* Mohan:
>
I face a problem with the stl string append on our 32-bit SLES9 machine .
(It is not happening on SLES9 64-bit machines and on SuSE 10).

When we read a 179MB file into the memory,
m_oString.append(awchDestinationStr, a_nTotalNumOfUnicodes) is crashing

Leads to a crash in libstdc++ new() on 32-bit SLES9 machine byut working
fine with SLES9 64-bit machines and on SuSE 10

When we read a smaller file (I tried a 30MB file) it works.

Is it problem with library or architectures ( 32 bit and 64 bit)
It's probably a problem with your code. When you have the possibility
of memory exhaustion, use std::set_new_handler to set up a failure
handler, and let that one log the incident and exit cleanly (not much
you can do). Alternatively, catch the std::bad_alloc exception
somewhere where the program can make further progress.

Make sure that you don't do needless copying of data. 180 MB is already
in the neighbourhood of 1/10 of available address space (or perhaps
1/20). You don't need many copies, like e.g. temporary copies
introduced by copy construction, before memory is exhausted.

Find the data size that the program is comfortable with and document it,
or if you need to handle data of any size, do the right thing and
process it one chunk at a time, with most of it on disk at any time.

Having said all this, please consult the FAQ about how to post.

Especially the FAQ item "How do I post a question about code that
doesn't work correctly?", currently at <url:
http://www.parashift.com/c++-faq-lite/how-to-post.html#faq-5.8>.

--
A: Because it messes up the order in which people normally read text.
Q: Why is it such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
Jul 13 '06 #2

P: n/a
"Mohan" <mo**********@in.bosch.comwrote in message
news:e9**********@ns2.fe.internet.bosch.com...
Hi

I face a problem with the stl string append on our 32-bit SLES9 machine .
(It is not happening on SLES9 64-bit machines and on SuSE 10).

When we read a 179MB file into the memory,
m_oString.append(awchDestinationStr, a_nTotalNumOfUnicodes) is crashing

Leads to a crash in libstdc++ new() on 32-bit SLES9 machine byut working
fine with SLES9 64-bit machines and on SuSE 10

When we read a smaller file (I tried a 30MB file) it works.

Is it problem with library or architectures ( 32 bit and 64 bit)
It sounds like you're running out of memory, especially with new() throwing.
Do the machines have the same amount of memory?

180MB is rather large to load into memory at one time anyway. It would
probably be good for you to redign the application to not have to load the
entire file at once.
Jul 13 '06 #3

P: n/a
"Mohan" <mo**********@in.bosch.comwrote:
I face a problem with the stl string append on our 32-bit SLES9
machine . (It is not happening on SLES9 64-bit machines and on
SuSE 10).

When we read a 179MB file into the memory,
How much memory do you have? I hope a lot, because you're
stressing the hell out of it. Do you have to read the whole
file in at once? Can't you process it a bit at a time?
m_oString.append(awchDestinationStr, a_nTotalNumOfUnicodes) is crashing

Leads to a crash in libstdc++ new() on 32-bit SLES9 machine byut working
fine with SLES9 64-bit machines and on SuSE 10

When we read a smaller file (I tried a 30MB file) it works.
Well, then, there you go. Use smaller files, or break larger
files into smaller, or process files incrementally.

Suggestion:

open file to be processed as an ifstream;
open a new output file as an ofstream;
while (input file stream good)
{
read small chunk of data from input file into RAM;
if (input file is eof) break;
process data;
append processed data to output file;
}
close files.
Is it problem with library or architectures ( 32 bit and 64 bit)
More likely a problem with inflicing cruel and unusual punishment
on your system's resources.
--
Cheers,
Robbie Hatley
East Tustin, CA, USA
lone wolf intj at pac bell dot net
(put "[usenet]" in subject to bypass spam filter)
http://home.pacbell.net/earnur/
Jul 14 '06 #4

This discussion thread is closed

Replies have been disabled for this discussion.