By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
440,704 Members | 1,122 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 440,704 IT Pros & Developers. It's quick & easy.

"Aborted" (while reading file ?)

P: n/a
I am trying to read a file full of numbers followed by spaces (and then
do some cool stuff with it)
My input file looks like this

1 0 1 0 1 0 1
1 1 0 0 1 1 0
(the number of columns is arbitrary, the number of lines is arbitrary.
but all lines have same number of columns)
Now, when I run my program (compiled using g++) I get "Aborted" at the
very end.
I get all my outputs written to my output file perfect. The program is
just crashing after it did its job.

Now, I traced thru the thing I found out it is crashing because it is
going thru one more iteration (when it is not supposed to) .. see "HEY
HEY HEY"

Here's all the relevant code (hopefully)

/* open input vector file */
ifstream inputfile ("input.dat");

/* create an outputfile */
string outputfile = argv[1];
outputfile = outputfile + ".results";
ofstream outfile(outputfile.c_str());

/* setting buffer, tokens and stimulus for file parsing */
char buf[255];
char *tokens;
vector <int> stimulus;

/* MAIN EVALUATION LOOP FOR EACH LINE OF INPUTS STARTS HERE */

while (!inputfile.eof())
{
cout << "HEY HEY HEY\n";
/* clear the stiumuls vector */
stimulus.clear();

/* read first line of input */
inputfile.getline(buf, 255);

/* split line by spaces and into stimulus vector */
tokens = strtok(buf, " ");
while (tokens != NULL)
{
stimulus.push_back(atoi(tokens));
tokens = strtok(NULL, " ");
}

/* initialize NET values to -1*/
initializenet(-1);

/* set all inputs from stimulus */
for (int i = 0; i < inputnets.size(); i++)
{
NET[inputnets.at(i)] = stimulus.at(i);
}

/* go through the evaluate loop */
evaluateloop();

/* write evaluated outputs to file */
for (int i = 0; i < outputnets.size(); i++)
{
outfile << NET[outputnets.at(i)] << " ";
}
outfile << endl;
}
return 0;
}

Like I said earlier, the code goes thru all my lines of input vectors,
evaluates and writes out correct output to my "outfile". But it
shamelessly crashes because it is going thru one more time in the loop
(I know this because it said "Aborted" after printing "HEY HEY HEY",
and my output file looks correct)

By the way, my input file will not have an extra line after the last
line, so for example

--------------file starts here--------------
1 0 1 0 1
1 1 0 0 1
--------------file ends here----------------

I am also probably screwing up some pointer stuff (haha, that's
expected)

So what's causing my program to die ?

Sep 17 '05 #1
Share this Question
Share on Google+
2 Replies


P: n/a
ma******@gmail.com wrote:
I am trying to read a file full of numbers followed by spaces (and then
do some cool stuff with it)
My input file looks like this

1 0 1 0 1 0 1
1 1 0 0 1 1 0
(the number of columns is arbitrary, the number of lines is arbitrary.
but all lines have same number of columns)
Now, when I run my program (compiled using g++) I get "Aborted" at the
very end.
I get all my outputs written to my output file perfect. The program is
just crashing after it did its job.

Now, I traced thru the thing I found out it is crashing because it is
going thru one more iteration (when it is not supposed to) .. see "HEY
HEY HEY"

It must be commonest newbie mistake.
while (!inputfile.eof())
{
cout << "HEY HEY HEY\n";
/* clear the stiumuls vector */
stimulus.clear();

/* read first line of input */
inputfile.getline(buf, 255);
This is the wrong way to read a file
while (inputfile.getline(buf, 255))
{
cout << "HEY HEY HEY\n";
/* clear the stiumuls vector */
stimulus.clear();


This is the right way.

eof() tells you why the last read failed. It does not tell you that the
next read is going to fail. That is why you are going round the loop one
too many times.

john
Sep 17 '05 #2

P: n/a
that did it.
thanks. :-)

Sep 18 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.