By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,779 Members | 1,078 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,779 IT Pros & Developers. It's quick & easy.

Regex on whole (large) text file

P: n/a
Hi,

I'm sorry if these questions are trivial, but I've searched the net and
haven't had any luck finding the information I need.

I need to perform some regular expression search and replace on a large
text file. The patterns I need to match are multi-line, so I can't do it
one line at a time. Instead I currently read in the entire text file in
a string using the code below.

File fin = new File("input.txt");
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
String aLine = null;
String theText = "";
while((aLine = in.readLine()) != null) {
theText = theText + aLine + "\n";
}

The problem with this is that the first couple of thousand lines read in
very fast, but it gets slower and slower, and as we approach line 4000
it gets really slow per line.

Is there a better way to read in an entire text file into a string?

Is storing the entire text file in a string a bad idea? And if so, what
are the alternatives?

Is it possible to perform multiple-line regular expressions on a text
file without loading the whole text file into memory?

Thanks in advance,
Rune
Jul 17 '05 #1
Share this Question
Share on Google+
8 Replies


P: n/a
Rune Johansen wrote:
Hi,

I'm sorry if these questions are trivial, but I've searched the net and
haven't had any luck finding the information I need.

I need to perform some regular expression search and replace on a large
text file. The patterns I need to match are multi-line, so I can't do it
one line at a time. Instead I currently read in the entire text file in
a string using the code below.

File fin = new File("input.txt");
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
String aLine = null;
String theText = "";
while((aLine = in.readLine()) != null) {
theText = theText + aLine + "\n";
}

The problem with this is that the first couple of thousand lines read in
very fast, but it gets slower and slower, and as we approach line 4000
it gets really slow per line.

Is there a better way to read in an entire text file into a string?


Absolutely. Instead of using + to concatenate strings, you should use a
StringBuffer and convert to a String at the end:

File fin = new File("input.txt");
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
String aLine = null;
StringBuffer theText = new StringBuffer((int)fin.length());
while((aLine = in.readLine()) != null)
{
// Question: Why are you converting all the line breaks
// to \n?
theText.append(aLine).append("\n");
}

Ray

--
XML is the programmer's duct tape.
Jul 17 '05 #2

P: n/a
Raymond DeCampo wrote:
Absolutely. Instead of using + to concatenate strings, you should
use a StringBuffer and convert to a String at the end
Thanks a lot! This indeed solves the problem.
Question: Why are you converting all the line breaks to \n?


How do I preserve the line breaks? Before I added the \n, the whole
string, when written to a file, was in a single line.

Rune
--
3D images and anims, include files, tutorials and more:
rune|vision: http://runevision.com **updated Apr 27**
POV-Ray Ring: http://webring.povray.co.uk
Jul 17 '05 #3

P: n/a

"Rune Johansen" <rune[insert_current_year_here]@runevision.com> wrote in
message news:Ts********************@news000.worldonline.dk ...
Raymond DeCampo wrote:
Absolutely. Instead of using + to concatenate strings, you should
use a StringBuffer and convert to a String at the end


Thanks a lot! This indeed solves the problem.
Question: Why are you converting all the line breaks to \n?


How do I preserve the line breaks? Before I added the \n, the whole
string, when written to a file, was in a single line.

Rune
--
3D images and anims, include files, tutorials and more:
rune|vision: http://runevision.com **updated Apr 27**
POV-Ray Ring: http://webring.povray.co.uk


Read characters (or bytes) instead of lines. Reading lines is useless unless
you want to process individual lines.

Silvio Bierman
Jul 17 '05 #4

P: n/a
Silvio Bierman wrote:
Read characters (or bytes) instead of lines. Reading lines
is useless unless you want to process individual lines.


Okay, I now read characters instead, using the following method:

public static String readTextFile(String filename) {
try {
File fin = new File(filename);
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new
InputStreamReader(fis));
char[] chrArr = new char[(int)fin.length()];
while(in.ready()==false) {}
in.read(chrArr);
in.close();
return new String(chrArr);
}
catch (FileNotFoundException e) { return ""; }
catch (IOException e) { return ""; }
}

Except for the poor exception handling, is there anything obvious that
could be improved here?

Rune
Jul 17 '05 #5

P: n/a

"Rune Johansen" <rune[insert_current_year_here]@runevision.com> wrote in
message news:65********************@news000.worldonline.dk ...
Silvio Bierman wrote:
Read characters (or bytes) instead of lines. Reading lines
is useless unless you want to process individual lines.


Okay, I now read characters instead, using the following method:

public static String readTextFile(String filename) {
try {
File fin = new File(filename);
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new
InputStreamReader(fis));
char[] chrArr = new char[(int)fin.length()];
while(in.ready()==false) {}
in.read(chrArr);
in.close();
return new String(chrArr);
}
catch (FileNotFoundException e) { return ""; }
catch (IOException e) { return ""; }
}

Except for the poor exception handling, is there anything obvious that
could be improved here?

Rune


Rune,

You could drop the BufferedReader and read from the InputStreamReader
directly. This will be somewhat faster. In cases where you would be
processing the file as a character stream you should use the BufferedRe\ader
though.

Silvio Bierman
Jul 17 '05 #6

P: n/a
Silvio Bierman wrote:
"Rune Johansen" <rune[insert_current_year_here]@runevision.com> wrote in
message news:65********************@news000.worldonline.dk ...
Silvio Bierman wrote:
Read characters (or bytes) instead of lines. Reading lines
is useless unless you want to process individual lines.


Okay, I now read characters instead, using the following method:

public static String readTextFile(String filename) {
try {
File fin = new File(filename);
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new
InputStreamReader(fis));
char[] chrArr = new char[(int)fin.length()];
while(in.ready()==false) {}
in.read(chrArr);
in.close();
return new String(chrArr);
}
catch (FileNotFoundException e) { return ""; }
catch (IOException e) { return ""; }
}

Except for the poor exception handling, is there anything obvious that
could be improved here?

Rune

Rune,

You could drop the BufferedReader and read from the InputStreamReader
directly. This will be somewhat faster. In cases where you would be
processing the file as a character stream you should use the BufferedRe\ader
though.

Silvio Bierman


I definitely disagree with that. My understanding is that
FileInputStream, FileOutputStream, FileReader and FileWriter are not
buffered and will go to the file system for every byte/character. So
they should almost always be wrapped with the appropriate buffered stream.

I would however, ditch the FileInputStream/InputStreamReader combination
in favor of FileReader.

The other potential issue is that InputStream.in() is not guaranteed to
fill the array (although in practice I think it usually does). So the
paranoid way to do this would be in a loop that ensures that all the
desired characters are read.

Finally, I don't think the while loop you have adds any value and will
just eat CPU cycles if it does anything.

And speaking of finally, you should use a finally clause to close your
streams.

Ray

--
XML is the programmer's duct tape.
Jul 17 '05 #7

P: n/a

"Raymond DeCampo" <rd******@spam.twcny.spam.rr.spam.com.spam> wrote in
message news:52********************@twister.nyroc.rr.com.. .
Silvio Bierman wrote:
"Rune Johansen" <rune[insert_current_year_here]@runevision.com> wrote in
message news:65********************@news000.worldonline.dk ...
Silvio Bierman wrote:

Read characters (or bytes) instead of lines. Reading lines
is useless unless you want to process individual lines.

Okay, I now read characters instead, using the following method:

public static String readTextFile(String filename) {
try {
File fin = new File(filename);
FileInputStream fis = new FileInputStream(fin);
BufferedReader in = new BufferedReader(new
InputStreamReader(fis));
char[] chrArr = new char[(int)fin.length()];
while(in.ready()==false) {}
in.read(chrArr);
in.close();
return new String(chrArr);
}
catch (FileNotFoundException e) { return ""; }
catch (IOException e) { return ""; }
}

Except for the poor exception handling, is there anything obvious that
could be improved here?

Rune

Rune,

You could drop the BufferedReader and read from the InputStreamReader
directly. This will be somewhat faster. In cases where you would be
processing the file as a character stream you should use the BufferedRe\ader though.

Silvio Bierman


I definitely disagree with that. My understanding is that
FileInputStream, FileOutputStream, FileReader and FileWriter are not
buffered and will go to the file system for every byte/character. So
they should almost always be wrapped with the appropriate buffered stream.

I would however, ditch the FileInputStream/InputStreamReader combination
in favor of FileReader.

The other potential issue is that InputStream.in() is not guaranteed to
fill the array (although in practice I think it usually does). So the
paranoid way to do this would be in a loop that ensures that all the
desired characters are read.

Finally, I don't think the while loop you have adds any value and will
just eat CPU cycles if it does anything.

And speaking of finally, you should use a finally clause to close your
streams.

Ray

--
XML is the programmer's duct tape.


Raymond,

The non-buffered streams and readers do not go to the filesystem for every
byte/character but for every read action instead. If you plan to read say 1M
bytes in a single read a plain stream will do a single filesystem level read
where a buffered reader will read multiple times its buffer size until the
1M bytes are read. If the raw variants where so dumb it would be impossible
for the buffered ones to use them efficiently.

As I said If you intend to read single (or very small counts of)
bytes/characters frequently the buffered variants will group the reads and
therefore give better performance.

It is a common misconception that you should always use buffered
streams/readers. If this where true the plain ones would have porobably been
left out of the API.

Regards,

Silvio Bierman
Jul 17 '05 #8

P: n/a
Silvio Bierman wrote:
"Raymond DeCampo" <rd******@spam.twcny.spam.rr.spam.com.spam> wrote in
message news:52********************@twister.nyroc.rr.com.. .
Silvio Bierman wrote:
Rune,

You could drop the BufferedReader and read from the InputStreamReader
directly. This will be somewhat faster. In cases where you would be
processing the file as a character stream you should use the
BufferedRe\ader
though.

Silvio Bierman


I definitely disagree with that. My understanding is that
FileInputStream, FileOutputStream, FileReader and FileWriter are not
buffered and will go to the file system for every byte/character. So
they should almost always be wrapped with the appropriate buffered stream.
Ray

--
XML is the programmer's duct tape.

Raymond,

The non-buffered streams and readers do not go to the filesystem for every
byte/character but for every read action instead. If you plan to read say 1M
bytes in a single read a plain stream will do a single filesystem level read
where a buffered reader will read multiple times its buffer size until the
1M bytes are read. If the raw variants where so dumb it would be impossible
for the buffered ones to use them efficiently.

As I said If you intend to read single (or very small counts of)
bytes/characters frequently the buffered variants will group the reads and
therefore give better performance.

It is a common misconception that you should always use buffered
streams/readers. If this where true the plain ones would have porobably been
left out of the API.

Regards,

Silvio Bierman


Hmm, that makes sense. Thanks for the clarification.

Ray

--
XML is the programmer's duct tape.
Jul 17 '05 #9

This discussion thread is closed

Replies have been disabled for this discussion.