471,079 Members | 899 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,079 software developers and data experts.

How to limit use of virtual memory in my Win application?

VM
How can I limit the use of the PC's virtual memory? I'm running a process
that basically takes a txt file and loads it to a datatable. The problem is
that the file is over 400,000 lines long (77 MB) and after a while I get the
Windows message saying that the virtual memory's getting really low. Plus
the machine gets really sluggish (with multi-threading). Is it possible to
use the virtual memory until it reaches a certain limit and then use HDD
space?

Thanks.
Nov 16 '05 #1
10 3516
As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting thing
to note is how you are accessing the file. Which of the stream classes are
you using to access the file?

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:eF*************@TK2MSFTNGP11.phx.gbl...
How can I limit the use of the PC's virtual memory? I'm running a process
that basically takes a txt file and loads it to a datatable. The problem
is
that the file is over 400,000 lines long (77 MB) and after a while I get
the
Windows message saying that the virtual memory's getting really low. Plus
the machine gets really sluggish (with multi-threading). Is it possible to
use the virtual memory until it reaches a certain limit and then use HDD
space?

Thanks.

Nov 16 '05 #2
VM
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:uj**************@TK2MSFTNGP10.phx.gbl...
As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your application taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting thing to note is how you are accessing the file. Which of the stream classes are
you using to access the file?

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:eF*************@TK2MSFTNGP11.phx.gbl...
How can I limit the use of the PC's virtual memory? I'm running a process that basically takes a txt file and loads it to a datatable. The problem
is
that the file is over 400,000 lines long (77 MB) and after a while I get
the
Windows message saying that the virtual memory's getting really low. Plus the machine gets really sluggish (with multi-threading). Is it possible to use the virtual memory until it reaches a certain limit and then use HDD
space?

Thanks.


Nov 16 '05 #3
Note, too, that virtual memory equals your physical memory plus hard disk
space allocated to swapping. It may be an option to have your users
allocate more memory to the page file.

However, I think the previous poster's comments to restructure your code in
order to make better use of available resources represent a better solution.

Eric
As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on


Nov 16 '05 #4
Ok, I would ***strongly*** recommend switching to FileStream and using byte
arrays. You can do async file access with FileStream (you would set this in
the cTor of the FileStream class) which could speed things up as much as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you
get a number that you are comfortable with. I have appended a code snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default, but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
"VM" <vo******@yahoo.com> wrote in message
news:uA****************@TK2MSFTNGP09.phx.gbl...
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:uj**************@TK2MSFTNGP10.phx.gbl...
As far as I know, there is no way to manipulate Windows VM from within an
application. The VM is handled by the operating system and it decides on
what is stored there. If you are encountering problems with your

application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in
chunks rather than looping through all 400,000+ lines. One interesting

thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:eF*************@TK2MSFTNGP11.phx.gbl...
> How can I limit the use of the PC's virtual memory? I'm running a process > that basically takes a txt file and loads it to a datatable. The
> problem
> is
> that the file is over 400,000 lines long (77 MB) and after a while I
> get
> the
> Windows message saying that the virtual memory's getting really low. Plus > the machine gets really sluggish (with multi-threading). Is it possible to > use the virtual memory until it reaches a certain limit and then use
> HDD
> space?
>
> Thanks.
>
>



Nov 16 '05 #5
VM
Thanks very much for the info.
Before making these changes, I wanted to know what you thought of this idea
regarding the table and the 400,000+ rows.
What I'm trying to do is display this txt file in a windows datagrid.
Basically, I call a method (with parm fileName) that creates and fills a
table with the 400K file and returns the table (that now has 400k rows) to
the form. Then I attach the table to the grid. I wrote this without knowing
that the program had to read such immense files. Since I don't have to
display all 400,000 records in the grid (the most the user will see at a
time is 40 recs in the grid), theoretically, how would you load the file
into a table and attach it to the grid in small chunks?

Thanks again.

"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:en**************@TK2MSFTNGP09.phx.gbl...
Ok, I would ***strongly*** recommend switching to FileStream and using byte arrays. You can do async file access with FileStream (you would set this in the cTor of the FileStream class) which could speed things up as much as 50% (according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you get a number that you are comfortable with. I have appended a code snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default, but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
"VM" <vo******@yahoo.com> wrote in message
news:uA****************@TK2MSFTNGP09.phx.gbl...
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:uj**************@TK2MSFTNGP10.phx.gbl...
As far as I know, there is no way to manipulate Windows VM from within an application. The VM is handled by the operating system and it decides on what is stored there. If you are encountering problems with your

application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in chunks rather than looping through all 400,000+ lines. One interesting

thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:eF*************@TK2MSFTNGP11.phx.gbl...
> How can I limit the use of the PC's virtual memory? I'm running a

process
> that basically takes a txt file and loads it to a datatable. The
> problem
> is
> that the file is over 400,000 lines long (77 MB) and after a while I
> get
> the
> Windows message saying that the virtual memory's getting really low.

Plus
> the machine gets really sluggish (with multi-threading). Is it
possible to
> use the virtual memory until it reaches a certain limit and then use
> HDD
> space?
>
> Thanks.
>
>



Nov 16 '05 #6
VM
I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge
file.
"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:en**************@TK2MSFTNGP09.phx.gbl...
Ok, I would ***strongly*** recommend switching to FileStream and using byte arrays. You can do async file access with FileStream (you would set this in the cTor of the FileStream class) which could speed things up as much as 50% (according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until you get a number that you are comfortable with. I have appended a code snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default, but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array with the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a good candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
"VM" <vo******@yahoo.com> wrote in message
news:uA****************@TK2MSFTNGP09.phx.gbl...
Thanks for your reply. I'm using the StreamReader class.

This is how I am currently doing it:
private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
StreamReader sr = new StreamReader(sFileName);
sAuditRecord = sr.ReadLine();

while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}

"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:uj**************@TK2MSFTNGP10.phx.gbl...
As far as I know, there is no way to manipulate Windows VM from within an application. The VM is handled by the operating system and it decides on what is stored there. If you are encountering problems with your

application
taking large amounts of RAM, you may want to modify your minimum
requirements for your app to include more RAM or try to read the file in chunks rather than looping through all 400,000+ lines. One interesting

thing
to note is how you are accessing the file. Which of the stream classes
are
you using to access the file?

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:eF*************@TK2MSFTNGP11.phx.gbl...
> How can I limit the use of the PC's virtual memory? I'm running a

process
> that basically takes a txt file and loads it to a datatable. The
> problem
> is
> that the file is over 400,000 lines long (77 MB) and after a while I
> get
> the
> Windows message saying that the virtual memory's getting really low.

Plus
> the machine gets really sluggish (with multi-threading). Is it
possible to
> use the virtual memory until it reaches a certain limit and then use
> HDD
> space?
>
> Thanks.
>
>



Nov 16 '05 #7
Hmmm, good question... lol

The thing is with your app, the biggest performance hit is not in the UI
display, it is in getting the data. Disk access is slow. Even on GHz HT
Pentiums, it is slow compared to RAM. This is where you are going to take
your hit hardest. I would first read the file using the FileStream in a
DataTable. Then, I would store that datatable in a memory stream or some
other persisted medium (if you are using ASP.NET, then I would say you could
stuff it into the Cache and really speed things up). Then, I would create a
method that returns only 40 records in a temp datatable:

<pseudo-code>
public DataTable GetRecords(int startRecord, int numberofRecords)
{
DataTable tempTable = new DataTable("tempDataTable");
for(int i = startRecord; i <= numberofRecords; i++)
{
tempTable.ImportRow(dataTableSource.Rows[i]);
}

return tempTable;
}
</pseudo-code>

Not the most elegant solution, but you are dealing with the most primitive
database known, text files! ;) I would make extra, extra sure to dispose
ANYTHING that you don't need. It's going to take the GC awhile to recover
the RAM used to read the text file initially and the StringReader to parse
it. So, take extra care to dispose of anything that you create while the GC
is handling other things in the background.

HTH,

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:O4**************@TK2MSFTNGP10.phx.gbl...
Thanks very much for the info.
Before making these changes, I wanted to know what you thought of this
idea
regarding the table and the 400,000+ rows.
What I'm trying to do is display this txt file in a windows datagrid.
Basically, I call a method (with parm fileName) that creates and fills a
table with the 400K file and returns the table (that now has 400k rows) to
the form. Then I attach the table to the grid. I wrote this without
knowing
that the program had to read such immense files. Since I don't have to
display all 400,000 records in the grid (the most the user will see at a
time is 40 recs in the grid), theoretically, how would you load the file
into a table and attach it to the grid in small chunks?

Thanks again.

"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:en**************@TK2MSFTNGP09.phx.gbl...
Ok, I would ***strongly*** recommend switching to FileStream and using

byte
arrays. You can do async file access with FileStream (you would set this

in
the cTor of the FileStream class) which could speed things up as much as

50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until

you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array

with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a

good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
"VM" <vo******@yahoo.com> wrote in message
news:uA****************@TK2MSFTNGP09.phx.gbl...
> Thanks for your reply. I'm using the StreamReader class.
>
> This is how I am currently doing it:
> private DataTable LoadFile(string sfileName)
> {
> DataTable DT_Audit = new DataTable("AZMViewTable");
> StreamReader sr = new StreamReader(sFileName);
> sAuditRecord = sr.ReadLine();
>
> while (sAuditRecord != null)
> {
> rowAudit = DT_Audit.NewRow();
> //Split string sAuditRecord and store it in appropriate fields in
> row
> DT_Audit.Rows.Add (rowAudit);
> sAuditRecord = sr.ReadLine();
> }
> sr.Close();
> return DT_Audit;
> }
>
> "Kyril Magnos" <ky**********@yahoo.com> wrote in message
> news:uj**************@TK2MSFTNGP10.phx.gbl...
>> As far as I know, there is no way to manipulate Windows VM from within an >> application. The VM is handled by the operating system and it decides on >> what is stored there. If you are encountering problems with your
> application
>> taking large amounts of RAM, you may want to modify your minimum
>> requirements for your app to include more RAM or try to read the file in >> chunks rather than looping through all 400,000+ lines. One interesting
> thing
>> to note is how you are accessing the file. Which of the stream classes
>> are
>> you using to access the file?
>>
>> Kyril
>>
>> "VM" <vo******@yahoo.com> wrote in message
>> news:eF*************@TK2MSFTNGP11.phx.gbl...
>> > How can I limit the use of the PC's virtual memory? I'm running a
> process
>> > that basically takes a txt file and loads it to a datatable. The
>> > problem
>> > is
>> > that the file is over 400,000 lines long (77 MB) and after a while I
>> > get
>> > the
>> > Windows message saying that the virtual memory's getting really low.
> Plus
>> > the machine gets really sluggish (with multi-threading). Is it possible > to
>> > use the virtual memory until it reaches a certain limit and then use
>> > HDD
>> > space?
>> >
>> > Thanks.
>> >
>> >
>>
>>
>
>



Nov 16 '05 #8
It should work just fine. I will test it here with a large file and post the
results.

~Kyril

"VM" <vo******@yahoo.com> wrote in message
news:uY**************@TK2MSFTNGP11.phx.gbl...
I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge
file.
"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:en**************@TK2MSFTNGP09.phx.gbl...
Ok, I would ***strongly*** recommend switching to FileStream and using

byte
arrays. You can do async file access with FileStream (you would set this

in
the cTor of the FileStream class) which could speed things up as much as

50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit). You
will have to play with the buffer size on different test machines until

you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the
file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and
multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array

with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to
use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a

good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
"VM" <vo******@yahoo.com> wrote in message
news:uA****************@TK2MSFTNGP09.phx.gbl...
> Thanks for your reply. I'm using the StreamReader class.
>
> This is how I am currently doing it:
> private DataTable LoadFile(string sfileName)
> {
> DataTable DT_Audit = new DataTable("AZMViewTable");
> StreamReader sr = new StreamReader(sFileName);
> sAuditRecord = sr.ReadLine();
>
> while (sAuditRecord != null)
> {
> rowAudit = DT_Audit.NewRow();
> //Split string sAuditRecord and store it in appropriate fields in
> row
> DT_Audit.Rows.Add (rowAudit);
> sAuditRecord = sr.ReadLine();
> }
> sr.Close();
> return DT_Audit;
> }
>
> "Kyril Magnos" <ky**********@yahoo.com> wrote in message
> news:uj**************@TK2MSFTNGP10.phx.gbl...
>> As far as I know, there is no way to manipulate Windows VM from within an >> application. The VM is handled by the operating system and it decides on >> what is stored there. If you are encountering problems with your
> application
>> taking large amounts of RAM, you may want to modify your minimum
>> requirements for your app to include more RAM or try to read the file in >> chunks rather than looping through all 400,000+ lines. One interesting
> thing
>> to note is how you are accessing the file. Which of the stream classes
>> are
>> you using to access the file?
>>
>> Kyril
>>
>> "VM" <vo******@yahoo.com> wrote in message
>> news:eF*************@TK2MSFTNGP11.phx.gbl...
>> > How can I limit the use of the PC's virtual memory? I'm running a
> process
>> > that basically takes a txt file and loads it to a datatable. The
>> > problem
>> > is
>> > that the file is over 400,000 lines long (77 MB) and after a while I
>> > get
>> > the
>> > Windows message saying that the virtual memory's getting really low.
> Plus
>> > the machine gets really sluggish (with multi-threading). Is it possible > to
>> > use the virtual memory until it reaches a certain limit and then use
>> > HDD
>> > space?
>> >
>> > Thanks.
>> >
>> >
>>
>>
>
>



Nov 16 '05 #9
VM
For such a huge file, what would the best buffer size be?

Thanks.
"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:Oa**************@tk2msftngp13.phx.gbl...
It should work just fine. I will test it here with a large file and post the results.

~Kyril

"VM" <vo******@yahoo.com> wrote in message
news:uY**************@TK2MSFTNGP11.phx.gbl...
I tried your suggestion but the file just freezes when reading the file.
Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge file.
"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:en**************@TK2MSFTNGP09.phx.gbl...
Ok, I would ***strongly*** recommend switching to FileStream and using

byte
arrays. You can do async file access with FileStream (you would set this
in
the cTor of the FileStream class) which could speed things up as much
as 50%
(according to MS, if you set your buffer too high, you will get a
performance hit, too low and you get an even worse performance hit).
You will have to play with the buffer size on different test machines until

you
get a number that you are comfortable with. I have appended a code
snippet
that I wrote that takes your code and changes from StreamReader to
FileStream.

HTH,

Kryil

<code>

private DataTable LoadFile(string sfileName)
{
DataTable DT_Audit = new DataTable("AZMViewTable");
byte[] byteData; //holder for data that we are going to read from the file.

//try playing around with the buffer size. I use 2048 as a default,
but
that may not be the best for your application.
//You might also want to look into using FileStream's Async methods and multiple threads.
using(FileStream fs = new FileStream(sfileName, FileMode.Open,
FileAccess.Read, FileShare.None, 2048, true))
{
byte[] byteData = new byte[fs.Length]; //initialize the byte array

with
the size of the file.
fs.Read(byteData, 0, byteData.Length); //read the data.
fs.Close(); //close the stream and the file.
}

//once you have the file read in as a byte array, it is very simple to use things
//like StringReader or other Readers
string data = System.Text.Encoding.Default.GetString(byteData, 0,
byteData.Length); //transform the bytes to a readable string.
using(StringReader sr = new StringReader(data)) //StringReader is a

good
candidate to use for reading very large strings.
{
sAuditRecord = sr.ReadLine();
while (sAuditRecord != null)
{
rowAudit = DT_Audit.NewRow();
//Split string sAuditRecord and store it in appropriate fields in
row
DT_Audit.Rows.Add (rowAudit);
sAuditRecord = sr.ReadLine();
}
sr.Close();
return DT_Audit;
}
}
</code>
"VM" <vo******@yahoo.com> wrote in message
news:uA****************@TK2MSFTNGP09.phx.gbl...
> Thanks for your reply. I'm using the StreamReader class.
>
> This is how I am currently doing it:
> private DataTable LoadFile(string sfileName)
> {
> DataTable DT_Audit = new DataTable("AZMViewTable");
> StreamReader sr = new StreamReader(sFileName);
> sAuditRecord = sr.ReadLine();
>
> while (sAuditRecord != null)
> {
> rowAudit = DT_Audit.NewRow();
> //Split string sAuditRecord and store it in appropriate fields in
> row
> DT_Audit.Rows.Add (rowAudit);
> sAuditRecord = sr.ReadLine();
> }
> sr.Close();
> return DT_Audit;
> }
>
> "Kyril Magnos" <ky**********@yahoo.com> wrote in message
> news:uj**************@TK2MSFTNGP10.phx.gbl...
>> As far as I know, there is no way to manipulate Windows VM from within an
>> application. The VM is handled by the operating system and it
decides on
>> what is stored there. If you are encountering problems with your
> application
>> taking large amounts of RAM, you may want to modify your minimum
>> requirements for your app to include more RAM or try to read the
file in
>> chunks rather than looping through all 400,000+ lines. One

interesting > thing
>> to note is how you are accessing the file. Which of the stream classes >> are
>> you using to access the file?
>>
>> Kyril
>>
>> "VM" <vo******@yahoo.com> wrote in message
>> news:eF*************@TK2MSFTNGP11.phx.gbl...
>> > How can I limit the use of the PC's virtual memory? I'm running a
> process
>> > that basically takes a txt file and loads it to a datatable. The
>> > problem
>> > is
>> > that the file is over 400,000 lines long (77 MB) and after a while I >> > get
>> > the
>> > Windows message saying that the virtual memory's getting really low. > Plus
>> > the machine gets really sluggish (with multi-threading). Is it

possible
> to
>> > use the virtual memory until it reaches a certain limit and then use >> > HDD
>> > space?
>> >
>> > Thanks.
>> >
>> >
>>
>>
>
>



Nov 16 '05 #10
Hi VM,

Sorry about the delay in getting back to you. I am still trying a few things
to see how to better implement this. I am currently using a buffer size of
65536 (64K) and it reads in from the FileStream quick enough. The trouble is
when I get the results into a DataTable. My mem usage was up to 400 Megs
yesterday afternoon! So, I am looking into some other things such as a
string array or something along those lines. I will keep you updated.

Kyril

"VM" <vo******@yahoo.com> wrote in message
news:uc*************@tk2msftngp13.phx.gbl...
For such a huge file, what would the best buffer size be?

Thanks.
"Kyril Magnos" <ky**********@yahoo.com> wrote in message
news:Oa**************@tk2msftngp13.phx.gbl...
It should work just fine. I will test it here with a large file and post

the
results.

~Kyril

"VM" <vo******@yahoo.com> wrote in message
news:uY**************@TK2MSFTNGP11.phx.gbl...
>I tried your suggestion but the file just freezes when reading the file.
> Would it work even if the file is 78MB (79146798 bytes) long ? It's a huge > file.
>
>
> "Kyril Magnos" <ky**********@yahoo.com> wrote in message
> news:en**************@TK2MSFTNGP09.phx.gbl...
>> Ok, I would ***strongly*** recommend switching to FileStream and using
> byte
>> arrays. You can do async file access with FileStream (you would set this > in
>> the cTor of the FileStream class) which could speed things up as much as > 50%
>> (according to MS, if you set your buffer too high, you will get a
>> performance hit, too low and you get an even worse performance hit). You >> will have to play with the buffer size on different test machines
>> until
> you
>> get a number that you are comfortable with. I have appended a code
>> snippet
>> that I wrote that takes your code and changes from StreamReader to
>> FileStream.
>>
>> HTH,
>>
>> Kryil
>>
>> <code>
>>
>> private DataTable LoadFile(string sfileName)
>> {
>> DataTable DT_Audit = new DataTable("AZMViewTable");
>> byte[] byteData; //holder for data that we are going to read from the >> file.
>>
>> //try playing around with the buffer size. I use 2048 as a default,
>> but
>> that may not be the best for your application.
>> //You might also want to look into using FileStream's Async methods and >> multiple threads.
>> using(FileStream fs = new FileStream(sfileName, FileMode.Open,
>> FileAccess.Read, FileShare.None, 2048, true))
>> {
>> byte[] byteData = new byte[fs.Length]; //initialize the byte array
> with
>> the size of the file.
>> fs.Read(byteData, 0, byteData.Length); //read the data.
>> fs.Close(); //close the stream and the file.
>> }
>>
>> //once you have the file read in as a byte array, it is very simple to >> use things
>> //like StringReader or other Readers
>> string data = System.Text.Encoding.Default.GetString(byteData, 0,
>> byteData.Length); //transform the bytes to a readable string.
>> using(StringReader sr = new StringReader(data)) //StringReader is a
> good
>> candidate to use for reading very large strings.
>> {
>> sAuditRecord = sr.ReadLine();
>> while (sAuditRecord != null)
>> {
>> rowAudit = DT_Audit.NewRow();
>> //Split string sAuditRecord and store it in appropriate fields in
>> row
>> DT_Audit.Rows.Add (rowAudit);
>> sAuditRecord = sr.ReadLine();
>> }
>> sr.Close();
>> return DT_Audit;
>> }
>> }
>> </code>
>> "VM" <vo******@yahoo.com> wrote in message
>> news:uA****************@TK2MSFTNGP09.phx.gbl...
>> > Thanks for your reply. I'm using the StreamReader class.
>> >
>> > This is how I am currently doing it:
>> > private DataTable LoadFile(string sfileName)
>> > {
>> > DataTable DT_Audit = new DataTable("AZMViewTable");
>> > StreamReader sr = new StreamReader(sFileName);
>> > sAuditRecord = sr.ReadLine();
>> >
>> > while (sAuditRecord != null)
>> > {
>> > rowAudit = DT_Audit.NewRow();
>> > //Split string sAuditRecord and store it in appropriate fields in
>> > row
>> > DT_Audit.Rows.Add (rowAudit);
>> > sAuditRecord = sr.ReadLine();
>> > }
>> > sr.Close();
>> > return DT_Audit;
>> > }
>> >
>> > "Kyril Magnos" <ky**********@yahoo.com> wrote in message
>> > news:uj**************@TK2MSFTNGP10.phx.gbl...
>> >> As far as I know, there is no way to manipulate Windows VM from within > an
>> >> application. The VM is handled by the operating system and it decides > on
>> >> what is stored there. If you are encountering problems with your
>> > application
>> >> taking large amounts of RAM, you may want to modify your minimum
>> >> requirements for your app to include more RAM or try to read the file > in
>> >> chunks rather than looping through all 400,000+ lines. One interesting >> > thing
>> >> to note is how you are accessing the file. Which of the stream classes >> >> are
>> >> you using to access the file?
>> >>
>> >> Kyril
>> >>
>> >> "VM" <vo******@yahoo.com> wrote in message
>> >> news:eF*************@TK2MSFTNGP11.phx.gbl...
>> >> > How can I limit the use of the PC's virtual memory? I'm running a
>> > process
>> >> > that basically takes a txt file and loads it to a datatable. The
>> >> > problem
>> >> > is
>> >> > that the file is over 400,000 lines long (77 MB) and after a
>> >> > while I >> >> > get
>> >> > the
>> >> > Windows message saying that the virtual memory's getting really low. >> > Plus
>> >> > the machine gets really sluggish (with multi-threading). Is it
> possible
>> > to
>> >> > use the virtual memory until it reaches a certain limit and then use >> >> > HDD
>> >> > space?
>> >> >
>> >> > Thanks.
>> >> >
>> >> >
>> >>
>> >>
>> >
>> >
>>
>>
>
>



Nov 16 '05 #11

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

11 posts views Thread by KalleD | last post: by
6 posts views Thread by Yahya | last post: by
81 posts views Thread by Peter Olcott | last post: by
reply views Thread by leo001 | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.