By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
464,556 Members | 990 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 464,556 IT Pros & Developers. It's quick & easy.

How can I prevent CPU Usage increasing to 100 percent when copying files from CD?

P: n/a
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the
code. All this is within another loop that goes through all the files
in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard
Jul 17 '05 #1
Share this Question
Share on Google+
22 Replies

P: n/a
On 7 Jan 2004 18:06:29 -0800, bw*@unicon.com (Bryan Rickard) wrote:
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.


<snip>

What actually gets slow ?

You are doing a heck of a lot of work
- you should expect the CPU usage to be high

If other Apps are responding sluggishly then put a DoEvents after each
block read/write
Jul 17 '05 #2

P: n/a
Bryan Rickard wrote:
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the
code. All this is within another loop that goes through all the files
in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard


You could use an API call. Look at
http://www.mentalis.org/apilist/CopyFile.shtml
For more details

Best regards,
Dikkie Dik
Jul 17 '05 #3

P: n/a
Thank you, Dikkie. Actually I could use FileCopy directly in Visual
Basic, but I wanted to show the user a ProgressBar showing progress of
each file transfer. That's why I wanted to break the copying into
chunks. FileCopy doesn't allow anything else to happen in the
application until the whole file has been copied. I don't think the
direct API call would be any different.

- Bryan

Dikkie Dik <Ab***@SpamBusters.com> wrote in message news:<3f***********************@dreader-2.news.scarlet-internet.nl>...
Bryan Rickard wrote:
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the
code. All this is within another loop that goes through all the files
in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard


You could use an API call. Look at
http://www.mentalis.org/apilist/CopyFile.shtml
For more details

Best regards,
Dikkie Dik

Jul 17 '05 #4

P: n/a
The copying gets slow. See following log extract. After the sixth
file it starts to slow and gets worse.

c9301.z: size = 34,149,859 bytes, time = 22 seconds, speed = 1.552
Mb/sec.
c9302.z: size = 33,209,029 bytes, time = 18 seconds, speed = 1.845
Mb/sec.
c9303.z: size = 33,427,953 bytes, time = 17 seconds, speed = 1.966
Mb/sec.
c9304.z: size = 33,270,682 bytes, time = 16 seconds, speed = 2.079
Mb/sec.
c9305.z: size = 32,773,719 bytes, time = 15 seconds, speed = 2.185
Mb/sec.
c9306.z: size = 32,319,428 bytes, time = 14 seconds, speed = 2.309
Mb/sec.
c9307.z: size = 31,733,326 bytes, time = 31 seconds, speed = 1.024
Mb/sec.
c9308.z: size = 31,580,633 bytes, time = 51 seconds, speed = 0.619
Mb/sec.
c9309.z: size = 31,101,258 bytes, time = 71 seconds, speed = 0.438
Mb/sec.
l93.z: size = 75,610,329 bytes, time = 340 seconds, speed = 0.222
Mb/sec.

There is a DoEvents after each block read/write.

- Bryan
er*****@nowhere.com (J French) wrote in message news:<3f***************@news.btclick.com>...
On 7 Jan 2004 18:06:29 -0800, bw*@unicon.com (Bryan Rickard) wrote:
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.


<snip>

What actually gets slow ?

You are doing a heck of a lot of work
- you should expect the CPU usage to be high

If other Apps are responding sluggishly then put a DoEvents after each
block read/write

Jul 17 '05 #5

P: n/a
See Randy Birch's stuff for a way to use APIs to copy files with
progress reports (and cancel ability):

http://vbnet.mvps.org/code/callback/...upcallback.htm
"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b**************************@posting.google.c om...
Thank you, Dikkie. Actually I could use FileCopy directly in Visual
Basic, but I wanted to show the user a ProgressBar showing progress of
each file transfer. That's why I wanted to break the copying into
chunks. FileCopy doesn't allow anything else to happen in the
application until the whole file has been copied. I don't think the
direct API call would be any different.

- Bryan

Dikkie Dik <Ab***@SpamBusters.com> wrote in message

news:<3f***********************@dreader-2.news.scarlet-internet.nl>...
Bryan Rickard wrote:
I wrote a simple program in VB6 to copy all the files from a directory on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and then put it into the disk. See code below. It works, but it slows down drastically before it copies all the files. Windows Task Manager shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the code. All this is within another loop that goes through all the files in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard


You could use an API call. Look at
http://www.mentalis.org/apilist/CopyFile.shtml
For more details

Best regards,
Dikkie Dik

Jul 17 '05 #6

P: n/a

"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b**************************@posting.google.c om...
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the
code. All this is within another loop that goes through all the files
in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard


You are reallocating sBuffer each time through the loop. If you
calculate the leftover first, you can copy that amount, then copy the
rest of the file using the same block.

' Get the length of the source file.
lFileLen = LOF(nSource)

' Calculate the left over.
nLeftOver = lFileLen Mod lIncrement

' Create a buffer for the nLeftOver amount.
sBuffer= String$(nLeftOver, " ")

' Read and write the nLeftOver amount.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = nLeftOver

' Create a buffer for a block (do only once)
sBuffer = String$(lIncrement, " ")

' Read and write the remaining blocks of data.
Do Until lStartPos > lFileLen
' Read and write one block of data.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = lStartPos + lIncrement
Loop
Jul 17 '05 #7

P: n/a
On Thu, 8 Jan 2004 15:54:49 -0800, "Steve Gerrard"
<no*************@comcast.net> wrote:

<snip>

You are reallocating sBuffer each time through the loop. If you
calculate the leftover first, you can copy that amount, then copy the
rest of the file using the same block.


Spot on - memory reallocation !

It might be an idea to use Byte Arrays

I am not sure about copying the 'residue' first
- it is an interesting idea
- but theoretically (if not in practice) it removes the possibility of
alignment of clusters/sectors

It just makes me feel uncomfortable ...
Jul 17 '05 #8

P: n/a

"J French" <er*****@nowhere.com> wrote in message
news:3f***************@news.btclick.com...
On Thu, 8 Jan 2004 15:54:49 -0800, "Steve Gerrard"
<no*************@comcast.net> wrote:

You are reallocating sBuffer each time through the loop. If you
calculate the leftover first, you can copy that amount, then copy the
rest of the file using the same block.


Spot on - memory reallocation !

It might be an idea to use Byte Arrays

I am not sure about copying the 'residue' first
- it is an interesting idea
- but theoretically (if not in practice) it removes the possibility of
alignment of clusters/sectors

It just makes me feel uncomfortable ...


I agree, it does seem odd. I actually got the basis of that code from a
MS article on file copying in networks, and that is how they did it. (It
was an old article addressing a collision problem in older 10 Mbps
networks.)

I suppose if you calculate the number of "std" blocks as well, you could
do all of them first, then do the residue block, or something like that.
Jul 17 '05 #9

P: n/a
"Steve Gerrard" <no*************@comcast.net> wrote in message news:<bK********************@comcast.com>...
<cut>

I've done this sort of thing often but never thought to put the
"leftover" chunk first like that... I'll have to try it. One minor
change:

' Get the length of the source file.
lFileLen = LOF(nSource)

' Calculate the left over.
nLeftOver = lFileLen Mod lIncrement
If nLeftOver>0 Then
' Create a buffer for the nLeftOver amount.
sBuffer= String$(nLeftOver, " ")

' Read and write the nLeftOver amount.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = nLeftOver
End If
' Create a buffer for a block (do only once)
sBuffer = String$(lIncrement, " ")

' Read and write the remaining blocks of data.
Do Until lStartPos > lFileLen
' Read and write one block of data.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = lStartPos + lIncrement
Loop

Jul 17 '05 #10

P: n/a
Steve, thank you very much! That took care of the problem.

I wonder, though, why VB doesn't reallocate the same memory when you
redefine a fixed-length string. You would think it would.

- Bryan

"Steve Gerrard" <no*************@comcast.net> wrote in message news:<bK********************@comcast.com>...
"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b**************************@posting.google.c om...
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the
code. All this is within another loop that goes through all the files
in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard


You are reallocating sBuffer each time through the loop. If you
calculate the leftover first, you can copy that amount, then copy the
rest of the file using the same block.

' Get the length of the source file.
lFileLen = LOF(nSource)

' Calculate the left over.
nLeftOver = lFileLen Mod lIncrement

' Create a buffer for the nLeftOver amount.
sBuffer= String$(nLeftOver, " ")

' Read and write the nLeftOver amount.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = nLeftOver

' Create a buffer for a block (do only once)
sBuffer = String$(lIncrement, " ")

' Read and write the remaining blocks of data.
Do Until lStartPos > lFileLen
' Read and write one block of data.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = lStartPos + lIncrement
Loop

Jul 17 '05 #11

P: n/a
Steve, thank you very much! That took care of the problem.

I wonder, though, why VB doesn't reallocate the same memory when you
redefine a fixed-length string. You would think it would.
"Steve Gerrard" <no*************@comcast.net> wrote in message news:<bK********************@comcast.com>...

You are reallocating sBuffer each time through the loop. If you
calculate the leftover first, you can copy that amount, then copy the
rest of the file using the same block.

' Get the length of the source file.
lFileLen = LOF(nSource)

' Calculate the left over.
nLeftOver = lFileLen Mod lIncrement

' Create a buffer for the nLeftOver amount.
sBuffer= String$(nLeftOver, " ")

' Read and write the nLeftOver amount.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = nLeftOver

' Create a buffer for a block (do only once)
sBuffer = String$(lIncrement, " ")

' Read and write the remaining blocks of data.
Do Until lStartPos > lFileLen
' Read and write one block of data.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = lStartPos + lIncrement
Loop

Jul 17 '05 #12

P: n/a

"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b*************************@posting.google.co m...
Steve, thank you very much! That took care of the problem.

I wonder, though, why VB doesn't reallocate the same memory when you
redefine a fixed-length string. You would think it would.

- Bryan

"Steve Gerrard" <no*************@comcast.net> wrote in message

news:<bK********************@comcast.com>...
You are reallocating sBuffer each time through the loop. If you
calculate the leftover first, you can copy that amount, then copy the rest of the file using the same block.


Actually I'm somewhat suprised it makes such a difference. I can see
that the new block has to be allocated before the old block is released,
but you would think that about the third or fourth time it would be able
to reuse the first block. Maybe something else smaller gets allocated in
the mean time, so the free memory gets "fragmented" like a disk, and the
new block has to keep being allocated "at the top".
Jul 17 '05 #13

P: n/a

"Bob Butler" <bu*******@earthlink.net> wrote in message
news:fa*************************@posting.google.co m...
"Steve Gerrard" <no*************@comcast.net> wrote in message news:<bK********************@comcast.com>... <cut>

I've done this sort of thing often but never thought to put the
"leftover" chunk first like that... I'll have to try it. One minor
change:

' Get the length of the source file.
lFileLen = LOF(nSource)

' Calculate the left over.
nLeftOver = lFileLen Mod lIncrement


If nLeftOver>0 Then
' Create a buffer for the nLeftOver amount.
sBuffer= String$(nLeftOver, " ")

' Read and write the nLeftOver amount.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = nLeftOver


End If
' Create a buffer for a block (do only once)
sBuffer = String$(lIncrement, " ")

' Read and write the remaining blocks of data.
Do Until lStartPos > lFileLen
' Read and write one block of data.
Get #iCDFN, , strBuffer
Put #iDiskFN, , strBuffer
lStartPos = lStartPos + lIncrement
Loop


Even though this has been working for me, I liked J. French's point that
there is something odd about writing the leftover block first. I think
you could calc the size, run all the other blocks, and write the
leftover at the end just as easily, which somehow sounds better, sincee
you probably made your block size a multiple of 512 to match the sector
size.

My code without the check for nLeftOver = 0 has been working fine for
quite a while, so I think the zero length get and put must be harmless.
However, it does seem like better programming to check for it.
Jul 17 '05 #14

P: n/a
On Fri, 9 Jan 2004 11:41:22 -0800, "Steve Gerrard"
<no*************@comcast.net> wrote:

<snip>

It just makes me feel uncomfortable ...


I agree, it does seem odd. I actually got the basis of that code from a
MS article on file copying in networks, and that is how they did it. (It
was an old article addressing a collision problem in older 10 Mbps
networks.)

I suppose if you calculate the number of "std" blocks as well, you could
do all of them first, then do the residue block, or something like that.


That is pretty much how I do it
- but I use a While/Wend

While BytesDone < TotalFileLen

alternatively :-
While BytesToCopy > 0

There is also something to be said for 'pre-extending' the destination
file to its final length before actually copying any data
- it saves the OS faffing about
Jul 17 '05 #15

P: n/a
On 9 Jan 2004 15:53:14 -0800, bw*@unicon.com (Bryan Rickard) wrote:
Steve, thank you very much! That took care of the problem.

I wonder, though, why VB doesn't reallocate the same memory when you
redefine a fixed-length string. You would think it would.


I agree, but even so it would still be quite a lot of unnecessary work
for the App.

Also, did you get my point about using Byte Arrays
- when VB reads data into a String it converts single bytes into 2
byte Unicode
- and when it writes it, it converts 2 byte Unicode to single bytes
Jul 17 '05 #16

P: n/a

"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b**************************@posting.google.c om...
I wrote a simple program in VB6 to copy all the files from a directory
on a CD-ROM to my hard disk. There are about 10 files, each about
30MB.

The program uses Get and Put to get data from the CD into a buffer and
then put it into the disk. See code below. It works, but it slows
down drastically before it copies all the files. Windows Task Manager
shows the CPU usage gradually increasing as the files are copied,
until it reaches 100 percent after about 6 files. That's when it
starts to get slow.

How can I prevent the CPU usage increasing? Here's the guts of the
code. All this is within another loop that goes through all the files
in sCDPath with Dir$. I've tried buffer sizes from 50000 to 2
million, makes no difference.

iCDFN = FreeFile
Open sCDPath & sFileName For Binary Access Read As #iCDFN
iDiskFN = FreeFile
Open sDiskPath & sFileName For Binary Access Write As #iDiskFN
Do
If lStartPos + lIncrement > lFileLen Then _
lIncrement = lFileLen - lStartPos + 1
sBuffer = String(lIncrement, " ")
Get #iCDFN, , sBuffer
Put #iDiskFN, , sBuffer
lStartPos = lStartPos + lIncrement
If lStartPos > lFileLen Then Exit Do
Loop
Close iCDFN
Close iDiskFN

TIA - Bryan Rickard


Bryan..

I know this doesn't address your question but just out of curiosity why
don't you just shell out? The operating system has optimized file copying
functions that I think we should make use of instead of bloating our code /
re-inventing the wheel.

You can either shell out, use API, or the file system object.
Jul 17 '05 #17

P: n/a
bw*@unicon.com (Bryan Rickard) wrote in
news:2b*************************@posting.google.co m:
Steve, thank you very much! That took care of the problem.

I wonder, though, why VB doesn't reallocate the same memory when you
redefine a fixed-length string. You would think it would.

- Bryan


Because it is not a fixed-length string. String returns a variant
containing a variable length string of the length specified containing
the specified byte. Please read that carfully. String returns a variant
while String$ returns a string. Variant type conversions can be very
slow in VB. So if sBuffer is declared a string (you do use Option
Explicit and correctly type all of your variables, right?) then sBuffer =
String(lIncrement," ") where lIncrement=3 million creates a 3 million
byte string in a variant and then copies that to a 3 million byte string.

Steve Gerrard is also right when he says build the string buffer only
when the buffer size changes. As far as writing the little piece first I
gave that up some years back as I realized that by doing so I avoid any
benefits of trying to help the OS by picking the right size buffer.

A far as lIncrement size: Pick an exact multiple of the destination
volume cluster size. Do keep it reasonably small. If there are not 3
miilon free bytes in the pool then a 3 meg string is going to get
virtualized. Less likely to happen with lengths of 64 K or so.

--
ATB

Charles Kincaid
Jul 17 '05 #18

P: n/a
"Raoul Watson" <Wa*****@IntelligenCIA.com> wrote in message news:<n1****************@nwrdny01.gnilink.net>...

Bryan..

I know this doesn't address your question but just out of curiosity why
don't you just shell out? The operating system has optimized file copying
functions that I think we should make use of instead of bloating our code /
re-inventing the wheel.

You can either shell out, use API, or the file system object.


Raoul - do you mean something like RetVal = Shell("explorer.exe", 1)?
If so, I need something more customized for this application, at the
very least it would need to open with the CD drive contents showing,
but a new Explorer window always opens at My Documents (annoyingly).
Thanks for the input anyway, I might find a need for that one day, and
I hadn't thought of it.

- Bryan
Jul 17 '05 #19

P: n/a

"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b**************************@posting.google.c om...
"Raoul Watson" <Wa*****@IntelligenCIA.com> wrote in message

news:<n1****************@nwrdny01.gnilink.net>...

Bryan..

I know this doesn't address your question but just out of curiosity why
don't you just shell out? The operating system has optimized file copying functions that I think we should make use of instead of bloating our code / re-inventing the wheel.

You can either shell out, use API, or the file system object.


Raoul - do you mean something like RetVal = Shell("explorer.exe", 1)?
If so, I need something more customized for this application, at the
very least it would need to open with the CD drive contents showing,
but a new Explorer window always opens at My Documents (annoyingly).
Thanks for the input anyway, I might find a need for that one day, and
I hadn't thought of it.

- Bryan


Not really. What I mean is have a file selection dialog and once the source
file is identified use Windows API or the file system object to copy it or
simple shell out to a batch file like Shell "copy.bat" where copy.bat would
have copy d:\whateverdir\whateverfile c:\whereever

If you need it, give me a holler, I'll e-mail you.
Jul 17 '05 #20

P: n/a

"Bryan Rickard" <bw*@unicon.com> wrote in message
news:2b**************************@posting.google.c om...
"Raoul Watson" <Wa*****@IntelligenCIA.com> wrote in message

news:<n1****************@nwrdny01.gnilink.net>...

Bryan..

I know this doesn't address your question but just out of curiosity why
don't you just shell out? The operating system has optimized file copying functions that I think we should make use of instead of bloating our code / re-inventing the wheel.

You can either shell out, use API, or the file system object.


Raoul - do you mean something like RetVal = Shell("explorer.exe", 1)?
If so, I need something more customized for this application, at the
very least it would need to open with the CD drive contents showing,
but a new Explorer window always opens at My Documents (annoyingly).
Thanks for the input anyway, I might find a need for that one day, and
I hadn't thought of it.

- Bryan


Here is one quick way..
Dim SourceF, DestF
SourceF = "D:\WHATEVERDIR\MYFILE.DAT" ' Define source file name.
DestF = "C:\WHEREVER\MYFILE.DAT" ' Define target file name.
FileCopy SourceF, DestF ' Copy source to target.

Jul 17 '05 #21

P: n/a
er*****@nowhere.com (J French) wrote in message news:<3f****************@news.btclick.com>...
Also, did you get my point about using Byte Arrays
- when VB reads data into a String it converts single bytes into 2
byte Unicode
- and when it writes it, it converts 2 byte Unicode to single bytes


I tried using Byte Arrays, they work fine with fixed length arrays but
for the left-over bytes I had to use a dynamic array, and the Put
statement added an extra string of hundreds of nulls (hex 0) onto the
end of the left-over bytes. Here's the code I used:

Dim bBytes(65536) As Byte
Dim bLeftOverBytes() As Byte
...
Do
If lStartPos + lIncrement > lFileLen Then
lLeftOver = lFileLen - lStartPos + 1
ReDim bLeftOverBytes(lLeftOver)
Get #iCDFN, , bLeftOverBytes
Put #iDiskFN, , bLeftOverBytes 'writes too many bytes
Exit Do
Else 'this is the normal loop, it works fine
lStartPos = lStartPos + lIncrement
Get #iCDFN, , bBytes
Put #iDiskFN, , bBytes
End If
Loop

Any ideas?
Jul 17 '05 #22

P: n/a
Charles Kincaid <ki*****@swbell.net> wrote in message news:<Xn******************************@38.144.126. 67>...

Because it is not a fixed-length string. String returns a variant
containing a variable length string of the length specified containing
the specified byte. Please read that carfully. String returns a variant
while String$ returns a string. Variant type conversions can be very
slow in VB. So if sBuffer is declared a string (you do use Option
Explicit and correctly type all of your variables, right?) then sBuffer =
String(lIncrement," ") where lIncrement=3 million creates a 3 million
byte string in a variant and then copies that to a 3 million byte string.

Steve Gerrard is also right when he says build the string buffer only
when the buffer size changes. As far as writing the little piece first I
gave that up some years back as I realized that by doing so I avoid any
benefits of trying to help the OS by picking the right size buffer.

A far as lIncrement size: Pick an exact multiple of the destination
volume cluster size. Do keep it reasonably small. If there are not 3
miilon free bytes in the pool then a 3 meg string is going to get
virtualized. Less likely to happen with lengths of 64 K or so.


Thank you, Charles, good inputs.
Jul 17 '05 #23

This discussion thread is closed

Replies have been disabled for this discussion.