473,729 Members | 2,137 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

alternative file.copy

it seems that when i do file.copy the svchost.exe is hanged, i mean if i make
40 threads of file.copy , 40 copys of files at same time the system is going
down and stop responding, this is when i'm working with cifs (shares). there
is another solution to copy files than file.copy in .net?
Nov 21 '05 #1
8 2931
"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com >
schrieb:
it seems that when i do file.copy the svchost.exe is hanged, i
mean if i make 40 threads of file.copy


Reduce the number of threads...

--
Herfried K. Wagner [MVP]
<URL:http://dotnet.mvps.org/>

Nov 21 '05 #2
Luis,

You make me curious, are you using threads to copy files, and then what is
the reason for that?

Cor

"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com >
....
it seems that when i do file.copy the svchost.exe is hanged, i mean if i
make
40 threads of file.copy , 40 copys of files at same time the system is
going
down and stop responding, this is when i'm working with cifs (shares).
there
is another solution to copy files than file.copy in .net?

Nov 21 '05 #3
Luis,
There are any number of alternatives to file.copy, some that work better
then others.

However! I'm not so sure that File.Copy is the problem as much as 40
File.Copy running at one time is.

To copy a file, you (or the Framework) needs open the source file, open the
destination file, read of chunk of data, write a chunk of data, repeat
reading & writing until there is no more data.

Depending on the size of the files & the size of the chunks, I would expect
40 copies to bring a system to its knees... I would expect adding network
shares to the mix would only compound the problem

Are you using the ThreadPool or creating the threads yourself.

I would consider using a thread safe Queue to hold the copy requests and
only start a handful of Workers. Each worker would get one copy request from
the Queue, copy the file, then get another request.

I would also consider using Asynchronous File I/O to copy the contents with
Stream.BeginRea d & Stream.BeginWri te, allowing the overlapping of the reads
& writes, which may require reducing the number of Workers.

I'm not sure which ones, I suspect there are a handful of Performance
Counters that you could use to monitor the process and dynamically adjust
the number of Workers to match your system performance. System not being
stressed increase workers, system being stressed decrease workers... I know
MSDN magazine a few years ago had an article on how to use Performance
Counters to adjust number of workers...

Hope this helps
Jay

"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com > wrote
in message news:93******** *************** ***********@mic rosoft.com...
it seems that when i do file.copy the svchost.exe is hanged, i mean if i
make
40 threads of file.copy , 40 copys of files at same time the system is
going
down and stop responding, this is when i'm working with cifs (shares).
there
is another solution to copy files than file.copy in .net?

Nov 21 '05 #4
yes you got me, im not using a threadpool, and im trying to copy 20 or 40
files from uncs shares at same time with a independent thread for each one of
them, im gonna try with the threadpool and the asyncronous io, thanks for all.

"Jay B. Harlow [MVP - Outlook]" wrote:
Luis,
There are any number of alternatives to file.copy, some that work better
then others.

However! I'm not so sure that File.Copy is the problem as much as 40
File.Copy running at one time is.

To copy a file, you (or the Framework) needs open the source file, open the
destination file, read of chunk of data, write a chunk of data, repeat
reading & writing until there is no more data.

Depending on the size of the files & the size of the chunks, I would expect
40 copies to bring a system to its knees... I would expect adding network
shares to the mix would only compound the problem

Are you using the ThreadPool or creating the threads yourself.

I would consider using a thread safe Queue to hold the copy requests and
only start a handful of Workers. Each worker would get one copy request from
the Queue, copy the file, then get another request.

I would also consider using Asynchronous File I/O to copy the contents with
Stream.BeginRea d & Stream.BeginWri te, allowing the overlapping of the reads
& writes, which may require reducing the number of Workers.

I'm not sure which ones, I suspect there are a handful of Performance
Counters that you could use to monitor the process and dynamically adjust
the number of Workers to match your system performance. System not being
stressed increase workers, system being stressed decrease workers... I know
MSDN magazine a few years ago had an article on how to use Performance
Counters to adjust number of workers...

Hope this helps
Jay

"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com > wrote
in message news:93******** *************** ***********@mic rosoft.com...
it seems that when i do file.copy the svchost.exe is hanged, i mean if i
make
40 threads of file.copy , 40 copys of files at same time the system is
going
down and stop responding, this is when i'm working with cifs (shares).
there
is another solution to copy files than file.copy in .net?


Nov 21 '05 #5
Jay,

I am quiet sure that using threads for diskcopying is useless. It gives you
even more threadmanaging processtime, and you even have even to check in a
strange way that you are not using twice the same filename in multiple
threads with the change that it will blow up your program because you cannot
control a part of this process.

Cor
Nov 21 '05 #6
Cor,
I am quiet sure that using threads for diskcopying is useless. I'm suspect, for you, you are correct! :-|
you even have even to check in a strange way that you are not using twice
the same filename in multiple Which is what the Thread Safe Queue is for. Here I am referring to a
System.Collecti ons.Queue object. Then using either Queue.Synchroni zed to
create a thread safe Queue or encapsulate the Queue in your own class with
SyncLock statements to make it Thread Safe.

Seeing as a Queue is a First In First Out (FIFO) construct you would put all
the copy requests (an object with source name & destination name properties)
into the Queue. The workers would then read a request & process the copy.
the change that it will blow up your program because you cannot control a
part of this process. My change (using a Thread Safe Queue) is a standard pattern for
multi-threading, its the pattern that ThreadPool is based on (as suggested
by the method ThreadPool.Queu eUserWorkItem)

Asynchronous File I/O is a standard pattern within the Framework.

http://msdn.microsoft.com/library/de...nousfileio.asp

I hope you realize my real suggestion to Luis was to limit the number of
requests, I was then offering alternatives that he may not have considered.

I do agree that one needs to be more careful writing multi-threaded
applications or not using Multi-threading, however as you reread the Luis's
original question, you will find he is already using Multi-threading!

Hope this helps
Jay
"Cor Ligthert" <no************ @planet.nl> wrote in message
news:uw******** ******@TK2MSFTN GP11.phx.gbl... Jay,

I am quiet sure that using threads for diskcopying is useless. It gives
you even more threadmanaging processtime, and you even have even to check
in a strange way that you are not using twice the same filename in
multiple threads with the change that it will blow up your program because
you cannot control a part of this process.

Cor

Nov 21 '05 #7
Luis,
Using the ThreadPool may not help per se.

My point is the number of requests going on. The ThreadPool will effectively
limit you to 25 requests. 25 requests may still be too many.

If I used the ThreadPool I would still try to limit the number of actual
copy requests going on...

Hope this helps
Jay

"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com > wrote
in message news:1A******** *************** ***********@mic rosoft.com...
yes you got me, im not using a threadpool, and im trying to copy 20 or 40
files from uncs shares at same time with a independent thread for each one
of
them, im gonna try with the threadpool and the asyncronous io, thanks for
all.

"Jay B. Harlow [MVP - Outlook]" wrote:
Luis,
There are any number of alternatives to file.copy, some that work better
then others.

However! I'm not so sure that File.Copy is the problem as much as 40
File.Copy running at one time is.

To copy a file, you (or the Framework) needs open the source file, open
the
destination file, read of chunk of data, write a chunk of data, repeat
reading & writing until there is no more data.

Depending on the size of the files & the size of the chunks, I would
expect
40 copies to bring a system to its knees... I would expect adding network
shares to the mix would only compound the problem

Are you using the ThreadPool or creating the threads yourself.

I would consider using a thread safe Queue to hold the copy requests and
only start a handful of Workers. Each worker would get one copy request
from
the Queue, copy the file, then get another request.

I would also consider using Asynchronous File I/O to copy the contents
with
Stream.BeginRea d & Stream.BeginWri te, allowing the overlapping of the
reads
& writes, which may require reducing the number of Workers.

I'm not sure which ones, I suspect there are a handful of Performance
Counters that you could use to monitor the process and dynamically adjust
the number of Workers to match your system performance. System not being
stressed increase workers, system being stressed decrease workers... I
know
MSDN magazine a few years ago had an article on how to use Performance
Counters to adjust number of workers...

Hope this helps
Jay

"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com >
wrote
in message news:93******** *************** ***********@mic rosoft.com...
> it seems that when i do file.copy the svchost.exe is hanged, i mean if
> i
> make
> 40 threads of file.copy , 40 copys of files at same time the system is
> going
> down and stop responding, this is when i'm working with cifs (shares).
> there
> is another solution to copy files than file.copy in .net?


Nov 21 '05 #8
Nak
Hi Luis,

This is probably going to sound stupid, but arent multiple asynchronous
file operations slow because of hard drive technology? I'll give you an
example, copying a load of MP3's (50mb) from one drive to the other chugs
along quite happy, then when you throw just 1 more file copy to happen at
the same time the entire operation gets slowed down, and my understanding
for this is because the hard drive can't read / write that many sectors at
the same time.

Probably not helpful, but I thought I'd say it anyway! :-)

Nick.

"luis molina Micasoft" <lu************ ****@discussion s.microsoft.com > wrote
in message news:93******** *************** ***********@mic rosoft.com...
it seems that when i do file.copy the svchost.exe is hanged, i mean if i
make
40 threads of file.copy , 40 copys of files at same time the system is
going
down and stop responding, this is when i'm working with cifs (shares).
there
is another solution to copy files than file.copy in .net?

Nov 21 '05 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

7
400
by: b.b | last post by:
Is the days of VB6 we used to place all of our shared dll's in one central location on a separate server within our network domain. All apps that needed a shared dll would have the dll registered from this location. If a shared dll needed to be upgraded in some way then we would just overwrite the dll (if binary compat.) and all app that used this shared dll would benefit from the upgrade. Now I want to mimic this in dotnet, I want to...
115
14148
by: TheAd | last post by:
At this moment I use MsAccess and i can build about every databound application i want. Who knows about a serious open source alternative? Because Windows will be a client platform for some time, i prefer a solution that (also) supports Windows. On the net I found a number of products that i looked at, but none of them gave me the impression of a serious candidate at this moment (KNoda, Gnome DB Manager, InterBase...). 2 additional...
43
6881
by: Mountain Bikn' Guy | last post by:
I have a situation where an app writes data of various types (primitives and objects) into a single dimensional array of objects. (This array eventually becomes a row in a data table, but that's another story.) The data is written once and then read many times. Each primitive read requires unboxing. The data reads are critical to overall app performance. In the hopes of improving performance, we have tried to find a way to avoid the...
14
755
by: ajfish | last post by:
Hi, I am trying to allocate a unique ID to every instance of tag 'foo' in a large XML document. currently I'm doing this: <xsl:variable name="UniqueId"> <xsl:number count="foo" level="any"/> </xsl:variable> but with .Net framework 1.1 (using XPathDocument) it is very slow for
3
5569
by: none | last post by:
I have a very complex data structure which is basically a class object containing (sometimes many) other class objects, function references, ints, floats, etc. The man for the copy module states pretty clearly that it will not copy methods or functions. I've looked around for a while (prob just using the wrong keywords) and haven't found a good solution. As a workaround I've been using cPickle, loads(dumps(obj)) which is incredibly slow...
6
5821
by: C10B | last post by:
hi, I have a table with several million rows. Each row is simply the date and time a certain page was viewed. eg page1 1-1-00 page2 2-1-00 page1 16-1-00 page1 17-1-00
13
2279
by: andrewanderson | last post by:
hi all, i'm wondering is there any other alternative to do a timestamp which consists of date and time!! below is a program i've created but its too long and to large i would like to know other way to do it can anyone help me? here is some part of the program i've created its impossible to show u guys the whole program as ive copy it to save in a notepad and its file is 64kb!!please tell me if there is other alternative to do this...
14
1451
by: Adam Atlas | last post by:
I wrote this little program called Squisher that takes a ZIP file containing Python modules and generates a totally self-contained .pyc file that imports a specified module therein. (Conveniently, Python's bytecode parser ignores anything after an end marker, and the zipimport mechanism skips any non-ZIP data at the beginning of a file!) For example, say you have a directory called 'foo', which contains an __init__.py, which contains...
3
12859
by: maheshkadam | last post by:
Hi friends I am new to perl so please guide me. I have one application which created backup log file every day.But it appends that file so you can see logs for different day in one file only. My requirement is to copy backup log for the specific day (yesterday) and write in other file. That file will be mailed to admin for ready reference. So here is some text from that log file
0
8763
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
9428
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
8156
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6026
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4531
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
4797
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3240
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
2692
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2166
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.