A file *is* a byte array. AFAIK there is no limit on how large a byte array
can be.... though there are RAM considerations with such a huge file. You
will probably want to use a filestream instead as it is optimized to buffer
file contents as it reads them.
Is the problem that you need to throttle the file across the network so that
it doesn't swamp the LAN?
In any case, a novel approach that addresses all your problems would be to
set up an internal FTP server (this is built into Windows Server as an
optional component I think). If you need to throttle the transfer, FTP
server has throttle settings so you don't swamp the network with your 200mb
file transfers. You can even have the web service communicate directly with
it as neither has any domain rights (both run as a locked down web user
account)... that would solve your problem no?
Alternatively, I think the FileSystemWatch er is viable... whether you put
the service on the web server and "push" the file across the LAN or put it
on the destination and "pull" the file. If you need to throttle the file
copy, you would read the file in several bytes at a time (perhaps at some
millisecond or second interval). But, in that case, I think the Internal FTP
server solution is way more robust and easier to implement.
"Sue" <iy*********@ho tmail.com> wrote in message
news:11******** **************@ g43g2000cwa.goo glegroups.com.. .
Having a shared directory is not an option..(don't ask why, the higher
up's have made that decision) Web service is the last option to
consider...I was asked to look to see if the (200mb) huge file can be
stored in a byte array and streamed across the network...
Also, do you know what is the max limit for a byte array...