467,888 Members | 1,447 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 467,888 developers. It's quick & easy.

Multiple file download from network drive via web

Hello all,

I have been trying to determine the best way to do this, but can't
seem to get any solution to work exactly the way I want. The scenario
is that I have some xml files being placed on a network drive of one
of our servers. I needed to copy this files to my web server to do
processing. Ideally I was going to write a server service that
monitor wold monitor that directory so that when a new file appeared
it was automatically copied and I may still look at doing that.

However right now I am just trying to perform a copy of the whole
directory contents via a link in my ASP.NET/C# web app. One thing I
did was to create a virtual directory on my iis box, set it up as an
application and point it to the UNC of the file. That seems to work.
Then Ifound some sample code:

string RemoteFolder = @"http://www.bogussite.com/123/";
string RemoteFile = "123.xml";
string url = RemoteFolder + RemoteFile;
HttpWebRequest webRequest =
(HttpWebRequest)WebRequest.Create(url);
HttpWebResponse webResponse =
(HttpWebResponse)webRequest.GetResponse();
StreamReader sr = new
StreamReader(webResponse.GetResponseStream());
string filecontent = sr.ReadToEnd();
StreamWriter sw = new StreamWriter(@"C:\Inetpub\wwwroot
\putthemhere\123.xml");
sw.Write(filecontent);
sw.Flush();
sw.Close();
sr.Close();

This works great, but only copies one file and doesn't seem to like
wild cards, *.xml would be ideal. I also look at trying to use
WebClient.download file but that did not seem to work for multiple
files either.

Any thoughts would be appreciated.
Greg
Jan 16 '08 #1
  • viewed: 2075
Share:
2 Replies
http unlike ftp does not support directory directives or multiple file
download.

you could add an aspx page or webservice that returned a directory listing.
you could also turn on directory browsing. with directory browsing, a get of
the site root returns a directory list. you woudl have to pase the list
becuase its not in a standard format.

-- bruce (sqlwork.com)
"gh***@medex.com" wrote:
Hello all,

I have been trying to determine the best way to do this, but can't
seem to get any solution to work exactly the way I want. The scenario
is that I have some xml files being placed on a network drive of one
of our servers. I needed to copy this files to my web server to do
processing. Ideally I was going to write a server service that
monitor wold monitor that directory so that when a new file appeared
it was automatically copied and I may still look at doing that.

However right now I am just trying to perform a copy of the whole
directory contents via a link in my ASP.NET/C# web app. One thing I
did was to create a virtual directory on my iis box, set it up as an
application and point it to the UNC of the file. That seems to work.
Then Ifound some sample code:

string RemoteFolder = @"http://www.bogussite.com/123/";
string RemoteFile = "123.xml";
string url = RemoteFolder + RemoteFile;
HttpWebRequest webRequest =
(HttpWebRequest)WebRequest.Create(url);
HttpWebResponse webResponse =
(HttpWebResponse)webRequest.GetResponse();
StreamReader sr = new
StreamReader(webResponse.GetResponseStream());
string filecontent = sr.ReadToEnd();
StreamWriter sw = new StreamWriter(@"C:\Inetpub\wwwroot
\putthemhere\123.xml");
sw.Write(filecontent);
sw.Flush();
sw.Close();
sr.Close();

This works great, but only copies one file and doesn't seem to like
wild cards, *.xml would be ideal. I also look at trying to use
WebClient.download file but that did not seem to work for multiple
files either.

Any thoughts would be appreciated.
Greg
Jan 16 '08 #2
Welcome to the programming....
Sometimes you will have to write code yourself, the code you have is a good
start :)
George.

<gh***@medex.comwrote in message
news:86**********************************@c23g2000 hsa.googlegroups.com...
Hello all,

I have been trying to determine the best way to do this, but can't
seem to get any solution to work exactly the way I want. The scenario
is that I have some xml files being placed on a network drive of one
of our servers. I needed to copy this files to my web server to do
processing. Ideally I was going to write a server service that
monitor wold monitor that directory so that when a new file appeared
it was automatically copied and I may still look at doing that.

However right now I am just trying to perform a copy of the whole
directory contents via a link in my ASP.NET/C# web app. One thing I
did was to create a virtual directory on my iis box, set it up as an
application and point it to the UNC of the file. That seems to work.
Then Ifound some sample code:

string RemoteFolder = @"http://www.bogussite.com/123/";
string RemoteFile = "123.xml";
string url = RemoteFolder + RemoteFile;
HttpWebRequest webRequest =
(HttpWebRequest)WebRequest.Create(url);
HttpWebResponse webResponse =
(HttpWebResponse)webRequest.GetResponse();
StreamReader sr = new
StreamReader(webResponse.GetResponseStream());
string filecontent = sr.ReadToEnd();
StreamWriter sw = new StreamWriter(@"C:\Inetpub\wwwroot
\putthemhere\123.xml");
sw.Write(filecontent);
sw.Flush();
sw.Close();
sr.Close();

This works great, but only copies one file and doesn't seem to like
wild cards, *.xml would be ideal. I also look at trying to use
WebClient.download file but that did not seem to work for multiple
files either.

Any thoughts would be appreciated.
Greg

Jan 17 '08 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

8 posts views Thread by Radioactive Man | last post: by
4 posts views Thread by Billy Jacobs | last post: by
8 posts views Thread by Lam | last post: by
35 posts views Thread by keerthyragavendran | last post: by
3 posts views Thread by Barry Flynn | last post: by
reply views Thread by MrMoon | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.