468,507 Members | 1,586 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,507 developers. It's quick & easy.

Advice needed for file archiving utility..

Jim
I work in the environmental industry and we have large gcms
instruments that aquire raw data to an attached computer. These
computers are networked, and have shared folders so the data can be
accessed from the network.

I would like to create a fairly simple utility that copies data from
the computers shared folders onto the network. It will also depending
on if the file successfully copied, check if the file/folder(s)
modified date is older than a set retention time - if it is older then
the retention time and the a file with the same or newer modifed date
resides on the network than it delete the file off the local machine.

What Im wondering is :

1. What would be the fastest most efficient way to copy these files.
For example using API versus the built in functions of VB.

2. Im wondering if there are any advtanges to using .NET vs VB6 to
write this program. Does .NET have new updated functions that would
benefit me in writing such an application.

3. What kind of suggestions do you have for error trapping. For
instance how would I log specific files that didnt copy. Is there a
particular order or logic I should have when checking for files that
need to be copied, if they are copied, is the computer im trying to
connect to even turned on, if it isnt than put that computer at the
bottom of the queue and try again later, if it still isnt on keep then
email administrator to let him know that computer didnt get backed up,
etc.. Any advice in this area or past experiences appreciated.

Thanks for your help ahead of time!
Nov 21 '05 #1
2 1256
Jim,
Have you looked nnto folder replications using "Distributed File System"
"Jim" <gu**@hotmail.com> wrote in message
news:2b*************************@posting.google.co m...
I work in the environmental industry and we have large gcms
instruments that aquire raw data to an attached computer. These
computers are networked, and have shared folders so the data can be
accessed from the network.

I would like to create a fairly simple utility that copies data from
the computers shared folders onto the network. It will also depending
on if the file successfully copied, check if the file/folder(s)
modified date is older than a set retention time - if it is older then
the retention time and the a file with the same or newer modifed date
resides on the network than it delete the file off the local machine.

What Im wondering is :

1. What would be the fastest most efficient way to copy these files.
For example using API versus the built in functions of VB.

2. Im wondering if there are any advtanges to using .NET vs VB6 to
write this program. Does .NET have new updated functions that would
benefit me in writing such an application.

3. What kind of suggestions do you have for error trapping. For
instance how would I log specific files that didnt copy. Is there a
particular order or logic I should have when checking for files that
need to be copied, if they are copied, is the computer im trying to
connect to even turned on, if it isnt than put that computer at the
bottom of the queue and try again later, if it still isnt on keep then
email administrator to let him know that computer didnt get backed up,
etc.. Any advice in this area or past experiences appreciated.

Thanks for your help ahead of time!

Nov 21 '05 #2
From my initial look at using DFS & Replication it looks as if it
would take care of the replication of the local client computer shares
data needs, but I would still need a utility to remove data off the
local computer based on set retention times for that particular
computer, and only after it checked to make sure that data was on the
network file server before it removed the local computers data.

Or..... Can DFS & Replication be setup to remove data based on certain
retention also.. If so a URL would be useful.

If not then I would still need a utility that removes the data.
"solex" <so***@nowhere.com> wrote in message news:<eN**************@TK2MSFTNGP10.phx.gbl>...
Jim,
Have you looked nnto folder replications using "Distributed File System"
"Jim" <gu**@hotmail.com> wrote in message
news:2b*************************@posting.google.co m...
I work in the environmental industry and we have large gcms
instruments that aquire raw data to an attached computer. These
computers are networked, and have shared folders so the data can be
accessed from the network.

I would like to create a fairly simple utility that copies data from
the computers shared folders onto the network. It will also depending
on if the file successfully copied, check if the file/folder(s)
modified date is older than a set retention time - if it is older then
the retention time and the a file with the same or newer modifed date
resides on the network than it delete the file off the local machine.

What Im wondering is :

1. What would be the fastest most efficient way to copy these files.
For example using API versus the built in functions of VB.

2. Im wondering if there are any advtanges to using .NET vs VB6 to
write this program. Does .NET have new updated functions that would
benefit me in writing such an application.

3. What kind of suggestions do you have for error trapping. For
instance how would I log specific files that didnt copy. Is there a
particular order or logic I should have when checking for files that
need to be copied, if they are copied, is the computer im trying to
connect to even turned on, if it isnt than put that computer at the
bottom of the queue and try again later, if it still isnt on keep then
email administrator to let him know that computer didnt get backed up,
etc.. Any advice in this area or past experiences appreciated.

Thanks for your help ahead of time!

Nov 21 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

5 posts views Thread by Carl Bevil | last post: by
3 posts views Thread by gieforce | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.