Hi Eduardo:
I hope I understand your question correctly.
1. Generally, this is something you can really only answer by testing.
Are you running a script on your server that pushes out new changes to
a list of other servers? If so, you couldn't expect this to happen
instantly. Set a goal, like one hour maximum for all changes to
propogate, then try to optimize to meet that goal.
2. Web servers really only respond to requests. If you need something
that acts proactively you'll need a service, or a scheduled task, or a
web application with a background thread that is actively monitoring
for changes in the filesystem or a database.
3. The safest way would be to use client certificates with SSL. This
increases the complexity and cost, however, so you might consider
restricting based on an IP address - which has drawbacka also (IPs can
change, IPs can be spoofed). There is always the ability to setup user
accounts with passwords for the remote services to login with. The
first step would be to set a goal for how secure you want to be.
HTH,
--
Scott
http://www.OdeToCode.com
On Thu, 9 Sep 2004 10:33:56 -0300, "Eduardo Rosa"
<ed******@yahoo.com.br> wrote:
I'm developing a system in ASP.Net C#, It'll be implement in a lot sites in
diverse servers; I need to update the system, sending a newer version, all
sites has to be immediately updated. I thought keep the script in my server,
that be access by the sites.
But I have some questions:
1. Could that overcharge my server?
2. What better way, perhaps webserver?
3. How Could I keep the security, accepting only the site that I want?
thanks a lot
and sorry my English....