By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
457,734 Members | 832 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 457,734 IT Pros & Developers. It's quick & easy.

Does anyone have any suggestions for synchronizing multiple websites?

P: n/a
Does anyone have any suggestions for synchronizing multiple websites?

We have a dev server, a testing server, and a live server. Now, we
have 3 developers, and we are having trouble tracking which files on
which server are the latest. I know it should be obvious, but this is
a bit complicated, as many of you may have experienced.

Any suggestions for software and/or methods/techniques for file
management?

Thanks in advance.

J
Mar 24 '08 #1
Share this Question
Share on Google+
3 Replies


P: n/a
On 24 Mar, 15:52, javelin <google.1.jvm...@spamgourmet.comwrote:
Does anyone have any suggestions for synchronizing multiple websites?
rsync Good version control on your dev server too. I'd suggest
Subversion as the current best choice.

Also stop trying to "synchronise" content. Just worry about
propagating in one direction, and in being able to reliably make stuff
propagate. That way it doesn't matter if you over-write content
unnecessarily, it would just be the same content anyway.

If you use Subversion, "tagging" is important to keep track of stuff
too. Don't propagate straight from the dev server, make it a rule to
always tag a build first, then propagate from that tag.

Don't let habits develop like "just tweaking it a bit" on the live
server, then back-propagating. That way lies madness.

Don't edit files during propagation, even if you do this
automatically . If you _must_ have something different (.properties
file for DB connections etc.) then treat each one as separate files
for each server. Try and limit these too.
Mar 24 '08 #2

P: n/a
javelin wrote:
We have a dev server, a testing server, and a live server. Now, we
have 3 developers, and we are having trouble tracking which files on
which server are the latest. I know it should be obvious, but this is
a bit complicated, as many of you may have experienced.
I'm sure there are software packages to handle this scenario, but for me
it was easier to develop my own mechanism, which runs as a CGI
application on my webserver itself. I have just test and production, but
six developers.

One webpage allows each developer to download files from the test system
(acquiring a lock on the file in the process) and a separate page shows
you which files are newer on the test system.

Each time a file is uploaded to the test system an automatic backup of
the old file is taken. This also happens when a file is promoted from
the test to the production system.

It's simple, but sufficient for our needs. The only slightly unusual
aspect of our system is that any of the developers can remove a lock
held by someone else; this means that we don't need an administrator.

--
Steve Swift
http://www.swiftys.org.uk/swifty.html
http://www.ringers.org.uk
Mar 25 '08 #3

P: n/a
On 25 Mar, 06:33, Steve Swift <Steve.J.Sw...@gmail.comwrote:
Each time a file is uploaded to the test system an automatic backup of
the old file is taken.
Don't use backups, use a real VCS and deltas.
The only slightly unusual
aspect of our system is that any of the developers can remove a lock
held by someone else; this means that we don't need an administrator.
Locks are just plain dumb wrong anyway. Use a CMM approach (Checkout,
Modify Merge) rather than Lock-In, Lock-Out. If you have more than one
developer, it's a vast saving.
Mar 25 '08 #4

This discussion thread is closed

Replies have been disabled for this discussion.