By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
431,852 Members | 2,116 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 431,852 IT Pros & Developers. It's quick & easy.

What is a good approach to PostgreSQL database backups?

scubak1w1
P: 53
Hello,

What is a good approach to PostgreSQL backups? (yep, a newbie)

(The server crashed and IT was not able to bring it back from the Acronis images - so rebuilding from scratch - ugh! - want to have a secondary system, that lesson is learnt!)

Ideally, having the compressed backup that pgAdmin is able to do run on some sort of Task Scheduler would be ideal... but seemingly no functionality for this...

So presumably write a DOS (sic) script to do a pg_dump once a week for every database and a pg_dumpall for the whole cluster (so I get the groups, roles, etc) say every month?? (to a separate network location)

Is there a 'lessons learned' style documentation online that I can ingest?

I was reading about Slony - but for the size of the database cluster (at least presently) it seems overkill? Or should I look at that?

Thanks in advance for any pointers, documentation links, etc...

Cheers:
GREG...
Nov 17 '09 #1

✓ answered by scottiebo

@scubak1w1
There actually is. If you look at the backup window in pg_admin, you'll see that it's running a command called pg_dump.exe. Take the whole command line that is running and you can have task scheduler run that for you automatically.


So presumably write a DOS (sic) script to do a pg_dump once a week for every database and a pg_dumpall for the whole cluster (so I get the groups, roles, etc) say every month?? (to a separate network location)
You can just do a pg_dumpall -g to get only the users / groups / tablespaces, you don't need to actually dump EVERYTHING out.


Is there a 'lessons learned' style documentation online that I can ingest?

I was reading about Slony - but for the size of the database cluster (at least presently) it seems overkill? Or should I look at that?
slony probably is overkill, take a look at point in time recovery (PITR)

http://www.enterprisedb.com/docs/en/...archiving.html




--
Scott Mead
EnterpriseDB

Share this Question
Share on Google+
2 Replies


P: 1
@scubak1w1
There actually is. If you look at the backup window in pg_admin, you'll see that it's running a command called pg_dump.exe. Take the whole command line that is running and you can have task scheduler run that for you automatically.


So presumably write a DOS (sic) script to do a pg_dump once a week for every database and a pg_dumpall for the whole cluster (so I get the groups, roles, etc) say every month?? (to a separate network location)
You can just do a pg_dumpall -g to get only the users / groups / tablespaces, you don't need to actually dump EVERYTHING out.


Is there a 'lessons learned' style documentation online that I can ingest?

I was reading about Slony - but for the size of the database cluster (at least presently) it seems overkill? Or should I look at that?
slony probably is overkill, take a look at point in time recovery (PITR)

http://www.enterprisedb.com/docs/en/...archiving.html




--
Scott Mead
EnterpriseDB
Nov 17 '09 #2

scubak1w1
P: 53
VERY much appreciated... (and sorry for posting a non-technical reply... :-) )
Nov 17 '09 #3

Post your reply

Sign in to post your reply or Sign up for a free account.