By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,921 Members | 1,442 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,921 IT Pros & Developers. It's quick & easy.

One Database vs. Multiple Databases

P: n/a
TC
I need to design a system which represents multiple "projects" in SQL
Server. Each project has the same data model, but is independent of all
others. My inclination is to use one database to store all projects.
Looking at the numbers involved, however, I wonder if I would get
better performance by storing each project in its own database.

Suppose I have 50 projects, each with two users and 10,000 rows; it
seems to me I'd rather have 50 x 2 users working in a table with 10,000
rows than 1 x 100 users working in a table with 500,000 rows.

On the other hand, the single database approach seems more elegant from
a design perspective. I wouldn't be creating multiple copies of an
identical data model, and I wouldn't be creating new databases as a
business procedure, every time a new project is required.

Here are my questions:
1. For the scenario described above, am I correct to assume I will get
better performance by using multiple databases, or does SQL Server have
some clever way of achieving the same performance in a single database?
2. Is the multiple database approach common? If anyone has tried it,
please tell me about how it works in practice.
-TC

Jun 20 '06 #1
Share this Question
Share on Google+
9 Replies


P: n/a
TC wrote:
I need to design a system which represents multiple "projects" in SQL
Server. Each project has the same data model, but is independent of all
others. My inclination is to use one database to store all projects.
Looking at the numbers involved, however, I wonder if I would get
better performance by storing each project in its own database.

Suppose I have 50 projects, each with two users and 10,000 rows; it
seems to me I'd rather have 50 x 2 users working in a table with 10,000
rows than 1 x 100 users working in a table with 500,000 rows.

On the other hand, the single database approach seems more elegant from
a design perspective. I wouldn't be creating multiple copies of an
identical data model, and I wouldn't be creating new databases as a
business procedure, every time a new project is required.

Here are my questions:
1. For the scenario described above, am I correct to assume I will get
better performance by using multiple databases, or does SQL Server have
some clever way of achieving the same performance in a single database?
2. Is the multiple database approach common? If anyone has tried it,
please tell me about how it works in practice.
-TC


1. Not unless your implementation is very bad indeed. 100 users and
500,000 rows is a small database by most standards.

2. Sometimes. Partitioning a database can make sense for administrative
and support reasons or as part of a solution where data is distributed
over multiple servers. But without other changes, partitioning a
database isn't likely to achieve much if anything in terms of
performance. Given the potential complexity of supporting that kind of
solution there are certainly much easier and more effective ways to
optimise performance.

--
David Portas, SQL Server MVP

Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.

SQL Server Books Online:
http://msdn2.microsoft.com/library/m...S,SQL.90).aspx
--

Jun 20 '06 #2

P: n/a

TC wrote:
I need to design a system which represents multiple "projects" in SQL
Server. Each project has the same data model, but is independent of all
others. My inclination is to use one database to store all projects.
Looking at the numbers involved, however, I wonder if I would get
better performance by storing each project in its own database.

Suppose I have 50 projects, each with two users and 10,000 rows; it
seems to me I'd rather have 50 x 2 users working in a table with 10,000
rows than 1 x 100 users working in a table with 500,000 rows.

On the other hand, the single database approach seems more elegant from
a design perspective. I wouldn't be creating multiple copies of an
identical data model, and I wouldn't be creating new databases as a
business procedure, every time a new project is required.

Here are my questions:
1. For the scenario described above, am I correct to assume I will get
better performance by using multiple databases, or does SQL Server have
some clever way of achieving the same performance in a single database?
2. Is the multiple database approach common? If anyone has tried it,
please tell me about how it works in practice.
-TC


I would go with 1 database per projet (so multiple databases):
- if your data model change, you will be able to migrate only projets
that you want, when you want.
- easier to separate projet, restart a projet, etc if you need.
- backup/restaure projet independantly
- Give acces to a particular projet to a user is easier.

Jun 20 '06 #3

P: n/a
I would go the other way - one database for all. It will more scalable
and flexible. The amount of data is not too much, the speed is not an
issue as long as it is properly indexed.

I actually did a data conversion merging several hundreds of databases
(Also called project) into one.

Jun 20 '06 #4

P: n/a
TC (go*********@yahoo.com) writes:
I need to design a system which represents multiple "projects" in SQL
Server. Each project has the same data model, but is independent of all
others. My inclination is to use one database to store all projects.
Looking at the numbers involved, however, I wonder if I would get
better performance by storing each project in its own database.

Suppose I have 50 projects, each with two users and 10,000 rows; it
seems to me I'd rather have 50 x 2 users working in a table with 10,000
rows than 1 x 100 users working in a table with 500,000 rows.

On the other hand, the single database approach seems more elegant from
a design perspective. I wouldn't be creating multiple copies of an
identical data model, and I wouldn't be creating new databases as a
business procedure, every time a new project is required.

Here are my questions:
1. For the scenario described above, am I correct to assume I will get
better performance by using multiple databases, or does SQL Server have
some clever way of achieving the same performance in a single database?
2. Is the multiple database approach common? If anyone has tried it,
please tell me about how it works in practice.


Whether to use one or many databases has nothing to do with performance
whatsoever. If performance is the only motive for you to consider
separate databases, just forget about it given the volumes you indicated.

There may be other reasons for using separate databases. One project
says "oops, we deleted our data". With a separate database, a restore
is a quick thing. Or some projects may start to call for diverging
requirements, so that they no longer fit into the same model. There
can also be security considerations.

But all of that business requirements that are unknown to me. Since
maintaining 50 databases with the same model requires more overhead,
a single database with a good data model is a good way to start.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Jun 20 '06 #5

P: n/a

"TC" <go*********@yahoo.com> wrote in message
news:11*********************@h76g2000cwa.googlegro ups.com...
I need to design a system which represents multiple "projects" in SQL
Server. Each project has the same data model, but is independent of all
others. My inclination is to use one database to store all projects.
Looking at the numbers involved, however, I wonder if I would get
better performance by storing each project in its own database.

Suppose I have 50 projects, each with two users and 10,000 rows; it
seems to me I'd rather have 50 x 2 users working in a table with 10,000
rows than 1 x 100 users working in a table with 500,000 rows.
This is a small database by today's standards.

In general a single database will probably give you better performance since
only one copy of query plans will be cached, as opposed to 50 (assuming you
use stored procs, etc.).

disk I/O will probably be less as SQL can do a better job of reading in
batches of rows.

So performance-wise, single probably wins out.

In terms of maintenance, etc, a single one is generally better. Assume you
develop an updated version of a stored proc, or need to change a table.
Would you rather do it once or 50 times?


On the other hand, the single database approach seems more elegant from
a design perspective. I wouldn't be creating multiple copies of an
identical data model, and I wouldn't be creating new databases as a
business procedure, every time a new project is required.

Here are my questions:
1. For the scenario described above, am I correct to assume I will get
better performance by using multiple databases, or does SQL Server have
some clever way of achieving the same performance in a single database?
2. Is the multiple database approach common? If anyone has tried it,
please tell me about how it works in practice.
-TC

Jun 21 '06 #6

P: n/a
TC wrote:
I need to design a system which represents multiple "projects" in SQL
Server. Each project has the same data model, but is independent of all
others.


Others have addressed, both pro and con using a single or multiple
databases and my response would be that the consideration is one
of maintenance and security: Two issues you have not discussed.

But what I would like to add to this discussion is based on your
first two sentences.

From what you've written I can't see how you can justify, except
for security purposes, more than one set of tables with the same
data model. I think Date and Codd said something about it so you
might want to read what they wrote. But based solely on the above
sentences ... the correct solution is to add a column to your
tables named PROJECT_ID.
--
Daniel A. Morgan
University of Washington
da******@x.washington.edu
(replace x with u to respond)
Puget Sound Oracle Users Group
www.psoug.org
Jun 21 '06 #7

P: n/a
As everybody has mentioned, 500k records and 100 users is small by any
definition. But regardless, performance should not be a consideration
until it forces itself to become so.

Think about your idea from a maintenance perspective. At some point
you'll need to add a column to one of those tables, or even make a
simple stored procedure change. Imagine the pain this will cause you
down the road, trying to synchronize changes in all those databases.
Multiply every small hassle you'll ever come across in the future by
the number of "Projects" in your universe. Yikes!

Stick to one database per Application and you'll live a long and
healthy life.

Jason Kester
Expat Software Consulting Services
http://www.expatsoftware.com/

---
Get your own Travel Blog, with itinerary maps and photos!
http://www.blogabond.com/

Jun 22 '06 #8

P: n/a
I recently recommended a multiple database solution, since my
assessment of the client's needs and data were that they needed the
flexibility of separate databases per data set. Also, my deadline for
completion was very short, and this product was not considered to be
used for longer than the near future.

Each external client's (about 20 clients) data set is different - even
the same client data set could vary - and the process was to massage
each set of data via stored procedures. I Initially opted for one
database per client, to allow for reuse of lookup data (holidays,
fiscal periods, accounts, etc.), but later realized that what I was
designing, an Excel workbook to analyze accounting data, would be
easier to modify if I kept everything the same and simply modified the
connection string.

Ultimately, the choice of multiple servers keeps the stored procs for
the XLW, and the XLW itself, the same, while the only modification
occurs in the stored procedures for building the data, since the data
requirements could vary wildly. The multiple database approach also
allows for multiple data sets to be massaged at the same time -
building the final data set could take several days, so some problems
in concurrency could develop - and with varying procedures for such the
independence was a benefit, albeit creating a lot of redundancy.

Given enough time to design a solution that preserved security - no
client specific information, nor information useful for hacking the
server data could be local, other than the connection string - and ran
as a single database, I could certainly design a solution for them.
James Igoe

ja********@gmail.com || http://code.comparative-advantage.com
Jason Kester wrote:
As everybody has mentioned, 500k records and 100 users is small by any
definition. But regardless, performance should not be a consideration
until it forces itself to become so.

Think about your idea from a maintenance perspective. At some point
you'll need to add a column to one of those tables, or even make a
simple stored procedure change. Imagine the pain this will cause you
down the road, trying to synchronize changes in all those databases.
Multiply every small hassle you'll ever come across in the future by
the number of "Projects" in your universe. Yikes!

Stick to one database per Application and you'll live a long and
healthy life.

Jason Kester
Expat Software Consulting Services
http://www.expatsoftware.com/

---
Get your own Travel Blog, with itinerary maps and photos!
http://www.blogabond.com/


Jun 22 '06 #9

P: n/a
TC
I want to thank everyone for their thoughtful responses. You've helped
me get perspective on this issue.

-TC

Jun 22 '06 #10

This discussion thread is closed

Replies have been disabled for this discussion.