473,385 Members | 1,766 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

Recommended SQL Table Size

I'm writing a web application that will collect as many as 200 entries per
day(~20 fields per entry). The data fields will always be the same and so,
if the sky's the limit, I could just keep adding these same entries to one
enormous SQL table-- collecting 200 entries per day, 7000 per month, ~84,000
per year, and on, and on, for the rest of the programs existence. I realize
that this is probably not practical and would cause a serious strain on the
database when modifying entries.

SHOULD I or HOW SHOULD I try and split up this information. Should I create
a new table for every day, month, year? Eventually I hope many people will
be accessing this site, and searching this content, so it should be
efficient.

Any help is appreciated,
Jacob
Nov 15 '05 #1
5 1300
Probably not the right group for this, but 84,000 records is nothing. I've
written apps
that collect more than 84,000 records in an hour or so. Just make sure to
properly
index your table, and you won't have any issues with updating any of the records
or
performing any processes for clean-up or whatever you might need.

If I have my numbers right:
2,480 Terrarium Clients Reporting
13 Records per report
6 Minutes per report

That is 32,240 records (about 10 fields) per 6 minutes or 322,400 per hour. We
ran
our servers for months and months on end before even thinking about clearing out
any
data. To date, the public Terrarium server has only been cleared of data once
and that
was during the first SQL Server attack well over a year ago.

--
Justin Rogers
DigiTec Web Consultants, LLC.

"Jacob" <ja**********@REMOVETHIShotmail.com> wrote in message
news:Fr4Ob.5843$A74.2285@fed1read02...
I'm writing a web application that will collect as many as 200 entries per
day(~20 fields per entry). The data fields will always be the same and so,
if the sky's the limit, I could just keep adding these same entries to one
enormous SQL table-- collecting 200 entries per day, 7000 per month, ~84,000
per year, and on, and on, for the rest of the programs existence. I realize
that this is probably not practical and would cause a serious strain on the
database when modifying entries.

SHOULD I or HOW SHOULD I try and split up this information. Should I create
a new table for every day, month, year? Eventually I hope many people will
be accessing this site, and searching this content, so it should be
efficient.

Any help is appreciated,
Jacob

Nov 15 '05 #2

"Jacob" <ja**********@REMOVETHIShotmail.com> wrote in message
news:Fr4Ob.5843$A74.2285@fed1read02...
I'm writing a web application that will collect as many as 200 entries per
day(~20 fields per entry). The data fields will always be the same and so, if the sky's the limit, I could just keep adding these same entries to one
enormous SQL table-- collecting 200 entries per day, 7000 per month, ~84,000 per year, and on, and on, for the rest of the programs existence. I realize that this is probably not practical and would cause a serious strain on the database when modifying entries.
WHOAAA :-)

Right.

Sorry if this sounds sarcastic.

84.000 rows per year.

This means basically that you ahve 840.000 rows in 10 years

and 8.400.000 rows in 100 years.

AND EVEN THEN THIS WOULD NOT BE A LARGE TABLE.

I normally start considering a table "non-trivial" at around 1 million rows
and not small at 10. THis means for you in 100 years.

Never seen a db table with 1.000.000.000 rows? This is possible and without
too many problems.
SHOULD I or HOW SHOULD I try and split up this information. Should I create

No, forget it. Get some REAL DATA VOLUME first.
a new table for every day, month, year? Eventually I hope many people will be accessing this site, and searching this content, so it should be
efficient.


Nope. Read up some books on how capable your database REALLY is.

Thomas Tomiczek
THONA Software & Consulting Ltd.
(Microsoft MVP C#/.NET)
Nov 15 '05 #3
How does you index a database table?
is there any reference site on this?

"Justin Rogers" <Ju****@games4dotnet.com> wrote in message
news:e6**************@TK2MSFTNGP11.phx.gbl...
Probably not the right group for this, but 84,000 records is nothing. I've written apps
that collect more than 84,000 records in an hour or so. Just make sure to
properly
index your table, and you won't have any issues with updating any of the records or
performing any processes for clean-up or whatever you might need.

If I have my numbers right:
2,480 Terrarium Clients Reporting
13 Records per report
6 Minutes per report

That is 32,240 records (about 10 fields) per 6 minutes or 322,400 per hour. We ran
our servers for months and months on end before even thinking about clearing out any
data. To date, the public Terrarium server has only been cleared of data once and that
was during the first SQL Server attack well over a year ago.

--
Justin Rogers
DigiTec Web Consultants, LLC.

"Jacob" <ja**********@REMOVETHIShotmail.com> wrote in message
news:Fr4Ob.5843$A74.2285@fed1read02...
I'm writing a web application that will collect as many as 200 entries per day(~20 fields per entry). The data fields will always be the same and so, if the sky's the limit, I could just keep adding these same entries to one enormous SQL table-- collecting 200 entries per day, 7000 per month, ~84,000 per year, and on, and on, for the rest of the programs existence. I realize that this is probably not practical and would cause a serious strain on the database when modifying entries.

SHOULD I or HOW SHOULD I try and split up this information. Should I create a new table for every day, month, year? Eventually I hope many people will be accessing this site, and searching this content, so it should be
efficient.

Any help is appreciated,
Jacob


Nov 15 '05 #4
That's what I needed to hear. :)

Jacob
"Thomas Tomiczek [MVP]" <t.********@thona-consulting.com> wrote in message
news:eY****************@TK2MSFTNGP12.phx.gbl...

"Jacob" <ja**********@REMOVETHIShotmail.com> wrote in message
news:Fr4Ob.5843$A74.2285@fed1read02...
I'm writing a web application that will collect as many as 200 entries per day(~20 fields per entry). The data fields will always be the same and so,
if the sky's the limit, I could just keep adding these same entries to one enormous SQL table-- collecting 200 entries per day, 7000 per month,

~84,000
per year, and on, and on, for the rest of the programs existence. I

realize
that this is probably not practical and would cause a serious strain on

the
database when modifying entries.


WHOAAA :-)

Right.

Sorry if this sounds sarcastic.

84.000 rows per year.

This means basically that you ahve 840.000 rows in 10 years

and 8.400.000 rows in 100 years.

AND EVEN THEN THIS WOULD NOT BE A LARGE TABLE.

I normally start considering a table "non-trivial" at around 1 million

rows and not small at 10. THis means for you in 100 years.

Never seen a db table with 1.000.000.000 rows? This is possible and without too many problems.
SHOULD I or HOW SHOULD I try and split up this information. Should I

create

No, forget it. Get some REAL DATA VOLUME first.
a new table for every day, month, year? Eventually I hope many people

will
be accessing this site, and searching this content, so it should be
efficient.


Nope. Read up some books on how capable your database REALLY is.

Thomas Tomiczek
THONA Software & Consulting Ltd.
(Microsoft MVP C#/.NET)

Nov 15 '05 #5
You're right, I should have found a different newsgroup. But you're
generous answer was what I needed to here :)

Thanks,
Jacob
"Justin Rogers" <Ju****@games4dotnet.com> wrote in message
news:e6**************@TK2MSFTNGP11.phx.gbl...
Probably not the right group for this, but 84,000 records is nothing. I've written apps
that collect more than 84,000 records in an hour or so. Just make sure to
properly
index your table, and you won't have any issues with updating any of the records or
performing any processes for clean-up or whatever you might need.

If I have my numbers right:
2,480 Terrarium Clients Reporting
13 Records per report
6 Minutes per report

That is 32,240 records (about 10 fields) per 6 minutes or 322,400 per hour. We ran
our servers for months and months on end before even thinking about clearing out any
data. To date, the public Terrarium server has only been cleared of data once and that
was during the first SQL Server attack well over a year ago.

--
Justin Rogers
DigiTec Web Consultants, LLC.

"Jacob" <ja**********@REMOVETHIShotmail.com> wrote in message
news:Fr4Ob.5843$A74.2285@fed1read02...
I'm writing a web application that will collect as many as 200 entries per day(~20 fields per entry). The data fields will always be the same and so, if the sky's the limit, I could just keep adding these same entries to one enormous SQL table-- collecting 200 entries per day, 7000 per month, ~84,000 per year, and on, and on, for the rest of the programs existence. I realize that this is probably not practical and would cause a serious strain on the database when modifying entries.

SHOULD I or HOW SHOULD I try and split up this information. Should I create a new table for every day, month, year? Eventually I hope many people will be accessing this site, and searching this content, so it should be
efficient.

Any help is appreciated,
Jacob


Nov 15 '05 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: maceo | last post by:
I have a script that will print out the results of a table and make a calculation of a total of one of the columns. See example: <?php /* Database connection */...
0
by: maceo | last post by:
I have some code that extracts the data from a table and performs a calculation (total time) on one of the columns. Here is the code: <?php /* Database connection */
19
by: Craig | last post by:
I have a 3rd party product that is quite old. It produces reports dynamically via the web. It users templates to do this. They are very basic, one looks like this. <html> <head>
10
by: Bing Wu | last post by:
Hi Folks, I have a problem while creating a big table space. It reports error: SQL1139N The total size of the table space is too big Explanation: The size of the current table space is too...
7
by: Shawn B. | last post by:
Greetings, I am trying to create a table that has a scrolling body. The problem I'm experiencing is that if the columns in the "body" part of the table exceed the width of the "header" then...
6
by: polocar | last post by:
Hi, I'm writing a program in Visual C# 2005 Professional Edition. This program connects to a SQL Server 2005 database called "Generations" (in which there is only one table, called...
3
by: acecraig100 | last post by:
I am fairly new to Javascript. I have a form that users fill out to enter an animal to exhibit at a fair. Because we have no way of knowing, how many animals a user may enter, I created a table...
1
by: MissMarie | last post by:
I've been playing around with DIV tables in myspace to better learn how to rewrite my own code for my business site without having to pay someone to design it. I've tried embedding a slideshow into...
1
by: since | last post by:
I figured I would post my solution to the following. Resizable column tables. Search and replace values in a table. (IE only) Scrollable tables. Sortable tables. It is based on a lot...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.