473,387 Members | 3,684 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Is this too much data for SQL?

I need to log data collected from instruments in a laboratory at a
very high rate. I need to log timestamps and values measured from
various devices at a rate as high as once a second.
This means my table will grow to tens or hundreds of gigabytes within
a year. I want to know if SQL server will be able to handle this much
data. In addition, is it better to separate this data into different
tables? Does a table become hard to search (search by the timestamp
field I am logging) once it gets to a certain size?

Does anyone have experience or advice aboutt his problem? Is there
possibly another product out there which is better suited for logging
time-based data like this than SQL Server?

Thanks
Jul 20 '05 #1
6 2168

"NewbieNewsGrouper" <ne***************@hotmail.com> wrote in message
news:1e**************************@posting.google.c om...
I need to log data collected from instruments in a laboratory at a
very high rate. I need to log timestamps and values measured from
various devices at a rate as high as once a second.
This means my table will grow to tens or hundreds of gigabytes within
a year. I want to know if SQL server will be able to handle this much
data. In addition, is it better to separate this data into different
tables? Does a table become hard to search (search by the timestamp
field I am logging) once it gets to a certain size?

Does anyone have experience or advice aboutt his problem? Is there
possibly another product out there which is better suited for logging
time-based data like this than SQL Server?

Thanks


I don't have any personal experience of applications like the one you're
describing, but there are certainly SQL Server databases which are handling
those volumes of data and transactions:

http://www.microsoft.com/sql/techinf...calability.asp
http://www.tpc.org/tpcc/results/tpcc...rf_results.asp

It is possible to partition data using partitioned views - you can find more
details in Books Online.

Simon
Jul 20 '05 #2
NewbieNewsGrouper (ne***************@hotmail.com) writes:
I need to log data collected from instruments in a laboratory at a
very high rate. I need to log timestamps and values measured from
various devices at a rate as high as once a second.
This means my table will grow to tens or hundreds of gigabytes within
a year. I want to know if SQL server will be able to handle this much
data.
It will. And once a second is not going to put SQL Server to the test.

With one caveat: you need to have a reasonable table and index structure.
Now, how that table and index structure should look like, I cannot tell,
because I don't know your application. It may also depend on you will
use that data. Maybe once the data has been collected, you should
move it over to Analysis Services where you can build cubes, dimensions
and whatever.

So the answer is, yes SQL Server can do it, but you have to design
carefully.
In addition, is it better to separate this data into different
tables? Does a table become hard to search (search by the timestamp
field I am logging) once it gets to a certain size?


Again, that depends on your indexing. But depending on your requirements,
partitioning may be a good idea. For instance, if many queries will be
against today's test, it can be helpful to have them in a separate
table.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3
I once supported a similar application which recorded lab measurement data
to SQL Server. The application architecture was such that data was
initially inserted to staging tables and then imported into a separate
schema for reporting and analysis. It is common to segregate operational
and reporting data so that you can optimize your schema (including indexes)
for the different requirements.

As Simon and Erland said, the key to performance with large tables is
appropriate indexing. SQL Server can certainly handle VLDBs but be aware
that performance is ultimately constrained by your hardware speed.

--
Hope this helps.

Dan Guzman
SQL Server MVP

"NewbieNewsGrouper" <ne***************@hotmail.com> wrote in message
news:1e**************************@posting.google.c om...
I need to log data collected from instruments in a laboratory at a
very high rate. I need to log timestamps and values measured from
various devices at a rate as high as once a second.
This means my table will grow to tens or hundreds of gigabytes within
a year. I want to know if SQL server will be able to handle this much
data. In addition, is it better to separate this data into different
tables? Does a table become hard to search (search by the timestamp
field I am logging) once it gets to a certain size?

Does anyone have experience or advice aboutt his problem? Is there
possibly another product out there which is better suited for logging
time-based data like this than SQL Server?

Thanks

Jul 20 '05 #4
It is usualy the standard to have your historic data sent to a seprate
database. At my company, we have a huge database with millions of
transactions. Right now, the database is well over 350GB (note hoever
this is on a million dollar 64 processor machine processing millions
of records a day).

Each evening, a DTS job runs which copies specific data (based on
timestamps) out of the production database, and into a datawarehouse.
This server can then be accessed offline. The benifit to this
alterntive database is 1) its not getting the busy hits that the
prodution system is getting. 2) You can have the DTS transfer the data
in a way that is benificial to any reports you may want to run.
(submit the data in a certain order, or into tables other than as are
structured in your production enviorment).

As far as size goes, You can pretty much go as big as you want,
assuming you have the disk space. If you are going to be collecting
data for over a year, I assume this data will be difficult to
reproduce, so you will want to back it up. The datawearhouse is an
execlent tool for this.

Hope this helps.
Mark
Jul 20 '05 #5
I am currently engaged in pulling operatong system metrics from BMC
Patrol agents on over a 1000 sperate servers into one SQL database. I
have found a dramatic improvment in DB performance when inserting data
into a staging table and then aggregating data into reporting tables.
I currently process over 100 million records daily using a fully
loaded HP DL760 attached to an IBM FasTt SAN.
Jul 20 '05 #6
In article <42**************************@posting.google.com >, atyoung75
@yahoo.com says...
I am currently engaged in pulling operatong system metrics from BMC
Patrol agents on over a 1000 sperate servers into one SQL database. I
have found a dramatic improvment in DB performance when inserting data
into a staging table and then aggregating data into reporting tables.
I currently process over 100 million records daily using a fully
loaded HP DL760 attached to an IBM FasTt SAN.


It's only to much data if you can't get the results you want.

--
--
sp*********@rrohio.com
(Remove 999 to reply to me)
Jul 20 '05 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Chris | last post by:
Could someone please provide me an effective means of exporting data from a data set (or data grid) to Excel?
9
by: Tony Lee | last post by:
Some time a ago, on this newsgroup the following comments were made in recommending good references for Access (2003) >I used to recommend Dr. Rick Dobson's, "Programming Access <version>" for...
1
by: djozy | last post by:
Please, I want to insert data into SQL Server database. I know for this commmand: SqlCommand myCommand= new SqlCommand("INSERT INTO table (Column1, Column2) " + "Values ('string', 1)",...
1
by: T8 | last post by:
I have a asp.net (framework 1.1) site interfacing against SQL 2000. It runs like a charm 99% of the time but once in a while I get the following "unspecified error". Sometimes it would resolve by...
0
by: NicK chlam via DotNetMonster.com | last post by:
this is the error i get System.Data.OleDb.OleDbException: Syntax error in INSERT INTO statement. at System.Data.Common.DbDataAdapter.Update(DataRow dataRows, DataTableMapping tableMapping) at...
3
by: bbernieb | last post by:
Hi, All, Is it possible to access a variable inside of a data binding, without the variable being out of scope? (Note: On the DataBinder line, I get an error message that says "Name 'i' is...
5
by: Gene | last post by:
What can I do if I want to get the result using the sql command? for example, the select command is "select Name from Employee where StaffID=10" How to get the "Name"??? dim Name as string and...
5
by: DC Gringo | last post by:
I am having a problem reading a simple update to the database. Basically I'm testing a small change to the pubs database -- changing the price of the Busy Executive's Database Guide from 19.99 to...
14
by: Rolf Welskes | last post by:
Hello, I have an ObjectDataSource which has as business-object a simple array of strings. No problem. I have an own (custom) control to which I give the DataSourceId and in the custom-control...
0
by: Winder | last post by:
Computer Data Recovery Help 24/7 Data recovering tools and services is our focus. We will recover your data in a cost effective and efficient manner. We recover all operating systems and media....
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.