471,108 Members | 1,631 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,108 software developers and data experts.

Access and SQL Server Speed Comparison

Hi,

I have a table with 8 millions records. Some of the fields needed to
be parse and modified. Example, "ABC DEF" will be parsed into two
fields "ABC" and "DEF". And If there is "123" in the field, if will
be replaced by "456". For example, "XYZ 123 AAA" will be "XYZ 456
AAA". Also, update query and append query will be used in the
project.

Give the above information, what will be a faster database to do this
- MS-Access 2003 or SQL Server 2000?

Mar 1 '07 #1
13 5547
SQL Learner wrote:
I have a table with 8 millions records. Some of the fields needed to
be parse and modified. Example, "ABC DEF" will be parsed into two
fields "ABC" and "DEF". And If there is "123" in the field, if will
be replaced by "456". For example, "XYZ 123 AAA" will be "XYZ 456
AAA". Also, update query and append query will be used in the
project.

Give the above information, what will be a faster database to do this
- MS-Access 2003 or SQL Server 2000?
Note that an Access front-end can be connected to either a Jet or SQL
back-end. (The former leads to a self-contained .MDB file.)

Beware of premature optimization. How often do these sorts of changes
occur? Once a year, month, day, hour? If it's infrequent, then
speeding them up may be less important than optimizing other factors,
e.g. ease of use, ease of development, low memory footprint.
Mar 2 '07 #2
Hi Ed,

Thank you for your reply. I just need to know which one is faster
since the queries that used for this project can be many. If both are
the same, then I will just use Access. This is just a one time event,
but the update queries can be slow. The example that I provided was
simplified for demostration purpose only. Let's assume, I need to run
the such query on a 8 millions records table 10 times. Give I have a
2GB Rem, with very fast hard disk and fast Due CPU, Would SQL Server
be faster? If yes, by how much?

- Grasshopper -
Mar 2 '07 #3

"SQL Learner" <ex***********@gmail.comwrote in message
news:11*********************@s48g2000cws.googlegro ups.com...
Hi,

I have a table with 8 millions records. Some of the fields needed to
be parse and modified. Example, "ABC DEF" will be parsed into two
fields "ABC" and "DEF". And If there is "123" in the field, if will
be replaced by "456". For example, "XYZ 123 AAA" will be "XYZ 456
AAA". Also, update query and append query will be used in the
project.

Give the above information, what will be a faster database to do this
- MS-Access 2003 or SQL Server 2000?
On equal hardware SQL Server 2000 will run the query faster. Depending on
your skills it may take longer to write the queries on SQL Server 2000 than
it takes to write and run in Access...
Mar 2 '07 #4
Hi Russ,

To you guess, how much faster would that be? Would it be
significant?

- Grasshopper -

Mar 2 '07 #5
SQL Learner,

A benchmark test would probably be the only way to provide a difinitive
answer.

-- Bill

"SQL Learner" <ex***********@gmail.comwrote in message
news:11**********************@j27g2000cwj.googlegr oups.com...
Hi Russ,

To you guess, how much faster would that be? Would it be
significant?

- Grasshopper -

Mar 2 '07 #6
SQL Learner (ex***********@gmail.com) writes:
I have a table with 8 millions records. Some of the fields needed to
be parse and modified. Example, "ABC DEF" will be parsed into two
fields "ABC" and "DEF". And If there is "123" in the field, if will
be replaced by "456". For example, "XYZ 123 AAA" will be "XYZ 456
AAA". Also, update query and append query will be used in the
project.

Give the above information, what will be a faster database to do this
- MS-Access 2003 or SQL Server 2000?
Benchmark, don't ask. Performance questions are very difficult to
answer up front, because there are so many "it depends". What I can
say, is that if you do this in Access, you should do it in a local
Jet database, not not a linked SQL Server table.

If you ask me to place my bets, I put them on Access for this one.
String processing is not SQL Server's best game.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Mar 2 '07 #7
"Erland Sommarskog" <es****@sommarskog.sewrote in message
news:Xn**********************@127.0.0.1...
SQL Learner (ex***********@gmail.com) writes:
>I have a table with 8 millions records. Some of the fields needed to
be parse and modified. Example, "ABC DEF" will be parsed into two
fields "ABC" and "DEF". And If there is "123" in the field, if will
be replaced by "456". For example, "XYZ 123 AAA" will be "XYZ 456
AAA". Also, update query and append query will be used in the
project.

Give the above information, what will be a faster database to do this
- MS-Access 2003 or SQL Server 2000?

Benchmark, don't ask. Performance questions are very difficult to
answer up front, because there are so many "it depends". What I can
say, is that if you do this in Access, you should do it in a local
Jet database, not not a linked SQL Server table.

If you ask me to place my bets, I put them on Access for this one.
String processing is not SQL Server's best game.
And further, I'll again suggest perl.

It may very well be faster to export the data, manipulate it in perl which
is designed for string processing and reimport it.

Especially if it appears to be a one time thing.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx

--
Greg Moore
SQL Server DBA Consulting
sql (at) greenms.com http://www.greenms.com
Mar 3 '07 #8
Greg D. Moore (Strider) (mo****************@greenms.com) writes:
And further, I'll again suggest perl.

It may very well be faster to export the data, manipulate it in perl which
is designed for string processing and reimport it.

Especially if it appears to be a one time thing.
I could consider doing a thing like this in Perl, since I'm fluent in
that language, and thus I could cut down the total time, that is
development + execution. Assuming, that is, it is a one-off.

If there is need to do this on a regular basis, there should be other
factors to determine whether to do this in Access or SQL Server. Just
moving the data to SQL Server - or a file - and then back, is likely
to be a loser.

Of course, it's perfectly possible to use ADO from Perl to access the
Access database.

--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Mar 3 '07 #9

"SQL Learner" <ex***********@gmail.comwrote in message
news:11**********************@j27g2000cwj.googlegr oups.com...
Hi Russ,

To you guess, how much faster would that be? Would it be
significant?

- Grasshopper -
My guess would have been a factor of 10 since SQL server is far more
efficient in its memory management.

My expirimentation has produced the following results on a dual Xeon with 2
GB memory:

I have a working notes table that contains 9.6 million rows.

--3 minutes to copy 9.6 million rows into an expirimental table
--Added 3 varchar columns to hold expirimental results

--20 minutes to split column into two fields
UPDATE dbo.WNTemp SET Split1 = SUBSTRING(Comments, 1, 3), Split2 =
SUBSTRING(Comments, 4, 3)

--12 minutes to nullify columns, so 8 minutes for SUBSTRING split function.
UPDATE dbo.WNTemp SET Split1 = NULL, Split2 = NULL

--Export to Access mdb yields 2GB file

--Same query using Mid function in place of SUBSTRING function

--Access at 3 minutes gave modal warning that no undo is possible
--At 34 minutes gave error and stopped at 8.1 million records.
--According to Task Manager Access read 3.8GB and wrote 1.6 GB during
partial update

My conclusion is that Access is only moderately slower, but I would guess
that the difference would grow if the width of the table was larger.

Although a partial update is nice in this circumstance, I believe it would
be a negative result in a production environment.


Mar 3 '07 #10
Erland and Greg,

Perl is not my thing. Beside, in this project there are actually two
tables that need to be links. The thing is that since many records
can only be linked using partial criterial (ex: "123" to match "abc123
xyz"), I still need to do this in a database.

Russ,

If I do this through ODBC by using Access as the front end (query
only) and SQL Server as the end back, how much slow down would this be
compares to run the query on SQL Server without using the ODBC linked
to Access?

Thank you all.

- Grasshopper -

Mar 6 '07 #11

"SQL Learner" <ex***********@gmail.comwrote in message
news:11*********************@h3g2000cwc.googlegrou ps.com...
Erland and Greg,

Perl is not my thing. Beside, in this project there are actually two
tables that need to be links. The thing is that since many records
can only be linked using partial criterial (ex: "123" to match "abc123
xyz"), I still need to do this in a database.

Russ,

If I do this through ODBC by using Access as the front end (query
only) and SQL Server as the end back, how much slow down would this be
compares to run the query on SQL Server without using the ODBC linked
to Access?
Not much if Access is run on the server with the data. If you are dragging 8
million rows across a network you are in for a long, long, long night.
>
Thank you all.

- Grasshopper -

Mar 6 '07 #12
Not much if Access is run on the server with the data. If you are dragging 8
million rows across a network you are in for a long, long, long night.
Hi Russ,

That was what I meant - dragging 8 million rows across a network.
Thank you for your answer. In such case, I will run Access on the
server.

- Grasshopper -

Mar 6 '07 #13
"SQL Learner" <ex***********@gmail.comwrote in message
news:11*********************@h3g2000cwc.googlegrou ps.com...
Erland and Greg,

Perl is not my thing. Beside, in this project there are actually two
tables that need to be links. The thing is that since many records
can only be linked using partial criterial (ex: "123" to match "abc123
xyz"), I still need to do this in a database.
Well perl's not my thing either. ;-)

That's why I keep several perl buddies handy for stuff like this.

Quite honestly, someone know knows perl REGEXP inside and out can probably
knock off the pattern matching script in 30 minutes or so.

But yeah, if you're the only guy on the job, that's probably a non-starter.

>
Russ,

If I do this through ODBC by using Access as the front end (query
only) and SQL Server as the end back, how much slow down would this be
compares to run the query on SQL Server without using the ODBC linked
to Access?
Yeah, definitely run it on the server. Network lag alone would kill you.

>
Thank you all.

- Grasshopper -
--
Greg Moore
SQL Server DBA Consulting
sql (at) greenms.com http://www.greenms.com
Mar 6 '07 #14

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

22 posts views Thread by Bob Darlington | last post: by
27 posts views Thread by SQL Learner | last post: by
1 post views Thread by SQL Learner | last post: by
1 post views Thread by Jean-Paul Calderone | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.