471,310 Members | 1,492 Online
Bytes | Software Development & Data Engineering Community
Post +

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 471,310 software developers and data experts.

Access and processor usage

I am just wondering why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes". I have 2 gb RAM,
Core 2 duo e6300 processor and plenty of disk space. Why doesn't
Access peg the CPU?

Joel

Apr 4 '07 #1
10 6770
Hi, Joel.
why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes".
The operation will require more page locks than your current MaxLocksPerFile
setting allows. Please see the following Web page for instructions on how
to fix this:

http://support.microsoft.com/kb/286153/
Why doesn't
Access peg the CPU?
It doesn't need to. It's not like there's disk space or RAM inside the CPU
to permanently store the data or calculations. The CPU fetches the computer
instruction, then executes the computer instruction, then fetches the next
computer instruction, et cetera. Your CPU has enough capacity and speed
that Jet isn't using its full resources when executing the query. This is a
good thing, not a bad thing.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"WannaKatana" <Gi*************@gmail.comwrote in message
news:11**********************@y66g2000hsf.googlegr oups.com...
>I am just wondering why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes". I have 2 gb RAM,
Core 2 duo e6300 processor and plenty of disk space. Why doesn't
Access peg the CPU?

Joel

Apr 4 '07 #2
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Apr 4 '07 #3
Hi, Rich.
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
Have you ever received the "There is not enough disk space or memory to undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).
Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk, which
Access is already complaining there isn't enough of to undo the changes when
<CTRL><Zis pressed.
If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.
He can fix the problem with a change of his Jet Engine settings, instead of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
Access
has a 1 gig data limit (more like a 500 meg limit).
Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.
If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the data
to a stronger and bigger database engine. Several of them are free, such as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
ADO.Net has way more bandwidth than com based ADO.
I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***

Apr 5 '07 #4
On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

>Hi, Rich.
>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.

Have you ever received the "There is not enough disk space or memory to undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
>The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).

Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk, which
Access is already complaining there isn't enough of to undo the changes when
<CTRL><Zis pressed.
>If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

He can fix the problem with a change of his Jet Engine settings, instead of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
>Access
has a 1 gig data limit (more like a 500 meg limit).

Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
>If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.

If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the data
to a stronger and bigger database engine. Several of them are free, such as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
>ADO.Net has way more bandwidth than com based ADO.

I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Apr 5 '07 #5
Hi, Tom.
The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.
EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:pa********************************@4ax.com...
On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

>>Hi, Rich.
>>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.

Have you ever received the "There is not enough disk space or memory to
undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
>>The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).

Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on
his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk,
which
Access is already complaining there isn't enough of to undo the changes
when
<CTRL><Zis pressed.
>>If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

He can fix the problem with a change of his Jet Engine settings, instead
of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
>>Access
has a 1 gig data limit (more like a 500 meg limit).

Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can
hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
>>If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.

If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the
data
to a stronger and bigger database engine. Several of them are free, such
as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
>>ADO.Net has way more bandwidth than com based ADO.

I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
>>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***

Apr 5 '07 #6
"'69 Camaro" <Fo**************************@Spameater.orgZERO_SP AMwrote
in news:13*************@corp.supernews.com:
Hi, Tom.
>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot
water if one doesn't publish the test results under the vendor's
specific guidelines, so I can see why folks may be reluctant.
I worry about this a lot.

--
lyle fairfield

Ceterum censeo Redmond esse delendam
Apr 5 '07 #7
"'69 Camaro" <Fo**************************@Spameater.orgZERO_SP AM>
wrote in news:13*************@corp.supernews.com:
EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal
hot water if one doesn't publish the test results under the
vendor's specific guidelines, so I can see why folks may be
reluctant.
I wish someone with deep pockets would take that one on. I don't
believe it could possibly survive in court. Indeed, there's a lot of
things in EULAs that wouldn't likely survive a legal challenge, if
only there were someone financially able to challenge them.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 5 '07 #8
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
>Hi, Tom.
>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:pa********************************@4ax.com.. .
>On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_S PAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

<clip>

Apr 6 '07 #9
Hi, Tom.

Borrowing from David's comment, how deep are your pockets? ;-)

Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:mm********************************@4ax.com...
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
>>Hi, Tom.
>>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot
water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:pa********************************@4ax.com. ..
>>On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_ SPAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

<clip>

Apr 6 '07 #10
Tom van Stiphout wrote:
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
I agree in general but just to be the devil's advocate wouldn't you be concerned about a
person's qualifications and testing techniques if they were posting data that made your
product look bad?

I'm sure we've all heard of vendors who slant benchmark testing to benefit their product.
I seem to recall this was very common with graphic card vendors about 5-6 years ago.
Perhaps still is...

--
---------------
John Mishefske, Microsoft Access MVP
Apr 6 '07 #11

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

5 posts views Thread by dba_db2 at nospam gmx.net | last post: by
6 posts views Thread by Nick via AccessMonster.com | last post: by
1 post views Thread by Jeff Mitchell | last post: by
2 posts views Thread by Cathy_db | last post: by
reply views Thread by rosydwin | last post: by

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.