473,395 Members | 1,762 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Access and processor usage

I am just wondering why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes". I have 2 gb RAM,
Core 2 duo e6300 processor and plenty of disk space. Why doesn't
Access peg the CPU?

Joel

Apr 4 '07 #1
10 6978
Hi, Joel.
why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes".
The operation will require more page locks than your current MaxLocksPerFile
setting allows. Please see the following Web page for instructions on how
to fix this:

http://support.microsoft.com/kb/286153/
Why doesn't
Access peg the CPU?
It doesn't need to. It's not like there's disk space or RAM inside the CPU
to permanently store the data or calculations. The CPU fetches the computer
instruction, then executes the computer instruction, then fetches the next
computer instruction, et cetera. Your CPU has enough capacity and speed
that Jet isn't using its full resources when executing the query. This is a
good thing, not a bad thing.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"WannaKatana" <Gi*************@gmail.comwrote in message
news:11**********************@y66g2000hsf.googlegr oups.com...
>I am just wondering why, with nothing else running and executing an
update query against a very large table, does Access seem to be
causing less than 10% processor usage. Then it says "There is not
enough disk space or memory to undo the changes". I have 2 gb RAM,
Core 2 duo e6300 processor and plenty of disk space. Why doesn't
Access peg the CPU?

Joel

Apr 4 '07 #2
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Apr 4 '07 #3
Hi, Rich.
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
Have you ever received the "There is not enough disk space or memory to undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).
Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk, which
Access is already complaining there isn't enough of to undo the changes when
<CTRL><Zis pressed.
If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.
He can fix the problem with a change of his Jet Engine settings, instead of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
Access
has a 1 gig data limit (more like a 500 meg limit).
Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.
If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the data
to a stronger and bigger database engine. Several of them are free, such as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
ADO.Net has way more bandwidth than com based ADO.
I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***

Apr 5 '07 #4
On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

>Hi, Rich.
>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.

Have you ever received the "There is not enough disk space or memory to undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
>The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).

Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk, which
Access is already complaining there isn't enough of to undo the changes when
<CTRL><Zis pressed.
>If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

He can fix the problem with a change of his Jet Engine settings, instead of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
>Access
has a 1 gig data limit (more like a 500 meg limit).

Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
>If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.

If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the data
to a stronger and bigger database engine. Several of them are free, such as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
>ADO.Net has way more bandwidth than com based ADO.

I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Apr 5 '07 #5
Hi, Tom.
The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.
EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:pa********************************@4ax.com...
On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

>>Hi, Rich.
>>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.

Have you ever received the "There is not enough disk space or memory to
undo
the changes" warning message when _not_ using Jet tables? Isn't this a
Jet-specific message?
>>The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC).

Years ago, we tested linked (ODBC) Oracle tables with DSN's and ADO with
DSN-less connections, and the speeds were similar. We didn't get markedly
"increased bandwidth" using ADO. And addressing the bandwidth (network
throughput) isn't going to solve Joel's warning message that the action
query can't be undone because there's not enough disk space or memory on
his
workstation. Even if you speed up the network, thereby increasing the
bandwidth, where are those bytes going? Either to memory or to disk,
which
Access is already complaining there isn't enough of to undo the changes
when
<CTRL><Zis pressed.
>>If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

He can fix the problem with a change of his Jet Engine settings, instead
of
replacing the problem with an expensive and time-consuming upgrade to SQL
Server.
>>Access
has a 1 gig data limit (more like a 500 meg limit).

Access 95 and 97 can hold 1 GB of data, while Access 2000 and newer can
hold
2 GB. With the horsepower Joel has, it's doubtful he's using a version of
Access older than Access 2000.
>>If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server.

If the data fits into a 2 GB database file, it isn't "genuinely big" yet.
Tens or hundreds of terabytes is "genuinely big." 2 GB ain't much, but if
the database is pushing that size limit, then it's time to migrate the
data
to a stronger and bigger database engine. Several of them are free, such
as
SQL Server 2005 Express, Oracle 10g Express, and IBM DB2 Express-C. The
first two hold up to 4 GB of data, and the last isn't limited by data file
size.
>>ADO.Net has way more bandwidth than com based ADO.

I didn't know that. What database connection technology can ADO.Net use
that's superior to the database connection technologies ADO is limited to?

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
>>If you are pulling data from a sql server (MS Sql Server, Oracle, ...)
and you are using an ODBC connection, that would be your bottle neck.
The fix is to use ADO (com based ADO) to pull your data (much more
bandwidth than ODBC). If Access is your backend, then Access is the
bottleneck - need to step up to Sql Server.

If neither of the above are your scenario, and you really are pulling in
hundreds of megs of data, then Access is still the bottleneck. Access
has a 1 gig data limit (more like a 500 meg limit). If you are working
with genuinely big data you need to step up to server based tools like
.Net and sql server. ADO.Net has way more bandwidth than com based ADO.

BTW, core2Duo is sweet, heh? I just upgraded my workstation to one last
month - the shop that upgraded me said that you can maximize its full
potential by using 4 gigs of memory.

Rich

*** Sent via Developersdex http://www.developersdex.com ***

Apr 5 '07 #6
"'69 Camaro" <Fo**************************@Spameater.orgZERO_SP AMwrote
in news:13*************@corp.supernews.com:
Hi, Tom.
>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot
water if one doesn't publish the test results under the vendor's
specific guidelines, so I can see why folks may be reluctant.
I worry about this a lot.

--
lyle fairfield

Ceterum censeo Redmond esse delendam
Apr 5 '07 #7
"'69 Camaro" <Fo**************************@Spameater.orgZERO_SP AM>
wrote in news:13*************@corp.supernews.com:
EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal
hot water if one doesn't publish the test results under the
vendor's specific guidelines, so I can see why folks may be
reluctant.
I wish someone with deep pockets would take that one on. I don't
believe it could possibly survive in court. Indeed, there's a lot of
things in EULAs that wouldn't likely survive a legal challenge, if
only there were someone financially able to challenge them.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Apr 5 '07 #8
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
>Hi, Tom.
>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:pa********************************@4ax.com.. .
>On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_S PAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

<clip>

Apr 6 '07 #9
Hi, Tom.

Borrowing from David's comment, how deep are your pockets? ;-)

Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:mm********************************@4ax.com...
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
>>Hi, Tom.
>>The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results.

EULA's on commercial database engines restrict users from publicly
publishing the detailed results of tests. One may get into legal hot
water
if one doesn't publish the test results under the vendor's specific
guidelines, so I can see why folks may be reluctant.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
Blogs: www.DataDevilDog.BlogSpot.com, www.DatabaseTips.BlogSpot.com
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Tom van Stiphout" <no*************@cox.netwrote in message
news:pa********************************@4ax.com. ..
>>On Wed, 4 Apr 2007 19:28:31 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_ SPAMwrote:

You make a lot of good points. I also seriously doubt that as part of
developing ADO.NET MSFT all of a sudden found a much more efficient
way of moving data back and forth.

The best way I know of to settle claims and counterclaims is for some
developers to get together and develop reproducible test scenarios,
and publish the results. I have coined this idea from time to time,
but alas, no takers.

-Tom.

<clip>

Apr 6 '07 #10
Tom van Stiphout wrote:
On Thu, 5 Apr 2007 07:23:07 -0700, "'69 Camaro"
<Fo**************************@Spameater.orgZERO_SP AMwrote:

Let me see: EULA speech versus the First Amendment.
I think there are ways to do this without getting into hot water.

-Tom.
I agree in general but just to be the devil's advocate wouldn't you be concerned about a
person's qualifications and testing techniques if they were posting data that made your
product look bad?

I'm sure we've all heard of vendors who slant benchmark testing to benefit their product.
I seem to recall this was very common with graphic card vendors about 5-6 years ago.
Perhaps still is...

--
---------------
John Mishefske, Microsoft Access MVP
Apr 6 '07 #11

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: Amit Dedhia | last post by:
Hi I am developing a Dot net application (involving image processing) on a uni processor. It works well on my machine. I then take all my code on a multi processor, build and run the application...
5
by: dba_db2 at nospam gmx.net | last post by:
We have got a brand new mutltiprocessor machine IBM pseries server with 2 processors. Now we want to buy a IBM DB2 license for this box. Does anyone know whether it is possible to buy a single...
6
by: Nick via AccessMonster.com | last post by:
Hello all, I've been working on a VBA application in Access for a few months now. This morning, my Access application began to hang in memory, using 97-100% of the CPU, and the only way to...
1
by: Jeff Mitchell | last post by:
I have a logging application that records various performance metrics into a database every 5 minutes, and to this I'd like to add a performace counter that shows the processor usage in a manner akin...
4
by: kaiteriteri | last post by:
I have a time-consuming VB.net application that i'd like to thread over 2 processors (that's all i've got in my machine!) and, hopefully, get it done in half the time. On running, the application...
3
by: Ken | last post by:
When I run my VB.NET application, it loads a large amount of data upon startup. When I checked Task Manager, it showed that roughly 25% +/- of the processor was being used. When I run the same...
2
by: Cathy_db | last post by:
Hi, How can I check if in our .net environment the applications are using all 4 of our processors ( 2 CPU's hyperthreaded) The issue is that our test environment with 1 CPU (hyperthreaded) shows...
16
by: Neil | last post by:
I posted a few days ago that it seems to me that the Access 2007 rich text feature does not support: a) full text justification; b) programmatic manipulation. I was hoping that someone might...
3
by: Akash | last post by:
Hi, I am using getProcessTimes API for calculating CPU Usage On a Multi-Core processor i have observed that the %CPU Usage that is multiple of the number of cores of the processor Is there any...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.