By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,392 Members | 1,561 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,392 IT Pros & Developers. It's quick & easy.

Need work around for possible data leaks.

P: n/a
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins

Jan 9 '07 #1
Share this Question
Share on Google+
8 Replies


P: n/a
Greetins,

Try to think about your problem this way. Do you think anyone would
ever try to haul a semi trailer with a 4 cylinder pickup truck? Of
course not. If you have big data - you need something with a big engine
- that would be sql server. Just as the little 4 cylinder pickup truck
would spin its wheels trying to pull the semi trailer, Access spins its
wheels when trying to pull big data. Access has a 1 gig file size limit
- that includes data, forms, code modules (well, actually 2 gigs for
unicode, but it still boils down to 1 gig of distinct data, objects...)

The solution to your problem would be to upgrade to sql server - use
the right tool for the right job.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Jan 9 '07 #2

P: n/a
Lykins wrote:
The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089.
Patrick, your data is either screwed (corrupted), your application is
screwed or your data structure is screwed. Or all three are screwed.

Are you using Jet or another database engine for your main data?
--
Tim http://www.ucs.mun.ca/~tmarshal/
^o<
/#) "Burp-beep, burp-beep, burp-beep?" - Quaker Jake
/^^ "Be Careful, Big Bird!" - Ditto "TIM-MAY!!" - Me
Jan 9 '07 #3

P: n/a
"Lykins" <ly****@gmail.comwrote in news:1168355262.639605.23460
@s34g2000cwa.googlegroups.com:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
99.44% of Access errors which are reported here are application errors:
errors in concept, errors in logic, errors in design, errors in SQL,
errors in code.
So my first suggestion is to hire an independent consultant to examine
all those things with respect to your database and to submit a full
evaulation of them to your database administrator and to senior
management.

My second suggestion is to consider something much more pwerful than 256
megs of ram, P2s or P3s and JET to handle millions of records. An
independent consultant might help with suggestions about this as well.


--
lyle fairfield
Jan 9 '07 #4

P: n/a
Hi, Patrick.
Is there any work around that we can try?
The #1 cause of missing records in a query is poor database design. The #2
cause is poor query design. The next two most common causes are data
corruption and accidental user deletions. Depending upon the environment,
one may be more prevalent than the other.

Ensure that all of the tables have primary keys and are normalized. Ensure
that an experienced DBA or query designer is writing and testing your
queries. If the database is corrupted and has that many records, then a
number of bizarre things will happen, not just this one query returning a
different number of records each time it's run. So if the file is
corrupted, then you'll likely have other clues that this is the problem.

With that many records, I'd recommend upgrading the back end to a
client/server database, such as Oracle or SQL Server. Oracle Express and
SQL Server 2005 Express are both free and hold up to 4 GB of data. And if
you do upgrade to an express version, put the client/server database on a
workstation with more horsepower.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Lykins" <ly****@gmail.comwrote in message
news:11*********************@s34g2000cwa.googlegro ups.com...
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins

Jan 10 '07 #5

P: n/a
Hi, Patrick.

In case you're interested in the free client/server databases to store your
data, you can download Oracle Express from the following Web page:

http://www.oracle.com/technology/pro.../xe/index.html

Or download SQL Server 2005 Express:

http://www.microsoft.com/downloads/d...displaylang=en

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"'69 Camaro" <Fo**************************@Spameater.orgZERO_SP AMwrote in
message news:N8******************************@adelphia.com ...
Hi, Patrick.
>Is there any work around that we can try?

The #1 cause of missing records in a query is poor database design. The
#2 cause is poor query design. The next two most common causes are data
corruption and accidental user deletions. Depending upon the environment,
one may be more prevalent than the other.

Ensure that all of the tables have primary keys and are normalized.
Ensure that an experienced DBA or query designer is writing and testing
your queries. If the database is corrupted and has that many records,
then a number of bizarre things will happen, not just this one query
returning a different number of records each time it's run. So if the
file is corrupted, then you'll likely have other clues that this is the
problem.

With that many records, I'd recommend upgrading the back end to a
client/server database, such as Oracle or SQL Server. Oracle Express and
SQL Server 2005 Express are both free and hold up to 4 GB of data. And if
you do upgrade to an express version, put the client/server database on a
workstation with more horsepower.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Lykins" <ly****@gmail.comwrote in message
news:11*********************@s34g2000cwa.googlegro ups.com...
>We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins


Jan 10 '07 #6

P: n/a
Use SQL Server for a backend if you have that many records.

Lykins wrote:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
Jan 11 '07 #7

P: n/a
Try SQL Server.

Lykins wrote:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
Jan 11 '07 #8

P: n/a
Thanks to everyone who responded. I must have had a caching problem
because I did not see any replies as of yesterday until this morning.

We have issues with most of our larger applications. The data resides
either in a Visual FoxPro table or on one of our three MS-SQL 2000
servers. All of my reporting is done through VFP 6.0 programs, ASP, or
Cold Fusion and we use ODBC connections to access all, however, we have
about two users that are VP's and two that work directly under them,
that are "power users" and we let them create there own queries.
Unfortunatly the size of our company and the amount of detail is just
too great. I have given them some small programs to create a small set
of the data and that had worked until recently. I have been working on
an ASP program that would act similar to access without all of the
bells and whistles. I believe that I may just need to go down that
path because I am 100% sure now that the issue is with the size of the
data that it is trying to handle, one of the tables that it is
accessing through SQL 2000 is 102.4GB now, and most reside in the
10-75GB range.

Thanks again,
Patrick

On Jan 9, 1:26 pm, Rich P <rpng...@aol.comwrote:
Greetins,

Try to think about your problem this way. Do you think anyone would
ever try to haul a semi trailer with a 4 cylinder pickup truck? Of
course not. If you have bigdata- youneedsomething with a big engine
- that would be sql server. Just as the little 4 cylinder pickup truck
would spin its wheels trying to pull the semi trailer, Access spins its
wheels when trying to pull bigdata. Access has a 1 gig file size limit
- that includesdata, forms, code modules (well, actually 2 gigs for
unicode, but it still boils down to 1 gig of distinctdata, objects...)

The solution to your problem would be to upgrade to sql server - use
the right tool for the right job.

Rich

*** Sent via Developersdexhttp://www.developersdex.com***
Jan 15 '07 #9

This discussion thread is closed

Replies have been disabled for this discussion.