472,354 Members | 2,024 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,354 software developers and data experts.

Need work around for possible data leaks.

We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins

Jan 9 '07 #1
8 1598
Greetins,

Try to think about your problem this way. Do you think anyone would
ever try to haul a semi trailer with a 4 cylinder pickup truck? Of
course not. If you have big data - you need something with a big engine
- that would be sql server. Just as the little 4 cylinder pickup truck
would spin its wheels trying to pull the semi trailer, Access spins its
wheels when trying to pull big data. Access has a 1 gig file size limit
- that includes data, forms, code modules (well, actually 2 gigs for
unicode, but it still boils down to 1 gig of distinct data, objects...)

The solution to your problem would be to upgrade to sql server - use
the right tool for the right job.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Jan 9 '07 #2
Lykins wrote:
The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089.
Patrick, your data is either screwed (corrupted), your application is
screwed or your data structure is screwed. Or all three are screwed.

Are you using Jet or another database engine for your main data?
--
Tim http://www.ucs.mun.ca/~tmarshal/
^o<
/#) "Burp-beep, burp-beep, burp-beep?" - Quaker Jake
/^^ "Be Careful, Big Bird!" - Ditto "TIM-MAY!!" - Me
Jan 9 '07 #3
"Lykins" <ly****@gmail.comwrote in news:1168355262.639605.23460
@s34g2000cwa.googlegroups.com:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
99.44% of Access errors which are reported here are application errors:
errors in concept, errors in logic, errors in design, errors in SQL,
errors in code.
So my first suggestion is to hire an independent consultant to examine
all those things with respect to your database and to submit a full
evaulation of them to your database administrator and to senior
management.

My second suggestion is to consider something much more pwerful than 256
megs of ram, P2s or P3s and JET to handle millions of records. An
independent consultant might help with suggestions about this as well.


--
lyle fairfield
Jan 9 '07 #4
Hi, Patrick.
Is there any work around that we can try?
The #1 cause of missing records in a query is poor database design. The #2
cause is poor query design. The next two most common causes are data
corruption and accidental user deletions. Depending upon the environment,
one may be more prevalent than the other.

Ensure that all of the tables have primary keys and are normalized. Ensure
that an experienced DBA or query designer is writing and testing your
queries. If the database is corrupted and has that many records, then a
number of bizarre things will happen, not just this one query returning a
different number of records each time it's run. So if the file is
corrupted, then you'll likely have other clues that this is the problem.

With that many records, I'd recommend upgrading the back end to a
client/server database, such as Oracle or SQL Server. Oracle Express and
SQL Server 2005 Express are both free and hold up to 4 GB of data. And if
you do upgrade to an express version, put the client/server database on a
workstation with more horsepower.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Lykins" <ly****@gmail.comwrote in message
news:11*********************@s34g2000cwa.googlegro ups.com...
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins

Jan 10 '07 #5
Hi, Patrick.

In case you're interested in the free client/server databases to store your
data, you can download Oracle Express from the following Web page:

http://www.oracle.com/technology/pro.../xe/index.html

Or download SQL Server 2005 Express:

http://www.microsoft.com/downloads/d...displaylang=en

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"'69 Camaro" <Fo**************************@Spameater.orgZERO_SP AMwrote in
message news:N8******************************@adelphia.com ...
Hi, Patrick.
>Is there any work around that we can try?

The #1 cause of missing records in a query is poor database design. The
#2 cause is poor query design. The next two most common causes are data
corruption and accidental user deletions. Depending upon the environment,
one may be more prevalent than the other.

Ensure that all of the tables have primary keys and are normalized.
Ensure that an experienced DBA or query designer is writing and testing
your queries. If the database is corrupted and has that many records,
then a number of bizarre things will happen, not just this one query
returning a different number of records each time it's run. So if the
file is corrupted, then you'll likely have other clues that this is the
problem.

With that many records, I'd recommend upgrading the back end to a
client/server database, such as Oracle or SQL Server. Oracle Express and
SQL Server 2005 Express are both free and hold up to 4 GB of data. And if
you do upgrade to an express version, put the client/server database on a
workstation with more horsepower.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Lykins" <ly****@gmail.comwrote in message
news:11*********************@s34g2000cwa.googlegro ups.com...
>We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins


Jan 10 '07 #6
Use SQL Server for a backend if you have that many records.

Lykins wrote:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
Jan 11 '07 #7
Try SQL Server.

Lykins wrote:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
Jan 11 '07 #8
Thanks to everyone who responded. I must have had a caching problem
because I did not see any replies as of yesterday until this morning.

We have issues with most of our larger applications. The data resides
either in a Visual FoxPro table or on one of our three MS-SQL 2000
servers. All of my reporting is done through VFP 6.0 programs, ASP, or
Cold Fusion and we use ODBC connections to access all, however, we have
about two users that are VP's and two that work directly under them,
that are "power users" and we let them create there own queries.
Unfortunatly the size of our company and the amount of detail is just
too great. I have given them some small programs to create a small set
of the data and that had worked until recently. I have been working on
an ASP program that would act similar to access without all of the
bells and whistles. I believe that I may just need to go down that
path because I am 100% sure now that the issue is with the size of the
data that it is trying to handle, one of the tables that it is
accessing through SQL 2000 is 102.4GB now, and most reside in the
10-75GB range.

Thanks again,
Patrick

On Jan 9, 1:26 pm, Rich P <rpng...@aol.comwrote:
Greetins,

Try to think about your problem this way. Do you think anyone would
ever try to haul a semi trailer with a 4 cylinder pickup truck? Of
course not. If you have bigdata- youneedsomething with a big engine
- that would be sql server. Just as the little 4 cylinder pickup truck
would spin its wheels trying to pull the semi trailer, Access spins its
wheels when trying to pull bigdata. Access has a 1 gig file size limit
- that includesdata, forms, code modules (well, actually 2 gigs for
unicode, but it still boils down to 1 gig of distinctdata, objects...)

The solution to your problem would be to upgrade to sql server - use
the right tool for the right job.

Rich

*** Sent via Developersdexhttp://www.developersdex.com***
Jan 15 '07 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

15
by: christopher diggins | last post by:
I posted to my blog a special pointer class work-around for inheriting from base classes without virtual destructors. I was wondering if there is any other similar work, and whether there are any...
19
by: James Fortune | last post by:
I have a lot of respect for David Fenton and Allen Browne, but I don't understand why people who know how to write code to completely replace a front end do not write something that will automate...
2
by: Jon Davis | last post by:
The garbage handler in the .NET framework is handy. When objects fall out of scope, they are automatically destroyed, and the programmer doesn't have to worry about deallocating the memory space...
7
by: Jack Addington | last post by:
I've got a fairly simple application implementation that over time is going to get a lot bigger. I'm really trying to implement it in a way that will facilitate the growth. I am first writing a...
6
by: Phillip N Rounds | last post by:
I have an application which is heavily graphics intensive, all the graphics being custom. Scattered throughout by app, I have MyView->OnDraw( this->GetDC() ); Apparently, each call to ...
8
by: Richie | last post by:
I am new to C++. I heard old stories that C++ is not standardized and microsoft, borland, watcom, and ...etc, have their own flavor of C++. Is this still true? If it is true, which kind of C++ is...
1
by: Joe Peterson | last post by:
I've been doing a lot of searching on the topic of one of Python's more disturbing issues (at least to me): the fact that if a __del__ finalizer is defined and a cyclic (circular) reference is...
20
by: mike | last post by:
I help manage a large web site, one that has over 600 html pages... It's a reference site for ham radio folks and as an example, one page indexes over 1.8 gb of on-line PDF documents. The site...
26
by: Ravindra.B | last post by:
I have declared a global variable which is array of pointers and allocated memory for each array variable by using malloc. Some thing similar to below... static char *arr; main() { int i;
0
by: Naresh1 | last post by:
What is WebLogic Admin Training? WebLogic Admin Training is a specialized program designed to equip individuals with the skills and knowledge required to effectively administer and manage Oracle...
0
by: AndyPSV | last post by:
HOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and on my computerHOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and...
0
by: Arjunsri | last post by:
I have a Redshift database that I need to use as an import data source. I have configured the DSN connection using the server, port, database, and credentials and received a successful connection...
1
by: Matthew3360 | last post by:
Hi, I have been trying to connect to a local host using php curl. But I am finding it hard to do this. I am doing the curl get request from my web server and have made sure to enable curl. I get a...
0
Oralloy
by: Oralloy | last post by:
Hello Folks, I am trying to hook up a CPU which I designed using SystemC to I/O pins on an FPGA. My problem (spelled failure) is with the synthesis of my design into a bitstream, not the C++...
0
by: Carina712 | last post by:
Setting background colors for Excel documents can help to improve the visual appeal of the document and make it easier to read and understand. Background colors can be used to highlight important...
0
BLUEPANDA
by: BLUEPANDA | last post by:
At BluePanda Dev, we're passionate about building high-quality software and sharing our knowledge with the community. That's why we've created a SaaS starter kit that's not only easy to use but also...
0
by: Rahul1995seven | last post by:
Introduction: In the realm of programming languages, Python has emerged as a powerhouse. With its simplicity, versatility, and robustness, Python has gained popularity among beginners and experts...
0
by: Ricardo de Mila | last post by:
Dear people, good afternoon... I have a form in msAccess with lots of controls and a specific routine must be triggered if the mouse_down event happens in any control. Than I need to discover what...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.