473,554 Members | 2,984 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Need work around for possible data leaks.

We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins

Jan 9 '07 #1
8 1728
Greetins,

Try to think about your problem this way. Do you think anyone would
ever try to haul a semi trailer with a 4 cylinder pickup truck? Of
course not. If you have big data - you need something with a big engine
- that would be sql server. Just as the little 4 cylinder pickup truck
would spin its wheels trying to pull the semi trailer, Access spins its
wheels when trying to pull big data. Access has a 1 gig file size limit
- that includes data, forms, code modules (well, actually 2 gigs for
unicode, but it still boils down to 1 gig of distinct data, objects...)

The solution to your problem would be to upgrade to sql server - use
the right tool for the right job.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Jan 9 '07 #2
Lykins wrote:
The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089.
Patrick, your data is either screwed (corrupted), your application is
screwed or your data structure is screwed. Or all three are screwed.

Are you using Jet or another database engine for your main data?
--
Tim http://www.ucs.mun.ca/~tmarshal/
^o<
/#) "Burp-beep, burp-beep, burp-beep?" - Quaker Jake
/^^ "Be Careful, Big Bird!" - Ditto "TIM-MAY!!" - Me
Jan 9 '07 #3
"Lykins" <ly****@gmail.c omwrote in news:1168355262 .639605.23460
@s34g2000cwa.go oglegroups.com:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
99.44% of Access errors which are reported here are application errors:
errors in concept, errors in logic, errors in design, errors in SQL,
errors in code.
So my first suggestion is to hire an independent consultant to examine
all those things with respect to your database and to submit a full
evaulation of them to your database administrator and to senior
management.

My second suggestion is to consider something much more pwerful than 256
megs of ram, P2s or P3s and JET to handle millions of records. An
independent consultant might help with suggestions about this as well.


--
lyle fairfield
Jan 9 '07 #4
Hi, Patrick.
Is there any work around that we can try?
The #1 cause of missing records in a query is poor database design. The #2
cause is poor query design. The next two most common causes are data
corruption and accidental user deletions. Depending upon the environment,
one may be more prevalent than the other.

Ensure that all of the tables have primary keys and are normalized. Ensure
that an experienced DBA or query designer is writing and testing your
queries. If the database is corrupted and has that many records, then a
number of bizarre things will happen, not just this one query returning a
different number of records each time it's run. So if the file is
corrupted, then you'll likely have other clues that this is the problem.

With that many records, I'd recommend upgrading the back end to a
client/server database, such as Oracle or SQL Server. Oracle Express and
SQL Server 2005 Express are both free and hold up to 4 GB of data. And if
you do upgrade to an express version, put the client/server database on a
workstation with more horsepower.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Lykins" <ly****@gmail.c omwrote in message
news:11******** *************@s 34g2000cwa.goog legroups.com...
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins

Jan 10 '07 #5
Hi, Patrick.

In case you're interested in the free client/server databases to store your
data, you can download Oracle Express from the following Web page:

http://www.oracle.com/technology/pro.../xe/index.html

Or download SQL Server 2005 Express:

http://www.microsoft.com/downloads/d...displaylang=en

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"'69 Camaro" <Fo************ **************@ Spameater.orgZE RO_SPAMwrote in
message news:N8******** *************** *******@adelphi a.com...
Hi, Patrick.
>Is there any work around that we can try?

The #1 cause of missing records in a query is poor database design. The
#2 cause is poor query design. The next two most common causes are data
corruption and accidental user deletions. Depending upon the environment,
one may be more prevalent than the other.

Ensure that all of the tables have primary keys and are normalized.
Ensure that an experienced DBA or query designer is writing and testing
your queries. If the database is corrupted and has that many records,
then a number of bizarre things will happen, not just this one query
returning a different number of records each time it's run. So if the
file is corrupted, then you'll likely have other clues that this is the
problem.

With that many records, I'd recommend upgrading the back end to a
client/server database, such as Oracle or SQL Server. Oracle Express and
SQL Server 2005 Express are both free and hold up to 4 GB of data. And if
you do upgrade to an express version, put the client/server database on a
workstation with more horsepower.

HTH.
Gunny

See http://www.QBuilt.com for all your database needs.
See http://www.Access.QBuilt.com for Microsoft Access tips and tutorials.
http://www.Access.QBuilt.com/html/ex...ributors2.html for contact
info.
"Lykins" <ly****@gmail.c omwrote in message
news:11******** *************@s 34g2000cwa.goog legroups.com...
>We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins


Jan 10 '07 #6
Use SQL Server for a backend if you have that many records.

Lykins wrote:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
Jan 11 '07 #7
Try SQL Server.

Lykins wrote:
We currently use Access 2003 in our company and have had this issues
from every version from Access 97 to 2003. We deal with large
databases and run a lot of queries over tables with millions of records
in them.

The problem comes in that when we pull a dataset out of a large table
we do not get the same result every time. Example is transaction count
for store 1 shows 3000 one time, the next run it is 3015, the next is
2089. All of this is over a static database that only gets updated at
night. In order to fix this we typically pull out the data and put it
into a temporary table and run the reports over it, so that we are only
going over say 100,000 rows instead of 2,000,000. Most of the time
this seems to work, however, lately the amount of data that we are
using it takes several tries to get the data to copy with the correct
data, and then several more to get rid of it.

We have tried this on multiple machines all from an older P2 1.0Ghz
with 256mb RAM to a P3 2.5Ghz with 1.5GB ram in it. The only
difference between the machines was that the higher the RAM the more
data that it could hold before corrupting the data results which is why
I am pointing to memory leaks.

Is there any work around that we can try?

Thank you,
Patrick L. Lykins
Jan 11 '07 #8
Thanks to everyone who responded. I must have had a caching problem
because I did not see any replies as of yesterday until this morning.

We have issues with most of our larger applications. The data resides
either in a Visual FoxPro table or on one of our three MS-SQL 2000
servers. All of my reporting is done through VFP 6.0 programs, ASP, or
Cold Fusion and we use ODBC connections to access all, however, we have
about two users that are VP's and two that work directly under them,
that are "power users" and we let them create there own queries.
Unfortunatly the size of our company and the amount of detail is just
too great. I have given them some small programs to create a small set
of the data and that had worked until recently. I have been working on
an ASP program that would act similar to access without all of the
bells and whistles. I believe that I may just need to go down that
path because I am 100% sure now that the issue is with the size of the
data that it is trying to handle, one of the tables that it is
accessing through SQL 2000 is 102.4GB now, and most reside in the
10-75GB range.

Thanks again,
Patrick

On Jan 9, 1:26 pm, Rich P <rpng...@aol.co mwrote:
Greetins,

Try to think about your problem this way. Do you think anyone would
ever try to haul a semi trailer with a 4 cylinder pickup truck? Of
course not. If you have bigdata- youneedsomethin g with a big engine
- that would be sql server. Just as the little 4 cylinder pickup truck
would spin its wheels trying to pull the semi trailer, Access spins its
wheels when trying to pull bigdata. Access has a 1 gig file size limit
- that includesdata, forms, code modules (well, actually 2 gigs for
unicode, but it still boils down to 1 gig of distinctdata, objects...)

The solution to your problem would be to upgrade to sql server - use
the right tool for the right job.

Rich

*** Sent via Developersdexht tp://www.developersd ex.com***
Jan 15 '07 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

15
2100
by: christopher diggins | last post by:
I posted to my blog a special pointer class work-around for inheriting from base classes without virtual destructors. I was wondering if there is any other similar work, and whether there are any problems with my proposed approach. Thanks in advance! See http://www.artima.com/weblogs/viewpost.jsp?thread=107587 For those who just want the...
19
4075
by: James Fortune | last post by:
I have a lot of respect for David Fenton and Allen Browne, but I don't understand why people who know how to write code to completely replace a front end do not write something that will automate the code that implements managing unbound controls on forms given the superior performance of unbound controls in a client/server environment. I can...
2
9163
by: Jon Davis | last post by:
The garbage handler in the .NET framework is handy. When objects fall out of scope, they are automatically destroyed, and the programmer doesn't have to worry about deallocating the memory space for those objects. In fact, all the programmer has to worry about is the total sum of objects loaded into RAM at any known point. Memory leaks are not...
7
2350
by: Jack Addington | last post by:
I've got a fairly simple application implementation that over time is going to get a lot bigger. I'm really trying to implement it in a way that will facilitate the growth. I am first writing a WinForms interface and then need to port that to a web app. I am kinda stuck on a design issue and need some suggestions / direction. Basically I...
6
1943
by: Phillip N Rounds | last post by:
I have an application which is heavily graphics intensive, all the graphics being custom. Scattered throughout by app, I have MyView->OnDraw( this->GetDC() ); Apparently, each call to this->GetDC() creates a GDI object and, 16,000 or so calls to OnDraw() results in the Application hanging because it can no longer create any new GDI...
8
1622
by: Richie | last post by:
I am new to C++. I heard old stories that C++ is not standardized and microsoft, borland, watcom, and ...etc, have their own flavor of C++. Is this still true? If it is true, which kind of C++ is widely used and widely accepted by the C++ community? What are the best opensource and proprietary IDEs to do C++ programming? What are the best...
1
2255
by: Joe Peterson | last post by:
I've been doing a lot of searching on the topic of one of Python's more disturbing issues (at least to me): the fact that if a __del__ finalizer is defined and a cyclic (circular) reference is made, the garbage collector cannot clean it up. First of all, it seems that it's best to avoid using __del__. So far, I have never used it in my...
20
4237
by: mike | last post by:
I help manage a large web site, one that has over 600 html pages... It's a reference site for ham radio folks and as an example, one page indexes over 1.8 gb of on-line PDF documents. The site is structured as an upside-down tree, and (if I remember correctly) never more than 4 levels. The site basically grew (like the creeping black...
26
2327
by: Ravindra.B | last post by:
I have declared a global variable which is array of pointers and allocated memory for each array variable by using malloc. Some thing similar to below... static char *arr; main() { int i;
0
7521
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language...
0
8042
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that...
0
7889
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the...
1
5436
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes...
0
5155
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert...
0
3560
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in...
0
3548
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
2020
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
0
841
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.