473,771 Members | 2,372 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

exceeded the 2Gb limit

I work with a lot of large databases (Tiger, Census, PhoneBooks...),
and occasionally I run a process that exceeds the 2Gb. limit. When
this happens I've NOT been able to erase data, compact or otherwise
recover the database. Because of backups, this hasn't been a big
problem with data lost, but I can loose hours of processing time when
running address matching or parsing.

Does anyone know what the fix is?

TIA
Tom
Nov 13 '05 #1
6 9362
Well, the first, obvious response is that, if you are frequently exceeding
2GB, then JET might not be the best back-end (nor is MSDE which has the same
limit). Obvious possible alternatives are Microsoft SQL Server, MySQL or
Interbase/Firebird. PostgreSQL is also a good choice, and version 8 now runs
natively on Windows and looks really nice, but it is also too new to have
proven itself yet.

If you don't go the database server route, workarounds pretty much involve
splitting the back-end in some manner or other. The downside of this is that
you can't enforce relational integrity across files at the database level, but
that is often acceptable.

Splitting an MDB back-end usually involves either putting some tables in one
back-end and some in another, or splitting the largest table horizontally into
2 tables with each having some of the fields, and linked 1-to-1 on the primary
key. You can make a query that joins the tables together, and treat it pretty
much the same as if it had never been split.

On 8 Oct 2004 00:02:21 -0700, tw*@gate.net (Tom Warren) wrote:
I work with a lot of large databases (Tiger, Census, PhoneBooks...),
and occasionally I run a process that exceeds the 2Gb. limit. When
this happens I've NOT been able to erase data, compact or otherwise
recover the database. Because of backups, this hasn't been a big
problem with data lost, but I can loose hours of processing time when
running address matching or parsing.

Does anyone know what the fix is?

TIA
Tom


Nov 13 '05 #2
Tom Warren wrote:
I work with a lot of large databases (Tiger, Census, PhoneBooks...),
and occasionally I run a process that exceeds the 2Gb. limit. When
this happens I've NOT been able to erase data, compact or otherwise
recover the database. Because of backups, this hasn't been a big
problem with data lost, but I can loose hours of processing time when
running address matching or parsing.

Does anyone know what the fix is?


Use a DBMS that handles over 2GB. e.g. SQL Server (not MSDE, that has a
2GB limit also).

--
Pretentious? Moi?
Nov 13 '05 #3
Trevor Best <nospam@localho st> wrote in message news:<41******* *************** *@auth.uk.news. easynet.net>...
Tom Warren wrote:
I work with a lot of large databases (Tiger, Census, PhoneBooks...),
and occasionally I run a process that exceeds the 2Gb. limit. When
this happens I've NOT been able to erase data, compact or otherwise
recover the database. Because of backups, this hasn't been a big
problem with data lost, but I can loose hours of processing time when
running address matching or parsing.

Does anyone know what the fix is?


Use a DBMS that handles over 2GB. e.g. SQL Server (not MSDE, that has a
2GB limit also).


I have 2 SQL servers and mainly for distributed processing (speed)
reasons I use jet. Thanks to all for their opinions on what data store
I should use, but all I'm really looking for is a fix for recovering
the occasional maxed out database.

TIA
Tom
Nov 13 '05 #4
Tom Warren wrote:
Trevor Best <nospam@localho st> wrote in message news:<41******* *************** *@auth.uk.news. easynet.net>...
Tom Warren wrote:

I work with a lot of large databases (Tiger, Census, PhoneBooks...),
and occasionally I run a process that exceeds the 2Gb. limit. When
this happens I've NOT been able to erase data, compact or otherwise
recover the database. Because of backups, this hasn't been a big
problem with data lost, but I can loose hours of processing time when
running address matching or parsing.

Does anyone know what the fix is?


Use a DBMS that handles over 2GB. e.g. SQL Server (not MSDE, that has a
2GB limit also).

I have 2 SQL servers and mainly for distributed processing (speed)
reasons I use jet. Thanks to all for their opinions on what data store
I should use, but all I'm really looking for is a fix for recovering
the occasional maxed out database.


Good luck with that one, the only time I've ever reach the 2GB limit is
due to bloat, compacting fixes this. I've not experienced any growth
like that due to data.

--
Pretentious? Moi?
Nov 13 '05 #5
You may be able to recover using the tool JetComp.

It compacts the file without opening it first, that may work.

ACC2000: Jet Compact Utility Available in Download Center (273956)
http://support.microsoft.com/default...B;EN-US;273956

Another possibility, which you may have already considered, is to break the
process up and compact at intervals.

Also you may wish to consider breaking up the database into a series of
separate databases each with a major table involved in the process. Then
import the results into the final database with the correct DRI.

As you are erasing data during the process is it possible to disregard it
before it hits the database.

Otherwise get a Developer Edition of SQL Server and stick it on your
workstation and play with it like you can with Jet.

Best of luck.

--
Slainte

Craig Alexander Morrison
"Tom Warren" <tw*@gate.net > wrote in message
news:f4******** *************** **@posting.goog le.com...
Trevor Best <nospam@localho st> wrote in message

news:<41******* *************** *@auth.uk.news. easynet.net>...
Tom Warren wrote:
I work with a lot of large databases (Tiger, Census, PhoneBooks...),
and occasionally I run a process that exceeds the 2Gb. limit. When
this happens I've NOT been able to erase data, compact or otherwise
recover the database. Because of backups, this hasn't been a big
problem with data lost, but I can loose hours of processing time when
running address matching or parsing.

Does anyone know what the fix is?


Use a DBMS that handles over 2GB. e.g. SQL Server (not MSDE, that has a
2GB limit also).


I have 2 SQL servers and mainly for distributed processing (speed)
reasons I use jet. Thanks to all for their opinions on what data store
I should use, but all I'm really looking for is a fix for recovering
the occasional maxed out database.

TIA
Tom

Nov 13 '05 #6
Craig,

Thanks for a smart idea but JetComp did not work, so I'm still looking.

Thank again.
Tom
"Craig Alexander Morrison" <re***@newsgrou ps.com> wrote in message news:<41******@ 212.67.96.135>. ..
You may be able to recover using the tool JetComp.

It compacts the file without opening it first, that may work.

ACC2000: Jet Compact Utility Available in Download Center (273956)
http://support.microsoft.com/default...B;EN-US;273956

Another possibility, which you may have already considered, is to break the
process up and compact at intervals.

Also you may wish to consider breaking up the database into a series of
separate databases each with a major table involved in the process. Then
import the results into the final database with the correct DRI.

As you are erasing data during the process is it possible to disregard it
before it hits the database.

Otherwise get a Developer Edition of SQL Server and stick it on your
workstation and play with it like you can with Jet.

Best of luck.

--
Slainte

Craig Alexander Morrison
"Tom Warren" <tw*@gate.net > wrote in message
news:f4******** *************** **@posting.goog le.com...
Trevor Best <nospam@localho st> wrote in message

news:<41******* *************** *@auth.uk.news. easynet.net>...
Tom Warren wrote:

> I work with a lot of large databases (Tiger, Census, PhoneBooks...),
> and occasionally I run a process that exceeds the 2Gb. limit. When
> this happens I've NOT been able to erase data, compact or otherwise
> recover the database. Because of backups, this hasn't been a big
> problem with data lost, but I can loose hours of processing time when
> running address matching or parsing.
>
> Does anyone know what the fix is?

Use a DBMS that handles over 2GB. e.g. SQL Server (not MSDE, that has a
2GB limit also).


I have 2 SQL servers and mainly for distributed processing (speed)
reasons I use jet. Thanks to all for their opinions on what data store
I should use, but all I'm really looking for is a fix for recovering
the occasional maxed out database.

TIA
Tom

Nov 13 '05 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

6
44953
by: Georgy Pruss | last post by:
Sometimes I get this error. E.g. >>> sum = lambda n: n<=1 or n+sum(n-1) # just to illustrate the error >>> sum(999) 499500 >>> sum(1000) ............ RuntimeError: maximum recursion depth exceeded
3
17323
by: Brian Piotrowski | last post by:
Hi All, I'm trying to run a simple query from an ASP page. I want the query to select each individual field in a table and compare it to another table. If the value doesn't exist, I want it reported. Here's the code I have to do this check: 'Check to see if there are any assemblies not in the database! strSQL = "select substring(mtoc,2,6) as deadmodel from bos_data where substring(bos_data.mtoc,2,6) not in (select code from...
2
6046
by: steve | last post by:
I am setting up a huge database in mysql, and I get the above error in Linux. I believe it is related to the size of one of my tables, which is 4,294,966,772 bytes in size. Can someone help. How can I break that barrier. A google search did not turn up anything useful. -- Posted using the http://www.dbforumz.com interface, at author's request Articles individually checked for conformance to usenet standards
6
7376
by: tigr | last post by:
I am trying to read BLOBs from a large table (i.e., greater than 34K rows) using Java and the IBM JDBC driver (actually through an Application server). I get the following: SQL0429N The maximum number of concurrent LOB locators has been exceeded. SQLSTATE=54028 I understand that the LOB locators are pointers to the BLOBs, but is there any way to free them explicitly, or get the server to do it ?
8
18758
by: Peter Ballard | last post by:
Hi all, I've got a C program which outputs all its data using a statement of the form: putchar(ch, outfile); This has worked fine for years until it had to output more than 2GB of data today, and I got a "file size limit exceeded" error.
3
12341
by: Big Dave | last post by:
Hello All, I was wondering wether anyone could help me solve what is probably a very easy issue. I keep getting this damn "The administrative limit for this request was exceeded" whenever I try to query my LDAP server. Does anyone have any idea how to fix this. I have tried the pagesize and the sizelimit to no avail. Please help. --- Here is my code below: --- using System;
0
2070
by: Brian Piotrowski | last post by:
Hi All, I have an SQL Server 2000 table that contains less than 2000 records. I would like to select some of these records and group them. The query I wrote runs fine in SQL Server's Query Analyzer, but when I run it in an ASP page, I get: Response object error 'ASP 0251 : 80004005'
1
7767
by: marcelo Cortez | last post by:
Hi folks My application fail with 'Non-superuser connection limit exceeded' error , the client application is connected via ODBC AND GPF MESSAGE appear there. the 'Non-superuser connection limit exceeded' what'wrong?.
9
11365
by: eastcoastguyz | last post by:
I wrote a simple program to continue to create a very large file (on purpose), and even though there is plenty of disk space on that device the program aborted with the error message "File Size Limit Exceeded". The file size was 2147483647. I checked ulimit -a and its set to unlimited. Is this a compiler issue? I would like to see a C code example of how to increase the limit or make it unlimited (if that is a wise thing to do).
1
6337
by: Gurpal | last post by:
I'm getting this error when I test this page. Here is the error: Response object error 'ASP 0251 : 80004005' Response Buffer Limit Exceeded /test/test4.asp, line 0 Execution of the ASP page caused the Response Buffer to exceed its configured limit.
0
9619
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9454
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10260
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
10038
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
9910
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8933
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
5354
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5482
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
2
3609
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.