473,395 Members | 1,458 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Access DB too large, ideas?

anoble1
245 128KB
I have a Access database that is over a gb. I have 2 tables that I dump monthly info in. When I open the table it takes over 5 minutes for it just to open the table.

Any suggestions? Don't have a SQL Server or money for one.
Oct 27 '14 #1
20 1210
twinnyfo
3,653 Expert Mod 2GB
anoble1,

What do you store in your tables (I know, data--but what kinds)? Is your DB fully normalized (this can help significantly on wasted resources. Also, have you compacted and repaired your DB recently? Be sure to make a back up first before you compact and repair.
Oct 27 '14 #2
anoble1
245 128KB
You may have to help me out a bit here. I compact and repair every few days for sure. Do you mean what data types I guess?
Oct 27 '14 #3
Seth Schrock
2,965 Expert 2GB
FYI you can get SQL Server Express for free.
Oct 27 '14 #4
anoble1
245 128KB
I don't know much about SQL Server. But, I do not have a dedicated PC for SQL Server. I have my PC. I am the only person who uses this though. Wonder if I could throw that on my pc and just store data on that and link it?
Oct 27 '14 #5
twinnyfo
3,653 Expert Mod 2GB
My question was in terms of types of data. If you have attachments included in your database, your file could grow rather large. Keep in mind that I have no idea what your DB is used for, nor how your tables are built. But, from a "basic, generic MS Access DB user" standpoint, a DB over 1 GB in size is almost inconceivable to me.

A coworker of mine has a DB that is 600 MB, but it is a RAW DATA pull that contains hundreds of text fields and the entire thing is 100% un-normalized. I use a RAW Text DB pull that is about 300 MB, but after I extract the necessary data and normalize it for my purposes, it is about 150 (still a lot).

My point is, I'm just wondering what you are storing that contains 1 GB of data. not that there is anything wrong with what you have or how you are doing it. Just a little amazed at the size of the file for a low-budget operation.
Oct 27 '14 #6
anoble1
245 128KB
Attached is the data that I import into Access using the Wizard. No attachments.
Attached Images
File Type: jpg Capture.jpg (44.4 KB, 177 views)
Oct 27 '14 #7
twinnyfo
3,653 Expert Mod 2GB
Only 15 fields? It looks like it is all text, too. It might be possible to normalize the data during the import process. I assume you receive the data in a spreadsheet? how many lines are imported each time?

our friend JForbes could probably calculate how many rows you would need to max out Access with only 15 Fields, but it must be a lot.
Oct 27 '14 #8
anoble1
245 128KB
Varies.. 15,000-50,000 records in a sheet. When I open the table it says running query at the bottom of access. Wonder if it would help if i made an import function.
Oct 27 '14 #9
twinnyfo
3,653 Expert Mod 2GB
I would definitely recommend looking into that. I use an import function rather than a straight import. Your data looks highly un-Normalized and this could save considerable space.

And.... you can practice your coding skills!

:-)
Oct 27 '14 #10
jimatqsi
1,271 Expert 1GB
Do you compact the database from time to time? It could be the actual size is less than you think? Look under the Manage option for Compact and Repair (or under tools for older versions of Access)

Jim
Oct 27 '14 #11
twinnyfo
3,653 Expert Mod 2GB
Hey, Jim! Good to see you again. Seems like you have been away for a short while. Hope things are going well!
Oct 27 '14 #12
jforbes
1,107 Expert 1GB
Haha, Yeah, that would be a lot of records... a lot, sorry for being late to the party Twinnyfo, I spent sometime in Germany for work. It was a blast and they appreciate math. =)

jimatqsi is on it. If you create a lot of records then delete them, your db size wont drop on its own.

Make sure you clear out (delete) all your temp tables after you perform an import. Another guess is that you have some appendix tables that are no longer used. Even at that many records at once a month, your no where near a Gig. I can do the math tomorrow, salesmen permitting. ;)
Oct 28 '14 #13
twinnyfo
3,653 Expert Mod 2GB
All,

I am sure there must be "something" creating such a large DB file. But without knowing what data are in the tables, it is all conjecture.

My first thought was (Post #2), as was with many others, to Compact and Repair.

Outside of that, proper Table Structure could help.

@Anoble1, can you post the Table Structure you have for this Table (or tables, as the case may be). Please include the Field name and Data Type. This may help us redirect some structural changes.
Oct 28 '14 #14
anoble1
245 128KB
I compact and repair the database a few times a week. I have a feeling something is wrong with these data types. I took over this database and it has always been done with the Access automated import from Excel.

After looking at Capture2. Some of this stuff I am not familiar with, such as Field Size and Format. Sounds like I need to make a new table
Attached Images
File Type: jpg Large DB 1.JPG (46.9 KB, 188 views)
File Type: jpg Capture2.jpg (62.5 KB, 142 views)
Oct 29 '14 #15
anoble1
245 128KB
@twinnyfo Wow, found one of my issues. Looked at the structure of the table and went to the properties. There was a filter setup on the properties of the table. Table opens fast now like it should.
Oct 29 '14 #16
twinnyfo
3,653 Expert Mod 2GB
anoble1,

Yes, I would definitely recommend creating new tables.

Based on what I see (not what I know) about your tables, it looks like there could be several ways to optimize your Detail Sales Table (which should be several tables).

I'll talk "generics" here, and use what you know about your tables to makes informed decisions.

Let's assume I am keeping track of sales at my business using a DB.

I need a Table called tblCustomers. Everything that I do has to do with customers, so my customers table will include CustomerID, first and last names, business names, contact info and addresses. stuff like that. Anything that is specifically associated with my Customers and is exclusive to one customer only goes in this table.

I also have a table called tblProducts, because what is my business without products to sell, right? This table will include al the information pertaining to the product: Product ID, SKU, Vendor (where I get my products from), Description, Cost (what I pay for it), Price (what I sell it for). Anything that is specifically associated with my products and is exclusive to one product only goes in this table.

Then, I have a Table called tblInvoices. Every time I sell something, I create an invoice. Believe it or not, all we need on this invoice table is an InvoiceID, InvoiceDate and Customer ID. That's all I need. Because.....

I have another table, called tblInvoiceDetails. This table uses the InvoiceID as a Foreign Key, then lists the ProductID and Quantity.

You get the idea. Now, when I want to look at sales invoices, every invoice will have a record (or records) associated witht he invoice details table. Based on those two tables, using join queries, etc., we know the Customer name, address, contact info, Product ID, description, cost, price, total cost (Quantity x Cost) and total Price (quantity x price).

Here is going to be your challenge (and we will be glad to help you along the way): You must import your excel spreadsheet (which I have no doubt that you receive it in a "flat file" like your table), and break that spreadsheet into its various parts (Customers, Products, invoices, Invoice Details etc.) and append those values to your database.

Believe it or not, this actually sounds fun to me--but also a headache at the same time, because it will require many hours of coding, building queries, testing, retesting and testing again, to make sure it works right.

But, again, we're glad to work through these steps with you.
Oct 29 '14 #17
anoble1
245 128KB
Yeah, I have other databases like that. I'm not sure if that would be the best thing here though. However, I made a simple import via VBA. Takes a while to import though.

Expand|Select|Wrap|Line Numbers
  1. ' Open an existing spreadsheet
  2.     Set appExcel = GetObject("C:\Users\anoble\Desktop\PSSR Compensation\PSSR ACCRUAL, BALDWIN ALLOCATION, PARTS COUNTER BONUS\DETAIL SALES UPLOAD\detail sales report-part.xls")
  3.  
  4.     ' Don't show spreadsheet on screen
  5.     'appExcel.Application.Visible = False
  6.  
  7.     Set workSheet = appExcel.Worksheets(1)
  8.  
  9.     i = 2
  10.     While workSheet.Cells.Range("A" & i & ":A" & i).Value <> ""
  11.             sqlStatement = "SELECT tblDetailSales.ID, tblDetailSales.BRANCH, tblDetailSales.LINE, tblDetailSales.ACCOUNT, tblDetailSales.ORDERNum, tblDetailSales.PARTNum, tblDetailSales.QTY, tblDetailSales.DESCRIPTION, tblDetailSales.EXTINVOICEPRICE, tblDetailSales.BDNET, tblDetailSales.BDAVN, tblDetailSales.VENDOR, tblDetailSales.CLASS, tblDetailSales.SOURCE, tblDetailSales.PERIOD, tblDetailSales.CMAST FROM tblDetailSales;"
  12.             Set records = dbs_curr.OpenRecordset(sqlStatement, dbOpenDynaset, dbSeeChanges, dbOptimistic)
  13.             'If Not records.EOF Then
  14.             If Not records.EOF Or True Then
  15.                 records.AddNew
  16.                 records!BRANCH = workSheet.Cells.Range("A" & i & ":A" & i).Value
  17.                 records!Line = workSheet.Cells.Range("B" & i & ":B" & i).Value
  18.                 records!ACCOUNT = workSheet.Cells.Range("C" & i & ":C" & i).Value
  19.                 records!ORDERNum = workSheet.Cells.Range("D" & i & ":D" & i).Value
  20.                 records!PARTNum = workSheet.Cells.Range("E" & i & ":E" & i).Value
  21.                 records!QTY = workSheet.Cells.Range("F" & i & ":F" & i).Value
  22.                 records!Description = workSheet.Cells.Range("G" & i & ":G" & i).Value
  23.                 records!EXTINVOICEPRICE = workSheet.Cells.Range("H" & i & ":H" & i).Value
  24.                 records!BDNET = workSheet.Cells.Range("I" & i & ":I" & i).Value
  25.                 records!BDAVN = workSheet.Cells.Range("J" & i & ":J" & i).Value
  26.                 records!VENDOR = workSheet.Cells.Range("K" & i & ":K" & i).Value
  27.                 records!Class = workSheet.Cells.Range("L" & i & ":L" & i).Value
  28.                 records!Source = workSheet.Cells.Range("M" & i & ":M" & i).Value
  29.                 records!PERIOD = workSheet.Cells.Range("N" & i & ":N" & i).Value
  30.                 records!CMAST = workSheet.Cells.Range("O" & i & ":O" & i).Value
  31.  
  32.                 records.Update
  33.             'i = i + 1
  34.             End If
  35.             i = i + 1
  36.      Wend
  37.  
  38.     ' Release objects
  39.     Set workSheet = Nothing
  40.     Set workBook = Nothing
  41.     Set appExcel = Nothing
  42.  
  43.     DoCmd.Close acForm, "frmRunningQuery"
  44.     MsgBox "Done"
Oct 30 '14 #18
twinnyfo
3,653 Expert Mod 2GB
anoble1,

Have you tried linking the spreadsheet to your DB? This would enable you to use it much like you use a table. This would 1) decomplexify the code tremendously and 2) probably increase the speed.

It also looks like your code is doing a raw import, must the same as what you started with. however, this could be for your initial workings on the code?
Oct 30 '14 #19
anoble1
245 128KB
I have used this import in the past. I wonder what other or better VBA options their are?
Oct 30 '14 #20
twinnyfo
3,653 Expert Mod 2GB
Like I said, if the spreadsheet is linked, you can create a recordset from the spreadsheet, rather than going line, by line, cell by cell.

BTW, I love your avatar!
Oct 30 '14 #21

Sign in to post your reply or Sign up for a free account.

Similar topics

1
by: Markus L?ffler | last post by:
Hi all, I'm looking for a class to access large memory blocks of dynamic length in an efficient way. Basically the simplest way to allocate a memory block is to allocate a byte . If you now...
20
by: Hemant Shah | last post by:
Folks, I am using DB2 UDB 8.2 on AIX 5.1. How large of a bufferpool can you create? I tried to create a 4GB bufferpool db2 complained that is cannot allocate enogth memory. I have 16GB on this...
5
by: almagg | last post by:
Microsoft has the VBA help file for Excel 2000 which I was able to download. But I couldn't find one for Access. Any ideas? The original laptop owner did not install these files and I could sure...
11
by: DFS | last post by:
Architecture: Access 2003 client, Oracle 9i repository, no Access security in place, ODBC linked tables. 100 or so users, in 3 or 4 groups (Oracle roles actually): Admins, Updaters and ReadOnly....
6
by: Rolf Schroedter | last post by:
(Sorry for cross-posting). I need to access large files > 2GByte (Linux, WinXP/NTFS) using the standard C-library calls. Till today I thought I know how to do it, namely for Win32: Use open(),...
0
by: Skywalker | last post by:
Hi. Can you please help me? I have problem;-) I am copying from one computer to another 50 MB large text file. For now is everything working. My question is, if I can in VBA for MS ACCESS show to...
4
by: Zeb | last post by:
Hi I'm using DirectoryInfo.Delete so that when a product is removed, all it's associated images (including the folder they sit in) are deleted. This actually seems to work fine. However, the...
1
by: Matt | last post by:
I'm using classic ASP with an Access 2003 database, running on Win 2003 Server (IIS 6), trying to update a table and get the following error: Syntax error in UPDATE statement. -...
4
by: =?Utf-8?B?U2VyZ2Vp?= | last post by:
Dear staff Can I get your assistance with \3GB (LARGEADDRESSAWARE) switch in mixed mode process built by VS 2008, please? I have a mixed mode application: C# GUI calling native C++ DLL through...
5
by: rote | last post by:
I'm using ASP.NET 2.0 and i have copied and pasted the code below to my Global.asax file but it desn't trap the error I want to trap the 401 access denied void Application_Error(object sender,...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.