473,399 Members | 3,038 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,399 software developers and data experts.

Whats the maximum sensible size for an Access application?

Hi All,

I developed a nice little database for analysis of equipment log files.

The project proved very useful and people started sending me more and more data.

It turned out I ran up against the 2GB limit quickly so I split the db and started a db for each piece of equipment using a front end db with linked table. Life was good.

Currently I have 12GB of data in linked tables and queries running through the front end on 30 million records.

Queries run really slowly (couple hours at least) and clearly will only get worse with more data ...

I'm wondering if Access is now a sensible architecture or if I would get better performance from an SQL server implementation. Our corporate network is really slow so I'm keen to host the data on my PC and if I understand correctly I cant run SQL without a server?

Another thought is to process data into OLAP cubes offline and then query the cubes instead of the tables.

I'm way out of my depth here, only 3 months using Access/VBA and would really appreciate some experienced users perspective of whats sensible.

Thanks,

Mark.
Aug 28 '10 #1
14 1738
Delerna
1,134 Expert 1GB
You will definitely be better off by moving over to SQL server, it is certainly possible to run SQL Server on a PC, I do it on mine.
However you will not gain the full benefit of SQL server if you don't run it on a propper server. The server specs play a large role in any performance gain you will get from SQL Server. Indeed the specs will change according to the use you will put the database to.

If you want to try it out for a while before making a decision then I would suggest that you install SQL Server Express onto your PC along with SQL Server Management Studio Express.

Both of these are freely available from the microsoft site
Aug 31 '10 #2
TheSmileyCoder
2,322 Expert Mod 2GB
If your running queries on 30 million records, it may also be worth spending some time on designing your index's properly, and looking into whether you have set all your field types correctly.

12 gb in 3 months sounds like a lot to me (others may think differently).
Have you tried doing a compact & repair from time to time? If your running alot of inserts, specifically in indexed tables, your database can grow quite fast.

Another option could be to split out old data in seperate backends (one for each month, or one for each piece of equipment), depending on what you need.

What information are you storing to get 30 million records?
Aug 31 '10 #3
Hey Delerna,

Thanks for the tips. I'd thought SQL Server had to run on a server OS. I'll install the express edition on my PC and see what happens with the dbs in SQL.

Thanks,

Mark.
Aug 31 '10 #4
Hey SmileyOne,

It's actually 12GB of raw data.
Each piece of equipment writes 1-2MB of data to a log file each day. Each line of the log file contain date,time, event, codes, additional metrics. Some of the files go back several years so it quickly adds up.

One of the queries locates 1 of ~300 possible event codes, parses data from the log string and collates statistics for causes over all records. Not sure if indexing will help. Will try.

I get much worse performance querying tables using relationships than pre-parsing data into tables and running queries directly on the tables....

Steep learning curve for me on relational dbs. I have no awareness of how many records can be processed in a reasonable time.

Thanks for your interest

Mark.
Aug 31 '10 #5
Delerna
1,134 Expert 1GB
Have you tried doing a compact & repair from time to time? If your running alot of inserts, specifically in indexed tables, your database can grow quite fast.
It's been so long since I used access as a backend I had forgotten about that.
Aug 31 '10 #6
Delerna
1,134 Expert 1GB
Not sure if indexing will help
Not sure with access as I had moved over to SQL Server before I had become aware of them.

Propper indexing will provide huge boosts to query performance .
I have seen queries running on unindexed tables taking many many minutes reduced to running within seconds, just by putting an index on the table.

Indexing is a topic that will be well worth your time spent getting familiar with if you do move over to SQL Server
Aug 31 '10 #7
TheSmileyCoder
2,322 Expert Mod 2GB
I would try putting an index on event codes if that is what you often query for.

One thing I have seen before in log databases, is that some information is stored repeatedly. Most error codes for instance will have a description related to it. Do you store the description in each record, or have you made a seperate table of the descriptions, so that you only store each error description once?
Sep 1 '10 #8
Marknut
42
Sql server is definitely the answer for 30 million records, but you should research propper implimentation. You'll have a more robust tool, but still poor performance withought indexes and primary keys. You should also make sure your tables are holding the correct kind of data.

I.e. if you had a database of the united states, you shouldn't have 3 text fields in a table to identify a state, county, and city. You should have 3 tables, each with a code and text identifier. Then any other tables that reference a state, county, or city should only reference their ids. I hope that makes sense.
Sep 1 '10 #9
Delerna
1,134 Expert 1GB
Yes, in line with what Marknut said.
Database normalization is another topic well worth the time you spend getting familliar with

Edit
And in the scheme of things, a good understanding of it should come before indexing.
You may or may not already know these things?
Sep 1 '10 #10
OK great!

Thanks for all the replies. I indexed the error codes tonight which didn't seem to make much difference. Will look into your other suggestions over the weekend.

Thanks again, really appreciate your time to reply.

Mark.
Sep 2 '10 #11
TheSmileyCoder
2,322 Expert Mod 2GB
Its kinda off the topic about the size of the database, but in your queries do you perform queries on memo type fields? (and do you have memo type fields in your tables?)
Sep 2 '10 #12
Hey SmileyOne,

Funny you should ask!

Each line in the log file has date, time, event code and a number of parameters (dependent on the specific code) linked together with descriptive plain text in localized language.
Lines go beyond the 255 text data type, up to 350 chars so I have to use memo data type.

I realized I needed to get into a more efficient format so during the initial log import I parse out the date, time, event codes and params into columns in the table and query on those text/number fields rather than querying the memo field.

I spent the last few days creating relationships to get rid of the text fields, replacing them with IDs to lookup tables see if that improves performance but am finding it difficult...

The lookup tables for equipment (Serial number, location etc) and event codes (12 char text, description, meaning etc) was obvious. I now have this information related to logs by ID number but cant decide how to represent the event code parameters as the number of parameters varies by event code.

Should I have a table for all event code parameters (will waste a lot of space) or a separate table for each event code type (will mean >500 tables to be created and related).

I'm googling for tips on normalization now. My Labor day weekend is shot..



Thanks,

Mark.
Sep 3 '10 #13
TheSmileyCoder
2,322 Expert Mod 2GB
I doubt it would be a good idea to create over 500 tables. The amount of work to create it and maintain it would be way to large.

How many columns do you have in your "main" table? And how do you query it? What do you query by(your Where clause for instance)



One thing I suddenly (think I) remember about SQL express is that is has a limit on 4gb per datafile (but if I recall correctly you can split the database in multiple datafiles). Im not really sure on this point, just something you should research in advance.
Sep 6 '10 #14
OK thanks. Single parameter table it is.

I'd thought I should get the db setup and working in Access before considering SQL just because of the extra learning curve. I also use a lot of VBA for string parsing and environment setup (tables, linking, formating) so need to think about how all that will happen in SQL.

Thanks again for your help. I'm real close not to something workable.

Mark.
Sep 6 '10 #15

Sign in to post your reply or Sign up for a free account.

Similar topics

1
by: Hank Reed | last post by:
Hello, I have not seen this question addressed in older threads. I am wondering about the practical limit for code size in Access 2000 on a project I have been working on for five years that runs...
0
by: Lokanathan | last post by:
How to set Richtextbox Maximum Byte Size?
4
by: Gregory Hassett | last post by:
Hello, Does anyone know how to get the maximum size of a datagram for a UDP Socket created via .NET's System.Net.Sockets.Socket class? Thanks!
4
by: Bei | last post by:
I would like to allow more connection to SQL servers from the web application. Do you know what is the code to set maximum pool size in web.config file? Thanks
2
by: Bei | last post by:
In web.config, will there be an issue if I set the maximum pool size to be too large, like 1000? I see normally people set it to 100 or less. But I want to make sure there will be no error like...
2
by: amber | last post by:
Hello I'm using the Data Access Application Block in my code I'm trying to populate a list box It works right up to the last step - displaying the datamember Can someone tell me where the problem...
2
by: ari | last post by:
Hi! Normally Google helps a lot with these simple questions, but now I cannot get correct search words. So I try here: I have db2 udb 8.1 fix pack 6, and it is on windows 2003 server. -...
1
by: sundar261 | last post by:
Hi, I am using ifstream open() to open an greater than 5GB file (more 100 million record) and open() fails to open the file. But I am able to open smaller sized files (around 1GB) with the same...
2
by: skyy | last post by:
Hi.. i am doing some uploading of files using CGI with perl script.. the $CGI::POST_MAX is used to limit the maximum size of the upload file. Did anyone know what is the maximum size that i can...
3
by: =?Utf-8?B?bXIgcGVhbnV0?= | last post by:
At some time in the distant past I had an insufficient memory problem that resulted as an artifact of there being a maximum allowable size for any single object in Windows. In the case that I...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.