By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
426,122 Members | 921 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 426,122 IT Pros & Developers. It's quick & easy.

File Length - Compact

P: 44
Hi there
I have a MS Access 2007 database which frequently grows to over 2Gb. I want to be warned when the file length gets to, say, 1.5Gb. I have got so far with the following code - please bear in mind that I am an absolute beginner!!! - but don't know where to put this. Do I put it in a Form, or in every form, or on the Start Up form, or where?

Also, ideally, I would like this Module to also Compact the database rather than just reminding me to do so manually.

The code so far is:
Expand|Select|Wrap|Line Numbers
  1. Public Sub Module_Compact()
  2.  
  3.         If FileLen(CurrentDb.Name) > 150000 Then
  4.  
  5.         Dim PrintMsgBox As Byte
  6.         PrintMsgBox = MsgBox("File Size getting too big.  Please Compact.", vbOKOnly + vbExclamation, "WARNING - FILE SIZE TOO BIG")
  7.  
  8.         End If
  9.  
  10. End Sub
Many thanks to you all for any help you can give me.

Viv
Jul 31 '12 #1

✓ answered by dsatino

The following will work,but I think you need to look into why your DB is constantly growing too large. 2 GB is not all that large by any means, but the fact that you can compact it and regain any significant space probably means your using too many temporary tables and need a better strategy for what you're doing. It definitely sounds like you need to split your DB as well.


As for where to put it...I'd say at the end of any process that is causing your DB to grow substantially.

Expand|Select|Wrap|Line Numbers
  1. Public Sub Module_Compact()
  2.     Dim x As Integer
  3.     Dim tf As Boolean
  4.  
  5.     tf = False
  6.  
  7.     If FileLen(CurrentDb.Name) > (0.9 * 2147483648#) Then
  8.         x = MsgBox("File Size getting too big. Database will now compact.", vbOKOnly, "WARNING - FILE SIZE TOO BIG")
  9.         tf = True
  10.     ElseIf FileLen(CurrentDb.Name) > (0.75 * 2147483648#) Then
  11.         x = MsgBox("File Size getting too big. Would you like to compact it now?", vbYesNo, "WARNING - FILE SIZE TOO BIG")
  12.         If x = 6 Then tf = True
  13.     End If
  14.  
  15.     Application.SetOption "Auto Compact", tf
  16.     If tf = True Then DoCmd.Quit
  17. End Sub

Share this Question
Share on Google+
37 Replies


dsatino
100+
P: 393
The following will work,but I think you need to look into why your DB is constantly growing too large. 2 GB is not all that large by any means, but the fact that you can compact it and regain any significant space probably means your using too many temporary tables and need a better strategy for what you're doing. It definitely sounds like you need to split your DB as well.


As for where to put it...I'd say at the end of any process that is causing your DB to grow substantially.

Expand|Select|Wrap|Line Numbers
  1. Public Sub Module_Compact()
  2.     Dim x As Integer
  3.     Dim tf As Boolean
  4.  
  5.     tf = False
  6.  
  7.     If FileLen(CurrentDb.Name) > (0.9 * 2147483648#) Then
  8.         x = MsgBox("File Size getting too big. Database will now compact.", vbOKOnly, "WARNING - FILE SIZE TOO BIG")
  9.         tf = True
  10.     ElseIf FileLen(CurrentDb.Name) > (0.75 * 2147483648#) Then
  11.         x = MsgBox("File Size getting too big. Would you like to compact it now?", vbYesNo, "WARNING - FILE SIZE TOO BIG")
  12.         If x = 6 Then tf = True
  13.     End If
  14.  
  15.     Application.SetOption "Auto Compact", tf
  16.     If tf = True Then DoCmd.Quit
  17. End Sub
Jul 31 '12 #2

P: 44
Hi dsatino

Thank you so much for such a speedy reply. And what's more, it has worked. Just one more thing..... once the database has compacted, it closes (as instructed in your DoCmd.Quit). I would like the database to remain open - do I just leave out the DoCmd.Quit Line?

And yes, I do agree that I have too many temporary tables - these are caused by Make Table Queries overwriting the originals. However, if I don's use these MTQs, and just use a series of queries, I get an error message "Query Too Complex", hence using tables.

Many thanks for your interest.

Viv Denham
Jul 31 '12 #3

dsatino
100+
P: 393
You can leave that line out, yes. But then it won't compact until you close the DB. So it really depends on whether you want it to happen directly after the dialogue box or not.

If you're going to keep the MTQ strategy then I strongly suggest that you put the temporary tables in another DB. Constantly compacting your DB could cause corruption and loss of your DB in it's entirety.
Jul 31 '12 #4

P: 44
Hi dsatino

Once again, thanks for such a speedy reply. For my purposes, I really want the Compact to happen so that the file size does not get too big and Access crash altogether, resulting in corrupt data. So I guess having Access close is the better of the 2 evils.

And yes, again, I agree with you that I need to split the database. This has been a steep learning curve for me - I am making a database for all the Machine Knitting designs I knit, and am completely self-taught (I am a 65-year old silver surfer!!!). So I will do as you suggest, and split the database. Many, many thanks for your help.

Regards
Viv Denham
Jul 31 '12 #5

NeoPa
Expert Mod 15k+
P: 31,419
Anyone getting into Access at that stage of their lives deserves a whopping thumbs up from all of us here. Be prepared for things to get tricky and complicated Viv, but full marks for your attitude :-)
Jul 31 '12 #6

P: 44
Tee Hee!!! Thanks for that - made me laugh out loud. My husband says "Too Late! Things have been tricky for years with this database!" I originally had the brainwave in 2001 with the intention of automating my knitting calculations using Excell - but then my son suggested Access - and then the trouble began. I had dark hair then, now it's white!!!

Best regards to you all, and thanks for all your help.

Viv
Jul 31 '12 #7

P: 44
Hi - Just as I thought I'd got it sussed.....

The solution provided by DSATINO works perfectly. However, I have noticed that every time it requires the database to be Compacted, it then unchecks the checkbox for Compact On Close in the Access Options.

This means that when the database is NOT too big to require a Compact, it does not get Compacted on closing the database.

I've got mixed feelings about this. Compact on Close is something I've always used, but lately I've been seeing people saying not to use it at all. What are your opinions? Is there a way to retain the tick in the Compact on Close checkbox and still use dsatino's code, or should I not bother with the bloat?

Many thanks for your ideas.

Viv
Jul 31 '12 #8

dsatino
100+
P: 393
The problem with compacting that it's actually a misnomer in that you're not actually compacting the DB, rather you are replacing it. What Access does is create a new DB, write all of the good data over to the new DB, delete the current DB, and then rename the new DB to that of the deleted DB.

If anything glitches during the write process, you could be left with a corrupt DB and no way to recover your data. Anybody that's worked with Access long enough has probably had this occur.

To be honest, I really think you need to re-evaluate the process that is causing the bloat.

If you want to .zip the DB and post it, I'll take a look at the process for you and point you in the right direction.
Jul 31 '12 #9

P: 44
Hi dsatino

Thank you so much for your extremely kind offer to look at my database. I have zipped it, but can't for the life of me see how to insert it into my reply to you. I'll keep trying to find out and will send it to you asap.

Thanks again
Viv
Jul 31 '12 #10

dsatino
100+
P: 393
Clicked the 'Advanced' button and then from that screen click the 'Manage Attachments' button and upload it through the popup screen
Jul 31 '12 #11

zmbd
Expert Mod 5K+
P: 5,397
@VivDenham
Hello and Welcome to Bytes!

I must say it sounds like you’ve really started quite the ambitious project!

Before you get too much farther into your project may I respectfully offer a few links that might help you with your current and future projects:

With all of the MTQ being discussed, and the problematic query SQL‘s; I suspect that you may have a need to review how the database is “normalized” and maybe the over all logic in the RDMS design.

As you already have a RDMS set up, let’s start with “Normalization:” For me, the following tutorial was a good refresher with the concept of "normalization" and it has a really good explanation of the concept for those just starting out: Database Normalization and Table Structures.

IMHO: This is the number one problem in database design and is very much like playing Chess… very easy to learn the basic rules and a #### difficult game when played. Using a lot of MTQ is very symptomatic of either a normalization issue or some other issue… You really shouldn't be running into issues with 2GB. As DSantio points out 2GB isn't a large storage space; however, 2GB should hold something like 1 million pages of unformatted 10pitch/Dbls/1"margn typed text (say around 20 sets of the Encyclopedia Britannica at 25ish-vol/set including some of the pictures). For the average user, that's a lot of space. My Mom (65yrs) has a very old PC with 4GB harddrive (I know) and it has over a 1000 pictures stored on the drive (thankfully, backed up to DVD now - love vacation I.T. work)

As for the other issues, the solution I have is this tutorial covering the basics of relational database management system (RDMS) design. Now I understand that you’ve already dived in and gotten wet (YEA!!! Can‘t swim if you don‘t get in the pool!); however, I’ve found that all too often the textbooks really do a very poor presentation of the actual steps behind setting up a database from scratch. However, I have found a really good step by step tutorial for basic design here: A Tutorial for Access
Although this may sound a tad late for your current project, this tutorial may yet give you some insight to any issues you may run into with it in the near term and should help you with any future projects (may inspire you to start from scratch … no…. don’t blame you! Several DB I use made by others that have way too much data to mess with and… well… I don’t fix them)

Of particular note is tables page within the tutorial - notice the information that is given, the table name, the field name, the field type, key or index, etc... AND the format used to present that data. When you post back, this is the type of information we'll need to help you along in your project. You don't need a fancy grid table to do this... just a line by line layout.

Now for one more issue you will run into with any text from Microsoft… lookup fields in tables... I avoid lookup fields at the table level! And here is why: :Are Lookup Fields in Tables Evil? ... despite the fact that these are supported in MSAccess, they are not supported in other databases. More importantly, writing queries based on tables with look-up fields later on will be somewhat problematic. IMHO: The ONLY exception I've seen for this deals with a share-point integration.

There will be other people that argue that look-up fields are just fine at the table level... I side with the MVPs that do this stuff for a living… unless using share point, don’t use them at the table level… only use them in queries, forms, and reports.

Finally, so that you can really get the help you need, and the other experts, mod, and users can follow what you’re asking in the future, the following are really a must read:
How to ask good questions
Posting guidelines

My apologies for the long post… just a lot of ground to cover--- and I tend to ramble... a little...

Most Respectfully
-Z
What do you mean it's still 95 outside... it's 3am!>.... oh... time for a nap :)
Aug 1 '12 #12

P: 44
Hi Zmbd - thanks for your reply - gosh, 3.00 a.m. and you are still awake enough to think properly!!!

I have read the article on Normalisation, and I'm pretty confident that my tables comply with this. In the past, I've tried to get over using MTQs by using SQL Server, but had to give up with that when it came to the calculations.

By calculations I mean that I have an item of knitting, say a Jumper. Before I start knitting, I need to know that I have enough yarn to finish it.

Therefore, I calculate the Knitting Instructions (number of stitches and number of rows). This is the first layer of queries - Knitting Instructions.

Then I calculate the Stitches for every part of the Jumper (i.e. Back, Front, Sleeve, Collar) and for every section of that part (i.e. Welt, Body, Armhole, Neck). This is the 2nd layer of queries - Stitches - using calculations from the queries Knitting Instructions.

Then I calculate the Projected Weight of yarn for each of these sections of knitting within each part of the Jumper, by using the Stitches calculated above and multiplying by the Weight per Stitch from the Tension Square details. This is the 3rd layer of queries.

Then I add all these Projected Weights together to give the total weight for the Jumper - the 4th layer of queries - and here I get the message "Query Too Complex" (and sometimes even at an earlier stage than this).

Hence my son told me about using Make Tables to give Access less work to do running all these cascading queries.

And this works perfectly well, and gives me the results I want. Except that every time I run the MTQs I get bloat and need to Compact. The actual file size, compacted, is 375,000kb. Without all the Make Tables, it would only be half this size, but without them, I wouldn't be able to calculate the Projected Weight of Yarn required.

This probably sounds all very trivial to you, someone who is used to dealing with companies trying to make a success of their businesses. But for someone from the Knitting World, it is vital to know you have enough yarn before you start on a project.

So, once again, thank you for all your (and everyone else's) efforts at Bytes. My son tells me that I am trying to get Access to do something it is not capable of, that I'm using a database to do something it's not meant to do. But I'm determined to get as far as I can with it.

Thanks again

Viv
Aug 1 '12 #13

zmbd
Expert Mod 5K+
P: 5,397
@VivDenham
giggle
You've tossed us a Rubik's Cube!
giggle - I suspect that the people here love puzzles

Makes perfect sense as for what you're trying to do. My Wife crochets and I used to know how to do really (really, really (x100)) simple knitted stuff (ok, so they were usually just pot holders, tried a sock once... three needles... Chemist, not Seamstress)

I'm using my Mom and Dad's PC; thus, no MSOffice for the next week; however, I'd love to see your database and tinker with it once I'm home and DSatino seems to be fairly sharp with this too given that D has already offered to take a look when you get a chance you should upload the file. This really sounds like something MSA should be able to do... even within a single query... or better yet a form with a few dropdowns...

(giggle... bring on the puzzle!!!)

Did you figure out DSatino's instructions on how to upload your zipped DB?

-z
Ok... I really need to take a nap before the kids get up and the Wife wants me to help (3yr-Twins(S&D) and a 9yr DD that thinks she's 20... when did that start! :) )
Aug 1 '12 #14

P: 44
Hi -z

I did try to upload the database, but got an error message that a Security Token was missing. I'm awaiting an update from Bytes Support team. Then I'll try again. If any of you would have any time to look at the database, it will be interesting to know what you feel.

I know I've built in lots of "Catch-Outs" (e.g. I've put a "Locked" field on every form, linked to a Module for every form. I had found that I was clicking on input fields and inadvertently changing the data unintentionally. Now I have every field locked until I click the "Locked" checkbox to unlock them. Long winded I know, but 65-year old idiot-proof!!!

Anyway, as soon as I hear from Support, I'll try to upload the DB again. Watch this space!!!

Viv
Aug 1 '12 #15

P: 44
Hi there

It has occurred to me that the reason my upload is failing is because the size of the zipped file is 78,452kb - is this too big? If so, would it help if I deleted the contents of the Make Tables and then uploaded it? Would you be able to then run the MTQs yourselves? (You need the contents of the Make Tables for the later queries to work as they pick up their data from the MTs).

Let me know if this would help and I will try again. Many thanks.

Viv
Aug 1 '12 #16

dsatino
100+
P: 393
Personally I have no idea what the size limit is, but deleting the contents of the temp tables will definitely reduce the size. I would go ahead and give it a shot and see what happens.
Aug 1 '12 #17

zmbd
Expert Mod 5K+
P: 5,397
Yes... Make a backup, clear the make tables, and then do a compact too before re-ziping the file.
-z
Aug 1 '12 #18

P: 44
Hi there

Deleted the Make Tables and this reduced the zipped file size from 78,452kb to 26,748kb. Tried to upload, but again got the Error Message that the Security Token is missing. Am still awaiting an update from Bytes Support team. I will keep trying.

Many thanks for your continued interest.

Viv
Aug 1 '12 #19

NeoPa
Expert Mod 15k+
P: 31,419
Viv,

See Attach Database (or other work) for the steps you should follow to attach your work. If you look at the page that opens when you click on {Manage Attachments} you'll see a list of maximum file sizes that pertain to various file types. Clearly, yours should be a ZIP file (if you've followed the instructions above correctly), so the maximum size to upload is 5 MB.

Good luck :-)
Aug 1 '12 #20

P: 44
Hi NeoPa

Thanks for that - I have been doing exactly what I should. However, the problem seems to be this "Security Tokens" issue. I have just tried uploading using IE9, Google Chrome and Mozilla Firefox browsers - and all give me the same message about Security Tokens being missing.

So it looks like I am in the hands of Bytes Support team for them to come up with a solution to this bug.

Many thanks for trying to help.

Viv
Aug 1 '12 #21

NeoPa
Expert Mod 15k+
P: 31,419
That may not be as easy as it sounds Viv, as it seems to be a problem specific to you as far as I can tell. I just tested a random database upload and had no trouble with my file which was <2MB in size. What size did your ZIP file turn out to be for interest?
Aug 2 '12 #22

P: 44
Hi - I think this problem may well have been me! I'm not at all up with the various meanings of Kb, Mb, and Gb, so although I was aware of the 5Mb size limit, I thought that my zipped file of 26,748 was under the 5Mb (thinking that 5Mb was 50,000kb). However, when my son returned from a trip away, he told me that I need to get the DB to under 5,000kb.

So I will try deleting more stuff to see if I can get it below the 5Mb (5,000kb) size limit.

As I said, I am a 65-year old beginner, and must have been having a senior moment. Apologies...

Viv
Aug 2 '12 #23

twinnyfo
Expert Mod 2.5K+
P: 3,210
Viv,

As I have been following this thread, now I am getting "curiouser and curiouser" as to the size of your database...... Do you have a lot of photos included in your DB? I have (what I consider) huge database with thousands of records, about a hundred forms, 50-60 reports and probably twice as many queries, that only comes out to 25 MB.

If you do have a lot of pics, we can also give you tips on how to remove the pics from the DB and refer to them in code. This would also save you some space.

I am also interested to see your queries, because I find it fascinating that you could actually calculate out the amount of yarn needed for a piece and estimate its weight.

I am intrigued!! Send us this DB, because us us puzzlers would love to help you figure it out!!

Regards,
twinnyfo
Aug 2 '12 #24

P: 44
Hi there

Yippee! I've finally managed to upload my database for you to look at.

To reduce its size, I've removed all Products except Tops (button on the Welcome Screen). I've also reduced the number of Tops to 6. And I've removed all the Make Tables.

Therefore, to get the queries to work for Knitting Instructions onwards, you will need to go to Step 5 on the menu - Items Knitted - Add New. In that form, if you uncheck the Record Locked Y/N box, and then close the form, this will force the MTQs to run. That should then populate the queries for Step 6 onwards.

Hope it all works. Thank you so much for your time.

Viv
Attached Files
File Type: zip Database - Viv's Knits - All pictures removed.zip (2.05 MB, 96 views)
Aug 2 '12 #25

twinnyfo
Expert Mod 2.5K+
P: 3,210
Viv,

First, I am incredibly impressed that you have built this database. It is HUGE and very complex. I am VERY impressed!

This is not an immediate fix to your ballooning database size, but may help you understand why it's doing what it's doing.

From what I understand about the generation of tables in Access, the DB will set aside a certain amount of memory based on the data type--whether all of that memory is needed or not. So, every time you have a memo field, and make a table, and establish records for that table, Access assumes you will be using 5000 characters. I did not look at all your memo fields, but typically, you should only use memo fields for consistently large blocks of text, for this very same reason. For many of your tables, which have "Notes", you may be able to change the data type of these fields to Text, which only assumes 255 characters. If this is too tight of a restraint, you may have to stick with Memo type.

Another, based on the same reasons, is the pictures. It appears that you have the pictures saved as OLE Objects within the tables. May I recommend that you establish a separate Directory under your Database for just your photos. Then, in the Table, instead of an OLE Object field, have a small text field (it could be limited to 30 Characters) with the file name of the photo. If ALL of your photos are in the same format (i.e. .jpg) you only need the file name and not the extension. Then, whenever you need to display any photos, instead of using hte embedded OLE Object, you point to the file in your photos directory and displayi it on the screen. There are some very easy ways to do this.

Although much less important, many of your Primary Keys are set as Long Integers. Although there is nothing wrong with this, and is very common (and required) if you have many, MANY records, since you only have a few records per table, these Keys could be regular Integers, which could save some space, because you have so many tables.

Because of these particular issues with your database, ballooning will continue (not sure why the compact on close would ever want to be turned OFF in your case). These minor (although very time-consuming) mods, might assist with the constant ballooning.

I hope this info helps.....

Still marvelling at the complexity of this DB!!!!
Aug 2 '12 #26

dsatino
100+
P: 393
Oh...my...lord...........

First, pleeeeeeaaaase don't take any of this the wrong way.

Ok, now where to begin. I guess the first thing is to honestly congratulate you on getting this far. The fact that you've managed to make this thing work with the structure you have is based on how highly organized and specific it is. Given what you're doing, you definitely made the right choice in creating a database.

That said, and this is the part where I beg you not to take offense, I practically had a brain anuerism when I saw your structure. What you need is to 'normalize' this database. I'm not going to explain what that means because you can just Google it, but normalization would cut down the number of tables you have by probably 80-90%. You defintely have about 60 more forms than you need and probably about 30 or so more queries and modules than you need.


Now, before I even look at automating your calculations, do you want to take a look at the concept of database normalization first? Between what I've seen from your organizational skills and the required skills needed for knitting products, I think you'd understand the normalization concept quite easily.
Aug 2 '12 #27

P: 44
Hi twinnyfo

Thanks for all that info - I'll take a look and see if I can use your tips to save space. I have in the past, tried to use various methods to reduce the size of the pictures and photos. For the Photos, I resample to 5% or 10%, depending which Form it appears in. For the Pictures, these are created in Coreldraw. I experimented with inserting them as bitmaps, jpegs, attachments etc, but the least space was taken by leaving them as Coreldraw files.

I am still intending to split the database but I've got one more field that I want to rename first - when the repercussions of doing that are sorted, then I'll do the split.

I did a "test run" at splitting the DB, and the Back End ended up at 60,000 kb and the Front End was 307,000 kb. At least that way, if I do have to keep compacting, it will only be the temporary tables etc that may get corrupted, not the raw data.

Having learned the hard way, now every time I make even minor changes to any of the tables, queries, forms etc, I close the DB, save it with a Version Number and name, make a copy and then work on that new copy. Some days I end up with 20 new copies, but at least I don't lose my work!!!

Thanks, once again for looking at my database, and for your tips. Tomorrow I shall be back to the drawing board.

Viv
Aug 2 '12 #28

P: 44
Hi dsatino

Thanks for the comments. Could you give me an example of where the DB is not normalised please.

Thanks
Viv
Aug 2 '12 #29

zmbd
Expert Mod 5K+
P: 5,397
@VivDenham
Viv, I've only read your last question here and haven't had a chance to review the database (can't wait to get home!)

Without getting too techy: http://www.databasedev.co.uk/flatfile-vs-rdbms.html

Also, revisit: Creating an Access Database - Step 1: Database Design this will help you with the flat (non-normalized) concept.
-z
OK… 10hour trip home with 2 toddlers and a 9yr DD… time for a nap!
Aug 3 '12 #30

P: 44
Hi

I've re-read both the above (and many other items found by Googling). However, as far as I understand it, the main requirements of Normalization are a) to only record 1 item in a table, b) not to duplicate data, and c) not to have redundant data not linked to that ID.

In my database, T-01 - Style - Tops - Arm records all the different types of Arms that a Jumper could have. This table contains the ID, Arm name, a picture, and any notes relating to that Arm. This is the only place this information is recorded. And none of the data is redundant, it is all pertinent to the ID.

Then, when actually deciding on the Style of Jumper I am going to knit, I go to the table T-01 - Style - Tops, and record an ID and Name for that Style, and select the Arm type that I want (i.e. whether it will be a long arm or short).

So as I understand it, the Arm is only recorded once, in the Arm table, and the Jumper Style is only recorded once, in the Style - Tops table. These 2 are then linked with Relationships.

Am I misunderstanding something?

Many thanks for trying to point me in the right direction.

Viv
Aug 3 '12 #31

dsatino
100+
P: 393
Very sorry for the delay, I've just been hoping to put together an example for, but I'm just really busy so for the sake of not ignoring you...

Your general understanding of keys and non-redundancy is basically correct, but your implementation is where the issue is.

Part of the normalization concept is to store similar data together in a single table (usually). In your db you have separate tables for cuffs, collars, necks, etc. These may be physically different, but from a data standpoint they are the same in that they are attributes of a top and thus should be stored together. I promise I'll show you an example when I get the time to build one out.
Aug 3 '12 #32

P: 44
Hi dsatino

Thanks for all your efforts - the cogs in my brain are whizzing round just now.

I can see the logic of having a table for attributes of Tops (i.e. component parts of a jumper) and that leads me to think that I have a stage of the logic (and hence, a table) missing - perhaps I should have a table called Tops - Component Parts, and these component parts would be things like Welt Type, Waist Type, Armhole Type, Neck Type, Cuff Type, Arm Type, Sleeve Head Type, etc.

But to my thinking, that just means 1 extra table. How would I then get all the different types of cuff, armholes, etc into that 1 table?

And further down the line, even if I could get all these components with their multitudes of cuffs types, armhole types, etc all into the one table, how would I then run all the cascading queries, without getting the error Query Too Complex. I would still have to use MTQs, wouldn't I? And that would still lead to the bloat?

I can't wait to see what you come up with. But once again, thanks for all the time and effort you are spending on me. It is so very much appreciated.

Viv
Aug 3 '12 #33

dsatino
100+
P: 393
Ok, here's the quick example. I'm not saying it's perfectly right since I don't enough about your products, but I hope it gives you the general idea of what I'm conveying. There's only 5 tables and although I only added a small bit of the data, you can add all of the products into them.

You'd need other tables for the stitching info, but the same concept would apply.

As for the calculations, I honestly haven't even looked at that piece yet, but I assure you that it can be done under this structure and without the MTQ's.
Attached Files
File Type: zip VivsExample.zip (24.0 KB, 44 views)
Aug 3 '12 #34

P: 44
Hi dsatino

Thanks for the example db - I shall spend the weekend going over it and sorting out how I can use the format myself.

Thanks again.
Viv
Aug 3 '12 #35

P: 44
Hi dsatino

Before I go too far down the Normalisation road, I wonder if you would have a quick look at what I've done so far. My brain works in such a way that I have to have tables, queries, etc listed in the correct order that the process takes, so I am just checking that I am understanding things before I do too much work which may be incorrect.

Many thanks, once again for all your help.

Viv
Aug 4 '12 #36

dsatino
100+
P: 393
The naming of the tables in mostly unimportant. It's definitely good practice to give them meaningful names, but not necessary. I think if the names you've given the tables help you then keep the convention you've been using, but I would definitely suggest a slight modification in which you get rid of the spaces and hyphens and use underscores instead. For example:

T-01 - 04 - Products - Product Components would be something like
T_01_04_Products_ProductComponents

This doesn't have anything to do with normalization, but it's just easier when referencing the tables in code.


As for the normalization part, I think you're getting the table structure part, but not quite the data redundancy part.

For example, in your "T-01 - 05 - Product Component Types" table, you are still using the Product and Component names in the component type name. There is no need for this because the tables are related through the foreign keys and hence you don't need to store the foreign table descriptions in this table.

The main reason why you don't want to do it this way because it defeats your ability to make changes in a single location. Let's say, for example, that you wanted to change your product "Scarves" to "Neckwear" instead. Under your current method, you would have to make this change in every instance in your database which is numerous tables and could be thousands of rows. This basically defeats the purpose of normalization.

If you look back at the DB I posted you'll see a single query. This query essentially constructs all the names you have, at any level you need to see it. Now to see the concept I'm explaining, close the query, open the "ProductType" table, change "Tops" to "Shirts", close the table, and then re-run the query.


This is basically the concept of normalization in action.
Aug 6 '12 #37

P: 44
Thanks, dsatino. I realise I'm doing things very much in a "Belt and Braces" fashion - just can't trust myself to rely on the Tables to work correctly for my purposes.

So, it's back to the drawing board. Thanks again.

Viv
Aug 6 '12 #38

Post your reply

Sign in to post your reply or Sign up for a free account.