Presently I'm dealing with a data base with the structure as under:
1.Source document is an excel file which contains thousands of rows ( item)& multiple columns ( relating to each individual item)which includes Text as well as numbers.
2.Source document is generated on a daily basis i.e.on each new day, a new file will be generated.
3.Requirement is to perform calculations based on each row( item) which will include cumulative,aver age,maximum,min imum,etc based on a number of previous days.
4.& to prepare summary report/s based on 3,For example:
Name of item; Date ; Average for last 7 days ; Maximum for last 7 days ; Minimum for last 7 days,ETC
Please advise what should be the best way to achieve these tasks.
I'm trying to import the source documents in access table ( all days in a single table), then to make automatic sub tables / split the tables based on each row( item) , as we need calculation based on each individual items.
Problem I'm experiencing is : 1.How to link all the tables based on a single/multiple fields in a single click.
2.How to generate a query based on 1, in a single click.
3.What is the best way to perform vertical calculation in access, example total of last 7 days,etc.
Use of any other sources like Access Add-ins or any other source of similar nature to facilitate the tasks.
Please advise.
My advice to you would be first and foremost determine your projected data size
Thousands of records per day equals not very long before you would be considering upsizing to SQL Server. Access 2000 database has datasize limitations that you must take into account. Don't be frightened off by this though.... just bear it in mind. You will be importing a fair few spreadsheets before you reach the 2GB limit.
If you have have no control over the source document the questions you have to ask yourself are these:- where does the source document come from, is it consistent, is it clean data. If I get the source document from multiple suppliers do they all validate their data in the manner I would wish? invariably not, so you have to consider the lowest common denominator taking into account the poorest example of the bunch and thus set routines in place to validate and clean any imported data to match the stringency tests that ANY RDMS database platform would demand to keep integrity
With your situation you have a clean slate so to speak and the design for import and storage is a matter for you as the designer to be comfortable with. It will not be long before you will be saying to yourself...
hmmm if only I'd done it this way or that ..we all do that so nothing new there.
On the face of it my advice for the importation would to store the production data (the actual successfully imported data you intend to work with) in a single table....most of anything you develop from thereon whether it be queries forms, will be referencing a single object rather than working with many.
It is common for a spreadsheet to have DENORMALISED data in it...that is to say a customers name might be in the spreadsheet several thousand times as might also be a unque reference number (particularly where this has come from other DBMS systems). The question then for me would be this 'Once I have imported the data to my single table are there any routines I could run in here that would speed things up?... yes there are ....extract all those unique customer names and their uniques references (if indeed they are unique) and place them in a table of their own in my system I then could search quickly on those as opposed to say 4 million rows in the main table and it goes on from there.........e xtract all the product codes put them in their own table....etc etc.
What do we mean by extract? I mean you 'query' the main table using parameters to return datasets you require. This can be one or many queries. If you design intelligently enough you will realise that you can reduce your physical QUERY objects to a minimum by having as a generic query as possible amended at runtime merely by having parameters passed to it
The single table principle will aid your VERTICAL calculations you speak of (functions of MIN,MAX SUM,AVG to name but few) because you will only ever have the one column to contend with as opposed to several columns which would be the case if you were to split your data over numerous tables ie: table of imports for Wednesday Monday,Tuesday, Saturday....ooo ps let me scroll a minute....hmmm is that the Saturday this week table or last week, why arent all these table objects listed in alphabetical order by day and so on. I'm sure you can visualise what I am talking about... if not you will in time.
The principle of what you wish conceptually is a common activity of Access (to import spreadsheets and work with them) how you do it is a matter of technique.
Import into a single table run consistency checks over the data and if it is ok move it to the main production table within your system as an append ..that is the method I employ....impor t.check...move. ...truncate(or delete from) the import table. (Access has the lovely feature of shifting any import failures to a table of import errors that it creates on the fly itself and which you will doubtless at some stage see in the database window (if you use the .MDB format)
I won't go into the usual 'How do you deal with UPDATES' bit because I am assuming you have provided to you an original source document that never changes at the provider end and that you are the last in the chain so to speak?
If I am reading you correctly your db is a relatvely new build? if this is the case then each of the elements you referred to importations,qu eries,calculati ons reports will be incremental in the development process Access will eat it overall have a fun time building it.
On the question of Add ins and suchlike? tons of stuff out there for instance you want to output a dataset to excel for yourself for instance made up of aggregated queried data defined at runtime by yourself.no problem hit the button and out it goes.you just have to incorporate this miriad of functionality when the time comes and it is appropriate to do so.
I wish you luck in the build the good folks on here will doubtless be pleased to help and advise along the way provided of course sufficient information is to hand ...remember We/I sit here with a blindfold on whereas you know your system and aims :)))
Jim