By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,493 Members | 1,270 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,493 IT Pros & Developers. It's quick & easy.

Separating the database

P: n/a
I have a backend Database with just one main table in it (no form/queries
etc)
The network pc's have the front end database (with all the forms,
queries,macros etc). These front ends are linked to the table belonging to
the backend database.
Records have been entered since January this year. However, as soon as the
size of the backend database exceeded 1 gigabyte I got concerned that it
would possibly encounter a corruption of the data at some stage soon as
Access apparently "max's out" at 2 gig.
So I simply moved almost half the records and placed it into a seperate new
database - calling it ArchiveJan-Apr 05.
People on the network will still need to retrieve records in this archived
database. This seems like it will be a fixed procedure from now on...
creating new databases every couple of months to relieve the size of the
main backend database.
I don't want to be creating new front ends for all pc's everytime I create a
new archived database.
I know I could simply (in each existing front end) create a linked table to
the table of a newly created archived database but I don't know which macro
command to create.
But instead of opening the archived databases seperately, ideally a user
would press a command button from their existing front end database and then
it would access the appropriate linked table to the respective archived
database table(if that makes sense).
Or, is there a better solution ?
thanks for any assistance.
Nov 13 '05 #1
Share this Question
Share on Google+
12 Replies


P: n/a
1 gig is massive for any database - but for one with just one table
it's a little crazy. You very much need to find out what's causing
this.
I presume you have been compacting the backend reqularly?
how many records are in this table, and how many fields?

As for how to set up front end for accessing archive databases, you can
include a function in the front end that will re-point the linked
table to another table in another database, allowing the user access
old archives without a second, and third etc front end. I can send you
the code if you need it - but you probably don't. I think a closer look
at whats causing the file size is more important.

Nov 13 '05 #2

P: n/a
how many users created 1Gb of data in six months ?
are you sure a 'compact' of the database isn't all that is required ?

Nov 13 '05 #3

P: n/a

"Gary" <g4****@hotmail.com> wrote in message
news:42***********************@news.optusnet.com.a u...
I have a backend Database with just one main table in it (no form/queries
etc)
The network pc's have the front end database (with all the forms,
queries,macros etc). These front ends are linked to the table belonging
to
the backend database.
Records have been entered since January this year. However, as soon as the
size of the backend database exceeded 1 gigabyte I got concerned that it
would possibly encounter a corruption of the data at some stage soon as
Access apparently "max's out" at 2 gig.
So I simply moved almost half the records and placed it into a seperate
new
database - calling it ArchiveJan-Apr 05.
People on the network will still need to retrieve records in this archived
database. This seems like it will be a fixed procedure from now on...
creating new databases every couple of months to relieve the size of the
main backend database.
I don't want to be creating new front ends for all pc's everytime I create
a
new archived database.
I know I could simply (in each existing front end) create a linked table
to
the table of a newly created archived database but I don't know which
macro
command to create.
But instead of opening the archived databases seperately, ideally a user
would press a command button from their existing front end database and
then
it would access the appropriate linked table to the respective archived
database table(if that makes sense).
Or, is there a better solution ?
thanks for any assistance.


This does seem like a very rapid rate of growth. What happens to the size
of the back end if it is compacted? Are you storing anything unusual like
ole objects, pictures, etc within the database?
Anyway, if you really do have to archive copies then you could add a table
of archives giving, say, the file path, file description, date, etc so the
front end has a startup form which shows a listbox or subform showing all
available backends. The code to re-link backends must have been posted many
many times before so I all I will add is to say that in my experience
deleting the links and re-creating from scratch is quicker than trying to
refresh the links.


Nov 13 '05 #4

P: n/a
Justin - not if you keep a connection to the target datbase open. When
you link your first table open a recordset to it and keep it open until
the rest have been re-linked. I have a 200+ table set up that can be
entirely re-linked over a fairly slow network in under 16 secs doing
this... just a thought :)

Nov 13 '05 #5

P: n/a
"Justin Hoffman" <j@b.com> wrote in message
news:da**********@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com...

Anyway, if you really do have to archive copies then you could add a table
of archives giving, say, the file path, file description, date, etc so the
front end has a startup form which shows a listbox or subform showing all
available backends.


Another approach might be to just have linked tables from the archives to
the FE, with a button on the main form to select the datasource.

Regards,
Keith.
www.keithwilby.com
Nov 13 '05 #6

P: n/a
Thanks for all the advice.
I did neglect to point out that each record allows users to link 4 pictures
(OLE objects) where the pics are stored in a seperate network folder.
There are approx 250 records. And yes, I do compact the databases every day.

"Gary" <g4****@hotmail.com> wrote in message
news:42***********************@news.optusnet.com.a u...
I have a backend Database with just one main table in it (no form/queries
etc)
The network pc's have the front end database (with all the forms,
queries,macros etc). These front ends are linked to the table belonging to the backend database.
Records have been entered since January this year. However, as soon as the
size of the backend database exceeded 1 gigabyte I got concerned that it
would possibly encounter a corruption of the data at some stage soon as
Access apparently "max's out" at 2 gig.
So I simply moved almost half the records and placed it into a seperate new database - calling it ArchiveJan-Apr 05.
People on the network will still need to retrieve records in this archived
database. This seems like it will be a fixed procedure from now on...
creating new databases every couple of months to relieve the size of the
main backend database.
I don't want to be creating new front ends for all pc's everytime I create a new archived database.
I know I could simply (in each existing front end) create a linked table to the table of a newly created archived database but I don't know which macro command to create.
But instead of opening the archived databases seperately, ideally a user
would press a command button from their existing front end database and then it would access the appropriate linked table to the respective archived
database table(if that makes sense).
Or, is there a better solution ?
thanks for any assistance.

Nov 13 '05 #7

P: n/a
"BillCo" <co**********@gmail.com> wrote in message
news:11**********************@g14g2000cwa.googlegr oups.com...
Justin - not if you keep a connection to the target datbase open. When
you link your first table open a recordset to it and keep it open until
the rest have been re-linked. I have a 200+ table set up that can be
entirely re-linked over a fairly slow network in under 16 secs doing
this... just a thought :)

Are you saying for this particular database you have compared the two
techniques of refreshing the links versus re-creating the links? If so, how
long did the delete/re-create routine take?
I have found a number of things work better by deleting the links - speed is
one thing and I also have vague memories of odbc links causing trouble where
the back end table structure changes or field data types change and certain
issues were not resolved until I deleted and re-created the links - although
I don't remember precise details.

Nov 13 '05 #8

P: n/a
"Keith" <ke*********@baeAWAYWITHITsystems.com> wrote in message
news:42**********@glkas0286.greenlnk.net...
"Justin Hoffman" <j@b.com> wrote in message
news:da**********@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com...

Anyway, if you really do have to archive copies then you could add a
table of archives giving, say, the file path, file description, date, etc
so the front end has a startup form which shows a listbox or subform
showing all available backends.


Another approach might be to just have linked tables from the archives to
the FE, with a button on the main form to select the datasource.

Regards,
Keith.
www.keithwilby.com


Might well be easier, but the extra table idea allows you to present a
nicely ordered list with user-friendly names for the files such as 'Archive:
Jan 2005 - Jun 2005' instead of '0105-0605.mdb'. You could see at a glance
what you were connected to and you would provide only a limited set of
backends to connect to, so that your average user would not 'browse off into
outer space' looking for a back end to connect to. It would be more work to
maintain, but more user-friendly.
Perhaps a middle-ground approach might be to give the back-ends descriptive
file names, and have the listbox get a list of all files in a folder.

Nov 13 '05 #9

P: n/a
Justin - not if you keep a connection to the target datbase open. When
you link your first table open a recordset to it and keep it open until
the rest have been re-linked. I have a 200+ table set up that can be
entirely re-linked over a fairly slow network in under 16 secs doing
this... just a thought :)

Nov 13 '05 #10

P: n/a

"Justin Hoffman" <j@b.com> schreef in bericht news:da**********@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com...
I have found a number of things work better by deleting the links - speed is
one thing and I also have vague memories of odbc links causing trouble where
the back end table structure changes or field data types change and certain
issues were not resolved until I deleted and re-created the links - although
I don't remember precise details.


I agree with you Justin.
I *always* delete and recreate the links since I had very strange issues when I only refreshed the links.
And this was not only with odbc, but also just with FE-BE-links between mdb's.
The link did not represent 'all' of the changed table.
I also don't remember precise details but it was weird!
(maybe we try to forget these nasty details ;-) )

From the moment I discovered this I changed the relink-code in *all* my databases.
I will *always* delete and recreate links I promised myself!

Arno R
Nov 13 '05 #11

P: n/a
Br
Gary <g4****@hotmail.com> wrote:
Thanks for all the advice.
I did neglect to point out that each record allows users to link 4
pictures (OLE objects) where the pics are stored in a seperate
network folder.
There are approx 250 records. And yes, I do compact the databases
every day.
But are you storing the pictures in the DB as OLE objects? All you need
to do is store the network path to the image file and then get Access to
display the image on a form/report. Storing a copy as an OLE object is a
massive waste. If you only store the text path I suspect your DB will be
tiny.
"Gary" <g4****@hotmail.com> wrote in message
news:42***********************@news.optusnet.com.a u...
I have a backend Database with just one main table in it (no
form/queries etc)
The network pc's have the front end database (with all the forms,
queries,macros etc). These front ends are linked to the table
belonging to the backend database.
Records have been entered since January this year. However, as soon
as the size of the backend database exceeded 1 gigabyte I got
concerned that it would possibly encounter a corruption of the data
at some stage soon as Access apparently "max's out" at 2 gig.
So I simply moved almost half the records and placed it into a
seperate new database - calling it ArchiveJan-Apr 05.
People on the network will still need to retrieve records in this
archived database. This seems like it will be a fixed procedure from
now on... creating new databases every couple of months to relieve
the size of the main backend database.
I don't want to be creating new front ends for all pc's everytime I
create a new archived database.
I know I could simply (in each existing front end) create a linked
table to the table of a newly created archived database but I don't
know which macro command to create.
But instead of opening the archived databases seperately, ideally a
user would press a command button from their existing front end
database and then it would access the appropriate linked table to
the respective archived database table(if that makes sense).
Or, is there a better solution ?
thanks for any assistance.


--
regards,

Bradley

A Christian Response
http://www.pastornet.net.au/response
Nov 13 '05 #12

P: n/a
> Are you saying for this particular database you have compared the two
techniques of refreshing the links versus re-creating the links? If so, how
long did the delete/re-create routine take?
I have found a number of things work better by deleting the links - speed is
one thing and I also have vague memories of odbc links causing trouble where
the back end table structure changes or field data types change and certain
issues were not resolved until I deleted and re-created the links - although
I don't remember precise details.


sorry for not getting back to you sooner. yes - i have a backend with
around 200 tables which is archived regularly. accessing the archive is
done in code in two ways from the front end:
Firstly deleting the linked tables in the front end and re-creting them
in code - this take 10 - 15 mins to complete.
Secondly, by looping through the tables and re-pointing the links and
refreshing them. By doing it this way it takes approx 15 seconds. The
key to making it fast is to open a recordset to the destination backend
before the routine and release it when all the tables are re-linked.
This code snippet should give you what you need:

function SetLinks(byval strDestDatabase as string)

Dim db As DAO.Database
Dim td As DAO.TableDef
Set db = CurrentDb

'set recordset to destination backend here

'qryLinks is a query on MSysObjects returning all linked tables
Set rstLinks = db.OpenRecordset("qryLinks")

Do While Not rstLinks.EOF
Set td = db.TableDefs(rstLinks!Name)
td.Connect = ";DATABASE=" & strDestDatabase & ";TABLE=" &
rstLinks!Name
td.RefreshLink
rstLinks.MoveNext
Loop

'realease recordset to destination backend here
rst.close: set rst = nothing
db.close: set db = nothing

End Function

I've never had any problems with this or seen any weird side effects.

hope this helps,
Bill

Nov 13 '05 #13

This discussion thread is closed

Replies have been disabled for this discussion.