473,408 Members | 1,730 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,408 software developers and data experts.

Database Design Recommendation Sought - Objective: Reduce Sluggishness / Improve Efficiency

MLH
I have a RDBMS app consisting of 3 primary mdb's...
1) a front-end with a few STATIC tables and the other menagerie of
objects
2) a back-end with most of my DYNAMIC tables. I'll call it my main
backend.
3) another back-end = zip.mdb with about 43000 zips/cities/states

The app has been operating stably (is that a word?) for some years.
No probs. The main backend is 63.3 megs now and contains tens of
thousands of letters - legal documents sent to attys, litigants, gvt,
judicial bodies - you name it. Pure text in a memo field - the entire
letter from dear mr jones to yours truly, john doe.

Probably upwards of 40 megs of the main backend's size is due
to the contents of this single memo field. Is there any reason on this
earth for me to consider breaking out the contents of this memo field
by putting into a linked table in what would then be a THIRD backend
database? Any reason at all? Mind you, I would have no other
objectives other than efficiency in mind for doing this. Rarely (if
ever) is the field's contents searched in ANY fashion. If I want to
find a particular letter, there are PLENTY of other fields to query
it out precisely. And the last thing worthy of mention, I think, is
the fact that these letters are being maintained for legal
responsibility only. It is damned near certain that only 1 out of
a thousand will ever need to be reproduced again for any reason.
If I knew which one, I would HAPPILY delete the other 999 if you
know what I mean. The other fields in the correspondence table
are far more valuable, in my opinion, to my overall needs as they
serve as a record of my having sent the letter, to whom, when,
where, etc... and even are topic-categorized to give one a pretty
good idea as to what the letter was concerning without even reading
it. So, its like separating useless (well not useless) from more
useful things (IE, the wheat from the chaf). What I'm really trying to
say here, dammit, is that if I could really afford a data loss
somewhere - it could be one of these confounded letters that
are eating up my disk.

Bottom line question for the gurus is: Do I really need to serious
consider carving out the huge block of memo data and putting
into its own database?

Thank-you for your patience in reading this rather lengthy post.
Nov 13 '05 #1
13 2174
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

One recommendation that is usually posted for document-related DBs is to
store the document in the operating system's folders and just have a
pointer (Hyperlink) in the DB to the document.

If you needed to search the actual document for words or phrases you'd
have to write a VBA routine that would loop thru each document, open it
& search that document for the search string & repeat until found/not
found. Not very efficient, but you did say that you rarely, if ever, do
that.

If you have Windows XP/2000 I believe you could store the documents in
compressed folders. Don't know if you could search each document when
it is compressed, or the effects on processing speed.
--
MGFoster:::mgf00 <at> earthlink <decimal-point> net
Oakland, CA (USA)

-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.0
Charset: noconv

iQA/AwUBQmp8m4echKqOuFEgEQLFEwCePeJAEFqUezK0TN91YQdh7B O8NZwAn0vm
N0RkTH+oKbC3f0czXLZCyeqN
=+O2u
-----END PGP SIGNATURE-----

MLH wrote:
I have a RDBMS app consisting of 3 primary mdb's...
1) a front-end with a few STATIC tables and the other menagerie of
objects
2) a back-end with most of my DYNAMIC tables. I'll call it my main
backend.
3) another back-end = zip.mdb with about 43000 zips/cities/states

The app has been operating stably (is that a word?) for some years.
No probs. The main backend is 63.3 megs now and contains tens of
thousands of letters - legal documents sent to attys, litigants, gvt,
judicial bodies - you name it. Pure text in a memo field - the entire
letter from dear mr jones to yours truly, john doe.

Probably upwards of 40 megs of the main backend's size is due
to the contents of this single memo field. Is there any reason on this
earth for me to consider breaking out the contents of this memo field
by putting into a linked table in what would then be a THIRD backend
database? Any reason at all? Mind you, I would have no other
objectives other than efficiency in mind for doing this. Rarely (if
ever) is the field's contents searched in ANY fashion. If I want to
find a particular letter, there are PLENTY of other fields to query
it out precisely. And the last thing worthy of mention, I think, is
the fact that these letters are being maintained for legal
responsibility only. It is damned near certain that only 1 out of
a thousand will ever need to be reproduced again for any reason.
If I knew which one, I would HAPPILY delete the other 999 if you
know what I mean. The other fields in the correspondence table
are far more valuable, in my opinion, to my overall needs as they
serve as a record of my having sent the letter, to whom, when,
where, etc... and even are topic-categorized to give one a pretty
good idea as to what the letter was concerning without even reading
it. So, its like separating useless (well not useless) from more
useful things (IE, the wheat from the chaf). What I'm really trying to
say here, dammit, is that if I could really afford a data loss
somewhere - it could be one of these confounded letters that
are eating up my disk.

Bottom line question for the gurus is: Do I really need to serious
consider carving out the huge block of memo data and putting
into its own database?

Thank-you for your patience in reading this rather lengthy post.

Nov 13 '05 #2
Bri
Whether the Memo field is in the Main BE or in its own BE is not likely
to make any difference in the amount of space it takes up. You don't say
what version of Access you are using, but if you are using AC2K or
higher, then you should make sure that the Memo field has Unicode
Compression set to Yes. From a sluggeshness reduction point of view,
make sure that any Forms/Reports/Comboboxes/Queries/Recordsets/etc do
not use SELECT * FROM on the table with the Memo unless the Memo is
required. Only SELECT the fields that are necessary. If you have a Form
that displays this Memo field, but you don't always require it to be
viewed, then remove it and make a new Form with only it that you can
call from the original Form with a Command Button. That way the Memo is
only retrieved on that rare occasion that it needs to be.

HTH

--
Bri

Nov 13 '05 #3
MLH <CR**@NorthState.net> wrote in news:d7nk61h0lrl3dru7137jvqjf5naq23i2tt@
4ax.com:
I have a RDBMS app consisting of 3 primary mdb's...
1) a front-end with a few STATIC tables and the other menagerie of
objects
2) a back-end with most of my DYNAMIC tables. I'll call it my main
backend.
3) another back-end = zip.mdb with about 43000 zips/cities/states

The app has been operating stably (is that a word?) for some years.
No probs. The main backend is 63.3 megs now and contains tens of
thousands of letters - legal documents sent to attys, litigants, gvt,
judicial bodies - you name it. Pure text in a memo field - the entire
letter from dear mr jones to yours truly, john doe.

Probably upwards of 40 megs of the main backend's size is due
to the contents of this single memo field. Is there any reason on this
earth for me to consider breaking out the contents of this memo field
by putting into a linked table in what would then be a THIRD backend
database? Any reason at all? Mind you, I would have no other
objectives other than efficiency in mind for doing this. Rarely (if
ever) is the field's contents searched in ANY fashion. If I want to
find a particular letter, there are PLENTY of other fields to query
it out precisely. And the last thing worthy of mention, I think, is
the fact that these letters are being maintained for legal
responsibility only. It is damned near certain that only 1 out of
a thousand will ever need to be reproduced again for any reason.
If I knew which one, I would HAPPILY delete the other 999 if you
know what I mean. The other fields in the correspondence table
are far more valuable, in my opinion, to my overall needs as they
serve as a record of my having sent the letter, to whom, when,
where, etc... and even are topic-categorized to give one a pretty
good idea as to what the letter was concerning without even reading
it. So, its like separating useless (well not useless) from more
useful things (IE, the wheat from the chaf). What I'm really trying to
say here, dammit, is that if I could really afford a data loss
somewhere - it could be one of these confounded letters that
are eating up my disk.

Bottom line question for the gurus is: Do I really need to serious
consider carving out the huge block of memo data and putting
into its own database?

Thank-you for your patience in reading this rather lengthy post.


Keeping tack of documents may be done vwry efficiently with Microsoft
Indexing Service and the ADO Microsoft Indexing Service Provider. With it
we keep track of what is stored in files and a file is similar to a memo
field. The file can be a Word doc, or whatever.

From ADO help:
"The Microsoft OLE DB Provider for Microsoft Indexing Service provides
programmatic read-only access to file system and Web data indexed by
Microsoft Indexing Service. ADO applications can issue SQL queries to
retrieve content and file property information."

Although it may take a little time to familiarize oneself with this
technology, it is so fast and so powerful that it's likely to be worth it.

My experience with it has involved large handbooks of many documents.
A field nurse has been able to look up blood near transfusion to find all
the relevant documents dealing with thsese realted topics.
A school principal has been able to look up bomb threat.
If indexing service is running then these lookups are seemingly
instantaneous, even if there are many thousands of documents.

--
Lyle

"The aim of those who try to control thought is always the same. They find
one single explanation of the world, one system of thought and action that
will (they believe) cover everything; and then they try to impose that on
all thinking people."
- Gilbert Highet
Nov 13 '05 #4
Lyle Fairfield wrote:
MLH <CR**@NorthState.net> wrote in news:d7nk61h0lrl3dru7137jvqjf5naq23i2tt@
4ax.com:

I have a RDBMS app consisting of 3 primary mdb's...
1) a front-end with a few STATIC tables and the other menagerie of
objects
2) a back-end with most of my DYNAMIC tables. I'll call it my main
backend.
3) another back-end = zip.mdb with about 43000 zips/cities/states

The app has been operating stably (is that a word?) for some years.
No probs. The main backend is 63.3 megs now and contains tens of
thousands of letters - legal documents sent to attys, litigants, gvt,
judicial bodies - you name it. Pure text in a memo field - the entire
letter from dear mr jones to yours truly, john doe.

Probably upwards of 40 megs of the main backend's size is due
to the contents of this single memo field. Is there any reason on this
earth for me to consider breaking out the contents of this memo field
by putting into a linked table in what would then be a THIRD backend
database? Any reason at all? Mind you, I would have no other
objectives other than efficiency in mind for doing this. Rarely (if
ever) is the field's contents searched in ANY fashion. If I want to
find a particular letter, there are PLENTY of other fields to query
it out precisely. And the last thing worthy of mention, I think, is
the fact that these letters are being maintained for legal
responsibility only. It is damned near certain that only 1 out of
a thousand will ever need to be reproduced again for any reason.
If I knew which one, I would HAPPILY delete the other 999 if you
know what I mean. The other fields in the correspondence table
are far more valuable, in my opinion, to my overall needs as they
serve as a record of my having sent the letter, to whom, when,
where, etc... and even are topic-categorized to give one a pretty
good idea as to what the letter was concerning without even reading
it. So, its like separating useless (well not useless) from more
useful things (IE, the wheat from the chaf). What I'm really trying to
say here, dammit, is that if I could really afford a data loss
somewhere - it could be one of these confounded letters that
are eating up my disk.

Bottom line question for the gurus is: Do I really need to serious
consider carving out the huge block of memo data and putting
into its own database?

Thank-you for your patience in reading this rather lengthy post.

Keeping tack of documents may be done vwry efficiently with Microsoft
Indexing Service and the ADO Microsoft Indexing Service Provider. With it
we keep track of what is stored in files and a file is similar to a memo
field. The file can be a Word doc, or whatever.

From ADO help:
"The Microsoft OLE DB Provider for Microsoft Indexing Service provides
programmatic read-only access to file system and Web data indexed by
Microsoft Indexing Service. ADO applications can issue SQL queries to
retrieve content and file property information."

Although it may take a little time to familiarize oneself with this
technology, it is so fast and so powerful that it's likely to be worth it.

My experience with it has involved large handbooks of many documents.
A field nurse has been able to look up blood near transfusion to find all
the relevant documents dealing with thsese realted topics.
A school principal has been able to look up bomb threat.
If indexing service is running then these lookups are seemingly
instantaneous, even if there are many thousands of documents.


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Lyle,
That's another good reason to use ADO ;-). After reading your post I
searched thru MS's website for info on Index Server & found the
following troubling quote:

"Do not run antivirus programs while Index Server is running, or when
you back up the Index Server catalog."
http://support.microsoft.com/kb/247093

I'd be loath to run my PC w/o my antivirus running ALL the time (since
I'm usually connected to the Internet - and just on general principles).

There also seems to be some confusion (at least to me) about what the
service is meant for. There are a lot of articles on using the IS w/
IIS and virtual directories, and only a few (that I could find) about
using the IS on local PCs. I'm assuming the IS can be used in both
situations. How did you use IS (and how did you get such cool jobs? :
)).

Rgds,
--
MGFoster:::mgf00 <at> earthlink <decimal-point> net
Oakland, CA (USA)

-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.0
Charset: noconv

iQA/AwUBQmqeeIechKqOuFEgEQI78QCggAe2CAlaDgPVhZOjekHMtL PXR9EAn1vW
+32iHmOGelzlJciAnuyj21un
=lgyk
-----END PGP SIGNATURE-----
Nov 13 '05 #5
MLH <CR**@NorthState.net> wrote in
news:d7********************************@4ax.com:
Bottom line question for the gurus is: Do I really need to serious
consider carving out the huge block of memo data and putting
into its own database?


Not so far as I can see, unless, as Lyle points out, searchability
is highly desired.

The big issue with putting the memos in a different back end is that
you then lose referential integrity enforcement, so you could end up
with orphaned memos.

In terms of bulk, I don't see memos as being that big -- they are
certainly tiny in comparison to OLE fields, for instance, embedding
a Word document in a Jet table. I did an app that used that method
and the average document size in the file system was about 25K, but
when stored in Jet, it took well over twice as much space.

Memo fields are going to be quite efficient.

I just don't see what you're worrying about, unless the issue really
is the indexing. And if that's the issue, I agree that, unless
you're going to build your own text indexing system within Access
(it can be done and it's very fast, but it's not trivial) you
probably should do it externally.

That ADO can interface with the indexing system is a nice benefit,
but if it all requires running IIS locally on the workstation,
that's disastrous. It really only makes sense if IIS is running on
the server and you can access its indexing services remotely from a
workstation, via ADO.

Otherwise, it's a nice check off on the list of features, but
basically useless in the real world.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 13 '05 #6
One can query indexing service directly in Windows XP Pro. (Heme too, I
THINK but am not sure).

My Computer->Right Click->Manage
Services and Applications->Indexing Service
(If it's not running then it must be started of course and if it's never
run it will need considerable time to build a catalog)
Click on a catalog, the default catalogs I have seen have been named
"system"
Click on Query the Catalog
Type in your query

Help->Help Topic->Index->queries: gives one an extensive list of query
formats.

That's it.

one can get such exotic info as music genre and image dimensions
---
---
or one can write an app using ado to do this
---

I gave up running anti-virus software some months ago after having done
so for several years. I don't remember any of the problems outlined in
this article having occurred.

---

So what are desktops search engines for? ... beats me.

----

my use of is has been in ASP-ADO-Internet applications
MGFoster wrote:
Lyle Fairfield wrote:
MLH <CR**@NorthState.net> wrote in
news:d7nk61h0lrl3dru7137jvqjf5naq23i2tt@
4ax.com:

I have a RDBMS app consisting of 3 primary mdb's...
1) a front-end with a few STATIC tables and the other menagerie of
objects
2) a back-end with most of my DYNAMIC tables. I'll call it my main
backend.
3) another back-end = zip.mdb with about 43000 zips/cities/states

The app has been operating stably (is that a word?) for some years.
No probs. The main backend is 63.3 megs now and contains tens of
thousands of letters - legal documents sent to attys, litigants, gvt,
judicial bodies - you name it. Pure text in a memo field - the entire
letter from dear mr jones to yours truly, john doe.

Probably upwards of 40 megs of the main backend's size is due
to the contents of this single memo field. Is there any reason on this
earth for me to consider breaking out the contents of this memo field
by putting into a linked table in what would then be a THIRD backend
database? Any reason at all? Mind you, I would have no other
objectives other than efficiency in mind for doing this. Rarely (if
ever) is the field's contents searched in ANY fashion. If I want to
find a particular letter, there are PLENTY of other fields to query
it out precisely. And the last thing worthy of mention, I think, is
the fact that these letters are being maintained for legal
responsibility only. It is damned near certain that only 1 out of
a thousand will ever need to be reproduced again for any reason.
If I knew which one, I would HAPPILY delete the other 999 if you
know what I mean. The other fields in the correspondence table
are far more valuable, in my opinion, to my overall needs as they
serve as a record of my having sent the letter, to whom, when, where,
etc... and even are topic-categorized to give one a pretty
good idea as to what the letter was concerning without even reading
it. So, its like separating useless (well not useless) from more
useful things (IE, the wheat from the chaf). What I'm really trying to
say here, dammit, is that if I could really afford a data loss
somewhere - it could be one of these confounded letters that
are eating up my disk.

Bottom line question for the gurus is: Do I really need to serious
consider carving out the huge block of memo data and putting
into its own database?

Thank-you for your patience in reading this rather lengthy post.


Keeping tack of documents may be done vwry efficiently with Microsoft
Indexing Service and the ADO Microsoft Indexing Service Provider. With
it we keep track of what is stored in files and a file is similar to a
memo field. The file can be a Word doc, or whatever.

From ADO help:
"The Microsoft OLE DB Provider for Microsoft Indexing Service provides
programmatic read-only access to file system and Web data indexed by
Microsoft Indexing Service. ADO applications can issue SQL queries to
retrieve content and file property information."

Although it may take a little time to familiarize oneself with this
technology, it is so fast and so powerful that it's likely to be worth
it.

My experience with it has involved large handbooks of many documents.
A field nurse has been able to look up blood near transfusion to find
all the relevant documents dealing with thsese realted topics.
A school principal has been able to look up bomb threat.
If indexing service is running then these lookups are seemingly
instantaneous, even if there are many thousands of documents.


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Lyle,
That's another good reason to use ADO ;-). After reading your post I
searched thru MS's website for info on Index Server & found the
following troubling quote:

"Do not run antivirus programs while Index Server is running, or when
you back up the Index Server catalog."
http://support.microsoft.com/kb/247093

I'd be loath to run my PC w/o my antivirus running ALL the time (since
I'm usually connected to the Internet - and just on general principles).

There also seems to be some confusion (at least to me) about what the
service is meant for. There are a lot of articles on using the IS w/
IIS and virtual directories, and only a few (that I could find) about
using the IS on local PCs. I'm assuming the IS can be used in both
situations. How did you use IS (and how did you get such cool jobs? :
)).

Rgds,

--
--
Lyle

"The aim of those who try to control thought is always the same. They
find one single explanation of the world, one system of thought and
action that will (they believe) cover everything; and then they try to
impose that on all thinking people."
- Gilbert Highet
Nov 13 '05 #7
This tiny script finds files indexed in the catalog named "System" with
"lyle" near "MS-Access" in them". It runs instantaneously (seemingly).

Dim r As ADODB.Recordset
Set r = New ADODB.Recordset
With r
.Open "SELECT Path, FileName, Size" & _
" FROM SCOPE()" & _
" WHERE CONTAINS ('Lyle near MS-Access')", _
"Provider=MSIDXS; Data Source = System;", adOpenKeyset,
adLockBatchOptimistic
While Not .EOF
Debug.Print .Fields("Path"), .Fields("Filename"), .Fields("Size")
.MoveNext
Wend
End With

Ir prints out:
c:\documents and settings\lyle fairfield\application data\phoenix\profiles
\default\cqwkl68s.slt\bookmarks.html bookmarks.html 16245

c:\documents and settings\lyle fairfield\application data\msn6\userdata
\{32ba96f0-79bc-01c3-0200-0000de2a727f}\favorites.xml favorites.xml 18322

c:\windows\homesite55plus.log homesite55plus.log
812496

c:\documents and settings\lyle fairfield\application data\mozilla\profiles
\default\u9dwipze.slt\bookmarks.html bookmarks.html 29828

c:\program files\regcleaner\backups\2005.02.12.21.36.08.55.re g
2005.02.12.21.36.08.55.reg 130898

c:\program files\k-meleon\profiles\default\sd8r1yxg.slt\history.txt
history.txt 345181

c:\documents and settings\lyle fairfield\my documents\excel\ffdba
\ffdba_invoices.xls ffdba_invoices.xls 3337728

Of course one must have started and configured indexing service before this
works. Configuring is quite important.

--
Lyle

"The aim of those who try to control thought is always the same. They find
one single explanation of the world, one system of thought and action that
will (they believe) cover everything; and then they try to impose that on
all thinking people."
- Gilbert Highet
Nov 13 '05 #8
Per MLH:
a front-end with a few STATIC tables and the other menagerie of
objects


Any reason for keeping the static tables in the front end?

Somebody else should verify this before anybody takes it to heart, but my
recollection is that I had performance problems when running queries that linked
across DBs - and that was just in a 100% JET environment. Sounds like your
dynamic tables are in SQL Server or Oracle..or something like that.
As far as storing the documents in the OS's file system goes, I've done that on
every app I've written that stored documents and it's worked - but there's a
significant 'gotcha' in the scheme: you'll never know if a document disappears
until you try to reference it and, by that time, odds are that either there'll
be no recovery source available or you'll be faced with a massive task in
searching through many backups to find the missing doc.
--
PeteCresswell
Nov 13 '05 #9
MLH
Thank-you all for your comments 'n suggestions.

I think there's no reason at all why I should change
anything I'm doing. Several years of stability to stand
on is reason enough I think. I believe I will begin deleting
the documents from the memo field after 5-yr holding
period. That's nicely achieved from within Access.

Yes, after reading all your comments, I think I'm good.
Again, thank you all for contributing.
Nov 13 '05 #10
"(Pete Cresswell)" <x@y.z.invalid> wrote
Somebody else should verify this
before anybody takes it to heart,
but my recollection is that I had
performance problems when running
queries that linked across DBs


If you join tables in the front end with tables in a back end or server, you
may retrieve more than you'd like across the network, prior to selection,
even if the criteria are indexed fields. On the other hand, judicious
placement of lookup tables (to be used a Row Source in Combo or List Boxes)
that do not particpate in such joins can boost performance. The key is
"judicious" placement of tables.

Larry Linson
Microsoft Access MVP

Nov 13 '05 #11
"Larry Linson" <bo*****@localhost.not> wrote in
news:7lEae.1513$Nc.1326@trnddc05:
"(Pete Cresswell)" <x@y.z.invalid> wrote
Somebody else should verify this
before anybody takes it to heart,
but my recollection is that I had
performance problems when running
queries that linked across DBs


If you join tables in the front end with tables in a back end or
server, you may retrieve more than you'd like across the network,
prior to selection, even if the criteria are indexed fields. On
the other hand, judicious placement of lookup tables (to be used a
Row Source in Combo or List Boxes) that do not particpate in such
joins can boost performance. The key is "judicious" placement of
tables.


Since those tables have to be periodically refreshed, do those go in
the front end, or in a temp db residing in the same location on the
workstation as the front end? I'd definitely want to architect as
the latter, simply because I wouldn't want to bloat the front end.

My goal with a front end is that it should never ever need to be
compacted (since it won't ever under any circumstances *be*
compacted).

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 13 '05 #12
MLH <CR**@NorthState.net> wrote:
I think there's no reason at all why I should change
anything I'm doing. Several years of stability to stand
on is reason enough I think.
Agreed.
I believe I will begin deleting
the documents from the memo field after 5-yr holding
period. That's nicely achieved from within Access.


Is that your decision or your users/managers decision? IOW why 5 years? Why not 3
years or 10 years?

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
Nov 13 '05 #13
"David W. Fenton" <dX********@bway.net.invalid> wrote
Since those tables have to be
periodically refreshed, do those
go in the front end, or in a temp
db residing in the same location
on the workstation as the front
end? I'd definitely want to architect
as the latter, simply because I
wouldn't want to bloat the front end.


Typical tables I might include in a front-end: US States and Canadian
Provinces and their abbreviations; Organization Names and Codes for the
Company whose application it is -- things that are unlikely to be changed
any more frequently than new copies of the front end are released.

I agree that Tables that require periodic refreshing might well be handled
in a separate database, though I admit to having had some directly in the
front-end on past projects. The one project that comes specifically to mind
did a check, and auto-refresh from the linked server DB. On the other hand,
that front-end was being distributed roughly once a month most of the time I
worked on that application and there was never a problem with the front-ends
growing out of control.

Larry Linson
Microsoft Access MVP

Nov 13 '05 #14

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

30
by: Christian Seberino | last post by:
How does Ruby compare to Python?? How good is DESIGN of Ruby compared to Python? Python's design is godly. I'm wondering if Ruby's is godly too. I've heard it has solid OOP design but then...
2
by: Anthony P. | last post by:
Hello Everyone, I've run into a problem that I can't seem to wrap my head around and I need some help. I know this might not be the *exact* right place to ask for help but since my problem *is*...
10
by: Rada Chirkova | last post by:
Hi, at NC State University, my students and I are working on a project called "self-organizing databases," please see description below. I would like to use an open-source database system for...
16
by: Robert Zurer | last post by:
Can anyone suggest the best book or part of a book on this subject. I'm looking for an in-depth treatment with examples in C# TIA Robert Zurer robert@zurer.com
9
by: burningsunorama | last post by:
Hi guys! This is maybe a too 'academic problem', but I would like to hear your opinions, something like pros and cons for each approach.... ... Recently we've had at work a little talk about the...
4
by: Richard Buckle | last post by:
Hi fellow Pythonistas, I've been using Python in anger for some time, and I must say, as I wrote in <http://macprang.sourceforge.net/>: "It's refreshing beyond words to use a language that so...
10
by: Jim Devenish | last post by:
I have a split front end/back end system. However I create a number of local tables to carry out certain operations. There is a tendency for the front end to bloat so I have set 'compact on...
0
by: YellowFin Announcements | last post by:
Introduction Usability and relevance have been identified as the major factors preventing mass adoption of Business Intelligence applications. What we have today are traditional BI tools that...
8
by: DaFrizzler | last post by:
Hi, I have received the following email from a colleague, and am quite frankly baffled by the idea. I am just wondering if anyone has any advice or suggestions about this???? === BEGIN MAIL...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...
0
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.