By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,903 Members | 2,075 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,903 IT Pros & Developers. It's quick & easy.

Speed problem

P: n/a
Ron
Hi All,

I've written a program that takes care of recordkeeping. I've got tables
for clients, orders, and details for those orders. All sortsa reports, etc.
On my main unit, everything runs okay. I've just imported data from an old
dos program that did basically the same thing, and things are running fairly
okay on the main unit. I did notice a slight slowdown, but within the realm
of usability. However, on the network, some things are happening at snail
race pace.

I've divided it into FE/BE and only the data is on the BE, all
forms/queries/reports on FE.

Some forms are still fast, even on a "terminal" (2nd, 3rd computer hooked to
network). But some act like they're accessing data from Mars over a slow
link. And some reports...jeez, it takes a couple minutes to run some of
them (granted, they do bring together a buncha tables via queries). But,
just on the terminals is it much slower. The main user (where the data is)
is "acceptable".

Now, here's the kicker. This all worked lightening fast with NO data (or,
little--under a couple thousand orders/maybe 70k-100k details. But, when I
added the data I already had in DOS (75,000 orders and 500,000 details), it
started slowing WAY down (again, mostly on the terminals).

Should ALL the controls in each table be indexed? If I've built queries and
included most of the controls in a table, do they all have to be indexed for
it to speed things up? I kinda just thought my linked fields (using
relationships) should be indexed so that's the way I set things up. No?

How can I speed this puppy up without saying "sorry, too much data--we'll
have to delete some"?

TIA for any advice,
ron
Aug 18 '06 #1
Share this Question
Share on Google+
3 Replies


P: n/a
In general, you only need to index the primary key and any fields that
get sorted or searched on.

Are you opening a form that is bound to all of your records? You
should generally create a lookup form and then open another form that
is based on the query that returns the records you want.

If you open a form that is linked to 75,000 records, it will not be
efficient.

Ron wrote:
Hi All,

I've written a program that takes care of recordkeeping. I've got tables
for clients, orders, and details for those orders. All sortsa reports, etc.
On my main unit, everything runs okay. I've just imported data from an old
dos program that did basically the same thing, and things are running fairly
okay on the main unit. I did notice a slight slowdown, but within the realm
of usability. However, on the network, some things are happening at snail
race pace.

I've divided it into FE/BE and only the data is on the BE, all
forms/queries/reports on FE.

Some forms are still fast, even on a "terminal" (2nd, 3rd computer hooked to
network). But some act like they're accessing data from Mars over a slow
link. And some reports...jeez, it takes a couple minutes to run some of
them (granted, they do bring together a buncha tables via queries). But,
just on the terminals is it much slower. The main user (where the data is)
is "acceptable".

Now, here's the kicker. This all worked lightening fast with NO data (or,
little--under a couple thousand orders/maybe 70k-100k details. But, when I
added the data I already had in DOS (75,000 orders and 500,000 details), it
started slowing WAY down (again, mostly on the terminals).

Should ALL the controls in each table be indexed? If I've built queries and
included most of the controls in a table, do they all have to be indexed for
it to speed things up? I kinda just thought my linked fields (using
relationships) should be indexed so that's the way I set things up. No?

How can I speed this puppy up without saying "sorry, too much data--we'll
have to delete some"?

TIA for any advice,
ron
Aug 19 '06 #2

P: n/a
Ron
Thanks for the reply. But yeah, I know that about the 75k records(actually,
my BIG file is 540k). Any form I have based on the big files is looking
only at one client, their several invoices, etc. All with queries.

I've looked through the advice others have gotten about this and implemented
some changes, but it still seems rather slow and lumbering on the additional
users. The main user, as I said, is pretty good. It's the networked users
that are suffering.

Maybe a client server deal is the answer? I dunno.

<bi**@macbridenet.comwrote in message
news:11**********************@i3g2000cwc.googlegro ups.com...
In general, you only need to index the primary key and any fields that
get sorted or searched on.

Are you opening a form that is bound to all of your records? You
should generally create a lookup form and then open another form that
is based on the query that returns the records you want.

If you open a form that is linked to 75,000 records, it will not be
efficient.

Ron wrote:
>Hi All,

I've written a program that takes care of recordkeeping. I've got tables
for clients, orders, and details for those orders. All sortsa reports,
etc.
On my main unit, everything runs okay. I've just imported data from an
old
dos program that did basically the same thing, and things are running
fairly
okay on the main unit. I did notice a slight slowdown, but within the
realm
of usability. However, on the network, some things are happening at
snail
race pace.

I've divided it into FE/BE and only the data is on the BE, all
forms/queries/reports on FE.

Some forms are still fast, even on a "terminal" (2nd, 3rd computer hooked
to
network). But some act like they're accessing data from Mars over a slow
link. And some reports...jeez, it takes a couple minutes to run some of
them (granted, they do bring together a buncha tables via queries). But,
just on the terminals is it much slower. The main user (where the data
is)
is "acceptable".

Now, here's the kicker. This all worked lightening fast with NO data
(or,
little--under a couple thousand orders/maybe 70k-100k details. But, when
I
added the data I already had in DOS (75,000 orders and 500,000 details),
it
started slowing WAY down (again, mostly on the terminals).

Should ALL the controls in each table be indexed? If I've built queries
and
included most of the controls in a table, do they all have to be indexed
for
it to speed things up? I kinda just thought my linked fields (using
relationships) should be indexed so that's the way I set things up. No?

How can I speed this puppy up without saying "sorry, too much data--we'll
have to delete some"?

TIA for any advice,
ron

Aug 19 '06 #3

P: n/a
1) No, you don't have to index all of the fields. Only the fields
used for selection or joins.

2) Have you been through Tony's optimisation page? (Keep links
open, change [auto] to [none] etc?)

3a) Reports slow down because they are sorting and grouping.
3b) Forms slow down because of subforms and combo boxes.

4) That's probably fixed it, but you could consider increasing
the size of the jet cache, and extending the refresh interval.

(david)

"Ron" <ro*******************@earthlink.comwrote in message
news:SU****************@newsread3.news.pas.earthli nk.net...
Hi All,

I've written a program that takes care of recordkeeping. I've got tables
for clients, orders, and details for those orders. All sortsa reports,
etc. On my main unit, everything runs okay. I've just imported data from
an old dos program that did basically the same thing, and things are
running fairly okay on the main unit. I did notice a slight slowdown, but
within the realm of usability. However, on the network, some things are
happening at snail race pace.

I've divided it into FE/BE and only the data is on the BE, all
forms/queries/reports on FE.

Some forms are still fast, even on a "terminal" (2nd, 3rd computer hooked
to network). But some act like they're accessing data from Mars over a
slow link. And some reports...jeez, it takes a couple minutes to run some
of them (granted, they do bring together a buncha tables via queries).
But, just on the terminals is it much slower. The main user (where the
data is) is "acceptable".

Now, here's the kicker. This all worked lightening fast with NO data (or,
little--under a couple thousand orders/maybe 70k-100k details. But, when I
added the data I already had in DOS (75,000 orders and 500,000 details),
it started slowing WAY down (again, mostly on the terminals).

Should ALL the controls in each table be indexed? If I've built queries
and included most of the controls in a table, do they all have to be
indexed for it to speed things up? I kinda just thought my linked fields
(using relationships) should be indexed so that's the way I set things up.
No?

How can I speed this puppy up without saying "sorry, too much data--we'll
have to delete some"?

TIA for any advice,
ron

Aug 20 '06 #4

This discussion thread is closed

Replies have been disabled for this discussion.