By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,847 Members | 2,285 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,847 IT Pros & Developers. It's quick & easy.

Database slow down due to query?

P: n/a
I have made some alterations to my company's database so that variable
rates are calculated on postcode area.

This is currently been expanded from a query which featured 12 IIf
statements to one which now features 48 IIf statements. The basis of
the IIf statements are as follows:

1: IIf([Fees]=1500 And [Category]="A" And [PostcodeID]="1",100,0)

So I have 6 different levels of Fees, 2 categories and 4 postcode ID
areas. = 48 IIf statements

Is there a more efficent way to do this?

I had designed a table that included the fields Fees, category,
PostcodeID and premium (with an autonumber PK of premium ID). The
calculation in query then read:

S3: IIf([Fees]=[fee] And [Category]=[Cat] And
[PostcodeID]=[PCArea],[Premium],0)

But this didnt seem to make much of a difference.

I am working here with a database with approx 25,000 records and some v
old machines which run a front end and access the server for a backend.
As you can imagine the system has slowed down considerably since I
introduced this. Its not so much moving from one form to the other that
does the calculations but the running of a report from that form which
displays all the calculations that takes the time (approx 5 mins on
some of the machines!). The system currently has approx 6 concurrent
users (there are more in the office but not all accessing all the time)

Any suggestions would be gratefully received. Current thinking is more
up to date computers?

Cheers

Nov 13 '05 #1
Share this Question
Share on Google+
1 Reply


P: n/a
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

It would help if you described your tables: columns, data types,
relationships w/ other tables; showed the query you are using; and, some
sample data.

Making a table to hold the Fees, Categories, PostCodes & Premiums is a
good idea. You should be able to join to the other table(s) to get the
correct Premiums.

--
MGFoster:::mgf00 <at> earthlink <decimal-point> net
Oakland, CA (USA)

-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.0
Charset: noconv

iQA/AwUBQjXVVoechKqOuFEgEQJaxACbBgzXqkkEkb2nPxfRLKDpVu 4ynlYAnAx8
gAzhIAgNQal1ebEKWzKlx+qF
=0WJG
-----END PGP SIGNATURE-----
dd*****@yahoo.co.uk wrote:
I have made some alterations to my company's database so that variable
rates are calculated on postcode area.

This is currently been expanded from a query which featured 12 IIf
statements to one which now features 48 IIf statements. The basis of
the IIf statements are as follows:

1: IIf([Fees]=1500 And [Category]="A" And [PostcodeID]="1",100,0)

So I have 6 different levels of Fees, 2 categories and 4 postcode ID
areas. = 48 IIf statements

Is there a more efficent way to do this?

I had designed a table that included the fields Fees, category,
PostcodeID and premium (with an autonumber PK of premium ID). The
calculation in query then read:

S3: IIf([Fees]=[fee] And [Category]=[Cat] And
[PostcodeID]=[PCArea],[Premium],0)

But this didnt seem to make much of a difference.

I am working here with a database with approx 25,000 records and some v
old machines which run a front end and access the server for a backend.
As you can imagine the system has slowed down considerably since I
introduced this. Its not so much moving from one form to the other that
does the calculations but the running of a report from that form which
displays all the calculations that takes the time (approx 5 mins on
some of the machines!). The system currently has approx 6 concurrent
users (there are more in the office but not all accessing all the time)

Any suggestions would be gratefully received. Current thinking is more
up to date computers?

Nov 13 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.