Thelma Lubkin <th****@alpha2.csd.uwm.edu> wrote in
news:e5**********@uwm.edu:
Bob Quintal <rq******@sympatico.ca> wrote:
: Thelma Lubkin <th****@alpha2.csd.uwm.edu> wrote in
: news:e5**********@uwm.edu:
:> I've inherited a SQL statement that looks approximately like
:> this:
:>
:> SELECT Sum(tblA.A9A) As [1990],
:> Sum(tblA.A9A)/Sum(tblA.I9N) As [1990P],
:> Sum(tblA.A0A) As [2000],
:> Sum(tblA.A0A)/Sum(tblA.I9N) As [2000P]
:> FROM [geoB]
:> INNER JOIN tblA
:> ON [geoB].STFID = tblA.STFID;
:> <snip>
:> I'm uncomfortable with that Sum(tblA.I9N)
:> which will be repeated for each year in the SQL.
:> <snip>
: You've also inherited a denormalized table which is a bigger
: concern. I don't think thay you'll get much improvement as
: the Jet engine appears to calculate all columns in a single
: pass through the records.
Thanks. The question was partly for my education,
because I am discussing a redesign of the tables with
the person maintaining the database. I'm not sure how
far the redesign can go though, because some of the
tables used are from an outside source
--thelma
: --
: Bob Quintal
: PA is y I've altered my email address.
I deal with a lot of data that I receive as Text files (some
..sdf, some delimited). I'll usually write a routine that imports
the data to a temp table, then runs queries to either update or
replace the data in properly normalized tables, as applicable.
If the amount of data is not large, one could simply use select
queries against the table to normalize the data for ease in
processing.
--
Bob Quintal
PA is y I've altered my email address.