472,955 Members | 2,578 Online

# Adding a calculated field from a stored query to a new query

5 Nibble
I have a question as to the best way to go about solving a problem. MS Access. I have query with the following fields
place, date, payment, frequency
aaaa, 1/1/2022, 1000, monthly
aaaa, 2/2/2022, 1000, monthly
bbbb, 1/1/2022, 12000, annual
bbbb, 2/1/2022 0 , annual
cccc, 1/1/2022, 4000, quarterly

The result for payment is 0 for any month there is no payment

I am trying to get to a result of what the approximate monthly payment is at any point in time. If the frequency is monthly, the payment is just the payment. However, if the payment is annual, I would like to show its monthly rate (12,000/12 from above)

The result set for any given month would be:
place, date, appoxMonthly
aaaa, 1/1/2022, 1000
bbbb, 1/1/2022, 1000
ccc, 1/1/2022, 1000

I could create a recordset from the old query and create a new query and populate with the new calculated values but i'm wondering if i am missing something easier
Apr 12 '22 #1
11 9165
zmbd
5,501 Expert Mod 4TB
<><><><><><><><><><><><><><><><><><<>>

+ Personally, I would put your frequency in terms of months - make your calculations simpler

+ When you get something like this:
Expand|Select|Wrap|Line Numbers
1. PK    Place    ActionDate    Payment    Frequencey
2. 3    bbbb    2022-01-01    \$12,000.00    annual
3. 4    bbbb    2022-02-01    \$0.00         annual
Is your monthly payment going to equal zero??

+Simple query like this would return your example (here I use Switch() which isn't strictly SQL ... some would have you use a nested IIF() instead... for a small database the impact of the Switch() shouldn't matter - better yet, switch the frequency in terms of months - no need for either function then :)):
Expand|Select|Wrap|Line Numbers
1. SELECT tbl_payments.PK, tbl_payments.Place, tbl_payments.ActionDate
2.    , tbl_payments.Payment, tbl_payments.Frequencey
3.    , Switch(Left([tbl_payments].[Frequencey],1)="a",12,
4.        Left([tbl_payments].[Frequencey],1)="q",3,
5.        Left([tbl_payments].[Frequencey],1)="m",1)
6.        AS weighting
7.    , [Payment]/[weighting]
8.      AS MntPym
9. FROM tbl_payments
10. WHERE (((tbl_payments.ActionDate)=#1/1/2022#));
Returning something like:
Expand|Select|Wrap|Line Numbers
1. PK Place  ActionDate  Payment     Frequency   weighting  MntPym
2. 1   aaaa  2022-01-01  \$1,000.00   monthly       1       \$1,000.00
3. 3   bbbb  2022-01-01  \$12,000.00  annual       12       \$1,000.00
4. 5   cccc  2022-01-01  \$4,000.00   quarterly     3       \$1,333.33
Apr 13 '22 #2
NeoPa
32,547 Expert Mod 16PB
Hi Z.

Long Time No See. Welcome back :-)

Unfortunately, the OP has absolutely not thought this through very well. What happens when you have four months of payment on a quarterly basis. Two quarterly payments but no explicit indicator of the number of months passed hence the expected number of payments. This would go south fast as the OP hasn't designed the data well enough to make this possible without extreme handling of the many different possible scenarios.

Easy enough for a human but using strict logic available to SQL, not so much.

I would suggest they throw this away and start again with better consideration of all the factors.

PS. If you don't like a post you've submitted you can simply edit it. Generally no need to delete & resubmit - though that works too of course.
Apr 13 '22 #3
jrol
5 Nibble
what i am doing is using the data to map out the cash flow in each month. A user enters the payment, the term, the start date, the frequency and it creates the stream of cash flows for that particular item. I then combine multiple items to get the monthly cash flow of the total.
like the following:
Expand|Select|Wrap|Line Numbers
1. site    freq    jan22    feb22    march22    apr22    may22    jun22    jul22    aug22    sept22    nov22    dec22    jan23
2. a       mon     1000    1000    1000    1000    1000    1000    1000    1000    1000    1000    1000    1000
3. b       quart   0       2000    0       0       2000    0       0       2000    0       0       2000    0
4. c       annual  0       0       0       5000    0       0       0       0       0       0       0       0
5.

i have a function for each frequency and run the terms through a switch when a new site is created (or edited) . From this i can map out the cash flow monthly for hundreds of sites over multiple years based on the terms.
this all works perfectly for my needs.

I dont understand your response. I have a 0 payment in each month of a quarterly stream reflecting the amount owed for that period.

what i was looking for was actually a less precise method of accounting for the the monthly payments by treating the frequencies greater than a month as a monthly payment (5000/12 for each month for site c above)
Apr 13 '22 #4
jrol
5 Nibble
I could just run the actual terms through again and adjust the payment amount (divide by 12 for annual) and treat it as a monthly payment in a new query. I would reuse the current structure. The reason I asked the question is do not need the whole stream so I didn't want to go through all the calculations. I just need one value for each site.
a 1000
b 2000/3
c 5000/12
for a monthly approximation of the actual cash flow

Go easy, this is the first time I ever used a site like this and I am self taught so I always assume there is an easier way to accomplish something
Apr 13 '22 #5
jrol
5 Nibble
I see this. I could do that if the payment was the same all the time. I could it with the weighting factor. My problem is the payment changes through time. (say increase 5% every year on the anniversary of the start date for a multiple year stream) So I was counting on using the current payment (or last payment if it was 0 that month for the quarterly payment example) As you can see in my other post, I already used a switch() to generate the cash flow streams to begin with. I just need to hold the "last payment that wasn't zero" and divide it by the weighting factor based on frequency. (12 for annual, 3 for quarterly, 1 (or don't do it) for monthly).

So without re running the cash flows through the switch, i was just looking for a way to hold the last payment as each row may have a payment or a zero.
Apr 13 '22 #6
zmbd
5,501 Expert Mod 4TB
@jrol:
NeoPa is one of my favorite mentors - I've learned a lot from him over the years, and never fear, we're all about the self taught!

As Neopa pointed out, and I tried to do so with the 01/01 and 02/01 and, what happens here...

From your second post, trying to follow the table... if you have columns for each of your months, then the dataset is not really optimal for what you're trying to do... I'll PM you here in a moment with a link to our insights covering normalization.

@NeoPa :
> It's been a rough couple of years.
> Something went wonky on my end - stupid updates, by the time I figured out what was happening I'd deleted the old message (actually it didn't appear to have posted who-knows)... and the rest is history. :)
> I have the impression that OP's schema may not be 1NF let alone 2NF or better; thus, data set might be a bit cumbersome. I think we would need to see the business model to really decide how to re-organize the data - if that's even feasible.
Apr 14 '22 #7
jrol
5 Nibble
Thanks for the help. Now if I can figure out where a PM would show up....
Apr 14 '22 #8
zmbd
5,501 Expert Mod 4TB
Apr 14 '22 #9
NeoPa
32,547 Expert Mod 16PB
Hi Z (and Jrol).

I'm less bashful about splashing the Database Normalisation and Table Structures link everywhere I can :-D This is perfectly acceptable to post in any thread where it may be seen to pertain (Just in case anyone was unclear of the rules and/or playing it cautiously).

@Jrol.

Essentially, for every [Place] you're looking for the single record that has a value set for [Payment] and, within that filtered subset, has the maximum (latest) date. The only way I can think of to get that data from this design is to use a SubQuery filter. This is because every [Place] could be using a different date for its last payment.

Try :
Expand|Select|Wrap|Line Numbers
1. SELECT [Place] AS [P], [Date], [Payment], [Frequency]
2. FROM   [YourTable]
3. WHERE  ([Payment]<>0)
4.   AND  ([Date]=(SELECT   TOP 1
5.                          [Date]
6.                 FROM     [YourTable]
7.                 WHERE    ([Place]=[P])
8.                 ORDER BY [Date] DESC))
This is air-SQL as I haven't done something like this for a while so try it out. If it fails then post back with a clear error message & description of exactly what happened.
Apr 15 '22 #10
NeoPa
32,547 Expert Mod 16PB
I played around some more and came up with a version that is more likely to work as required. The Sub-Query can alternatively be saved as a QueryDef and used by name.
Expand|Select|Wrap|Line Numbers
1. SELECT [Place], [Date], [Payment], [Frequency]
2. FROM   [YourTable]
3.        INNER JOIN
4.        (SELECT   [Place]
5.                , Max([Date]) AS [MaxDate]
6.         FROM     [YourTable]
7.         WHERE    ([Payment]<>0)
8.         GROUP BY [Place]) AS [sQ]
9.   ON   [YourTable].[Place]=[sQ].[Place]
10.  AND   [YourTable].[Date]=[sQ].[MaxDate]
Apr 15 '22 #11
zmbd
5,501 Expert Mod 4TB
The only issue I see with Neopa's suggestion relates to my question in post#7 in that:
+ If the dates being stored in record fields as implied in Post#4....
[Field1 - Date 1][Field2 - Date 2][Field3 - Date 3][Field4 - Date 4]
Then the only option I really know of to handle this in Access uses VBA. This happens alot in Worksheets; however, at least then you can use Max() on the row and one (or more) of the lookup functions in a dashboard (I have to do this alot with legacy datasets in the Lab where I work - Excel is not a database and array formulas are a form-of-crazy-uniquely-unto-themselves); however, I digress...

+ Of course, If the dates are being stored in a table column, as inferred in OP, which is what base my first suggestion on; then, of course, Neopa's suggestions are closer to what you are after, combine that with my suggestion in Post#2 to convert your text values for frequency and you should be well on your way.
Apr 15 '22 #12