production to process historic data from DB2 8.2 database.
It accesses data from a partitioned table and based on some condition
it would insert the result to another table and update the same row
(using updatable cursor). Right now it takes around 2 minutes to
process 10000 rows of data so when I did a test run against 3 Million
row it took around 30 hours to complete.
But I feel it as a concern as it has to run against 50 million rows of
data in prod and that means it would run for 500 hours. Well I think
that I'm missing something which causes the performance impact in my
stored procedure.
I ran explain against the 4 cursors which I'm using and all are using
index scan. Also statistics are also updated and it is 100% accurate.
The cursors which I'm using are as below
DECLARE cur1 CURSOR WITH HOLD FOR
SELECT col1,col2,col3
FROM
smt.order_detail
WHERE
ORDER_CSI = 1
AND
DWH_EFCTV_TIMSTM >=IN_LAST_RUN_TIMESTAMP OR DWH_UPD_TIMSTM
=IN_LAST_RUN_TIMESTAMP FOR UPDATE OF col4,col5,col6;
DECLARE cur2 CURSOR WITH HOLD FOR
SELECT col1,col2,col3
FROM
smt.order_detail
WHERE
ORDER_CSI in (7,8,9)
AND
DWH_EFCTV_TIMSTM >=IN_LAST_RUN_TIMESTAMP OR DWH_UPD_TIMSTM=IN_LAST_RUN_TIMESTAMP FOR UPDATE OF col4,col5,col6;
DECLARE cur3 CURSOR WITH HOLD FOR
SELECT col1,col2,col3
FROM
smt.order_detail
WHERE
ORDER_CSI = 10
AND
DWH_EFCTV_TIMSTM >=IN_LAST_RUN_TIMESTAMP OR DWH_UPD_TIMSTM=IN_LAST_RUN_TIMESTAMP
FOR UPDATE OF col4,col5,col6;
DECLARE cur4 CURSOR WITH HOLD FOR
SELECT
col1,col2,col3, col4,col5,col6,col7
FROM
smt.order_detail_lookup
WHERE
VIN_ATTRB_CD = IN_VIN_ENCODING_ATTRB_CD with ur;
Can you give me any tips please.