Until today I always thought as long as you see a lot of logical reads
as compared to physical reads, then you're good. But it looks like it
isn't so. But doesn't logical read mean it's being read from memory
and no I/O involved? So why is Logical Reads = CPU Consumption ?
I ran into an exact scenario last week when our applciation were
running something, and each time an application started, the CPU would
go from 99% idle to 48% idle. I took the snapshot for each application
and all i saw was logical reads. Here are some outputs from the
snapshot:
App 1
Buffer pool data logical reads = 5463749
Buffer pool data physical reads = 0
Buffer pool index logical reads = 1022588
Buffer pool index physical reads = 0
App2
Buffer pool data logical reads = 25831618
Buffer pool data physical reads = 0
Buffer pool index logical reads = 4836091
Buffer pool index physical reads= 0
Similarly, there are more applications with similar output. As you can
see, there are no physical reads and a lot of logical reads. And as I
said, each time an application was running, the CPU went up to 48%
idle. Does this suggest the logical reads taking all the CPU?
All applications were seen to be running the routine :
CALL SYSIBM.SQLCOLUMNS(?,?,?,?,?)
Have anyone heard of this routine? What does it do?