By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,686 Members | 2,589 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,686 IT Pros & Developers. It's quick & easy.

ASP Speed Tricks

P: n/a
I just posted an article I wrote called ASP Speed Tricks. It covers
techniques to optimize output of database data in HTML, for
both simple tables and complex tables. More advanced ASP authors might
be interested in the complex table optimizations. Please check it out at:

http://www.somacon.com/aspdocs/

Hope you enjoy,
Shailesh

Please reply at http://www.somacon.com/contact.php

Jul 19 '05 #1
Share this Question
Share on Google+
17 Replies


P: n/a
a) Use GetRows/GetString rather than looping through recordsets

b) In the "complex table" scenario, do a JOIN to retrieve all the data as a
single resultset. Combine with .GetRows.

It's not just about speed - you need to look at scalability as well.
Eliminating the need to mulitple recordset objects, and reducing the amount
of time required for connections etc to be held open increases scalability
in terms of the number of users your webapp can support.

Cheers
Ken

"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
: I just posted an article I wrote called ASP Speed Tricks. It covers
: techniques to optimize output of database data in HTML, for
: both simple tables and complex tables. More advanced ASP authors might
: be interested in the complex table optimizations. Please check it out at:
:
: http://www.somacon.com/aspdocs/
:
: Hope you enjoy,
: Shailesh
:
: Please reply at http://www.somacon.com/contact.php
:
Jul 19 '05 #2

P: n/a
Good points. I did not cover GetRows or GetString, as they preclude the
possibility of using VBScript to perform additional formatting to cell
contents. I know that the formatting can be done in the SQL query
itself, but this is usually a little harder and less flexible.

I talked about using Join to retrieve all the 'complex' data as a single
recordset in the section "Methods to Avoid". The rationale for avoiding
this is that the primary table data will be repeated for each joined row
of the secondary table, which in most circumstances would cause excess
overhead. But I can imagine circumstances when this would work out faster.

I am reading up on disconnected recordsets. Otherwise, I don't have
much experience with scalability, as the app I've been working on is for
the Intranet. I should mention this in the article.

Thank you very much for your comments.

Shailesh

Ken Schaefer wrote:
a) Use GetRows/GetString rather than looping through recordsets

b) In the "complex table" scenario, do a JOIN to retrieve all the data as a
single resultset. Combine with .GetRows.

It's not just about speed - you need to look at scalability as well.
Eliminating the need to mulitple recordset objects, and reducing the amount
of time required for connections etc to be held open increases scalability
in terms of the number of users your webapp can support.

Cheers
Ken

"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
: I just posted an article I wrote called ASP Speed Tricks. It covers
: techniques to optimize output of database data in HTML, for
: both simple tables and complex tables. More advanced ASP authors might
: be interested in the complex table optimizations. Please check it out at:
:
: http://www.somacon.com/aspdocs/
:
: Hope you enjoy,
: Shailesh
:
: Please reply at http://www.somacon.com/contact.php
:


Jul 19 '05 #3

P: n/a

"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:_p**********************@twister.columbus.rr. com...
: Good points. I did not cover GetRows or GetString, as they preclude the
: possibility of using VBScript to perform additional formatting to cell
: contents. I know that the formatting can be done in the SQL query
: itself, but this is usually a little harder and less flexible.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

With .GetRows() you can do the formatting as you access the array elements.

..GetString() is faster, but not suitable for situations where flexible
formatting options are required.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
: I talked about using Join to retrieve all the 'complex' data as a single
: recordset in the section "Methods to Avoid". The rationale for avoiding
: this is that the primary table data will be repeated for each joined row
: of the secondary table, which in most circumstances would cause excess
: overhead. But I can imagine circumstances when this would work out
faster.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Personally I think the use of multiple recordsets would generally cause more
overhead. I suppose it depends how much info is repeated for each "parent"
record.

Cheers
Ken
Jul 19 '05 #4

P: n/a
I agree - for very large recordsets I suspect that time is spent allocating
memory. However I don't think it's practical (or normal) to dump 20,000
records to a webpage :-)

For a webpage that has a lot of reports, you can do something like this:

' Open connection
' Open RS and GetRows (first resultset)
' Open RS and GetRows (second resultset)
' Open RS and GetRows (third resultset)
' Close Connection
'
' Lots of other stuff here
'
'
' Write out first resultset
' Write out second resultset
' Write out third resultset

That way, your connection is only open for a small part of the webpage,
which means (usually) that you can support more clients.

Cheers
Ken
"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:Jz**********************@twister.columbus.rr. com...
: I just found that GetString can be slower than looping through the
: recordset. I modified my example simpletable5.asp to check.
:
: This is the code snippet with GetString:
:
: Response.write objRS.GetString(, , vbTab, vbCrLf, "&nbsp;")
:
: For 20,000 records, the server time was about 2.13 seconds.
:
: This is the current simpletable5.asp algorithm, which produces
: the same output with looping through the Recordset:
:
: Do While Not objRS.EOF
: Response.write objField0
: Response.write vbTab
: Response.write objField1
: Response.write vbTab
: Response.write objField2
: Response.write vbTab
: Response.write objField3
: Response.write vbTab
: Response.write vbCrLf
: objRS.MoveNext
: Loop
:
: For 20,000 records, the server time was about 1.25 seconds.
:
: This is rather surprising to me! I think it is because GetString needs
: to do memory allocation to create the output string, whereas
: Response.write just dumps out the data as soon as it is generated.
:
: Shailesh
:
: Ken Schaefer wrote:
:
: > a) Use GetRows/GetString rather than looping through recordsets
: >
: > b) In the "complex table" scenario, do a JOIN to retrieve all the data
as a
: > single resultset. Combine with .GetRows.
: >
: > It's not just about speed - you need to look at scalability as well.
: > Eliminating the need to mulitple recordset objects, and reducing the
amount
: > of time required for connections etc to be held open increases
scalability
: > in terms of the number of users your webapp can support.
: >
: > Cheers
: > Ken
: >
: > "Shailesh Humbad" <hu******@hotmail.com> wrote in message
: > news:IQ**********************@twister.columbus.rr. com...
: > : I just posted an article I wrote called ASP Speed Tricks. It covers
: > : techniques to optimize output of database data in HTML, for
: > : both simple tables and complex tables. More advanced ASP authors
might
: > : be interested in the complex table optimizations. Please check it out
at:
: > :
: > : http://www.somacon.com/aspdocs/
: > :
: > : Hope you enjoy,
: > : Shailesh
: > :
: > : Please reply at http://www.somacon.com/contact.php
: > :
: >
: >
:
Jul 19 '05 #5

P: n/a
It would be nice to one day be able to dump 20000 records to a webpage.
The long list would definitely be useful for something like an embedded
search box, as in http://www.somacon.com/phpref/ . Anyway, GetString is
only slower for the larger recordsets, as you pointed out. It is only
somewhat faster than the optimized looping for the smaller recordsets.
Here are the test results for the other cases:

Records GetString Time Looping Time KBytes
1000 0.08 0.10 25
2000 0.13 0.17 52
10000 0.50 0.67 262
20000 2.13 1.25 536

Time is seconds to run the entire page, and is the approximate average
of 5-10 test runs. KBytes is the size of the output of the page.

Shailesh

Shailesh Humbad wrote:
I just found that GetString can be slower than looping through the
recordset. I modified my example simpletable5.asp to check.

This is the code snippet with GetString:

Response.write objRS.GetString(, , vbTab, vbCrLf, "&nbsp;")

For 20,000 records, the server time was about 2.13 seconds.

This is the current simpletable5.asp algorithm, which produces
the same output with looping through the Recordset:

Do While Not objRS.EOF
Response.write objField0
Response.write vbTab
Response.write objField1
Response.write vbTab
Response.write objField2
Response.write vbTab
Response.write objField3
Response.write vbTab
Response.write vbCrLf
objRS.MoveNext
Loop

For 20,000 records, the server time was about 1.25 seconds.

This is rather surprising to me! I think it is because GetString needs
to do memory allocation to create the output string, whereas
Response.write just dumps out the data as soon as it is generated.

Shailesh

Ken Schaefer wrote:
a) Use GetRows/GetString rather than looping through recordsets

b) In the "complex table" scenario, do a JOIN to retrieve all the data
as a
single resultset. Combine with .GetRows.

It's not just about speed - you need to look at scalability as well.
Eliminating the need to mulitple recordset objects, and reducing the
amount
of time required for connections etc to be held open increases
scalability
in terms of the number of users your webapp can support.

Cheers
Ken

"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
: I just posted an article I wrote called ASP Speed Tricks. It covers
: techniques to optimize output of database data in HTML, for
: both simple tables and complex tables. More advanced ASP authors might
: be interested in the complex table optimizations. Please check it
out at:
:
: http://www.somacon.com/aspdocs/
:
: Hope you enjoy,
: Shailesh
:
: Please reply at http://www.somacon.com/contact.php
:


Jul 19 '05 #6

P: n/a
hailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
I just posted an article I wrote called ASP Speed Tricks. It covers
techniques to optimize output of database data in HTML, for
both simple tables and complex tables. More advanced ASP authors might
be interested in the complex table optimizations. Please check it out at:

http://www.somacon.com/aspdocs/
Another thing bad for scalability AND performance is this

' Connect to the data source
objCN.ConnectionString = "DSN=datasource"
objCN.Open

You should advise to use OLEDB instead of ODBC.
success

--
compatible web farm Session replacement for Asp and Asp.Net
http://www.nieropwebconsult.nl/asp_session_manager.htm

Hope you enjoy,
Shailesh

Please reply at http://www.somacon.com/contact.php


Jul 19 '05 #7

P: n/a
"Bob Barrows" <re*******@yahoo.com> wrote in message
news:uf**************@TK2MSFTNGP12.phx.gbl...
Shailesh Humbad wrote:

Set rs = server.createobject("adodb.recordset")
objCN.ProcedureOrQueryName param1,..,paramM, rs And I would recommand not to use SErver.Createobject but CreateObjecet
avoiding another extra code-layer that is not necesarry on W2k and higher.
(It was however on NT4)...
;-) Bob Barrows


Jul 19 '05 #8

P: n/a
Egbert Nierop (MVP for IIS) wrote:
"Bob Barrows" <re*******@yahoo.com> wrote in message
news:uf**************@TK2MSFTNGP12.phx.gbl...
Shailesh Humbad wrote:

Set rs = server.createobject("adodb.recordset")
objCN.ProcedureOrQueryName param1,..,paramM, rs

And I would recommand not to use SErver.Createobject but CreateObjecet
avoiding another extra code-layer that is not necesarry on W2k and
higher. (It was however on NT4)...
;-)


Oops, I forgot :-)
I'm still on NT4 - it will be easier to remember when we finally move to
IIS5, I hope :-)

Thx,
Bob

Jul 19 '05 #9

P: n/a
In my Intranet medical office application, many of the pages allow the
user to select a patient, physician, or insurance, and these lists can
easily have up to thousands of entries. The asynchronous server call
has too high latency for common, repeated use. On the other hand, I
agree that scroll bars and drop down lists are awful user interface
elements. I have written an extensive article on this, and have come up
with what I think is a better solution. It implements the Winamp "Jump
To File" feature in Javascript. If you're interested, read about it here:

http://somacon.com/jumpto/userinterface.php

And try it out and get the source code here:

http://www.somacon.com/phpref/

This interface requires that all the options are pre-loaded in the page
that is sent to the browser. Since people are moving to higher
bandwidth connections all the time, and the Intranet is already fast
enough, this is more and more becoming acceptable in practice.

Shailesh
Chris Barber wrote:
There are better ways to do this that don't entail sending 20,000 records to
the client browser. Some form of asynchronous call to the server to process
the search term and return a list of associated entries. After all, who is
every going to scroll down a list that has 20,000 entries. I get exasperated
just scrolling through about 50-100.

Chris.

"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:pf**********************@twister.columbus.rr. com...
It would be nice to one day be able to dump 20000 records to a webpage.
The long list would definitely be useful for something like an embedded
search box, as in http://www.somacon.com/phpref/ . Anyway, GetString is
only slower for the larger recordsets, as you pointed out. It is only
somewhat faster than the optimized looping for the smaller recordsets.
Here are the test results for the other cases:

Records GetString Time Looping Time KBytes
1000 0.08 0.10 25
2000 0.13 0.17 52
10000 0.50 0.67 262
20000 2.13 1.25 536

Time is seconds to run the entire page, and is the approximate average
of 5-10 test runs. KBytes is the size of the output of the page.

Shailesh

Shailesh Humbad wrote:
I just found that GetString can be slower than looping through the
recordset. I modified my example simpletable5.asp to check.

This is the code snippet with GetString:

Response.write objRS.GetString(, , vbTab, vbCrLf, "&nbsp;")

For 20,000 records, the server time was about 2.13 seconds.

This is the current simpletable5.asp algorithm, which produces
the same output with looping through the Recordset:

Do While Not objRS.EOF
Response.write objField0
Response.write vbTab
Response.write objField1
Response.write vbTab
Response.write objField2
Response.write vbTab
Response.write objField3
Response.write vbTab
Response.write vbCrLf
objRS.MoveNext
Loop

For 20,000 records, the server time was about 1.25 seconds.

This is rather surprising to me! I think it is because GetString needs
to do memory allocation to create the output string, whereas
Response.write just dumps out the data as soon as it is generated.

Shailesh

Ken Schaefer wrote:
a) Use GetRows/GetString rather than looping through recordsets

b) In the "complex table" scenario, do a JOIN to retrieve all the data
as a
single resultset. Combine with .GetRows.

It's not just about speed - you need to look at scalability as well.
Eliminating the need to mulitple recordset objects, and reducing the
amount
of time required for connections etc to be held open increases
scalability
in terms of the number of users your webapp can support.

Cheers
Ken

"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus .rr.com...
: I just posted an article I wrote called ASP Speed Tricks. It covers
: techniques to optimize output of database data in HTML, for
: both simple tables and complex tables. More advanced ASP authors
might
: be interested in the complex table optimizations. Please check it
out at:
:
: http://www.somacon.com/aspdocs/
:
: Hope you enjoy,
: Shailesh
:
: Please reply at http://www.somacon.com/contact.php
:



Jul 19 '05 #10

P: n/a
"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
I just posted an article I wrote called ASP Speed Tricks. It covers
techniques to optimize output of database data in HTML, for
both simple tables and complex tables. More advanced ASP authors might be interested in the complex table optimizations. Please check it out at:
http://www.somacon.com/aspdocs/

Hope you enjoy,
Shailesh

Please reply at http://www.somacon.com/contact.php

Taking into account the excellent points made in this thread, here's
some code in defense of GetString

<%
Dim sql,cn,rs,timeGS,timeRS
Dim fld0,fld1,fld2,fld3
sql = "tblData"

timeGS=Timer
Response.Write "<pre>"
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Do While Not rs.EOF
Response.Write rs.GetString(2,30,vbTab,vbCRLF)
Loop
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
Response.Write "</pre>"
timeGS = Timer - timeGS

timeRS = Timer
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Set fld0 = rs(0)
Set fld1 = rs(1)
Set fld2 = rs(2)
Set fld3 = rs(3)
Response.Write "<pre>"
Do While Not rs.EOF
Response.Write rs(0)
Response.Write vbTab
Response.Write rs(1)
Response.Write vbTab
Response.Write rs(2)
Response.Write vbTab
Response.Write rs(3)
Response.Write vbTab
Response.Write vbCRLF
rs.MoveNext
Loop
Response.Write "</pre>"
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
timeRS = Timer - timeRS

Response.Clear
Response.Write "<br>GetString: " & timeGS
Response.Write "<br>Recordset: " & timeRS
%>

Results:
GetString: 2.324219
Recordset: 4.566406

The GetString method ran nearly twice as fast (1.965x) as the Recordset
method in the 20,000 record scenario.

Notes:
1. The key thing to note is that you can use the rows parameter of the
GetString method to "burst" output in chunks. A little empirical testing
showed that in this scenario, 30 rows at a time seemed optimal. I'm sure
there's some black magic formula that defines the ideal size to make
best use of the Response.Write optimizations available in VBScript, I
just have no idea what that formula is. If someone could provide insight
into this, I'd be much obliged. Sorry, no such optimization exists in
JScript (early bound vs. late bound).
2. I let the Recordset method go second to insure it had the benefit of
connection pooling. But it's a small point and not worth quibbling over.
3. There are ways to apply additional formatting at the cell level when
using the GetString method. Namely, use set the DataFormat property of
each field using the StdDataFormat object available via the
MSSTDFMT.DLL. However, I'm of the opinion that formatting should be
handles via stylesheets, i.e.. declare a class attribute for the
embodying table/block tag and take care of formatting in an external
CSS.
4. Your complex table example was a little difficult to follow, however
if I understand it correctly you wanted to present both detailed data
and summary/aggregate data in the same listing. One solution that you
did not investigate would be to return a result set which is the union
of the detail data and the summary/aggregate data. This has the
following consequences. First it eliminates the redundant data
associated with an joined result set. Secondly, it keeps the data
processing where it belongs, namely at the database. Thirdly, is does
not change the nature of what's returned (a result set is a result set
is a result set). As such, all the same benefits/optimizations of the
GetString method continue to apply.
5. I would have liked to have used sp as connection, but in this case
that would actually represent an additional translation layer. Sorry
Bob.
6. Not that is should matter, since the recordset is being called in
exactly the same way in both scenarios, but for completeness here's what
I used to represent tblData. Note, Numbers is a table which contains
numbers from 1 to 8000 (inclusive). The database was SQL Server 2000.

CREATE TABLE [dbo].[Numbers]( [Number] [smallint] IDENTITY (1, 1) NOT
NULL)

CREATE VIEW dbo.tblData
AS
SELECT TOP 20000
N1.Number AS Field1,
N2.Number AS Field2,
CAST('FOO' AS TEXT(50)) AS Field3,
CAST('BAR' AS TEXT(50)) AS Field4
FROM
dbo.Numbers N1 CROSS JOIN
dbo.Numbers N2

HTH
-Chris


Jul 19 '05 #11

P: n/a
"Chris Hohmann" <hohmannATyahooDOTcom> wrote in message
news:%2***************@TK2MSFTNGP11.phx.gbl...
5. I would have liked to have used sp as connection, but in this case
that would actually represent an additional translation layer. Sorry
Bob.


That should read:
5. I would have liked to have used sp as connection method...
Jul 19 '05 #12

P: n/a
I had read somewhere that one of the techniques to optimize performance
is to reduce the number of response.write statements. If I recall
correctly, there is a performance hit (however minor) for each statement
which can add up if there's a large number of statements. As such, I
build my HTML and place it into a variable which at the end of the
script is written using a single response.write statement.

So the example provided would become...

strHTML = ""
Do While Not rs.EOF
strHTML = strHTML & rs(0)
strHTML = strHTML & vbTab
strHTML = strHTML & rs(1)
strHTML = strHTML & vbTab
strHTML = strHTML & rs(2)
strHTML = strHTML & vbTab
strHTML = strHTML & rs(3)
strHTML = strHTML & vbTab
strHTML = strHTML & vbCRLF
rs.MoveNext response.write strHTML

Granted I've never actually performed any tests to confirm the
optimization benefits of this, however I'm comfortable in assuming that
there is a benefit.

David H

Chris Hohmann wrote: "Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
I just posted an article I wrote called ASP Speed Tricks. It covers
techniques to optimize output of database data in HTML, for
both simple tables and complex tables. More advanced ASP authors


might
be interested in the complex table optimizations. Please check it out


at:
http://www.somacon.com/aspdocs/

Hope you enjoy,
Shailesh

Please reply at http://www.somacon.com/contact.php


Taking into account the excellent points made in this thread, here's
some code in defense of GetString

<%
Dim sql,cn,rs,timeGS,timeRS
Dim fld0,fld1,fld2,fld3
sql = "tblData"

timeGS=Timer
Response.Write "<pre>"
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Do While Not rs.EOF
Response.Write rs.GetString(2,30,vbTab,vbCRLF)
Loop
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
Response.Write "</pre>"
timeGS = Timer - timeGS

timeRS = Timer
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Set fld0 = rs(0)
Set fld1 = rs(1)
Set fld2 = rs(2)
Set fld3 = rs(3)
Response.Write "<pre>"
Do While Not rs.EOF
Response.Write rs(0)
Response.Write vbTab
Response.Write rs(1)
Response.Write vbTab
Response.Write rs(2)
Response.Write vbTab
Response.Write rs(3)
Response.Write vbTab
Response.Write vbCRLF
rs.MoveNext
Loop
Response.Write "</pre>"
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
timeRS = Timer - timeRS

Response.Clear
Response.Write "<br>GetString: " & timeGS
Response.Write "<br>Recordset: " & timeRS
%>

Results:
GetString: 2.324219
Recordset: 4.566406

The GetString method ran nearly twice as fast (1.965x) as the Recordset
method in the 20,000 record scenario.

Notes:
1. The key thing to note is that you can use the rows parameter of the
GetString method to "burst" output in chunks. A little empirical testing
showed that in this scenario, 30 rows at a time seemed optimal. I'm sure
there's some black magic formula that defines the ideal size to make
best use of the Response.Write optimizations available in VBScript, I
just have no idea what that formula is. If someone could provide insight
into this, I'd be much obliged. Sorry, no such optimization exists in
JScript (early bound vs. late bound).
2. I let the Recordset method go second to insure it had the benefit of
connection pooling. But it's a small point and not worth quibbling over.
3. There are ways to apply additional formatting at the cell level when
using the GetString method. Namely, use set the DataFormat property of
each field using the StdDataFormat object available via the
MSSTDFMT.DLL. However, I'm of the opinion that formatting should be
handles via stylesheets, i.e.. declare a class attribute for the
embodying table/block tag and take care of formatting in an external
CSS.
4. Your complex table example was a little difficult to follow, however
if I understand it correctly you wanted to present both detailed data
and summary/aggregate data in the same listing. One solution that you
did not investigate would be to return a result set which is the union
of the detail data and the summary/aggregate data. This has the
following consequences. First it eliminates the redundant data
associated with an joined result set. Secondly, it keeps the data
processing where it belongs, namely at the database. Thirdly, is does
not change the nature of what's returned (a result set is a result set
is a result set). As such, all the same benefits/optimizations of the
GetString method continue to apply.
5. I would have liked to have used sp as connection, but in this case
that would actually represent an additional translation layer. Sorry
Bob.
6. Not that is should matter, since the recordset is being called in
exactly the same way in both scenarios, but for completeness here's what
I used to represent tblData. Note, Numbers is a table which contains
numbers from 1 to 8000 (inclusive). The database was SQL Server 2000.

CREATE TABLE [dbo].[Numbers]( [Number] [smallint] IDENTITY (1, 1) NOT
NULL)

CREATE VIEW dbo.tblData
AS
SELECT TOP 20000
N1.Number AS Field1,
N2.Number AS Field2,
CAST('FOO' AS TEXT(50)) AS Field3,
CAST('BAR' AS TEXT(50)) AS Field4
FROM
dbo.Numbers N1 CROSS JOIN
dbo.Numbers N2

HTH
-Chris


Jul 19 '05 #13

P: n/a
Mr. Holley, you need to read my article. That is exactly the naive
method that is obscenely slow. I have to say it again, it is OBSCENELY
slow.

http://www.somacon.com/aspdocs/

David C Holley wrote:
I had read somewhere that one of the techniques to optimize performance
is to reduce the number of response.write statements. If I recall
correctly, there is a performance hit (however minor) for each statement
which can add up if there's a large number of statements. As such, I
build my HTML and place it into a variable which at the end of the
script is written using a single response.write statement.

So the example provided would become...

strHTML = ""
> Do While Not rs.EOF
> strHTML = strHTML & rs(0)
> strHTML = strHTML & vbTab
> strHTML = strHTML & rs(1)
> strHTML = strHTML & vbTab
> strHTML = strHTML & rs(2)
> strHTML = strHTML & vbTab
> strHTML = strHTML & rs(3)
> strHTML = strHTML & vbTab
> strHTML = strHTML & vbCRLF
> rs.MoveNext

response.write strHTML

Granted I've never actually performed any tests to confirm the
optimization benefits of this, however I'm comfortable in assuming that
there is a benefit.

David H

Chris Hohmann wrote:
"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
I just posted an article I wrote called ASP Speed Tricks. It covers
techniques to optimize output of database data in HTML, for
both simple tables and complex tables. More advanced ASP authors

might
be interested in the complex table optimizations. Please check it out

at:
http://www.somacon.com/aspdocs/

Hope you enjoy,
Shailesh

Please reply at http://www.somacon.com/contact.php


Taking into account the excellent points made in this thread, here's
some code in defense of GetString

<%
Dim sql,cn,rs,timeGS,timeRS
Dim fld0,fld1,fld2,fld3
sql = "tblData"

timeGS=Timer
Response.Write "<pre>"
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Do While Not rs.EOF
Response.Write rs.GetString(2,30,vbTab,vbCRLF)
Loop
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
Response.Write "</pre>"
timeGS = Timer - timeGS

timeRS = Timer
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Set fld0 = rs(0)
Set fld1 = rs(1)
Set fld2 = rs(2)
Set fld3 = rs(3)
Response.Write "<pre>"
Do While Not rs.EOF
Response.Write rs(0)
Response.Write vbTab
Response.Write rs(1)
Response.Write vbTab
Response.Write rs(2)
Response.Write vbTab
Response.Write rs(3)
Response.Write vbTab
Response.Write vbCRLF
rs.MoveNext
Loop
Response.Write "</pre>"
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
timeRS = Timer - timeRS

Response.Clear
Response.Write "<br>GetString: " & timeGS
Response.Write "<br>Recordset: " & timeRS
%>

Results:
GetString: 2.324219
Recordset: 4.566406

The GetString method ran nearly twice as fast (1.965x) as the Recordset
method in the 20,000 record scenario.

Notes:
1. The key thing to note is that you can use the rows parameter of the
GetString method to "burst" output in chunks. A little empirical testing
showed that in this scenario, 30 rows at a time seemed optimal. I'm sure
there's some black magic formula that defines the ideal size to make
best use of the Response.Write optimizations available in VBScript, I
just have no idea what that formula is. If someone could provide insight
into this, I'd be much obliged. Sorry, no such optimization exists in
JScript (early bound vs. late bound).
2. I let the Recordset method go second to insure it had the benefit of
connection pooling. But it's a small point and not worth quibbling over.
3. There are ways to apply additional formatting at the cell level when
using the GetString method. Namely, use set the DataFormat property of
each field using the StdDataFormat object available via the
MSSTDFMT.DLL. However, I'm of the opinion that formatting should be
handles via stylesheets, i.e.. declare a class attribute for the
embodying table/block tag and take care of formatting in an external
CSS.
4. Your complex table example was a little difficult to follow, however
if I understand it correctly you wanted to present both detailed data
and summary/aggregate data in the same listing. One solution that you
did not investigate would be to return a result set which is the union
of the detail data and the summary/aggregate data. This has the
following consequences. First it eliminates the redundant data
associated with an joined result set. Secondly, it keeps the data
processing where it belongs, namely at the database. Thirdly, is does
not change the nature of what's returned (a result set is a result set
is a result set). As such, all the same benefits/optimizations of the
GetString method continue to apply.
5. I would have liked to have used sp as connection, but in this case
that would actually represent an additional translation layer. Sorry
Bob.
6. Not that is should matter, since the recordset is being called in
exactly the same way in both scenarios, but for completeness here's what
I used to represent tblData. Note, Numbers is a table which contains
numbers from 1 to 8000 (inclusive). The database was SQL Server 2000.

CREATE TABLE [dbo].[Numbers]( [Number] [smallint] IDENTITY (1, 1) NOT
NULL)

CREATE VIEW dbo.tblData
AS
SELECT TOP 20000
N1.Number AS Field1,
N2.Number AS Field2,
CAST('FOO' AS TEXT(50)) AS Field3,
CAST('BAR' AS TEXT(50)) AS Field4
FROM
dbo.Numbers N1 CROSS JOIN
dbo.Numbers N2

HTH
-Chris


Jul 19 '05 #14

P: n/a
I have taken the suggestions from the thread into account and revised
the article. If you have the interest, please check it out at:

http://www.somacon.com/aspdocs/

Mr. Hohmann, I tested your GetString technique and found it was indeed
faster. It was not 2x like your example, because your example did not
use the field references.

I tested Mr. Nierop's suggestion of using OLE DB, and found it was
about 10% faster for simple table output.

I also tried Mr. Barrows' suggestion of using "Set .ActiveConnection =
objCN" instead of ".ActiveConnection = objCN". Running this in a loop,
the Set call actually turned out to be twice as slow, but both methods
were relatively fast. I think my testing procedure of looping may be
wrong. In any case, my impression from the book I'm reading is that the
property will use the existing connection if it is set to a Connection
object that is already opened. Do you have a test that shows a new
connection being made when Set is not used?

Trying Mr. Barrows' suggestion of using a stored procedure was also not
fruitful, and it actually ended up taking longer than using the
recordset to create the query each time. I created a query called
"dummyquery" in the database, and executed it using "objCN.dummyquery
objField1, objRS2". This ran in about 5 seconds, as opposed to 4
seconds for "objRS.Open "SELECT ...", and 1 second for the prepared
command "objRS.Open objCmd". Any ideas?

I revised the complex tables introduction, and it should be more
understandable now. The UNION clause can not be used because it simply
concatenates the results of the sub-queries. I want to execute a
secondary query for each row of a primary query. UNION also requires
that the number of fields in each combined sub-query be the same.

Again, thanks for all your comments.

Shailesh

Reply at:
http://www.somacon.com/contact.php

Chris Hohmann wrote:
"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:IQ**********************@twister.columbus.rr. com...
I just posted an article I wrote called ASP Speed Tricks. It covers
techniques to optimize output of database data in HTML, for
both simple tables and complex tables. More advanced ASP authors


might
be interested in the complex table optimizations. Please check it out


at:
http://www.somacon.com/aspdocs/

Hope you enjoy,
Shailesh

Please reply at http://www.somacon.com/contact.php


Taking into account the excellent points made in this thread, here's
some code in defense of GetString

<%
Dim sql,cn,rs,timeGS,timeRS
Dim fld0,fld1,fld2,fld3
sql = "tblData"

timeGS=Timer
Response.Write "<pre>"
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Do While Not rs.EOF
Response.Write rs.GetString(2,30,vbTab,vbCRLF)
Loop
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
Response.Write "</pre>"
timeGS = Timer - timeGS

timeRS = Timer
Set cn = CreateObject("ADODB.Connection")
cn.Open gConn '<-- Your OLEDB DSN-Less Connection String Here
Set rs = cn.Execute(sql,,&H2)
Set fld0 = rs(0)
Set fld1 = rs(1)
Set fld2 = rs(2)
Set fld3 = rs(3)
Response.Write "<pre>"
Do While Not rs.EOF
Response.Write rs(0)
Response.Write vbTab
Response.Write rs(1)
Response.Write vbTab
Response.Write rs(2)
Response.Write vbTab
Response.Write rs(3)
Response.Write vbTab
Response.Write vbCRLF
rs.MoveNext
Loop
Response.Write "</pre>"
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
timeRS = Timer - timeRS

Response.Clear
Response.Write "<br>GetString: " & timeGS
Response.Write "<br>Recordset: " & timeRS
%>

Results:
GetString: 2.324219
Recordset: 4.566406

The GetString method ran nearly twice as fast (1.965x) as the Recordset
method in the 20,000 record scenario.

Notes:
1. The key thing to note is that you can use the rows parameter of the
GetString method to "burst" output in chunks. A little empirical testing
showed that in this scenario, 30 rows at a time seemed optimal. I'm sure
there's some black magic formula that defines the ideal size to make
best use of the Response.Write optimizations available in VBScript, I
just have no idea what that formula is. If someone could provide insight
into this, I'd be much obliged. Sorry, no such optimization exists in
JScript (early bound vs. late bound).
2. I let the Recordset method go second to insure it had the benefit of
connection pooling. But it's a small point and not worth quibbling over.
3. There are ways to apply additional formatting at the cell level when
using the GetString method. Namely, use set the DataFormat property of
each field using the StdDataFormat object available via the
MSSTDFMT.DLL. However, I'm of the opinion that formatting should be
handles via stylesheets, i.e.. declare a class attribute for the
embodying table/block tag and take care of formatting in an external
CSS.
4. Your complex table example was a little difficult to follow, however
if I understand it correctly you wanted to present both detailed data
and summary/aggregate data in the same listing. One solution that you
did not investigate would be to return a result set which is the union
of the detail data and the summary/aggregate data. This has the
following consequences. First it eliminates the redundant data
associated with an joined result set. Secondly, it keeps the data
processing where it belongs, namely at the database. Thirdly, is does
not change the nature of what's returned (a result set is a result set
is a result set). As such, all the same benefits/optimizations of the
GetString method continue to apply.
5. I would have liked to have used sp as connection, but in this case
that would actually represent an additional translation layer. Sorry
Bob.
6. Not that is should matter, since the recordset is being called in
exactly the same way in both scenarios, but for completeness here's what
I used to represent tblData. Note, Numbers is a table which contains
numbers from 1 to 8000 (inclusive). The database was SQL Server 2000.

CREATE TABLE [dbo].[Numbers]( [Number] [smallint] IDENTITY (1, 1) NOT
NULL)

CREATE VIEW dbo.tblData
AS
SELECT TOP 20000
N1.Number AS Field1,
N2.Number AS Field2,
CAST('FOO' AS TEXT(50)) AS Field3,
CAST('BAR' AS TEXT(50)) AS Field4
FROM
dbo.Numbers N1 CROSS JOIN
dbo.Numbers N2

HTH
-Chris


Jul 19 '05 #15

P: n/a
Responses inline:
"Shailesh Humbad" <hu******@hotmail.com> wrote in message
news:8R**********************@twister.columbus.rr. com...
I have taken the suggestions from the thread into account and revised
the article. If you have the interest, please check it out at:

http://www.somacon.com/aspdocs/

Mr. Hohmann, I tested your GetString technique and found it was indeed
faster. It was not 2x like your example, because your example did not
use the field references.
Whoops. I went through the trouble of declaring the field references but
did not use them. I revised my code and got results similar to the ones
posted in your revised article. Although the wording of the results
should be as follows:

GetString is about 25-67% faster than "Optimized Looping".

OR

"Optimized Looping" is about 20-40% slower than GetString.
I revised the complex tables introduction, and it should be more
understandable now. The UNION clause can not be used because it simply concatenates the results of the sub-queries. I want to execute a
secondary query for each row of a primary query. UNION also requires
that the number of fields in each combined sub-query be the same.
Your ComplexTable3 example generates a summary line for every record in
tblData. I think what you intended was to print a summary line for each
distinct value in Field1. I've modified your code accordingly and also
presented the union query solution I discussed in my prior post. My
results show that the union query solution runs 14 times faster than the
ComplexTable3 solution.

<%
Function timeGR
Dim sql,cn,rs,arr,j,jMax
timeGR=Timer
Set cn = CreateObject("ADODB.Connection")
Set rs = CreateObject("ADODB.Recordset")
cn.Open gConn
cn.spDataComplex rs
arr = rs.GetRows
rs.Close : Set rs = Nothing
cn.Close : Set cn = Nothing
jMax = Ubound(arr,2)
Response.Write "<table class='datatable'>"
For j = 0 To jMax
If arr(1,j)<0 Then
Response.Write "<tr><th colspan='4'>"
Response.Write Abs(arr(1,j))
Response.Write "</th></tr>"
End If
Response.write "<tr><td>"
Response.write arr(0,j)
Response.write "</td><td>"
Response.write arr(1,j)
Response.write "</td><td>"
Response.write arr(2,j)
Response.write "</td><td>"
Response.write arr(3,j)
Response.write "</td></tr>"
Next
Response.Write "</table>"
timeGR = Timer - timeGR
End Function

Function timeRS
Dim StartTime, EndTime

StartTime = Timer

Dim objCN ' ADO Connection object
Dim objRS ' ADO Recordset object
Dim strsql ' SQL query string
Dim objRS2 ' Another ADO Recordset object
Dim objField0, objField1, objField2, objField3

' Create a connection object
Set objCN = Server.CreateObject("ADODB.Connection")

' Connect to the data source
objCN.ConnectionString = gConn
objCN.Open

' Create the a recordset object, and initialize it
Set objRS = Server.CreateObject("ADODB.RecordSet")
With objRS
.CursorType = adOpenForwardOnly
.LockType = adLockReadOnly
.CursorLocation = adUseServer
Set .ActiveConnection = objCN
End With
' Create the second recordset object, and initialize it
Set objRS2 = Server.CreateObject("ADODB.RecordSet")
With objRS2
.CursorType = adOpenForwardOnly
.LockType = adLockReadOnly
.CursorLocation = adUseServer
Set .ActiveConnection = objCN
End With

' Execute the SQL query
objRS.Open "SELECT Field1,Field2,Field3,Field4 FROM tblData"

' Set up field references after opening recordset
Set objField0 = objRS(0)
Set objField1 = objRS(1)
Set objField2 = objRS(2)
Set objField3 = objRS(3)
' Write out the results in a table
Response.write "<table class="&Chr(34)&"datatable"&Chr(34)&">"
Field1=-1
Do While Not objRS.EOF
' Use the pre-prepared Recordset object to issue the dummy query
If objRS(0) <> Field1 Then
Response.write "<tr><th colspan=4>"
objRS2.Open "SELECT COUNT(*) FROM tblData WHERE Field1="&objRS(0)
Response.write objRS2(0)
objRS2.Close
Response.write "</th></tr>"
Field1 = objRS(0)
End If
Response.write "<tr><td>"
Response.write objField0
Response.write "</td><td>"
Response.write objField1
Response.write "</td><td>"
Response.write objField2
Response.write "</td><td>"
Response.write objField3
Response.write "</td></tr>"
Response.write vbCrLf
objRS.MoveNext

Loop
Response.write "</table>"

objRS.Close
objCN.Close
Set objCN = Nothing
Set objRS = Nothing
Set objRS2 = Nothing

EndTime = Timer
timeRS = EndTime-StartTime
End Function

Dim o
o = "<br>Recordset: " & timeRS & "<br>GetRows: " & timeGR
Response.Clear
Response.Write o
%>

Here are the revised/new definitions of the database objects I used to
simulate your examples. As before, the database is Microsoft SQL Server
2000 and the Numbers table contains numbers from 1 to 8000 (inclusive):

CREATE TABLE [dbo].[Numbers]( [Number] [smallint] IDENTITY (1, 1) NOT
NULL)

CREATE VIEW dbo.tblData
AS
SELECT TOP 20000
N % 100 AS Field1,
N AS Field2,
CAST('FOO' AS TEXT(50)) AS Field3,
CAST('BAR' AS TEXT(50)) AS Field4
FROM
(
SELECT
N1.Number*200-1+N2.Number AS N
FROM
dbo.Numbers N1 CROSS JOIN
dbo.Numbers N2
WHERE
N1.Number <= 100 AND
N2.Number <=200
) AS Numbers20000
ORDER BY Field1

CREATE PROCEDURE spDataComplex
AS
SELECT
Field1,
Field2,
Field3,
Field4
FROM
tblData
UNION ALL
SELECT
Field1,
0 - Count(*),
'',
''
FROM
tblData
GROUP BY
Field1
ORDER BY
Field1, Field2
Again, thanks for all your comments.

You're welcome

-Chris

Jul 19 '05 #16

P: n/a
It's all about adjusting paradigms :) I've adjusted mine half a dozen
times already through the feedback from this thread.

I have been using Access, or more precisely Jet. But I just installed
MSDE (SQL Server 2000 Desktop Edition) and tested it too. In these
tests, I am modifying a few lines in my script called complextable3.asp.

Here are the test results for Jet and ODBC:

' TEST 1: Prepared Command Object pointing to stored Query using Jet
' objCmd.CommandText = "dummyquery"
' .CommandType = adCmdStoredProc, .Prepared = True
' **2.1 seconds**
objParam.Value = objRS(0)
objRS2.Open objCmd

TEST 2: Same as TEST 1, but using ODBC
** 1.1 seconds **

' TEST 3: Prepared Command Object as text query using Jet
' objCmd.CommandText = "SELECT COUNT(*) FROM tblData WHERE Field1=?",
' .CommandType = adCmdText, .Prepared = True
' **2.1 seconds**
objParam.Value = objRS(0)
objRS2.Open objCmd

TEST 4: Same as TEST 3, but using ODBC
** 1.1 seconds**

-----------------------------------------------------

' TEST 5: objCN.Execute of stored procedure named "dummyquery" using Jet
' 3.4 seconds
Set objRS2 = objCN.Execute("EXECUTE dummyquery "&objRS(0))

' TEST 6: Same as TEST 5, but using ODBC
** 5.5 seconds **

' TEST 7: stored procedure as function call using Jet
' ** 3.4 seconds **
objCN.dummyquery objRS2, objRS(0)

' TEST 8: Same as TEST 7 but using Jet
' ** 5.5 seconds **

-----------------------------------------------------
SQL Server TEST 1:
' objCmd.CommandText = "dummyquery"
' .CommandType = adCmdStoredProc, .Prepared = True
' **5 to 25 seconds**
objParam.Value = objRS(0)
objRS2.Open objCmd

SQL Server TEST 2:
' objCmd.CommandText = "SELECT COUNT(*) FROM tblData WHERE Field1=?",
' .CommandType = adCmdText, .Prepared = True
' **5 to 25 seconds**
objParam.Value = objRS(0)
objRS2.Open objCmd

SQL Server TEST 3:
' **5.4 seconds**
Set objRS2 = objCN.Execute("EXECUTE dummyquery "&objRS(0))

SQL Server TEST 4:
' **5.4 seconds**
objCN.dummyquery objRS2, objRS(0)
Surprising results? First, with Jet or ODBC, the results show that
using a prepared command object is the fastest method, with either the
Native Jet OLE DB provider or the ODBC Provider. The ODBC provider is
*twice as fast* as the Jet provider in this case, clocking in at 1.1
seconds! Furthermore, there is no difference between defining the query
in ASP versus saving the query in the database.

Using the connection object to execute the identical stored query was
*slower* using either of two techniques, and either Provider. In this
case, there was also no difference between defining the query in ASP
versus executing the query saved in the database.

All of the SQL Server tests were slower in comparision to Jet or ODBC.
Worse, using a command object defined in ASP was unreliable, taking
anywhere from 5-25 seconds, and even timing out sometimes. This is for
either a ASP-defined query or a Stored Procedure. On the other hand,
using the connection object to execute either a stored procedure or an
ASP-defined query was reliably about 5 seconds. However, this is slower
than Jet or ODBC.

I'm not sure that these results mean anything, because I have tested
just a single scenario. But the results do debunk the conventional
wisdom that SQL Server > Jet > ODBC always. In fact, ODBC can sometimes
be faster than both Jet and SQL Server. It's very likely that this is a
fluke of the test, so take the results with a grain of salt. I really
wish I could explain *why* the results are the way they are, but it
seems like that would be impossible without looking at some source code.

Shailesh

Bob Barrows wrote:
Shailesh Humbad wrote:
I also tried Mr. Barrows' suggestion of using "Set .ActiveConnection =
objCN" instead of ".ActiveConnection = objCN". Running this in a
loop, the Set call actually turned out to be twice as slow, but both
methods were relatively fast. I think my testing procedure of
looping may be wrong. In any case, my impression from the book I'm
reading is that the property will use the existing connection if it
is set to a Connection object that is already opened. Do you have a
test that shows a new connection being made when Set is not used?

I just ran some tests myself and it seems as if I have been the victim of
either obsolete or originally incorrect information.

This code:
cn.Open
rs.ActiveConnection=cn
cn.Close
rs.Open sSQL,,,,adcmdtext

results in this error: The connection cannot be used to perform this
operation. It is either closed or invalid in this context.

So obviously a new connection is not being spawned. Sorry about that. I must
adjust some of my paradigms.

I don't know why the Set keyword slows things down. It shouldn't ...
Trying Mr. Barrows' suggestion of using a stored procedure was also
not fruitful, and it actually ended up taking longer than using the
recordset to create the query each time. I created a query called
"dummyquery" in the database, and executed it using "objCN.dummyquery
objField1, objRS2". This ran in about 5 seconds, as opposed to 4
seconds for "objRS.Open "SELECT ...", and 1 second for the prepared
command "objRS.Open objCmd". Any ideas?


This is the opposite of the results I got from my tests. Are you using
Access or SQL Server?

Bob Barrows


Jul 19 '05 #17

P: n/a
I figured it out. SQL Server was slower on the sub query because it was
being performed using the same connection object as the main query. By
creating a second connection object, and using a prepared query on the
second connection object, the time to run the script goes down to 1.2
seconds, which is comparable to what was being acheived with ODBC.
Maybe there's an unwritten rule for SQL Server optimization under ADO,
that each opened recordset should have it's own connection for maximum
speed?

Here's the code containing all the test cases. The optimized case is
uncommented out. For this particular sub query, which returns only a
single value, it might run even faster by using an output return
parameter on the sub query, rather than creating a full record set.

<%@ Language=VBScript %>
<% Option Explicit %>
<html>
<head>
<style type="text/css">\
/* set the table layout to fixed width */
table.datatable { table-layout: fixed; width: 400px; }
table.datatable td { width: 100px; }
</style>
</head>
<body>
<%
Dim StartTime, EndTime

StartTime = Timer

Dim objCN ' ADO Connection object
Dim objRS ' ADO Recordset object
Dim strsql ' SQL query string
Dim objRS2 ' Another ADO Recordset object
Dim objCmd ' ADO Command object
Dim objParam ' ADO Parameter object
Dim objField0, objField1, objField2, objField3
Dim objCN2

' Create a connection object
Set objCN = Server.CreateObject("ADODB.Connection")
Set objCN2 = Server.CreateObject("ADODB.Connection")

' Connect to the data source
'objCN.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;"&_
' "Data Source=C:\Inetpub\wwwroot\data\test.mdb;"&_
' "Persist Security Info=False"
'objCN.ConnectionString = "DSN=datasource"
'objCN.ConnectionString = "DSN=datasourcesql;User ID=;Password=;"
objCN.ConnectionString = "Provider=SQLOLEDB.1;"&_
"User ID=;Data Source=localhost;"&_
"Initial Catalog=sqltest;Password=;"
objCN.Open
objCN2.ConnectionString = "Provider=SQLOLEDB.1;"&_
"User ID=;Data Source=localhost;"&_
"Initial Catalog=sqltest;Password=;"
objCN2.Open

' Create the a recordset object, and initialize it
Set objRS = Server.CreateObject("ADODB.RecordSet")
With objRS
.CursorType = adOpenForwardOnly
.LockType = adLockReadOnly
.ActiveConnection = objCN
.CursorLocation = adUseServer
.Source = "SELECT Field1,Field2,Field3,Field4 FROM tblData"
End With
' Create the second recordset object, and initialize it
Set objRS2 = Server.CreateObject("ADODB.RecordSet")
With objRS2
.CursorType = adOpenForwardOnly
.LockType = adLockReadOnly
.ActiveConnection = objCN2
.CursorLocation = adUseServer
End With
' Create command object
Set objCmd = Server.CreateObject("ADODB.Command")
With objCmd
.ActiveConnection = objCN2
.Prepared = True
.CommandType = adCmdText
.CommandText = "SELECT COUNT(*) FROM tblData WHERE Field1=?"
'.CommandType = adCmdStoredProc
'.CommandText = "dummyquery"
End With
' Set up Parameter object
Set objParam = objCmd.CreateParameter("Param1",adInteger,adParamI nput,4)
' Append the parameter to the Command object's parameters collection
objCmd.Parameters.Append objParam

' Execute the SQL query
objRS.Open

' Set up field references after opening recordset
Set objField0 = objRS(0)
Set objField1 = objRS(1)
Set objField2 = objRS(2)
Set objField3 = objRS(3)

' Write out the results in a table
Response.write "<table class="&Chr(34)&"datatable"&Chr(34)&">"
Do While Not objRS.EOF
Response.write "<tr><td>"
Response.write objField0
Response.write "</td><td>"
Response.write objField1
Response.write "</td><td>"
Response.write objField2
Response.write "</td><td>"
Response.write objField3
Response.write "</td></tr>"
Response.write vbCrLf

Response.write "<tr><th colspan=4>"

' Prepared Command Object
objParam.Value = objRS(0)
objRS2.Open objCmd

'objRS2.Open "SELECT COUNT(*) FROM tblData"&_
" WHERE Field1="&objRS(0)

'Set objRS2 = objCN.Execute("EXECUTE dummyquery "&objRS(0))
'Set objRS2 = objCN2.Execute("{call "&_
" dummyquery('"&objRS(0)&"')}")
'Set objRS2 = objCN.Execute("SELECT COUNT(*) FROM tblData"&_
" WHERE Field1="&objRS(0))
'objCN.dummyquery objRS(0), objRS2

Response.write objRS2(0)
objRS2.Close
Response.write "</th></tr>"

objRS.MoveNext
Loop
Response.write "</table>"

objRS.Close
objCN.Close
Set objCN = Nothing
objCN2.Close
Set objCN2 = Nothing
Set objRS = Nothing
Set objRS2 = Nothing
Set objCmd = Nothing

EndTime = Timer
Response.write "<p>processing took "&(EndTime-StartTime)&" seconds<p>&nbsp;"
%>
</body>
</html>

Jul 19 '05 #18

This discussion thread is closed

Replies have been disabled for this discussion.