By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
443,865 Members | 1,725 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 443,865 IT Pros & Developers. It's quick & easy.

Scalability questions for extended stored procedure

P: n/a
Hi,
First I would like to apologize for cross posting in three groups as
I did not know which one would be the appropriate group. If some one
points me to the correct one I will use that in the future.
I am trying to findout the scalabilty of an user written extended
store procedure. I have created a dll using a C program
(modified/plagarized from some of the examples) . The main function of
this extended SP is to act as a passthru to connect to an third party
ODBC driver. All this is supposed to do is take the passthru sql
statement, userid, passsword and connect to a remote server (IBM 3090
Mainframe in our case) using a system ODBC connection and pass the
returned result set back to the stored procedure calling this extended
SP. I am trying to find out the answers for the following questions.
1. What are the limitations of this approach.
2. What would happen say if 2,000 concurrent calls per minute are made
to the extended SP from the web app.
3. What would happen if this continued for say 4 hours. Will the memory
usage increase to point that will cripple the server assuming there is
no memory leak in the dll.
4. Are there any connection pooling concerns that I should pay
attention to specifically from an Extended SP point of view.
5. Apart from compiling the dll using the "MultiThread" option should I
be using other options to make sure the dll is threadsafe.

SQL server Environment :
OS - Windows 2000 Advanced Server SP4
SQL - SQLServer 2000 Enterprise edition SP3
Hardware - 8 way 2 node cluster with 6Gb RAM
Any help regarding this is greately appreciated.

Prahalad

Jan 25 '06 #1
Share this Question
Share on Google+
3 Replies


P: n/a
ppateel (pp*****@gmail.com) writes:
First I would like to apologize for cross posting in three groups as
I did not know which one would be the appropriate group. If some one
points me to the correct one I will use that in the future.
I am trying to findout the scalabilty of an user written extended
store procedure. I have created a dll using a C program
(modified/plagarized from some of the examples) . The main function of
this extended SP is to act as a passthru to connect to an third party
ODBC driver. All this is supposed to do is take the passthru sql
statement, userid, passsword and connect to a remote server (IBM 3090
Mainframe in our case) using a system ODBC connection and pass the
returned result set back to the stored procedure calling this extended
SP. I am trying to find out the answers for the following questions.
1. What are the limitations of this approach.
2. What would happen say if 2,000 concurrent calls per minute are made
to the extended SP from the web app.
3. What would happen if this continued for say 4 hours. Will the memory
usage increase to point that will cripple the server assuming there is
no memory leak in the dll.
4. Are there any connection pooling concerns that I should pay
attention to specifically from an Extended SP point of view.
5. Apart from compiling the dll using the "MultiThread" option should I
be using other options to make sure the dll is threadsafe.


I don't have own any experiece of this situation, but my gut reaction
is that the memory pressure is very much related to much data that
the queries bring back. As long as they return on average 100 rows
with 10 columns, I would worry. But if your 2000 concurrent calls
brings back 100 MB each time, then memory may be a problem.

More of an issue may be all the context switching with that high
frequency. That is something you could try with a dummy XP, and
then just hammer the SQL machines with calls to that XP.

As for the connection pooling that is something you need to check with
the ODBC driver in question. But normally you get pooling if the
connection string is the same. So if these 2000 calls have different
username/password, then there will not be much pooling.
--
Erland Sommarskog, SQL Server MVP, es****@sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pro...ads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinf...ons/books.mspx
Jan 25 '06 #2

P: n/a
Hello Mr. Pateel,

Did you work at Sears in 1999? I worked with a gentleman there in that
timeframe that had a similar name, but it might have been Pattel.

In any case, you MUST be doing a web site for a really high volume,
high profile customer. At 2000 sql calls a minute, I'd think about
batching your data, and going about it differently. Think of the old
VTAM days where you asked for a screen full of data components, show it
to the user, and wait for their input. So instead of a sql call for
every field, do a sql call to a stored procedure equivalent on the 390
for each screen. Stored procedures can gather data from a variety of
tables. Return all the data that MIGHT be needed for the screen, and
let the Web app throw away what it doesn't need.

If you truly have 2000 different users hitting 2000 web pages a minute,
call microsoft directly through your sales rep. They would be GLAD to
help you tune that in.

Jan 30 '06 #3

P: n/a
No. I did not work in Sears at all. Also the last name is as common as
Smith so ..... You get the idea.
We are using stored procedures. These calls are not SQL Statements
calls but call to DB2 stored procedure on the mainframe. The scenerio I
portrayed was a worst case scenerio. We have about 25% of the
requirement for 25% of the time. But on certain given days we do reach
that as different applications use the same middle ware to get the data
from mainframe.

Jan 30 '06 #4

This discussion thread is closed

Replies have been disabled for this discussion.