472,371 Members | 1,546 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,371 software developers and data experts.

SELECTs vs Stored Procedures

Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data from/to
SQL Server? Why?

Thanks

Regards
Nov 21 '05 #1
45 3188
Hi,

Stored procedures are better. They are faster and harder to use a
sql injection attack against.

sql injection
http://msdn.microsoft.com/msdnmag/is.../SQLInjection/

http://msdn.microsoft.com/library/de...AGHT000002.asp

Ken
-------------------
Ken
-------------------
"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data
from/to SQL Server? Why?

Thanks

Regards

Nov 21 '05 #2
John,

In addition to Ken, you use for both Select.

The differnce is that a stored procedure is on the Server and more
processing is done on the Server, which make your retrieving of data
something (or sometimes a lot) quicker.

I hope this helps,

Cor
Nov 21 '05 #3
Hi john

They have there plans precompiled , so are faster.

Yes definetly prevents SQL Injection to a decent level.

Shivprasad Koirala
C# , VB.NET , SQL SERVER , ASP.NET Interview Questions
http://www.geocities.com/dotnetinterviews/

Nov 21 '05 #4
>Stored procedures are better. They are faster and harder to use a
sql injection attack against.


Seconded - also, with stored procs, you're only sending the name of
the stored proc to call and the parameters across the wire, not the
whole big SQL statement - that can make quite a difference on slower
links!

Marc
Nov 21 '05 #5
Ken Tucker [MVP] wrote:
Hi,

Stored procedures are better. They are faster and harder to
use a sql injection attack against.

sql injection
http://msdn.microsoft.com/msdnmag/is.../SQLInjection/

http://msdn.microsoft.com/library/de...ry/en-us/dnpag
2/html/PAGHT000002.asp
Faster than what? parameterized dyn. queries? Nope. Both are compiled
at runtime equally and execution plans are cached for both.

Sqlinjection attacks are not possible when you use solely parameters
in your dyn. sql. Furthermore, just by using a proc doesn't mean your
dyn. search stored procedure which concatenates SQL internally is not
vulnerable for sql injection.

Frans

Ken
-------------------
Ken
-------------------
"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server datasource,
is it preferable to use SELECTs or stored procedure to read and
write data from/to SQL Server? Why?

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #6
John wrote:
Hi

When developing vb.bet winform apps bound to sql server datasource,
is it preferable to use SELECTs or stored procedure to read and write
data from/to SQL Server? Why?


As long as you use solely parameterized queries, it's not making a
difference.
so do:
SELECT * FROM dbo.Foo where field1 = @param
and not:
SELECT * FROM dbo.foo where field1 = 'value'

Also, specify schema names for tables always, this makes the query's
execution plan get cached better and SQLServer's mechanism to find back
an execution plan works more efficiently.

Frans

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #7
sh**********@yahoo.com wrote:
Hi john

They have there plans precompiled , so are faster.


Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).

The reason for this is that only at runtime statistics are known and
only in the situation of a fresh query the optimizer can truly optimize
the plan based on the statistics. See BOL about execution plans.

As he's talking about SELECT's, the gain of a proc is not really an
issue (having processing on the server so you avoid pumping loads of
data to the client first and then back to the server).

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #8
Frans,
Faster than what?

Please, read the question again.

Than you will see
"to use SELECTs or stored procedure to read and write data from/to SQL
Server"

Therefore this question should in my opinion not only be seen about Select.
I have seen that SP's can make a difference in speed.

Although we agree very much about the in my idea deeper meaning of the rest
of your answers in this thread. I get sometimes the idea that people think
that a SP is a solution for everything, which puts an application direct on
a higher level.

Cor
Nov 21 '05 #9
"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data
from/to SQL Server? Why?

Thanks

Regards


If the data is sensitive, you may not allow direct access to your table. So
if you had an employee table, your users would not be able to execute any
old sql statement, eg:
select firstname, lastname, salary from employee order by salary desc
but you could allow limited access to the employees table by writing a
stored procedure which returned only non-sensitive data for a given
employeeID. You give users permissions to run the stored procedure but not
read permissions for the table.
Nov 21 '05 #10
Cor Ligthert [MVP] wrote:
Frans,
Faster than what?

Please, read the question again.

Than you will see
"to use SELECTs or stored procedure to read and write data from/to
SQL Server"

Therefore this question should in my opinion not only be seen about
Select. I have seen that SP's can make a difference in speed.


and I've seen the other side of it where they were slower (because
they lack flexibility to tailor the query towards the actual need of
the action at runtime, you have to write the procs up front).

I indeed missed 'write' in the question, which makes the question
indeed a bit weird: how to compare a SELECT statement send directly
with an insert proc :)

Frans

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #11
A number of valid (and some not so valid) have been mentioned. Let me add
this: When I poll a room full of 1000 SQL Server developers asking "How many
of you don't use Stored Procedures?" I usually get a handful of people to
raise their hands. The vast majority (over 95%) use and depend on stored
procedures. They have found that they provide a single, focused way to
manage queries, protect the database, provide better performance (when they
are written correctly) and build a more secure, more scalable application.

hth
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data
from/to SQL Server? Why?

Thanks

Regards

Nov 21 '05 #12
I would just like to add that Stored Procedures offer you the added benefit
of making changes to your queries without having to recompile your entire
application.


"William (Bill) Vaughn" wrote:
A number of valid (and some not so valid) have been mentioned. Let me add
this: When I poll a room full of 1000 SQL Server developers asking "How many
of you don't use Stored Procedures?" I usually get a handful of people to
raise their hands. The vast majority (over 95%) use and depend on stored
procedures. They have found that they provide a single, focused way to
manage queries, protect the database, provide better performance (when they
are written correctly) and build a more secure, more scalable application.

hth
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data
from/to SQL Server? Why?

Thanks

Regards


Nov 21 '05 #13
I'll have to add a hearty, "hear, hear" to that. I can't tell you how many
times I've saved myself tons of work by simply changing a stored procedure
versus having to make changes in code, recompile the app, and redeploy the
app. Along that same idea, it is sweet to be able to camp out on the server
for a short time testing a change to a stored procedure while your user
continues with their normal workday, totally oblivious to any fixes being
made.

"DHass [MCP]" <DH******@discussions.microsoft.com> wrote in message
news:55**********************************@microsof t.com...
I would just like to add that Stored Procedures offer you the added benefit
of making changes to your queries without having to recompile your entire
application.


"William (Bill) Vaughn" wrote:
A number of valid (and some not so valid) have been mentioned. Let me add
this: When I poll a room full of 1000 SQL Server developers asking "How
many
of you don't use Stored Procedures?" I usually get a handful of people to
raise their hands. The vast majority (over 95%) use and depend on stored
procedures. They have found that they provide a single, focused way to
manage queries, protect the database, provide better performance (when
they
are written correctly) and build a more secure, more scalable
application.

hth
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no
rights.
__________________________________

"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
> Hi
>
> When developing vb.bet winform apps bound to sql server datasource, is
> it
> preferable to use SELECTs or stored procedure to read and write data
> from/to SQL Server? Why?
>
> Thanks
>
> Regards
>


Nov 21 '05 #14

"Earl"
..
I'll have to add a hearty, "hear, hear" to that. I can't tell you how many
times I've saved myself tons of work by simply changing a stored procedure
versus


Yes and and almost everybody can do that while changing the programs can
only done by few people, who have access to those programs..

Or is that not what you want to say?

:-)))))

Cor
Nov 21 '05 #15
As we say here in the States, close enough for government work.

"Cor Ligthert [MVP]" <no************@planet.nl> wrote in message
news:%2****************@TK2MSFTNGP12.phx.gbl...

"Earl"
.
I'll have to add a hearty, "hear, hear" to that. I can't tell you how
many times I've saved myself tons of work by simply changing a stored
procedure versus


Yes and and almost everybody can do that while changing the programs can
only done by few people, who have access to those programs..

Or is that not what you want to say?

:-)))))

Cor

Nov 21 '05 #16
> Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).
in more detail regarding this for those who might be interested :

SQL server 2000 stores only the source for stored procedures and triggers
when a stored procedure or trigger is first executed the source is compiled
into an execution plan . If the stored procedure or trigger is again
executed before the execution plan is aged from memory, the relational
engine detects the existing plan and reuses it, otherwise a new plan is
built

The above process flow is simular to the process that SQL server follows for
all SQL statements

the main performance advantage that stored procedures and triggers have in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans

Stored procedures had a more pronounced performance advantage over other SQL
statements in earlier versions of SQL server , they did not attempt to reuse
execution plans for batches that were NOT stored procedures or triggers
regards

Michel Posseth [MCP]


"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in message
news:xn***************@news.microsoft.com... sh**********@yahoo.com wrote:
Hi john

They have there plans precompiled , so are faster.


Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).

The reason for this is that only at runtime statistics are known and
only in the situation of a fresh query the optimizer can truly optimize
the plan based on the statistics. See BOL about execution plans.

As he's talking about SELECT's, the gain of a proc is not really an
issue (having processing on the server so you avoid pumping loads of
data to the client first and then back to the server).

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------

Nov 21 '05 #17
DHass [MCP] wrote:
I would just like to add that Stored Procedures offer you the added
benefit of making changes to your queries without having to recompile
your entire application.
Sure, until you have to add a parameter to your insert/update proc
set, because a column was added to a table. Have a nice time updating
the tier(s) above that proc-api!

FB


"William (Bill) Vaughn" wrote:
A number of valid (and some not so valid) have been mentioned. Let
me add this: When I poll a room full of 1000 SQL Server developers
asking "How many of you don't use Stored Procedures?" I usually get
a handful of people to raise their hands. The vast majority (over
95%) use and depend on stored procedures. They have found that they
provide a single, focused way to manage queries, protect the
database, provide better performance (when they are written
correctly) and build a more secure, more scalable application.

hth
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no
rights. __________________________________

"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server
datasource, is it preferable to use SELECTs or stored procedure
to read and write data from/to SQL Server? Why?

Nov 21 '05 #18
the main performance advantage that stored procedures and triggers have in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans


Not exactly, nothing prohibits you to craft dynamic sql within stored
procedure.

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/
SLODUG - Slovene Developer Users Group www.codezone-si.info
Nov 21 '05 #19
Ah, close.
Stored procedure query plans change--each time they're compiled. No, they're not recompiled each time they're executed but when they're forced out of the cache or the DBA or developer forces recompilation for one reason or another. That's a problem. When you test a SP with a specific set of parameters, a plan matching those parameters is created. When the parameters change, a new plan is NOT created to match--not if there is a plan in the cache. Hopefully the old plan works efficiently with the new parameters--but it might not.
SQL Server 2005 changes this a bit--it can recompile parts of the plan without affecting other parts.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"m.posseth" <mi*****@nohausystems.nl> wrote in message news:Om**************@TK2MSFTNGP14.phx.gbl...
Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).


in more detail regarding this for those who might be interested :

SQL server 2000 stores only the source for stored procedures and triggers
when a stored procedure or trigger is first executed the source is compiled
into an execution plan . If the stored procedure or trigger is again
executed before the execution plan is aged from memory, the relational
engine detects the existing plan and reuses it, otherwise a new plan is
built

The above process flow is simular to the process that SQL server follows for
all SQL statements

the main performance advantage that stored procedures and triggers have in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans

Stored procedures had a more pronounced performance advantage over other SQL
statements in earlier versions of SQL server , they did not attempt to reuse
execution plans for batches that were NOT stored procedures or triggers


regards

Michel Posseth [MCP]






"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in message
news:xn***************@news.microsoft.com...
sh**********@yahoo.com wrote:
Hi john

They have there plans precompiled , so are faster.


Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).

The reason for this is that only at runtime statistics are known and
only in the situation of a fresh query the optimizer can truly optimize
the plan based on the statistics. See BOL about execution plans.

As he's talking about SELECT's, the gain of a proc is not really an
issue (having processing on the server so you avoid pumping loads of
data to the client first and then back to the server).

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------


Nov 21 '05 #20
How is this different than what applications have to do otherwise? Is
drag-and-drop or hand-coding the application any different because you have
moved the update logic to the middle tier?

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in message
news:xn***************@news.microsoft.com...
DHass [MCP] wrote:
I would just like to add that Stored Procedures offer you the added
benefit of making changes to your queries without having to recompile
your entire application.


Sure, until you have to add a parameter to your insert/update proc
set, because a column was added to a table. Have a nice time updating
the tier(s) above that proc-api!

FB


"William (Bill) Vaughn" wrote:
> A number of valid (and some not so valid) have been mentioned. Let
> me add this: When I poll a room full of 1000 SQL Server developers
> asking "How many of you don't use Stored Procedures?" I usually get
> a handful of people to raise their hands. The vast majority (over
> 95%) use and depend on stored procedures. They have found that they
> provide a single, focused way to manage queries, protect the
> database, provide better performance (when they are written
> correctly) and build a more secure, more scalable application.
>
> hth
> --
> ____________________________________
> William (Bill) Vaughn
> Author, Mentor, Consultant
> Microsoft MVP
> www.betav.com/blog/billva
> www.betav.com
> Please reply only to the newsgroup so that others can benefit.
> This posting is provided "AS IS" with no warranties, and confers no
> rights. __________________________________
>
> "John" <Jo**@nospam.infovis.co.uk> wrote in message
> news:ON**************@TK2MSFTNGP12.phx.gbl...
> > Hi
> >
> > When developing vb.bet winform apps bound to sql server
> > datasource, is it preferable to use SELECTs or stored procedure
> > to read and write data from/to SQL Server? Why?

Nov 21 '05 #21
Well i totally agree with this , however i would like to add that if you
are talking about parameters that change , we are talking about the
parameter signature of the Proc and not the parameter values
as only a signature change requires a forced recompile while testing .

at least that is how i learned it at my 70-229 MCSE course

regards

Michel Posseth [MCP]

"William (Bill) Vaughn" <bi**************@nwlink.com> wrote in message
news:O$*************@TK2MSFTNGP12.phx.gbl...
Ah, close.
Stored procedure query plans change--each time they're compiled. No, they're
not recompiled each time they're executed but when they're forced out of the
cache or the DBA or developer forces recompilation for one reason or
another. That's a problem. When you test a SP with a specific set of
parameters, a plan matching those parameters is created. When the parameters
change, a new plan is NOT created to match--not if there is a plan in the
cache. Hopefully the old plan works efficiently with the new parameters--but
it might not.
SQL Server 2005 changes this a bit--it can recompile parts of the plan
without affecting other parts.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"m.posseth" <mi*****@nohausystems.nl> wrote in message
news:Om**************@TK2MSFTNGP14.phx.gbl...
Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).


in more detail regarding this for those who might be interested :

SQL server 2000 stores only the source for stored procedures and triggers
when a stored procedure or trigger is first executed the source is
compiled
into an execution plan . If the stored procedure or trigger is again
executed before the execution plan is aged from memory, the relational
engine detects the existing plan and reuses it, otherwise a new plan is
built

The above process flow is simular to the process that SQL server follows
for
all SQL statements

the main performance advantage that stored procedures and triggers have in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans

Stored procedures had a more pronounced performance advantage over other
SQL
statements in earlier versions of SQL server , they did not attempt to
reuse
execution plans for batches that were NOT stored procedures or triggers
regards

Michel Posseth [MCP]


"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in message
news:xn***************@news.microsoft.com...
sh**********@yahoo.com wrote:
Hi john

They have there plans precompiled , so are faster.


Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).

The reason for this is that only at runtime statistics are known and
only in the situation of a fresh query the optimizer can truly optimize
the plan based on the statistics. See BOL about execution plans.

As he's talking about SELECT's, the gain of a proc is not really an
issue (having processing on the server so you avoid pumping loads of
data to the client first and then back to the server).

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------


Nov 21 '05 #22
well even then the statement still stands ,, as how dynamic is dynamic ???
to be non beneficial for the execution plan ? wich will work on anny SQL
statement wether it is stored in a Proc or not
that is exactly why Frans Bouma is right in his above satements

regards

Michel Posseth

"Miha Markic [MVP C#]" <miha at rthand com> wrote in message
news:e1**************@TK2MSFTNGP14.phx.gbl...
the main performance advantage that stored procedures and triggers have
in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans


Not exactly, nothing prohibits you to craft dynamic sql within stored
procedure.

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/
SLODUG - Slovene Developer Users Group www.codezone-si.info

Nov 21 '05 #23
Yup, changing parameter VALUES does not affect the cached plan--it's used
regardless of how well it works given the parameter values provided. If you
change the signature, you're certainly going to get a new SP and thus a new
plan when it first executes.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"m.posseth" <mi*****@nohausystems.nl> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
Well i totally agree with this , however i would like to add that if you
are talking about parameters that change , we are talking about the
parameter signature of the Proc and not the parameter values
as only a signature change requires a forced recompile while testing .

at least that is how i learned it at my 70-229 MCSE course

regards

Michel Posseth [MCP]

"William (Bill) Vaughn" <bi**************@nwlink.com> wrote in message
news:O$*************@TK2MSFTNGP12.phx.gbl...
Ah, close.
Stored procedure query plans change--each time they're compiled. No,
they're not recompiled each time they're executed but when they're forced
out of the cache or the DBA or developer forces recompilation for one
reason or another. That's a problem. When you test a SP with a specific
set of parameters, a plan matching those parameters is created. When the
parameters change, a new plan is NOT created to match--not if there is a
plan in the cache. Hopefully the old plan works efficiently with the new
parameters--but it might not.
SQL Server 2005 changes this a bit--it can recompile parts of the plan
without affecting other parts.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no
rights.
__________________________________

"m.posseth" <mi*****@nohausystems.nl> wrote in message
news:Om**************@TK2MSFTNGP14.phx.gbl...
Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).


in more detail regarding this for those who might be interested :

SQL server 2000 stores only the source for stored procedures and triggers
when a stored procedure or trigger is first executed the source is
compiled
into an execution plan . If the stored procedure or trigger is again
executed before the execution plan is aged from memory, the relational
engine detects the existing plan and reuses it, otherwise a new plan is
built

The above process flow is simular to the process that SQL server follows
for
all SQL statements

the main performance advantage that stored procedures and triggers have
in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans

Stored procedures had a more pronounced performance advantage over other
SQL
statements in earlier versions of SQL server , they did not attempt to
reuse
execution plans for batches that were NOT stored procedures or triggers
regards

Michel Posseth [MCP]


"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in message
news:xn***************@news.microsoft.com...
sh**********@yahoo.com wrote:

Hi john

They have there plans precompiled , so are faster.

Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).

The reason for this is that only at runtime statistics are known and
only in the situation of a fresh query the optimizer can truly optimize
the plan based on the statistics. See BOL about execution plans.

As he's talking about SELECT's, the gain of a proc is not really an
issue (having processing on the server so you avoid pumping loads of
data to the client first and then back to the server).

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------



Nov 21 '05 #24
William (Bill) Vaughn wrote:
How is this different than what applications have to do otherwise? Is
drag-and-drop or hand-coding the application any different because
you have moved the update logic to the middle tier?


Well, with dynamic SQL generated at runtime, you don't have to do
anything in that scenario in most cases, while in the case of a proc,
you have to change more than 1 tier. That's not all, often there are
two people or even two teams involved in such a change, which can be
cumbersome, as it's over 2 or more tiers.

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #25
On Tue, 16 Aug 2005 02:45:48 +0100, "John" <Jo**@nospam.infovis.co.uk> wrote:

¤ Hi
¤
¤ When developing vb.bet winform apps bound to sql server datasource, is it
¤ preferable to use SELECTs or stored procedure to read and write data from/to
¤ SQL Server? Why?
¤

I would agree with most of the other sentiments toward a preference with respect to stored
procedures.

The only issues I have with SPs is that development can be cumbersome if your data store is not
supported in an integrated development environment. You can't see the SQL code, or other database
objects, so you typically have to open up a separate tool to work with the SPs.

In addition, some designs may need to implement dynamic SQL, which may not translate well to an SP.

Otherwise, the benefits of using SPs far outweigh those for naked SQL.
Paul
~~~~
Microsoft MVP (Visual Basic)
Nov 21 '05 #26
Hi frans. Always a pleasure to see your posts on this subject, even if we
disagree on others. ;)

So, you're saying, putting "dbo." in front of your table names, actually
improves performance?

Fregas

"Frans Bouma [C# MVP]" wrote:
John wrote:
Hi

When developing vb.bet winform apps bound to sql server datasource,
is it preferable to use SELECTs or stored procedure to read and write
data from/to SQL Server? Why?


As long as you use solely parameterized queries, it's not making a
difference.
so do:
SELECT * FROM dbo.Foo where field1 = @param
and not:
SELECT * FROM dbo.foo where field1 = 'value'

Also, specify schema names for tables always, this makes the query's
execution plan get cached better and SQLServer's mechanism to find back
an execution plan works more efficiently.

Frans

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------

Nov 21 '05 #27
Ok, guys, we've heard a lot of pros cons about stored procedures versus
dynamic (hopefully parameterized) sql.

How do you handle optional parameters? For example, I want to query for
orders by customerid, a date range and/or productID. Any or all of these
parameters could be included:

SELECT O.*
FROM Orders
WHERE OrderID = @OrderID
AND ProductID = ProductID
AND CustomerID = @CustomerID
AND DateOrder between @startDate and @enddate

how do i make the parameters optional? I've tried using IsNull() and
Coalesce() but these have slowed performance quite a bit for me even in
stored procs. With parameterized dynamic sql, its not an issue, because I
can change the query to not include one of the pieces of the WHERE clause.
Would I just have to make a bunch of different stored procs for every
combination of the parameters? Bleh...

This is very common in applications that require search functionality, BTW.

I'm not issuing this as a challenge or to be a prick...I really want to know.

Fregas

"John" wrote:
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data from/to
SQL Server? Why?

Thanks

Regards

Nov 21 '05 #28
Ah well, we write the SP to set default values for the parameters (for
example "NULL"). The SP logic then incorporates that fact into its WHERE
clause logic.

WHERE (x = @MyXParm OR @myXParm IS NULL)
--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

"Fregas" <Fr****@discussions.microsoft.com> wrote in message
news:2C**********************************@microsof t.com...
Ok, guys, we've heard a lot of pros cons about stored procedures versus
dynamic (hopefully parameterized) sql.

How do you handle optional parameters? For example, I want to query for
orders by customerid, a date range and/or productID. Any or all of these
parameters could be included:

SELECT O.*
FROM Orders
WHERE OrderID = @OrderID
AND ProductID = ProductID
AND CustomerID = @CustomerID
AND DateOrder between @startDate and @enddate

how do i make the parameters optional? I've tried using IsNull() and
Coalesce() but these have slowed performance quite a bit for me even in
stored procs. With parameterized dynamic sql, its not an issue, because I
can change the query to not include one of the pieces of the WHERE clause.
Would I just have to make a bunch of different stored procs for every
combination of the parameters? Bleh...

This is very common in applications that require search functionality,
BTW.

I'm not issuing this as a challenge or to be a prick...I really want to
know.

Fregas

"John" wrote:
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data
from/to
SQL Server? Why?

Thanks

Regards

Nov 21 '05 #29
Fregas wrote:
Hi frans. Always a pleasure to see your posts on this subject, even
if we disagree on others. ;)
:P
So, you're saying, putting "dbo." in front of your table names,
actually improves performance?
Yes, as it will make the sql optimizer more able to find back a plan
and re-use it.

Frans

Fregas

"Frans Bouma [C# MVP]" wrote:
John wrote:
Hi

When developing vb.bet winform apps bound to sql server
datasource, is it preferable to use SELECTs or stored procedure
to read and write data from/to SQL Server? Why?


As long as you use solely parameterized queries, it's not making a
difference.
so do:
SELECT * FROM dbo.Foo where field1 = @param
and not:
SELECT * FROM dbo.foo where field1 = 'value'

Also, specify schema names for tables always, this makes the query's
execution plan get cached better and SQLServer's mechanism to find
back an execution plan works more efficiently.

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #30
William (Bill) Vaughn wrote:
Ah well, we write the SP to set default values for the parameters
(for example "NULL"). The SP logic then incorporates that fact into
its WHERE clause logic.

WHERE (x = @MyXParm OR @myXParm IS NULL)


... which throws out your 'procs are faster' argument ;), because this
is slower than normal, task-tailored dynamic parameterized SQL.

Also, how is:

CREATE PROCEDURE dbo.pr_DeleteCustomer
@customerID int
AS
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

more secure than, firing this:
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

as dynamic sql? After all -> say I use an app and I have to delete
customers through the app sometimes. I then have access to the db
table, and can with QA run the proc, giving in a random ID value.
FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #31
Hi,

In my opinion is in this kind of discussion always expected that the SQL
server is by its IP directly addressed over the Internet.

That is in my opinion not the way by dotNet.

In by instance a normal ASPNET application there is not any code transported
over the network (or is should be with an by a router seperated IIS and SQL
Server, which should than be done in a very private part of the network).

While if it is a windowsforms application than in my opinion a webservice is
the way to go if it is not in an Internal network. To secure this kind of
operatinon in an Internal network is in my idea the same as securing that
workers do anything, they probably have easier methods internal to get the
information than securing SQL parameters on parts of the ISO/OSI layers.

Just my thought,

Cor
Nov 21 '05 #32
Yeah Bill, I seem to remember having performance problems with these kind of
WHERE clauses. Its about the same as using IsNull() or Coalesce(). They just
didn't perform as well as dynamic sql.

In defense of stored proc security Frans, in your example, a hacker who
gains access to the database by discovering the application's SQL Server
login could then run the delete stored procedure pr_DeleteCustomer, but
they'd have to know the procedure name and ID of the customers to delete
them. However they couldn't necessarily do DELTE FROM CUSTOMER, TRUNCATE,
run a DTS package, etc., becuase the application's login doesn't have
permissions to do those things. Thats the theory anyway.

Craig

"Frans Bouma [C# MVP]" wrote:
William (Bill) Vaughn wrote:
Ah well, we write the SP to set default values for the parameters
(for example "NULL"). The SP logic then incorporates that fact into
its WHERE clause logic.

WHERE (x = @MyXParm OR @myXParm IS NULL)


... which throws out your 'procs are faster' argument ;), because this
is slower than normal, task-tailored dynamic parameterized SQL.

Also, how is:

CREATE PROCEDURE dbo.pr_DeleteCustomer
@customerID int
AS
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

more secure than, firing this:
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

as dynamic sql? After all -> say I use an app and I have to delete
customers through the app sometimes. I then have access to the db
table, and can with QA run the proc, giving in a random ID value.
FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------

Nov 21 '05 #33
I'm not sure if I'm understanding you correctly, but I'll add my 2 cents.

I would say that in a web hosting environment for example, the sql server is
usually accessible over the internet, so developers can access it using
Enterprise Manager. Therefore, security is a higher concern than if the sql
server is only on a local internal network. However, in either case the
bandwidth usage in sending/receiving data to and from sql server DOES make a
difference. It might be less noticable if the sql server is on your local
network, but it would still count.

In the case of a winforms application (or maybe SmartClients) in which the
database must be accessed over the internet because the users will be
connecting from both inside and outside the company network, than I agree
that some other kind of access to your data, such as web services, is a
better choice than each client accessing the database directly. Although
realistically, you would probably want to use some kind of remtoing system to
send/receive business objects, rather than just using web services for
everything.

CSLA for example (Rocky Lhotka's framework), has something called the
DataPortal which sends and receives your business objects back to your web
server or windows client application. It is designed so that the windows
clients (or even mutliple web services, if you have web farm) can ask the
DataPortal for all business objects. The DataPortal can reside on its own
server and be the only thing that talks to the database, as opposed to having
potentially hundreds of windows clients all opening connections to the DB.
This makes the application more scalable, because there are less connections
and more connection pooling.

Even though there's a lot of things I don't like about CSLA, I would
probably take this same approach if I were using windows clients, or if I had
to cluster my web server. I think CSLA handles this part for you pretty
nicely. I might still use web services for some things, or just to avoid the
hassle of remoting if I was building a relatively small, low user count,
application, but my understanding is that web services would not perform as
well in most cases. Also I prefer to work with objects directly, rather than
have some web service wrapping them up for me or talking to the DB directly.

Craig

"Cor Ligthert [MVP]" wrote:
Hi,

In my opinion is in this kind of discussion always expected that the SQL
server is by its IP directly addressed over the Internet.

That is in my opinion not the way by dotNet.

In by instance a normal ASPNET application there is not any code transported
over the network (or is should be with an by a router seperated IIS and SQL
Server, which should than be done in a very private part of the network).

While if it is a windowsforms application than in my opinion a webservice is
the way to go if it is not in an Internal network. To secure this kind of
operatinon in an Internal network is in my idea the same as securing that
workers do anything, they probably have easier methods internal to get the
information than securing SQL parameters on parts of the ISO/OSI layers.

Just my thought,

Cor

Nov 21 '05 #34
Craig,
I'm not sure if I'm understanding you correctly, but I'll add my 2 cents.


You understand me 100%,

:-)

Cor
Nov 21 '05 #35
Fregas wrote:
In defense of stored proc security Frans, in your example, a hacker
who gains access to the database by discovering the application's SQL
Server login could then run the delete stored procedure
pr_DeleteCustomer, but they'd have to know the procedure name and ID
of the customers to delete them. However they couldn't necessarily
do DELTE FROM CUSTOMER, TRUNCATE, run a DTS package, etc., becuase
the application's login doesn't have permissions to do those things.
Thats the theory anyway.
Though, when an app uses dyn. sql (parameterized) and uses a login
specially for that app (or a couple, for example one for admin usage,
and one for fetches of generic data), how can I truncate the table
then? There's no difference, as teh security defined on the table is
the same as defined on the proc: if I have the credentials defined on
the object, I can execute an action on it, be it execute it or delete
something.

Having a proc like the one I gave doesn't prevent a user from deleting
a customer, hence not more secure.

Frans

Craig

"Frans Bouma [C# MVP]" wrote:
William (Bill) Vaughn wrote:
Ah well, we write the SP to set default values for the parameters
(for example "NULL"). The SP logic then incorporates that fact
into its WHERE clause logic.

WHERE (x = @MyXParm OR @myXParm IS NULL)


... which throws out your 'procs are faster' argument ;), because
this is slower than normal, task-tailored dynamic parameterized SQL.

Also, how is:

CREATE PROCEDURE dbo.pr_DeleteCustomer
@customerID int
AS
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

more secure than, firing this:
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

as dynamic sql? After all -> say I use an app and I have to delete
customers through the app sometimes. I then have access to the db
table, and can with QA run the proc, giving in a random ID value.

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #36
Well, I think you might be mistaken, Frans. The security for the login to
run dynamic sql is a BIT broader then for the login to only run procs.

With dynamic sql, the application would have to have direct access to the
table (at the very least read/write permissions.) Therefore, a hacker who
manages to get the login for the application using dynamic sql can then do a
query like:

DELETE FROM Customers

which deletes all the customers, basically the equivalent of truncate.

However, if you use only stored procs, you can give the application's login
ONLY access execute procs and NOT the underlying table:

up_DELETE_CUSTOMER @CustomerID

in which case the hacker cannot simply blow out the whole table at once, but
has to know the id of every customer in order to do the same damage. It must
be said, that the hacker could potentially find another "SELECT" stored proc,
in order to find the customerID's if he didn't know them already somehow.

I agree that stored procs don't necessarily prevent the hacker from doing
the same damage, but makes it more difficult, which is all security can hope
to do anyway.

If the app's login has DBO access, then the hacker can also TRUNCATE and so
forth, but not many applications actually need this level of access.
However, many developers (myself included) are guilty of the following: we
develop the application using a DBO account for convenience, but don't remove
those extra permissions in production, because we're well...lazy. I know at
my company, few of the web applications we're developing are available
outside our VPN, so security is already pretty tight.

I'm not saying procs are better by any means, just that i can understand the
security benefits of their use even though I choose not to do so. In the
performance realm, I have found dynamic sql about as fast as procedures, and
procs are actually SLOWER in many cases, like some of the optional parameter
queries i've mentioned elsewhere on here. Also, it is much more difficult to
reuse code within/between stored procs, thus adding development time and
maintenance headaches to your application.

Craig
"Frans Bouma [C# MVP]" wrote:

Though, when an app uses dyn. sql (parameterized) and uses a login
specially for that app (or a couple, for example one for admin usage,
and one for fetches of generic data), how can I truncate the table
then? There's no difference, as teh security defined on the table is
the same as defined on the proc: if I have the credentials defined on
the object, I can execute an action on it, be it execute it or delete
something.

Having a proc like the one I gave doesn't prevent a user from deleting
a customer, hence not more secure.

Frans

Craig

"Frans Bouma [C# MVP]" wrote:
William (Bill) Vaughn wrote:

> Ah well, we write the SP to set default values for the parameters
> (for example "NULL"). The SP logic then incorporates that fact
> into its WHERE clause logic.
>
> WHERE (x = @MyXParm OR @myXParm IS NULL)

... which throws out your 'procs are faster' argument ;), because
this is slower than normal, task-tailored dynamic parameterized SQL.

Also, how is:

CREATE PROCEDURE dbo.pr_DeleteCustomer
@customerID int
AS
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

more secure than, firing this:
DELETE FROM dbo.Customers WHERE CustomerID = @customerID

as dynamic sql? After all -> say I use an app and I have to delete
customers through the app sometimes. I then have access to the db
table, and can with QA run the proc, giving in a random ID value.

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------

Nov 21 '05 #37
Fregas wrote:
Well, I think you might be mistaken, Frans. The security for the
login to run dynamic sql is a BIT broader then for the login to only
run procs.
No, that's not the case, read on.
With dynamic sql, the application would have to have direct access to
the table (at the very least read/write permissions.) Therefore, a
hacker who manages to get the login for the application using dynamic
sql can then do a query like:

DELETE FROM Customers

which deletes all the customers, basically the equivalent of truncate.

However, if you use only stored procs, you can give the application's
login ONLY access execute procs and NOT the underlying table:

up_DELETE_CUSTOMER @CustomerID

in which case the hacker cannot simply blow out the whole table at
once, but has to know the id of every customer in order to do the
same damage. It must be said, that the hacker could potentially find
another "SELECT" stored proc, in order to find the customerID's if he
didn't know them already somehow.
exactly. And not only that: if a hacker is able to delete one customer
it's already not safe. If the customerID is numeric, like an int, he
might have to write a loop to delete all of them, but that's a matter
of time.
I agree that stored procs don't necessarily prevent the hacker from
doing the same damage, but makes it more difficult, which is all
security can hope to do anyway.
It's either SAFE, or not, but not 'a bit safe'. A 'bit safe' isn't
safe, it's unsafe. So if someone argues that stored procedures are
secure and dynamic sql aren't, that person then has to proof how that
can be achieved because that's much harder than 99% of the stored
procedure using people think. (for starters: moving your complete app
into the db, passing secret keys towards procs which only act if the
key is valid etc.)..

People tend to believe that as long as it is a bit safe, they're at
least a bit more secure, but that's not true. If a hacker is after your
data, he'll get it, as soon as there's one tiny hole he can peek
through.

Though, the vast majority of 'security' related concerns aren't
targeting hackers from the outside, but towards the 'manager' with ms
access who wants to run ad-hoc reports on your 3TB machine. In that
scenario, you can perfectly limit access with normal db-role based
security on tables and a couple of logins instead of one general login
which can do everything.
If the app's login has DBO access, then the hacker can also TRUNCATE
and so forth, but not many applications actually need this level of
access. However, many developers (myself included) are guilty of the
following: we develop the application using a DBO account for
convenience, but don't remove those extra permissions in production,
because we're well...lazy. I know at my company, few of the web
applications we're developing are available outside our VPN, so
security is already pretty tight.
It's often lazyness, especially in the area of 'I heard this so it
must be true, I don't have time to investigate it further'. Databases
can be locked down pretty good, though often it's a matter of doing it
on a higher level than on a lower level, because the problem is that
when a DBA says: "I decide what's used and I decide it will be stored
procedures and I write them and that's final", the developers who write
the code of the actual application are then faced with an api which
might not be optimal for their application, or consists of 90% crud
procs and therefore not more secure than any other approach as they
don't bring 100% security to the table.

Instead of working together, they work against eachother, which is
weird, as they all work for the same company. It's not new though. A
lot of developers have to argue with system administrators a lot,
because the system administrator doesn't want to take a new app into
production if it isn't tested for a month on some obscure test server.
I'm not saying procs are better by any means, just that i can
understand the security benefits of their use even though I choose
not to do so. In the performance realm, I have found dynamic sql
about as fast as procedures, and procs are actually SLOWER in many
cases, like some of the optional parameter queries i've mentioned
elsewhere on here. Also, it is much more difficult to reuse code
within/between stored procs, thus adding development time and
maintenance headaches to your application.
It's a bit of a weird thing: a team is asked to develop an n-tier
application, and they are free to decide what they use in the BL and PL
tier, but the datalayer is written by a person not part of the team,
and at the same time, has a very big say in the rest of the design of
the application.

Because, if you use procedures, it's harder to use objects in the BL
and PL tier. It's not hard to fill a set of customer entities with the
result of a select proc call. It's MUCH harder to fetch a graph of
entities efficiently with a set of procs (as in: impossible without
writing for each occasion a new proc). By opting for procedures, you
decide to move the dataset route, which is perhaps not what the team
wants to use in their app, and THEY write the application.

Instead of using the DBA's knowledge (so add the DBA as a db
consultant to the team, so the DBA knows which tables are targeted
often, and because the DBA knows which tables are large, he/she can
advice on using subqueries instead of joins or vice versa, through the
usage of the abstract DAL), the DBA and the developers aren't working
together, which is counter productive.

"But it is more secuuuuuureee!!!"
no. An n-tier application has numerous areas where a hacker can inject
code, alter data during the dataflow etc. And most hackers who look for
databases aren't interested in deleting data, they're interested in
fetching data. So the obvious place to stick a listener component is
outside the db, right below the PL.

Frans.

Craig
"Frans Bouma [C# MVP]" wrote:

Though, when an app uses dyn. sql (parameterized) and uses a login
specially for that app (or a couple, for example one for admin
usage, and one for fetches of generic data), how can I truncate the
table then? There's no difference, as teh security defined on the
table is the same as defined on the proc: if I have the credentials
defined on the object, I can execute an action on it, be it execute
it or delete something.

Having a proc like the one I gave doesn't prevent a user from
deleting a customer, hence not more secure.

Frans

Craig

"Frans Bouma [C# MVP]" wrote:

> William (Bill) Vaughn wrote:
>
> > Ah well, we write the SP to set default values for the
> > parameters (for example "NULL"). The SP logic then
> > incorporates that fact into its WHERE clause logic.
> >
> > WHERE (x = @MyXParm OR @myXParm IS NULL)
>
> ... which throws out your 'procs are faster' argument ;),
> because this is slower than normal, task-tailored dynamic
> parameterized SQL.
>
> Also, how is:
>
> CREATE PROCEDURE dbo.pr_DeleteCustomer
> @customerID int
> AS
> DELETE FROM dbo.Customers WHERE CustomerID = @customerID
>
> more secure than, firing this:
> DELETE FROM dbo.Customers WHERE CustomerID = @customerID
>
> as dynamic sql? After all -> say I use an app and I have to
> delete customers through the app sometimes. I then have access
> to the db table, and can with QA run the proc, giving in a
> random ID value.

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #38
Frans,

You would be surprised I agree the overall trend of your message with you. I
want to add something.
It's often lazyness,


That is for me no problem if it is for investigation, however for me a
problem is that people often don't review there programs and let things as
they are when they have tried something and it seems to work. That makes in
my opinion very big holes.

Just my thought,

Cor
Nov 21 '05 #39
Hi Frans,

I respectfully disagree. Most things in this world, including technology
are a matter of degrees. I think most security analysts understand that they
can't make any system "unhackable" but its a matter of balancing how much
effort and inconvenience they put in versus the pay off of being more secure.
I don't think its a simple matter of "Stored procedures are secure, dynamic
sql is not". I depends on a lot of factors. In whole, I think you have more
security options with stored procedures, but that certainly doesn't make your
database "completely secure" whatever that is. You're looking at it as a
black and white question, and few things fit into neat categories like that.
exactly. And not only that: if a hacker is able to delete one customer
it's already not safe. If the customerID is numeric, like an int, he
might have to write a loop to delete all of them, but that's a matter
of time.

And that extra time might cause the hacker to get noticed by intrusion
detection, get logged off because the network is only open for a specific
window of time, etc. If the key is a guid or something not easily "loopable"
than its going to be even harder. Many of the recommendations microsoft
gives in locking IIS for example, simply cause the hacker to have to put in a
lot more work to do the same damage, in the hope that the hacker will get
caught or reach the extent of their knowledge and give up.

People tend to believe that as long as it is a bit safe, they're at
least a bit more secure, but that's not true. If a hacker is after your
data, he'll get it, as soon as there's one tiny hole he can peek
through.
Again, I disagree. For example, I have a Belkin router. That router has
firewall protection. It has basic security. Am I completely safe? No I'm
not. Am I as safe as if I had a professional come in and install a
Checkpoint firewall? Not nearly. Am I more safe than if I had no firewall
protection at all. Definitely.

Though, the vast majority of 'security' related concerns aren't
targeting hackers from the outside, but towards the 'manager' with ms
access who wants to run ad-hoc reports on your 3TB machine. In that
scenario, you can perfectly limit access with normal db-role based
security on tables and a couple of logins instead of one general login
which can do everything.
I do agree with this, although I would say that managers, end users, etc
shouldn't have any access to the database, but should be working through the
application so both security and business rules can be applied. Obviously if
they're developing something like crystal reports, than the security will
need to be at the database level, in which case your right. Stored procs
aren't really helping security much at all, since the developer needs access
to certain tables or views.
Instead of working together, they work against eachother, which is
weird, as they all work for the same company. It's not new though. A
lot of developers have to argue with system administrators a lot,
because the system administrator doesn't want to take a new app into
production if it isn't tested for a month on some obscure test server.

It's a bit of a weird thing: a team is asked to develop an n-tier
application, and they are free to decide what they use in the BL and PL
tier, but the datalayer is written by a person not part of the team,
and at the same time, has a very big say in the rest of the design of
the application.

Because, if you use procedures, it's harder to use objects in the BL
and PL tier. It's not hard to fill a set of customer entities with the
result of a select proc call. It's MUCH harder to fetch a graph of
entities efficiently with a set of procs (as in: impossible without
writing for each occasion a new proc). By opting for procedures, you
decide to move the dataset route, which is perhaps not what the team
wants to use in their app, and THEY write the application.

Instead of using the DBA's knowledge (so add the DBA as a db
consultant to the team, so the DBA knows which tables are targeted
often, and because the DBA knows which tables are large, he/she can
advice on using subqueries instead of joins or vice versa, through the
usage of the abstract DAL), the DBA and the developers aren't working
together, which is counter productive.
I'm fortunate enough to have not experienced that, although I've heard it
from others. I've only worked in two places that had a DBA, and in both of
those places it was a pleasant relationship. We created our own queries,
procedures, views, etc. and the db helped maintain indexes, gave us tips, did
backups, etc. They stayed out of our way, we stayed out of theirs and for
the most part helped each other when needed. My current DBA is great and my
managers understand the developers need to create data structures rather than
have them dictated to them.

This is a bit of topic, but I have to say I dislike the whole idea of a team
project where certain members are working on one tier like the database or
the business objects, and other people working on other tiers like the
presentation. I'd rather break people up into developing certain areas of
the app and making them work from top to bottom. Although its good for a
designer to create images and chop of the Html, you really need developers to
consume objects and create the UI, and thats easier if that developer has
also created his businss objects and database structures (whether just
tables, views or stored procs), because he's working with his own API(s).

I've seen the "build one tier at a time" approach to development fail many
times, even with only one or two developers. We had a guy at my current job
get fired because he felll so far behind. He created so many objects for
things and test scripts and so forth but never built his UI. He was saving
that for last. The problem was that the "test scripts" didn't accurately
test his API very well and so he had a lot of code but no screens to show for
it and no real way of knowing if his API worked. I much prefer to build one
piece of the app, usually one or two screens at a time: design an object
model, build my tables, build my data access code, build my objects,
integrate them into the UI. This way I have something to show and I know
that this piece of functionality really works.

Also, I knew another good developer who wrote all his tables, then all his
CRUD stuff in stored procedures, then all his business objects, then all his
UI. This led to a bit of wasted effort because things often change during
development. Many of his stored procs and business objects didn't ever get
used, or had to be rewritten because priorities changed, new requirements
were discovered and better ways of writing these layers are found while
developing the app.

So I build one "module" at a time, from top to bottom.

"But it is more secuuuuuureee!!!"
no. An n-tier application has numerous areas where a hacker can inject
code, alter data during the dataflow etc. And most hackers who look for
databases aren't interested in deleting data, they're interested in
fetching data. So the obvious place to stick a listener component is
outside the db, right below the PL.


I agree with this too. Very astute observation Frans.

Fregas

Nov 21 '05 #40
Craig,
....
I respectfully disagree. ...............................cut everyting exept the first and last
sentence I agree with this too. Very astute observation Frans.


I hope that you are not angry if I tell that I don't understand this, do you
agree or disagree.

I never try to read words (in any language), however what the writer wants
to tell, in my opinion are you both trying to tell the same, with some
slight points that can be discussed, however not covering the trend of your
messages.

:-)

Cor
Nov 21 '05 #41
Countless religious arguments have been battled over in the past two decades.
Seriously, do a search and you'll find an endless number of threads or
webpages devoted to this topic, where the advocator of one or the other will
assure you that they are correct and anyone else is a heathen.

Two quick points:

-Dynamic SQL, presuming you do it write (e.g. parameterized), has cached
query plans just like stored procedures.

-Most experienced developers use stored procedures because it adds one more
level of abstraction from the underlying schema (it's a sort of
aspect-oriented program in that there's a clear and easy way to intercept and
modify calls), not to mention that it's a great way to centralize database
I/O. In the most beautiful designs the only objects accessible to consumers
are a subsection of externally visible stored procedures.

"John" wrote:
Hi

When developing vb.bet winform apps bound to sql server datasource, is it
preferable to use SELECTs or stored procedure to read and write data from/to
SQL Server? Why?

Thanks

Regards

Nov 21 '05 #42
Fregas wrote:
I respectfully disagree. Most things in this world, including
technology are a matter of degrees. I think most security analysts
understand that they can't make any system "unhackable" but its a
matter of balancing how much effort and inconvenience they put in
versus the pay off of being more secure. I don't think its a simple
matter of "Stored procedures are secure, dynamic sql is not".
Sadly, for most pro/con discussions it IS a matter of procs -> good,
dyn. sql -> bad, and security, and what not, is dragged into this. Of
course it's a matter of balance, but I've seen more than one person in
this thread claim procedures are more secure and dyn. is not, without
giving any example how to achieve that.

Using generated crud procs or handwritten crudprocs is NOT secure, not
even MORE secure than dyn. sql. Furthermore, with the options you have
in modern databases, you can make your db as secure as you want, even
with dyn. sql. The theoretic options you have with procs, where you
offer a small set of procs which have a fine-grained set of checks to
see if the caller is actually legit, are nice but also limit the
actually app's structure on top of it. And besides that: is that level
of security also in place in teh BL tier?
I depends on a lot of factors. In whole, I think you have more
security options with stored procedures, but that certainly doesn't
make your database "completely secure" whatever that is. You're
looking at it as a black and white question, and few things fit into
neat categories like that.
true, I placed it as a black/white situation, though that's how the
pro-proc claims were made in this thread, without any example, proof of
concept etc.

The pro-proc people also forget why procedures are often chosen: DBA's
who write the procs are way more important for the organization: THEY
call the shots, not vice versa. DB vendors also want people to write
procedures instead of dyn. sql: it will tie an app to the db, and makes
it very hard (== a high price) to port it to a competitor's db, and
therefore promote procs as THE way to develop data-access code. But
data-access code doesn't live alone, it's part of an application.
exactly. And not only that: if a hacker is able to delete one
customer it's already not safe. If the customerID is numeric, like
an int, he might have to write a loop to delete all of them, but
that's a matter of time.


And that extra time might cause the hacker to get noticed by
intrusion detection, get logged off because the network is only open
for a specific window of time, etc. If the key is a guid or
something not easily "loopable" than its going to be even harder.


Don't rely on it, if your data is worth something.
Many of the recommendations microsoft gives in locking IIS for
example, simply cause the hacker to have to put in a lot more work to
do the same damage, in the hope that the hacker will get caught or
reach the extent of their knowledge and give up.
Well, I think we've to make some distinction here. Let's call it
unauthorized access to the data. In that category fall:
managers/information workers with access who run adhoc reports on large
databases, bring it to its knees; hackers who want to delete data for
fun; hackers who want to steal your data for money.

The last category is the most fatal for you and also doesn't stop when
a hurdle is popping up. Also, how would you stop these people with a
set of crud procs?

People tend to believe that as long as it is a bit safe, they're at
least a bit more secure, but that's not true. If a hacker is after
your data, he'll get it, as soon as there's one tiny hole he can
peek through.


Again, I disagree. For example, I have a Belkin router. That router
has firewall protection. It has basic security. Am I completely
safe? No I'm not. Am I as safe as if I had a professional come in
and install a Checkpoint firewall? Not nearly. Am I more safe than
if I had no firewall protection at all. Definitely.


Not a good comparison. A firewall is supposed to be lock tight. A
stored proc api is open by definition: it offers a set of functionality
to get to the data/act on the data. Now, to be able to do something
with the data, you need procs which do that something for you, correct?
so these procs are there. And thus the hacker has a nice proc to get
things done.

10 to 1 the app the hacker used to get access, for example a leaky
website, has access to that proc, otherwise, why add that proc if the
website/app can't use it?
Instead of using the DBA's knowledge (so add the DBA as a db
consultant to the team, so the DBA knows which tables are targeted
often, and because the DBA knows which tables are large, he/she can
advice on using subqueries instead of joins or vice versa, through
the usage of the abstract DAL), the DBA and the developers aren't
working together, which is counter productive.


I'm fortunate enough to have not experienced that, although I've
heard it from others. I've only worked in two places that had a DBA,
and in both of those places it was a pleasant relationship. We
created our own queries, procedures, views, etc. and the db helped
maintain indexes, gave us tips, did backups, etc. They stayed out of
our way, we stayed out of theirs and for the most part helped each
other when needed. My current DBA is great and my managers
understand the developers need to create data structures rather than
have them dictated to them.


That's a great example how it should be :).
This is a bit of topic, but I have to say I dislike the whole idea of
a team project where certain members are working on one tier like the
database or the business objects, and other people working on other
tiers like the presentation. I'd rather break people up into
developing certain areas of the app and making them work from top to
bottom. Although its good for a designer to create images and chop
of the Html, you really need developers to consume objects and create
the UI, and thats easier if that developer has also created his
businss objects and database structures (whether just tables, views
or stored procs), because he's working with his own API(s).
That won't work in large applications. The disadvantage of having your
team work on all tiers is that when the times get tough, a developer
can decide to create a shortcut in the application, which will work at
one time, but fall apart later on. When you've a team per tier, the GUI
team for example can't do anything with the db, they have to call the
BL api methods, and the BL team just focusses on writing the business
logic, not GUI oriented stuff. So you get cleaner code: the GUI code
isn't polluted with stuff that can hurt its scalability, and the BL
code isn't polluted with stuff related to the gui which can hurt its
scalability as well, also, the GUI doesn't make shortcuts to the DB but
has to call the BL tier.
I've seen the "build one tier at a time" approach to development fail
many times, even with only one or two developers. We had a guy at my
current job get fired because he felll so far behind. He created so
many objects for things and test scripts and so forth but never built
his UI. He was saving that for last. The problem was that the "test
scripts" didn't accurately test his API very well and so he had a lot
of code but no screens to show for it and no real way of knowing if
his API worked. I much prefer to build one piece of the app, usually
one or two screens at a time: design an object model, build my
tables, build my data access code, build my objects, integrate them
into the UI. This way I have something to show and I know that this
piece of functionality really works.
being fired because you wrote the UI last is very weird, then
management definitely didn't know what's important. Pretty pictures are
nice to show to customers, but it's the logic behind it that really
counts. I'd rather have a raw ui with great BL that works, than a shiny
UI with rushed BL that falls apart if I push too hard. ;)
So I build one "module" at a time, from top to bottom.


With smaller projects (where 'smaller' is subjective, so don't take
this wrong, it's not meant negative) it's not always needed to have a
team per tier, but it's IMHO very important to have specs per tier and
code the specs separately as if there is no other tier. Not only do you
then learn if the tiers are well separated, you'll also keep away from
the pitfall where you write code in the GUI which actually had to be in
the BL and vice versa. ;)

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Nov 21 '05 #43
Only if your stored procs are recompiling on execution, which they should
not. Also, by keeping it in stored procs, scalability is increased, having a
single point of access, or proc to change, and prevents having to change,
test, QA, and approve the app when data access needs change. I suppose if the
majority of the changes are in the app to start with, this wouldn't matter so
much, but having to search through a .Net app to find all the references is
not so easy as a single proc used for the purpose. A good rule of thumb is
generally to keep the database in the database layer and the application in
the application layer. Notice the "generally".

Faster than what? parameterized dyn. queries? Nope. Both are compiled
at runtime equally and execution plans are cached for both.

Sqlinjection attacks are not possible when you use solely parameters
in your dyn. sql. Furthermore, just by using a proc doesn't mean your
dyn. search stored procedure which concatenates SQL internally is not
vulnerable for sql injection.

Frans

Ken
-------------------
Ken
-------------------
"John" <Jo**@nospam.infovis.co.uk> wrote in message
news:ON**************@TK2MSFTNGP12.phx.gbl...
Hi

When developing vb.bet winform apps bound to sql server datasource,
is it preferable to use SELECTs or stored procedure to read and
write data from/to SQL Server? Why?

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------

Nov 21 '05 #44
Ah, I see what your saying, and where your going with this, but if the memory
is sufficient, and your procs are the single gateways to particular business
functions, the the proc won't age out of the proccache near as fast as a
parameterized query.

"Frans Bouma [C# MVP]" wrote:
sh**********@yahoo.com wrote:
Hi john

They have there plans precompiled , so are faster.


Stored procs don't have their plans precompiled. At least not since
Sqlserver 7. Only on DB2 procedures are really precompiled and stored
in a precompiled fashion (not always as well, but most people opt for
that option, although it can hurt runtime optimization).

The reason for this is that only at runtime statistics are known and
only in the situation of a fresh query the optimizer can truly optimize
the plan based on the statistics. See BOL about execution plans.

As he's talking about SELECT's, the gain of a proc is not really an
issue (having processing on the server so you avoid pumping loads of
data to the client first and then back to the server).

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------

Nov 21 '05 #45
Now who would be doing that without a good reason?

"Miha Markic [MVP C#]" wrote:
the main performance advantage that stored procedures and triggers have in
SQL 2000 is that their SQL statements are always the same , therefore the
realational engine matches them with any execution plans


Not exactly, nothing prohibits you to craft dynamic sql within stored
procedure.

--
Miha Markic [MVP C#] - RightHand .NET consulting & development
www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/
SLODUG - Slovene Developer Users Group www.codezone-si.info

Nov 21 '05 #46

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

11
by: jrefactors | last post by:
I want to know the differences between SQL Server 2000 stored procedures and oracle stored procedures? Do they have different syntax? The concept should be the same that the stored procedures...
2
by: scott | last post by:
Hi, Just wondering what sort of problems and advantages people have found using stored procedures. I have an app developed in VB6 & VB.NET and our developers are starting to re-write some of the...
5
by: Tim Marshall | last post by:
I was following the thread "Re: Access Treeview - Is it Safe Yet?" with interest and on reading the post describing Lauren Quantrell's SmartTree, I've run into something I don't understand: Stored...
2
by: Eli | last post by:
Hi all We currently have a strange problem with calling a Stored Procedure (SQL Database) in our C# Project. The only error I get is "System error" which says a lot :) Background: We have...
28
by: mooreit | last post by:
The purpose for my questions is accessing these technologies from applications. I develop both applications and databases. Working with Microsoft C#.NET and Microsoft SQL Server 2000 Production and...
11
by: peter | last post by:
I am trying to get a SQL stored procedure to use user maintained MQT implicitly which raises questions on when they are used or not used. In theory you would expect the stored procedure to pick up...
0
by: Naresh1 | last post by:
What is WebLogic Admin Training? WebLogic Admin Training is a specialized program designed to equip individuals with the skills and knowledge required to effectively administer and manage Oracle...
0
by: antdb | last post by:
Ⅰ. Advantage of AntDB: hyper-convergence + streaming processing engine In the overall architecture, a new "hyper-convergence" concept was proposed, which integrated multiple engines and...
1
by: Matthew3360 | last post by:
Hi, I have been trying to connect to a local host using php curl. But I am finding it hard to do this. I am doing the curl get request from my web server and have made sure to enable curl. I get a...
0
by: Carina712 | last post by:
Setting background colors for Excel documents can help to improve the visual appeal of the document and make it easier to read and understand. Background colors can be used to highlight important...
0
BLUEPANDA
by: BLUEPANDA | last post by:
At BluePanda Dev, we're passionate about building high-quality software and sharing our knowledge with the community. That's why we've created a SaaS starter kit that's not only easy to use but also...
0
by: Rahul1995seven | last post by:
Introduction: In the realm of programming languages, Python has emerged as a powerhouse. With its simplicity, versatility, and robustness, Python has gained popularity among beginners and experts...
1
by: Johno34 | last post by:
I have this click event on my form. It speaks to a Datasheet Subform Private Sub Command260_Click() Dim r As DAO.Recordset Set r = Form_frmABCD.Form.RecordsetClone r.MoveFirst Do If...
1
by: ezappsrUS | last post by:
Hi, I wonder if someone knows where I am going wrong below. I have a continuous form and two labels where only one would be visible depending on the checkbox being checked or not. Below is the...
0
DizelArs
by: DizelArs | last post by:
Hi all) Faced with a problem, element.click() event doesn't work in Safari browser. Tried various tricks like emulating touch event through a function: let clickEvent = new Event('click', {...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.