473,800 Members | 2,404 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

architecture question

Jon
I'm working on a project and the lead developer wants to put all of the
business logic on the database side. He wants the proc to do all of the
logic and create XML on the fly, parse out data, etc, and then return the
final result to the business layer. I pushed back and said that the business
layer should do all of the logic, just have SQL return the data needed and
then have the code do all of the XML creation, parsing of data, etc. I was
told I was wrong and thats not the best practice.
Now, this app can hit the db by a total of 100-500 users a day, and having
the db do all of the processing can have a performance hit. Then he
mentioned something about using temp table in the database side, do a
select, then insert into the temp table, then do a select on the temp table
and do the business logic. I pushed back on that as well..

Now, I'm not a DBA but a developer and I've seen DB's take performance hits
when doing logic such as this on the database side, so my question is, which
would be the best solution for this?
Jul 6 '07
24 1482
GlennDoten wrote:
Arne Vajhøj wrote:
>GlennDoten wrote:
>>Arne Vajhøj wrote:
There can be both performance and multiple application support reasons
to put logic in SP's in the database.

I would say that this is an extremely rare design avenue to take.
>>Some sort of code (in this forum most likely C#) must call the stored
procedures. At worst, business logic may be placed into those methods
but I cannot think of a single reason why business logic would ever
have to go any "lower" than that (i.e., right in a sproc). And even
putting business logic into the data access layer like this is
practically as bad an idea as putting it right in the sproc, for the
same reason that some sort of code (again, C#) would need to call
into the data access layer and business logic should really never go
any lower than that. With models like this you still have full
multiple application support.

You are making an assumption that all the applications are made
in same language/framework. That is not always the case.

Yes, I am. That's why I clearly called out the fact that this is the C#
forum so, yes, some assumptions are being made.
It is a very valid assumption considering the forum that C# is
used.

Assuming that nothing else than .NET is used is not a very
realistic assumption.
>>And I would counter the performance claim (in the general sense,
anyhow) with pointing out that a database probably needs to be
denormalize d to some extent to make it faster way before anyone
should ever put business logic into a sproc.

If denormalization (or more nice: materialized views) help then the
problem it is unlikely that the usage of SP's will help and vice versa.
The one is a query problem the other is primarily a tier crossing
overhead problem.

My experience with many databases is that when you start to think you
need to put business logic into sprocs the first thing to look at is an
overly normalized database. A highly normalized database can easily
cause performance problems. But it's always better to denormalize a bit
than to put any business logic into sprocs.
Two different problems.
Views wouldn't help an overly normalized database. But I'm not sure what
you mean with that point.
I did not say "view" - I said "materializ ed view".

And it is a common remedy for join performance. Sligthly more
advanced than just denormalising the database.

(which is called something else in SQLServer but the concept
is wellknown)
>>>Whether such reasons exist in this case or whether there are good
reasons not to do so is not possible to say based on the
informatio n present.

Yes, each application is different. But I do think there are some
pretty basic guiding principles that should be adhered to, or in this
case principles that should be completely avoided.

Why do you think MS put the ability to write SP's in C# in 2005 ?

I think that was a marketing ploy and nothing more. No one I've worked
with thinks that writing sprocs in C# makes any sense, mostly because
the performance is horrible. Word is that's what the SQL team thinks
too. I can see creating and sharing C# user-defined types in SQL Server,
but that's about it with regards to CLR integration into SQL Server.
Oracle, Sybase, DB2 etc. has SP's in Java.

Rather surprising that everybody has it if noone uses it.

Arne
Aug 27 '07 #21
GlennDoten wrote:
Arne Vajhøj wrote:
>Enkidu wrote:
>>Arne Vajhøj wrote:
Generally it is expected that:
- SP's perform better
- all logic in code scale better
>>You say "SP's perform better". That's a sweeping statement and more
often than not might be wrong. It needs skill and a detailed
knowledge of the database and how it stores data to make a SP perform
better than a simple (or even complex) query. I'm sceptical.

There are no reason to believe a single statement SP is faster, but
there are good reasons to believe a multi statement SP is faster.

But, um, yes, single-statement sprocs are indeed faster than their
inline SQL counterparts. Every time. Enkidu, there's no skill involved
there. If you time running, say, a single SELECT run via inline SQL and
the same SELECT run via a sproc then sproc will always be faster. A lot
faster? Not really. But faster nonetheless.
The performance benefits of SP's for single statement is usually
very small.

For obvious reasons - the exact same things are done.

For multi statement the performance benefits can be
rather big.

Again for obvious reasons: roundtrips between app and
db are eliminated.

Arne
Aug 27 '07 #22
Enkidu wrote:
You say "SP's perform better". That's a sweeping statement and more
often than not might be wrong. It needs skill and a detailed knowledge
of the database and how it stores data to make a SP perform better than
a simple (or even complex) query. I'm sceptical.
Usualy you would use the exact same SQL in the SP and non SP
solution, so the extra skills required for a SP solution
compared to a non SP solution is: zero.

Arne

Aug 27 '07 #23
On Aug 20, 3:59 am, GlennDoten <gdo...@gmail.c omwrote:
But, um, yes, single-statement sprocs are indeed faster than their
inline SQL counterparts. Every time. Enkidu, there's no skill involved
there. If you time running, say, a single SELECT run via inline SQL and
the same SELECT run via a sproc then sproc will always be faster. A lot
faster? Not really. But faster nonetheless.
Glenn, that's a sweeping statement that's no longer always true. SQL
Server 2005 does a much better job reusing execution plans for
parameterized statements, even when the parameter values change
between queries. Here's MSDN, under the execute_sql entry:

<MSDN>
In earlier versions of SQL Server, the only way to be able to reuse
execution plans is to define the Transact-SQL statements as a stored
procedure and have the application execute the stored procedure. This
generates additional administrative overhead for the applications.
Using sp_executesql can help reduce this overhead while still letting
SQL Server reuse execution plans. sp_executesql can be used instead of
stored procedures when executing a Transact-SQL statement several
times, when the only variation is in the parameter values supplied to
the Transact-SQL statement. Because the Transact-SQL statements
themselves remain constant and only the parameter values change, the
SQL Server query optimizer is likely to reuse the execution plan it
generates for the first execution.
</MSDN>

I put together a little benchmark to compare a simple query with its
equivalent stored procedure. I ran both 10,000 times. Here's what I
saw:

Stored procedure 10000 iterations. Elapsed time: 00:00:09.984311 1
Statement 10000 iterations. Elapsed time: 00:00:09.953061 3

(to be fair, sometimes the stored procedure would come out ahead.
they were always within half a percent of each other)

Here's my source:

using System;
using System.Data.Sql Client;
using System.Data;

namespace StatementVersus SP
{
class Program
{
static string server = XXXX
static string database = XXXX;
static string connString = "Data Source={0};Init ial
Catalog={1};Int egrated Security=True";

static void Main(string[] args)
{
Compare(10000);
Console.ReadLin e();
}

static private void Compare(int iterations)
{
// preload
for (int i = 0; i < 10; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStored Procedure(conn, i);
}
}

for (int i = 0; i < 10; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStatem ent(conn, i);
}
}
//

DateTime start;
DateTime end;

start = DateTime.Now;
for (int i = 0; i < iterations; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStored Procedure(conn, i);
}
}
end = DateTime.Now;
Console.WriteLi ne(String.Forma t("Stored procedure {0} iterations.
Elapsed time: {1}", iterations, end - start));

start = DateTime.Now;
for (int i = 0; i < iterations; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStatem ent(conn, i);
}
}
end = DateTime.Now;
Console.WriteLi ne(String.Forma t("Statement {0} iterations. Elapsed
time: {1}", iterations, end - start));

}

private static void QueryWithStored Procedure(SqlCo nnection conn, int
i)
{
// defined as:
//create proc test_sp
//@id int
//as
//SELECT * from TEST WHERE ID = @id
SqlCommand command = new SqlCommand("tes t_sp", conn);
command.Command Type = CommandType.Sto redProcedure;
command.Paramet ers.Add(new SqlParameter("@ id", i));
conn.Open();
using (IDataReader dr = command.Execute Reader())
{
dr.Read();
}
}

private static void QueryWithStatem ent(SqlConnecti on conn, int i)
{
SqlCommand command = new SqlCommand("SEL ECT * FROM Test WHERE ID =
@id", conn);
command.Command Type = CommandType.Tex t;
command.Paramet ers.Add(new SqlParameter("@ id", i));
conn.Open();
using (IDataReader dr = command.Execute Reader())
{
dr.Read();
}
}
}
}

Michael

Aug 28 '07 #24
Finally!

Due to maintenance reasons sp's are a "bad" option. IMHO.

However, unfortunately it doesn't apply to sequences of statements, for
example, when you have to do complex joins and use intermediary tables and
(maybe) updates and inserts. Probably caching issues or space for
improvement.

Somehow, on my SQL 2005 Express sp's are somewhat better in general.
<mp*******@gmai l.comwrote in message
news:11******** **************@ l22g2000prc.goo glegroups.com.. .
On Aug 20, 3:59 am, GlennDoten <gdo...@gmail.c omwrote:
>But, um, yes, single-statement sprocs are indeed faster than their
inline SQL counterparts. Every time. Enkidu, there's no skill involved
there. If you time running, say, a single SELECT run via inline SQL and
the same SELECT run via a sproc then sproc will always be faster. A lot
faster? Not really. But faster nonetheless.

Glenn, that's a sweeping statement that's no longer always true. SQL
Server 2005 does a much better job reusing execution plans for
parameterized statements, even when the parameter values change
between queries. Here's MSDN, under the execute_sql entry:

<MSDN>
In earlier versions of SQL Server, the only way to be able to reuse
execution plans is to define the Transact-SQL statements as a stored
procedure and have the application execute the stored procedure. This
generates additional administrative overhead for the applications.
Using sp_executesql can help reduce this overhead while still letting
SQL Server reuse execution plans. sp_executesql can be used instead of
stored procedures when executing a Transact-SQL statement several
times, when the only variation is in the parameter values supplied to
the Transact-SQL statement. Because the Transact-SQL statements
themselves remain constant and only the parameter values change, the
SQL Server query optimizer is likely to reuse the execution plan it
generates for the first execution.
</MSDN>

I put together a little benchmark to compare a simple query with its
equivalent stored procedure. I ran both 10,000 times. Here's what I
saw:

Stored procedure 10000 iterations. Elapsed time: 00:00:09.984311 1
Statement 10000 iterations. Elapsed time: 00:00:09.953061 3

(to be fair, sometimes the stored procedure would come out ahead.
they were always within half a percent of each other)

Here's my source:

using System;
using System.Data.Sql Client;
using System.Data;

namespace StatementVersus SP
{
class Program
{
static string server = XXXX
static string database = XXXX;
static string connString = "Data Source={0};Init ial
Catalog={1};Int egrated Security=True";

static void Main(string[] args)
{
Compare(10000);
Console.ReadLin e();
}

static private void Compare(int iterations)
{
// preload
for (int i = 0; i < 10; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStored Procedure(conn, i);
}
}

for (int i = 0; i < 10; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStatem ent(conn, i);
}
}
//

DateTime start;
DateTime end;

start = DateTime.Now;
for (int i = 0; i < iterations; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStored Procedure(conn, i);
}
}
end = DateTime.Now;
Console.WriteLi ne(String.Forma t("Stored procedure {0} iterations.
Elapsed time: {1}", iterations, end - start));

start = DateTime.Now;
for (int i = 0; i < iterations; i++)
{
using (SqlConnection conn = new
SqlConnection(S tring.Format(co nnString, server, database)))
{
QueryWithStatem ent(conn, i);
}
}
end = DateTime.Now;
Console.WriteLi ne(String.Forma t("Statement {0} iterations. Elapsed
time: {1}", iterations, end - start));

}

private static void QueryWithStored Procedure(SqlCo nnection conn, int
i)
{
// defined as:
//create proc test_sp
//@id int
//as
//SELECT * from TEST WHERE ID = @id
SqlCommand command = new SqlCommand("tes t_sp", conn);
command.Command Type = CommandType.Sto redProcedure;
command.Paramet ers.Add(new SqlParameter("@ id", i));
conn.Open();
using (IDataReader dr = command.Execute Reader())
{
dr.Read();
}
}

private static void QueryWithStatem ent(SqlConnecti on conn, int i)
{
SqlCommand command = new SqlCommand("SEL ECT * FROM Test WHERE ID =
@id", conn);
command.Command Type = CommandType.Tex t;
command.Paramet ers.Add(new SqlParameter("@ id", i));
conn.Open();
using (IDataReader dr = command.Execute Reader())
{
dr.Read();
}
}
}
}

Michael

Aug 28 '07 #25

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
1580
by: comp.lang.php | last post by:
I can't possibly reproduce the code for this as the 2 classes in question are about 1500 lines each and condensing is in this case impossible due to algorithmic logic dependencies. Let's say you have a Class A and a Class B. Let's say Class A is like this: class A { var $b;
3
2613
by: Michael Crawford | last post by:
Hi, Where would one start for this type of application: I want to create an vb.net container application that has the gives the end user the ability to install and uninstall plugins or add-in modules. As the user installs the various components, additional functionality to the application is integrated. How would I structure or orchestrate a project plan that makes use of add-in
6
381
by: Tim | last post by:
Hello everyone. I visit this board pretty regularly but have never posted. My question is an application architecture question... so hopefully it's okay to post in the dotnet general forum. I have this project I'm proposing but want to get some opinions on a good architecture. Some of the requirements are as follows: 1. 30+ tables and 25+ lookup tables 2. Small number of records added monthly (<100) 3. Fairly complex, PCU-intensive...
2
1779
by: hans | last post by:
Hi! I am new to .NET (coming from Java). We have to implement a desktop application which extracts data from a database, does some analysis, filtering etc. and displays the results. I have noticed that in .NET applications Windows widgets like the DataGrid are often directly bound to a DataSet Object. For me this means essentially a 2 tier architecture instead of a 3 tier architecture. I am used to seperating the application into 3 tiers:...
6
2954
by: Gary James | last post by:
This may not be a direct C# question, but since I'll be using using C# for development, I thought I'd pose the question here. I'll soon be involved in the design of a new software product that will employ a software "Plug-In" architecture. Taking the plug-in route will give us a design that can adapt to, as yet, undefined future requirements (within the scope of the plug-in interface spec of course). In the past I've done this with...
1
3130
by: dilip ranganathan | last post by:
Howdy All I have a quick question on an architecture I am trying to implement. I want to implement something similar to a pub-sub architecture. Is it possible to broadcast event notifications across machines and networks? I have a .NET application (lets call it .NETApp) sitting on an IIS box that recieves events from another source. If I want client applications (that are potentially on machines different from the network on which...
1
4475
by: benmorganpowell | last post by:
I have a small windows service which connects to a POP3 server at defined intervals, scans the available messages, extracts the required information and inserts the data into a SQL database. I am assuming that this is not an uncommon piece of software. I want to get an architecture that conforms as closely as possible with the recommendations from Microsoft on developing Windows Services, but to be honest I have found difficultly in...
3
1539
by: Ken H | last post by:
Hi I have a question about architecting solutions.. I have a part of a project which requires me to track person details (name, addresses, etc... Should I be creating Person objects, Address objects etc as Business layer classes and then have an data access layer which actually save / updates / delete to and from the DB.. If so, and bearing in mind the 'easiest' way to pass data between the layers would be with a dataset - should my BL...
2
1776
by: John A | last post by:
I have a Web Service that I am reponsible for that we use for data integration purposes. Recently I have been tasked with sending some of this data to a third party. Because they need to receive the data in real time. They have requested that I subscribe to a Web Service that they have published. The only problem is that they often take longer than 30 seconds to process my data before I get a response back from them. This is taking far...
13
7213
by: rrs.matrix | last post by:
hi i have to detect the type of CPU. whether it is 32-bit or 64-bit.. how can this be done.. can anyone please help me.. thanks.
0
9551
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10505
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
10253
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10033
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9085
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7576
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5471
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
2
3764
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2945
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.