By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,089 Members | 2,301 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,089 IT Pros & Developers. It's quick & easy.

Passing Business Objects through nTier Web App

P: n/a
Hi all,

I am hoping that someone with some experience developing nTier apps can give
me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I want
to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which would
insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs to
reference the BLL (to receive the custom business object), hence a circular
referencing error. I understand that I could turn this custom object into
some sort of generic object[] or collection and pass it then, or
alternatively pass the method field values one by one (not practical with
10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other layers
to reference the same objects so they can be passed between themselves
without issues. The drawback is that for this to work properly you need to
have the objects themselves defined in the ORL, but the methods defined
statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu
Nov 17 '05 #1
Share this Question
Share on Google+
25 Replies


P: n/a
> The problem is that the BLL needs to reference the DAL and the DAL needs
to
reference the BLL (to receive the custom business object), hence a
circular
referencing error.


That doesn't sound right. The bottom layer shouldn't reference the top
layer. So the DAL should not reference the BLL.

Just like in networking protocols. TCP is built on top of IP. It would
break the layer separation if IP referenced a TCP property.

My advise would be to examine why the DAL references the BLL and redesign
the DAL to remove the reference.

Greetings,
Wessel
Nov 17 '05 #2

P: n/a
Thanks for your reply Wessel.

The reason the DAL needs to reference the BLL is because it need to know
about the object it's being passed.

For example if I have a Customer object in the BLL and I pass it as a
parameter in DAL.UpdateCustomer(BLL.Customer myCustomer), the DAL needs to
know what sort of object a 'Customer' is, and so i need to reference the BLL
layer from the DAL layer.

If i didn't do this, i would not be able to pass 'Customer' as a parameter
between layers, i would instead need to either pass the individual
properties one by one, or use another type of object that the DAL already
knows about (ie. object array)

"Wessel Troost" <no*****@like.the.sun> wrote in message
news:op.sutprqbkf3yrl7@asbel...
The problem is that the BLL needs to reference the DAL and the DAL needs
to
reference the BLL (to receive the custom business object), hence a
circular
referencing error.


That doesn't sound right. The bottom layer shouldn't reference the top
layer. So the DAL should not reference the BLL.

Just like in networking protocols. TCP is built on top of IP. It would
break the layer separation if IP referenced a TCP property.

My advise would be to examine why the DAL references the BLL and redesign
the DAL to remove the reference.

Greetings,
Wessel

Nov 17 '05 #3

P: n/a
Hi Stuart,
For example if I have a Customer object in the BLL and I pass it as a
parameter in DAL.UpdateCustomer(BLL.Customer myCustomer), the DAL needs
to
know what sort of object a 'Customer' is, and so i need to reference the
BLL
layer from the DAL layer.

Well, so why don't you move the definition of Customer from the BLL to the
DAL? That seems like a proper place to put it in any case.

Good luck,
Wessel
Nov 17 '05 #4

P: n/a
Hi Wessel,

Thanks for the input.

I see what your saying, but i would tend to think it needs to stay the the
BLL.

As i understand it, the purpose of the DAL is simply to seperate the
datasource interactions from the actual business logic of the application.
So for example if i decided to move from SQL2000 to MySql in the future i
would only need to update the DAL, not rewrite the entire thing.

Also, if i was to move the Customer object into the DAL, then i would really
need to move all my other business objects since virtually all of them
interact with a datasource in some way, and obviously if i did that then i
wouldn't have a BLL anymore, the DAL & BLL would merge and it would become a
2 tier app.

The 3 Tier architecture seems to be used a lot, I would think that someone
would have come accross this issue before.

- Stu

"Wessel Troost" <no*****@like.the.sun> wrote in message
news:op.sutsvjttf3yrl7@asbel...
Hi Stuart,
For example if I have a Customer object in the BLL and I pass it as a
parameter in DAL.UpdateCustomer(BLL.Customer myCustomer), the DAL needs
to
know what sort of object a 'Customer' is, and so i need to reference the
BLL
layer from the DAL layer.

Well, so why don't you move the definition of Customer from the BLL to the
DAL? That seems like a proper place to put it in any case.

Good luck,
Wessel

Nov 17 '05 #5

P: n/a
Hi,

I have used a different approach, My "DAL" has no knowledge about the BLL ,
all it cares about is interacting with the DB , it receive a SqlCommand from
the BLL and it execute it, and return the values.

It's in the BLL where each object ( as Customer ) create theirs command and
then use the DAL to have them executed.

This work on my situation cause I know this system will ALWAYS use a SQL DB.

If you want to have more independency it gets complex, you could have an
abstract factory to create the DAL that interact with the especific DB
back-end.

alternative you could create abstract classes/interfaces in a separate
prject, then have both the BLL and the DAL reference it, this way you avoid
the circular references.
cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:Rs******************@news-server.bigpond.net.au...
Thanks for your reply Wessel.

The reason the DAL needs to reference the BLL is because it need to know
about the object it's being passed.

For example if I have a Customer object in the BLL and I pass it as a
parameter in DAL.UpdateCustomer(BLL.Customer myCustomer), the DAL needs to
know what sort of object a 'Customer' is, and so i need to reference the
BLL layer from the DAL layer.

If i didn't do this, i would not be able to pass 'Customer' as a parameter
between layers, i would instead need to either pass the individual
properties one by one, or use another type of object that the DAL already
knows about (ie. object array)

"Wessel Troost" <no*****@like.the.sun> wrote in message
news:op.sutprqbkf3yrl7@asbel...
The problem is that the BLL needs to reference the DAL and the DAL needs
to
reference the BLL (to receive the custom business object), hence a
circular
referencing error.


That doesn't sound right. The bottom layer shouldn't reference the top
layer. So the DAL should not reference the BLL.

Just like in networking protocols. TCP is built on top of IP. It would
break the layer separation if IP referenced a TCP property.

My advise would be to examine why the DAL references the BLL and redesign
the DAL to remove the reference.

Greetings,
Wessel


Nov 17 '05 #6

P: n/a
Thanks Ignacio,

It's not so much that I think that will ever move from SQL svr, I am really
just trying to enforce a complete seperation of data access and business
logic and I would think that by creating SqlCommands within the BLL you are
getting very close to removing the DAL altogether.

Like i said, I've no real interest in access multiple db's so I think an
abstract factory would be overkill, however the seperate project with
classes that can be referenced by both the DAL and the BLL seems like what i
have already in the form of an ORL (Object Reference Layer).

I'm really just curious to know if this is best practise considering this
scenario? The reason is that by using this technique I can't use instance
methods, so effectively my classes & properties are defined in the ORL and
the methods are defined (statically) in the BLL. It's a bit odd programming
in this way, but it works.

- Stu

"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote
in message news:Ov**************@TK2MSFTNGP09.phx.gbl...
Hi,

I have used a different approach, My "DAL" has no knowledge about the BLL
, all it cares about is interacting with the DB , it receive a SqlCommand
from the BLL and it execute it, and return the values.

It's in the BLL where each object ( as Customer ) create theirs command
and then use the DAL to have them executed.

This work on my situation cause I know this system will ALWAYS use a SQL
DB.

If you want to have more independency it gets complex, you could have an
abstract factory to create the DAL that interact with the especific DB
back-end.

alternative you could create abstract classes/interfaces in a separate
prject, then have both the BLL and the DAL reference it, this way you
avoid the circular references.
cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:Rs******************@news-server.bigpond.net.au...
Thanks for your reply Wessel.

The reason the DAL needs to reference the BLL is because it need to know
about the object it's being passed.

For example if I have a Customer object in the BLL and I pass it as a
parameter in DAL.UpdateCustomer(BLL.Customer myCustomer), the DAL needs
to know what sort of object a 'Customer' is, and so i need to reference
the BLL layer from the DAL layer.

If i didn't do this, i would not be able to pass 'Customer' as a
parameter between layers, i would instead need to either pass the
individual properties one by one, or use another type of object that the
DAL already knows about (ie. object array)

"Wessel Troost" <no*****@like.the.sun> wrote in message
news:op.sutprqbkf3yrl7@asbel...
The problem is that the BLL needs to reference the DAL and the DAL
needs to
reference the BLL (to receive the custom business object), hence a
circular
referencing error.

That doesn't sound right. The bottom layer shouldn't reference the top
layer. So the DAL should not reference the BLL.

Just like in networking protocols. TCP is built on top of IP. It would
break the layer separation if IP referenced a TCP property.

My advise would be to examine why the DAL references the BLL and
redesign the DAL to remove the reference.

Greetings,
Wessel



Nov 17 '05 #7

P: n/a
Hi Stewart,
Also, if i was to move the Customer object into the DAL, then i would


Well I know two kinds of DAL, handwritten and generated. Generation
enforces the separation of layers rule; you can't generate objects the
database doesn't know about. So I assume you're using a handwritten DAL.

Usually in such a scenario, the Customer class would correspond to a
database tabled called (something like) Customers. The DAL would then
contain a class named Customer, with optional operations like
UpdateCustomer() and the like.

However, you say your Customer class resides in the BLL. So I'm curious:
what does your DAL contain?

Greetings,
Wessel
Nov 17 '05 #8

P: n/a

Typically, a data access tier is meant to encapsulate the access to
persistent storage such that there is no implementation-dependent code
within the BLL. It's often little more than a thin wrapper on a set of
stored procedures. It shouldn't really know anything at all about your
domain model classes.

The problem you've got is where to put the mapping code to connect the
domain model classes to the data access logic. One option is a separate
mapping library which knows about the data access layer, knows about the
domain model, and mediates between them.

http://www.martinfowler.com/eaaCatalog/dataMapper.html

A less clean but simpler solution [simpler for small implementations,
I've found that a mapping-based architecture simplifies large / complex
systems] is to access the DAL directly from within the domain classes.
So, you write a DAL with methods which take simple parameters and return
DataTables and/or DataSets. The code mapping the data to the object is
then encapsulated within the domain model class, which is nice, but
knows a little too much about the underlying storage medium, which is
not nice.

--
Steve Walker
Nov 17 '05 #9

P: n/a
Hi Wessel,

My Customer class used to reside in the BLL, but in order for the DAL to
access the Customer object I needed to move it into another layer which I
call an Object Reference Layer (ORL).

Now that Customer is in the ORL (think of it as a vertical layer rather than
a traditional horizontal layer) I am able to access the object from every
other layer (UIL, BLL, DAL) and I can pass the object between these layers.

This saves me from declaring the same class multiple times. If I was to
declare the Customer class in the DAL, the although I could access the
object from the BLL, I could not access it from the UIL (User Interface
Layer).

However, using this technique, I cannot use instance methods. Therefore all
of my methods are static, and in the BLL my methods are used for enforcing
business rules as well as passing through to the DAL from the UIL eg.

public static bool BLL.InsertCustomer(ORL.Customer myCustomer)
{
return DAL.InsertCustomer(myCustomer);
}

In the DAL, all of my methods are static and they perform InsertCustomer,
UpdateCustomer, DeleteCustomer, etc actions, and they are always called from
the BLL, never directly from the UIL.

- Stu

"Wessel Troost" <no*****@like.the.sun> wrote in message
news:op.sutznkuof3yrl7@asbel...
Hi Stewart,
Also, if i was to move the Customer object into the DAL, then i would


Well I know two kinds of DAL, handwritten and generated. Generation
enforces the separation of layers rule; you can't generate objects the
database doesn't know about. So I assume you're using a handwritten DAL.

Usually in such a scenario, the Customer class would correspond to a
database tabled called (something like) Customers. The DAL would then
contain a class named Customer, with optional operations like
UpdateCustomer() and the like.

However, you say your Customer class resides in the BLL. So I'm curious:
what does your DAL contain?

Greetings,
Wessel

Nov 17 '05 #10

P: n/a
Hi Steve,

Thanks for the response.

I think your suggestion of writing the DAL as 'stored procedure' wrappers
that accept a simple input, and return a dataset, or datatable is pretty
much what i have aimed to do.

The important thing for me is to be able to pass the actual business object
as a single parameter through to the DAL, and i have achieved this by
creating an Object Reference Layer.

At this point i think i have what i want, the DAL knows nothing about the
BLL but knows where the ORL is, if it's passed a Customer object.

"Steve Walker" <st***@otolith.demon.co.uk> wrote in message
news:W$**************@otolith.demon.co.uk...

Typically, a data access tier is meant to encapsulate the access to
persistent storage such that there is no implementation-dependent code
within the BLL. It's often little more than a thin wrapper on a set of
stored procedures. It shouldn't really know anything at all about your
domain model classes.

The problem you've got is where to put the mapping code to connect the
domain model classes to the data access logic. One option is a separate
mapping library which knows about the data access layer, knows about the
domain model, and mediates between them.

http://www.martinfowler.com/eaaCatalog/dataMapper.html

A less clean but simpler solution [simpler for small implementations, I've
found that a mapping-based architecture simplifies large / complex
systems] is to access the DAL directly from within the domain classes. So,
you write a DAL with methods which take simple parameters and return
DataTables and/or DataSets. The code mapping the data to the object is
then encapsulated within the domain model class, which is nice, but knows
a little too much about the underlying storage medium, which is not nice.

--
Steve Walker

Nov 17 '05 #11

P: n/a
Hi Stuart,

Personally, I use strongly-typed datasets as my business objects. I have a
project that only contains the strongly-typed datasets, and both the BLL
project and the DAL project can then reference the datasets project. The
UIL project also references the strongly-typed datasets project, which makes
for easy data binding.

HTH,

Mike Rodriguez

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:ID*****************@news-server.bigpond.net.au...
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which
would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs
to reference the BLL (to receive the custom business object), hence a
circular referencing error. I understand that I could turn this custom
object into some sort of generic object[] or collection and pass it then,
or alternatively pass the method field values one by one (not practical
with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work properly
you need to have the objects themselves defined in the ORL, but the
methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu

Nov 17 '05 #12

P: n/a
I haven't come across this article yet.

Thanks very much Nigel.

:)

"Nigel Norris" <no****@nospam.com> wrote in message
news:OB**************@TK2MSFTNGP09.phx.gbl...
Stuart,

Have you read Microsoft's writings on this topic?

"Designing Data Tier Components and Passing Data Through Tiers"

See:
http://msdn.microsoft.com/library/de...tml/boagag.asp

It may leave you more confused, mind you, because it's not entirely clear
in answering your question.

-------
Nigel Norris

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:ID*****************@news-server.bigpond.net.au...
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which
would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs
to reference the BLL (to receive the custom business object), hence a
circular referencing error. I understand that I could turn this custom
object into some sort of generic object[] or collection and pass it then,
or alternatively pass the method field values one by one (not practical
with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work properly
you need to have the objects themselves defined in the ORL, but the
methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu


Nov 17 '05 #13

P: n/a
Hi Michael,

Sounds like your doing the same thing I am only using datasets rather than
business objects. I would have thought that your application would take a
massive performance by using datasets in this way, especially if you use a
lot of objects. I suppose you can help with caching (if it's an option).

- Stu

"Michael Rodriguez" <mi**@nospamforme.com> wrote in message
news:OX*************@TK2MSFTNGP09.phx.gbl...
Hi Stuart,

Personally, I use strongly-typed datasets as my business objects. I have
a project that only contains the strongly-typed datasets, and both the BLL
project and the DAL project can then reference the datasets project. The
UIL project also references the strongly-typed datasets project, which
makes for easy data binding.

HTH,

Mike Rodriguez

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:ID*****************@news-server.bigpond.net.au...
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which
would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs
to reference the BLL (to receive the custom business object), hence a
circular referencing error. I understand that I could turn this custom
object into some sort of generic object[] or collection and pass it then,
or alternatively pass the method field values one by one (not practical
with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work properly
you need to have the objects themselves defined in the ORL, but the
methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu


Nov 17 '05 #14

P: n/a
Hey,

Just talking from experience, I have achieved the seperation of business
objects (BLL) from database specific calls (DAL) by having mappers that
some of the people talked about.

You can read my comments about the design of my blog which currently can
run on MySql and Sql Server by going here:
http://www.laimisnet.com/default.aspx?entryid=18
.. Anyways, the design fit my needs, and I can add another storage
mechanism without modifying bll.
Nov 17 '05 #15

P: n/a
Very interesting reading Laimis.

I am curious, is the mapper implemented as a seperate project in the
solution?

Obviously this is a very good solution for anyone wanting a great deal of
flexibility in which data store is used.

"laimis" <si*****@iit.edu> wrote in message
news:O1*************@TK2MSFTNGP09.phx.gbl...
Hey,

Just talking from experience, I have achieved the seperation of business
objects (BLL) from database specific calls (DAL) by having mappers that
some of the people talked about.

You can read my comments about the design of my blog which currently can
run on MySql and Sql Server by going here:
http://www.laimisnet.com/default.aspx?entryid=18
. Anyways, the design fit my needs, and I can add another storage
mechanism without modifying bll.

Nov 17 '05 #16

P: n/a
Stuart Hilditch wrote:
Hi all,

I am hoping that someone with some experience developing nTier apps can give
me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I want
to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which would
insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs to
reference the BLL (to receive the custom business object), hence a circular
referencing error. I understand that I could turn this custom object into
some sort of generic object[] or collection and pass it then, or
alternatively pass the method field values one by one (not practical with
10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other layers
to reference the same objects so they can be passed between themselves
without issues. The drawback is that for this to work properly you need to
have the objects themselves defined in the ORL, but the methods defined
statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu

Google Dependency Inversion Principle and Interface Segregation Principle.

I am by no means an expert but..
In the past I have created another assembly containing an interface that
both the DAL and BLL know about.
This interface pretty much only contains properties for all the db table
columns.
Thus, the DAL classes know about the interface and can interact with
that and the BLL classes implement this interface.

HTH
JB
Nov 17 '05 #17

P: n/a
I did originally use interfaces in the DAL, but it resulted in so much messy
code (some of my object have 25+ properties) I decided that it was far
cleaner to simply reference the object from the ORL.

- Stu

"John B" <jb******@yahoo.com> wrote in message
news:42***********************@news.sunsite.dk...
Stuart Hilditch wrote:
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which
would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs
to reference the BLL (to receive the custom business object), hence a
circular referencing error. I understand that I could turn this custom
object into some sort of generic object[] or collection and pass it then,
or alternatively pass the method field values one by one (not practical
with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work properly
you need to have the objects themselves defined in the ORL, but the
methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu

Google Dependency Inversion Principle and Interface Segregation Principle.

I am by no means an expert but..
In the past I have created another assembly containing an interface that
both the DAL and BLL know about.
This interface pretty much only contains properties for all the db table
columns.
Thus, the DAL classes know about the interface and can interact with that
and the BLL classes implement this interface.

HTH
JB

Nov 17 '05 #18

P: n/a
Stuart,

One thing you haven't given any indication of is the scale or requirements
for the project. In practice I think that the appropriate answer to tackling
this sort of problem depends on answers to questions such as:

- How big a project is this?
- How important is flexibility in future?
- Do you need ot have the DAL as a full blown tier (distributable) or is it
simply an internal software layer?
- What is your testing approach - does the DAL need to be replaceable for
testing?

I'd probably come up with very different answers for a 100 user one-off
project than for something that was going to be the key business application
for the next decade. I think all projects should have some sort of
'complexity budget' - where complexity in this case is measured as the ratio
of 'infrastructure' code to useful application logic. As you add structure
(to reduce complexity in your application logic) you add complexity in extra
code. So it's a tradeoff of whether the benefit outweighs the cost.

So for a really simple project where the DAL was an internal layer I might
just break the layering rule and have the DAL create my busness objects. I'd
keep some separation between the database logic and the 'mapping' component,
per laimis's ideas, but I wouldn't worry too much about making it perfect.

Personally I'd prefer this 'impure' layering to having to have all my
business logic in static methods - that's too high a price to pay. I want to
have my business objects enapsulate data and behaviour, not separate them
out for artificial reasons.

If you want to stick to strict layering, there are two basic approaches:
- pass the data between layers in some simple shared structure, and copy in
the BL to the business objects
- use some form of inversion of control or factory class to pass the
necessary logic into the DAL to allow it to create business objects without
knowing about them

In the first case, use DataSets or custom-defined Data Transfer Objects.
This works well for remoting, as well.

The inversion of control solutions generally seem too complex for me - if
it's getting that complex then I'd be thinking of a full blown O/R mapper
instead of a DAL. However one simple variation that I've considered but not
used in practice is to use strongly typed datasets as the basis for the
business objects in the BL and pass in datasets to the DAL. In the style
'here's a dataset - please fill it for me'. The DAL works against simple
untyped dataset interfaces, but the dataset insfrastructure creates the
approriate strongly typed objects.

Good luck..

Nigel

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:FJ******************@news-server.bigpond.net.au...
I did originally use interfaces in the DAL, but it resulted in so much
messy code (some of my object have 25+ properties) I decided that it was
far cleaner to simply reference the object from the ORL.

- Stu

"John B" <jb******@yahoo.com> wrote in message
news:42***********************@news.sunsite.dk...
Stuart Hilditch wrote:
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer (DAL),
Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the DAL,
let's call this method DAL.AddCustomer(BLL.Customer myCustomer) which
would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL needs
to reference the BLL (to receive the custom business object), hence a
circular referencing error. I understand that I could turn this custom
object into some sort of generic object[] or collection and pass it
then, or alternatively pass the method field values one by one (not
practical with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work
properly you need to have the objects themselves defined in the ORL, but
the methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to work
effectively any other way.

I would appreciate any advice.

- Stu

Google Dependency Inversion Principle and Interface Segregation
Principle.

I am by no means an expert but..
In the past I have created another assembly containing an interface that
both the DAL and BLL know about.
This interface pretty much only contains properties for all the db table
columns.
Thus, the DAL classes know about the interface and can interact with that
and the BLL classes implement this interface.

HTH
JB


Nov 17 '05 #19

P: n/a
Thanks very much Nigel,

That's great feedback and it's given me something to think about. I do agree
that being limited to static methods only is a price to pay, I think i might
start looking into to transferring these objects from the BLL to the DAL by
way of a dataset, it sounds like a practical option.

Thanks again!

"Nigel Norris" <no****@nospam.com> wrote in message
news:OY**************@TK2MSFTNGP09.phx.gbl...
Stuart,

One thing you haven't given any indication of is the scale or requirements
for the project. In practice I think that the appropriate answer to
tackling this sort of problem depends on answers to questions such as:

- How big a project is this?
- How important is flexibility in future?
- Do you need ot have the DAL as a full blown tier (distributable) or is
it simply an internal software layer?
- What is your testing approach - does the DAL need to be replaceable for
testing?

I'd probably come up with very different answers for a 100 user one-off
project than for something that was going to be the key business
application for the next decade. I think all projects should have some
sort of 'complexity budget' - where complexity in this case is measured as
the ratio of 'infrastructure' code to useful application logic. As you add
structure (to reduce complexity in your application logic) you add
complexity in extra code. So it's a tradeoff of whether the benefit
outweighs the cost.

So for a really simple project where the DAL was an internal layer I might
just break the layering rule and have the DAL create my busness objects.
I'd keep some separation between the database logic and the 'mapping'
component, per laimis's ideas, but I wouldn't worry too much about making
it perfect.

Personally I'd prefer this 'impure' layering to having to have all my
business logic in static methods - that's too high a price to pay. I want
to have my business objects enapsulate data and behaviour, not separate
them out for artificial reasons.

If you want to stick to strict layering, there are two basic approaches:
- pass the data between layers in some simple shared structure, and copy
in the BL to the business objects
- use some form of inversion of control or factory class to pass the
necessary logic into the DAL to allow it to create business objects
without knowing about them

In the first case, use DataSets or custom-defined Data Transfer Objects.
This works well for remoting, as well.

The inversion of control solutions generally seem too complex for me - if
it's getting that complex then I'd be thinking of a full blown O/R mapper
instead of a DAL. However one simple variation that I've considered but
not used in practice is to use strongly typed datasets as the basis for
the business objects in the BL and pass in datasets to the DAL. In the
style 'here's a dataset - please fill it for me'. The DAL works against
simple untyped dataset interfaces, but the dataset insfrastructure creates
the approriate strongly typed objects.

Good luck..

Nigel

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:FJ******************@news-server.bigpond.net.au...
I did originally use interfaces in the DAL, but it resulted in so much
messy code (some of my object have 25+ properties) I decided that it was
far cleaner to simply reference the object from the ORL.

- Stu

"John B" <jb******@yahoo.com> wrote in message
news:42***********************@news.sunsite.dk...
Stuart Hilditch wrote:
Hi all,

I am hoping that someone with some experience developing nTier apps can
give me some advice here.

I am writing an nTier web app that began with a Data Access Layer
(DAL), Business Logic Layer (BLL) and User Interface Layer (UIL).

The problem I found with this was circular referencing...

My objects would be defined in the BLL, so let's say for example that I
want to instantiate a new BLL.Customer object in the UIL, and then run
Customer.AddCustomer() which would in turn pass the object into the
DAL, let's call this method DAL.AddCustomer(BLL.Customer myCustomer)
which would insert into the DB.

The problem is that the BLL needs to reference the DAL and the DAL
needs to reference the BLL (to receive the custom business object),
hence a circular referencing error. I understand that I could turn this
custom object into some sort of generic object[] or collection and pass
it then, or alternatively pass the method field values one by one (not
practical with 10+ values)

What I did was to create a 4th 'vertical' layer which I called the ORL
(Object Reference Layer), the purpose of which is to allow all other
layers to reference the same objects so they can be passed between
themselves without issues. The drawback is that for this to work
properly you need to have the objects themselves defined in the ORL,
but the methods defined statically in the BLL.

My question is this...

Is this good programming?

Obviously it would be ideal to have the object constructor and instance
methods declared in the same class, but I can't seem to get this to
work effectively any other way.

I would appreciate any advice.

- Stu
Google Dependency Inversion Principle and Interface Segregation
Principle.

I am by no means an expert but..
In the past I have created another assembly containing an interface that
both the DAL and BLL know about.
This interface pretty much only contains properties for all the db table
columns.
Thus, the DAL classes know about the interface and can interact with
that and the BLL classes implement this interface.

HTH
JB



Nov 17 '05 #20

P: n/a
"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:y1******************@news-server.bigpond.net.au...
Hi Michael,

Sounds like your doing the same thing I am only using datasets rather than
business objects. I would have thought that your application would take a
massive performance by using datasets in this way, especially if you use a
lot of objects. I suppose you can help with caching (if it's an option).

- Stu


Hi Stu,

Why would you think there would be a massive performance hit? Where do you
think that would occur? I know datasets can be slow if they are loaded with
thousands of records, but I'm designing my app to avoid scenarious like
that.

I chose to use datasets for several reasons:

1.) They pass easily through web services. You can pass custom types,
but it requires manually changing the reference.cs file each time.
2.) They have built-in support for remembering which rows get added,
edited, deleted, etc. This is especially true in a data grid. If you pass
your data object to a data grid, how do you know which rows got updated?
3.) They allow for very easy data binding on Windows forms.
4.) You can give custom sql scripts to a data adapter, then let it do all
of the data manipulation work for you. There is no need to loop through
each record manually and figure out whether it needs to be added, deleted,
etc.
5.) You can use DataSet1.GetChanges() to only pass only the changes to
your web service and then to your data layer.
...

Thanks,

Mike
Nov 17 '05 #21

P: n/a
Hi Mike,

I would tend to think they were better in cases where you use thousands of
records.

I agree they are very handy, but there is a very instantiation & marshalling
cost that is incurred everytime you create a new object.

From
http://msdn.microsoft.com/library/de...tml/BOAGag.asp

"High instantiation and marshalling costs. DataSets result in the creation
of several subobjects (DataTable, DataRow, and DataColumn), which means that
DataSets can take longer to instantiate and marshal than XML strings or
custom entity components. The relative performance of DataSets improves as
the amount of data increases, because the overhead of creating the internal
structure of the DataSet is less significant than the time it takes to
populate the DataSet with data."

I would not use them for individual objects, but for large sets of data they
are great.

When retreiving data from a datastore, a datareader is the way to go if
performance is a consideration. Check out this benchmark ...

http://www.devx.com/vb2themax/Articl...7/1954?pf=true

- Stu

"Michael Rodriguez" <mi**@nospamforme.com> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:y1******************@news-server.bigpond.net.au...
Hi Michael,

Sounds like your doing the same thing I am only using datasets rather
than business objects. I would have thought that your application would
take a massive performance by using datasets in this way, especially if
you use a lot of objects. I suppose you can help with caching (if it's an
option).

- Stu


Hi Stu,

Why would you think there would be a massive performance hit? Where do
you think that would occur? I know datasets can be slow if they are
loaded with thousands of records, but I'm designing my app to avoid
scenarious like that.

I chose to use datasets for several reasons:

1.) They pass easily through web services. You can pass custom types,
but it requires manually changing the reference.cs file each time.
2.) They have built-in support for remembering which rows get added,
edited, deleted, etc. This is especially true in a data grid. If you
pass your data object to a data grid, how do you know which rows got
updated?
3.) They allow for very easy data binding on Windows forms.
4.) You can give custom sql scripts to a data adapter, then let it do
all of the data manipulation work for you. There is no need to loop
through each record manually and figure out whether it needs to be added,
deleted, etc.
5.) You can use DataSet1.GetChanges() to only pass only the changes to
your web service and then to your data layer.
...

Thanks,

Mike

Nov 17 '05 #22

P: n/a
Hi Stu,

I've read those benchmarks and it did give me some concern. However,
there's something I still don't understand about using custom entities to
store your data. If you bind a grid to your business object, how do you
know which rows get changed? How do you know which rows need to be added
and/or deleted?

Also, do you have to pass your business objects through a web service (I
do)? Web services aren't great for passing custom types like that...

Thanks,

Mike
"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:B0******************@news-server.bigpond.net.au...
Hi Mike,

I would tend to think they were better in cases where you use thousands of
records.

I agree they are very handy, but there is a very instantiation &
marshalling cost that is incurred everytime you create a new object.

From
http://msdn.microsoft.com/library/de...tml/BOAGag.asp

"High instantiation and marshalling costs. DataSets result in the creation
of several subobjects (DataTable, DataRow, and DataColumn), which means
that DataSets can take longer to instantiate and marshal than XML strings
or custom entity components. The relative performance of DataSets improves
as the amount of data increases, because the overhead of creating the
internal structure of the DataSet is less significant than the time it
takes to populate the DataSet with data."

I would not use them for individual objects, but for large sets of data
they are great.

When retreiving data from a datastore, a datareader is the way to go if
performance is a consideration. Check out this benchmark ...

http://www.devx.com/vb2themax/Articl...7/1954?pf=true

- Stu

Nov 17 '05 #23

P: n/a
Michael Rodriguez wrote:
Hi Stu,

I've read those benchmarks and it did give me some concern. However,
there's something I still don't understand about using custom entities to
store your data. If you bind a grid to your business object, how do you
know which rows get changed? How do you know which rows need to be added
and/or deleted?
All that should be catered for in IBindingList.
I havent actually implemented deletion, but have done add's.
Also, do you have to pass your business objects through a web service (I
do)? Web services aren't great for passing custom types like that...

Im not sure

<snip)

JB
Nov 17 '05 #24

P: n/a
Hi Mike,

I think it really depends on the context of the problem. Generally I use a
mix of business objects and datasets, I find that custom entities give me a
lot more control, you can use them as a collection and there are a number of
interfaces that will allow you more control over binding, etc.

I also use datasets when a large quantity of data is moving from my Data
Access Layer to my User Interface Layer relatively unchanged, but I would
usually implement caching in that case. Unfortunately I have not had any
experience with web services, but you can easily serialize business objects
so I imagine it can be done.

Hope this helps.

- Stu

"Michael Rodriguez" <mi**@nospamforme.com> wrote in message
news:eo**************@TK2MSFTNGP15.phx.gbl...
Hi Stu,

I've read those benchmarks and it did give me some concern. However,
there's something I still don't understand about using custom entities to
store your data. If you bind a grid to your business object, how do you
know which rows get changed? How do you know which rows need to be added
and/or deleted?

Also, do you have to pass your business objects through a web service (I
do)? Web services aren't great for passing custom types like that...

Thanks,

Mike
"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:B0******************@news-server.bigpond.net.au...
Hi Mike,

I would tend to think they were better in cases where you use thousands
of records.

I agree they are very handy, but there is a very instantiation &
marshalling cost that is incurred everytime you create a new object.

From
http://msdn.microsoft.com/library/de...tml/BOAGag.asp

"High instantiation and marshalling costs. DataSets result in the
creation of several subobjects (DataTable, DataRow, and DataColumn),
which means that DataSets can take longer to instantiate and marshal than
XML strings or custom entity components. The relative performance of
DataSets improves as the amount of data increases, because the overhead
of creating the internal structure of the DataSet is less significant
than the time it takes to populate the DataSet with data."

I would not use them for individual objects, but for large sets of data
they are great.

When retreiving data from a datastore, a datareader is the way to go if
performance is a consideration. Check out this benchmark ...

http://www.devx.com/vb2themax/Articl...7/1954?pf=true

- Stu


Nov 17 '05 #25

P: n/a
Stu,

It's true that business objects can be serialized and passed through a web
service. The problem is the default proxy class does not know about your
custom type. This means every time you update your web reference, you have
to go into the Reference.cs file and add a reference to your custom type.
The was definitely the case for .NET 1.1. I haven't verified yet whether or
not that has changed for 2.0.

Mike

"Stuart Hilditch" <st*************@gmail.com> wrote in message
news:Tm******************@news-server.bigpond.net.au...
Hi Mike,

I think it really depends on the context of the problem. Generally I use a
mix of business objects and datasets, I find that custom entities give me
a lot more control, you can use them as a collection and there are a
number of interfaces that will allow you more control over binding, etc.

I also use datasets when a large quantity of data is moving from my Data
Access Layer to my User Interface Layer relatively unchanged, but I would
usually implement caching in that case. Unfortunately I have not had any
experience with web services, but you can easily serialize business
objects so I imagine it can be done.

Hope this helps.

- Stu



Nov 17 '05 #26

This discussion thread is closed

Replies have been disabled for this discussion.