473,750 Members | 2,213 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Data/Business Object Tier Best Practices

I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbComm and
cm.CommandType = CommandType.Tex t
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDat aTable(cm, Me.tblParts, ConnectionStrin gs.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Nov 21 '05 #1
18 2865
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation calls
for it and consider using Custom entity objects when appropriate. Most of
the time I opt for Typed DataSets because it can be more productive to use
them and a lot of developers are used to programming in a relational model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring out
the SQL statement from the typed dataset class you have and moving that into
a seperate class.

Some resources
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbComm and
cm.CommandType = CommandType.Tex t
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDat aTable(cm, Me.tblParts, ConnectionStrin gs.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Nov 21 '05 #2
CMM
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME. Typed
datasets are huge time savers and provide all the benefits of custom objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from it,
and try to add all sorts of code to it. This makes your life harder... as the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it is
redundant and requires a huge development effort in exchange for relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation calls
for it and consider using Custom entity objects when appropriate. Most of
the time I opt for Typed DataSets because it can be more productive to use
them and a lot of developers are used to programming in a relational model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring out
the SQL statement from the typed dataset class you have and moving that into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbComm and
cm.CommandType = CommandType.Tex t
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDat aTable(cm, Me.tblParts, ConnectionStrin gs.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Nov 21 '05 #3
CMM
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME. Typed
datasets are huge time savers and provide all the benefits of custom objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from it,
and try to add all sorts of code to it. This makes your life harder... as the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it is
redundant and requires a huge development effort in exchange for relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation calls
for it and consider using Custom entity objects when appropriate. Most of
the time I opt for Typed DataSets because it can be more productive to use
them and a lot of developers are used to programming in a relational model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring out
the SQL statement from the typed dataset class you have and moving that into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbComm and
cm.CommandType = CommandType.Tex t
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDat aTable(cm, Me.tblParts, ConnectionStrin gs.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Nov 21 '05 #4
This little thread got me to go back and give another try to IDE-generated
Typed Datasets, you made them sound like the killers I thought that they
might be back a few years ago.

Thing is while they are neat and can jumpstart some coding I still find them
(as the IDE generates them) unweildy when used against a lot of real world
tables.

The big thing everyone pushes is that typed datasets are better because
they're easier to read and so lend themselves more to OOPers... and I don't
see that myself..

I don't know about everyone else but I often get tables that don't have the
most happy column names. I don't think I've ever seen a column named
"HomeAddressPar tOne", "ApartmentNumbe r" ... in fact I don't think I've see
many "FirstName" column names over the years. I get more along the lines
of cooumns named by Unix guys such as Fnm, Lnm, Adr1 and so on.

While you can figure them out in a lot of cases, several times I've been
told to figure out column contents by hitting another lookup table. Hey,
I'm all for better table design but not all projects let you make new
tables. Maybe it's just that I've spent a lot of time on Oracle and maybe
SqlServer DBAs always follow the Microsoft Access documentation style with
clearly and obviously named columns having embedded spaces and such (that
was a joke).

But in the end, when I generate those typed datasets I just have to go in
and manually change the interfaces if I really want to get the grail of
Humanly Comprehendable Objects.

Tell me that all this is moot and that I've just missed something in the
wizard... a place to simply tell the generator to use aliases and not much
with them every time a schema is refershed and no changes were found in the
base tables. That would be great, I'd love to hear about it.

The second thing is an oldie but a goodie that CMM mentioned: After
changing those properties manually, along somes a minor schema change
(pretty common during development) and with that comes the loss of all our
manual interface changes.

The thrid thing is that I used to read that typed datasets were somehow
faster performance-wise than vanilla datasets but I've since read that that
really isn't the case depending on how you code (here's a source, call up
the page and do a find for the word "faster"
http://bdn.borland.com/borcon2004/ar...,32284,00.html ).

In all, it's true that the up-front coding can be jumpstarted by using the
IDE to make an xsd, but still I find that after you've done the brunt of
your own entity objects you end up spending less time dealing with schema
change problems in that you simply add new properties and you're done
without worrying over how much tedious re-tweaking you'll have to do if
someone else opens up the project and accidentally regenerates the xsd.

As to the pain of binding custom objects and custom collections to GUis, CMM
said that typed datasets and binding are easy enough "once you master the
intricasies" of BindingContext/BindingManager ... the same can be said for
binding custom objects that aren't typed datasets, you can bind guis to
objects and to custom collections once you master some of hte intracasies of
"complex" binding.

I'm not trying to start a fight, I also would just like to know which is
best in most cases since I keep coming back to prefering my own object and
binding code to all those fragile lines generated by the freebie wizard.

Looking forward to being told that I'm wrong, I live to learn :)

robert smith
kirkland, wa
www.smithvoice.com

"CMM" <CM*@discussion s.microsoft.com > wrote in message
news:A4******** *************** ***********@mic rosoft.com...
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
Typed
datasets are huge time savers and provide all the benefits of custom
objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from
it,
and try to add all sorts of code to it. This makes your life harder... as
the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for
data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time
saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should
not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it
is
redundant and requires a huge development effort in exchange for
relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are
huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation
calls
for it and consider using Custom entity objects when appropriate. Most
of
the time I opt for Typed DataSets because it can be more productive to
use
them and a lot of developers are used to programming in a relational
model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring
out
the SQL statement from the typed dataset class you have and moving that
into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
> I am developing a Windows Forms application in VB.NET that will use
> .NET
> remoting to access the data tier classes.
>
> A very simple way I have come up with is by creating typed (.xsd)
> datasets.
> For example dsParts.xsd and including that in the data tier. I then
> will
> create a class that looks like this
>
>
> Public Class CPart
> Inherits dsParts
> Public Sub New(ByVal iPartID as Integer)
> Dim cm As New OleDb.OleDbComm and
> cm.CommandType = CommandType.Tex t
> cm.CommandText = "Select * from tblParts where PartID=" &
> iPartID
> modData.FillDat aTable(cm, Me.tblParts,
> ConnectionStrin gs.QASpec)
> 'Fill data table is a common method where i pass in a
> command
> and connection string
> 'it then fills the passed table object (ByRef) with the
> results
> of the command
> 'I could fill more than 1 data table here if this xml data
> schema had more than one table
> 'I can now add more methods to CPart and overide methods of
> the
> underlying dataset if required
> 'CPart is a datasource which can be used in place of a
> standard
> dataset object which is great for data binding
>
> 'One thing I haven't got to yet is Overriding or adding
> additional methods to the other table classes in the underlying
> baseclass
> 'not sure how I will accomplish that part.
> End Sub
> End Class
>
> To me this is a simple way of creating your dataclasses because you can
> create your XML schema easily by dragging tables from the server
> explorer
> directly on to the schema. Then when you Inherit the XML data schema
> (typed
> dataset) you get all of the table fields as properties in your class by
> default.
>
> Doing it any other way just seems like A LOT OF WORK. Other ways would
> be
> to create data classes and manually type in every field as a property.
> You
> do not get your databinding capability (though I hear there is a way to
> make
> these bindable at runtime). One thing you definatly won't get is
> design
> time databinding (the other method mentioned above, we can bind the
> typed
> datasets to our 3rd party grid controls easily at design time. )
>
> Then with your dataclasses you have to implement them in a collection.
> For
> example CParts and CPart, would be two different classes. Inheriting
> from a
> typed dataset just seems like a lot of this work is done for you and
> the
> project can be completed months earlier.
>
> What do you guys think? Is this an accepted practice? or am I way off
> here? Are there other alternatives? Pro's/Con's? I am looking for
> advice
> on this as I have to decide soon on the design of the data tier.
>
> Thanks for your input.
>
> D.
>
>
>
>

Nov 21 '05 #5
CMM
Oh I totally agree with the column name mappings problem. .Getfirst_name( )
sure is ugly. But you might be missing or misunderstandin g some things...

1) You can use the TableMappings property of the DataAdapter to map database
columns names to make them look however you want (so that the DB's first_name
maps to the dataset's FirstName field). You can access this via the property
editor... but it's not as pretty or easy to use as it should be (I'd love to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!! Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors... . they're NOT just for WinForms... they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then you
go in and modify to your hearts content. 80% of the work code (tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change in
thinking that you must undergo. It might not be for you. But, I know I've had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.

"smith" wrote:
This little thread got me to go back and give another try to IDE-generated
Typed Datasets, you made them sound like the killers I thought that they
might be back a few years ago.

Thing is while they are neat and can jumpstart some coding I still find them
(as the IDE generates them) unweildy when used against a lot of real world
tables.

The big thing everyone pushes is that typed datasets are better because
they're easier to read and so lend themselves more to OOPers... and I don't
see that myself..

I don't know about everyone else but I often get tables that don't have the
most happy column names. I don't think I've ever seen a column named
"HomeAddressPar tOne", "ApartmentNumbe r" ... in fact I don't think I've see
many "FirstName" column names over the years. I get more along the lines
of cooumns named by Unix guys such as Fnm, Lnm, Adr1 and so on.

While you can figure them out in a lot of cases, several times I've been
told to figure out column contents by hitting another lookup table. Hey,
I'm all for better table design but not all projects let you make new
tables. Maybe it's just that I've spent a lot of time on Oracle and maybe
SqlServer DBAs always follow the Microsoft Access documentation style with
clearly and obviously named columns having embedded spaces and such (that
was a joke).

But in the end, when I generate those typed datasets I just have to go in
and manually change the interfaces if I really want to get the grail of
Humanly Comprehendable Objects.

Tell me that all this is moot and that I've just missed something in the
wizard... a place to simply tell the generator to use aliases and not much
with them every time a schema is refershed and no changes were found in the
base tables. That would be great, I'd love to hear about it.

The second thing is an oldie but a goodie that CMM mentioned: After
changing those properties manually, along somes a minor schema change
(pretty common during development) and with that comes the loss of all our
manual interface changes.

The thrid thing is that I used to read that typed datasets were somehow
faster performance-wise than vanilla datasets but I've since read that that
really isn't the case depending on how you code (here's a source, call up
the page and do a find for the word "faster"
http://bdn.borland.com/borcon2004/ar...,32284,00.html ).

In all, it's true that the up-front coding can be jumpstarted by using the
IDE to make an xsd, but still I find that after you've done the brunt of
your own entity objects you end up spending less time dealing with schema
change problems in that you simply add new properties and you're done
without worrying over how much tedious re-tweaking you'll have to do if
someone else opens up the project and accidentally regenerates the xsd.

As to the pain of binding custom objects and custom collections to GUis, CMM
said that typed datasets and binding are easy enough "once you master the
intricasies" of BindingContext/BindingManager ... the same can be said for
binding custom objects that aren't typed datasets, you can bind guis to
objects and to custom collections once you master some of hte intracasies of
"complex" binding.

I'm not trying to start a fight, I also would just like to know which is
best in most cases since I keep coming back to prefering my own object and
binding code to all those fragile lines generated by the freebie wizard.

Looking forward to being told that I'm wrong, I live to learn :)

robert smith
kirkland, wa
www.smithvoice.com

"CMM" <CM*@discussion s.microsoft.com > wrote in message
news:A4******** *************** ***********@mic rosoft.com...
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
Typed
datasets are huge time savers and provide all the benefits of custom
objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from
it,
and try to add all sorts of code to it. This makes your life harder... as
the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for
data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time
saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should
not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it
is
redundant and requires a huge development effort in exchange for
relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are
huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation
calls
for it and consider using Custom entity objects when appropriate. Most
of
the time I opt for Typed DataSets because it can be more productive to
use
them and a lot of developers are used to programming in a relational
model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring
out
the SQL statement from the typed dataset class you have and moving that
into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:

> I am developing a Windows Forms application in VB.NET that will use
> .NET
> remoting to access the data tier classes.
>
> A very simple way I have come up with is by creating typed (.xsd)
> datasets.
> For example dsParts.xsd and including that in the data tier. I then
> will
> create a class that looks like this
>
>
> Public Class CPart
> Inherits dsParts
> Public Sub New(ByVal iPartID as Integer)
> Dim cm As New OleDb.OleDbComm and
> cm.CommandType = CommandType.Tex t
> cm.CommandText = "Select * from tblParts where PartID=" &
> iPartID
> modData.FillDat aTable(cm, Me.tblParts,
> ConnectionStrin gs.QASpec)
> 'Fill data table is a common method where i pass in a
> command
> and connection string
> 'it then fills the passed table object (ByRef) with the
> results
> of the command
> 'I could fill more than 1 data table here if this xml data
> schema had more than one table
> 'I can now add more methods to CPart and overide methods of
> the
> underlying dataset if required
> 'CPart is a datasource which can be used in place of a
> standard
> dataset object which is great for data binding
>
> 'One thing I haven't got to yet is Overriding or adding
> additional methods to the other table classes in the underlying
> baseclass
> 'not sure how I will accomplish that part.
> End Sub
> End Class
>
> To me this is a simple way of creating your dataclasses because you can
> create your XML schema easily by dragging tables from the server
> explorer
> directly on to the schema. Then when you Inherit the XML data schema
> (typed
> dataset) you get all of the table fields as properties in your class by
> default.
>
> Doing it any other way just seems like A LOT OF WORK. Other ways would
> be
> to create data classes and manually type in every field as a property.
> You
> do not get your databinding capability (though I hear there is a way to
> make
> these bindable at runtime). One thing you definatly won't get is
> design
> time databinding (the other method mentioned above, we can bind the
> typed
> datasets to our 3rd party grid controls easily at design time. )
>
> Then with your dataclasses you have to implement them in a collection.
> For
> example CParts and CPart, would be two different classes. Inheriting
> from a
> typed dataset just seems like a lot of this work is done for you and
> the
> project can be completed months earlier.
>
> What do you guys think? Is this an accepted practice? or am I way off
> here? Are there other alternatives? Pro's/Con's? I am looking for
> advice
> on this as I have to decide soon on the design of the data tier.
>
> Thanks for your input.
>
> D.
>
>
>
>


Nov 21 '05 #6
On Fri, 18 Mar 2005 20:41:02 -0800, CMM
<CM*@discussion s.microsoft.com > wrote:
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME. Typed
datasets are huge time savers and provide all the benefits of custom objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from it,
and try to add all sorts of code to it. This makes your life harder... as the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time saver.
Data binding in .NET is very good (one you master some of its weird
intricacies. .. namely the BindingContext/BindingManager stuff)! It should not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it is
redundant and requires a huge development effort in exchange for relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation calls
for it and consider using Custom entity objects when appropriate. Most of
the time I opt for Typed DataSets because it can be more productive to use
them and a lot of developers are used to programming in a relational model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring out
the SQL statement from the typed dataset class you have and moving that into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
> I am developing a Windows Forms application in VB.NET that will use .NET
> remoting to access the data tier classes.
>
> A very simple way I have come up with is by creating typed (.xsd) datasets.
> For example dsParts.xsd and including that in the data tier. I then will
> create a class that looks like this
>
>
> Public Class CPart
> Inherits dsParts
> Public Sub New(ByVal iPartID as Integer)
> Dim cm As New OleDb.OleDbComm and
> cm.CommandType = CommandType.Tex t
> cm.CommandText = "Select * from tblParts where PartID=" &
> iPartID
> modData.FillDat aTable(cm, Me.tblParts, ConnectionStrin gs.QASpec)
> 'Fill data table is a common method where i pass in a command
> and connection string
> 'it then fills the passed table object (ByRef) with the results
> of the command
> 'I could fill more than 1 data table here if this xml data
> schema had more than one table
> 'I can now add more methods to CPart and overide methods of the
> underlying dataset if required
> 'CPart is a datasource which can be used in place of a standard
> dataset object which is great for data binding
>
> 'One thing I haven't got to yet is Overriding or adding
> additional methods to the other table classes in the underlying baseclass
> 'not sure how I will accomplish that part.
> End Sub
> End Class
>
> To me this is a simple way of creating your dataclasses because you can
> create your XML schema easily by dragging tables from the server explorer
> directly on to the schema. Then when you Inherit the XML data schema (typed
> dataset) you get all of the table fields as properties in your class by
> default.
>
> Doing it any other way just seems like A LOT OF WORK. Other ways would be
> to create data classes and manually type in every field as a property. You
> do not get your databinding capability (though I hear there is a way to make
> these bindable at runtime). One thing you definatly won't get is design
> time databinding (the other method mentioned above, we can bind the typed
> datasets to our 3rd party grid controls easily at design time. )
>
> Then with your dataclasses you have to implement them in a collection. For
> example CParts and CPart, would be two different classes. Inheriting from a
> typed dataset just seems like a lot of this work is done for you and the
> project can be completed months earlier.
>
> What do you guys think? Is this an accepted practice? or am I way off
> here? Are there other alternatives? Pro's/Con's? I am looking for advice
> on this as I have to decide soon on the design of the data tier.
There are altternatives and one of them is to accept a comprise
position and that is to use datatables in your dataaccess layer and
place over that a business object layer. In this layer you have a set
of classes which map to users business objects and not the relational
data model. to tie them together you have constructors that accept
typed or untyped data records. This approach enables you easily build
entities that mimic the real world situation. A trivial example would
be customers, you may have for example a set of accounts and someone
comes to you and says in reality those ten customers are ten branch
accounts and we want to handle them for invoicing purposes as one. It
is easy in your business objects to create a parent child relationship
and handle it, though you can do at the database level it involves non
standard proprietry SQL and a lot of headaches, anyway the user will
change their mind just as you get code sign off.

It also much easier to build in business rules and complex data
validation schemes in to business objects and to deal with highly
nested data. But you can do it without compromising the use of
datatables.

Doug Taylor
>
> Thanks for your input.
>
> D.
>
>
>
>


Nov 21 '05 #7
Thanks CMM, I'll go back in again and spend some time to see more 'whats and
wheres' and I appreciate your pointers.

I do remember that making complex binding was not the most intuitive thing
back when I started doing it and if I don't have to do it in a project for a
while I have to go back to the books to get refreshers.

Of course, that's the same for a lot of things in programming, like most
folks I've spent years doing loads of database work but after spending a
month or so heads-down in a in a GUI I'll admit that I have moments when I
go back to the back-end and say to myself things like "now... what was the
best parameter syntax again?" :). We weren't born with any code syntax in
our heads so all things kind of turn out equal and relative; if you do more
complex object databinding code day in and day then out your fingers will
start doing the patterns faster ... and I'm sure it's the same for the
intracasies of typed datasets becuase it's the same for just about anything
we all use often enough. (Boy I spent years doing VB5/6 6 to 7 days a week
and could write code "in my head" in a lot of cases, but recently I loaded
up a virtual machine to show someone a VB Classic technique - that I
developed and was first to document so I should have known it pretty well -
and it was harder shifting from VB7 to VB5/6 than it is shifting from VB7 to
FlashMX2004 ... amazing how the mind so quickly drops rote memories)

Thanks again for your information, it is sincerely appreciated. And if you
have some specific intermediate/advanced resources that you could list I
would like to read them, most of the tutorials and books show only how to
use the IDE to make a typed dataset and pretty much leave it at that.

smith
"CMM" <CM*@discussion s.microsoft.com > wrote in message
news:47******** *************** ***********@mic rosoft.com...
Oh I totally agree with the column name mappings problem. .Getfirst_name( )
sure is ugly. But you might be missing or misunderstandin g some things...

1) You can use the TableMappings property of the DataAdapter to map
database
columns names to make them look however you want (so that the DB's
first_name
maps to the dataset's FirstName field). You can access this via the
property
editor... but it's not as pretty or easy to use as it should be (I'd love
to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!!
Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors... . they're NOT just for WinForms...
they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then
you
go in and modify to your hearts content. 80% of the work code
(tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change
in
thinking that you must undergo. It might not be for you. But, I know I've
had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to
have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.

Nov 21 '05 #8
CMM
IMHO, I think the learning curve is worth it... and you'd disover that at the
end, the solution *IS* ORM... minus the hassle but with a lot more
functionality.

You just have to get over some stubborn mental stuff.

For instance, there is absolutely nothing wrong with returning a
TypedDataset that will always only have one row. Who cares? It works, right?
But at first a lot of us are like, "no way, I'll just create a flat class to
handle it." Well, that's stupid. What if you end up wanting to manipulate a
bunch of them in a collection.... well, serializing collections over tiers
sucks and is extremely error-prone (not all types are serializable) and are
MUCH less functional the tables (sorting, mapping, serializing, binding,
etc).

Also, don't dismiss the GUI design-time tools just because they at first
glance look like the VB.Classic crappy tools. Design-time DataAdapters are a
godsend. Desiging your TypedDatasets using the Designer is fun. Setting up
binding at design time is also easy.

Books suck. No book I have ever seen properly explains the stuff...
especially the quirks of databinding. Check out some of these links:

How databinding really works
http://groups-beta.google.com/group/...af8230f57c38de

Mapping Data Source Tables to Dataset Tables
http://msdn.microsoft.com/library/de...asetTables.asp

Roadmap for WindowsForms databinding
http://support.microsoft.com/default...;EN-US;Q313482
Good luck.

"smith" wrote:
Thanks CMM, I'll go back in again and spend some time to see more 'whats and
wheres' and I appreciate your pointers.

I do remember that making complex binding was not the most intuitive thing
back when I started doing it and if I don't have to do it in a project for a
while I have to go back to the books to get refreshers.

Of course, that's the same for a lot of things in programming, like most
folks I've spent years doing loads of database work but after spending a
month or so heads-down in a in a GUI I'll admit that I have moments when I
go back to the back-end and say to myself things like "now... what was the
best parameter syntax again?" :). We weren't born with any code syntax in
our heads so all things kind of turn out equal and relative; if you do more
complex object databinding code day in and day then out your fingers will
start doing the patterns faster ... and I'm sure it's the same for the
intracasies of typed datasets becuase it's the same for just about anything
we all use often enough. (Boy I spent years doing VB5/6 6 to 7 days a week
and could write code "in my head" in a lot of cases, but recently I loaded
up a virtual machine to show someone a VB Classic technique - that I
developed and was first to document so I should have known it pretty well -
and it was harder shifting from VB7 to VB5/6 than it is shifting from VB7 to
FlashMX2004 ... amazing how the mind so quickly drops rote memories)

Thanks again for your information, it is sincerely appreciated. And if you
have some specific intermediate/advanced resources that you could list I
would like to read them, most of the tutorials and books show only how to
use the IDE to make a typed dataset and pretty much leave it at that.

smith
"CMM" <CM*@discussion s.microsoft.com > wrote in message
news:47******** *************** ***********@mic rosoft.com...
Oh I totally agree with the column name mappings problem. .Getfirst_name( )
sure is ugly. But you might be missing or misunderstandin g some things...

1) You can use the TableMappings property of the DataAdapter to map
database
columns names to make them look however you want (so that the DB's
first_name
maps to the dataset's FirstName field). You can access this via the
property
editor... but it's not as pretty or easy to use as it should be (I'd love
to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!!
Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors... . they're NOT just for WinForms...
they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then
you
go in and modify to your hearts content. 80% of the work code
(tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change
in
thinking that you must undergo. It might not be for you. But, I know I've
had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to
have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.


Nov 21 '05 #9
In response to your issue about serializing datasets, you may want to check
this out:

http://bethmassi.blogspot.com/2004/1...-datasets.html

Cheers,
-Beth

"CMM" <CM*@discussion s.microsoft.com > wrote in message
news:47******** *************** ***********@mic rosoft.com...
Oh I totally agree with the column name mappings problem. .Getfirst_name( )
sure is ugly. But you might be missing or misunderstandin g some things...

1) You can use the TableMappings property of the DataAdapter to map
database
columns names to make them look however you want (so that the DB's
first_name
maps to the dataset's FirstName field). You can access this via the
property
editor... but it's not as pretty or easy to use as it should be (I'd love
to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!!
Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors... . they're NOT just for WinForms...
they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then
you
go in and modify to your hearts content. 80% of the work code
(tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change
in
thinking that you must undergo. It might not be for you. But, I know I've
had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to
have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.

"smith" wrote:
This little thread got me to go back and give another try to
IDE-generated
Typed Datasets, you made them sound like the killers I thought that they
might be back a few years ago.

Thing is while they are neat and can jumpstart some coding I still find
them
(as the IDE generates them) unweildy when used against a lot of real
world
tables.

The big thing everyone pushes is that typed datasets are better because
they're easier to read and so lend themselves more to OOPers... and I
don't
see that myself..

I don't know about everyone else but I often get tables that don't have
the
most happy column names. I don't think I've ever seen a column named
"HomeAddressPar tOne", "ApartmentNumbe r" ... in fact I don't think I've
see
many "FirstName" column names over the years. I get more along the
lines
of cooumns named by Unix guys such as Fnm, Lnm, Adr1 and so on.

While you can figure them out in a lot of cases, several times I've been
told to figure out column contents by hitting another lookup table. Hey,
I'm all for better table design but not all projects let you make new
tables. Maybe it's just that I've spent a lot of time on Oracle and maybe
SqlServer DBAs always follow the Microsoft Access documentation style
with
clearly and obviously named columns having embedded spaces and such (that
was a joke).

But in the end, when I generate those typed datasets I just have to go in
and manually change the interfaces if I really want to get the grail of
Humanly Comprehendable Objects.

Tell me that all this is moot and that I've just missed something in the
wizard... a place to simply tell the generator to use aliases and not
much
with them every time a schema is refershed and no changes were found in
the
base tables. That would be great, I'd love to hear about it.

The second thing is an oldie but a goodie that CMM mentioned: After
changing those properties manually, along somes a minor schema change
(pretty common during development) and with that comes the loss of all
our
manual interface changes.

The thrid thing is that I used to read that typed datasets were somehow
faster performance-wise than vanilla datasets but I've since read that
that
really isn't the case depending on how you code (here's a source, call
up
the page and do a find for the word "faster"
http://bdn.borland.com/borcon2004/ar...,32284,00.html ).

In all, it's true that the up-front coding can be jumpstarted by using
the
IDE to make an xsd, but still I find that after you've done the brunt of
your own entity objects you end up spending less time dealing with schema
change problems in that you simply add new properties and you're done
without worrying over how much tedious re-tweaking you'll have to do if
someone else opens up the project and accidentally regenerates the xsd.

As to the pain of binding custom objects and custom collections to GUis,
CMM
said that typed datasets and binding are easy enough "once you master the
intricasies" of BindingContext/BindingManager ... the same can be said
for
binding custom objects that aren't typed datasets, you can bind guis to
objects and to custom collections once you master some of hte intracasies
of
"complex" binding.

I'm not trying to start a fight, I also would just like to know which is
best in most cases since I keep coming back to prefering my own object
and
binding code to all those fragile lines generated by the freebie wizard.

Looking forward to being told that I'm wrong, I live to learn :)

robert smith
kirkland, wa
www.smithvoice.com

"CMM" <CM*@discussion s.microsoft.com > wrote in message
news:A4******** *************** ***********@mic rosoft.com...
> Having been developing entity objects for years to represent data and
> carrying that same ORM ideology to .NET for some time until I gave
> typed
> datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
> Typed
> datasets are huge time savers and provide all the benefits of custom
> objects.
> Developers just have to lose some of their old practices which were
> never
> good ideas to begin with. You have to learn to seperate business rules
> and
> validation from the data object itself. One of the first thing old
> school
> developers try to do is hijack the Typed Dataset, inherit some class
> from
> it,
> and try to add all sorts of code to it. This makes your life harder...
> as
> the
> dataset is recreated and your code changes lost whenever you use the
> very
> productive and useful designer to change the dataset. Datasets are for
> data.
> Validation objects act on the dataset. Data Access objects act on the
> dataset. It's all very clean and manageable and productive.
>
> Also, the benefits of using typed datasets ripples to other things. if
> you
> hesitated using binding in .NET because of your experiences in VB6 and
> you
> don't want to appear "lazy"... you're losing out on another huge time
> saver.
> Data binding in .NET is very good (one you master some of its weird
> intricacies... namely the BindingContext/BindingManager stuff)! It
> should
> not
> be dismissed.
>
> There are times when its appropriate to use ORM, but for the most part
> it
> is
> redundant and requires a huge development effort in exchange for
> relatively
> minor advantages. If you have a huge development team that can handle
> it,
> then maybe it's the way to go. But, the benefits of typed datasets are
> huge.
>
> Just my 2c.
>
> "Jorge Matos" wrote:
>
>> Whether to use Typed Datasets or Custom Entity objects is a
>> controversial
>> topic. My rule of thumb is to use Typed DataSets when the situation
>> calls
>> for it and consider using Custom entity objects when appropriate.
>> Most
>> of
>> the time I opt for Typed DataSets because it can be more productive to
>> use
>> them and a lot of developers are used to programming in a relational
>> model.
>> Custom entity classes and collections are usefull when you have a lot
>> of
>> business rules that you want to enforce on your data.
>>
>> The only issue I have with your code is that I would consider
>> factoring
>> out
>> the SQL statement from the typed dataset class you have and moving
>> that
>> into
>> a seperate class.
>>
>> Some resources:
>> http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
>> http://www.codeproject.com/dotnet/In...romDataSet.asp
>>
>>
>>
>> "D Witherspoon" wrote:
>>
>> > I am developing a Windows Forms application in VB.NET that will use
>> > .NET
>> > remoting to access the data tier classes.
>> >
>> > A very simple way I have come up with is by creating typed (.xsd)
>> > datasets.
>> > For example dsParts.xsd and including that in the data tier. I then
>> > will
>> > create a class that looks like this
>> >
>> >
>> > Public Class CPart
>> > Inherits dsParts
>> > Public Sub New(ByVal iPartID as Integer)
>> > Dim cm As New OleDb.OleDbComm and
>> > cm.CommandType = CommandType.Tex t
>> > cm.CommandText = "Select * from tblParts where PartID="
>> > &
>> > iPartID
>> > modData.FillDat aTable(cm, Me.tblParts,
>> > ConnectionStrin gs.QASpec)
>> > 'Fill data table is a common method where i pass in a
>> > command
>> > and connection string
>> > 'it then fills the passed table object (ByRef) with the
>> > results
>> > of the command
>> > 'I could fill more than 1 data table here if this xml
>> > data
>> > schema had more than one table
>> > 'I can now add more methods to CPart and overide methods
>> > of
>> > the
>> > underlying dataset if required
>> > 'CPart is a datasource which can be used in place of a
>> > standard
>> > dataset object which is great for data binding
>> >
>> > 'One thing I haven't got to yet is Overriding or adding
>> > additional methods to the other table classes in the underlying
>> > baseclass
>> > 'not sure how I will accomplish that part.
>> > End Sub
>> > End Class
>> >
>> > To me this is a simple way of creating your dataclasses because you
>> > can
>> > create your XML schema easily by dragging tables from the server
>> > explorer
>> > directly on to the schema. Then when you Inherit the XML data
>> > schema
>> > (typed
>> > dataset) you get all of the table fields as properties in your class
>> > by
>> > default.
>> >
>> > Doing it any other way just seems like A LOT OF WORK. Other ways
>> > would
>> > be
>> > to create data classes and manually type in every field as a
>> > property.
>> > You
>> > do not get your databinding capability (though I hear there is a way
>> > to
>> > make
>> > these bindable at runtime). One thing you definatly won't get is
>> > design
>> > time databinding (the other method mentioned above, we can bind the
>> > typed
>> > datasets to our 3rd party grid controls easily at design time. )
>> >
>> > Then with your dataclasses you have to implement them in a
>> > collection.
>> > For
>> > example CParts and CPart, would be two different classes.
>> > Inheriting
>> > from a
>> > typed dataset just seems like a lot of this work is done for you and
>> > the
>> > project can be completed months earlier.
>> >
>> > What do you guys think? Is this an accepted practice? or am I way
>> > off
>> > here? Are there other alternatives? Pro's/Con's? I am looking for
>> > advice
>> > on this as I have to decide soon on the design of the data tier.
>> >
>> > Thanks for your input.
>> >
>> > D.
>> >
>> >
>> >
>> >


Nov 21 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

16
3035
by: D Witherspoon | last post by:
I am developing a Windows Forms application in VB.NET that will use .NET remoting to access the data tier classes. A very simple way I have come up with is by creating typed (.xsd) datasets. For example dsParts.xsd and including that in the data tier. I then will create a class that looks like this Public Class CPart Inherits dsParts
41
4726
by: laimis | last post by:
Hey guys, I just recently got introduced to data mappers (DTO mapper). So now I have a SqlHelper being used by DTOMapper and then business layer is using DTOMapper when it needs to persist object to database or load them back. Everything is working nicely so far. My question is, is it OK practice to use DTOMapper rfom the presentation layer? For instance, if I want to present in HTML format the list of entries in my database, should I...
2
2213
by: headware | last post by:
I'm relatively new to ASP.NET and ADO.NET, but I have a basic design question regarding the use of web services and APS.NET applications. Right now we have an application that uses web services to access the database layer. However, the code works in a pretty cumbersome and ungeneric way. Basically every query, update, and insert has its own function. So you see a lot of functions like webService.InsertCustomer(name, age, phone); or...
8
1673
by: Keith-Earl | last post by:
Okay, looking for a Best Practice. We are building a classic three tier app in VB.NET. When we load up a WebForm we have access to very useful objects such as the Session object. We frequently store short lists in Session or even Application objects and retrieve them later without having to make a round trip to the db. We think the best place to do all this is in the Business tier and not to clutter up the client (WebForm). In order...
13
3113
by: Alan Silver | last post by:
Hello, MSDN (amongst other places) is full of helpful advice on ways to do data access, but they all seem geared to wards enterprise applications. Maybe I'm in a minority, but I don't have those sorts of clients. Mine are all small businesses whose sites will never reach those sorts of scales. I deal with businesses whose sites get maybe a few hundred visitors per day (some not even that much) and get no more than ten orders per day....
4
3017
by: pratham | last post by:
Hi! I'm making a database application and i heard from a friend that it is more proffecional and easy to do this with bussines objects. Can anyone tell me where i can find more info on bussines objects and how to implement them with c#? I would appreciate any help. How can i decide that what things go into business layer and what
0
1581
by: uncensored | last post by:
Hi, Sort of new with the whole treeview control and I was wondering is there a way to build a treeview menu using my SQL data I pull from a database instead of having to hard code it into the webpage. Pasted below is my current code but what I would like to do is instead make a table in SQL that data can be added or modified to and have it generate the menu according to the data instead. Thanks for the help, Mike <mytree:treeview...
0
9000
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9577
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9396
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
0
9256
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8260
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
6804
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6081
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4887
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3322
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.