By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,594 Members | 3,291 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,594 IT Pros & Developers. It's quick & easy.

Data/Business Object Tier Best Practices

P: n/a
I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbCommand
cm.CommandType = CommandType.Text
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDataTable(cm, Me.tblParts, ConnectionStrings.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Jul 21 '05 #1
Share this Question
Share on Google+
16 Replies


P: n/a
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation calls
for it and consider using Custom entity objects when appropriate. Most of
the time I opt for Typed DataSets because it can be more productive to use
them and a lot of developers are used to programming in a relational model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring out
the SQL statement from the typed dataset class you have and moving that into
a seperate class.

Some resources
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbCommand
cm.CommandType = CommandType.Text
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDataTable(cm, Me.tblParts, ConnectionStrings.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Jul 21 '05 #2

P: n/a
CMM
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME. Typed
datasets are huge time savers and provide all the benefits of custom objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from it,
and try to add all sorts of code to it. This makes your life harder... as the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it is
redundant and requires a huge development effort in exchange for relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation calls
for it and consider using Custom entity objects when appropriate. Most of
the time I opt for Typed DataSets because it can be more productive to use
them and a lot of developers are used to programming in a relational model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring out
the SQL statement from the typed dataset class you have and moving that into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
I am developing a Windows Forms application in VB.NET that will use .NET
remoting to access the data tier classes.

A very simple way I have come up with is by creating typed (.xsd) datasets.
For example dsParts.xsd and including that in the data tier. I then will
create a class that looks like this
Public Class CPart
Inherits dsParts
Public Sub New(ByVal iPartID as Integer)
Dim cm As New OleDb.OleDbCommand
cm.CommandType = CommandType.Text
cm.CommandText = "Select * from tblParts where PartID=" &
iPartID
modData.FillDataTable(cm, Me.tblParts, ConnectionStrings.QASpec)
'Fill data table is a common method where i pass in a command
and connection string
'it then fills the passed table object (ByRef) with the results
of the command
'I could fill more than 1 data table here if this xml data
schema had more than one table
'I can now add more methods to CPart and overide methods of the
underlying dataset if required
'CPart is a datasource which can be used in place of a standard
dataset object which is great for data binding

'One thing I haven't got to yet is Overriding or adding
additional methods to the other table classes in the underlying baseclass
'not sure how I will accomplish that part.
End Sub
End Class

To me this is a simple way of creating your dataclasses because you can
create your XML schema easily by dragging tables from the server explorer
directly on to the schema. Then when you Inherit the XML data schema (typed
dataset) you get all of the table fields as properties in your class by
default.

Doing it any other way just seems like A LOT OF WORK. Other ways would be
to create data classes and manually type in every field as a property. You
do not get your databinding capability (though I hear there is a way to make
these bindable at runtime). One thing you definatly won't get is design
time databinding (the other method mentioned above, we can bind the typed
datasets to our 3rd party grid controls easily at design time. )

Then with your dataclasses you have to implement them in a collection. For
example CParts and CPart, would be two different classes. Inheriting from a
typed dataset just seems like a lot of this work is done for you and the
project can be completed months earlier.

What do you guys think? Is this an accepted practice? or am I way off
here? Are there other alternatives? Pro's/Con's? I am looking for advice
on this as I have to decide soon on the design of the data tier.

Thanks for your input.

D.

Jul 21 '05 #3

P: n/a
This little thread got me to go back and give another try to IDE-generated
Typed Datasets, you made them sound like the killers I thought that they
might be back a few years ago.

Thing is while they are neat and can jumpstart some coding I still find them
(as the IDE generates them) unweildy when used against a lot of real world
tables.

The big thing everyone pushes is that typed datasets are better because
they're easier to read and so lend themselves more to OOPers... and I don't
see that myself..

I don't know about everyone else but I often get tables that don't have the
most happy column names. I don't think I've ever seen a column named
"HomeAddressPartOne", "ApartmentNumber" ... in fact I don't think I've see
many "FirstName" column names over the years. I get more along the lines
of cooumns named by Unix guys such as Fnm, Lnm, Adr1 and so on.

While you can figure them out in a lot of cases, several times I've been
told to figure out column contents by hitting another lookup table. Hey,
I'm all for better table design but not all projects let you make new
tables. Maybe it's just that I've spent a lot of time on Oracle and maybe
SqlServer DBAs always follow the Microsoft Access documentation style with
clearly and obviously named columns having embedded spaces and such (that
was a joke).

But in the end, when I generate those typed datasets I just have to go in
and manually change the interfaces if I really want to get the grail of
Humanly Comprehendable Objects.

Tell me that all this is moot and that I've just missed something in the
wizard... a place to simply tell the generator to use aliases and not much
with them every time a schema is refershed and no changes were found in the
base tables. That would be great, I'd love to hear about it.

The second thing is an oldie but a goodie that CMM mentioned: After
changing those properties manually, along somes a minor schema change
(pretty common during development) and with that comes the loss of all our
manual interface changes.

The thrid thing is that I used to read that typed datasets were somehow
faster performance-wise than vanilla datasets but I've since read that that
really isn't the case depending on how you code (here's a source, call up
the page and do a find for the word "faster"
http://bdn.borland.com/borcon2004/ar...,32284,00.html ).

In all, it's true that the up-front coding can be jumpstarted by using the
IDE to make an xsd, but still I find that after you've done the brunt of
your own entity objects you end up spending less time dealing with schema
change problems in that you simply add new properties and you're done
without worrying over how much tedious re-tweaking you'll have to do if
someone else opens up the project and accidentally regenerates the xsd.

As to the pain of binding custom objects and custom collections to GUis, CMM
said that typed datasets and binding are easy enough "once you master the
intricasies" of BindingContext/BindingManager ... the same can be said for
binding custom objects that aren't typed datasets, you can bind guis to
objects and to custom collections once you master some of hte intracasies of
"complex" binding.

I'm not trying to start a fight, I also would just like to know which is
best in most cases since I keep coming back to prefering my own object and
binding code to all those fragile lines generated by the freebie wizard.

Looking forward to being told that I'm wrong, I live to learn :)

robert smith
kirkland, wa
www.smithvoice.com

"CMM" <CM*@discussions.microsoft.com> wrote in message
news:A4**********************************@microsof t.com...
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
Typed
datasets are huge time savers and provide all the benefits of custom
objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from
it,
and try to add all sorts of code to it. This makes your life harder... as
the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for
data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time
saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should
not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it
is
redundant and requires a huge development effort in exchange for
relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are
huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation
calls
for it and consider using Custom entity objects when appropriate. Most
of
the time I opt for Typed DataSets because it can be more productive to
use
them and a lot of developers are used to programming in a relational
model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring
out
the SQL statement from the typed dataset class you have and moving that
into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
> I am developing a Windows Forms application in VB.NET that will use
> .NET
> remoting to access the data tier classes.
>
> A very simple way I have come up with is by creating typed (.xsd)
> datasets.
> For example dsParts.xsd and including that in the data tier. I then
> will
> create a class that looks like this
>
>
> Public Class CPart
> Inherits dsParts
> Public Sub New(ByVal iPartID as Integer)
> Dim cm As New OleDb.OleDbCommand
> cm.CommandType = CommandType.Text
> cm.CommandText = "Select * from tblParts where PartID=" &
> iPartID
> modData.FillDataTable(cm, Me.tblParts,
> ConnectionStrings.QASpec)
> 'Fill data table is a common method where i pass in a
> command
> and connection string
> 'it then fills the passed table object (ByRef) with the
> results
> of the command
> 'I could fill more than 1 data table here if this xml data
> schema had more than one table
> 'I can now add more methods to CPart and overide methods of
> the
> underlying dataset if required
> 'CPart is a datasource which can be used in place of a
> standard
> dataset object which is great for data binding
>
> 'One thing I haven't got to yet is Overriding or adding
> additional methods to the other table classes in the underlying
> baseclass
> 'not sure how I will accomplish that part.
> End Sub
> End Class
>
> To me this is a simple way of creating your dataclasses because you can
> create your XML schema easily by dragging tables from the server
> explorer
> directly on to the schema. Then when you Inherit the XML data schema
> (typed
> dataset) you get all of the table fields as properties in your class by
> default.
>
> Doing it any other way just seems like A LOT OF WORK. Other ways would
> be
> to create data classes and manually type in every field as a property.
> You
> do not get your databinding capability (though I hear there is a way to
> make
> these bindable at runtime). One thing you definatly won't get is
> design
> time databinding (the other method mentioned above, we can bind the
> typed
> datasets to our 3rd party grid controls easily at design time. )
>
> Then with your dataclasses you have to implement them in a collection.
> For
> example CParts and CPart, would be two different classes. Inheriting
> from a
> typed dataset just seems like a lot of this work is done for you and
> the
> project can be completed months earlier.
>
> What do you guys think? Is this an accepted practice? or am I way off
> here? Are there other alternatives? Pro's/Con's? I am looking for
> advice
> on this as I have to decide soon on the design of the data tier.
>
> Thanks for your input.
>
> D.
>
>
>
>

Jul 21 '05 #4

P: n/a
CMM
Oh I totally agree with the column name mappings problem. .Getfirst_name()
sure is ugly. But you might be missing or misunderstanding some things...

1) You can use the TableMappings property of the DataAdapter to map database
columns names to make them look however you want (so that the DB's first_name
maps to the dataset's FirstName field). You can access this via the property
editor... but it's not as pretty or easy to use as it should be (I'd love to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!! Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors.... they're NOT just for WinForms... they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then you
go in and modify to your hearts content. 80% of the work code (tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change in
thinking that you must undergo. It might not be for you. But, I know I've had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.

"smith" wrote:
This little thread got me to go back and give another try to IDE-generated
Typed Datasets, you made them sound like the killers I thought that they
might be back a few years ago.

Thing is while they are neat and can jumpstart some coding I still find them
(as the IDE generates them) unweildy when used against a lot of real world
tables.

The big thing everyone pushes is that typed datasets are better because
they're easier to read and so lend themselves more to OOPers... and I don't
see that myself..

I don't know about everyone else but I often get tables that don't have the
most happy column names. I don't think I've ever seen a column named
"HomeAddressPartOne", "ApartmentNumber" ... in fact I don't think I've see
many "FirstName" column names over the years. I get more along the lines
of cooumns named by Unix guys such as Fnm, Lnm, Adr1 and so on.

While you can figure them out in a lot of cases, several times I've been
told to figure out column contents by hitting another lookup table. Hey,
I'm all for better table design but not all projects let you make new
tables. Maybe it's just that I've spent a lot of time on Oracle and maybe
SqlServer DBAs always follow the Microsoft Access documentation style with
clearly and obviously named columns having embedded spaces and such (that
was a joke).

But in the end, when I generate those typed datasets I just have to go in
and manually change the interfaces if I really want to get the grail of
Humanly Comprehendable Objects.

Tell me that all this is moot and that I've just missed something in the
wizard... a place to simply tell the generator to use aliases and not much
with them every time a schema is refershed and no changes were found in the
base tables. That would be great, I'd love to hear about it.

The second thing is an oldie but a goodie that CMM mentioned: After
changing those properties manually, along somes a minor schema change
(pretty common during development) and with that comes the loss of all our
manual interface changes.

The thrid thing is that I used to read that typed datasets were somehow
faster performance-wise than vanilla datasets but I've since read that that
really isn't the case depending on how you code (here's a source, call up
the page and do a find for the word "faster"
http://bdn.borland.com/borcon2004/ar...,32284,00.html ).

In all, it's true that the up-front coding can be jumpstarted by using the
IDE to make an xsd, but still I find that after you've done the brunt of
your own entity objects you end up spending less time dealing with schema
change problems in that you simply add new properties and you're done
without worrying over how much tedious re-tweaking you'll have to do if
someone else opens up the project and accidentally regenerates the xsd.

As to the pain of binding custom objects and custom collections to GUis, CMM
said that typed datasets and binding are easy enough "once you master the
intricasies" of BindingContext/BindingManager ... the same can be said for
binding custom objects that aren't typed datasets, you can bind guis to
objects and to custom collections once you master some of hte intracasies of
"complex" binding.

I'm not trying to start a fight, I also would just like to know which is
best in most cases since I keep coming back to prefering my own object and
binding code to all those fragile lines generated by the freebie wizard.

Looking forward to being told that I'm wrong, I live to learn :)

robert smith
kirkland, wa
www.smithvoice.com

"CMM" <CM*@discussions.microsoft.com> wrote in message
news:A4**********************************@microsof t.com...
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
Typed
datasets are huge time savers and provide all the benefits of custom
objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from
it,
and try to add all sorts of code to it. This makes your life harder... as
the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for
data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time
saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should
not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it
is
redundant and requires a huge development effort in exchange for
relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are
huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation
calls
for it and consider using Custom entity objects when appropriate. Most
of
the time I opt for Typed DataSets because it can be more productive to
use
them and a lot of developers are used to programming in a relational
model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring
out
the SQL statement from the typed dataset class you have and moving that
into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:

> I am developing a Windows Forms application in VB.NET that will use
> .NET
> remoting to access the data tier classes.
>
> A very simple way I have come up with is by creating typed (.xsd)
> datasets.
> For example dsParts.xsd and including that in the data tier. I then
> will
> create a class that looks like this
>
>
> Public Class CPart
> Inherits dsParts
> Public Sub New(ByVal iPartID as Integer)
> Dim cm As New OleDb.OleDbCommand
> cm.CommandType = CommandType.Text
> cm.CommandText = "Select * from tblParts where PartID=" &
> iPartID
> modData.FillDataTable(cm, Me.tblParts,
> ConnectionStrings.QASpec)
> 'Fill data table is a common method where i pass in a
> command
> and connection string
> 'it then fills the passed table object (ByRef) with the
> results
> of the command
> 'I could fill more than 1 data table here if this xml data
> schema had more than one table
> 'I can now add more methods to CPart and overide methods of
> the
> underlying dataset if required
> 'CPart is a datasource which can be used in place of a
> standard
> dataset object which is great for data binding
>
> 'One thing I haven't got to yet is Overriding or adding
> additional methods to the other table classes in the underlying
> baseclass
> 'not sure how I will accomplish that part.
> End Sub
> End Class
>
> To me this is a simple way of creating your dataclasses because you can
> create your XML schema easily by dragging tables from the server
> explorer
> directly on to the schema. Then when you Inherit the XML data schema
> (typed
> dataset) you get all of the table fields as properties in your class by
> default.
>
> Doing it any other way just seems like A LOT OF WORK. Other ways would
> be
> to create data classes and manually type in every field as a property.
> You
> do not get your databinding capability (though I hear there is a way to
> make
> these bindable at runtime). One thing you definatly won't get is
> design
> time databinding (the other method mentioned above, we can bind the
> typed
> datasets to our 3rd party grid controls easily at design time. )
>
> Then with your dataclasses you have to implement them in a collection.
> For
> example CParts and CPart, would be two different classes. Inheriting
> from a
> typed dataset just seems like a lot of this work is done for you and
> the
> project can be completed months earlier.
>
> What do you guys think? Is this an accepted practice? or am I way off
> here? Are there other alternatives? Pro's/Con's? I am looking for
> advice
> on this as I have to decide soon on the design of the data tier.
>
> Thanks for your input.
>
> D.
>
>
>
>


Jul 21 '05 #5

P: n/a
Thanks CMM, I'll go back in again and spend some time to see more 'whats and
wheres' and I appreciate your pointers.

I do remember that making complex binding was not the most intuitive thing
back when I started doing it and if I don't have to do it in a project for a
while I have to go back to the books to get refreshers.

Of course, that's the same for a lot of things in programming, like most
folks I've spent years doing loads of database work but after spending a
month or so heads-down in a in a GUI I'll admit that I have moments when I
go back to the back-end and say to myself things like "now... what was the
best parameter syntax again?" :). We weren't born with any code syntax in
our heads so all things kind of turn out equal and relative; if you do more
complex object databinding code day in and day then out your fingers will
start doing the patterns faster ... and I'm sure it's the same for the
intracasies of typed datasets becuase it's the same for just about anything
we all use often enough. (Boy I spent years doing VB5/6 6 to 7 days a week
and could write code "in my head" in a lot of cases, but recently I loaded
up a virtual machine to show someone a VB Classic technique - that I
developed and was first to document so I should have known it pretty well -
and it was harder shifting from VB7 to VB5/6 than it is shifting from VB7 to
FlashMX2004 ... amazing how the mind so quickly drops rote memories)

Thanks again for your information, it is sincerely appreciated. And if you
have some specific intermediate/advanced resources that you could list I
would like to read them, most of the tutorials and books show only how to
use the IDE to make a typed dataset and pretty much leave it at that.

smith
"CMM" <CM*@discussions.microsoft.com> wrote in message
news:47**********************************@microsof t.com...
Oh I totally agree with the column name mappings problem. .Getfirst_name()
sure is ugly. But you might be missing or misunderstanding some things...

1) You can use the TableMappings property of the DataAdapter to map
database
columns names to make them look however you want (so that the DB's
first_name
maps to the dataset's FirstName field). You can access this via the
property
editor... but it's not as pretty or easy to use as it should be (I'd love
to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!!
Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors.... they're NOT just for WinForms...
they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then
you
go in and modify to your hearts content. 80% of the work code
(tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change
in
thinking that you must undergo. It might not be for you. But, I know I've
had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to
have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.

Jul 21 '05 #6

P: n/a
CMM
IMHO, I think the learning curve is worth it... and you'd disover that at the
end, the solution *IS* ORM... minus the hassle but with a lot more
functionality.

You just have to get over some stubborn mental stuff.

For instance, there is absolutely nothing wrong with returning a
TypedDataset that will always only have one row. Who cares? It works, right?
But at first a lot of us are like, "no way, I'll just create a flat class to
handle it." Well, that's stupid. What if you end up wanting to manipulate a
bunch of them in a collection.... well, serializing collections over tiers
sucks and is extremely error-prone (not all types are serializable) and are
MUCH less functional the tables (sorting, mapping, serializing, binding,
etc).

Also, don't dismiss the GUI design-time tools just because they at first
glance look like the VB.Classic crappy tools. Design-time DataAdapters are a
godsend. Desiging your TypedDatasets using the Designer is fun. Setting up
binding at design time is also easy.

Books suck. No book I have ever seen properly explains the stuff...
especially the quirks of databinding. Check out some of these links:

How databinding really works
http://groups-beta.google.com/group/...af8230f57c38de

Mapping Data Source Tables to Dataset Tables
http://msdn.microsoft.com/library/de...asetTables.asp

Roadmap for WindowsForms databinding
http://support.microsoft.com/default...;EN-US;Q313482
Good luck.

"smith" wrote:
Thanks CMM, I'll go back in again and spend some time to see more 'whats and
wheres' and I appreciate your pointers.

I do remember that making complex binding was not the most intuitive thing
back when I started doing it and if I don't have to do it in a project for a
while I have to go back to the books to get refreshers.

Of course, that's the same for a lot of things in programming, like most
folks I've spent years doing loads of database work but after spending a
month or so heads-down in a in a GUI I'll admit that I have moments when I
go back to the back-end and say to myself things like "now... what was the
best parameter syntax again?" :). We weren't born with any code syntax in
our heads so all things kind of turn out equal and relative; if you do more
complex object databinding code day in and day then out your fingers will
start doing the patterns faster ... and I'm sure it's the same for the
intracasies of typed datasets becuase it's the same for just about anything
we all use often enough. (Boy I spent years doing VB5/6 6 to 7 days a week
and could write code "in my head" in a lot of cases, but recently I loaded
up a virtual machine to show someone a VB Classic technique - that I
developed and was first to document so I should have known it pretty well -
and it was harder shifting from VB7 to VB5/6 than it is shifting from VB7 to
FlashMX2004 ... amazing how the mind so quickly drops rote memories)

Thanks again for your information, it is sincerely appreciated. And if you
have some specific intermediate/advanced resources that you could list I
would like to read them, most of the tutorials and books show only how to
use the IDE to make a typed dataset and pretty much leave it at that.

smith
"CMM" <CM*@discussions.microsoft.com> wrote in message
news:47**********************************@microsof t.com...
Oh I totally agree with the column name mappings problem. .Getfirst_name()
sure is ugly. But you might be missing or misunderstanding some things...

1) You can use the TableMappings property of the DataAdapter to map
database
columns names to make them look however you want (so that the DB's
first_name
maps to the dataset's FirstName field). You can access this via the
property
editor... but it's not as pretty or easy to use as it should be (I'd love
to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!!
Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors.... they're NOT just for WinForms...
they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then
you
go in and modify to your hearts content. 80% of the work code
(tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change
in
thinking that you must undergo. It might not be for you. But, I know I've
had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to
have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.


Jul 21 '05 #7

P: n/a
In response to your issue about serializing datasets, you may want to check
this out:

http://bethmassi.blogspot.com/2004/1...-datasets.html

Cheers,
-Beth

"CMM" <CM*@discussions.microsoft.com> wrote in message
news:47**********************************@microsof t.com...
Oh I totally agree with the column name mappings problem. .Getfirst_name()
sure is ugly. But you might be missing or misunderstanding some things...

1) You can use the TableMappings property of the DataAdapter to map
database
columns names to make them look however you want (so that the DB's
first_name
maps to the dataset's FirstName field). You can access this via the
property
editor... but it's not as pretty or easy to use as it should be (I'd love
to
see a graphical implementation where I can map column names visually using
drag and drop). The point is: Leave the DataSet generated code ALONE!!!!
Jeez!

2) You have to unlearn what you have learned (Yoda quote). Use the
design-time created DataAdaptors.... they're NOT just for WinForms...
they're
totally applicable to the Middle Tier as well. You can host them in a
component or something. Let them create the SQL for you (if it can) then
you
go in and modify to your hearts content. 80% of the work code
(tablemappings,
filling the dataset) is done for you. Sometimes even 100%.

3) The typed dataset does not in ANY WAY have to look like your database
tables. With carefully crafted SELECT/UPDATE/INSERT statements you can get
away with almost anything. Your SELECT can return 100 fields... but your
UPDATE only has to work on a subset of them if it wants.

4) Just one more tipe: Discover the DataView. When using binding, I almost
always wrap a table around a DataView... you gain a whole bunch of new
functionality.

I am not saying Typed Datasets are perfect. There is a fundamental change
in
thinking that you must undergo. It might not be for you. But, I know I've
had
my fill of ORM. I hate it.

One more thing: There is no way in hell object binding is equal to
dataset/datable binding. First off every property in your class has to
have a
corrolating PropertyChanged event or else you lose all sorts of Validation
events. You also lose AFAIK the very useful RowError functionality that is
used by all DataGrids (including 3rd party ones).

As for typed datasets being "slower" that's hogwash. It's one of those
things that while theoretically true would never have an effect in
real-world-use. I myself don't like the way they serialize to XML (even
binary XML) over tiers.... but this is something addressed in .NET 2.0.

"smith" wrote:
This little thread got me to go back and give another try to
IDE-generated
Typed Datasets, you made them sound like the killers I thought that they
might be back a few years ago.

Thing is while they are neat and can jumpstart some coding I still find
them
(as the IDE generates them) unweildy when used against a lot of real
world
tables.

The big thing everyone pushes is that typed datasets are better because
they're easier to read and so lend themselves more to OOPers... and I
don't
see that myself..

I don't know about everyone else but I often get tables that don't have
the
most happy column names. I don't think I've ever seen a column named
"HomeAddressPartOne", "ApartmentNumber" ... in fact I don't think I've
see
many "FirstName" column names over the years. I get more along the
lines
of cooumns named by Unix guys such as Fnm, Lnm, Adr1 and so on.

While you can figure them out in a lot of cases, several times I've been
told to figure out column contents by hitting another lookup table. Hey,
I'm all for better table design but not all projects let you make new
tables. Maybe it's just that I've spent a lot of time on Oracle and maybe
SqlServer DBAs always follow the Microsoft Access documentation style
with
clearly and obviously named columns having embedded spaces and such (that
was a joke).

But in the end, when I generate those typed datasets I just have to go in
and manually change the interfaces if I really want to get the grail of
Humanly Comprehendable Objects.

Tell me that all this is moot and that I've just missed something in the
wizard... a place to simply tell the generator to use aliases and not
much
with them every time a schema is refershed and no changes were found in
the
base tables. That would be great, I'd love to hear about it.

The second thing is an oldie but a goodie that CMM mentioned: After
changing those properties manually, along somes a minor schema change
(pretty common during development) and with that comes the loss of all
our
manual interface changes.

The thrid thing is that I used to read that typed datasets were somehow
faster performance-wise than vanilla datasets but I've since read that
that
really isn't the case depending on how you code (here's a source, call
up
the page and do a find for the word "faster"
http://bdn.borland.com/borcon2004/ar...,32284,00.html ).

In all, it's true that the up-front coding can be jumpstarted by using
the
IDE to make an xsd, but still I find that after you've done the brunt of
your own entity objects you end up spending less time dealing with schema
change problems in that you simply add new properties and you're done
without worrying over how much tedious re-tweaking you'll have to do if
someone else opens up the project and accidentally regenerates the xsd.

As to the pain of binding custom objects and custom collections to GUis,
CMM
said that typed datasets and binding are easy enough "once you master the
intricasies" of BindingContext/BindingManager ... the same can be said
for
binding custom objects that aren't typed datasets, you can bind guis to
objects and to custom collections once you master some of hte intracasies
of
"complex" binding.

I'm not trying to start a fight, I also would just like to know which is
best in most cases since I keep coming back to prefering my own object
and
binding code to all those fragile lines generated by the freebie wizard.

Looking forward to being told that I'm wrong, I live to learn :)

robert smith
kirkland, wa
www.smithvoice.com

"CMM" <CM*@discussions.microsoft.com> wrote in message
news:A4**********************************@microsof t.com...
> Having been developing entity objects for years to represent data and
> carrying that same ORM ideology to .NET for some time until I gave
> typed
> datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
> Typed
> datasets are huge time savers and provide all the benefits of custom
> objects.
> Developers just have to lose some of their old practices which were
> never
> good ideas to begin with. You have to learn to seperate business rules
> and
> validation from the data object itself. One of the first thing old
> school
> developers try to do is hijack the Typed Dataset, inherit some class
> from
> it,
> and try to add all sorts of code to it. This makes your life harder...
> as
> the
> dataset is recreated and your code changes lost whenever you use the
> very
> productive and useful designer to change the dataset. Datasets are for
> data.
> Validation objects act on the dataset. Data Access objects act on the
> dataset. It's all very clean and manageable and productive.
>
> Also, the benefits of using typed datasets ripples to other things. if
> you
> hesitated using binding in .NET because of your experiences in VB6 and
> you
> don't want to appear "lazy"... you're losing out on another huge time
> saver.
> Data binding in .NET is very good (one you master some of its weird
> intricacies... namely the BindingContext/BindingManager stuff)! It
> should
> not
> be dismissed.
>
> There are times when its appropriate to use ORM, but for the most part
> it
> is
> redundant and requires a huge development effort in exchange for
> relatively
> minor advantages. If you have a huge development team that can handle
> it,
> then maybe it's the way to go. But, the benefits of typed datasets are
> huge.
>
> Just my 2c.
>
> "Jorge Matos" wrote:
>
>> Whether to use Typed Datasets or Custom Entity objects is a
>> controversial
>> topic. My rule of thumb is to use Typed DataSets when the situation
>> calls
>> for it and consider using Custom entity objects when appropriate.
>> Most
>> of
>> the time I opt for Typed DataSets because it can be more productive to
>> use
>> them and a lot of developers are used to programming in a relational
>> model.
>> Custom entity classes and collections are usefull when you have a lot
>> of
>> business rules that you want to enforce on your data.
>>
>> The only issue I have with your code is that I would consider
>> factoring
>> out
>> the SQL statement from the typed dataset class you have and moving
>> that
>> into
>> a seperate class.
>>
>> Some resources:
>> http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
>> http://www.codeproject.com/dotnet/In...romDataSet.asp
>>
>>
>>
>> "D Witherspoon" wrote:
>>
>> > I am developing a Windows Forms application in VB.NET that will use
>> > .NET
>> > remoting to access the data tier classes.
>> >
>> > A very simple way I have come up with is by creating typed (.xsd)
>> > datasets.
>> > For example dsParts.xsd and including that in the data tier. I then
>> > will
>> > create a class that looks like this
>> >
>> >
>> > Public Class CPart
>> > Inherits dsParts
>> > Public Sub New(ByVal iPartID as Integer)
>> > Dim cm As New OleDb.OleDbCommand
>> > cm.CommandType = CommandType.Text
>> > cm.CommandText = "Select * from tblParts where PartID="
>> > &
>> > iPartID
>> > modData.FillDataTable(cm, Me.tblParts,
>> > ConnectionStrings.QASpec)
>> > 'Fill data table is a common method where i pass in a
>> > command
>> > and connection string
>> > 'it then fills the passed table object (ByRef) with the
>> > results
>> > of the command
>> > 'I could fill more than 1 data table here if this xml
>> > data
>> > schema had more than one table
>> > 'I can now add more methods to CPart and overide methods
>> > of
>> > the
>> > underlying dataset if required
>> > 'CPart is a datasource which can be used in place of a
>> > standard
>> > dataset object which is great for data binding
>> >
>> > 'One thing I haven't got to yet is Overriding or adding
>> > additional methods to the other table classes in the underlying
>> > baseclass
>> > 'not sure how I will accomplish that part.
>> > End Sub
>> > End Class
>> >
>> > To me this is a simple way of creating your dataclasses because you
>> > can
>> > create your XML schema easily by dragging tables from the server
>> > explorer
>> > directly on to the schema. Then when you Inherit the XML data
>> > schema
>> > (typed
>> > dataset) you get all of the table fields as properties in your class
>> > by
>> > default.
>> >
>> > Doing it any other way just seems like A LOT OF WORK. Other ways
>> > would
>> > be
>> > to create data classes and manually type in every field as a
>> > property.
>> > You
>> > do not get your databinding capability (though I hear there is a way
>> > to
>> > make
>> > these bindable at runtime). One thing you definatly won't get is
>> > design
>> > time databinding (the other method mentioned above, we can bind the
>> > typed
>> > datasets to our 3rd party grid controls easily at design time. )
>> >
>> > Then with your dataclasses you have to implement them in a
>> > collection.
>> > For
>> > example CParts and CPart, would be two different classes.
>> > Inheriting
>> > from a
>> > typed dataset just seems like a lot of this work is done for you and
>> > the
>> > project can be completed months earlier.
>> >
>> > What do you guys think? Is this an accepted practice? or am I way
>> > off
>> > here? Are there other alternatives? Pro's/Con's? I am looking for
>> > advice
>> > on this as I have to decide soon on the design of the data tier.
>> >
>> > Thanks for your input.
>> >
>> > D.
>> >
>> >
>> >
>> >


Jul 21 '05 #8

P: n/a
<< clapping >>

I agree. Although I wouldn't go as far as saying that ORM is a waste of
time, I have found it MUCH MUCH more productive to go with datasets when
developing Winforms applications because, like you said, you can easily
implement very complex databinding. One thing we do to help speed up
development is to code generate the typed datasets directly from our SELECT
queries in our middle-tier. You can call the xsd.exe utility directly to
generate them. All you have to do is execute your queries into an untyped
dataset with MissingSchemaAction set to AddWithKey temporarily while you're
generating and pass the dataset's schema to the xsd.exe utility to create
the typed dataset code:

'-- Fill a dataset from your data adapter like normal
Dim ds as New Dataset("MyData")
Dim da As New SqlClient.SqlDataAdapter(myCommand)
da.MissingSchemaAction = MissingSchemaAction.AddWithKey
da.Fill(ds)

'-- Write out the schema (discard the data)
ds.WriteXmlSchema("MyData.xsd")

'-- Generate the typed dataset
Dim p as System.Diagnostics.Process
p.Start("C:\Program Files\Microsoft Visual Studio .NET
2003\SDK\v1.1\Bin\xsd.exe", "MyData.xsd /dataset /language:VB")

Have Fun! You're on the right track!
-Beth
"CMM" <CM*@discussions.microsoft.com> wrote in message
news:A4**********************************@microsof t.com...
Having been developing entity objects for years to represent data and
carrying that same ORM ideology to .NET for some time until I gave typed
datasets a chance, I can honestly say that ORM is a BIG WASTE OF TIME.
Typed
datasets are huge time savers and provide all the benefits of custom
objects.
Developers just have to lose some of their old practices which were never
good ideas to begin with. You have to learn to seperate business rules and
validation from the data object itself. One of the first thing old school
developers try to do is hijack the Typed Dataset, inherit some class from
it,
and try to add all sorts of code to it. This makes your life harder... as
the
dataset is recreated and your code changes lost whenever you use the very
productive and useful designer to change the dataset. Datasets are for
data.
Validation objects act on the dataset. Data Access objects act on the
dataset. It's all very clean and manageable and productive.

Also, the benefits of using typed datasets ripples to other things. if you
hesitated using binding in .NET because of your experiences in VB6 and you
don't want to appear "lazy"... you're losing out on another huge time
saver.
Data binding in .NET is very good (one you master some of its weird
intricacies... namely the BindingContext/BindingManager stuff)! It should
not
be dismissed.

There are times when its appropriate to use ORM, but for the most part it
is
redundant and requires a huge development effort in exchange for
relatively
minor advantages. If you have a huge development team that can handle it,
then maybe it's the way to go. But, the benefits of typed datasets are
huge.

Just my 2c.

"Jorge Matos" wrote:
Whether to use Typed Datasets or Custom Entity objects is a controversial
topic. My rule of thumb is to use Typed DataSets when the situation
calls
for it and consider using Custom entity objects when appropriate. Most
of
the time I opt for Typed DataSets because it can be more productive to
use
them and a lot of developers are used to programming in a relational
model.
Custom entity classes and collections are usefull when you have a lot of
business rules that you want to enforce on your data.

The only issue I have with your code is that I would consider factoring
out
the SQL statement from the typed dataset class you have and moving that
into
a seperate class.

Some resources:
http://msdn.microsoft.com/asp.net/de...CustEntCls.asp
http://www.codeproject.com/dotnet/In...romDataSet.asp

"D Witherspoon" wrote:
> I am developing a Windows Forms application in VB.NET that will use
> .NET
> remoting to access the data tier classes.
>
> A very simple way I have come up with is by creating typed (.xsd)
> datasets.
> For example dsParts.xsd and including that in the data tier. I then
> will
> create a class that looks like this
>
>
> Public Class CPart
> Inherits dsParts
> Public Sub New(ByVal iPartID as Integer)
> Dim cm As New OleDb.OleDbCommand
> cm.CommandType = CommandType.Text
> cm.CommandText = "Select * from tblParts where PartID=" &
> iPartID
> modData.FillDataTable(cm, Me.tblParts,
> ConnectionStrings.QASpec)
> 'Fill data table is a common method where i pass in a
> command
> and connection string
> 'it then fills the passed table object (ByRef) with the
> results
> of the command
> 'I could fill more than 1 data table here if this xml data
> schema had more than one table
> 'I can now add more methods to CPart and overide methods of
> the
> underlying dataset if required
> 'CPart is a datasource which can be used in place of a
> standard
> dataset object which is great for data binding
>
> 'One thing I haven't got to yet is Overriding or adding
> additional methods to the other table classes in the underlying
> baseclass
> 'not sure how I will accomplish that part.
> End Sub
> End Class
>
> To me this is a simple way of creating your dataclasses because you can
> create your XML schema easily by dragging tables from the server
> explorer
> directly on to the schema. Then when you Inherit the XML data schema
> (typed
> dataset) you get all of the table fields as properties in your class by
> default.
>
> Doing it any other way just seems like A LOT OF WORK. Other ways would
> be
> to create data classes and manually type in every field as a property.
> You
> do not get your databinding capability (though I hear there is a way to
> make
> these bindable at runtime). One thing you definatly won't get is
> design
> time databinding (the other method mentioned above, we can bind the
> typed
> datasets to our 3rd party grid controls easily at design time. )
>
> Then with your dataclasses you have to implement them in a collection.
> For
> example CParts and CPart, would be two different classes. Inheriting
> from a
> typed dataset just seems like a lot of this work is done for you and
> the
> project can be completed months earlier.
>
> What do you guys think? Is this an accepted practice? or am I way off
> here? Are there other alternatives? Pro's/Con's? I am looking for
> advice
> on this as I have to decide soon on the design of the data tier.
>
> Thanks for your input.
>
> D.
>
>
>
>

Jul 21 '05 #9

P: n/a
Hi,

I have previously written fairly large projects using datasets/dataviews
with good results.
However, ever since having got hold of VB.Net Beta 2, I have been looking at
generics - List(Of type), and been messing around with the class designer.

I am now starting a new project, where the backend is presently an
antiquated DB2 database running on an AS/400 which may or may not move to SQL
Server, and the UI will be both Windows Forms and Explorer.

Playing around with the class designer, I have created my model, which is
essentially classes and collections of classes. The designer allows me to
visualise very clearly the interrelationships between the various classes. In
the past, I have always started with the database and built the application
from there on up. Since the prospect of actually getting the data in and out
of the AS/400 is so daunting, I decided to build the middle tier (business
rules) first and then worry about the data access layer.

However, I have now hit a mental brick wall. I can't see how and where I
will be able to update/insert the back-end data from these classes. I
downloaded the TimeTracker v 2.0 Starter Kit from asp.net, but since I'm not
very goods with C#, haven't been able to really figure out the logic between
the n tiers.

I'm in a bit of a quandry - I would really like to continue using the class
designer to build the application, but would also like to continue to use
datasets etc because of their ease of use.

Is there somewhere I can go which will point me in the direction of a hybrid
model - or must I decide now which way I want to go?

Many thanks
Jeremy Holt
Jul 21 '05 #10

P: n/a
Jeremy,

This can be very interesting to discuss this in these newsgroups.

http://forums.microsoft.com/MSDN/default.aspx

For those problems you have now is in my opinion beta testing for.

Cor

Jul 21 '05 #11

P: n/a
On Thu, 23 Jun 2005 16:55:01 -0700, "JeremyHolt"
<Je********@discussions.microsoft.com> wrote:
Hi,

I have previously written fairly large projects using datasets/dataviews
with good results.
However, ever since having got hold of VB.Net Beta 2, I have been looking at
generics - List(Of type), and been messing around with the class designer.

I am now starting a new project, where the backend is presently an
antiquated DB2 database running on an AS/400 which may or may not move to SQL
Server, and the UI will be both Windows Forms and Explorer.

Playing around with the class designer, I have created my model, which is
essentially classes and collections of classes. The designer allows me to
visualise very clearly the interrelationships between the various classes. In
the past, I have always started with the database and built the application
from there on up. Since the prospect of actually getting the data in and out
of the AS/400 is so daunting, I decided to build the middle tier (business
rules) first and then worry about the data access layer.

However, I have now hit a mental brick wall. I can't see how and where I
will be able to update/insert the back-end data from these classes. I
downloaded the TimeTracker v 2.0 Starter Kit from asp.net, but since I'm not
very goods with C#, haven't been able to really figure out the logic between
the n tiers.

I'm in a bit of a quandry - I would really like to continue using the class
designer to build the application, but would also like to continue to use
datasets etc because of their ease of use.

Is there somewhere I can go which will point me in the direction of a hybrid
model - or must I decide now which way I want to go?
Jeremy,

You are basically using a model which is known as business objects,
i.e. you are putting a layer in place between the Database Layer and
the Presentation Layer that understands your business rules. An
alternative approach of course is to place these business rules in the
database as stored procedures and triggers.

There are good reasons for both of these approaches.

Your problem is as you say is to tie the business objects to the
database, there are two common approaches to this, the first is to
have as a private member a typed record in each class and a typed
recordset in the collection and use that to interact with the
database. The second is to give your objects the ability to access
the database directly. Again both approaches are valid. There are a
number of articles on the internet that discuss the merits of these
approaches.

Doug Taylor
Many thanks
Jeremy Holt


Jul 21 '05 #12

P: n/a
"=?Utf-8?B?SmVyZW15SG9sdA==?=" <Je********@discussions.microsoft.com>
wrote in news:58**********************************@microsof t.com:
I am now starting a new project, where the backend is presently an
antiquated DB2 database running on an AS/400 which may or may not move
to SQL Server, and the UI will be both Windows Forms and Explorer.


This is just part 1, part 2 and 3 will be more of what you are lookig for, but it might still proves
useful:
http://www.codeproject.com/useritems...inessLogic.asp
--
Chad Z. Hower (a.k.a. Kudzu) - http://www.hower.org/Kudzu/
"Programming is an art form that fights back"

Get your ASP.NET in gear with IntraWeb!
http://www.atozed.com/IntraWeb/
Jul 21 '05 #13

P: n/a


"Doug Taylor" wrote:
On Thu, 23 Jun 2005 16:55:01 -0700, "JeremyHolt"
<Je********@discussions.microsoft.com> wrote:
Hi,

I have previously written fairly large projects using datasets/dataviews
with good results.
However, ever since having got hold of VB.Net Beta 2, I have been looking at
generics - List(Of type), and been messing around with the class designer.

I am now starting a new project, where the backend is presently an
antiquated DB2 database running on an AS/400 which may or may not move to SQL
Server, and the UI will be both Windows Forms and Explorer.

Playing around with the class designer, I have created my model, which is
essentially classes and collections of classes. The designer allows me to
visualise very clearly the interrelationships between the various classes. In
the past, I have always started with the database and built the application
from there on up. Since the prospect of actually getting the data in and out
of the AS/400 is so daunting, I decided to build the middle tier (business
rules) first and then worry about the data access layer.

However, I have now hit a mental brick wall. I can't see how and where I
will be able to update/insert the back-end data from these classes. I
downloaded the TimeTracker v 2.0 Starter Kit from asp.net, but since I'm not
very goods with C#, haven't been able to really figure out the logic between
the n tiers.

I'm in a bit of a quandry - I would really like to continue using the class
designer to build the application, but would also like to continue to use
datasets etc because of their ease of use.

Is there somewhere I can go which will point me in the direction of a hybrid
model - or must I decide now which way I want to go?


Jeremy,

You are basically using a model which is known as business objects,
i.e. you are putting a layer in place between the Database Layer and
the Presentation Layer that understands your business rules. An
alternative approach of course is to place these business rules in the
database as stored procedures and triggers.

There are good reasons for both of these approaches.

Your problem is as you say is to tie the business objects to the
database, there are two common approaches to this, the first is to
have as a private member a typed record in each class and a typed
recordset in the collection and use that to interact with the
database. The second is to give your objects the ability to access
the database directly. Again both approaches are valid. There are a
number of articles on the internet that discuss the merits of these
approaches.

Doug Taylor

Many thanks
Jeremy Holt


Doug,

Many thanks for your thoughts on this. I had previoulsy thought about a
private typed record in the class, but then found myself inserting the data
into the class properties from the dataset - this seemed rather
counter-intuituve in that if I already have the typed dataset, why would I
want to transfer this data back into my "typed" class. I also have a problem
visualising how one would handle the "collection".

In my business model, I have defined the following classes:
Class Product
ProductID as integer
SampleSize as double
Description as string
Grade as string
MoistureContent as double
Ash as double
Rotten as double
End Class

Class LaboratoryAnalysis
LabID as integer
Date as date
Products as List(Of Product)
End Class

Class Truck
TruckID as integer
NumberPlate as string
Origin as string
Quantity as integer
LabAnalysis as List(Of LaboratoryAnalysis)
End Class

Class Warehouse
WarehouseID as integer
Trucks as List(Of Truck)
End Class

I tend to think in terms of stored procs/triggers filling/updating datasets,
producing dataviews which are consumed by the UI. Single records (master part
of master/details) are obtained through passing a parameter to the stored
proc. As I said in my previous post, I know it works, and its relatively easy
to do.

However, I am constantly drawn back to the idea of classes/collection of
classes, because of the elegance of the Class Designer.

In my "dataset world" in order to fill my Warehouse I would have a stored
proc which queried the database and would do an inner join accross the tables
to give me the "collection" of Trucks - in reality a dataview, which would
continue to drill down through the LaboratoryAnalysis to the Products that
were analysed in the laboratory etc. My query would thus be able to give me
how many tons of Rotten Apples I have in the Warehouse.

In my "Generics world" my Business Objects code would be something like:

Dim TotalWeight as integer
Dim RottenWeight as double
For each T as Truck in Warehouse.Trucks
TotalWeight += T.Quantity
For each L as LaboratoryAnalysis IN T.LabAnalysis
For each P as Product in L.Products
RottenWeight += (T.Quantity * P.Rotten)
Next
Next
Next

Assuming I'm along the right lines so far, how would I get the data, and
update the data into my collections? I guess something like:

Class Product
ProductID as integer
SampleSize as double
Description as string
Grade as string
MoistureContent as double
Ash as double
Rotten as double

Sub Load(ProductID as integer)
Dim ds as dsProducts
' Fill dataset
'SELECT ProductID, SampleSize, Description, Grade, MoistureContent, Ash,
Rotten FROM Products WHERE ProductID BETWEEN COALESCE(-1, '@ProductID) AND
COALESCE(65535, @ProductID)
For Each row as dsProducts.ProductsRow in dsProducts
Me.SampleSize = row.SampleSize
Me.Description = row.Description
etc.
Next
End Sub
End Class

However, this is where I run into my mental "brick wall":
1) Should Class Product not just expose the dataset, saving me the effort of
the loop to fill the class properties with the row details?
2) How do I update Class Product? Assuming that I'm using a DataGridSource
(asp.net or win.forms), if the souce of these UI elements is a dataview, then
I can use all of ADO.Net's capababilities, i.e. Merge, GetChanges(dataset)
etc.

If I carry on down the road of my "Generics" model do I have to lose all the
"cool" things about ADO.Net? Alternativley, how do I model my "traditional
database oriented" design using Class Designer?

I do hope that I have been able to clearly express my confusion, and hope
that someone could point me in the right direction.

Many thanks
Jeremy Holt
Jul 21 '05 #14

P: n/a
Cor

In which forum would you suggest I post?

Thanks
Jeremy

"Cor Ligthert" <no************@planet.nl> wrote in message
news:%2****************@TK2MSFTNGP09.phx.gbl...
Jeremy,

This can be very interesting to discuss this in these newsgroups.

http://forums.microsoft.com/MSDN/default.aspx

For those problems you have now is in my opinion beta testing for.

Cor

Jul 21 '05 #15

P: n/a
> Cor

In which forum would you suggest I post?


http://forums.microsoft.com/msdn/Sho...spx?ForumID=32

Jul 21 '05 #16

P: n/a
On Fri, 24 Jun 2005 09:09:06 -0700, "JeremyHolt"
<Je********@discussions.microsoft.com> wrote:


"Doug Taylor" wrote:
On Thu, 23 Jun 2005 16:55:01 -0700, "JeremyHolt"
<Je********@discussions.microsoft.com> wrote:
>Hi,
>
>I have previously written fairly large projects using datasets/dataviews
>with good results.
>However, ever since having got hold of VB.Net Beta 2, I have been looking at
>generics - List(Of type), and been messing around with the class designer.
>
>I am now starting a new project, where the backend is presently an
>antiquated DB2 database running on an AS/400 which may or may not move to SQL
>Server, and the UI will be both Windows Forms and Explorer.
>
>Playing around with the class designer, I have created my model, which is
>essentially classes and collections of classes. The designer allows me to
>visualise very clearly the interrelationships between the various classes. In
>the past, I have always started with the database and built the application
>from there on up. Since the prospect of actually getting the data in and out
>of the AS/400 is so daunting, I decided to build the middle tier (business
>rules) first and then worry about the data access layer.
>
>However, I have now hit a mental brick wall. I can't see how and where I
>will be able to update/insert the back-end data from these classes. I
>downloaded the TimeTracker v 2.0 Starter Kit from asp.net, but since I'm not
>very goods with C#, haven't been able to really figure out the logic between
>the n tiers.
>
>I'm in a bit of a quandry - I would really like to continue using the class
>designer to build the application, but would also like to continue to use
>datasets etc because of their ease of use.
>
>Is there somewhere I can go which will point me in the direction of a hybrid
>model - or must I decide now which way I want to go?
Jeremy,

You are basically using a model which is known as business objects,
i.e. you are putting a layer in place between the Database Layer and
the Presentation Layer that understands your business rules. An
alternative approach of course is to place these business rules in the
database as stored procedures and triggers.

There are good reasons for both of these approaches.

Your problem is as you say is to tie the business objects to the
database, there are two common approaches to this, the first is to
have as a private member a typed record in each class and a typed
recordset in the collection and use that to interact with the
database. The second is to give your objects the ability to access
the database directly. Again both approaches are valid. There are a
number of articles on the internet that discuss the merits of these
approaches.

Doug Taylor
>
>Many thanks
>Jeremy Holt
>


Doug,

Many thanks for your thoughts on this. I had previoulsy thought about a
private typed record in the class, but then found myself inserting the data
into the class properties from the dataset - this seemed rather
counter-intuituve in that if I already have the typed dataset, why would I
want to transfer this data back into my "typed" class. I also have a problem
visualising how one would handle the "collection".

In my business model, I have defined the following classes:
Class Product
ProductID as integer
SampleSize as double
Description as string
Grade as string
MoistureContent as double
Ash as double
Rotten as double
End Class

Class LaboratoryAnalysis
LabID as integer
Date as date
Products as List(Of Product)
End Class

Class Truck
TruckID as integer
NumberPlate as string
Origin as string
Quantity as integer
LabAnalysis as List(Of LaboratoryAnalysis)
End Class

Class Warehouse
WarehouseID as integer
Trucks as List(Of Truck)
End Class

I tend to think in terms of stored procs/triggers filling/updating datasets,
producing dataviews which are consumed by the UI. Single records (master part
of master/details) are obtained through passing a parameter to the stored
proc. As I said in my previous post, I know it works, and its relatively easy
to do.

However, I am constantly drawn back to the idea of classes/collection of
classes, because of the elegance of the Class Designer.

In my "dataset world" in order to fill my Warehouse I would have a stored
proc which queried the database and would do an inner join accross the tables
to give me the "collection" of Trucks - in reality a dataview, which would
continue to drill down through the LaboratoryAnalysis to the Products that
were analysed in the laboratory etc. My query would thus be able to give me
how many tons of Rotten Apples I have in the Warehouse.

In my "Generics world" my Business Objects code would be something like:

Dim TotalWeight as integer
Dim RottenWeight as double
For each T as Truck in Warehouse.Trucks
TotalWeight += T.Quantity
For each L as LaboratoryAnalysis IN T.LabAnalysis
For each P as Product in L.Products
RottenWeight += (T.Quantity * P.Rotten)
Next
Next
Next

Assuming I'm along the right lines so far, how would I get the data, and
update the data into my collections? I guess something like:

Class Product
ProductID as integer
SampleSize as double
Description as string
Grade as string
MoistureContent as double
Ash as double
Rotten as double

Sub Load(ProductID as integer)
Dim ds as dsProducts
' Fill dataset
'SELECT ProductID, SampleSize, Description, Grade, MoistureContent, Ash,
Rotten FROM Products WHERE ProductID BETWEEN COALESCE(-1, '@ProductID) AND
COALESCE(65535, @ProductID)
For Each row as dsProducts.ProductsRow in dsProducts
Me.SampleSize = row.SampleSize
Me.Description = row.Description
etc.
Next
End Sub
End Class

However, this is where I run into my mental "brick wall":
1) Should Class Product not just expose the dataset, saving me the effort of
the loop to fill the class properties with the row details?
2) How do I update Class Product? Assuming that I'm using a DataGridSource
(asp.net or win.forms), if the souce of these UI elements is a dataview, then
I can use all of ADO.Net's capababilities, i.e. Merge, GetChanges(dataset)
etc.

If I carry on down the road of my "Generics" model do I have to lose all the
"cool" things about ADO.Net? Alternativley, how do I model my "traditional
database oriented" design using Class Designer?

I do hope that I have been able to clearly express my confusion, and hope
that someone could point me in the right direction.


Firstly I'll state I haven't yet looked at the latest release of
Visual studio so I'm not sure of the capabilities of the LIST object,
but I assume you can inherit from it.

Assuming you have an inherited list ProductList in your 'LIMS'
(Laboratory Information Management System) with a private member
dsProducts

In your product class if you expose a friend property as
dsProducts.ProductsRow then the load method (of the product
collection) could look something like

For Each row as dsProducts.ProductsRow in dsProducts
Product.ProductRow = Row

Next
Where Product is retried from the appropriate item of the list.

Then the Friend ProductRow should be a copy of its data in the
datatable. I personally don't like this approach as it tends to be
very memory hungry.

The alternative approach is in your class to track the state of the
object i.e. is it a new object or one that has been retrieved from the
database or has it been modified and then provide appropriate Insert
Delete and Update methods.

As in all of these things the best approach will depend on the
environment you are working in and such factors as data ownership and
data volumes.

If the use of computers within a laboratory is of interest to you then
there is a discusion group (the LIMS LIST) at
Ma*******@lims.taratec.com
Many thanks
Jeremy Holt


Jul 21 '05 #17

This discussion thread is closed

Replies have been disabled for this discussion.