473,385 Members | 1,958 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

O/R Mapper

I was spending time to learn the use of strongly typed collection instead
of Dataset/datatable and using Enterprise Library Applictations block..

Recently i discovered there is alot of project for o/r mapper, csla.net,
NHibrnate and many others..
Actuly NHibrnate is look unique than others for me and easier too..
But i don't know if it's the most efficient o/r mapper out there or i look
for something better.

Please if anyone have experince with these mappers may help me to find the
most efficient solutions..
thanx...
Jan 8 '06 #1
21 2412
I have a love-hate with ORMs.

On the one hand, I'm intrigued by the idea of being able to work with
OOP-oriented classes vs DAL code constructs.

On the other hand, they all have some sort of learning curve. There are a
number of new ones out; DataObjects.net looks promising:
http://www.x-tensive.com/Products/DataObjects.NET/

EasyObjects.NET is another:
http://www.easyobjects.net/

Then you have Gentle.Net (sourceforge.net) and the ones you mentioned. I
think it really boils down to this: Do I want to make the commitment of time
needed to really master this and use it on a continued basis in my
applications?
Peter
--
Co-founder, Eggheadcafe.com developer portal:
http://www.eggheadcafe.com
UnBlog:
http://petesbloggerama.blogspot.com


"Islamegy®" wrote:
I was spending time to learn the use of strongly typed collection instead
of Dataset/datatable and using Enterprise Library Applictations block..

Recently i discovered there is alot of project for o/r mapper, csla.net,
NHibrnate and many others..
Actuly NHibrnate is look unique than others for me and easier too..
But i don't know if it's the most efficient o/r mapper out there or i look
for something better.

Please if anyone have experince with these mappers may help me to find the
most efficient solutions..
thanx...

Jan 9 '06 #2
I also have a love/hate relationship. I have been using XPO by DevExpress
for the lsat few months. On 1 hand it is very easy to implement a lot of
things. I have some reservations about complex queries, etc and the lack of
performance type control, but so far for this project it has been going
smoothly. It still remains to be seen how things will go when things become
more complex.

Rob Trainer

"Peter Bromberg [C# MVP]" <pb*******@yahoo.nospammin.com> wrote in message
news:5A**********************************@microsof t.com...
I have a love-hate with ORMs.

On the one hand, I'm intrigued by the idea of being able to work with
OOP-oriented classes vs DAL code constructs.

On the other hand, they all have some sort of learning curve. There are a
number of new ones out; DataObjects.net looks promising:
http://www.x-tensive.com/Products/DataObjects.NET/

EasyObjects.NET is another:
http://www.easyobjects.net/

Then you have Gentle.Net (sourceforge.net) and the ones you mentioned. I
think it really boils down to this: Do I want to make the commitment of
time
needed to really master this and use it on a continued basis in my
applications?
Peter
--
Co-founder, Eggheadcafe.com developer portal:
http://www.eggheadcafe.com
UnBlog:
http://petesbloggerama.blogspot.com


"Islamegy®" wrote:
I was spending time to learn the use of strongly typed collection
instead
of Dataset/datatable and using Enterprise Library Applictations block..

Recently i discovered there is alot of project for o/r mapper, csla.net,
NHibrnate and many others..
Actuly NHibrnate is look unique than others for me and easier too..
But i don't know if it's the most efficient o/r mapper out there or i
look
for something better.

Please if anyone have experince with these mappers may help me to find
the
most efficient solutions..
thanx...

Jan 9 '06 #3
Strongly recommending LLBLGen Pro:

http://www.llblgen.com/

Easy to use, and with one of greatest customer support ever experienced :)
Jan 9 '06 #4
Peter Bromberg [C# MVP] wrote:
I have a love-hate with ORMs.

On the one hand, I'm intrigued by the idea of being able to work with
OOP-oriented classes vs DAL code constructs.

On the other hand, they all have some sort of learning curve. There
are a number of new ones out; DataObjects.net looks promising:

Then you have Gentle.Net (sourceforge.net) and the ones you
mentioned. I think it really boils down to this: Do I want to make
the commitment of time needed to really master this and use it on a
continued basis in my applications?


so you want to write the same dreaded plumbing code over and over
again instead? :) It's your time, your project and your deadline of
course ;)

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 9 '06 #5
"Islamegy®" <NU****************@yahoo.com> a écrit dans le message de news:
uZ**************@TK2MSFTNGP10.phx.gbl...

| Please if anyone have experince with these mappers may help me to find the
| most efficient solutions..

Beware of mappers that map database tables to classes; this is not the best
way to do things as it tries to impose a relational model into your object
model.

The best mappers are the ones that allow you to design your object model and
then let the mapper take care of creating the database to fit the object
model.

If you can write your application, using a mapper, and you don't need to
know anything about databases, then you have a good mapper.

Joanna

--
Joanna Carter [TeamB]
Consultant Software Engineer
Jan 9 '06 #6
Joanna Carter [TeamB] wrote:
"Islamegy®" <NU****************@yahoo.com> a écrit dans le message de
news: uZ**************@TK2MSFTNGP10.phx.gbl...
Please if anyone have experince with these mappers may help me to
find the most efficient solutions..
Beware of mappers that map database tables to classes; this is not
the best way to do things as it tries to impose a relational model
into your object model.


Ohh yeah, THE doom-scenario. :-/

Perhaps you should elaborate a bit why this is 'bad', as it really
doesn't make a difference. If you have a hierarchy of entities
(supertype/subtype) and each entity is mapped onto its own table/view,
you still have 1:1 mappings between entity and table/view but also have
inheritance. If you start from the object model, you'll end up with the
same tables and with the same mappings.

Furthermore, a lot of people have to write software which has to work
with legacy databases. Your proposal implies that these people can't
use o/r mapping, because they have to work with an 'inferior' setup...
The best mappers are the ones that allow you to design your object
model and then let the mapper take care of creating the database to
fit the object model.
and why is that 'better' ? Ever tried to add addional fields to
entities in a system with, say 500 entities, and which is already in
production ? happy migrating with your automatic schema updater :)

Another downside is that often those mappers add meta-data into the
relational model which then makes it impossible to re-use the model +
data in other applications which aren't able to use the o/r mapper core.
If you can write your application, using a mapper, and you don't need
to know anything about databases, then you have a good mapper.


... till your app is in production and a functionality change has to
be implemented. Ooops, now suddenly someone has to write migration
scripts for the 1TB of data to the new schema...

software development doesn't end when v1 is done, Joanna.

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 9 '06 #7
Islamegy® wrote:
I was spending time to learn the use of strongly typed collection instead
of Dataset/datatable and using Enterprise Library Applictations block..

Recently i discovered there is alot of project for o/r mapper, csla.net,
NHibrnate and many others..
Actuly NHibrnate is look unique than others for me and easier too..
But i don't know if it's the most efficient o/r mapper out there or i look
for something better.

Please if anyone have experince with these mappers may help me to find the
most efficient solutions..
thanx...


I have used NHibernate, DeKlarrit and LLBGEN the last definitely has my
vote, Good support on the forums, very nice price for what you get, easy
to grasp, frequent useful updates and a passionate author! What more can
you ask for?

Regards,

Olle
Jan 9 '06 #8

"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in message
news:xn***************@news.microsoft.com...
Perhaps you should elaborate a bit why this is 'bad', as it really
doesn't make a difference. If you have a hierarchy of entities
(supertype/subtype) and each entity is mapped onto its own table/view,
you still have 1:1 mappings between entity and table/view but also have
inheritance. If you start from the object model, you'll end up with the
same tables and with the same mappings.


I find it is very liberating to be free from that shackled thinking of such
a 1:1 relationship.

Not that I'm arguing that ORM products like yours aren't valuable - they are
incredibly powerful for certain scenarios. But they are too limiting in
others, in my experience. Much like products that were called 4th GL's back
in the day - the most powerful RAD tools on the planet but at the same time
limiting when you really needed to push the limits.

I find it particularly odd that design methodologies don't start with the
data, but we jump straight there when we start writing. I've done it myself
many times. But if you were starting a new consulting engagement, you don't
walk into a company and ask "what data do you store?" but rather you ask
"what is it that you DO here?" There are several UML diagrams meant to
document that information, and they are usually done before any design of
the applicaton or its data begins. Doesn't it make sense to flow from use
case documentation to functional design of the business layer that must
perform those things, and then finally to the required data - rather than
from use case to data to business layer? As I understand it, Kathleen
Dollard's book argues pretty strongly for that approach.
Jan 9 '06 #9
Daniel Billingsley wrote:

"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> wrote in
message news:xn***************@news.microsoft.com...
Perhaps you should elaborate a bit why this is 'bad', as it really
doesn't make a difference. If you have a hierarchy of entities
(supertype/subtype) and each entity is mapped onto its own
table/view, you still have 1:1 mappings between entity and
table/view but also have inheritance. If you start from the object
model, you'll end up with the same tables and with the same
mappings.
I find it is very liberating to be free from that shackled thinking
of such a 1:1 relationship.


In an inheritance hierarchy, where each subtype's own fields are
mapped onto the subtype's own table/view, every subtype is actually
mapped on multiple tables/views.

Though the subtype's own fields are mapped on 1 table/view.
If you want to map one entity on multiple tables, why would you want
to do that? The only reason I can think of is a very very bad
relational model.
Not that I'm arguing that ORM products like yours aren't valuable -
they are incredibly powerful for certain scenarios. But they are too
limiting in others, in my experience. Much like products that were
called 4th GL's back in the day - the most powerful RAD tools on the
planet but at the same time limiting when you really needed to push
the limits.
what have O/R mappers to do with 4GL tools? I cant project the flaws
of 4GL's onto O/R mappers to see why o/r mappers are limiting in
several scenario's, so it would help if you could elaborate WHY they
are limiting in some scenario's and also: which scenario's exactly?

For example: our o/r mapper supports lists based on fields from
related entities, so you can create views from entities you define.
This can greatly benefit a developer if s/he has to fetch data for
reporting or other read-only purposes with aggregates, groupby clauses
etc. etc.

Solely focussing on objects is then pretty limiting, agreed, but most
data-access solutions which are worth using offer a way to fetch data
in other flexible ways.
I find it particularly odd that design methodologies don't start with
the data, but we jump straight there when we start writing. I've
done it myself many times. But if you were starting a new consulting
engagement, you don't walk into a company and ask "what data do you
store?" but rather you ask "what is it that you DO here?" There are
several UML diagrams meant to document that information, and they are
usually done before any design of the applicaton or its data begins.
ever heard of NIAM/ORM (http://www.orm.net ) ? Abstract relational
modelling done in the functional research phase. No-one talks about
data, as data are just bits and bytes and have no meaning without
context, and it's the context that's important.

After the functional research phase has ended and the technical
research phase has ended and software has to be written, you start
thinking which code you can replace with code pulled from a shelve so
you gain time. No-one considers writing his own grid control a valuable
way to spend time, but apparently a lot of people find it completely
reasonable to spend a lot of time writing data-access plumbing code.

I always say: first see which code you have to write, then check which
code can be replaced by standard components. it's very simple, and you
keep a clear overview because you know what the standard component
replaces.
Doesn't it make sense to flow from use case documentation to
functional design of the business layer that must perform those
things, and then finally to the required data - rather than from use
case to data to business layer? As I understand it, Kathleen
Dollard's book argues pretty strongly for that approach.


again, no-one talks about data. Customer Has Order, Order BelongsTo
Customer. Customer IsIdentifiedBy CustomerID etc. etc. Simple fact
based sentences which formulate the analysis you've been doing and
result in an abstract relational model. You can then use it to produce
a physical datamodel in your RDBMS, or generate classes from it or both.

The idea of using an existing relational model implemented in a
datamodel is that you can easily reverse engineer these models to an
abstract level which represents an equal abstract model which was also
used to create the datamodel in the first place!

Or do you think people with 500+ tables in their database write those
tables by hand directly in an SQL editor? No of course not, they use
modelling tools which given them a higher level overview.

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 10 '06 #10
"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> a écrit dans le
message de news: xn***************@news.microsoft.com...

| > Beware of mappers that map database tables to classes; this is not
| > the best way to do things as it tries to impose a relational model
| > into your object model.
|
| Ohh yeah, THE doom-scenario. :-/

No, not doom, just not the best.

| Perhaps you should elaborate a bit why this is 'bad', as it really
| doesn't make a difference. If you have a hierarchy of entities
| (supertype/subtype) and each entity is mapped onto its own table/view,
| you still have 1:1 mappings between entity and table/view but also have
| inheritance. If you start from the object model, you'll end up with the
| same tables and with the same mappings.

Not true. If you just look at the typical example of Composition like an
Invoice :

database version
=============
Invoice
ID
Reference
CustomerID
Date

InvoiceLine
ID
InvoiceID
Qty
ProductID
UnitPrice

object version
==========
InvoiceLine
ID
Qty
Product
UnitPrice

Invoice
ID
Reference
CustomerID
Date
Lines

Note the shift from each Line having to refer to its "owner" Invoice to the
Lines being a part of the Invoice with no back reference necessary.

| Furthermore, a lot of people have to write software which has to work
| with legacy databases. Your proposal implies that these people can't
| use o/r mapping, because they have to work with an 'inferior' setup...

Not true. A good OPF allows you to map objects to legacy databases and
customise storage/retrieval beyond the simplistic 1:1 mapping.

| > The best mappers are the ones that allow you to design your object
| > model and then let the mapper take care of creating the database to
| > fit the object model.
|
| and why is that 'better' ? Ever tried to add addional fields to
| entities in a system with, say 500 entities, and which is already in
| production ? happy migrating with your automatic schema updater :)

Yes, it works very nicely thank you. Don't judge all OPFs by those you
already know; some of us took the time and effort to ensure a comprehensive
design.

| Another downside is that often those mappers add meta-data into the
| relational model which then makes it impossible to re-use the model +
| data in other applications which aren't able to use the o/r mapper core.

Again, only in inferior mappers.

| > If you can write your application, using a mapper, and you don't need
| > to know anything about databases, then you have a good mapper.
|
| ... till your app is in production and a functionality change has to
| be implemented. Ooops, now suddenly someone has to write migration
| scripts for the 1TB of data to the new schema...

Not true. You really haven't seen a good OPF yet, have you ?

| software development doesn't end when v1 is done, Joanna.

Which is why my OPF is several versions old now :-)

Joanna

--
Joanna Carter [TeamB]
Consultant Software Engineer
Jan 13 '06 #11
Joanna Carter [TeamB] wrote:
"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> a écrit dans
le message de news: xn***************@news.microsoft.com...
| > Beware of mappers that map database tables to classes; this is not
| > the best way to do things as it tries to impose a relational model
| > into your object model.

Ohh yeah, THE doom-scenario. :-/
No, not doom, just not the best.


what the definition of 'best' is in your statement is still vague.
Perhaps you should elaborate a bit why this is 'bad', as it really
doesn't make a difference. If you have a hierarchy of entities
(supertype/subtype) and each entity is mapped onto its own
table/view, you still have 1:1 mappings between entity and
table/view but also have inheritance. If you start from the object
model, you'll end up with the same tables and with the same
mappings.


Not true. If you just look at the typical example of Composition like
an Invoice :

database version
=============
Invoice
ID
Reference
CustomerID
Date

InvoiceLine
ID
InvoiceID
Qty
ProductID
UnitPrice

object version
==========
InvoiceLine
ID
Qty
Product
UnitPrice

Invoice
ID
Reference
CustomerID
Date
Lines

Note the shift from each Line having to refer to its "owner" Invoice
to the Lines being a part of the Invoice with no back reference
necessary.


Ok, though that would make single entity fast data manipulation pretty
difficult, as you always have to have the Invoice object around.

But, tell me: why is this 'better' ? Of course wacky setups are
thinkable where there are clear differences, the question is if this
will bring you any advantage.

Btw, you still have one entity mapped onto one db object, which I was
arguing. Of course I wasn't arguing that every field in the table/view
has to be present in the entity and vice versa. I was arguing that if
you pick an entity type E, it has one target db object, and if you use
inheritance, inherits the fields and target objects of its supertypes,
but that's it.
| > The best mappers are the ones that allow you to design your object
| > model and then let the mapper take care of creating the database
to | > fit the object model.

and why is that 'better' ? Ever tried to add addional fields to
entities in a system with, say 500 entities, and which is already
in production ? happy migrating with your automatic schema updater
:)
Yes, it works very nicely thank you. Don't judge all OPFs by those
you already know; some of us took the time and effort to ensure a
comprehensive design.


Yes, I know. 3 years full time now and counting.

Btw I was referring to the model you proposed: add fields / hierarchy
elements to classes and let a tool propagate the changes to the db.
(which you seem to declare 'best'). The propagation of changes to the
db can be cumbersome in a huge schema with lots of data.
| > If you can write your application, using a mapper, and you don't
need | > to know anything about databases, then you have a good
mapper.

... till your app is in production and a functionality change has
to be implemented. Ooops, now suddenly someone has to write
migration scripts for the 1TB of data to the new schema...


Not true. You really haven't seen a good OPF yet, have you ?


So, your 'superior' mapper always cranks out the proper upgrade
scripts, and takes into account the full load of the db ? Remember: you
claimed starting with classes and let the o/r mapper update the db
according to the changes made to the class model be 'best'. So, 'not
true', what does that really mean?
software development doesn't end when v1 is done, Joanna.


Which is why my OPF is several versions old now :-)


mine too. so that's not an argument. Besides, I was refering to the
business application build with the o/r mapper, which is for example a
year in production and needs an update.

It has a lot of tables and data. You suggest the developer should just
update the class model, and let the o/r mapper alter the db schema, I
argued that that will be difficult in a schema in production with a lot
of data.

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 13 '06 #12
"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> a écrit dans le
message de news: xn***************@news.microsoft.com...

| what the definition of 'best' is in your statement is still vague.

I did not state that my OPF was the best, just that deriving classes from
tables was not the best. Unfortunately, unitl we get a "perfect" OODB, we
will all be working with compromise due to the impedance mismatch between
the two paradigms.

| But, tell me: why is this 'better' ? Of course wacky setups are
| thinkable where there are clear differences, the question is if this
| will bring you any advantage.

Because this is an accurate modelling of the real compositional nature of
such classes; it is not abstracted to "make it fit" a relational database.

| Yes, I know. 3 years full time now and counting.

Heheh, then it should start getting easier now :-))

| Btw I was referring to the model you proposed: add fields / hierarchy
| elements to classes and let a tool propagate the changes to the db.
| (which you seem to declare 'best'). The propagation of changes to the
| db can be cumbersome in a huge schema with lots of data.

As I have already said, I did not say that this method was "the best", just
that it avoids having an object model coerced into a relational one.

| So, your 'superior' mapper always cranks out the proper upgrade
| scripts, and takes into account the full load of the db ? Remember: you
| claimed starting with classes and let the o/r mapper update the db
| according to the changes made to the class model be 'best'. So, 'not
| true', what does that really mean?

We don't write migration scripts, the OPF looks after that for us; since we
wrote the scripting engine.

Our OPF evaluates execution times and attempts to optimise the generated
SQL; if it still can't achieve desired speeds, then we have a facility for
injecting a custom generator for any given type.

Most complex data retrieval that would require joins, etc tends to be for
what I would call reporting classes. IOW, they are complex queries for
something like, how many of a certain product were sold to Mr Bloggs in
2004. For these scenarios, we write a class like ProductSalesSummary and
ensure that the OPF contains a custom query, handcrafted if necessary.

I am not saying that your design doesn't have value; it certainly makes a
quick migration for legacy databases but for new-build projects, we have
found the object-led approach works very well.

Joanna

--
Joanna Carter [TeamB]
Consultant Software Engineer
Jan 13 '06 #13
Joanna Carter [TeamB] wrote:
"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> a écrit dans
le message de news: xn***************@news.microsoft.com...
what the definition of 'best' is in your statement is still vague.
I did not state that my OPF was the best, just that deriving classes
from tables was not the best. Unfortunately, unitl we get a "perfect"
OODB, we will all be working with compromise due to the impedance
mismatch between the two paradigms.


Well, if you follow this analogy:
- create NIAM model from reality
- create E/R model from niam model
- create tables from E/R model

- create entity types back from tables,

you get back your e/r model. With the proper tooling you get back the
niam model from the reverse engineered e/r model.

The advantage is that you can use NIAM/ORM for functional analysis and
also use the results of that for the db schema, which results in a
relational model build on entity definitions, which are also the base
of what you will work with when you reverse engineer it. At the same
time you have the freedom to adjust the entity definitions in your
classes to the application, without jeopardizing the route from
functional research + abstract model -> e/r model -> tables.

In a lot of organisations, this is a proper way of designing the
database, often done by a person who's job it is to design the
schemas/maintain the schemas because that person is the specialist for
relational models. Although I'd love to say that programs can do that
job for that person, it's often a lot of 'interpretation' of the
reality to come to the proper result.
But, tell me: why is this 'better' ? Of course wacky setups are
thinkable where there are clear differences, the question is if
this will bring you any advantage.


Because this is an accurate modelling of the real compositional
nature of such classes; it is not abstracted to "make it fit" a
relational database.


though neither would the other way around.
Btw I was referring to the model you proposed: add fields /
hierarchy elements to classes and let a tool propagate the changes
to the db. (which you seem to declare 'best'). The propagation of
changes to the db can be cumbersome in a huge schema with lots of
data.


As I have already said, I did not say that this method was "the
best", just that it avoids having an object model coerced into a
relational one.


yeah well, if you start with the classes and create teh relational
model from that, you of course cut corners to make it as easy as
possible for the classes to get persisted and let the o/r mapper do its
job, which will compromise what otherwise would have been a solid
relational model.

It's hard, but doable to reproduce an abstract model of entities
which aren't 1:1 copies of table definitions, and which are also usable
in hierarchies for example (supertype/subtype) which already move away
from the relational approach as inheritance isn't modelable in a
relational model without semantic interpretation of the relational
model and / or containing data.
So, your 'superior' mapper always cranks out the proper upgrade
scripts, and takes into account the full load of the db ?
Remember: you claimed starting with classes and let the o/r mapper
update the db according to the changes made to the class model be
'best'. So, 'not true', what does that really mean?


We don't write migration scripts, the OPF looks after that for us;
since we wrote the scripting engine.


Though how a production database is migrated is often depending on the
amount of data in the tables, not the structure of the tables itself,
and because a production database with a lot of data is hard to backup
/ restore easily, migrating these databases can be a task which is
often done by hand, tested in labs first etc.
Our OPF evaluates execution times and attempts to optimise the
generated SQL; if it still can't achieve desired speeds, then we have
a facility for injecting a custom generator for any given type.
Clever.
Most complex data retrieval that would require joins, etc tends to be
for what I would call reporting classes. IOW, they are complex
queries for something like, how many of a certain product were sold
to Mr Bloggs in 2004. For these scenarios, we write a class like
ProductSalesSummary and ensure that the OPF contains a custom query,
handcrafted if necessary.
I do similar things: you can define a list based on fields in the
entities in the project (which support inheritance) which are related,
and you can also do that in code, dynamically. Using the compile-time
checked query language then to build the query with aggregates, groupby
etc. for reporting data.

Not a lot of the o/r mappers out there offer that kind of features, as
most of them use the entity as their smallest block to work with, while
if they would use teh attribute (entity field), dynamic lists and the
like would be possible.
I am not saying that your design doesn't have value; it certainly
makes a quick migration for legacy databases but for new-build
projects, we have found the object-led approach works very well.


I also think what is the core information provider for your design is
important. If you, as I stated above, use NIAM/ORM or similar highly
abstract modelling technique to design the reality with the customer
during the functional research phase, you have other sources of
information than when you start with a UML model. It's not that the
other way of doing things doesn't work or works less good, it's just
that a person should realize that by starting with the DB requires a
properly designed relational model, and starting with classes requires
a properly designed class model. If people don;t realize that, no
matter what technique they choose, it will be cumbersome.

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 13 '06 #14
"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> a écrit dans le
message de news: xn***************@news.microsoft.com...

| Well, if you follow this analogy:
| - create NIAM model from reality
| - create E/R model from niam model
| - create tables from E/R model
|
| - create entity types back from tables,
|
| you get back your e/r model. With the proper tooling you get back the
| niam model from the reverse engineered e/r model.

Having never found a use for things like NIAM (I had to look it up to see
what it was), I can't see what you mean. I start my modelling by defining
classes that contain, not only data, but also functionality as OO design is
not just about data in isolation.

Then, if instances of that class require persisting, I then go on to decide
which attributes (data) require persisting and then map those attributes to
fields of tables in the teh OPF. Many classes in my systems do not actually
need persisting at all as they are "worker" classes, used to manipulate
other objects.

| In a lot of organisations, this is a proper way of designing the
| database, often done by a person who's job it is to design the
| schemas/maintain the schemas because that person is the specialist for
| relational models. Although I'd love to say that programs can do that
| job for that person, it's often a lot of 'interpretation' of the
| reality to come to the proper result.

We find that, using our approach, employing a DBA would be serious overkill
as all the tables are created automatically, adjusted automatically and the
only SQL involved is simple Select, Update, Insert and Delete statements

| though neither would the other way around.

Having to add foreign key fields and referential integrity constraints into
a database where they do not exist in the object model is corrupting the
original model. Relational theory was invented to accomodate the need to
store data in a relational database. If you were able to use a well designed
OODB, then you would no longer need relational databases. Why should I have
to store the invoice lines separately from the invoice if they are an
integral part of that invoice ? Only because a relational database requires
me to do it that way.

| yeah well, if you start with the classes and create teh relational
| model from that, you of course cut corners to make it as easy as
| possible for the classes to get persisted and let the o/r mapper do its
| job, which will compromise what otherwise would have been a solid
| relational model.

I don't cut corners in class design, I use the OPF to translate the object
model into a relational one. However, due to the minimal requirements of
implementing an object storage mechanism, the majority of what you call a
"solid" relational model simply never gets done; it is the job of the
business classes themselves to maintain inter-object integrity.

| It's hard, but doable to reproduce an abstract model of entities
| which aren't 1:1 copies of table definitions, and which are also usable
| in hierarchies for example (supertype/subtype) which already move away
| from the relational approach as inheritance isn't modelable in a
| relational model without semantic interpretation of the relational
| model and / or containing data.

It looks like we must agree to differ. You seem to think that everything has
to be solved using relational modelling, whereas I only use RDBs as a
temporary store for objects.

| Though how a production database is migrated is often depending on the
| amount of data in the tables, not the structure of the tables itself,
| and because a production database with a lot of data is hard to backup
| / restore easily, migrating these databases can be a task which is
| often done by hand, tested in labs first etc.

If I were to migrate from one database manufacturer to another, I would
simply use one type of connection to read in the objects and then write them
out using a different type of connection. Both connections are written to
accomodate any peculiarities of the particular DB to which they are talking.

| I do similar things: you can define a list based on fields in the
| entities in the project (which support inheritance) which are related,
| and you can also do that in code, dynamically. Using the compile-time
| checked query language then to build the query with aggregates, groupby
| etc. for reporting data.
|
| Not a lot of the o/r mappers out there offer that kind of features, as
| most of them use the entity as their smallest block to work with, while
| if they would use teh attribute (entity field), dynamic lists and the
| like would be possible.

Certainly, we use an attribute (property/field) as the smallest granularity.

We have been known to create database views that manage inherited classes,
so that the base/derived classes are stored in "partial" tables linked on
the instance ID. but we tend to keep most classes to full class per table,
even if that means the same columns in multiple tables. This is an
optimisation that pays off in retrieval times at the expense of being able
to easily retrieve heterogeneous lists of all instances of a hierarchy.

| I also think what is the core information provider for your design is
| important. If you, as I stated above, use NIAM/ORM or similar highly
| abstract modelling technique to design the reality with the customer
| during the functional research phase, you have other sources of
| information than when you start with a UML model. It's not that the
| other way of doing things doesn't work or works less good, it's just
| that a person should realize that by starting with the DB requires a
| properly designed relational model, and starting with classes requires
| a properly designed class model. If people don;t realize that, no
| matter what technique they choose, it will be cumbersome.

We have never neede NIAM/ORM, we usually start with UML class diagrams and
use case diagrams. When we have a tested, working object model, then we look
at persistence. Don't forget, in theory we could just keep all the objects
in memory as in something like Prevayler, only writing the whole object
graph to disk when the application quits.

We really are approaching applicaiton design from two different directions;
you from the relational driven background where the world can be ruled by
RDBs :-) , I from the OO background where RDBs are mere (and inefficient)
slaves to the object world :-)

Joanna

--
Joanna Carter [TeamB]
Consultant Software Engineer
Jan 13 '06 #15
Many factors will probably play into your decision including
- Can you start with the domain model, and are you free to design your
schema?
- Or do you have an existing database schema which you cannot modify?

Also, the amount of data you want to process, the deployment scenarios,
the number of concurrent users, the kind of data access (OLAP vs. Data
Processing/Number Crunching) might all have an impact on your decision.
You may have to support multiple different SQL dialects, e.g. MS SQL
vs. MySQL vs. Oracle vs. DB/2. If that is the case you need to figure
out how to handle differences yourself or you can use and OR framework
that does it for you.

You might also consider performance issues, e.g. what happens if you
process 100,000 objects in a list, each of them referencing another set
of objects. When you access the list then do you want to load all the
100,000 objects immediately? When you access one of the 100,000
objects, do you want the OR framework to load the other referenced
objects as well? Think of a network of objects: Where do you want to
"cut-off" loading? Do you want to use lazy loading? How does this
related to caching? Do you want to be able to support a distributed
cache? Read/write? Read-caching only?

The point I want to make is that selecting the "best" OR framework does
not reduce the number of questions you need to answer. At best you get
a different set of questions.

Also, assuming that introducing an OR framework completely hides the
database access/product/etc. is naive at best. You still need to know
what is going on under the hood. For instance, you probably want to
know what kind of SQL statement is being sent to the server.
From a project I can remember a case where a simple method retrieved two strings from the database. As the team didn't have a lot of
experience with the OR tool the just did what they thought was the
straight forward approach. In the end the SQL statement was a join over
a dozen tables and extremely slow. In the particular case the issue was
solved with a fairly simple stored procedure wrapped by a simple method
call.

As a side note: beware of people who think that mapping tables to
classes is identical to mapping a good object-oriented design to
tables, and that you end up with the same set of tables in both cases.
It usually raises a red flag if someone does not know the difference
between a class diagram and an ERD.

Bottom line my suggestions would be:
- Depending on your project size run a little experiment with different
approaches, determine selection criteria, and finally decide on the
make-or-buy decision.
- If you roll your own mapping, you should be extremely confident that
you can do better that those who implemented the available frameworks,
or you should have an excellent justification.

Best regards,
Manfred.
---
Manfred Lange
http://www.manfred-lange.com
http://manfredlange.blogspot.com
ml at agileutilities dot com
Islamegy® wrote: I was spending time to learn the use of strongly typed collection instead
of Dataset/datatable and using Enterprise Library Applictations block..

Recently i discovered there is alot of project for o/r mapper, csla.net,
NHibrnate and many others..
Actuly NHibrnate is look unique than others for me and easier too..
But i don't know if it's the most efficient o/r mapper out there or i look
for something better.

Please if anyone have experince with these mappers may help me to find the
most efficient solutions..
thanx...


Jan 13 '06 #16
"Ever tried to add addional fields to
entities in a system with, say 500 entities, and which is already in
production ? happy migrating with your automatic schema updater :) "

This doesn't need to be an issue in all cases. We have designed our
product in a way that the customer can add his own fields/properties to
any domain object. These custom fields/properties even 'survive'
updates of the software.

If we would want to add a specific field/property to all domain objects
then we would simply extend the common base class for all domain
objects.

I think it depends on the requirements and in particular on the
object-oriented design of your system, and how the domain model is
mapped onto the schema.

The story certainly changes if other factors play a role, e.g. the
database schema is given and cannot be changed.

Best regards,
Manfred.
---
http://www.manfred-lange.com
http://manfredlange.blogspot.com
ml at agileutilities dot com

Jan 13 '06 #17
"Manfred" <ml@agileutilities.com> a écrit dans le message de news:
11**********************@g43g2000cwa.googlegroups. com...

| This doesn't need to be an issue in all cases. We have designed our
| product in a way that the customer can add his own fields/properties to
| any domain object. These custom fields/properties even 'survive'
| updates of the software.

Have you been reading my source code ? :-))

| The story certainly changes if other factors play a role, e.g. the
| database schema is given and cannot be changed.

Heheh, especially if you have a DBA who insists that you only use their
stored procs, etc :-(

Joanna

--
Joanna Carter [TeamB]
Consultant Software Engineer
Jan 13 '06 #18
"Manfred" <ml@agileutilities.com> a écrit dans le message de news:
11**********************@g14g2000cwa.googlegroups. com...
You might also consider performance issues, e.g. what happens if you process 100,000 objects in a list, each of them referencing another set
of objects. When you access the list then do you want to load all the
100,000 objects immediately? When you access one of the 100,000
objects, do you want the OR framework to load the other referenced
objects as well? Think of a network of objects: Where do you want to
"cut-off" loading? Do you want to use lazy loading? How does this
related to caching? Do you want to be able to support a distributed
cache? Read/write? Read-caching only?
<

I have found the idea of "proxy" instances where objects in lists are only
partially loaded with the properties essential to browsing e.g. ID, Name,
Description; the rest of the properties are only loaded when you select an
object for editing.
The point I want to make is that selecting the "best" OR framework does not reduce the number of questions you need to answer. At best you get
a different set of questions.
<

Agreed
From a project I can remember a case where a simple method retrieved two strings from the database. As the team didn't have a lot of
experience with the OR tool the just did what they thought was the
straight forward approach. In the end the SQL statement was a join over
a dozen tables and extremely slow. In the particular case the issue was
solved with a fairly simple stored procedure wrapped by a simple method
call.
<

We have also used a method whereby we use a series of single table selects
in sequence basing the where clause or such on the results of the first
query.
As a side note: beware of people who think that mapping tables to

classes is identical to mapping a good object-oriented design to
tables, and that you end up with the same set of tables in both cases.
It usually raises a red flag if someone does not know the difference
between a class diagram and an ERD.
<

Amen to that !

Joanna

--
Joanna Carter [TeamB]
Consultant Software Engineer
Jan 13 '06 #19

Islamegy® wrote:
I was spending time to learn the use of strongly typed collection instead
of Dataset/datatable and using Enterprise Library Applictations block..
I tried this route, too. It takes too long and you won't really be able
to build in all the features that a framework can offer.
Recently i discovered there is alot of project for o/r mapper, csla.net,
NHibrnate and many others..
Actuly NHibrnate is look unique than others for me and easier too..
But i don't know if it's the most efficient o/r mapper out there or i look
for something better.
It may depend on if your database is supported by the framework. Most
C# folks are sucked into the MS universe and settle for SQL Server.
Most of the .Net frameworks will support that.
Please if anyone have experince with these mappers may help me to find the
most efficient solutions..


It depends, of course, on your application. If you are looking at
SmartClient applications and not web apps, take a look at DevForce from
IdeaBlade (www.ideablade.com). It has an ORM component. However, it
also has data-binding classes for the UI part of the application. IMO,
they understand the needs and realities of most business application
development. And they solve them with some great tools. For some of the
best documentation around, check out their Concepts guide. As a
customer, I have also found their support to be extremely responsive,
perceptive, and smart.

Conceptually, you can picture DevForce as a usable implementation of
CSLA, which seems to be a highly over-engineered approach. R. Lohtka,
the CSLA author, is a member of their advisory board.

Jan 13 '06 #20
Joanna Carter [TeamB] wrote:
"Frans Bouma [C# MVP]" <pe******************@xs4all.nl> a écrit dans
le message de news: xn***************@news.microsoft.com...
Well, if you follow this analogy:
- create NIAM model from reality
- create E/R model from niam model
- create tables from E/R model

- create entity types back from tables,

you get back your e/r model. With the proper tooling you get back
the niam model from the reverse engineered e/r model.
Having never found a use for things like NIAM (I had to look it up to
see what it was), I can't see what you mean. I start my modelling by
defining classes that contain, not only data, but also functionality
as OO design is not just about data in isolation.

Then, if instances of that class require persisting, I then go on to
decide which attributes (data) require persisting and then map those
attributes to fields of tables in the teh OPF. Many classes in my
systems do not actually need persisting at all as they are "worker"
classes, used to manipulate other objects.


Ok, I think we then use a different approach towards creating
software. Which is perfectly fine of course, though it requires that
the proper tooling/methods are used which fit with the approach chosen.
I'm perhaps pretty old-school in this regard (Yourdon, Halpin etc.)
In a lot of organisations, this is a proper way of designing the
database, often done by a person who's job it is to design the
schemas/maintain the schemas because that person is the specialist
for relational models. Although I'd love to say that programs can
do that job for that person, it's often a lot of 'interpretation'
of the reality to come to the proper result.


We find that, using our approach, employing a DBA would be serious
overkill as all the tables are created automatically, adjusted
automatically and the only SQL involved is simple Select, Update,
Insert and Delete statements


Heh :) well, as you might have read some of my anti-stored procedure
articles, you might have guessed I'm not that fond of the average
DBA-"stored procedures are the way to go" approach, though I think they
have their place, though more on the admin side, where tuning of RDBMS
parameters is required (e.g.: parameters outside the application scope,
like index bucket fill rate etc. )
though neither would the other way around.


Having to add foreign key fields and referential integrity
constraints into a database where they do not exist in the object
model is corrupting the original model. Relational theory was
invented to accomodate the need to store data in a relational
database. If you were able to use a well designed OODB, then you
would no longer need relational databases. Why should I have to store
the invoice lines separately from the invoice if they are an integral
part of that invoice ? Only because a relational database requires me
to do it that way.


If everything would have been as small as the sole entity, it would
indeed be the right approach. Though when I want a list of all
customer's company names and the order dates of their last orders, I
effectively create a new entity ('relation' in Codd's terms) which is a
first-class citizen of the relational model, but a bit of an ugly
stephchild in an OODB.

Also, OODBs are around for a long time now, but never gained any
ground while OO languages are around even longer, so something must
make them not really applicable to realworld scenario's.

Though I agree, if (not when) an application is fully suited with an
OODB, you don't need an o/r mapper.
yeah well, if you start with the classes and create teh relational
model from that, you of course cut corners to make it as easy as
possible for the classes to get persisted and let the o/r mapper
do its job, which will compromise what otherwise would have been a
solid relational model.


I don't cut corners in class design, I use the OPF to translate the
object model into a relational one. However, due to the minimal
requirements of implementing an object storage mechanism, the
majority of what you call a "solid" relational model simply never
gets done; it is the job of the business classes themselves to
maintain inter-object integrity.


I think we disagree on that based on where we both come from and how
we look at the problem.
It's hard, but doable to reproduce an abstract model of entities
which aren't 1:1 copies of table definitions, and which are also
usable in hierarchies for example (supertype/subtype) which
already move away from the relational approach as inheritance
isn't modelable in a relational model without semantic
interpretation of the relational model and / or containing data.


It looks like we must agree to differ. You seem to think that
everything has to be solved using relational modelling, whereas I
only use RDBs as a temporary store for objects.


I don't think 'everything' has to be solved with relational modeling,
I'm just practical. Just try it for a small simple customer-order etc.
app: go to http://www.orm.net (successor of NIAM), and read the basic
rules, it's very simple. You can model the reality without talking
about databases, classes or other things which map 1:1 to a technical
construct (table/class whatever). I think that's a great advantage, but
I agree with you, that we think differently and as long as we both
agree that the other approach is also doable but just 'different' it's
fine by me. :) I know a lot of people use E/R modelling and are very
succesful, but also know that a lot of people use DDD and are very
succesful.
I do similar things: you can define a list based on fields in the
entities in the project (which support inheritance) which are
related, and you can also do that in code, dynamically. Using the
compile-time checked query language then to build the query with
aggregates, groupby etc. for reporting data.

Not a lot of the o/r mappers out there offer that kind of
features, as most of them use the entity as their smallest block
to work with, while if they would use teh attribute (entity
field), dynamic lists and the like would be possible.


Certainly, we use an attribute (property/field) as the smallest
granularity.

We have been known to create database views that manage inherited
classes, so that the base/derived classes are stored in "partial"
tables linked on the instance ID. but we tend to keep most classes to
full class per table, even if that means the same columns in multiple
tables. This is an optimisation that pays off in retrieval times at
the expense of being able to easily retrieve heterogeneous lists of
all instances of a hierarchy.


though wouldn't you agree that if said database is for example also to
be used in another application not using your o/r mapper, the database
would look 'un-normalized' or 'not properly normalized' and which would
be a bit of a problem?
I also think what is the core information provider for your design
is important. If you, as I stated above, use NIAM/ORM or similar
highly abstract modelling technique to design the reality with the
customer during the functional research phase, you have other
sources of information than when you start with a UML model. It's
not that the other way of doing things doesn't work or works less
good, it's just that a person should realize that by starting with
the DB requires a properly designed relational model, and starting
with classes requires a properly designed class model. If people
don;t realize that, no matter what technique they choose, it will
be cumbersome.


We have never neede NIAM/ORM, we usually start with UML class
diagrams and use case diagrams. When we have a tested, working object
model, then we look at persistence. Don't forget, in theory we could
just keep all the objects in memory as in something like Prevayler,
only writing the whole object graph to disk when the application
quits.


sure, though querying and reporting would be a big pain ;)
We really are approaching applicaiton design from two different
directions; you from the relational driven background where the world
can be ruled by RDBs :-) , I from the OO background where RDBs are
mere (and inefficient) slaves to the object world :-)


hehe :) I can live with that conclusion :)

Cheers

Frans

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 14 '06 #21
Manfred wrote:
Also, assuming that introducing an OR framework completely hides the
database access/product/etc. is naive at best. You still need to know
what is going on under the hood. For instance, you probably want to
know what kind of SQL statement is being sent to the server.
Isn't that contradicting with your statement below that it apparently
isn't important which relational model is created by the mapper? After
all, mapping classes to tables suggests that tables follow classes
which means that what you're doing with the classes would drive the
structure of the tables, as the tables are just storage for what you've
written in code.
From a project I can remember a case where a simple method retrieved

two strings from the database. As the team didn't have a lot of
experience with the OR tool the just did what they thought was the
straight forward approach. In the end the SQL statement was a join
over a dozen tables and extremely slow. In the particular case the
issue was solved with a fairly simple stored procedure wrapped by a
simple method call.

As a side note: beware of people who think that mapping tables to
classes is identical to mapping a good object-oriented design to
tables, and that you end up with the same set of tables in both cases.
It usually raises a red flag if someone does not know the difference
between a class diagram and an ERD.


again, claiming things without proving why isn't helping the
discussion.

Let me help you, as I'm getting pretty tired of claims from people
without any proof. If you have something to say about which is bad or
good, explain why or don't claim things.

Let's say we define the following entities in the scope of our
application (among other entities):

Employee
Manager (subtype of employee)
BoardMember (subtype of manager)
CompanyCar
FamilyCar (subtype of CompanyCar)
SportsCar (subtype of CompanyCar).

We also identity the rule that only a boardmember can have a company
car.

If you use niam/orm you'd define a supertype employee, then a subtype
manager, then a subtype of that of boardmember. you also would define a
supertype companycar and 2 subtypes of that called familycar and
sportscar. You also would create a m:1 relation between boardmember and
companycar, as that would realize the rule that only boardmembers can
have a company car.

You can do the exact same thing in UML.

No tables have been created yet. Ok. Creating tables from this
abstract model creates an interesting question: you can model
hierarchies in 2 ways: every subtype in a separate table with a 1:1
relation over the PK, or you can flatten the hierarchy and store
everything in a single table with a discriminator column (there are
other ways, which are actually combinations).

As there is a relation between boardmember and company car which is
thus non-existend for manager and employee, it's better to opt for the
1-table per entity modelling. However for the
companycar-familycar/sportscar hierarchy, it doesn't matter, so we
flatten that hierarchy.

The tables we end up with are:
- employee
- manager (fk on pk to pk of employee)
- boardmember (fk on pk to manager, fk to companycar)
- companycar (with discriminatorcolumn cartype)

so 4 tables instead of 6.

Now, lets say the system architect uses niam/orm or other abstract
modelling technique and created those 4 tables. We then use the
table->class approach and what do we get?

4 entities.

So we mis 2. Now, from an OO's perspective, this is lacking, and I
fully agree (in LLBLGen Pro, you can re-create the complete hierarchy
as I explained earlier on, easily, recreating the complete abstract
model as it was intended, so at first it finds the 4 entities, and the
hierarchy of employee-manager-boardmember, and you create teh subtypes
familycar and sportscar).

Though what's the real deal? Isn't it so that all this is rather
semantic? If I have just 4 types, and just a companycar, is that
preventing me from writing software that works? No not at all, because
people are writing software with that paradigm for over 3 decades now.
It's just that it isn't that EASY to work with, because you can make
mistakes (if you don't interpret manually the cartype value, you might
treat a companycar row as a nice sportscar while it's a lame familycar!)

Though never forget: the data in the rows in the db are just data, and
get their semantic value in the sense of inheritance and what TYPE they
represent, by the context they're read. This means: beyond the type
offered by the relational model, as 'companycar' is the type offered by
the relational model but the types we're looking at are based on
semantical interpretation of a value in a column, not by the structure
of the type offered by the relational model.

Is the approach of using a proper relational model as the basis of
your application that weird though? No, as I just described, the
abstract elements which are used as building blocks are effectively the
same, be it an entity in a niam model or a class in an UML model. The
UML model combines behavior with teh data structure, the niam model
doesn't but that's a difference not really important for the structure
of the model.

Now, did I end up with different tables than a person who would have
started with UML? No. Employee, manager and boardmember are all mapped
to their own table. companycar, familycar and sportscar are all mapped
to the same table.

perhaps the cause of all this is the difference between datamodel and
relational model, I'm not sure. I hope people now realize what the
differences are between using an entity model vs using a pure table
model and can make their decision which one to choose.

For the record: I always talked about entities and the relational
model of entities, not about pure tables, as I've described above, you
can have multiple entities with just 1 table.
Bottom line my suggestions would be:
- Depending on your project size run a little experiment with
different approaches, determine selection criteria, and finally
decide on the make-or-buy decision.
- If you roll your own mapping, you should be extremely confident that
you can do better that those who implemented the available frameworks,
or you should have an excellent justification.


As a person who now works for 3 years full time on one of the
marketleaders, I can assure you: writing your own is simply too time
consuming: even if you just implement the very basic mapping code, its
still a lot of work, and also unnecessary, simply because there are
enough different o/r mappers out there which are solid and mature and
all have their own POV on the matter so it should be easy to find that
o/r mapper which fits the way you want to work with data.

I tried to write a little article on that some time ago:
http://weblogs.asp.net/fbouma/archiv...09/240225.aspx

Personally, I don't think anyone is helped with 'this is good' and
'this is bad' kind of babbling, simply because there isn't a silver
bullit. People who are perfectly comfortable with sql, table focussed
database access etc. have a hard time with DDD and o/r mappers but feel
at home with stored procs and datatables. We all can try to lecture
them that they're doing stupid things and use bad practises
(exxagerated) but at the end of the day, if they have succeeded in
writing a working succesful application, we're not justified to lecture
them, because they did manage to write a proper application.

FB

--
------------------------------------------------------------------------
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET blog: http://weblogs.asp.net/fbouma
Microsoft MVP (C#)
------------------------------------------------------------------------
Jan 14 '06 #22

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

8
by: Flare | last post by:
Hi Im coming from a J2EE world and have used Hibernate very succesfully. Now I have to choose wich O/R mapping to use in my next ASP.NET project for the persistence layer. The next projet is...
1
by: mirnazim | last post by:
Hi, I have to develop a web based information system for an educational institution. It is going to be a 100% OO code. Therefore I also need an Object Relational Mapper. I have not used any object...
3
by: Dave A | last post by:
Has anyone written or using an identity mapper design pattern in an enterprise framework? The identity mapper pattern is described in Martin Fowler's Patterns of Enterprise Application...
17
by: Fregas | last post by:
I'm the lead developer at my company and i'm looking into O/R Mappers again. I've used LLBLGen and Wilson ORM and played with some others. I prefer WORM because of its simplicity and the fact that...
0
by: cduffy | last post by:
Hi all, I have been converting my BT 2004 projects to BT 2006 for several days now. Things have been going well, but there is one thing that is really irking me.. When I double click a map...
0
by: Iceman | last post by:
To help speed up HTML image mapping try out Precision HTML Image Mapper from Hackett Solutions Inc. http://www.hackettsolutions.com/PrecisionHTMLImageMapper.aspx The Precision HTML Image...
1
by: Fregas | last post by:
Just an update to see what O/R Mappers you guys recommend. LINQ is coming out soon but i'd really like to standardize on something at my company. Some O/R Mappers i've already looked at or used...
5
by: Flip Rayner | last post by:
Hi, We are creating a URL mapper that translates from one URL (yucky - badly formatted), to a nicely formatted URL. The mapper acts as a proxy type website above the source website, processing...
1
by: LavanyaM | last post by:
hi all, could you give me some open source standard tool for xml to xml mapper? Regards, Lavanya.M
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.