By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
446,156 Members | 962 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 446,156 IT Pros & Developers. It's quick & easy.

Smart client - general data access best practice question

P: n/a
Kind of an open question on best-practice for smart-client design. I'd
really appreciate anyones views (preferably with reasoning, but I'll take
what I get...). Or if anybody has any useful links on the subject? (and yes,
I have already googled it at length, but still no strong decision)

=============

After a long stint of pure-desktop / pure-server applications, I'm currently
working on a number of smart-client projects in C# using .Net 2.0.

Obviously we need to do our data access through the web-service, but we have
various options available to us, including:
* Define rich classes at the server (exposed directly as e.g. the return
value from [WebMethod] functions), then use WSDL.exe to extract a
representation of those objects to use at the client
* Using datasets
* Using a hand-crafted, documented schema (presumably xml) on the
web-service and manually constructing (separate) rich objects at both ends
* Not using objects on the web-service, but simpler "parameter-per-property"
based invokes
* Others

Personally I favor the first object, as it
a: best fits my (pre-conceived? naive?) ideas of OO
b: doesn't bind the client and server too closely together (i.e. I could
probably call the web-service from a VB6 or Java client if I wanted - no
dependency on the more complex dataset object, and no dependency on a .Net
client (as would exist if I did binary serialization))
c: seems reasonably efficient code-wise, as I don't need to write all my own
code to stub out the client objects and web-service methods (WSDL.exe does
that for me)
d: allows me to pass complex data in a single round-trip to the server,
rather than multiple calls which would then need more complex transaction
management
e: thanks to the BindingSource and the flag to WSDL.exe that enables
INotifyPropertyChanged, meets all of my UI data-binding needs
f: avoids messing with datasets, which (maybe incorrectly) I percieve as
slightly messy and less efficient that performing my own database access at
the server

OK, this means that at the client I have objects with reduced functionality
(arrays instead of object-collections), but in the few cases where I
genuinely need something more sophisticated I can always wrap it in a facade
or other wrapper, so no huge issue here.

---

So - before I go to town, am I barking up the wrong tree? Am I making my
life hard for the future, or is this a reasonable approach? I don't have any
immensely complex/specialized requirements, so it should be similar to other
peoples experiences...?

All thoughts appreciated.

Marc
Dec 13 '05 #1
Share this Question
Share on Google+
3 Replies


P: n/a
Marc,

the approach we take is to have all functionality (eg business rules,
persistence, everything) reside on the server, and deploy purely as Web
Services. This allows for rich [but thin] client access, AJAX browser
access, SmartPhone, PDA, B2B etc.

It has taken a while (many many years actually) to develop a
framework/platform for building systems like this, but it has paid off, as
we can now deploy any business functionality (eg ERP, CRM, Financial) purely
as hosted Web Services.

The client simply has access to server objects - objects in the pure sense
being Fields and Methods. A change to a text box, click of a checkbox etc
results in an a request to the server to make the corresponding change
(<Delta field="FirstName" value="Fred"/> for a Textbox called 'FirstName');
business rules are executed, validation, calculations, calls to the
persistent store etc and the net results are sent back to the client for
distribution (eg <Field id="FullName" value="Smith,Fred"/> if there was a
server rule that made FullName = LastName +","+ FirstName). Methods are
invoked similarly (reflection on the server is used) and repeating data for
display in grids and listviews has a particular schema that the client
understands.

Our server sessions are Stateful, so no cookies or state restoration is
required (a parameter of the calls is SessionID)..

I have a Whitepaper if you're interested.

Good luck,

Radek

"Marc Gravell" <mg******@rm.com> wrote in message
news:eg**************@TK2MSFTNGP11.phx.gbl...
Kind of an open question on best-practice for smart-client design. I'd
really appreciate anyones views (preferably with reasoning, but I'll take
what I get...). Or if anybody has any useful links on the subject? (and
yes, I have already googled it at length, but still no strong decision)

=============

After a long stint of pure-desktop / pure-server applications, I'm
currently working on a number of smart-client projects in C# using .Net
2.0.

Obviously we need to do our data access through the web-service, but we
have various options available to us, including:
* Define rich classes at the server (exposed directly as e.g. the return
value from [WebMethod] functions), then use WSDL.exe to extract a
representation of those objects to use at the client
* Using datasets
* Using a hand-crafted, documented schema (presumably xml) on the
web-service and manually constructing (separate) rich objects at both ends
* Not using objects on the web-service, but simpler
"parameter-per-property" based invokes
* Others

Personally I favor the first object, as it
a: best fits my (pre-conceived? naive?) ideas of OO
b: doesn't bind the client and server too closely together (i.e. I could
probably call the web-service from a VB6 or Java client if I wanted - no
dependency on the more complex dataset object, and no dependency on a .Net
client (as would exist if I did binary serialization))
c: seems reasonably efficient code-wise, as I don't need to write all my
own code to stub out the client objects and web-service methods (WSDL.exe
does that for me)
d: allows me to pass complex data in a single round-trip to the server,
rather than multiple calls which would then need more complex transaction
management
e: thanks to the BindingSource and the flag to WSDL.exe that enables
INotifyPropertyChanged, meets all of my UI data-binding needs
f: avoids messing with datasets, which (maybe incorrectly) I percieve as
slightly messy and less efficient that performing my own database access
at the server

OK, this means that at the client I have objects with reduced
functionality (arrays instead of object-collections), but in the few cases
where I genuinely need something more sophisticated I can always wrap it
in a facade or other wrapper, so no huge issue here.

---

So - before I go to town, am I barking up the wrong tree? Am I making my
life hard for the future, or is this a reasonable approach? I don't have
any immensely complex/specialized requirements, so it should be similar to
other peoples experiences...?

All thoughts appreciated.

Marc

Dec 14 '05 #2

P: n/a
I would be interested in that paper - while I admire the central nature of
this approach, doesn't this *significantly* increase the bandwidth and
latency of the application? An interesting discussion.

Marc

"Radek Cerny" <ra*********@c1s.com.au> wrote in message
news:%2****************@TK2MSFTNGP12.phx.gbl...
Marc,

the approach we take is to have all functionality (eg business rules,
persistence, everything) reside on the server, and deploy purely as Web
Services. This allows for rich [but thin] client access, AJAX browser
access, SmartPhone, PDA, B2B etc.

It has taken a while (many many years actually) to develop a
framework/platform for building systems like this, but it has paid off, as
we can now deploy any business functionality (eg ERP, CRM, Financial)
purely as hosted Web Services.

The client simply has access to server objects - objects in the pure sense
being Fields and Methods. A change to a text box, click of a checkbox etc
results in an a request to the server to make the corresponding change
(<Delta field="FirstName" value="Fred"/> for a Textbox called
'FirstName'); business rules are executed, validation, calculations, calls
to the persistent store etc and the net results are sent back to the
client for distribution (eg <Field id="FullName" value="Smith,Fred"/> if
there was a server rule that made FullName = LastName +","+ FirstName).
Methods are invoked similarly (reflection on the server is used) and
repeating data for display in grids and listviews has a particular schema
that the client understands.

Our server sessions are Stateful, so no cookies or state restoration is
required (a parameter of the calls is SessionID)..

I have a Whitepaper if you're interested.

Good luck,

Radek

"Marc Gravell" <mg******@rm.com> wrote in message
news:eg**************@TK2MSFTNGP11.phx.gbl...
Kind of an open question on best-practice for smart-client design. I'd
really appreciate anyones views (preferably with reasoning, but I'll take
what I get...). Or if anybody has any useful links on the subject? (and
yes, I have already googled it at length, but still no strong decision)

=============

After a long stint of pure-desktop / pure-server applications, I'm
currently working on a number of smart-client projects in C# using .Net
2.0.

Obviously we need to do our data access through the web-service, but we
have various options available to us, including:
* Define rich classes at the server (exposed directly as e.g. the return
value from [WebMethod] functions), then use WSDL.exe to extract a
representation of those objects to use at the client
* Using datasets
* Using a hand-crafted, documented schema (presumably xml) on the
web-service and manually constructing (separate) rich objects at both
ends
* Not using objects on the web-service, but simpler
"parameter-per-property" based invokes
* Others

Personally I favor the first object, as it
a: best fits my (pre-conceived? naive?) ideas of OO
b: doesn't bind the client and server too closely together (i.e. I could
probably call the web-service from a VB6 or Java client if I wanted - no
dependency on the more complex dataset object, and no dependency on a
.Net client (as would exist if I did binary serialization))
c: seems reasonably efficient code-wise, as I don't need to write all my
own code to stub out the client objects and web-service methods (WSDL.exe
does that for me)
d: allows me to pass complex data in a single round-trip to the server,
rather than multiple calls which would then need more complex transaction
management
e: thanks to the BindingSource and the flag to WSDL.exe that enables
INotifyPropertyChanged, meets all of my UI data-binding needs
f: avoids messing with datasets, which (maybe incorrectly) I percieve as
slightly messy and less efficient that performing my own database access
at the server

OK, this means that at the client I have objects with reduced
functionality (arrays instead of object-collections), but in the few
cases where I genuinely need something more sophisticated I can always
wrap it in a facade or other wrapper, so no huge issue here.

---

So - before I go to town, am I barking up the wrong tree? Am I making my
life hard for the future, or is this a reasonable approach? I don't have
any immensely complex/specialized requirements, so it should be similar
to other peoples experiences...?

All thoughts appreciated.

Marc


Dec 14 '05 #3

P: n/a
Bandwidth usage is minimal - firstly the sessions are stateful so its only
Deltas that are exchanged, and that traffic is gzipped using SOAP extensions
anyway. High latency does hurt, but anything better than 200ms is fine, and
under 500ms usable.
We have several live clients, some using PDA (O2 XDA II) devices in realtime
via GPRS as well as Rich Windows clients.

We are looking forward to playing with XAML - I believe the server could
publish form definitions as well as provide all of the functionality.

Whitepaper on its way to your inbox.

Radek
"Marc Gravell" <mg******@rm.com> wrote in message
news:uv*************@TK2MSFTNGP15.phx.gbl...
I would be interested in that paper - while I admire the central nature of
this approach, doesn't this *significantly* increase the bandwidth and
latency of the application? An interesting discussion.

Marc

"Radek Cerny" <ra*********@c1s.com.au> wrote in message
news:%2****************@TK2MSFTNGP12.phx.gbl...
Marc,

the approach we take is to have all functionality (eg business rules,
persistence, everything) reside on the server, and deploy purely as Web
Services. This allows for rich [but thin] client access, AJAX browser
access, SmartPhone, PDA, B2B etc.

It has taken a while (many many years actually) to develop a
framework/platform for building systems like this, but it has paid off,
as we can now deploy any business functionality (eg ERP, CRM, Financial)
purely as hosted Web Services.

The client simply has access to server objects - objects in the pure
sense being Fields and Methods. A change to a text box, click of a
checkbox etc results in an a request to the server to make the
corresponding change (<Delta field="FirstName" value="Fred"/> for a
Textbox called 'FirstName'); business rules are executed, validation,
calculations, calls to the persistent store etc and the net results are
sent back to the client for distribution (eg <Field id="FullName"
value="Smith,Fred"/> if there was a server rule that made FullName =
LastName +","+ FirstName). Methods are invoked similarly (reflection on
the server is used) and repeating data for display in grids and listviews
has a particular schema that the client understands.

Our server sessions are Stateful, so no cookies or state restoration is
required (a parameter of the calls is SessionID)..

I have a Whitepaper if you're interested.

Good luck,

Radek

"Marc Gravell" <mg******@rm.com> wrote in message
news:eg**************@TK2MSFTNGP11.phx.gbl...
Kind of an open question on best-practice for smart-client design. I'd
really appreciate anyones views (preferably with reasoning, but I'll
take what I get...). Or if anybody has any useful links on the subject?
(and yes, I have already googled it at length, but still no strong
decision)

=============

After a long stint of pure-desktop / pure-server applications, I'm
currently working on a number of smart-client projects in C# using .Net
2.0.

Obviously we need to do our data access through the web-service, but we
have various options available to us, including:
* Define rich classes at the server (exposed directly as e.g. the return
value from [WebMethod] functions), then use WSDL.exe to extract a
representation of those objects to use at the client
* Using datasets
* Using a hand-crafted, documented schema (presumably xml) on the
web-service and manually constructing (separate) rich objects at both
ends
* Not using objects on the web-service, but simpler
"parameter-per-property" based invokes
* Others

Personally I favor the first object, as it
a: best fits my (pre-conceived? naive?) ideas of OO
b: doesn't bind the client and server too closely together (i.e. I could
probably call the web-service from a VB6 or Java client if I wanted - no
dependency on the more complex dataset object, and no dependency on a
.Net client (as would exist if I did binary serialization))
c: seems reasonably efficient code-wise, as I don't need to write all my
own code to stub out the client objects and web-service methods
(WSDL.exe does that for me)
d: allows me to pass complex data in a single round-trip to the server,
rather than multiple calls which would then need more complex
transaction management
e: thanks to the BindingSource and the flag to WSDL.exe that enables
INotifyPropertyChanged, meets all of my UI data-binding needs
f: avoids messing with datasets, which (maybe incorrectly) I percieve as
slightly messy and less efficient that performing my own database access
at the server

OK, this means that at the client I have objects with reduced
functionality (arrays instead of object-collections), but in the few
cases where I genuinely need something more sophisticated I can always
wrap it in a facade or other wrapper, so no huge issue here.

---

So - before I go to town, am I barking up the wrong tree? Am I making my
life hard for the future, or is this a reasonable approach? I don't have
any immensely complex/specialized requirements, so it should be similar
to other peoples experiences...?

All thoughts appreciated.

Marc



Dec 14 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.