473,396 Members | 1,846 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Data Tier Optimization

I am building a web application that will have a single form that will be
populated with data from 8-10 tables. The easiest way to implement the data
tier would be to use DataSets and Data Adapters for each table. But, is
that the optimal approach?

How do I optimize the back-end data retrieval and corresponding updates
(inserts, updates, and deletes) within the .NET Framework?

Thanks in advance,

Mervin Williams
Nov 18 '05 #1
4 1412
Mervin:

If it's all on one form (and even if it's not) having a seperate dataset for
each datattable is definitely something you don't want to do. One DataSet
holding the datatables, (Unless logically there's a good reason for using
seperate datasets - but if you're talking about one form, it's hard to
imagine a scenario where you'd want/need more than one dataset), using
datarelations where possible is probably the best approach. You can then
serialize one set if you need to , pass it to/from a web service or use
whatever other method you want to access the data. You'll want different
dataadapters.

As far as optimization, there are different schools of thought, but you can
encapsulate that in any tier although the presentation tier is usually a
really bad choice. If you use stored procs you cna keep it all in the
backend layer which I find preferable b/c I can make change to my program
without recompiling and redistributing and it allows for optimum
performance, secuirty and flexibility. Many disagree and think the logic
should be in the middle tier and they have a point. You can also combine
the two. However, calling procs, passing in params where needed and letting
them do the heavy lifting is a way that I prefer.

HTH,

Bill
"Mervin Williams" <mw*******@innovasolutions.net> wrote in message
news:e4**************@TK2MSFTNGP12.phx.gbl...
I am building a web application that will have a single form that will be
populated with data from 8-10 tables. The easiest way to implement the data tier would be to use DataSets and Data Adapters for each table. But, is
that the optimal approach?

How do I optimize the back-end data retrieval and corresponding updates
(inserts, updates, and deletes) within the .NET Framework?

Thanks in advance,

Mervin Williams

Nov 18 '05 #2
Mervin:

If it's all on one form (and even if it's not) having a seperate dataset for
each datattable is definitely something you don't want to do. One DataSet
holding the datatables, (Unless logically there's a good reason for using
seperate datasets - but if you're talking about one form, it's hard to
imagine a scenario where you'd want/need more than one dataset), using
datarelations where possible is probably the best approach. You can then
serialize one set if you need to , pass it to/from a web service or use
whatever other method you want to access the data. You'll want different
dataadapters.

As far as optimization, there are different schools of thought, but you can
encapsulate that in any tier although the presentation tier is usually a
really bad choice. If you use stored procs you cna keep it all in the
backend layer which I find preferable b/c I can make change to my program
without recompiling and redistributing and it allows for optimum
performance, secuirty and flexibility. Many disagree and think the logic
should be in the middle tier and they have a point. You can also combine
the two. However, calling procs, passing in params where needed and letting
them do the heavy lifting is a way that I prefer.

HTH,

Bill
"Mervin Williams" <mw*******@innovasolutions.net> wrote in message
news:e4**************@TK2MSFTNGP12.phx.gbl...
I am building a web application that will have a single form that will be
populated with data from 8-10 tables. The easiest way to implement the data tier would be to use DataSets and Data Adapters for each table. But, is
that the optimal approach?

How do I optimize the back-end data retrieval and corresponding updates
(inserts, updates, and deletes) within the .NET Framework?

Thanks in advance,

Mervin Williams

Nov 18 '05 #3
I am curious when is it evident to break up a dataset with mulitple tables.
For example if you have multiple forms and need data on each form. Do you
suggest to break up the one dataset logically to match the forms if the
forms map to schema of the db? Or one dataset is still better? I have seen
many examples using only one dataset and it seems to me that this is not a
good approach.

Thanks

"William Ryan eMVP" <do********@comcast.nospam.net> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
Mervin:

If it's all on one form (and even if it's not) having a seperate dataset for each datattable is definitely something you don't want to do. One DataSet
holding the datatables, (Unless logically there's a good reason for using
seperate datasets - but if you're talking about one form, it's hard to
imagine a scenario where you'd want/need more than one dataset), using
datarelations where possible is probably the best approach. You can then
serialize one set if you need to , pass it to/from a web service or use
whatever other method you want to access the data. You'll want different
dataadapters.

As far as optimization, there are different schools of thought, but you can encapsulate that in any tier although the presentation tier is usually a
really bad choice. If you use stored procs you cna keep it all in the
backend layer which I find preferable b/c I can make change to my program
without recompiling and redistributing and it allows for optimum
performance, secuirty and flexibility. Many disagree and think the logic
should be in the middle tier and they have a point. You can also combine
the two. However, calling procs, passing in params where needed and letting them do the heavy lifting is a way that I prefer.

HTH,

Bill
"Mervin Williams" <mw*******@innovasolutions.net> wrote in message
news:e4**************@TK2MSFTNGP12.phx.gbl...
I am building a web application that will have a single form that will be populated with data from 8-10 tables. The easiest way to implement the

data
tier would be to use DataSets and Data Adapters for each table. But, is
that the optimal approach?

How do I optimize the back-end data retrieval and corresponding updates
(inserts, updates, and deletes) within the .NET Framework?

Thanks in advance,

Mervin Williams


Nov 18 '05 #4
I am curious when is it evident to break up a dataset with mulitple tables.
For example if you have multiple forms and need data on each form. Do you
suggest to break up the one dataset logically to match the forms if the
forms map to schema of the db? Or one dataset is still better? I have seen
many examples using only one dataset and it seems to me that this is not a
good approach.

Thanks

"William Ryan eMVP" <do********@comcast.nospam.net> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
Mervin:

If it's all on one form (and even if it's not) having a seperate dataset for each datattable is definitely something you don't want to do. One DataSet
holding the datatables, (Unless logically there's a good reason for using
seperate datasets - but if you're talking about one form, it's hard to
imagine a scenario where you'd want/need more than one dataset), using
datarelations where possible is probably the best approach. You can then
serialize one set if you need to , pass it to/from a web service or use
whatever other method you want to access the data. You'll want different
dataadapters.

As far as optimization, there are different schools of thought, but you can encapsulate that in any tier although the presentation tier is usually a
really bad choice. If you use stored procs you cna keep it all in the
backend layer which I find preferable b/c I can make change to my program
without recompiling and redistributing and it allows for optimum
performance, secuirty and flexibility. Many disagree and think the logic
should be in the middle tier and they have a point. You can also combine
the two. However, calling procs, passing in params where needed and letting them do the heavy lifting is a way that I prefer.

HTH,

Bill
"Mervin Williams" <mw*******@innovasolutions.net> wrote in message
news:e4**************@TK2MSFTNGP12.phx.gbl...
I am building a web application that will have a single form that will be populated with data from 8-10 tables. The easiest way to implement the

data
tier would be to use DataSets and Data Adapters for each table. But, is
that the optimal approach?

How do I optimize the back-end data retrieval and corresponding updates
(inserts, updates, and deletes) within the .NET Framework?

Thanks in advance,

Mervin Williams


Nov 18 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

16
by: D Witherspoon | last post by:
I am developing a Windows Forms application in VB.NET that will use .NET remoting to access the data tier classes. A very simple way I have come up with is by creating typed (.xsd) datasets. For...
0
by: momina_dar | last post by:
AlachiSoft “TierDeveloper” eradicates the problem of writing thousands of lines of code for your middle tier. Many software solutions on the market can do it for you but according to our research...
2
by: Mervin Williams | last post by:
I am building a web application that will have a single form that will be populated with data from 8-10 tables. The easiest way to implement the data tier would be to use DataSets and Data...
3
by: zc2468 | last post by:
I am new to dot net and would like to write an application using a solid 3 tier design because I expect the app to require a lot of updates and maintenance over time. I am creating my own data...
13
by: Michelle | last post by:
Hi all... I could use a little TLC here for understanding and not for solving a specific problem... sorry if I've got the wrong group, but I'm using VB so I figured this was the most appropriate...
5
by: Tina | last post by:
I'm reading about the "3-tier" design afforded by using Object Data Sources in the App_Data folder in 2.0 asp.net projects. To me, putting an object data source in a separate folder on the web...
4
by: pratham | last post by:
Hi! I'm making a database application and i heard from a friend that it is more proffecional and easy to do this with bussines objects. Can anyone tell me where i can find more info on bussines...
3
by: alan.chambers | last post by:
I am new to C++/CLI. I want to do a very simple thing. In other C++ applications, my forms have always been 'context free' in the sense that I pass them a temporary copy of the data for them to...
2
by: grawsha2000 | last post by:
Greetings, I am developing this N-tier business app. The problem I'm facing is when I try to pass business objects (employees, dept..etc) from business tier to data tier,i.e., the add method in...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.