My question is basically about whether its best to do a small number
database accesses and do most of the data organisation in sql stored
procedures, or one larger fetch from the database and do the database
organisation in the asp.net code:
I am building a menu system for an online store. At the top will be the main
'departments' and on the side will be the sub-departments of whichever main
department is selected, with its sub-sub departments underneath each one.
There will also be a breadcrumb at the top showing the route through the
departments. It lookds something like:
Store/Computing/Computers/Handhelds
Clothes >Computing Home Music
Computers
- Desktops
- Laptops
- >Handhelds
Accessories
- Media
- Cases
Drives
- CD
- DVD
- Hard Disk
The possible issue I may have at the moment is the number of database
accesses per click on an menu item. I though I was being efficient by doing
much of the work of organising the data by using SQL 2005 stored procedures,
but I am wondering whether there's a better way.
Currently, there are 3-4 database accesses to display the above page (1 for
breadcumb, 1 or 2 for sub menu and one for main menu). Ofcourse there will
be another database fetch to display the details of the department selected
in the centre of the page.
Although I am caching as I go along I wonder if it would it be better to get
all the departments (and their decendants) into a dataset and do the
organisation there? i.e. one big fetch instead of lots of little ones??
JJ 4 1304
JJ,
The amount of data you will have is minimal on the client.
Why don't you load all you data to a XML file and then
traverse the XML using a script language.
Funnily enough I am doing the same thing at the moment
to produce a drill down report.
Just an idea.
rotsey
"JJ" <ab*@xyz.comwro te in message
news:u3******** ******@TK2MSFTN GP04.phx.gbl...
My question is basically about whether its best to do a small number
database accesses and do most of the data organisation in sql stored
procedures, or one larger fetch from the database and do the database
organisation in the asp.net code:
I am building a menu system for an online store. At the top will be the
main 'departments' and on the side will be the sub-departments of
whichever main department is selected, with its sub-sub departments
underneath each one. There will also be a breadcrumb at the top showing
the route through the departments. It lookds something like:
Store/Computing/Computers/Handhelds
Clothes >Computing Home Music
Computers
- Desktops
- Laptops
- >Handhelds
Accessories
- Media
- Cases
Drives
- CD
- DVD
- Hard Disk
The possible issue I may have at the moment is the number of database
accesses per click on an menu item. I though I was being efficient by
doing much of the work of organising the data by using SQL 2005 stored
procedures, but I am wondering whether there's a better way.
Currently, there are 3-4 database accesses to display the above page (1
for breadcumb, 1 or 2 for sub menu and one for main menu). Ofcourse there
will be another database fetch to display the details of the department
selected in the centre of the page.
Although I am caching as I go along I wonder if it would it be better to
get all the departments (and their decendants) into a dataset and do the
organisation there? i.e. one big fetch instead of lots of little ones??
JJ
Well, I did think of that at first, but all the solutions that I came upon
dealt with using sql stored procedures. It is working at the moment using
recursive SQL statements. But the organisation of such a nested menu system
does seem to point to an XML structure.
I think I did attempt it, but as I am using self referencing tables (ie. one
may be the parent or child of another) I think I got a little confused....
How would you hold the XML file in memory - could you cache it? or would you
actually write to a physical file?
Thanks,
JJ
"Rotsey" <ma***********@ RemoveThis.optu snet.com.auwrot e in message
news:%2******** *******@TK2MSFT NGP06.phx.gbl.. .
JJ,
The amount of data you will have is minimal on the client.
Why don't you load all you data to a XML file and then
traverse the XML using a script language.
Funnily enough I am doing the same thing at the moment
to produce a drill down report.
Just an idea.
rotsey
"JJ" <ab*@xyz.comwro te in message
news:u3******** ******@TK2MSFTN GP04.phx.gbl...
>My question is basically about whether its best to do a small number database accesses and do most of the data organisation in sql stored procedures, or one larger fetch from the database and do the database organisation in the asp.net code:
I am building a menu system for an online store. At the top will be the main 'departments' and on the side will be the sub-departments of whichever main department is selected, with its sub-sub departments underneath each one. There will also be a breadcrumb at the top showing the route through the departments. It lookds something like:
Store/Computing/Computers/Handhelds
Clothes >Computing Home Music Computers - Desktops - Laptops - >Handhelds
Accessories - Media - Cases
Drives - CD - DVD - Hard Disk
The possible issue I may have at the moment is the number of database accesses per click on an menu item. I though I was being efficient by doing much of the work of organising the data by using SQL 2005 stored procedures, but I am wondering whether there's a better way.
Currently, there are 3-4 database accesses to display the above page (1 for breadcumb, 1 or 2 for sub menu and one for main menu). Ofcourse there will be another database fetch to display the details of the department selected in the centre of the page.
Although I am caching as I go along I wonder if it would it be better to get all the departments (and their decendants) into a dataset and do the organisation there? i.e. one big fetch instead of lots of little ones??
JJ
JJ,
You could seriailse your data to memory easily using a memorystream
then to a string and and cache it quite easily in a cache object
The key to what I did was I created class objects in the hierarchy
I wanted the data then used xmlserialiser. That way when
I traversed the XML it would be relatively fast.
Your data model may be a bit more tricky from what you say.
But there is no reason why you could do the same thing but
duplicate any data that your relationships required to keep the
correct hierarchy.
Do you see what I mean?
rotsey
"JJ" <ab*@xyz.comwro te in message
news:eM******** ******@TK2MSFTN GP06.phx.gbl...
Well, I did think of that at first, but all the solutions that I came upon
dealt with using sql stored procedures. It is working at the moment using
recursive SQL statements. But the organisation of such a nested menu
system does seem to point to an XML structure.
I think I did attempt it, but as I am using self referencing tables (ie.
one may be the parent or child of another) I think I got a little
confused....
How would you hold the XML file in memory - could you cache it? or would
you actually write to a physical file?
Thanks,
JJ
"Rotsey" <ma***********@ RemoveThis.optu snet.com.auwrot e in message
news:%2******** *******@TK2MSFT NGP06.phx.gbl.. .
>JJ,
The amount of data you will have is minimal on the client.
Why don't you load all you data to a XML file and then traverse the XML using a script language.
Funnily enough I am doing the same thing at the moment to produce a drill down report.
Just an idea.
rotsey
"JJ" <ab*@xyz.comwro te in message news:u3******* *******@TK2MSFT NGP04.phx.gbl.. .
>>My question is basically about whether its best to do a small number database accesses and do most of the data organisation in sql stored procedures, or one larger fetch from the database and do the database organisatio n in the asp.net code:
I am building a menu system for an online store. At the top will be the main 'departments' and on the side will be the sub-departments of whichever main department is selected, with its sub-sub departments underneath each one. There will also be a breadcrumb at the top showing the route through the departments. It lookds something like:
Store/Computing/Computers/Handhelds
Clothes >Computing Home Music Computers - Desktops - Laptops - >Handhelds
Accessories - Media - Cases
Drives - CD - DVD - Hard Disk
The possible issue I may have at the moment is the number of database accesses per click on an menu item. I though I was being efficient by doing much of the work of organising the data by using SQL 2005 stored procedures, but I am wondering whether there's a better way.
Currently, there are 3-4 database accesses to display the above page (1 for breadcumb, 1 or 2 for sub menu and one for main menu). Ofcourse there will be another database fetch to display the details of the department selected in the centre of the page.
Although I am caching as I go along I wonder if it would it be better to get all the departments (and their decendants) into a dataset and do the organisatio n there? i.e. one big fetch instead of lots of little ones??
JJ
I think I understand. I have got so far with it before and cached the stream
from the database, but the realtionships are a little tricky, and displaying
the data in nested repeaters was tricky too.
I found lots of examples for two seperate tables and how to relate them in
your dataset, turn them into XML and then use them in repeaters/menu's etc,
but none for self referencing tables.
Think I'll give it a go after Christmas.
Thank you very much for your suggestion,
JJ
"Rotsey" <ma***********@ RemoveThis.optu snet.com.auwrot e in message
news:%2******** *******@TK2MSFT NGP06.phx.gbl.. .
JJ,
You could seriailse your data to memory easily using a memorystream
then to a string and and cache it quite easily in a cache object
The key to what I did was I created class objects in the hierarchy
I wanted the data then used xmlserialiser. That way when
I traversed the XML it would be relatively fast.
Your data model may be a bit more tricky from what you say.
But there is no reason why you could do the same thing but
duplicate any data that your relationships required to keep the
correct hierarchy.
Do you see what I mean?
rotsey
"JJ" <ab*@xyz.comwro te in message
news:eM******** ******@TK2MSFTN GP06.phx.gbl...
>Well, I did think of that at first, but all the solutions that I came upon dealt with using sql stored procedures. It is working at the moment using recursive SQL statements. But the organisation of such a nested menu system does seem to point to an XML structure.
I think I did attempt it, but as I am using self referencing tables (ie. one may be the parent or child of another) I think I got a little confused....
How would you hold the XML file in memory - could you cache it? or would you actually write to a physical file?
Thanks,
JJ
"Rotsey" <ma***********@ RemoveThis.optu snet.com.auwrot e in message news:%2******* ********@TK2MSF TNGP06.phx.gbl. ..
>>JJ,
The amount of data you will have is minimal on the client.
Why don't you load all you data to a XML file and then traverse the XML using a script language.
Funnily enough I am doing the same thing at the moment to produce a drill down report.
Just an idea.
rotsey
"JJ" <ab*@xyz.comwro te in message news:u3****** ********@TK2MSF TNGP04.phx.gbl. .. My question is basically about whether its best to do a small number database accesses and do most of the data organisation in sql stored procedures , or one larger fetch from the database and do the database organisati on in the asp.net code:
I am building a menu system for an online store. At the top will be the main 'departments' and on the side will be the sub-departments of whichever main department is selected, with its sub-sub departments underneath each one. There will also be a breadcrumb at the top showing the route through the departments. It lookds something like:
Store/Computing/Computers/Handhelds
Clothes >Computing Home Music Computers - Desktops - Laptops - >Handhelds
Accessorie s - Media - Cases
Drives - CD - DVD - Hard Disk
The possible issue I may have at the moment is the number of database accesses per click on an menu item. I though I was being efficient by doing much of the work of organising the data by using SQL 2005 stored procedures , but I am wondering whether there's a better way.
Currently, there are 3-4 database accesses to display the above page (1 for breadcumb, 1 or 2 for sub menu and one for main menu). Ofcourse there will be another database fetch to display the details of the department selected in the centre of the page.
Although I am caching as I go along I wonder if it would it be better to get all the departments (and their decendants) into a dataset and do the organisation there? i.e. one big fetch instead of lots of little ones??
JJ This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics |
by: jason |
last post by:
I would appreciate some help on how to convert a database table into an html
file via FSO and whether more seasoned asp programmers recommned this route.
The main reason I am attempting to do this is becuase we are constantly
being told by so-called marketing gurus that 'html' pages are more search
engine friendly.
Other advantages include speed of download.
Will there be any problems with using this approach besides establishing...
|
by: Rob Cheshire |
last post by:
Hi to all,
I need some help deciding on the best database system for our
specific application.
OPTIONAL INFO: We currently have 2 separate projects that need to be
updated from dos-based dBase. I'll only describe one of these but the
other is comparable in size. The project currently has about 32
files. The 2 largest are about 30,000 records and 5 columns include
data from the last 30 years. The other 30 files are created by year...
|
by: A.P. Hofstede |
last post by:
Could someone tell me where MS-Access (current and 97?) fit(s) on the
RDBMS - ORDBMS - ODBMS spectrum?
I gather it's relational, but how does it size up against/follow SQL2/3/4
definitions and how does it compare to other database vendors?
Any articles that might be of interest? It's for an essay on database
models...
Thanks in advance,
Alex
|
by: Frank Rizzo |
last post by:
Hello,
I am writing a server that will be inserting/reading/updating the
database on multiple threads. I understand that you can't have more
than 1 reader open on a database connection. Are there any other
limitations? Like can I have more than one command object inserting on
the same connection? Any other pitfalls, I should look out for?
Thanks.
|
by: RSH |
last post by:
Hi,
I have a situation where I have created a little application that makes an
Access database from a SQL Database for reporting purposes. it does the job
well, but it seems a bit slow. Is there anything that i can do to speed the
app up a bit?
namespace SQLToAccessBackup
| |
by: shsandeep |
last post by:
The ETL application loaded around 3000 rows in 14 seconds in a Development
database while it took 2 hours to load in a UAT database.
UAT db is partitioned.
Dev db is not partitioned.
the application looks for existing rows in the table...if they already
exist then it updates otherwise inserts them.
The table is pretty large, around 6.5 million rows.
|
by: Matt.W.Stephens |
last post by:
Here is the scenario. We have a database with patient case information.
This database was previously solely used in Access. The problem is our
users connect to this database (located in California) from locations
across a large network. California to Virginia. Naturally the east
coast locations were running very slow. We decided to migrate to an
Access front end with an SQL backend. The database contains several
forms with multiple...
|
by: tc |
last post by:
Hi. I have a customer who is running our .net application, multiple clients
connecting to an Access database. This ONE customer reports a great speed
degredation when more than one client is connected.
First client logs on and the system runs correctly.
Second client logs on, both connected clients now run very slowly when
retrieving data.
Third client logs on, the speed on all three clients is no degraded even
more.
Etc.
|
by: ykhamitkar |
last post by:
Hi There,
I have some questions about ms access database
1. How much data ms access can handle with good speed.
2. Does the size of column affect the speed of ms access database.
(If i keep size as 100 instead of 50 in the Field Size property in design view)
3. If I create a client server application and 10 people work on the same database at the same time, Would it affect the speed of the database
4. Is there any way I can keep a...
|
by: JB |
last post by:
I'm writing a data driven app, its all just about finished but i
havent decided how to store the data.
I'm looking at some form of database. Requirements:
Easy to Distribute, hopefully just some hard disk files.
Relatively quick to query.
I think in this case, speed is secondary to ease of use for the end
user. Although speed is always important, which is why i wont be using
xml :)
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look !
Part I. Meaning of...
| |
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it.
First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
|
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed.
This is as boiled down as I can make it.
Here is my compilation command:
g++-12 -std=c++20 -Wnarrowing bit_field.cpp
Here is the code in...
|
by: jinu1996 |
last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth.
The Art of Business Website Design
Your website is...
|
by: tracyyun |
last post by:
Dear forum friends,
With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
|
by: conductexam |
last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one.
At the time of converting from word file to html my equations which are in the word document file was convert into image.
Globals.ThisAddIn.Application.ActiveDocument.Select();...
|
by: adsilva |
last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
| |
by: 6302768590 |
last post by:
Hai team
i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
|
by: bsmnconsultancy |
last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...
| |