473,325 Members | 2,771 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,325 software developers and data experts.

Generating overnight data vs Live - suggestions needed.

We have an MIS system which has approx 100 reports. Each of these
reports can take up to several minutes to run due to the complexity of
the queries (hundreds of lines each in most cases). Each report can be
run by many users, so in effect we have a slow system.

I want to seperate the complex part of the queries into a process that
is generated each night. Then the reports will only have to query
pre-formatted data with minimal parameters as the hard part will have
been completed for the users when they are not in. Ideally we will
generate (stored procedure possibly) a set of data for each report and
hold this on the server. We can then query with simpler parameters
such as by date and get the data back quite quickly.

The whole process of how we obtain the data is very complex. There are
various views which gather data from the back office system. These are
very complex and when queries are run against them including other
tables to bring in more data, it gets nicely complicated.

The only problem is that the users want to have access to LIVE data
from the back office system, specifically the Sales team who want to
access this remotely. My method only allows for data from the night
before, so is there an option available to me which will allow me to
do this ? The queries can't be improved on an awful lot, so they will
take as long as they take. The idea of running them once is the only
way I can see to improve the performance in any significant way.

True I could just let them carry on as they are and let them suffer
with the performance on live data, but I'd like to do something to
improve the situation for them.

Any advice would be appreciated.

Thanks

Ryan
Jul 20 '05 #1
2 1929
Yep, that definitely makes a difference. Taken me a while to get back
to this, but we're looking at a solution along these lines.

By taking a copy of the data from the view into a table and then
replicating this onto another database (for generating) and then
running stored procedures (from each complex query) against this,
we've been able to reduce the time from 5 minutes for a test generate
query to about 3 seconds. We've timed the other parts and can probably
generate all of our data in less than 5 minutes for 100 complex
queries that would normally take hours.

Once this data is generated we'll simply swap it for the current data
which should be pretty quick. Our report now loads in less than a
second as opposed to 5 mins or so.

This means we can probably update the data once an hour and push it
onto the query database which will be more than quick enough for the
users. This has the knock on effect of being able to produce the
reports in seconds instead of minutes.

Erland, thanks for the advice on this it should make quite a
difference.

Erland Sommarskog <so****@algonet.se> wrote in message news:<Xn**********************@127.0.0.1>...
Ryan (ry********@hotmail.com) writes:
The only problem is that the users want to have access to LIVE data
from the back office system, specifically the Sales team who want to
access this remotely. My method only allows for data from the night
before, so is there an option available to me which will allow me to
do this ? The queries can't be improved on an awful lot, so they will
take as long as they take. The idea of running them once is the only
way I can see to improve the performance in any significant way.


If users want current data, you cannot pre-compute it for them. You
know, have the cake and eat it.

What you could to do is to set up a report server which you replicate
to, so that you take the load of the back-office system.

Yet another alternative, is to have three databases:
A - the source-system.
B - the view system that the users use.-
C - a computation system.

You keep the computation database updated by means of log shipping
(this is better than replication in this case). One you have applied
a log, you pre-compute all data. When this step is done, you flush the
pre-computed tables on B, and insert the data from C. Now you go and
pick up a new log from A. A more fancy variant is to have a seamless
switch between B and C which you flip constantly. I believe there are
such things.

Jul 20 '05 #2
Ryan (ry********@hotmail.com) writes:
By taking a copy of the data from the view into a table and then
replicating this onto another database (for generating) and then
running stored procedures (from each complex query) against this,
we've been able to reduce the time from 5 minutes for a test generate
query to about 3 seconds. We've timed the other parts and can probably
generate all of our data in less than 5 minutes for 100 complex
queries that would normally take hours.
300 seconds down to three! That's not bad!
Erland, thanks for the advice on this it should make quite a
difference.


I'm only glad to have helped! And thanks for reporing back!
--
Erland Sommarskog, SQL Server MVP, so****@algonet.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techinf...2000/books.asp
Jul 20 '05 #3

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

15
by: Christian Tismer | last post by:
Dear friends. During a conversation with good friends and newly acquired Pythonista, we were discussing Python, what it is in essence, and what it is giving to us. The people were Dinu...
7
by: Chris Ritchey | last post by:
Hmmm I might scare people away from this one just by the title, or draw people in with a chalange :) I'm writting this program in c++, however I'm using char* instead of the string class, I am...
12
by: Russ | last post by:
I'm interested in setting up a web page where live data can be displayed in real-time on the web page. For example: I would like to display a (nice looking) graph of some data value versus time...
0
by: Chankama | last post by:
Hi. I am making an application using the Compact Framework for a PPC. Essentially what I am trying to do is, read a table from a external datasource, and try to display it using the datagrid. For...
4
by: jhill | last post by:
is it possible to display 'live' data in .net via a web page? by live data i mean data that is changing dynamically, say once a second i am guessing i need to use java to go this route because MS...
5
by: Andy Baker | last post by:
Our VB.NET 2003 application requires several processes to run overnight. I have written a program to perform these processes with a simple user interface to allow the user to switch various options...
2
by: Dale | last post by:
Access 2003 I have an access database used for scoring a high school sporting event. Data is entered as scores are turned in from the different events throughout the day. I would like to...
1
by: Bobby Edward | last post by:
I am doing a funeral home page (depressing huh?) :) They want an obituary page where people can post comments. So, in my db I have 2 tables (1 to many relationship)... TABLE1: Obituary...
0
by: Aswanth | last post by:
I'm Generating Reports in SSRS-2005.. Previously I got the Data from One Database & Generated Reports.. Now I used to get the Data from Two Different Databases(ie Database-1 & Database-2) & to...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
1
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...
0
by: jfyes | last post by:
As a hardware engineer, after seeing that CEIWEI recently released a new tool for Modbus RTU Over TCP/UDP filtering and monitoring, I actively went to its official website to take a look. It turned...
0
by: ArrayDB | last post by:
The error message I've encountered is; ERROR:root:Error generating model response: exception: access violation writing 0x0000000000005140, which seems to be indicative of an access violation...
1
by: CloudSolutions | last post by:
Introduction: For many beginners and individual users, requiring a credit card and email registration may pose a barrier when starting to use cloud servers. However, some cloud server providers now...
1
by: Defcon1945 | last post by:
I'm trying to learn Python using Pycharm but import shutil doesn't work
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.