473,396 Members | 1,940 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Performance Bottleneck in ASP.NET

I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best.
Let me explain.

Our site has a report designer that allows users to create
dynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for
download (ie Excel). The contents of the exported report
must be identical to the original report, thus cannot read
from the DB as data is volatile.

To overcome this we persist the original HTML report (can
be exported too), the report options XML document, and the
dataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.

Having to write to the file system before returning the
generated report is causing a performance bottleneck when
the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the data
is retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution
then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.
Nov 18 '05 #1
3 1647
Hi Glenn,

I wonder if there isn't a way to get a "snapshot" of the data into a unique
temporary table in SQL Server. That would preserve the contents until the
user decides what formats in needs.

You'd have to identify that particular table as the datasource for the
export format about to be requested, but adding a Session ID field to the
table might get around that.

This would offload the slow file system portions onto the database server.

BTW, I have a process that does write to XML to the file system and then
processes into PDFs. It also uses XSL Transformations to generate HTML for
the Web and Excel spreadsheets. It confirm that it certainly can be slow.

Ken

"Glenn" <an*******@discussions.microsoft.com> wrote in message
news:04****************************@phx.gbl...
I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best.
Let me explain.

Our site has a report designer that allows users to create
dynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for
download (ie Excel). The contents of the exported report
must be identical to the original report, thus cannot read
from the DB as data is volatile.

To overcome this we persist the original HTML report (can
be exported too), the report options XML document, and the
dataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.

Having to write to the file system before returning the
generated report is causing a performance bottleneck when
the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the data
is retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution
then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.


Nov 18 '05 #2
Thanks for that Ken.

Unfortuately the dataset usually contains cross-tabbed
data, so the table schema is different for a vast majority
of instances.

I had toyed with the idea of creating a report table that
contained the sessionID, name of a separate report data
table, and date created. I could then schedule a stored
proc to run periodically to clean up any "old" report data
tables.

If we had a 64 bit server I would just store it using
InProc session state, but we don't!
-----Original Message-----
Hi Glenn,

I wonder if there isn't a way to get a "snapshot" of the data into a uniquetemporary table in SQL Server. That would preserve the contents until theuser decides what formats in needs.

You'd have to identify that particular table as the datasource for theexport format about to be requested, but adding a Session ID field to thetable might get around that.

This would offload the slow file system portions onto the database server.
BTW, I have a process that does write to XML to the file system and thenprocesses into PDFs. It also uses XSL Transformations to generate HTML forthe Web and Excel spreadsheets. It confirm that it certainly can be slow.
Ken

"Glenn" <an*******@discussions.microsoft.com> wrote in messagenews:04****************************@phx.gbl...
I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best. Let me explain.

Our site has a report designer that allows users to create dynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for download (ie Excel). The contents of the exported report must be identical to the original report, thus cannot read from the DB as data is volatile.

To overcome this we persist the original HTML report (can be exported too), the report options XML document, and the dataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.
Having to write to the file system before returning the
generated report is causing a performance bottleneck when the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the data is retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.


.

Nov 18 '05 #3
You certainly don't have to persist the data to disk to
enable this kind of functionality, but it is one possible
solution.

I understand your datasets can be large, but the bottom
line is you HAVE to persist them somewhere right? So...
are you memory constrained? If so, a few options come to
mind. One is persisting to disk, as you are currently
doing. Another is using session state but perhaps
offloading the session storage to another machine using
asp.net state service or SQL session state. While you have
some perf hit introduced w/ serialization/deserialization
and moving the data across the network, my guess is it is
not as high as the disk I/O hit (especially if its a
single spindle). These methods also give you the added
functionality of being able to survive a worker process
recycle on the web server end.

If you're not memory constrained, store in session. I'd
highly recommend quantifying your expected load and stress
testing to that level to determine if you are, in fact,
memory constrained.

In any case, if the user chooses to export to excel or
what have you, set the mime type and stream to the client.

Hope that is helpful in some way.
-----Original Message-----
I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best.
Let me explain.

Our site has a report designer that allows users to createdynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for
download (ie Excel). The contents of the exported report
must be identical to the original report, thus cannot readfrom the DB as data is volatile.

To overcome this we persist the original HTML report (can
be exported too), the report options XML document, and thedataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.

Having to write to the file system before returning the
generated report is causing a performance bottleneck when
the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the datais retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution
then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.
.

Nov 18 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

8
by: Thomas Junior | last post by:
I always assumed C++ would have better performance than C and, when I had the choice, opted for C++. Today, however, I was researching writing components for Linux and came across this...
10
by: Alex Gerdemann | last post by:
Hello, I have spent a bunch of time converting a Java program I wrote to C++ in order to improve performance, and have found that it is not necessarily faster. Specifically, I'm writing a...
4
by: Chandy | last post by:
Hi, I have a public shared function in a utility class full of other public shared functions. It does not get hit that often but when it does the hit will be long-ish and involve potentially...
9
by: bluedolphin | last post by:
Hello All: I have been brought onboard to help on a project that had some performance problems last year. I have taken some steps to address the issues in question, but a huge question mark...
1
by: Jeff | last post by:
Okay, I know there is likely no straght-forward way to get a definitive answer to the question, "why is my page loading so slowly tonight?"; but I'm thinking that there *are* some things we can...
22
by: roadrunner | last post by:
Hi, Our website has recently been experiencing some problems under load. We have pinpointed a particular function which slows dramatically when we have these problems. Normally it should execute...
3
by: yonil | last post by:
Over the years of using C++ I've begun noticing that freestore management functions (malloc/free) become performance bottlenecks in complex object-oriented libraries. This is usually because these...
5
by: tombrogan3 | last post by:
Hi, I need to implement in-memory zlib compression in c# to replace an old c++ app. Pre-requisites.. 1) The performance must be FAST (with source memory sizes from a few k to a meg). 2) The...
4
by: shreyask | last post by:
I have been working on doing bulk updates and inserts through sqlapi++ using a bulk udpate extention library, developed, a long time ago, internally. The database is oracle. The bulk inserts/updates...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.