473,804 Members | 3,211 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Performance Bottleneck in ASP.NET

I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best.
Let me explain.

Our site has a report designer that allows users to create
dynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for
download (ie Excel). The contents of the exported report
must be identical to the original report, thus cannot read
from the DB as data is volatile.

To overcome this we persist the original HTML report (can
be exported too), the report options XML document, and the
dataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.

Having to write to the file system before returning the
generated report is causing a performance bottleneck when
the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the data
is retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution
then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.
Nov 18 '05 #1
3 1665
Hi Glenn,

I wonder if there isn't a way to get a "snapshot" of the data into a unique
temporary table in SQL Server. That would preserve the contents until the
user decides what formats in needs.

You'd have to identify that particular table as the datasource for the
export format about to be requested, but adding a Session ID field to the
table might get around that.

This would offload the slow file system portions onto the database server.

BTW, I have a process that does write to XML to the file system and then
processes into PDFs. It also uses XSL Transformations to generate HTML for
the Web and Excel spreadsheets. It confirm that it certainly can be slow.

Ken

"Glenn" <an*******@disc ussions.microso ft.com> wrote in message
news:04******** *************** *****@phx.gbl.. .
I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best.
Let me explain.

Our site has a report designer that allows users to create
dynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for
download (ie Excel). The contents of the exported report
must be identical to the original report, thus cannot read
from the DB as data is volatile.

To overcome this we persist the original HTML report (can
be exported too), the report options XML document, and the
dataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.

Having to write to the file system before returning the
generated report is causing a performance bottleneck when
the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the data
is retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution
then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.


Nov 18 '05 #2
Thanks for that Ken.

Unfortuately the dataset usually contains cross-tabbed
data, so the table schema is different for a vast majority
of instances.

I had toyed with the idea of creating a report table that
contained the sessionID, name of a separate report data
table, and date created. I could then schedule a stored
proc to run periodically to clean up any "old" report data
tables.

If we had a 64 bit server I would just store it using
InProc session state, but we don't!
-----Original Message-----
Hi Glenn,

I wonder if there isn't a way to get a "snapshot" of the data into a uniquetemporary table in SQL Server. That would preserve the contents until theuser decides what formats in needs.

You'd have to identify that particular table as the datasource for theexport format about to be requested, but adding a Session ID field to thetable might get around that.

This would offload the slow file system portions onto the database server.
BTW, I have a process that does write to XML to the file system and thenprocesses into PDFs. It also uses XSL Transformations to generate HTML forthe Web and Excel spreadsheets. It confirm that it certainly can be slow.
Ken

"Glenn" <an*******@disc ussions.microso ft.com> wrote in messagenews:04******* *************** ******@phx.gbl. ..
I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best. Let me explain.

Our site has a report designer that allows users to create dynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for download (ie Excel). The contents of the exported report must be identical to the original report, thus cannot read from the DB as data is volatile.

To overcome this we persist the original HTML report (can be exported too), the report options XML document, and the dataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.
Having to write to the file system before returning the
generated report is causing a performance bottleneck when the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the data is retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated...

Glenn.


.

Nov 18 '05 #3
You certainly don't have to persist the data to disk to
enable this kind of functionality, but it is one possible
solution.

I understand your datasets can be large, but the bottom
line is you HAVE to persist them somewhere right? So...
are you memory constrained? If so, a few options come to
mind. One is persisting to disk, as you are currently
doing. Another is using session state but perhaps
offloading the session storage to another machine using
asp.net state service or SQL session state. While you have
some perf hit introduced w/ serialization/deserialization
and moving the data across the network, my guess is it is
not as high as the disk I/O hit (especially if its a
single spindle). These methods also give you the added
functionality of being able to survive a worker process
recycle on the web server end.

If you're not memory constrained, store in session. I'd
highly recommend quantifying your expected load and stress
testing to that level to determine if you are, in fact,
memory constrained.

In any case, if the user chooses to export to excel or
what have you, set the mime type and stream to the client.

Hope that is helpful in some way.
-----Original Message-----
I have a performance issue that needs resolving, and am
not sure which options we have come up with are the best.
Let me explain.

Our site has a report designer that allows users to createdynamic report content. The output for these reports is
HTML, but they can be exported to a number of formats for
download (ie Excel). The contents of the exported report
must be identical to the original report, thus cannot readfrom the DB as data is volatile.

To overcome this we persist the original HTML report (can
be exported too), the report options XML document, and thedataset used to create the report (XML) to the file
system. If a user wants to export to another format then
we load these files and create the necessary report type.

Having to write to the file system before returning the
generated report is causing a performance bottleneck when
the server is under load.

Here are the options we have come up with:
* Store the xml options and dataset in session state
(properties of a wrapper class).
* Write files asynchronously to file system after the datais retrieved from SQL, whilst still continuing report
generation processing.
* Is there a better way to structure this? Could caching
be used here?

The dataset has the potential to be quite large (up to
5000 rows in some cases), and will contain multiple data
tables. This may be a problem for session state usage.

Haven't use asych processing very much so am a bit
hesitant to use, but if it is the most efficient solution
then that's the way I'll move forward.

Any advice or recommendations here would be greately
apprecieated.. .

Glenn.
.

Nov 18 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

8
2481
by: Thomas Junior | last post by:
I always assumed C++ would have better performance than C and, when I had the choice, opted for C++. Today, however, I was researching writing components for Linux and came across this statement: "KDE came first and came earlier to a mature state but C++ makes Qt and MICO particularly inefficient for sophisticated applications. This was partially corrected in version 2.0 and the coming 2.1."
10
4281
by: Alex Gerdemann | last post by:
Hello, I have spent a bunch of time converting a Java program I wrote to C++ in order to improve performance, and have found that it is not necessarily faster. Specifically, I'm writing a circuit simulator, so I must first parse the netlist file. The program goes through an input file, and makes a hash_set of unique nodes (std::string's). The words are then sorted and numbered by copying the hash_set into a vector and sorting. ...
4
1448
by: Chandy | last post by:
Hi, I have a public shared function in a utility class full of other public shared functions. It does not get hit that often but when it does the hit will be long-ish and involve potentially thousands of operations, though it does not make calls to any other non-system assemblies. What can I do within .Net to mitigate against this becoming some sort of bottleneck? For instance, should I move it into it's own assembly? TIA
9
2812
by: bluedolphin | last post by:
Hello All: I have been brought onboard to help on a project that had some performance problems last year. I have taken some steps to address the issues in question, but a huge question mark remains. Last year, all of the tables and reports were stored in Access. The database was put online so that end users could access the reports online using Snapshot Viewer. The reports were aggregated on the fly, and the selection criteria...
1
1497
by: Jeff | last post by:
Okay, I know there is likely no straght-forward way to get a definitive answer to the question, "why is my page loading so slowly tonight?"; but I'm thinking that there *are* some things we can measure to get *some* general idea of where the bottleneck is. I can sit here and theorize about why it's periodically slow, but I'd like to be able to know what's causing the periodical performance degradation. My situation is that I have an...
22
3251
by: roadrunner | last post by:
Hi, Our website has recently been experiencing some problems under load. We have pinpointed a particular function which slows dramatically when we have these problems. Normally it should execute in under a second but it rises to about one minute under duress. The code is fairly straight forward with no calls to databases or any other servers. The only dubious thing I can see is that it retrieves several arrays from the Application...
3
1935
by: yonil | last post by:
Over the years of using C++ I've begun noticing that freestore management functions (malloc/free) become performance bottlenecks in complex object-oriented libraries. This is usually because these functions acquire a mutex lock on the heap. Since the software I'm writing is targetted for a number of embedded platforms as well as the PC, it's somewhat difficult to use anything but the standard implementation given with the compiler. I've...
5
4816
by: tombrogan3 | last post by:
Hi, I need to implement in-memory zlib compression in c# to replace an old c++ app. Pre-requisites.. 1) The performance must be FAST (with source memory sizes from a few k to a meg). 2) The output must match exactly compression generated with the c++ zlib.org code with default compression. zlib.org c# code is as slow as hell! Not sure if I'm doing anything
4
6585
by: shreyask | last post by:
I have been working on doing bulk updates and inserts through sqlapi++ using a bulk udpate extention library, developed, a long time ago, internally. The database is oracle. The bulk inserts/updates are done in batches of 50K. I am facing a very peculiar performance issue with regard to these bulk operations. The initial batch of bulk insert (50K records) is finished within 3 seconds; while the next similar batch for bulk insert or update...
0
10586
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
10338
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven tapestry of website design and digital marketing. It's not merely about having a website; it's about crafting an immersive digital experience that captivates audiences and drives business growth. The Art of Business Website Design Your website is...
1
10323
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
10082
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
9161
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
0
6856
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5525
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
2
3823
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.
3
2997
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.