By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,652 Members | 989 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,652 IT Pros & Developers. It's quick & easy.

Using SQl to store aspx pages and memory problems

P: n/a
I have a system that was originally developed in asp - the pages are saved in
SQL (there are over 10,000 pages) and saved to a temp directory in the server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew
Nov 19 '05 #1
Share this Question
Share on Google+
21 Replies


P: n/a
TJS
what exactly does this mean ?

"as each page is requested it is retrieved from SQL saved to the
temp directory and compiled "

why are you saving it twice ?

if it 's saved in sql why not just write it back to the client?

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved
in
SQL (there are over 10,000 pages) and saved to a temp directory in the
server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #2

P: n/a
Many of the pages have server side code - they are aspx files - not html
files. e.g some of the files may need to get information such as current
date the file was displayed - or they are customized based on the browser
that views them - so they get modifed at run-time. Many have server side
controls that display charts. (chartfx control)

The way I manage this is to write the saved aspx file to a temp directory on
the server and redirect the request to this file.

I also have html files that are retrieved from SQL but these are simply
written back to the client - the problem is with aspx files.

Any help would be appreciated.

Matthew
--
matthew
"TJS" wrote:
what exactly does this mean ?

"as each page is requested it is retrieved from SQL saved to the
temp directory and compiled "

why are you saving it twice ?

if it 's saved in sql why not just write it back to the client?

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved
in
SQL (there are over 10,000 pages) and saved to a temp directory in the
server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew


Nov 19 '05 #3

P: n/a
This is just such a dumb way to do things. You should just start over, and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved in SQL (there are over 10,000 pages) and saved to a temp directory in the server when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #4

P: n/a
TJS
why are are you doing this ? This is not a good design. I'm surprised itr
works at all .

Just use use regular aspx files and be done with it.


"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:AA**********************************@microsof t.com...
Many of the pages have server side code - they are aspx files - not html
files. e.g some of the files may need to get information such as current
date the file was displayed - or they are customized based on the browser
that views them - so they get modifed at run-time. Many have server side
controls that display charts. (chartfx control)

The way I manage this is to write the saved aspx file to a temp directory
on
the server and redirect the request to this file.

I also have html files that are retrieved from SQL but these are simply
written back to the client - the problem is with aspx files.

Any help would be appreciated.

Matthew
--
matthew
"TJS" wrote:
what exactly does this mean ?

"as each page is requested it is retrieved from SQL saved to the
temp directory and compiled "

why are you saving it twice ?

if it 's saved in sql why not just write it back to the client?

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
>I have a system that was originally developed in asp - the pages are
>saved
>in
> SQL (there are over 10,000 pages) and saved to a temp directory in the
> server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the
> server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one
> session - as each page is requested it is retrieved from SQL saved to
> the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a
> considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #5

P: n/a
Bob,

I will respond to your response - as the designer I won’t feel as bad as TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired – wat I have
works very well – except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I move
to aspx is not an helpful answer and sounds more like a cop-out - what I need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem comes
about as a result of having 10,000 aspx files and the memory and speed that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over, and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved

in
SQL (there are over 10,000 pages) and saved to a temp directory in the

server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew


Nov 19 '05 #6

P: n/a
Bob

I made yousund like the good guy - I am sorry - but thats not true.

It was TJS who was a little less unreasonable
--
matthew
"matvdl" wrote:
Bob,

I will respond to your response - as the designer I won’t feel as bad as TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired – wat I have
works very well – except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I move
to aspx is not an helpful answer and sounds more like a cop-out - what I need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem comes
about as a result of having 10,000 aspx files and the memory and speed that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over, and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved

in
SQL (there are over 10,000 pages) and saved to a temp directory in the

server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew


Nov 19 '05 #7

P: n/a
"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:5A**********************************@microsof t.com...
I made yousund like the good guy - I am sorry - but thats not true.


You're forgiven. Thanks for clearing that up.

Bob Lehmann
Nov 19 '05 #8

P: n/a
TJS
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as
TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I
move
to aspx is not an helpful answer and sounds more like a cop-out - what I
need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients
as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for
the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so
the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants
to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would
solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over,
and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
> I have a system that was originally developed in asp - the pages are
> saved

in
> SQL (there are over 10,000 pages) and saved to a temp directory in the

server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the
> server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one
> session - as each page is requested it is retrieved from SQL saved to
> the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a
> considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #9

P: n/a
TJS,

Sorry - I made a mistake on my first response I did try to correct the issue.

Thanks for providing some constructive comments - going back to ASP for
these pages is something that I have considered.

By looking at this from another direction - maybe a solution can be provided.

What if I use an external program to return a portion of the HTML text that
is returned the user - I am sure that this configuration is not unusual.
Basically when the aspx page is called it goes and calls an external function
and this returns text formatted in HTML - this can simply sent back to the
client using the write statement.

Just say though that the external program decided that it wanted to embed
some server specific information into its html text - say the server name -
to do this you could embed a <%= %> statement into the HTML. Is there not
anyway of parsing the returned HTML and embedded <%= %> statements to result
in pure html text that is then written to the browser?

This does not sound like an unreasonable thing to do - and from the
perspective of the external program - if this was possible - would add
significantly to the flexibility of the system.

Hopefully this example has made some sense - does this sound possible?

--
matthew
"TJS" wrote:
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as
TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I
move
to aspx is not an helpful answer and sounds more like a cop-out - what I
need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients
as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for
the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so
the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants
to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would
solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over,
and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
> I have a system that was originally developed in asp - the pages are
> saved
in
> SQL (there are over 10,000 pages) and saved to a temp directory in the
server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the
> server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one
> session - as each page is requested it is retrieved from SQL saved to
> the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a
> considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #10

P: n/a
DLM
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP
..NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round
trip time for any given request/response for a page to include the time it
takes to get the page from SQL Server. And this added time is database access
time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very
start, and not make the ASP .NET worker process wait for the files before it
can dynamically compile pages. In a sense you are using SQL Server as a web
server to your web server, which is highly unusual. You need to find a way to
place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES
Nov 19 '05 #11

P: n/a
First, I didn't call matvdl an idiot.

Second, I did say that the methodology was dumb, and the person
_who_designed_it was an idiot.

Third, is your <shift > key broken?

Fourth, I agree with everything else you've said.

Bob Lehmann
"TJS" <no****@here.com> wrote in message
news:er**************@TK2MSFTNGP10.phx.gbl...
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as
TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.
2. I'm not sure - but the idea of having 10,000 aspx files sitting on the web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I
move
to aspx is not an helpful answer and sounds more like a cop-out - what I
need
is someone willing to understand the reasons for designing it this way and for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so
the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would
solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over,
and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
> I have a system that was originally developed in asp - the pages are
> saved
in
> SQL (there are over 10,000 pages) and saved to a temp directory in the server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the > server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one > session - as each page is requested it is retrieved from SQL saved to > the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a > considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #12

P: n/a
> What if I use an external program to return a portion of the HTML text

Yes, adding another layer of complexity to a poorly designed system will
surely be beneficial.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:D8**********************************@microsof t.com...
TJS,

Sorry - I made a mistake on my first response I did try to correct the issue.
Thanks for providing some constructive comments - going back to ASP for
these pages is something that I have considered.

By looking at this from another direction - maybe a solution can be provided.
What if I use an external program to return a portion of the HTML text that is returned the user - I am sure that this configuration is not unusual.
Basically when the aspx page is called it goes and calls an external function and this returns text formatted in HTML - this can simply sent back to the
client using the write statement.

Just say though that the external program decided that it wanted to embed
some server specific information into its html text - say the server name - to do this you could embed a <%= %> statement into the HTML. Is there not anyway of parsing the returned HTML and embedded <%= %> statements to result in pure html text that is then written to the browser?

This does not sound like an unreasonable thing to do - and from the
perspective of the external program - if this was possible - would add
significantly to the flexibility of the system.

Hopefully this example has made some sense - does this sound possible?

--
matthew
"TJS" wrote:
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.
2. I'm not sure - but the idea of having 10,000 aspx files sitting on the web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.
3. It provides an excellent mix of creating a file that has a combination of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.
So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I move
to aspx is not an helpful answer and sounds more like a cop-out - what I need
is someone willing to understand the reasons for designing it this way and for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server - either aspx or asp page - I also need to load the control with data - so the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.
Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be made more efficient - is it possible to save the compiled code into the server and load this up with the file when it is displayed. This would solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:

> This is just such a dumb way to do things. You should just start over,> and
> brain the idiot who did this to you.
>
> Bob Lehmann
>
> "matvdl" <ma****@discussions.microsoft.com> wrote in message
> news:75**********************************@microsof t.com...
> > I have a system that was originally developed in asp - the pages are> > saved
> in
> > SQL (there are over 10,000 pages) and saved to a temp directory in the> server
> > when requested by a client.
> >
> > I have updated this system and changed the pages that are saved to the> > server as aspx - everything works fine and pages can be served - but> >
> > Its not impossible for a single client to request 100 plus pages in one> > session - as each page is requested it is retrieved from SQL saved to> > the
> > temp directory and compiled - problems are.
> > - Performance is well down on the original asp system - I believe> > that
> > this due to the compiling of the pages in asp.net
> > - The memory usage of server also goes through the roof - there is a> > considerable increase in memory as each page is loaded
> >
> > Are there any solutions to these problems??
> >
> > --
> > matthew
>
>
>


Nov 19 '05 #13

P: n/a
TJS
Quit complaining , I capitalized your name. And yes, you basically did
call him an idiot and he felt the same way and said as much.
Unfortunately, he thought it was me but later did clear that up .

At least we agree about his approach.
"Bob Lehmann" <no****@dontbotherme.zzz> wrote in message
news:ep**************@tk2msftngp13.phx.gbl...
First, I didn't call matvdl an idiot.

Second, I did say that the methodology was dumb, and the person
_who_designed_it was an idiot.

Third, is your <shift > key broken?

Fourth, I agree with everything else you've said.

Bob Lehmann
"TJS" <no****@here.com> wrote in message
news:er**************@TK2MSFTNGP10.phx.gbl...
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic
asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
> Bob,
>
> I will respond to your response - as the designer I won't feel as bad
> as
> TJS
> who basically called me an idiot.
>
> There are many reason why I am doing it this way - and I don't believe
> that
> they are all stupid.
>
> 1. Firstly the system was inherited from asp - in asp this idea caused no > problems at all and worked very well - other than the overhead of
> obtaining
> the text from SQL server and writing it to disk - it all worked very well. >
> 2. I'm not sure - but the idea of having 10,000 aspx files sitting on the > web server does not make sense - and it would not solve my problem
> anyway -
> the compiling time would still be there and so would the memory issues.
>
> 3. It provides an excellent mix of creating a file that has a combination > of data that was only available at the time the file was created and
> combining it with functionality available at the time it was displayed.
>
> So - the ability of me to re-design is not possible or desired - wat I
> have
> works very well - except for the problems of memory and speed.
>
> This was a good idea in asp - so to say that it is simply stupid when I
> move
> to aspx is not an helpful answer and sounds more like a cop-out - what
> I
> need
> is someone willing to understand the reasons for designing it this way and > for someone to attempt to answer my questions.
>
> For example - I have saved the copy of each invoice I send out to clients > as
> text within a database - this text could have been saved as plain html and > forwarded to the client - but I have included a chart within these
> invoices
> and this chart is a active x control (chart fx - fully interactive) - for > the
> chart fx control to display correctly it must be served from a server -
> either aspx or asp page - I also need to load the control with data -
> so
> the
> only way to do this was through an aspx file.
>
> NOW - it is not possible to re-create the invoice each time someone wants > to
> view it - much of the data used to create the invoice is not available.
>
> Also - the fact that SQL is involved is not the problem - the problem
> comes
> about as a result of having 10,000 aspx files and the memory and speed
> that
> it takes to display them.
>
> So - can anyone help? - can anyone offer some insight into how this can be > made more efficient - is it possible to save the compiled code into the
> server and load this up with the file when it is displayed. This would
> solve
> the speed issue.
>
> Is it possible to get a page to unload once it has been compiled - this
> would solve the memory issue.
>
> Some constructive answers would be helpful.
> --
> matthew
>
>
> "Bob Lehmann" wrote:
>
>> This is just such a dumb way to do things. You should just start over,
>> and
>> brain the idiot who did this to you.
>>
>> Bob Lehmann
>>
>> "matvdl" <ma****@discussions.microsoft.com> wrote in message
>> news:75**********************************@microsof t.com...
>> > I have a system that was originally developed in asp - the pages are
>> > saved
>> in
>> > SQL (there are over 10,000 pages) and saved to a temp directory in the >> server
>> > when requested by a client.
>> >
>> > I have updated this system and changed the pages that are saved to the >> > server as aspx - everything works fine and pages can be served - but
>> >
>> > Its not impossible for a single client to request 100 plus pages in one >> > session - as each page is requested it is retrieved from SQL saved to >> > the
>> > temp directory and compiled - problems are.
>> > - Performance is well down on the original asp system - I believe
>> > that
>> > this due to the compiling of the pages in asp.net
>> > - The memory usage of server also goes through the roof - there
>> > is a >> > considerable increase in memory as each page is loaded
>> >
>> > Are there any solutions to these problems??
>> >
>> > --
>> > matthew
>>
>>
>>



Nov 19 '05 #14

P: n/a
DLM,

Thanks for your response.

I don't believe that SQL is my problem - the time it takes to do the remote
calls to the external program has not presented any real limitations or
caused any significant time-delays. The problem that I have is trying to
deal with the quanitity of files that I have and the combiling of those pages
once they have been returned. Any time issues that I have had are the result
of parsing and compilling the page.

I understand your point - that asp.net was designed to have the pages
pre-compiled before requests where made for the pages - I guess this is one
of the fundamental differances between asp and asp.net. From experiance the
design that has been implemented had no significant disadvantages when
working with asp - in fact I believe that it had some very significant
advantages - mainly due to the unlimited number of files it could manage.

Question - is it practicale to have 10,000 pages in a single asp.net
application. I wouldn't have thought so?

I have no alternative but to deal with this many files as it is a legal
requirment of the industry I work in is to be able to provide the original
copies of invoices to clients - I can't go back to the original data and
attempt to re-produce the same invoice just in case the data has changed.

So - considering this is what I am stuck with - my problem mainly relates to
the memory increases - the slowness of the request could be dealt with by the
more grunt - but the ever increasing memory is more difficult. I believe
that this is the result of the creation of a new class each time a new page
is displayed - although this page gets un-loaded once the info is returned -
are the increases in memory the result of the new class definition in the
asp.net application?

If this is the case - is it possible to remove this class definition once it
has been created? I have tried to delete the file - but this does not appear
to fix the problem.

Does any of this make sense.
--
matthew
"DLM" wrote:
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP
.NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round
trip time for any given request/response for a page to include the time it
takes to get the page from SQL Server. And this added time is database access
time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very
start, and not make the ASP .NET worker process wait for the files before it
can dynamically compile pages. In a sense you are using SQL Server as a web
server to your web server, which is highly unusual. You need to find a way to
place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES

Nov 19 '05 #15

P: n/a
The following suggestion may or may not work for you, but I'll chime in.

First off, to the folks that called the approach your existing app uses
"idiotic", I've seen classic ASP applications design this way that work
extremely well, are scalable and extremely manageable. However, the devil is
in the details. Just putting "pages" in IMAGE or TEXT fields on a SQL
database isn't some sort of magic pixie dust that solves all problems. It
takes a lot of thought and experimentation to get just right, and the
rewards in most cases offset the negative aspects of these designs.

Having said that, your approach is, as you've figured out by now, wrong.
Others have already mentioned the perf hit you're taking by forcing ASP.NET
to function in a way it wasn't designed to. But there are ways to get around
the problem.

First off, I'll assume that you have more content than code. That is, *most*
of your existing ASP/ASPX pages in the database contain stuff, like invoices
and whatnot, that do not require intervention by the parser to load and
display. If this is not the case then stop reading now =)

There are a lot of web applications nowadays that use their databases as
primary content storage repositories. Blogs are a good example of this.
However, they do not store code in the database, they store content. If you
can adapt to this type of scenario then all you have to do is create a sort
of "master page" or maybe an IHttpHandler implementation that looks at the
request (even a path), pulls the *content* from the database and then
displays it on a sort of templated page container. Think of a site like
MSDN. Lots of articles, yet all the pages look the same. The content is what
varies. The trick is to simply treat your "pages" as opaque chunks of
content, and "stream" them into a placeholder page that is rendered the same
way every time. You can pretty much do anything you want once you're hooked
into the ASP.NET processing pipeline.

If you do this you'll get rid of your perf problem and you'll still have the
flexibility and manageability of the database storage. This is web
application design 101 - always separate content from function and
structure.

Of course getting from point A to point B could be tricky, but maybe this
gives you some ideas =)

--
Klaus H. Probst, MVP
http://www.simulplex.net/
"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:DF**********************************@microsof t.com...
DLM,

Thanks for your response.

I don't believe that SQL is my problem - the time it takes to do the remote calls to the external program has not presented any real limitations or
caused any significant time-delays. The problem that I have is trying to
deal with the quanitity of files that I have and the combiling of those pages once they have been returned. Any time issues that I have had are the result of parsing and compilling the page.

I understand your point - that asp.net was designed to have the pages
pre-compiled before requests where made for the pages - I guess this is one of the fundamental differances between asp and asp.net. From experiance the design that has been implemented had no significant disadvantages when
working with asp - in fact I believe that it had some very significant
advantages - mainly due to the unlimited number of files it could manage.

Question - is it practicale to have 10,000 pages in a single asp.net
application. I wouldn't have thought so?

I have no alternative but to deal with this many files as it is a legal
requirment of the industry I work in is to be able to provide the original
copies of invoices to clients - I can't go back to the original data and
attempt to re-produce the same invoice just in case the data has changed.

So - considering this is what I am stuck with - my problem mainly relates to the memory increases - the slowness of the request could be dealt with by the more grunt - but the ever increasing memory is more difficult. I believe
that this is the result of the creation of a new class each time a new page is displayed - although this page gets un-loaded once the info is returned - are the increases in memory the result of the new class definition in the
asp.net application?

If this is the case - is it possible to remove this class definition once it has been created? I have tried to delete the file - but this does not appear to fix the problem.

Does any of this make sense.
--
matthew
"DLM" wrote:
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP .NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round trip time for any given request/response for a page to include the time it takes to get the page from SQL Server. And this added time is database access time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very start, and not make the ASP .NET worker process wait for the files before it can dynamically compile pages. In a sense you are using SQL Server as a web server to your web server, which is highly unusual. You need to find a way to place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES

Nov 19 '05 #16

P: n/a
To understand why the solution you had working in clasic asp doesn't perform
in .net, you have to understand the internals of how these systems work.

In asp, every page was intrepreted runtime. It looked for your <% %> tags
and intrepeted the code between them real time as it ran each page. If your
page didn't caontain much server-side code and was mostly content then there
wasn't much to do. Basically the system was designed to look at each page
each time it was hit as necessary. This is why almost all systems (except
your) are faster in .net.

..net on the other hand "compiles" every page into IL code when the page
changes. This IL code is then intrepeted real time by the runtime engine.
Compiling is a lot of work. It outputs new code, checks syntax, checks type
casts, checks variable declarations, function calls, and another million
things. asp would just crap out if it didn't like what it saw. The key here
is that .aspx files are compiled once for every folder and every folder every
time a file changes in it. On a normal system, you take a hit once for each
folder that has a page in it requested. Then nothing changes for a long time
so the compiler may not do anything for months.

In your system, every time you push a page out from SQL to disk you are
instructing the compiler that there is a new page that it must compile (not
to mention the file system write which could possibly be the slowest thing
you could do on a Windows box). Now if you start loading all 10000 pages
into the same folder, the work it has to do will get worse as the more pages
are requested since there will be more pages to compile.

The key to making your system work is to do some major rearchitecting or
leave it in clasic asp. Why do you want to change? If you rearchitect, you
need to only store "data" in the database and keep code in files on the
server compiled. Then you load the pages with data from the db. This is how
content management systems work.

An example would be your invoice pages. I assume that many of your "pages"
in the database are invoices. You need to just store them as text along with
whatever data you need to make the page in SQL Server (table called Invoice?)
and then have a page called invoice.aspx on the server that uses code to grab
the data and format it for display(including your graph control).

It really has never been a "good" design to put code in a database. It just
happens that classic asp was stupid enough that you could get away with it.
You need to leverage the systems for what they are good at; IIS/.net for page
serving and SQL server for data storage.
Nov 19 '05 #17

P: n/a
Klaus,

Thankyou for your help - I must admit that at the beginning I was a little
taken back by people's negative responses to the problem - I believe that
people were far to quick to make judgement without bothering to understand
the problem - I mentioned - I think in almost every response I mentioned that
I have around 10,000 pages - and no one bothered to provide much help with
the specific problem - I think the method I have used is not too bad - just
needs to be modified to work efficiently with asp.net.

Yes - most of the pages have mostly content and there is not allot of code
in each page. So I am trying to work out how to implement something similar
to your suggestion - but still provide the flexibility of enabling server
side code where needed.

What can make this is a little difficult is my lack of experience in asp.net
- but hopefully that will improve.

The solution I was looking at would be to create some specific server side
asp.net controls for some of the more specific tasks that I have - for
instance - I need to load charts at the server and therefore need to be able
to provide the data to these charts - sometimes this could be allot of data.
I want the data to be saved in the page so I thought that a asp.net control
could be the way to go - I could pass the data to the control and this
control would already be defined on the server.

So - if I have a string of HTML text with a embedded asp.net server control
and I use the response.write function will asp.net recognize that there is a
control with in the document and load it accordingly??

So there wouldn't be any <% %> within the html - just the server controls.

If this don't work that only way that I can image doing it would be to
implement my own method of tags - search from them at run-time and extract
the code between them and use the Eval function to evaluate the code. Does
not seem like a good solution - but I think it would work.

Part of my aim is to limit the dependence of the need to store data - once
the invoice or file is produced ideally I need future requests to only look
at this one file for it to display correctly - therefore the reasoning on
imbedding the data within the file - but that does not mean that the code
could be pulled out of the file and placed elsewhere. Having as few
dependencies as possible on what is required to display the page was part of
the original design and one that ensures that future system changes will not
inadvertently make viewing older invoices fail. Something that has worked
well over the past 5 years.

Are these the sorts of ideas you were thinking off??
--
matthew
"Klaus H. Probst" wrote:
The following suggestion may or may not work for you, but I'll chime in.

First off, to the folks that called the approach your existing app uses
"idiotic", I've seen classic ASP applications design this way that work
extremely well, are scalable and extremely manageable. However, the devil is
in the details. Just putting "pages" in IMAGE or TEXT fields on a SQL
database isn't some sort of magic pixie dust that solves all problems. It
takes a lot of thought and experimentation to get just right, and the
rewards in most cases offset the negative aspects of these designs.

Having said that, your approach is, as you've figured out by now, wrong.
Others have already mentioned the perf hit you're taking by forcing ASP.NET
to function in a way it wasn't designed to. But there are ways to get around
the problem.

First off, I'll assume that you have more content than code. That is, *most*
of your existing ASP/ASPX pages in the database contain stuff, like invoices
and whatnot, that do not require intervention by the parser to load and
display. If this is not the case then stop reading now =)

There are a lot of web applications nowadays that use their databases as
primary content storage repositories. Blogs are a good example of this.
However, they do not store code in the database, they store content. If you
can adapt to this type of scenario then all you have to do is create a sort
of "master page" or maybe an IHttpHandler implementation that looks at the
request (even a path), pulls the *content* from the database and then
displays it on a sort of templated page container. Think of a site like
MSDN. Lots of articles, yet all the pages look the same. The content is what
varies. The trick is to simply treat your "pages" as opaque chunks of
content, and "stream" them into a placeholder page that is rendered the same
way every time. You can pretty much do anything you want once you're hooked
into the ASP.NET processing pipeline.

If you do this you'll get rid of your perf problem and you'll still have the
flexibility and manageability of the database storage. This is web
application design 101 - always separate content from function and
structure.

Of course getting from point A to point B could be tricky, but maybe this
gives you some ideas =)

--
Klaus H. Probst, MVP
http://www.simulplex.net/
"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:DF**********************************@microsof t.com...
DLM,

Thanks for your response.

I don't believe that SQL is my problem - the time it takes to do the

remote
calls to the external program has not presented any real limitations or
caused any significant time-delays. The problem that I have is trying to
deal with the quanitity of files that I have and the combiling of those

pages
once they have been returned. Any time issues that I have had are the

result
of parsing and compilling the page.

I understand your point - that asp.net was designed to have the pages
pre-compiled before requests where made for the pages - I guess this is

one
of the fundamental differances between asp and asp.net. From experiance

the
design that has been implemented had no significant disadvantages when
working with asp - in fact I believe that it had some very significant
advantages - mainly due to the unlimited number of files it could manage.

Question - is it practicale to have 10,000 pages in a single asp.net
application. I wouldn't have thought so?

I have no alternative but to deal with this many files as it is a legal
requirment of the industry I work in is to be able to provide the original
copies of invoices to clients - I can't go back to the original data and
attempt to re-produce the same invoice just in case the data has changed.

So - considering this is what I am stuck with - my problem mainly relates

to
the memory increases - the slowness of the request could be dealt with by

the
more grunt - but the ever increasing memory is more difficult. I believe
that this is the result of the creation of a new class each time a new

page
is displayed - although this page gets un-loaded once the info is

returned -
are the increases in memory the result of the new class definition in the
asp.net application?

If this is the case - is it possible to remove this class definition once

it
has been created? I have tried to delete the file - but this does not

appear
to fix the problem.

Does any of this make sense.
--
matthew
"DLM" wrote:
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP .NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round trip time for any given request/response for a page to include the time it takes to get the page from SQL Server. And this added time is database access time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very start, and not make the ASP .NET worker process wait for the files before it can dynamically compile pages. In a sense you are using SQL Server as a web server to your web server, which is highly unusual. You need to find a way to place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES


Nov 19 '05 #18

P: n/a
DancesWithBamboo,

Thankyou for your help - your response I believe hits the nail on the head.

I am trying to look at my options on pulling the code out of the pages and
placing it in asp.net - not sure exactly how to do it - but are looking at
different options. If you look at one of my recent replies to Klaus I have
outlined a few of my ideas.

I found the asp solution very flexible and it provided a good solution to
the issues I was having - but I was forced to move to asp.net due to
reliability issues and a need to use webservices. Asp pages still get served
occasionally from the server as they were created 3-4 years ago and the
problems I have been having with asp.net have not really caused a great
problem until recently. I guess its time to do some more up-grading.

Thanks for your help.

--
matthew
"DancesWithBamboo" wrote:
To understand why the solution you had working in clasic asp doesn't perform
in .net, you have to understand the internals of how these systems work.

In asp, every page was intrepreted runtime. It looked for your <% %> tags
and intrepeted the code between them real time as it ran each page. If your
page didn't caontain much server-side code and was mostly content then there
wasn't much to do. Basically the system was designed to look at each page
each time it was hit as necessary. This is why almost all systems (except
your) are faster in .net.

.net on the other hand "compiles" every page into IL code when the page
changes. This IL code is then intrepeted real time by the runtime engine.
Compiling is a lot of work. It outputs new code, checks syntax, checks type
casts, checks variable declarations, function calls, and another million
things. asp would just crap out if it didn't like what it saw. The key here
is that .aspx files are compiled once for every folder and every folder every
time a file changes in it. On a normal system, you take a hit once for each
folder that has a page in it requested. Then nothing changes for a long time
so the compiler may not do anything for months.

In your system, every time you push a page out from SQL to disk you are
instructing the compiler that there is a new page that it must compile (not
to mention the file system write which could possibly be the slowest thing
you could do on a Windows box). Now if you start loading all 10000 pages
into the same folder, the work it has to do will get worse as the more pages
are requested since there will be more pages to compile.

The key to making your system work is to do some major rearchitecting or
leave it in clasic asp. Why do you want to change? If you rearchitect, you
need to only store "data" in the database and keep code in files on the
server compiled. Then you load the pages with data from the db. This is how
content management systems work.

An example would be your invoice pages. I assume that many of your "pages"
in the database are invoices. You need to just store them as text along with
whatever data you need to make the page in SQL Server (table called Invoice?)
and then have a page called invoice.aspx on the server that uses code to grab
the data and format it for display(including your graph control).

It really has never been a "good" design to put code in a database. It just
happens that classic asp was stupid enough that you could get away with it.
You need to leverage the systems for what they are good at; IIS/.net for page
serving and SQL server for data storage.

Nov 19 '05 #19

P: n/a
Hi,
Convert the aspx pages to controls and use Loadcontrol from a single aspx by
looking up the ID of the SQL stored control! As well you could cache the
controls to improve performance.

regards
Brucek

"matvdl" wrote:
I have a system that was originally developed in asp - the pages are saved in
SQL (there are over 10,000 pages) and saved to a temp directory in the server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #20

P: n/a
You could also use ParseControl to pass them as a string from the database,
Loadcontrol would load from a file.
Basically in pseudocode:
1. Put a placeholder in a aspx page
2. Get id of stored control from calling page
3. Get Text of ascx in database
4. New Control = Parsecontrol(text)
5. Placeholder.Controls.Add(Control)

Place 2-5 in loop if more than 1 control per page

This way you get one aspx, with 1 or more content controls without redirects
etc.

Regards
Bruce
"Bruce" wrote:
Hi,
Convert the aspx pages to controls and use Loadcontrol from a single aspx by
looking up the ID of the SQL stored control! As well you could cache the
controls to improve performance.

regards
Brucek

"matvdl" wrote:
I have a system that was originally developed in asp - the pages are saved in
SQL (there are over 10,000 pages) and saved to a temp directory in the server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #21

P: n/a
Matthew,

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:EF**********************************@microsof t.com...
So - if I have a string of HTML text with a embedded asp.net server control and I use the response.write function will asp.net recognize that there is a control with in the document and load it accordingly??
No, the template parser is long done after you get to write to the HTTP
output stream.
If this don't work that only way that I can image doing it would be to
implement my own method of tags - search from them at run-time and extract
the code between them and use the Eval function to evaluate the code. Does not seem like a good solution - but I think it would work.
You can embed type names or something like that, and then use some sort of
class factory implementation to instantiate the "controls" that need to
render themselves. Pass the HttpContext reference to them or some such so
they can play. Remember that a "control" in ASP.NET is something that just
renders some HTML. They're nothing special and they don't have to be loaded
from disk or have an ASCX extension (although that has its advantages).
Still, avoid that sort of thing as much as you can.
Part of my aim is to limit the dependence of the need to store data - once
the invoice or file is produced ideally I need future requests to only look at this one file for it to display correctly - therefore the reasoning on
imbedding the data within the file - but that does not mean that the code
could be pulled out of the file and placed elsewhere. Having as few
dependencies as possible on what is required to display the page was part of the original design and one that ensures that future system changes will not inadvertently make viewing older invoices fail. Something that has worked
well over the past 5 years.


Ideally you never store data with presentation metadata (your markup), but I
will recognize that in some cases it makes sense. But storing the data with
the markup is not a good idea just because you think the application will
change someday - you design applications to be resilient and adaptable to
most foreseeable changes.

--
Klaus H. Probst, MVP
http://www.simulplex.net/
Nov 19 '05 #22

This discussion thread is closed

Replies have been disabled for this discussion.