473,416 Members | 1,542 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,416 software developers and data experts.

Using SQl to store aspx pages and memory problems

I have a system that was originally developed in asp - the pages are saved in
SQL (there are over 10,000 pages) and saved to a temp directory in the server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew
Nov 19 '05 #1
21 2897
TJS
what exactly does this mean ?

"as each page is requested it is retrieved from SQL saved to the
temp directory and compiled "

why are you saving it twice ?

if it 's saved in sql why not just write it back to the client?

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved
in
SQL (there are over 10,000 pages) and saved to a temp directory in the
server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #2
Many of the pages have server side code - they are aspx files - not html
files. e.g some of the files may need to get information such as current
date the file was displayed - or they are customized based on the browser
that views them - so they get modifed at run-time. Many have server side
controls that display charts. (chartfx control)

The way I manage this is to write the saved aspx file to a temp directory on
the server and redirect the request to this file.

I also have html files that are retrieved from SQL but these are simply
written back to the client - the problem is with aspx files.

Any help would be appreciated.

Matthew
--
matthew
"TJS" wrote:
what exactly does this mean ?

"as each page is requested it is retrieved from SQL saved to the
temp directory and compiled "

why are you saving it twice ?

if it 's saved in sql why not just write it back to the client?

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved
in
SQL (there are over 10,000 pages) and saved to a temp directory in the
server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew


Nov 19 '05 #3
This is just such a dumb way to do things. You should just start over, and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved in SQL (there are over 10,000 pages) and saved to a temp directory in the server when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #4
TJS
why are are you doing this ? This is not a good design. I'm surprised itr
works at all .

Just use use regular aspx files and be done with it.


"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:AA**********************************@microsof t.com...
Many of the pages have server side code - they are aspx files - not html
files. e.g some of the files may need to get information such as current
date the file was displayed - or they are customized based on the browser
that views them - so they get modifed at run-time. Many have server side
controls that display charts. (chartfx control)

The way I manage this is to write the saved aspx file to a temp directory
on
the server and redirect the request to this file.

I also have html files that are retrieved from SQL but these are simply
written back to the client - the problem is with aspx files.

Any help would be appreciated.

Matthew
--
matthew
"TJS" wrote:
what exactly does this mean ?

"as each page is requested it is retrieved from SQL saved to the
temp directory and compiled "

why are you saving it twice ?

if it 's saved in sql why not just write it back to the client?

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
>I have a system that was originally developed in asp - the pages are
>saved
>in
> SQL (there are over 10,000 pages) and saved to a temp directory in the
> server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the
> server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one
> session - as each page is requested it is retrieved from SQL saved to
> the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a
> considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #5
Bob,

I will respond to your response - as the designer I won’t feel as bad as TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired – wat I have
works very well – except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I move
to aspx is not an helpful answer and sounds more like a cop-out - what I need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem comes
about as a result of having 10,000 aspx files and the memory and speed that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over, and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved

in
SQL (there are over 10,000 pages) and saved to a temp directory in the

server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew


Nov 19 '05 #6
Bob

I made yousund like the good guy - I am sorry - but thats not true.

It was TJS who was a little less unreasonable
--
matthew
"matvdl" wrote:
Bob,

I will respond to your response - as the designer I won’t feel as bad as TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired – wat I have
works very well – except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I move
to aspx is not an helpful answer and sounds more like a cop-out - what I need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem comes
about as a result of having 10,000 aspx files and the memory and speed that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over, and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
I have a system that was originally developed in asp - the pages are saved

in
SQL (there are over 10,000 pages) and saved to a temp directory in the

server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew


Nov 19 '05 #7
"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:5A**********************************@microsof t.com...
I made yousund like the good guy - I am sorry - but thats not true.


You're forgiven. Thanks for clearing that up.

Bob Lehmann
Nov 19 '05 #8
TJS
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as
TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I
move
to aspx is not an helpful answer and sounds more like a cop-out - what I
need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients
as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for
the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so
the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants
to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would
solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over,
and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
> I have a system that was originally developed in asp - the pages are
> saved

in
> SQL (there are over 10,000 pages) and saved to a temp directory in the

server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the
> server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one
> session - as each page is requested it is retrieved from SQL saved to
> the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a
> considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #9
TJS,

Sorry - I made a mistake on my first response I did try to correct the issue.

Thanks for providing some constructive comments - going back to ASP for
these pages is something that I have considered.

By looking at this from another direction - maybe a solution can be provided.

What if I use an external program to return a portion of the HTML text that
is returned the user - I am sure that this configuration is not unusual.
Basically when the aspx page is called it goes and calls an external function
and this returns text formatted in HTML - this can simply sent back to the
client using the write statement.

Just say though that the external program decided that it wanted to embed
some server specific information into its html text - say the server name -
to do this you could embed a <%= %> statement into the HTML. Is there not
anyway of parsing the returned HTML and embedded <%= %> statements to result
in pure html text that is then written to the browser?

This does not sound like an unreasonable thing to do - and from the
perspective of the external program - if this was possible - would add
significantly to the flexibility of the system.

Hopefully this example has made some sense - does this sound possible?

--
matthew
"TJS" wrote:
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as
TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no
problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.

2. I'm not sure - but the idea of having 10,000 aspx files sitting on the
web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination
of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I
move
to aspx is not an helpful answer and sounds more like a cop-out - what I
need
is someone willing to understand the reasons for designing it this way and
for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients
as
text within a database - this text could have been saved as plain html and
forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for
the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so
the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants
to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be
made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would
solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over,
and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
> I have a system that was originally developed in asp - the pages are
> saved
in
> SQL (there are over 10,000 pages) and saved to a temp directory in the
server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the
> server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one
> session - as each page is requested it is retrieved from SQL saved to
> the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a
> considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #10
DLM
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP
..NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round
trip time for any given request/response for a page to include the time it
takes to get the page from SQL Server. And this added time is database access
time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very
start, and not make the ASP .NET worker process wait for the files before it
can dynamically compile pages. In a sense you are using SQL Server as a web
server to your web server, which is highly unusual. You need to find a way to
place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES
Nov 19 '05 #11
First, I didn't call matvdl an idiot.

Second, I did say that the methodology was dumb, and the person
_who_designed_it was an idiot.

Third, is your <shift > key broken?

Fourth, I agree with everything else you've said.

Bob Lehmann
"TJS" <no****@here.com> wrote in message
news:er**************@TK2MSFTNGP10.phx.gbl...
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as
TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.
2. I'm not sure - but the idea of having 10,000 aspx files sitting on the web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.

3. It provides an excellent mix of creating a file that has a combination of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.

So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I
move
to aspx is not an helpful answer and sounds more like a cop-out - what I
need
is someone willing to understand the reasons for designing it this way and for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server -
either aspx or asp page - I also need to load the control with data - so
the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.

Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be made more efficient - is it possible to save the compiled code into the
server and load this up with the file when it is displayed. This would
solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this
would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:
This is just such a dumb way to do things. You should just start over,
and
brain the idiot who did this to you.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:75**********************************@microsof t.com...
> I have a system that was originally developed in asp - the pages are
> saved
in
> SQL (there are over 10,000 pages) and saved to a temp directory in the server
> when requested by a client.
>
> I have updated this system and changed the pages that are saved to the > server as aspx - everything works fine and pages can be served - but
>
> Its not impossible for a single client to request 100 plus pages in one > session - as each page is requested it is retrieved from SQL saved to > the
> temp directory and compiled - problems are.
> - Performance is well down on the original asp system - I believe
> that
> this due to the compiling of the pages in asp.net
> - The memory usage of server also goes through the roof - there is a > considerable increase in memory as each page is loaded
>
> Are there any solutions to these problems??
>
> --
> matthew


Nov 19 '05 #12
> What if I use an external program to return a portion of the HTML text

Yes, adding another layer of complexity to a poorly designed system will
surely be beneficial.

Bob Lehmann

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:D8**********************************@microsof t.com...
TJS,

Sorry - I made a mistake on my first response I did try to correct the issue.
Thanks for providing some constructive comments - going back to ASP for
these pages is something that I have considered.

By looking at this from another direction - maybe a solution can be provided.
What if I use an external program to return a portion of the HTML text that is returned the user - I am sure that this configuration is not unusual.
Basically when the aspx page is called it goes and calls an external function and this returns text formatted in HTML - this can simply sent back to the
client using the write statement.

Just say though that the external program decided that it wanted to embed
some server specific information into its html text - say the server name - to do this you could embed a <%= %> statement into the HTML. Is there not anyway of parsing the returned HTML and embedded <%= %> statements to result in pure html text that is then written to the browser?

This does not sound like an unreasonable thing to do - and from the
perspective of the external program - if this was possible - would add
significantly to the flexibility of the system.

Hopefully this example has made some sense - does this sound possible?

--
matthew
"TJS" wrote:
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic asp for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
Bob,

I will respond to your response - as the designer I won't feel as bad as TJS
who basically called me an idiot.

There are many reason why I am doing it this way - and I don't believe
that
they are all stupid.

1. Firstly the system was inherited from asp - in asp this idea caused no problems at all and worked very well - other than the overhead of
obtaining
the text from SQL server and writing it to disk - it all worked very well.
2. I'm not sure - but the idea of having 10,000 aspx files sitting on the web server does not make sense - and it would not solve my problem
anyway -
the compiling time would still be there and so would the memory issues.
3. It provides an excellent mix of creating a file that has a combination of data that was only available at the time the file was created and
combining it with functionality available at the time it was displayed.
So - the ability of me to re-design is not possible or desired - wat I
have
works very well - except for the problems of memory and speed.

This was a good idea in asp - so to say that it is simply stupid when I move
to aspx is not an helpful answer and sounds more like a cop-out - what I need
is someone willing to understand the reasons for designing it this way and for someone to attempt to answer my questions.

For example - I have saved the copy of each invoice I send out to clients as
text within a database - this text could have been saved as plain html and forwarded to the client - but I have included a chart within these
invoices
and this chart is a active x control (chart fx - fully interactive) - for the
chart fx control to display correctly it must be served from a server - either aspx or asp page - I also need to load the control with data - so the
only way to do this was through an aspx file.

NOW - it is not possible to re-create the invoice each time someone wants to
view it - much of the data used to create the invoice is not available.
Also - the fact that SQL is involved is not the problem - the problem
comes
about as a result of having 10,000 aspx files and the memory and speed
that
it takes to display them.

So - can anyone help? - can anyone offer some insight into how this can be made more efficient - is it possible to save the compiled code into the server and load this up with the file when it is displayed. This would solve
the speed issue.

Is it possible to get a page to unload once it has been compiled - this would solve the memory issue.

Some constructive answers would be helpful.
--
matthew
"Bob Lehmann" wrote:

> This is just such a dumb way to do things. You should just start over,> and
> brain the idiot who did this to you.
>
> Bob Lehmann
>
> "matvdl" <ma****@discussions.microsoft.com> wrote in message
> news:75**********************************@microsof t.com...
> > I have a system that was originally developed in asp - the pages are> > saved
> in
> > SQL (there are over 10,000 pages) and saved to a temp directory in the> server
> > when requested by a client.
> >
> > I have updated this system and changed the pages that are saved to the> > server as aspx - everything works fine and pages can be served - but> >
> > Its not impossible for a single client to request 100 plus pages in one> > session - as each page is requested it is retrieved from SQL saved to> > the
> > temp directory and compiled - problems are.
> > - Performance is well down on the original asp system - I believe> > that
> > this due to the compiling of the pages in asp.net
> > - The memory usage of server also goes through the roof - there is a> > considerable increase in memory as each page is loaded
> >
> > Are there any solutions to these problems??
> >
> > --
> > matthew
>
>
>


Nov 19 '05 #13
TJS
Quit complaining , I capitalized your name. And yes, you basically did
call him an idiot and he felt the same way and said as much.
Unfortunately, he thought it was me but later did clear that up .

At least we agree about his approach.
"Bob Lehmann" <no****@dontbotherme.zzz> wrote in message
news:ep**************@tk2msftngp13.phx.gbl...
First, I didn't call matvdl an idiot.

Second, I did say that the methodology was dumb, and the person
_who_designed_it was an idiot.

Third, is your <shift > key broken?

Fourth, I agree with everything else you've said.

Bob Lehmann
"TJS" <no****@here.com> wrote in message
news:er**************@TK2MSFTNGP10.phx.gbl...
first, I didn't call you an idiot, Bob Lehmann did...........
second, you can't take a bad approach and make it efficient....
third, if the classic asp system works, you could leave it in classic
asp
for that piece. classic and .net can run side by side.

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:77**********************************@microsof t.com...
> Bob,
>
> I will respond to your response - as the designer I won't feel as bad
> as
> TJS
> who basically called me an idiot.
>
> There are many reason why I am doing it this way - and I don't believe
> that
> they are all stupid.
>
> 1. Firstly the system was inherited from asp - in asp this idea caused no > problems at all and worked very well - other than the overhead of
> obtaining
> the text from SQL server and writing it to disk - it all worked very well. >
> 2. I'm not sure - but the idea of having 10,000 aspx files sitting on the > web server does not make sense - and it would not solve my problem
> anyway -
> the compiling time would still be there and so would the memory issues.
>
> 3. It provides an excellent mix of creating a file that has a combination > of data that was only available at the time the file was created and
> combining it with functionality available at the time it was displayed.
>
> So - the ability of me to re-design is not possible or desired - wat I
> have
> works very well - except for the problems of memory and speed.
>
> This was a good idea in asp - so to say that it is simply stupid when I
> move
> to aspx is not an helpful answer and sounds more like a cop-out - what
> I
> need
> is someone willing to understand the reasons for designing it this way and > for someone to attempt to answer my questions.
>
> For example - I have saved the copy of each invoice I send out to clients > as
> text within a database - this text could have been saved as plain html and > forwarded to the client - but I have included a chart within these
> invoices
> and this chart is a active x control (chart fx - fully interactive) - for > the
> chart fx control to display correctly it must be served from a server -
> either aspx or asp page - I also need to load the control with data -
> so
> the
> only way to do this was through an aspx file.
>
> NOW - it is not possible to re-create the invoice each time someone wants > to
> view it - much of the data used to create the invoice is not available.
>
> Also - the fact that SQL is involved is not the problem - the problem
> comes
> about as a result of having 10,000 aspx files and the memory and speed
> that
> it takes to display them.
>
> So - can anyone help? - can anyone offer some insight into how this can be > made more efficient - is it possible to save the compiled code into the
> server and load this up with the file when it is displayed. This would
> solve
> the speed issue.
>
> Is it possible to get a page to unload once it has been compiled - this
> would solve the memory issue.
>
> Some constructive answers would be helpful.
> --
> matthew
>
>
> "Bob Lehmann" wrote:
>
>> This is just such a dumb way to do things. You should just start over,
>> and
>> brain the idiot who did this to you.
>>
>> Bob Lehmann
>>
>> "matvdl" <ma****@discussions.microsoft.com> wrote in message
>> news:75**********************************@microsof t.com...
>> > I have a system that was originally developed in asp - the pages are
>> > saved
>> in
>> > SQL (there are over 10,000 pages) and saved to a temp directory in the >> server
>> > when requested by a client.
>> >
>> > I have updated this system and changed the pages that are saved to the >> > server as aspx - everything works fine and pages can be served - but
>> >
>> > Its not impossible for a single client to request 100 plus pages in one >> > session - as each page is requested it is retrieved from SQL saved to >> > the
>> > temp directory and compiled - problems are.
>> > - Performance is well down on the original asp system - I believe
>> > that
>> > this due to the compiling of the pages in asp.net
>> > - The memory usage of server also goes through the roof - there
>> > is a >> > considerable increase in memory as each page is loaded
>> >
>> > Are there any solutions to these problems??
>> >
>> > --
>> > matthew
>>
>>
>>



Nov 19 '05 #14
DLM,

Thanks for your response.

I don't believe that SQL is my problem - the time it takes to do the remote
calls to the external program has not presented any real limitations or
caused any significant time-delays. The problem that I have is trying to
deal with the quanitity of files that I have and the combiling of those pages
once they have been returned. Any time issues that I have had are the result
of parsing and compilling the page.

I understand your point - that asp.net was designed to have the pages
pre-compiled before requests where made for the pages - I guess this is one
of the fundamental differances between asp and asp.net. From experiance the
design that has been implemented had no significant disadvantages when
working with asp - in fact I believe that it had some very significant
advantages - mainly due to the unlimited number of files it could manage.

Question - is it practicale to have 10,000 pages in a single asp.net
application. I wouldn't have thought so?

I have no alternative but to deal with this many files as it is a legal
requirment of the industry I work in is to be able to provide the original
copies of invoices to clients - I can't go back to the original data and
attempt to re-produce the same invoice just in case the data has changed.

So - considering this is what I am stuck with - my problem mainly relates to
the memory increases - the slowness of the request could be dealt with by the
more grunt - but the ever increasing memory is more difficult. I believe
that this is the result of the creation of a new class each time a new page
is displayed - although this page gets un-loaded once the info is returned -
are the increases in memory the result of the new class definition in the
asp.net application?

If this is the case - is it possible to remove this class definition once it
has been created? I have tried to delete the file - but this does not appear
to fix the problem.

Does any of this make sense.
--
matthew
"DLM" wrote:
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP
.NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round
trip time for any given request/response for a page to include the time it
takes to get the page from SQL Server. And this added time is database access
time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very
start, and not make the ASP .NET worker process wait for the files before it
can dynamically compile pages. In a sense you are using SQL Server as a web
server to your web server, which is highly unusual. You need to find a way to
place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES

Nov 19 '05 #15
The following suggestion may or may not work for you, but I'll chime in.

First off, to the folks that called the approach your existing app uses
"idiotic", I've seen classic ASP applications design this way that work
extremely well, are scalable and extremely manageable. However, the devil is
in the details. Just putting "pages" in IMAGE or TEXT fields on a SQL
database isn't some sort of magic pixie dust that solves all problems. It
takes a lot of thought and experimentation to get just right, and the
rewards in most cases offset the negative aspects of these designs.

Having said that, your approach is, as you've figured out by now, wrong.
Others have already mentioned the perf hit you're taking by forcing ASP.NET
to function in a way it wasn't designed to. But there are ways to get around
the problem.

First off, I'll assume that you have more content than code. That is, *most*
of your existing ASP/ASPX pages in the database contain stuff, like invoices
and whatnot, that do not require intervention by the parser to load and
display. If this is not the case then stop reading now =)

There are a lot of web applications nowadays that use their databases as
primary content storage repositories. Blogs are a good example of this.
However, they do not store code in the database, they store content. If you
can adapt to this type of scenario then all you have to do is create a sort
of "master page" or maybe an IHttpHandler implementation that looks at the
request (even a path), pulls the *content* from the database and then
displays it on a sort of templated page container. Think of a site like
MSDN. Lots of articles, yet all the pages look the same. The content is what
varies. The trick is to simply treat your "pages" as opaque chunks of
content, and "stream" them into a placeholder page that is rendered the same
way every time. You can pretty much do anything you want once you're hooked
into the ASP.NET processing pipeline.

If you do this you'll get rid of your perf problem and you'll still have the
flexibility and manageability of the database storage. This is web
application design 101 - always separate content from function and
structure.

Of course getting from point A to point B could be tricky, but maybe this
gives you some ideas =)

--
Klaus H. Probst, MVP
http://www.simulplex.net/
"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:DF**********************************@microsof t.com...
DLM,

Thanks for your response.

I don't believe that SQL is my problem - the time it takes to do the remote calls to the external program has not presented any real limitations or
caused any significant time-delays. The problem that I have is trying to
deal with the quanitity of files that I have and the combiling of those pages once they have been returned. Any time issues that I have had are the result of parsing and compilling the page.

I understand your point - that asp.net was designed to have the pages
pre-compiled before requests where made for the pages - I guess this is one of the fundamental differances between asp and asp.net. From experiance the design that has been implemented had no significant disadvantages when
working with asp - in fact I believe that it had some very significant
advantages - mainly due to the unlimited number of files it could manage.

Question - is it practicale to have 10,000 pages in a single asp.net
application. I wouldn't have thought so?

I have no alternative but to deal with this many files as it is a legal
requirment of the industry I work in is to be able to provide the original
copies of invoices to clients - I can't go back to the original data and
attempt to re-produce the same invoice just in case the data has changed.

So - considering this is what I am stuck with - my problem mainly relates to the memory increases - the slowness of the request could be dealt with by the more grunt - but the ever increasing memory is more difficult. I believe
that this is the result of the creation of a new class each time a new page is displayed - although this page gets un-loaded once the info is returned - are the increases in memory the result of the new class definition in the
asp.net application?

If this is the case - is it possible to remove this class definition once it has been created? I have tried to delete the file - but this does not appear to fix the problem.

Does any of this make sense.
--
matthew
"DLM" wrote:
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP .NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round trip time for any given request/response for a page to include the time it takes to get the page from SQL Server. And this added time is database access time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very start, and not make the ASP .NET worker process wait for the files before it can dynamically compile pages. In a sense you are using SQL Server as a web server to your web server, which is highly unusual. You need to find a way to place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES

Nov 19 '05 #16
To understand why the solution you had working in clasic asp doesn't perform
in .net, you have to understand the internals of how these systems work.

In asp, every page was intrepreted runtime. It looked for your <% %> tags
and intrepeted the code between them real time as it ran each page. If your
page didn't caontain much server-side code and was mostly content then there
wasn't much to do. Basically the system was designed to look at each page
each time it was hit as necessary. This is why almost all systems (except
your) are faster in .net.

..net on the other hand "compiles" every page into IL code when the page
changes. This IL code is then intrepeted real time by the runtime engine.
Compiling is a lot of work. It outputs new code, checks syntax, checks type
casts, checks variable declarations, function calls, and another million
things. asp would just crap out if it didn't like what it saw. The key here
is that .aspx files are compiled once for every folder and every folder every
time a file changes in it. On a normal system, you take a hit once for each
folder that has a page in it requested. Then nothing changes for a long time
so the compiler may not do anything for months.

In your system, every time you push a page out from SQL to disk you are
instructing the compiler that there is a new page that it must compile (not
to mention the file system write which could possibly be the slowest thing
you could do on a Windows box). Now if you start loading all 10000 pages
into the same folder, the work it has to do will get worse as the more pages
are requested since there will be more pages to compile.

The key to making your system work is to do some major rearchitecting or
leave it in clasic asp. Why do you want to change? If you rearchitect, you
need to only store "data" in the database and keep code in files on the
server compiled. Then you load the pages with data from the db. This is how
content management systems work.

An example would be your invoice pages. I assume that many of your "pages"
in the database are invoices. You need to just store them as text along with
whatever data you need to make the page in SQL Server (table called Invoice?)
and then have a page called invoice.aspx on the server that uses code to grab
the data and format it for display(including your graph control).

It really has never been a "good" design to put code in a database. It just
happens that classic asp was stupid enough that you could get away with it.
You need to leverage the systems for what they are good at; IIS/.net for page
serving and SQL server for data storage.
Nov 19 '05 #17
Klaus,

Thankyou for your help - I must admit that at the beginning I was a little
taken back by people's negative responses to the problem - I believe that
people were far to quick to make judgement without bothering to understand
the problem - I mentioned - I think in almost every response I mentioned that
I have around 10,000 pages - and no one bothered to provide much help with
the specific problem - I think the method I have used is not too bad - just
needs to be modified to work efficiently with asp.net.

Yes - most of the pages have mostly content and there is not allot of code
in each page. So I am trying to work out how to implement something similar
to your suggestion - but still provide the flexibility of enabling server
side code where needed.

What can make this is a little difficult is my lack of experience in asp.net
- but hopefully that will improve.

The solution I was looking at would be to create some specific server side
asp.net controls for some of the more specific tasks that I have - for
instance - I need to load charts at the server and therefore need to be able
to provide the data to these charts - sometimes this could be allot of data.
I want the data to be saved in the page so I thought that a asp.net control
could be the way to go - I could pass the data to the control and this
control would already be defined on the server.

So - if I have a string of HTML text with a embedded asp.net server control
and I use the response.write function will asp.net recognize that there is a
control with in the document and load it accordingly??

So there wouldn't be any <% %> within the html - just the server controls.

If this don't work that only way that I can image doing it would be to
implement my own method of tags - search from them at run-time and extract
the code between them and use the Eval function to evaluate the code. Does
not seem like a good solution - but I think it would work.

Part of my aim is to limit the dependence of the need to store data - once
the invoice or file is produced ideally I need future requests to only look
at this one file for it to display correctly - therefore the reasoning on
imbedding the data within the file - but that does not mean that the code
could be pulled out of the file and placed elsewhere. Having as few
dependencies as possible on what is required to display the page was part of
the original design and one that ensures that future system changes will not
inadvertently make viewing older invoices fail. Something that has worked
well over the past 5 years.

Are these the sorts of ideas you were thinking off??
--
matthew
"Klaus H. Probst" wrote:
The following suggestion may or may not work for you, but I'll chime in.

First off, to the folks that called the approach your existing app uses
"idiotic", I've seen classic ASP applications design this way that work
extremely well, are scalable and extremely manageable. However, the devil is
in the details. Just putting "pages" in IMAGE or TEXT fields on a SQL
database isn't some sort of magic pixie dust that solves all problems. It
takes a lot of thought and experimentation to get just right, and the
rewards in most cases offset the negative aspects of these designs.

Having said that, your approach is, as you've figured out by now, wrong.
Others have already mentioned the perf hit you're taking by forcing ASP.NET
to function in a way it wasn't designed to. But there are ways to get around
the problem.

First off, I'll assume that you have more content than code. That is, *most*
of your existing ASP/ASPX pages in the database contain stuff, like invoices
and whatnot, that do not require intervention by the parser to load and
display. If this is not the case then stop reading now =)

There are a lot of web applications nowadays that use their databases as
primary content storage repositories. Blogs are a good example of this.
However, they do not store code in the database, they store content. If you
can adapt to this type of scenario then all you have to do is create a sort
of "master page" or maybe an IHttpHandler implementation that looks at the
request (even a path), pulls the *content* from the database and then
displays it on a sort of templated page container. Think of a site like
MSDN. Lots of articles, yet all the pages look the same. The content is what
varies. The trick is to simply treat your "pages" as opaque chunks of
content, and "stream" them into a placeholder page that is rendered the same
way every time. You can pretty much do anything you want once you're hooked
into the ASP.NET processing pipeline.

If you do this you'll get rid of your perf problem and you'll still have the
flexibility and manageability of the database storage. This is web
application design 101 - always separate content from function and
structure.

Of course getting from point A to point B could be tricky, but maybe this
gives you some ideas =)

--
Klaus H. Probst, MVP
http://www.simulplex.net/
"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:DF**********************************@microsof t.com...
DLM,

Thanks for your response.

I don't believe that SQL is my problem - the time it takes to do the

remote
calls to the external program has not presented any real limitations or
caused any significant time-delays. The problem that I have is trying to
deal with the quanitity of files that I have and the combiling of those

pages
once they have been returned. Any time issues that I have had are the

result
of parsing and compilling the page.

I understand your point - that asp.net was designed to have the pages
pre-compiled before requests where made for the pages - I guess this is

one
of the fundamental differances between asp and asp.net. From experiance

the
design that has been implemented had no significant disadvantages when
working with asp - in fact I believe that it had some very significant
advantages - mainly due to the unlimited number of files it could manage.

Question - is it practicale to have 10,000 pages in a single asp.net
application. I wouldn't have thought so?

I have no alternative but to deal with this many files as it is a legal
requirment of the industry I work in is to be able to provide the original
copies of invoices to clients - I can't go back to the original data and
attempt to re-produce the same invoice just in case the data has changed.

So - considering this is what I am stuck with - my problem mainly relates

to
the memory increases - the slowness of the request could be dealt with by

the
more grunt - but the ever increasing memory is more difficult. I believe
that this is the result of the creation of a new class each time a new

page
is displayed - although this page gets un-loaded once the info is

returned -
are the increases in memory the result of the new class definition in the
asp.net application?

If this is the case - is it possible to remove this class definition once

it
has been created? I have tried to delete the file - but this does not

appear
to fix the problem.

Does any of this make sense.
--
matthew
"DLM" wrote:
Go to this link:
http://www.theserverside.net/books/i...ngPerf/pag.tss

We all know that there is an initial performance 'hit' the first time a ASP .NET Page is requested, and this is true with the page sitting in it's
virtual directory on IIS. What you seem to be doing is increasing the round trip time for any given request/response for a page to include the time it takes to get the page from SQL Server. And this added time is database access time which is probably longer than any HTTP request.

I have never seen this kind of architecture before. The whole idea is to
serve pages from a highly scalable architecture such as IIS from the very start, and not make the ASP .NET worker process wait for the files before it can dynamically compile pages. In a sense you are using SQL Server as a web server to your web server, which is highly unusual. You need to find a way to place all your pages on the web server and use SQL Server only for data
retrieval/update.

TAKE A LOOK AT MICROSOFT PATTERNS & PRACTICES


Nov 19 '05 #18
DancesWithBamboo,

Thankyou for your help - your response I believe hits the nail on the head.

I am trying to look at my options on pulling the code out of the pages and
placing it in asp.net - not sure exactly how to do it - but are looking at
different options. If you look at one of my recent replies to Klaus I have
outlined a few of my ideas.

I found the asp solution very flexible and it provided a good solution to
the issues I was having - but I was forced to move to asp.net due to
reliability issues and a need to use webservices. Asp pages still get served
occasionally from the server as they were created 3-4 years ago and the
problems I have been having with asp.net have not really caused a great
problem until recently. I guess its time to do some more up-grading.

Thanks for your help.

--
matthew
"DancesWithBamboo" wrote:
To understand why the solution you had working in clasic asp doesn't perform
in .net, you have to understand the internals of how these systems work.

In asp, every page was intrepreted runtime. It looked for your <% %> tags
and intrepeted the code between them real time as it ran each page. If your
page didn't caontain much server-side code and was mostly content then there
wasn't much to do. Basically the system was designed to look at each page
each time it was hit as necessary. This is why almost all systems (except
your) are faster in .net.

.net on the other hand "compiles" every page into IL code when the page
changes. This IL code is then intrepeted real time by the runtime engine.
Compiling is a lot of work. It outputs new code, checks syntax, checks type
casts, checks variable declarations, function calls, and another million
things. asp would just crap out if it didn't like what it saw. The key here
is that .aspx files are compiled once for every folder and every folder every
time a file changes in it. On a normal system, you take a hit once for each
folder that has a page in it requested. Then nothing changes for a long time
so the compiler may not do anything for months.

In your system, every time you push a page out from SQL to disk you are
instructing the compiler that there is a new page that it must compile (not
to mention the file system write which could possibly be the slowest thing
you could do on a Windows box). Now if you start loading all 10000 pages
into the same folder, the work it has to do will get worse as the more pages
are requested since there will be more pages to compile.

The key to making your system work is to do some major rearchitecting or
leave it in clasic asp. Why do you want to change? If you rearchitect, you
need to only store "data" in the database and keep code in files on the
server compiled. Then you load the pages with data from the db. This is how
content management systems work.

An example would be your invoice pages. I assume that many of your "pages"
in the database are invoices. You need to just store them as text along with
whatever data you need to make the page in SQL Server (table called Invoice?)
and then have a page called invoice.aspx on the server that uses code to grab
the data and format it for display(including your graph control).

It really has never been a "good" design to put code in a database. It just
happens that classic asp was stupid enough that you could get away with it.
You need to leverage the systems for what they are good at; IIS/.net for page
serving and SQL server for data storage.

Nov 19 '05 #19
Hi,
Convert the aspx pages to controls and use Loadcontrol from a single aspx by
looking up the ID of the SQL stored control! As well you could cache the
controls to improve performance.

regards
Brucek

"matvdl" wrote:
I have a system that was originally developed in asp - the pages are saved in
SQL (there are over 10,000 pages) and saved to a temp directory in the server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #20
You could also use ParseControl to pass them as a string from the database,
Loadcontrol would load from a file.
Basically in pseudocode:
1. Put a placeholder in a aspx page
2. Get id of stored control from calling page
3. Get Text of ascx in database
4. New Control = Parsecontrol(text)
5. Placeholder.Controls.Add(Control)

Place 2-5 in loop if more than 1 control per page

This way you get one aspx, with 1 or more content controls without redirects
etc.

Regards
Bruce
"Bruce" wrote:
Hi,
Convert the aspx pages to controls and use Loadcontrol from a single aspx by
looking up the ID of the SQL stored control! As well you could cache the
controls to improve performance.

regards
Brucek

"matvdl" wrote:
I have a system that was originally developed in asp - the pages are saved in
SQL (there are over 10,000 pages) and saved to a temp directory in the server
when requested by a client.

I have updated this system and changed the pages that are saved to the
server as aspx - everything works fine and pages can be served - but

Its not impossible for a single client to request 100 plus pages in one
session - as each page is requested it is retrieved from SQL saved to the
temp directory and compiled - problems are.
- Performance is well down on the original asp system - I believe that
this due to the compiling of the pages in asp.net
- The memory usage of server also goes through the roof - there is a
considerable increase in memory as each page is loaded

Are there any solutions to these problems??

--
matthew

Nov 19 '05 #21
Matthew,

"matvdl" <ma****@discussions.microsoft.com> wrote in message
news:EF**********************************@microsof t.com...
So - if I have a string of HTML text with a embedded asp.net server control and I use the response.write function will asp.net recognize that there is a control with in the document and load it accordingly??
No, the template parser is long done after you get to write to the HTTP
output stream.
If this don't work that only way that I can image doing it would be to
implement my own method of tags - search from them at run-time and extract
the code between them and use the Eval function to evaluate the code. Does not seem like a good solution - but I think it would work.
You can embed type names or something like that, and then use some sort of
class factory implementation to instantiate the "controls" that need to
render themselves. Pass the HttpContext reference to them or some such so
they can play. Remember that a "control" in ASP.NET is something that just
renders some HTML. They're nothing special and they don't have to be loaded
from disk or have an ASCX extension (although that has its advantages).
Still, avoid that sort of thing as much as you can.
Part of my aim is to limit the dependence of the need to store data - once
the invoice or file is produced ideally I need future requests to only look at this one file for it to display correctly - therefore the reasoning on
imbedding the data within the file - but that does not mean that the code
could be pulled out of the file and placed elsewhere. Having as few
dependencies as possible on what is required to display the page was part of the original design and one that ensures that future system changes will not inadvertently make viewing older invoices fail. Something that has worked
well over the past 5 years.


Ideally you never store data with presentation metadata (your markup), but I
will recognize that in some cases it makes sense. But storing the data with
the markup is not a good idea just because you think the application will
change someday - you design applications to be resilient and adaptable to
most foreseeable changes.

--
Klaus H. Probst, MVP
http://www.simulplex.net/
Nov 19 '05 #22

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

40
by: Elijah Bailey | last post by:
I want to sort a set of records using STL's sort() function, but dont see an easy way to do it. I have a char *data; which has size mn bytes where m is size of the record and n is the...
16
by: thomas peter | last post by:
I am building a precache engine... one that request over 100 pages on an remote server to cache them remotely... can i use the HttpWebRequest and WebResponse classes for this? or must i use the...
3
by: Alex | last post by:
I'm having a problem porting an ASP solution to ASPX. In the ASP solution I'm accessing a DCOM server, create sub DCOM objects and call functions from VB script on the ASP pages. The DCOM object...
1
by: Spencer H. Prue | last post by:
Hello, I have two.aspx files and so far there are two pages on my web app. On one page I use a button event to create a dataset and fill/update it from the MSDE. I have dataadapter, a...
3
by: GP | last post by:
We are connecting to sql server database in the webservices & retrieve the data as dataset from the webservices and load it to the aspx pages ,But we find loading of the aspx pages takes longer...
133
by: Alan Silver | last post by:
Hello, Just wondered what range of browsers, versions and OSs people are using to test pages. Also, since I don't have access to a Mac, will I have problems not being able to test on any Mac...
6
by: Alec MacLean | last post by:
Hi, I've created a small application for our company extranet (staff bulletins) that outputs a list of links to PDF's that are stored in a SQL table. The user clicks a link and the PDF is...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.