473,466 Members | 1,332 Online
Bytes | Software Development & Data Engineering Community
Create Post

Home Posts Topics Members FAQ

Search engine friendly URLs slowing up site

I've made the decision to use search engine friendly URLs in my site which
means translating stripping all parameters our of the URL and converting it
to a hierarchical URL like this:
Change:
/mysite/default.aspx?MenuID=contactus
To:
/mysite/ContactUs.aspx?

The problem I'm having is that its really slowed things up by at least 0.5
seconds to 1 second longer just to pull up a light weight static page. The
then when navigating using the browser's back button, when I used the normal
(parameter urls) it navigates back instantly, but not its got that long
delay to translate the url. In short here's my code:

'Get the incoming hierarchical URL here.
Dim oldpath As String = context.Request.Path.ToLower()

'more code here to....
'Parse the page name out of the url and pass it to the default page as a
parameter.
'The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db.

context.RewritePath("default.aspx?MenuID=" & sPage)

This is the slow way to do it.

The fast way was just to pass url like this:
default.aspx?MenuID=123

Evidently the holdup is in the context.RewritePath.

This line of code is many times slower than executing 100s of lines of
normal code including hitting the db several times. Is there a way to speed
this up. Also, what considerations do I need to take into account when I'm
caching pages on the client, and/or caching pages on the server?

Thanks.

--
mo*******@nospam.com
Nov 18 '05 #1
4 1938

Are you sure that it's context.RewritePath that runs slowly.
And not the (excerpt from you post)
"'Parse the page name out of the url and pass it to the default page as a
parameter.
The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db."
George.

"moondaddy" <mo*******@nospam.com> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I've made the decision to use search engine friendly URLs in my site which
means translating stripping all parameters our of the URL and converting it to a hierarchical URL like this:
Change:
/mysite/default.aspx?MenuID=contactus
To:
/mysite/ContactUs.aspx?

The problem I'm having is that its really slowed things up by at least 0.5
seconds to 1 second longer just to pull up a light weight static page. The then when navigating using the browser's back button, when I used the normal (parameter urls) it navigates back instantly, but not its got that long
delay to translate the url. In short here's my code:

'Get the incoming hierarchical URL here.
Dim oldpath As String = context.Request.Path.ToLower()

'more code here to....
'Parse the page name out of the url and pass it to the default page as a
parameter.
'The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db.

context.RewritePath("default.aspx?MenuID=" & sPage)

This is the slow way to do it.

The fast way was just to pass url like this:
default.aspx?MenuID=123

Evidently the holdup is in the context.RewritePath.

This line of code is many times slower than executing 100s of lines of
normal code including hitting the db several times. Is there a way to speed this up. Also, what considerations do I need to take into account when I'm caching pages on the client, and/or caching pages on the server?

Thanks.

--
mo*******@nospam.com

Nov 18 '05 #2
I haven't done any benchmarking to test each line but this is what I do
know:
1) I've been running the site for about a month using the method of passing
parameters into the url (from the client browser) and that ran fast.
2) The only difference between that (old) way and the way I'm doing it now
is the "context.RewritePath " part. I use the context.RewritePath to
capture a fake page name being passed into the url (instead of a paramter),
change the url back to the original url (of a month ago) where I pass the
parameter into the url (/default.aspx?MenuID=123). then all the same
original code executes to populate the products grid or call up static pages
like "AboutUs.aspx". In other words, the difference between the 2 ways is
the context.RewritePath because after doing this, the same original code
executes as before. I also have a copy of the old site running in a test
domain where I can run them side by side and the new way is much slower.

--
mo*******@nospam.com
"George Ter-Saakov" <no****@hotmail.com> wrote in message
news:uo**************@TK2MSFTNGP12.phx.gbl...

Are you sure that it's context.RewritePath that runs slowly.
And not the (excerpt from you post)
"'Parse the page name out of the url and pass it to the default page as a
parameter.
The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db."
George.

"moondaddy" <mo*******@nospam.com> wrote in message
news:%2****************@TK2MSFTNGP11.phx.gbl...
I've made the decision to use search engine friendly URLs in my site which means translating stripping all parameters our of the URL and converting

it
to a hierarchical URL like this:
Change:
/mysite/default.aspx?MenuID=contactus
To:
/mysite/ContactUs.aspx?

The problem I'm having is that its really slowed things up by at least 0.5 seconds to 1 second longer just to pull up a light weight static page.

The
then when navigating using the browser's back button, when I used the

normal
(parameter urls) it navigates back instantly, but not its got that long
delay to translate the url. In short here's my code:

'Get the incoming hierarchical URL here.
Dim oldpath As String = context.Request.Path.ToLower()

'more code here to....
'Parse the page name out of the url and pass it to the default page as a
parameter.
'The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db.

context.RewritePath("default.aspx?MenuID=" & sPage)

This is the slow way to do it.

The fast way was just to pass url like this:
default.aspx?MenuID=123

Evidently the holdup is in the context.RewritePath.

This line of code is many times slower than executing 100s of lines of
normal code including hitting the db several times. Is there a way to

speed
this up. Also, what considerations do I need to take into account when

I'm
caching pages on the client, and/or caching pages on the server?

Thanks.

--
mo*******@nospam.com


Nov 18 '05 #3
Hi Moondaddy,

As for the performance(speed) concerns when using context.RewritePath to
provide Url Rewrite, here are some of my suggestions:

When we use the context.RewritePath to perform Url Rewrite , it is only
used for those Search Engine friendly or human readable/hackable urls ,and
the rewrite occurrs in our Custom HttpModule. At runtime when the original
request come and go into the custom module, it'll check the raw url and
replace it with the actual dynamic url (with requerystring params rather
than a static url). All these steps will cost some additional time and its
unvoidable since we want to make those readable/hackable url to the actual
url. However, in most scenario, we can still use the actual dyniamic url in
our code or page's links such as http://severname/appname/page.aspx?a=a&b=b

Also, I still think the process that we parse the orignial comming request
url and replace it with the acutal url is a very important step. And as
mentioned in my tech articles on UrlRewriting, they'll have a certain
rewrite engine for checking the raw url and replace with actual url, such
as using regex, and xml configure file so as to improve the performance. So
I think this is the key point of the whole ReWriting engine we need to take
care of. Do you think so.

In addition, as for the client or server side caching ,I think this is a
common approach no matter you're using Rewriting or not. And I think you
can first try performance test without using UrlRewrite and then consider
the step where you
"Parse the page name out of the url and pass it to the default page as a
parameter"
Improving this step will mostly imporve the whole rewrite mechanism. Thanks.

Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)

Get Preview at ASP.NET whidbey
http://msdn.microsoft.com/asp.net/whidbey/default.aspx

Nov 18 '05 #4
Hi Moondaddy,

Have you had a chance to check out the suggestions in my last reply or have
you got any further ideas on this issue? If you have anything unclear or if
there're anything else we can help, please feel free to post here. Thanks.

Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)

Get Preview at ASP.NET whidbey
http://msdn.microsoft.com/asp.net/whidbey/default.aspx

Nov 18 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
by: phpkid | last post by:
Howdy I've been given conflicting answers about search engines picking up urls like: http://mysite.com/index.php?var1=1&var2=2&var3=3 Do search engines pick up these urls? I've been considering...
0
by: R. Rajesh Jeba Anbiah | last post by:
Q: Is PHP search engine friendly? Q: Will search engine spiders crawl my PHP pages? A: Spiders should crawl anything provided they're accessible. Since, nowadays most of the websites are been...
0
by: Shabam | last post by:
I'm interested in converting dynamic urls to something that looks static so that they're more search engine friendly. Instead of www.domain.com/script.aspx?userid=1234 I'd like to have it as...
5
by: Sam | last post by:
Does anyone know of a way to create a search page under ASP.NET 2.0? I have started out by configuring a catalog in Index Server, registering the aspx, ascx extensions in the registry to allow...
4
by: MDW | last post by:
Posted this on another board, but evidently it was off-topic there...hope you folks will be able to provide some guidance. I've been working on a Web site for a business (my first non-personal...
4
by: Geoff Berrow | last post by:
I've been trying to use .htaccess to get Apache to recognise 'article' as 'article.php' so I can have search engine friendly urls in the form article/var1/var2/var3 etc I have this in a...
8
by: Roman | last post by:
I received a marketing call from a guy first showing me my website and then some other website and ranking of that other website. My questions is it worth paying to SEO corporation a $1200 -...
8
by: Bruno Rafael Moreira de Barros | last post by:
I have this framework I'm building in PHP, and it has Search Engine Friendly URLs, with site.com/controller/page/args... And on my View files, I have <?=$this->baseURL;?to print the base URL on the...
2
by: flickle1 | last post by:
Hi, I created a website in PHP www.poundsback.com and i want to know how to make Friendly URL's if you browse the website you will see that some of the urls are quite long in size. I heard...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing,...
0
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and...
0
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
0
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated ...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.