473,772 Members | 2,349 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Search engine friendly URLs slowing up site

I've made the decision to use search engine friendly URLs in my site which
means translating stripping all parameters our of the URL and converting it
to a hierarchical URL like this:
Change:
/mysite/default.aspx?Me nuID=contactus
To:
/mysite/ContactUs.aspx?

The problem I'm having is that its really slowed things up by at least 0.5
seconds to 1 second longer just to pull up a light weight static page. The
then when navigating using the browser's back button, when I used the normal
(parameter urls) it navigates back instantly, but not its got that long
delay to translate the url. In short here's my code:

'Get the incoming hierarchical URL here.
Dim oldpath As String = context.Request .Path.ToLower()

'more code here to....
'Parse the page name out of the url and pass it to the default page as a
parameter.
'The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db.

context.Rewrite Path("default.a spx?MenuID=" & sPage)

This is the slow way to do it.

The fast way was just to pass url like this:
default.aspx?Me nuID=123

Evidently the holdup is in the context.Rewrite Path.

This line of code is many times slower than executing 100s of lines of
normal code including hitting the db several times. Is there a way to speed
this up. Also, what considerations do I need to take into account when I'm
caching pages on the client, and/or caching pages on the server?

Thanks.

--
mo*******@nospa m.com
Nov 18 '05 #1
4 1955

Are you sure that it's context.Rewrite Path that runs slowly.
And not the (excerpt from you post)
"'Parse the page name out of the url and pass it to the default page as a
parameter.
The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db."
George.

"moondaddy" <mo*******@nosp am.com> wrote in message
news:%2******** ********@TK2MSF TNGP11.phx.gbl. ..
I've made the decision to use search engine friendly URLs in my site which
means translating stripping all parameters our of the URL and converting it to a hierarchical URL like this:
Change:
/mysite/default.aspx?Me nuID=contactus
To:
/mysite/ContactUs.aspx?

The problem I'm having is that its really slowed things up by at least 0.5
seconds to 1 second longer just to pull up a light weight static page. The then when navigating using the browser's back button, when I used the normal (parameter urls) it navigates back instantly, but not its got that long
delay to translate the url. In short here's my code:

'Get the incoming hierarchical URL here.
Dim oldpath As String = context.Request .Path.ToLower()

'more code here to....
'Parse the page name out of the url and pass it to the default page as a
parameter.
'The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db.

context.Rewrite Path("default.a spx?MenuID=" & sPage)

This is the slow way to do it.

The fast way was just to pass url like this:
default.aspx?Me nuID=123

Evidently the holdup is in the context.Rewrite Path.

This line of code is many times slower than executing 100s of lines of
normal code including hitting the db several times. Is there a way to speed this up. Also, what considerations do I need to take into account when I'm caching pages on the client, and/or caching pages on the server?

Thanks.

--
mo*******@nospa m.com

Nov 18 '05 #2
I haven't done any benchmarking to test each line but this is what I do
know:
1) I've been running the site for about a month using the method of passing
parameters into the url (from the client browser) and that ran fast.
2) The only difference between that (old) way and the way I'm doing it now
is the "context.Rewrit ePath " part. I use the context.Rewrite Path to
capture a fake page name being passed into the url (instead of a paramter),
change the url back to the original url (of a month ago) where I pass the
parameter into the url (/default.aspx?Me nuID=123). then all the same
original code executes to populate the products grid or call up static pages
like "AboutUs.as px". In other words, the difference between the 2 ways is
the context.Rewrite Path because after doing this, the same original code
executes as before. I also have a copy of the old site running in a test
domain where I can run them side by side and the new way is much slower.

--
mo*******@nospa m.com
"George Ter-Saakov" <no****@hotmail .com> wrote in message
news:uo******** ******@TK2MSFTN GP12.phx.gbl...

Are you sure that it's context.Rewrite Path that runs slowly.
And not the (excerpt from you post)
"'Parse the page name out of the url and pass it to the default page as a
parameter.
The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db."
George.

"moondaddy" <mo*******@nosp am.com> wrote in message
news:%2******** ********@TK2MSF TNGP11.phx.gbl. ..
I've made the decision to use search engine friendly URLs in my site which means translating stripping all parameters our of the URL and converting

it
to a hierarchical URL like this:
Change:
/mysite/default.aspx?Me nuID=contactus
To:
/mysite/ContactUs.aspx?

The problem I'm having is that its really slowed things up by at least 0.5 seconds to 1 second longer just to pull up a light weight static page.

The
then when navigating using the browser's back button, when I used the

normal
(parameter urls) it navigates back instantly, but not its got that long
delay to translate the url. In short here's my code:

'Get the incoming hierarchical URL here.
Dim oldpath As String = context.Request .Path.ToLower()

'more code here to....
'Parse the page name out of the url and pass it to the default page as a
parameter.
'The default page will then run logic against this, query the db and
populate a user control in the content section of the default page with
dynamic content from the db.

context.Rewrite Path("default.a spx?MenuID=" & sPage)

This is the slow way to do it.

The fast way was just to pass url like this:
default.aspx?Me nuID=123

Evidently the holdup is in the context.Rewrite Path.

This line of code is many times slower than executing 100s of lines of
normal code including hitting the db several times. Is there a way to

speed
this up. Also, what considerations do I need to take into account when

I'm
caching pages on the client, and/or caching pages on the server?

Thanks.

--
mo*******@nospa m.com


Nov 18 '05 #3
Hi Moondaddy,

As for the performance(spe ed) concerns when using context.Rewrite Path to
provide Url Rewrite, here are some of my suggestions:

When we use the context.Rewrite Path to perform Url Rewrite , it is only
used for those Search Engine friendly or human readable/hackable urls ,and
the rewrite occurrs in our Custom HttpModule. At runtime when the original
request come and go into the custom module, it'll check the raw url and
replace it with the actual dynamic url (with requerystring params rather
than a static url). All these steps will cost some additional time and its
unvoidable since we want to make those readable/hackable url to the actual
url. However, in most scenario, we can still use the actual dyniamic url in
our code or page's links such as http://severname/appname/page.aspx?a=a&b=b

Also, I still think the process that we parse the orignial comming request
url and replace it with the acutal url is a very important step. And as
mentioned in my tech articles on UrlRewriting, they'll have a certain
rewrite engine for checking the raw url and replace with actual url, such
as using regex, and xml configure file so as to improve the performance. So
I think this is the key point of the whole ReWriting engine we need to take
care of. Do you think so.

In addition, as for the client or server side caching ,I think this is a
common approach no matter you're using Rewriting or not. And I think you
can first try performance test without using UrlRewrite and then consider
the step where you
"Parse the page name out of the url and pass it to the default page as a
parameter"
Improving this step will mostly imporve the whole rewrite mechanism. Thanks.

Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)

Get Preview at ASP.NET whidbey
http://msdn.microsoft.com/asp.net/whidbey/default.aspx

Nov 18 '05 #4
Hi Moondaddy,

Have you had a chance to check out the suggestions in my last reply or have
you got any further ideas on this issue? If you have anything unclear or if
there're anything else we can help, please feel free to post here. Thanks.

Regards,

Steven Cheng
Microsoft Online Support

Get Secure! www.microsoft.com/security
(This posting is provided "AS IS", with no warranties, and confers no
rights.)

Get Preview at ASP.NET whidbey
http://msdn.microsoft.com/asp.net/whidbey/default.aspx

Nov 18 '05 #5

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

1
3424
by: phpkid | last post by:
Howdy I've been given conflicting answers about search engines picking up urls like: http://mysite.com/index.php?var1=1&var2=2&var3=3 Do search engines pick up these urls? I've been considering converting a site of mine to PHP-Nuke, but if the individual modules aren't picked up in search engines I'm not going to do it. Thanks phpKid
0
4156
by: R. Rajesh Jeba Anbiah | last post by:
Q: Is PHP search engine friendly? Q: Will search engine spiders crawl my PHP pages? A: Spiders should crawl anything provided they're accessible. Since, nowadays most of the websites are been developed with PHP, you are not supposed to doubt that. As a proof that PHP pages could be crawled and indexed, refer this Google search
0
1206
by: Shabam | last post by:
I'm interested in converting dynamic urls to something that looks static so that they're more search engine friendly. Instead of www.domain.com/script.aspx?userid=1234 I'd like to have it as www.domain.com/1234.html. How can this be done via code?
5
2068
by: Sam | last post by:
Does anyone know of a way to create a search page under ASP.NET 2.0? I have started out by configuring a catalog in Index Server, registering the aspx, ascx extensions in the registry to allow them to be indexed and built the catalog as per KB article, but I've run into an interesting problem. When you publish a website from Whidbey, it precompiles everything and strips out the searchable details of the page (metadata, html, etc)...
4
2182
by: MDW | last post by:
Posted this on another board, but evidently it was off-topic there...hope you folks will be able to provide some guidance. I've been working on a Web site for a business (my first non-personal site) and I want to help my client get the best search engine listing. Because this is my first for-profit site, I'm not sure what I need to do for optimal search engine placement. I've been poking around the Web, and I think I have a good start,...
4
1600
by: Geoff Berrow | last post by:
I've been trying to use .htaccess to get Apache to recognise 'article' as 'article.php' so I can have search engine friendly urls in the form article/var1/var2/var3 etc I have this in a .htaccess file <Files article> ForceType application/x-httpd-php </Files>
8
2141
by: Roman | last post by:
I received a marketing call from a guy first showing me my website and then some other website and ranking of that other website. My questions is it worth paying to SEO corporation a $1200 - $3000 setup fee and then $150 monthly to get your website search optimized ? I used the domaintools.com and it seems like my website had higher SEO rating and tag relevance than the example site he was showing me. I did not pay attention to search...
8
2122
by: Bruno Rafael Moreira de Barros | last post by:
I have this framework I'm building in PHP, and it has Search Engine Friendly URLs, with site.com/controller/page/args... And on my View files, I have <?=$this->baseURL;?to print the base URL on the links (eg. <a href='<?=$this->baseURL;?>/controller/page/args'>Go somewhere</ a>. But on the CSS / JS files, how will I do it? I wonder, because on the View files, I can do <?=$this->baseURL;?>/css/site.css, and it will work. But images on the...
2
2289
by: flickle1 | last post by:
Hi, I created a website in PHP www.poundsback.com and i want to know how to make Friendly URL's if you browse the website you will see that some of the urls are quite long in size. I heard already you can use .htaccess
0
9454
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
10264
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
10039
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8937
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7461
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5355
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
0
5484
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4009
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
2
3610
muto222
by: muto222 | last post by:
How can i add a mobile payment intergratation into php mysql website.

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.