I have a site that I need to generate a sitemap for. It is way to big
for a desktop program to crawl or any of the online sites. It
technically only has 4 pages but it runs from a DB with a couple
thousand pages. My plan is to generate a giant sitemap with a php file
querying the database and copy and pasting into a real xml file but I
hit a problem. I have tried the following code but no matter what I do
it always seems to takeout the < and >
<?php
$db_host = "localhost";
$db_user = "name";
$db_pwd = "pass";
$db_name = "db name";
$con = mysql_connect($db_host, $db_user, $db_pwd);
mysql_select_db($db_name);
$result = mysql_query("SELECT * FROM db");
while($data = mysql_fetch_row($result)){
echo("
<url><br />
<loc>http://www.mysite.com/-$data[0].html</loc><br />
<priority>0.90</priority><br />
<changefreq>daily</changefreq><br />
<lastmod>2008-09-02</lastmod><br />
</url><br />");
}
?>
And when run it shows:
http://www.mysite.com/-data.html
0.90
daily
2008-09-02
It echos everything but the xml tags (Which is what I need). Is there
a correct way of doing this or tricking it somehow?
Thanks in Advance,
Bryan