468,290 Members | 2,047 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 468,290 developers. It's quick & easy.

Retrieve and display 3rd party pages

Hi,

I'm wondering if you can show me an example php page that allows me to
retrieve and display 3rd party pages. I.e.,

when my php script is called as http://site/path/getit.php?file_key
it can retrieve from http://3rd.party/path2/name-with-file_key.htm
and display the result back to the browser.

I don't know php, and start to learning it. So I hope you can give me a full
php page, not a segment. I don't think it'd be too long though. Luckily, I'm
a programmer, so you don't need to comment it much, unless it is too
convoluted.

Further, if you can throw in handling of gzipped files, that'd be super.

If you have Firefox, it can automagically gunzipped gzipped files, like

http://www.gnu.org/software/bash/manual/bashref.html.gz

So I hope the script http://site/path/getit.php?bashref

can display the content of

http://www.gnu.org/software/bash/manual/bashref.html.gz

thanks
--
Tong (remove underscore(s) to reply)
http://xpt.sourceforge.net/
--
Posted via a free Usenet account from http://www.teranews.com

Oct 29 '06 #1
2 1516
Hi,

You could fetch a remote URL using PHP's URL wrappers. For example,
readfile can stream the contents of a remote URL:

readfile('http://site/content/page.html');

You could also do this with mod_proxy:

ProxyPass /my/mirror/ http://site/content/

Requests for /my/mirror/* will be internally sent to the remote site.
If the page contains internal links that use absolute URLs, you may
want to use mod_proxy_html to rewrite those URLs or use an output
buffer handler. More info on using mod_proxy_html:

<http://www.wlug.org.nz/ApacheReverseProxy>

You could also use a proxy application such as PHP Proxy
(www.phpproxy.com).

Since many browsers can automatically decompress gzip-compressed
content, you wouldn't need to worry about decompressing it yourself. If
you still wanted to do this however, you could use a shell command.
Some code to do that:

<?

// Sample usage:
//
// zcatUrl('http://site/file.html.gz')
//
// That will decompress the remote URL and stream content
// to the browser.

function zcatUrl($url)
{
$errors = tempnam('/tmp', 'zcatUrl-');
$cmd = 'wget -O - ' . escapeshellarg($url) .
' 2' . escapeshellarg($errors) . ' | zcat';
passthru($cmd, $exitCode);

if ($exitCode != 0) {
$msg = "Command \"$cmd\" failed with exit code $exitCode";
$err = false;
if (is_file($errors)) {
$err = file_get_contents($errors) or
trigger_error("Failed to read file $errors.");
}
$msg .= $err !== false ? ": $err" : '.';
trigger_error($msg);
}

if (is_file($errors)) {
unlink($errors) or
trigger_error("Failed to delete file $errors");
}
}

?>

* Tong * wrote:
Hi,

I'm wondering if you can show me an example php page that allows me to
retrieve and display 3rd party pages. I.e.,

when my php script is called as http://site/path/getit.php?file_key
it can retrieve from http://3rd.party/path2/name-with-file_key.htm
and display the result back to the browser.

I don't know php, and start to learning it. So I hope you can give me a full
php page, not a segment. I don't think it'd be too long though. Luckily, I'm
a programmer, so you don't need to comment it much, unless it is too
convoluted.

Further, if you can throw in handling of gzipped files, that'd be super.

If you have Firefox, it can automagically gunzipped gzipped files, like

http://www.gnu.org/software/bash/manual/bashref.html.gz

So I hope the script http://site/path/getit.php?bashref

can display the content of

http://www.gnu.org/software/bash/manual/bashref.html.gz

thanks
--
Tong (remove underscore(s) to reply)
http://xpt.sourceforge.net/
--
Posted via a free Usenet account from http://www.teranews.com
Oct 29 '06 #2
On Sat, 28 Oct 2006 22:12:45 -0700, petersprc wrote:
You could fetch a remote URL using PHP's URL wrappers. For example,
readfile can stream the contents of a remote URL:

readfile('http://site/content/page.html');

You could also do this with mod_proxy...

Since many browsers can automatically decompress gzip-compressed
content, you wouldn't need to worry about decompressing it yourself.
First of all, thanks a lot for this comprehensive reply. You've even
covered issues that I hadn't thought about. thanks!

Just for the archive, following your advice, and scripts found in
http://ca.php.net/readfile
I hacked the following php script, which tested ok:

<?php
header("Content-Type: text/html");
header("Content-Encoding: x-gzip");

set_time_limit(0);

ob_start();
readfile('http://rute.2038bug.com/index.html.gz');
// use @ to hide 'nothing to flush' errors
@ob_flush();

$content = ob_get_contents();
ob_end_clean();
?>
--
Tong (remove underscore(s) to reply)
http://xpt.sourceforge.net/
--
Posted via a free Usenet account from http://www.teranews.com

Oct 29 '06 #3

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

reply views Thread by David Potahisnsky | last post: by
5 posts views Thread by Emmett Power | last post: by
3 posts views Thread by Richard Thornley | last post: by
5 posts views Thread by Martin Moser | last post: by
1 post views Thread by Lyners | last post: by
8 posts views Thread by Yoshitha | last post: by
3 posts views Thread by ipramod | last post: by
3 posts views Thread by John Kotuby | last post: by
reply views Thread by NPC403 | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.