By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
438,747 Members | 2,011 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 438,747 IT Pros & Developers. It's quick & easy.

max_execution_time and fork

P: n/a
I use recursive readdir as the engine for
a numerous file processing routines, including
generating thumbnails, using getimagesize
imagecreatetruecolor and imagejpeg, etc.
where each resize operation involves instantiating
a new resizer class instance.

.....but my (shared host) website is large, so I
run past max_execution_time if I try to make
all thumbnails at once, from the top of
my document_root. My virtual file system
includes a php.ini which I can edit, but
bumping max_execution-time (although the file saves)
seems to have no effect, and I do not have
permission to run /usr/sbin/apachectl restart

Perhaps the php.ini change
will start to work the next time the server reboots.

In the mean time, is there some way to fork a new process,
for instantiating the resizer?

I could try to use exec to run command line
php. But there must be a more elegant way
to do this. I'd use perl and imagemagick,
but my shared host server doesn't have the
right perl libs installed, and that's another
can of worms.
Apr 28 '07 #1
Share this Question
Share on Google+
5 Replies


P: n/a
why not just generate the thumbnails for the current directory, then
provide navigational links to the sub-directories. I just started on
a photo album system that works in that exact way. http://code.google.com/p/zippyphotos/.
Check it out maybe we can work out some requirements together

On Apr 28, 12:00 am, sandy <relativ...@isrelative.comwrote:
I use recursive readdir as the engine for
a numerous file processing routines, including
generating thumbnails, using getimagesize
imagecreatetruecolor and imagejpeg, etc.
where each resize operation involves instantiating
a new resizer class instance.

....but my (shared host) website is large, so I
run past max_execution_time if I try to make
all thumbnails at once, from the top of
my document_root. My virtual file system
includes a php.ini which I can edit, but
bumping max_execution-time (although the file saves)
seems to have no effect, and I do not have
permission to run /usr/sbin/apachectl restart

Perhaps the php.ini change
will start to work the next time the server reboots.

In the mean time, is there some way to fork a new process,
for instantiating the resizer?

I could try to use exec to run command line
php. But there must be a more elegant way
to do this. I'd use perl and imagemagick,
but my shared host server doesn't have the
right perl libs installed, and that's another
can of worms.

Apr 28 '07 #2

P: n/a
On Apr 28, 5:00 am, sandy <relativ...@isrelative.comwrote:
I use recursive readdir as the engine for
a numerous file processing routines, including
generating thumbnails, using getimagesize
imagecreatetruecolor and imagejpeg, etc.
where each resize operation involves instantiating
a new resizer class instance.

....but my (shared host) website is large, so I
run past max_execution_time if I try to make
all thumbnails at once, from the top of
my document_root. My virtual file system
includes a php.ini which I can edit, but
bumping max_execution-time (although the file saves)
seems to have no effect, and I do not have
permission to run /usr/sbin/apachectl restart

Perhaps the php.ini change
will start to work the next time the server reboots.

In the mean time, is there some way to fork a new process,
for instantiating the resizer?

I could try to use exec to run command line
php. But there must be a more elegant way
to do this. I'd use perl and imagemagick,
but my shared host server doesn't have the
right perl libs installed, and that's another
can of worms.
you could obtain a list of the jobs (or folders) to be processed, put
it in a session, then use an iframe within a main page.
the main page loads the iframe and a javascript function which can
refresh the iframe.
the iframe finishes calls processjob.php which completes job1
job1 is removed from the array
the php script outputs <script>parent.refreshIframe()</scriptto the
iframe
the iframe reloads and starts the next job

this is of course a single thread approach, if you want to emulate
more threads, use more iframes where the "parent iframe refresher" now
takes the argument iframename.
don't use too many or you will probably hit other hard limits of
average CPU usage within a certain time, or just RAM.

Apr 28 '07 #3

P: n/a
shimmyshack wrote:
>
you could obtain a list of the jobs (or folders) to be processed, put
it in a session, then use an iframe within a main page.
the main page loads the iframe and a javascript function which can
refresh the iframe.
the iframe finishes calls processjob.php which completes job1
job1 is removed from the array
the php script outputs <script>parent.refreshIframe()</scriptto the
iframe
the iframe reloads and starts the next job
I like that idea. I'll try it.
The best idea would be to get a co-located machine,
or a virtual one, so I control server. In the
meantime that sounds like a workable hack. Thank you.
Apr 28 '07 #4

P: n/a
On Apr 28, 2:00 pm, sandy <relativ...@isrelative.comwrote:
shimmyshack wrote:
you could obtain a list of the jobs (or folders) to be processed, put
it in a session, then use an iframe within a main page.
the main page loads the iframe and a javascript function which can
refresh the iframe.
the iframe finishes calls processjob.php which completes job1
job1 is removed from the array
the php script outputs <script>parent.refreshIframe()</scriptto the
iframe
the iframe reloads and starts the next job

I like that idea. I'll try it.
The best idea would be to get a co-located machine,
or a virtual one, so I control server. In the
meantime that sounds like a workable hack. Thank you.
i like circumventing the resitrictions placed on a shared host, it's
cheaper!
although the above approach is jobs of unit size "folder"
you could have a more fine grained approach where you list every job,
using a database, the iframes would be included in each webapge
provided there are jobs to do, the visitors then start a job if there
are any. Provided you have low volume usage / record the last job
start time in the db, this would be a great "cron" emulator which
might be something else your shared service doesnt allow access to.

Apr 28 '07 #5

P: n/a
shimmyshack wrote:
the iframes would be included in each webapge
provided there are jobs to do, the visitors then start a job if there
are any. Provided you have low volume usage / record the last job
start time in the db, this would be a great "cron" emulator which
might be something else your shared service doesnt allow access to.
Not sure about this. I use a home-rolled cms that reads a
source directory structure, looking for images, image captions,
html fragments, link references, etc, and then initializes
a schema. Then I write out the whole website as static html.
That way I can manage hundreds of sources and pages.

Pages that *have* to be dynamic remain that way. But most
of the site ends up as static html.

.....so this is an admin functionality for me only, not something
I want to let users invoke in anyway.
Apr 28 '07 #6

This discussion thread is closed

Replies have been disabled for this discussion.