By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,952 Members | 916 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,952 IT Pros & Developers. It's quick & easy.

Retrying handled errors

P: n/a
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.

Jul 9 '06 #1
Share this Question
Share on Google+
8 Replies


P: n/a

ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.
All I can think of is to use a loop
while $result == null
if ($retries '3') { //do stuff $retries++ }

or you can kill the script on error then reload the page
header:$php_self?retries=$retries

meh.. there is a way.. somehow.

Flamer.

Jul 10 '06 #2

P: n/a
flamer di******@hotmail.com wrote:
ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.

All I can think of is to use a loop
while $result == null
if ($retries '3') { //do stuff $retries++ }

or you can kill the script on error then reload the page
header:$php_self?retries=$retries

meh.. there is a way.. somehow.

Flamer.
PHP5 doesn't have that feature, though i'd love to have that for some
situations, also.

I think c# has this feature, not sure though.

The best you could do is to nest a try/catch block within a loop and
use a control flag.

while($retry < 5) {
try{ .... } catch{$retry++; ... }
}

Jul 10 '06 #3

P: n/a
"ImOk" <jo**********@gmail.comwrote in message
news:11**********************@h48g2000cwc.googlegr oups.com...
>I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.

I've done something like that, altough there is no direct language support
for that, you just have to make one. Anyway, I've got a database interface
class which I use for all queries, it has a built-in error reporting
mechanism which sends errors to my email address. Now, when the error
message contains the word "deadlock" I try to rerun the query, keeping track
how many times rerun has been attempted. Before another rerun, there's a
random length sleep period 0.5 ... 1.0 seconds, and then it's retried. If
after ten reruns it fails, I get the error report, but on most cases it runs
succesfully after 3 or 4. It's really handy. We always get those pesky
deadlocks once a day when a major task is performed during which the server
hits a performance peak, but thanks to the deadlock solver, the queries
eventually are ran even though it takes a little longer then.

--
"ohjelmoija on organismi joka muuttaa kofeiinia koodiksi" -lpk
sp**@outolempi.net | Gedoon-S @ IRCnet | rot13(xv***@bhgbyrzcv.arg)
Jul 10 '06 #4

P: n/a

ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.
If you use set_error_handler to collect the errors and debug_backtrace
to see what caused the error it is possible to retry *some* errors.

I could only get it working if the error occured inside a function. If
you check the error message given to the event handler then you maybe
able to handle other errors, not sure if it can be done though.

Anyhoo the code below will try 3 times when a error occurs in any and
all functions.

<?php

error_reporting(0);

function retryErrorHandler($errNo, $errMsg ) {

static $attempts = 1;
$maxAttempts = 3;

$backTrace = debug_backtrace();

print "<br><br>Attempt $attempts<br>$errMsg<br>";
if ( $attempts >= $maxAttempts ) {
$attempts = 1;
} else if ( isset($backTrace[1]) &&
isset($backTrace[1]['function']) && isset($backTrace[1]['args']) ) {

// $backTrace[0] is info about the call to retryErrorHandler
// we need info about the previous function that triggered the
error
// so $backTrace[1] is used if a function + its arguments are
available

$attempts+=1;
$func = $backTrace[1]['function'];
$args = $backTrace[1]['args'];

print "Retry function $func <br>";

//Hmm..The error handler is only called the second time round if
the handler is reset
set_error_handler('retryErrorHandler');
call_user_func_array( $func, $args );

} else {
print "Can't retry error: $errMsg<br>";
$attempts = 1;
}

}

$old_error_handler = set_error_handler('retryErrorHandler');
// trigger a few errors...

// will not retry, not enough info returned by debug_backtrace to try
again
$i = UNDEFINED_THING;

// not enough info to try include() errors again - the dodgy filename
isn't in the backtrace details
// assume its because include is a language construct not a function
call
include('is not a file.php');

// will retry three times
fopen( 'no file', 'r' );

// will retry three times
mysql_connect( 'mysql wont connect' );

// will retry user functions as well as internal ones
testFunc();

function testFunc() {
$x = 3 / 0;
}

?>

Jul 10 '06 #5

P: n/a
This looks like a very good idea. I will have to study it.

But you only need retries for cases where there is a chance of
recovery. E.g. Syntax errors or missing includes will never recover.
Neither will divide by zero.

The main purpose of retries as far as Im concerned has to do with file
and resource connections and locks.

Also using an error handler for unique key checking is better.You dont
have to perform a select first to see if the key already exists. Just
Insert and see if you get an error and retry with a different key.
Tim Hunt wrote:
ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.

If you use set_error_handler to collect the errors and debug_backtrace
to see what caused the error it is possible to retry *some* errors.

I could only get it working if the error occured inside a function. If
you check the error message given to the event handler then you maybe
able to handle other errors, not sure if it can be done though.

Anyhoo the code below will try 3 times when a error occurs in any and
all functions.

<?php

error_reporting(0);

function retryErrorHandler($errNo, $errMsg ) {

static $attempts = 1;
$maxAttempts = 3;

$backTrace = debug_backtrace();

print "<br><br>Attempt $attempts<br>$errMsg<br>";
if ( $attempts >= $maxAttempts ) {
$attempts = 1;
} else if ( isset($backTrace[1]) &&
isset($backTrace[1]['function']) && isset($backTrace[1]['args']) ) {

// $backTrace[0] is info about the call to retryErrorHandler
// we need info about the previous function that triggered the
error
// so $backTrace[1] is used if a function + its arguments are
available

$attempts+=1;
$func = $backTrace[1]['function'];
$args = $backTrace[1]['args'];

print "Retry function $func <br>";

//Hmm..The error handler is only called the second time round if
the handler is reset
set_error_handler('retryErrorHandler');
call_user_func_array( $func, $args );

} else {
print "Can't retry error: $errMsg<br>";
$attempts = 1;
}

}

$old_error_handler = set_error_handler('retryErrorHandler');
// trigger a few errors...

// will not retry, not enough info returned by debug_backtrace to try
again
$i = UNDEFINED_THING;

// not enough info to try include() errors again - the dodgy filename
isn't in the backtrace details
// assume its because include is a language construct not a function
call
include('is not a file.php');

// will retry three times
fopen( 'no file', 'r' );

// will retry three times
mysql_connect( 'mysql wont connect' );

// will retry user functions as well as internal ones
testFunc();

function testFunc() {
$x = 3 / 0;
}

?>
Jul 10 '06 #6

P: n/a

ImOk wrote:
This looks like a very good idea. I will have to study it.

But you only need retries for cases where there is a chance of
recovery. E.g. Syntax errors or missing includes will never recover.
Neither will divide by zero.

Syntax errors are fatal errors which can't be caught with a user
function (normally, anyways).
The main purpose of retries as far as Im concerned has to do with file
and resource connections and locks.

Also using an error handler for unique key checking is better.You dont
have to perform a select first to see if the key already exists. Just
Insert and see if you get an error and retry with a different key.

This could also be done using INSERT ... ON DUPLICATE KEY...
Tim Hunt wrote:
ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.
>
Is there a way after you trap an error to retry the same line that
cause the error?
>
In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.
If you use set_error_handler to collect the errors and debug_backtrace
to see what caused the error it is possible to retry *some* errors.

I could only get it working if the error occured inside a function. If
you check the error message given to the event handler then you maybe
able to handle other errors, not sure if it can be done though.

Anyhoo the code below will try 3 times when a error occurs in any and
all functions.

<?php

error_reporting(0);

function retryErrorHandler($errNo, $errMsg ) {

static $attempts = 1;
$maxAttempts = 3;

$backTrace = debug_backtrace();

print "<br><br>Attempt $attempts<br>$errMsg<br>";
if ( $attempts >= $maxAttempts ) {
$attempts = 1;
} else if ( isset($backTrace[1]) &&
isset($backTrace[1]['function']) && isset($backTrace[1]['args']) ) {

// $backTrace[0] is info about the call to retryErrorHandler
// we need info about the previous function that triggered the
error
// so $backTrace[1] is used if a function + its arguments are
available

$attempts+=1;
$func = $backTrace[1]['function'];
$args = $backTrace[1]['args'];

print "Retry function $func <br>";

//Hmm..The error handler is only called the second time round if
the handler is reset
set_error_handler('retryErrorHandler');
call_user_func_array( $func, $args );

} else {
print "Can't retry error: $errMsg<br>";
$attempts = 1;
}

}

$old_error_handler = set_error_handler('retryErrorHandler');
// trigger a few errors...

// will not retry, not enough info returned by debug_backtrace to try
again
$i = UNDEFINED_THING;

// not enough info to try include() errors again - the dodgy filename
isn't in the backtrace details
// assume its because include is a language construct not a function
call
include('is not a file.php');

// will retry three times
fopen( 'no file', 'r' );

// will retry three times
mysql_connect( 'mysql wont connect' );

// will retry user functions as well as internal ones
testFunc();

function testFunc() {
$x = 3 / 0;
}

?>
Jul 10 '06 #7

P: n/a
Syntax errors are fatal errors which can't be caught with a user
function (normally, anyways).
True, the compiler in theory should catch them up front. But this is
PHP. Anything can happen.
>Is INSERT ... ON DUPLICATE KEY standard SQL?
Usefull if you want to update a record if it exists. Not usefull if you
mean to insert a new record with a unique key. Also is this standard
SQL?

Richard Levasseur wrote:
ImOk wrote:
This looks like a very good idea. I will have to study it.

But you only need retries for cases where there is a chance of
recovery. E.g. Syntax errors or missing includes will never recover.
Neither will divide by zero.


Syntax errors are fatal errors which can't be caught with a user
function (normally, anyways).
The main purpose of retries as far as Im concerned has to do with file
and resource connections and locks.

Also using an error handler for unique key checking is better.You dont
have to perform a select first to see if the key already exists. Just
Insert and see if you get an error and retry with a different key.

This could also be done using INSERT ... ON DUPLICATE KEY...
Tim Hunt wrote:
ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.
>
If you use set_error_handler to collect the errors and debug_backtrace
to see what caused the error it is possible to retry *some* errors.
>
I could only get it working if the error occured inside a function. If
you check the error message given to the event handler then you maybe
able to handle other errors, not sure if it can be done though.
>
Anyhoo the code below will try 3 times when a error occurs in any and
all functions.
>
<?php
>
error_reporting(0);
>
function retryErrorHandler($errNo, $errMsg ) {
>
static $attempts = 1;
$maxAttempts = 3;
>
$backTrace = debug_backtrace();
>
print "<br><br>Attempt $attempts<br>$errMsg<br>";
>
>
if ( $attempts >= $maxAttempts ) {
$attempts = 1;
} else if ( isset($backTrace[1]) &&
isset($backTrace[1]['function']) && isset($backTrace[1]['args']) ) {
>
// $backTrace[0] is info about the call to retryErrorHandler
// we need info about the previous function that triggered the
error
// so $backTrace[1] is used if a function + its arguments are
available
>
$attempts+=1;
$func = $backTrace[1]['function'];
$args = $backTrace[1]['args'];
>
print "Retry function $func <br>";
>
//Hmm..The error handler is only called the second time round if
the handler is reset
set_error_handler('retryErrorHandler');
call_user_func_array( $func, $args );
>
} else {
print "Can't retry error: $errMsg<br>";
$attempts = 1;
}
>
}
>
$old_error_handler = set_error_handler('retryErrorHandler');
>
>
// trigger a few errors...
>
// will not retry, not enough info returned by debug_backtrace to try
again
$i = UNDEFINED_THING;
>
// not enough info to try include() errors again - the dodgy filename
isn't in the backtrace details
// assume its because include is a language construct not a function
call
include('is not a file.php');
>
// will retry three times
fopen( 'no file', 'r' );
>
// will retry three times
mysql_connect( 'mysql wont connect' );
>
// will retry user functions as well as internal ones
testFunc();
>
function testFunc() {
$x = 3 / 0;
}
>
?>
Jul 10 '06 #8

P: n/a
ImOk wrote:
Syntax errors are fatal errors which can't be caught with a user
function (normally, anyways).
True, the compiler in theory should catch them up front. But this is
The specific instance i was thinking of was reading about being able to
catch fatal errors by registering an error handler and a shutdown
function. The general idea (IIRC) being that in case of a fatal error,
the error message is in the output buffer, which you then parse in your
shutdown function and do what you want.
PHP. Anything can happen.
Thats because PHP is magical ;)
Is INSERT ... ON DUPLICATE KEY standard SQL?

Usefull if you want to update a record if it exists. Not usefull if you
mean to insert a new record with a unique key. Also is this standard
SQL?
(Its in mysql for those who don't know)

I don't know, but can't find any evidence of it being standard; there
is also update...if exists, which this is probably true for, also. I
wouldn't be surprised if other databases had a similar clause, though.

Personally, when it comes to duplicates, i try to pass that all down to
the database as much as possible using some sort of sequences approach.
Just insert NULL and let it automagically make the unique value (not
so easy in mysql when you need two auto increments, though - another
reason i like postgres)
Richard Levasseur wrote:
ImOk wrote:
This looks like a very good idea. I will have to study it.
>
But you only need retries for cases where there is a chance of
recovery. E.g. Syntax errors or missing includes will never recover.
Neither will divide by zero.
>

Syntax errors are fatal errors which can't be caught with a user
function (normally, anyways).
The main purpose of retries as far as Im concerned has to do with file
and resource connections and locks.
>
Also using an error handler for unique key checking is better.You dont
have to perform a select first to see if the key already exists. Just
Insert and see if you get an error and retry with a different key.
>
>
This could also be done using INSERT ... ON DUPLICATE KEY...
Tim Hunt wrote:
ImOk wrote:
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.
>
Is there a way after you trap an error to retry the same line that
cause the error?
>
In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.

If you use set_error_handler to collect the errors and debug_backtrace
to see what caused the error it is possible to retry *some* errors.

I could only get it working if the error occured inside a function. If
you check the error message given to the event handler then you maybe
able to handle other errors, not sure if it can be done though.

Anyhoo the code below will try 3 times when a error occurs in any and
all functions.

<?php

error_reporting(0);

function retryErrorHandler($errNo, $errMsg ) {

static $attempts = 1;
$maxAttempts = 3;

$backTrace = debug_backtrace();

print "<br><br>Attempt $attempts<br>$errMsg<br>";


if ( $attempts >= $maxAttempts ) {
$attempts = 1;
} else if ( isset($backTrace[1]) &&
isset($backTrace[1]['function']) && isset($backTrace[1]['args']) ) {

// $backTrace[0] is info about the call to retryErrorHandler
// we need info about the previous function that triggered the
error
// so $backTrace[1] is used if a function + its arguments are
available

$attempts+=1;
$func = $backTrace[1]['function'];
$args = $backTrace[1]['args'];

print "Retry function $func <br>";

//Hmm..The error handler is only called the second time round if
the handler is reset
set_error_handler('retryErrorHandler');
call_user_func_array( $func, $args );

} else {
print "Can't retry error: $errMsg<br>";
$attempts = 1;
}

}

$old_error_handler = set_error_handler('retryErrorHandler');


// trigger a few errors...

// will not retry, not enough info returned by debug_backtrace to try
again
$i = UNDEFINED_THING;

// not enough info to try include() errors again - the dodgy filename
isn't in the backtrace details
// assume its because include is a language construct not a function
call
include('is not a file.php');

// will retry three times
fopen( 'no file', 'r' );

// will retry three times
mysql_connect( 'mysql wont connect' );

// will retry user functions as well as internal ones
testFunc();

function testFunc() {
$x = 3 / 0;
}

?>
Jul 11 '06 #9

This discussion thread is closed

Replies have been disabled for this discussion.