http://dev.mysql.com/doc/refman/5.1/en/innodb-deadlock-detection.html
http://dev.mysql.com/doc/refman/5.1/en/innodb-deadlocks.html
http://www.mysqlperformanceblog.com/2007/02/25/pitfalls-of-converting-to...

I don't know how often ER_LOCK_DEADLOCK will occur on a Drupal website - but we should be able to handle it.

ER_LOCK_DEADLOCK is mysql-specific. I don't know if other engines have something similar. We should have an exception that is thrown in this circumstance, regardless of which engine type is being used.

It might make sense for each transaction to be wholly contained in its own function. As the function's local variables will be reset each time it is called, it will be easier to write transaction functions that do exactly the same thing during reattempts.

We could then have a function called execute_transaction() that would repeatedly reattempt the transaction function if it fails due to ER_LOCK_DEADLOCK.

Comments

Damien Tournoud’s picture

There are simply no way we can reattempt transactions, and I'm pretty reluctant to that anyway. Drupal is a transactional application, and we want to follow a 'fail early' strategy.

Why do you believe this is critical?

jbrown’s picture

If node_save() randomly doesn't work then that's pretty bad.

I have no idea how often this problem occurs.

Would it really be that hard to reattempt transactions?

Damien Tournoud’s picture

I have no idea how often this problem occurs.

Typically: never. We don't use row-level locking that much (we still use that in merge queries in MySQL, but are probably going to change that), and all the transactions are typically in the same order.

If something really bad happen and MySQL detects a deadlock (for example a transaction cannot commit in a reasonable amount of time), it's probably means that the server is overloaded and it's better not to reissue the transaction.

Would it really be that hard to reattempt transactions?

Not only hard, it's completely impossible right now. Transactions are nested and we just cannot redo the whole processing.

jbrown’s picture

Priority: Normal » Critical

We could do something like this:

function execute_transaction($function) {
  $transaction = db_transaction();

  $args = func_get_args();
  array_shift($args);

  do {
    $reattempt = FALSE;
    
    try {
      $function($args);
    }
    catch (ExceptionTransactionDeadlock $e) {
      $reattempt = TRUE;
    }
  }
  while ($reattempt);
}

Transaction functions would have to be written carefully so they perform the exactly the same transaction the second time around.

jbrown’s picture

Priority: Critical » Normal

Okay - so we whale fail in this situation?

Damien Tournoud’s picture

Our transactions can have any arbitrary level of nesting. Database modifications are mixed with other processes that possibly cannot roll-back (like querying external services, etc.). There is no way to make this work in D7.

For example: think about node_save(). During the saving process, we call hook_node_presave(), hook_node_insert() or hook_node_update(). Those hooks can do arbitrary things. After that (and during the *same transaction*) we save the fields which (on the default SQL Storage) start their own nested transactions, and have their own set of hooks that can do arbitrary things.

Damien Tournoud’s picture

Okay - so we whale fail in this situation?

Failing early is always better in transactional applications. The user know how to redo what he was doing. We don't really have all the information to know how.

jbrown’s picture

Priority: Critical » Normal

For GET requests it might make sense to drupal_goto() to the current page.

For form submissions, there should probably be a form error saying that something bad happened.

Damien Tournoud’s picture

Status: Active » Closed (won't fix)

That's what the error page is for, I believe. This makes this won't fix.