I chose Drupal over Wordpress for its functionality and used it to develop a webpage (visible in its nascent form at http://chriswalkerimages.com).

As you can see, the website is unusably slow, both on the production server (GoDaddy shared "deluxe" linux) and localhost (XAMPP, Windows 7, x64, no performance issues).

I am using page caching for anonymous users as well as browser caching, so while the first page request takes > 30 seconds frequently, subsequent requests are completed in an almost-reasonable timeframe. Of course, this does not help load times for registered users.

During development with the Zen theme, I thought the slow speeds were due to the theme registry being rebuilt on every page load. This is not the case: theme registry rebuilding has been disabled in every possibly related .conf file and double-checked in the MySQL tables.

I enabled Xdebug on my localhost and examined the results using kcachegrind: it appears that almost all of the loading time is repeatably due to these php functions:

drupal_load and system_list call drupal_get_filename (~58 times combined).
drupal_get_filename calls drupal_system_listing (~16 times).
drupal_system_listing calls file_scan_directory (~32 times).
Most of the time is spent in file_scan_directory, which is called (including self-calls) ~1184 times.

As far as I can tell, Drupal is scanning many directories every time a page is requested, looking for enabled and bootstrap modules. I believe the module list ought to be cached in an SQL table, but it doesn't appear that this table is being used.

I've deleted all modules I added and disabled all unnecessary modules. This temporarily brought the number of file_scan_directory calls to about ~600, and now it is back up to double that for an unknown reason.

What should I attempt next? Is this a common problem? I can attach example cachegrind files if that would help. Searching indicates that Drupal 7 is generally slow, but this is agonizing and entirely unusable (visitors who don't know about the problem sometimes believe the site is broken and begin clicking links in rapid succession before the server has a chance to serve a single page, then leave the site).

Thanks,

Chris

Comments

I can't offer anything technical (above my head at this point), but I can confirm the drastic speed difference between 6 and 7. I would love to see whatever you come up with - Drupal 7 is far too slow for me to recommend on simple shared hosting environments (my typical small client).

Nice research, Chris. I am new to Drupal and can't offer any advice, but would encourage you to file a bug/issue ticket (I guess it is done at http://drupal.org/project/issues).
It's frustrating that in the various Drupal 7 Speed posts I've seen, nobody with knowledge of the project has been able to say whether this issue is being looked at, or recommend any settings that a host provider could change to help with the speed problem, or provided any type of feedback at all.

I have the same issue with my site with durpal 7, and I'm still investigating.
In your case you are you using big images and that make the browsing slower.

Hammadi- you are right, the images still need further compression, but if you examine the page load times using firebug or similar, most of the time is spent waiting for the server before any data has been transmitted.

See bug report here: http://drupal.org/node/1082892

Trying to figure out this one too. Some detailed discussion in another post (I'm not technical enough to understand whether these are related).

Here's the other thread: http://drupal.org/node/1018126

My Drupal 7 install is on Bluehost (shared hosting), which I understand isn't optimal. And I can confirm that it gets throttled. What scares me is that: I am the only one who even knows it exists. The thread above points to index.php demanding lots of resources.

Just posting this to subscribe - thanks for pushing the work to figure out what's going on.

I'm a proper cheapskate with shared hosting at GoDaddy. My current Drupal 6 site runs just fine and dandy and it's running a bunch of contributed modules. My test Drupal 7 site on exactly the same account (so same shared DB server) performs, quite frankly, abysmally.

A few other threads I've seen have hinted that Drupal 7 is far more resource hungry than 6, and I also see hints that Drupal 7 is intended to target larger "clients" - meaning the intention is to be attrctive to big companies with the financial means to have huge server resources at their displosal.

I sincerely hope that Drupal isn't abandoning individuals who want to build websites on a budget.

I don't know if you noticed, but your site doesn't even come up - just a 505 error page. Anyway, my drupal 7 install is so slow, that basic admin requests die due to time-out and my pages won't even come up half the time!

-Otto

From what I've been reading this is an issue with taxonomies, so instead of creating categories why not create basic pages, then add articles? It looks like there is a bug report in, so why not bypass the issue until the bug is resolved? IE. I have 25 different news "categories" on the site I am putting up, but rather than use categories, I am using basic pages for each feed, I can still have the "categories" feel by adding articles rather than pages, and can update the system once the bug is fixed. I have no clue about web building, but after 46 years in real life, I have learned that most things can be bypassed until a bad situation is resolved.

I have run into the exact same problem.
I nearly sent on the login details to my client for proofing when my site stated to take 60 seconds to load. Then it timed out at 120 second and threw the following error:
Fatal error: Maximum execution time of 180 seconds exceeded in E:\inetpub\wwwroot\XXX\includes\file.inc on line 1992

After pulling my hair out all afternoon i decided to poke around in that core file to see what was going on.

This is the function that was causing the error:

function file_scan_directory($dir, $mask, $options = array(), $depth = 0) {
  // Merge in defaults.
  $options += array(
    'nomask' => '/(\.\.?|CVS)$/',
    'callback' => 0,
    'recurse' => TRUE,
    'key' => 'uri',
    'min_depth' => 0,
  );
  $options['key'] = in_array($options['key'], array('uri', 'filename', 'name')) ? $options['key'] : 'uri';
  $files = array();
  if (is_dir($dir) && $handle = opendir($dir)) {
    while (FALSE !== ($filename = readdir($handle))) {
      if (!preg_match($options['nomask'], $filename) && $filename[0] != '.') {
        $uri = "$dir/$filename";
        $uri = file_stream_wrapper_uri_normalize($uri);
        if (is_dir($uri) && $options['recurse']) {
          // Give priority to files in this folder by merging them in after any subdirectory files.
          $files = array_merge(file_scan_directory($uri, $mask, $options, $depth + 1), $files);
        }
        elseif ($depth >= $options['min_depth'] && preg_match($mask, $filename)) {
          // Always use this match over anything already set in $files with the
          // same $$options['key'].
          $file = new stdClass();
          $file->uri = $uri;
          $file->filename = $filename;
          $file->name = pathinfo($filename, PATHINFO_FILENAME);
          $key = $options['key'];
          $files[$file->$key] = $file;
          if ($options['callback']) {
            $options['callback']($uri);
          }
        }
      }
    }
    closedir($handle);
  }
  return $files;
}

the if statement was causing me the problems. I have commented it out for the time being so I can use my site. The function now looks like:

function file_scan_directory($dir, $mask, $options = array(), $depth = 0) {
  // Merge in defaults.
  $options += array(
    'nomask' => '/(\.\.?|CVS)$/',
    'callback' => 0,
    'recurse' => TRUE,
    'key' => 'uri',
    'min_depth' => 0,
  );
  $options['key'] = in_array($options['key'], array('uri', 'filename', 'name')) ? $options['key'] : 'uri';
  $files = array();
  if (is_dir($dir) && $handle = opendir($dir)) {
    while (FALSE !== ($filename = readdir($handle))) {
     if ($depth >= $options['min_depth'] && preg_match($mask, $filename)) {
          // Always use this match over anything already set in $files with the
          // same $$options['key'].
          $file = new stdClass();
          $file->uri = $uri;
          $file->filename = $filename;
          $file->name = pathinfo($filename, PATHINFO_FILENAME);
          $key = $options['key'];
          $files[$file->$key] = $file;
          if ($options['callback']) {
            $options['callback']($uri);
          }
        }
      }
    }
    closedir($handle);
  }
  return $files;
}

removing this snippet of code

if (!preg_match($options['nomask'], $filename) && $filename[0] != '.') {
        $uri = "$dir/$filename";
        $uri = file_stream_wrapper_uri_normalize($uri);
        if (is_dir($uri) && $options['recurse']) {
          // Give priority to files in this folder by merging them in after any subdirectory files.
          $files = array_merge(file_scan_directory($uri, $mask, $options, $depth + 1), $files);
        }
        else

Anyone know what this function is doing and a better solution for me to get my site running again?

Thanks to a Colleague of mine we found a solution. Here is what he said:

Did you perchance recently go through and delete a bunch of modules you were no longer using.. as sort of a last second cleanup or something?

The problem (with excessive file_scan_directory() calls) seems to be Drupal scanning for missing modules it thinks are still installed somewhere. (See: http://drupal.org/node/1082892, related to that other post you found)

After installing all the modules it was scanning for (and/or disabling the offending module from the system DB table and clearing caches), everything is fine. Went from 10,000+ calls to file_scan_directory() to zero.

Here are the modules it was looking for (I installed all of these):

wysiwyg
token
recently_read
references (node reference)
plupload
views_bulk_operations
views_field_view
pathauto

LIFE SAVED!

Thank you for the hint!!!
I had the same problem.
I went to my systems table and made sure that all installed (status=1) modules do really exist.
After I deleted records for the modules that were not found ( most probably deleted from the contrib folder without uninstalling them properly ) none of calls to file_scan_directory() was taking longer than a sec again.

NEVER DELETE MODULE without uninstalling it properly first.

is there an easy way or a script to find out what modules are missing?

Thanks

SELECT filename FROM system WHERE status = 1;

Should do the trick. Check the result against your module directories.

This script is realy helpfull to fix the performance issue!

Normunds Puzo, CIO from IDYNAMIC
IDYNAMIC in Drupal Marketplace

I quickly threw together a function that checks for missing .module files based on the SQL query that someone else provided above:

<?php
function MYMODULE_deadmodulecheck() {
   
$startingtime = microtime(true);
   
$o = '<p>Checking for dead modules ...</p>';
   
$result = db_select('system')
      ->
fields('system', array('filename'))
      ->
condition('status', '1', '=')
      ->
range(0, 150)
      ->
execute();
     
$n = 1;
     
$m = 0;
    foreach (
$result as $node) {
     
$path = DRUPAL_ROOT.'/'.$node->filename;
      If (!
file_exists($path)) {
         
$o .= "#$n $path<br>";
         
$m++;
      }
     
$n++;
    }
   
$timedif round(microtime(true) - $startingtime,3);
   
$o .= "Total of $n active modules registered in database. $m dead entries found.<br>";
   
$o .= 'Query Time: '.$timedif.' seconds';
    return
$o;
}
?>

HI, I'm still quite a newbie, where is this script inserted, in which file or how do I create a new file to run this script?
Thanks!

you can just drop a php file in your drupal root with these contents:

<?php
/**
* Root directory of Drupal installation.
*/
define('DRUPAL_ROOT', getcwd());
require_once
DRUPAL_ROOT . '/includes/bootstrap.inc';
drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL);
function
nobueno() {
   
$startingtime = microtime(true);
   
$o = '<p>Checking for dead modules ...</p>';
   
$result = db_select('system')
      ->
fields('system', array('filename'))
      ->
condition('status', '1', '=')
      ->
range(0, 150)
      ->
execute();
     
$n = 1;
     
$m = 0;
    foreach (
$result as $node) {
     
$path = DRUPAL_ROOT.'/'.$node->filename;
      If (!
file_exists($path)) {
         
$o .= "#$n $path<br>";
         
$m++;
      }
     
$n++;
    }
   
$timedif round(microtime(true) - $startingtime,3);
   
$o .= "Total of $n active modules registered in database. $m dead entries found.<br>";
   
$o .= 'Query Time: '.$timedif.' seconds';
    return
$o;
}
echo
nobueno();
?>

then hit it in your browser. that's not very secure but it worked for me... even tho it didn'tr reveal anything and i still have a slow balls website :(


trevor simonton
web developer
http://www.trevorsimonton.com/

Thanks for that, removing the dead module fixed the issue for us. This must be a fairly common situation slowing down a large percentage of Drupal sites. This check really should be a part of the Status report. In any case we will be using this to check all our sites and perhaps even submit it as a module.

for me none of the solutions above worked, but then clearing my site cache magically fixed the problem! try it out!


trevor simonton
web developer
http://www.trevorsimonton.com/

Walker,

Newbie here using the forum to try and get my own help with installation of Commerce Kickstart 2 demo store. Saw an update to your posting. Got curious and followed the link.

Your site popped up amazingly fast ... and the photography is refreshingly breathtaking and incredibly beautiful. Whatever web problems you think you are having... I don't experience them. But you are a very fine photographer...that is for sure!

"Tropfen"