There is an interesting performance comparison of different CMSes over at mamboserver.com. While Drupal comes out very good, there is still room for improvement, it seems.

Comments

jsilence’s picture

Well, certainly not something that will boost performance to infinity, but...

While exploring the Drupal code I saw that function conf_init() in bootstrap.inc tries to construct configuration filenames from the URL and tries to find and load them. Depending on where you installed Drupal this can be three to ten (or more) unsuccessful file lookups for each page load (!).

Actually I fail to see the benefit in this method. IF someone has multiple sites running on the same drupal installation, then where is the problem in explicitly configuring the site specific configuration files? One file which is valid for the whole installation (conf.php) and explicit includes for the multisites.

Secondly bootstrap.inc executes modules_invoke_all("init",...) for a couple of core modules. But none of these modules actually HAS a _init funtion. Dropping this call could spare a few cycles.

Just my 2cent...

-silence

--
If you can't walk, try to run.

moshe weitzman’s picture

1. if you profile those file_exists() calls, you'll see that they are negligible.

2. which modules? i don't see how this is possible since modules register their init hooks dynamically. no hard coding is done. see system_listing()

Bèr Kessels’s picture

Of course we can start optimising drupal in a general way. But I think the problem lies in the fact that drupal can do anything, anywhere. For each type of website other specific configuration is needed. A site with huge amounts of small articles (newsfeed etc) performs different than a site with mainly user-traffic (community site). And those again act different from multisite configurations and different on various platforms.

So I think we should try to optimize drupal to work as good as possible for the bulk of the sites. But we should leave space for those specific cases (how many people actually run a mutliple-site drupal? maybe 5% at the most) to optimise their performance.

For this to work we must first define "the bulk". OTherwise optimisation might, in the end, work for 5% of the sites, while 90% is still running slow.

[Ber | webschuur.com]

dries’s picture

I'm not sure whether to agree with your statement: what type-specific optimizations did you had in mind? I fail to see why a generic cache system can't serve different types of sites equally well and why that would affect the configuration?

Bèr Kessels’s picture

I did not mean to say that we should exlude less popular uses of drupal. Not al all. Neither did I meant to say that we should not do specific optimisations.

My idea is that we should define where drupal is used most often.
Then we can see for those cases what can be optimised.
And last we should see what is left over to optimise for the special case.

To take the example of multiple site: hardly anyone ever uses that. So if we can choose between the bulk performing a little better, over a few site performing little less good. WE should go for the first.

Another example: Page caching is nice for site that have fairly static pages. If your site consists of constantly changins lists of content (I am thinking of logged in users in sideblocks, or aggregated content on pages) optimising the cache will be a real pain. But if we see that 90% of the drupal-users use drupal as weblogging system, with fairly static pages, we can optimise the cache. When we choose to first optimise a caching system for those 95% of the cases, we will benefit much more from it.

Another example. A site set up to serve images, lots of images, will benefit most from optimised databasecalls for those images. It might not benefit particularly well from caching, not even advanced caching, because the biggest amount of data served to end-users is not cached, but served by apache directly. Or is maybe firrst pulled through a file system, that checks permissions. Now, if we see that not even 3% of the drupal sites serve big amounts of images, there is no need to optimise this. Let that optimising be done by a developer who scratches his own itch, when he sets up a drupal site for images.

So to summarise all: The core, which is aimed at the biggest amount of implementations (I hope that is the case, otherwise we are doing something wrong) should perform best. Optimising all other cruft, such as big image modules, or multiple-site implementations will not help drupal in general much further!

[Ber | webschuur.com]

wernst’s picture

The main reason people go to drupal is its flexibility in handling many different types of sites equally well. To start excluding some "unpopular" sites is to send people to other CMS systems.

-Warr

B.X’s picture

If not to use locale.module that it is the truth.

rjspence’s picture

... is that it would be better to provide a page or two of optimization techniques so that individuals can choose what they would like to optomize. Like the above gentleman wrote, there are improvements that can be made. Less queries that perform the same job is always good. I like the idea of having a very fast template system with few errors and tweaked sql lookup. Look at the fastest in the group. It blows everything out of the water in every test. This shows that it can be done. Now granted I haven't tried e107 or whatever it's called, but it appears one of two differnt ways to me. Either the coder has tweaked the code intentionally and put many hours into it, or there are not many functions and queries in the program to begin with. Drupal doesn't do to bad in the test however.

robertdouglass’s picture

I find it to be a natural idea that we would optimize anything we can as soon as a problem is identified and a solution presented. Deciding what to optimize is a moot point if nobody really knows where slowdowns occur. I mean only to say that the first order of business seems to be more profiling. Where do we spend most of our CPU cycles? The best way to improve performance in the long run will be to integrate regular profiling of code sections into the submit/release process. When people see where CPUs vanish, optimization will naturally follow.

Of course there are possible optimizations that would exclude sites of type X, or only benefit sites of type Y, but I think the normal code review and submitting process is the place to make these decisions. If we want to focus more attention on performance, making profiles of the existing code a regular updated feature of this site is the way to go.

- Robert Douglass

-----
visit me at www.robshouse.net