Here are the results of some benchmark tests that I have been running to compare the performance of a shared server with a dedicated one. Perhaps they will be useful to somebody that is trying to decide what hardware they need to host a new site:

All tests were done with Drupal 4.6.5, PHP4 and MySQL using "siege" (see http://www.joedog.org/siege) to establish a certain "base loading level" and the apache benchmark tool "ab" to generate timing statistics.

The exact commands used were like this:

siege -c 32 -i -t 11m -d 5 -f url_list.txt

This simulated 32 concurrent users all requesting random pages from a list of URLs, with a 5 second delay after each page (to simulate the user reading the page). This was set to run for 11 minutes. After about 30 seconds I fired off the "ab" command in a second terminal:

ab -c 1 -t 600 http://server/page.htm

This would request the same page for 10 minutes & collect timing statistics.

Tests were repeated for varying lengths of time and for different numbers of simulated users. I was only really interested in the maximum practical number of users, so I'll only show those results:

On the shared host (supplied by oneandone.co.uk), with cacheing enabled, the maximum practical number of users was about 16. This gave an average page load time of 3.71 seconds, with 95% of pages loading in 5.07 seconds or less. With 32 simulated users the average load time was 8.51 seconds, with 25% of pages taking over 10.2 seconds, which was clearly too much. With cacheing disabled, to simulate "logged-in" users, performance dropped badly after only 4 users! (Note: this was done over a 1Mbit ADSL connection. With 32 simulated users still only 1/2 the available bandwidth was being used, so that shouldn't have been a limiting factor.)

On the dedicated host (1.7 GHz Celeron, 256 Mb RAM) the performance was initially very bad, even with cacheing enabled. Once I enabled the Turck MMCache PHP accelerator, performance was about 10 times better! With 512 simulated users (on a local network connection), the average page load time was 2.68 seconds and 95% of pages loaded in 5.27 seconds or less.

If anybody has any views on whether these results are typical, that would be really helpful. Also, I'm not sure that my testing approach was particularly statistically valid, but hopefully it gives a good indication of what the hardware is capable of.

Comments

Dries’s picture

Weird. Page load times of more than a second are not typical. Especially with caching enabled, the page load times should be in the order of milliseconds.

I suspect your machines ran out of RAM and that most of the time was spent swapping data from memory to disk, and back. If so, you might be able to solve this by tuning your Apache/MySQL configuration.

Adrian Freed’s picture

I fear a performance related bug may have crept into the recent 4.6 releases. I have spent hours tuning and looking for server/apache/mysql/php configuration issues that would explain the 2second load times for pages of very small drupal sites on an idle dedicated average-speed server. With hardly any modules loaded it is only slightly faster. I thought this was an OS/X related issue so I decided to ignore it as I will be deploying on Linux boxes eventually but I have seen many posts now with these sorts of numbers. I installed devel to look at query times and nothing seems too unreasonable so I am now guessing there is some php function throwing away cycles.

CnnmnSchnpps’s picture

well i havent done any serious performance testing yet, but i do have a script that generates as many requests as it can. When I let it run with caching enabled, i usually get well over 1500 served requests per minute once the pages are cached. And I can browse the site at the same time from another machine with no noticeable performance degradation.

With caching disabled, the same script fetches about 200-300 pages per minute.

I should really try for some concurrent users...

insomoz’s picture

Post your settings:

Server: Pentium 3.
Memory: 2gb
Avg Time: 0.0331692888737

1) copy the code
2) create a new file and upload
3) Navigate to browser with file upload and run test
4) Paste your results

To run the test, change the $file paramater, to your drupal page.

<?php
/*
There are two settings:
   First, set $file to be the server and page that you want to benchmark.
   Secondly, set $iter to be the number of times you want it loaded.
*/
 
 
$file = "http://localhost/index.php";  
$iter = 1000;
 
 
function getmtime()
{
$a = explode (' ',microtime());
return(double) $a[0] + $a[1];
}
 
for ($i = 0; $i < $iter; $i++)
{
$start = getmtime();
file ($file);
$loadtime += getmtime() - $start ;
$intertime = getmtime() - $start ;
echo $intertime . "<br>" ;
$avgload = $loadtime / $iter;
}
echo "<p><b>" . $avgload . "</b>" ;
 
?>

When i ran test with a php info file
I noticed this time,

Avg Time: Php info file 0.003170

insomoz’s picture

after installin eaccerlator http://eaccelerator.net/
I got the drupal benchmark to
0.00799290847778

and disabled a few modules that i wasnt using and moved them to another folder.

ymcp’s picture

What speed is your CPU?

Expressing the average page load times as "hits per second" makes for an easier comparison:

your initial test: 0.03316 = 30 hits/s
your eaccel test: 0.00799 = 125 hits/s

my mmcache test: 0.0164 = 61 hits/s

I am curious why you are getting so much better performance. OK, you have masses more RAM, but when I doubled my RAM it made no difference. I have even tried using eAccelerator instead of mmcache, but that didn't make any difference.

Have you done some Apache / MySQL tuning perhaps?

It would be great if a few people could run the php timing script & report their results & server specs.

insomoz’s picture

cat /proc/cpuinfo
Intel(R) Pentium(R) 4 CPU 3.20GHz

With the server, there is only one webpage running, drupal.

The server was compiled with latest kernel,
and the server admin removed many of the un-needed modules that apache is compiled with.

KeepAlive On
MaxKeepAliveRequests 150
KeepAliveTimeout 5
MinSpareServers 30
MaxSpareServers 50
StartServers 15
MaxClients 75

my.cnf
skip-innodb
skip-locking
table_cache=1000
key_buffer=256M
long_query_time=2
max_connections=1000
max_connect_errors=10
tmp_table_size=64M
innodb_buffer_pool_size=128M
wait_timeout=20
connect_timeout=6
thread_cache_size=128
query_cache_limit=8M
query_cache_size=256M
query_cache_type=1
join_buffer_size=2M
read_buffer_size=2M
sort_buffer_size=4M
myisam_sort_buffer_size=64M
read_rnd_buffer_size=1M
#log-long-format
log-slow-queries = /var/mysql/slow_queries.log
thread_concurrency = 4

The mysql settings could probably be tuned better.
The service admin hasnt tuned this part.

ymcp’s picture

Ahhh... the penny drops... your earlier post said you were using a "Pentium 3", which meant the maximum clock speed would have been about 1GHz, which was confusing me!

If the figures you posted were really for a "Pentium 4, 3.2GHz", it isn't so surprising that you're getting roughly double the performance of my little "Celeron 1.7 GHz" test box.

Looks like CPU speed may be the limiting factor here.

Update:
On a 2.7GHz Celeron with 512Mb RAM, the perftest.php results were:
Drupal cache=off, eaccel=off: 0.2260 (4.4 pages/sec)
Drupal cache=off, eaccel=on: 0.1173 (8.5 pages/sec)
Drupal cache=on, eaccel=off: 0.0512 (19.5 pages/sec)
Drupal cache=on, eaccel=on: 0.0163 (61.3 pages/sec)

insomoz’s picture

Nice work with the bench, can you bench
Drupal cache=on, zend+eaccel= on

ymcp’s picture

I have already done benchmark tests using mmcache and eAccelerator. The results were identical.

Is there really any point using "zend accelerator" and "eAccelerator" at the same time? Wouldn't this be more likely to produce some sort of conflict rather than any speed boost?

Note: I have no experience of using the zend accelerator, so if you think there is a benefit, please explain why.

There are several PHP accelerators, so I don't really want to spend time benchmarking them all. I've got a site to develop! ;-)

What is really confusing me at the moment is why I should be getting almost identical Drupal benchmark results for a 1.7 GHz Celeron and a 2.7 GHz Celeron. Given that the CPU is at 100% during the tests, I expected the faster box to perform better. Any thoughts?

ub_freak’s picture

did anyone ever try using litespeed (http://litespeedtech.com/) instead of apache. i recently installed it and it works real well.

some results for anyone interested:

Lifting the server siege... done. Transactions: 5735 hits
Availability: 100.00 %
Elapsed time: 300.81 secs
Data transferred: 30.85 MB
Response time: 0.04 secs
Transaction rate: 19.07 trans/sec
Throughput: 0.10 MB/sec
Concurrency: 0.68
Successful transactions: 5735
Failed transactions: 0
Longest transaction: 2.33
Shortest transaction: 0.01

This was done using siege with 30 concurret users for 5 mins, with drupal cache on. i tried it on a little 1ghz (p IV) with 256 mb ram. but anyway the results are quite good and it works real well as a replacement for apache

ymcp’s picture

Thanks for that script... very useful.

On the shared server I get an average of 0.3659, which is pretty terrible! On the "dedicated" server I get 0.0164, ie 3651 pages per minute.

Dries mentioned that there might be a paging issue, so I re-ran some tests and gathered some "sar" stats for each run.

users %usr %sys %iowait page(kb/s) hits/minute
 64    28     8    0.38     29       1510
128    55    14    0.95     41       2963
256    76    19    0.16    122       4277
512    79    20    0.18     76       4280

Note: Page(kb/s) was calculated from "sar -B" by adding pgpgin/s and pgpgout/s. If I force the system to page excessively (by using open office and firefox at the same time), I can get over 4Mb/s page_io on this box.

So... it doesn't seem that paging is an issue in this case... but it seems that the CPU maxes out at slightly less than 256 simulated users. This is probably why the total number of hits per minute never goes much beyond 4200, regardless of the number of simulated users.

I'm going to try again with the RAM doubled to 512Mb, but I suspect I just need a faster CPU!

I have also tried removing all unused drupal modules, but that didn't make any difference.

Update: doubling the RAM made no noticeable difference.

mikhailian’s picture

My result is 0.24941328787804 on a Tektonic UM3 plan
(LVDS, 512Mb guaranteed RAM, Dual AMD Opteron 244)

James Andres’s picture

Hi,

I've been doing some work benchmarking our site and have been getting numbers between 0.5 and 2 seconds for page rendering time.

My method, right now, is pretty trivial:

Step 1) Grab a microtime timestamp in the index.php (right at the beginning)
Step 2) Grab it in the page.tpl.php (right at the end) by using global and figure out the difference in time.

This method, of course, factors in a bit of transit time to the client browser. Which methods were people using to get 0.03 second times?

Disclaimer: I'm quite new to this server benchmarking stuff, but learning quickly (under pressure :-)

James Andres

Lead Developer on Project Opus
www.projectopus.com

ymcp’s picture

So you're only testing with a single user, requesting a single web page? If that is taking up to 2 seconds, that's *really* bad. What would happen if there were more users?

Any successful web site is going to have to deal with multiple concurrent users, all requesting different pages at the same time.

At the very least I would suggest that you try running the php script that insomoz posted above. That will do multiple requests of a single page, as fast as the server is able to cope.

There's simply no point building a web site that will grind to a halt as soon as it becomes popular. Unless you are just building a tiny "personal" site that is only ever used by family / friends, you should try to ensure that the site can cope with a "slashdotting".

James Andres’s picture

Hey,

Oh we deffinitly plan to have our site server multiple users ;-) (hundreds to thousands to millions, but one step at a time.)

One thing I just noticed was that I forgot to reinstal APC and the other performance enhancers (we had a server crash over Christmas and I've been rebuilding the server piece by piece, with good documentation this time...).

Right now the page render time is around 0.4 - 0.1 seconds per page (much much better).

I'm sure there is still much room for improvement. I'll give those scripts a try and post the results.

Anything else you guys suggest? As I said before, I'm not new to linux or webprogramming and I've also run quite a few small scale servers. This is, however, the biggest thing I've setup yet.

update
I just ran the script posted by insomoz and got an average of 1.2 seconds OUCH!

Time to look into this pronto.

Cheers,

James

Developer on Project Opus
w: www.projectopus.com
h: japhotogallery.webhop.org

drupalguest’s picture

Can someone tell me the ideal load average when using "top"?

=====
Drupal hosting - $50off coupon DH256306 --- Another Drupal Host

insomoz’s picture

Well I guess it depends on the host.
Some hosts will run servers with loads of over 50, but the usually run into all sort of hardware failures.

A load of under 1, is nice and the server should run effeciently.

James Andres’s picture

Hi,

Just a follow up for some things I have done to further boost performance.
1) Implemented the url_alias backport patch (biggest gains by far, 0.2 - 0.6 sec faster load times!!)
2) Worked on removing useless queries in modules and/or compiling queries to remove repetition. Helped a bit...
3) Turned off unused modules. This helps a lot in many cases (seems obvious).
4) Installed the devel module to keep track of queries, etc. (not performance enhancing, but helped me find issues faster.)

Planning to do the follwing:
1) Tweak memory usage of mysql and apache to better handle high traffic.
2) Possibly plan to move some callback functions (ie: ajax callbacks) into their own php files to remove drupal bootstraping overhead. (I heard talk of a better shortcircuted bootstrapping mechanism in 4.7 at OSCMS. WE WOULD LOVE THIS!)

Maybe this will give the curious some ideas.

James

Lead Developer on Project Opus
www.projectopus.com

ymcp’s picture

Hi James,

I suspect that the performance problems that I have noticed are also largely due to the fact that we currently have about 2000 url aliases. The site is still under development - when it "goes live" there will be over 4000 aliases, which is worrying me slightly.

Since 4.7 doesn't seem like it will be ready for a while, and I understand that 4.6 has been "frozen", it seems my only option is to apply this patch. (Note: I have tried the 4.7 beta & really like it... hope it isn't too long until it is properly released.)

So, which patch did you apply? Is it the one posted by matt westgate on January 23, 2006 in this thread?

I have never applied a Drupal patch before... I found some instructions. Is there anything else to watch out for?

kbahey’s picture

Also, add the following to your list:

1. Run as many Apache pre-forked processes as your memory allows.

2. Keep them running for a long time.

For example, I use this:

StartServers             8
MinSpareServers      2
MaxSpareServers     6
MaxClients                24

3. Turn off any Apache modules you do not plan to use.

4. Run as many MySQL threads as you can.

For example:

thread_cache            = 6

--
Drupal development and customization: 2bits.com
Personal: Baheyeldin.com

--
Drupal performance tuning and optimization, hosting, development, and consulting: 2bits.com, Inc. and Twitter at: @2bits
Personal blog: Ba