Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
It looks like with new changes (registry, etc.), it's now impossible to install Drupal 7-dev with only 16M of memory.
The installer fails at the Batch API stage with an obscure Batch API error.
Comment | File | Size | Author |
---|---|---|---|
#152 | drupal-reduce-ram-requirement-281405-152.patch | 613 bytes | webchick |
#145 | 281405_2.patch | 210.23 KB | mikejoconnor |
#142 | 281405.patch | 210.18 KB | mikejoconnor |
#134 | no-updates-for-you.patch | 104.98 KB | webchick |
#119 | no-cache-for-you.patch | 25.74 KB | webchick |
Comments
Comment #1
nlindley CreditAttribution: nlindley commentedThis seems to come up quite a bit. Since the default limit is 8M, it would make sense to have somewhere in the .htaccess a commented line with the php_value set somewhere around 24M. Here's a quick and dirty patch to get things started. You may want to change the wording, the default limit, or the placement.
I don't think it's a good idea to have this uncommented by default since some hosts or users could already have higher limits.
Is there a way to check the current limit and set it only if it needs to be higher?
Comment #2
catchThe PHP5 default memory limit is now 16M so we'll probably see less installs around with 8M. That doesn't help if we go over 16M though...
@Damien - will it install with 16M of memory with #278592: Sync 6.x extra updates with HEAD applied?
I think we might need to have .update files to avoid memory bloat when loading install files.
Comment #3
catchAlso there was lot of discussion of this general issue over at #197720: Drupal requires more than 8M PHP memory_limit (although it was 8mb that time).
Comment #4
Damien Tournoud CreditAttribution: Damien Tournoud commentedI opened that issue just as a reminder. I don't think we should increase the requirement, but rather work on closing that issue by other means (#259412: Total module system revamp could be beneficial on that matter).
Comment #5
RobLoachJust ran into this. Subscribing. Would be nice to at least warn the user.
Comment #6
alexanderpas CreditAttribution: alexanderpas commentedphp ini_set() and ini_get() maybe?
Comment #7
Dave ReidShouldn't this fail at the 'install requirements' phase? I need to test this.
Comment #8
Gurpartap Singh CreditAttribution: Gurpartap Singh commentedForcing memory limit as a requirement (more than the default 16MB) will likely be a pain for those who have shared hosting accounts.
Comment #9
alexanderpas CreditAttribution: alexanderpas commentedwe need to diagnose why this happens... but the question is how...
Comment #10
aaron CreditAttribution: aaron commentedrelated to #309457: Allow profiles to specify required memory in .info file. even though it's noble to try to reduce the memory, i believe it's only stopgap, as eventually we'll cross that limit (as it seems we already have).
Comment #11
aaron CreditAttribution: aaron commentedbumping to critical. i tried to install HEAD on a default 16M installation, and not only did it fail, it corrupted the database, giving the following on successive visits to install.php:
PDOException: SELECT menu_router.* FROM {menu_router} menu_router WHERE (path IN ()) ORDER BY fit DESC LIMIT 0, 1 - Array ( ) SQLSTATE[42000]: Syntax error or access violation: 1064 You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')) ORDER BY fit DESC LIMIT 0, 1' at line 4 in menu_get_item() (line 369 of /var/www/jq-7/includes/menu.inc).
Comment #12
PanchoSee also #346073: memory exhausted on install - wrong link.
Comment #14
RobLoachOnce again ran into this problem:
I think we should increase the required memory because a regular user would just give up before they even get Drupal installed. We should increase it at least until we decrease the amount of memory that Drupal requires to install.
Comment #15
alexanderpas CreditAttribution: alexanderpas commented-1 for that patch: see #2, #4 and #8
Comment #16
Anonymous (not verified) CreditAttribution: Anonymous commentedEven if the 16 MB requirement is somehow maintained for core, adding a contrib module (or even any or all of the core modules) will likely take Drupal over the limit. A warning at least should be given for < 32MB, IMO.
Comment #17
Damien Tournoud CreditAttribution: Damien Tournoud commentedAt the minimum, this is premature.
Comment #18
webchickIn addition, we should upgrade this from a warning to an error. Here's what happens when you install w/ default MAMP 1.7.2 (PHP 5.2.6):
1. On the warnings page, you are told about the recommended memory size, as well as the file permissions stuff. You edit php.ini and bump up your memory_limit to 32MB or so.
2. BUT! You forget to reboot Apache. Oops. Drupal's install happily proceeds along until the point that it attempts to install and you get a HTTP 500 error. That's ... helpful.
3. Then, in an extra special dose of helpfulness, clicking on the "error page" link takes you to a nice WSOD. ;)
Comment #19
nlindley CreditAttribution: nlindley commentedMostly system administrators will be modifying the php.ini file. Most users will modify the .htaccess file which doesn't require the reboot. That was the original reason I thought having a line in that distributed file would be beneficial.
Comment #20
RobLoachWhy is this premature, Damien? I hit it every time I setup a new development environment in both Drupal 6 and HEAD...
Comment #21
alexanderpas CreditAttribution: alexanderpas commentedcurrent state:
memorylimit = 16MB during install => drupal breakage (500), with no warning message when all requirements are met.
Comment #22
CZ CreditAttribution: CZ commentedError "Fatal error: Allowed memory size of 26214400 bytes exhausted " after installation with memory limit 26M.
But no error with memory limit of 32M. So "<32M" is correct.
Comment #23
quickcel CreditAttribution: quickcel commentedI tried a fresh install with 16MB and received the, "HTTP 500 error occurred." Once I bumped it up to 32MB and started fresh no errors at all.
Attached is a simple patch to require a PHP memory limit of 32MB and upgrade from a warning to an error.
Comment #24
catchThis should stay as a warning, not an error - which PHP extensions are loaded and various other factors (install profile etc.) can influence how much memory is required, so we don't want to stop someone installing with 22mb of memory if they can do it.
Comment #25
quickcel CreditAttribution: quickcel commentedOk - I can see how you don't want a hard error for it, but what about bumping up the DRUPAL_MINIMUM_PHP_MEMORY_LIMIT variable?
If I keep it at 16MB and run the Drupal (minimal) installation profile I still get a fatal error of not enough memory after clicking "Save and Continue" in the Configure Site step. If I try again with 22MB on the minimal profile it works without any errors, but if I try with the "standard" install it gives an error with 22MB, so I would say to bump up the DRUPAL_MINIMUM_PHP_MEMORY_LIMIT variable to at least 32MB.
Comment #26
catchYes I think the minimum limit should be bumped to 32M, just not the change from warning to error.
Sorry for not making that clear in #24
Comment #27
quickcel CreditAttribution: quickcel commentedGotcha - we're on the same page then. Attached is a patch with just the minimum bumped up to 32MB, but the severity still kept as a warning
Comment #29
drifter CreditAttribution: drifter commentedRerolled patch in #27.
Comment #30
catchComment #31
webchickCommitted to HEAD. This invariably trips up every. single. person. in #drupal who installs Drupal 7 for the first time, and we're fast-approaching alpha territory, so blatant issues like this need to be cleaned up.
We've had this issue open for almost 1.5 years, and so far have not had good luck reducing Drupal's memory footprint (although I know there's been efforts with module splitting, etc.). Maybe this will light an extra fire for us to figure this out. :)
Comment #32
cburschkaUpdate: I can no longer install the default profile with 32M.
Upping my memory limit to 35M works. 34M is still too little.
(Note that this is only for the web installer. If you want to install the default profile non-interactively via a script, you need at least 41M.)
Comment #33
cburschkaI have placed memory_get_peak_usage() calls in the installer and batch processor. Here's what happens:
It appears the culprit is not any single module (though menu and the default profile itself are big eaters), but rather the fact that the memory used up by installing each module is never freed up. If we managed to free memory after each installed module (particularly the menu module), the memory peak would drop back down to 20M, and could possibly even be stuffed under 16M.
Comment #34
cburschkaStepping into the respective install processes of menu and the default profile, it appears the major part of the footprint comes from rebuilding the menu.
Comment #35
cburschkaIt just occurred to me that I might be getting the wrong impression by looking at the peak usage. So here's another run with the current usage instead. It shows that the memory cost still rises cumulatively with every installed module, and does not go back down.
Comment #36
Anonymous (not verified) CreditAttribution: Anonymous commentedsubscribe.
Comment #37
webchickArancaytar: I'm curious, when you load the front page after install, what's the memory usage there? If it's still ~35MB then we probably need to go down the road of splitting up the code into multiple files. But if it's more like ~20MB, it might be worth exploring how we can make the installer periodically dump its cache as modules are installed as you suggested.
Comment #38
cburschkaI enabled the devel performance module and looked at the memory footprint. It seems that standard page loads are only around 20-21MB, so the installer does seem to be the bottleneck. I haven't tried to generate bulk content, but I have a hunch that the registries and boot process outweighs the footprint of the content that is actually displayed.
Edit: On a front page with the full 50 nodes, the memory usage goes up to 25MB.
Comment #39
catch20-30M without APC sounds about right to me on a normal install (i had it as low as 6 or 7MB without APC).
Those memory numbers are really, really interesting.
@Arancatayar - would you be up for running xdebug on the installer, then following the same process as here http://drupal.org/node/513984#comment-2279256 to get the memory usage per-function? That should show which function or group of functions is hogging all the memory.
Comment #40
webchickCan we please make sure that those instructions get put into the handbook either in http://drupal.org/node/79237 or as a sub-page there? I have no idea how anyone would find those in some totally random issue without having a catch to point them there. ;)
Comment #41
catchAdded: http://drupal.org/node/659980
Comment #42
cburschkaI am able to run xdebug and get a raw trace file, but the log analyzer is written in a syntax that is not compatible with PHP 5.2.10.
Comment #43
cburschkaAh, 5.3.0+. Bleh. Well, it shouldn't be that tricky to convert it to create_function() syntax.
Comment #44
cburschkaWell, for what it's worth, here are my results. They look kind of weird in part (over 1 GB? What the?)
Also, the calls are unbelievable. Consider that get_defined_constants is called *only* in drupal_parse_info_format(). There is no way 3245 info files exist in the Drupal code-base... But I do know that all this data was aggregated from a single installation run of the default profile.
Edit: And yep, 3245 calls to the info parser for a single default installation. Every single info file in the code base is parsed 33 times, triggering a call to get_defined_constants() every time. Statically caching drupal_parse_info_file() did not decrease the peak memory usage, but it did cut the installation time by a third.
Note that these statistics are useful for finding how memory intensive individual functions are, but not so much for finding the leaks that put the cumulative usage over the 32M limit.
Another resource hog is the stream wrapper handling being called 90,000 times, which seems entirely useless when parsing code files during installation.
Comment #45
catchDamn, this is cool (insert embarrassed smiley).
This is indeed not very useful for working out why memory usage is so high at the end, and I can't think of a way to isolate a trace file to just one request like that.
file_uri_scheme() is called a bunch of times on normal requests too, but I didn't see a way out of that yet.
"Statically caching drupal_parse_info_file() did not decrease the peak memory usage, but it did cut the installation time by a third."
Can we open a separate issue for this? 300% speedup on installation, if that's really the case, would be really nice to have.
Comment #46
cburschkaJust down by a third, not down to a third. Still, it was substantial. I split the subject off to #661420: Installation of modules is hugely inefficient.
Comment #47
catchJust cross posting this patch: http://drupal.org/node/358815 - which might be some of the same problem.
Comment #48
mcrittenden CreditAttribution: mcrittenden commentedSubscribe.
Comment #49
cburschkaI can cut the default profile's installation's peak usage by around 7MB, almost getting it under the 32M limit, by inserting the menu routers one at a time instead of building up a big query object (ie. executing the query once for each router).
The time cost caused by repetitive database queries appears to be between 0.5 and 1.5 seconds. That is not negligible, but seems worth it for 7MB - particularly since the module installation issue has some proposals that can cut execution time to 6-7 seconds anyway.
Diff:
Before:
After:
Comment #50
ShutterFreak CreditAttribution: ShutterFreak commentedIs it possible we are facing some known PHP issues here?
Before visiting this issue I posted a comment with 5 possible PHP problems in #661420: Installation of modules is hugely inefficient. I now realize my comment on the other issue is maybe more relevant for this issue.
Best regards,
Olivier
Comment #51
alexanderpas CreditAttribution: alexanderpas commented#49 seems to be very proper use of prepared statements!
instead of using an 12X sized dataset 1 time,
we're using 12 times an 1X sized dataset.
(just an example.)
Comment #52
catchYeah if multi-inserts are taking up that much memory, that's a real problem elsewhere - can we open a new issue for that?
Comment #53
Crell CreditAttribution: Crell commented7 MB would surprise me, but perhaps not on menu rebuild.
Insert statements keep the record they are going to insert as array data within the query object. If it's a multi-insert, then it's a multi-dimensional array. If you have a crap-ton of records to insert, you have a correspondingly large crap-ton of entries in that array. :-) It should scale linearly, but a menu rebuild is an edge case where you're inserting hundreds, perhaps a thousand records at once. I can see that causing memory issues.
At one point we discussed having a ->executeEveryXRecords() method on insert statements, but decided that was really the caller's responsibility to do. So we absolutely could chunk the insert up into pieces without dropping multi-insert completely here. What an optimal number is I don't know off hand, but we could start with something like 10 or 20 and test from there. Once the code is converted it's dead simple to tweak the number.
If we do that, though, we should absolutely wrap this function in a transaction. Honestly I'm surprised it isn't already. Wiping and rebuilding the menu tables really should be atomic. :-)
Comment #54
Crell CreditAttribution: Crell commentedHere's a patch that batches the inserts in groups of 20. I also checked and the transaction really belongs a step higher in menu_rebuild(), so I added it there since we're increasing the number of queries that could potentially break the system entirely if they fail. :-)
We can haz benchmarks?
Comment #55
cburschkaThe code-style fix in this patch conflicts with core; probably fixed already.
Unfortunately, my benchmark shows a very small improvement for this patch. I'll try to find out why.
Comment #56
cburschkaOops. Code error.
Of course, unless you reset $num_errors to 0, that is only going to fire for the first 20 records.
Comment #57
cburschkaFixed patch is here.
New benchmarks:
Shows that 20 really does seem to be the optimal trade-off.
Comment #58
Anonymous (not verified) CreditAttribution: Anonymous commentedi created #650858: wrap menu_rebuild() operations in a transaction to wrap menu_rebuild() in a transaction, so i'm happy to see that get in here.
looks like there's a bug with the chunking:
that looks like if we're inserting an exact multiple of 20 records, then we'll call the $insert->execute() without any values. i guess we need something like:
otherwise, looks good to me.
Comment #59
Anonymous (not verified) CreditAttribution: Anonymous commenteddiscussed with Arancaytar in #drupal, rerolled as per #58.
Comment #60
Jon Nunan CreditAttribution: Jon Nunan commentedFrom a quick look I thought the new DB layer uses transactions, when available, on ->insert already. Isn't this nesting transactions? I thought MySQL didn't support nesting. I could very well be wrong on all of this though.
Comment #61
cburschkaWell, it's definitely working, so either MySQL is dropping the inner transaction without breaking, or there is no inner transaction going on.
Edit: The transaction wrapping is in the degenerate case, when each row requires its own database query. Obviously, MySQL supports inserting multiple rows in a single query, no transaction is needed or used. However, it would be a good idea to test this patch with an SQLite database.
Comment #62
Anonymous (not verified) CreditAttribution: Anonymous commentedmeatsack: we don't support transactions by default in mysql yet:
see this issue #616650: Default MySQL to transactions ON if you want to help.
on the nesting - its emulated in the database code, its not nested transaction on the db server.
Comment #63
catchThere's no implicit transactions within the database layer afaik - each has been added case by case - see node_save() for example.
Comment #64
Crell CreditAttribution: Crell commentedThe DB layer handles nesting transactions in PHP-space safely. The MySQL InsertQuery implementation does not use transactions anyway, it uses multi-insert statements (which are the issue here). Insert queries also should self-guard against empty inserts, so the edge case where we are inserting an exact multiple of 20 records is already handled.
Comment #65
moshe weitzman CreditAttribution: moshe weitzman commentedWe are building a fortress around menu_rebuild() [lock API, transaction, custom batched inserts, ...]. Thats a code smell to me. I wish we would refactor that rebuild, even at this stage of the release. I'm +1 for this patch, I'm just saying ...
Comment #66
cburschkaHow should it be refactored, though? I'm not sure how the fortress is a bad thing here - it is a fairly critical system change after all.
Comment #67
Crell CreditAttribution: Crell commentedWell truthfully I think a lock *should* include a transaction by default. We can't change that now, but for future reference... The "custom batched inserts" isn't a fortress or even custom. It's a feature of how the database API works that you can do that when necessary. Looks like here, it's necessary.
Comment #68
catchThere are issues to refactor menu_rebuild(), but they involve comparing the entire database array to the entire hook_menu() + _alter() array, which doesn't bode well for bringing memory use down.
I've previously measured around 5mb taken up by menu_rebuild(), looks like we're cutting down a big percentage of that here which is great. I don't see that we can refactor it without something like menu_router_save() and kill hook_menu() - but that's not for D7 and has it's own issues I think.
Comment #69
moshe weitzman CreditAttribution: moshe weitzman commentedOK sure, menu rebuild refactor will happen next release.
@Crell - I agree that locks should be surrounded by transactions by default. I do see that as a D7 thing in case you were on the fence and needed code reviewer :)
Comment #70
catchLocks surrounded by transactions also sounds D7-ish to me, that's not an API change which affects modules porting to any extent at all, and probably saves a lot of work when people try to convert to the locking framework later instead of variable semaphores etc.
Comment #71
Crell CreditAttribution: Crell commentedWell, it is D8-ish if we need to change the locking API at all, since I'm not entirely sure how we'd do it currently. :-) That's not a performance question, though, so it's OT for this thread. Let's open a new one for that if you want to push it. For now, #59 needs some updating as above.
Comment #72
cburschkaWhich changes need to be made to #59?
The only explicit suggestion I can see is #64 about the edge case being implicitly handled - but that would just take us back to patch #58 without the edge case condition...
Comment #73
Crell CreditAttribution: Crell commentedActually #57, I think, but you're right, that's the good one.
Since the bot is ignoring us, I'm going to mark #57 RTBC. Let's hope we don't regret that. :-)
Comment #74
Juanlu001 CreditAttribution: Juanlu001 commentedSubscribing.
Comment #75
cburschkaAfter what we've seen today... let's wait for the bot. :P
Comment #76
catchAlso it's not 100% clear which patch is the good one, can someone re-upload?
Comment #77
Anonymous (not verified) CreditAttribution: Anonymous commented#57 is the good one. Crell pointed out that the "execute an insert without any values" case is handled by the db layer, so we don't need to account for it. re-uploaded...
Comment #78
Juanlu001 CreditAttribution: Juanlu001 commentedIt may seem an stupid question (this is my first comment despite some "Suscribing") but does #57 really solve the problem? I mean, Arancaytar wrote that the peak memory usage was 32.95 MB after applying the patch, which is a remarkable improvement, but it's still over the 32M memory limit.
Comment #79
int CreditAttribution: int commentedNo, but it's one step further..
Comment #80
Anonymous (not verified) CreditAttribution: Anonymous commentedi'd be interested in the memory numbers with this patch applied #661420: Installation of modules is hugely inefficient.
Comment #81
cburschkaI got a bit a bit over 29MB with both (around 37MB with just the other one). Both of these changes together would put us back under the limit, though it would be a good idea to investigate further and gain a bit of a safety margin.
Comment #82
Anonymous (not verified) CreditAttribution: Anonymous commentedArancaytar: thanks, woo, nice to be under 32M. i agree we should keep looking.
i posted a patch at comment #10 of #661420: Installation of modules is hugely inefficient to give us a real cache during install. not sure if this would help or hinder the memory issues, but probably worth trying.
Comment #83
ShutterFreak CreditAttribution: ShutterFreak commentedCould we try to see how memory usage evolves as a function of the number of include() statements?
I'm not sure whether we have a simple way to replace some occurrences with the inline code just for a try. Probably some shell script with pattern substitution could do the job?
Comment #86
Crell CreditAttribution: Crell commentedLet's get #77 committed before it breaks again. It is a memory improvement and safety improvement. We can do more after that part is in in this and other threads. (Whether it gets us past the 32 MB magic barrier is irrelevant. Less memory usage == good.)
Comment #88
sunPlease revert.
s/Execute/Insert/
The added newline seems wrong here.
Powered by Dreditor.
Comment #89
cburschkaComment #90
cburschkaRemoved the first newline, and changed the word (the first Execute is correct, since it refers to the query - the second one indeed should be Insert).
The second blank is debatable - I like loops to stand clear, but a counter-argument is that the "any remaining" insert is in some ways the "final iteration" of the loop and semantically a part of the "code paragraph". Removed the second newline too.
Comment #91
Crell CreditAttribution: Crell commentedI would disagree as well and include the extra blank line there, as I see them as separate code stanzas. Frankly "execute" is accurate as well, but I'm not going to belabor the point.
#90 and #77 are identical save for stylistic non-code stuff. Drieschick, please commit one of them soon. :-)
Comment #92
webchickI went with #90. We could always clean up code style en-masse in a separate patch.
As Crell mentions, menu_rebuild is pretty edge-casey in the number of queries it fires at the same time. But I do worry about this coming up elsewhere randomly in contrib as well. Hrm...
Anyway, I've committed this to HEAD as a starting point. Marking back to active since it sounded like we wanted to do further digging here.
Comment #93
cburschkaYay! Installation of HEAD with default profile now peaks at 32.95MB usage as opposed to 40MB. We're nearly there! :)
Comment #94
Jon Nunan CreditAttribution: Jon Nunan commentedI've been able to install Head with only 32MB for a while now. Arancaytar can you try an install with the ini set to 32MB?
I wonder if the profile functions themselves are pushing you over now? Or maybe PHP will use more memory if it has access to it...
Comment #95
ShutterFreak CreditAttribution: ShutterFreak commentedRevisiting #44 I am puzzled why
readdir
is invoked 127618 times during install.Comment #96
Jon Nunan CreditAttribution: Jon Nunan commentedShutterFreak, the bench in 44 was done before #358815: drupal_get_install_files() is slow with a large tree landed. It should be much less than that now.
Comment #97
cburschkaNope, still failing.
However, it works with 33M:
Are you testing this with a web install? That is known to use less memory than the CLI script (see #33)...
Comment #98
catchMemory usage is going to differ depending on which PHP extensions you have loaded, APC or not. It's quite reasonable to expect some variation between machines.
Comment #99
Crell CreditAttribution: Crell commentedSo we're right at the 32 MB limit right now. That means to be safe, we still want to try and shave a couple meg off. Any idea where else we could do that? Any other big arrays we can kill?
Comment #100
cburschkaNote: If we don't get to a safety margin of about 2-4MB (at least), we should probably increase the memory requiremen before the alpha release, see webchick's comment #31. A "fatal error, out of memory, installation partially failed" is not a good first impression, when the alternative is a helpful message telling them to increase their memory limit...
Comment #101
int CreditAttribution: int commentedI don't think that is a alpha blocker, maybe a RC blocker...
the alpha release is only for dev
Comment #102
Anonymous (not verified) CreditAttribution: Anonymous commentedArancaytar: can you hit me with a cluebat re. the steps you took to install via the CLI in #97? do you just invoke that from the webroot of a clean install?
Comment #103
moshe weitzman CreditAttribution: moshe weitzman commentedthe drush installsite command is pretty nice for CLI installs of any install profile
Comment #104
moshe weitzman CreditAttribution: moshe weitzman commentedFigured i would chime in here with more interesting data. I just enhanced Drush so it reports memory usage and watchdog entries. Here is the latest out put for installsite command.
Some observations:
Comment #105
Anonymous (not verified) CreditAttribution: Anonymous commentedthanks moshe - could you give us the options for drush to make it report those memory and time numbers?
Comment #106
moshe weitzman CreditAttribution: moshe weitzman commenteddrush installsite --debug --yes. this shows debug info like memory snapshots and skips confirmation which would just throw off the timer.
Comment #107
Anonymous (not verified) CreditAttribution: Anonymous commentedmoshe: thanks, i'm going to try this with the fakecache patch and see if that makes a difference to memory usage.
Comment #108
webchickSo... alpha's tomorrow. What's a safe value to set this minimum to? Could someone run a couple of tests?
Comment #109
cburschka@webchick: Installation is almost definitely safe at 40M; even in drush it seems to peak at 36M.
(Furthermore, Drupal runs comfortably within 32M after installation/while using minimal.profile, so maybe the memory requirement should be added to default.profile as default_requirements instead?)
I got over 64MB usage when visiting the Simpletest overview the first time (probably on parsing all those files) which I haven't been able to reproduce. Trying it right now with a new install uses only 27MB. I'll investigate more thoroughly tonight; but even if it turns out to be a problem, this should probably be a simpletest_requirements() change, and not drag up the memory requirement of Drupal itself.
Comment #110
int CreditAttribution: int commentedThe minimal require php memory should be different if we have others php modules ON. Like rewrite/apc/vhosts/...
So 40Mb if the peak now is 32Mb it seams ok to me.
But this only work at the instalation and with minimal profile.
So we must have at least four minimals requirements:
For:
Instalation Minimal Profile: 40MB
Instalation Standard Profile: ?? MB
One normal Drupal site with all core modules ON: ?? MB
To execute the SimpleTest: ?? MB
And say any module addition will increase the php-memory that is required..
Comment #111
catchVariable requirements has been discussed in #309457: Allow profiles to specify required memory in .info file - we can't do that here too.
It's almost tempting to remove one module from the default profile to try push this under 32mb - I vote for color.module.
Comment #112
int CreditAttribution: int commentedRemove core modules from the default instalation to reduce php-memory isn't the solution..
Because follow that statement we will have to remove one more at D8.
Or we implement a new feature to remove the memory that we don't needed, or we increase the memory requirement.
Comment #113
webchickFor the purposes of alpha 1, I just committed a patch to raise this limit temporarily to 40M. But let's keep at it here so we can get this back down to 32M prior to 7.0. :)
Comment #114
cburschkaIn individual modules during run-time, this is something else, but if we know for certain that the installation of default.profile takes a certain amount of memory (32M on the web, up to 36M in CLI, but certainly under 40M), this is a pretty fixed number. The arguments about "depending on datasets" or unforeseen circumstances don't really apply in this case.
Comment #115
FiReaNGeL CreditAttribution: FiReaNGeL commentedWouldn't a solution be to split the install process in many parts? Maybe via AJAX calls to install each module - of course, this would require JS to be enabled.
Comment #116
cburschkaThat's not possible on a command-line install. The web installer already has a fairly robust batch API, which is why it's not using as much memory...
Comment #117
webchickI ran into #550124-28: Remove prepared statement caching while playing around with hook_query_alter() last night. Basically, we're caching the compiled output of every single query on the page into memory.
I'm curious if the following line in includes/database/database.inc is commented out:
...what does that do for our RAM footprint during install?
Comment #118
webchickOh, nevermind. I just saw #106, so I can do this myself. :)
HEAD:
With "patch":
Well, that shaves off a little. But not several MBs.
But um. Hey! It's down to < 20 MB now?! Can someone confirm those numbers?
Comment #119
webchickHere's the patch if someone wants to play around with it.
Comment #120
webchickLet's try that *without* the 250,000 patches that I've reviewed over the past year. ;)
Comment #121
huysonzone CreditAttribution: huysonzone commented#1: htaccess.patch queued for re-testing.
Comment #122
moshe weitzman CreditAttribution: moshe weitzman commentedFYI, I just retested and webchick's patch shaves 2.5MB from a default install. Look for the MB number reported in each line below.
Comment #123
moshe weitzman CreditAttribution: moshe weitzman commentedBTW, I have no idea why memory consumption is twice as much as webchick's. Maybe xdebug or other php config is at play here. Would be great if webchick could repeat with latest drush and latest d7.
Comment #124
Jon Nunan CreditAttribution: Jon Nunan commentedNumbers on one of my windows machines with latest drush and Head
Head unpatched:
Head with no-cache-for-you #120
So around 2MB less memory. Got something weird going on with those long times on this machine, but that'll be down to my setup.
Comment #125
catchCrell's benchmarks for prepared statement caching or not show no measurable difference if we remove it:
Cache no, Emulate no: 10.16 requests/second
Cache no, Emulate yes: 10.23 requests/second (same as #120)
Cache yes, Emulate no: 10.16 requests/second
Cache yes, Emulate yes: 10.25 requests/second (Current Drupal HEAD)
So I think we should do #120, however it only comments out the caching, and Crell suggested we need additional cleanup elsewhere if we do it, so CNW.
Comment #126
catchLet's remove prepared statement caching in #550124: Remove prepared statement caching, which means no patch here.
Comment #127
webchickOk, I just committed #550124: Remove prepared statement caching, so at least one major memory suck is gone.
@moshe: I cvs up -dPC drush HEAD and core, and still get similar numbers to before:
It's interesting that you ask about xdebug because I recently totally screwed MAMP and had to reinstall, and it may indeed have been around that time that I did my original tests. But tonight, I'm sure I have xdebug installed, and my memory footprint remains low.
Here's a copy of my xdebug settings:
One thing I do not have is xdebug profiling on. Could that be the difference?
In any case, here's my phpinfo(), which is basically out-of-the-box MAMP: http://webchick.net/files/_temp/phpinfo.php.html
Moshe, could you post your config so we could try to get to the bottom of this memory discrepancy?
Also, if someone could run a test through the GUI and check RAM usage like Arancaytar did earlier, I think we might be able to set this down to 32 and call it good?
Comment #128
moshe weitzman CreditAttribution: moshe weitzman commentedI don't have xdebug profiling either. I tried with APC enabled and disabled and no difference. webchick's MAMP beats my newish unibody badly. /me weeps. do you have some ramdisk or something?
Comment #129
webchickUm. I don't know what ramdisk is. I'm guessing no. :) It's just on my regular ol' hard drive in the /Applications folder on my 2.5-year old MacBook Pro. I /do/ have 4GB of RAM in this machine, but I don't think that would make a difference?
Comment #130
ctmattice1 CreditAttribution: ctmattice1 commented@moshe weitzman "do you have some ramdisk or something?"
I haven't heard anyone use ramdisk in YEARS, they kind of died with the 486. Boy could they speed up a process though. back then they were anywhere from 100 to 1000% quicker than the slow hard drives of the time. wonder what the feasibility of putting one of those bad boys on a system today would be like?
guess in showing my age.
Comment #131
moshe weitzman CreditAttribution: moshe weitzman commentedPutting all your mysql data into a RAMDISK or similar has dramatic speed advantages. Thats how our pifr test bots get through the whole test suite in 3 minutes. The setup for a pifr test bot is documented at http://qa.drupal.org/performance-tuning-tips-for-D7. See "Run MySQL in memory"
Comment #132
ctmattice1 CreditAttribution: ctmattice1 commentedThanks Moshe
I'll give it a shot on my CentOS box, I'd forgotten all about /tmpfs
Comment #133
catchOne other idea, someone who's hitting the 32M limit, please try deleting every single hook_update_N() in system.install (and ideally the same in other module's .install file), then try updating and measuring the memory again. I reckon we could shave about 500
MBKB or more off for non-APC hosts if we moved those out to a .update file.Comment #134
webchickI'm not, but the drush test...
HEAD:
HEAD minus updates:
.5 MB is not all that impressive. :\ Shaving 7 seconds off run-time is nice, but that could just be a glitch.
Here's the patch I used. I just did it manually cos my brain needed a nice, brainless break from my other work, so thanks for that. :)
Comment #136
alexanderpas CreditAttribution: alexanderpas commentedI think i just managed to install the bare essentials of drupal, using the minimal profile within 16MB
However, to be able to view the "Configure Site" page I needed 17 MB
I've Confirmed this by re-installing with 17MB (which brought me to the "Configure Site" page without a hitch), lowering the memory limit to 16MB, restarting the server, and refreshing (which resulted in an memory limit error).
Also, when I lowered the Memory 15MB, it installed all the tables, but wasn't able to populate the menu_router table.
Conclusion: the "Configure Site" page needs over 16MB to even get displayed. (and I think that means there is a bug somewhere.)
To be able to submit that page, and finish the installation, i needed 1MB of additional Memory (which can be expected.)
Note that this is using the minimal profile, so actual drupal performance is even worse.
(Environment: Ubuntu 9.10 off the shelf)
Comment #137
obris CreditAttribution: obris commented#1: htaccess.patch queued for re-testing.
Comment #138
catchI tried with webchick's patch and also got about 500kb less usage, peak memory for me was about 48mb though.
Comment #139
sunKilling 0.5 MB of memory consumption and speeding up by 7 seconds, by splitting updates away into $module.update.inc sounds like a very good improvement to me. Note that $module.install files are also loaded during regular site operation, whenever the cached schema is rebuilt. Hence, that change will improve performance of installing/enabling/disabling/uninstalling modules.
Comment #140
Crell CreditAttribution: Crell commentedSplitting off update hooks makes a lot of sense, since those are even more rarely used than install hooks. It's also an easy-to-follow pattern.
Would we want to make that required, or optional? (Vis, allow modules with only one or two update hooks to leave them in the .install file.)
Comment #141
Anonymous (not verified) CreditAttribution: Anonymous commented+1 Required
Comment #142
mikejoconnor CreditAttribution: mikejoconnor commentedI think I got everything. Here is a patch to add module_load_update() and move all of the update functions, as well as the update_dependency functions to the new update.inc files. Personally I think it is a good idea to force this. After making this patch, I can honestly say it's not *that* bad.
Please review, it's my first run at this, so I'm sure I missed something.
Comment #144
sun1) Wrong arguments for http://api.drupal.org/api/function/module_load_include/7
2) I think we should additionally load $module.install files when loading $module.update.inc. It's likely that schema and installation functions are needed when needing updates.
90 critical left. Go review some!
Comment #145
mikejoconnor CreditAttribution: mikejoconnor commentedSun,
Thanks for the review. Here is an update to fix the argument issue, and load the .install file.
Comment #146
catchI have a bit of a bad feeling about http://api.drupal.org/api/function/drupal_get_schema_versions/7 which is called by drupal_install_modules(), which requires all .update to be loaded into memory. We might not be able to get an actual saving here, although it's possible there's a way around this.
Comment #148
Anonymous (not verified) CreditAttribution: Anonymous commentedMight make a hook_schema_update to register the hook_update_N to control it?
Comment #149
sunI'd suggest to move the $module.update.inc split + discussion into a separate issue, so this issue can stay and serve as general/meta issue.
@mikejoconnor: Could you create that isue and link to it here?
Comment #150
mikejoconnor CreditAttribution: mikejoconnor commentedMoved the .update.inc discussion to #788496: Break update functions into update.inc
Comment #151
mikejoconnor CreditAttribution: mikejoconnor commentedI'm currently able to install the standard profile with 27mb of ram. With the minimal profile, I am able to install with 20mb of ram.
Currently, I believe this issue is resolved, unless I'm missing something.
Comment #152
webchickOk. I committed the attached patch, which moved this memory requirement back down to 32M. Let's see if we get any more bug reports about out of memory errors during installation, but it looks like just about everyone who's posted stats since we backed out that DBTNG query cache is well below that limit.
Comment #154
David_Rothstein CreditAttribution: David_Rothstein commented@webchick, did you ever commit that?
I came across this issue for another reason, but then read your comment that you put it back down to 32M and thought "hm, that's strange, because I'm pretty sure it's back at 40M now", and looking at CVS, it doesn't look to me like the patch was ever committed in the first place...
(I'm marking "needs review" not RTBC because I don't know if 32M is actually enough anymore.)
Comment #155
David_Rothstein CreditAttribution: David_Rothstein commentedOh, I did not mean for this to show up on the criticals list, oops.
Comment #156
mcrittenden CreditAttribution: mcrittenden commentedalpha blocker tag no longer relevant since we're on beta now.
Comment #157
David_Rothstein CreditAttribution: David_Rothstein commentedWhen I install Drupal via the command line, the script I use always prints out the maximum memory taken. With current HEAD, that comes out to around 26-27M using the Standard profile.
I just added some quick logging to the browser-based install too, and for the Standard install profile, no page request got above around 25-26M.
So, it seems safe to mark this RTBC. Other install profiles can (and will) exceed the 32M limit, but they could always add their own hook_requirements() if they need to.
Comment #158
juan_g CreditAttribution: juan_g commentedCurrently, in includes/bootstrap.inc:
Comment #159
webchickOk, let's try it again then. :P~
Committed to HEAD. For real this time. I think. :P
Comment #160
juan_g CreditAttribution: juan_g commentedIt's 32M now (current bootstrap.inc, at drupalcode.org).
Comment #161
HansKuiters CreditAttribution: HansKuiters commentedI just installed 7.0-beta1 on Apple with MAMP (php 5.2.6 and 16M memory. Install fails on 'Install profile' step with message: "An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: http://localhost:8888/drupal-7/install.php?profile=standard&locale=en&id... StatusText: OK". Changed memory to 32 Mb and install is now succesfull.
Couldn't there be a memory check before install?
(I didn't read all the messages above. Don't know if I should start a new issue, because this concerns de beta)
Comment #162
David_Rothstein CreditAttribution: David_Rothstein commented@capono: There is a memory check done before install. However, I think the problem you're encountering is because when the check fails, it only triggers a warning (not an error), and there seems to be a bug in Drupal 7 where if the requirements check only results in warnings but not any errors, the warnings are never displayed.
This should be a separate issue. I created one here: #951644: Requirement warnings (e.g. for PHP memory limit) are not shown on install or update unless there is a requirement error also
Comment #163
HansKuiters CreditAttribution: HansKuiters commented@David_Rothstein: thank you.
Comment #165
David_Rothstein CreditAttribution: David_Rothstein commentedNote that there are (some) reports coming in of 32M not being enough on particular hosting environments. See #1008362: 32M is sometimes not enough memory to install Drupal 7 for the followup issue.
Comment #166
moshe weitzman CreditAttribution: moshe weitzman commentedI'm hearing reports that 128MB is not enough for Drupal 8. Now is a good time for folks in this issue to mobilize and start fixing up D8. You can post on #1744302: [meta] Resolve known performance regressions in Drupal 8 or start a new issue.