Active
Project:
Cache Router
Version:
6.x-1.0-rc2
Component:
Code
Priority:
Critical
Category:
Bug report
Assigned:
Unassigned
Reporter:
Created:
10 May 2011 at 02:14 UTC
Updated:
30 Sep 2011 at 03:36 UTC
Cache write error, failed to open file "/files/filecache/cache/8/variables-87cd8b8808600624d8c590cfc2e6e94b"
I only have this file in the cache/8 directory
variables-87cd8b8808600624d8c590cfc2e6e94b.perm
Comments
Comment #1
sashken2 commentedI have this problem too.
I'm using filecache. After update to 6.x-1.0-rc2, I see this error.
Comment #2
rkodrupal commentedsame problem here with 6.x-1.x-dev ... this is a new install.
Windows XAMPP 1.7.3 localhost
Comment #3
lieb commentedStarting to get a lot of these
Cache write error, failed to open file .../sites/default/cache/cache_page/...
May need to roll back to previous version or disable Cache Router completely.
Comment #4
rkodrupal commentednever got it to work on localhost but updated to the host provider and it seems to work fine. might be an xampp issue?
Comment #5
calypso2k commentedok, so...
for some reason cwd (Current Working Directory) in engines/file.php is '/'.
This is the reason for 'Cache write error, failed to open file...'
engines/file.php line 80
+chdir($_SERVER["DOCUMENT_ROOT"].base_path());
this do the trick for me...
can you confirm?
Comment #6
jim kirkpatrick commented@ calypso2k: Your added line gives me "Fatal error: Call to undefined function base_path() in /web/transitionnetwork.org/www/sites/all/modules/cacherouter/engines/file.php on line 80"
Upgrading - this is a biggie, stopping data being saved sometimes on our site, and losing the contents of fields somehow.
Will roll back to RC1 or earlier and see...
Comment #7
jim kirkpatrick commentedOK, maybe I was hasty in the upgrade...
SOMETHING has changed since updates to our site at the end of May, and that something is stopping users saving data and certain user reference fields being emptied. Not sure if this is to do with it but we get a LOT of these errors in our dblog.
The finger of doubt is pointing at Mollom too...
Anyone have any thoughts? Prior to RC1 we didn't get any issues like this at all... Anyone else finding users cannot save pages, or fields are emptied?
Comment #8
jim kirkpatrick commentedThe error happens because locking/writing/opening a file fails. From engines/file.php line 75 onward:
Begs two questions:
Comment #9
rkodrupal commentedDon't know if this comment will help or hinder, but I've been using the original file.php on my hosted server for close to 3 weeks now without seeing this error ... it only appears on my localhost. If there's anything I can compare across the 2 versions that might help in your diagnosis please let me know.
K
Comment #10
jim kirkpatrick commentedI wasn't hasty in the upgrade. This is a critical data loss issue...
I've now seen many incidences of forms saying they've been saved, only to reload with the old values in. Users saving data are believing they've done the job but the system hasn't managed to save it. As soon as I turned off Cache Router (using the file engine) it all stopped and the site worked perfectly again. There were a LOT of errors cacherouter errors in the watchdog.
It believe stems from the bit of code I posted from engines/file.php in comment #8 - the locks on the files mean that if a cache file is in use the data is simply not being saved - no retries, no other fixes I can see... On a quiet site you'd probably never see this, but on our busy one it's a big issue.
As rkodrupal said in #9, the earlier version file.php didn't have this issue. It's because it doesn't bother with file locks... I don't know if that is better or worse for data integrity, but we didn't see these issue until I updated the module.
I think upon hitting a lock, the code should try again, usleep()ing for 250 microseconds until an upper threshold is reached or the file lock becomes available. I don't know about the rest of the code and if it's supposed just to write through the cache in such circumstances, but I do know as the code stands user data is being lost.
I'll mention this over at #578522: File engine: Performance, filenames, and other cleanup in case...
Comment #11
calypso2k commentedI've got this error too in some environments
try just this
engines/file.php line 80
+chdir($_SERVER["DOCUMENT_ROOT"]);
ONLY IF your site is in domain root folder (not in subfolder), it seems, at this point function base_path(), does not exists
and yes, if it can't open file, data is lost - forms are submitting, but after reload you get old settings
Comment #12
jim kirkpatrick commentedRe "forms submit but old values appear": This is actually a symptom of the cache not updating properly - the actual data IS saved...
However, the contents of the file cache quickly goes out of date with the actual data because of this locking issue... So when you save you do save, but the page loads the old, incorrect values from the cache files because they were locked at the time they needed updating. From there, the system's caches continue to get further out of date with the actual real data.
Another big problem is that 'clear all caches' doesn't seem to actually clear the file cache - yikes! You can see the real data by
rm -R *in the file cache folder. I've got into the habit of doing this whenever needed now.This is troubling because, whilst critical data is NOT lost on save, the site continues to use old values, meaning it's possible to save data using obsolete values from the cache. This could snowball, with each save using older data and getting ever more inconsistent.
The above explains my symptoms in #7 with forms 'saving' but not updating, and with users/content disappearing (because it had fields that were saved with incorrect cached data in them).
And further to my idea to loop/wait until a lock is released in #10, I tried this but it appears the cache files can be locked within the same request! Hence within a request, no amount of waiting will help since code earlier in the processing has got a lock that the current code cannot break.
I reckon this locking code is more trouble than it's worth. At the very least, if a lock is 'hit' when a write needs to be done, an alternate list should be kept of obsolete cache files that MUST be deleted at the end of the request. This would ensure cache coherency with a small performance overhead. Better to be slower than broken, though!
Comment #13
jim kirkpatrick commentedUpdating title based on my findings.
Comment #14
socialnicheguru commentedI am getting file cache write errors also.
I notice that some directories have the wrong owners.
Once I chmod -R www-data:www-data files/filecache they seem to go away
Comment #15
jim kirkpatrick commentedHi SocialNicheGuru, that is an issue but we fixed that by emptying the filecache folder and doing the chmod/chown thing you did. It did not fix the underlying issue that something in the way CR handles errors means duff/incomplete/obsolete data is often returned from the cache.
We moved to memcache not long after I stopped posting here, but carried on using CR as the module for it... Long story short, it's broken for memcache too as far as we can tell -- lots of results being partly returned because the code doesn't check if the cachable data is over 1Mb and when it fails it doesn't clear the cache entry out to ensure valid data is got next request.
We moved to the Memcache module and made set a few tables to not be cached by memcached and all our issues are gone.
I therefore still believe this module has a big data integrity problem that needs resolving, it caused our site a lot of very odd issues, both with the File and Memcache backends. I don't recommend using this module for bigger sites until this issue is somehow resolved.
Comment #16
5t4rdu5t commentedJust for those (like me) who need more clear instructions, to solve this poblem you need to issue
from the root dir of your site.