When I import large CSV document, i get following errors. Anybody know Why? This also happens when cron run. Is there any fix out there?
* Warning: fopen(public://feeds/FeedsHTTPFetcherResult1295145004_1) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in FeedsFetcherResult->sanitizeFile() (line 88 of /home2/santhos1/public_html/deals/sites/all/modules/feeds/plugins/FeedsFetcher.inc).
* Warning: fgets(): supplied argument is not a valid stream resource in FeedsFetcherResult->sanitizeFile() (line 89 of /home2/santhos1/public_html/deals/sites/all/modules/feeds/plugins/FeedsFetcher.inc).
* Warning: fclose(): supplied argument is not a valid stream resource in FeedsFetcherResult->sanitizeFile() (line 90 of /home2/santhos1/public_html/deals/sites/all/modules/feeds/plugins/FeedsFetcher.inc).
* Warning: fopen(public://feeds/FeedsHTTPFetcherResult1295145004_1) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in ParserCSVIterator->__construct() (line 20 of /home2/santhos1/public_html/deals/sites/all/modules/feeds/libraries/ParserCSV.inc).
* Recoverable fatal error: Argument 2 passed to FeedsProcessor::process() must be an instance of FeedsParserResult, null given, called in /home2/santhos1/public_html/deals/sites/all/modules/feeds/includes/FeedsSource.inc on line 350 and defined in FeedsProcessor->process() (line 102 of /home2/santhos1/public_html/deals/sites/all/modules/feeds/plugins/FeedsProcessor.inc).
Comment | File | Size | Author |
---|---|---|---|
#97 | interdiff-1029102-96-97.txt | 486 bytes | greenSkin |
#97 | feeds-remove-permanent-file-after-import-1029102-97.patch | 13.69 KB | greenSkin |
| |||
#96 | interdiff-1029102-94-96.txt | 1.5 KB | MegaChriz |
#96 | feeds-remove-permanent-file-after-import-1029102-96.patch | 13.65 KB | MegaChriz |
| |||
#94 | interdiff-1029102-92-94.txt | 598 bytes | MegaChriz |
Comments
Comment #1
hwasem CreditAttribution: hwasem commentedI know it has been a while, but I'm curious to know if you ever figured this problem out. I'm getting similar errors on my Drupal 7 feeds import. I'm only importing around 100 items, but still getting the errors. I'm using Feeds 7.x-2.0-alpha4. It only happens during Cron for me, as well.
Comment #2
SeriousMatters CreditAttribution: SeriousMatters commentedI am encountering the same error when running cron to import large csv file. using 7.x-2.0-alpha4
/sites/default/files/feeds directory is empty at the time of this error. I think this is the trigger of failed to open stream
On /import/my_importer, the status is stuck at "importing - 57% complete", which indicates cron is trying to continue an import started before.
if I go into database feeds_source table to change both status and fetcher_result fields to 'b:0;' (boolean -> 0), I can restart the import from 0%. And /sites/default/files/feeds now contains a fetcher result file.
So the problem is when cron try to continue a previous feeds import using the saved /sites/default/files/feeds/FeedsHTTPFetcherResult1322621526_0 , the file is deleted for some reason.
I will try to increase the frequency of cron and/or decrease the frequency of import to see if it helps.
Comment #3
marcus_w CreditAttribution: marcus_w commentedIt's been almost a year now, still no fix? Anyone has the slightest idea what might cause this? I've got the same issue.
My CSV has about 1500 lines...
Thanks,
Mark
Comment #4
dema502 CreditAttribution: dema502 commentedThis patch helped me to solve similar problem with Postgresql
Comment #5
emackn CreditAttribution: emackn commentedhave you tried the dev version?
Comment #6
mrfelton CreditAttribution: mrfelton commentedI had a batch running, and my browser crashed. After that, the feed importer was stuck at 75% and the button to import was greyed out making it impossible for me to stop or restart the import. After running the following I was able to run the import again.
Comment #7
strr CreditAttribution: strr commentedCSV Import of a 10K file is stopping at about 9-12% - tried more than 5 times.
I also tried with the latest dev version. Is there any fix for?
This is some of my error:
Warning: fopen(public://feeds/FeedsHTTPFetcherResult1337690085) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in FeedsFetcherResult->sanitizeFile() (Zeile 87 von /is/htdocs/wp1021/www/website/modules/feeds/plugins/FeedsFetcher.inc).
Warning: fgets(): supplied argument is not a valid stream resource in FeedsFetcherResult->sanitizeFile() (Zeile 88 von /is/htdocs/wp1021/www/website/modules/feeds/plugins/FeedsFetcher.inc).
Warning: fclose(): supplied argument is not a valid stream resource in FeedsFetcherResult->sanitizeFile() (Zeile 89 von /is/htdocs/wp1021/www/website/modules/feeds/plugins/FeedsFetcher.inc).
Warning: fopen(public://feeds/FeedsHTTPFetcherResult1337690085) [function.fopen]: failed to open stream: "DrupalPublicStreamWrapper::stream_open" call failed in ParserCSVIterator->__construct() (Zeile 19 von /is/htdocs/wp1021/www/website/modules/feeds/libraries/ParserCSV.inc).
Recoverable fatal error: Argument 2 passed to FeedsProcessor::process() must be an instance of FeedsParserResult, null given, called in /is/htdocs/wp1021/www/website/modules/feeds/includes/FeedsSource.inc on line 356 and defined in FeedsProcessor->process() (Zeile 101 von /is/htdocs/wp1021/www/website/modules/feeds/plugins/FeedsProcessor.inc).
Comment #8
cimo75 CreditAttribution: cimo75 commentedvery same error as #7
Comment #9
strr CreditAttribution: strr commentedplease, can someone helb to get started for a fix ...
Comment #10
Rob_Feature CreditAttribution: Rob_Feature commentedStill seeing this in the latest dev as of May 29, 2012 release. Unfortunately I don't know where to start on this...marking as active and major since it seems to kill feed importing.
Comment #11
that0n3guy CreditAttribution: that0n3guy commentedI'm seeing this issue for the first time today as well... I just updated to dev yesterday... so maybe its a bug in dev.
Comment #12
tomcatuk CreditAttribution: tomcatuk commentedSame problem here. I'm looking at http://drupal.org/node/1454666, but presumably the problem is with a record in the feed. Is there a way to track down which record the feed failed on?
Comment #13
that0n3guy CreditAttribution: that0n3guy commented@tomcatuk, Its not as issue w/ a stuck feed. My feed isn't locked, its just stuck, so that patch doesn't help me (I tested it).
Comment #14
that0n3guy CreditAttribution: that0n3guy commentedUsing latest dev, I get this error, which is only slightly diffferent than the one above:
Comment #15
that0n3guy CreditAttribution: that0n3guy commentedI got mine running again.
I took my csv file that feeds was trying to import and copied it to public://feeds/ (in my case: sites/default/files/feeds). Then I renamed it from "jobinfo.csv" to "FeedsHTTPFetcherResult1339699961". Now my cron works. I chose the name "FeedsHTTPFetcherResult1339699961" because in the error its looking for the file with that name.
This is kind of what SeriousMatters was saying in #2 I think.
So the real issue isn't really this error (all though it should have a better notification than this and reset it self), but the question "where has the file gone".
Comment #16
gordns CreditAttribution: gordns commentedI had this error and it seems to have been a permissions issue on the "files/feeds" folder (wrong owner). It was fixed by deleting the folder and then let the server recreate it on the next manual import.
Comment #17
tomcatuk CreditAttribution: tomcatuk commented@that0n3guy you're a gem. Anyone else reading this, that0n3guy is on the money - I followed his suggestion in #15 and can now run cron again. Do think this is something that requires a more permanent fix though.
Comment #18
ErnestoJaboneta CreditAttribution: ErnestoJaboneta commented#16 seems to have been my problem as well. The owner was right but I changed the permissions anyway since I couldn't delete the directory. I changed the permissions to 775 and the message seems to have gone away. Not sure what security issues this causes though...Nevermind......
Comment #19
resullivan CreditAttribution: resullivan commentedI ran into this due to the fact I ran out of requests to the database. My server limits it to 75,000 per user per some amount of time. Right now mine is stuck at 69% with the button grayed out.
Comment #20
ErnestoJaboneta CreditAttribution: ErnestoJaboneta commentedIf you have a guid set up correctly, you can always start the import over and it will skip over the existing imported items. Though you'd have to go into the database table "feeds_source" and delete the row for the feed. This will get rid of the grayed out button and the source of the feed.
Comment #21
spuky CreditAttribution: spuky commented#15 nails the problem and I guess it is happening only if your import takes longer than 6 hours...
I discovered that the import file has a status of 0 in the file_managed table which means it gets deleted after 6 hours... so if your import is not done but the time of the file is over drupal core just kills it.
the reason is: http://api.drupal.org/api/drupal/modules%21system%21system.module/consta...
a constant in Drupal that defines how long temp files should live...
I changed that value in modules/system/system.module on line 11
will report back tomorrow, if it helped with my large import but an final fix should be that feeds is updating the timestamp of the temp file as long as it's importing form it...
Comment #22
node9 CreditAttribution: node9 commented+1
I have a CSV feed that jams up at 3% each time cron fires off.
Edit: I kept running cron and noticed that, each time, it stops at 6%, then 10%, 13%, and so on. I expected the whole feed to run, but it runs in small increments. In my case, feeds is not jammed. I just didn't realize cron will only run part of the feed each time.
Comment #23
pavel.karoukin CreditAttribution: pavel.karoukin commentedI had similar problem. File is too large and set to import over nights only and during 3 days. So file is automatically removed due being set as Temporary File by Feeds module. Simple patch solved this problem for me - http://drupal.org/node/1801680#comment-6551192
Comment #24
jos_s CreditAttribution: jos_s commentedThanks, that helped me a lot. I had the same problems, but was able to recover this way.
Comment #25
pavel.karoukin CreditAttribution: pavel.karoukin commentedMoving from duplicate issue to in here:
I've found easy solution - re-download remote file if local file was removed. Patch attached. This potentially might cause problems if feed file will change between downloads. Not sure so. Discussion is welcome.
Comment #26
twistor CreditAttribution: twistor commentedTagging this.
This is a tricky issue. We could have Feeds manage the files itself. The simple way would be to delete the file on completion, but we cannot guarantee that a feed will finish without failing, and files could potentially add up. See #955236: Handle fatal errors for entity create with better error handler function.
We also could name the files in a more predictable manner, like importer_source_timestamp, and delete old files that match a pattern when starting a new import.
Comment #27
ari-meetai CreditAttribution: ari-meetai commentedI can see this a mix of issues, where the main one seems to be better tittled after this one.
I've created a new issue, seemingly more related to #6 here, as my case is with a very big CSV file already on the server.
Comment #28
MTecknology CreditAttribution: MTecknology commentedThis is still an issue. This seems to happen when:
1) Large file gets downloaded and processed in the background
2) Incremental cron jobs import pieces of the file (not all at once)
3) The next document import happens before enough cron jobs ran to import the document
It seems that this is directly related to the "Process in background" option. "Periodic import" with "Process in background" on large documents results in issues.
The web ui option to fix this seems to be to make sure that the "Periodic import" is set high enough that enough cron executions will happen to make sure the document gets fully imported before the next run. Alternatively, you could just uncheck the process in background option.
I opted to just disable the process in background option, set periodic import to 6hr, and run cron every 15min.
The fix is likely to first add a warning about background processing, followed by:
On cron, check if periodic import is running, if it is properly close that execution, throw an error/warning that the partial was truncated, start a new import.
OR: On cron, check if periodic import is running, if it is then process the whole rest of the file, throw a warning that the partial didn't finish before next import and continue as usual with the new import.
OR: On cron check if periodic import is running, if it is then process the next segment as it would without the new import, throw an error/warning that the new import hasn't started yet because the old is still running.
Three options that would resolved this bug... I don't know which would be best.
Comment #29
MTecknology CreditAttribution: MTecknology commentedThis patch from a duplicate bug report seems to do the trick.
Edit: I just noticed that someone linked to this same patch earlier. Sorry about that. It does work, though. :)
Comment #30
rudiedirkx CreditAttribution: rudiedirkx commentedComment #31
rudiedirkx CreditAttribution: rudiedirkx commentedConfirmed with import of 51 items (2 batches) and manual cron runs (and manual file deletion). Perfect. I've also added a watchdog inside the
!file_exists()
, but that was purely debugging and is absolutely not necessary in the patch.Comment #32
twistor CreditAttribution: twistor commentedThat patch doesn't quite fix the issue. It will just cause the file to be re-downloaded. This could lead to potentially missing data, if there was data in the previous file that doesn't exist in the current file, or if new data was added, the pointer that the CSV parser uses can be off.
That issue needs work as well, since the import should abort/restart if the file goes missing.
Comment #33
Ganginator CreditAttribution: Ganginator commentedI'm also experiencing this issue on a 537KB .csv, halts at 6% max.
I even have it set to not update current nodes, which I assume means it should skip over the ones it has already processed.
Comment #34
henkit CreditAttribution: henkit commentedFYI Patch #29 also fixed the problem i had like the one below on running cron:
Recoverable fatal error: Argument 2 passed to feeds_tamper_feeds_after_parse() must be an instance of FeedsParserResult, null given in feeds_tamper_feeds_after_parse() (regel 19 van ******/public_html/sites/all/modules/feeds_tamper/feeds_tamper.module).
Thanx!
Comment #35
checker CreditAttribution: checker commentedI think twistor is right. Re-download is not the right way. You should only use this patch (#29) if you import from a static file.
Comment #36
rudiedirkx CreditAttribution: rudiedirkx commentedWhy isn't the fix to make it a permanant file (easy) and remove it after import (easy?).
I don't use Feeds a lot, so I probably won't be making a patch.
Comment #37
mhenning CreditAttribution: mhenning commentedTwo ways I found to get around this issue:
1) Increase the number of items that are processed each run by editing the variable feeds_process_limit. This is set at 50 by default. I was unable to find any place in the admin user interface to set it, so I upped it to 300 in code:
variable_set('feeds_process_limit', 300);
This impacts all feeds and I don't see an easy way to apply to one particular feed.
2) Update the time that temporary files stick around. I was able to change this to 24 hrs (up from the default of 6 hrs) without editing core files by adding this to sites/default/settings.php:
define('DRUPAL_MAXIMUM_TEMP_FILE_AGE', 86400);
Comment #38
rudiedirkx CreditAttribution: rudiedirkx commented@mhenning You should turn notices on. Defining a constant twice results in notices, if you have good error reporting. Probably not the best idea. The variable is good, but it's not a solution for BIIIG files.
Comment #39
mgifford29: feeds-redownload_remote_file-1801680.patch queued for re-testing.
Comment #40
rudiedirkx CreditAttribution: rudiedirkx commented@mgifford What about #32?
Comment #41
mgiffordSorry, missed that.
Comment #42
crutch CreditAttribution: crutch commented#37 - 2) - Drupal 7
When adding to settings.php, it says it's declared twice with that statement. Would this be the correct way?
$conf ['DRUPAL_MAXIMUM_TEMP_FILE_AGE'] = '86400';
Comment #43
rudiedirkx CreditAttribution: rudiedirkx commented@crutch No. It's a constant, not a variable. It's not a good method.
I'll try to make #36 tomorrow.
Comment #44
rudiedirkx CreditAttribution: rudiedirkx commentedComment #45
sinn CreditAttribution: sinn commentedany progress?
Comment #46
stefan.r CreditAttribution: stefan.r commentedRough implementation of #36
Comment #47
stefan.r CreditAttribution: stefan.r commentedComment #48
rudiedirkx CreditAttribution: rudiedirkx commented@stefan.r How did you test that? Set
DRUPAL_MAXIMUM_TEMP_FILE_AGE
to something low and run cron manually?Comment #49
stefan.r CreditAttribution: stefan.r commented@rudiedirkx no I have not tested this using cron yet, just on a single manual import. DRUPAL_MAXIMUM_TEMP_FILE_AGE should not affect this patch since it's a permanent file now.
Comment #50
rudiedirkx CreditAttribution: rudiedirkx commented"should" is the reason we test =) You have to recreate the exact context where this was a problem and then try the fix. I'll try that later this week.
Comment #51
AlfTheCat CreditAttribution: AlfTheCat commentedTried this on the lates dev but the patch from #46 fails
Comment #52
stefan.r CreditAttribution: stefan.r commented@AlfTheCat: this patch is to 7.x-2.x-dev, you can get it at http://ftp.drupal.org/files/projects/feeds-7.x-2.x-dev.tar.gz
I tested and it still applies fine...
Comment #53
MTeck CreditAttribution: MTeck commentedI updated to the dev version and updated the database since there's a schema change. Then I triggered an update (foreground) from the web ui. It worked fine. Then I tried to trigger cron using drush and I saw that the import never finished. It would reach the end, the continue updating from the beggining. I saw it run through five times with drush cron and then used ^C to kill it. Trying to remove the patch or downgrade didn't help.
The patch in #46 seems solid and I imagine it should work perfect. This is probably a new/separate bug.
Comment #54
robertwb CreditAttribution: robertwb commentedI applied the patch in #46 successfully. I will note a couple of things that I think may be relevant:
Comment #55
stefan.r CreditAttribution: stefan.r commentedTo those of you that applied the patch, did it solve your issue or did it merely "apply successfully"? :)
Comment #56
AlfTheCat CreditAttribution: AlfTheCat commentedOn my end the patch still fails.
I'm using the latest dev but I also tried to manually download the mentioned tarball (which is still the same version as I downloaded via Drush) but the patch keeps failing
This is in the module's .info file:
Comment #57
stefan.r CreditAttribution: stefan.r commented@AlfTheCat try the following...
Comment #59
stefan.r CreditAttribution: stefan.r commented@rudiedirkx or anyone else, as per #50 has anyone managed to try see whether this fixed their issue?
Comment #61
rudiedirkx CreditAttribution: rudiedirkx commentedI tried it and found files aren't deleted. It seems, if you 'cancel' the batch, the file is orphaned. It also didn't delete the file when I cronned 10 times to make it finish. It finished the file and moved on to the next (I set up EVERY cron run to import), but didn't delete the previous.
Maybe it needs a cron to delete orphaned files? Or maybe I'm doing it wrong.
Comment #62
rudiedirkx CreditAttribution: rudiedirkx commentedUnlocking the import also orphans the file (bad) and downloads a new one next cron run (good, in my test case).
Comment #63
stefan.r CreditAttribution: stefan.r commented,
Comment #64
stefan.r CreditAttribution: stefan.r commentedComment #65
stefan.r CreditAttribution: stefan.r commented,
Comment #68
stefan.r CreditAttribution: stefan.r commentedComment #69
stefan.r CreditAttribution: stefan.r commentedComment #70
stefan.r CreditAttribution: stefan.r commented@rudiedirx: I am trying to reproduce the "files not getting deleted" issue but I can't. For me the file always deletes if it finishes importing, whether batched, in background or in multiple iterations.
Do you see any "Imported in n s" messages in the feeds log to confirm the feed has finished importing? And this is using the CSV importer? Are there any error messages in your error log?
Patch
feeds-remove_permanent_file_after_import-67.patch
should delete after unlock as well now and contains a rough implementation of the orphaned file deletion on cron (file needs to be older than a minute though).Untested yet but hoping it will help the issue move forward. If this is the right approach and it fixes people's problems I can write automated tests.
Comment #72
rudiedirkx CreditAttribution: rudiedirkx commentedThe one where it finishes succesfully and didn't delete the file was a fluke, probably.
But there's still the possibility that the batch doesn't finish (not relevant in cron) and the import is interrupted and the file is left there, but there's no way to finish the import. (Or will cron take over then..?)
I have a local testcase now, so I'll try out more tomorrow.
Comment #73
stefan.r CreditAttribution: stefan.r commentedNew patch attached, let's see if we can get this to turn green :)
@rudiedirkx if the file is just left there, cron should take care of it. During every cron load it will load the temporary files associated to all current Feed sources and put them in a list. Any files still in the filesystem under
public/feeds/Feeds*Fetcher*
that are not on this list will be deleted.Comment #74
stefan.r CreditAttribution: stefan.r commentedComment #75
ressa CreditAttribution: ressa commentedI am using Elysian cron, and couldn't get the import started again after getting "Recoverable fatal error: Argument 2 passed to feeds_tamper_feeds_after_parse() must be an instance of FeedsParserResult" in my logs.
There were a few stuck jobs in the queue, so after unchecking "Process in background" for my feed, I deleted the queues, ran cron a few times, and the feed import seems to be working again now.
Comment #76
alexander.sibert CreditAttribution: alexander.sibert commentedI have an important issue and maybe this patch can solve it. I try to import the GeoNames database as nodes.
http://download.geonames.org/export/dump/allCountries.zip
I use Feeds Tamper and Feeds and if import simple only the Geoname ID as Unique ID, Title and use Keyword Filter exactly to match the GeoName ID 2838632 it didn't import the line. The file size is extracted 1.2 GB large.
I parse the CSV directly from private Drupal Directory. If i import without Feeds Tamper (i mean all lines) the import works but it imports not all.
I applied now the last tested Patch #73, enabled Background process via Cron and now it imports to 100 % and start automatically again?
Comment #77
Khalor CreditAttribution: Khalor commentedThe patch applies cleanly, but it'd hard to tell what effect it's having.
Comment #78
twistor CreditAttribution: twistor commentedThis looks promising.
How does
$valid_fids[] = $source_object->fetcher_result->fid;
work if fid is protected?What if we didn't use Drupal's file handling at all?
Comment #79
jomarocas CreditAttribution: jomarocas as a volunteer commentedmy issue duplicate nodes when ran the cron process, this worked for me, disable background process and Periodic import
#28, thanks @MTecknology
Comment #80
gaele CreditAttribution: gaele commentedComment #81
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedI hope that this will make it into the Feeds 7.x-2.0-beta4 release.
Comment #82
demonde CreditAttribution: demonde commentedI confirm #73 fixing this error on a site with multiple complex csv imports with 25 fields and up to 1000 entries.
Comment #83
sphism CreditAttribution: sphism commented#73 doesn't apply cleanly for me:
and the FeedsSource.inc.rej is:
Ah ha... there's some new code in there about Rules
Comment #84
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedI'm a bit worried about the following lines in the patch:
Well, first on to the tests. The tests in the attached patch cover the following cases:
Since the patch is tests only, it should fail.
Comment #86
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedHere is a first step at fixing this issue. The approach is similar as the patch in #73, though excluding the part that cleans up orphaned files from the feeds 'in progress' directory. I'm still thinking about that part. Having a 1 minute gap for files to consider them orphaned sounds too short to me. I'm considering solutions where a FeedsSource object is passed to the fetcher result, so that the file can be associated with the Feeds source in question. Problem is that this isn't reliable: due to keeping backwards compatibility we cannot enforce that a fetcher result object knows to which FeedsSource it belongs.
I expect some failure or else I still would need to add a test for cleaning up orphaned files from the feeds 'in progress' directory.
Comment #88
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedSome of the additions introduced in the patch from #86 have been handled separately:
#2087091: FeedsSource->import() error
#2867189: Add a method to FeedsSource to unlock a feed.
#2867182: Add a finishImport() method to FeedsSource to bundle tasks that need to be done at the end of an import
So this patch is basically a reroll. Tests failures expected in tests added by this patch.
Comment #90
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedFeedsSource::clearStates()
.FeedsFileHTTPTestCase::setUpMultipleCronRuns()
to use process in background instead of periodic import. This is now more reliable to use since with the changes from #2630694: Run background jobs directly in queue. initiating a background import no longer imports the first chunk.Probably still a few test failures.
Comment #92
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedThe test failure in
FeedsFileHTTPTestCase::testTemporaryFileCleanUpAfterImport()
is fixed by clearing the PHP file exists cache by callingclearstatcache()
.To clean up orphaned files, I've done the following:
setContext()
is added. This is used by the FeedsSource class to tell FeedsFetcherResult which Feeds source is using the fetcher result.FeedsFetcherResult::saveRawToFile()
file usage is added in case it is known to which importer and feed node the fetcher result belongs. If no context is known, the file is saved as a temporary file just as before. This make sense when using the Feeds Import Preview module, for example.Comment #94
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedOops. The constant 'FILE_STATUS_TEMPORARY' does not exist in D7.
Comment #96
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedMistake in
FeedsSource::save()
. When saving the 'fid' value, the check should be if$this->fetcher_result
is an instance of FeedsFetcherResult, not FeedsFetcher.Also spread the query in
feeds_cron()
across multiple lines.Comment #97
greenSkin CreditAttribution: greenSkin commentedI was getting "SQLSTATE[22004]: Null value not allowed: 1138 Invalid use of NULL value" when running the 7214 update (adding the "fid" field to the "feeds_source" table). Here's an updated patch to #96 that adds 0 as the field's default value.
Comment #98
MegaChriz CreditAttribution: MegaChriz as a volunteer commented@greenSkin
Thanks for testing! I'm busy with testing/reviewing the patch myself as well and noticed this one too. The 'fid' database column should actually have 'not null' set to 'FALSE' as in
FeedsSource::save()
, a NULL value can be set on this column. I often get confused about that 'not null' setting on a database column.