I was just looking for something like that and your module looks great :)
I have a small problem with it and I don't know if it's a bug or my misconfiguration.
My feeds all have 0 in feed_nid so the check in _feed_import returns Invalid Feed ID: errors.
What is this field supposed to do? I changed the if statement but then I get
substr() expects parameter 1 to be string, array given FeedsFetcher.inc:75 [warning]
WD php: PDOException: SQLSTATE[23000]: Integrity constraint violation: 1048 Column 'feed_nid' cannot be null: INSERT INTO {feeds_log} (id,[error]
feed_nid, log_time, request_time, type, message, variables, severity) VALUES (:db_insert_placeholder_0, :db_insert_placeholder_1,
:db_insert_placeholder_2, :db_insert_placeholder_3, :db_insert_placeholder_4, :db_insert_placeholder_5, :db_insert_placeholder_6,
:db_insert_placeholder_7); Array
(
[:db_insert_placeholder_0] => magazyny
[:db_insert_placeholder_1] =>
[:db_insert_placeholder_2] => 1358176634
[:db_insert_placeholder_3] => 1358176632
[:db_insert_placeholder_4] => import
[:db_insert_placeholder_5] => Imported in !s s
[:db_insert_placeholder_6] => a:1:{s:2:"!s";i:0;}
[:db_insert_placeholder_7] => 6
)
in feeds_log() (line 818 of /var/www/velo.pl/dev/public_html/sites/all/modules/feeds/feeds.module).
Edit: I don't know if the problem is not feeds_sql I'm using witch may require sql string from form submission under import/.
Comments
Comment #1
FreeFox commentedI have the same problem. Than I tried the command where I can't make errors. This is the result:
$ drush feeds-import-all
Invalid Feed ID: mssql_users
Invalid Feed ID: mssql_ct_users
Invalid Feed ID: csv_customer_profiles
...
Please help
Comment #2
j0rd commentedIf you don't have your imports attaches to nodes it won't work because of some bad logic in the code.
Change this line.
I'll post a patch later, once I work out the other problems I'm having.
Comment #3
j0rd commentedPatch which resolves my issue.
Just imported 30000 nodes with this, so I can confirm it works.
I also believe that this command should set feeds_process_limit from 100 to unlimited or something else. Otherwise this script only imports 100 nodes at a time.
For me I just `drush vset feeds_process_limit 1000000` for now, which resolves the issue of limiting at 100, but will completely destroy all batch or cronjobs.
PS. thx for the module. Saved me a bunch of headaches today.
Comment #4
rerooting commentedWorks! RTBC. Please commit!
Comment #5
j0rd commentedI've improved this patch a little further. It now takes --feed-id and an optional --feed-node-id argument.
If --feed-node-id is not used, then it imports all --feed-id of that type. Previously there's a bug where it only imports the first one. Now it will import all feeds of the same ID. If you want to get more specific, you can then use --feed-node-id to import only a specific one.
Comment #5.0
j0rd commentedMore information
Comment #6
alexander.sibert commentedI can confirm the Patch works. I try now to import a 1.3 GB CSV tab delimited file in background.
Comment #7
puddyglum#3 is great for fixing the original issue
#5 should be moved into a separate issue so it's clear what solved this issue
Comment #8
rc_82 commentedTested #5. Works great
Comment #9
schignel commentedHello, I just applied #3, but when I run the module I get the following errors in the Feeds Importer Log:
"Feeds directory either cannot be created or is not writable."
Does anyone have any suggestions?
Thank you!
Comment #10
joelpittet