Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
When importing big table, data could exceed PHP input variables limit. It leads to problems with node saving or unserialized data in output in D6 version.
Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in Unknown on line 0
Sample csv attached.
Comment | File | Size | Author |
---|---|---|---|
#13 | tablefield-big_tables_can_exceed-2161689-13.patch | 4.45 KB | jenlampton |
#10 | tablefield-big_tables_can_exceed-2161689-10.patch | 4.42 KB | jenlampton |
#8 | big_tables_can_exceed-2161689-8.patch | 4.47 KB | lolandese |
#5 | big_tables_can_exceed-2161689-5.patch | 5.37 KB | efpapado |
р-_18.12.13_utf8.zip | 3.5 KB | Razunter |
Comments
Comment #1
mrchristophy CreditAttribution: mrchristophy commentedOld issue I know, but...
My client is actually reporting this problem using quite a small CSV, although I can't replicate it myself. Any ideas why this would be happening for them but not me?
Comment #2
esbite CreditAttribution: esbite commentedIf you're having issues on one server but not another, check the setting for max_input_vars in php.ini, maybe you can just increase it.
I'm also having this issue and it seems kinda redundant for us to post that many values, our client is only using the CSV import function anyway. I'd vote for a rewrite of the backend storage to circumvent this issue and open up for different widgets.
Comment #3
quantos CreditAttribution: quantos commentedFollowing. We seem to have multiple issues, this one included, since v 7.23. We don't seem able to import any csv data at all right now - which I presume can't be connected to this max_input_vars issue but thought I'd mention it just in case.
Q.
Comment #4
jenlamptontagging
Comment #5
efpapado CreditAttribution: efpapado at Ramsalt Lab commentedI submit a patch that solves the problem for CSV files.
For maintainers/developers:
After the CSV uploading, the parsed table gets stored to
$form_state
. During the final submission of the form, an extra validation function checks if the max_input_vars has been exceeded.If so, the stored tablefield values from
$form_state
are retrieved, and get copied to$form_state['values']
. The user gets a warning that if he changed any value on browser, after uploading the CSV, his changes will be lost, and he is advised to reupload the CSV with the correct values.Also, if the too large table did not come from a CSV upload (which seems quite impossible, as the max_input_vars is by default set to 1.000 and it is unlikely that the user would type 800 table cells on keyboard!) he receives a drupal error, where he is adviced to store his table data through a CSV upload.
If you decide to commit the patch, this is a common work of me and esolitos (https://www.drupal.org/u/esolitos) so please don't forget to credit him too :)
Comment #6
efpapado CreditAttribution: efpapado at Ramsalt Lab commentedComment #7
lolandese CreditAttribution: lolandese at HCL Technologies Limited commentedComment #8
lolandese CreditAttribution: lolandese at HCL Technologies Limited commentedAfter a reroll against the latest dev the attached example CSV is imported correctly however the node does not save and keeps stuck on the Edit page.
This is probably due to the provided validation function.
Getting these messages:
The dimensions of the table are too large to be processed on browser. You are advised to import your table through a CSV file, to prevent data loss. (at least it shows the validation is kicking in)
Status message Successfully imported р-_18.12.13_utf8.csv
Comment #9
lolandese CreditAttribution: lolandese at HCL Technologies Limited commentedComment #10
jenlamptonIn my testing I am seeing the node save correctly, and the validation messages are appearing on the node view page. I'm unable to reproduce the problem reported above:
I rerolled the patch again, made a few coding standards updates, and cleaned up the language in the error messaging. New patch for review.
Comment #11
jenlamptonOne quick note... Though this patch does help the process of editing existing tables that exceed max_input_vars, it does not address what happens when you attempt to add a new row or column to an already-too-large table. I still get an ajax error;
We should adapt the same error messaging for rebuilding the table in all cases, and let the user know that uploading a CSV file with the intended number of rows and/or columns is recommended.
Comment #12
lolandese CreditAttribution: lolandese at HCL Technologies Limited commentedI guess it need work then.
Comment #13
jenlamptonThis is a reroll of the previous patch that applies cleanly (it does not incorporate new suggestions yet)