I'm working on helper functions that allow the AmazonS3 module to reference files on S3 buckets without going through the Drupal upload module, by means of an autocomplete widget. There is an issue when adding a file greater than 2.1 GB, due to the MySQL size constraint of the INT column type.
I was able to fix this for AmazonS3's table, but since we're still using the core Field API, adding a new file to an entity requires inserting a new record into the {file_managed} table. Since file_managed.filesize is an INT(10) the insertion fails with PDOException, sqlstate Numeric value out of range: 1264 for filesize column in row 1
.
Would it be possible to either 1) update the file_managed schema so the filesize column uses a BIGINT(20)? I'm open to alternatives of course, since this is something we have to address for our site's D7, upgrade which includes the handling of large DVD image files that are hosted on S3.
Relevant issues:
#1651096: Adding file metadata to {amazons3_file} fails with filesize > 2GB, use BIGINT instead
#1003692: PDOException: SQLSTATE[22003] after entering large value in integer and decimal field
Comment | File | Size | Author |
---|---|---|---|
#16 | 1815886-16-filesize-bigint-d7.patch | 1.03 KB | bojanz |
#13 | 1815886_13_update_only.patch | 621 bytes | slashrsm |
#9 | 1815886-9-filesize-bigint.patch | 887 bytes | bojanz |
#8 | 1815886-8-filesize-bigint.patch | 447 bytes | bojanz |
Comments
Comment #1
torgosPizzaPlease excuse my "possible to either" typo. Originally I thought I had a viable alternative, but then determined it wasn't a very good one.
Comment #2
Anonymous (not verified) CreditAttribution: Anonymous commentedSo you're asking for support and want to know if you can manually adjust the Drupal core table?
Yes, you can do that, you just need to make sure you take upgrades into consideration. Especially when #1003692: PDOException: SQLSTATE[22003] after entering large value in integer and decimal field is backported to D7.
Always take a backup. Always test in a development instance.
Comment #3
Anonymous (not verified) CreditAttribution: Anonymous commentedNote, a contrib module should not update a anyone else's schema. That is asking for trouble.
Comment #4
torgosPizza@earnie, Thanks, but I apologize for the confusion. I realized my wording made it seem like I was asking for support, which was not my intent.
What I meant was, this issue perhaps should be addressed with a Core schema update to allow larger file metadata to be inserted into {file_managed}. I know that I can manually adjust the database table myself, and in fact that's what I'm doing for testing purposes.
Exactly my point. I felt that it's more sensible to open an issue (maybe this is better as a Feature Request?) that addresses the root cause - the restrictive table schema. Would your alternative be to include a note in the module that says "In case you get this error, update these database tables"?
If someone wants to use this contrib module to utilize S3 for large file delivery, are we going to require that they open up their database management software and change the schema themselves? IMHO that sounds like an even greater risk, and I can't think of any reason why we shouldn't enable Core to have the flexibility to manage files of this size.
Comment #5
torgosPizzaI guess this does make more sense as a Feature request.
Comment #6
bojanz CreditAttribution: bojanz commentedThis is obviously a bug report.
And we need to fix it in D8 first.
Comment #7
bojanz CreditAttribution: bojanz commentedComment #8
bojanz CreditAttribution: bojanz commentedThe current limit is actually 4GB, since we're using an unsigned integer.
The 2GB limit comes from filesize() on 32bit PHP installs, and there's nothing we can do there.
Still, for 64bit installs there is no need to limit the size of files that much (especially since we need to think about years and years into the future, for the lifespan of D7 and D8). So let's convert that column to a bigint.
Comment #9
bojanz CreditAttribution: bojanz commentedNow with an update function.
Comment #10
slashrsm CreditAttribution: slashrsm commentedLooks good to me.
Comment #11
catchCommitted/pushed to 8.x, thanks!
This could use a backport I think.
Comment #12
tstoecklerI think the wrong patch was committed here, i.e. the one without the update function.
http://drupalcode.org/project/drupal.git/commit/e1ce0ac
Comment #13
slashrsm CreditAttribution: slashrsm commentedPatch attached.
Comment #14
tstoecklerLooks good.
Comment #15
webchickCommitted and pushed to 8.x. Thanks!
Moving to 7.x for the backport.
Comment #16
bojanz CreditAttribution: bojanz commentedHere we go.
Comment #17
naveenvalechaLooks good to me.
Comment #18
slashrsm CreditAttribution: slashrsm commentedAgree. We probably need to remove D8 update hook once this gets in?
Comment #19
naveenvalechaAs the #13 is already committed 6942242 in d8.So we will open a new issue to revert this 6942242 commit when #16 will commit in d7.
Comment #20
David_Rothstein CreditAttribution: David_Rothstein commentedCommitted to 7.x - thanks! http://drupalcode.org/project/drupal.git/commit/6c89f39
Back to Drupal 8 for now to remove the update function there (or feel free to close and create a new issue).
Comment #21
torgosPizzaThis is awesome. Thanks!
Comment #22
naveenvalechaClosed this one as there are no update path of major releases.As there is migrate module for major version releases.Bump this one.
if anyone wants to re open then s/he can.
Thanks
Naveen Valecha
Comment #23
naveenvalecha