I'm stumped. My import of an uploaded XML file (using xpath parser) works partially. It ALMOST finishes and imports a ton of nodes 400+). But then it stops and gives me this aggravating and non-descriptive error:
An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /myavail/batch?id=489&op=do StatusText: error ResponseText: 500 Internal Server Error Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. Apache Server at www.masoninnovation.org Port 80
My error log doesn't have this error.
More background: I'm importing an XML feed into a content type using a standalone form. I'm using xpath mapping...which is pretty simple and either works or doesn't, so I'm not sure how that could be causing this error. I'm using a fairly high volume dedicated server on (regretably) Godaddy.com (this is a test project that I'm toying with in a subdirectory). I'm using Drupal 7.x dev (latest commit).
I'm hoping that someone else has experienced a general problem with 500 errors and has solved it. I've tried a few things:
1. On the chance that I was experiencing a timeout, I increased the php memory level all the way to 256M! I also set the timeout settings to 0 (no limit).
2. I deactivated almost every module...
3. I simplified my content type and removed field permissions, conditional fields, user references, and other things that were unrelated to Feeds.
I have a couple of really bad guesses as to what's causing the problem:
A. One of my fields is a number (set up as an integer). I'm importing numeric data into that field. Does Feeds support importing into an integer field type? I would assume so, but I'm grasping here.
B. There's something wrong with my .htaccess file (not sure why - I haven't ever even edited it).
C. There's something about using a subdirectory install that impacts Feeds (I really doubt it).
D. Even Godaddy's expensive dedicated servers can't handle tough tasks like importing 500 nodes (won't rule it out).
Anyway. Just want to make sure there aren't any low hanging fruit that I'm missing that could resolve this issue (ie - other people who have experienced this partial import issue). I've also contacted Godaddy to see if they can help tell me what's wrong with their server.
Comment | File | Size | Author |
---|---|---|---|
#42 | feeds-error-500-partial-import-1219296-42.patch | 1.19 KB | Temoor |
#22 | Screen Shot 2011-11-23 at 12.47.02 PM.png | 38.93 KB | ursula |
Comments
Comment #1
paulgemini CreditAttribution: paulgemini commentedchanging issue to the correct version
Comment #2
paulgemini CreditAttribution: paulgemini commentedI should add that the error pops up after about 3 minutes of sitting in the batch screen with the blue line stuck at "Initializing". Debugging fields is useless because I never get a chance to see them.
Comment #3
paulgemini CreditAttribution: paulgemini commentedOk after some investigation, I noticed that my database was huge. With literally 2 nodes (I cleared my nodes and cleared my search index), it was 9 MB! The culprit?
The feeds log
Clearing that and will get back to you.
UPDATE: nope, didn't help.
Comment #4
johnmmarty CreditAttribution: johnmmarty commentedI'm seeing this same error message immediately upon clicking import.
Comment #5
johnmmarty CreditAttribution: johnmmarty commentedOn the importers Basic Settings screen, I checked the "Process in Background" options and bam! Good to go. Seems like a work around though but hey who needs a progress bar.
Comment #6
paulgemini CreditAttribution: paulgemini commentedNice workaround! Will try that.
Comment #7
paulgemini CreditAttribution: paulgemini commentedNope. Still got the error. Just in a different format.
Comment #8
manu manuHaving the same issues either with background process or not.
I noticed that when it crashes, PHP was trying some huge memory allocations:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 3195729 bytes) in /path/to/html/includes/database/database.inc on line 2095
Background infos:
- Memory limit is very high in my php's cli conf file as I process my feeds with drush
- the feed is a 12 MB xml file parsed with XPath XML parser
- the content type have a file (image) field fetched by feeds
Comment #9
paulgemini CreditAttribution: paulgemini commentedSame deal. I upped my memory to an absurd level, removed all timeout limits, and still getting this.
I also import an image. I'll try removing the image field from the import and report back.
Comment #10
johnmmarty CreditAttribution: johnmmarty commentedpaulgemini, sorry it didn't work for you. I moved onto a larger import last night, 7000 records and it hits 50 and quits every time. Very frustrating. I'm considering setting up the import as a cron job that processes files on the server. Then i would have to cut down the file to 140 files of 50 records, oh what fun.
Comment #11
paulgemini CreditAttribution: paulgemini commentedWhat web host do you use?
Comment #12
johnmmarty CreditAttribution: johnmmarty commentedI'm doing this on my local macbook pro. When I migrate I'll use navicat to sync my databases. I've used goddy and siteground as hosts in the past but as of now I'm not in production with this site.
Comment #13
johnmmarty CreditAttribution: johnmmarty commentedI upped all my php.ini limits last night and it was looking good then I hit a mysql issue. I tried to restart mysql and it wouldn't start at all. I'll have to go digging in the logs when i get home to see what's happening.
Comment #14
kkuhnen CreditAttribution: kkuhnen commentedsubscribe
Comment #15
Dale Baldwin CreditAttribution: Dale Baldwin commentedAre you sure its not an xpath parser issue? I say this cause I managed to get a feed off the same data with basic fields, title and body but after that everything has failed.
The two errors I keep getting are
Error message
Cannot acquire lock for source tt_product_feed / 2941.
and
An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /?q=batch&id=42&op=do StatusText: Internal Server Error ResponseText:
Comment #16
Dale Baldwin CreditAttribution: Dale Baldwin commentedAnother interesting thing I just found, I had a bit of a look through the mysql database and my files folders. I found some of the images in my feed had been pulled down and saved and there were database records for each of the images. I checked a bunch of other fields such as the body fields and there was nothing in there so more and more there is an issue with feeds writing text content into mysql.
Comment #17
Dale Baldwin CreditAttribution: Dale Baldwin commentedapache error.log has this sitting in it
[Fri Aug 05 16:15:05 2011] [error] [client 192.168.1.145] PHP Fatal error: Unsupported operand types in /data/drupal7/sites/all/modules/feeds/includes/FeedsConfigurable.inc on line 149, referer: http://192.168.1.118/?q=batch&op=start&id=7
147 public function getConfig() {
148 $defaults = $this->configDefaults();
149 return $this->config + $defaults;
}
not sure if that helps anyone, I'm not great with PHP but that's at least what the end of what I've been able to discover today.
thanks
Comment #18
Dale Baldwin CreditAttribution: Dale Baldwin commentedlooked further few the bug reports try this http://drupal.org/node/1213472#comment-4754062
worked for me without a hitch.
Comment #19
John Bryan CreditAttribution: John Bryan commentedIf you are not using taxonomy with Feed items then this comment is not relevant.
If you are trying to have RSS or XML (etc.) "Feed Item" nodes inherit a taxonomy term from the parent "Feed" importer node and seeing "An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows." :-
Multiple Feed project issues have been created for this, or with similar message but the relvant one appears to be:-
"taxonomy_node_get_terms doesn't work with drupal 7"
http://drupal.org/node/959984
Comment #20
askandlearn CreditAttribution: askandlearn commentedI am also getting this error.
/batch?id=21&op=do StatusText: Internal Server Error ResponseText:
There is nothing in the error log.
I am not doing taxonomy nor do I have images. I've cleaned out the csv file of any odd characters. It is text only.
On a desktop mac with large MB in php.ini.
Any clues would be appreciated.
Comment #21
Jason Dean CreditAttribution: Jason Dean commentedI was getting this error with a fairly small import, no images, no taxonomy, using Node Processor and Xpath XML parser.
I found that in my mapping settings, I'd selected two fields as unique identifiers. I thought that was ok, but as soon as I deselected one of them the entire import worked with no errors.
So I think this error can be a bit misleading... there seems to be a variety of potential causes. Just remember to check your mapping settings when you are trying to eliminate it from your import :)
Comment #22
ursula CreditAttribution: ursula commentedI am having a similar problem. Importing 5000 nodes from a text file on the server. At 92%, it gives that Ajax error. I found the following in the logs:
PHP Fatal error: Maximum execution time of 30 seconds exceeded in includes /home/drupal/database/database.inc on line 429, referer: http://example.com/batch?op=start&id=215
My database engine is running a big (non drupal) site, so sometimes, updates take longer, because the other service is loading data.
I am fine with restarting the import. However, the import page now claims that it is still importing, and I don't seem to be able to stop the import attempt (see attached screen-shot).
I'd be happy to manually remove entries from database tables. If I just knew which ones to remove ...
As a workaround, I am now deleting the 4500 nodes it already imported (and the 1000 I imported earlier).
Any help greatly appreciated.
Ursula
Comment #23
lafingguy CreditAttribution: lafingguy commentedHi folks,
I'm having the same error 500 importing nodes as group nodes.
I also am getting an "Cannot acquire lock for source ..." error message.
I'm running on a macbook pro and previously loaded 30K users and profile2s without a hitch.
A couple of additional things I've noticed:
Why aren't more people having issues with this?
Comment #24
Ivan Simonov CreditAttribution: Ivan Simonov commentedI am also getting an "500 error" error message.
I try to import 2 products and got button "50% processed"
Comment #25
lafingguy CreditAttribution: lafingguy commentedYou can get rid of that button issue by going into the database and deleting the entry in feeds_source which matches the import which hung.
It's a hack but it does work :-(
Comment #26
Ivan Simonov CreditAttribution: Ivan Simonov commentedThanks, lafingguy.
Enough to clear "state" field for this importer.
Comment #27
bancarddata CreditAttribution: bancarddata commentedI, too, am experiencing a similar issue with the latest dev copy of feeds. It is a CSV parser with 819 records in the CSV file. It hangs at 98% and then gives the same 500 AJAX error as originally posted. If I go in and clear the state for the importer, the import page then shows all 819 records were imported. I am not importing any files, but I do use quite a few text, taxonomy, numeric, etc. fields. I am also using the uc_feeds module which adds some Ubercart-specific fields to the Mappers. Whatever the problem is, at least for me, it seems like it is happening during some sort of end process that occurs after the CSV import has finished.
My Apache log shows a corresponding "PHP Fatal error: Allowed memory size of 167772160 bytes exhausted" error message. One thing I noticed is it reports the faulting script as "drupal/includes/database/log.inc on line 144". Is something going wrong during the part where it writes the log entries? Or is this error just a side-effect of the original AJAX error?
edit: Nevermind; this issue appears to be coming from an add-on Feeds processor (the Garbage Collector at #661314: "Sync" or "cache" mode).
Comment #28
rakesh.gectcrYeah its working for me,
Go to your database,
and delete the raw of entry from the table " feeds_source"
Thanks @ lafingguy :P
Comment #30
Ivan Simonov CreditAttribution: Ivan Simonov commentedGuys, check this issue: #1487670: CSV Parser: incorrect UTF-8 interpretation (looks like limited number of imported fields)
After I apply the fix, errors disappear.
Guess frozen import and 500 error may depends of incorrect UTF-8 interpretation.
Comment #31
franzNo patch here, so can't be "needs review".
Comment #32
hop CreditAttribution: hop commentedWhen I run Bulk Update, I see error message: "An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /batch?id=306&op=do StatusText: Internal Server Error ResponseText:".
When I generate URL aliases with module Views Bulk Operations (VBO) it was all OK :)
I recommend patch http://drupal.org/node/1415930#comment-5511014
Comment #33
PatchRanger CreditAttribution: PatchRanger commentedProblem
Import with BatchAPI throws an AJAX error 500 and doesn't do the work.
Reason
BatchAPI can become broken by any output that your code is trying to display :
1) dpm/dsm/others commands
2) warning or error messages, displayed to the screen
Solution
This is a problem because they are throwing messages to the screen, making BatchAPI broken.
Sometimes you can't eliminate the reason of these messages because of invalid markup of parsed source (this was my use case with XPathParser) - then just turn off displaying warning messages at all.
Note : make sure to do it twice (the first - globally, the second - in settings of your FeedsParser (XPathParser has such an option)).
Marking this as fixed because it worked for me.
Feel free to re-open if it does not for you.
Comment #34
PatchRanger CreditAttribution: PatchRanger commentedIt seems that 'works as designed' is more appropriate status for it.
Comment #35
oglok CreditAttribution: oglok commentedHi guys!!
I get this error:
Hubo un error HTTP AJAX. Código de Resultado HTTP: 500 A continuación se detalla la información de depuración. Ruta: /batch?id=43&op=do StatusText: Internal Server Error ResponseText: {"status":true,"percentage":"33","message":"Se completaron 2 de 6.\u003Cbr \/\u003E"}
I have my website in a multishared hosting... do you think It will be a problem of the server?? I cannot access all the setting files of the Apache...so I don't know how to fix it!
Help, I need somebody's help!!!
Thanks guys!
Comment #36
PatchRanger CreditAttribution: PatchRanger commented@oglok In my case the problem was in parser.
What Feeds parser do you use?
I use FeedsCrawler and the AJAX error 500 has gone after this patch : #1777438: FeedsCrawler breaks BatchAPI.
There you can find a link, related to this problem, which may be useful to you.
Comment #37
xeniak CreditAttribution: xeniak commentedNo. Suggestions in #33 don't help me at all.
Here's what I'm getting:
Grouping error?
Comment #38
xeniak CreditAttribution: xeniak commentedIn our case, it's some kind of conflict with either the Messaging or Notifications module, or both. So it may not be a Feeds bug at all.
Comment #39
jenlamptonI'm having the same issue with the iCal parser. I can import just over 400 nodes (98% of my set) and then I get the error. I'm on Pantheon and I recognize the "Service Unavailable" as the standard Rackspace Cloud message.
I expect this is happening because the batching here is not actually working (see #1363088: Feeds batch process not fully implemented - needed for large imports) I have increased my feeds_process_limit variable from 50 to 500 in order to import all my records at once, but since there's no batching this maxes out Pantheon's resources.
Comment #40
twistor CreditAttribution: twistor commentedThis issue is kind of a crap shoot.
I suspect there's a few different causes over the years.
@jenlampton, Batch processing won't affect your issue with iCal, since from what I can tell, date_ical batches in the parser correctly. Batch processing is only affected by parsers that parse everything at once, rather than respecting the batch limit.
You'd actually want to lower your process limit to get around resource issues. feeds_process_limit is what determines the batch size, so if you're running out of memory, then you want to decrease the value.
Comment #41
mrpotatohead CreditAttribution: mrpotatohead commentedTry "enabling" Clean URLs :
admin/settings/clean-urls
After trying a million other things all day, that finally did the trick for me.
Comment #42
Temoor CreditAttribution: Temoor commentedIt seems that this issue collected lot of different bugs with alike symptoms - batch fails with some error.
I used FTP Fetcher and XPath XML parser. In my case it was 500 error, which appeared when feeds tried to save progress state to database.
Given patch is only part of solution, which allows to use file name instead of file content in FeedsFetcherResult. Other part is to update fetcher so it will return file name instead of content. See #2454615: Pass file name instead of content into FeedsFetcherResult for example of fetcher update.
Comment #43
Temoor CreditAttribution: Temoor commentedComment #45
twistor CreditAttribution: twistor commentedAs I mentioned in #2454615: Pass file name instead of content into FeedsFetcherResult FeedsFileFetcherResult can be used in this case.
Comment #46
yuseferi CreditAttribution: yuseferi commentedI have the same problem, it work correctly on local, but when I deploy my site to ubuntu 16 sever, when I tried to import from youtube, faced wtih that, but there is not any log
Comment #47
flyke CreditAttribution: flyke commentedThis answer is not exactly Feeds related/specific, but is very much related to the reported error, and since this thread is the top result in google when searching for the error, I post it here.
I'm having the same problems with a custom product importer using the Batch API (not using feeds).
Same code, same file works without problem on my local development environment, but fails on the production site:
Er is een AJAX HTTP fout opgetreden. HTTP-resultaatcode: 500 Debug informatie volgt. Pad: /nl/batch?id=1018&op=do Statustekst: Internal Server Error Antwoordtekst:
(this is the Dutch version of the original error reported in this issue)
Since this thread shows there can be multiple reasons for the error, here are some general things you can do
Comment #48
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedI'm working on an issue to prevent the raw data to be saved in the feeds_source table. At the same time, I see if can optimize the code in terms of memory. The raw data is being passed around a lot. This causes that several copies of the raw data are saved into memory.
#2829097: Don't store raw source in feeds_source table
Comment #49
MegaChriz CreditAttribution: MegaChriz as a volunteer commentedClosed #1232514: CSV node import throwing AJAX 500 error? which reports also a few useful possible causes.