I'm stumped. My import of an uploaded XML file (using xpath parser) works partially. It ALMOST finishes and imports a ton of nodes 400+). But then it stops and gives me this aggravating and non-descriptive error:

An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /myavail/batch?id=489&op=do StatusText: error ResponseText: 500 Internal Server Error Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. Apache Server at www.masoninnovation.org Port 80

My error log doesn't have this error.

More background: I'm importing an XML feed into a content type using a standalone form. I'm using xpath mapping...which is pretty simple and either works or doesn't, so I'm not sure how that could be causing this error. I'm using a fairly high volume dedicated server on (regretably) Godaddy.com (this is a test project that I'm toying with in a subdirectory). I'm using Drupal 7.x dev (latest commit).

I'm hoping that someone else has experienced a general problem with 500 errors and has solved it. I've tried a few things:

1. On the chance that I was experiencing a timeout, I increased the php memory level all the way to 256M! I also set the timeout settings to 0 (no limit).

2. I deactivated almost every module...

3. I simplified my content type and removed field permissions, conditional fields, user references, and other things that were unrelated to Feeds.

I have a couple of really bad guesses as to what's causing the problem:

A. One of my fields is a number (set up as an integer). I'm importing numeric data into that field. Does Feeds support importing into an integer field type? I would assume so, but I'm grasping here.

B. There's something wrong with my .htaccess file (not sure why - I haven't ever even edited it).

C. There's something about using a subdirectory install that impacts Feeds (I really doubt it).

D. Even Godaddy's expensive dedicated servers can't handle tough tasks like importing 500 nodes (won't rule it out).

Anyway. Just want to make sure there aren't any low hanging fruit that I'm missing that could resolve this issue (ie - other people who have experienced this partial import issue). I've also contacted Godaddy to see if they can help tell me what's wrong with their server.

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

paulgemini’s picture

Version: 7.x-2.0-alpha4 » 7.x-2.x-dev

changing issue to the correct version

paulgemini’s picture

I should add that the error pops up after about 3 minutes of sitting in the batch screen with the blue line stuck at "Initializing". Debugging fields is useless because I never get a chance to see them.

paulgemini’s picture

Ok after some investigation, I noticed that my database was huge. With literally 2 nodes (I cleared my nodes and cleared my search index), it was 9 MB! The culprit?

The feeds log

Clearing that and will get back to you.

UPDATE: nope, didn't help.

johnmmarty’s picture

Version: 7.x-2.x-dev » 7.x-2.0-alpha4

I'm seeing this same error message immediately upon clicking import.

johnmmarty’s picture

On the importers Basic Settings screen, I checked the "Process in Background" options and bam! Good to go. Seems like a work around though but hey who needs a progress bar.

paulgemini’s picture

Nice workaround! Will try that.

paulgemini’s picture

Nope. Still got the error. Just in a different format.

manu manu’s picture

Having the same issues either with background process or not.

I noticed that when it crashes, PHP was trying some huge memory allocations:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 3195729 bytes) in /path/to/html/includes/database/database.inc on line 2095

Background infos:
- Memory limit is very high in my php's cli conf file as I process my feeds with drush
- the feed is a 12 MB xml file parsed with XPath XML parser
- the content type have a file (image) field fetched by feeds

paulgemini’s picture

Same deal. I upped my memory to an absurd level, removed all timeout limits, and still getting this.

I also import an image. I'll try removing the image field from the import and report back.

johnmmarty’s picture

paulgemini, sorry it didn't work for you. I moved onto a larger import last night, 7000 records and it hits 50 and quits every time. Very frustrating. I'm considering setting up the import as a cron job that processes files on the server. Then i would have to cut down the file to 140 files of 50 records, oh what fun.

paulgemini’s picture

What web host do you use?

johnmmarty’s picture

I'm doing this on my local macbook pro. When I migrate I'll use navicat to sync my databases. I've used goddy and siteground as hosts in the past but as of now I'm not in production with this site.

johnmmarty’s picture

I upped all my php.ini limits last night and it was looking good then I hit a mysql issue. I tried to restart mysql and it wouldn't start at all. I'll have to go digging in the logs when i get home to see what's happening.

kkuhnen’s picture

subscribe

Dale Baldwin’s picture

Are you sure its not an xpath parser issue? I say this cause I managed to get a feed off the same data with basic fields, title and body but after that everything has failed.

The two errors I keep getting are

Error message
Cannot acquire lock for source tt_product_feed / 2941.

and

An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /?q=batch&id=42&op=do StatusText: Internal Server Error ResponseText:

Dale Baldwin’s picture

Another interesting thing I just found, I had a bit of a look through the mysql database and my files folders. I found some of the images in my feed had been pulled down and saved and there were database records for each of the images. I checked a bunch of other fields such as the body fields and there was nothing in there so more and more there is an issue with feeds writing text content into mysql.

Dale Baldwin’s picture

apache error.log has this sitting in it

[Fri Aug 05 16:15:05 2011] [error] [client 192.168.1.145] PHP Fatal error: Unsupported operand types in /data/drupal7/sites/all/modules/feeds/includes/FeedsConfigurable.inc on line 149, referer: http://192.168.1.118/?q=batch&op=start&id=7

147 public function getConfig() {
148 $defaults = $this->configDefaults();
149 return $this->config + $defaults;
}

not sure if that helps anyone, I'm not great with PHP but that's at least what the end of what I've been able to discover today.

thanks

Dale Baldwin’s picture

looked further few the bug reports try this http://drupal.org/node/1213472#comment-4754062

worked for me without a hitch.

John Bryan’s picture

If you are not using taxonomy with Feed items then this comment is not relevant.

If you are trying to have RSS or XML (etc.) "Feed Item" nodes inherit a taxonomy term from the parent "Feed" importer node and seeing "An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows." :-

Multiple Feed project issues have been created for this, or with similar message but the relvant one appears to be:-
"taxonomy_node_get_terms doesn't work with drupal 7"
http://drupal.org/node/959984

askandlearn’s picture

I am also getting this error.
/batch?id=21&op=do StatusText: Internal Server Error ResponseText:
There is nothing in the error log.

I am not doing taxonomy nor do I have images. I've cleaned out the csv file of any odd characters. It is text only.

On a desktop mac with large MB in php.ini.

Any clues would be appreciated.

Jason Dean’s picture

I was getting this error with a fairly small import, no images, no taxonomy, using Node Processor and Xpath XML parser.

I found that in my mapping settings, I'd selected two fields as unique identifiers. I thought that was ok, but as soon as I deselected one of them the entire import worked with no errors.

So I think this error can be a bit misleading... there seems to be a variety of potential causes. Just remember to check your mapping settings when you are trying to eliminate it from your import :)

ursula’s picture

I am having a similar problem. Importing 5000 nodes from a text file on the server. At 92%, it gives that Ajax error. I found the following in the logs:
PHP Fatal error: Maximum execution time of 30 seconds exceeded in includes /home/drupal/database/database.inc on line 429, referer: http://example.com/batch?op=start&id=215

My database engine is running a big (non drupal) site, so sometimes, updates take longer, because the other service is loading data.

I am fine with restarting the import. However, the import page now claims that it is still importing, and I don't seem to be able to stop the import attempt (see attached screen-shot).

I'd be happy to manually remove entries from database tables. If I just knew which ones to remove ...

As a workaround, I am now deleting the 4500 nodes it already imported (and the 1000 I imported earlier).

Any help greatly appreciated.

Ursula

lafingguy’s picture

Hi folks,
I'm having the same error 500 importing nodes as group nodes.
I also am getting an "Cannot acquire lock for source ..." error message.

I'm running on a macbook pro and previously loaded 30K users and profile2s without a hitch.

A couple of additional things I've noticed:

  1. The error happens ~ 32 records and the record size doesn't seem to matter
  2. if I delete the processed rows from the file, re-name it, and then run it - I can avoid the hung state with the button saying "XX% processed"

Why aren't more people having issues with this?

Ivan Simonov’s picture

I am also getting an "500 error" error message.
I try to import 2 products and got button "50% processed"

lafingguy’s picture

You can get rid of that button issue by going into the database and deleting the entry in feeds_source which matches the import which hung.

It's a hack but it does work :-(

Ivan Simonov’s picture

Thanks, lafingguy.
Enough to clear "state" field for this importer.

bancarddata’s picture

Version: 7.x-2.0-alpha4 » 7.x-2.x-dev

I, too, am experiencing a similar issue with the latest dev copy of feeds. It is a CSV parser with 819 records in the CSV file. It hangs at 98% and then gives the same 500 AJAX error as originally posted. If I go in and clear the state for the importer, the import page then shows all 819 records were imported. I am not importing any files, but I do use quite a few text, taxonomy, numeric, etc. fields. I am also using the uc_feeds module which adds some Ubercart-specific fields to the Mappers. Whatever the problem is, at least for me, it seems like it is happening during some sort of end process that occurs after the CSV import has finished.

My Apache log shows a corresponding "PHP Fatal error: Allowed memory size of 167772160 bytes exhausted" error message. One thing I noticed is it reports the faulting script as "drupal/includes/database/log.inc on line 144". Is something going wrong during the part where it writes the log entries? Or is this error just a side-effect of the original AJAX error?

edit: Nevermind; this issue appears to be coming from an add-on Feeds processor (the Garbage Collector at #661314: "Sync" or "cache" mode).

rakesh.gectcr’s picture

Yeah its working for me,

Go to your database,

and delete the raw of entry from the table " feeds_source"

Thanks @ lafingguy :P

Ivan Simonov’s picture

Status: Active » Needs review

Guys, check this issue: #1487670: CSV Parser: incorrect UTF-8 interpretation (looks like limited number of imported fields)
After I apply the fix, errors disappear.
Guess frozen import and 500 error may depends of incorrect UTF-8 interpretation.

franz’s picture

Status: Needs review » Active

No patch here, so can't be "needs review".

hop’s picture

Component: Feeds Import » Miscellaneous
Assigned: Unassigned » hop

When I run Bulk Update, I see error message: "An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /batch?id=306&op=do StatusText: Internal Server Error ResponseText:".

When I generate URL aliases with module Views Bulk Operations (VBO) it was all OK :)

I recommend patch http://drupal.org/node/1415930#comment-5511014

PatchRanger’s picture

Status: Active » Fixed

Problem
Import with BatchAPI throws an AJAX error 500 and doesn't do the work.

Reason
BatchAPI can become broken by any output that your code is trying to display :
1) dpm/dsm/others commands
2) warning or error messages, displayed to the screen

Solution

  1. Make sure that none of your modules use dpm or dsm commands.
  2. Debug your messages in feeds log.
    This is a problem because they are throwing messages to the screen, making BatchAPI broken.
    Sometimes you can't eliminate the reason of these messages because of invalid markup of parsed source (this was my use case with XPathParser) - then just turn off displaying warning messages at all.
    Note : make sure to do it twice (the first - globally, the second - in settings of your FeedsParser (XPathParser has such an option)).
  3. Try switching to import with background process by checking the "Process in Background" option.

Marking this as fixed because it worked for me.
Feel free to re-open if it does not for you.

PatchRanger’s picture

Status: Fixed » Closed (works as designed)

It seems that 'works as designed' is more appropriate status for it.

oglok’s picture

Hi guys!!

I get this error:

Hubo un error HTTP AJAX. Código de Resultado HTTP: 500 A continuación se detalla la información de depuración. Ruta: /batch?id=43&op=do StatusText: Internal Server Error ResponseText: {"status":true,"percentage":"33","message":"Se completaron 2 de 6.\u003Cbr \/\u003E"}

I have my website in a multishared hosting... do you think It will be a problem of the server?? I cannot access all the setting files of the Apache...so I don't know how to fix it!

Help, I need somebody's help!!!

Thanks guys!

PatchRanger’s picture

@oglok In my case the problem was in parser.
What Feeds parser do you use?
I use FeedsCrawler and the AJAX error 500 has gone after this patch : #1777438: FeedsCrawler breaks BatchAPI.
There you can find a link, related to this problem, which may be useful to you.

xeniak’s picture

Version: 7.x-2.x-dev » 7.x-2.0-alpha7
Status: Closed (works as designed) » Active

No. Suggestions in #33 don't help me at all.
Here's what I'm getting:

An AJAX HTTP error occurred. HTTP Result Code: 500 Debugging information follows. Path: /batch?id=1113&op=do StatusText: Service unavailable (with message) ResponseText: {"status":true,"percentage":"18","message":""}PDOException: SQLSTATE[42803]: Grouping error: 7 ERROR: column "s.conditions" must appear in the GROUP BY clause or be used in an aggregate function LINE 1: SELECT s.sid AS sid, s.conditions AS conditions ^: SELECT s.sid AS sid, s.conditions AS conditions FROM {notifications_subscription} s LEFT OUTER JOIN {notifications_subscription_fields} f ON s.sid = f.sid WHERE (s.status = :db_condition_placeholder_0) AND (s.send_interval >= :db_condition_placeholder_1) AND(( (s.type = :db_condition_placeholder_2) AND(( (f.type = :db_condition_placeholder_3) AND (f.intval = :db_condition_placeholder_4) )))OR( (s.type = :db_condition_placeholder_5) AND(( (f.type = :db_condition_placeholder_6) AND (f.value = :db_condition_placeholder_7) )))OR( (s.type = :db_condition_placeholder_8) AND(( (f.type = :db_condition_placeholder_9) AND (f.value = :db_condition_placeholder_10) ))))AND (s.sid > :db_condition_placeholder_11) GROUP BY s.sid HAVING (COUNT(f.sid) = s.conditions) ORDER BY s.sid ASC LIMIT 100 OFFSET 0; Array ( [:db_condition_placeholder_0] => 1 [:db_condition_placeholder_1] => 0 [:db_condition_placeholder_2] => content_thread [:db_condition_placeholder_3] => node:nid [:db_condition_placeholder_4] => 48796 [:db_condition_placeholder_5] => content_type [:db_condition_placeholder_6] => node:type [:db_condition_placeholder_7] => article [:db_condition_placeholder_8] => content_type_term [:db_condition_placeholder_9] => node:type [:db_condition_placeholder_10] => article [:db_condition_placeholder_11] => 0 ) in Notifications_Event->get_subscriptions() (line 507 of /var/botswana/sites/all/modules/notifications/notifications.event.inc).

Grouping error?

xeniak’s picture

Status: Active » Closed (works as designed)

In our case, it's some kind of conflict with either the Messaging or Notifications module, or both. So it may not be a Feeds bug at all.

jenlampton’s picture

Version: 7.x-2.0-alpha7 » 7.x-2.0-alpha8
Status: Closed (works as designed) » Active

I'm having the same issue with the iCal parser. I can import just over 400 nodes (98% of my set) and then I get the error. I'm on Pantheon and I recognize the "Service Unavailable" as the standard Rackspace Cloud message.

I expect this is happening because the batching here is not actually working (see #1363088: Feeds batch process not fully implemented - needed for large imports) I have increased my feeds_process_limit variable from 50 to 500 in order to import all my records at once, but since there's no batching this maxes out Pantheon's resources.

twistor’s picture

This issue is kind of a crap shoot.

I suspect there's a few different causes over the years.

@jenlampton, Batch processing won't affect your issue with iCal, since from what I can tell, date_ical batches in the parser correctly. Batch processing is only affected by parsers that parse everything at once, rather than respecting the batch limit.

You'd actually want to lower your process limit to get around resource issues. feeds_process_limit is what determines the batch size, so if you're running out of memory, then you want to decrease the value.

mrpotatohead’s picture

Try "enabling" Clean URLs :

admin/settings/clean-urls

After trying a million other things all day, that finally did the trick for me.

Temoor’s picture

It seems that this issue collected lot of different bugs with alike symptoms - batch fails with some error.
I used FTP Fetcher and XPath XML parser. In my case it was 500 error, which appeared when feeds tried to save progress state to database.
Given patch is only part of solution, which allows to use file name instead of file content in FeedsFetcherResult. Other part is to update fetcher so it will return file name instead of content. See #2454615: Pass file name instead of content into FeedsFetcherResult for example of fetcher update.

Temoor’s picture

Status: Active » Needs review

Status: Needs review » Needs work

The last submitted patch, 42: feeds-error-500-partial-import-1219296-42.patch, failed testing.

twistor’s picture

As I mentioned in #2454615: Pass file name instead of content into FeedsFetcherResult FeedsFileFetcherResult can be used in this case.

yuseferi’s picture

I have the same problem, it work correctly on local, but when I deploy my site to ubuntu 16 sever, when I tried to import from youtube, faced wtih that, but there is not any log

flyke’s picture

This answer is not exactly Feeds related/specific, but is very much related to the reported error, and since this thread is the top result in google when searching for the error, I post it here.

I'm having the same problems with a custom product importer using the Batch API (not using feeds).
Same code, same file works without problem on my local development environment, but fails on the production site:

Er is een AJAX HTTP fout opgetreden. HTTP-resultaatcode: 500 Debug informatie volgt. Pad: /nl/batch?id=1018&op=do Statustekst: Internal Server Error Antwoordtekst:

(this is the Dutch version of the original error reported in this issue)

Since this thread shows there can be multiple reasons for the error, here are some general things you can do

  • check the Drupal watchdog ofcourse, but chances are you find nothing there
  • check your php error log, this could show some useful info
  • wrap your import code in try catch wrappers, this should allow the importer to continue to the next record if it encounters a problem and if it imports fine you know it was an error in your code which you can try to debug via the catch, or create more specific try catch wrappers until you find the culprit that messes with your import
  • start a timer (no code, like from a real watch or something) from the second you start your importer until the moment it fails. If it always fails on the same rounded time (like exactly 60 seconds for example) then you know you should adjust a timeout parameter in your php.ini which will currently be set to (amount of seconds it takes your import to crash)
MegaChriz’s picture

I'm working on an issue to prevent the raw data to be saved in the feeds_source table. At the same time, I see if can optimize the code in terms of memory. The raw data is being passed around a lot. This causes that several copies of the raw data are saved into memory.
#2829097: Don't store raw source in feeds_source table

MegaChriz’s picture

Closed #1232514: CSV node import throwing AJAX 500 error? which reports also a few useful possible causes.