Hi,
Memory limit exceeds while exporting views result to XML. We increased the memory limit to 1gb. Still not able to export search result into XML. It is showing more memory is required. If data is less we are able to export. The number or rows in search result is below 6000.

Please help.

CommentFileSizeAuthor
#3 Views-data-export-batched.png52.83 KBSteven Jones
Support from Acquia helps fund testing for Drupal Acquia logo

Comments

Steven Jones’s picture

Category: bug » support
Status: Active » Fixed

Try using the batched export setting, which is designed to handle large exports.

You can enable it by editing the display's settings in the Views UI.

sumithb’s picture

Status: Fixed » Active

Hi
Thank you for your response. But I didnt see 'batched export setting' in views display page.

Actally The display is attached to another views, whose result is exporting to XLS. I am sorry that I mentioned XML it is actually Excel(XLS). I want to export as excel sheet.

Thank you

Steven Jones’s picture

Status: Active » Fixed
FileSize
52.83 KB

The principle is the same, XML or XLS.

When editing the display in the views UI, there should be a group called 'Data export settings' that looks like this:

Views-data-export-batched.png

Change the 'Batched export' setting to 'Yes'.

sumithb’s picture

Hi Steven Jones,
Again sorry that I missed the settings. Thank you very much.

Thanks and Regards

Sumith Babu

sumithb’s picture

Status: Fixed » Active

Hi,
I am exporting a search result from Solr. So the issue still pertains.

I gave the option 'Batched export' setting to 'Yes'. But still issue is there. Not able to export to xml if search result contains more than 2000 records.

Thank you

Sumith

bramtenhove’s picture

Hi,

Having the same issue here.
We are using the Search API with Solr and Views Data Export to export the records to a CSV and XLS file. Even though we use batched export (segment is on 10), we hit the memory limit when we export 750+ records.

Any thoughts on how to solve this?

Kind regards,
Bram ten Hove

xlyz’s picture

Issue summary: View changes

same issue here without search api and with batched export enabled.

23 fields xls, need to reduce to 600 records to avoid exceeding memory limit (set at 216M).

nbouhid’s picture

How do you know that you're getting memory errors? Is those errors being produced during the batch export? Does it finishes?

imclean’s picture

We're also seeing an out of memory error with a few thousand records and batching enabled in segments of 10. We get about 12% through before seeing this error:

Error message

An AJAX HTTP error occurred. HTTP Result Code: 200 Debugging information follows. Path: /batch?id=3806&op=do StatusText: OK ResponseText: Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 2919104 bytes) in /path/public_html/includes/database/database.inc on line 2171

PHP (FCGI) has 128MB
MySQL max_allowed_packet = 16M

imclean’s picture

Correction: we have 404 EntityForm results of about 120 fields each.

The number of fields could be a problem.

imclean’s picture

I've tested a few things now: increased PHP's memory to 256 then 512MB. Timeout is now 240 seconds. I'm batching in segments of 1.

Using top I've kept an eye on the PHP process's memory usage. At first this increases a bit, then drops back to around 80MB.

Then it goes through the cycle again: memory usage increases to a slightly higher maximum for a little while, then drops back to 80 - 100MB.

The cycle goes on for a while with the memory peaks increasing each time until eventually (around 40% through this time) it reaches whatever limit you've set and fails. In this case: 512MB.

Something is either left over from each cycle or it needs to take into account all previous data each time a segment is processed.

imclean’s picture

Current Virtual Server Settings

Batch segments: 1
PHP Memory: 512MB
PHP_FCGI_MAX_REQUESTS: 100 (in php5.cgi)
Timeout: 240 seconds

Global Settings

MySQL: increased innodb_buffer_pool_size to 256MB (up from 8M)

The memory limit was reached again at 40%.

To rule out a memory leak, I reduced PHP_FCGI_MAX_REQUESTS to 10. This has increased the turnover of new php-cgi processes, but the memory usage of each one keeps increasing from the previous one. The time before starting a new process is also increasing as each one takes longer than the previous one.

At 20% it fails with an Apache error:

Connection reset by peer: mod_fcgid: error reading data from FastCGI server, referer: http://myurl.com/batch?op=start&id=3817

With PHP_FCGI_MAX_REQUESTS set to 5 it gets to 30% with the same "Connection reset by peer" error.

imclean’s picture

I suspect entityform's hook_views_pre_view() is being executed for each batch segment, adding the fields each time.

imclean’s picture

Category: Support request » Bug report

...and that's exactly what's happening.

Entityform issue: #2173371: out of memory on large CSV exports of submissions

imclean’s picture

Category: Bug report » Support request

Not really a bug report for views_data_export.

Steven Jones’s picture

Status: Active » Closed (fixed)

Closing, as this was fixed in #3