I did the initial import of the database from endnote, of about 2500 entries, and it did loaded the entire file in 1 go.

With the features of Drupla 4.7 you should be able to do imcremental imports so that memory requirements are not that large and there is no chance of a time out.

Comments

rjerome’s picture

Good idea, but I'm not sure how I would handle it. What format are you using for import?

I'll have to check an see what's the biggest memory/time waster, the parsing or the DB inserts (I suspect it's the inserts, since they are done one at a time through the standard "node_save" mechanism.

gordon’s picture

I am actually using the XML format to do imports.

I actually started devloping a biblio module to do this early last year, but I never got advanced as your module. I did however have incremental imports running under 4.7. It was most likely slower as it would open the XML file each time, but it used a lot less resources.

rjerome’s picture

So you are saying that it parsed the XML file for each entry? (i.e. 2500 times?). Any chance I could see the code maybe I could use some of it.

By the way, how long did it take to import 2500 records? I presume you had to modify the script run time limits to get it to finish.

gordon’s picture

No, not on each entry, but on each request. And each request would process between 50 and 100 entries.

So it would take longer than processing them all at once, but it is nicer to the web server.

rjerome’s picture

I made some changes since you brought this up. Previously, large chunks of data (the complete contents of the file and the node array) were being passed by value, they are now being passed by reference, so that should help memory wise. There is still the issue of inserts, they are done one at a time (because it was easier), but could be en mass, by bypassing the node_insert function.

ntripcevich’s picture

Version: 4.7.x-2.4 » 5.x-1.9

Following up on this.
I just imported 600+ records using the latest release of Biblio for D5.7 and I also was unable to upload Endnote 8+ XML files with more than 100 records. So by breaking it into chunks of 100 (as Groups in Endnote) and export each to XML it went OK, but the problem described above seems to be persisting.
Thanks

bekasu’s picture

Status: Active » Closed (fixed)

Upgrade to Biblio 6x and use batch upload.
Marking issue closed.