The partial migration of stories from Geeklog into story-nodes Drupal is a mapping of the *_stories table into the nodes table; a quick way to do the transform is to dump the stories out into a format suitable for load data infile:

    select 'story' as type,
        title,
        unix_timestamp(date) as created,
        '' as users,
        introtext as teaser, 
        bodytext as body, 
        unix_timestamp(date) as changed,
        '' as revisions 
            from tc_stories into outfile '/tmp/stories.dump';

this creates the load file; after you've loaded the database script from the Drupal distribution, this data can be inserted into the database with

    load data infile '/tmp/stories.dump' 
        into table node    
        (type,title,created,users,teaser,body,changed,revisions);

This is not a perfect transformation, but it's a start. Geeklog subjects are lost: To preserve categories, you would need to pre-load the topics in Drupal, then create a script that would insert items from *_stories as mapped in the above example, but then to fetch the nid node id number from the newly inserted record and do a search on the term_data to get the tid number, then insert the pair into the term_node table.

Since the terms of our new site were only superficially similar to the categories we'd used in Geeklog, we chose instead to fix up the categories later by doing keyword searches on stories to get a list of nid and then pairing those to the new topics by hand using SQL via mysql

Bug: After inserting stories using the above method, the stories will be in the archives, but will not appear on the main page (use update node set promot = 1 to fix this for all or selected items). A more serious bug is that the nodes do not appear in search results -- I don't know that much about the inner workings of Drupal, but I expect someone will post a comment explaining how to fix this.

Comments

handelaar’s picture

Here's what I did to get more of the default fields populated.

I've added 'promote=1' to all imported stories to make sure they're all front-page enabled. Change that to '0' if you don't want your imported nodes to appear on the front page.

In the first line here, change 'story' to 'blog' or some other node type if that's what you'll need:

select 'story' as type, 
title, 
unix_timestamp(date) as created, 
unix_timestamp(date) as changed, 
'2' as comment, 
'1' as promote, 
'0' as moderate, 
'1' as format, 
'1' as uid, 
'0' as score, 
'1' as status, 
'' as revisions, 
introtext as teaser, 
bodytext as body 
from gl_stories into outfile '/tmp/gldump.txt';

The corresponding import command at the Drupal end:

load data infile '/tmp/gldump.txt' into table node 
(type,title,created,changed,comment,promote,moderate,format,uid,score,status,revisions,teaser,body);

The next part, missing above, is very important.

You must also alter the value of 'node_nid' in the 'sequences' table to relect how many new items have been added to the node table. So if you've added 100 rows in this way, add 100 to the current value of node_nid. Otherwise Drupal will try use unique node IDs which you've just taken when you try to add new content, and you'll get an error.

Note also that my using '1' as the value for 'uid' in this example will assign all the imported filed to the Drupal user with UID 1, aka your godlike admin user which you created the site with.

handelaar’s picture

... you need to run cron.php.

Thanks to almaw from #drupal-support for that last tip.

seannyob’s picture

Ok, so I absolutely had to migrate my user table, too. Here's how I did it. I must say that this is a rather lousy hack, and very probably a terrible idea, and that hopefully someone who is not a newbie to Drupal will figure out a more elegant solution. However, it worked for me, despite the fact that it's kinda klunky.

Please note that this hack doesn't correct for possible probelms arrising from equal uid's in the two tables--I'm assuming you're installing into a fairly blank drupal database. If the drupal database had auto-incremented uid's (anyone?) this wouldn't be an issue, but it doesn't. If you already have lotsa users in your drupal database, then a possible workaround is to edit our outfile after the select statement manually. (Anyone have a better idea?)

The password encryption methods are the same, we're essentially copying encrypted passwords so things work fine in that department.

  1. Back up your drupal database:

    $ mysqldump -u you -p drupaldb > ~/drupalbackup.today.sql

    This will dump a valid sql file into your home dir.

  2. Verify you got a good mysqldump:

    $ cat ~/drupalbackup.today.sql

  3. We're going to run mysql select query on our geeklog database to create a file with helpful data that we can plug into our drupal database.

    Much like our content migration, above, this is not complete but we'll deal with that in the following steps. To wit:

    Log into your geeklog database and run the following query:

    select uid,
    username as name,
    passwd as pass,
    email as mail,
    sig as signature,
    '1' as status,
    email as init
    from gl_users where uid > 4
    into outfile '~/user_migration.txt';
    

    ...or something like that, depending on your geeklog database. In mine, I didn't want to include the Administrator, Moderator, and sean accounts (2-4), so I took everything > 4.

    Verify you have data in a tab delimited file:

    $ cat ~/user_migration.txt

  4. Ok, now in our drupal database: (you already backed it up, right?)

    load data infile '~/user_migration.txt' into table users
    (uid, name, pass, mail, signature, status, init);
    
  5. If you care, the drupal database column "created" is now blank. What I did, instead of worrying about converting things, is just got a current UNIX time stamp [ http://www.csgnetwork.com/time2unixdscalc.html ] and, in the drupal database, did a

    update users set created='1103330968' where uid > 2;
    

    Where two is the number of your highest drupal user before the migration. Of course this is lame, but I didn't want to deal with researching how to migrate the time. Somone else might know off the top of their head. All your users will have the same create time, which will be whatever you put in there.

  6. (Here's the annoying part. I really hope a drupalguru will find a way to automate this for you.)

    Then, I logged in as admin in drupal and went to user administration. In each user's page, I checked "Registered User," which is the only role I've got defined and the only one other than anonymous by default. I did this for each user I imported. This populated the "data" column in the "users" table with the appropriate data. It also placed these users' uid's in the "users_roles" table, which I did not bother writing a sql statement for.

  7. Then I logged in as my old geeklog test user to my new Drupal CMS and enjoyed the view.
  8. Then I had a cup of cappucinno.

--
Sean K. O'Brien
CTO
Colley Graphics, LLC
http://www.colleygraphics.com

blantz’s picture

you can preserve the user registration date with unix_timestamp:

select uid,
username as name,
passwd as pass,
email as mail,
sig as signature,
'1' as status,
email as init,
unix_timestamp(regdate) as created
from gl_users where uid > 4
into outfile '~/user_migration.txt';

this script worked for drupal 4.7

dietman’s picture

Can this be done with excel and nodeimport?

I am tryign to do this, because using mysql is too hard for me, is there a video tut for migrating from geeklog?

thelusiv’s picture

I wrote a Python script to do this. It's on GitHub here:
geeklog2drupal.py
Documentation

fender-dupe’s picture

I am confused

how to run phyton?

How to do this?
1. Edit the file to point to your databases, and edit lines 51 and 56 - you must put user IDs in gl_uids_to_convert
2. python geeklog2drupal.py

Can you be more specific please

I dont know hot to run phyton

I am using local server xampp please advice where to put the file