I am having a problem using a sourceMigration which migrates a multi-valued term reference. In such a case, when importing a 2-value term_reference field, $source_values contains something like:

array(LANGUAGE_NONE => array(array('tid' => 1), array('tid' => 2)));

this causes $source_key to be built as

array(array('tid' => 1), array('tid' => 2));

and within lookupDestinationID($source_id), the condition is being built as

$query = $query->condition("map.$key_name", array_shift($source_id), '=');

which builds incorrect SQL like:

SELECT map.destid1 AS destid1
FROM {migrate_map_catmsmaterials} map
WHERE  (map.sourceid1 = :db_condition_placeholder_0_0, :db_condition_placeholder_0_1)

Comments

mikeryan’s picture

Status: Active » Postponed (maintainer needs more info)

I can't reproduce this, but maybe I'm misunderstanding what you mean by "multi-valued term reference". As an example, I'm looking at the WineWineMigration in migrate_example, which maps the BestWith terms thus:

    $this->addFieldMapping('migrate_example_wine_best_with', 'best_with')
         ->separator(',')
         ->sourceMigration('WineBestWith')
         ->arguments(array('source_type' => 'tid'));

One of the wines has multiple best-with values, so $source_values as passed to handleSourceMigration() is:

Array
(
    [0] => 12
    [1] => 15
)

It all works fine in this context. So, what's your context? In particular, how are you getting array(LANGUAGE_NONE => array(array('tid' => 1), array('tid' => 2))), which is a fully constructed field structure, going into handleSourceMigration() which is called before the prepare handlers that construct those structures?

fgm’s picture

This happens because the source data already is a field structure before any processing. I'm moving fields on entites between two sites using the source in migrate_remote, which transmits these transparently instead of reducing them to plain value arrays, and having fields with a non-single cardinality (two values on that field, for instance).

In the case of this specific issue, I worked around it by creating a derived ResMigration and MigrateResFieldMapping, much like the XML equivalents, to save the field data, massage them to the format expected by Migration::applyMappings(), get the data from there, inject them into the saved data, and passed the result as the source data after mapping, but it seems it should not be necessary to do this type of work, and have by default a behaviour where native data like fields are used by default.

fgm’s picture

Status: Postponed (maintainer needs more info) » Active

Forgot to reset active.

mikeryan’s picture

Category: bug » feature
Status: Active » Closed (works as designed)

That's just not the API. At applyMappings() time, the input is flat data, which is naturally how the data comes in 95%+ of the time. It's in the prepare() functions, the last thing called before foo_save(), that transformation to field arrays occurs, and everything before that assumes flat data. That is not going to change. Even if handleSourceMigration() dealt with the field arrays, when prepare() got called the field handlers would either choke or push those arrays down a couple more levels.

If you want to copy field arrays directly, without flattening them out up front and having prepare() reconstruct them, the best place to do that would be in your own Migration class prepare() - at this point any straight data migrated will have been put into the proper field arrays, so copying in field arrays directly will maintain the self-consistency of the destination object.