Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
Token integration was added in the (now) v2 branch but as that code branch is a bit of a clusterfsck I don't want to continue it. Instead, I'd like to cleanly re-implement token integration in the v1 branch.
Comment | File | Size | Author |
---|---|---|---|
#25 | nodewords-6.x-1.14-patch-Tokens.zip | 77.59 KB | aiphes |
#20 | TokenSupport-1380362-20.patch | 9.95 KB | mrP |
#21 | BasicTokenSupport-1380362-21.patch | 9.95 KB | mrP |
Comments
Comment #1
AlfTheCat CreditAttribution: AlfTheCat commentedHi Damien,
Are tokens already usable in the new (and awesome) 1.13 version?
All the best!
Comment #2
DamienMcKenna@AlfTheCat: no, there's no Token support in the new release, unless you do so yourself via the hooks. Hopefully for the next release.
Comment #3
bjalford CreditAttribution: bjalford commentedWill this include Open Graph token support?
Comment #4
DamienMcKenna@bjalford: It'll be possible to use tokens in Open Graph tags, is that what you mean?
Comment #5
bjalford CreditAttribution: bjalford commentedyes - thanks
Comment #6
DamienMcKennaI've added the following note to the README.txt and project page:
Comment #7
DamienMcKennaDecided to bump this to the v2 branch, which will be rebuilt from the current v1 codebase.
Comment #8
com_net CreditAttribution: com_net commentedI think the use of tokens was the only meaning of this module. How can I go back to version 1.11?
With 1.13 all my pages became completely stupid to look in the search results.
Comment #9
DamienMcKennaYou would have to go back to one of the v1.12 betas, but they had problems too.
Comment #10
notluap CreditAttribution: notluap commentedAny update about token integration? In my opinion, not having tokens working makes this module almost counter productive as Google will punish your site for having identical meta descriptions on every page.
Comment #11
DamienMcKennaMarking all v2 issues as postponed while v1 is finished off.
Comment #12
hanoiiI haven't surfed the code yet, but I would ask to reconsider adding token to v1. It shouldn't really be any hard to implement, it's just properly adding a couple of token replace functions before outputting the tags, or am I missing something? I am willing to try to submit a patch if this might be considered.
Comment #13
DamienMcKenna@hanoii: I'll happily review a patch. My concern with tokens is that when you check empty($tag['value']) it will find the token string, not its output, so the current logic of using defaults if the overrides are empty will require more processing.
Comment #14
hanoiithis would only happen if the override has a token that expands to an empty string?
You could always check for empty after the replacement of tokens have happened.
Comment #15
Dries Arnoldshanoii, are you still working on a patch? I'll gladly review it.
Comment #16
chadmkidner CreditAttribution: chadmkidner commentedHello, after a long time of putting off "upgrading" from the 1.2 (or 2.x) release as it was working great for me, I finally did so yesterday (mostly due to the annoying error always notifying me that my release was no longer supported).
I have to comment though, I was disappointed to find that token support was removed from the module. I have to agree with what others have already mentioned, this was certainly one of the primary components to having this module to begin with. If I wanted the same default meta info on every page I would simply hard code it, however as also previously mentioned, search engine penalize for having it as such.
I think that if this is going to be a feature request that it needs to be so under the 1.4 branch as this seems to be the current version in development. and as the 1.2 or rather 2.x version already has token support... I would personally say that this is a pretty vital feature for this module and hope @maintainers find it to be as well and choose to change this to active again soon.
If attention gets put into this re-implementation soon I would definitely be on board to help and test it for bugs. Thanks for all the work put into this module!
Comment #17
marcoBauli CreditAttribution: marcoBauli commentedyet another +1 for reimplementing token support, and very willing to contribute with bounty and testing.
ps: if token is nasty and requires lots of work, at least would be useful to be able to use the default 'page title' together with 'site name divider' text. This and custom pages meta tags paths can handle quite a few combinations.
Comment #18
hanoiiI believe page title and site name is the standard default
Comment #19
marcoBauli CreditAttribution: marcoBauli commented@hanoii: the problem is that 'site name divider' works only if you first specify a 'page title', so this makes custom pages paths almost unusable.
As it is now, if you specify:
PATH: countries/*
PAGE TITLE:
SITE NAME DIVIDER: holidays and vacations |
does not "take" and nothing happens on your countries/* pages. To make it work you need to specify 'page title' also, for example:
PATH: countries/*
PAGE TITLE: Foo
SITE NAME DIVIDER: holidays and vacations |
but this outputs "Foo holidays and vacations | Sitename" on all countries/* pages, which is no good both for usability and SEO.
Unless i am missing something...
Comment #20
mrP CreditAttribution: mrP commentedThe patch adds token support to nodewords_basic for description, abstract, keywords, and copyright only. The available tokens are quite limited (ie, no taxonomy, vocabulary, body, or teaser tokens) though adding the others present in the old 2.x version should be pretty straighforward.
Its probably junk, but at least its a start.
Comment #21
mrP CreditAttribution: mrP commented...testbot please (but still needs work *obviously*)
Comment #22
DamienMcKennaChanging the status so it doesn't get overlooked.
Comment #23
aiphespatch work on 1.14 version...but missing token like node-teaser..
Comment #24
Summit CreditAttribution: Summit commentedAlso missing token arg[0], arg[1] etc for arguments of the url tokens. These are very important to me
@aiphes can you winzip your version including tokens, I am not able to get it to work..sorry.
greetings, Martijn
Comment #25
aipheshere it is
Comment #26
Summit CreditAttribution: Summit commentedHi, Aiphes,
Did you add url tokens like arg[0] etc...or are you willing to?
greetings, Martijn
Comment #27
aiphesi did just apply the patch #21
Comment #28
kinaia CreditAttribution: kinaia commentedwhy token isn't added in function nodewords_basic_title_prepare ?
Comment #29
JCB CreditAttribution: JCB commentedI can confirm that I just applied patch #21 to the latest dev version, and I now have token support, which meets my requirements.
Comment #30
slybud CreditAttribution: slybud commentedHello,
Is the integration of this feature/patch in the roadmap of nodewords ?
Is there any plans to extend this feature/patch to submodules like OPen Graph ?
Comment #31
DamienMcKennaThis won't be added to v1, but it will be added to v3: #2274037: Plan for Nodewords v6.x-3.0 release
Comment #32
Summit CreditAttribution: Summit commentedSuper! Looking forward to the 6.x-3 release with tokens like arg[0] etc..working again!
Greetings, Martijn
Comment #33
DamienMcKennaClosed two duplicates: #2159673: OG:Description should check Meta:Description first, #983414: Automatic Taxonomy Term Description using Tokens.
Comment #34
DamienMcKennaClosed another duplicate: #1978906: Grab Description from another field other than teaser or Body
Comment #35
DamienMcKennaThank you for taking the time to work on this. However, the module is no longer supported, so I'm closing this issue.