Support for Drupal 7 is ending on 5 January 2025—it’s time to migrate to Drupal 10! Learn about the many benefits of Drupal 10 and find migration tools in our resource center.
In other words there should be a flag, if ticked, a message can be sent and saved into the pm_message table without any recipients being entered into pm_index.
This should allow us to have a pm_roles module with a pm_roles_index that allows for scalable mass user messaging. (and og can just duplicate the functionality if needed.)
Comments
Comment #1
litwol CreditAttribution: litwol commentedconsider the negative end of numbers.
Comment #2
NaheemSays CreditAttribution: NaheemSays commentedeh?
Comment #3
litwol CreditAttribution: litwol commentedpositive numbers could be user id, negative numbers could be role id.
Comment #4
BerdirHm. There is one big problem when sending messages to groups or roles. We cannot track who read/deleted a message anymore. Additionally, users will recieve old messages when they are assigned to a role. That should imho not happen. And they should be able to keep messages they recieved when they were in a specific group, had a role, whatever..
Actually, I think the best idea might be to use batch api to send messages to all users of the given role/group/whatever.
Comment #5
NaheemSays CreditAttribution: NaheemSays commentedThe problem with this is that if you send a message to all users, or even a big subgroup of them on a very large site, for every message you could be adding 100,000+ records to the index. I am not a database guru and that may not be a problem, but such numbers scare me.
What I would think is if we do allow messages to groups/roles/other then we make it like a sort of "announcements" - not an actual conversation, but a single message that cannot be replied to, but even here if we want, we can have conversations.
The weak link for such a proposal has already been pointed out by Berdir - messages gained when a new member joins and messages lost when one is removed.
Alternatively we can simply say that this is not our problem space and if people want to contact all the members of a site, they should use some other method.
Comment #6
litwol CreditAttribution: litwol commentedI've tackled a similar problem before. the way to solve it is to "buffer" your writes. for example:
Instead of this :
It would become like this:
A single query like this can write hundreds of thousands of records per second, while the individual writes are much slower.
Comment #7
BerdirBut it doesn't work on PostgreSQL...
Edit: Also, what nbz imho means is filling the index table with hundreds of thousends of rows and then selecting again. The inserting is not the issue, we can handle that with batch api.
Comment #8
litwol CreditAttribution: litwol commentedProperly indexed tables can handle millions of records very easily. i fear i chose wrong schema when i broke up pm_* which results in too many joints which degrades performance.
is there no equivalent for pgsql that allows multiple inserts in single query?
Comment #9
oadaeh CreditAttribution: oadaeh as a volunteer commentedThis issue is being closed because it is against a branch for a version of Drupal that is no longer supported.
If you feel that this issue is still valid, feel free to re-open and update it (and any possible patch) to work with the 7.x-1.x branch (bug fixes only) or the 7.x-2.x branch.
Thank you.