Summary: not sure how this changes things, since Discord wa already
banning groups about IVM etc. (early treatment)
Social media companies are preventing discussion of new conditions, and emerging observations, about a novel disease - unless it has confirmation in the regulatory and health agency policies.
However "long hauler syndrome" emerged first from discussion among sufferers.
Doctors - many still in denial - call it a psychological condition.
Others in public health (I think the UK health minister or head of some health body) recommend just exercising it out - when exercise exacerbates long hauler symptoms (!)
So without a social platform, how will pushback by public ever get traction?
Are these tech companies leveraging public power - or govt power? Are they slipping from their mandate? Will company viability follow downhill trend, if this dents public trust?
With a policy of comprehensive censorship of emerging evidence, how will newly observed symptoms gain traction and get attention of public health officials - as long haulers did 1.5 years ago?
Will this censorship serve as accomplice to pharma - if pharma decides suppression of discontent over rare or not so rare vaccine injuries is injurious to their stock price?
In the age of Omicron, rampant infection/resultant widespread natural immunity, does enhancing censorship NOW serve any health benefit, or is it in aid of (now declining) stock prices and rising scrutiny of virus origins, vax development and increased recognition of injury stats?
Article:
https://www.pcgamer.com/uk/discord-bans-anti-vaccination-content-and-misleading-covid-19-information/
Discord bans 'anti-vaccination content' and 'misleading' Covid-19 information
By Tyler Wilde
Feb 26, 2022
A new Discord rule prohibits unsupported medical claims that are "likely to cause physical or societal harm."
Discord is updating its community guidelines with a clause that bans sharing information it deems "false or misleading" and "likely to cause physical or societal harm" if acted upon. The rule could apply to lots of information, but Covid-19 rhetoric is the primary example given. The chat service doesn't want to be a source for "anti-vaccination content" or advice not accepted by the medical community, such as the use of unproven home remedies.
Link:
https://discord.com/blog/addressing-health-misinformation
Put briefly, Discord will not allow individuals to "post, promote, or organize communities around false or misleading health information that is likely to result in harm," wrote Discord senior platform policy specialist Alex Anderson in a blog post explaining the update.
Discord defines false or misleading health information as being any health information which "directly and unequivocally contradicts the most recent consensus reached by the medical community," and it offers a surprising amount of detail on what it means by that.
The following is a list of topics Discord warns not to make "false or misleading" claims about:
the safety, side effects, or efficacy of vaccines;
the ingredients, development, or approval of vaccines;
alternative, unapproved treatments for disease (including claims that promote harmful forms of self-medication, as well as claims advocating vaccine refusal or alternatives);
the existence or prevalence of a disease;
the transmission or symptoms of a disease;
health guidance, advisories, or mandates (including false claims about preventative measures and actions that could hinder the resolution of a public health emergency);
the availability or eligibility for health services; and,
content that implies a health conspiracy by malicious forces (including claims that could cause social unrest or prompt the destruction of critical infrastructure).
On its own, the list could be construed as a blanket prohibition on expressing distrust in any local health mandate or even recommending "alternative" traditional medicines. However, Anderson says that Discord will consider context like intent, and won't take action unless it believes messages are "likely to cause some form of harm."
"This policy is not intended to be punitive of polarizing or controversial viewpoints," he writes. "We allow the sharing of personal health experiences; opinions and commentary (so long as such views are based in fact and will not lead to harm); good-faith discussions about medical science and research; content intended to condemn or debunk health misinformation; and satire and humor that obviously and deliberately intends to mock false or misleading health claims."
People who hold polarizing or controversial viewpoints will probably disagree with the claim that they're not being targeted, although it bears mentioning that Discord users who mainly stick to smaller groups may not notice any change, regardless of what they say on the platform.
When I spoke to Discord about privacy in 2019, it told me that it doesn't proactively monitor the text and voice chat of any given server—with over 150 million monthly active users, how could it? Instead, moderators largely respond to user reports, which are most likely to come from big public servers.
Link:
https://www.pcgamer.com/uk/how-private-is-your-private-discord-server/
I think it remains unlikely that Discord is scanning the chat logs of every 20-person server looking for narratives related to Covid-19 vaccines and microchips, although there is some precedent for proactive moderation on Discord. In 2018, after a few publications reported that the relative privacy offered by Discord was turning it into a white supremacist hideout, the company made a publicized effort to rid itself of hate group servers. Following that example, it's possible that Discord will seek out and shut down servers that openly advertise themselves as anti-vax hubs, if any such servers exist. (If I had to guess, I'd say they do.)
Link:
https://slate.com/technology/2018/10/discord-safe-space-white-supremacists.html
The new Discord policies go into effect on March 28. "Malicious impersonation" is also forbidden by the new guidelines, with the note that "satire and parody are okay," and Discord has given itself permission to consider "relevant off-platform behaviors" when acting on user reports, such as "membership or association with a hate group, illegal activities, and hateful, sexual or other types of violent acts."
Link:
https://support.discord.com/hc/en-us/articles/4469957714327
Discord also says it will be cracking down on "false, malicious, or spammy" reports. "If you are found to be reporting in bad faith, we may take action against your account," the company says.
There are a number of other changes to the user guidelines, terms of service, and privacy policy. You can read more about all of them here.
Link:
https://support.discord.com/hc/en-us/articles/4469943799319
As someone who doesn't use Discord as a soapbox for vaccine-related commentary one way or another, the news mostly serves as a reminder that conversations which happen on the platform aren't entirely private, even on so-called private servers. It's a moderated social network, so if someone files a report, it is possible for Discord mods to look at your chat logs and issue warnings, suspensions, or bans. For those who want Discord-like features without joining a social network, companies like TeamSpeak still offer paid, private VOIP servers. (For the moment, I'm not too worried about the encryption of my D&D group's endless scheduling conversations.)
Link:
https://www.teamspeak.com/en/
Author:
Tyler Wilde
Tyler has spent over 1,200 hours playing Rocket League, and slightly fewer nitpicking the PC Gamer style guide. His primary news beat is game stores: Steam, Epic, and whatever launcher squeezes into our taskbars next.
Twitter:
https://www.twitter.com/@tyler_wilde
[–]spelllingchamp 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 0 fun2 insightful - 1 fun - (0 children)