veganism.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Veganism Social is a welcoming space on the internet for vegans to connect and engage with the broader decentralized social media community.

Administered by:

Server stats:

296
active users

#moderation

13 posts12 participants1 post today

Mastodon Moderators: If you've had this verification scam/spam happening from accounts on your server, would you mind sharing with me privately the IP address and other details of the account?

I'm trying to compile information on this attack (we don't have any of this data from Hachyderm, because we've approval based registrations due to LLM spam)

I've been able to pull together information on the different domains and text of the spam messages being used, but not much in the way of account details.

(Yes, this data is PII, and I promise to handle it responsibly)

Another #Goosestep toward full-on #authoritarianism !

This Bill Would Fine Social Media Companies $5 Million Every Day for Not Fighting '#Terrorism'

The #StopHateAct wants social media platforms to report their #moderation policies and outcomes to the government. And it’s not the only #censorial measure Rep. #JoshGottheimer wants.

Matthew Petti | 7.24.2025 5:25 PM

"The idea that the federal government even talked to social media platforms about their moderation was a major scandal. After the Twitter Files leak revealed that the Biden administration was privately leaning on one platform to suppress "misinformation," the courts blocked officials from communicating with social media companies for several months on free speech grounds.

"A bipartisan bill, however, would make it mandatory for social media companies to work with the federal government. The Stopping Terrorists Online Presence and Holding Accountable Tech Entities (STOP HATE) Act would require companies to provide triennial reports on their moderation policies—and violations they catch—to the U.S. attorney general.

"The bill requires companies to issue specific policies for groups the federal government designates as terrorists and the director of national intelligence to also begin reporting on terrorist usage of social media. Companies would be fined $5 million per day that they fail to comply.

"Reps. Josh Gottheimer (D–N.J.) and Don Bacon (R–Neb.) had first proposed the bill in November 2023. It died in committee at the time. Gottheimer and Bacon announced that they would be reintroducing the bill at a press conference on Wednesday alongside Anti-Defamation League CEO Jonathan Greenblatt.

" 'There is no reason why anyone, especially terrorists or anyone online, should access social media platforms to promote radical, hate-filled violence,' Gottheimer said at the press conference. He cited supportive social media comments about the May 2025 murder of two Israeli embassy staffers in Washington, and the AI platform Grok's sudden decision to declare itself 'MechaHitler' earlier this month.

"#Meta, the company that runs #Facebook and #Instagram, is already known to have a list of 'dangerous individuals and organizations' banned from the platform. When the list was leaked to The Intercept in 2021, it included around 1,000 entries taken straight from the U.S. government's foreign terrorist list, as well as various foreign and domestic entries sourced to private think tanks."

Read more:
reason.com/2025/07/24/this-bil

Reason.com · New bill would fine social media companies $5 million every day for not fighting 'terrorism'By Matthew Petti

As a moderator: Thank you to everyone reporting the fake "you need to verify your account" posts flying around Mastodon, supposedly from a Mastodon Support Team; this lets us suspend the spammers as quickly as possible.

And wow, there are a lot today. I suspended two while sitting in my dentist's parking lot after just a one-hour appointment for a cleaning. And then one more when I got home after a one-hour drive.

"We often receive reports that don’t violate those rules. It’s simply someone saying they don’t like what another person said. We aren’t argument referees, but people expect us to be. If that argument becomes harassment or discrimination, it’s time for intervention. But even that isn’t always straightforward."

I recognize this.

Hi, friends.

I wrote an article on being a Mastodon moderator. I talk about the process, our mental health, the challenges of mutual aid, and more.

It leaves me a little vulnerable, but I think it’s an important story to tell.

If you like it, boosts are welcomed. If you have questions, feel free to reach out.

Okay, here goes…

markwrites.io/being-a-mastodon

Mark W.rites · Being a Mastodon ModeratorPeople ask me what it’s like to be a moderator. Our discussions reveal that a lot of what we do is a mystery. So, I’m gonna lay it out ...

⚠️ ❗ We have a #phishing #scam on #Mastodon. The fake-accounts (often 0 followers/followees) pose in comments as "administrators" or "support teams" of Mastodon asking for account verification. Real admins would never do that!

❗ It's very important that you *report* them! Don't click their link❗

The mods of mastodon-social acted very fast to delete such an account but the scammers seem to try it now on other instances, especially smaller ones.

So maybe some of you are wondering why we're kind of posting mod actions. And thats fair; most instances keep that stuffs in mod spaces. But we kind of operate diff... ppl who trust us to be their home on the fedi should feel secure and part of that is being transparent with our actions.

So yeah, as we add ppl (slowly) we're kind of working up the framework of how to handle stuffs. And part of that is posting our mod decisions so ppl can see. And yeah okay, that might open up some harassment potential but it also (we think) offers ppl on Bardic insight so they can be sure this is where they want to be.

Fedi kind of has a "Reddit style mod" prob sometimes and we def don't want to go that way!