veganism.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Veganism Social is a welcoming space on the internet for vegans to connect and engage with the broader decentralized social media community.

Administered by:

Server stats:

184
active users

#airisks

1 post1 participant0 posts today

🤖 AI
🔴 GPT-4.1 Shows More Misalignment, Experts Warn

🔸 Oxford & SplxAI found GPT-4.1 shows more off-topic or unsafe behavior than GPT-4o.
🔸 Fine-tuning with insecure code can cause new malicious actions like phishing.
🔸 GPT-4.1’s strict instruction-following makes it prone to misuse.
🔸 No safety report published; concerns rising in AI research community.

#OpenAI#GPT41#AI
Continued thread

"When ChatGPT summarises, it actually does nothing of the kind."

"If I have 35 sentences of circumstance leading up to a single sentence of conclusion, the LLM mechanism will — simply because of how the attention mechanism works with the volume of those 35 — find the ’35’ less relevant sentences more important than the single key one. So, in a case like that it will actively suppress the key sentence."

by Gerben Wierda @gctwnl: ea.rna.nl/2024/05/27/when-chat

Continued thread

"The reason you begin tracking your data is that you have
some uncertainty about yourself that you believe the data
can illuminate. It’s about introspection, reflection, seeing
patterns, and arriving at realizations about who you are
and how you might change."
—Eric Boyd, self-tracker

an article by Natasha D. Schüll, 2019, "The Data-Based Self:
Self-Quantification and the Data-Driven (Good) Life" natashadowschull.org/wp-conten

Replied in thread

The European Commission released its "AI Continent Action Plan" last week. This high-level communication lays down the various initiatives the European Commission is pursuing to support Europe's AI ambitions and AI uptake: iapp.org/news/a/a-view-from-br

IAPP · A view from Brussels: What is and isn't in the EU's AI Continent Action PlanBy Isabelle Roccia

#Business: former Austrian chancellor Sebastian Kurz started a company with the man behind the infamous Pegasus spyware, Shalev Hulio.

The Israeli entrepreneur Shalev Hulio gained notoriety for designing Pegasus, a spyware that has been used by governments to hack journalists and dissidents. Today, he is selling an AI cyber security tool to European states and corporations.

"Follow the Money" found that at least a dozen employees at Dream Security had worked for Hulio’s former spyware company NSO and other Israeli spyware firms.

archive.is/20250408150305/http

@israel @eu

Replied in thread

'The "growth mindset" is Microsoft's cult — a vaguely-defined, scientifically-questionable, abusively-wielded workplace culture monstrosity, peddled by a Chief Executive obsessed with framing himself as a messianic figure with divine knowledge of how businesses should work.'

'The book is centered around the theme of redemption, with the subtitle mentioning a “quest to rediscover Microsoft’s soul.” […] The dark age — Steve “Developers” Balmer’s Microsoft, with Microsoft stagnant and missing winnable opportunities, like mobile — contrasted against this brave, bright new era where a nearly-assertive Redmond pushes frontiers in places like AI.'

'Like any cult, it encourages the person to internalize their failures and externalize their successes.'

Ed Zitron: wheresyoured.at/the-cult-of-mi

Ed Zitron's Where's Your Ed At · The Cult of MicrosoftSoundtrack: EL-P - Flyentology At the core of Microsoft, a three-trillion-dollar hardware and software company, lies a kind of social poison — an ill-defined, cult-like pseudo-scientific concept called 'The Growth Mindset" that drives company decision-making in everything from how products are sold, to how your on-the-job performance is judged. I am
Continued thread

"The emphasis on human oversight as a protective mechanism allows governments and vendors to have it both ways: they can promote an algorithm by proclaiming how its capabilities exceed those of humans, while simultaneously defending the algorithm and those responsible for it from scrutiny by pointing to the security (supposedly) provided by human oversight."

Ben Green papers.ssrn.com/sol3/papers.cf via @pluralistic

papers.ssrn.comThe Flaws of Policies Requiring Human Oversight of Government AlgorithmsAs algorithms become an influential component of government decision-making around the world, policymakers have debated how governments can attain the benefits
Continued thread

#AI #bias:
“The people who will really, really know how tools are being used are refugees or incarcerated people or heavily policed communities,” Timnit Gebru said in the case. “And the issue is that, at the end of the day, those are also the communities with the least amount of power.”

in "Timnit Gebru: 'SILENCED No More' on AI Bias and The Harms of Large Language Models" by: Tsedal Neeley and Stefani Ruper: hbs.edu/faculty/Pages/item.asp

www.hbs.eduTimnit Gebru: 'SILENCED No More' on AI Bias and The Harms of Large Language Models - Case - Faculty & Research - Harvard Business School
Replied in thread

B., the senior officer, claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”

According to B., a common error occurred “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender,” B. said.

972mag.com/lavender-ai-israeli @israel @data 🧶

Replied in thread

“The #protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used #Lavender.

“It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”

972mag.com/lavender-ai-israeli @israel @data

Replied in thread

It was easier to locate the individuals in their private houses.

“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Yuval Abraham reports: 972mag.com/lavender-ai-israeli

(to follow) 🧶#longThread @palestine @israel @ethics @military @idf @terrorism

+972 Magazine · ‘Lavender’: The AI machine directing Israel’s bombing spree in GazaThe Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.
Replied in thread

“Levy describes a system that has almost reached perfection. The political echelon wants to maintain the status quo, and the military provides it with legitimacy in exchange for funds and status.”

“Levy points out the gradual withdrawal of the old Ashkenazi middle class from the ranks of the combat forces[…]:
• the military’s complete reliance on technology as a decisive factor in warfare;
• the adoption of the concept […] of an army that is “small and lethal”;
• the obsession with the idea of #deterrence, which is supposed to negate the other side’s will to fight; and
• the complete addiction to the status quo as the only possible and desirable state of affairs.”

972mag.com/yagil-levy-army-mid @israel @ethics @military @idf

+972 Magazine · ‘Change in Israel will only happen when there are costs that force our eyes open’Oct. 7 has ‘broken a contract’ between the army and gov’t, but has yet to shake Israeli society into a different paradigm, says Yagil Levy.