Murdered Insurance CEO Had Deployed an AI to Automatically Deny Benefits for Sick People: Not to justify murder, but I’m sure there were a lot of desperate people very unhappy with Brian Thompson’s company.
Bad tech
2024-12-10: Misinformation expert misinforms court by using ChatGPT
2024-12-08
2024-12-06
OpenAI’s new model tried to avoid being shut down: By disabling its oversight mechanism and copying itself elsewhere, tho the context is important.
2024-12-06: OpenAI’s new model tried to avoid being shut down. Well, that’s a little unnerving, for …
2024-11-28: Some good work by Skeleton Claw:
2024-11-26
Helpline for Yakuza victims fears it leaked their personal info: A Japanese government agency was phished.
2024-11-26: From The Register: A local Japanese government agency dedicated to preventing organized crime has …
2024-11-23: The Guardian newspaper will no longer post on X
2024-11-21
A new era dawns. America’s tech bros now strut their stuff in the corridors of power: ‘This is the fusion of state and commercial power in a ruling elite.’
Don’t call it a Substack: Anil implores us not to refer to our writing using some other dubious entity’s brand name - ‘Imagine the author of a book telling people to “read my Amazon”’.
2024-11-04
Bluesky and enshittification: The most well-intentioned service remains at risk of enshittification whilst there remains significant costs incurred when switching away from it.
2024-10-31
The billionaires hedge their bets: Purely out of immoral and misguided self-interest, many of the tech billionaires are cosying up to Trump.
2024-10-30
Why I Am (Still) a Liberal (For Now): ‘Thin’ vs ‘thick’ liberalism, and the Cruelty Culture of 2022-era Twitter.
2024-10-28: Vox's take on "Is AI the new nuclear weapons?"
2024-10-27
The AI-Generated Product Reviews Choking the Internet Are Now Illegal: The FTC makes certain types of fake reviews illegal, whether human or robots wrote them, in the US.
2024-10-23
Ed Newton-Rex, who organised the recent anti AI ingesting everyone’s work for free statement that artists of all sorts of fame levels are signing makes a good point.
There are three key resources that generative AI companies need to build AI models: people, compute, and data. They spend vast sums on the first two - sometimes a million dollars per engineer, and up to a billion dollars per model. But they expect to take the third - training data - for free.
Big tech AI companies aren’t stingy with their money for everything. That’s one reason why they’re so unprofitable. It’s just that magic third ingredient that often attracts the $0 compensation rate.
Thom Yorke and Julianne Moore join thousands of creatives in AI warning: Even superstars are concerned about AI companies shovelling up the artistic output of humanity without constraint or compensation.
2024-10-22
It’s not just you, Google Search really has gotten worse: Researchers find that search engines are getting ever more flooded with nasty SEO product spam sites despite their best efforts.
Why changes to the block on Elon Musk’s X are driving users away: Musk doesn’t believe block buttons should block.
Online Safety and the “Great Decentralization” – The Perils and Promises of Federated Social Media: Is the Fediverse in need of better tooling around moderation?
2024-10-21
Governments spying on Apple, Google users through push notifications - US senator: Inevitably the centralisation and lack of encryption around push notifications makes them a useful source of surveillance material.
Governments Are Using Spyware on Citizens. Can They Be Stopped?: Companies like the NSO Group are happy to sell commercial spyware to governments who will predictably use it for bad things.
2024-10-21: The wrongness of 'If you've got nothing to hide then you've nothing to fear'
2024-10-20
The Subprime AI Crisis: Ed Zitron worries about what happens when the unprofitability of the current generative AI business becomes unsustainable.
Get Me Out Of Data Hell: The day Nikhil Suresh’s software engineering work pushed him so far into the pain zone that he has to quit there and then.