Hey there,
Here's what caught my attention this week:
blog.banast.as
Andrej Karpathy has this way of reframing things that makes you see the whole field differently. In a recent conversation with Dwarkesh Patel, he laid out his vision for the "decade of agents" - and introduced a metaphor that's been stuck in my head ever since.
When most people talk about AI, they reach for animal metaphors. Training models like dogs. Systems that "think" or "reason." But Karpathy flips it: we're not building animals. We're summoning ghosts.
This isn't just wordplay. LLMs are simulations of processes that used to happen in human brains. When you prompt GPT-4, you're not talking to an entity - you're summoning a ghost of human text-writing processes captured in training data. If the pattern exists in training, the ghost appears. If it doesn't, you get confabulation.
The implications ripple outward. The agent era won't arrive through sudden breakthroughs - it'll come through the unglamorous "march of nines," grinding from 99% to 99.9% reliability the same way self-driving has for a decade. The real innovation will be in-context learning, not bigger training runs. And the bottleneck isn't model capability - it's having good judges to provide reliable training signal.
Understanding we're orchestrating ghosts instead of building minds doesn't make AI less impressive. It just helps us avoid the dead ends.
futurism.com
Researchers discovered that poems bypass AI safety guardrails 63% of the time, but won't release the prompts because the real story is worse: their entire safety architecture is built on sand. If verse can do this much damage, the problem isn't the jailbreak - it's that these systems were never actually secure to begin with.
thepavement.xyz
The job-as-identity contract was already broken, but AI makes the rot impossible to ignore - what happens when machines finally call the bluff on a system built on grinding people down for centuries?
futurism.com
McDonald's deleted their AI Christmas ad after the backlash got too loud, but the real story isn't about the technology failing - it's about a company with unlimited resources choosing to cheap out, and that choice revealing something darker about how brands now think about quality.