Hey there,
Here's what caught my attention this week:
youtube.com
Jensen Huang spent over two hours explaining where computing is headed, and it's worth breaking down what matters.
General-purpose computing hit a wall. We're energy-constrained now, so you can't just keep scaling CPUs. The shift is toward specialized accelerators doing the heavy computational work - Huang called this the biggest technology transition of his 40-year career. GPUs stay critical because AI fundamentally runs on massive matrix multiplication, which is exactly what they're built for. NVIDIA is designing Blackwell and Rubin for AI models that don't even exist yet, banking on higher parameters, bigger context windows, and multimodal capabilities.
The developer role is changing. AI is software that writes itself through data and examples, not traditional programming. You're curating datasets and orchestrating compute instead of writing logic line by line. Domain-specific AI is the next phase - not general chatbots, but expert systems trained on deep, specialized data for biology, chemistry, robotics. Current LLMs are just the general-purpose foundation layer.
The real shift is toward agents that do things in the world: using tools, making API calls, controlling physical devices. Future chip architectures targeting physical systems - robots, cars, factories that need real-time response over raw power. This is cyber-physical computing, not just digital work.
Supply chain is the constraint nobody wants to address. Advanced fabs take five-plus years and tens of billions to build. Only a handful of companies can attempt it. The whole system is fragile - TSMC, ASML, packaging, substrates. Any disruption cascades everywhere. Future gains come from efficiency because we can't brute-force more power.
NVIDIA's advantage isn't hardware - it's the software ecosystem. CUDA, cuDNN, TensorRT, years of co-development with researchers. Full-stack ecosystems win: hardware plus software plus developer mindshare. Huang's survival philosophy is simple: reinvent or disappear. NVIDIA went from PC graphics to general computing to AI. Companies that don't reinvent around new paradigms either shrink or vanish.
We're rebuilding the entire computing stack to move AI from software into physical reality. That's the project of the next decade.
theverge.com
The Verge piece breaks down how everyone's betting trillions on LLMs while ignoring the gap between predicting words and actual reasoning - basically the entire industry is conflating pattern matching with intelligence. It's the kind of thing where in five years we're either gonna look back and say "yeah, we knew that" or pretend we never made the bet in the first place.
futurism.com
MIT modeled 20M jobs at risk from AI by mapping actual worker skills across 32k capabilities instead of just throwing out scary numbers. Finally someone's measuring actual displacement with real data instead of guessing which jobs sound like they could maybe be automated.
arstechnica.com
Supreme Court is hearing arguments on whether ISPs like Cox and Charter should be required to terminate subscribers who repeatedly pirate content - the music labels are basically trying to force internet providers to become copyright cops. Wild that we're at the point where they want your ISP to kick you offline because you torrented an album, but here we are.
theverge.com
OpenAI is shelving ads, shopping, health agents, and Pulse to focus entirely on ChatGPT speed, reliability, and personalization - basically admitting Google's Gemini caught them flat-footed. When you're scrambling like this, you strip away everything that isn't core and pray you can hold the line.