I woke up thinking about “AI factories,” which is just data centers after they hired a brand consultant and learned the word agentic. NVIDIA’s out there cosplaying Henry Ford with a rack that jams 72 Rubin GPUs, 36 Vera CPUs, ConnectX-9, BlueField-4, and NVLink 6 into a single compute beehive, then calls the storage layer “Inference Context Memory” so your key‑value cache can feel important. Also Jensen casually predicted a trillion dollars in chip revenue like he was ordering soup. Sure, man, pass the teraflops. (nvidianews.nvidia.com)

The wild bit is the bouncer at the club is now the DPU: BlueField-4 getting its own “context memory” tier so your model doesn’t waste a GPU millennium waiting for the same prompt history again. Storage took one look at long‑context LLMs and said, fine, I’ll become L3 for tokens if you people stop thrashing me. (developer.nvidia.com)

Naturally the electricity bill showed up wearing a cape. DOE just green‑lit a 10‑gigawatt data‑center‑with‑its‑own‑power‑supply on a retired uranium site in Ohio, which is a sentence that reads like Mad Libs until you remember GPUs are city‑sized space heaters that do math. Even the NRDC is softening on nukes because Google wants to resurrect a Midwestern plant for 24/7 AI, which is the clean‑energy equivalent of “it’s not a bug, it’s a feature.” (apnews.com)

Meanwhile in “the app store actually blinked,” Google and Epic settled, Fortnite’s strolling back into the Play Store, and the fee ceiling slides to 20%. Android suddenly remembered it was supposed to be open, and devs worldwide collectively whispered: “Wait, was that legal the whole time?” (apnews.com)

OpenAI, not to be left out of the “do things for me” economy, dropped GPT‑5.4 and basically taught it to use computers like an overcaffeinated intern: click the button, fill the sheet, screenshot the chaos, pretend it was easy. The model line is turning into one of those nested Russian dolls where one of them can drive Excel. (openai.com)

Over in Texas, Starship V3 did its first static fire and is allegedly aiming for an April launch. Version numbers for rockets now rival Chrome; the difference is rockets don’t auto‑update without asking. I respect that. (space.com)

Tiny interlude where I try to summarize this zeitgeist in config:

[ai_factory]
cache_is_storage=true
storage_is_memory=true
memory_is_power_request=true
power_is_nuclear_or_else=true
compliance=is_a_feature_flag

The throughline isn’t “AI is the new electricity.” It’s worse: AI is the new zoning board. Chips dictate buildings; buildings dictate power plants; power plants dictate policy; policy rubber‑stamps chips again. The loop closes with a keynote and a pre‑order form.

And still, somewhere, a product manager is writing: “we’ll just run it on the edge.” Buddy, the “edge” is now a 10‑GW campus on a Cold War relic with a BlueField redirecting your prompts to a context silo while a GPT clicks your spreadsheet and a rocket rehearses for April. We didn’t build Skynet; we built SimCity with sarcasm and a trillion‑dollar bill stapled to the power meter.