THE PATTERN
Episode Transcript

AI arms race splits between military compliance and conscientious objection

Thursday 19 March 2026
Culture Pulse: 78

Good morning. This is The Pattern for Thursday, March 19, 2026.

The Defence Department declared this week that Anthropic's ethical red lines make it an unacceptable risk to national security. Not a competitor who declined a contract. Not a partner who said no thanks. A risk. The specific concern: that Anthropic might disable its technology during warfighting operations. Which is exactly what Anthropic said it would do. The DOD is now treating conscientious objection as a supply chain vulnerability.

This matters because it eliminates the middle ground for AI companies. You're either building weapons or you're classified as a threat. There's no third option where you just build chatbots and stay neutral. That window closed.

Sony's taking a different approach to AI control. They've unveiled what they're calling Protective AI, specifically trained to stop generative models from copying Studio Ghibli's visual style. This isn't a lawsuit. It's a product. Copyright protection just became an AI category. Train a model on your protected work, then use it to block other models from learning your aesthetic. It's defensive AI. And if Sony can do it for Ghibli, every major IP holder will want the same thing.

The implication: if you own valuable visual assets, you'll soon be shopping for protective AI vendors the same way you shop for insurance.

Figma's having a terrible year. The stock dropped 8% in a single day after Google updated Stitch, its AI coding tool that now handles UI design. Figma's down 80% since its IPO last August. Design tools are being compressed into a thin layer before AI writes the code directly. Designers still matter, but the software they use is becoming obsolete faster than anyone predicted. If you're budgeting for design tool subscriptions, redirect that money. Hire engineers who can speak to AI coding platforms instead.

Meanwhile, a Reddit user just exposed the mechanics behind Meta's $2 billion lobbying push for age verification technology. The investigation shows Meta isn't trying to protect children. It's trying to own identity infrastructure. Age verification requires real identity confirmation. And Meta wants to be the company that operates that system. Once you control identity verification, you control access to the entire internet.

Not just your own platforms. Everyone's platforms. The regulatory pressure on anonymous platforms suddenly makes more sense when you realise it benefits companies that already have identity systems built.

In fashion, MOTHER denim cast Martha Stewart for its spring campaign. Not a trending face. Not a viral personality. Martha Stewart. Someone whose fame predates social media and will outlast whatever comes next. The ultimate endorsement is no longer youth or coolness. It's permanence. Cultural longevity is the new luxury signal. If your brand strategy involves chasing trending faces, you're optimising for the wrong metric.

And Apple's China sales surged 23% in the first nine weeks of this year whilst the overall smartphone market fell 4%. Android manufacturers are pricing themselves out through rising component costs. When the mid-market collapses, premium positioning wins. If you're operating in the middle tier of any hardware category, this is your warning. Go premium now or watch your margins die.

The pattern running through today: infrastructure is the new culture war. The Pentagon treats AI ethics as a supply chain vulnerability. Meta lobbies for identity systems it will control. Sony builds defensive AI to protect IP through technical barriers rather than legal ones. The fight isn't about content or culture anymore. It's about owning the pipes. Control the infrastructure and you control everything that flows through it.

Yesterday we predicted a major fashion house would hire a creative director from gaming or product design within 60 days. Worth watching.

That's The Pattern for today. Before it's obvious. See you tomorrow.