WokeDonalds: How DEI Has Been Rebranded for Advertising Optics
05/03/2026
24/02/2026
When dominant institutions face sustained criticism, they rarely surrender control. Instead, they redesign the interface. Google’s “Preferred Sources” feature, rolled out widely by late 2025, has been presented as a corrective to years of complaints about search bias and algorithmic gatekeeping. Users can now “star” their preferred news outlets, which then appear more prominently in Top Stories. The framing is deliberate: if you distrust the algorithm, take control. Curate your own information environment.
That sounds empowering. In practice, it may be something else entirely.
For years, Google has been accused of ranking favouritism, systemic suppression of certain outlets and an ecosystem that concentrates traffic among a narrow group of dominant publishers. Whether every claim is justified is almost secondary to the structural reality: in the digital economy, visibility is revenue. Search placement determines traffic. Traffic determines advertising yield. Distribution determines survival. Google owns the distribution layer.
“Preferred Sources” does not dismantle that power. It reframes it. Instead of being accused of narrowing the information landscape, Google can now point to user choice. If your feed is homogeneous, that is because you curated it. If certain outlets dominate your Top Stories, that is because you starred them.
The problem is not that users are incapable of choosing. It is that human beings are not neutral editors. Decades of behavioural research demonstrate that we gravitate toward confirmation rather than contradiction. We seek coherence with our existing beliefs and social identity. When given the opportunity to personalise, most people do not construct a balanced portfolio of perspectives. They optimise for alignment. The mainstream reader stars mainstream outlets. The contrarian does the same with alternative media. The right and left each deepen their own silos.
The result is not decentralisation. It is self-segregation at scale.
From Google’s perspective, this has an elegant advantage. Allegations of bias lose force because the narrowing is voluntary. No stories need to be demoted. No publishers need to be removed. The ecosystem fragments along predictable tribal lines, and the platform can claim neutrality while retaining full control of the infrastructure beneath it.
This matters particularly from an advertising and media economics standpoint. Larger publishers with established brands and marketing budgets will aggressively campaign for users to “star” them. Smaller independent outlets, already struggling for discoverability, must now compete not only in the opaque arena of ranking algorithms but also in a behavioural marketplace shaped by platform design. The rich gain another mechanism to consolidate attention. The challengers face another barrier layered on top of the existing ones.
Context is critical. This feature does not exist in isolation. Google’s ecosystem includes AI Overviews that summarise content without necessarily driving clicks, previous ad bans that have materially affected monetisation for specific publishers, and ongoing disputes about ranking transparency. Against that backdrop, “Preferred Sources” appears less like structural reform and more like a reputational pressure valve. Critics argue the algorithm suppresses. Google responds by offering personalisation. The underlying distribution engine remains untouched.
The core issue is not speech removal but discovery. Modern power rarely operates through deletion. It operates through ranking. If defaults privilege incumbents, if AI intermediates user attention before a click is made, and if design nudges encourage ideological consolidation, then the range of exposure narrows without any visible act of censorship. That is a far more sophisticated form of control because it is harder to diagnose and easier to defend.
Supporters will argue that the feature simply gives users what they want. There is truth in that. Platforms optimise for engagement, and engagement often increases when content reinforces identity rather than challenges it. Reinforcement reduces friction. Reduced friction increases time on site. Increased time on site increases advertising revenue. From a commercial standpoint, encouraging users to double down on familiar sources is entirely rational.
Commercial rationality, however, is not the same as informational neutrality.
If genuine pluralism were the goal, one might expect experiments in default diversity weighting, greater transparency around how “Top Stories” are assembled, or clearer disclosures about the interaction between starred sources and ranking logic. Instead, the burden shifts to the user while the levers of distribution remain firmly in corporate hands.
Personal choice within a closed infrastructure is not equivalent to open competition. When a single platform controls indexing, ranking and discovery at global scale, interface tweaks do not alter the underlying power structure. They simply redistribute perception.
Whether intentional or emergent, the effect of “Preferred Sources” is to reduce scrutiny of ranking decisions while deepening ideological segmentation. It allows Google to move from being accused of narrowing the information landscape to facilitating a system in which users voluntarily narrow it themselves.
Censorship in modern media markets does not always arrive as deletion or deplatforming. It can just as easily take the form of behavioural design that concentrates attention while maintaining plausible deniability. When distribution remains centralised but responsibility is decentralised, the appearance of control can mask the consolidation of power.
The question is not whether users can star their favourite outlets. The question is who ultimately determines which voices are easy to find in the first place.