Can an algorithmic timeline alter what citizens believe? A Nature study published on 18 February 2026 suggests that X’s “For You” feed can indeed nudge political attitudes, though not by magically rewriting a person’s partisan identity overnight. In a seven-week randomized experiment with 4,965 active U.S.-based X users in summer 2023, researchers compared the algorithmic “For You” feed with the chronological “Following” feed. Users moved onto the algorithmic feed became more conservative on policy priorities and on evaluations of current issues, including the criminal investigations into Donald Trump and the war in Ukraine. Yet the same intervention did not significantly change affective polarization or self-reported partisanship. (nature.com)
That distinction matters psychologically. The study itself notes that opinions about policy and current events may be less rigid than party identity, and the results fit that idea well: the feed seems better at shifting what feels salient and persuasive than at changing who people think they are. X is also an unusually political platform; in a 2024 Pew survey, 59% of X users said keeping up with politics was one reason they used it. In such an environment, ranking and recommendation are not neutral plumbing. They shape attention, and attention often shapes judgment. (nature.com)
The most revealing finding concerns mechanism. The researchers found that X’s algorithmic feed showed more political content, disproportionately amplified conservative posts, demoted material from traditional news organizations, and increased users’ likelihood of following conservative political activist accounts. Even after people were switched back to the chronological feed, those new follows often remained, which helps explain why turning the algorithm off produced little immediate reversal. Psychologically, this looks less like sudden conversion than a layered process of exposure, engagement, and habit formation: first the feed changes what you see, then it changes whom you follow, and finally it changes the informational world you inhabit. (nature.com)
The broader lesson is not that every algorithm inevitably radicalizes every user. The authors stress that the effect is specific to X and to the period studied, and may depend on the platform’s owner and design choices. Still, the paper delivers a sobering conclusion: recommendation systems do not merely mirror our preferences. Under some conditions, they quietly help manufacture them. (nature.com)










