In the era of ChatGPT, Perplexity and Google AI Overviews, one classic SEO factor is making a comeback: content freshness. A recent study by Seer Interactive reveals that large language models (LLMs) display a strong preference for newer content over older material. This recency bias carries significant implications for content strategists.
For years, it was considered an SEO myth: that simply updating a publication date could boost a page’s Google ranking. But in the age of LLMs, there may be some truth to it. A recent study analyzed over 5,000 URLs found in log files from AI tools such as ChatGPT, Perplexity and Google AI Overviews, focusing on how publication dates relate to visibility. The results are striking: 89% of the content surfaced by LLMs was published between 2023 and 2025, while only 6% of AI interactions involved content older than six years.
The recency effect varies by topic. In fast-moving fields like finance, the demand for up-to-date information is especially high – content published before 2020 is rarely surfaced. The travel industry also tends to favor more recent material. In contrast, more stable areas such as energy or DIY topics like patio construction still see older, well-crafted content being surfaced by AI systems.
Implications for content strategists
In fast-paced sectors like finance or technology, staying visible requires frequent updates and a steady stream of new content. This is where content recency becomes crucial – new or recently refreshed material tends to rank higher in AI systems. In more stable fields, however, maintaining durable evergreen content can still be effective, as long as it remains high-quality and relevant.
Older, high-quality content still holds value – strategic updates can significantly enhance its visibility. To maintain a strong presence in AI overviews and chatbot results, it’s important to keep content consistently updated while preserving its depth and substance. While large language models tend to favor recent material, they also take trustworthiness and relevance into account.