Survival of the Smartest: Navigating the ‘post-search’ world

Post Search BookMachine Blog Article Header

The London Book Fair is always a whirlwind of caffeine and conversation, but this year, one session felt particularly vital for those of us looking toward the horizon. The panel, ‘How to Survive and Thrive in a Post-Search World’, featured insights from author Jamie Bartlett, Sarah Posner, Publishing Innovation Manager at Bonnier Books, and chaired by Searsha Sadek from Shimmr.

Survival of the Smartest panel at LBF 2026
Left to right: Sarah Posner, Publishing Innovations Manager, Bonnier Books; Jamie Bartlett, author, How to Talk to AI (and how not to); and chair Searsha Sadek, Founder & Chief Product Officer, Shimmr

They reflected on the shift that is already changing how readers find their next favourite book: we are moving away from a world dominated by the traditional search bar and into one defined by Large Language Models (LLMs). But what does a ‘post-search’ world actually look like for publishers and creators?

From SEO to GEO

For years, the industry has focused on Search Engine Optimisation (SEO) – essentially using keywords to climb the rankings of Google or Amazon. However, the rise of AI has introduced a new term: Generative Engine Optimisation (GEO).

While SEO is often about shallow keywords, GEO requires a much more complex understanding of a book. AI models do not just look for words; they look for cultural identity, tone, and deep context. As Sarah noted, this requires a joined-up approach. It is no longer just a task for the marketing team; sales, ops, design, and metadata teams must all upskill to ensure their books are visible to these digital ‘recommendation engines’.

How AI ‘reads’ your backlist

One of the most exciting takeaways was that LLMs do not necessarily privilege recency. While traditional marketing often focuses on the launch window of a new release, AI models tend to recommend books with the widest cultural impact, often pulling from sources like Wikipedia or long-standing digital footprints.

This presents a massive opportunity for the backlist. By using AI to refresh keywords or optimise Amazon copy, publishers can breathe new life into titles that are ten years old but still hold cultural relevance. The panel shared a word of caution: AI discovery does not automatically equal sales. These titles still need human-led campaigns, budget, and resource to cross the finish line.

The art of talking to machines

Jamie Bartlett highlighted a crucial point: most users only use about 1% of what these machines can actually do. If you ask ChatGPT for a ‘top ten’ list, you get a superficial answer. But if you interrogate the machine with personal detail – describing specific moods, tropes, or past loves – the AI can dig into the ‘long tail’ of literature to find something truly unique. But we must remember that these models are ‘grown’, not ‘built’. Even their creators only understand a fraction of why they answer the way they do. They are statistical probability systems, which leads to the industry’s biggest AI headache: hallucinations.

The hallucination hazard

‘Hallucinations’ are when generative AI models create confident, plausible-sounding, but false or unsubstantiated information, such as fake citations or fabricated facts. All current models ‘hallucinate’ between 5% and 30% of the time. They are designed to be eloquent and persuasive, which makes it very easy to believe them even when they are spouting nonsensical or inaccurate information. Jamie shared an anecdote about a book recommendation he received from ChatGPT. He bought the book, only to find the publisher knew nothing about the ‘facts’ the AI had claimed were inside it.

The best way to combat this is with transparency and a ‘human-in-the-loop’ approach. Use AI as a focus group to build personas or test ad copy, but never let it have the final word on metadata or factual content without a human check.

The return to authenticity

Perhaps the most heartening part of the discussion was the consensus that AI might actually trigger a backlash in favour of the ‘real’. As the digital world becomes flooded with AI-generated summaries, there is a growing movement towards authenticity and grittier, reality-based content. The panel also reminded us that word-of-mouth in bookshops and libraries is not going anywhere. In fact, the more we use AI to summarise documents and shortcut our learning, the more we risk losing our critical thinking and retention of knowledge.

The power of language, precision and eloquence

To thrive in this post-search era, the advice from the panel was to read as many books as you can. Mastery of language, precision, and eloquence are the keys to communicating effectively with AI. Those who can feel, extrapolate, and bring a human sense of ‘truth’ to their work will be the ones who stand out. In a world of probability-based word matching, our greatest asset remains our authenticity.

Are you experimenting with GEO for your backlist? Or are you sticking to tried-and-tested SEO? Let us know your thoughts in the BookMachine Campus community! Log in here.

Not a Campus member yet? Sign up here.

Related Articles