Artificial intelligence was, not surprisingly, a very hot topic at BrightonSEO earlier this month. Here, we’ve rounded up the top insights on AI and its intersection with SEO from the expert speakers who graced the Lumar Stage at BrightonSEO
The angles of these ‘AI and SEO’ conversations varied across the conference; some of the experts covered how AI-powered SERP features are impacting traditional organic search efforts, and others discussed how to use AI tools to improve one’s own SEO workflows.
The BrightonSEO Lumar Stage speakers (and their presentation titles) included:
How AI is Changing Search:
- Crystal Carter, Head of SEO Communications, Wix Studios — “Say My Name, Say My Name: SEO for Brand Recognition in LLMs”
- Fiona McGovern, VP, Account Management EMEA, Conductor – “Impact of AI on Search Results: Implications for Content Strategy and SEO”
Practical Applications of AI Tools for SEO:
- Katarina Dahlin, Senior Growth Hacker & SEO Consultant, Genero – “How to Optimize 3,500 Product Descriptions for E-commerce in One Day Using ChatGPT”
- Panos Kondylis, Director of SEO, SLEED – “Using AI to Create Scripts And Tools for SEO Operational Excellence”
- Michael Van Den Reym, Data-Driven Marketing Specialist, Immoscoop – “Creating Data-Driven SEO Apps with AI”
- Kane Bartlett, Associate Director of Growth, Holiday Extras – ”Our AI Journey at Holiday Extras”
- Tejaswi Suresh, Director of Global SEO, Navan – “Programmatic SEO Framework for Enterprises”
Top Takeaways: Practical & Strategic AI Tips from BrightonSEO
- More and more people are using LLM tools like ChatGPT in place of traditional search engines like Google — but optimizing your brand’s content and entity components for inclusion in generative search responses requires a different approach than traditional SEO. (Learn how to approach AI optimization below!)
- LLMs crawl your site not to rank it but to interpret it. Ensuring your website content is easily parsed and ‘understood’ by large language models will be vital in securing brand visibility in generative search.
- AI tools provide opportunities for SEO professionals to streamline their workflows in numerous ways. The Lumar stage speakers provide use cases and how-tos for large-scale content creation, keyword research, and using generative AI coding tools to build your own SEO apps for unique reporting and SEO data analysis.
How AI is Changing Search:
Crystal Carter on how to build brand recognition & visibility in Large Language Models (LLMs)
“Gen-search ‘engine’ visibility matters,” says Crystal Carter, head of SEO communications at Wix Studios and co-host of the SERP’s Up Podcast.
“LLMs aren’t search engines, but many people use them as such,” she notes. And if an increasing number of people are using LLMs like ChatGPT for ‘search,’ then it behooves SEO professionals to take note — and add ‘AI optimization’ to their repertoires. (For context: Carter cited Similarweb data that showed ChatGPT now receives 2.63 billion monthly visits — and 9 billion monthly page views!)
It is important to note that the user journey for ‘searchers’ using LLM tools is notably different from that of traditional search engines like Google. It’s even distinct from users’ interactions with Google’s “AI Overviews” and other SERP-integrated AI features. This is because users simply “stumble upon” Google’s AI Overviews by following their usual search behaviors — as opposed to specifically navigating to a generative AI tool like ChatGPT for a more conversational approach to their queries.
Furthermore, users interact with LLM responses. They can follow up on their original queries in a manner that’s quite different from the user interface on traditional Google search. Even with Google’s AI Overviews, users simply receive the information rather than conducting an ongoing ‘conversation’ as they would with an AI chatbot. There is no option for back-and-forth ‘conversation’ in Google’s AI overviews as they are simply an added tool on top of the traditional search results.
And, of course, the algorithms at play will be different across different platform types and companies. You’ll need to tailor your approach for each LLM.
To understand how to best optimize for brand inclusion in LLM results, it helps to understand how the different AI platforms and tools operate, so you can know how to monitor each platform and what — and when — to expect results.
Carter makes a distinction between two primary types of LLMs, those that use:
- Pre-Trained Data (For example: Google Gemini and Claude) — These LLMs have a hard ‘knowledge cut-off’ date. These tools may or may not include website links or citations to show you where their ‘knowledge’ has come from.
- Augmented Pre-Trained Data (For example: Perplexity ad Copilot) — These LLMs augment their fixed training set by a present-day web crawl in an attempt to provide more up-to-date responses and information. They usually include website links and citations by default to ‘show their work’.
Brand optimization for pre-trained LLMs (with knowledge cut-off dates)
LLMs running on pre-trained data are using data that is fixed in time — when these platforms update their data, there will likely be a significant shift in visibility for any given brand or information. They are not being updated ‘in real time’ using information that has been published today.
Recall Open AI’s disclaimer that “ChatGPT is not connected to the internet, and it can occasionally produce incorrect answers. It has limited knowledge of the world and events after 2021 and may also occasionally produce harmful instructions or biased content. We’d recommend checking whether responses from the model are accurate or not.”
Always check the LLM’s knowledge cut-off date. You can simply ask it yourself in a prompt to get this information.
For pre-trained LLMs, don’t expect to see brand new information — or brand new appearances of your own brand name — right away. If the LLM is trained on information from before you’ve made your optimizations, you won’t see any changes until the LLM has updated its training set. Carter suggests monitoring for news about training set updates across each pre-trained LLM to know when to check for changes in your brand’s visibility.
Presently, the training knowledge cut-off dates for some of the better-known pre-trained LLM tools are as follows, shared Carter:
- ChatGPT (OpenAI): 2021 (for free users) or 2023 (for paid users)
- Gemini (Google): 2023
- Claude (Anthropic): April 2024
When you come across incorrect or outdated information related to your brand in a generative AI tool’s response, you will generally have the option to provide feedback on the AI response. Carter recommends doing this so you can help train the next iteration of the tool before its release.
Optimization for web-augmented LLMs (impacted by present-day web search results)
Augmented LLMs attempt to stay more ‘up to date’ with their information about the world and events by augmenting their core fixed training set knowledge with present-day web searches. This allows them to incorporate more recent data and information into their generative responses.
You are more likely to see website links and citations in responses from LLMs using augmented data, as they are usually programmed to ‘show their work’ and share where their information is coming from. This is because it is likely not possible to vet this ‘live’ web content for quality assurance and trustworthiness to the same extent as the content used in its core training data.
In these web-augmented generative AI tools, you can expect to see new information appearing in the results sooner than you will in the ‘fixed knowledge’ LLMs.
When you start to see new, correct information about your brand in these LLM tools, Carter suggests providing feedback on these good responses to affirm to the model that it is ‘learning’ the correct information. (As with the fixed dataset LLMs, also provide feedback when the generated response is incorrect.)
Carter recommends checking your core queries regularly. If you have a concept or framework that is central to your brand, for example, regularly check in with these augmented LLMs to see whether or not they have started to connect your brand to these concepts.
Your SEO and LLM optimization also go hand-in-hand when it comes to web-augmented AI tools — prioritize your SEO efforts on the pages that you see are getting attention and links from the LLMs.
Tips for LLM visibility across the board
Regardless of whether you’re trying to show up in a limited-knowledge LLM or a web-augmented LLM, Carter shares some general tips when beginning your quest for brand visibility in AI-generated responses:
- Optimize for LLM crawl intent
- Optimize for web-augmented LLMs first
- Manage your brand entities
- Get involved with the platforms (provide feedback on responses generated, create custom-branded GPTs, etc.)
Carter reminds us that LLM crawlers are not trying to rank your content in the same way that traditional search engines are — they are trying to interpret it. Make your content as easy to parse and ‘understand’ (from a machine-learning perspective) as possible.
To optimize your brand entities, you’ll need to ensure you are managing the information that appears in any relevant Wikipedia or Wiki Data entity, claiming your Knowledge Panel, and implementing structured data via appropriate schema markup across your website — as well as making an effort to align your content to known entities via appropriate links
Fiona McGovern on changing content strategies to address AI’s impact on search
The search landscape is changing, and SEOs need to prepare for the latest AI SERP enhancements to stay ahead, says Fiona McGovern, VP of EMEA account management at Conductor.
When Google introduced AI Overviews in its search platform earlier this year, there were some notable issues with its AI output — for example, the case of Google’s AI recommending the ingestion of rocks. These mistakes can be amusing, but on a platform that many turn to for their most intimate questions, the potential to spread blatantly false information is concerning, says McGovern. But Google has continued to refine its AI output, adding extra guardrails for ‘YMYL’ searches that can impact people’s health and finances that are likely to prioritize content from actual experts. McGovern reminds us that web content influences AI outputs, so what we publish matters and can influence the AI responses.
Alongside the addition of AI Overviews to Google Search, the industry has seen a notable increase in the number of Reddit and other forum posts appearing at the top of the SERPs, suggesting that user-generated content (UGC) may once again be having a moment. But as Google refines its AI Overviews to pull from more authoritative sources, there may also be an opportunity here for brands that prioritize authorial expertise.
With the search landscape changing in numerous ways this year, how should SEO professionals respond? McGovern offered several suggestions for website teams that want to stay ahead:
- Monitor relevant keywords that receive AI Overviews: Understand when your brand is showing up in AI Overviews — and which brand-relevant keywords receive the AI Overviews box at the top of the SERPs. If your brand is not showing up there, whose is?
- Create more specific content for each stage of the user journey: Be as specific as possible in the content you create, suggested McGovern. This can help your site’s content get included (and linked to) in Google’s AI Overviews and ensure you’re present at each stage of the user journey. She uses the example of people searching for information about the northern lights — what very specific questions might they be asking as they explore this topic? Chances are they will want to know precisely when the best time of year is to try and see the northern lights, as well as exactly where they can go to improve their chances.
- Develop a UGC content strategy: With online forums having a heyday in search, do you have a strategy to get included in these conversations? It’s high time to capitalize on the opportunities that exist outside of your own .com, says McGovern. Future-thinking content strategies need to go beyond what you publish directly on your domain.
- Track Reddit as a competitor: Look into what brand-relevant keywords Reddit posts are showing up for in the SERPs. Get involved with the conversation or look for content ideas in the highly engaged discussions – especially for YMYL content, where your authors’ expertise may give your content the edge over forum posts.
- Authorship matters: Why should someone believe what you publish? Is the writer of any given content piece an actual expert, or is the user just as well off going to a forum of anonymous contributors to get the information they seek? Leverage expertise inside your organization and showcase the reasons why these authors are trustworthy authorities on the topic by linking to their LinkedIn profiles, other articles they’ve written, wiki entries, etc.
Practical Applications of AI Tools for SEO:
Katarina Dahlin on how to use ChatGPT to optimize thousands of e-commerce product descriptions in one day
While understanding how to appear in generative AI responses will be key for forward-thinking SEOs, understanding how to use generative AI tools in your own SEO workflows can exponentially improve your productivity as an SEO professional today.
Katarina Dahlin, SEO consultant at Genero, has done just that. On the Lumar Stage at BrightonSEO, Dahlin shared how she used ChatGPT to shave countless hours off the time required to optimize e-commerce product descriptions.
Dahlin scaled her website’s organic traffic from 0 to 50k monthly page visits in under a year. As her site grew, so did her number of product pages — soon leading to 9,500 unique product pages on her domain.
Strapped for time but wanting to continue her site’s growth momentum, Dahlin began looking for new processes that could help her optimize her thousands of product pages for better search engine ranking and visibility. That’s when she started building out new processes with ChatGPT.
She began building custom GPTs trained on her own tailored datasets and instructions to provide the best possible AI-generated content for her website’s unique needs.
To get the best responses, she built out individual custom GPTs for each product brand she offered on her site.
“Custom GPTs can write better content than regular ChatGPT when they know the products and the brand,” she explains. This means fewer ‘AI hallucinations,’ helping you avoid the most commonly encountered, quality-killing pitfalls associated with AI-generated content.
Training each custom GPT on a unique brand also helps you maintain that brand’s voice in the content outputs, notes Dahlin.
For other SEOs in the e-commerce space, Dahlin outlines her top suggestions for building your own custom GPTs for product descriptions:
- Scrape product and category descriptions using your website crawler’s custom extraction capabilities. (Note: Dahlin has also written about how to scrape content on her blog.)
- Start building your custom GPT via ChatGPT’s editor. (Note: you will need a paid account to do this.)
- Upload your scraped content data to your new GPT as a .csv file.
- Provide your new GPT with well-defined instructions and basic context about your required task and desired language, style, etc.
- Instruct the custom GPT to “never hallucinate” — and to always ask you for more input if needed to generate a result.
- Create your prompt template for product descriptions — see Dahlin’s own product description prompt template here.)
- Be specific about how you want the content structured — For example, outline the recurring subheadings you want to use across every product description.
- Tell your GPT how you want the text styled in HTML — For example, “Subheadings should all be formatted as H2.”
- If you aren’t getting the results you want, Dahlin suggests showing examples to your custom GPTs in the prompts.
In Dahlin’s first iteration of the process, she still had to manually copy/paste the prompt and type in the product name to generate the response. It took about 5 minutes per product at this stage.
But Dahlin knew the process could be streamlined even more. She moved her process to the Make platform to start automating her workflows and eliminate the need for her to copy/paste the prompt herself.
She used her automation platform to connect Google Sheets to her custom GPTs and build out workflows that didn’t require her to manually input prompts one by one.
The result? By automating her workflows for the custom GPTs, Dahlin could now generate 3,500 product descriptions for her website in one day. Instead of generating 1 product description every 5 minutes, she could now generate 40 product descriptions in the same amount of time.
Watch Dahlin’s fully automated product description process on her website.
Panos Kondylis on AI use cases for SEO operations
The applications of AI for SEO go well beyond generative content tools. Panos Kondylis, Director of SEO at SLEED, shared how his team developed 7 AI tools that automated time-consuming SEO tasks to improve their internal operations and workflows.
He estimates that building out these simple AI tools for SEO operations have saved his team over 30 hours per month, for each person using the scripts. “We can analyze our content much more efficiently,” he says.
SEOs looking for practical ways to apply AI to their own workflows can take inspiration from these use cases and tool types that have accelerated SLEED’s SEO operations.
One example of a simple AI tool even non-programmer SEOs can build to improve their workflows:
Keyword N-Gram Analyzer
SEOs can build AI tools that help them cluster their keyword research data into N-Grams to spot patterns and opportunities for content planning and optimization.
How To:
- Upload your entire keyword research file into the AI tool (including both keyword and search volume data)
- Have the AI cluster the keyword data into 1-grams, 2-grams, and 3-grams
- Review the output clusters to quickly identify key topics, query types, and other patterns at-a-glance — and update your content plan accordingly.
Looking at the 1-grams (one-word clusters), you can quickly spot brand names and high-level topics that receive the highest search volumes — this is easy enough to do in a spreadsheet, so where this simple AI tool really shines is with 2-grams and 3-grams.
Looking at your 2-grams (two-word clusters), you can find the most common two-word phrases that appear across your keyword phrases and the search volumes associated with these 2-grams across your keyword list — for example, what is the combined search volume for keywords in your list that include phrases like “how to” or “how do” or “[product type] for”? Do you have content that aligns with these query types?
3-grams (three-word clusters) can help you get even more granular with your content planning. What is the collective search volume for keywords that include specific three-word phrases? What can this tell you about the query types that could be applied across your broader content strategy? (For example, “how to connect” or “best laptop for” or “not working on”.) If “best laptop for” is at the top of your 3-gram list in terms of search volume, Kondylis suggests you may want to consider building out pages about laptops for various needs, ie, “laptops for gaming”; “laptops for university”; etc.
(Note: you can view Kondylis‘s full talk deck here to see the full list of AI use cases presented.)
Kane Bartlett on how to get your company on board with AI tools
AI tools helped improve both internal operations and customer experiences at Holiday Extras, one of the first UK companies to become a ChatGPT Enterprise-level customer.
Kane Bartlett, associate director of growth, knew that enterprise-level AI tools could be transformative across the business. But getting the entire company up-to-speed with what these tools could do — and how to get the most value out of them — posed an initial challenge.
Getting people on board with AI is often a harder problem than learning to use the tools themselves, says Bartlett. Lucky for us, he’s outlined the core challenges he’s encountered with onboarding new AI tools — and how his team has overcome them.
There are several key problems to solve when you’re onboarding new AI tools for your company, says Bartlett:
Getting Buy-In for AI Tools:
Bartlett recommends ‘leading from the top’ when it comes to implementing new AI tools and processes. Make sure you have vocal support from your top leadership team as you roll out the AI tools to the broader company.
Getting AI on the Agenda:
Don’t just quietly add new AI tools to your company’s tech stack and hope your team starts using them— talk about it. Make sure you communicate regularly about what’s available — and how others across your organization are using the tools.
Understanding the ROI:
Spending money on new AI tools without a plan to reap the most benefit from them is not going to maximize your ROI. Before purchasing new tools, Bartlett recommends making sure you’ve got a plan for how your company can actually benefit from them.
The Skynet Problem:
People may have some worries about potential risks they associate with AI — Legal may be concerned with protecting IP and data from being used in broader machine learning datasets, for example.
Aim to take a transparent, accessible, and fun approach to help mitigate any concerns — and get ahead of these potential concerns with full clarity on how the platforms you’re working with protect your data and how they operate with regard to privacy, security, and ethics.
Bartlett recommends setting up an internal AI steering committee made up of employees from different disciplines across your organization. This will help your team collectively look at (and address) any perceived risks from the perspectives of multiple stakeholders and specialists.
The Ignorance Problem:
People are busy and new tools have a learning curve. Chances are, you’ll have a number of team members who feel like they don’t know where to start with the AI tools, or don’t know what they can even do with them — which may make adoption of the new technology a slow process.
Bartlett recommends showing examples of how different teams have used the tools and bringing potential use cases and case studies to your team to get them thinking about how they might implement AI in their own workflows to build out new efficiencies.
Start with hosting some ‘AI for beginners’ courses within your organization to demystify the tech. Bartlett gives the example of having a mandatory ‘Prompting 101’ workshop company-wide and also offering more advanced sessions for folks who want to dive deeper.
In your initial ‘AI Prompting Masterclass’, Bartlett recommends teaching the entire organization key skills like:
- How to prompt the AI tool to get the most useful responses
- How to structure data to provide extra context for machine learning
- How to work with the AI tool’s API
Michael Van Den Reym on how to create data-driven SEO apps with AI coding tools
Even if you’re not a coder or data scientist by trade, it’s never been easier to develop your own customized apps to make SEO data analysis a breeze. Michael Van Den Reym, a data-driven marketing specialist at Immoscoop, provides an overview of the AI-powered coding tools SEOs can use to develop their own time-saving apps, along with some example SEO apps to s.
Thanks to the rise in AI-powered coding and data analysis tools, we all have an opportunity to streamline our SEO workflows with automations. Van Den Reym points to several cost-effective options that can help non-coder SEOs get started:
- Google Colab: Helps you write and run python code, easily share with colleagues via Google Drive, and requires little to no set-up. (Free.)
- Deepnote: Helps you use AI to do data analysis in python, helps you work with data warehouses and huge spreadsheets, and helps you create apps yourself. ($39/month)
- Claude: Helps you build interactive JavaScript applications and create publishable artifacts. ($20/month for the pro version.) (Note: Van Den Reym says Claude currently outperforms ChatGPT when it comes to writing and coding.)
So, now that you’ve got some new tools to try, what can you do with these tools that will help you with your SEO work?
Van Den Reym points out that there are some SEO tasks that can feel a bit… robotic. Identifying overly large images on your website and resizing each and every one? That can take ages if you’re doing it manually, but it doesn’t take a lot of human brainpower — Van Den Reym points to these sorts of repetitive SEO tasks as prime targets for the sorts of automations you can build with AI-powered coding tools.
SEOs can also use AI coding tools to build out apps that help with large-scale data analysis. Van Den Reym provides the example of redirect mapping for a website migration. Using Deepnote, he was able to build an app that can automatically analyze your old URL list against your new one — and provide the top 3 suggestions for which pages to match up for redirects based on meta title keywords in the ‘old pages’ vs. ‘new pages’ list.
Ready to start building your own SEO automations with AI tools? Van Den Reym offers a few general tips for getting started: start simple, embrace trial and error, use multiple prompts (so you can check the outputs at every step within the program and refine the prompts at each stage), start with a design document to help you plan, and don’t forget to bring a critical, human eye to the project. Most importantly, experiment — a lot.
“How will you work one year from now?” Van Den Reym asks, pointing out that the best way to create the future is to create it.