Digital Marketing

Comparison of AI Citation Patterns Provides Strategic SEO Ideas

BrightEdge has published new data showing the different types of sites five AI search engines tend to show in the generated results. The data makes it possible to see how those differences shape what types of sites each AI engine shows, which has a strong impact on how to promote each one.

The research focuses on five research areas for AI:

  1. ChatGPT
  2. Overview of Google AI
  3. Google AI mode
  4. Google Gemini
  5. Confusion

AI Engines Criticize Different Sources But Recommend Similar Products

BrightEdge research compared the top cited website sources across AI engines to measure how much they overlap (Source Overlap). What the data shows is that there was wide disparity across the five AI search engines tested, with the lowest level of citation overlap between any two AI search engines at 16% and the highest level of agreement between any two engines at 59%.

  • Minimum deal rate: 16%
  • Maximum overlap rate: 59%

Key Agreement Product Quotations

BrightEdge also measured brand name overlap between five AI search engines and found that there was more consistency between all five. The lowest overlap between any two AI areas was 36% and the highest overlap between any two areas was 59%.

  • Minimum overlap rate: 36%
  • Maximum product quote overlap rate: 55%

This suggests that brand names strongly associated with products and services tend to perform similarly in all tested areas of AI search and may reflect how widely cited brands are by trusted websites and possibly user intent and expectations.

In my opinion, the takeaway here is that associating a brand with a product or service in the consumer’s mind is a powerful way to influence user expectations that can translate into branded searches. This is something the SEO community has been slow to pick up on, even though Google has always pointed to user signals playing a strong role in rankings. I say the SEO community has been slow to pick it up because Google has been doing this since at least 2004 (Navboost) and specifically with brand navigation marks in search (Google brand marks patent).

Wide Range of Reference Sources

BrightEdge analyzed citations from five AI areas across three types of websites (Institutional, Commercial and Editorial, and User Generated Content) and found significant differences between all five engines, despite convergence in citing strong brands.

Three Categories of Areas Analyzed

  1. Institutional websites, including government, academia, and leaders of major manufacturing industries
  2. Commercial and editorial sites, including media, reviews, and listings
  3. User Generated Content (UGC), including forums, video platforms, and social content

The data shows that each engine draws from all three categories, but measures the mix differently: institutional sources range from a low 10% citation rate to a high 26% of citations. Citations for UGC sites range from a low of 0.2% to a high of 18% of citations.

The largest category overlap across all five search engines is found in product citations for business, commerce, and editorial sites, with a low end of 37% for Gemini to a high of 51% for AI Overviews.

BrightEdge provides this takeaway about that data:

“Review sites, comparison content, trade media, vendor listings, and financial data are sources that AI often accesses. Investments in PR, trade coverage, review site visibility, and category comparison content translate into visibility on every engine, not just one.”

What BrightEdge can say is that AI search engines display sponsored articles from trusted websites that are clearly labeled as complying with the FTC’s guidelines on native advertising and Google’s guidelines on sponsored posts. This allows companies to strongly associate their products with specific products and services and increase their chances of being indexed in AI search engines.

Gemini Overview and AI Differs in Website Approval

The difference between the types of websites Gemini and Google AI Overviews use as sources shows that Gemini follows, tending to show more trust in institutional sites at a higher rate than user-generated content (UGC). Institutional websites are academic, government, academic and university websites.

The AI ​​overview, on the other hand, trusts institutional and UGC sources of information, with almost as many citations to UGC websites.

  • Institutional Content Approval Versus UGC
  • Gemini: 26% institutional, 0.2% social
  • AI overview: 10% institutional, 18% public

Another revealing finding is that there is wide variation in the top-ranking domains identified by each AI search engine. Gemini tends to link to the most trusted and authoritative websites. For example, Gemini tends to cite .gov and .org websites at higher rates than any of the other AI engines.

Gemini: 13% .gov, 23% .org

Gemini answers tend to trust institutional websites more than user-generated content, citing them 26% of the time but distrusts UGC sites, citing them only half a percentage point. AI Overview trusts UGC content to a great extent. Why is that?

It is possible that the technology underlying Gemini and AI Overviews differs. For example, it is possible that Google’s FastSearch, which prioritizes speed over other ranking signals, could be the reason why UGC sites are sources more often than Gemini. Interesting question.

I did some informal testing by asking both Gemini Overview and AI to compare the usage of a specific op-amp (electrical component) in a specific amplifier.

  • Gemini’s response cited institutional sources (Texas Instruments and an amplifier manufacturer).
  • The AI ​​overview cited two institutional websites but also several user-generated content (UGC) sites.

Gemini’s response was often discriminatory, citing the institute’s website (Texas Instruments, manufacturer).

The AI ​​overview citations of various UGC sites were useful in the context of this question because real users shared their experiences with this op-amp and actual electronic op-amp ratings and comparisons.

.Edu sites not authorized?

Another interesting finding is that all AI search engines rarely cite .edu websites. Confusion cited .edu sites at a higher rate than any of the other AI engines, citing .edu websites 3.2% of the time.

Those results contradict the old belief in SEO circles that .edu sites are more authoritative. BrightEdge research shows that .edu sites are not ranked for the types of questions users ask AI search engines.

ChatGPT Cites Various High Sources

The data also shows that ChatGPT shows the most diverse sources of website sources, relying on its top ten sources only 18.5% of the time, with Google AI Mode behind it at 19.4%. Gemini (26.3%) and Perplexity (26.7%) show the largest number of similar sites taken from the top ten.

Percent of Top Ten Sources

  • ChatGPT: 18.5%
  • Google AI mode: 19.4%
  • Gemini: 26.3%
  • Confusion: 26.7%

Gemini and Obsession Rely on Authorized Sites

Gemini and Perplexity tend to lean heavily on authoritative websites. As noted, Gemini trusts institutional sites the most and Perplexity cites .edu sites more than any of the other AI engines.

Confusion showed a similar pattern of linking in a secure manner to highly trusted and authoritative sites. BrightEdge’s report explains:

“Perplexity focuses more of your citations on institutional, government, encyclopedic, and medical publisher sources than any other engine. Combined, those four categories account for nearly 30% of Perplexity’s citations.”

Five AI Engines, Five Different Citation Profiles

Here’s a breakdown showing the citation distribution of each AI search site, Gemini and Perplexity showing a strong selection of authoritative sites.

Gemini

  • 26% institutional plots
  • 23% .org
  • 13% .gov
  • 0.2% UGC

Confusion

  • 86% of the product comes from the 5th position or earlier
  • 30% of citations from institutional, government, encyclopedic, and publisher health sources
  • 22% institutional plots
  • 3.2% .edu
  • 1.5% UGC sites

ChatGPT

  • The top 10 sources account for 18.5% of citations
  • 20% .org
  • 12% .gov
  • 0.5% UGC

Google AI mode

  • The top 10 sources account for 19.4% of citations
  • 14% institutional plots
  • 7% UGC

Overview of Google AI

  • 18% UGC
  • 10.6% of citations from one video site
  • 10% institutional sites
  • 2.9% from the forum area

Google AI Is Not A Single System

Google’s AI mode and full Ai check show nearly identical websites, with an average of 59% overlap of cited websites. Gemini has a small amount of overlap.

  • Gemini vs AI Overview: 34%
  • Gemini vs AI Mode: 27%

This difference shows that Google’s AI systems rely on different mixes of sources, while Gemini shows a wider number of differences.

What is taken

The data makes it easy to view each AI search engine with a brief description of what type of sources each AI engine is most likely to refer to. There is a huge difference in source citations with a clear choice of what types of sites each engine chooses to link to. If there is one big takeaway from data, in my opinion it would be the importance of establishing brand connections with products and services.

Other Takeaways

  • Gemini and Perplexity depend on the brand of high authorities and institutional websites.
  • ChatGPT cites a wide range of sources, showing a high mix of websites.
  • Google’s AI Overviews cites more UGC sites than any other AI search.
  • Gemini shows the least amount of overlap between the three Google AI systems.
  • The AI ​​overview and AI mode show a high degree of overlap.
  • Citation overlap varies widely across the five AI engines, reflecting significant differences in source selection.

Read the BrightEdge report: Why AI Engines Cite Different Sources But Recommend the Same Products

Featured image by Shutterstock/Toey Andante

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button