Robots.txt Documents Expand, Deep Links Get Rules, EU Enters

Welcome to Pulse of the week: updates on how deep links appear in your snippets, how your robots.txt is classified, how agent features work in Search, and how EU data sharing rules apply to AI chatbots.
Here is what is important to you and your work.
Google Lists the Best Ways to Learn Deep Links
Google has updated its snippets with a new “Learn more” section for deep links in search results. The article lists three best practices that can increase the chances of these links appearing.
Important facts: Content should be immediately visible to the person loading the page, and content hidden behind expandable sections or tabbed interfaces can reduce the likelihood of these links appearing. Sections should use H2 or H3 headings. Caption text needs to match the content from the page, and pages with content loaded after scrolling or interacting may reduce the chances even further.
Why This Matters
The three procedures are the first specific guidance that Google has published on this feature. Sites that use expandable FAQ sections, tabbed product information areas, or scroll-processed content for basic information may see fewer deep links in their snippets compared to sites that offer the same content on page load.
The guide is similar to the pattern Google has used for other Search features. Content that offers no user interaction will likely appear in an enhanced display.
Slobodan Manić, founder of No Hacks, made a related comment on LinkedIn:
“The docs are structured in one way for snippets (read deep links in search results), but the language Google chose reads like a general favorite. ‘Content that’s immediately visible to a person’ is a structural command, not a highly readable tip.”
Manić’s point expands on his April 16 IMHO interview with Managing Editor Shelley Walsh, in which he argued that many websites are structurally compromised by AI agents. He argues that search crawlers and AI agents now face the same structural problem, and the test is the same for both.
On existing pages, the research question is whether the key information is contained within the click-to-expand element. If a page already has a “Learn more” deep link for one section, the structure of that section serves as a guideline for what works. In other sections on the same page, repeating that layout may also improve their chances.
Google describes the guidelines as best practices that can “increase the likelihood” of deep links. That fence is important because this is not a list of requirements, and following all three may not guarantee that links appear.
Read our full article: Google Lists Best Ways to Learn Deep Links
Google May Expand Its List of Unsupported Robots Rules.txt
Google may add rules to its robots.txt documents based on analysis of real-world data collected through the HTTP Archive. Gary Illyes and Martin Splitt described the project in a recent Search Off the Record podcast.
Important facts: The Google team analyzed the most commonly unsupported rules in robots.txt files for millions of URLs indexed by the HTTP Archive. Illyes said the team plans to write the top 10 to 15 most-used unsupported rules beyond user agent, allow, disallow, and sitemap. He also said the attacker may increase the typos he accepts to be disapproved, though he did not commit to a timeline or specific typos.
Why This Matters
If Google writes more unsupported guidelines, sites that use custom or third-party rules will have clear guidance on what Google ignores.
Anyone maintaining a robots.txt file with rules beyond user agent, allow, disallow, and sitemap should check the guidelines that never worked for Google. HTTP archive data is publicly queried on BigQuery, so the same distribution used by Google is available to anyone who wants to test it.
Typo tolerance is the most guesswork part. Illyes’ words suggest that the parser has already accepted some misspellings of “disallow,” and more may be honored later. Check any spelling mistakes now and correct them, rather than assuming they will be ignored.
Read our full story: Google May Expand List of Unsupported Robot Rules.txt
EU Proposes Google Shares Search Data With Competitors And AI Chatbots
The European Commission has published preliminary findings suggesting that Google shared search data with rival search engines across the EU and EEA, including AI chatbots that qualify as online search engines under the DMA. The measures are not binding, as public consultation will be open until May 1 and the final decision to be taken on July 27.
Important facts: This proposal includes four categories of data shared on fair, reasonable, and non-discriminatory terms. The categories are rank, ask, click, and view data. Eligibility extends to AI chat providers that meet the DMA’s definition of online search engines. If the Commission upholds eligibility in a final decision, eligible providers may gain access to anonymized Google Search data under the Commission’s proposed rules.
Why This Matters
This proposal clearly extends the eligibility of search engine data sharing to AI chatbots under the DMA. If relevance survives the discussion, the “search engine” control category now includes products that most marketing work has treated as a separate category.
Results vary depending on where you work. For sites that prepare for EU/EEA visibility, the change could widen the area through which anonymous search signals flow. AI products that compete with Google in that market may use the data to improve their search and ranking systems, which may influence which content they refer to.
Outside the EU, the direct regulatory effect is zero. The meaning of the section is a different matter. How the Commission draws the line between an “AI chatbot” and an “AI chatbot that qualifies as a search engine” will likely be cited in future proceedings.
The question of eligibility is a matter to be watched till May 1. If the Commission reduces the AI criteria for the chatbot in response to the consultation, the effects remain regulatory. If it holds the line, that could set an important precedent for how AI search is structured.
Read our full story: Google May Have to Share Search Data with Competitors
Google Adds New Activity-Based Search Features
Google has introduced new Search features that further its evolution in getting the job done. Users can now track each hotel’s price drop with a new toggle in Search, and Google is adding the ability to launch AI agents directly from AI Mode.
Important facts: Hotel price tracking is available worldwide by changing the search bar. When prices drop at a tracked hotel, Google sends an email alert. The AI agent launched from AI Mode allows users to initiate AI-managed tasks within the search interface. Rose Yao, product lead for Google Search, posted about the features on X.
Why This Matters
Each activity-based feature moves a process that started somewhere else in the Google environment. Tracking hotel rates has been available at the city level for months. The expansion of individual hotels adds a new signal that users can set within Google rather than on hotel or aggregator sites.
Direct booking visibility depends on being within the Google ecosystem. Sites that rely on price drop alerts as an incentive to return users may see some of that engagement redistributed to Google’s tracking UI. For hotel companies, this raises the possibility of ensuring that each hotel’s pages are fully populated in the Google Business Profile and hotel feed.
On LinkedIn, Daniel Foley Carter linked the feature to a broader pattern:
“Overview of Google’s AI, AI mode and in-frame functionality of SERP + SITE Google eats a lot of traffic opportunities. Everything Google told us not to do itself. SPAM / LOW VALUE CONTENT – don’t review other people’s content – Google does.”
The introduction of an AI agent is highly speculative. Google has not yet published detailed documentation explaining what types of works users can refer to or how to cite sources. The feature ensures that agent searches, described by Sundar Photosi as “search like an agent manager,” appear incrementally in Search instead of a single launch.
Read Roger Montti’s full story: Google Adds New Activity-Based Search Features
Theme of the Week: Rules Are Written
Each story this week describes something that was or is happening.
Google has signed plans to expand what its robots.txt documents cover. The company has listed some practices that can increase the chances of “Learn more” deep links appearing. The European Commission has proposed measures that extend the scope of search engine data sharing to AI chatbots under the DMA. And the task-based features described by Sundar Photosi in the discussions appear as toggles in the search bar.
In your daily life, the ground is getting stronger. A few questions are judgment calls. What works and what doesn’t, what Google supports, and what matters as a search engine to the controller are all documented. That’s to your advantage if it means clear research criteria, and against you when “we weren’t sure” is no longer a defensible answer.
Top Stories of the Week:
Additional resources:
Featured Image: [Photographer]/Shutterstock



