Digital Marketing

Google Testing Web Bot Auth to Authenticate AI Agent Applications

Google has published documentation describing its Web Bot Auth test, an IETF test protocol that can help websites anonymously authenticate other automated requests from bots and AI agents.

The protocol adds another layer of authentication by allowing agents to sign HTTP requests with cryptographic keys. Websites can then verify those signatures against published public keys to verify that the request is coming from who it says it is.

What’s new

Web Bot Auth uses HTTP message signatures (RFC 9421) to allow automated clients to sign outgoing requests. The bot holds the private key, publishes its public key to a known URL, and signs each request. The receiving website checks the signature with the public key to verify your identity.

Google claims that a subset of signed Google-Agent applications are authorized as https://agent.bot.goog. Signed petitions include a Signature-Agent The HTTP header is configured to say g="https://agent.bot.goog"and the corresponding signature can be verified using public keys published on that domain .well-known directory.

According to Google’s documentation, bot discovery services, CDNs, and WAFs already support the process. The IETF draft was approved by Thibault Meunier of Cloudflare and Sandor Major of Google. Cloudflare publishes a reference implementation on GitHub.

The IETF Web Bot Auth Working Group was recruited in early 2026 with milestones on the standard track specification and the current best document.

What Google Hasn’t Done Yet

Not all Google users participate. The documents say Google is testing “some AI agents hosted on Google’s infrastructure” but don’t say which ones other than Google’s user-activated tracker.

Even for participating agents, not all applications are signed. The docs recommend that sites continue to rely on IP addresses, reverse DNS, and user agent strings as the primary means of authentication while signed traffic is phased out.

The Internet Framework may change as the working group develops the standard.

Why This Matters

Bot simulation has been an ongoing problem. Scrapers and malicious actors can tamper with users’ threads to disguise their traffic as Googlebot or other legitimate crawlers, making it difficult for site owners to tell real bot traffic from fake.

We talked about this issue when Google’s Martin Splitt warned that “not everyone who claims to be a Googlebot is actually a Googlebot.” The verification methods available at the time were reverse DNS lookups and IP range checks. Web Bot Auth can add a layer that cannot be forged without the agent’s private key.

For sites that already use a CDN or WAF that supports the protocol, authentication can happen automatically. For everyone, an experimental situation means there is no urgency to act. The documentation recommends handling existing authentication as the default and Web Bot Auth as an add-on.

Looking Forward

Web Bot Auth is still going through the standards process, and Google’s implementation is still being tested.

Currently, effective change is evident. Websites may start seeing signed requests from third parties for Google-Agent traffic, while existing authentication methods remain automatic.

The next question is whether most AI agents use signed requests, and whether hosting providers automate authentication for websites that don’t want to manage keys.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button