TrawlerPolicyData

TrawlerCrawling

GoogleApi.ContentWarehouse.V1.Model.TrawlerPolicyData

6
out of 10
Medium
SEO Impact
Trawler can add a policy label to a FetchReply. The two main cases are: - "spam" label added for specific spammer IPs listed in trawler_site_info, which most crawls auto-reject. - "roboted:useragent" (e.g. "roboted:googlebot") if InfoOnlyUserAgents field is set in FetchParams

SEO Analysis

AI Generated

Part of Google's web crawling infrastructure (Trawler is Google's internal name for their web crawler). This model governs how Googlebot fetches and processes web pages, affecting crawl efficiency and frequency. Crawl management directly impacts how quickly new content is discovered and how often existing content is refreshed in the index.

Actionable Insights for SEOs

  • Optimize crawl budget by fixing broken links and reducing redirect chains
  • Use robots.txt and sitemap.xml effectively to guide crawling
  • Monitor Google Search Console for crawl errors and indexing issues

Attributes

2
Sort:|Filter:
ExtraDatainteger(
Default: nil

in roboted case, the RobotsInfo

Labelstring
Default: nilFull type: String.t

"spam" or "roboted:googlebot"