Google Search Console is a web service through Google for web masters.

It allows us to:

  • Submit and check a sitemap.
  • Check and set the crawl rate, and view statistics about when Googlebot accesses a particular site.
  • Write and check a robots.txt file to help discover pages that are blocked in robots.txt accidentally.
  • List internal and external pages that link to the site.
  • Get a list of links which Googlebot had difficulty crawling, including the error that Googlebot received when accessing the URLs in question.
  • See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings.
  • Set a preferred domain, which determines how the site URL is displayed in SERPs.
  • Highlight to Google Search elements of structured data which are used to enrich search hit entries.
  • Demote Sitelinks for certain search results.
  • Receive notifications from Google for manual penalties.
  • Provide access to an API to add, change and delete listings and list crawl errors