Googlebot ignores robots.txt

Featured Imgs 11

I'm noticing Googlebot is not respecting my robots.txt. I'm seeing Googlebot's user agent crawling pages that have been in my robots.txt file for many months. Some of them are showing up in GSC as "Indexed, though blocked by robots.txt" with Last crawled dates indicated as recent as yesterday.

Additionally, I'm seeing Googlebot crawl my robots.txt file a few times a day, and the URLs are definitely blocked per the Google robots.txt tester.

My robots.txt is in the following format:

Sitemap: ...

User-agent: *

# ...

Disallow: ...
Disallow: ...
etc. ~ 40 lines

# ...

Disallow: ...
Disallow: ...
etc. ~ 60 lines

# ...

Disallow: ...
Disallow: ...
etc. ~ 20 lines

How to Drive Transactional Email Engagement for E-Commerce

Category Image 051

This post is originally published on Designmodo: How to Drive Transactional Email Engagement for E-Commerce

How to Drive Transactional Email Engagement for E-Commerce

Whatever meaningful action a user takes – creating an account, buying stuff, changing preferences, resetting a password – should result in a notification that the interaction was a success. This direct response is crucial for a good user experience. A …

For more information please contact Designmodo