![URL is not available to Google / blocked by robots.txt, but it is not robots.txt is allowed - Google Search Central Community URL is not available to Google / blocked by robots.txt, but it is not robots.txt is allowed - Google Search Central Community](https://storage.googleapis.com/support-forums-api/attachment/thread-141673123-16229526744087123644.png)
URL is not available to Google / blocked by robots.txt, but it is not robots.txt is allowed - Google Search Central Community
![console says robots.txt is blocking googlebot for images but its not (and tester says its ok too) - Google Search Central Community console says robots.txt is blocking googlebot for images but its not (and tester says its ok too) - Google Search Central Community](https://storage.googleapis.com/support-forums-api/attachment/thread-19010848-4115745824209449958.png)
console says robots.txt is blocking googlebot for images but its not (and tester says its ok too) - Google Search Central Community
![URL Inspection Tool Shows Blocked Resources Via Robots.txt But Robots Tester Indicates "Allowed" - Google Search Central Community URL Inspection Tool Shows Blocked Resources Via Robots.txt But Robots Tester Indicates "Allowed" - Google Search Central Community](https://storage.googleapis.com/support-forums-api/attachment/thread-3259265-11695333023515808037.jpg)
URL Inspection Tool Shows Blocked Resources Via Robots.txt But Robots Tester Indicates "Allowed" - Google Search Central Community
![How to Properly Use Robots.txt tester in Google Search Console? - BloggerSpice - HubSpot to Maximize Online Earnings How to Properly Use Robots.txt tester in Google Search Console? - BloggerSpice - HubSpot to Maximize Online Earnings](https://3.bp.blogspot.com/-VhY2f7_Dm7g/VyiBMsFV93I/AAAAAAAAO8Q/4OuR7oHLaU4_XcyZSvDvMj4P1SuwJvFvgCLcB/s1600/robots.txt%2Btester%2Bdisallow.png)
How to Properly Use Robots.txt tester in Google Search Console? - BloggerSpice - HubSpot to Maximize Online Earnings
![Mixed Directives: A reminder that robots.txt files are handled by subdomain and protocol, including www/non-www and http/https [Case Study] Mixed Directives: A reminder that robots.txt files are handled by subdomain and protocol, including www/non-www and http/https [Case Study]](https://searchengineland.com/wp-content/seloads/2020/04/robots-txt-tester.jpg)