Crawl errors using Jetpack generated sitemap
-
When I started with my blog 6 months ago I experimented with few logos and deleted them after I decided on the current one. Unfortunately those 8 pages still show up in my Google Search Console as “Submitted URL has crawl issue” These 8 links are nothing but old versions of my logo which I overwrote with my current logo.
In my Google search Console I see 2 sitemaps
A sitemap to my posts and pages and
An image sitemap ( https://financialfreedomcountdown.com/image-sitemap-1.xml )The 8 pages are part of the XML Image Sitemap generated by Jetpack. https://financialfreedomcountdown.com/ffc_logo/ is an example of one of the 8 URLS
When I use the URL Inspection tool I get the message “URL is not on Google: Indexing errors”. Page fetch Failed: Crawl anomaly. It gives me a message that the URL is “blocked by robots.txt”When I use the robots.txt Tester tool and type the exact URL it says allowed.
This is so confusing now because on one had it seems that robots.txt is blocking but when I test it I dont see it being blocked and when I open the URL I can also access the 8 pages in question.
Ideally i would like to find a way to delete these 8 URLs from my Image sitemap or have the Image sitemap REGENERATED so they are not a part of it. Any ideas?
The page I need help with: [log in to see the link]
The topic ‘Crawl errors using Jetpack generated sitemap’ is closed to new replies.