Hi @aaschere,
I hope you are keeping well and thank you for reaching out to us.
The robots.txt seems to have been recreated (I can view it live here), but I cannot see it on my FTP where it used to be (in the root folder). Where is that file located once SmartCrawl recreates it?
SmartCrawl generates robots.txt dynamically, hence a physical file is not created under the root folder.
Within the robots.txt file, the “allow” line reads: “/wp-admin/admin-ajax.php” which seems odd. Is that what it should read?
By default, WordPress allows /wp-admin/admin-ajax.php through robots.txt. This is because WordPress uses admin-ajax.php to refresh a page’s content without reloading the entire page.
Should the robots.txt file also include the link to the sitemaps? I’ve seen it both ways.
Including your sitemap in the robots.txt file guides search engine bots to its location and helps them crawl and index your site. This enhances the site’s crawlability and results in more effective indexing.
I hope I was able to answer all your queries, please feel free to get back to us if you need any further clarification.
Kind Regards,
Nebu John
Thanks very much for the detailed answers. Regarding your last point about putting the sitemap in the file, the robots.txt generated by SmartCrawl does not have a link to the sitemaps. Is there a setting within the plugin that allows you to add this?
Quick update: I found where to create the robots.txt within the plugin settings, and now I can see the sitemap linked in the robots.txt file. That said, the “/wp-admin/admin-ajax.php” is now gone from allow, and there is no content in disallow. That doesn’t sound normal based on your previous message. Thoughts?
Thanks again.
-
This reply was modified 2 years, 7 months ago by
aaschere.
Hi @aaschere,
Hope this message finds you well and thanks for the update.
That said, the “/wp-admin/admin-ajax.php” is now gone from allow, and there is no content in disallow. That doesn’t sound normal based on your previous message. Thoughts?
As Nebu explained, that is added WordPress, still, has a feature at Advanced Tools >> Robots.txt Editor >> Customize, you can add wp-admin code which would be:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Let us know if you require additional assistance.
Best regards,
Laura
Thank you. I guess my point is: if Nabu was suggesting those entries are important in the robots.txt, I would think the SmartCrawl plugin would add them by default. I see where I can add them, and I have read blog posts and resources where others recommend more or different entries into the robots.txt – I was hoping the plugin would use SEO/Wordpress best practices to fill in what should be in there for the optimal seo experience.
Hi @aaschere
By default, we add only this to robots.txt
User-agent: *
Disallow:
and depending on the scenario we allow users to customize that file like other SEO plugins on the market. It seems that we have replied to all your queries at this point, so I will mark this thread as resolved.
Please note I have passed your suggestion to our developers so that we could consider what code should we add by default in that file.
Kind Regards,
Kris