Search engines will crawl them, but they won’t give us the indexing and error information that we would see in Search Console.
Another way is to ping
A URL of the search engine that we want to inform with the URL to read. A ping is nothing more than a request to this URL, without any further complexity. The URL is loaded and the system takes notice. This solution is also valid for several search engines. We send a ping to a URL with the following format:
Search domain ping?sitemap=Sitemap URL
For example, to send my domain’s ping to Google we would call the following URL:
This is the most uncontrolled way, although it can serve to give the search engine a wake-up call and make it look at something. Many people also use it to try to force crawls: before an update we launch pings of sitemaps that contain the URLs with to visit them. Although it is true that uploading sitemaps of URLs to perform updates seems to have twitter data some effect. At least I have not had much difference in pinging or not pinging their sitemaps, to be honest.
Sitemaps.xml only affect files that are in your directory or deeper within it
That is, “mydomainfolder1folder2sitemaps.xml” should not contain links to files in “mydomainfolder1” or “mydomain” only to those in “mydomainfolder1folder2…”
So, for example, creating the typical sitemaps how to become a game developer? folder and hosting one at “mydomainsitemapsposts.xml” is not a good idea, since in theory these links would be nofollowed.
The best thing to do is to keep all of your sitemaps at
The root of your domain (this way you can be sure you won’t have these restrictions). You can also keep things well organized and, for example, have “blogsitemaps.xml” that only links to “blog…”. However, it’s easier to make mistakes with these yeezys shoes URLs than forcing the files to stay at the root of the domain (without using folders).