Working in the SEO field, you periodically get clients that aren’t appearing in search at all, sometimes even for branded search. Not the clients that have poor content and no SEO strategies implemented, but the ones who were ranking REALLY well and then all of the sudden, everything took a nosedive. At this point, you’re probably thinking: bad site launch. We’ve touched on potential issues with site launches and why your website might not be appearing in search before, but not a specific case like this. These issues can cost businesses thousands, if not hundreds of thousands of dollars depending on their size.
Finding the Problem
We recently had a client come to us that launched a new site with MIVA, an ecommerce platform, and was not appearing in search (branded and non-branded,) except for their business listing if you typed the full business name. Anytime this happens, these are the first two things I look for:
- Check if the robots.txt file has disallowed all crawlers. (Note, if this happens, the robots.txt file will just contain “disallow: /)
- See if all the website’s URL’s are noindexed/nofollowed
If neither of those are present, there is probably a deeper problem. In this case I started running crawls with Screaming Frog and SEMRush. Screaming Frog was showing me limited results that didn’t include any of the sites primary pages or PDFs. There were 20 scripts/site mm5 files showing when there were thousands of pages and PDFs that should have been crawled. With my crawl in SEMRush, it identified that there was clearly an issue with indexing, but it didn’t specify what it was. I went back to the code and still couldn’t find anything, it was driving me nuts! I then went into Google Search Console and had Googlebot fetch the home page to see what it was seeing that I wasn’t. Sure enough, I see an x-robots-tag stating that bots shouldn’t index or follow.
I took this information to our technical team, and my colleague Cory informed me that this may be happening at the server level, which ended up being true. The code was removed; I re-fetched and requested indexing of the home page and created a new sitemap that I quickly resubmitted. Within a day the website was starting to see traffic again.
Throughout this process, I was working with MIVA support. We were the ones who found the issue and cause, not MIVA. When we pointed it out, they noted that they inserted the tag during development and removed it with the client’s permission at launch. In this case, the client gave them permission, but MIVA forgot to remove it. This was a HUGE mistake—it effectively made our client’s website invisible to search engines.
When I was trying to solve this problem, there were limited resources out there—just a couple of somewhat-relevant MIVA forum threads. I hope that in the future if this happens to MIVA users, they’ll find this post, and it will save some businesses a lot of money.
Had/Have a Similar Issue?
If you’ve had or are currently dealing with a situation similar to this, think I missed some checkups in the process, or have any other feedback; I’d love to hear about it in the comments.