Greetings, boils and ghouls!
Today is Halloween – one of the most popular U.S. holidays, with more than 21 million searches per month.
Clearly, this means Halloween needs to be celebrated by the digital marketing community.
So I decided to mark this Halloween in the spookiest way possible.
I asked 19 SEO pros to share the most terrifying stories – the most horrifying situations they’d ever faced during their career.
This post is not only here to entertain you, but to also remind you how one small mistake can destroy the whole site’s SEO performance.
Hopefully, next year at this time, you won’t be featured in one of these gruesome Halloween roundups!
Jason Barnard, Founder, Kalicube
Back in 2013, I was called in to have a manual penalty lifted. I spent a year disavowing thousands of spammy backlinks.
I also took the opportunity to clean up the site – deleting doorway pages, switching to SSL and HHTP2, reorganizing the categorization, optimizing the images, adding some funky Schema, and a few other nice tricks.
After three months, the penalty was lifted, and six months later we were seeing 10 percent+ traffic increases every month.
One day I noticed the Google +1 count had shot up from 20 to 1,020.
Turns out, the boss had become frustrated with the “slow” progress and bought them from what he called a “reputable” online service.
A few days later, manual penalty.
Back to square one.
A client decided to open up a marketplace system within their existing ecommerce platform.
All those vendors got their own profile and unique URLs for their product range.
Next, to that, the general ecommerce environment added new facets for every individual vendor. They proudly launched the platform together with 120 vendors.
All those new facets and vendor pages added up to over 1 billion new URLs for a domain that used to have 120,000 indexable URLs.
Nobody involved in that project understood the implications of the new setup for SEO and it took us six months to clean up again.
Clark Boyd, Founder, Candid Digital
I worked at an agency for a little bit whose “USP” was that they use freelancers to perform all the usual SEO tasks.
My first project there was to try and coordinate 382 new landing pages, all of which were due to launch on the same day for a big event.
The agency used to sell in these preposterous projects on the proviso that the “freelancer network” could deliver.
The assets were delivered to me on time by the freelancers, but the client was less than impressed with the quality.
No, that’s too diplomatic.
They hated it.
With two days until launch, we were 382 pages from our target.
In the end, I and one colleague worked round the clock to write titles, descriptions, and many, many paragraphs.
I’m not sure it was any good, but the content was at least a little better than what we had…
Craig Campbell, Founder, Craig Campbell SEO
I was working with someone on their website to help them rank better within the Lancashire area for a number of terms.
We were making good progress with technical SEO – then they decided to move from WordPress to Wix in order to “save on costs.”
And yes, it looks good; but the rankings are now tanking.
Adam Connell, Founder, Blogging Wizard
When I was doing agency work a few years ago, my team and I spent 3+ years working with a client to develop content assets, build links, and increase rankings/traffic.
One day traffic and rankings plummeted. I opened up the site to start figuring out why. The problem was obvious – the blog didn’t exist anymore.
Turns out that a customer service rep at their host “accidentally” deleted their entire blog. Along with all their backups. And their ToS got them out of any responsibility.
The impact was significant: 400+ blog posts and content assets wiped out in an instant. They had to be resurrected from drafts in emails and old Word documents.
Even if your client only hires you to work on content/SEO, and even if they have an agency managing their website – make sure they have redundant backups for everything.
And I mean everything.
Rachel Costello, Technical SEO Executive, DeepCrawl
It seemed like just another ordinary day in the office – how was I to know that a client was about to tell me something that would send a terrible chill down my spine?
One morning I was checking the crawl error report in Google Search Console for a new ecommerce client.
There had been a huge spike in crawl errors, from less than a hundred or so to thousands.
I started checking through the website itself. Category page after category page was either completely empty or had just one or two products left on it, when the last time I checked the previous day they were full.
I then checked the back-end in the CMS and saw, to my horror, that over two-thirds of all of the products had been disabled even though there was still stock left, meaning all of these product pages we’d been working hard to improve were now serving 404s.
I set up a call with the client as soon as I could to find out what had happened.
They told me that “The SEO consultant we worked with before told us it was fine to disable products whenever we want. So at the end of each season, we disable all of the products and if they’re seasonal we just launch them again with new URLs when we want to showcase them on the website again. That’s still OK, right?”
Needless to say, some training on stock management was scheduled immediately. However, the thought of all that wasted link equity over the years still haunts me to this day.
Blake Denman, Founder of RicketyRoo
Early on when I started my agency, I was rebuilding a website for a small business on WordPress.
I built out the redesign on my local machine and would migrate the site late one night.
When I migrated the new site, there was a critical error and the website was showing a 500 error.
I tried again, the same result. I tried, and tried, and tried, nothing was working.
It was about 12:30 am and I froze. I didn’t know what to do.
From 12:30 am on, I resorted to rebuilding the entire site in the live environment.
I finished the site at 5:30 a.m.
I later found my critical mistake. Even though I had double, triple, quadruple-checked the database info, I made the slightest mistake in the password.
Nick Eubanks, Founder, From the Future
So this is so simple but it was costing our client so much money (tens of millions of dollars per month), and all it was was a misplaced canonical tag…
This client had an internal page, right off the root directory, that was built to target a keyword with an exact match MSV of ~130,000, but there was a canonical to the site’s homepage.
After a simple site crawl, once it was identified, we simply removed the tag and the page popped to Position 5 (and now generates literally tens of millions of dollars in online revenue each and every month).
Dan Foland, Director of SEO, Postali
I once worked for an agency that handled the SEO for some of the largest and most well-respected healthcare systems in the U.S.
Every time one of our largest and most notable clients pushed an update live from their dev server it also pushed sitewide noindex tags and robots.txt disallow rules live.
Jenny Halasz, President, JLH Marketing
Working for a very big ecommerce brand about five weeks before Christmas, we had a post about the best gifts get really large reach and get a #1 ranking for “Christmas gifts” on Google.
The increased traffic caused the site to start throwing 503 errors!
We responded really fast with two SEO plays:
- 302 redirect traffic on that specific URL to another domain hosted elsewhere until we could spool up additional bandwidth.
- Change the meta description to include the phone number so that if people couldn’t access the site and hit back, they could call.
Both worked really well. We definitely lost some sales, but were able to recover a lot.
Milosz Krasinski, Founder, Chilli Fruit Web Consulting
My client turned up to be multinational scam agency. Since I have been managing their hosting as well I was halfway involved in this.
Thankfully all got sorted.
Ron Lieback, Founder and CEO, Content Mender
Back in 2008, during the first year of running Ultimate Motorcycling, we hired an agency to migrate us from Drupal to WordPress.
At the time we were doing around a million uniques per month, and the content was stronger than ever.
But after the migration or rankings tanked by more than half, and the SEO company “lost” about 15,000 URLs, and over 30,000 images.
Yes – lost; it was super scary because I thought the entire business would go under – advertisers pay based on exposure, and we couldn’t afford to go under.
That was the last time I trusted an agency, but it forced me to learn SEO for myself, which led to where I am today.
I can’t stand a hack, but thankfully that one came into my life. 🙂 As for rankings, it took nearly two years to recover, but persistence and patience paid off. And then some.
Karen Neicy, Director of Experience Strategy, OGK Creative
I once inherited a client website that got hacked because it was using some outdated plugins.
Turns out, the links implanted on the site ended up ranking it for all sorts of “adult” keywords.
So the client was getting traffic from some pretty unsavory verticals.
It was a major brand, and we were getting press inquiries about why they were showing up for such distasteful search terms.
I was like, we’re handling it, but why were you searching for those things in the first place?
It took weeks to correct, but we installed malware protection, removed the bad links (most of them were in the forms of anchored blog comments), removed the outdated plugins, and switched to a more secure, https certificate.
Andrew Optimisey, Founder, Optimisey Cambridge MeetUp
This is a classic SEO tale of woe.
I’d just started in a new job, SEO was just one of the things I “looked after” (I was very much the one-eyed man in the kingdom of the blind).
All was going well and then I had some holiday booked so was away from work for a bit.
Two days into my holiday I had panicked messages from my boss (via LinkedIn, email, phone – they’d tried almost everything to get hold of me – I was in a low/no signal area).
In short, the developer team had released a bunch of updates and… included the “User-agent: *Disallow: /” in the robots.txt.
It took them two days to notice that the traffic jumped off a cliff. After having the “hair on fire” moment, they started trying to call me.
The, moderately, happy ending is that it was a relatively quick fix (thanks to a quick fix from the devs and Google Search Console) and the dev team didn’t make that mistake again!
Bill Sebald, Founder, Greenlane
At an old agency, we worked with a very well-known ecommerce brand.
They were immovable in their #1 rankings for many, many years. (I’d really love to tell you who it was, but I cannot – still, it’s all terrifyingly true!)
They decided to buy their #2 competitor in a very expensive buy-out (who was also immovable in their respective rankings). It was a huge story that month in the trades.
This competitor had an exact match keyword as their domain. (The EMD update hadn’t happened yet.) The keyword had more than a million searches per month. It was a phenomenal opportunity.
We were asked for our opinion on an SEO approach.
We said, “They are mighty, and you are mighty. We recommend you run the site and keep it as close to its current state, even if you change the fulfillment to your own infrastructure. After all, you’ll be owning your #1 and #2 spot – that’s a huge advantage against Amazon. Own that ‘above the fold’ real estate.”
The advice was not taken.
Instead, the site was purchased and promptly dismantled until Google eventually found very little importance in the domain. It dropped right out of the top spot it had enjoyed for 10+ years.
When the purchasers came back and asked how they can fix their mistake, we told them their best bet was to restore to the original state. But that was now impossible. The whole process had been fumbled.
To this day, that domain is sitting with no site attached to it. It’s just sitting in a very large company’s portfolio. It’s a domain that has so much power, and it’s just being squandered. Now that is terrifying.
Deepak Shukla, Founder, Pearl Lemon
I remember starting a business that was to be called Kukumber (an agency) and made some cool videos that I published on my site.
My intern Catherine told me she wanted to “publish the videos on other sites.” Great idea, I thought.
What I didn’t know is that she found a multi-channel video uploader and didn’t create original descriptions or even use an article spinner.
And I didn’t ask about her process nor consider she might not know the difference between duplicated content and syndicating content; alongside mass uploading of the same content.
Within a week, my videos and website were slapped with a manual penalty and you could not “Google” Kukumber for love nor money.
And that’s how the story of Pearl Lemon started.
Sal Surra, Senior SEO Specialist, Angie’s List
I accidentally put a meta noindex tag on a template for an enterprise site that generates millions of dollars from ad impressions on organic search results.
Because we used Google Analytics, it took us a couple of days to realize what had happened and get it fixed.
A couple day-long issue resulted in multimillion-dollar losses.
That was a really bad day.
Glad I could keep the job.
Marcus Tandler, Founder, Ryte
A large telecommunication service provider used to offer white label shops for their local stores.
Local stores would be paid an affiliate commission for all product sales.
Good idea, but horribly executed because all white label shops were just put in a directory on the main domain with no canonical tag in sight. 🙂
This created enormous amounts of cannibalization issues since there were now multiple duplicates of the very same store.
Of course, this also leads to the most SEO-savvy local store ranking for all products with their white label shop instead of the TSP’s very own, original online shop.
The local store’s subdirectory even ranked for most of the brand terms resulting in a massive affiliate payout for this local store owner.
The worst part about this horror story: They didn’t even notice.
They noticed the decline in sales in the original online shop but they were excited about the uplift in the affiliate marketing channel.
It was not until 6 months later (!) when they started using Ryte Search Success and discovered this huge SEO screw up.
Dan Taylor, Founder, Dan Taylor SEO
Working with a large, international travel brand they were facing issues not being on HTTPS, but due to their legacy infrastructure, they had a limit of the number of redirects they could implement on any given site.
The first solution provided by development was to have the different country managers implement 10,000 redirects manually through the CMS – the country managers rejected this as it’s insane.
So the second option was they found another “SEO agency”, who agreed with (the development team) them that redirects weren’t necessary for a protocol migration and you could just change the preferred URL in Google Search Console.
The end result, both protocol versions open, both indexed, and because the majority of the international sites were English for other regions (with no hreflang so it was duplicate content), this was a straw that broke the camel’s back.
What are some of YOUR most horrific SEO tales? Scare us all in the comments, below.