Mobile SEO: 6 Steps to a Mobile-Friendly Website — SitePoint

SITE SEO Test

Seo Advice / May 6, 2022

Yoast SEO's Canonical URL field.Split testing and Search Engine Optimization (SEO) are two essential tools for any website owner. However, they can have an effect on one another – if you split test incorrectly, your site’s SEO can be damaged.

This post will show you five common split testing and SEO mistakes, as well as how you can avoid them.

Mistake #1: Not Setting Canonical URLs for Duplicate Pages

Split testing works by creating two slightly different versions of the same page, then showing them to different users at random. Split testing software tracks metrics such as subscriptions, purchases, or click-throughs over time to determine which version of a page converts better.

Unfortunately, this can make it unclear to search engines which version of the page they should index and display in search results. When this happens, all of your valuable split testing work goes to waste.

You can avoid confusing search engines by making sure to set a ‘canonical URL’ for the two pages being tested. This will tell search engines which version to view as the definitive one.

Normally, you’ll want to set the control page as canonical while running a test. This may change once you’ve determined a winning page, but to begin with, avoiding confusion is the best policy.

While you can manually define a canonical URL by editing the section of the page’s HTML, it’s much easier to simply set it from within WordPress. If you’re using a tool such as Yoast SEO, you can set the page’s canonical URL under the Advanced settings screen:

Properly defining canonical URLs doesn’t take a lot of time, and it ensures you avoid one of the most common split testing and SEO problems.

Mistake #2: Running a Test for Too Long

It’s important to avoid running split tests for too long. If search engines detect two very similar pages on your site for long periods of time, they may interpret it as an attempt to ‘game’ your position in the Search Engine Results Pages (SERPs).

divi leads graph…the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.

You can easily avoid this split testing and SEO problem by running your tests for a limited amount of time. More specifically, run your test until you’ve achieved a statistically significant result, then stop.

You don’t need to know any math to see if your results are statistically significant – all quality split testing programs will tell you when you’ve achieved it. For example, Divi Leads displays this data when you run an A/B test.

Mistake #3: Manually Blocking Search Engine Crawlers on Duplicate Content

One popular method of preventing search engine crawlers from indexing duplicate pages is to edit the page’s file. This is simply a file loaded with every web page that instructs bots how they should treat it.

You could edit a page’s robots.txt file to block crawlers from all the major search engines, but this method is a bad idea for two reasons. Firstly, it can make it appear you are showing the search engine different content.

Google explicitly recommends against blocking crawlers in robots.txt, warning that:

…if search engines can’t crawl pages with duplicate content, they can’t automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages.

Secondly, anyone can view your robots.txt file. Since this file is publicly viewable, your competitors could use it to see which pages you’re instructing search engines not to index. From this, they can figure out the pages you’re testing and attempting to optimize their own. For obvious reasons, you should avoid this.

Instead of editing robots.txt, Google recommends using either a 302 redirect or a canonical URL to tell crawlers how to treat your similar pages.

Source: www.elegantthemes.com