If you’re just starting to dip a toe in SEO, it can be easy to stumble into one of the many pitfalls that small businesses encounter when running a website. It takes foresight and careful review to make sure every SEO box is checked for a high-value, profit-making website. But there are several major, common technical SEO issues that you can fix right now. In a time when Google and other search engines are constantly tweaking the standards for what makes a web page appear on the first page of a search result, your website will always be stronger for having avoided these mistakes.
Duplicate Content
According to a research analysis by SEMrush of on-site SEO issues, 50% of websites have problems with duplicate content.
Duplicate content as defined by Google is “substantive blocks of content within or across domains that either completely match other content or are appreciably similar.”
This can happen either knowingly or not but doesn’t usually happen with malicious intent. However, search engines will have a hard time determining which page is important and will rank one over the other. Those pages may even begin to compete with each other in search results!
Occasionally, duplicate content might be seen as an attempt to manipulate search engines by placing that content across multiple domains. If Google perceives this, even if it was unintentional, they state specifically that, “the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.” If your business is dependent on organic search traffic, that can be devastating!
Issue – Domain Variations
Sometime a site might have variations in how the domain is displayed. That means if a site can be reached with and without a “www” prefix and the same content lives at both, it’s creating duplicate content. Likewise, sites that can be found on both “http” and “https” versions will have duplicate content problems.
Solution
Use Search Console to tell Google how you want your site to be indexed (www.example.com or example.com). Check with your hosting company to get your preferred domain set up correctly. You should also set up 301 redirects for any domain variations to the correct version for all search engines and visitors. 301 redirects are also very important for “http” to “https” versions of your site.
Issue – URLs Variations From Additional Parameters
When a page has additional parameters, usually with tracking codes or a session ID, it can sometimes get indexed by search engines along with the original page creating a duplicate content problem.
Examples of this are:
www.example.com/service-page?type=sale
www.example.com/service-page?type=sale&source=email
www.example.com/service-page?SESID=12345
If each of the example URLs gets indexed along with the original, a search engine would assume that there are 4 different pages with the same content.
Solution
Check your site for any internal links to content that might have these unnecessary URL parameters added. You can use a tool like Screaming Frog to quickly crawl your site and identify where those might be and get them corrected.
At the very least, set up a canonical tag on the pages that have URL variations and point that to the correct URL for each page. This is a simple solution that can indicate to search engines which page to show in search results and gives it the SEO credit it deserves. Even if it means having a page that points to itself, you’ll still get the benefits for doing this.
Issue – Syndicated Content
Good content is important for SEO and should be shared as well. Content syndicated incorrectly can dilute any SEO value your site would have gained if you’re not careful. Content that lives on other sites along with yours may confuse search engines about which is the original. They’ll do their best to show what they consider the most appropriate URL is in search results but it might not be the one you want.
Solution
Syndicate your content carefully! Include a link back to the original on each site you syndicate your content on. This will go a long way in indicating to search engines which is the preferred version to index. You should also ask others that use your content to use a noindex meta tag or even the canonical cag if it’s a direct copy.
How to Check
Look into your Search Console or Bing Webmaster Tools accounts. In Search Console, check under HTML Improvements. You’re looking for duplicate title tag and meta description issues. Bing Webmaster Tools has an SEO Report section under Reports & Data that will give you similar information you can act on.
Lack of a Mobile Friendly Website
More people than ever are using mobile devices to find information, products and services on those devices. Google has tried to make it very clear to website designers and owners that having a mobile friendly website is incredibly important. They have embraced the mobile first philosophy so much so that websites providing information with that in mind tend to get better search rankings. Especially in mobile searches.
Issue
Have you ever tried watching a movie or show through your fingers? You can only see part of the action and have to move your head or hands around to see everything that’s happening. But chances are you’ll miss something. Looking at a non-mobile friendly website is exactly like that. You’ll need to use your fingers to adjust the website size on your device or have to slide content sideways to read each line of text.
How to Check
One simple way to test your website is to bring it up in a mobile device; a table or, even better, a phone. If the site content spills over the sides of the device causing you to move it around with your finger left and right to read, it’s not mobile friendly. Easy navigation may be a problem too. A responsive website should scale with the browser.
Alternatively on a desktop or laptop, grab the left or right edge of a browser and change the size. If a horizontal scrollbar shows up on the bottom, it’s not a responsive website.
Solution
You’ll need to get your website redesigned or recoded to be responsive to various device viewport sizes. Content should adjust and flow appropriately. It needs to include mobile friendly navigation to allow visitors to find what they’re looking for and to move easily from page to page.
Unfriendly URL Structure
How are your websites’ URLs? Are they easy to read? A good URL structure should be on the shorter side and use dashes as spaces (Ex: www.example.com/services/my-first-service/). This makes them easier to share and remember. Search engines are also able to read them easier too, and if they match your title and content, they’ll be able to index them more appropriately. Besides, they are visible as a part of search results too, lending another visual cue for people looking for answers.
Issue
If you have a website developer that has creates pages for you, they may be less concerned about the URL structure than they are about getting the page(s) done within a deadline. Sometimes a misconfigured or old content management system could be generating an unfriendly URL structure for you unintentionally as well.
What does an unfriendly URL look like? Here are a few common examples.
Example 1: www.example.com/?p=34182
Example 2: www.example.com/ku0808676.html
Sometimes a dynamically generated URL will have a variable that’s needed to display the content. Or generate a URL that looks like a jumble of letters and numbers as we see in the first two examples. This is more common with older content management systems but can still happen with a poorly set up new system.
Example 3: www.example.com/ilikecreatingcontent.html
Example 3 looks like a jumble of letters but, if you look closer, it’s a bunch of words smashed together. This usually happens when pages are created manually but can happen with an improperly configured content management system.
Example 4: www.example.com/services/my_first_service/
A simple mistake by both an Example 4 is the inclusion of an underscore (_) for a space between words in the URL. While it seems that visually it would make more sense, that’s just not how a search engine sees them. Underscores are seen as characters. So, my_first_service in the example above would still read as one long word.
Example 5: www.example.com/blog/10-ways-to-do-something-that-will-generate-tons-of-website-traffic-and-explodes-your-revenu
With the importance of Content Marketing, especially related to SEO, overly wordy URLs are quickly becoming the most typical problem today. This type of URL defeats the purpose of an easy to remember and share URL, degrading the user experience small degree. Some search engines, Bing for example and others, might consider long URLs as an attempt at keyword stuffing and those could be considered a spam signal.
Note: Images and documents should also use friendly URL structures.
How to Check
Simple! Look at how your URLs are displayed in a browser. Look at a sampling of simple and longer URLs across your whole site. Check your blog too, if you have one. If any match the examples earlier, you have a problem.
Solution
The solution to fix unfriendly URLs seems easy … Simply change the URL structure of the pages that need it or update how your content management system generates them through its settings. But you have to consider the impact that will have across the web and in search engines?
You’ll need to set up 301 redirects as well for each of the pages. This will ensure that any SEO value with those old URLs are transfer to the new versions. Those will also redirect any external links on other websites and in search engines. Depending on the size of your website, hundreds of 301 redirects may be needed. The last thing you want to do is lose website traffic and leads. If you don’t understand 301 redirects, please leave it to a professional!
Unsecure Website (SSL Certificate)
Google is asking website owners and developers to take information security more seriously by implementing SSL certificates. To help prevent some of this malicious activity, Google is encouraging the implementation of HTTPS by giving appropriately secure sites a bit of a boost by adding it as a ranking signal in 2014. With an appropriate SSL certificate for your domain, it ensures that:
- Visitors activity will be encrypted from others preventing it from malicious tracking and information theft
- Prevent man-in-the-middle types of attacks
- Increases user trust, in turn increases other business benefits
- Maintains data integrity preventing files from corruption when transferred
Issue
Traditional HTTP just isn’t good enough against information theft. Coupled with the huge number of businesses that lack the resources to monitor their websites regularly and the sophistication of cyber-attacks, this leaves millions of people and businesses vulnerable.
How to Check
This one is supper simple. Look at the URL in a browser window and check for https in it. Most browsers will display a lock too depending on how the domain is set up. Still can’t tell? You can copy the URL from the browser and paste it in an email or document.
Solution
The good news is that many hosting companies are now offering low cost SEO SSL certificates with their hosting packages. Some of them are even free! They are also making it easy to get one implemented, usually with phone call or by turning it on within your hosting account.
Be careful! Unless you’re building a website with a fresh, new domain, you should have an expert on hand to ensure the SSL setup is done properly. If it’s not, Google and other search engines could have issues with redirects, taking months to get those resolved. Also, if there’s a mix of secure and unsecured content on your website, you won’t see the benefits of the switch at all. Search engines will consider the security of your website as weakened and not give you the benefits of one that is done properly.
Lack of a Search Engine Friendly XML Sitemap
Search engine crawlers use sitemaps, preferably XML sitemaps, to crawl and understand valuable information about a website. For example, meta-information about when a webpage was last updated or added, its importance compared to other pages within your site, and how frequently content is updated.
Issue
Without a sitemap, a search engine will attempt to discover most of your pages but it won’t know the information mentioned earlier. You’ll likely experience a less frequently crawled website. That can create a delay in indexing of new or updated pages. Sometimes webpages may be missed completely from a search engine index!
Your website may have a sitemap as a link on your website somewhere. Search engines consider those HTML sitemaps. They can find any pages linked there but frequently those pages get neglected and are missing links to new pages. While it’s great for visitors to have access to an HTML sitemap, this method lacks the inclusion of important meta-information that an XML sitemap has.
How to Check
Login to your Search Console or Bing Webmaster Tools account and navigate to Sitemaps within Crawl (Search Console) or Configure My Site (Bing Webmaster Tools) if you have them. There you’ll be able to see any sitemaps that have been found by those search engines.
You can also bring up your website’s home page and append /robots.txt to the end. This should bring up a file that will list out any XML sitemaps linked to your site. If you don’t have one of those files or are missing sitemap links, you more than likely don’t have an XML sitemap.
Solution
If you have a WordPress website or similar content management system, find a trusted plugin/tool that will generate one for you and will submit it to search engines regularly.
Don’t have a content management system? You can find a free service on the web that can take your base domain and generate one for you but you’ll need to add it to your website and submit it yourself.
Take note! A proper XML sitemap should be updated and submitted to search engines every time there is new content. If this is out of your league, contact your website developer or find someone that can set one up professionally.
The reason all of these SEO rules exist is to cultivate a smooth, convenient browsing experience that helps people find what they need when they need it. Correcting and avoiding technical SEO issues like these will establish your website as a respected, user-friendly destination important to the interests of internet users.
If your SEO problems have gotten out of hand and you’re struggling to stay on top of them, we’re here to help. At DTSquared, we regularly optimize websites for all essential SEO practices and keep them in line with all search engine rules and guidelines. Let’s optimize your website today!