8 Technical SEO Problems That Are Holding Back Your Content

Optimize Your Rankings

Technical SEO has certainly dropped out of fashion somewhat with the ascent of content marketing, and rightly so. 

Content marketing draws in and conveys real value to the users, and can put you on the guide, putting your brand before significantly more eyeballs than fixing a canonical tag ever could. 

While content is at the heart of everything we do, there is a risk that ignoring a site's technical set-up and diving straight into content creation will not convey the required returns. Inability to legitimately audit and resolve technical concerns can detach your content efforts from the benefits it should be bringing to your website. 

The following eight issues should be considered before committing to any significant campaign: 

1. Not hosting important content on the main site 

For whatever reason, websites often have their best content off the main website, either in subdomains or separate sites altogether. Ordinarily this is because it is considered easier from a development viewpoint. The issue with this? It's straightforward. 

If content is not in your main site's directory, Google won't treat it as a major aspect of your main site. Any links procured on subdomains will not be passed to the main site in an indistinguishable way from if it was in a directory on the site. 

Sistrix posted this great case study at work site Monster, who as of late migrated two subdomains into their main site and saw an uplift of 116% visibility in the UK. The graph represents itself with no issue: 

We as of late worked with a customer who came to us with a thousand referring domains pointing towards a blog subdomain. This spoke to one third of their total referring domains. Can you imagine how much time and effort it would take to construct one thousand referring domains? 

The cost of migrating content again into the main site is miniscule in contrast with earning links from one thousand referring domains, so the business case was straightforward, and the customer saw a sizeable boost from this. 

2. Not making use of internal links 

The best way to get Google to bug your content and pass equity between segments of the website is through internal links. 

I like to take a gander at a website's link equity as a heat which moves through the site through its internal links. Some pages are linked to more liberally as are really hot; different pages are truly frosty, just getting heat from different segments of the site. Google will battle to find and rank these cool pages, which massively limits their viability. 

Suppose you've created an awesome bit of functional content around one of the key pain points your customers encounter. There's loads of search volume in Google and your site already has a tolerable measure of authority so you hope to gain visibility for this immediately, but you publish the content and nothing happens! 

You've facilitated your content in some frosty directory miles away from anything that is frequently getting visits and it's suffering thus. 

This works both ways, obviously. Let's assume you have a page with loads of outer links pointing to it, but no outbound internal links – this page will be super hot, but it's hoarding the equity that could be used somewhere else on the site. 

Look at this awesome bit of content created about Bears Ears national park

Ignoring the reality this has broken govern No.1 and is on a subdomain, it's entirely cool, isn't that so? 

But they've just recovered a single link to the main site, and it is covered in the credits at the bottom of the page. Why wouldn't they be able to have made the logo a link back to the main site? 

You're presumably going to have heaps of pages on content which are great magnets for links, but what is more than likely is that these are most likely not your key business pages. You need to guarantee important links are included between hot pages and key pages. 

3. Poor crawl effectiveness 

Crawl effectiveness is a massive issue we see constantly, especially with bigger sites. Basically Google just has a limited measure of pages it will crawl on your site at any one time. When it has exhausted its spending it will proceed onward and return at a later date. 

If your website has an unreasonably huge measure of URLs then Google may stall out crawling unimportant areas of your website, while failing to index new content quickly enough. 

The most widely recognized cause of this is an unreasonably extensive number of query parameters being crawlable. 

You may see the following parameters working on your website: 

  • https://www.example.com/dresses 
  • https://www.example.com/dresses?category=maxi 
  • https://www.example.com/dresses?category=maxi&colour=blue 
  • https://www.example.com/dresses?category=maxi&size=8&colour=blue 

Functioning parameters are infrequently search friendly. Creating several variations of a single URL for engines to crawl individually is one major crawl spending black opening. 

River Island's faceted navigation creates a one of a kind parameter for each combination of buttons you can click: 

This creates thousands of different URLs for each category on the site. While they have actualized canonical tags to specify which pages they need in the index, this does not specify which pages are to be crawled, and a lot of their crawl spending will be wasted on this. 

Google have released their own guidelines on how to legitimately execute faceted navigation, which is certainly worth a read.

As a general guideline however, we recommend blocking these parameters from being crawled, either through marking the links themselves with a nofollow attribute, or using the robots.txt or the parameter tool within Google Search Console. 

All priority pages should be linked to somewhere else anyway, not just the faceted navigation. River Island has already done this part: 

Another regular cause of crawl inefficiency emerges from having multiple forms of the website open, for instance: 

  • https://www.example.com 
  • http://www.example.com 
  • https://example.com 
  • http://example.com 

Regardless of the possibility that the canonical tag specifies the main URL as our default, this isn't going to stop search engines from crawling different renditions of the site if they are open. This is certainly pertinent if different variants of the site have a considerable measure of backlinks. 

Keeping all renditions of the site available can make four forms of a page crawlable, which will execute your crawl spending plan. Manage redirects should be setup to redirect any demand and the non-canonicalization adaptation of the page to 301 redirect to the favored form in a single stride. 

One final case of wasted crawl proficiency is broken or redirected internal links. We once had a customer query the measure of time it was taking for content in a certain directory to get indexed. From crawling the directory, we realized instantly that each and every internal link within the directory was pointing to a form of the page not annexed with a trailing slash, and afterward a redirect was forcing the trailing slash on. 

Basically for each link followed, two pages were asked. While broken and redirected internal links are not a massive priority for most sites, as the asset required to settle them does not exceed the benefit, it is certainly worth resolving priority issues (such as issues from links within the main navigation, or in our case whole directories of redirecting links) especially if you have an issue with the speed with which your content is being indexed. 

Just imagine if your site had each of the three issues! Infinite functioning parameters on four separate sites, all with double the measure of pages asked!

4. A lot of thin content 

In the post Google Panda world we live in, this really is an easy decision. If your website has a lot of thin content pages, then sprucing up one page on your website with 10x better content is not going to be adequate to shroud the insufficiencies your website already has. 

The Panda algorithm basically makes a score of your website based upon the measure of remarkable, profitable content you have. Should the majority of the pages not meet the minimum score required to be considered profitable, your rankings will fall. 

While everybody needs the following enormous viral idea on their website, when doing our initial content audit, it's more important to take a gander at the present content on the site and ask the following questions: Is it profitable? Is it performing? If not, can it be enhanced to serve a need? Evacuation of content may be required for pages that cannot be improved. 

Content cleanliness is more important initially than the "huge legend" ideas, which come at a later point within the relationship. 

5. A lot of content with covers in keyword targeting 

Despite everything we see websites making this mistake in 2017. For instance, if our main keyword is blue gadgets and is being targeted on a service page, we might need to make a blog post about blue gadgets too! Because it's on our main service offering, how about we put an ad spot on our landing page about blue gadgets. Goodness, and obviously, you likewise have a features of our blue gadgets page. 

No! Just stop, please! The rule of one keyword for every page has been around for about the length of SEO, but despite everything we see this mistake being made. 

You should have one master center point page which contains all the top line information about the topic your keyword is referencing. 

You should just use different pages should there be significant search volume around long tail variations of the term, and on these pages target the long tail keyword and the long tail keyword as it were. 

At that point link prominently between your main topic page and your long-tail pages. 

If you have any additional pages which do not provide any search benefit, such as a features page, then consider consolidating the content onto the center point page, or preventing this page from being indexed with a meta robots noindex attribute. 

Along these lines, for instance, we have our main blue gadgets page, and from it we link out to a blog post on the topic of why blue gadgets are better than red gadgets. Our blue gadgets feature page has been expelled from the index and the landing page has been de-optimized for the term. 

6. Absence of website authority 

But content marketing attracts authority naturally, you say! Yes, this is 100% true, but not a wide range of content marketing do. At Busy With SEO, we've found the best ROI on content creation is the evergreen, functioning content which satisfies search intent. 

When we accept another customer we do a massive keyword research extend which identifies each possible long tail search around the customer's products and services. This gives us more than enough content ideas to approach bringing in at the top of the pipe traffic that we can then attempt to strategically push down the pipe through the creative use of different channels. 

The great thing about this strategy is that it requires no advancement. When it becomes visible in search, it brings in traffic frequently without any additional financial plan. 

One consideration before undergoing this strategy is the measure of authority a website already has. Without a level of authority, it is extremely difficult to get a web page to rank for anything great, regardless of the content. 

Links still matter in 2017. While brand pertinence is the new No.1 ranking factor (certainly for exceedingly competitive niches), links are still particularly No. 2. 

Without an authoritative website, you may have to venture over from creating informational content for search intent, and instead focus on more link-bait sorts of content. 

7. Absence of Data 

Without data it is impossible to settle on an informed choice about the accomplishment of your campaigns. We use an abundance of data to settle on informed choices preceding creating any bit of content, then use an abundance of data to measure our performance against those objectives. 

Content should be expended and shared, customers retained and locked in. 

Keyword tools like Storybase will provide loads of long tail keywords with which to base your content on. Ahrefs content pilgrim can help validate content ideas by comparing the performance of comparative ideas. 

I adore likewise using Facebook page insights on custom groups of onlookers (by website traffic or email rundown) to extricate vital information about our customer statistic. 

At that point there is Google Analytics. 

Returning visits, pages per session, measure customer maintenance. 

Time on page, exit rate and social shares can measure the achievement of the content. 

Number of new users and bounce rate is a decent indication of the engagement of new users. 

If you're not tracking the above measurements you may be pursuing a strategy which essentially does not work. What's more regrettable, how can you expand on your past triumphs? 

8. Moderate page load times 

This one is an easy decision. Amazon estimated that a single second increase to their page load times would cost them $1.6 billion in deals. Google have published videos, documents and tools to help webmasters address page load issues. 

I see poor page load times as a symptom of a considerably more extensive issue; that the website being referred to unmistakably hasn't considered the user by any means. Why else would they disregard presumably the greatest usability factor? 

These websites commonly have a tendency to be burdensome, have little value and what content they do have is pitifully self-serving. 

Striving to determine page speed issues is a commitment to improving the experience a user has of your website. This kind of mentality is urgent if you need to assemble a drew in user base. 

Some, if not all, of these topics justify their own blog post. The overriding message from this post is about maximizing an arrival of investment for your efforts. 

Everybody is looking for the next big bang idea, but most aren't ready for it yet. Technical SEO should be working as an inseparable unit with content marketing efforts, letting you squeeze out the most ROI your content merits.


leave a comment
Please post your comments here.