These are the 3 Core Components of a Strong SEO Strategy. No1
To optimize a site, you need to improve ranking factors in three areas — technical website setup, content, and links. So, let’s go through them in turn.
The Technical Setup
For your website to rank, three things must happen:
- First, a search engine needs find your pages on the Web.
- Then, it must scan them to understand their topics and identify their keywords.
- And finally, it needs to add them to its index — a database of all the content it has found on the web. This way, its algorithm can consider displaying your website for relevant queries.
Seem simple, doesn’t it? Certainly, nothing to worry about. After all, since you can visit your site without any problem, so should Google, right?
Unfortunately, there is a catch. A web page looks different for you and the search engine. You see it as a collection of graphics, colors, text with its formatting, and links.
To a search engine, it’s nothing but text.
As a result, any elements it cannot render this way remain invisible to the search engine. And so, in spite of your website looking fine to you, Google might find its content inaccessible.
Let me show you an example. Here’s how a typical search engine sees one of our articles. It’s this one, by the way, if you want to compare it with the original.
Notice some things about it
- The page is just text. Although we carefully designed it, the only elements a search engine sees are text and links.
- As a result, it cannot see an image on the page (note the element marked with an arrow.) It only recognizes its name. If that image contained an important keyword we’d want the page to rank for, it would be invisible to the search engine.
That’s where technical setup, also called on-site optimization, comes in.
It ensures that your website and pages allow Google to scan and index them without any problems.
And the most important factors affecting it include:
Website navigation and links
Search engines crawl sites just like you would. They follow links. Search engine crawlers land on a page and use links to find other content to analyze. But as you’ve seen above, they cannot see images. So, set the navigation and links as text-only.
Simple URL structure
Search engines don’t like reading lengthy strings of words with complex structure. So, if possible, keep your URLs short. Set them up to include as little beyond the main keyword for which you want to optimize the page, as possible.
Search engines, use the load time — the time it takes for a user to be able to read the page — as an indicator of quality. Many website elements can affect it. Image size, for example. Use Google’s Page Speed Insights Tool for suggestions how to improve your pages.
Dead links or broken redirects
A dead link sends a visitor to a nonexistent page. A broken redirect points to a resource that might no longer be there. Both provide poor user experience but also, prevent search engines from indexing your content.
Sitemap and Robots.txt files
A sitemap is a simple file that lists all URLs on your site. Search engines use it to identify what pages to crawl and index. A robots.txt file, on the other hand, tells search engines what content not to index (for example, specific policy pages you don’t want to appear in search.) Create both to speed up crawling and indexing of your content.
Pages containing identical or quite similar content confuse search engines. They often find it near impossible to determine what content they should display in search results. For that reason, search engines consider duplicate content as a negative factor. And upon finding it, can penalize a website by not displaying any of those pages at all.