With the advent of web algorithms and analytics, most of us have become aware of the importance of site crawlability. If a site cannot be “crawled”, how will it be able to rank in the search engine results at all? In other words, you would prefer to have it crawled.
If you have no idea of what I mean, though, let me explain using an analogy.
Think of it this way: An actual spider encounters a framed image on the wall. It wants to see what is depicted there. Should it crawl over the image, or go around it, or just forget about it? The thing is, if it does not walk over the site of interest, it cannot have any idea of the whole picture and see if it’s any good.
In the same sense, an online spider or Internet bot, which is really a program, “crawls” your site for web indexing. It records links, updates, contents, etcetera. With the indexing, it gives Google, Yahoo, Bing, and others a bigger picture of what the site and its pages are about. Only then do these search engines analyze, with their own tools, what are indexed to determine your site’s authority rank.
And therein lies your site’s chances of landing on the first page of search results, if not in the first ten list.
Getting to the Top
As Google indicated, “It is important that the URLs you want…to display in Search results can be first found by Google.” Then all else must be geared towards the goal of landing on the top of organic search results. Not that they mentioned that exactly, but that is the point of letting your site get crawled in the first place. You would like to be found fast by your intended audience and to retain that audience.
Why is this so important? Studies have shown that with people’s short attention span, they tend to simply click on the first ten links. What are the odds that they will find and click on your link that is on page 5, 12, 27, 48…100? There’s very little chance. Most people do not have the patience to even skim quickly through pages.
Now, for some reason or another, you might want to purposefully hide URLs. That can be done, although do determine which contents you’d still want to come out in results so you do not exactly go missing and you can still be found.
Content, Content, Content!
Contents are at the heart of this whole search business. It’s what may make or break your site, regardless of how good you work on your SEO, which does bring in the visitors, except they leave if they don’t find the information needed—so much for user conversion.
It does not matter how big your site is, how great it looks, how much contents there are. If engines are able to determine through its crawlers that the site gravely lacks unique contents and that you indulge in the much frowned-upon black-hat SEO methods, there is little to zero chance of getting anywhere near the peak. Instead, there is a big chance of getting your site penalized and sent to virtual oblivion.
Not quite making it “there”, though, does not necessarily mean you have been penalized. Rather, it could be that you are doing things incorrectly and getting into common pitfalls. It affects the way bots crawl and index your site. So, as in anywhere else, avoid the pitfalls. Audit your site. Find from the search engines what you have missed.
Always apply the best practices. That should assist you on your journey to the top.
This is my first #ThursdayTips, and my first tips post after the Free Site Audit Test Tools one. In my July monthly recap, I mentioned that I was going to change the schedule from Tuesday to Thursday. It may seem that I write SEO tips only, but I don’t. In fact, you can check out my Tips Jar and WordSmithereens Niche pages. Those stuff were written pre-Tuesday Tips days, plus stuff on SEO are what I have in “abundance” right now. I do plan to share more that are non-SEO stuff, I promise 🙂
Once again I enjoy all the techno brain snacks you bring to the table. Great post, Gi!
LikeLiked by 1 person
Aaaw…shucks…. 🙂 Thanks, Roo!
LikeLiked by 1 person