Building a website is both a time consuming and somewhat tedious process, so before you begin your journey, it's vital that you understand that your website needs to meet a certain standard if it has any chance of succeeding on the World Wide Web.
If it doesn't, search engines such as Google won't send you any traffic or worse may even hit you with a penalty or de-index your website. Therefore, you need to understand what search engine optimisation is and how it's a vital part of succeeding on the Internet.
SEO is an important part of web development. If you run a website or a blog, the goal is to allow users to locate your website on the World Wide Web.
Search Engine Optimisation isn't easy, but in the following section we will give you lots of tips and tricks on how to make your website as visible as possible.
SEO is an acronym for "Search Engine Optimization", and this is a method that is used to make a website more visible in search engines on the Internet.
The purpose of SEO is so a website will appear on the first page or two in search result. Previously, search engines based their search on keywords, but today this process has become more complicated. High search engine ranking is more dependent on the quality of content and the number of links a website has.
Google remains the dominant search engine worldwide, and therefore we will focus on the Big G. You need to understand how Google works, so you know what actions you should take to make sure Google can find your website.
When Google (and other search engines) first started keywords were the most important thing, and Web developers used a META tag in the HTML code to convey keywords to Google. The problem with a algorithm based on keywords is that was easy to abuse. Therefore, Google made significant changes to their algorithm and stopped ranking sites based on keywords alone.
Before 2011, the manipulation of search results was a big problem - and the search engine was full of low quality sites that offered very little. In 2011, Google introduced a new algorithm (Google Panda), to combat this manipulation. Focus of the algorithm was changed from pure keywords to user-based experiences.
Over the past few years, Google has launched a couple of new algorithms. In 2012 came Google Penguin, and in 2013 Google Hummingbird was born. This new system makes it difficult for Webmasters to cheat their way to a high search engine ranking.
Google has left the keyword-based search in favour of semantic search, and Google now uses a so-called LSI indexing - Latent Semantic Indexing (LSI) is the indexing of keywords that are similar to each other.
The latest from Google is the so-called "mobilegeddon." This means that sites that don't have a mobile friendly design will appear lower down in search results when being searched on mobile devices. Sites that can be read on any device, including mobile phones, will be favoured by Google.
You can check whether your website is mobile friendly at google.com.
Google's algorithm has changed a lot over the last few years, at least in the way the search engine ranks websites.
In the past, sites that had hundreds or thousands of links from other sites, generally got a higher page rank, but Google fought against this and websites trying their hand at "dirty tricks", such as link buying risk being blacklisted.
Today Google wants to see more quality, and bases your search engine ranking on how much it trust your site.
Such trust is not achieved overnight. Three factors play a major role in achieving Google's trust.
These are the three key factors needed to gain the trust with Google, thus improving SEO:
Authority: This means how important or serious Google considers your website. For example, if another website, which Google already trusts, has a link to your website, this will have a positive effect on your website, and help it gain greater authority.
Contents: Quality Content is absolutely king. If your website has well researched and thorough contents, this will increase trust.
Age: Older domains will always have more trust with Google, especially if that domain has been focusing on the same category for many years. New domains will have little confidence with Google at first.
Social signals: Social websites can certainly help to boost confidence in Google. If many users like your website and share it on Facebook and Twitter, Google will interpret this as a sign that your site is of a high quality.
White-Hat SEO is a method that is used to increase SEO without using "dirty tricks." Black-Hat SEO is to increasing SEO using "dirty tricks" and Gray Hat SEO is in a gray zone between the two methods. White-Hat SEO involves measures that Google itself recommends.
We have previously mentioned three mainstays that can help to gain the trust of Google - authority, content and the age of the domain. These rules apply regardless.
These are some of the recommendations for effective White-Hat SEO:
If you are trying to achieve a high search engine ranking in a short period of time, this may have a devastating effect. Google can detect this and issue your website with a penalty which will result in a low search engine ranking.
Google Sandbox is a loose term that can be described as Google's way to "punish" webpages. Sites that do not have Google's trust, or who have lost that trust, can end up in the Google Sandbox, which also called sandboxing and the Sandbox Effect.
If you are attempting to manipulate search results in Google, your website may end up in Google's sandbox. A classic example is a fresh website with a new domain, and the owners are trying to influence the ranking in a short time, without first obtaining the trust of Google.
Before 2011 it was relatively easy to register a new domain, and achieve a high ranking in Google, in a short time. They used so-called Black-Hat SEO techniques for manipulating rankings. Google's new algorithms put a stop to this, partly because fresh domains do not automatically get the trust of Google.
If you over optimize your website, it can end up in Google's sandbox. Over optimization can for example mean that you try to generate many links from the same source in a short period of time. If all the links contain the same keywords, then you're in trouble.
A website can also end up in Google's sandbox if it does not have unique content, and has more or less copied contents from other sites, though large established sites are generally immune from this.
I'm sure we've all come across well-known big websites who are doing things which are clearly against Google's guidelines. One of the most common violations has to be inserting do-follow footer links to other sites that belong to them. Amazon and Expedia are two prime examples who do this, yet Google never takes any action against them, even though this is something they clearly tell Webmasters not to do.
So why do small websites who do this get penalised, but the big boys get let off? Well, there are probably several reasons. First, the big websites usually invest heavily in Google AdWords, which is what has made Google the multibillion-dollar business it is today. I'm sure Google (just like any other business) wants to look after its best customers and are prepared to overlook a few violations if it keeps everybody happy.
Another reason, and probably the biggest is that these big websites do offer a lot of value to their customers. For example, Amazon is a massive online shopping site that sells thousands upon thousands products at very reasonable prices. They are also a household name who offer a safe and trusted online shopping experience. If Google was to take action against Amazon and other big names and push them way back in the search engine results who would take their place? The answer is small and untrusted websites that people are not familiar with. Google doesn't want this. It wants the biggest and best sites to appear at the top of their search engine results, even if they are not playing 100% fair.
This is why you should never copy the tactics of another website who are achieving a high ranking, because Google (unfairly) treats every website differently, and does certainly appear to give special treatment to the biggest websites.
There are quite a few sites that can be used to check whether a website has been blacklisted by Google, such as - Sandbox Checker.
Black-Hat is a concept which involves optimization of SEO with "dirty tricks." This can be done by sending thousands of links, in various formats, to increase search relevance to what is basically a website of low quality.
It can also be done by writing lots of keywords in the META tag. The intention is that Google will detect keywords, thereby increasing the relevance of the particular website. If you try this method, your website will probably be blacklisted by Google, though big sites such as Amazon seem to get away with it.
Invisible text is another Black-Hat SEO technique. This is possible with a CSS code, such as font colour that is the same colour as the background colour of the webpage. It is then possible to create lots of keywords, without anyone seeing it, except engines. This method will causes Google to blacklist your website.
"Link farming" is a classic example of Black-Hat SEO. It involves creating thousands of links that point to your website. These are links that have no function other than to fool search engines.
"Content farming" is about writing articles with "garbage text" that is not very useful for humans but is designed only to manipulate search engines. This is a method that was common a few years back, but Google has managed to combat this with new algorithms.
Gray-Hat SEO is between Black Hat SEO and White Hat SEO. You should know what you are doing before attempting Gray-Hat SEO because this method can get your website banned by Google if you do something wrong.
A keyword is not necessarily just one word, but may consist of several words, or a phrase.
Keyword Planner is a tool from Google for users of Google Adwords. This is an auxiliary tool which can be used to pass advertisements on web pages. The tool will provide keyword suggestions and it gives you the answer to how many people use a keyword each month.
PageRank is an algorithm that Google uses to rank sites in their search engine. The basis for the ranking of sites is authority, which proves quite difficult to influence - as you have to work pretty hard on your website to make it.
Authority means how important or serious Google considers your website. When other sites link to your website, it will increase the authority of your website. The greater authority of the other site has, the more authority your website will receive.
"The problem" is that Pagerank is updated very rarely, only every 3 to 6 months. A change will not happen overnight, even if you have managed to acquire quality links to your website.
PageRank uses a scale from 0 (worst) to 10 (best) and has the following categories:
You can use the website checkpagerank.net to check the PageRank of a specific website.
A freshly installed WordPress website is basically quite well optimized for search engines, but there are still some steps you can take.
The first thing you should do with a new WordPress website, is to change the permalink structure. WordPress's default code (p=123) is used in the URL. This should be change so you have the title of the post at the end of a URL.
You can use an extension like SEO by Yoast to help you.
Also take advantage of built-in functions such as categories and tags as this makes it easier for visitors to find content on your website.