Search Engine Optimization After Mids
Search Engine Optimization After Mids
Optimization
Lecture 8
SEO
SEO stands for Search Engine Optimization. It is a process designed
to optimize a website for search engines. It helps websites achieve a
higher ranking in search engine results when people search for
keywords related to their products and services. So, it is a practice of
increasing the quantity and quality of traffic to a website through
organic search engine results. See the following image to understand
the basic activities involved in the SEO.
Cont.
Search results are presented in the form of an ordered list, and the
sites which are higher on the list tend to receive more traffic. For
example, for a search query, the result which is at number one will
receive 40 to 60% of the total traffic generated for that query. Only 2
to 3% visitors go beyond the first page of search results.
How Search Engine Optimization Works?
Search Engines such as Google have their own algorithm or rules to
decide the order of pages to show for a search query. These
algorithms determine the rankings of the SERPs based on various
ranking factors. However, it gives more emphasis on certain metrics
to evaluate the quality of a page and accordingly to decide its
ranking.
Key Metrics Used by Search
Engines:
Links: The links from other sites are called backlinks. These links
help in determining the ranking of a site in SERPs. A link is considered
as a vote of quality from other websites, as a website owner would
not link to a site that is of poor quality.
Content: The quality of content is also a vital parameter in
determining the ranking of a site. The content should be unique,
relevant for the given search query.
Page Structure: The web pages are written in HTML; the html
coding of a page is also used by search engines to evaluate a page.
So include important keywords in the title, URL, and other meta tags
and also make sure the site is crawlable.
White Hat SEO Techniques
Good content
Proper use of title, keywords and metatags
Ease of Navigation
Site Performance
Quality Inbound/Back Links
Mobile Friendliness
Black Hat SEO Techniques
Keyword Stuffing
Cloaking
Hidden Text
Doorway Pages
Article Spinning
Duplicate Content
Page Swapping
Link Farms
URL Hijacking
Improper Use of Snippets
Cont.
Keyword Stuffing: Search engine analyses the keywords and key phrases on the webpages to index the
websites. To exploit this feature of search engine, some SEO practitioners increase keyword density to get a
higher ranking, which is considered a black hat SEO technique. A keyword density between two to four
percent is considered optimal, increasing keyword density beyond that will irritate your readers and affect
your ranking.
Cloaking: It refers to coding webpages in such a way that search engines see one set of content, and visitors
see another set of content, i.e., a user searching for "gold price" clicks on a search result "current gold price"
and is greeted with a travel and tourism site. This practice is not in accordance with the search engines'
guidelines, which say to create content for users not for the search engines.
Hidden Text: The text which search engines can view but readers can't is known as hidden text. This
technique is used to incorporate irrelevant keywords and hide text or links to increase keyword density or
improve internal link structure. Some of the ways to hide text are to set the font size to zero, use CSS to set
text off-screen, create white text on a white background, etc.
Doorway Pages: The poorly written pages which are rich in keywords but don't contain relevant information
and focus on the links to redirect users to an unrelated page are called doorway pages. These pages are used
by black hat SEO professionals to pass on user traffic to unrelated sites.
Article Spinning: It involves rewriting a single article to produce its different copies in such a way that each
copy looks like a new article. The content of such articles is repetitive, poorly written, and has low value for
the visitors. In this technique, such articles are regularly uploaded to create the illusion of fresh articles.
Cont.
Duplicate Content: The content copied from a website to publish it on another website as the
original content is known as a duplicate content. This black hat technique is known as plagiarism.
Page Swapping (Bait-and-Switch): In this technique, first, you get the webpage indexed and
ranked on Search Engine listings, then you change the content of the page entirely. In this case,
the user is diverted to a different page when they click on a result in the SERP.
Link Farms A link farm is a website or collection of websites intended to increase the link
popularity of a site by increasing the number of incoming links. It is considered black hat seo as
links farms' sites have low quality and irrelevant content.
URL Hijacking (Typosquatting): Here, a domain name that is a misspelled version of a popular
website or a competitor's site is registered in an attempt to mislead the visitors. For example,
whitehouse.com may mislead users who want to visit whitehouse.gov.
Improper Use of Snippets: In this black hat seo technique, the snippets which are not relevant to
your site or page are used to drive traffic to a website. For example, using a review snippet even
How Search Engine Works
The work of the search engine is divided into three stages, i.e.,
crawling, indexing, and retrieval.
Crawling
This is the first step in which a search engine uses web crawlers to find out the
webpages on the World Wide Web. A web crawler is a program used by Google to
make an index. It is designed for crawling, which is a process in which the crawler
browses the web and stores the information about the webpages visited by it in the form
of an index.
So, the search engines have the web crawlers or spiders to perform crawling, and the
task of crawler is to visit a web page, read it, and follow the links to other web pages of
the site. Each time the crawler visits a webpage, it makes a copy of the page and adds
its URL to the index. After adding the URL, it regularly visits the sites like every month
or two to look for updates or changes.
Indexing
In this stage, the copies of webpages made by the crawler during crawling are returned
to the search engine and stored in a data centre. Using these copies, the crawler creates
the index of the search engine. Each of the webpages that you see on search engine
listings is crawled and added to the index by the web crawler. Your website should be
in the index only then it will appear in the search engine pages.
We can say that the index is like a huge book which contains a copy of each web page
found by the crawler. If any webpage changes, the crawler updates the book with new
content.
So, the index comprises the URL of different webpages visited by the crawler and
contains the information collected by the crawler. This information is used by search
engines to provide the relevant answers to users for their queries. If a page is not added
to the index, it will not be available to the users. Indexing is a continuous process;
crawlers keep visiting websites to find out new data.
Retrieval
This is the final stage in which the search engine provides the most useful and relevant
answers in a particular order in response to a search query submitted by the user.
Search engines use algorithms to improve the search results so that only genuine
information could reach the users, e.g., PageRank is a popular algorithm used by search
engines. It shifts through the pages recorded in the index and shows those webpages on
the first page of the results that it thinks are the best.
Difference between Search Engine and Portal
Search Engine
The Search engine is a program which is designed to enable the users to browse information
or content on World Wide Web. It helps retrieve the desired information in minimum time. It
allows you to input specific keywords or phrases and retrieves a list of items matching those
keywords and phrases. Thus, it does not provide information straight away; it just retrieves
pages which are related to keywords or other search terms. Some of the popular search
engines are Google, Bing, and Yahoo! Search
Portal
Portal is a private location on the internet which acts as a point of access to the
information available on the World Wide Web. A portal is accessed through a unique
URL, unique username and password, i.e. apart from URL, personal login is required to
see the content on a portal. Some of the popular portals are facebook.com, gmail.com
and twitter.com.
Google Algorithm
Google’s algorithms are a complex system used to retrieve data from its search index
and instantly deliver the best possible results for a query. The search engine uses a
combination of algorithms and numerous ranking factors to deliver webpages ranked
by relevance on its search engine results pages (SERPs).
In its early years, Google only made a handful of updates to its algorithms. Now,
Google makes thousands of changes every year.
Most of these updates are so slight that they go completely unnoticed. However, on
occasion, the search engine rolls out major algorithmic updates that significantly
impact the SERPs such as:
Cont.
Compulsory task 2:
Read the first two articles and write summary in your own words in
handwritten.
How to find the suitable keyword
Before using the keyword research tools you need to understand your
product or service and the target audience. If you know your product or
service in a better way you can create better keywords.
The next step is to access the value of the keyword. To access the value of
your keyword you can follow the steps described below:
Few questions:
SWOT stands for Strengths, Weaknesses, Opportunities and Threats. These are the four
factors used by organizations to evaluate their business and viability as an enterprise. In
a similar way, these factors can be used to evaluate the current and future growth of a
website.
Strengths and weaknesses are internal factors which are under the control of an
organization. Opportunities and threats are the external factors which are beyond the
control of an organization. After the SWOT analysis, you can find
the strengths and weaknesses of your website as well as
the opportunities and threats to your website.
SWORT
Strengths
It refers to the advantages that are not present in the sites of your
competitors. These are competitive advantages that help your
website gain a competitive edge over other similar sites, e.g. relevant
and unique content, user-friendly design and quick sign up and check
out process, etc. can be the strengths of your website.
Weaknesses
It refers to the features of your website that slow down your progress or
prevent you from achieving your objectives and goals. To identify your
website weaknesses you can compare a feature of your website with the
same feature of competitor website. If your competitor executes this
feature better than you then consider this feature as your weakness
which should be improved.
Opportunities
It refers to the external elements which can help you improve your website performance
and popularity. These elements are beyond your control and if exploited wisely can help
you achieve your objectives and goals. Some of the common opportunities for the websites
are new technology to improve visitor experience, web 2.0 trend focused on social
networking, internet on mobile phones, online transactions, innovative marketing
strategies, etc.
Threats
It refers to external factors that may prevent a site from achieving its objectives and goals.
These factors are beyond your control and if you ignore these threats your website can't
make progress. Some of the common threats to a website include new entrants (websites),
software piracy, unfavorable government regulations, changing customer needs,
competitors imitating your ideas and features, fraudulent activities, etc.
How to Choose Best Keywords
Keywords are the words or phrases used by people in search engines to find the
desired information. Relevant keyword and phrases help increase profitable
traffic to your website.
So, keep the following points in mind if you are looking for best keywords and
phrases.
1) Long-tail keywords: It usually refers to 3 to 5 words phrases. Such
keywords are considered ideal for SEO as people in purchase stage of buying
cycle tend to use longer phrases to search products or services. The long
phrases are also found to be less competitive than shorter keywords and help
you attract targeted traffic.
2) Latent semantic indexing: It is also known as LSI. In this method, we
incorporate related terms along with main key phrases, e.g. if a page contains
information about cars then search engine tends to see related terms like
models, makes, car parts, etc. So, if the search engine finds more semantic
words on your page they will assume you more relevant and authentic.
Cont.
Keyword Spy
It is an SEO optimization tool designed for keyword research. As the name suggests, it allows you
to spy on competitors' keywords, to create targeted campaigns and to see their rank based on
geographical location. Its free version offers several features; Domain spy tool is one of them
which allows you to type the domain into the search box and provide you the data such as on
what keywords a site spends most, how much a site is spending in paid search and who are the
competitors, etc.
Keyword Discovery
It is one of the best keyword tools. It collects data from more than 200 search engines including
the popular ones like Google and Yahoo. It provides regional or country specific databases,
industry related keyword lists, online shopping keywords lists, etc. The tool also features various
research options like filters, misspellings, keyword permutations, trends, etc.
Keyword Tool.io
It is a basic keyword research tool which can be useful if you are looking for long-tail keyword
suggestions for your keywords. It is also a free tool; its basic version can be used even without
creating an account. The basic version uses Google Autocomplete to create a list of related long-
tail keywords. But, it does not provide information about search volume and cost-per-click for
this information you have to upgrade to Keyword Tool Pro.
Cont.
WordTracker
The tool helps you find best keywords related to your search keywords and also shows
how much traffic a keyword gets. It is generally used by small organizations. It helps
them to research keywords, develop a planned SEO platform and build new links.
Moz's Keyword Difficulty Tool
The tool is designed by Moz. It helps analyze the competitiveness of a keyword. When
you enter a keyword, it shows the top ten ranking for that keyword. Then, it assigns a
difficulty score for the keyword based on the webpages that are ranking for this word. It
also allows you to export data into a CSV for deep analysis.
Keyword Finding tools