SEO Class
SEO Class
History of SEO
• The early days of search engine optimization go back to mid-1990s. The most important
aspect of a search engine algorithm appeared to be entirely ―on-pageǁ based and was
focused almost exclusively around meta tags and their related text.
• During the late 1990s, ethical SEOs and spammers alike realized that search engine
results could be manipulated by the simple process of adjusting a site‘s meta tags to
match the desired keywords. Google‘s arrival in 1998 and the introduction of its
―off-pageǁ, link based, approach signaled the beginning of the end for the exclusively
meta tag driven approach.
• Because of link spamming, Google introduced Domain Relevance and authority
Types of SEO
● Whitre hat
● Black Hat
● Grey Hat
• On-page SEO is the practice of optimizing individual web pages in order to rank higher
and earn more relevant traffic in search engines. On-page refers to both the content and
HTML source code of a page that can be optimized, as opposed to off-page SEO which
refers to links and other external signals. It includes providing good content, good
keywords selection, putting keywords on correct places, giving appropriate title to every
page, etc.
On Page SEO Factors
• HTTP response code errors
• Site speed
• Internal links pointing to the page
• Correct rel=”canonical”Use
• Absence of Broken Links
• Perfect HTML Code
• Valid CSS & JS
• Acquiring Backlinks
• Leveraging social interaction with your site
• Promote your content via social channels
• Video Sharing
• Add social book markings
• Guest Blogging
Introduction to SERP
Search Engines
• How does a Search Engine Work?
• Crawling- Process of fetching all the web pages linked to a website. This task is performed by a
software called a crawler or a spider (or Googlebot, in case of Google).
• Indexing- Process of creating index for all the fetched web pages and keeping them into a giant
database from where it can later be retrieved. Essentially, the process of indexing is identifying
the words and expressions that best describe the page and assigning the page to particular
keywords.
• Processing- When a search request comes, the search engine processes it, i.e., it compares the
search string in the search request with the indexed pages in the database.
• Calculating Relevancy- It is likely that more than one page contains the search string, so the
search engine starts calculating the relevancy of each of the pages in its index to the search
string.
• Retrieving Results- The last step in search engine activities is retrieving the best matched results.
Basically, it is nothing more than simply displaying them in the browser.
Search Engine Ranking
• Search Engine Rank When you search any keyword using a search engine, it displays thousands
of results found in its database. A page ranking is measured by the position of web pages
displayed in the search engine results. If a search engine is putting your web page on the first
position, then your web page rank will be number 1 and it will be assumed as the page with the
highest rank.
Keyword Selection
• Keyword selection is the first search specific discipline. Keyword selection is crucial and
has implications for so much else within the search
• Search Volumes : You should use a word or phrases that have sufficient search volumes
for your needs. You can find out about search volumes by checking with
https://www.wordtracker.com/, https://searchvolume.io/ etc.
• Competitive Advantage : A place to look for keywords is where you enjoy some
competitive advantage. How are your products or services differentiated? What are the
real strengths of your business compared to your closest competitors?
• Competition : Try to find words or phrases that appear ignored or underutilized by your
competitors. An alternative but higher risk approach is to see what keywords are used
by competitor sites and then attempt to outmaneuver them by better use of links,
content and meta tags.
• Relevance : The keyword terms you select must be relevant, salient and part of the
vocabulary used by the audience you are seeking to attract. If the target audience is a
consumer one s/he is unlikely to use jargon. The opposite may be true if you are seeking
B2B prospects.
Keyword Research
• What are Keywords?
• Keyword are words and phrases that searchers enter into search engines.
• Or its the search term that you want to rank a certain page.
Types of keywords
3. On Page
● Title Tag
● Meta Descriptions & Meta Keywords
● Heading Tags
● URL Optimisation
1. Title Tag
• In the HTML code of your web page, they should appear as:
<head>
<title> Your Title Goes Here </title>
</head>
• Each of your pages & posts should have its own unique title, which includes the main
keywords for that page
• Title tag length should be between 50 - 70
• Make all first letter of a word capital
• Put in your most relevant and desired keywords you want to rank for in the title
2. Meta Descriptions
• You can add a meta description in the <head> section of your site’s HTML. It should look
something like this:
<head>
<meta name=”description” content=”Here is a precise description of my awesome
webpage.”>
</head>
• include relevant keywords
• Length should be 50–300 characters
• Write legible, readable copy
• Do not duplicate meta descriptions
• Consider using rich snippets
3. Meta Keywords
● Meta keywords element is invisible to visitors but visible to search engines. The
keywords you put into this element’s content attribute were used as a ranking factor by
the search engines.
● <meta name="keywords" content="seo, search engine optimisation, search engine
optimization,
search engine ranking">
4. Heading Tags
• Headings are defined with the <h1> to <h6> tags.
• <h1> defines the most important heading. <h6> defines the least important
heading.
• <h1> headings should be used for main headings, followed by <h2> headings,
then the less important <h3>, and so on.
• Include keywords in Heading tags to ranking
• Only one H1 tag in one page
5. URL Optimisation
• URL - Uniform Resource Locator
• Do not use modifiers like top, 2018, etc in URL
• Use hyphens for space
• Do not use underscores or any other symbols
• Do not use capital letters in URL
• It should be human readable
• Ignore stop words
ASSIGNMENT - Write title ,url and meta description of a website & add meta keywords
4. Linking
● Internal Linking--
● External Linking
● Inbound Linking
● Outbound linking
● Image Optimisation
1. Internal Linking
• Internal links are hyperlinks that point at the same domain or its the link that points
to another page on the same domain
• They allow users to navigate in a website
• Help in ranking of websites
• Internal links are used to build SEO friendly site architecture.
• Helps web crawlers to crawl websites and find all pages easily
• Interlinking with keywords help in SEO
• HTML code for internal linking
• <a href=”http://domain.com/sub-category” >Link anchor text</a>
2. External Linking
• External links are hyperlinks that point at any domain other than the domain the link
exist.
• Structure is,
• <a href=”http://domain.com/sub-category” >Link anchor text</a>
• External links are one of the important factors which provide to the ranking of
websites.
• The quantity and quality of external links that you use matters
• Adding trustworthy and informative links to your pages will improve the credibility of
your website.
• Valuable external links will also help to improve authority of your website.
3. Inbound and Outbound Links
• Inbound Linking
• Inbound links are links from other webpages that direct customers and search
engines to your web page.
• They are also called backlinks.
• Outbound Linking
• Outbound links are links that direct you to another specific webpage.
4. Image Optimization -tool-tinyPNG
• Image optimization is the process of delivering the high quality images in the right
format, dimension, size and resolution while keeping the smallest possible size.
• Add relevant images to the content
• Add alt tags to the images
• Keep alt tags short and descriptive
• Big size images makes page heavy so reduce image size
• Loading time of pages is an imp factor for SEO ranking.
• Add keyword rich file names and use hyphens for space
• Use relevant keyword for title of images
• <img src=“https://site.com/image.png” alt=“your alt text”>
Keyword Density
● Keyword density is the percentage of times a keyword or phrase appears on a web page
compared to the total number of words on the page.
● Keyword density tells you how often a search term appears in a text in relation to the
total number of words it contains.
● For example: if a keyword appears three times in a 100 word text the keyword density
would be 3%.
Content Optimization
● Use Natural Language
o Use natural variants
o Contextual variations
o Latent Semantic Indexing
● Increase Content Length
o Length = Strength in SEO
o Valuable content + Best Length = Best
● Optimise images
o Use relevant images
o Rename your images with keywords and use hyphens for space
o Add alt text
● Write quality content
o Content is king
o Engage people
o Free of spelling and grammatical errors
● Add Social Sharing Buttons
o Makes easy for people to share content
o Social sharing values our content
o Brand recognition
Broken Links
● Broken Links are links that send visitors to a webpage that no longer exists.
● Broken links are links that don't work.
● Search engines see links as a vote for a website’s quality.
● Links to your website and links within your website can affect where your website ranks
in search results. Because of this, it’s best practice to either remove or update broken
links.
www or non-www
● The preferred domain is the one that you would like used to index your site's pages.
● Links may point to your site using both the www and non-www versions of the URL (for
instance, http://www.example.com and http://example.com).
● The preferred domain is the version that you want to use for your site in the search
results.
● The rule can be added in .htaccess
Duplicate content
● Duplicate content is content that appears on the internet in more than one place.
● Check duplicate content issues and fix them for better ranking
● Getting rid of duplicate content is a very important task for an SEO
● Tool: copyscape.com
5. Responsiveness
● Responsiveness
● Robots.txt
● 404 Page Optimisation
● 301/ 302 Redirects
● XML Sitemap
Responsiveness
• A responsive website automatically changes to fit the device you are reading it
on.
• Improved site responsibility
• Faster page speed
• Decreased bounce rate
• Less duplicate content
Robots.txt
• Robots.txt is a txt file webmasters create to instruct robots how to crawl pages on their
website.
• Structure is,
• User-agent:
• Disallow:
• Here user agent is the search engine crawlers and disallow lists files to be excluded from
indexing or that instructs a user agent to not to crawl a URL.
• You can use * to reference all crawlers or specify the name of crawler.
• User-agent: * (includes all crawlers)
• User-agent:Googlebot (instruction for google bot only)
• Disallow:/ (block the entire site)
• Disallow:/private-file.html (block a webpage)
• Allow:/folder/file (allows a file in that specific
• folder)can be accessed.
Allow - This tells which pages or sub folders
• Robots.txt is case sensitive so you need to add elements in small letter only.
• 301/302 Redirects
• 301 redirect is a permanent redirect which passes between 90-99% of link juice
to the redirected page.
• 302 redirect is a temporary redirect which does not carry or pass the link juice to
the redirected page.
● XML Sitemap
• XML Sitemap is the file where you can list the web pages of your site to tell google and
other search engines about the organization of your site content.
• It helps the robots to intelligently crawl the website.
• Creating the sitemap is not too complicated. Basically, for each URL you list in your
sitemap, you can add additional information about images on that page. Google gives us
an example for the URLhttp://example.co.uk/sample.xml:
• Sitemap structure
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.example.com/</loc>
<l astmod>2005-01-01</lastmod>
<c hangefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
</urlset>
• <urlset> : Encapsulates the file and references the current protocol standard.
• <url> : Parent tag for each URL entry. The remaining tags are children of this tag.
• <loc> : URL of the page. This URL must begin with the protocol (such as http) and end
with a trailing slash, if your web server requires it. This value must be less than 2,048
characters.
• <lastmod > optional The date of last modification of the file. This date should be in W3C
Datetime format. This format allows you to omit the time portion, if desired, and use
YYYY-MM-DD.
• <changefreq> : How frequently the page is likely to change. This value provides general
information to search engines and may not correlate exactly to how often they crawl the
page. Valid values are:
• always
• hourly
• daily
• weekly
• monthly
• yearly
• never
• <priority> : The priority of this URL relative to other URLs on your site. Valid values
range from 0.0 to 1.0. This value does not affect how your pages are compared to pages
on other sites—it only lets the search engines know which pages you deem most
important for the crawlers.
• The default priority of a page is 0.5.
• Please note that the priority you assign to a page is not likely to influence the position of
your URLs in a search engine's result pages. Search engines may use this information
when selecting between URLs on the same site, so you can use this tag to increase the
likelihood that your most important pages are present in a search index.
• Also, please note that assigning a high priority to all of the URLs on your site is not
likely to help you. Since the priority is relative, it is only used to select between URLs on
your site.
Static Dynamic
In static web pages, Pages will remain same until In dynamic web pages, Content of pages are
someone changes it manually. different for different visitors.
Static Web Pages are simple in terms of Dynamic web pages are complicated.
complexity.
In static web pages, Information are change In dynamic web page, Information are change
rarely. frequently.
Static Web Page takes less time for loading than Dynamic web page takes more time for loading.
dynamic web page.
In Static Web Pages, database is not used. In dynamic web pages, database is used.
Static web pages are written in languages such Dynamic web pages are written in languages
as: HTML, JavaScript, CSS, etc. such as: CGI, AJAX, ASP, ASP.NET, etc.
● Canonical Tag
● Broken Links
● No Follow/ Do Follow
● Social sharing
● Canonical Tag
• A canonical tag tells a search engine that a specific URL represents the master
copy of a page.
• Using canonical tag prevents problems caused by identical or duplicate
content.
• A canonical tag tells search engine which version of a URL you want to appear in
search results.
• Structure is,
• Here it indicates that the page on which this tag appears should be treated as a
duplicate of the specified URL
● Broken Links
• Broken links are links that send visitors to a webpage that no longer exists or they
are links that don't work
• Find broken links in any of the websites using a broken link checker tool
● No Follow/ Do Follow
• No follow provides a way for webmasters to tell search engines to not follow that
particular link.
• This give webmasters more granular control that is instead of telling search
engines and bots not to follow any links on the page, it lets you easily instruct
robots not to a crawl a specific link.
• Do follow links allow Google to follow them and reach our website, thus giving us
link juice and a backlink.
• Structure is,
• <a href=”https://domain.com” rel=”nofollow”>Anchor text</a>
● Social Sharing
• Social signals refer to a Webpages collective shares, likes and overall social media
visibility.
• These activities contribute to a pages organic search ranking.
7. On Page
● Breadcrumb Navigation
● W3c Validation
● Sitelinks
● Schema Markup
● Page Speed
BreadCrumb Navigation
• Breadcrumbs are links that allow a user to track their path from the page they are
currently viewing to the homepage of your website.
• They appear close to the top of your page and reflects the structure of your site.
• They help users understand the layout of your site
• Enable user to scan your site
• Easy to understand and follow
W3C Validator
• To check validity of web documents.
• Its an important step towards ensuring the technical quality of web pages.
Sitelinks
• Sitelinks are hyperlinks that are mentioned below any specific search engine result to
link to the websites subpages.
Citation
● Citation is the way you tell your readers that certain material in your work came from
another source.
● Giving credit to the original author by citing sources is the only way to use other peoples
work without plagiarizing.
Schema Markup ( Structured Data)
● Schema Markup is a code that you can add to your website to improve the way search
engines read and represent your page in SERPs.
● Schema tells the search engines what your data means, not just what it says.
● Adding schema markup to your website improves the way your page display in SERP’s
by enhancing the rich snippets that are displayed beneath the page title.
● Schema markup vocabulary are available at http://schema.org/
● Google structured data tool
● Microdata is a form of Structured data that works with HTML5.
(https://developer.mozilla.org/en-US/docs/Web/HTML/Microdata)
● Search results with rich snippets will have a better click through
rate.
● JSON-LD(JavaScript Object Notation for Linked Data) supports
all types of content
● Microdata support only HTML documents. It's a HTML5
specification
Example (Microdata)
<div itemscope itemtype="http://schema.org/Movie">
<h1>Avatar</h1>
<span>Director: James Cameron (born August 16,
1954)</span>
<span>Science fiction</span>
<a
href="../movies/avatar-theatrical-trailer.html">Trailer</a>
</div>
● @Context: When two people communicate with one another, the conversation takes place
in a shared environment, typically called "the context of the conversation". This shared
context allows the individuals to use shortcut terms, like the first name of a mutual friend,
to communicate more quickly but without losing accuracy. A context in JSON-LD works
in the same way. It allows two applications to use shortcut terms to communicate with
one another more efficiently, but without losing accuracy.
● Directory Submission.
○ Directory Submission.
10 Directory submission
10: Off Page Optimisation
● Social Bookmarking…
● Social Profile Creation
● Classified Submission
● Classified Submission is the process by which you can submit ads in classified
submission sites.
● Classified submission can also enhance your visibility and presence in search
engine.
● Article Submission
● Infographic Submission
● Image Sharing
● Forum Posting
● PPT Sharing
● PDF Sharing
● Press Release
● Audio Sharing
● Video Sharing
● Guest Post
ASSIGNMENT - https://neilpatel.com/blog/google-webmaster-tools/
5. Competitor Analysis
ASSIGNMENT -
https://www.youtube.com/watch?v=ZHItK2EVLQA
https://webdesign.tutsplus.com/tutorials/analyzing-your-website-with-the-screaming-frog-seo-spi
der--cms-21669
https://www.soravjain.com/semrush-competitor-analysis-tool
https://www.youtube.com/watch?v=tq9NnLzaMXU
ASSIGNMENT -
https://ahrefs.com/blog/how-to-use-ahrefs/
https://www.youtube.com/watch?v=amtMnIRGbG0
ASSIGNMENT
https://www.youtube.com/watch?v=ozWNNNxLW8E
https://www.youtube.com/channel/UClgihdkPzNDtuoQy4xDw5mA
https://blog.hubspot.com/marketing/google-tag-manager-guide