SEO Report
SEO Report
HOD
CE DEPARTMENT
R.K.SHAH
We would like to take this opportunity to say a lot of thanks to
all of those that have helped us, provided direction technical
information and advice at all stages of our project.
We are deeply indebted to everyone at our company, Moksha
Business. Particularly our HR Mr.Hussain Ahemad and project
leader Mr.Aashish Khajuria who have been a consistent source
of encouragement, and whom depth knowledge provided us the
full understanding of the project.
We sincerely wish to thank all those people who spared their
valuable time for guiding and helping us to complete this
project.
We are also thankful to Mr.Dharmendra suttariya and
Mr.Rahul Shah for their generous support and guidance that
did a lot to shape the tone and scope of the project.
And without forgetting finally we would like to thank our
friends and our family members for their co-operation in making
this project a success.
HANIF KHAN
JAYESH SINH
SEARCH ENGINE OPTIMIZATION
1. SOFTWARE INTRODUCTION
SOFTWARE DEFINATION.
SOFTWARE CHARACTERISTICS.
2. Process Model In Production (SPIRAL MODEL)
PARTATIONING
MODELLING
TESTING
8. Customer Evolution
REQUIREMENT SPECIFICATIONS.
FUTURE EVOLUTION THROUGH SPIRAL MODEL.
CONCLUSION.
9. Form Layouts
10.Bibliography
COMPANY PROFILE
We assure our clients fresh deals that are not internet generated. All
leads are verified twice over phone. This is an accepted practice at
Moksha Business Solutions to confirm genuineness of leads generated.
Vision of Company
Moksha Business Solutions vision is to provide exceptional services and
impeccable quality in web design, web development, e-commerce
solution and web site maintenance services. They are constantly striving
to get better and increase our niche and areas of expertise.
PREFACE
W E B S I T E : -
A static website is one that has content that is not expected to change
frequently and is manually maintained by some person or persons using
some type of editor software.
Types of websites:-
There are many varieties of Web sites, each specializing in a particular type of
content or use, and they may be arbitrarily classified in any number of ways. A few
such classifications might include:
• Affiliate: enabled portal that rend not only its custom CMS but also
syndicated content from other content providers for an agreed fee. There are
usually three relationship tiers. Affiliate Agencies (e.g. Commission
Junction), Advertisers (e.g. EBay) and consumer (e.g. Yahoo).
• Blog (or web log) site: site used to log online readings or to post online
diaries, which may include discussion forums (e.g. blogger, Xanga).
• Corporate website: used to provide background information about a
business, organization, or service.
• Commerce site or ecommerce site: for purchasing goods, such as
Amazon.com.
• Community site: a site where persons with similar interests communicate
with each other, usually by chat or message boards, such as MySpace.
• Database site: a site whose main use is the search and display of a specific
database's content such as the Internet Movie Database or the Political
Graveyard.
• Development site: a site whose purpose is to provide information and
resources related to software development, Web design and the like.
• Directory site: a site that contains varied contents which are divided into
categories and subcategories, such as Yahoo! directory, Google directory and
Open Directory Project.
• Download site: strictly used for downloading electronic content, such as
software, game demos or computer wallpaper.
• Employment site: allows employers to post job requirements for a position
or positions to be filled using the Internet to advertise world wide. A
prospective employee can locate and fill out a job application or submit a
résumé for the advertised position.
• Game site: a site that is itself a game or "playground" where many people
come to play, such as MSN Games and Pogo.com.
• Information site: contains content that is intended to inform visitors, but not
necessarily for commercial purposes, such as: RateMyProfessors.com, Free
Internet Lexicon and Encyclopedia. Most government, educational and non-
profit institutions have an informational site.
• Java applet site: contains software to run over the Web as a Web application.
• Mirror (computing) site: A complete reproduction of a website.
• News site: similar to an information site, but dedicated to dispensing news
and commentary.
• Personal homepage: run by an individual or a small group (such as a family)
that contains information or any content that the individual wishes to
include.
• Political site: A site on which people may voice political views.
• Review site: A site on which people can post reviews for products or
services.
• Search engine site: a site that provides general information and is intended
as a gateway or lookup for other sites. A pure example is Google, and the
most widely known extended type is Yahoo!.
• Web portal site: a website that provides a starting point, a gateway, or portal,
to other resources on the Internet or an intranet.
• Wiki site: a site which users collaboratively edit (such as Wikipedia)
U R L : -
I N T E R N E T : -
As of January 11, 2007, 1.093 billion people use the Internet according
to Internet World Stats. The most prevalent language for communication
on the Internet is English. This may be a result of the Internet's origins,
as well as English's role as the lingua franca. It may also be related to the
poor capability of early computers to handle characters other than those
in the basic Latin alphabet. After English (30% of Web visitors) the
most-requested languages on the World Wide Web are Chinese 14%,
Japanese 8%, Spanish 8%, German 5%, and French 5% (from Internet
World Stats, updated January 11, 2007). By continent, 36% of the
world's Internet users are based in Asia, 29% in Europe, and 21% in
North America ( updated January 11, 2007).
• E-mail
• File-sharing
• Instant messaging
• Internet fax
• Search engine
• World Wide Web
• Marketing
• Voice Telephony (VoIP)
• Streaming Media
• Collaboration
• Remote Access
W H A T I S A S E A R C H E N G I N E : -
Search engine optimization means ensuring that your Web pages are
accessible to search engines and are focused in ways that help improve
the chances they will be found.
The guide is not a primer on ways to trick or "spam" the search engines.
In fact, there are not any "search engine secrets" that will guarantee a top
listing. But there are a number of small changes you can make to your
site that can sometimes produce big results.
Let's go forward and first explore the two major ways search engines get
their listings; then you will see how search engine optimization can
especially help with crawler-based search engines.
There are basically Three Types of Search Engines:
The crawler will periodically return to the sites to check for any
information that has changed, and the frequency with which this happens
is determined by the administrators of the search engine.
Changing your web pages has no effect on your listing. Things that are
useful for improving a listing with a search engine have nothing to do
with improving a listing in a directory. The only exception is that a good
site, with good content, might be more likely to get reviewed for free
than a poor site.
Human Powered Search Engines rely on humans to submit information
that is subsequently indexed and catalogued. Only information that is
submitted is put into the index.
In both cases, when you query a search engine to locate information, you
are actually searching through the index that the search engine has
created; you are not actually searching the Web. These indices are giant
databases of information that is collected and stored and subsequently
searched. This explains why sometimes a search on a commercial search
engine, such as Yahoo! or Google, will return results that are in fact dead
links. Since the search results are based on the index, if the index hasn't
been updated since a Web page became invalid the search engine treats
the page as still an active link even though it no longer is. It will remain
that way until the index is updated.
In the web's early days, it used to be that a search engine either presented
crawler-based results or human-powered listings. Today, it extremely
common for both types of results to be presented. Usually, a hybrid
search engine will favor one type of listings over another. For example,
MSN Search is more likely to present human-powered listings from
Look Smart. However, it does also present crawler-based results (as
provided by Inktomi), especially for more obscure queries.
Another common element that algorithms analyze is the way that pages
link to other pages in the Web. By analyzing how pages link to each
other, an engine can both determine what a page is about (if the
keywords of the linked pages are similar to the keywords on the original
page) and whether that page is considered "important" and deserving of
a boost in ranking. Just as the technology is becoming increasingly
sophisticated to ignore keyword stuffing, it is also becoming savvier to
Web masters who build artificial links into their sites in order to build an
artificial ranking.
Soon after, many search engines appeared and vied for popularity. These
included Excite, Infoseek, Inktomi, Northern Light, and AltaVista. In
some ways, they competed with popular directories such as Yahoo!.
Later, the directories integrated or added on search engine technology
for greater functionality.
• Google
• Yahoo! Search
• Msn Search
• Windows Live Search
• Lycos
• AltaVista
• Alltheweb
• WebCrawler
• Northern Light
• Aliweb
•
• G O O G L E : -
Around 2001, the Google search engine rose to prominence. Its success
was based in part on the concept of link popularity and Page Rank. The
number of other websites and WebPages that link to a given page is
taken into consideration with Page Rank, on the premise that good or
desirable pages are linked to more than others. The Page Rank of linking
pages and the number of links on these pages contribute to the Page
Rank of the linked page. This makes it possible for Google to order its
results by how many websites link to each found page. Google's
minimalist user interface is very popular with users, and has since
spawned a number of imitators.
Google and most other web engines utilize not only Page Rank but more
than 150 criteria to determine relevancy. The algorithm "remembers"
where it has been and indexes the number of cross-links and relates
these into groupings. Page Rank is based on citation analysis that was
developed in the 1950s by Eugene Garfield at the University of
Pennsylvania. Google's founders cite Garfield's work in their original
paper. In this way virtual communities of WebPages are found. Teoma’s
search technology uses a communities approach in its ranking algorithm.
NEC Research Institute has worked on similar technology. Web link
analysis was first developed by Jon Kleinberg and his team while
working on the CLEVER project at IBM's Almaden Research Center.
Google is currently the most popular search engine.
2. You'll see only pages that are relevant to the terms you type.
Google only produces results that match all of your search terms or, through use of
a proprietary technology, results that match very close variations of the words
you've entered (e.g., if you enter "comic book", we may return results for "comic
books" as well). The search terms or their variants must appear in the text of the
page or in the text of the links pointing to the page. This spares you the frustration
of viewing a multitude of results that have nothing to do with what you're looking
to find.
The two founders of Yahoo!, David Filo and Jerry Yang, Ph.D.
candidates in Electrical Engineering at Stanford University started their
guide in a campus trailer in February 1994 as a way to keep track of
their personal interests on the Internet. Before long they were spending
more time on their home-brewed lists of favorite’s links than on their
doctoral dissertations. Eventually, Jerry and David's lists became too
long and unwieldy, and they broke them out into categories. When the
categories became too full, they developed subcategories ... and the core
concept behind Yahoo! was born. In 2002, Yahoo! acquired Inktomi and
in 2003, Yahoo! acquired Overture, which owned Alltheweb and
AltaVista. Despite owning its own search engine, Yahoo! initially kept
using Google to provide its users with search results on its main website
Yahoo.com. However, in 2004, Yahoo! launched its own search engine
based on the combined technologies of its acquisitions and providing a
service that gave pre-eminence to the Web search engine over the
directory.
M I C R O S O F T S E A R C H : -
• The queries one can make are currently limited to searching for
key words, which may result in many false positives, especially
using the default whole-page search. Better results might be
achieved by using a proximity-search option with a search-bracket
to limit matches within a paragraph or phrase, rather than matching
random words scattered across large pages. Another alternative is
using human operators to do the researching for the user with
organic search engines.
• Dynamically generated sites may be slow or difficult to index, or
may result in excessive results, perhaps generate 500 times more
WebPages than average. Example: for a dynamic webpage which
changes content based on entries inserted from a database, a
search-engine might be requested to index 50,000 static WebPages
for 50,000 different parameter values passed to that dynamic
webpage.
• Many dynamically generated websites are not index able by search
engines; this phenomenon is known as the invisible web. There are
search engines that specialize in crawling the invisible web by
crawling sites that have dynamic content, require forms to be filled
out, or are password protected.
• Relevancy: sometimes the engine can't get what the person is
looking for.
• Some search-engines do not rank results by relevance, but by the
amount of money the matching websites pay.
• In 2006, hundreds of generated websites used tricks to manipulate
a search-engine to display them in the higher results for numerous
keywords. This can lead to some search results being polluted with
link spam or bait-and-switch pages which contain little or no
information about the matching phrases. The more relevant
WebPages are pushed further down in the results list, perhaps by
500 entries or more.
• Secure pages (content hosted on HTTPS URLs) pose a challenger
for crawlers which either can't browse the content for technical
reasons or won't index it for privacy reasons.
H O W T H E S E A R C H E N G I N E W O R K S : -
1. Web crawling
2. Indexing
3. Searching
Some search engines, such as Google, store all or part of the source page
(referred to as a cache) as well as information about the web pages,
whereas others, such as AltaVista, store every word of every page they
find. This cached page always holds the actual search text since it is the
one that was actually indexed, so it can be very useful when the content
of the current page has been updated and the search terms are no longer
in it. This problem might be considered to be a mild form of link rot, and
Google's handling of it increases usability by satisfying user
expectations that the search terms will be on the returned webpage.
This satisfies the principle of least astonishment since the user normally
expects the search terms to be on the returned pages. Increased search
relevance makes these cached pages very useful, even beyond the fact
that they may contain data that may no longer be available elsewhere.
When a user comes to the search engine and makes a query, typically by
giving key words, the engine looks up the index and provides a listing of
best-matching web pages according to its criteria, usually with a short
summary containing the document's title and sometimes parts of the text.
Most search engines support the use of the Boolean terms AND, OR and
NOT to further specify the search query. An advanced feature is
proximity search, which allows users to define the distance between
keywords.
The vast majorities of search engines are run by private companies using
proprietary algorithms and closed databases, though some are open
source.
All crawler-based search engines have the basic parts described above,
but there are differences in how these parts are tuned. That is why the
same search on different search engines often produces different results.
Some of the significant differences between the major crawler-based
search engines are summarized on the Search Engine Features Page.
Information on this page has been drawn from the help pages of each
search engine, along with knowledge gained from articles, reviews,
books, independent research, tips from others and additional information
received directly from the various search engines.
W H A T I S A W E B C R A W L E R : -
Web crawlers are mainly used to create a copy of all the visited pages for
later processing by a search engine, that will index the downloaded
pages to provide fast searches. Crawlers can also be used for automating
maintenance tasks on a Web site, such as checking links or validating
HTML code. Also, crawlers can be used to gather specific types of
information from Web pages, such as harvesting e-mail addresses
(usually for spam).
Crawling policies:-
There are three important characteristics of the Web that generate a
scenario in which Web crawling is very difficult: its large volume, its
fast rate of change, dynamic page generation, containing a wide variety
of possible crawl able URLs.
The large volume implies that the crawler can only download a fraction
of the Web pages within a given time, so it needs to prioritize its
downloads. The high rate of change implies that by the time the crawler
is downloading the last pages from a site, it is very likely that new pages
have been added to the site, or that pages have already been updated or
even deleted.
Crawler identification:-
World Wide Web Worm: - (McBryan, 1994) was a crawler used to build a simple
index of document titles and URLs. The index could be searched by using the grep
UNIX command.
Google Crawler :- (Brin and Page, 1998) is described in some detail, but the
reference is only about an early version of its architecture, which was based in C++
and Python. The crawler was integrated with the indexing process, because text
parsing was done for full-text indexing and also for URL extraction. There is an
URL server that sends lists of URLs to be fetched by several crawling processes.
During parsing, the URLs found were passed to a URL server that checked if the
URL has been previously seen. If not, the URL was added to the queue of the URL
server.
Cobweb: - (da Silva et al., 1999) uses a central "scheduler" and a series of
distributed "collectors". The collectors parse the downloaded Web pages and send
the discovered URLs to the scheduler, which in turn assign them to the collectors.
The scheduler enforces a breadth-first search order with a politeness policy to
avoid overloading Web servers. The crawler is written in Perl.
Mercator :- (Heydon and Najork, 1999) is a modular web crawler written in Java.
Its modularity arises from the usage of interchangeable "protocol modules" and
"processing modules". Protocols modules are related to how to acquire the Web
pages (e.g.: by HTTP), and processing modules are related to how to process Web
pages. The standard processing module just parses the pages and extracts new
URLs, but other processing modules can be used to index the text of the pages, or
to gather statistics from the Web.
Web Fountain :-( Edwards’s et al., 2001) is a distributed, modular crawler similar
to Mercator but written in C++. It features a "controller" machine that coordinates
a series of "ant" machines. After repeatedly downloading pages, a change rate is
inferred for each page and a non-linear programming method must be used to solve
the equation system for maximizing freshness. The authors recommend using this
crawling order in the early stages of the crawl, and then switching to a uniform
crawling order, in which all pages are being visited with the same frequency.
PolyBot :-[Shkapenyuk and Suel, 2002] is a distributed crawler written in C++ and
Python, which is composed of a "crawl manager", one or more "downloader’s" and
one or more "DNS resolvers". Collected URLs are added to a queue on disk, and
processed later to search for seen URLs in batch mode. The politeness policy
considers both third and second level domains (e.g.: www.example.com and
www2.example.com are third level domains) because third level domains are
usually hosted by the same Web server.
FAST Crawler: - (Risvik and Michelsen, 2002) is the crawler used by the FAST
search engine, and a general description of its architecture is available. It is a
distributed architecture in which each machine holds a "document scheduler" that
maintains a queue of documents to be downloaded by a "document processor" that
stores them in a local storage subsystem. Each crawler communicates with the
other crawlers via a "distributor" module that exchanges hyperlink information.
Labrador: - is a closed-source web crawler that works with the Open Source
project Terrier search engine. In addition to the specific crawler architectures listed
above, there are general crawler architectures published by Cho (Cho and Garcia-
Molina, 2002) and Chakrabarti (Chakrabarti, 2003).
Web crawler architectures:-
Search engines do not always provide the right information, but rather
often subject the user to a deluge of disjointed irrelevant data.
• The final step is the search, locate, and match process itself.
Location and frequency of the keywords' occurrence within the
document are the most common criteria used by search engines in
matching and ranking. Words located in the title of a document are
awarded a greater weight, as are words located in HTML Meta
tags. Those located in subject descriptions and those located higher
up (i.e., earlier) on the page are also more highly weighted. The
frequent recurrence of the keywords results in a greater weight;
however, frequency is subject to certain limitations.
Not all search engines are going to be successful. To date, Yahoo! is the
only one to turn a profit. If a sustainable business model is not
eventually found, these companies will fail. The current strategy is to
fold a search engine into a larger portal site. An increasing number of
personalized services (e.g., paging services, weather reports, and chat
rooms) are being added by the portals to increase the likelihood of the
user logging on and staying put. Portals are becoming one-stop web
organizers. America Online and Yahoo! currently dominate the portal
race. However, Lycos's recent acquisition of HotBot, Disney's
acquisition of Infoseek and newcomer Northern Light are examples of
different solutions to the positioning question.
The Role of Search Engine Rank in Driving Traffic to Your
Website:
Having a desirable search engine rank is ideal for driving traffic to your
website. Generally, the majority of a website's traffic comes through
internet users' use of the search engines. A good search engine rank is
really important considering that over 80% of traffic for most websites is
directed via search engines and most users of search engines only click
through to websites that have a search engine rank within the first three
pages of the search engine results.
There are a number of search engines today. Each search engine has its
own algorithms which are rules that determine how websites are placed
in their search engine rank. Thus, a search engine optimization strategy
that provides a desirable search engine rank in one engine may not
produce good results in another search engine. So, when trying to
achieve a beneficial search engine rank, you really need to focus on one,
or maybe two search engines for search engine rank purposes. If your
website gets a good search engine rank in the other search engines, you
can consider that a blessing.
Getting a good search engine rank with Google will undoubtedly drive
loads of traffic to your website. Statistics have repeatedly shown that
Google is the most used search engine with Yahoo! coming in second.
These two major search engines also power some of the smaller search
engines, meaning that the results generated by small search engines draw
their results from the major search engines. So, if you get a good search
engine rank in Google or Yahoo!, you will likely get a decent search
engine rank in some of the search engines they power such as MSN,
AOL, Ask Jeeves, and Alta Vista.
Another reason to strive for a top search engine rank in Google is that it
is an organic search engine, thus the results the search engine displays
are based on the quality and relevance of the information on a website to
the key terms searchers use rather than being based on paid advertising
and who can pay the most to reach the top in search engine rank.
When you do a search using Google, you will notice that sponsored links
that appear at the top and at the right of the screen are clearly marked so
the searcher knows they are paid ads. All other results generated are
generated based on algorithms, not paid advertisements. With Yahoo!,
you can also tell which results are generated as a result of sponsored
links. If you cannot achieve a search engine rank in the first three pages
of Google or Yahoo!, you can always opt for paid advertising that will
get you listed in the sponsored links.
The difference between organic search engine rank listings and pay-per-
click/cost-per-click search engine rank is that organic listings don't cost
you anything while with per-click listings you are charged the amount of
your keyword bid for every click- through to your website. Whether you
get a search engine rank in the search results naturally or through paid
advertising, the search engine rank is vital for driving traffic to your
website.
Building a website is just the beginning. Most websites fail for lack of
traffic. In order to get traffic, the most cost-effective step you can take is
to prepare your website properly so that it can come up in the first few
pages of results when someone searches for your most important
keywords or keyword phrases. The preparation of web pages so that
they are search engine friendly is known as Search Engine
Optimization or SEO.
• Titles, Headings
• Meta tags
• Clean Design
• Navigation
• Content - Keywords
• URLs
• Sitemaps
• File size
• Site size
• Domain name
• Site age
• Images - ALT
• Outgoing links
• And others…
On page optimization will not guarantee any top rating within a search
engine, only off page optimization can offer that guarantee. However,
off page optimization is far more effective WHEN on page optimization
is in place.
But, even if you have web pages that are well-designed and well-written,
you may still be buried in the rankings. Google, the most popular search
engine, relies heavily on the link popularity of websites in its formula (or
algorithm) determining the most relevant answers to any particular
query. If your website does not have inbound links from other
websites it will not achieve high rankings in highly competitive
categories. Boosting the link popularity of a particular web page or
website is also known off-page or off-site search engine optimization.
We rewrite your headlines and important text so that they are search
engine friendly, and most importantly, so that they are user friendly! It
should always be remembered that your site should be built for the
satisfaction of the people who will read it and not for the satisfaction of
search engine robots.
The goal of your site is to sell your product or service, and SEO service is
aimed at helping you to get more traffic and to convert visitors into buyers.
The following is a checklist you should run through every time you
create a new webpage:
File Name:-
The file name of a page is taken into account when search engine spider
your website. You should therefore always try to use your main keyword
in the filename. If you are using dynamic pages you could use
mod_rewrite in the *.htacces file to rewrite your file names and URLs.
Title Tags:-
The page title is what most search engines will show on their search
result pages. It is vital that you make extra effort into getting this right
and it should not (only) is the websites name.
Example:
Although most search engines don’t pay much attention to meta tags it is
still important to add your meta tags for on page optimization. Some
search engines still use them to display your page in their lists. Try to
make them different for every webpage.
Example:
Heading Tags:-
Right after your page title your heading tags are of most importance.
Search engines use them to define the importance of your keywords. If
you find you’re heading tags too big and use css to define how they
look.
Example:
Spread out your keywords throughout the pages paragraphs (<p>) and
repeat them a couple of times. Try not to make the text unnatural,
because you still want people to be able to read it. Readability should be
more important here than on page optimization.
A bold or italic word has more weight for most search engines. So try
and use your main keyword in bold <b> and italic <i>. Please note that
the use of <strong> and <em> instead of <b> and <i> does not matter.
Google has stated that they treat this code exactly the same.
Search engines don’t just look at text they also take images into account.
Try to Use Image Alt tags and name images using your main keyword.
Example:
<img src=’images/onpage_optimization.jpg’ alt=’On page Optimization’
/>
Promotional Comments:-
Some search engines, like the Inktomi search engine, read comments in
the “<!--" format. If you place a keyword rich paragraph after such a
comment at the top of the page this could help keyword weighting and
your keyword relevancy.
Example:
There are two important factors in any Search Engine Optimization Campaign:
Once you've optimized your web pages and uploaded them to your
server, your next step will be to submit your main pages to the Search
Engines. However, don't submit your pages to Google. Your pages will
rank much higher if you allow this Search Engine to find your pages on
its own.
You may want to consider creating a site map for your site and submit
this page to Google instead. A site map is a page that outlines how your
pages are set up and linked together. If you design a site map with links
to all of your pages, the Search Engine robots can easily spider and
index them.
Taking the time to optimize each of your web pages is the most
important step you can take towards ranking high in the Search Engines
and driving your more traffic to your web site.
"Search engine submission" refers to the act of getting your web site
listed with search engines. Another term for this is search engine
registration.
Getting listed does not mean that you will necessarily rank well for
particular terms, however. It simply means that the search engine knows
your pages exist.
Think of it as a lottery. Search engine submission is akin to your
purchasing a lottery ticket. Having a ticket doesn't mean that you will
win, but you must have a ticket to have any chance at all.
"Search engine optimization" refers to the act of altering your site so that
it may rank well for particular terms, especially with crawler-based
search engines (later in this guide, we will explain what these are).
These terms also highlight the fact that doing well with search engines is
not just about submitting right, optimizing well or getting a good rank
for a particular term. It's about the overall job of improving how your
site interacts with search engines, so that the audience you seek can find
you.
On To Submission
The next few "essentials" pages cover the basics of search engine
submission. If all you do is follow the instructions on these essentials
pages, you'll receive traffic from search engines. However, if you have
time, you should also read beyond the essentials to understand how
optimization can increase your traffic and other ways you can market
your site with search engines.
• Page Rank
• Back links
• Link Exchange
• Anchor text
• Relevancy
• Directories
• Traffic
• Bookmark
Off page Optimization is optimization done off the Page, like getting
relevant links from other sites, link exchange with quality relevant sites,
choosing relevant anchor text from the perfect location on the different
pages of different sites etc.
Links are the ultimate driving force behind all Search Engines today. A
quality back link not only helps in Search Engine Ranking but is also
capable of developing your brand as unique, bringing quality targeted
traffic to your site.
The importance of link popularity varies with each search engine, but
the basic premise is that every link to your web site is an endorsement of
your site's quality, and the more endorsements you have, the higher your
site is likely to be listed. Search Engine Optimization helps building
quality text links to your site, thus increasing the visibility of your site.
Reciprocal linking or two way linking is where you provide a return link
to the other Web site in return for a link to yours, which is less valuable
than one way direct links. This type of linking is done when it is difficult
to get one way links. Many search engines use a search algorithm which
analyzes the quality of the site linking to you.
Blogging:-
What all this being said, it is important to note that links from High
Quality Sites are better for rankings than links from low quality sites. A
few links from reputable sites is worth more than a lot of links from
unknown sites. So if you can, try getting links from these sites instead.
There are a lot of SEO experts who say that off page optimization is
more important than on page optimization. While that is not completely
wrong, it is very difficult (but not impossible) for a good off page
optimization to work without excellent content. And that is ultimately
what is on your web pages.
How Search Engines Rank Web Pages:-
One of the the main rules in a ranking algorithm involves the location
and frequency of keywords on a web page. Call it the location/frequency
method, for short.
Search engines will also check to see if the search keywords appear near
the top of a web page, such as in the headline or in the first few
paragraphs of text. They assume that any page relevant to the topic will
mention those words right from the beginning.
Search engines may also penalize pages or exclude them from the index,
if they detect search engine "spamming." An example is when a word is
repeated hundreds of times on a page, to increase the frequency and
propel the page higher in the listings. Search engines watch for common
spamming methods in a variety of ways, including following up on
complaints from their users.
Off the page factors are those that a webmasters cannot easily influence.
Chief among these is link analysis. By analyzing how pages link to each
other, a search engine can both determine what a page is about and
whether that page is deemed to be "important" and thus deserving of a
ranking boost. In addition, sophisticated techniques are used to screen
out attempts by webmasters to build "artificial" links designed to boost
their rankings.
Another off the page factor is click through measurement. In short, this
means that a search engine may watch what results someone selects for a
particular search, and then eventually drop high-ranking pages that aren't
attracting clicks, while promoting lower-ranking pages that do pull in
visitors. As with link analysis, systems are used to compensate for
artificial links generated by eager webmasters.
With the addition of this document to their website, the people at Google
appear to be trying to frighten people away from search engine
optimization altogether. Although they say that "Many SEOs provide
useful services for website owners", they finish the sentence by
describing the range of what those useful services are:- "from writing
copy to giving advice on site architecture and helping to find relevant
directories to which a site can be submitted".
They say that an SEO's useful services include:- writing copy, giving
advice on site architecture and helping to find relevant directories....
These can be part of search engine optimization, of course, but they are
not what is widely understood by the term search engine optimization;
i.e. optimizing pages to rank highly. Even writing copy doesn't suggest
anything to do with seo copywriting, and giving advice on site
architecture is to do with website design and not search engine
optimization, although an SEO can advise on it with respect to crawling.
The document goes on to say, "there are a few unethical SEOs who have
given the industry a black eye through their overly aggressive marketing
efforts and their attempts to unfairly manipulate search engine results".
The implication is that search engine optimizers who go further than the
sort of things that Google mentions, and actually optimize pages to
improve rankings (manipulate search results), are unethical. Google
clearly views any sort of optimizing to improve rankings as unethical.
Later in the document, Google lists a number 'credentials' that reputable
search engine optimizers should have. In Google's view, a search engine
optimization company should employ a reasonable number of staff
(individual SEOs are not reputable), they should offer "a full and
unconditional money-back guarantee", they should report "every spam
abuse that it finds to Google", and more, and they warn people against
those who don't measure up. But there isn't a search engine optimizer in
the world, individual or company, who doesn't fall foul of Google's
'credentials'. There are people, who can write copy (not seo copy),
advise on site structure and even find directories to submit to, but they
aren't search engine optimizers and, in terms of rankings, they are of
limited value.
Suppose there are 1000 hotels in New York, each of which has a
website. When somebody types "New York hotels" into a search engine,
all 1000 websites are equally relevant to the search. Because of the way
that Google and other engines have been designed, they normally
display the results 10 at a time. But which of the 1000 hotel sites will be
displayed in the first 10, which of them will be displayed in the second
10......and which will be placed right at the bottom of the pile?
It is well-known that searchers don't look very far down the results, so
the sites that are nearer the top will take all the business, and those that
are further down will get none. But which sites will be at the top?
Google uses its algorithms to determine the order of the results. It is
patently obvious that all 1000 equally relevant websites will not be
displayed on the first results page (the top 10). It is also obvious that
equally relevant sites cannot be displayed where they belong. Some
necessarily become more equal than others.
So what if the owner of one of the websites decides to try and push his
site to the top? Is that wrong? Of course not. The site is just as relevant
as the top ones; it's just that Google cannot satisfy all the relevant sites.
This is where ethical search engine optimization comes in.
So what are search engines like Google so afraid of? SEOs have exactly
the same aim as the engines - relevant search results. The difference is
that search engines don't care about individual websites, whereas search
engine optimizers and website owners do. That's the only difference.
Engines don't care if a particular website is in the top 10; SEOs care very
much that a particular website is in the top 10. But they can't get an off-
topic site there because the search engine algorithms see to that. And
that's an important point - search engine optimization can only get pages
to the top of relevant results. The search engines' own algorithms keep
off-topic pages out.
Any individual person may say they find a page relevant to a particular
keyword. That doesn't mean that it is among the most relevant pages on
the Web for that individual or for other individuals. This is a subjective
assessment.
Search engines work at the group level. To a search engine, every page
in the index is relevant for every possible keyword. The question is
"How relevant?" A search engine applies algorithms to determine a
relevance score and orders its search results by that relevance score,
most relevant first. Thus the results at the top of any set of search results
are literally the most relevant. This is still a subjective assessment, as it
is effectively made by the programmers of the search engine algorithms.
However, as the assessment is made automatically by the algorithm,
according to pre-determined criteria, there is also an element of
objectivity to it. The function of a search engine is to deliver search
results in response to keywords that the individual searchers in its target
market find to be relevant - in other words, for its assessment of
relevance to match its users' assessment.
• To actually make the page more relevant by changing the content and
link structure, but still using content that people will see and links
that people will follow. This corresponds with White Hat
techniques.
There is one thing you should never forget to take into consideration
when you write and publish your pages: Never tend to over optimize
your contents. If there is no sense in what you are writing, it is less
useful to your visitors. Visitors may be distracted because of that, but
moreover the search engines will find out about excessive use of
keywords (= keyword stuffing) and the like. So always keep in mind
that your texts should be relevant and useful to your actual audience.
It is essential to find out at least one main target keyword. This is the
keyword you would like your visitors to query the search engine for,
and eventually see your site on the top of the result page. Your main
target keyword should not be a very competitive keyword, if your
business just started out. Keyword does not necessarily mean "one"
word, but rather a key phrase.
After you have picked your target keyword(s), you should concentrate
on optimizing your web site for this keyword(s). This does not mean
to stuff your site with the keyword, but rather to keep in mind to
mention this keyword from time to time in a relevant sentence.
Now that you have in mind what you are optimizing for, you can go
on with web site search engine optimization on, let's say, your
homepage first. Search Engine Optimization is not only done on the
texts you are writing. The most important parts of your page are the
title, the meta description and the meta keywords, the images ALT
attributes, the first body text (beside the other contents as well), the
headlines (esp. H1, H2), and the anchor texts of links, amongst others.
Page Title
You should avoid stuffing your page's title with keywords only. This
has been a wide-spread technique in past, but nowadays it is
considered spamming and you are one step further to being penalized
for that. The page's title should include your business and the purpose
or main point of the current page (or the website itself). The first 5
words are the most important ones, so try to include one of your main
target keywords at the beginning of the tag. Anyway, only do that if
the keyword is relevant to the displayed page! Do not put in senseless
keywords that cannot be found anywhere else on the page. This is
likely to be considered spamming, as already mentioned.
Meta Description
This should also not be stuffed with keywords only. It is best practice
to write one or two very relevant sentences that somewhat
summarizes the contents of the page. The sentence should be
grammatically correct in order to not be considered a spamming
sentence.
Meta Keywords
Always keep in mind that you do not put in too many keywords.
There are web sites around that use more than 30 keywords that is
way too much. Why? Simply because too much keywords decreases
the keyword density for your actual target keywords that your visitors
should search for in the search engines. Besides, you should order the
keywords by relevancy. Start with the most relevant main keyword
and eventually finish with the less relevant ones. So your selected
main target keywords should appear at the very beginning of this tag.
Also keep in mind that you should not list any keywords that never
appear on your web site as this may be considered spam.
The ALT attribute of images
It is likely that more weight is put on text that appears in headlines. If your main
keywords appear in the headlines, its relevancy may be increased. Anyway,
never try to think of wrapping all your body text in one headline tag, this will be
of no use for sure and is a way to trick the search engines. Always try to use
headlines in a common useful way, as if you were writing an article for a
newspaper.
The anchor text apparently plays an important way for keyword relevancy too.
Having internal links to pages that write something about or related to your
main keywords is of great value. Also it is a good practice to embed links in a
sentence. Links may receive more relevance by being logically inserted into a
sentence. Anyhow you should always have something like a menu for easier use
for the human visitor!
The keyword density is the number of the same keyword compared to the
overall body text.
Keyword density does not play such a big role in web site search engine
optimization, as it did a year or two ago. However, increasing the keyword
density in your body text and the Meta keywords can increase the keywords
relevancy a bit. Despite of that, never try to stuff your texts with the same
keyword several times. A too high density may backfire its relevancy. There
exist no perfect keyword density, e.g. 1%, though you should always focus on at
least mention your main key word once in a body text.
• Link Building
If you have not yet promoted your business anywhere, and you are
not yet indexed by any of the search engines, this section may be of
interest to you. If you are optimizing an established web site and you
do not need to do the basic promotion again, take a look at our SEO
techniques to find out about several techniques to successfully excel
your competitors.
If you write very quality articles about the area of your business or
any other web site, it is even not unlikely that some sites will link to
your article's permanent link. These are quality one-way links that
increase your link popularity as well.
The next step is to show others that you are an expert in your area.
Forums or bulletin boards are the keys to spread your online business
or web site. Be active on forums that are relevant to your business,
and you'll not only draw the attention of the forum visitors and
members, but also draw the attention of the search engines on your
signature or relevant threads with your link inside.
Having done the basic promotion you can think of advanced SEO
techniques in order to increase your link popularity and quality.
One of the most important factors that influence the search engine
ranking of your site is the number of back links you have. Almost every
search engine takes the number of back links into account while
evaluating and ranking your site in the SERPs. The number of back links
is the most important factor for ranking well in Google.
Back Links:-
Back links are basically incoming links to a particular website or
webpage. The more the number of back links a website or a webpage
possess the more popular or important the website is. Back links are also
popularly known as incoming links inbound links, in links and inward
links.
Ever wondered why back links are so very important in search engine
rankings? Well, almost all the search engines, especially Google value
websites that are rich in useful and informative content. And whenever a
site places a link to your site, Google considers that link as a vote to
your site. It feels that the other site had cast a vote to your site because
the content of your site is useful to its visitors. Hence greater the number
of votes, greater is the value that your site has from the search engine's
perspective. And if you are hankering for page rank, then back links is
the only way to go. Back links help your website attain a higher
search engine ranking and definitely a higher page rank.
• Blogs:-
One of the best ways of creating one-way links for your site is through
the posting of comments in blogs. Just include the URL of your site
while posting comments in blogs and soon you will build a good many
number of one-way links for your site.
• Forums:-
Posting in forums is also an excellent way for building back links. But
don't post your URL directly in the discussion boards or else the spam
busters would get you banned and removed from the forum. The best
way to do it is through your signature. Include the URL of your site in
your signature and then whenever you post a comment in the forum you
will be leaving behind a link to your site along with your name.
• Social Networks:-
Social Bookmarking:-
Articles:-
You may write articles on various topics and post them in article sites
such as this one and include the URL of your site in the resource box or
in the author's bio section. There are many sites which frequently
publish articles from various such article sites and along with the article
they also publish the author’s bio which contains the link to your site. If
your article is informative and intriguing, then it will soon be published
in many other sites and newsletters. This way you will have a large
number of back links pointing to your site.
Press Release:-
Directories:-
Submit you website in as many web directories as you can. These web
directories attract a large number of visitors daily and so you may get a
good number of visitors from such directories. Submission in directories
also helps in building quality back links for your website. Consider
submitting your website in leading web directories such as Dmoz.org
Link Exchange:-
Though it is a very old method for building traffic and back links, link
exchange still works. Exchange links with quality and relevant sites for
building back links.
The above mentioned methods will help you build umpteen numbers of
back links for your site. But always remember that the content of your
site is what really matters. Concentrate on making exceptionally good
content for your site and others will voluntarily link to your site.
What is a Sitemap?
Sitemaps are very important for two main reasons. First, your sitemap
provides food for the search engine spiders that crawl your site. The
sitemap will give the spider links to all the major pages of your site,
allowing every page included on your sitemap to be indexed by the
spider. This is a very good thing! Having all of your major pages
included in the search engine database will make your site more likely to
come up in the search engine results when a user performs a query.
Your sitemap pushes the search engine toward the individual pages of
your site instead of making them hunt around for links. A well planned
site map can ensure your Web site is fully indexed by search engines.
Sitemaps are also very valuable for you human visitors. They help them
to understand your site structure and layout, while giving them quick
access to your entire site. It is also helpful for lost users in need of a
lifeline.
Often if a visitor finds themselves lost or stuck inside your page, he will
begin to look for a way out of his hole. Having a detailed sitemap will
show him how to get back on track and find what he was looking for.
Without it, your visitor would have just closed the browser or headed
back over to the search engines. Conversion lost.
Your sitemap should be linked from your homepage. Linking it this way
will force search engines to find it that way and then follow it all the
way through the site. If it's linked from other pages it is likely the spider
will find a dead end along the way and just quit. Small sites can place
every page on their sitemap, but larger sites should not. You do not want
the search engines to see a never-ending list of links and assume you are
a link farm. Most SEO experts believe you should have no more than 25
to 40 links on your sitemap. This will also make it easier to read for your
human visitors. Remember, your sitemap is there to assist your visitors,
not confuse them. The title of each link should contain a keyword
whenever possible and should link to the original page. We recommend
writing a short description (10-25) words under each link to help visitors
learn what the page is about. Having short descriptions will also
contribute to your depth of content with the search engines. Once
created, go back and make sure that all of your links are correct. If you
have 15 pages on your sitemap, then all 15 pages need to link to every
other sitemap page. Otherwise both visitors and search engine spiders
will find broken links and lose interest.
Remember to Update!
Just like you can't leave your website to fend for it, the same applies to
your sitemap. When your site changes, make sure your sitemap is
updated to reflect that. What good are directions to a place that's been
torn down? Keeping your sitemap current will make you an instant
visitor and search engine favorite.
1. WHITE HAT
1. Internal Linking
By far one of the easiest ways to stop your website from ranking well on
the search engines is to make it difficult for search engines to find their
way through it. Many sites use some form of script to enable fancy drop-
down navigation, etc. Many of these scripts cannot be crawled by the
search engines resulting in UN indexed pages.
While many of these effects add visual appeal to a website, if you are
using scripts or some other form of navigation that will hinder the
spidering of your website it is important to add text links to the bottom
of at least your homepage linking to all you main internal pages
including a sitemap to your internal page.
2. Reciprocal Linking
Exchanging links with other webmasters is a good way (not the best, but
good) of attaining additional incoming links to your site. While the value
of reciprocal links has declined a bit over the past year they certainly
still do have their place.
3. Content Creation
Don't confuse "content creation" with doorway pages and such. When
we recommend content creation we are discussing creating quality,
unique content that will be of interest to your visitors and which will add
value to your site.
The more content-rich your site is the more valuable it will appear to the
search engines, your human visitors, and to other webmasters who will
be far more likely to link to your website if they find you to be a solid
resource on their subject. Creating good content can be very time-
consuming; however it will be well worth the effort in the long run. As
an additional bonus, these new pages can be used to target additional
keywords related to the topic of the page.
You know more about your business that those around you so why not
let everyone know? Whether it be in the form of articles, forum posts, or
a spotlight piece on someone else's website, creating content that other
people will want to read and post on their sites is one of the best ways to
build links to your website that don't require a reciprocal link back.
5. Site Optimization
The manipulation of your content, wording, and site structure for the
purpose of attaining high search engine positioning is the backbone of
SEO and the search engine positioning industry. Everything from
creating solid title and Meta tags to tweaking the content to maximize its
search engine effectiveness is key to any successful optimization effort.
2.BLACK HAT
1. Keyword Stuffing
2. Hidden Text
Hidden text is text that is set at the same color as the background or very
close to it. While the major search engines can easily detect text set to
the same color as a background some webmasters will try to get around
it by creating an image file the same color as the text and setting the
image file as the background. While undetectable at this time to the
search engines this is blatant spam and websites using this tactic are
usually quickly reported by competitors and the site blacklisted.
3. Cloaking
4. Doorway Pages
5. Redirects
6. Duplicate Sites
A throwback tactic that rarely works these days. When affiliate programs
became popular many webmasters would simply create a copy of the site
they were promoting, tweak it a bit, and put it online in hopes that it
would outrank the site it was promoting and capture their sales. As the
search engines would ideally like to see unique content across all of their
results this tactic was quickly banned and the search engines have
methods for detecting and removing duplicate sites from their index. If
the site is changed just enough to avoid automatic detection with hidden
text or such, you can once again be reported to the search engines and be
banned that way.
The Internet! There has never been a venue of this magnitude to reach so
many potential customers in the history of mankind.
Currently, there are over 391 million people online throughout the
world. This number is growing larger daily at an unprecedented rate.
Internet studies now show that over 85% of the millions people online
use search engines on a daily basis when they’re surfing the Internet.
M I S T A K E S M A K E B Y A S E O : -
Many web designers fall into the trap of designing a fancy website that
looks great but includes elements that will cripple its search engine
rankings. It is a huge waste of time, effort, and sometimes money to
create a beautiful looking site that does not attract any visitors. What
good is all that beauty if no one can find it? Here are some common
design elements that should be avoided whenever possible.
• Irrelevant keywords
There is a case of too much keyword that is unrelated to your site. For
example, sites about selling carpeting equipment with keywords like
“David Beckham”. Sure, tons of individuals that are searching for David
Beckham will end up at the site, just to realize there is not connection
with “David Beckham”. Nothing is gain from such traffic. On the other
hand, having the right keywords can be essential to bring in more
potential customers. Make sure that the keyword used is specific and it
gives an overall picture to the whole site.
If your site has a very limited amount of ingoing and outgoing links,
chances are your traffic can be extremely limited. The best way to do
this is to exchange links with other sites. Links on other websites to your
own plays an important part to increase your page rank of search
engines. However, it is also important to have links that are from good
quality sites instead of low quality and banned sites. If not it will not
contribute much worth to your site.
• Image Links:-
Using images instead of text links for your site's navigation menu can
confuse search engine bots and may make your site difficult for them to
spider completely. If you must use graphic elements for links, be sure to
also include a set of text links for the spiders to follow. Many sites do
this in their footer, either above or below the copyright information.
• Frames:-
If you use frames to display your site, you run the risk that the search
engine's spider will not pick up all of your content. Most bots will only
spider the first HTML file they encounter, ignoring your other frames. If
you use frames and notices that only a fraction of your pages make it
into the search engine's index, the frames could be the problem. Also,
many users are turned off by the sight of multiple frames and will not
stick around your site long enough to purchase your product or click an
advertisement.
• Duplicate Content:-
It is very bad practice that you use other website's contents. If you site
contains exactly the same text as another web site, and Google figured
out that you updated your web page later than the other site, Google and
other search engines consider your site to be spamming. Duplicate
contents will certainly not result in an immediate penalty, however, you
ranking may be negatively affected for sure.
Furthermore it is even not recommended to have duplicate contents, i.e.
bigger body texts, on several pages of your web site. That means if you
have 3 links that all link to pages that are differently named, but contain
the same information, you are likely to be considered tricking the search
engines as well. This may result in a loss of relevance...
• Over Optimization:-
Over optimization is reached when you are putting too much weight
on your web site search engine optimization. That means when you
are, for example, having no real relevant contents for your visitors but
rather keyword stuffed body text. Having a too high keyword density
may increase the likeliness of getting an Over Optimization Penalty.
This also applies to your page title, your Meta description and Meta
keywords, as well as the ALT attributes of your images.
None of the named objects should be stuffed with keywords nor be
irrelevant. The title, for example, should always reflect the contents of
the page. Although you won't be penalized for just having lots of
keywords in the title, you will see a decrease in page relevancy that
possibly results in a loss of your search engine ranking.
Therefore, avoid optimizing your pages too intensely by all means.
Search Engine Optimization (SEO) has become an essential weapon in the arsenal
of every online business. Unfortunately, for most business owners and marketing
managers (and even many webmasters), it's also somewhat of an enigma. This is
partly due to the fact that it's such a new and rapidly changing field, and partly
due to the fact that SEO practitioners tend to speak in a language all of their own
which, without translation, is virtually impenetrable to the layperson. This
glossary seeks to remedy that situation, explaining specialist SEO terms in plain
English...
• ALGORITHEM
The submitting of free reprint articles too many article submission sites and article
distribution lists in order to increase your website's search engine ranking and
Google Page Rank. (In this sense, the "PR" stands for Page Rank.) Like
traditional public relations, article PR also conveys a sense of authority because
your articles are widely published. And because you're proving your expertise and
freely dispensing knowledge, your readers will trust you and will be more likely
to remain loyal to you. (In this sense, the "PR" stands for Public Relations.)
Websites which act as repositories of free reprint articles. They are sites where
authors can submit their articles free of charge, and where webmasters can find
articles to use on their websites free of charge. Article submission sites generate
revenue by selling advertising space on their websites.
• BACKLINKS
• COPY
• COPYWRITER
• CRAWL
Google finds pages on the World Wide Web and records their details in its index
by sending out ‘spiders’ or ‘robots’. These spiders make their way from page to
page and site to site by following text links. To a spider, a text link is like a door.
• DOMAIN NAME
The virtual address of your website (normally in the form
www.yourbusinessname.com). This is what people will type when they want to
visit your site. It is also what you will use as the address in any text links back to
your site.
• EZINES
An electronic magazine. Most publishers of ezines are desperate for content and
gladly publish well written, helpful articles and give you full credit as author,
including a link to your website.
• FLASH
The search engine with the greatest coverage of the World Wide Web, and which
is responsible for most search engine-referred traffic. Of approximately 11.5
billion pages on the World Wide Web, it is estimated that Google has indexed
around 8.8 billion. This is one reason why it takes so long to increase your
ranking!
How Google scores a website’s importance. It gives all sites a mark out of 10. By
downloading the Google Toolbar, you can view the PR of any site you visit.
• GOOGLE TOOLBAR
A free tool you can download. It becomes part of your browser toolbar. It’s most
useful features are it’s Page Rank display (which allows you to view the PR of
any site you visit) and it’s AutoFill function (when you’re filling out an online
form, you can click AutoFill, and it enters all the standard information
automatically, including Name, Address, Zip code/Postcode, Phone Number,
Email Address, Business Name, Credit Card Number (password protected), etc.)
Once you’ve downloaded and installed the toolbar, you may need to set up how
you’d like it to look and work by clicking Options (setup is very easy). NOTE:
Google does record some information (mostly regarding sites visited).
• HTML
HTML (Hypertext Markup Language) is the coding language used to create much
of the information on the World Wide Web. Web browsers read the HTML code
and display the page that code describes.
• INTERNET
• JAVASCRIPT
• KEYWORD
A word which your customers search for and which you use frequently on your
site in order to be relevant to those searches. This use known as targeting a
keyword. Most websites actually target ‘keyword phrases’ because single
keywords are too generic and it is very difficult to rank highly for them.
• KEYWORD DENSITY
• KEYWORD PHRASE
A phrase which your customers search for and which you use frequently on your
site in order to be relevant to those searches.
• LINK
A word or image on a web page which the reader can click to visit another page.
There are normally visual cues to indicate to the reader that the word or image is a
link.
• LINK PATH
Using text links to connect a series of page (i.e. page 1 connects to page 2, page 2
connects to page 3, page 3 connects to page 4, and so on). Search engine ‘spiders’
and ‘robots’ use text links to jump from page to page as they gather information
about it, so it’s a good idea to allow them traverse your entire site via text links.
• LINK PARTNER
A webmaster that is willing to put a link to your website on their website. Quite
often link partners engage in reciprocal linking.
• LINK POPULARITY
The number of links to your website. Link popularity is the single most important
factor in a high search engine ranking. Webmasters use a number of methods to
increase their site's link popularity including article PR, link exchange (link
partners / reciprocal linking), link buying, and link directories.
• LINK TEXT
The part of a text link that is visible to the reader. When generating links to your
own site, they are most effective (in terms of ranking) if they include your
keyword.
• META TAG
A short note within the header of the HTML of your web page which describes
some aspect of that page. These meta tags are read by the search engines and used
to help assess the relevance of a site to a particular search.
• NATURAL SEARCH RESULTS
The ‘real’ search results. The results that most users are looking for and which
take up most of the window. For most searches, the search engine displays a long
list of links to sites with content which is related to the word you searched for.
These results are ranked according to how relevant and important they are.
• RANK
Your position in the search results that display when someone searches for a
particular word at a search engine.
• RECIPROCAL LINK
A mutual agreement between two webmasters to exchange links (i.e. they both
add a link to the other’s website on their own website). Most search engines
(certainly Google) are sophisticated enough to detect reciprocal linking and they
don’t view it very favorably because it is clearly a manufactured method of
generating links. Websites with reciprocal links risk being penalized.
• ROBOTS.TXT FILE
A file which is used to inform the search engine spider which pages on a site
should not be indexed. This file sits in your site’s root directory on the web server.
(Alternatively, you can do a similar thing by placing tags in the header section of
your HTML for search engine robots/spiders to read.
• SANDBOX
Many SEO experts believe that Google ‘sandboxes’ new websites. Whenever it
detects a new website, it withholds its rightful ranking for a period while it
determines whether your site is a genuine, credible, long term site. It does this to
discourage the creation of SPAM websites (sites which serve no useful purpose
other than to boost the ranking of some other site). Likewise, if Google detects a
sudden increase (i.e. many hundreds or thousands) in the number of links back to
your site, it may sandbox them for a period (or in fact penalize you by lowering
your ranking or blacklisting your site altogether).
• SEO
Search Engine Optimization. The art of making your website relevant and
important so that it ranks high in the search results for a particular word.
• SEO COPYWRITER
A ‘copywriter’ who is not only proficient at web copy, but also experienced in
writing copy which is optimized for search engines (and will therefore help you
achieve a better search engine ranking for your website).
• SEARCH ENGINE
A search engine is an online tool which allows you to search for websites which
contain a particular word or phrase. The most well known search engines are
Google, Yahoo, and MSN.
• SITE MAP
A single page which contains a list of text links to every page in the site (and
every page contains a text link back to the site map). Think of your site map as
being at the center of a spider-web.
• SPAM
• SPIDER
Google finds pages on the World Wide Web and records their details in its index
by sending out ‘spiders’ or ‘robots’. These spiders make their way from page to
page and site to site by following text links.
• SPONSERED LINKS
Paid advertising which displays next to the natural search results. Customers can
click on the ad to visit the advertiser’s website. This is how the search engines
make their money. Advertisers set their ads up to display whenever someone
searches for a word which is related to their product or service. These ads look
similar to the natural search results, but are normally labeled “Sponsored Links”,
and normally take up a smaller portion of the window. These ads work on a Pay-
Per-Click (PPC) basis (i.e. the advertiser only pays when someone clicks on their
ad).
• SUBMIT
You can submit your domain name to the search engines so that their ‘spiders’ or
‘robots’ will crawl your site. You can also submit articles to ‘article submission
sites’ in order to have them published on the Internet.
• TEXT LINKS
A word on a web page which the reader can click to visit another page. Text links
are normally blue and underlined. Text links are what ‘spiders’ or ‘robots’ use to
jump from page to page and website to website.
• URL
• WEB COPYWRITER
• WEBMASTER
• WORDCOUNT
The number of words on a particular web page.
The vast array of documents published on the Internet. It is estimated that the
World Wide Web now consists of approximately 11.5 billion pages.
WE ARE NOT CLAIM THAT OUR WORK IS UNIQUE, BUT WE CLAIM THAT
OUR EFFORTS ARE ALWAYS UNIQUE.
C A N C L U S I O N : -
The process of SEO is not easy to tackle, largely because so many pieces
of a site factor into the final results. Promoting a site that writers on the
web are unlikely to link to is as deadly as creating a fantastic website no
one will see. SEO is also a long-term process, both in application and
results - those who expect quick rankings after completing a few
suggestions in this guide will be deeply disappointed. Search engines
can often be frustratingly slow to respond to improvements that will
eventually garner significant boosts in traffic.
An optimization campaign also takes time. Search engines may not see
or react to changes you’ve made on your site or links you’ve received
for months. For very small companies, it may be smart to run your own
optimization campaign. But for most businesses, it is smart to use a
professional search engine optimization company.
Patience is not the only virtue that should be used for successful SEO.
The strategy itself must have a strong foundation in order to succeed.
B I B L I O G R A P H Y & R E F E R E N C E S : -
B O O K S : -
SEO E-BOOK.
W E B S I T E S : -
www.searchenginechannel.com
www.searchenginewatch.com
www.seobook.com
www.seoworld.com
www.searchengines.com