0% found this document useful (0 votes)
271 views21 pages

Ai Search Engine

This document appears to be a project report for developing an AI search engine. It includes chapters on introduction, literature review, design process, results analysis and validation, and conclusion. The introduction discusses how AI has integrated into many aspects of life including search engines by helping retrieve relevant information from the internet based on context. It notes some issues around AI in search relating to unwanted or harmful content. The report will examine how people interact with search engines and how AI is changing search experiences.

Uploaded by

PK Boss
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
271 views21 pages

Ai Search Engine

This document appears to be a project report for developing an AI search engine. It includes chapters on introduction, literature review, design process, results analysis and validation, and conclusion. The introduction discusses how AI has integrated into many aspects of life including search engines by helping retrieve relevant information from the internet based on context. It notes some issues around AI in search relating to unwanted or harmful content. The report will examine how people interact with search engines and how AI is changing search experiences.

Uploaded by

PK Boss
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 21

AI SEARCH ENGINE

A PROJECT REPORT

Submitted by
Kushank Chauhan(21BCS7352)
Sonu Kumar (21BCS7403)
Simran (21BCS7355)
Pushkar (21BCS7292)
Roshan (21BCS7447)

in partially fulfilment for the award of the degree of

BACHELOR OF ENGINEERING
IN
COMPUTER SCIENCE & ENGINEERING

Chandigarh University
September 2023
BONAFIDE CERTIFICATE

Certified that this project report “..................AI


SEARCH ENGINE …………….” is the bonafide work of “………….
KUSHANK CHAUHAN, SONU KUMAR, SIMRAN, PUSHKAR,
ROSHAN.…………” who carried out the project work under our supervision.

<<Signature of the HoD>> <<Signature of the Supervisor>>


SIGNATURE SIGNATURE

<<Name of the Head of the Department>> <<Name>>

SUPERVISOR

HEAD OF THE DEPARTMENT <<Academic Designation>>

<<Department>> <<Department>>

Submitted for the project viva-voce examination held on.

INTERNAL EXAMINER EXTERNAL EXAMINER


TABLE OF THE CONTENTS

CHAPTER 1. INTRODUCTION…………………………………………..11
1.1. Identification of Client/ Need/ Relevant Contemporary issue…………………….11
1.2. Identification of Problems…..................................................................................11

1.3. Identification of Tasks…........................................................................................11

1.4. Timeline..................................................................................................................11

1.5 Organization of the report………………………………………………..………..11

CHAPTER 2. LITERATURE REVIEW/BACKGROUND STUDY…..….12

2.1 Timeline of the reported problem………………………………………………….12

2.2 Existing solutions…………………………………………………………………..12

2.3 Bibliometric analysis……………………………………………………………….12

2.4 Review Summary…………………………………………………………………..12

2.5 Problem Definition………………………………………………………………....12

2.6 Goals/Objectives…………………………………………………………..………...12

CHAPTER 3. DESIGN FLOW/PROCESS ................................................… 13

3.1. Evaluation & Selection of Specifications/Features ...................................................... 13

3.2. Design Constraints ....................................................................................................... 13

3.3. Analysis of Features and finalization subject to constraints ................................…..... 13

3.4. Design Flow ........................................................................................…….....……..… 13

3.5. Design selection ..................................................................................................……. 13

3.6. Implementation plan/methodology ....................................................................……... 13

CHAPTER 4. RESULTS ANALYSIS AND VALIDATION ......................... 14


4.1. Implementation of solution ............................................................................................ 14
CHAPTER 5. CONCLUSION AND FUTURE WORK ................................. 15
5.1. Conclusion ...................................................................................................................... 15
5.2. Future work .................................................................................................................... 15
REFERENCES ................................................................................................... 16
APPENDIX ......................................................................................................... 17
1. Plagiarism Report ............................................................................................................... 17
2. Design Checklist ................................................................................................................ 17
USER MANUAL .................................................................................................... 18
ABSTRACT

A search engine is simple software to search the World Wide Web. In modern world, search
engines many times a day and we always wonder how they really work. It also searches the
web for weather conditions, driving directions, recipes, and more. We really look for
whatever comes to mind. Without search engines, we are powerless today. The main goal of
this report is to know how people interact with search engines. The way people search and
their expectations of machines are changing rapidly, and AI-powered search is playing a
pivotal role.

CHAPTER NO. 1
1.1 INTRODUCTION

Artificial intelligence (AI) is a broad division of computer technology that focuses on


building nifty machines capable of performing tasks that normally require human
intelligence. Artificial intelligence is a multidisciplinary technological know-how with many
approaches, but advances in systems learning, especially deep learning, are causing paradigm
shifts in practically every technology industry. With the help of synthetic intelligence,
machines can replicate or even enhance the capabilities of the human mind. And from
improved self-driving cars to the rise of smart assistants like Siri and Alexa, artificial
intelligence is becoming more ubiquitous and where companies in all industries are leading.
invest in.

In this contemporary world, artificial intelligence has integrated into many different elements
of people's lifestyles today. Artificial intelligence is used in education, healthcare,
investments, audience hearings, cybersecurity, as well as home lifestyle and transportation.
The development of artificial intelligence that was initially afraid to make people lose control
of their lifestyles has honestly provided AI as a personal secretary that helps people get
through everything in their daily lives. Even Steve Wozniak, the founder of Apple, retracted
his statement that one day artificial intelligence will update people on earth. One useful
synthetic intelligence topic is our global online research. Artificial intelligence has helped
humans get reliable and contextual statistics about their needs on the internet through search
engine optimization. However, there are issues regarding the usefulness of general
intelligence in search engine optimization. While this utility helps humans efficiently pull
statistics from the internet, purely context-based search mechanisms have enabled computer
systems to honestly
search for things that humans are not interested in. unwanted people today, including racism,
ageism, and sexism.

We many times here about the Artificial Intelligence and use of it in search engines. AI has a
huge benefit to the search engines. Before the invention of AI, the search engines usually
worked simply by matching the query with Index. This type of working leads to spamming of
the keyword. Spammers literally stuffed the websites with keywords and search engine
showed them as there was no way to find out the relevant information.

The spamming leads to bad user experience as more and more irrelevant websites show up on
the first page of the search engine. Search engines came up with the new update in the
software, devaluating or banning the websites which are spamming and having thin/irrelevant
content.

In recent years, they incorporated the AI into the software & thereby getting more accurate
and spam-free results. The AI incorporation leads to a massive shift in how a search engine
works. Now the search engine can literally understand the context and read the contents as
human do. They also learn from the user behaviour and deliver exact answers to the users’
queries.

Now, search engines are an integral part of our lives. We interact with them from time to time
in many ways. The next biggest change will be voice search. The integration of AI and speech
recognition is leading to more voice-based search queries and answers. Now you can literally
interact with your voice search engine. Search engines can listen, understand and give
appropriate answers to your questions and even take action.

In short, in the near future, you will interact with a search engine just like your friend. You'll
interact and take actions like calling friends or ordering online with your voice. Search
engines now have enormous capabilities never seen before. As a result, they will continue to
interact with you, influencing your life in many ways.

1.2 INDEFICATION OF PROBLEMS

1. A lot of 404 errors


You can crawl your website or see if the Google search results list any error
pages. You can check for mistakes more quickly by typing site:website.com
into
Google, which will list all of your pages for you to review. Nobody would
want to go to a webpage with links on it if all (or even some) of the links were
no longer valid. So why would Google include a web page with a lot of
problematic links in its search results?

1. Lack of links
Lack of links is without a doubt the main problem I observe on client websites
that keeps them from rising in search engine results. The majority of you
probably already know this, but we still need to mention it. The significance of
ongoing connection development is still poorly understood by a large portion
of the population. If you establish links effectively enough, search engines will
ignore numerous other problems. It's crucial to obtain links from reputable
websites, popular websites, and websites that are relevant to your website.

2. Unclean URLs
Spiders for search engines must be extremely effective, so even if something
doesn't actually pose a problem, they are wary of it. They are at risk from
dynamically created sites because the spiders could get trapped in an endless
loop on the website. So, be sure to remove any special characters from your
URL, including question marks, equal signs, ampersands, and others.
Compared to a shorter url without any special characters, URLs with lengthy,
complex query strings will have a tougher time being indexed.

3. Not Enough Patience


Being patient may be the largest mental adjustment clients need to make.
Implementing effective SEO takes patience. Similar to how crash diets can be
effective in the short term but harm you over the long term, there are methods
that can be utilised to boost ranks much more quickly. Yet, with the right
approach, you can be well on your way to implementing possibly the most
cost- effective kind of marketing that is now available. You want to make sure
you do adequate work up front, which regrettably requires spending more time
and resources, which are typically precious.
4. Repetitive Title Tags
The biggest mental change customers may need to make is learning to be
patient. Effective SEO implementation requires perseverance. There are
techniques that can be used to increase ranks much more quickly, much to
how crash diets can be beneficial in the short term but harmful in the long
term. Yet, with the appropriate strategy, you might be well on your way to
executing what may be the most economically advantageous marketing
strategy currently available. Making sure you complete appropriate work up
front necessitates sadly using up more time and money, both of which are
often limited.

1.3 IDENTIFICATION OF TASK

The search engine's three primary tasks include gathering data about websites, categorising
those websites, and developing an algorithm that makes it simple for users to locate pertinent
online pages. Google is by far the most well-known search engine, for instance.

1.4 TIMELINE
CHAPTER NO. 2

2.1 TIMELINE OF THE REPORT PROBLEM

An HTTP status code of 404 indicates that the requested webpage or resource could not be
located on the server. The user is usually sent the error message "404 Not Found." Here's a
timeline of 404 errors:

The earliest HTTP protocol, RFC 1945, did not contain a standardised answer for when
a requested resource was not found in the early days of the World Wide Web, in the
1990s. Instead, servers would frequently send a generic error message like "File not
found" or "Page not found," providing the user with little information about what went
wrong.

In 1997, the HTTP/1.1 specification was published, which introduced the 404 status
code as a standardised response for when a requested resource was not found. This
allowed web servers to communicate more detailed information about the error to the
user, such as the type of resource that was not found and whether it was a temporary or
permanent condition.

Since then, 404 errors have become a common occurrence on the web, as users often
mistype URLs or follow broken links. Web developers and designers have implemented
various strategies to minimise 404 errors, such as creating custom error pages with
helpful information and redirecting users to relevant content.

In the early days of the internet, dirty URLs, often known as "ugly" or "non-SEO-friendly"
URLs, were widely utilised. These URLs generally featured a long string of random letters
and numbers, making them difficult for users to remember or distribute, and they sometimes
lacked information about the page's content.For instance, a common filthy URL may look
like this: http://www.example.com/index.php?id=12345&sort=asc&page=1
However, as search engine optimisation (SEO) and user-friendly site design got more
prominent, clean URLs became increasingly common. Clean URLs are more
descriptive, shorter, and simpler to read and remember. They also incorporate pertinent
keywords, which makes them more search engine friendly.
For instance, a tidy URL may look like this:
http://www.example.com/products/widgets

Clean URLs are now used by the majority of websites as a best practise for boosting
user experience and search engine visibility. Unclean URLs may still be used on some
outdated websites or legacy systems.

a.) Title tags have developed throughout time, but one issue has remained: repetitious or
spammy title tags. In the early days of the internet, it was usual for website owners to jam as
many keywords as possible into their title tags in order to rank better in search engines. This
technique, known as "keyword stuffing," was rapidly identified as spammy and
manipulative.
As search engines improved, they began to penalise websites that used keyword stuffing
and other manipulative techniques. As a result, title tags are becoming more natural and
descriptive, precisely reflecting the content of the page.

Even today, though, some website owners utilise repetitious or spammy title tags in an
attempt to influence search rankings. This might include utilising the same title tag on many
pages, using title tags that are too long or confusing, or using irrelevant or deceptive title
tags.
Websites that participate in these practises continue to be penalised by search engines, and it
is usually suggested that website owners use meaningful and unique title tags that
appropriately represent the content of each particular page.

2.2 EXIXTING SOLUTION

1. A lot of 404 errors :-


Google, for example, utilises a complicated algorithm to analyse the relevancy and
quality of web sites in order to deliver the most accurate search results for a user's query. Even
the most sophisticated search engines, however, can encounter 404 errors, which occur when a
requested page or resource on a website is not found. Here are several solutions for
experiencing a high number of 404 errors in an AI-powered search engine:-

Crawl the website on a regular basis: Make sure the search engine scans the website on a
frequent basis to find any broken or missing links. This helps to guarantee that the search
engine's index is current and free of obsolete or erroneous material.

Implement redirects: Redirect any broken or missing links on the website to appropriate sites or
resources. This ensures that consumers are routed to the proper material while still allowing
search engines to crawl and index the pages.

Remove dead links: To avoid 404 errors, remove any broken or outdated links from your
website.
Improve website structure: Make sure the website is well-organized and has a clear page
hierarchy. This makes it easier for search engines to explore the page and identify relevant
material. Use a custom 404 page: Create a personalised 404 page that offers customers with
useful information and guides them to relevant material on your website. This improves the user
experience and can also assist search engines comprehend the structure of the page.

By applying these solutions, you may minimise the amount of 404 errors seen by an AI-based
search engine, improving user experience and assisting the search engine in providing more
accurate search results.

2. Lack of links:-

The capacity of a search engine to crawl and index web sites is contingent on
the availability of links to those pages. A lack of links linking to a website or certain
pages can have a detrimental influence on its visibility and search engine results. Here
are various solutions to the issue of a lack of links in an AI-powered search engine:

Create excellent content: The greatest strategy to gain links from other websites is to
create good and interesting material on your website. Make an effort to create material
that is distinct, entertaining, and gives value to your audience. Blog entries, whitepapers,
infographics, videos, and other forms of information may be included.

Engage in link building:


The capacity of a search engine to crawl and index web sites is contingent on the
availability of links to those pages. A lack of links linking to a website or certain pages
can have a detrimental influence on its visibility and search engine results. Here are
various solutions to the issue of a lack of links in an AI-powered search engine.
Create excellent content: The greatest strategy to gain links from other websites is to
create good and interesting material on your website. Make an effort to create material
that is distinct, entertaining, and gives value to your audience. Blog entries, whitepapers,
infographics, videos, and other forms of information may be included.

To enhance exposure and gain links, share your material on social media and
communicate with your audience.

Make linkable assets: Make resources, tools, or other materials that other websites may
link to. You might, for example, make an industry report, a calculator, or an interactive
infographic.

You may draw more links to your website by applying these techniques, which can
boost its exposure and search engine ranks in an AI-based search engine.

2. A clean URL:

Follow these steps to build a clean URL for an AI-powered search engine:

Remove any extraneous characters: To make the URL simpler and easier to understand,
remove any special characters, symbols, or spaces.Use keywords: In the URL, use
relevant keywords to assist people and search engines understand what the website is
about. This can also help your website's SEO (search engine optimization ).Use
hyphens: To separate words in the URL, use hyphens (-). This makes the URL easier to
read and remember.Keep it brief: Try to keep the URL as brief as feasible while
expressing the relevant information. A shorter URL is easier to remember and
share.Make it descriptive: Make sure the URL includes a descriptive description of the
website's content to assist users and search engines in understanding what the page is
about.Lowercase: Use lowercase characters in the URL to avoid any misunderstanding
caused by case sensitivity.

You may establish a clean and user-friendly URL for your AI-based search engine by
following these steps.

3. Repetitive Title tags :


Repetitive Title tags can have a detrimental influence on a website's user experience and
search engine rankings. Here are some ideas for resolving this issue using AI-powered
search engines. Use natural language processing: AI-powered search engines may
analyse web page content using natural language processing techniques to detect similar
or identical Title tags. This can assist webmasters in identifying and correcting
redundant Title tags.

Use dynamic Title tags: AI-powered search engines may analyse user behaviour and
preferences to recommend dynamic Title tags based on the user's search query. This
can assist in avoiding redundant Title tags and providing a more personalised user
experience. Machine learning techniques may be used to analyse title tags and find
patterns that signal recurrence. This can assist webmasters in more effectively
identifying and fixing redundant Title tags. A/B testing: AI-powered search engines
may assist in testing alternative Title tags to see which ones are most effective in
increasing user engagement and search engine rankings. This can assist webmasters in
creating more efficient Title tags while avoiding redundancy.Webmasters may avoid
repeating Title tags and improve the user experience and search engine rankings of
their websites by adopting these AI-based search engine tactics.

2.3 Bibliometric Analysis:-

In AI-based search, bibliometric analysis entails employing quantitative approaches to


investigate patterns and trends in the literature linked to artificial intelligence. (AI). This sort of
study can reveal insights into the AI research landscape, such as which fields are garnering the
most attention, which authors and institutions are doing the most work, and which journals and
conferences are publishing the most significant work.

Citation analysis is a standard bibliometric technique for identifying highly cited publications,
authors, and journals. This can assist AI researchers and practitioners in identifying crucial
concepts, influential experts, and relevant venues for research dissemination. Furthermore, co-
citation analysis may be used to discover clusters of relevant research, which can aid in
identifying new trends and regions of overlap between distinct AI subfields.

Topic modelling is another helpful bibliometric approach that includes utilising machine
learning techniques to discover latent themes or subjects in massive volumes of text. Topic
modelling may be used to identify the most relevant AI research subjects and examine how
these themes have grown over time.

Overall, bibliometric analysis might be a valuable technique for comprehending the AI research
environment and finding interesting topics for future study. Researchers and practitioners in AI
can discover insights that would be difficult or impossible to achieve using traditional
qualitative approaches by employing quantitative tools to analyse massive volumes of data.

In Figure 1a, the publishing output has been condensed. It is evident that there has been an
increase in the number of publications in the subject of artificial intelligence published globally.
Three time periods—2007 to 2011 (old articles), 2012 to 2015 (current papers), and 2016 to
2020—have been used to partition this increase in publications. (current papers). It was noted
that the number of publications surged suddenly between 2007 and 2011, remained consistent
between 2012 and 2015, and then increased significantly between 2016 and 2020. Figure 1b
(which only includes citations from WoS Core collections) illustrates the continual exponential
growth in the number of citations from 1 in 2007 to 1815 in 2020.

61 nations in all are making contributions to the artificial intelligence in IT industry. Although
it would not be appropriate to compare publications by nation, this is emphasised here to
illustrate the dispersion of the research domain. The top nations, as determined by their
participation, are shown in Figure 3. There were 321 publications in China, 96 in the United
States, 77 in Iran, 68 in Turkey, 67 in India, 57 in France, 54 in Germany, 36 in Canada, 31 in
Italy, 31 in Spain, and 31 elsewhere. (158). The USA was discovered to be the second-largest
participant in this study field, after China.

PRO AND CONS :

Efficiency and scalability

AI can create content much faster than people, which is probably the biggest benefit. An AI tool
can produce an article in minutes. It would take a human writer much longer to do all the
research and write it.

Multiply the quick turnaround by the number of articles, and an AI tool can produce a
significant amount of content.

AI also helps with language localization for various geographic areas and can create social
media with personalization for various sites.

Cost-effective

Hiring quality content writers typically costs a few hundred dollars per project, depending on
the length of the article, the number of pieces and the needed technical knowledge. And this
may be money well spent for high-quality, well-researched content.

Some AI writing tools are free, while others charge a monthly subscription rate. The pricing
typically runs about $100 for tens of thousands of words.

AI-generated content may be better suited for simpler content than articles needing expertise
and authority.

Improves SEO
AI content generators scroll through thousands of online documents to absorb the information.
By viewing all these documents, generators choose keywords to improve search engine
optimization (seo). The AI tool can suggest keywords for the content writer. By using these
keyword suggestions throughout an article, the content appears higher in search engine ranking
if it follows the rest of the guidelines of being authoritative and is written by a person.

Overcome writer's block

Writer's block is a common hurdle for many people. At times, writers may have trouble creating
authoritative content for a subject they know little about.

To help overcome this hurdle, AI tools can create detailed outlines and key points to help the
writer determine what should be included in the article. AI tools can help the person overcome
writer's block and spark ideas to get started.

There are some considerations for AI-generated content. Some content is best written by a
human writer. Here are some cons of AI-generated content.

Quality concerns and possible plagiarism

AI relies on data and algorithms for content. The intended tone may get lost. AI tools can cover
black and white areas of a topic, but gray areas are more subjective.

Search engines may also flag content because it is similar to published materials as it pulls from
the same sources. The AI tools piece content together from various sites and reword them.
Without adding the proper flow, this process goes against Google's "stitching and combining
content" guidelines. Content needs to be authoritative and informative, which can be hard to do
when piecing information together from various sites without proper human review.

Algorithms devalue content

Google released its helpful text genration in August 2022, which highlights "helpful content
written by people, for people." It goes on to state that a search engine crawler looks for content
from humans, providing a more cohesive and satisfying practice with SEO.

The update looks to punish content that is created to strictly rank higher in search engine results
first. AI tools evaluate SEO results first without truly understanding the text, so the results focus
on keywords and not being informative to the reader.

Lack of creativity and personalization

Creative content makes articles more engaging. People tend to share articles they feel a
connection to, but AI does not have the emontional intelligence to create a story, instead adding
facts to an outline.

AI relies on existing web content and data to develop wording. AI does not understand user
intent for queries and still lacks the common sense of human behavior.

Human editing still required

People still need to read through AI-generated content. It might save time, but people still need
to be involved and articles quality-checked.
AI tools combine information from several websites into one piece. There may be some mix-
ups to fix, such as product descriptions with textures and colors because AI tools do not
understand adjective meanings.

Can't generate new ideas

AI tools use existing data for content, so this means they cannot come up with fresh ideas or
original content. AI tools make it hard to come up with new content with the latest, trending
ideas and topics.

Ways to use AI-generated content

AI-generated content is best used as a writing assistant instead of relying strictly on technology.
Here are some ways to use AI tools for assistance with content:

 Research. For writers having issues organizing a topic or coming up with ideas, AI-
generated content can help them get started. Some tools give ideas about what to include
for broader topics to help narrow down the research process.
 Overcome writer's block. For writers who know their keyword or topic, AI-generated
content tools can help them get started by offering a few hundred words on the subject.
Some tools recommend headers so writers can get moving and adapt their content.
 Proofread current material. To make sure a drafted article is optimized, writers can
run it through AI tools for a grade. The tool can also highlight keywords and phrases
that should be used. AI tools can also assist with checking grammar and correcting
spelling mistakes.
 Write short content. AI tools can produce a lot of content in a short amount of time, so
they are a great way to reduce boredom with repetitive tasks. While some
communications require more of an emotional side, some short descriptions do not.
Product descriptions, metatags, ad copy and social media posts are examples of short
text for content generators.
 Translate language. For written material to appeal to all audiences, AI generators can
help translate content into different languages.
 Create templates. AI tools can help create emails or other templates. Some AI tools
offer different types of ready-made templates for people to plug in customized
information.

CITE

[1] Keong, B. V., & Anthony, P. (2011, June). Meta search engine powered by DBpedia. In
2011 international conference on semantic technology and information retrieval (pp. 89-93).
IEEE
[2] Ma, Y., Ping, K., Wu, C., Chen, L., Shi, H., & Chong, D. (2020). Artificial Intelligence
powered Internet of Things and smart public service. Library Hi Tech, 38(1), 165-179.
[3] Gozzo, M., Woldendorp, M. K., & De Rooij, A. (2022, February). Creative collaboration
with the “brain” of a search engine: effects on cognitive stimulation and evaluation
apprehension. In ArtsIT, Interactivity and Game Creation: Creative Heritage. New Perspectives
from Media Arts and Artificial Intelligence. 10th EAI International Conference, ArtsIT 2021,
Virtual Event, December 2-3, 2021, Proceedings (pp. 209-223). Cham: Springer International
Publishing.
[4] Kanowitz, S. (2023). AI-Powered Career Recommendation Engine Delivers More Job
Options. https://gcn. com/cloud-infrastructure/2023/02/ai-powered-career-recommendation-
engine-delivers-more-job-options/382792/.
[5] Wilkin, N. (2022, April). Demo of Graide: AI powered assistive grading engine. In
Proceedings of the Ninth ACM Conference on Learning@ Scale. Association for Computing
Machinery (ACM).
[6] Talarico, D. (2018). Getting your SEO ready for AI: Artificial intelligence and search
engine optimization. Recruiting & Retaining Adult Learners, 20(6), 3-3.
[7] Talarico, D. (2018). Getting your SEO ready for AI: Artificial intelligence and search
engine optimization. Recruiting & Retaining Adult Learners, 20(6), 3-3.
[8] Mohadikar, U., Chattergee, O., Bhaisare, K., Kurve, N., Sarode, R., Puri, S., &
Bairagi, R. A. AUTOMATED SEARCH ENGINE BY VERTUAL ASSISTANT USING
ARTIFICIAL INTELLIGENCE. Journal homepage: www. ijrpr. com ISSN, 2582, 7421.
[9] Chen, T. J. (2023). ChatGPT and other artificial intelligence applications speed up
scientific writing. Journal of the Chinese Medical Association, 10-1097.
[10] Ha, N. T. T. (2022). APPLICATION OF AI AND MACHINE LEARNING IN SEARCH
ENGINE OPTIMIZATION. Technology and Society Studies, 397.
[11] Russell, D. M. (2015). What do you need to know to use a search engine? Why we still
need to teach research skills. AI Magazine, 36(4), 61-70.
[12] Lee, T., Kim, S., & Kim, K. (2019, October). A research on the vulnerabilities of PLC
using search engine. In 2019 International Conference on Information and Communication
Technology Convergence (ICTC) (pp. 184-188). IEEE.
[13] Kim, C., Sung, R. J., Ahn, S. G., Min, J., & Kwon, K. W. (2018, May). Low power
search engine using non-volatile memory based TCAM with priority encoding and selective
activation of search line and match line. In 2018 IEEE International Symposium on Circuits and
Systems (ISCAS) (pp. 1-4). IEEE.
[14] Wang, N., Wen, X., Zhu, J., & Jiao, J. (2022, March). Design of Intelligent Power
Search Engine Selection System Based on Micro Service Architecture. In Cyber Security
Intelligence and Analytics: The 4th International Conference on Cyber Security Intelligence and
Analytics (CSIA 2022), Volume 2 (pp. 846-850). Cham: Springer International Publishing.
[15] Kejriwal, M. (2021). A meta-engine for building domain-specific search engines.
Software Impacts, 7, 100052.
[16] Daugherty, P. R., Wilson, H. J., & Chowdhury, R. (2018). Using artificial intelligence
to promote diversity. MIT Sloan Management Review.

2.4 REVIEW SUMMARY:

A search engine is simple software to search the World Wide Web. In modern world, search
engines many times a day and we always wonder how they really work. It also searches the web
for weather conditions, driving directions, recipes, and more. We really look for whatever
comes to mind. Without search engines, we are powerless today. The main goal of this report is
to know how people interact with search engines. The way people search and their expectations
of machines are changing rapidly, and AI-powered search is playing a pivotal role.

As AI systems make predictions, they learn from those predictions to get smarter over time.
You see this in common consumer products like Gmail's Smart Compose feature.
A few years ago, Gmail could predict simple phrases and words that you intended to type next.
Today, it has learned so well from billions of emails, it can now finish entire sentences for you.
That same type of AI technology is now getting good enough to write entire articles on its own.
This ability to learn makes AI more powerful than the traditional software that came before it.
It's also why AI is fundamental to any search engine being used today.
Today's search is way too complex for humans or traditional machines to handle. It's estimated
that Google alone processes a whopping 63,000 search queries every second, or north of two
trillion searches a year.
Even the largest team of humans couldn't process this volume of search effectively. And
traditional software, which has no ability to find patterns in data and make predictions, isn't up
to the task either.
It is quite simply impossible to serve up accurate search results in real-time at this speed and
scale without AI.
That's why today AI powers almost every part of a search engine, including things like:
 Indexing all the pages published online and understanding their contents
 Interpreting search queries by understanding human language
 Matching queries to the most accurate and highest quality results
 Evaluating and reevaluating content quality to consistently improve search results
 And much, much more...

2.5 PROBLEM DEFINITION

While AI-powered search engines have some advantages over traditional search engines, there
are also drawbacks to be aware of. Here are a couple such examples:

Inadequate transparency: AI search engines utilise complicated algorithms to produce search


results, making it difficult to comprehend how and why specific results are shown. This lack of
transparency might be aggravating for users who wish to know why some pages are rated
higher than others.

prejudice: AI-powered search engines, like any other AI system, are prone to prejudice. For
example, if the training data used to construct the AI system is biassed in any manner, the
search results may reflect this bias. This may be especially troublesome when dealing with
sensitive themes such as politics or social concerns.

Concerns regarding privacy: AI-powered search engines frequently acquire a considerable


amount of data about users, such as search history and personal information. This information
can be utilised to improve search results, but it can also pose privacy problems if it is not
handled correctly.

AI search engines are primarily reliant on technology, which might be a disadvantage if the
technology fails or is disturbed in any manner. For example, if the AI system is hacked or
suffers a bug, this might result in erroneous search results or the system completely failing.

Limited human interaction: Because AI search engines are meant to work independently, they
may be incapable of responding to complicated inquiries or providing the same degree of
personalised service as a human search engine operator. For consumers that demand more
specialised or nuanced search results, this might be a drawback.

2.6 GOALS/OBJECTIVES:

A research engine's primary objective is to assist users in finding the most relevant and valuable
information accessible on the internet. Search engines have numerous purposes to reach this
goal, including:

Indexing: First, search engines must locate and collect data from throughout the internet.
Indexing is a procedure that includes software programmes called spiders or crawlers crawling
the web and storing information about web pages in a vast database.

Ranking: After a significant number of web pages have been indexed, the search engine must
evaluate which pages are most relevant to a particular query. This is known as ranking, and it
entails analysing numerous variables such as keywords, page quality, relevancy, popularity, and
others to decide the order in which search results are shown.

Providing relevant results: A search engine's ultimate goal is to give the most relevant results
to the user's query. This necessitates a thorough comprehension of the user's meaning and
context, as well as the capacity to analyse and interpret natural language questions.

Speed and efficiency: In order to keep users interested and pleased, search engines must offer
search results swiftly and efficiently. Fast indexing and ranking algorithms, as well as optimised
infrastructure and data storage systems, are required.

Quality and safety must be ensured: Search engines must ensure that the results they provide are
of high quality and safe for consumers to access. This involves screening out spam, malware,
and other potentially hazardous content, as well as giving helpful information about the sources
and authenticity of search results.

You might also like