Emerging Trends Mohan Chapter 3
Emerging Trends Mohan Chapter 3
Emerging Trends
________________________________________________________
INTRODUCTION:
The 21st century has been marked by the rapid advancement and pervasive application of
information technology. Today, information technology is an essential part of our daily lives, acting as
a significant catalyst for change in various aspects of business and society. It has revolutionized the
way we address economic and social issues, proving to be a game changer in many respects. The
landscape of information technology is ever-evolving, with new technologies emerging constantly.
While many new technologies are introduced almost daily, not all succeed—some fade away, while
others thrive and capture the attention of users. Information technology impacts virtually every field
of human life, with various technological trends such as cloud computing, mobile computing, social
media, and ubiquitous computing growing rapidly.
Cloud computing allows us to share hardware and software resources over the internet as a service,
accessible globally on a pay-per-use basis. Mobile computing enables data access and processing on
handheld devices like smartphones and tablets.
Social media platforms like Facebook, Twitter, WhatsApp, YouTube, and LinkedIn facilitate global
interaction and contribute to social betterment by highlighting important social issues. The Internet
of Things (IoT), a network of objects and computing devices embedded with microchips, sensors, and
actuators, is making our world smarter.
Emerging trends in information technology set new standards and gain popularity among users. In
this chapter, we will explore some of these emerging trends—cloud computing, mobile computing,
social media, and IoT—examining their rapid evolution and discussing their roles and future impacts
on the digital economy and interactions within digital societies.
The term "Artificial Intelligence" combines "Artificial," meaning made by humans, and
"Intelligence," meaning the ability to think. So, AI means
human-made thinking power.
Artificial Intelligence aims to create expert systems that show intelligent behavior, learn, explain
things, and give advice to users. AI also tries to make machines understand, think, learn, and act like
humans.
Machine Learning:
Machine Learning is a part of Artificial Intelligence (AI). It enables computers to learn and improve
automatically without being specifically programmed. By learning from past tasks, machines become
smarter and respond better to new inputs and situations. Machine Learning involves creating
computer programs that can access and learn from data on their own.
For example, when you say, "Alexa, play this song," the device plays the song thanks to NLP and
other AI elements like machine learning and deep learning. NLP enables computers to read text, hear
speech, interpret it, gauge sentiment, and identify important parts of human language. It focuses on
the interaction between
computers and humans using
natural language.
Natural Language Processing (NLP) is the driving force behind many common applications, including:
Word processors like Microsoft Word and Grammarly that use NLP to check grammar.
Interactive Voice Response (IVR) systems in call centers, that handle user requests.
In fact, it is possible to search the web or control our devices using our voice. All this has been made
possible by NLP. An NLP system can perform text-to-speech and speech-to-text conversion.
Immersive Experiences:
An "immersive experience" draws a person into a new or augmented reality, enriching everyday life
through technology. It often combines multiple technologies to create this effect.
With three-dimensional (3D) videography, the joy of watching movies in theaters has reached new
heights. Video games are also being developed to provide players with immersive experiences. These
Driving Simulator:
Immersive experiences are utilized in training fields, like driving and flight
simulations. These experiences can be created through virtual reality (VR)
and augmented reality (AR) technologies.
VR Headset:
Our perception of reality is shaped by our senses. This led to the idea that if we present our senses
with artificial information, our perception of reality could change accordingly.
Today, this is accomplished with VR headsets. To enhance the realism of VR, it incorporates other
sensory inputs like sound, smell, motion, and temperature. VR is a relatively new field with
applications in gaming, military training, medical procedures, entertainment, social science,
psychology, engineering, and other areas where simulation aids in understanding and learning.
One of the pioneers in using AR is IKEA, which allows you to select furniture from their catalog and
see how it would fit in your space.
Robotics:
Robotics is a field of study that combines computer science and engineering. It focuses on designing,
building, operating, and using robots. The aim of robotics is to create intelligent machines that can
support and assist humans in their daily activities while ensuring safety.
Robotic technology aims to develop machines that can perform tasks in place of humans and mimic
human actions. Robots are used in various situations and for multiple purposes. Currently, they are
employed in hazardous environments like inspecting radioactive materials, detecting and
deactivating bombs, and manufacturing processes. They are also utilized in environments where
humans cannot survive, such as space, underwater, and in high temperatures.
Humanoid robots are robots designed to resemble humans. They find applications in various fields
such as industries, medical science, bionics, scientific research, and military.
NASA's Mars Exploration Rover (MER) mission involves sending robotic spacecraft to study the
planet Mars. Mitra is an Indian-made humanoid robot equipped
with artificial intelligence, visual data processing, facial
recognition, and the ability to imitate human gestures and facial
expressions.
The emergence of Big Data results in the creation of massive and intricate datasets. Such data sets
cannot be effectively processed and analyzed using traditional tools due to their sheer volume and
unstructured nature. This includes our posts, instant messages, chats, shared photographs, tweets,
blog articles, news items, opinion polls and comments, audio/video chats, and more.
Big Data presents numerous challenges, including integration, storage, analysis, searchability,
processing, transfer, querying, and visualization. Despite these challenges, Big Data often contains
valuable information and insights that are highly beneficial for businesses. Consequently, there is a
significant focus on developing software and methodologies to effectively process and analyze Big
Data.
(A) Volume: Big data is characterized by its enormous size. When a dataset is so large that
traditional database tools struggle to process it, it's considered big data.
(B) Velocity: This refers to the speed at which data is generated and stored. Big data is generated at a
much faster rate compared to traditional datasets.
(C) Variety: Big data consists of diverse types of data, including structured, semi-structured, and
unstructured data. Examples include text, images, videos, and web pages.
(D) Veracity: Big data can sometimes be inconsistent, biased, noisy, or have
issues with data collection methods. Veracity concerns the trustworthiness of
the data, as processing incorrect data can lead to misleading results.
(E) Value: Big data not only comprises a large amount of data but also
contains hidden patterns and valuable insights that can benefit businesses.
However, investing resources in processing big data requires assessing its
potential value to avoid wasted efforts.
Data analytics:
It is the process of examining datasets to extract insights using specialized systems and software. It's
increasingly popular across industries, helping organizations make informed decisions. In fields like
science and technology, data analytics aids researchers in validating or refuting scientific models,
theories, and hypotheses.
Pandas, a Python library, simplifies data analysis processes, making it a valuable tool for data analysis
tasks.
The web is already a system for communication, so why not use it to connect all these devices
efficiently? Web of Things (WoT) makes this possible by using web services to connect anything in
the physical world, not just people. This could lead to the development of smart homes, offices,
cities, and more.
Sensors
When you hold your mobile phone vertically or horizontally, the display adjusts accordingly. This is
made possible by two sensors: the accelerometer and gyroscope (gyro). The accelerometer detects
the phone's orientation, while the gyroscope tracks any rotation or twist of your hand, adding to the
information provided by the accelerometer.
Sensors are frequently used in real-world applications for monitoring and observation. The
advancement of smart electronic sensors is playing a significant role in the evolution of the Internet
of Things (IoT), leading to the development of new sensor-based intelligent systems.
A smart sensor is a gadget that collects information from the world around it. It has its own
computer inside, which helps it do certain tasks when it senses something specific. Before sharing
this information, it processes it to make sense of it.
Smart Cities
As cities grow rapidly, the pressure on resources like land, water, and air pollution increases.
Managing issues such as waste, traffic congestion, public safety, and infrastructure becomes more
challenging. City planners worldwide are seeking smarter approaches to address these challenges
and create sustainable and liveable urban environments.
CLOUD COMPUTING
Cloud computing is a growing trend in information technology. It involves delivering computer-based
services over the Internet or "the cloud," accessible to users from anywhere using any device. These
services include
software, hardware
(like servers),
databases, and
storage, among
others. Cloud service
providers, companies
that offer these
resources, typically
charge users based
on their usage,
similar to how we pay for electricity. We often use cloud services to store pictures and files as
backups or to host websites on the Internet.
Cloud computing enables users to run large applications or process extensive data without needing
the necessary storage or processing power on their personal computer, as long as they are
connected to the Internet. Among its many benefits, cloud computing offers cost-effective, on-
demand resources. Users can access the resources they need from the cloud at a very reasonable
cost.
Cloud Services
A simpler approach to grasp the concept of the
cloud is to view everything as a service. A "service"
refers to any feature offered by the cloud. There are
three main models for categorizing various
computing services provided through the cloud:
Infrastructure as a Service (IaaS), Platform as a
Service (PaaS), and Software as a Service (SaaS).
For example, let's say we've developed a web application using MySQL and Python. To run this
application online, we can use a pre-configured Apache server from the cloud, which already has
MySQL and Python pre-installed. This means we don't need to install MySQL and Python on the
cloud, nor do we have to configure the web server (like Apache or nginx).
With PaaS, users have full control over the deployed application and its configuration. It provides a
deployment environment for developers at a much lower cost, reducing the complexity of purchasing
and managing the underlying hardware and software.
In all of the standard service models mentioned above, users can access on-demand infrastructure,
platforms, or software, typically paying based on usage. This model eliminates the need for
significant upfront investment, which is particularly beneficial for new or evolving organizations. To
leverage the benefits of cloud computing, the Government of India has launched an ambitious
initiative called "GI Cloud," also known as "MeghRaj" (https://cloud.gov.in).
GRID COMPUTING
A grid is a computer network composed of geographically dispersed and heterogeneous
computational resources. Unlike cloud computing, which primarily focuses on providing services, a
grid is more application-specific and functions like a virtual supercomputer with immense processing
power and storage. The resources within a grid, known as nodes, temporarily unite to tackle a single
large task and achieve a common goal.
Nowadays, countless computational nodes, ranging from handheld mobile devices to personal
computers and workstations, are connected to LANs or the Internet. This connectivity makes it
economically feasible to reuse or utilize their resources, such as memory and processing power. Grid
computing leverages these resources to solve computationally intensive scientific and research
problems without the need to procure expensive hardware.
Grids can be classified into two types: (i) Data grids, which are
used to manage large, distributed datasets requiring multi-user
access, and (ii) CPU or Processor grids, where processing tasks
are either transferred from one PC to another as needed or
divided into subtasks that are processed in parallel across
Grid computing differs from IaaS (Infrastructure as a Service) cloud services. In IaaS, a service
provider rents the necessary infrastructure to users. In contrast, grid computing involves multiple
computing nodes collaborating to solve a common computational problem.
To set up a grid by connecting numerous nodes for both data and CPU resources, middleware is
needed to implement the distributed processor architecture. One such software toolkit is the Globus
Toolkit (http://toolkit.globus.org/toolkit), which is open-source and includes software for security,
resource management, data management, communication, fault detection, and more.
BLOCKCHAINS
Traditionally, digital transactions are performed by storing data in a centralized database, with
transactions updated sequentially on this database. This is the method used by ticket booking
websites and banks. However, centralizing all data in one location poses risks of data breaches or
loss.
Blockchain technology, in contrast, operates on the principle of a decentralized and shared database,
where each computer has a copy of the database. A block in this context is a secure chunk of data or
a valid transaction. Each block contains a header, which is visible to all other nodes, while only the
owner has access to the block's private data. These blocks are linked together, forming a blockchain.
The most popular application of blockchain technology is in digital currency. However, due to its
decentralized nature, openness, and security, blockchain is increasingly being recognized as a means
to ensure transparency, accountability, and efficiency in business and governance systems.
For example, in healthcare, improved data sharing between providers can lead to more accurate
diagnoses, more effective treatments, and an overall increase in the ability of healthcare
organizations to deliver cost-effective care.
Another potential application of blockchain technology is in land registration records, which can help
prevent disputes over land ownership and encroachments. A blockchain-based voting system can
address issues like vote alterations and other electoral problems. Since all transactions are recorded
in the ledger, voting becomes more transparent and authentic. Blockchain technology can be applied
across various sectors, including banking, media, telecom, travel, hospitality, and many other areas.
*********