Emerging Technologies Handout
Emerging Technologies Handout
Steam Engine
Figure1: The development of machine tools and the rise of the factory system
Figure2: New technological systems were introduced, such as electrical power and telephones
1.1.3.3. IR 3.0-Digital Revolution
Figure3: Transition from mechanical and analog electronic technology to digital electronics
1.1.3.4. IR 4.0-Robotics and IoT-Cyber physical system
Figure4: Anybody connected device-Eg. Computer Numerical Control (by Instruction
Notice: A cyber-physical system is a mechanism that is controlled or monitored by computer-
based algorithms, tightly integrated with the Internet and its users-Eg-AI.
1.2. Role of Data for Emerging Technologies
1 Data is regarded as the new oil and strategic asset since we are living in the age of big
data, and
2 Drives or even determines the future of science, technology, the economy, and possibly
everything in our world today and tomorrow.
3 More importantly presents enormous challenges that in turn bring incredible
innovation and economic opportunities by understanding, exploring, and utilizing
data.
1.3. Enabling devices and network (Programmable devices)
In the world of digital electronic systems, there are four basic kinds of devices:
o Memory
o microprocessors
o Logic and Networks.
Memory: stores random information such as the contents of a spreadsheet or database.
Microprocessors: execute software instructions to perform a wide variety of tasks such as
running a word processing program or video game.
Logic devices: provide specific functions, including device-to-device interfacing, data
communication, signal processing, data display, timing and control operations, and almost
every other function a system must perform.
Network: is a collection of computers, servers, mainframes, network devices, peripherals, or
other devices connected to one another to allow the sharing of data. Eg. Internet
Why is a computer referred to as a programmable device?
Because what makes a computer a computer is that it follows a set of instructions.
Many electronic devices are computers that perform only one operation, but they are still
following instructions that reside permanently in the unit.
1.3.1. List of some Programmable devices
Achronix Speedster SPD60
Actel’s
Altera Stratix IV GT and Arria II GX
Atmel’s AT91CAP7L
Cypress Semiconductor’s programmable system-on-chip (PSoC) family
Lattice Semiconductor’s ECP3
Lime Microsystems’ LMS6002
Silicon Blue Technologies
Xilinx Virtex 6 and Spartan 6 and Xmos Semiconductor L series
A full range of network-related equipment referred to as Service Enabling Devices
(SEDs), which can include:
Traditional channel service unit (CSU) and data service unit (DSU)
Modems
Routers
Switches
Conferencing equipment
Network appliances (NIDs and SIDs)
Hosting equipment and servers
1.4. Human to Machine Interaction
HMI refers to the communication and interaction between a human and a machine via a user
interface.
Nowadays, natural user interfaces such as gestures have gained increasing attention as they
allow humans to control machines through natural and intuitive behaviors
What is interaction in human-computer interaction?
HCI is the study of how people interact with computers and to what extent computers are or
are not developed for successful interaction with human beings.
As its name implies, HCI consists of three parts: the user, the computer itself, and the ways
they work together.
How do users interact with computers?
The user interacts directly with hardware for the human input and output such as displays,
e.g. through a graphical user interface.
The user interacts with the computer over this software interface using the given input and
output (I/O) hardware.
How important is human-computer interaction?
The goal of HCI is to improve the interaction between users and computers by making
computers more user-friendly and receptive to the user's needs.
The main advantages of HCI are simplicity, ease of deployment & operations and cost
savings for smaller set-ups.
They also reduce solution design time and integration complexity.
1.4.1. Disciplines Contributing to (HCI)
Cognitive psychology: Limitations, information processing, performance prediction,
cooperative working, and capabilities.
Computer science: Including graphics, technology, prototyping tools, user interface
management systems.
Linguistics.
Engineering and design.
Artificial intelligence.
Human factors.
1.5. Present and Future Trends in Emerging Technologies
Emerging technology trends:
5G Networks
Artificial Intelligence (AI)
Autonomous Devices
Blockchain
Augmented Analytics
Digital Twins
Enhanced Edge Computing and
Immersive Experiences in Smart Spaces
Chapter One Review Questions
1. Where did the Industrial Revolution start and why did it begin there?
2. What does “emerging” mean, emerging technologies and how are they found?
3. What makes “emerging technologies” happen and what impact will they have on
Individuals, Society, and Environment?
4. Discussed the economic and ideological causes of the American, the French, and the
Chinese Revolutions, and to see the larger historical contexts in which these events took
place?
5. Discuss and compare the course of the American, the French, and the Chinese revolutions
and analyze the reasons for and significance of the different outcomes of these three
revolutions?
6. Discuss the successes and the shortcomings of the conservative reaction to the French
Revolution as seen in the actions of the Congress of Vienna and the Holy Alliance?
7. How do recent approaches to “embodied interaction” differ from earlier accounts of the
role of cognition in human-computer interaction?
8. What is the reason for taking care of design a good computer-human interface?
9. Discuss the pros and cons of human-computer interaction technology?
3. Artificial Intelligence
3.1 What is Artificial Intelligence?
Artificial Intelligence is composed of two words Artificial and Intelligence. Artificial defines
"man-made," and intelligence defines "thinking power", or “the ability to learn and solve
problems” hence Artificial Intelligence means "a man-made thinking power."
So, we can define Artificial Intelligence (AI) as the branch of computer science by which we can
create intelligent machines which can behave like a human, think like humans, and able to make
decisions.
Intelligence is composed of:
Reasoning Perception
Learning Linguistic Intelligence
Problem Solving
An AI system is composed of an agent and its environment. An agent (e.g., human or robot) is
anything that can perceive its environment through sensors and acts upon that environment
through effectors. Intelligent agents must be able to set goals and achieve them. In classical
planning problems, the agent can assume that it is the only system acting in the world, allowing
the agent to be certain of the consequences of its actions. However, if the agent is not the only
actor, then it requires that the agent can reason under uncertainty. This calls for an agent that
cannot only assess its environment and make predictions but also evaluate its predictions and
adapt based on its assessment. Machine perception is the ability to use input from sensors (such
as cameras, microphones, sensors, etc.) to deduce aspects of the world. e.g., Computer Vision.
Machine Learning
Deep Learning
Figure 3.12 Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL)
3.1.1 Need for Artificial Intelligence
1. To create expert systems that exhibit intelligent behavior with the capability to learn,
demonstrate, explain, and advice its users.
2. Helping machines find solutions to complex problems like humans do and applying them
as algorithms in a computer-friendly manner.
3.1.2 Goals of Artificial Intelligence
Following are the main goals of Artificial Intelligence:
1. Replicate human intelligence
2. Solve Knowledge-intensive tasks
3. An intelligent connection of perception and action
4. Building a machine which can perform tasks that requires human intelligence such as:
Proving a theorem Plan some surgical operation
Playing chess Driving a car in traffic
5. Creating some system which can exhibit intelligent behavior, learn new things by itself,
demonstrate, explain, and can advise to its user.
3.1.3 What Comprises to Artificial Intelligence?
To achieve the objective of AI the following things should be comprised:
Mathematics Neurons Study
Biology Statistics
Psychology
Sociology
Computer Science
Figure 3.3 Artificial Intelligence is multidisciplinary
17
C. The golden years-Early enthusiasm (1956-1974):- algorithms that can solve
mathematical problems are developed and first intelligent humanoid robot was built in
Japan which was named WABOT-1.
D. The first AI winter (1974-1980):- Called the first AI winter duration (a severe shortage
of funding from the government for AI researches.)
E. A boom of AI (1980-1987):- AI came back with "Expert System” and first national
conference of the American Association of Artificial Intelligence was held at Stanford
University.
F. The second AI winter (1987-1993):- Called the second AI Winter duration (Expert
system such as XCON was very cost-effective).
G. The emergence of intelligent agents (1993-2011):- IBM Deep Blue beats world chess
champion, Gary Kasparov, and became the first computer to beat a world chess
champion. AI entered the home in the form of Roomba, a vacuum cleaner and Companies
like Facebook, Twitter, and Netflix also started using AI.
H. Deep learning, big data and artificial general intelligence (2011-present):-
The year 2011: In the year 2011, IBM's Watson won jeopardy, a quiz show,
where it had to solve complex questions as well as riddles. Watson had proved
that it could understand natural language and can solve tricky questions quickly.
The year 2012: Google has launched an Android app feature "Google now",
which was able to provide information to the user as a prediction.
The year 2014: In the year 2014, Chatbot "Eugene Goostman" won a competition
in the infamous "Turing test."
The year 2018: The "Project Debater" from IBM debated on complex topics with
two master debaters and also performed extremely well.
Google has demonstrated an AI program "Duplex" which was a virtual assistant
and which had taken hairdresser appointment on call and the lady on the other
side didn't notice that she was talking with the machine.
Now AI has developed to a remarkable level. The concept of Deep learning, big
data, and data science are now trending like a boom. Nowadays companies like
Google, Facebook, IBM, and Amazon are working with AI and creating amazing
18
devices. The future of Artificial Intelligence is inspiring and will come with high
intelligence.
19
Stage 4 – Reasoning Machines: - These algorithms have some ability to attribute mental states
to themselves and others – they have a sense of beliefs, intentions, knowledge, and how their
own logic works. This means they could reason or negotiate with humans and other machines.
Stage 5 – Self Aware Systems / Artificial General Intelligence (AGI):- These systems have
human-like intelligence – the most commonly portrayed AI in media – however, no such use is
in evidence today.
Stage 6 – Artificial Superintelligence (ASI):- AI algorithms can outsmart even the most
intelligent humans in every domain.
Stage 7 – Singularity and Transcendence: - We might go beyond the limits of the human body
and connect to other forms of intelligence on the planet – animals, plants, weather systems, and
the natural environment. Singularity is impossible and human consciousness could never be
digitized.
Artificial Intellegence
Functionality Capabilities
Limitted
Reactive Theory of Self
Narrow AI General AI Strong AI Memory
Machines Mind Awerness
20
Weak AI or Narrow AI: is able to perform a dedicated task with intelligence, the most
common, and currently available and cannot perform beyond its field or limitations AI. Apple
Siri, IBM's Watson supercomputer, Google translate, playing chess, purchasing suggestions on e-
commerce sites, self-driving cars, speech recognition, and image recognition are some examples
of Narrow AI.
General AI: it could perform any intellectual task with efficiency like a human on its own and
currently, there is no such system exists. General AI is still under research.
Super or Strong AI: is a level of Intelligence of Systems at which machines could surpass
human intelligence, and can perform any task better than a human with cognitive properties.
Some key characteristics of strong AI include capability include the ability to think, to reason
solve the puzzle, make judgments, plan, learn, and communicate on its own. Super AI is still a
hypothetical concept of Artificial Intelligence.
Reactive Machines: the most basic types of AI, do not store memories or past experiences for
future actions, and only focus on current scenarios AI. IBM's Deep Blue system and Google's
AlphaGo are some examples of reactive AI.
Limited Memory: can store past experiences or some data for a short period of time. Self-
deriving care is the best example of limited memory AI.
Theory of Mind: it understands human emotions, beliefs, and be able to interact socially like
humans. It is still not developed, but researchers are making lots of efforts and improvement for
developing such AI machines.
Self-Awareness: is the future of AI. Is smarter than the human mind, does not exist in reality
still and a hypothetical concept.
3.4.1 How humans think?
Intelligence or the cognitive process is composed of three main stages: First observe and input
the information or data in the brain. Second interpret and evaluate the input that is received from
the surrounding environment. Third make decisions as a reaction towards what you received as
input and interpreted and evaluated.
AI researchers are simulating the same stages in building AI systems or models. This process
represents the main three layers or components of AI systems.
3.4.2 Mapping human thinking to artificial intelligence components
Because AI is the science of simulating human thinking, it is possible to map the human thinking
stages to the layers or components of AI systems.
In the first stage, humans acquire information from their surrounding environments through
human senses, such as sight, hearing, smell, taste, and touch, through human organs, such as
eyes, ears, and other sensing organs, for example, the hands.
In AI models, this stage is represented by the sensing layer, which perceives information from
the surrounding environment. This information is specific to the AI application. For example,
there are sensing agents such as voice recognition for sensing voice and visual imaging
recognition for sensing images. Thus, these agents or sensors take the role of the hearing and
sight senses in humans.
21
The second stage is related to interpreting and evaluating the input data. In AI, this stage is
represented by the interpretation layer, that is, reasoning and thinking about the gathered input
that is acquired by the sensing layer.
The third stage is related to taking action or making decisions. After evaluating the input data,
the interacting layer performs the necessary tasks. Robotic movement control and speech
generation are examples of functions that are implemented in the interacting layer.
3.5 Influencers of artificial intelligence
The following influencers of AI are described in this section:
Big data: Structured data versus unstructured data: Big data refers to huge amounts
of data. Big data requires innovative forms of information processing to draw insights,
automate processes, and help decision making.
Advancements in computer processing speed and new chip architectures: The
meaning of big data expanded beyond the volume of data after the release of a paper by
Google on MapReduce and the Google File System (GFS), which evolved into the
Apache Hadoop open source project. The Hadoop file system is a distributed file system
that may run on a cluster of commodity machines, where the storage of data is distributed
among the cluster and the processing is distributed too. This approach determines the
speed with which data is processed. This approach 54 includes an element of complexity
with the introduction of new, structured, unstructured, and multi-structured data types.
Large manufacturers of computer chips such as IBM and Intel are prototyping “brain-
like” chips whose architecture is configured to mimic the biological brain’s network of
neurons and the connections between them called synapses.
Cloud computing and APIs: Cloud computing is a general term that describes the
delivery of on-demand services, usually through the internet, on a pay-per-use basis.
Companies worldwide offer their services to customers over cloud platforms. These
services might be data analysis, social media, video storage, e-commerce, and AI
capabilities that are available through the internet and supported by cloud computing. AI
APIs are usually delivered on an open cloud-based platform on which developers can
infuse AI capabilities into digital applications, products, and operations by using one or
more of the available APIs. All the significant companies in the AI services market
deliver their services and tools on the internet through APIs over cloud platforms, for
example: IBM delivers Watson AI services over IBM Cloud, Amazon AI services are
delivered over Amazon Web Services (AWS), Microsoft AI tools are available over the
MS Azure cloud, Google AI services are available in the Google Cloud Platform. These
services benefit from cloud platform capabilities, such as availability, scalability,
accessibility, rapid deployment, flexible billing options, simpler operations, and
management.
The emergence of data science: After collection of a large enough volume of data,
patterns emerge. Then, data scientists use learning algorithms on these patterns. Data
science uses machine learning and AI to process big data.
22
3.6 Applications of AI
Artificial Intelligence has various applications in today's society. It is becoming essential for
today's time because it can solve complex problems in an efficient way in multiple industries,
such as, Agriculture Healthcare, Entertainment, Finance, Education, Social Media, Travel
&Transport, Automotive Industry, Robotics, Game, Data Security, Commuting, Email, Social
Networking, Online Shopping, Mobile Use, etc. AI is making our daily life more comfortable
and faster.
3.7 AI tools and platforms
AI platforms are defined as some sort of hardware architecture or software framework (including
application frameworks), that allows the software to run. It involves the use of machines to
perform the tasks that are performed by human beings. The platform simulates the cognitive
function that human minds perform such as problem-solving, learning, reasoning, social
intelligence as well as general intelligence.
Many tools are used in AI, including versions of search and mathematical optimization, logic,
methods based on probability and economics.
AI has developed a large number of tools to solve the most difficult problems in computer
science, like:
Search and optimization
Logic
Probabilistic methods for uncertain reasoning
Classifiers and statistical learning methods
Neural networks
Control theory
Languages
The most common artificial intelligence platforms include Microsoft AZURE Machine Learning,
Google Cloud Prediction API, IBM Watson, TensorFlow, Infosys Nia, Wipro HOLMES,
API.AI, Premonition, Rainbird, Ayasdi, MindMeld, and Meya.
23
Chapter 4: Internet of Things (IoT)
Introduction
IoT is a network of connected devices that interact and exchange information with each other.
The technology allows connection of two or more devices that connect with each other and
sending and receiving information through the internet. IoT is a system of interrelated
computing devices, mechanical and digital machines, objects, animals or people that are
provided with unique identifiers and the ability to transfer data over a network without
requiring human-to-human or human-to-computer interaction.
Activity 4.1.
o What is Internet of Thing?
o Explain the key features of IOT?
o What does IOTs plays in the day-to-day lives of people and organizations?
Definitions:-
1. The internet of things, or IoT, is a system of interrelated computing devices, mechanical
and digital machines, objects, animals or people that are provided with unique identifiers
(UIDs) and the ability to transfer data over a network without requiring human-to-human
or human-to-computer interaction.
2. IoT is the networking of smart objects in which smart objects have some constraints such
as limited bandwidth, power, and processing accessibility for achieving interoperability
among smart objects.
3. IoT is the interaction of everyday object is computing devices through the Internet
that enables the sending and receiving of useful data.
4. The term Internet of Things (IoT) according to the 2020 conceptual framework is
expressed
through a simple formula such as:
24
IoT= Services+ Data+ Networks + Sensors
A thing in the internet of things can be any natural or man-made object that can be assigned an
Internet Protocol (IP) address and is able to transfer data over a network.
AI − IoT essentially makes virtually anything “smart”, meaning it enhances every aspect of life
with the power of data collection, artificial intelligence algorithms, and networks. E.g
refrigerator and cabinets to detect when milk and your favorite cereal run low, and to then place
an order with your preferred grocer.
Connectivity − New enabling technologies for networking, and specifically IoT networking,
mean networks are no longer exclusively tied to major providers. IoT creates these small
networks between its system devices.
Sensors − IoT loses its distinction without sensors. They act as defining instruments, which
transform IoT from a standard passive network of devices into an active system capable of real-
world integration.
Active Engagement − Much of today's interaction with connected technology happens through
passive engagement.
Small Devices − Devices, as predicted, have become smaller, cheaper, and more powerful over
time. IoT exploits purpose-built small devices to deliver its precision, scalability, and versatility.
The internet of things (IoT) has found its application in several areas such as connected industry,
smart-city, smart-home, smart-energy, connected car, smart agriculture, connected building and
campus, health care, logistics, among other domains. A device that transfer data over a network
are members of the IoT. Example, Medical devices, such as a heart monitor implant, a biochip
transponder in a farm animal, Ring, a doorbell that links to your smartphone etc.
IoT utilizes existing and emerging technology for sensing, networking, and robotics. Its new and
advanced elements bring major changes in the delivery of products, goods, and services; and the
social, economic, and political impact of those changes.
25
The development of computers began in the 1950s. The Internet, itself a significant component
of the IoT, started out as part of DARPA (Defense Advanced Research Projects Agency) in
1962, and evolved into ARPANET in 1969. In the 1980s, commercial service providers began
supporting public use of ARPANET, allowing it to evolve into our modern Internet. Global
Positioning Satellites (GPS) became a reality in early 1993, with the Department of Defense
providing a stable, highly functional system of 24 satellites. Privately owned, commercial
satellites being placed in orbit quickly followed this. Satellites and landlines provide basic
communications for much of the IoT. One additional and important component in developing a
functional IoT was IPV6’s remarkably intelligent decision to increase address space. Steve
Leibson, of the Computer History Museum, states, “The address space expansion means that we
could assign an IPV6 address to every atom on the surface of the earth, and still have enough
addresses left to do another 100+ earths.” Put another way, we are not going to run out of
internet addresses anytime soon. Kevin Ashton stated that Radio Frequency Identification
(RFID) was a prerequisite for the Internet of Things. He concluded if all devices were “tagged,”
computers could manage, track, and inventory them. To some extent, the tagging of things has
been achieved through technologies such as digital watermarking, barcodes, and QR codes.
Inventory control is one of the more obvious advantages of the Internet of Things.
Activity 4.2
o State the history of internet of thing (IOT)?
o What was the role of Kevin Ashton in the history of IOT?
4.1.3. IoT – Advantages
The advantages of IoT span across every area of lifestyle and business. Here is a list of some of
the advantages that IoT has to offer:
o Improved Customer Engagement - Current analytics suffer from blind-spots and significant
flaws inaccuracy; and as noted, engagement remains passive. IoT completely transforms this
to achieve richer and more effective engagement with audiences.
o Technology Optimization - The same technologies and data, which improve the customer
experience, improve device use, and aid in more potent improvements to technology. IoT
unlocks a world of critical functional and field data. Reduced Waste - IoT makes areas of
improvement clear. Current analytics give us superficial insight, but IoT provides real-world
information leading to the more effective management of resources.
o Enhanced Data Collection - Modern data collection suffers from its limitations and its
design for passive use. IoT breaks it out of those spaces and places it exactly where
humans really want to go to analyze our world. It allows an accurate picture of everything.
Activity 4.3
The use of IOT provides a number of advantages. What are they?
4.1.4. IoT – Disadvantages
Here is a list of some of the disadvantages of IoT. These are:
26
o Poor security and confidentiality
o System corruption
o Interoperability of systems (no international standard of compatibility for IoT)
o Collecting and managing the data from multiple devices will be challenging
Activity 4.4
o Briefly discus the cons IOT related to security and compatibility?
o Briefly discus security requirement at different layer of IOT?
4.1.5. Challenges of IoT
Though IoT delivers an impressive set of advantages, it also presents a significant set of
challenges. Here is a list of some its major issues:
Activity 4.5
o What are the most frequently raised challenges that of IOT has been facing?
4.2. How does it work?
An IoT ecosystem consists of web-enabled smart devices that use embedded processors, sensors
and communication hardware to collect, send and act on data they acquire from their
environments.
IoT devices share the sensor data they collect by connecting to an IoT gateway. The devices do
most of the work without human intervention, although people can interact with the devices.
27
The architecture of IoT devices comprises four major components: sensing, network, data
processing, and application layers (as depicted in Figure 4.2). A detailed description of these
layers is given below.
1. Sensing Layer - The main purpose of the sensing layer is to identify any phenomena in the
devices’ peripheral and obtain data from the real world. This layer consists of several
sensors. Sensors in IoT devices are usually integrated through sensor hubs. A sensor hub is a
common connection point for multiple sensors that accumulate and forward sensor data to
the processing unit of a device. Actuators can also intervene to change the physical
conditions that generate the data. Sensors in IoT devices can be classified into three broad
categories.
A. Motion Sensors: Motion sensors measure the change in motion as well as the orientation
of the devices. There are two types of motions one can observe in a device: linear and
angular motions.
B. Environmental Sensors: Sensors such as Light sensors, Pressure sensors, etc. are
embedded in IoT devices to sense the change in environmental parameters in the device’s
peripheral. Environment sensors are used in many applications to improve user
experience (e.g., home automation systems, smart locks, smart lights, etc.).
C. Position sensors: Deal with the physical position and location of the device. The most
common position sensors used in IoT devices are magnetic sensors and Global
Positioning System (GPS) sensors. Magnetic sensors are usually used as a digital
compass and help to fix the orientation of the device display. On the other hand, GPS is
used for navigation purposes in IoT devices.
2. Network Layer - Acts as a communication channel to transfer data, collected in the sensing
layer, to other connected devices. Diverse communication technologies (e.g., Wi-Fi,
Bluetooth, Zigbee, ZWave, LoRa, cellular network, etc.) are used to allow data flow between
other devices within the same network.
3. Data Processing Layer:- takes data collected in the sensing layer and analyses the data to
make decisions based on the result. This layer may share the result of data processing with
other connected devices via the network layer.
4. Application Layer- implements and presents the results of the data processing layer to
accomplish disparate applications of IoT devices. A user-centric layer executes various tasks
for the users. There exist diverse IoT applications, which include smart transportation, smart
home, personal care, healthcare, etc.
28
Figure 4.1 Architecture of IoT
Activity 4.6
o There are four components in the IoT architecture. What are they?
o Explain the functions of each layer of IoT.
o What is the difference between the sensors used in IoT devices?
4.2.2. Devices and Networks
IoT devices are meant to work in concert for people at home, in industry or in the enterprise. As
such, the devices can be categorized into three main groups: consumer, enterprise and industrial.
29
Consumer connected devices include smart TVs, smart speakers, toys, wearables, and smart
appliances. smart meters, commercial security systems and smart city technologies such as those
used to monitor traffic and weather conditions are examples of industrial and enterprise IoT
devices. Other technologies, including smart air conditioning, smart thermostats, smart lighting,
and smart security, span home, enterprise, and industrial uses. In the enterprise, smart sensors
located in a conference room can help an employee locate and schedule an available room for a
meeting, ensuring the proper room type, size and features are available.
Efficient and autonomic management of IoT networks is needed. Developing an IoT network
management solution is not an easy task because of the intrinsic constraints of IoT networks
(architecture, technologies, physical layer).
Indeed, it is necessary to take into account several elements such as scalability, interoperability,
energy efficiency, topology control, Quality of Service (QoS), fault tolerance, and security. The
security, context-aware, and the standard model of messages still in an early stage and should be
resolved in a new management platform. Therefore, this work proposes a platform for IoT
networks and devices management, called M4DN.IoT (Management for Device and Network
in the Internet of Things). The structure of the platform is expandable, allowing the addition of
new types of network devices or applications. In addition, the platform provides standard web
services, such as device discovery, data storage, and user authorities, which are basic
requirements for creating IoT applications.
Activity 4.7
o List and discussed at least three examples of IoT devices and their application?
o Dear leaner please make a note about Management for Device and Network in the
Internet of Things (M4DN.IoT) later discussed in the group?
30
4.3. IoT Tools and Platforms
There are many vendors in the industrial IoT platform marketplace, offering remarkably
similar capabilities and methods of deployment, just to name a few.
IoT Platform Key features
KAA Manage an unlimited number of connected devices
Set up cross-device interoperability
Perform real-time device monitoring
Perform remote device provisioning and configuration
Collect and analyze sensor data
Analyze user behavior and deliver targeted notifications
Create cloud services for smart products
SiteWhere Run any number of IoT applications on a single SiteWhere instance
Spring delivers the core configuration framework
Add devices through self-registration
Integrates with third-party integration frameworks such as Mule any
point
Default database storage is MongoDB
Eclipse Californium for CoAP messaging
InfluxDB for event data storage
Grafana to visualize SiteWhere data
ThingSpeak Collect data in private channels
Share data with public channels
MATLAB analytics and visualizations
Alerts
Event scheduling
App integrations
Worldwide community
DeviceHive Directly integrate with Alexa
Visualization dashboard of your choice
It supports Big data solutions such as ElasticSearch, Apache Spark,
Cassandra and Kafka for real-time and batch processing.
Connect any device
It comes with Apache Spark and Spark Streaming support.
Supports libraries written in various programming languages,
including Android and iOS libraries
It allows running batch analytics and machine learning on top of your
device data
Zetta Supports a wide range of hacker boards
31
Zetta allows you to assemble smartphone apps, device apps, and cloud
apps
ThingsBoard Real-time data visualization and remote device control
Customizable rules, plugins, widgets and transport implementations
Allows monitoring client-side and provision server-side device
attributes.
Support multi-tenant installations out-of-the-box.
Supports transport encryption for both MQTT and HTTP(s) protocols.
Activity 4.8
o Briefly discussed some of the IoT development tools that are listed below? • Tessel 2
Eclipse IoT Arduino PlatforIO IBM Watson
Raspbian OpenSCADA Node-RED Kimono Create
Device Hive
32
Smart Parking: Real-time monitoring of parking spaces available in the city making
residents able to identify and reserve the closest available spaces,
Waste Management: Detection of rubbish levels in containers to optimize the trash
collection routes. Garbage cans and recycle bins with RFID tags allow the sanitation staff to
see when garbage has been put out.
4.3.3. IoT Based Smart Farming
Green Houses: Control micro-climate conditions to maximize the production of fruits and
vegetables and its quality.
Compost: Control of humidity and temperature levels in alfalfa, hay, straw, etc. to prevent
fungus and other microbial contaminants.
Animal Farming/Tracking: Location and identification of animals grazing in open pastures
or location in big stables, Study of ventilation and air quality in farms and detection of
harmful gases from excrements.
Offspring Care: Control of growing conditions of the offspring in animal farms to ensure its
survival and health.
Field Monitoring: Reducing spoilage and crop waste with better monitoring, accurate
ongoing data obtaining, and management of the agriculture fields, including better control of
fertilizing, electricity and watering.
Activity 4.9
What is the application of IoT in agriculture?
What is the application of IoT for the use of consumers?
What is the application of IoT in healthcare?
What is the application of IoT in insurance companies?
What is the application of IoT in manufacturing industries?
What is the application of IoT in retail industries?
What is the application of IoT in transportation?
What is the application of IoT in utilities?
33
Chapter 5: Augmented Reality (AR)
After accomplishing this chapter, Students will be able to:
Explain augmented reality
Explain the features of augmented reality
Explain the difference between augmented reality (AR), virtual reality (VR), and mixed
reality (MR)
Explain the architecture of augmented reality systems
Describe the application areas of augmented reality
Overview of augmented reality
Augmented reality (AR) is a form of emerging technology that allows users to overlay computer-
generated content in the real world. The augmentation is typically done in real-time and in
semantic context with environmental elements.
By using the latest AR techniques and technologies, the information about the surrounding real
world becomes interactive and digitally usable. Through this augmented vision, a user can
digitally interact with and adjust information about their surrounding environment.
Unlike virtual reality, which creates a totally artificial environment, augmented reality uses the
existing environment and overlays new information on top of it. A live direct or indirect view of
a physical, real-world environment whose elements are augmented by computer-generated
sensory input such as sound, video, graphics or GPS data.
With constant development in computer vision and the exponential advancement of computer
processing power, virtual reality (VR), augmented reality (AR), and mixed reality (MR)
technology is becoming more and more prominent. With some overlap in the applications and
functions of these emerging technologies, sometimes these terms get confused or are used
incorrectly. The main differences between them are explained in the figure below:
34
VR:- is fully immersive, which tricks your senses into thinking you’re in a different environment
or world apart from the real world. It is also called a computer-simulated reality
AR፦ is an enhanced version of reality created by the use of technology to overlay digital
information on an image of something being viewed through a device (such as a Smartphone
camera). In augmented reality, users see and interact with the real world while digital content is
added to it.
MR:- sometimes referred to as hybrid reality, is the merging of real and virtual worlds to
produce new environments and visualizations where physical and digital objects co-exist and
interact in real-time.
One of the most obvious differences among augmented reality, virtual reality, and mixed reality
is the hardware requirements and also VR is content which is 100% digital and can be enjoyed in
a fully immersive environment, AR overlays digital content on top of the real-world and MR is a
digital overlay that allows interactive virtual elements to integrate and interact with the real-
world environment.
The Architecture of AR Systems
The first Augmented Reality Systems (ARS) were usually designed with a basis on three main
blocks, as is illustrated in Figure below:
35
(1) Infrastructure Tracker Unit: The Infrastructure Tracker Unit was responsible for collecting
data from the real world, sending them to the Processing Unit. Some designs used a Video In, to
acquire required data for the Infrastructure Tracker Unit.
(2) Processing Unit: mixed the virtual content with the real content and sent the result to the
Video Out module of the Visual Unit.
(3) Visual Unit: The Visual Unit can be classified into two types of system, depending on the
followed visualization technology:
1. Video see-through: It uses a Head-Mounted Display (HMD) that employs a video-
mixing and displays the merged images on a closed-view HMD.
2. Optical see-through: It uses an HMD that employs optical combiners to merge the
images within an open-view HMD.
Applications of AR Systems
AR can be applied to many different disciplines such as education, medicine, entertainment,
military, etc.
36
• Medical training and education.
• Assistance in medical procedures and routine tasks.
AR In Entertainment: can be used in various “entertainment” industries.
AR in games: were praised for increasing physical activity in people. For example, Pokémon
GO game app.
AR in music: AR can enhance live performances by illustrating the story told by a track or
displaying the way it was created by the band.
AR on TV: One way of integrating augmented reality in television is adding supplementary
information to what is going on the TV screen – such as match scores, betting options, and the
like.
AR in eSports: turns eSports shows into interactive experiences allowing the watchers to
become participants.
AR in the theater: in this sector, augmented reality can serve not only for entertainment
purposes but also for the purposes of accessibility.
37
Chapter 6: ETHICS AND PROFESSIONALISM OF EMERGING TECHNOLOGIES
6.1. Technology and Ethics
6.1. Activity
From your civic and ethical education course, what do you understand about the word
ethics?
The Internet boom has provided many benefits for society, allowing the creation of new tools
and new ways for people to interact. Technology can serve to promote or restrict human rights.
The Information Society should foster the use of emerging technologies in such a way as to
maximize the benefits that they provide while minimizing the harms. In many cases, this
promotion may be less a matter of technological control than of oversight: establishing the
proper legal or regulatory system to ensure that technology capable of abuse is not in fact
abused and that the benefits of technology are shared among all.
Ethics is particularly important for the accountancy profession, with a code for professional
ethics based on five basic principles – integrity, objectivity, competence and due care,
confidentiality, and professional behaviour. However, the emergence of new technologies
raises some new challenges for the profession to address.
Activity 6.2
What do you think the need for ethics in data science? Is it really important to include
ethical rules when dealing with big data? If your answer is yes, why?
The increasing use of big data, algorithmic decision-making, and artificial intelligence can
enable more consistent, evidence-based and accurate judgments or decisions, often more
quickly and efficiently. However, these strengths can potentially have a darker side too,
throwing up questions around the ethical use of these fairly new technologies.
For example, outputs can be based on biased data, which could lead to discriminatory
outcomes. Indeed, where systems learn from real-world data, there is a significant risk that
those systems simply recreate the past and subsequently build in errors or systemic biases.
Activity 6.3
As we discussed in chapter three, AI is all about making a machine learn and decide as
humans do. Do you think that it is necessary to rely on machines and give them all the
opportunity to decide? Why?
Additionally, questions are being asked regarding the interaction between computers and
humans. How much reliance can we place on data and models, and what is the role of human
38
judgment, as well as how do we ensure that we understand the decision-making process?
Whatever the power of the machine, humans will still need to be involved, so that people can
be held accountable, or explain the reasons behind a decision.
Activity 6.4
Do you think that integrating ethical rules with emerging technologies is important? If
your answer is yes, why? What are the challenges of integrating ethical rules with new
technologies?
A central problem of the ethics of technology is that it tends to arrive too late. In many cases,
ethical issues are only recognized when the technology is already on the market and problems
arise during its widespread use.
Ethics can then become a tool to clean up a mess that might have been avoidable.
It is probably not contentious to say it would be desirable to have ethical input at the earlier
stages of technology design and development. Indeed, there are ethical theories and approaches
that explicitly aim at an early integration of ethics into the technology life cycle.
Activity 6.5
List down common ethical rules that must be applied in all technologies?
1. Contribute to society and to human well-being, acknowledging that all people are
stakeholders in computing.
2. Avoid harm.
3. Be honest and trustworthy.
4. Be fair and take action not to discriminate
5. Respect the work required to produce new ideas, inventions, creative works, and
computing artifacts.
6. Respect privacy.
7. Honor confidentiality
Activity 6.6
1. Strive to achieve high quality in both the processes and products of professional work.
2. Maintain high standards of professional competence, conduct, and ethical practice.
39
3. Know and respect existing rules pertaining to professional work.
4. Accept and provide appropriate professional review.
5. Give comprehensive and thorough evaluations of computer systems and their impacts,
including analysis of possible risks.
6. Perform work only in areas of competence.
7. Foster public awareness and understanding of computing, related technologies, and
their consequences.
8. Access computing and communication resources only when authorized or when
compelled by the public good.
9. Design and implement systems that are robustly and usably secure.
Activity 6.7
1. Ensure that the public good is the central concern during all professional computing
work.
4. Articulate, apply, and support policies and processes that reflect the principles of the
Code.
6. Use care when modifying or retiring systems. Interface changes, the removal of
features, and even software updates have an impact on the productivity of users and the
quality of their work.
7. Recognize and take special care of systems that become integrated into the
infrastructure of society.
Activity 6.8
Discuss some specific professional ethical principles related to big data, AI and IoT?
40
6.2. Digital privacy
Activity 6.9
What do you think about privacy in general? Is it really important?
In this digital world, what do you mean by digital privacy? Give concrete examples?
It is often used in contexts that promote advocacy on behalf of individual and consumer
privacy rights in digital spheres, and is typically used in opposition to the business
practices of many e-marketers/businesses/companies to collect and use such information
and data.
Activity 6.10
What do you think that private information like passwords, PIN numbers, will be
guarded or shared with the public? Why?
In the context of digital privacy, information privacy is the notion that individuals should
have the freedom, or right, to determine how their digital information, mainly that pertaining to
personally identifiable information, is collected and used.
Every country has various laws that dictate how information may be collected and used by
companies.
Activity 6.11
Do you think communication privacy is the only one who is accessing the message will
be the sender’s intended receiver? If your answer is yes, why?
In the context of digital privacy, communication privacy is the notion that individuals should
have the freedom, or right, to communicate information digitally with the expectation that their
communications are secure; meaning that messages and communications will only be
accessible to the sender's original intended recipient. However, communications can be
41
intercepted or delivered to other recipients without the sender's knowledge, in a multitude of
ways. It’s necessarily requires consideration of technological methods of protecting
information/communication in digital mediums, the effectiveness, and ineffectiveness of
such methods/systems, and the development/advancement of new and current technologies.
Activity 6.12
Do you think that it is good to make yourself exposed to every information on the
internet? Why?
Individual privacy is the notion that individuals have a right to exist freely on the internet, in
that they can choose what types of information they are exposed to, and more importantly that
unwanted information should not interrupt them.
Activity 6.13
Give some examples you have to consider to make your data as well as communication
private?
Data Minimization
Transparency
Accuracy
Security: Adequate physical and IT security measures will be implemented to ensure that
the collection, use, and maintenance of identifiable information are properly safeguarded
and the information is promptly destroyed in accordance with approved records control
schedules.
Activity 6.14
Emerging technologies can provide improved accuracy, better quality and cost efficiencies for
businesses in every sector. They can enhance trust in the organization’s operations and
42
financial processes, which is crucial for sustainable success. But this can produce a paradox:
the very solutions that can be used to better manage risk, increase transparency and build
confidence are often themselves the source of new risks, which may go unnoticed.
Activity 6.15
What are the challenges of using technologies like AI, IoT, and big data?
With Technology moving at a fast pace it is always been a challenge for Security. As security
professionals, we need to keep pace with ever-changing technology and be aware of the AI,
IoT, Big Data, Machine Learning, etc.
Activity 6.16
What role can technologies such as AI, IoT, Machine Learning and Big Data play in
enhancing the security of an organization?
Emerging technologies are already impacting how we live and work. They're also changing
how we approach, plan, and integrate security operations. For security, both physical and
cyber, the equation is the same catalyzing many new potential applications for emerging
technologies.
2. Real-time horizon scanning and data mining for threats and information sharing
6. Safety and security equipment (including bullet and bomb proof) made with lighter
and stronger materials
43
7. Situational awareness capabilities via GPS for disaster response and crisis response
scenarios
Activity 6.17
AI has a wide application in health and manufacturing industries. What are the
challenges the world face when implementing the applications of AI in the previously
mentioned industries?
AI is only as good as the data it is exposed to, which is where certain challenges may present
themselves. How a business teaches and develops its AI will be the major factor in its
usefulness. Humans could be the weak link here, as people are unlikely to want to input masses
of data into a system.
Activity 6.18
Write down the challenges of using robots in the manufacturing industry? Debate on
the pros and cons of giving jobs to humans or to robots in the manufacturing industry?
With automation and robotics moving from production lines out into other areas of work and
business, the potential for humans losing jobs is great here too. As automation technologies
become more advanced, there will be a greater capability for automation to take over more and
more complex jobs. As robots learn to teach each other and themselves, there is the potential
for much greater productivity but this also raises ethical and cyber security concerns.
Activity 6.19
44
As more and more connected devices (such as smartwatches and fitness trackers) join the
Internet of Things (IoT) the amount of data being generated is increasing. Companies will have
to plan carefully how this will affect the customer-facing application and how to best utilize
the masses of data being produced. There are also severe security implications of mass
connectivity that need to be addressed.
Almost all the technologies mentioned above have some relation to Big Data. The huge amount
of data being generated on a daily basis has the potential to provide businesses with better
insight into their customers as well as their own business operations. Although data can be
incredibly useful for spotting trends and analyzing impacts, surfacing all this data to humans in
a way that they can understand can be challenging. AI will play a role here.
6.5.2. Treats
Activity 6.20
Write down some risks in emerging technologies like driverless cars, drones, and IoT?
New and emerging technologies pose significant opportunities for businesses if they utilize
them well and understand their true value early on.
Some risks of emerging technology are:
Driverless car
Wearables
Drones
Internet of things
Chapter Six Review Questions
1. What is the importance of ethics in emerging technologies?
2. List down some general ethical rules?
3. List down some professional responsibility related to ethical rules
4. What is digital privacy? What is its importance?
5. Briefly explain digital privacy principles
6. Why we need accountability in using emerging technologies?
7. Is the trust necessary to use an emerging technology platform? Why?
8. Briefly explain the challenges in using:
a. AI?
b. Robots?
c. IoT?
9. Briefly explain the risks we face in augmented reality, IoT and AI?
10. Do you think that dealing with big data demands high ethical regulations,
accountability, and responsibility of the person as well as the company? Why?
45
CHAPTER 7: OTHER EMERGING TECHNOLOGIES
Introduction
In this chapter, we are going to discuss other emerging technologies like nanotechnology,
biotechnology, block-chain technology, cloud and quantum computing, autonomic computing,
computer vision, embedded systems, cyber-security, and 3D printing.
7.1 Nanotechnology
Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is
about 1 to 100 nanometres. Nanoscience and nanotechnology are the study and application of
extremely small things and can be used across all the other science fields, such as chemistry,
biology, physics, materials science, and engineering. One nanometer is a billionth of a meter or
10-9 meters. Here are a few illustrative examples:
There are 25,400,000 nanometers in an inch.
A sheet of newspaper is about 100,000 nanometers thick.
On a comparative scale, if a marble were a nanometer, then one meter would be the size
of the Earth.
Nanoscience and nanotechnology involve the ability to see and to control individual atoms and
molecules. Everything on Earth is made up of atoms—the food we eat, the clothes we wear, the
buildings and houses we live in, and our own bodies. But something as small as an atom is
impossible to see with the naked eye.
In fact, it’s impossible to see with the microscopes typically used in high school science classes.
The microscopes needed to see things at the nanoscale were invented relatively recently about 30
years ago. As small as a nanometer is, it's still large compared to the atomic scale. An atom has a
diameter of about 0.1 nm. An atom's nucleus is much smaller about 0.00001 nm. Atoms are the
building blocks for all matter in our universe. You and everything around you are made of atoms.
Nature has perfected the science of manufacturing matter molecularly. For instance, our bodies
are assembled in a specific manner from millions of living cells. Cells are nature's
nanomachines. At the atomic scale, elements are at their most basic level. On the nanoscale, we
can potentially put these atoms together to make almost anything.
We define Nanoscience as the study of phenomena and manipulation of materials at atomic,
molecular and macromolecular scales, where properties differ significantly from those at a larger
scale; and nanotechnologies as the design, characterization, production, and application of
structures, devices, and systems by controlling shape and size at the nanometer scale.
The properties of materials can be different at the nanoscale for two main reasons:
1. Nanomaterials have a relatively larger surface area when compared to the same mass of
material produced in a larger form. This can make materials more chemically reactive (in
some cases materials that are inert in their larger form are reactive when produced in their
nanoscale form), and affect their strength or electrical properties.
46
2. Quantum effects can begin to dominate the behaviour of matter at the nanoscale
particularly at the lower end – affecting the optical, electrical and magnetic behavior of
materials. Materials can be produced that are nanoscale in one dimension (for example,
nanowires, nanorods, and nanotubes), in two dimensions (plate-like shapes like
nanocoating, nanolayers, and graphene) or in all three dimensions (for example,
nanoparticles).
Applications of nanotechnology:-
Medicine: customized nanoparticles the size of molecules that can deliver drugs directly
to diseased cells in your body. When it's perfected, this method should greatly reduce the
damage treatment such as chemotherapy does to a patient's healthy cells.
Electronics: it has some answers for how we might increase the capabilities of electronics
devices while we reduce their weight and power consumption.
Food: it has an impact on several aspects of food science, from how food is grown to how
it is packaged.
Agriculture: nanotechnology can possibly change the whole agriculture part and
nourishment industry anchor from generation to preservation, handling, bundling,
transportation, and even waste treatment.
Vehicle manufacturers: Much like aviation, lighter and stronger materials will be
valuable for making vehicles that are both quicker and more secure.
7.2 Biotechnology
Biotechnology is technology based on biology. Biotechnology harnesses cellular and
biomolecular processes to develop technologies and products that help improve our lives and the
health.
We have used the biological processes of microorganisms for more than 6,000 years to make
useful food products, such as bread and cheese, and to preserve dairy products. Brewing and
baking bread are examples of processes that fall within the concept of biotechnology. Such
traditional processes usually utilize the living organisms in their natural form (or further
developed by breeding), while the more modern form of biotechnology will generally involve a
more advanced modification of the biological system or organism.
One example of modern biotechnology is genetic engineering. Genetic engineering is the process
of transferring individual genes between organisms or modifying the genes in an organism to
remove or add a desired trait or characteristic.
Today, biotechnology covers many different disciplines (e.g. genetics, biochemistry, molecular
biology, etc.). New technologies and products are developed every year within the areas of e.g.
medicine (development of new medicines and therapies), agriculture (development of genetically
modified plants, biofuels, and biological treatment) or industrial biotechnology (production of
chemicals, paper, textiles, and food).
47
Application of biotechnology
Agriculture (Green Biotechnology): Biotechnology had contributed a lot to modify the
genes of the organism known as Genetically Modified Organisms such as Crops,
Animals, Plants, Fungi, Bacteria, etc.
Medicine (Medicinal Biotechnology): This helps in the formation of genetically modified
insulin known as humulin.
Aquaculture Fisheries: It helps in improving the quality and quantity of fishes. Through
biotechnology, fishes are induced to breed via gonadotropin-releasing hormone.
Environment (Environmental biotechnology): is used in waste treatment and pollution
prevention. Environmental biotechnology can more efficiently clean up many wastes than
conventional methods and greatly reduce our dependence on methods for land-based
disposal.
48
blocks. Once recorded, the data in any given block cannot be altered retroactively without the
alteration of all subsequent blocks, which requires the consensus of the network majority.
Although Blockchain records are not unalterable, Blockchain may be considered secure by
design and exemplify a distributed computing system.
The Blockchain network has no central authority; it is the very definition of a democratized
system. Since it is a shared and immutable ledger, the information in it is open for anyone and
everyone to see. Hence, anything that is built on the Blockchain is by its very nature transparent
and everyone involved is accountable for their actions.
A Blockchain carries no transaction cost. (An infrastructure cost yes, but no transaction cost.)
The Blockchain is a simple yet ingenious way of passing information from A to B in a fully
automated and safe manner. One party to a transaction initiates the process by creating a block.
This block is verified by thousands, perhaps millions of computers distributed around the net.
The verified block is added to a chain, which is stored across the net, creating not just a unique
record, but a unique record with a unique history. Falsifying a single record would mean
falsifying the entire chain in millions of instances. That is virtually impossible. Bitcoin uses this
model for monetary transactions, but it can be deployed in many other ways.
49
2. Transparency
A person’s identity is hidden via complex cryptography and represented only by
their public address. So, if you were to look up a person’s transaction history, you
will not see. So, while the person’s real identity is secure, you will still see all the
transactions that were done by their public address. This level of transparency has
never existed before within a financial system. It adds that extra, and much
needed level of accountability which is required by some of these biggest
institutions.
3. Immutability
It means that once something has been entered into the Blockchain, it cannot be tampered with.
The reason why the Blockchain gets this property is that of the cryptographic hash
function. Hashing means taking an input string of any length and giving out an
output of a fixed length. In the context of crypto-currencies like Bitcoin, the
transactions are taken as input and run through a hashing algorithm (Bitcoin uses
SHA-256) which gives an output of a fixed length.
Let’s see how the hashing process works. We are going to use the SHA-256
(Secure Hashing Algorithm 256).
As you can see, in the case of SHA-256, no matter how big or small your input is, the
output will always have a fixed 256-bits length. This becomes critical when you are
dealing with a huge amount of data and transactions. So basically, instead of
remembering the input data which could be huge, you can just remember the hash and
keep track.
50
The blockchain is immutable, so no one can tamper with the data that is inside the
blockchain
The blockchain is transparent so one can track the data if they want to.
Application of Blockchain
51
has the potential to speed up file transfer and streaming times. Such an improvement is not only
convenient. It’s a necessary upgrade to the web’s currently overloaded content-delivery systems.
Cloud computing is a means of networking remote servers that are hosted on the Internet. Rather
than storing and processing data on a local server, or a PC's hard drive, one of the following
three types of cloud infrastructure is used.
1. Public cloud. Here a third-party provider manages the servers, applications, and storage
much like a public utility. Anyone can subscribe to the provider’s cloud service, which is
usually operated through their own data center.
2. Private cloud. Hosted on their on-site data center, although some companies host through
a third-party provider instead. Either way, the computing infrastructure exists as a private
network accessible over the Internet.
3. The third option is a hybrid cloud. Here private clouds are connected to public clouds,
allowing data and applications to be shared between them.
Currently, the only organization which provides a quantum computer in the cloud is IBM. They
allow free access to anyone who wishes to use their 5-qubit machine. Earlier this year they
installed a 17-qubit machine. So far over 40,000 users have taken advantage of their online
52
service to run experiments. Google provided the fastest quantum computer with 53qubits and
speed of 200 seconds computation while the supercomputer took 10000 years.
Qubit is short for a sequence of quantum bits. With a classic computer, data is stored in tiny
transistors that hold a single bit of information, either the binary value of 1 or 0. With a quantum
computer, the data is stored in qubits. Thanks to the mechanics of quantum physics, where
subatomic particles obey their own laws, a qubit can exist in two states at the same time. This
phenomenon is called superposition. So, a qubit can have a value of 1, 0, or some value
between. Two qubits can hold even more values. Before long, you are building yourself an
exponentially more powerful computer the more qubits you add.
Advantages of quantum computing
1. To make complex calculations that would only overwhelm classic computers.
2. Help in the discovery of new drugs, by unlocking the complex structure of chemical
molecules.
3. For financial trading, risk management, and supply chain optimization.
4. With its ability to handle more complex numbers, data could be transferred over the
internet with much safer encryption.
53
Self-Protecting: An autonomic application/system should be capable of detecting and
protecting its resources from both internal and external attacks and maintaining overall
system security and integrity.
Context-Aware: An autonomic application/system should be aware of its execution
environment and be able to react to changes in the environment.
Open: An autonomic application/system must function in a heterogeneous world and
should be portable across multiple hardware and software architectures. Consequently, it
must be built on standard and open protocols and interfaces.
Anticipatory: An autonomic application/system should be able to anticipate to the extent
possible, its needs and behaviours and those of its context, and be able to manage itself
proactively
There are many types of computer vision that are used in different ways:
Image segmentation partitions an image into multiple regions or pieces to be examined
separately.
Object detection identifies a specific object in an image. Advanced object detection
recognizes many objects in a single image: a football field, an offensive player, a
defensive player, a ball and so on. These models use an X, Y coordinate to create a
bounding box and identify everything inside the box.
Facial recognition is an advanced type of object detection that not only recognizes a
human face in an image but identifies a specific individual.
Edge detection is a technique used to identify the outside edge of an object or landscape
to better identify what is in the image.
54
Pattern detection is a process of recognizing repeated shapes, colours and other visual
indicators in images.
Image classification groups images into different categories.
Feature matching is a type of pattern detection that matches similarities in images to help
classify them.
55
Fig: Basic structure of an embedded system
Sensor − It measures the physical quantity and converts it to an electrical signal which
can be read by an observer or by any electronic instrument like an A2D converter. A
sensor stores the measured quantity to the memory.
A-D Converter − An analog-to-digital converter converts the analog signal sent by the
sensor into a digital signal.
Processor & ASICs − Processors process the data to measure the output and store it to the
memory.
D-A Converter − A digital-to-analog converter converts the digital data fed by the
processor to analog data.
Actuator − An actuator compares the output given by the D-A Converter to the actual
(expected) output stored in it and stores the approved output.
7.8. Cyber-security
It is the protection of computer systems from the theft of or damage to their hardware, software,
or electronic data, as well as from the disruption or misdirection of the services they provide.
Cybersecurity is often confused with information security but it focuses on protecting computer
systems from unauthorized access or being otherwise damaged or made inaccessible.
Information security is a broader category that looks to protect all information assets, whether in
hard copy or in digital form. The term cybercrime is used to describe an unlawful activity in
which computer or computing devices such as smartphones, tablets, Personal Digital Assistants
(PDAs), etc. which are stand-alone or a part of a network are used as a tool or/and target of
criminal activity. It is often committed by the people of destructive & criminal mindset either for
revenge, greed or adventure. Combating this is a multi-disciplinary affair that spans hardware &
software through to policy & people all of it aimed at both preventing cybercrimes occurring in
the first place, & minimizing its impact.
Cyber-security measures
The following are some security measures to be taken to prevent cybercrimes:
Staff awareness training: - Human error is the leading cause of data breaches, so you need
to equip staff with the knowledge to deal with the threats they face. Training courses will
show staff how security threats affect them and help them apply best-practice advice to
real-world situations.
56
Application security: - Web application vulnerabilities are a common point of intrusion
for cybercriminals. As applications play an increasingly critical role in business, it is vital
to focus on web application security.
Network security: - Network security is the process of protecting the usability and
integrity of your network and data. This is achieved by conducting a network penetration
test, which scans your network for vulnerabilities and security issues.
Leadership commitment: - Leadership commitment is the key to cyber resilience.
Without it, it is very difficult to establish or enforce effective processes. Top management
must be prepared to invest in appropriate cybersecurity resources, such as awareness
training.
Password management: - Almost half of the UK population uses ‘password’, ‘123456’ or
‘qwerty’ as their password. You should implement a password management policy that
provides guidance to ensure staff create strong passwords and keep them secure.
57
is much broader, but the term is often associated with filament-based plastic printers, which are
the pride and joy of many a hobbyist and self-described maker. But there are also binder jet
printers, laser metal 3D printers, as well as glass and clay 3D printers
Additive manufacturing (AM) describes types of advanced manufacturing that are used to create
three-dimensional structures out of plastics, metals, polymers and other materials that can be
sprayed through a nozzle or aggregated in a vat. These constructs are added layer by layer in
real-time based on digital design. The simplicity and low cost of AM machines, combined with
the scope of their potential creations, could profoundly alter global, local economies and affect
international security.
Review Questions
1. What is nanotechnology? Write down some applications of nanotechnology?
2. Briefly explain biotechnology and its importance in agriculture, medicine, and the
environment?
3. What is Blockchain technology? Briefly explain how it works?
4. Briefly explain cloud and quantum computing?
5. What is autonomic computing? Write down some of its characteristics?
6. What is Computer vision? List down some real-world applications?
7. Briefly explain embed systems and their components?
8. What is cybersecurity? List some cybersecurity treats? Write down the advantages of
cybersecurity?
9. Briefly explain additive manufacturing?
58
Bibliography
E. S. Ruiz and F. M. U. R. Perez, “German carro fernandez, sergio martin gutierrez, elio
sancristobal ruiz, francisco mur perez, and manuel castro gil ©,” no. June, pp. 51–58, 2012.
F. Griffiths and M. Ooi, “The fourth industrial revolution - Industry 4.0 and IoT [Trends in
Future I&M],” IEEE Instrum. Meas. Mag., vol. 21, pp. 29–43.
J. Hoerni, “Semiconductors and the second industrial revolution,” pp. 38–39, 1982.
J. Wan et al., “Software-Defined Industrial Internet of Things in the Context of Industry 4. 0,”
vol. 16, no. 20, pp. 7373–7380, 2016.
H. Xu, W. Yu, D. Griffith, and N. Golmie, “A Survey on Industrial Internet of Things : A Cyber-
Physical Systems Perspective,” IEEE Access, vol. PP, no. c, p. 1, 2018.
Smith, F.J., Data science as an academic discipline. Data Science Journal, 5, 2006. pp.163–164.
Thomas L. Floyd, “Digital Fundamentals with PLD Programming”, Pearson Prentice Hall, 2006,
Prakash G. Gupta, “Data Communications and Computer Networking”, Prentice-Hall, 2006 139
59
“What is Data Science? “https://datascience.berkeley.edu/about/what-is-data-science/ [Online].
Available : [Accessed: September 7, 2019]
“Top 15 Artificial Intelligence Platforms - Compare Reviews, Features, Pricing in 2019,” PAT
RESEARCH: B2B Reviews, Buying Guides & Best Practices, 15-Jul-2019. [Online]. Available:
140
Brewster, C., Roussaki, I., Kalatzis, N., Doolin, K., & Ellis, K. (2017). IoT in agriculture:
Designing a Europe-wide large-scale pilot. IEEE communications magazine, 55(9), 26-33.
Ramakrishna, G.Kiran Kumar, A.Mallikarjuna Reddy, Pallam Ravi (2018). A Survey on various
IoT Attacks and its Countermeasures. International Journal of Engineering Research in
Computer Science and Engineering (IJERCSE), 5(4), 2394-2320.
Elijah, O., Rahman, T. A., Orikumhi, I., Leow, C. Y., & Hindia, M. N. (2018). An overview of
the Internet of Things (IoT) and data analytics in agriculture: Benefits and challenges. IEEE
Internet of Things Journal, 5(5), 3758-3773.
60
Elijah, O., Rahman, T. A., Orikumhi, I., Leow, C. Y., & Hindia, M. N. (2018). An overview of
the Internet of Things (IoT) and data analytics in agriculture: Benefits and challenges. IEEE
Internet of Things Journal, 5(5), 3758-3773.
Foote, K. D. (2016). A brief history of the internet of things. Data Education for Business and IT
Professionals. Available online: http://www. diversity. net/brief-history-internet-things/
(accessed on 12 November 2018).
Gupta, B. B., & Quamara, M. (2018). An overview of the Internet of Things (IoT): Architectural
aspects, challenges, and protocols. Concurrency and Computation: Practice and Experience,
e4946.
John Terra (2019). Everything You Need to Know About IoT Applications.
https://www.simplilearn.com/iot-applications-article
Mohamed, K. S. (2019). The Era of Internet of Things: Towards a Smart World. In the Era of
Internet of Things (pp. 1-19). Springer, Cham.
Vyas, D. A., Bhatt, D., & Jha, D. (2015). IoT: trends, challenges, and future scope. IJCSC, 7(1),
186-197. 141
Antonioli, M., Blake, C., & Sparks, K. (2014). Augmented reality applications in education. The
Journal of Technology Studies, 96-107.
Kipper, G., & Rampolla, J. (2012). Augmented Reality: an emerging technologies guide to AR.
Elsevier.
Kirner, C., Cerqueira, C., & Kirner, T. (2012). Using augmented reality artifacts in education and
cognitive rehabilitation. Virtual Reality in Psychological, Medical and Pedagogical Applications
2 Will-be-set-by-IN-TECH, 247-270.
Margetis, G., Papagiannakis, G., & Stephanidis, C. (2019). Realistic Natural Interaction with
Virtual Statues in X-Reality Environments. International Archives of the Photogrammetry,
Remote Sensing and Spatial Information Sciences, 42(2/W11).
Thimbleby, H. (2013). Technology and the future of healthcare. Journal of public health
research, 2(3).
Thomas, B. H. (2012). A survey of visual, mixed, and augmented reality gaming. Computers in
Entertainment (CIE), 10(1), 3.
61
https://www.icaew.com/technical/ethics/ethics-and-new-technologies. [Accessed: 25-Aug-2019].
“IT Privacy Policy, Office of Privacy and Open Government, U.S. Department of Commerce.”
[Online]. Available: http://www.osec.doc.gov/opog/privacy/digital_policy.html. [Accessed: 25-
Aug-2019]. 142
“How can you build trust when emerging technologies bring new risks?” [Online]. Available:
https://www.ey.com/en_gl/digital/how-can-you-build-trust-when-emerging-technologies-bring-
new-risks. [Accessed: 02-Sep-2019].
“‘Emerging Technologies are Already Impacting Security Strategies,’” IFSEC India, 11-Jan-
2019. [Online]. Available: https://www.ifsec.events/india/visit/news-and-updates/emerging-
technologies-are-already-impacting-security-strategies. [Accessed: 02-Sep-2019].
C. Lovatt, “5 Big Technology Challenges For Enterprises In The Future.” [Online]. Available:
https://blog.cutover.com/technology-challenges-enterprises-future. [Accessed: 08-Sep-2019].
“What are the ethical implications of emerging tech?,” World Economic Forum. [Online].
Available: https://www.weforum.org/agenda/2015/03/what-are-the-ethical-implications-of-
emerging-tech/. [Accessed: 25-Aug-2019].
B. Dainow, “Ethics in Emerging Technology,” ITNOW, vol. 56, pp. 16–18, Aug. 2014.
62
“Nanotechnology - Definition, and Introduction. What is nanotechnology?” Nanowerk. [Online].
Available:
https://www.nanowerk.com/nanotechnology/introduction/introduction_to_nanotechnology_1.ph
p. [Accessed: 08-Sep-2019].
“Biotechnology and its Applications - Study Material for NEET (AIPMT) & Medical Exams |
askIITians.” [Online]. Available: /biology/biotechnology-and-its-applications/. [Accessed: 08-
Sep-2019].
63
M. Parashar and S. Hariri, “Autonomic Computing: An Overview,” in Unconventional
Programming Paradigms, vol. 3566, J.-P. Banâtre, P. Fradet, J.-L. Giavitto, and O. Michel, Eds.
Berlin, Heidelberg: Springer Berlin Heidelberg, 2005, pp. 257–269.
“3D Printing and Additive Manufacturing – What’s the Difference?” All3DP, 24-Jan-2019.
[Online]. Available: https://all3dp.com/2/3d-printing-and-additive-manufacturing-what-s-the-
difference/. [Accessed: 02-Sep-2019].
B.-H. Lu, H.-B. Lan, H.-Z. Liu, 1State Key Laboratory for Manufacturing System Engineering,
Xi’an Jiao Tong University, Xi’an 710049, China, 2Qingdao Engineering Research Center for
3D Printing, Qingdao University of Technology, Qingdao 266033, China, and
3Nanomanufacturing and Nano-Optoelectronics Lab, Qingdao University of Technology,
Qingdao 266033, China, “Additive manufacturing frontier: 3D printing electronics,” Opto-
Electron. Adv., vol. 1, no. 1, pp. 17000401–17000410, 2018.
64