0% found this document useful (0 votes)
203 views82 pages

IT and Analytics

The document provides an overview of key global technology trends in 2021, including the acceleration of cloud computing adoption and digital workplaces due to remote work needs during the pandemic. Cybersecurity remains a top priority as incidents have increased. Democratization of technology will give rise to citizen developers who can build applications without extensive coding knowledge. Intelligent automation and artificial intelligence are being deployed at larger scales across industries. Healthcare systems are becoming more connected through digital technologies. Technology is also driving sustainability efforts.

Uploaded by

kunal arora
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
203 views82 pages

IT and Analytics

The document provides an overview of key global technology trends in 2021, including the acceleration of cloud computing adoption and digital workplaces due to remote work needs during the pandemic. Cybersecurity remains a top priority as incidents have increased. Democratization of technology will give rise to citizen developers who can build applications without extensive coding knowledge. Intelligent automation and artificial intelligence are being deployed at larger scales across industries. Healthcare systems are becoming more connected through digital technologies. Technology is also driving sustainability efforts.

Uploaded by

kunal arora
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 82

Presents

IT and Analytics Guide 2021-22

For queries and suggestions, mail to:


[email protected]
Contents
Introduction ............................................................................................................................................ 3
Global Technology Trends in 2021 ....................................................................................................... 3
Managing through Market Disruption and Beyond ............................................................................... 5
Cloud Computing ................................................................................................................................... 6
Internet of Things (IOT) ........................................................................................................................ 9
Artificial Intelligence ........................................................................................................................... 11
Machine Learning ................................................................................................................................ 14
Virtual Reality and Augmented Reality ............................................................................................... 17
Natural Language Processing............................................................................................................... 19
Blockchain ........................................................................................................................................... 20
Cryptocurrency .................................................................................................................................... 21
5G......................................................................................................................................................... 22
Recent Trends in 5G ............................................................................................................................ 24
Social Media Analytics ........................................................................................................................ 25
Analytics .............................................................................................................................................. 26
Data Mining ......................................................................................................................................... 30
Data Algorithms ................................................................................................................................... 32
Examples of companies using Analytics ............................................................................................. 37
Quantum Computing ............................................................................................................................ 41
Cybersecurity ....................................................................................................................................... 43
NFT ...................................................................................................................................................... 48
Future of Data Sharing ......................................................................................................................... 49
Robotic Process Automation................................................................................................................ 50
Edge Computing .................................................................................................................................. 54
Virtual Events ...................................................................................................................................... 57
Virtual workspace ................................................................................................................................ 59
Cloud Service Types ............................................................................................................................ 60
Product Management and Project Management .................................................................................. 68
Product Management (Role) ................................................................................................................ 77
Discussion Topics ................................................................................................................................ 78
APPENDIX .......................................................................................................................................... 82
Introduction
The role of IT in business is seen in how it can help the company become more productive, increase
performance, save money, improve the customer experience, streamline communications and
enhance managerial decision-making. It also plays a role in helping companies expand globally
and in providing staff access to company information wherever and whenever they need it.
Information Technology (IT) can be defined as the utilization of hardware, software, services and
infrastructure to create, store, exchange and leverage information in its various forms to
accomplish multiple business objectives. Additionally, the term encompasses the workers that
develop, implement, maintain and utilize information technology directly and indirectly.
• IT Hardware – computers, tablets, mobile phones, printers, servers, etc.
• Software – productivity applications, network applications, security applications, etc.
• IT Services – integration, maintenance, repair, application development, managed services,
etc.
• IT Infrastructure – the Internet backbone, fiber optic networks, data centers, etc. Information
– text data, documents, voice, video and images, etc.
• Business Objectives – migration, communication, collaboration, efficiency, insight,
production, e-commerce, etc.

Global Technology Trends in 2021


Digital Workplaces is the thing of the future
The rapid growth in online collaborative tools in response to lockdowns imposed, which led to
stay-at-home orders, has pushed organizations to serve all their clients remotely. Building virtual
spaces was crucial in order to help their people stay connected. Platforms like Zoom, Microsoft
Teams and Google Meet and Skype for Business have gained importance in organizations and
educational institutions to conduct virtual meetings. Dynamically changing employee preferences
and a shrinking office footprint would make hybrid team operations a norm. With new features
being appended every day, organizations need to gear up to evolve the digital workplace at a pace
never experienced before, to deliver the state-of-the-art user experience and cater to the
expectations of its clients and employees.

Cloud transformation accelerates


Much of our lives has shifted online. The recent surge in remote work, exploding e-commerce
platform usage and endless streaming of content from the comfort of our homes have spurred
greater cloud adoption and consumption. This has led to double-digit growth rates in the cloud
space. Many organizations, however are still in the early stages of their cloud journey, but the
increasing digitalization would accelerate the move to adopt cloud infrastructures and help
businesses unlock productivity gains, drive efficiencies and innovation at higher scale and speed.

Cyber security remains a top priority


The onset of COVID-19 is a world event which has caused a spike in fraudulent emails and text
messages. A KPMG Survey suggests that 41% of organizations have experienced an increase in
incidents, primarily from spear phishing and malware attacks that target employees working
remotely. Securing internal networks remains a top priority apart from educating an organization's
workforce about cybersecurity.

Democratization of technology will give rise to citizen developers


There is a dire need for developers with the digital transformation occurring at a rapid pace.
Centralized IT resources are getting out-paced by the new technological demands. New, intuitive
and user-friendly development platforms which involve low/no coding allows professionals across
different areas in an organization to build applications that can improve business processes. It also
drives customer engagement with speed and at a fraction of the cost, thereby effectively
democratizing technology and narrowing down the transformational gap.

Intelligent automation and artificial intelligence happen at scale


There is rapid automation and digitization of processes, backed by advanced machines that are
underpinned by artificial intelligence. Chatbots, for example, are integrated into business
processes. They execute tasks based on data and learning, thus allowing humans to focus on value-
creating work. With advanced technologies becoming more accessible through cloud, intelligent
automation and artificial intelligence are moving from the experimental phase to deployment at
larger scales. This is reflected in the maturing conversations regarding ethics, regulation, control
and other related issues.
Healthcare is connected
COVID-19 acted as a catalyst for changes which were long overdue in the healthcare industry.
Health systems globally, have leveraged digital technologies to deliver care using new channels,
support disease monitoring and contact tracing. In recent reality, expansion of digital infrastructure
will be necessary to build more resilient and connected health systems. This would enable the
delivery of patient-centered integrated care and would also support insight-driven decision
making, fostering innovation. This would include advancements in operational and support
functions.

Technology drives sustainability


Leaders have started recognizing ESG themes not just as a global challenge or a regulatory issue,
but as a key opportunity to rebuild their organizations that support a sustainable economy, create
a competitive edge, influence buyer decisions and attract talent. Technology is the driving force;
it enables data flows and allows organizations to analyze and predict impact on the environment.
It creates transparency and also drives efficiencies in value chains. It optimizes processes and
systems to reduce waste, helps in conserving energy and guides daily behavior to change for the
better.

Managing through Market Disruption and Beyond

Approach for Software / SaaS Companies (KPMG Report)


Cloud Computing
The practice of using a network of remote servers hosted on the Internet to store, manage, and process
data, rather than a local server or a personal computer. It reduces upfront infrastructure cost so that the
organization can focus on projects. Cloud resources are usually not only shared by multiple users but as
well as dynamically re-allocated as per demand. This can work for allocating resources to users in
different time zones.
Service Models [Source – NASSCOM Report]

Further readings:

• https://www.zdnet.com/article/what-is-cloud-computing-everything-you-need-to-know-about-the-
cloud/
Deployment models

Internet of Things (IOT)


The internet of things is a system of interrelated computing devices, mechanical and digital machines,
objects, animals or people that are provided with unique identifiers and the ability to transfer data over a
network without requiring human-to-human or human-to-computer interaction.
How IOT Works?
An IoT ecosystem consists of web-enabled smart devices that use embedded processors, sensors and
communication hardware to collect, send and act on data they acquire from their environments. IoT
devices share the sensor data they collect by connecting to an IoT gateway or other edge device where
data is either sent to the cloud to be analyzed or analyzed locally. Sometimes, these devices communicate
with other related devices and act on the information they get from one another. The devices do most of
the work without human intervention, although people can interact with the devices -- for instance, to set
them up, give them instructions or access the data. The connectivity, networking and communication
protocols used with these web-enabled devices largely depend on the specific IoT applications deployed.

Real World Applications of Internet of Things (IoT) across different sectors

GE Digital’s Predix Platform: GE created Predix Platform to address the unique needs of industrial
companies on the path to digitization. Predix Platform is a distributed application platform that’s purpose-
built for the digital industrial era.
It captures and analyzes the unique volume, velocity, and variety of machine data in a secure, industrial-
strength cloud environment.

Further Readings:
• https://www.ge.com/digital/iiot-platform
• https://www.ge.com/digital/blog/industrial-iot-how-connected-things-are-changing-
manufacturing

Wipro’s Internet of Things (IoT) solutions: Wipro’s Internet of Things (IoT) solutions address
connectivity of ‘legacy’ and ‘new’ things. Our Multi-Protocol IoT Gateway Framework
amalgamates heterogeneous protocols that complement IoT at the edge, allowing select data to
be transmitted to backend systems. The solutions are spread across Fleet & Asset Management
for real-time tracking of assets, Integration with cloud-based data storage and analytics and
flexible rules-based Smart Drone Framework for field operations.

Further Reading:
• https://www.wipro.com/en-IN/infrastructure/internet-of-things-iot/

IBM’s Cognitive IoT for Healthcare: IBM Watson Health is transforming healthcare by
helping organizations across the healthcare industry leverage data, technology and expertise to
solve clinical, operational and financial problems.

Further Readings:
• https://researcher.watson.ibm.com/researcher/view_group.php?id=7866
• https://www.ibm.com/in-en/cloud/internet-of-things
Trends in IoT

Artificial Intelligence

In computer science, the term artificial intelligence (AI) refers to any human-like intelligence
exhibited by a computer, robot, or other machine.

Artificial intelligence, machine learning, and deep learning:


The easiest way to understand the relationship between artificial intelligence (AI), machine
learning, and deep learning is as follows:
• Think of artificial intelligence as the entire universe of computing technology that exhibits
anything remotely resembling human intelligence.
• Machine learning is a subset of AI applications that learns by itself. It actually reprograms
itself, as it digests more data, to perform the specific task it's designed to perform with
increasingly greater accuracy.
• Deep learning is a subset of machine learning applications that teaches itself to perform a
specific task with increasingly greater accuracy, without human intervention.
Types of artificial intelligence:
• Weak AI: Also called Narrow AI or Artificial Narrow Intelligence (ANI)—it is AI trained and
focused to perform specific tasks.
• Strong AI: Also called Artificial General Intelligence (AGI), it is AI that more fully replicates
the autonomy of the human brain—AI that can solve many types or classes of problems and
even choose the problems it wants to solve without human intervention.

Artificial intelligence applications:


Artificial intelligence has made its way into a number of areas. Here are six examples.
1. AI in healthcare: The biggest bets are on improving patient outcomes and reducing costs.
Companies are applying machine learning to make better and faster diagnoses than humans.
One of the best-known healthcare technologies is IBM Watson. It understands natural language
and is capable of responding to questions asked of it. The system mines patient data and other
available data sources to form a hypothesis, which it then presents with a confidence scoring
schema.
2. AI in business: Robotic process automation is being applied to highly repetitive tasks normally
performed by humans. Machine learning algorithms are being integrated into analytics and
CRM platforms to uncover information on how to better serve customers.
3. AI in education: AI can automate grading, giving educators more time. AI can assess students
and adapt to their needs, helping them work at their own pace.
4. AI in finance: AI in personal finance applications, such as Mint or Turbo Tax, is disrupting
financial institutions. Applications such as these collect personal data and provide financial
advice. Today, software performs much of the trading on Wall Street.
5. AI in manufacturing: This is an area that has been at the forefront of incorporating robots
into the workflow. Industrial robots used to perform single tasks and were separated from
human workers, but as the technology advanced that changed.

AI – Recent Applications
Necessity is the biggest driver of (re)invention. The Covid-19 pandemic has dramatically
accelerated corporate digital transformation. Companies have developed new digital capabilities
backed by Artificial Intelligence, in an effort to build resilience and retool for the post-pandemic
world.

Edge AI transplants brains to factory tools and machinery


The next wave of artificial intelligence is called “edge AI” or “AI on the edge”. It is a network
infrastructure that makes it possible for AI algorithms to run on the edge of a network. This means,
it is closer to or even on the devices which acquire the data. The sudden and drastic changes in
network traffic that have accompanied Covid-19 lockdowns and the transition to working from
home would potentially accelerate the move towards edge computing.

Some of the advantages of edge computing include saving bandwidth and improving efficiency by
processing information closer to the users and the devices that require it, rather than sending that
data to be processed in central locations in a virtual cloud. When AI is locally embedded,
manufacturers can reduce latency issues and enhance the generation of insights while lowering
cloud service utilization and the costs. Connectivity cost also decreases, as processing part of the
data locally lowers bandwidth and cellular data usage. Since intelligence is run locally, plants
located in remote areas with poor communication infrastructure are less subject to connectivity
losses that can hinder mission-critical and time-sensitive decision making.

Edge AI “steals” a portion of the intelligence from the cloud infrastructure and brings it to
machinery. Octonion is a start-up that integrates artificial intelligence into low-power
microcontrollers, exemplifies how intelligence can be imbued into industrial products. This
technology supports companies while making smart decisions in real time, locally, by using
continuous learning models and machine health scores. Few examples include deploying edge AI
on industrial pumps and motors in order to improve monitoring of machine capabilities and also
while developing predictive maintenance techniques.

Automated and explainable AI makes financial organizations smarter


Banking and Non-Banking Financial Companies expect an 86% increase in investments in AI by
the year 2025. User-friendly AI platforms that enable business employees to build models that are
easy to understand and confidently deliver the output are essential in the large-scale deployment
of AI. An example of a fully-automated AI platform is DreamQuark’s Brain for the sales and
customer engagement teams in the financial sector. Deep-learning algorithms have been
incorporated while building the application and it also is effective in determining more than 40%
of the credit fraudsters. Customer transaction data assessment along with their preferences for
different products like pensions, retirement and savings insurance schemes can be examined and
analyzed using the application.

Health data is gold


Healthcare sector’s big data market is expected to be at nearly $70 billion by the year 2025. This
rapid acceleration of health data acquisition gives the industry an unprecedented opportunity to
leverage and deploy groundbreaking AI capabilities. Patient care also sees potential improvement
with smarter use of health data. In association with Bain’s product and experience innovation team,
a leading European distributor of medical supplies and services has applied AI―including
machine learning―to the treatment of hard-to-heal wounds by developing a mobile application
for healthcare professionals. The app is approved and used as a medical device. It utilizes image
recognition to identify a wound and check whether it is infected or inflamed. Its use has led to a
substantial reduction in unnecessary antibiotics and also minimizes the healing time of hard-to-
heal wounds from several years to just a few months.

Machine Learning
Machine learning is an application of artificial intelligence (AI) that provides systems the ability
to automatically learn and improve from experience without being explicitly programmed.
Machine learning focuses on the development of computer programs that can access data and
use it to learn for themselves. As it is evident from the name, it gives the computer that which
makes it more similar to humans: The ability to learn. Machine learning is actively being used
today, perhaps in many more places than one would expect.

Machine Learning Methods


• Supervised machine learning algorithms
• Unsupervised machine learning algorithms
• Semi-supervised machine learning algorithms
• Reinforcement machine learning algorithms

Applications

Marketing
“75% of enterprises using AI and machine learning enhance customer satisfaction by more than
10%.”
Measuring marketing’s many contributions to revenue growth is becoming more accurate and
real- time thanks to analytics and machine learning. The following are 10 ways machine learning
is revolutionizing marketing today and in the future:
• Using a concerted approach to applying AI and machine learning across a retailer’s value
chains has the potential to deliver a 50% improvement of assortment efficiency and a 30%
online sales increase using dynamic pricing.
• Machine learning is streamlining creation, fine-tuning and revenue contributions of up-sell
and cross-sell strategies by automating the entire progress.
• Lead scoring accuracy is improving, leading to increased sales that are traceable back to
initial marketing campaigns and sales strategies.
• Identifying and defining the sales projections of specific customer segments and micro
segments using RFM (recency, frequency and monetary) modelling within machine learning
apps is becoming pervasive.
• Optimizing the marketing mix by determining which sales offers, incentive and programs are
presented to which prospects through which channels is another way machine learning is
revolutionizing marketing.
Finance
Leading banks and financial services companies are deploying AI technology, including machine
learning (ML), to streamline their processes, optimize portfolios, decrease risk and underwrite
loans amongst other things. Due to the high volume of historical financial data generated in the
industry, ML has found many useful applications in finance. The following are some of the
current applications of machine learning in finance:
Portfolio Management – Robo-Advisors: provides automated financial guidance and service.
They provide portfolio management services that use algorithms and statistics to automatically
establish and manage the investment portfolio of a client

Algorithmic Trading: is the use of algorithms to conduct trades autonomously. It’s mostly
hedges fund managers that make use of automated trading systems and so make use of machine
learning in finance. It allows traders to automate certain processes ensuring a competitive
advantage.
High-Frequency Trading (HFT): Machines in charge of HFT is nothing new. During 2009-
2010, anywhere from 60% to 70% of U.S. trading was attributed to HFT. Some of the biggest
players include companies like Tokyo-based Nomura Securities, Two Sigma Securities, Citadel
Securities, Tower Research Capital and DRW, but there are many more operating in financial
markets worldwide.
Fraud Detection: Fraud is a massive problem for financial institutions and one of the foremost
reasons to leveraged machine learning in finance. This is because ML systems can scan through
vast data sets, detect unusual activities, (anomalies), and flag them instantly. ML is also the
perfect candidate to tackle the problem of false positives, which is something that happens
regularly in finance.

Operations
Machine learning makes it possible to discover patterns in supply chain data by relying on
algorithms that quickly pinpoint the most influential factors to a supply networks’ success. Key
factors influencing inventory levels, supplier quality, demand forecasting, procure-to-pay, order-
to-cash, production planning, transportation management and more are becoming known for the
first time. The ten ways machine learning is revolutionizing supply chain management include:
1. Machine learning algorithms and the apps running them are capable of analyzing large,
diverse data sets fast, improving demand forecasting accuracy.
2. Reducing freight costs, improving supplier delivery performance, and minimizing supplier
risk are three of the many benefits machine learning is providing in collaborative supply
chain networks.
3. Machine Learning and its core constructs are ideally suited for providing insights into
improving supply chain management performance not available from previous
technologies.
4. Machine learning excels at visual pattern recognition, opening up many potential
applications in physical inspection and maintenance of physical assets across an entire
supply chain network.

Industry Examples
IBM
IBM’s Watson’s been following self-learning behavior models and has done everything from
diagnosing certain types of cancers more effectively than oncologists, writing songs, and
producing movie trailers. In the case of cancer treatments, Watson can read half a million medical
research papers in 15 seconds and was trained at Memorial Sloan Kettering in New York to be
able to suggest diagnoses and treatments to doctors.

Amazon
On the retail side, everything from product recommendations to supply chain, forecasting, and
capacity planning runs on machine-learning, while programs like Macie and Glue that scan for
sensitive data breaches and perform data cleansing, respectively. Of course, let’s not forget
Alexa, Prime Air, and Amazon Go, which all function through AI algorithms, while rumors of
an AI fashion designer are feeding the Amazon AI flame.

Google
Google was one of the pioneers of machine learning with suggested searches and ever-evolving
search ranking algorithms. Google’s Machine Intelligence efforts have focused on deep learning,
which involves multiple layers of neural networks—built to simulate human thought processes—
that allow Google’s technology to process data more thoroughly.
Netflix
The online streaming giant announced an AI algorithm called Dynamic Optimizer to analyze
each and every frame of video in each of its roughly 13,000 titles it streams and compresses it
without sacrificing image quality.

Virtual Personal Assistants


Siri, Alexa, Google Now are some of the popular examples of virtual personal assistants. As the
name suggests, they assist in finding information, when asked over voice. All you need to do is
activate them and ask “What is my schedule for today?”, “What are the flights from Germany to
London”, or similar questions. For answering, your personal assistant looks out for the
information, recalls your related queries, or send a command to other resources (like phone apps)
to collect info. You can even instruct assistants for certain tasks like “Set an alarm for 6 AM next
morning”, “Remind me to visit Visa Office day after tomorrow”.

Virtual Reality and Augmented Reality


Virtual reality
Users are completely immersed in a computer-generated reality when they experience virtual
reality. Virtual reality is the most well-known of immersive technologies. Gaming and
entertainment were early adopters of virtual reality.

Industry Applications
Automotive industry - VR allows engineers and designers to experiment easily with the look
and build of a vehicle before commissioning expensive prototypes. Brands such as BMW and
Jaguar Land Rover already use VR to hold early design and engineering reviews to check the
visual design and object obscuration of the vehicle - all before any money has been spent on
physically manufacturing the parts.

Healthcare - Healthcare in an important application where VR can have a significant impact.


Healthcare professionals now use virtual models to prepare themselves for working on real
bodies and VR has even been used as pain relief for burn injuries.

Augmented Reality
Do you remember the Pokémon GO craze? That’s the most well-known application of
augmented reality—technology that overlays digital information on the real world. Rather than
provide a fully immersive virtual experience, augmented reality enhances the real-world with
images, text, and other virtual information via devices such as heads-up displays, smartphones,
tablets, smart lenses, and AR glasses.

Industry Applications
Training - every industry needs to train the new recruits. AR is used to create training
programs and step by step instructions are given to the trainees. This creates more engaging
and interactive training programs.
Assembly industry - in industries, such as automotive or semiconductor, where all the workers
assemble components, they used to rely on paper instructions or remember all the steps. But
with augmented reality they are given step by step instructions, simplifying their job.

Warehouse logistics - AR applications are increasingly being used for order pickup in
warehouses. So, AR applications are basically combining a lot of other capabilities, such as
image recognition, barcode scanning, indoor navigation, and everything is being integrated
with the warehouse management system.

Natural Language Processing


Natural Language Processing (NLP) refers to an AI method of communicating with intelligent
systems using a natural language such as English. Processing of Natural Language is required
when you want an intelligent system like a robot to perform as per your instructions, when you
want to hear decisions from a dialogue based clinical expert system, etc.
These underlying tasks are often used in higher-level NLP capabilities, such as:
Content categorization: A linguistic-based document summary, including search and indexing,
content alerts and duplication detection
Topic discovery and modelling: Accurately capture the meaning and themes in text collections,
and apply advanced analytics to text, like optimization and forecasting.
Contextual extraction: Automatically pull structured information from text-based sources.
Sentiment analysis: Identifying the mood or subjective opinions within large amounts of text,
including average sentiment and opinion mining.
Speech-to-text and text-to-speech conversion: Transforming voice commands into written
text, and vice versa.
Document summarization: Automatically generating synopses of large bodies of text.
Machine translation: Automatic translation of text or speech from one language to another.

Blockchain

Blockchain is a technology which can be used to develop applications, such as social networks,
messengers, games, exchanges, storage platforms, voting systems, prediction markets, online
shops and much more. In this sense, it is similar to the internet, which is why some have dubbed
it “The Internet3.0”. A blockchain is, in the simplest of terms, a time-stamped series of
immutable records of data that is managed by a cluster of computers not owned by any single
entity. Each of these blocks of data (i.e., block) is secured and bound to each other using
cryptographic principles (i.e., chain).
Cryptocurrency
A cryptocurrency (or crypto currency) is a digital asset designed to work as a medium of
exchange using cryptography to secure the transactions and to control the creation of additional
units of the currency. Cryptocurrencies are classified as a subset of digital currencies and are also
classified as a subset of alternative currencies and virtual currencies. Bitcoin and its derivatives
use decentralized control as opposed to centralized electronic money/centralized banking
systems. The decentralized control is related to the use of bitcoin’s blockchain transaction
database in the role of a distributed ledger.

5G

• 5G is the 5th generation mobile network. It is a new global wireless standard after 1G, 2G, 3G,
and 4G networks. 5G enables a new kind of network that is designed to connect virtually
everyone and everything together including machines, objects, and devices.
• 5G is based on OFDM (Orthogonal frequency-division multiplexing), a method of modulating
a digital signal across several different channels to reduce interference.
• 5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra-low
latency, more reliability, massive network capacity, increased availability, and a more uniform
user experience to more users.
Barriers to 5G Adoption:
1. One major obstacle is that network providers will need to install a lot of new, and
expensive, infrastructure.
2. With 5G signals tending to travel relatively short distances, network providers will need to
deploy more antennas and base stations to ensure broad coverage.

5G in India:

1. During the AGM 2020, RIL chairman Mukesh Ambani announced the plans for
'homegrown 5G' network in India. At the event, Ambani revealed that Jio has developed a
5G solution from scratch and that it will be ready for trials as soon as 5G spectrum is
available and can be ready for field deployment in 2021.
2. Currently, India does not have 5G services and the government has not yet allocated
spectrum to telecom operators for running field trials aimed at promoting domestic
ecosystems for 5G.

Applications:
Recent Trends in 5G
The fifth-generation of mobile networking technology, or 5G, has been top of mind for the telecom
industry in recent years and the buzz has trickled into almost every other market as businesses look
for ways to manage change through new connectivity options.
Here are some of the most important 5G trends that solution providers should be aware of this year
as the latest cellular technology becomes a viable connectivity option.
5G’s Impact on IoT
5G is going to help further IoT because of the latency and bandwidth improvements it can offer.
The IoT opportunities that will especially benefit from mobile and cellular connectivity include
transportation, manufacturing, farming, and smart cities use cases. And 5G could even make new
and emerging use cases and applications a true reality for the first time, such as connected cars,
which require lightning-fast, low-latency technologies.
Aside from the cutting-edge use cases, many industries right now need highly reliable low-latency
wireless links that can power applications as quickly as possible for their existing IoT use cases.
Connected Communities
Smart cities have become a major IoT trend in recent years as metro areas all around the world
equip indoor and outdoor areas with sensors to collect data and gain insights to better manage their
assets, resources and services.
5G is the technology that the smart city and connected communities use cases have been looking
for. Existing 4G networks are limited in its support for simultaneous connections, high power
consumption and high price per bit. 5G, on the other hand, is expected to drive smart city
applications by addressing these issues and in return, harness the newly captured data to improve
city operations.
5G And Security
As the amount of 5G implementations increase, the need for good security will become even more
critical. Carriers, such as AT&T, Verizon, and T-Mobile have been bolstering their next-
generation networks with added encryption and additional defenses at the edge of the network.
But 5G, unlike previous iterations of cellular technology, will be made up of a mostly software-
based network, so securing 5G is a different kind of endeavor. The applications that will ride on
top of the 5G network, such as IoT and smart city apps, will also require additional layers of
security to lock down the new devices and connections that will be joining the network.
5G At the Edge
The link between 5G and edge computing is all about latency. 5G promises to fuel innovation at
the edge by powering brand-new use case, enabling more data collection and faster processing
than ever before, while giving businesses and organizations another connectivity option.
By combining 5G and edge computing, organizations will be able to outfit devices like smart
cameras and sensors to collect more data, which will drive more compute use cases at the edge.
This will result in expanding opportunities for solution providers in collecting data at the edge,
channel partners told CRN.
According to research firm IDC, the worldwide edge computing market is forecast to reach
approximately $250 billion in 2024 with a compound annual growth rate of 12.5 percent over the
next four years. 5G technology is expected to act as a catalyst for that market growth.

Social Media Analytics


Social media analytics is the practice of gathering data from social media websites and analyzing
that data using social media analytics tools to make business decisions. The most common use
of social media analytics is to mine customer sentiment to support marketing and customer
service activities.

This is considered the basic foundation for enabling an enterprise to:


● Execute focused engagements like one-to-one and one-to-many
● Enhance social collaboration over a variety of business functions, such as customer service,
marketing, support, etc.
● Maximize the customer experience

Social media is a good medium to understand real-time consumer choices, intentions and
sentiments. The most prevalent application of social media analytics is to get to know the
customer base on a more emotional level to help better target customer service and marketing.

The initial step during a social media analytics program is to figure out which business objectives
can gain an advantage from the data that is collected and evaluated. Standard goals include
maximizing business earnings, decreasing customer service expenditures, obtaining feedback on
services and products, and enhancing public opinion about a business division or specific
product. As soon as the business goals are determined, key performance indicators (KPIs) to
perform objective evaluation of the data must be outlined.

The advantages of implementing social media tools include:

• Competitive Advantage: SMA tools allow the organizations to gain a competitive edge over
their competitors by facilitating a much better comprehension of their brands. This usually
includes an understanding of how the customers make use of particular services or products,
what issues are faced by the customers while using these services or products, and getting to
know how customers' views about a particular company or product.
• Learn from the Customers: In many cases, customers may have effective solutions for some of
the issues faced by an organization. For example, if a product is in the market without proper
documentation, the chances of use errors increase. Some users may solve these problems
through trial-and-error, and then post their findings in forums, which can help the company
determine whether better documentation is required, and what users really need to know.
• Improve Products and Services: This is the key goal of SMA. There are countless tweets, blogs,
comments and complaints regarding products and services. This huge volume of information
contains consumer sentiments that can be used to evaluate users' experience with a particular
product or service. This information can then be used to help companies perform better.

Analytics
Data analytics is the science of analyzing raw data in order to make conclusions about that
information. Many of the techniques and processes of data analytics have been automated into
mechanical processes and algorithms that work over raw data for human consumption.

Data analytics techniques can reveal trends and metrics that would otherwise be lost in the mass
of information. This information can then be used to optimize processes to increase the overall
efficiency of a business or system.

Types of Data Analytics


Since big data is not a new concept for businesses, enterprises are leveraging different types of
data analytics tools to excerpt meaningful information from their data. Here are some most
relevant types of data analytics.

• Prescriptive Analytics: This data analytics concept prescribes what action to take to remove
future problems or capitalize on a promising trend. Prescriptive analytics essentially provides
an organization with a laser-like focus to answer a specific question. It also helps them to
determine the best solution for a future opportunity or avoid future risks.
• Predictive analytics: It uses big data to identify past patterns to predict the future. Predictive
analytics draws its power from numerous methods and technologies, such as big data, data
mining, statistical modeling, machine learning and assorted mathematical processes, among
others. By utilizing this model, an organization can use past and current data to reliably
forecast trends and behaviors.
• Descriptive analytics: This data analytics method provides insight into what has happened
historically and will provide businesses with trends to get in-depth detail. Descriptive
analytics defines a preliminary stage of data processing that creates a summary of historical
data to yield meaningful information and possibly prepare the data for further analysis.
• Diagnostic Analytics: With this analytics technique, historical data can be measured against
other data to answer the question of why something happened. Essentially, data scientists
turn to this technique when trying to determine “Why” behind something happened.
Diagnostic analytics can be beneficial in the sales cycle, for instance, to categorize customers
by their likely product preferences and sales cycle.

Applications of Data Analytics

The applications of data analytics are broad. Analyzing big data can optimize efficiency in many
different industries. Improving performance enables businesses to succeed in an increasingly
competitive world. One of the earliest adopters is the financial sector. Data analytics has an
important role in the banking and finance industries, used to predict market trends and assess risk.
Credit scores are an example of data analytics that affects everyone. These scores use many data
points to determine lending risk. Data analytics is also used to detect and prevent fraud to improve
efficiency and reduce risk for financial institutions.

The use of data analytics goes beyond maximizing profits and ROI, however. Data analytics
can provide critical information for healthcare (health informatics), crime prevention, and
environmental protection. These applications of data analytics use these techniques to
improve our world. Though statistics and data analysis have always been used in scientific
research, advanced analytic techniques and big data allow for many new insights. These
techniques can find trends in complex systems.

Data Analytics Tools


• R Programming

• Tableau

• Python

• SAS

• Excel

• SPSS
• Power BI

SPSS

SPSS is short for Statistical Package for the Social Sciences, and it’s used by various kinds of
researchers for complex statistical data analysis.
SPSS is used by market researchers, health researchers, survey companies, government entities,
education researchers, marketing organizations, data miners, and many more for the processing
and analyzing of survey data.
SPSS offers four programs that assist researchers with their complex data analysis needs.

Statistics Program Basic statistical functions like frequencies, cross


tabulation, and bivariate statistics

Modeler Program Build and validate predictive models using advanced


statistical procedures

Text Analytics for Surveys Program Uncover powerful insights from responses to open
ended survey questions

Visualization Designer Use data to create a wide variety of visuals like density
charts and radial boxplots

Power BI
Microsoft’s Power BI is a cloud-based, business analytics service for analyzing and visualizing
data. Power BI gives you a platform to Connect to hundreds of data sources and bring your data
to life with live dashboards and reports.

Why is Power BI Important?


1. Telling stories through charts and data visualizations
2. Examining "what if" scenarios within data
3. Forecasting to make sure departments meet business metrics
4. Executive dashboards for managers for insight into departments
5. Automatic data refresh which provides near-real-time analytics of trends and indicators

Tableau
Tableau is a data visualization software that is used for data science and business intelligence. It
presents the impact of data visually and comes with real-time data analytics capabilities and
cloud support.
Power BI Tableau

Data sources:
It has access to numerous database sources and
Limited access to other databases and servers servers.
when compared to Tableau.
Example:
Example: Excel, Text File, Access, JSON File, PDF File,
Spatial File, Statistical File, Other Files
SQL Server Database, Access Database, SQL
(such as Tableau .hyper, .tds, .twbx),
Server Analysis Services Database, Oracle
Connect to a Published Data Source on
Database, IBM DB2 Database, IBM Informix
Tableau Online or Server, Actian Matrix,
database (Beta)
Actian Vector, Amazon Athena, Amazon
Aurora,Amazon EMR, Amazon Redshift

Data Capacity:

Each workspace/group could handle up to 10 Tableau works on the columnar based structure
GB of Data. which stores only unique values for each
For more than 10GB, Either Data needs to be column making it possible to fetch Billions of
in a cloud (Azure), if it is in local databases rows.
Power BI just selects or pulls the data from a
database and does not import

Machine Learning:

Power BI is integrated with Microsoft Azure, it Python machine learning capacities are inbuilt
helps in analyzing the data and understanding with Tableau, making it efficient for
the trends and patterns of the product/business. performing ML operations over the datasets.

Performance:
It can handle a huge volume of data with better
It can handle a limited volume of data. performance.

Target Audience:
Even though access is easy and simple,
Naive Users, Experienced Users Analysts and Experienced users use it for their
analytics purposes.

Pricing: It is very cheap when compared to Tableau is costlier than power BI. It needs to
Tableau be paid more when connected to third-party
applications.

Real Time Dashboard: With Power BI real- Tableau provides feature for real time data.
time streaming, you can stream data and The Connect Live feature is used for real-time
update dashboards in real-time. Any visual or data analysis.
dashboard that can be created in Power BI
can also be created to display and update real-
time data and visuals.

Data Mining
Data mining is the practice of automatically searching large stores of data to discover patterns and
trends that go beyond simple analysis. Data mining uses sophisticated mathematical algorithms to
segment the data and evaluate the probability of future events. In simple terms it is used to turn
raw data into useful information.
Steps in Data Mining
1) Data Cleaning – Data cleaning is the process of preparing data for analysis by removing or
modifying data that is incorrect, incomplete, irrelevant, duplicated, or improperly formatted.
This data is usually not necessary or helpful when it comes to analyzing data because it may hinder
the process or provide inaccurate results. There are several methods for cleaning data depending
on how it is stored along with the answers being sought. Data cleaning is not simply about erasing
information to make space for new data, but rather finding a way to maximize a data set’s accuracy
without necessarily deleting information.

2) Data Integration – Data Integration is a data pre-processing technique that involves combining
data from multiple heterogeneous data sources into a coherent data store and provide a unified
view of the data
3) Data Selection – Data Selection is the process where data relevant to the analysis task are
retrieved from the database. Sometimes data transformation and consolidation are performed
before the data selection process.
4) Data Transformation – Data transformation is the process of changing the format, structure,
or values of data. For data analytics projects, data may be transformed at two stages of the data
pipeline. Organizations that use on-premises data warehouses generally use an ETL process, in
which data transformation is the middle step. Today, most organizations use cloud-based data
warehouses, which can scale compute and storage resources with latency measured in seconds or
minutes. The scalability of the cloud platform lets organizations skip preload transformations and
load raw data into the data warehouse, then transform it at query time — a model called ELT
5) Data Mining – Data mining involves six common classes of tasks:
• Anomaly detection (outlier/change/deviation detection) – The identification of unusual data
records, that might be interesting or data errors that require further investigation.
• Association rule learning (dependency modeling) – Searches for relationships between
variables. For example, a supermarket might gather data on customer purchasing habits. Using
association rule learning, the supermarket can determine which products are frequently bought
together and use this information for marketing purposes. This is sometimes referred to as
market basket analysis.
• Clustering – is the task of discovering groups and structures in the data that are in some way
or another "similar", without using known structures in the data.
• Classification – is the task of generalizing known structure to apply to new data. For example,
an e-mail program might attempt to classify an e-mail as "legitimate" or as "spam".
• Regression – attempts to find a function that models the data with the least error that is, for
estimating the relationships among data or datasets.
• Summarization – providing a more compact representation of the data set, including
visualization and report generation.
6) Pattern Evaluation- Pattern Evaluation is defined as identifying strictly increasing patterns
representing knowledge based on given measures. Uses summarization and Visualization to make
data understandable by the user.
7) Knowledge Representation- Knowledge representation is the presentation of knowledge to the
user for visualization in terms of trees, tables, rules graphs, charts, matrices, etc.
Data Mining Algorithms

Data Mining Applications


Data Mining is highly useful in the following domains –
• Market Analysis and Management
• Corporate Analysis and Risk Management
• Fraud Detection

Data Algorithms
Data mining is known as an interdisciplinary subfield of computer science and basically is a
computing process of discovering patterns in large data sets. It is considered as an essential
process where intelligent methods are applied in order to extract data patterns.
Techniques of working with Traditional Methods.

● Regression (linear model)


● Logistic Regression (non-linear model)
● Clustering
● Factor Analysis
● Time Series

K-means:

K-means clustering that is also known as nearest centroid classifier or The Rocchio algorithm is
a method of vector quantization that is considerably popular for cluster analysis in data mining.
K-means is used to create k groups from a set of objects just so that the members of a group are
more similar. Cluster analysis is a family of algorithms designed to form groups such that the
group members are more similar versus non-group members. It assigns data points to a cluster
such that the sum of the squared distance between the data points and the cluster’s centroid
(arithmetic mean of all the data points that belong to that cluster) is at the minimum. The less
variation we have within clusters, the more homogeneous (similar) the data points are within the
same cluster.
K-Means is relatively an efficient method. However, we need to specify the number of clusters, in
advance and the final results are sensitive to initialization and often terminates at a local optimum.

To process the learning data, the K-means algorithm in data mining starts with a first group of
randomly selected centroids, which are used as the beginning points for every cluster, and then
performs iterative (repetitive) calculations to optimize the positions of the centroids. It halts
creating and optimizing clusters when either:
● The centroids have stabilized — there is no change in their values because the clustering
has been successful.
● The defined number of iterations has been achieved
Apriori Algorithm:
Apriori algorithm is a classical algorithm in data mining. It is used for mining frequent item sets
and relevant association rules. It is devised to operate on a database containing a lot of transactions,
for instance, items brought by customers in a store. With the quick growth in e-commerce
applications, there is an accumulation vast quantity of data in months not in years. Data Mining,
also known as Knowledge Discovery in Databases (KDD), to find anomalies, correlations,
patterns, and trends to predict outcomes. It is very important for effective Market Basket Analysis
and it helps the customers in purchasing their items with more ease which increases the sales of
the markets. It has also been used in the field of healthcare for the detection of adverse drug
reactions. It produces association rules that indicate what all combinations of medications and
patient characteristics lead to ADRs.
Usage of Apriori Algorithm in Complementary Goods Concept:
Complements are the items bought and used together. These items help to lift the sales of each
other in a customer basket. Examples of complements could be Bread and Butter or Flights and
Taxi services etc.
How Apriori Helps!!
1. Item placement in the stores. Complementary items can be placed together/closer.
2. In the E-commerce website, whenever an item is bought, recommending its complimentary
items as these are items bought together.
3. On unavailability of an item, recommending its substitute.
4. Giving combo offers on the item and its complement to lift the sales or clear the stocks.
5. Whenever there is a price hike/drop of an item, monitoring the impact on the sales/demand of
its substitute. This helps in taking conscious and planned pricing decisions.
Naive Bayes -
Naive Bayes (NB) is a simple supervised function and is special form of discriminant analysis. It's
a generative model and therefore returns probabilities. It's the opposite classification strategy of
one Rule. All attributes contribute equally and independently to the decision. Naive Bayes makes
predictions using Bayes' Theorem, which derives the probability of a prediction from the
underlying evidence, as observed in the data. Naive Bayes works surprisingly well even if
independence assumption is clearly violated because classification doesn’t need accurate
probability estimates so long as the greatest probability is assigned to the correct class.
Naive Bayes works also on text categorization. NB affords fast model building and scoring and
can be used for both binary and multi-class classification problems. The naive Bayes classifier is
very useful in high-dimensional problems because multivariate methods like QDA and even LDA
will break down. For example, a fruit may be considered to be an apple if it is red, round, and
about 3 inches in diameter. Even if these features depend on each other or upon the existence of
the other features, all of these properties independently contribute to the probability that this fruit
is an apple and that is why it is known as ‘Naive’.
What is Big Data and why is Big Data Analytics important?

Big Data refers to a huge volume of data that cannot be stored and processed using the traditional
computing approach within a given time frame. But how huge this data needs to be? To be termed
as Big Data? There is a lot of misconception surrounding, what amount of data can be termed as
Big Data. Usually, the data which is either in gigabytes, terabytes, petabytes, Exabyte or anything
larger than this in size is considered as Big Data. This is where the misconception arises. Even a
small amount of data can be referred to as Big Data depending on the context it is being used.
Big Data Analytics and Data Science

The analytics involves the use of advanced techniques and tools of analytics on the data obtained
from different sources in different sizes. Big Data analytics involves the use of analytics
techniques like machine learning, data mining, natural language processing, and statistics. The
data is extracted, prepared and blended to provide analysis for the businesses. Data analytics
involves qualitative as well as quantitative techniques to improve business productivity and
profits. The data analytics tools are used by researchers, analysts, and engineers for business
organizations to access the data efficiently.

Real-time Benefits of Big Data Analytics


There has been an enormous growth in the field of Big Data analytics with the benefits of the
technology. There are many other industries which use big data analytics. Banking is seen as the
field making the maximum use of Big Data Analytics.
Banking analytics, or applications of data mining in banking, can help improve how banks
segment, target, acquire and retain customers. Additionally, improvements to risk management,
customer understanding, risk and fraud enable banks to maintain and grow a more profitable
customer base. As per Deloitte research, three business drivers increase the importance of
analytics within the banking industry
● Regulatory reform – Major legislation such as Dodd-Frank, the CARD Act, FATCA
(Foreign Account Tax Compliance Act) and Basel III have changed the business environment
for banks. Given the focus on systemic risk, regulators are pushing banks to demonstrate
better understanding of data they possess, turn data into information that supports business
decisions and manage risk more effectively. Each request has major ramifications on data
collection, governance and reporting. Over the next several years, regulators will finalize
details in the recently passed legislation. However, banks should start transforming their
business models today to comply with a radically different regulatory environment.
● Customer profitability – Personalized offerings are expected to play a big role in attracting
and retaining the most profitable customers, but studies show that a small percentage of banks
have strong capabilities in this area. The CARD Act and Durbin Amendment make it even
more important to understand the behavioral economics of each customer and find ways to
gain wallet share in the most profitable segments.
● Operational efficiency – while banks have trimmed a lot of fat over the past few years, there
is still plenty of room for improvement, including reducing duplicative systems, manual
reconciliation tasks and information technology costs.

The education sector is also making use of data analytics in a big way. There are new options for
research and analysis using data analytics. The institutional data can be used for innovations by
technical tools available today. Due to immense opportunities, Data analytics has become an
attractive option to study for students as well.

How Data Analytics Can Be Used in the Education Sector


Statistical Models: They can be used to forecast the grades of students in the class. Based on
certain parameters collected, if the model generates that the student will have a poor CGPA then
the model can generate a warning to the instructor indicating that the student will have to work
harder to reach the desired mark. The teacher can know what a student is weak in what subject
and implement study plans for him accordingly.
Judging Panels: During interviews of the students to schools and Universities, the judging panel
can model the student’s performance at the entry test to know if he/she has the potential since
these entry tests and CGPAs would have a very high correlation. They may also record absentee
rates and study how that affects a performance of all these students using models.
Dropout Rate: A lot of students in different parts of the world drop out of schools and colleges
because of various reasons. Predictive models help in evaluating the risks of student dropouts
using data analysis and in turn help in taking precautionary measures against it.
Virtual Interview: There is also an AI-based evaluation virtual interview platform that mimics
an actual face to face interview. It can be used to automatically evaluate the body language of
the candidate. The same technique can be used in classrooms to examine who in the classroom
is paying proper attention, who is not and who is pretending.
Cloud: Teachers may leverage the use of cloud technology to have a maximum access to study
material to students. The technology can also encourage independent learning and students can
take an ownership of their learning and learn from even hours outside of school. University of
Michigan uses a tool called E2Coach automatically sends its students personalized course
performance messages based on a continually updated algorithm.

Use Cases
IBM has its own project that has been using analytics and helping schools succeed. The
University of Florida has also been using this platform to extract the student data and use it to
monitor and predict their student performance. Finger Lakes Community College and The Keller
Graduate School of Management have already adopted the IBM Analytics platform to track their
students and both the institutes have seen a rise in the performance of their students.
The insights provided by the big data analytics tools help in knowing the needs of customers
better. This helps in developing new and better products. Improved products and services with
new insights can help the firm enormously. This may help the customers too as they get better
offerings satisfying their needs effectively. All in all, Data analytics has become an essential part
of the companies today.

Examples of companies using Analytics

Coca-Cola

Coca Cola is known to have ploughed extensive research and development resources into
artificial intelligence (AI) to ensure it is squeezing every drop of insight it can from the data it
collects.
Fruits of this research were unveiled earlier this year when it was announced that the decision to
launch Cherry Sprite as a new flavor was based on monitoring data collected from the latest
generation of self-service soft drinks fountains, which allow customers to mix their own drinks.

Healthy options - The Company combines weather data, satellite images, information on crop
yields, pricing factors and acidity and sweetness ratings, to ensure that orange crops are grown
in an optimum way, and maintain a consistent taste. The algorithm then finds the best
combination of variables in order to match products to local consumer tastes in the 200-plus
countries around the world where its products are sold.

Social data mining - Coca Cola closely tracks how its products are represented across social
media, and in 2015 was able to calculate that its products were mentioned somewhere in the
world an average of just over once every two seconds.
Knowing this gives insight into who is consuming their drinks, where their customers are, and
what situations prompt them to talk about their brand. The company has used AI-driven image
recognition technology to spot when photographs of its products, or those of competitors, are
uploaded to the internet, and uses algorithms to determine the best way to serve them
advertisements.
Netflix Recommendation System

With over 100 million subscribers, the company collects huge data, which is the key to achieving
the industry status Netflix boasts. Whenever you access the Netflix service, their
recommendations system helps you find a show or movie to enjoy with minimal effort. The
likelihood that you will watch a particular title in the catalogue is estimated based on a number
of factors including your interactions with the service (such as your viewing history and how you
rated other titles), other members with similar tastes and preferences on the service information
about the titles, such as their genre, categories, actors, release year etc.

To best personalize the recommendations following factors are considered:


• the time of day you watch,
• the devices you are watching Netflix on and
• how long you watch

All of these pieces of data are used as inputs that are processed in the algorithms. The
recommendations system does not include demographic information (such as age or gender) as
part of the decision-making process.
To improve the recommendation system, feedback from every visit to the Netflix service is
collected used to and continually re-train the algorithms with those signals to improve the
accuracy of their prediction of what you’re most likely to watch.
Amazon Fresh and Whole Foods

Amazon knows the online customer backwards and forwards, but when it comes to
understanding the brick-and-mortar shopper, they lack insight.
Amazon didn’t buy Whole Foods for the business -they bought it for the data.
What exactly is in the Whole Foods data that Amazon would want? Answer: Grocery buying
habits and patterns. Preferences. Correlations between purchases of different products and even
different categories. With massive amounts of data from Whole Foods shoppers, Amazon will
ultimately be able to tailor the grocery shopping experience to the individual. Amazon has
already mastered the process of upselling, i.e., offering additional items that go with the items
the consumer is looking to buy. With consumables like groceries, Amazon will know when you
run out of cereal and will present you with the offer to buy more at precisely the right time.
Alternatively, the new box of cereal may just show up at your door at the moment you take that
last bite.

Adidas

Due to the size of Adidas, keeping track of all products on its e-commerce activity is a difficult
task. Johannes Wagner, senior business analyst at Adidas Group explained, “Across Europe,
there are two brands (Adidas and Reebok), 17 markets and over 9,000 individual articles”.
Combined, this creates a mammoth task with a huge number of data points.

This high volume of data means that, before implementing data analytics, the workload was very
high and all areas of the business were being pushed for time. Secondly, merchandisers were
unable to perform in-season management of their items as they were not equipped with the
appropriate solutions, meaning they had to contact the analysis team instead. The solution was
based around the collection of transactional data from multiple sources and enabling this data to
dictate the direction of business decisions through analytics insights. Using an analytics platform,
Wagner explained how the nature of analytics allowed merchandisers to perform data tasks and
in-season management on their own without consulting analyst teams. This increased the teams’
independence and allowed the analysts to spend more time on high-value tasks. This, along with
newfound granular product tracking, had an overall outcome of higher profit margins.

Amazon Alexa

Alexa is built based on natural language processing (NLP), a procedure of converting speech
into words, sounds, and ideas. Amazon records your words. Indeed, interpreting sounds takes up
a lot of computational power, the recording of your speech is sent to Amazon’s servers to be
analyzed more efficiently.
Amazon breaks down your “orders” into individual sounds. It then consults a database containing
various words’ pronunciations to find which words most closely correspond to the combination
of individual sounds. It then identifies important words to make sense of the tasks and carry out
corresponding functions. For instance, if Alexa notices words like “sport” or “basketball”, it
would open the sports app. Amazon’s servers send the information back to your device and Alexa
may speak. If Alexa needs to say anything back, it would go through the same process described
above, but in reverse order.
The main command has 3 main parts: Wake word, Invocation name, Utterance.
Wake word - When users say ‘Alexa’ which wakes up the device. The wake word put the Alexa
into the listening mode and ready to take instructions from users.
Invocation name - Invocation name is the keyword used to trigger a specific “skill”. Users can
combine the invocation name with an action, command or question. All the custom skills must
have an invocation name to start it.
Utterance - ‘Taurus’ is an utterance. Utterances are phrases the users will use when making a
request to Alexa. Alexa identifies the user’s intent from the given utterance and responds
accordingly. So basically, the utterance decides what user want Alexa to perform.

After, Alexa enabled devices sends the user’s instruction to a cloud-based service called Alexa
Voice Service (AVS). Think the Alexa Voice Service as the brain of Alexa enabled devices and
perform all the complex operations such as Automatic Speech Recognition (ASR) and Natural
Language Understanding (NLU). Alexa Voice Service process the response and identify the
user’s intent, then it makes the web service request to third party server if needed

IBM Watson Analytics

IBM Watson Analytics is an intelligent, self-service data analysis and visualization application
for discovering patterns and insights in your data. It guides you through the process of discovery
and automates the predictive analysis and related cognitive processes that comes afterward.
Because of the natural language processing capability of IBM Watson Analytics, you can interact
with your data as if you are having a conversation with it. As such, you can extract answers from
structured and unstructured information with ease. IBM Watson represents a new era of
computing called Cognitive computing. It is a cloud-based data discovery service intended to
provide the benefits of advanced analytics without the complexity. Watson Analytics empowers
even novice users to understand and make use of data science techniques ranging from machine
learning and predictive modeling.

Furthermore, IBM Watson Analytics lets you instantly find new and emerging trends in your
data. The service even presents it in a visual manner through your dashboards so you can detect
patterns faster.

Quantum Computing

Quantum computing is an area of computing focused on developing computer technology based


on the principles of quantum theory, which explains the behaviour of energy and material on the
atomic and subatomic levels.

Classical computers that we use today can only encode information in bits that take the value of 1
or 0. This restricts their ability. Quantum computing, on the other hand, uses quantum bits or
qubits. It harnesses the unique ability of subatomic participles that allows them to exist in more
than one state i.e., a 1 and a 0 at the same time. Superposition and entanglement are two features
of quantum physics on which these supercomputers are based. This empowers quantum computers
to handle operations at speeds exponentially higher than conventional computers and at much
lesser energy consumption.

Quantum computing could contribute greatly in the fields of finance, military affairs, intelligence,
drug design and discovery, aerospace designing, utilities (nuclear fusion), polymer design,
Artificial Intelligence (AI) and Big Data search, and digital manufacturing.

Its potential and projected market size has engaged some of the most prominent technology
companies to work in the field of quantum computing, including IBM, Microsoft, Google, D-
Waves Systems, Alibaba, Nokia, Intel, Airbus, HP, Toshiba, Mitsubishi, SK Telecom, NEC,
Raytheon, Lockheed Martin, Rigetti, Biogen, Volkswagen, and Amgen.

Some applications of Quantum Computing:

Optimization: Many optimization problems are searching for a global minimal point solution.
By using quantum annealing, the optimization problems may be solved earlier than using
supercomputers.

Machine Learning / Big data: ML and deep learning researchers are seeking for efficient ways
to train and test models using large data set. Quantum computing can help to make the process
of training and testing faster.

Simulation: Simulation is a useful tool to anticipate possible errors and take action. Quantum
computing methods can be used to simulate complex systems.
Material Science: Chemistry and material science are limited by the calculations of the complex
interactions of atomic structures. Quantum solutions are promising a faster way to model these
interactions.

Cybersecurity
Cyber security refers to the body of technologies, processes, and practices designed to protect
networks, devices, programs, and data from attack, damage, or unauthorized access. Cyber security
may also be referred to as information technology security.

Cyber security is important because government, military, corporate, financial, and medical
organizations collect, process, and store unprecedented amounts of data on computers and other
devices. A significant portion of that data can be sensitive information, whether that be intellectual
property, financial data, personal information, or other types of data for which unauthorized access
or exposure could have negative consequences. Organizations transmit sensitive data across
networks and to other devices in the course of doing businesses, and cyber security describes the
discipline dedicated to protecting that information and the systems used to process or store it. As
the volume and sophistication of cyber-attacks grow, companies and organizations, especially
those that are tasked with safeguarding information relating to national security, health, or financial
records, need to take steps to protect their sensitive business and personnel information.

Challenges of Cyber Security

For an effective cyber security, an organization needs to coordinate its efforts throughout its entire
information system. Elements of cyber encompass all of the following:

Network security: The process of protecting the network from unwanted users, attacks and
intrusions.

Application security: Apps require constant updates and testing to ensure these programs are
secure from attacks.

Endpoint security: Remote access is a necessary part of business, but can also be a weak point
for data. Endpoint security is the process of protecting remote access to a company’s network.

Data security: Inside of networks and applications is data. Protecting company and customer
information is a separate layer of security.

Identity management: Essentially, this is a process of understanding the access every individual
has in an organization.

Database and infrastructure security: Everything in a network involves databases and physical
equipment. Protecting these devices is equally important.

Cloud security: Many files are in digital environments or “the cloud”. Protecting data in a 100%
online environment presents a large number of challenges.
Mobile security: Cell phones and tablets involve virtually every type of security challenge in and
of themselves.

Disaster recovery/business continuity planning: In the event of a breach, natural disaster or


other event data must be protected and business must go on. For this, you’ll need a plan. End-user
education: Users may be employees accessing the network or customers logging on to a company
app. Educating good habits (password changes, 2-factor authentication, etc.) is an important part
of cybersecurity.

There remains a lot of speculation about what happens after the pandemic, but six things appear to
be certain:

Some organizations will need to move to new operating models. For these companies,
immediately after the crisis, cybersecurity and IT rights will require careful examination and
handling. Remote worker monitoring and support will become vital. And for workers who
transition from home back to the office, cybersecurity professionals must ensure stringent system
and access scrutiny prior to allowing the shifted system to connect back to the network

Companies will need to reset their security systems to ensure there are no outliers. Both physical
and digital systems will need to be restarted, to check for any digital holes in the fence. System
and data access rights granted during the pandemic to enable remote work will require auditing to
determine whether they should be revoked or updated. IT systems will need to be analyzed for
cracks, foul paths or fraudulent identities. The reason is that cybercriminals may have found ways
to gain entry into otherwise hardened facilities.

New cyber risks that appeared during the pandemic must be understood. For instance, security
experts will need to scrutinize the digital capabilities of critical business functions, making sure
they can withstand cyberattacks during a lockdown. They will examine critical supply chains,
including digital supply chains, to ensure continuity during a health crisis.

Corporate IT security architectures should be reassessed. This includes access mechanisms,


support needs for remote access on a mass scale, and feature risk/context-based security
authentication mechanisms.

Updates to remote access and bring-your-own-device (BYOD) policies must be made. They
should include cybersecurity hygiene controls.

Advanced technology must be deployed. Threat detection and response capabilities must include
advanced capabilities supported by next-generation technologies like big data, artificial
intelligence and machine learning. These are needed to detect and respond to adverse behaviour at
machine speed, without human interventions. Further, organizations will want to explore insurance
against losses from cyberattacks incurred during a pandemic scenario.

Cyber Security Examples:

Encryption
Encrypting data in storage, transit and use.
Authentication
Securely identifying people and digital entities.

Authorization
Defining and implementing privileges for computing resources.

Network Security
Securing networks with techniques such as a network perimeter.

Sandboxing
Running untrusted software in a virtual environment where it can do no harm.

Internal Controls
Internal controls such as the requirement that different people write code, review the code and
launch it into production.

Security by Design
Architecting and designing systems, applications and infrastructure to be secure.

Secure Coding
A series of principles and practices for developing code that is free of security vulnerabilities.

Secure Testing
Testing cycles designed to discover security vulnerabilities.

Defense in Depth
The principle that each layer of security doesn't assume anything. For example, an application
that doesn't assume that a firewall has prevented external access

Physical Security
Physical security such as a data center with access controls.

Audit Trail
Logging that records interactions with systems, applications, databases and infrastructure such
that malicious activity can be detected and reconstructed.

Defensive Computing
Users who are aware of cybersecurity and are careful in their use of technology.

Non-Repudiation
The ability to prove that a commercial transaction took place.

Security Infrastructure
Foundational tools that offer security services such as a virus scanner or intrusion detection
system.
Monitoring
Monitoring systems, applications and infrastructure and promptly investigating suspicious
activity.

Vulnerability Management
Tracking known vulnerabilities to software and hardware and applying fixes in a timely manner.

Response to Breaches
Defending your services, resources and data from an attack

Cyber Security Trends:

Rise of Automotive Hacking


The first cyber security trend in 2021 is going to be the rise of automotive hacking. Modern
vehicles nowadays come packed with automated software creating seamless connectivity for
drivers in cruise control, engine timing, door lock, airbags, and advanced systems for driver
assistance. These vehicles use Bluetooth and Wi-Fi technologies to communicate, which also
opens them to several vulnerabilities or threats from hackers. Gaining control of the vehicle or
using microphones for eavesdropping is expected to rise in 2021 with more use of automated
vehicles. Self-driving or autonomous vehicles use an even further complex mechanism that
requires strict cybersecurity measures.

Integrating AI with Cyber Security


With AI being introduced in all market segments, this technology with a combination of machine
learning has brought tremendous changes in cybersecurity. AI has been paramount in building
automated security systems, natural language processing, face detection, and automatic threat
detection. Although it is also being used to develop smart malware and attacks to bypass the latest
security protocols in controlling data. AI-enabled threat detection systems can predict new attacks
and notify admins for any data breach instantly, making it the next cyber security trend in 2021.

Mobile is the New Target


Cybersecurity trends provide a considerable increase (50 percent) for mobile banking malware or
attacks in 2019, making our handheld devices a potential prospect for hackers. All our photos,
financial transactions, emails, and messages possess more threats to individuals. Smartphone's
virus or malware may capture the attention of cybersecurity trends in 2021.

Cloud is Also Potentially Vulnerable


With more and more organizations now established on clouds, security measures need to be
continuously monitored and updated to safeguard the data from leaks. Although cloud applications
such as Google or Microsoft are well equipped with security from their end still, it's the user end
that acts as a significant source for erroneous errors, malicious software, and phishing attacks.
Data Breaches: Prime target
Data will continue to be a leading concern for organizations around the world. Whether it be for
an individual or organization, safeguarding digital data is the primary goal now. Any minor flaw
or bug in your system browser or software is a potential vulnerability for hackers to access personal
information. New strict measures General Data Protection Regulation (GDPR) was enforced from
May 25th, 2018 onwards, offering data protection and privacy for individuals in the European
Union (EU). Similarly, the California Consumer Privacy Act (CCPA) was applied after January
1st, 2020, for safeguarding consumer rights in the California area.

IoT with 5G Network: The New Era of Technology and Risks


Next raging cyber security trend for 2021 is the IoT with 5G networks. With 5G networks expected
to roll out in 2020 globally, a new era of inter-connectivity will become a reality with the Internet
of Things (IoT). This communication between multiple devices also opens them to vulnerabilities
from outside influence, attacks, or an unknown software bug. Even the world's most used browser
supported by Google Chrome was found to have serious flaws. 5G architecture is still effectively
new in the industry and requires a lot of research to find loopholes to make the system secure from
external attack. Every step of the 5G network might bring a plethora of network attacks that we
might not be aware of. Here manufacturers need to be very strict in building sophisticated 5G
hardware and software to control data breaches.

Automation and Integration


Here's the next cyber security trend - with the size of data multiplying every day, it is eminent that
automation is integrated to give more sophisticated control over the information. Modern hectic
work demand also pressurizes professionals and engineers to deliver quick and proficient
solutions, making automation more valuable than ever. Security measurements are incorporated
during the agile process to build more secure software in every aspect. Large and complex web
applications are further hard to safeguard, making automation, as well as cyber security, to be a
vital concept of the software development process.

Targeted Ransomware
Another significant trend in cybersecurity is that we can't seem to ignore for 2020 is targeted
ransomware. Especially in the developed nation's industries rely heavily on specific software to
run their daily activities. These ransomware targets are more focused, such as the Wanna Cry
attack on the National Health Service hospitals in England Scotland corrupted more than 70,000
medical devices. Though generally, ransomware asks to threaten to publish victim's data unless a
ransom is paid still, it can affect the large organization or in case of nations too.

State-Sponsored Cyber Warfare


There won't be any stoppage between the western and eastern powers in attempts to find
superiority. The tension between the US and Iran or Chinese hackers often creates worldwide news
though the attacks are few; they have a significant impact on an event such as elections. And with
more than 70 elections bound to be held this year, criminal activities during this time will surge.
Expect high-profile data breaches, political and industrial secrets to top cyber security trends for
2021.

Insider Threats
Human error is still one of the primary reasons for the data breach. Any bad day or intentional
loophole can bring down a whole organization with millions of stolen data. Report by Verizon in
data breach gives strategic insights on cybersecurity trends that the employees directly or indirectly
made 34 percent of total attacks. So, make sure you create more awareness within premises to
safeguard data in every way possible.

NFT

What is an NFT? What does NFT stand for?


NFT stands for non-fungible token, which is a form of cryptocurrency that is unlike any other. It
is a digital certificate of asset possession. A cricket match ticket, for example, denotes possession
of one seat at the game. NFTs are cryptographic assets on the blockchain with specific
identification codes and metadata that differentiate them from one another, according to
Investopedia. NTFs may be used to describe ownership of artwork, buildings, collections, and
everything else that is unique. Since an NFT is transparent and stored on the blockchain, it is easy
for anyone to see who owns the token.

What Makes NFT Unique?


NFTs are non-fungible, which means they are one-of-a-kind. For example, your cellphones,
laptops, PlayStation 5, makeup kit, and everything else in your environment that has the same
value to anyone and can be easily replaced are fungible. NFTs are items that only exist in digital
form and can be purchased and owned. Artwork is a good example of anything that cannot be
replaced.

How Do NFTs Function?


NFTs, as previously said, are digital certificates that can be purchased or sold just like any other
valued object. When you purchase an NFT, you will receive a certificate that is protected by
Blockchain Technology and identifies you as the owner of that digital asset. Without requesting
permission, anyone can build, buy, and sell an NFT. Cybercriminals would have a hard time
hacking or tampering with these properties since they are kept in an encrypted peer-to-peer
network.

Are NFTs Similar to Bitcoin?


While both NFTs and Bitcoin are considered cryptocurrencies, NFTs differs in that it cannot be
used as a tool for commercial transactions directly. Bitcoin is fungible, meaning that it can be sold
or substituted for other currencies. The value of one bitcoin, for example, is equal to the value of
another bitcoin. This is not the case with NFTs, however. Even if they are on the same platform or
in the same set, two NFTs are never identical. Consider it a one-of-a-kind collectible card that can
only be held by one person at any given time.
What Is the Future of NFTs?
NFT is a relatively new concept in India, and experts say it will take some time for this trend to
gain traction. “India has lakhs of traditional artisans who could profit from using NFTs to check
their original work,” Rahul Pagidipati, CEO of ZebPay. Add to that the growing number of digital
media artists who can cover their work with a tokenized "wrapper" to demonstrate that it is an
original work. NFTs will gain popularity over time as more mainstream artists become aware of
the unique cryptocurrency.

What Are Some of the Most Expensive NFTs?


• CGI artist Mike Inkelmann sold 21 of his works for 3.5 million dollars.
• Rick and Morty images were sold for 2.3 million dollars.
• Popular blockchain game Axie was sold for 1.5 million dollars
• NFT representing character of CryptoPunks were sold for 7,50,000 dollars

Future of Data Sharing


Data on its own has limited value but, when aggregated with other data and forged into insights
and applications, data can transform businesses, create enormous value, and help solve some of
the biggest societal problems.
The challenges of data aggregation have led to the emergence of platforms and ecosystems that
facilitate data sharing. These are growing in size and number (like smart cities), and more and
more of them are enabling integrated, citizen- and business-centric solutions that use data from
disparate sources. Tech giants have established themselves as early movers in shaping the data-
sharing marketplace. As the benefits of data aggregation increase, there are four models that appear
to facilitate broad data sharing within and across industries.

Vertical Platform: Within individual industries, vertical platforms have formed in order to share
data and provide solutions to targeted needs, such as predictive maintenance, supply chain
optimization, operational efficiency, and network optimization. Airbus’s Skywise and Penske’s
Fleet Insight, for example, provide benchmarking and other services using aggregated data from
airlines and logistics providers, respectively. Volkswagen recently opened its Industrial Cloud to
external companies, inviting platform partners to both contribute software applications that
increase the carmaker’s production efficiency and improve their own operations by scaling their
applications.

Super Platforms: Big companies, both tech giants and traditional industry incumbents, have
recognized the value in data sharing and are positioning themselves to capture a significant share
by expanding their existing vertical platforms to become super platforms. Super platforms
aggregate data across both verticals and data entities to support the development of applications
that address new sets of use cases. Most super platforms so far have been consumer focused, but
there are also early examples that aggregate data for industrial and B2B uses. Siemens’
MindSphere, ABB Ability, and Honeywell Forge, among others, are competing to be the digital
entry point for the factory. Super platforms can also address big societal issues, such as energy
efficiency. Schneider Exchange connects Schneider Electric customers with an open ecosystem of
analytics and solution providers that can develop applications by tapping directly into data from
the entire electrical system, from generation to transmission to commercial and residential use.
The issue with super platforms is assuring where and how the data is being used.

Shared Infrastructure: Most large companies are migrating at least some of their data and IT
infrastructure to cloud services provided by so-called hyperscalers, such as Amazon Web Services
(AWS) and Microsoft Azure, some of which also provide super platforms. Hyperscalers facilitate
data sharing by providing both cloud storage infrastructure and the applications that put data to
use for consumers and businesses. They already host massive amounts of data from all kinds of
industries, and they are in a natural position to aggregate data by building connections across
companies.
As data sharing generates more value, and as more data migrates to the cloud, providers can
differentiate themselves by offering data connectivity services that help clients capture and retain
business. Down the road, cloud providers can offer additional features that allow companies to
control access to their data, trace it as it is being shared across ecosystems, and monitor—and
potentially charge for—its use. Ecosystem orchestrators looking to bolster data sharing can shift
the entire ecosystem to the cloud platform with the best sharing functionality.

Distributed Data Space: There are two drivers of innovation from data sharing: aggregation and
access. Aggregation of data from disparate sources can lead to more innovation as hidden
relationships are revealed. Aggregation of more and more data from the same source across time
and space can facilitate benchmark comparisons and generate insights into trends. Likewise,
greater access and more open platforms unlock innovation by allowing a broader base of talent to
solve problems. Innovation contests such as Kaggle and DrivenData can help connect data sets
and problems with analytical talent.
But data concentrated in a few companies’ hands can also hamper innovation if those companies
aggregate only limited data types or seek to control access. As tech giants and others build out
infrastructure and services to consolidate data, the impact of network effects propels them into
powerful positions in the market. The distributed data space concept is capturing the attention of
think tanks, academics, researchers, and investors. For example, the UK’s Open Data Institute is
exploring the value of data sharing as well as models such as data trusts and other data institutions.

Robotic Process Automation


What is RPA?
Robotic process automation (RPA) is the term used for software tools that partially or fully
automate human activities that are manual, rule-based, and repetitive. It works by replicating the
actions of an actual human interacting with one or more software applications to perform tasks
such as data entry, process standard transactions, or respond to simple customer service queries.
The best example of RPA is the “chatbot” that has started to become ubiquitous on websites to
handle standard queries.
Advantages of RPA:
• It frees humans from monotonous, low-value-added tasks like data entry and makes them
available for higher-value tasks that require human creativity, ingenuity, and decision making.
• It helps to ensure that outputs are complete, correct, and consistent between tasks and between
human workers
• It helps to ensure that tasks can be completed more quickly because the robotic process
automation tool can find and retrieve any necessary data in the background

Characteristics of RPA:
• Flexibility: RPA bot can be programmed to complete almost any repetitive task.
• Ease of integration: Thanks to screen scraping and existing integrations, RPA bots do not need
to be integrated and can input and evaluate the output of almost all Windows applications.
• Ease of implementation: Setting up RPA is as simple as setting up a macro in excel by
recording your actions. While the drag drop interface is available for setting up most of
automation, the next gen RPA bots learn activities to be automated based on employee’s
actions, also called cognitive or intelligent automation.
• Cost: RPA further reduces the cost of the process. Business process outsourcing solutions are
no longer economical when those processes can be automated yielding better results and
requiring less cost than outsourcing.

Different types of RPA:

Attended Automation
Attended Automation refers to the kind of automation where the bot or the agent passively resides
on the user’s machine and is invoked by the user at certain instances. The triggering has to actively
happen by the user’s action since the points of triggering are programmatically hard to detect.

The best example for attended automation is customer-service. The customer’s inquiry might
mandate a few basic checks which would have to be manually performed by the service
representative. The output of that process would mostly be a work of inference. The ‘agent’ or the
‘bot’ takes care to scrap information and paste it in the relevant fields, and this takes a monotonous
task away from the representative. The automation also ensures that there is no error in the copying
or pasting of information.

Unattended Automation
Unattended Automation is an enhanced version of RPA. It is used in the tasks that can be run in
the background and process the essential data to give the output. This saves a lot of time for the
back-end employees who do not have to deal with customers but more with data and processes.
There are various triggers that are used for Unattended automation such as Data-Input in A Specific
Field, Bot-Initiated Launching, Workflow-Initiated, time-Slot Based Bots
Hybrid RPA
The large organizations of today that have both a support-environment and a back-end
environment mean that the RPAs that offer the best of both are needed to make the processes more
robust and efficient. Thus, they use both attended and unattended automation for their processes.

Difference between RPA and AI


AI is a complementary sibling to the RPA robots. RPA & AI work together to expand automation
into all sorts of new areas allowing to automate more and complex tasks.

RPA cleans up your underlying processes to provide an easily integrated framework on top of your
existing digital systems. Without this underlying foundation, the barrier to entry for integrating AI
is much higher. Without that foundation, AI would need to be manually woven into your core
processes.

Trends in RPA:
RPA, an interesting issue among the C-suite, is quickly making strides over several businesses,
including Manufacturing, Retail, Telecommunications, BFSI, and protection. However, the
rundown doesn’t end there-it rather begins. They key trends to look forward are:
RPA Will be the New ERP
The community of global system integrators (GSIs) and audit-based counseling organizations will
motivate and train a huge number of laborers to embrace automation. Furthermore, GSIs will do
as such similarly they did with enterprise resource planning (ERP) software in the 1990s.

These organizations perceive that the automation industry is ready for explosive development and
see an undeniable opportunity to sell business systems and enable services to help their customers
to receive new rewards, much like they once did with ERP.

Rise of SPA
SPA, otherwise called Smart Process Automation, is nothing, however an expansion of RPA. The
prior generation of RPA was equipped for automating structured data, with a pre-characterized set
of rules. However, with the progressions, and simple incorporation of Machine Learning-SPA bots
fill in as an alternative for RPA’s ‘If-Then’ rules and statements.

A Mix of Manual and Digital Efforts


Software robots will automate the work of most of the people by taking the unpredictable, dreary
and monotonous tasks. This can prompt a mix of both manual and automation efforts.

The eventual fate of RPA will see the use of modern innovations, for example, Advanced Data,
Analytics Business Process Automation, Artificial Intelligence, Blockchain, Optical Character
Recognition (OCR), etc. joined with RPA to offer powerful automation. We can likewise expect
an expanded development with robotic process automation companies.

More Prominent RPA Vendor Differentiation Insights


Companies currently face a dumbfounding choice of more than 150 robotic process automation
(RPA)-branded products and they all change essentially in profitability claims, design quality and
approach. So, a more noteworthy arrangement is critically needed to comprehend these significant
technical nuances. A clearer outline of RPA products will start emerging in 2021, where sellers
fall into two general classes: those that give quick strategic advantages across desktop
environments and those that convey more strategic transformation across a large-scale enterprise.
Elimination of Paper Work
Robotic Process Automation is a data-driven automation process, helping companies center around
overseeing monotonous tasks, exercises, and mundane time-consuming duties performed by
humans.
Dependence on Work-from-home Robots
Companies are envisioning a strategy of ‘a robot for each individual.’ That vision will soon be
turning into a reality soon and each individual will have their own robot (or digital assistant). It is
predicted that the RPA market will change in accordance to help this continued shift to remote
working models.

Employee Experience will Become Significant


Over a recent couple of years, we’ve seen an expanded spotlight on the customer experience (CX),
and companies currently foresee a similar approach to the employee experience (EX).

Take the 2020 novel COVID-19 pandemic for instance. Countless workers endure increased
anxiety and concern related to the recession and what it will mean for them and their families. The
whole automation market must address this with processes and operations that help improve the
employee experience that is crucial for increasing morale, engagement, and productivity.

References:
• https://research.aimultiple.com/what-is-robotic-process-automation/
• https://www.sapcle.com/blog/?p=1081
• https://www.uipath.com/blog/ai-rpa-differences-when-to-use-them-together

Edge Computing

Edge computing refers to the computing done at or near the source of the data, instead of relying
on the cloud at one of a dozen data centers to do all the work. The word edge in this context means
literal geographic distribution. Edge computing is based on a networking philosophy focused on
bringing computing as close to the source of data as possible in order to reduce latency and
bandwidth use. In simpler terms, edge computing means running fewer processes in the cloud and
moving those processes to local places, such as on a user’s computer, an IoT device, or an edge
server. Bringing computation to the network’s edge minimizes the amount of long-distance
communication that has to happen between a client and server.

Advantages of Edge Computing


• Scalability
• Decreased latency
• Decrease in bandwidth use and associated cost
• Decrease in server resources and associated cost
• Added functionality

How is Edge Computing different from other computing models?


The first computers were large, bulky machines that could only be accessed directly or via
terminals that were basically an extension of the computer. With the invention of personal
computers, computing could take place in a much more distributed fashion. For a time, personal
computing was the dominant computing model. Applications ran and data was stored locally on a
user's device, or sometimes within an on-premise data center.
Cloud computing, a more recent development, offered a number of advantages over this locally
based, on-premise computing. Cloud services are centralized in a vendor-managed "cloud" (or
collection of data centers) and can be accessed from any device over the Internet.
However, cloud computing can introduce latency because of the distance between users and the
data centers where cloud services are hosted. Edge computing moves computing closer to end
users to minimize the distance that data has to travel, while still retaining the centralized nature of
cloud computing.

Why we need Edge Computing


With the increasing need for speed and low latency, ever more data and new use cases where edge
computing makes sense, we’re closer to the benefits of edge and examples of how edge computing
is used. Let’s look at one.
There are many drivers for edge computing, other than sheer data-related and IoT-related ones.
Moreover, most of the so-called next-generation applications that will really need extremely low
latency and extremely high availability are not here yet. And it’s not certain if they will be here
anytime soon, even if 5G is what many of them are waiting for. There simply isn’t any guarantee
that things like autonomous vehicles or virtual and augmented reality at scale will indeed become
a reality soon. In fact, even if 5G is really here there’s totally no guarantee that true self-driving
cars will ever be a reality except in specific areas; there is far more to it than meets the eye. VR
and AR might find their play here and there but in industrial applications slower than many like to
believe as becomes clear in the part on edge computing and Industry 4.0.
Vendors of edge computing equipment of course aren’t waiting because of the earlier-mentioned
use cases and scenarios in which edge already makes sense and since it’s not certain what use cases
we’ll see. In fact, just as there are use cases, we probably won’t see happening at all, others will
pop up that we don’t think about today. Yet, that doesn’t change the case for edge computing
overall. And there’s quite some low-hanging fruit already in customer-facing areas such as retail
and commercial applications where edge computing vendors are focusing on.

Edge Computing Vs. Cloud Computing


It’s important to understand that cloud and edge computing are different, non-interchangeable
technologies that cannot replace one another. Edge computing is used to process time-sensitive
data, while cloud computing is used to process data that is not time-driven.
Besides latency, edge computing is preferred over cloud computing in remote locations, where
there is limited or no connectivity to a centralized location. These locations require local storage,
similar to a mini data center, with edge computing providing the perfect solution for it.
Edge computing is also beneficial to specialize and intelligent devices. While these devices are
akin to PCs, they are not regular computing devices designed to perform multiple functions. These
specialized computing devices are intelligent and respond to particular machines in a specific way.
However, this specialization becomes a drawback for edge computing in certain industries that
require immediate responses.

Trends in Edge Computing


References:
• https://www.theverge.com/circuitbreaker/2018/5/7/17327584/edge-computing-cloud-google-
microsoft-apple-amazon
• https://www.cloudflare.com/learning/serverless/glossary/what-is-edge-computing/
• https://www.simplilearn.com/edge-computing-vs-cloud-computing-article

Virtual Events
Tech marketers are beginning to embrace virtual events as a way of reaching diverse audiences to
meet a wide range of objectives — from brand awareness and demand generation to customer
experience and education. Most big vendors have already added virtual events to their marketing
mix. In response to this increasing demand, the number and diversity of players in the virtual event
field has exploded. Tech marketers face the daunting task of selecting the right platform for their
needs. Many of the available platforms provide the basic functionality — reminiscent of a physical
event with various venues and incorporating familiar online communication and collaboration
tools such as videos, chat, and collateral downloads. In this nascent market, certain vendors stand
out, distinguished by innovative features, exceptional service, or ability to scale delivery. As the
tools become more widely used, best practices develop and mature, as do some of the players in
the market. Choosing the right platform and services helps avoid the pitfalls experienced by early
adopters and increases the likelihood of the event delivering the chosen marketing objectives.
Virtual events powered by Artificial Intelligence
In a pandemic-riddled world, several industries including IT, Retail, Healthcare, Automotive,
Education, BFSI to name a few, are actively transitioning to virtual events. From internal trainings,
press releases, product launches, trade shows, or even a client conference, virtual events are
becoming the norm rather than an exception.
While the beginning of 2020 threw the events industry into a tizzy, the fag end of 2020 proved that
virtual events are not only here to stay, but grow and morph into mean and lean delivering
machines. With more than 93% of event marketers planning to invest in virtual platforms
according to the Post-Covid Event Outlook Report, virtual event platforms it seems, will hold
sway, moving forward.
With the adoption for Artificial Intelligence for virtual events, where AI powered bots drive virtual
companionship for audiences, the efficacy of these platforms to anchor customer engagement,
deliver personalized experience, build positive disposition for brands and drive demand generation
has become even more promising. For instance, using Machine Learning technology, bots can
observe and learn behaviour patterns of engagement and act as your personal virtual concierge.
Imagine this: in a virtual event, anywhere between 50-500 documents get uploaded, which are
used by attendees to browse through and read at their convenience. Trying to sift and filter these
documents is tedious and time-consuming. But bots can provide suggestions by learning about
your interests and even auto-suggest people you could network with. From converting voice to
text and making session notes to directly email them to you, virtual event bots have become an
inseparable part of platforms that aim to deliver greater personalization, increase audience
engagement and improve audience retention.

The future of virtual events


Pre-Covid, around 40% of the marketing spends were on on-ground physical events. But with a
very challenging global environment for business, the on-ground event spends will see a decrease
of up to 20-25% and will move to virtual events. Businesses believe that events will continue to
be critical for business growth, and in fact 80% of marketers believe that business leaders also
support the move to virtual platforms and technology adoption is no longer a barrier.
As technology continues to power virtual event platforms, rich data analytics adds a layer to the
entire experience of the virtual event and are proving to be a valuable extended marketing arm for
businesses. Delivering improved Return on Investment (RoI) or generating leads which in turn
could lead to higher sales, data is being used to provide key actionable insights. Event industry
reports suggest that virtual events are seen as an increasingly viable alternative for C-level
executives as it saves time on travel and is more convenient.
The other advantage of virtual event’s platforms is accessibility. With device compatible
platforms, events can be accessed and experienced from virtually anywhere. Training programs,
product launches, conferences, seminars can all be attended while traveling, from an airport, from
a café or the convenience of one’s home. This opens up a whole host of possibilities for virtual
event marketers. Alliances, product placements and sponsorships can create innovative
experiences and engagement that could be personalized and create business opportunities for
customers. For instance, delivering wine and cheese at home prior to an event for online delivery
technology service providers can create a significant positive impact on the attendee, making the
whole experience exciting and memorable and open doors to potential business collaborations.
Industry sectors such as healthcare and pharmaceuticals have adopted virtual event platforms for
medical conferences and seminars that are used extensively for knowledge sharing, networking,
trainings and discussions. BFSI has also seen the advantages of virtual event platforms are
adopting this with a vengeance. Augmented Reality, Virtual Reality and intelligent AI-driven bots
extract maximum mileage from virtual events, with analytics adding a rich layer to virtual events
which are no longer boring, detached and distant. Communicating with large audiences, using
them as levers for career growth such as online job fairs, or for knowledge enhancement with
global online trainings, industries recognize the benefits and are willing to invest in virtual
platforms. Virtual events are seen to be streamlined, focused, targeted, and can be tailored to
deliver experiences par excellence.

Virtual workspace
Many people had been working outside of their workplaces since the beginning of the pandemic,
and for some of them, this was the first time they had ever seen it for such a long time.

Clearly, this is the greatest remote work experiment ever performed, and since there was no other
choice but to make it succeed, many businesses have gone to great lengths to fully comprehend
the business's true needs as well as the conditions in which its workers could remain efficient.

Now, the big question is how we can make up for the missing real-life interactions between people
that normally get lost when everyone is working remotely. The first issue we must address is
replacing all voice communications with face-to-face video chats. Finally, we are just people, and
there is a lot to learn just by looking at the other person's face. That is how we learn to trust and
care for one another.

The problem here is the consistency of our internet speeds, which can often be a stumbling block,
and also the head pose, the background, and the lighting. Many of us have heard about Zoom's
new virtual background function, which uses deep learning to seamlessly replace your current
background with a virtual one.

The real breakthrough comes from Nvidia, which used GANs to solve the last three problems.
Rather than sending the entire video stream, it only sends data about the most important facial
features (eyes, nose, mouse). The receiver can easily re-construct the same exact live video of the
face with much higher quality while using much less bandwidth thanks to GANs and this small
amount of information. Furthermore, the newly created video can be easily changed to achieve the
ideal pose for natural eye contact between the participants.

From virtual meetings to virtual workspaces


People have been enthralled by the prospect of building a virtual environment for a long time. A
computer-simulated environment in which everybody has their own avatar and can explore the
virtual world, engage in its events, and interact with others simultaneously and independently.

Although most virtual world environments, such as SecondLife, are designed for entertainment,
social, and educational purposes, there is still a good chance to adapt this to our need for a virtual
workspace where everyone can meet and communicate with one another.

A virtual reality workplace may also encourage us to get up from our desks and walk around while
completely absorbed in the virtual world. Horizon has already been created by Facebook as a
virtual reality gaming environment, but who says we can't use the same concepts for work?
Companies have also begun to use RecRoom, a popular virtual world game, for meetings, virtual
outings, and team events. You would be able to maintain the same workplace habits in a virtual
environment, in addition to getting more interest in meetings, trainings, and close collaboration.
So, if you want to go to the pantry and have a small talk over lunch or just have a random
conversation over coffee, you can do so. We'll be able to transfer the office to the cloud this way!
Rather than choose between the office and the house, we would essentially add all of the office's
functionality to our homes.

Cloud Service Types

SaaS: Software as a Service


Software as a Service, also known as cloud application services, represents the most commonly
utilized option for businesses in the cloud market. SaaS utilizes the internet to deliver applications,
which are managed by a third-party vendor, to its users. A majority of SaaS applications run
directly through your web browser, which means they do not require any downloads or
installations on the client side.

SaaS Delivery
Due to its web delivery model, SaaS eliminates the need to have IT staff download and install
applications on each individual computer. With SaaS, vendors manage all potential technical
issues, such as data, middleware, servers, and storage, resulting in streamlined maintenance and
support for the business.

SaaS Advantages
SaaS provides numerous advantages to employees and companies by greatly reducing the time and
money spent on tedious tasks such as installing, managing, and upgrading software. This frees up
plenty of time for technical staff to spend on more pressing matters and issues within the
organization.

SaaS Characteristics
There are a few ways to help you determine when SaaS is being utilized:
• Managed from a central location
• Hosted on a remote server
• Accessible over the internet
• Users not responsible for hardware or software updates

When to Use SaaS


SaaS may be the most beneficial option in several situations, including:
• Start-ups or small companies that need to launch ecommerce quickly and don’t have time
for server issues or software
• Short-term projects that require quick, easy, and affordable collaboration
• Applications that aren’t needed too often, such as tax software
• Applications that need both web and mobile access

SaaS Limitations & Concerns


Interoperability. Integration with existing apps and services can be a major concern if the SaaS
app is not designed to follow open standards for integration. In this case, organizations may need
to design their own integration systems or reduce dependencies with SaaS services, which may not
always be possible.

Vendor lock-in. Vendors may make it easy to join a service and difficult to get out of it. For
instance, the data may not be portable–technically or cost-effectively–across SaaS apps from other
vendors without incurring significant cost or inhouse engineering rework. Not every vendor
follows standard APIs, protocols, and tools, yet the features could be necessary for certain business
tasks.

Lack of integration support. Many organizations require deep integrations with on-premise apps,
data, and services. The SaaS vendor may offer limited support in this regard, forcing organizations
to invest internal resources in designing and managing integrations. The complexity of integrations
can further limit how the SaaS app or other dependent services can be used.

Data security. Large volumes of data may have to be exchanged to the backend data centers of
SaaS apps in order to perform the necessary software functionality. Transferring sensitive business
information to public-cloud based SaaS service may result in compromised security and
compliance in addition to significant cost for migrating large data workloads.

Customization. SaaS apps offer minimal customization capabilities. Since a one-size-fits-all


solution does not exist, users may be limited to specific functionality, performance, and
integrations as offered by the vendor. In contrast, on-premise solutions that come with several
software development kits (SDKs) offer a high degree of customization options.

Lack of control. SaaS solutions involves handing control over to the third-party service provider.
These controls are not limited to the software–in terms of the version, updates, or appearance–but
also the data and governance. Customers may therefore need to redefine their data security and
governance models to fit the features and functionality of the SaaS service.

Feature limitations. Since SaaS apps often come in a standardized form, the choice of features
may be a compromising tradeoff against security, cost, performance, or other organizational
policies. Furthermore, vendor lock-in, cost, or security concerns may mean it’s not viable to switch
vendors or services to serve new feature requirements in the future.

Performance and downtime. Because the vendor controls and manages the SaaS service, your
customers now depend on vendors to maintain the service’s security and performance. Planned
and unplanned maintenance, cyber-attacks, or network issues may impact the performance of the
SaaS app despite adequate service level agreement (SLA) protections in place.

Examples of SaaS
Popular examples of SaaS include:
• Google Workspace (formerly GSuite)
• Dropbox
• Salesforce
• Cisco WebEx
• SAP Concur
• GoToMeeting

PaaS: Platform as a Service


Cloud platform services, also known as Platform as a Service (PaaS), provide cloud components
to certain software while being used mainly for applications. PaaS delivers a framework for
developers that they can build upon and use to create customized applications. All servers, storage,
and networking can be managed by the enterprise or a third-party provider while the developers
can maintain management of the applications.

PaaS Delivery
The delivery model of PaaS is similar to SaaS, except instead of delivering the software over the
internet, PaaS provides a platform for software creation. This platform is delivered via the web,
giving developers the freedom to concentrate on building the software without having to worry
about operating systems, software updates, storage, or infrastructure. PaaS allows businesses to
design and create applications that are built into the PaaS with special software components. These
applications, sometimes called middleware, are scalable and highly available as they take on
certain cloud characteristics.

PaaS Advantages
No matter the size of your company, using PaaS offers numerous advantages, including:
• Simple, cost-effective development and deployment of apps
• Scalable
• Highly available
• Developers can customize apps without the headache of maintaining the software
• Significant reduction in the amount of coding needed
• Automation of business policy
• Easy migration to the hybrid model

PaaS Characteristics
PaaS has many characteristics that define it as a cloud service, including:
• Builds on virtualization technology, so resources can easily be scaled up or down as your
business changes
• Provides a variety of services to assist with the development, testing, and deployment of
apps
• Accessible to numerous users via the same development application
• Integrates web services and databases

When to Use PaaS


Utilizing PaaS is beneficial, sometimes even necessary, in several situations. For example, PaaS
can streamline workflows when multiple developers are working on the same development project.
If other vendors must be included, PaaS can provide great speed and flexibility to the entire
process. PaaS is particularly beneficial if you need to create customized applications. This cloud
service also can greatly reduce costs and it can simplify some challenges that come up if you are
rapidly developing or deploying an app.
PaaS Limitations & Concerns

Data security. Organizations can run their own apps and services using PaaS solutions, but the
data residing in third-party, vendor-controlled cloud servers poses security risks and concerns.
Your security options may be limited as customers may not be able to deploy services with specific
hosting policies.

Integrations. The complexity of connecting the data stored within an onsite data center or off-
premise cloud is increased, which may affect which apps and services can be adopted with the
PaaS offering. Particularly when not every component of a legacy IT system is built for the cloud,
integration with existing services and infrastructure may be a challenge.

Vendor lock-in. Business and technical requirements that drive decisions for a specific PaaS
solution may not apply in the future. If the vendor has not provisioned convenient migration
policies, switching to alternative PaaS options may not be possible without affecting the business.

Customization of legacy systems. PaaS may not be a plug-and-play solution for existing legacy
apps and services. Instead, several customizations and configuration changes may be necessary for
legacy systems to work with the PaaS service. The resulting customization can result in a complex
IT system that may limit the value of the PaaS investment altogether.

Runtime issues. In addition to limitations associated with specific apps and services, PaaS
solutions may not be optimized for the language and frameworks of your choice. Specific
framework versions may not be available or perform optimally with the PaaS service. Customers
may not be able to develop custom dependencies with the platform.

Operational limitation. Customized cloud operations with management automation workflows


may not apply to PaaS solutions, as the platform tends to limit operational capabilities for end
users. Although this is intended to reduce the operational burden on end users, the loss of
operational control may affect how PaaS solutions are managed, provisioned, and operated.

Examples of PaaS
Popular examples of PaaS include:
• AWS Elastic Beanstalk
• Windows Azure
• Heroku
• Force.com
• Google App Engine
• OpenShift

IaaS: Infrastructure as a Service


Cloud infrastructure services, known as Infrastructure as a Service (IaaS), are made of highly
scalable and automated compute resources. IaaS is fully self-service for accessing and monitoring
computers, networking, storage, and other services. IaaS allows businesses to purchase resources
on-demand and as-needed instead of having to buy hardware outright.
IaaS Delivery
IaaS delivers cloud computing infrastructure, including servers, network, operating systems, and
storage, through virtualization technology. These cloud servers are typically provided to the
organization through a dashboard or an API, giving IaaS clients complete control over the entire
infrastructure. IaaS provides the same technologies and capabilities as a traditional data center
without having to physically maintain or manage all of it. IaaS clients can still access their servers
and storage directly, but it is all outsourced through a “virtual data center” in the cloud. As opposed
to SaaS or PaaS, IaaS clients are responsible for managing aspects such as applications, runtime,
OSes, middleware, and data. However, providers of the IaaS manage the servers, hard drives,
networking, virtualization, and storage. Some providers even offer more services beyond the
virtualization layer, such as databases or message queuing.

IaaS Advantages
IaaS offers many advantages, including:
• The most flexible cloud computing model
• Easy to automate deployment of storage, networking, servers, and processing power
• Hardware purchases can be based on consumption
• Clients retain complete control of their infrastructure
• Resources can be purchased as-needed
• Highly scalable

IaaS Characteristics
Characteristics that define IaaS include:
• Resources are available as a service
• Cost varies depending on consumption
• Services are highly scalable
• Multiple users on a single piece of hardware
• Organization retains complete control of the infrastructure
• Dynamic and flexible

When to Use IaaS


Just as with SaaS and PaaS, there are specific situations when IaaS is most advantageous.
• Startups and small companies may prefer IaaS to avoid spending time and money on
purchasing and creating hardware and software.
• Larger companies may prefer to retain complete control over their applications and
infrastructure, but they want to purchase only what they actually consume or need.
• Companies experiencing rapid growth like the scalability of IaaS, and they can change out
specific hardware and software easily as their needs evolve.

Anytime you are unsure of a new application’s demands, IaaS offers plenty of flexibility and
scalability.

IaaS Limitations & Concerns


Many limitations associated with SaaS and PaaS models – such as data security, cost overruns,
vendor lock-in and customization issues – also apply to the IaaS model. Particular limitations to
IaaS include:

Security. While the customer is in control of the apps, data, middleware, and the OS platform,
security threats can still be sourced from the host or other virtual machines (VMs). Insider threat
or system vulnerabilities may expose data communication between the host infrastructure and
VMs to unauthorized entities.

Legacy systems operating in the cloud. While customers can run legacy apps in the cloud, the
infrastructure may not be designed to deliver specific controls to secure the legacy apps. Minor
enhancement to legacy apps may be required before migrating them to the cloud, possibly leading
to new security issues unless adequately tested for security and performance in the IaaS systems.

Internal resources and training. Additional resources and training may be required for the
workforce to learn how to effectively manage the infrastructure. Customers will be responsible for
data security, backup, and business continuity. Due to inadequate control into the infrastructure
however, monitoring and management of the resources may be difficult without adequate training
and resources available inhouse.

Multi-tenant security. Since the hardware resources are dynamically allocated across users as
made available, the vendor is required to ensure that other customers cannot access data deposited
to storage assets by previous customers. Similarly, customers must rely on the vendor to ensure
that VMs are adequately isolated within the multitenant cloud architecture.

Examples of IaaS
Popular examples of IaaS include:
• DigitalOcean
• Linode
• Rackspace
• Amazon Web Services (AWS)
• Cisco Metacloud
• Microsoft Azure
• Google Compute Engine (GCE)

SaaS vs PaaS vs IaaS


Each cloud model offers specific features and functionalities, and it is crucial for your organization
to understand the differences. Whether you need cloud-based software for storage options, a
smooth platform that allows you to create customized applications, or complete control over your
entire infrastructure without having to physically maintain it, there is a cloud service for you. No
matter which option you choose, migrating to the cloud is the future of business and technology.

In the image shown below, pizza is used as an example to understand the differences between the
different cloud services:
Newer types of services are emerging, some of them are listed below:

Storage as a Service (SAAS)


Storage as a Service is a business model in which a large company rents space in their storage
infrastructure to a smaller company or individual. The economy of scale in the service provider’s
infrastructure theoretically allows them to provide storage much more cost-effectively than most
individuals or corporations can provide their own storage when the total cost of ownership is
considered. Storage as a Service is generally seen as a good alternative for a small or mid-sized
business that lacks the capital budget and/or technical personnel to implement and maintain their
own storage infrastructure.

Communications as a Service (CAAS)


Communications as a Service (CAAS) is an outsourced enterprise communications solution that
can be leased from a single vendor. Such communications can include voice over IP (VoIP or
Internet telephony), instant messaging (IM), collaboration and video conference applications using
fixed and mobile devices. The CAAS vendor is responsible for all hardware and software
management and offers guaranteed Quality of Service (QoS). CAAS allows businesses to
selectively deploy communications devices and modes on a pay-as-you-go, as-needed basis.
Network as a Service (NAAS)
Network as a Service (NAAS), a framework that integrates current cloud computing offerings with
direct, yet secure, client access to the network infrastructure. NAAS is a new cloud computing
model in which the clients have access to additional computing resources collocated with switches
and routers. NAAS can include flexible and extended Virtual Private Network (VPN), bandwidth
on demand, custom routing, multicast protocols, security firewall, intrusion detection and
prevention, Wide Area Network (WAN), content monitoring and filtering, and antivirus.

Monitoring as a Service (MAAS)


Monitoring-as-a-service (MAAS) is a framework that facilitates the deployment of monitoring
functionalities for various other services and applications within the cloud. The most common
application for MAAS is online state monitoring, which continuously tracks certain states of
applications, networks, systems, instances or any element that may be deployable within the cloud.
MAAS makes it easier for users to deploy state monitoring at different levels of Cloud services.

Product Management and Project Management

Software Development Life Cycle

SDLC stands for Software development Life Cycle. SDLC is a process that consists of a series of
planned activities to develop or alter Software Products. Below is an overview of SDLC, SDLC
models available and their application in the industry.

Why is Product Life cycle important?


The SDLC aims to produce a high-quality software that meets or exceeds customer expectations,
reaches completion within times and cost estimates.
It consists of a detailed plan describing how to develop, maintain, replace and alter or enhance
specific software. It is a framework defining tasks performed at each step in the software
development process.

ISO/IEC 12207 is an international standard for software life-cycle processes. It aims to be the
standard that defines all the tasks required for developing and maintaining software.
Waterfall Model
The Waterfall Model was the first Process Model to be introduced. It is also
referred to as a linear-sequential life cycle model. It is very simple to
understand and use. In a waterfall model, each phase must be completed
before the next phase can begin and there is no overlapping in the phases.
The Waterfall model is the earliest SDLC approach that was used for
software development.
The waterfall Model illustrates the software development process in a
linear sequential flow. This means that any phase in the development
process begins only if the previous phase is complete. In this waterfall
model, the phases do not overlap.

Waterfall Model – Design


The sequential phases in Waterfall model are −
• Requirement Gathering and analysis − All possible requirements of the system to be
developed are captured in this phase and documented in a requirement specification document.
• System Design − the requirement specifications from first phase are studied in this phase and
the system design is prepared. This system design helps in specifying hardware and system
requirements and helps in defining the overall system architecture.
• Implementation − with inputs from the system design, the system is first developed in small
programs called units, which are integrated in the next phase. Each unit is developed and tested
for its functionality, which is referred to as Unit Testing.
• Integration and Testing − All the units developed in the implementation phase are integrated
into a system after testing of each unit. Post integration the entire system is tested for any
faults and failures.
• Deployment of system − Once the functional and non-functional testing is done; the product
is deployed in the customer environment or released into the market.
• Maintenance − There are some issues which come up in the client environment. To fix those
issues, patches are released. Also, to enhance the product some better versions are released.
Maintenance is done to deliver these changes in the customer environment.

Waterfall Model - Advantages


The advantages of waterfall development are that it allows for departmentalization and control.
A schedule can be set with deadlines for each stage of development and a product can proceed
through the development process model phases one by one.
Development moves from concept, through design, implementation, testing, installation,
troubleshooting, and ends up at operation and maintenance. Each phase of development proceeds
in strict order.
Some of the major advantages of the Waterfall Model are as follows −
• Simple and easy to understand and use
• Easy to manage due to the rigidity of the model. Each phase has specific deliverables and a
review process. Phases are processed and completed one at a time.
• Works well for smaller projects where requirements are very well understood & documented.
Waterfall Model - Disadvantages
The disadvantage of waterfall development is that it does not allow much reflection or revision.
Once an application is in the testing stage, it is very difficult to go back and change something
that was not well-documented or thought upon in the concept stage.
The major disadvantages of the Waterfall Model are as follows −
● No working software is produced until late during the life cycle.
● Not suitable for the projects where requirements are at a moderate to high risk of changing.
So, risk and uncertainty are high with this process model.
● It is difficult to measure progress within stages as can’t accommodate changing requirements.
● Integration is done as a "big-bang. at the very end, which doesn't allow identifying any
technological or business bottleneck or challenges early.

Agile Methodology
Agile software development refers to a group of software development methodologies based on
iterative development, where requirements and solutions evolve through collaboration between
self-organizing cross-functional teams.

What is Scrum?
Scrum is a framework that helps teams work together. Often thought of as an agile project
management framework, Scrum describes a set of meetings, tools, and roles that work in concert
to help teams’ structure and manage their work.
Scrum Phases and Processes
Scrum roles
1) Scrum Master

• A Scrum Master is a facilitator and Servant Leader who encourages and demands self-
organization from the development team.
• A Scrum Master enables close cooperation across all roles and functions, addresses resource
issue and disobedience of scrum practices.
• A Scrum Master protects the team from external and internal distractions.
• A Scrum Master removes impediments so the team can focus on the work at hand and follow
scrum practices.
• A Scrum Master is not typically a manager or lead, but he is an influential leader and coach
who does not do direct command and control.

2) Product Owner
• A Product Owner owns the Product backlog and writes user stories and acceptance criteria.
• A Product Owner is responsible for prioritizing the Product Backlog is prioritized and decides
the release date and the content.
• A Product Owner accepts or rejects product backlog item.
• A Product Owner has the power to cancel the Sprint, if he thinks the Sprint goal is redundant.
• A Product Owner is the one who is responsible for the Return on Investment (ROI) of the
product
3) Development Team
• The Development Team builds the product that the Product Owner indicates: the application
or website, for example. The Team in Scrum is “cross-functional”
• The Development Team includes all the expertise necessary to deliver the potentially shippable
product each Sprint
• The Development Team is self-organizing, with a very high degree of autonomy and
accountability.
• The Development Team decides how many items to build in a Sprint, and how best to
accomplish that goal.
• The Development Team is a cross functional, small and self-organizing team which owns the
collective responsibility of developing, testing and releasing the Product increment.

• The Development Team may not appoint any team lead since decisions are taken collectively
by the team.

Benefits of Agile working


• Agile approaches empower those involved; build accountability; encourage diversity of ideas;
allowing the early release of benefits; and promotion of continuous improvement.
• Agile helps build client and user engagement because changes are incremental and
evolutionary rather than revolutionary: it can therefore be effective in supporting cultural
change that is critical to the success of most transformation projects.
• Agile allows decision ‘gremlins’ to be tested and rejected early: the tight feedback loops
provide benefits in agile that are not as evident in waterfall.
PDCA

PDCA (plan do check act or plan do check adjust) is an iterative four-step management method
used in business for the control and continuous improvement of processes and products. It is also
known as the Deming circle/cycle/wheel, the Shewhart cycle, the control circle/cycle, or plan–do–
study–act (PDSA). A fundamental principle of the scientific method and PDCA is iteration—once
a hypothesis is confirmed (or negated), executing the cycle again will extend the knowledge
further. Repeating the PDCA cycle can bring its users closer to the goal, usually a perfect operation
and output.
Another fundamental function of PDCA is the "hygienic" separation of each phase, for if not
properly separated measurements of effects due to various simultaneous actions (causes) risk
becoming confounded.
PDCA (and other forms of scientific problem solving) is also known as a system for developing
critical thinking. At Toyota this is also known as "Building people before building cars". Toyota
and other lean manufacturing companies propose that an engaged, problem-solving workforce
using PDCA in a culture of critical thinking is better able to innovate and stay ahead of the
competition through rigorous problem solving and the subsequent innovations.

KAIZEN
Kaizen is a Japanese term with two words – KAI meaning change and ZEN meaning good, so
KAIZEN means "change for good" or "continuous improvement."
It is a Japanese business philosophy regarding the processes that continuously improve operations
and involve all employees. Kaizen sees improvement in productivity as a gradual and methodical
process. Description. Kaizen is a concept referring to business activities that continuously improve
all functions and involve all employees from the CEO to the assembly line workers. Kaizen is the
Sino-Japanese word for "improvement". Kaizen also applies to processes, such as purchasing and
logistics, that cross organizational boundaries into the supply chain.
9 knowledge areas of Project Management
5 phases of Project Management

Difference between Agile and Waterfall Model

Agile Waterfall

It separates the project development lifecycle Software development process is divided into
into sprints. distinct phases.

It follows an incremental approach Waterfall methodology is a sequential design


process.

Agile methodology is known for its Waterfall is a structured software


flexibility. development methodology so most times it
can be quite rigid.

Agile can be considered as a collection of Software development will be completed as


many different projects. one single project.

Agile is quite a flexible method which allows There is no scope of changing the
changes to be made in the project requirements once the project development
development requirements even if the initial starts.
planning has been completed.
Agile methodology, follow an iterative All the project development phases like
development approach because of this designing, development, testing, etc. are
planning, development, prototyping and other completed once in the Waterfall model.
software development phases may appear
more than once.

Test plan is reviewed after each sprint The test plan is rarely discussed during the
test phase.

Agile development is a process in which the The method is ideal for projects which have
requirements are expected to change and definite requirements and changes not at all
evolve. expected.

In Agile methodology, testing is performed In this methodology, the "Testing" phase


concurrently with software development. comes after the "Build" phase

Agile introduces a product mindset where the This model shows a project mindset and
software product satisfies needs of its end places its focus completely on accomplishing
customers and changes itself as per the the project.
customer's demands.

Agile methodology works exceptionally well Reduces risk in the firm fixed price contracts
with Time & Materials or non-fixed funding. by getting risk agreement at the beginning of
It may increase stress in fixed-price scenarios. the process.

Prefers small but dedicated teams with a high Team coordination/synchronization is very
degree of coordination and synchronization. limited.

Products owner with team prepares Business analysis prepares requirements


requirements just about every day during a before the beginning of the project.
project.

Test team can take part in the requirements It is difficult for the test to initiate any
change without problems. change in requirements.
Product Management (Role)
A PM takes holistic responsibility for the product, from the little details to the big picture. A PM
needs to set vision and strategy and finally define success and make decisions. It is a highly
collaborative role. The product manager usually serves as the main liaison between the engineering
and other roles such as design, quality assurance, user research, data analysts, marketing, sales,
customer support, business development, legal, content writers, other engineering teams, and the
executive team. It’s usually the job of the product manager to identify times when one of those
teams should be brought in, and to fill in for them if they don’t exist.

Function of a PM
The day-to-day work of a product manager varies over the course of the product life cycle. In the
beginning, they will be figuring out what to build, in the middle, they help the team to make
progress; at the end they prepare for the launch of the product.

While the product life cycle varies by company (and sometimes even by team), it usually follows
a general pattern of Research & Plan, Design, Implement & Test, and Release.

Research and Planning


All products and features start with research and planning. This is the time when the PM is starting
to think about what to build next. The next idea may come from a customer request, competitive
analysis, new technology, user research, the sales or marketing teams, brainstorming, or the big
vision for the product. Depending on the scope of the role, a big part of the product manager’s job
in this phase is creating or proposing a roadmap.

Design
Product design does not just mean user interface (UI) design or drawing out what the product will
look like. Product design is defining the features and functionality of the product. The PM’s role
in product design varies substantially between companies and teams.

Implement & Test


During the implementation stage, the product manager keeps track of how the project is
going and makes adjustments. During implementation, one of the most important parts of the job
is helping the engineers work efficiently. The product manager will check in regularly with his
team and learn how things are going. Often an engineer will be blocked because she’s waiting for
work from another team. In this case, the PM will need to find other tasks for the engineer and, in
the meantime, work with the other team to get the work finished more promptly.

Release
When the development process is finished, the product manager needs to make sure the launch
goes smoothly. The launch process varies from team to team but usually involves things like
running through the launch checklist, etc.
Difference between Project Manager and Product Manager
While some product managers have project management as a large part of their job, most do not.
Project managers are mostly concerned with timelines and coordination. While they might be
responsible for gathering the project requirements, they don’t have much say in identifying and
choosing the requirements.

Product managers are responsible for identifying problems and opportunities, picking which ones
to go after, and then making sure the team comes up with great solutions, either by thinking of the
solution themselves or by working with the designers and engineers. This is why product sense—
having the intuition to recognize the difference between a good product and a bad product—is so
important for product managers.

Discussion Topics

Data is the new oil


Digital footprints are hard to erase due to the fast-growing cloud services. Data is being
increasingly used by companies to improve their services. For example, a clothes brand tries to
gather data on customer preferences. Data collected from individuals is owned and manipulated
by the companies that collect it (such as Google, Apple, Facebook, and Amazon – the much-
vaunted GAFA oligopoly – and others such as ride sharing, food delivery, grocery apps, etc.). A
lot of such privately held data can be used for governance and policy purposes. For instance, data
from ride-sharing companies such as Uber and mapping tools such as Google Maps can provide
key insights into how people in cities travel, and help develop solutions for making travel easier.
But since the data is owned by a private company, policy makers and researchers have no access
to it.
Links:
• https://www.hindustantimes.com/editorials/why-data-is-the-new-oil/story-
Hdm3yVcArJrRx0uokyYDcL.html
• https://www.wired.com/insights/2014/07/data-new-oil-digital-economy/

Big Data and Information Privacy – A future challenge


Big Data is a technology that has come into existence in recent years. Its applications, ease of
access and accuracy have made it very popular in diverse fields. But, very recently, its cons have
also come to the fore. The Cambridge Analytica and Facebook scandal exposed the weak flank of
system, i.e., the ease with which the information can be misused. Information Privacy in Big Data
is something to be taken very seriously, as it can influence even voter behavior and in extension
alter our Governments. Politics is emerging as one of the key markets of Data Analytics firms.
Cambridge Analytica was increasingly engaged in helping politicians to understand voter behavior
through data (later found to be stolen), Donald Trump was allegedly one of its clients. In the light
of this privacy loophole, for the Big Data Technology to stay relevant and benevolent, certain
checks and balances need to exist. Stringent privacy norms and punitive laws need to be put in
place so that companies would be more careful with user data. Data must only be used with
consumer/user consent. Also, the consumers/users must be made aware mandatorily of the exact
details of the information that would be extracted from their profiles, to what extent will it be used,
etc.
Links:
• https://dataconomy.com/2017/07/10-challenges-big-data-security-privacy/
• https://www.researchgate.net/publication/324482789_Security_and_Privacy_Challenges_in_
Big_Data

Impact of Technology on Jobs: Will Automation & Artificial Intelligence reduce jobs
Technology intervention is inevitable in any sphere. It does raise the bar of productivity,
efficiency and safety to a level which is not achievable by humans. Adoption of technology,
global reach and faster communication has overhauled manufacturing, servicing, product
delivery and also employment associated with these sectors. But this is not the first time the
world has experienced significant shifts in employment due to new technology. History states
that technology has been a creator of jobs and has augmented new avenues. The course this time
will be same or not is a debatable issue.
Link:
• https://www.hindustantimes.com/education/debate-robots-artificial-intelligence-will-make-
humans-jobless-in-50-years/story-beG3KbHf9VBnw4AsvdwQbJ.html
• https://www.skynettoday.com/editorials/ai-automation-job-loss

How Data Protection Act will change the way data is used as of now?
With technology influencing every facet of life around us, and the quantum of personal information
being shared online or offline, it has become essential, and at the same time crucial, to strike a
balance between the cultural revolution brought about by this very digital transformation and the
associated implications of personal data protection. With most organizations on a digitization
spree, PDPB is a valuable step towards a sustainable solution that would aid India in strengthening
its personal data security concerns and position, as well as empower and equip individuals to
manage their personal data. PDPB will serve as a model for ensuring that Indian citizens have
autonomy in the digital economy. It will also permit them to regain control over their personal
data. PDPB focuses on this mix, and seeks to establish an overarching data privacy framework by
standardizing collection, usage, storage and transmission of personal data, ensuring adequate
protection of personal data. The Bill also establishes an independent authority, the Data Protection
Authority of India (DPAI), which will be empowered to oversee the enforcement of the law.
Links:
• https://m.economictimes.com/tech/internet/changes-likely-in-proposed-data-privacy-rules-
only-critical-data-may-need-to-be-housed-in-india/articleshow/70355298.cms
• https://www.yelloveedub.com/blog/gdpr-regulation-change
Digital payments are secure and India is ready to go 100% cashless
There has been a massive expansion of the formal banking imprint over the last few years,
especially due to the efforts of the Jan Dhan Yojana, which is a central government initiative. The
number of bank account holders has doubled during this period as per official figures. Indians are
worldwide known for their IT skills, a lot of which is required in building the infrastructure needed
for such a cashless economy. The brain power exists to create this infra. India is also home to the
phenomenal success story of digital wallet and payment app-Paytm. It is by all means, one of the
top unicorns in the world today, with a substantial valuation. In addition, we also have other such
gateways such as Mobikwik and Phone Pe. Corruption can be controlled to a large degree with the
contraction of the cash economy. This will happen as all transactions will now get recorded via
digital transactions. A cashless economy will also be good for the social aspects of the economy.
Women will also now get their payments in their bank accounts, thus reducing their dependency
on men of the family, who usually control the household expenses. A cash economy is also the
backbone of several unorganized industries, employing millions of people. Shutting these
industries will be akin to taking away these peoples’ employment opportunities. It is true that
Indian tech personnel are responsible for several top tech giants worldwide, so we do have the
requisite talent to develop the infrastructure. But these same Indians thrived in an atmosphere
where corruption was minimal. Thus, the bureaucratic and government machinery will need much
cleaning before these people can make a similar impact here.
Links:
• https://itslyf.com/are-digital-payments-secure-enough-to-go-cashless/
• https://www.careerride.com/view/are-digital-payments-secure-enough-for-the-indian-
economy-to-go-cashless-30807.aspx

Is cryptocurrency the future exchange currency?


The decision by India's Supreme Court to lift the central bank’s ban on cryptocurrency trading
could soon translate into notable growth in trading volumes, according to cryptocurrency
exchanges in the country. India was doing very well in terms of trading volumes, contributing
about $50 million to $60 million per day before the RBI ban, according to Singhal. “In India, a
huge damage was done due to the lack of awareness and RBI’s decision,” said Kumar Gaurav,
founder and CEO at online crypto banking platform Cashaa. Volumes subsequently dipped after
the central bank issued banking restrictions and commercial banks responded by advising account
holders not to engage in cryptocurrency transactions. For instance, Kotak Mahindra Bank, one of
the largest lenders in India, has diligently sent multiple notification emails to account holders in
the last two years warning against the use of credit cards for cryptocurrency exchanges.
Links:
• https://www.compareremit.com/money-transfer-guide/the-future-of-cryptocurrencies-in-
india/
• https://www.coindesk.com/after-court-victory-indian-exchanges-gear-up-for-crypto-trading-
surge
How is technology impacting the banking sector?

Positive impact of Technology on Banking Sector:


The biggest revolution came in banks is Digitization. Banking process is faster than before and
more reliable. Maintenance and retrieval of documents and records have become much faster and
easier. Computerized banking also improves the core banking system. With CBS (core banking
system) all branches have access to common centralized data and are interconnected. With
the innovation of MICR cheque processing system, the processing of cheques becomes faster and
more efficient than before. USSD (Unstructured supplementary service data) was launched by
Government, so people with no internet-connectivity too can access their bank accounts without
visiting the branch. With increasing internet reach, Internet Banking was developed and now
offered by almost every bank. Through this, every transaction details and inquiries can be
performed online without visiting the bank. It offered more transparency in transactions. The scope
of frauds in banks is being minimized through the use of passwords, double authentication in
online banking.

Negative impact of Technology on Banking Sector:

The biggest negative impact of technology is loss of Jobs as automation has replaced number of
jobs in banking sector. Through technology comes the threat of Cyber Attack, a loophole in the
system, millions of data can be lost in the blink of an eye. These technologies consume less time,
it also sometimes makes people careless-which causes loss of personal details as happened last
year in 2016, and many debit cards details of big banks were compromised.

Links:

• https://www.indiabix.com/group-discussion/how-is-technology-impacting-the-banking-
sector/
• https://bankinnovation.net/allposts/biz-lines/lending/the-impact-of-technology-on-banking-
revolution-or-evolution/
• https://www.information-age.com/technology-finance-banking-sector-123471800/
APPENDIX

Video links (for reference) –


• Cloud Computing - https://youtu.be/36zducUX16w
• Machine Learning - https://youtu.be/tfnhpezpkk0
• Augmented Realityhttps://youtu.be/uisezip_Pwc
• Cryptocurrency - https://www.youtube.com/watch?V=8ngvgnx4kow
• SEO - https://www.youtube.com/watch?V=d7uxlkwdyc0
• On page SEO - https://www.youtube.com/watch?V=ecesgy9bdnc
• Off Page SEO - https://www.youtube.com/watch?V=yanx-bqh4dy
• Keyword optimization - https://www.youtube.com/watch?V=taoa_Zy2XUw
• SEM - https://www.youtube.com/watch?V=laltdylusp0
• CRM - https://www.youtube.com/watch?V=hneqq7knfwo
• Social media analytics - https://www.youtube.com/watch?V=1fg58kjkme4
• Analytics: https://www.youtube.com/watch?V=26glyvcyzii
• Data analytics tool: https://www.youtube.com/watch?V=obvzhfvesvi
• Excel vs tableau: https://www.youtube.com/watch?V=beo-6Fi4FGY
• Power bi vs tableau: https://www.youtube.com/watch?V=siqcqm87uf4
• Apriori Algorithm: https://www.youtube.com/watch?V=lzii6n4vgds
• What is big data and why is it important:
https://www.youtube.com/watch?V=k7zu3nxeigy
• Predictive analytics in banking: https://www.youtube.com/watch?V=8ijnzes05dq
• Data analytics in education: https://www.youtube.com/watch?V=l3eo8gymwcc
• What does data scientist do: https://www.youtube.com/watch?V=rxxrs4hrdvs
• Importance of big data analytics: https://www.youtube.com/watch?v=VAd-fg9EKtY

You might also like