IT and Analytics
IT and Analytics
Further readings:
• https://www.zdnet.com/article/what-is-cloud-computing-everything-you-need-to-know-about-the-
cloud/
Deployment models
GE Digital’s Predix Platform: GE created Predix Platform to address the unique needs of industrial
companies on the path to digitization. Predix Platform is a distributed application platform that’s purpose-
built for the digital industrial era.
It captures and analyzes the unique volume, velocity, and variety of machine data in a secure, industrial-
strength cloud environment.
Further Readings:
• https://www.ge.com/digital/iiot-platform
• https://www.ge.com/digital/blog/industrial-iot-how-connected-things-are-changing-
manufacturing
Wipro’s Internet of Things (IoT) solutions: Wipro’s Internet of Things (IoT) solutions address
connectivity of ‘legacy’ and ‘new’ things. Our Multi-Protocol IoT Gateway Framework
amalgamates heterogeneous protocols that complement IoT at the edge, allowing select data to
be transmitted to backend systems. The solutions are spread across Fleet & Asset Management
for real-time tracking of assets, Integration with cloud-based data storage and analytics and
flexible rules-based Smart Drone Framework for field operations.
Further Reading:
• https://www.wipro.com/en-IN/infrastructure/internet-of-things-iot/
IBM’s Cognitive IoT for Healthcare: IBM Watson Health is transforming healthcare by
helping organizations across the healthcare industry leverage data, technology and expertise to
solve clinical, operational and financial problems.
Further Readings:
• https://researcher.watson.ibm.com/researcher/view_group.php?id=7866
• https://www.ibm.com/in-en/cloud/internet-of-things
Trends in IoT
Artificial Intelligence
In computer science, the term artificial intelligence (AI) refers to any human-like intelligence
exhibited by a computer, robot, or other machine.
AI – Recent Applications
Necessity is the biggest driver of (re)invention. The Covid-19 pandemic has dramatically
accelerated corporate digital transformation. Companies have developed new digital capabilities
backed by Artificial Intelligence, in an effort to build resilience and retool for the post-pandemic
world.
Some of the advantages of edge computing include saving bandwidth and improving efficiency by
processing information closer to the users and the devices that require it, rather than sending that
data to be processed in central locations in a virtual cloud. When AI is locally embedded,
manufacturers can reduce latency issues and enhance the generation of insights while lowering
cloud service utilization and the costs. Connectivity cost also decreases, as processing part of the
data locally lowers bandwidth and cellular data usage. Since intelligence is run locally, plants
located in remote areas with poor communication infrastructure are less subject to connectivity
losses that can hinder mission-critical and time-sensitive decision making.
Edge AI “steals” a portion of the intelligence from the cloud infrastructure and brings it to
machinery. Octonion is a start-up that integrates artificial intelligence into low-power
microcontrollers, exemplifies how intelligence can be imbued into industrial products. This
technology supports companies while making smart decisions in real time, locally, by using
continuous learning models and machine health scores. Few examples include deploying edge AI
on industrial pumps and motors in order to improve monitoring of machine capabilities and also
while developing predictive maintenance techniques.
Machine Learning
Machine learning is an application of artificial intelligence (AI) that provides systems the ability
to automatically learn and improve from experience without being explicitly programmed.
Machine learning focuses on the development of computer programs that can access data and
use it to learn for themselves. As it is evident from the name, it gives the computer that which
makes it more similar to humans: The ability to learn. Machine learning is actively being used
today, perhaps in many more places than one would expect.
Applications
Marketing
“75% of enterprises using AI and machine learning enhance customer satisfaction by more than
10%.”
Measuring marketing’s many contributions to revenue growth is becoming more accurate and
real- time thanks to analytics and machine learning. The following are 10 ways machine learning
is revolutionizing marketing today and in the future:
• Using a concerted approach to applying AI and machine learning across a retailer’s value
chains has the potential to deliver a 50% improvement of assortment efficiency and a 30%
online sales increase using dynamic pricing.
• Machine learning is streamlining creation, fine-tuning and revenue contributions of up-sell
and cross-sell strategies by automating the entire progress.
• Lead scoring accuracy is improving, leading to increased sales that are traceable back to
initial marketing campaigns and sales strategies.
• Identifying and defining the sales projections of specific customer segments and micro
segments using RFM (recency, frequency and monetary) modelling within machine learning
apps is becoming pervasive.
• Optimizing the marketing mix by determining which sales offers, incentive and programs are
presented to which prospects through which channels is another way machine learning is
revolutionizing marketing.
Finance
Leading banks and financial services companies are deploying AI technology, including machine
learning (ML), to streamline their processes, optimize portfolios, decrease risk and underwrite
loans amongst other things. Due to the high volume of historical financial data generated in the
industry, ML has found many useful applications in finance. The following are some of the
current applications of machine learning in finance:
Portfolio Management – Robo-Advisors: provides automated financial guidance and service.
They provide portfolio management services that use algorithms and statistics to automatically
establish and manage the investment portfolio of a client
Algorithmic Trading: is the use of algorithms to conduct trades autonomously. It’s mostly
hedges fund managers that make use of automated trading systems and so make use of machine
learning in finance. It allows traders to automate certain processes ensuring a competitive
advantage.
High-Frequency Trading (HFT): Machines in charge of HFT is nothing new. During 2009-
2010, anywhere from 60% to 70% of U.S. trading was attributed to HFT. Some of the biggest
players include companies like Tokyo-based Nomura Securities, Two Sigma Securities, Citadel
Securities, Tower Research Capital and DRW, but there are many more operating in financial
markets worldwide.
Fraud Detection: Fraud is a massive problem for financial institutions and one of the foremost
reasons to leveraged machine learning in finance. This is because ML systems can scan through
vast data sets, detect unusual activities, (anomalies), and flag them instantly. ML is also the
perfect candidate to tackle the problem of false positives, which is something that happens
regularly in finance.
Operations
Machine learning makes it possible to discover patterns in supply chain data by relying on
algorithms that quickly pinpoint the most influential factors to a supply networks’ success. Key
factors influencing inventory levels, supplier quality, demand forecasting, procure-to-pay, order-
to-cash, production planning, transportation management and more are becoming known for the
first time. The ten ways machine learning is revolutionizing supply chain management include:
1. Machine learning algorithms and the apps running them are capable of analyzing large,
diverse data sets fast, improving demand forecasting accuracy.
2. Reducing freight costs, improving supplier delivery performance, and minimizing supplier
risk are three of the many benefits machine learning is providing in collaborative supply
chain networks.
3. Machine Learning and its core constructs are ideally suited for providing insights into
improving supply chain management performance not available from previous
technologies.
4. Machine learning excels at visual pattern recognition, opening up many potential
applications in physical inspection and maintenance of physical assets across an entire
supply chain network.
Industry Examples
IBM
IBM’s Watson’s been following self-learning behavior models and has done everything from
diagnosing certain types of cancers more effectively than oncologists, writing songs, and
producing movie trailers. In the case of cancer treatments, Watson can read half a million medical
research papers in 15 seconds and was trained at Memorial Sloan Kettering in New York to be
able to suggest diagnoses and treatments to doctors.
Amazon
On the retail side, everything from product recommendations to supply chain, forecasting, and
capacity planning runs on machine-learning, while programs like Macie and Glue that scan for
sensitive data breaches and perform data cleansing, respectively. Of course, let’s not forget
Alexa, Prime Air, and Amazon Go, which all function through AI algorithms, while rumors of
an AI fashion designer are feeding the Amazon AI flame.
Google
Google was one of the pioneers of machine learning with suggested searches and ever-evolving
search ranking algorithms. Google’s Machine Intelligence efforts have focused on deep learning,
which involves multiple layers of neural networks—built to simulate human thought processes—
that allow Google’s technology to process data more thoroughly.
Netflix
The online streaming giant announced an AI algorithm called Dynamic Optimizer to analyze
each and every frame of video in each of its roughly 13,000 titles it streams and compresses it
without sacrificing image quality.
Industry Applications
Automotive industry - VR allows engineers and designers to experiment easily with the look
and build of a vehicle before commissioning expensive prototypes. Brands such as BMW and
Jaguar Land Rover already use VR to hold early design and engineering reviews to check the
visual design and object obscuration of the vehicle - all before any money has been spent on
physically manufacturing the parts.
Augmented Reality
Do you remember the Pokémon GO craze? That’s the most well-known application of
augmented reality—technology that overlays digital information on the real world. Rather than
provide a fully immersive virtual experience, augmented reality enhances the real-world with
images, text, and other virtual information via devices such as heads-up displays, smartphones,
tablets, smart lenses, and AR glasses.
Industry Applications
Training - every industry needs to train the new recruits. AR is used to create training
programs and step by step instructions are given to the trainees. This creates more engaging
and interactive training programs.
Assembly industry - in industries, such as automotive or semiconductor, where all the workers
assemble components, they used to rely on paper instructions or remember all the steps. But
with augmented reality they are given step by step instructions, simplifying their job.
Warehouse logistics - AR applications are increasingly being used for order pickup in
warehouses. So, AR applications are basically combining a lot of other capabilities, such as
image recognition, barcode scanning, indoor navigation, and everything is being integrated
with the warehouse management system.
Blockchain
Blockchain is a technology which can be used to develop applications, such as social networks,
messengers, games, exchanges, storage platforms, voting systems, prediction markets, online
shops and much more. In this sense, it is similar to the internet, which is why some have dubbed
it “The Internet3.0”. A blockchain is, in the simplest of terms, a time-stamped series of
immutable records of data that is managed by a cluster of computers not owned by any single
entity. Each of these blocks of data (i.e., block) is secured and bound to each other using
cryptographic principles (i.e., chain).
Cryptocurrency
A cryptocurrency (or crypto currency) is a digital asset designed to work as a medium of
exchange using cryptography to secure the transactions and to control the creation of additional
units of the currency. Cryptocurrencies are classified as a subset of digital currencies and are also
classified as a subset of alternative currencies and virtual currencies. Bitcoin and its derivatives
use decentralized control as opposed to centralized electronic money/centralized banking
systems. The decentralized control is related to the use of bitcoin’s blockchain transaction
database in the role of a distributed ledger.
5G
• 5G is the 5th generation mobile network. It is a new global wireless standard after 1G, 2G, 3G,
and 4G networks. 5G enables a new kind of network that is designed to connect virtually
everyone and everything together including machines, objects, and devices.
• 5G is based on OFDM (Orthogonal frequency-division multiplexing), a method of modulating
a digital signal across several different channels to reduce interference.
• 5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra-low
latency, more reliability, massive network capacity, increased availability, and a more uniform
user experience to more users.
Barriers to 5G Adoption:
1. One major obstacle is that network providers will need to install a lot of new, and
expensive, infrastructure.
2. With 5G signals tending to travel relatively short distances, network providers will need to
deploy more antennas and base stations to ensure broad coverage.
5G in India:
1. During the AGM 2020, RIL chairman Mukesh Ambani announced the plans for
'homegrown 5G' network in India. At the event, Ambani revealed that Jio has developed a
5G solution from scratch and that it will be ready for trials as soon as 5G spectrum is
available and can be ready for field deployment in 2021.
2. Currently, India does not have 5G services and the government has not yet allocated
spectrum to telecom operators for running field trials aimed at promoting domestic
ecosystems for 5G.
Applications:
Recent Trends in 5G
The fifth-generation of mobile networking technology, or 5G, has been top of mind for the telecom
industry in recent years and the buzz has trickled into almost every other market as businesses look
for ways to manage change through new connectivity options.
Here are some of the most important 5G trends that solution providers should be aware of this year
as the latest cellular technology becomes a viable connectivity option.
5G’s Impact on IoT
5G is going to help further IoT because of the latency and bandwidth improvements it can offer.
The IoT opportunities that will especially benefit from mobile and cellular connectivity include
transportation, manufacturing, farming, and smart cities use cases. And 5G could even make new
and emerging use cases and applications a true reality for the first time, such as connected cars,
which require lightning-fast, low-latency technologies.
Aside from the cutting-edge use cases, many industries right now need highly reliable low-latency
wireless links that can power applications as quickly as possible for their existing IoT use cases.
Connected Communities
Smart cities have become a major IoT trend in recent years as metro areas all around the world
equip indoor and outdoor areas with sensors to collect data and gain insights to better manage their
assets, resources and services.
5G is the technology that the smart city and connected communities use cases have been looking
for. Existing 4G networks are limited in its support for simultaneous connections, high power
consumption and high price per bit. 5G, on the other hand, is expected to drive smart city
applications by addressing these issues and in return, harness the newly captured data to improve
city operations.
5G And Security
As the amount of 5G implementations increase, the need for good security will become even more
critical. Carriers, such as AT&T, Verizon, and T-Mobile have been bolstering their next-
generation networks with added encryption and additional defenses at the edge of the network.
But 5G, unlike previous iterations of cellular technology, will be made up of a mostly software-
based network, so securing 5G is a different kind of endeavor. The applications that will ride on
top of the 5G network, such as IoT and smart city apps, will also require additional layers of
security to lock down the new devices and connections that will be joining the network.
5G At the Edge
The link between 5G and edge computing is all about latency. 5G promises to fuel innovation at
the edge by powering brand-new use case, enabling more data collection and faster processing
than ever before, while giving businesses and organizations another connectivity option.
By combining 5G and edge computing, organizations will be able to outfit devices like smart
cameras and sensors to collect more data, which will drive more compute use cases at the edge.
This will result in expanding opportunities for solution providers in collecting data at the edge,
channel partners told CRN.
According to research firm IDC, the worldwide edge computing market is forecast to reach
approximately $250 billion in 2024 with a compound annual growth rate of 12.5 percent over the
next four years. 5G technology is expected to act as a catalyst for that market growth.
Social media is a good medium to understand real-time consumer choices, intentions and
sentiments. The most prevalent application of social media analytics is to get to know the
customer base on a more emotional level to help better target customer service and marketing.
The initial step during a social media analytics program is to figure out which business objectives
can gain an advantage from the data that is collected and evaluated. Standard goals include
maximizing business earnings, decreasing customer service expenditures, obtaining feedback on
services and products, and enhancing public opinion about a business division or specific
product. As soon as the business goals are determined, key performance indicators (KPIs) to
perform objective evaluation of the data must be outlined.
• Competitive Advantage: SMA tools allow the organizations to gain a competitive edge over
their competitors by facilitating a much better comprehension of their brands. This usually
includes an understanding of how the customers make use of particular services or products,
what issues are faced by the customers while using these services or products, and getting to
know how customers' views about a particular company or product.
• Learn from the Customers: In many cases, customers may have effective solutions for some of
the issues faced by an organization. For example, if a product is in the market without proper
documentation, the chances of use errors increase. Some users may solve these problems
through trial-and-error, and then post their findings in forums, which can help the company
determine whether better documentation is required, and what users really need to know.
• Improve Products and Services: This is the key goal of SMA. There are countless tweets, blogs,
comments and complaints regarding products and services. This huge volume of information
contains consumer sentiments that can be used to evaluate users' experience with a particular
product or service. This information can then be used to help companies perform better.
Analytics
Data analytics is the science of analyzing raw data in order to make conclusions about that
information. Many of the techniques and processes of data analytics have been automated into
mechanical processes and algorithms that work over raw data for human consumption.
Data analytics techniques can reveal trends and metrics that would otherwise be lost in the mass
of information. This information can then be used to optimize processes to increase the overall
efficiency of a business or system.
• Prescriptive Analytics: This data analytics concept prescribes what action to take to remove
future problems or capitalize on a promising trend. Prescriptive analytics essentially provides
an organization with a laser-like focus to answer a specific question. It also helps them to
determine the best solution for a future opportunity or avoid future risks.
• Predictive analytics: It uses big data to identify past patterns to predict the future. Predictive
analytics draws its power from numerous methods and technologies, such as big data, data
mining, statistical modeling, machine learning and assorted mathematical processes, among
others. By utilizing this model, an organization can use past and current data to reliably
forecast trends and behaviors.
• Descriptive analytics: This data analytics method provides insight into what has happened
historically and will provide businesses with trends to get in-depth detail. Descriptive
analytics defines a preliminary stage of data processing that creates a summary of historical
data to yield meaningful information and possibly prepare the data for further analysis.
• Diagnostic Analytics: With this analytics technique, historical data can be measured against
other data to answer the question of why something happened. Essentially, data scientists
turn to this technique when trying to determine “Why” behind something happened.
Diagnostic analytics can be beneficial in the sales cycle, for instance, to categorize customers
by their likely product preferences and sales cycle.
The applications of data analytics are broad. Analyzing big data can optimize efficiency in many
different industries. Improving performance enables businesses to succeed in an increasingly
competitive world. One of the earliest adopters is the financial sector. Data analytics has an
important role in the banking and finance industries, used to predict market trends and assess risk.
Credit scores are an example of data analytics that affects everyone. These scores use many data
points to determine lending risk. Data analytics is also used to detect and prevent fraud to improve
efficiency and reduce risk for financial institutions.
The use of data analytics goes beyond maximizing profits and ROI, however. Data analytics
can provide critical information for healthcare (health informatics), crime prevention, and
environmental protection. These applications of data analytics use these techniques to
improve our world. Though statistics and data analysis have always been used in scientific
research, advanced analytic techniques and big data allow for many new insights. These
techniques can find trends in complex systems.
• Tableau
• Python
• SAS
• Excel
• SPSS
• Power BI
SPSS
SPSS is short for Statistical Package for the Social Sciences, and it’s used by various kinds of
researchers for complex statistical data analysis.
SPSS is used by market researchers, health researchers, survey companies, government entities,
education researchers, marketing organizations, data miners, and many more for the processing
and analyzing of survey data.
SPSS offers four programs that assist researchers with their complex data analysis needs.
Text Analytics for Surveys Program Uncover powerful insights from responses to open
ended survey questions
Visualization Designer Use data to create a wide variety of visuals like density
charts and radial boxplots
Power BI
Microsoft’s Power BI is a cloud-based, business analytics service for analyzing and visualizing
data. Power BI gives you a platform to Connect to hundreds of data sources and bring your data
to life with live dashboards and reports.
Tableau
Tableau is a data visualization software that is used for data science and business intelligence. It
presents the impact of data visually and comes with real-time data analytics capabilities and
cloud support.
Power BI Tableau
Data sources:
It has access to numerous database sources and
Limited access to other databases and servers servers.
when compared to Tableau.
Example:
Example: Excel, Text File, Access, JSON File, PDF File,
Spatial File, Statistical File, Other Files
SQL Server Database, Access Database, SQL
(such as Tableau .hyper, .tds, .twbx),
Server Analysis Services Database, Oracle
Connect to a Published Data Source on
Database, IBM DB2 Database, IBM Informix
Tableau Online or Server, Actian Matrix,
database (Beta)
Actian Vector, Amazon Athena, Amazon
Aurora,Amazon EMR, Amazon Redshift
Data Capacity:
Each workspace/group could handle up to 10 Tableau works on the columnar based structure
GB of Data. which stores only unique values for each
For more than 10GB, Either Data needs to be column making it possible to fetch Billions of
in a cloud (Azure), if it is in local databases rows.
Power BI just selects or pulls the data from a
database and does not import
Machine Learning:
Power BI is integrated with Microsoft Azure, it Python machine learning capacities are inbuilt
helps in analyzing the data and understanding with Tableau, making it efficient for
the trends and patterns of the product/business. performing ML operations over the datasets.
Performance:
It can handle a huge volume of data with better
It can handle a limited volume of data. performance.
Target Audience:
Even though access is easy and simple,
Naive Users, Experienced Users Analysts and Experienced users use it for their
analytics purposes.
Pricing: It is very cheap when compared to Tableau is costlier than power BI. It needs to
Tableau be paid more when connected to third-party
applications.
Real Time Dashboard: With Power BI real- Tableau provides feature for real time data.
time streaming, you can stream data and The Connect Live feature is used for real-time
update dashboards in real-time. Any visual or data analysis.
dashboard that can be created in Power BI
can also be created to display and update real-
time data and visuals.
Data Mining
Data mining is the practice of automatically searching large stores of data to discover patterns and
trends that go beyond simple analysis. Data mining uses sophisticated mathematical algorithms to
segment the data and evaluate the probability of future events. In simple terms it is used to turn
raw data into useful information.
Steps in Data Mining
1) Data Cleaning – Data cleaning is the process of preparing data for analysis by removing or
modifying data that is incorrect, incomplete, irrelevant, duplicated, or improperly formatted.
This data is usually not necessary or helpful when it comes to analyzing data because it may hinder
the process or provide inaccurate results. There are several methods for cleaning data depending
on how it is stored along with the answers being sought. Data cleaning is not simply about erasing
information to make space for new data, but rather finding a way to maximize a data set’s accuracy
without necessarily deleting information.
2) Data Integration – Data Integration is a data pre-processing technique that involves combining
data from multiple heterogeneous data sources into a coherent data store and provide a unified
view of the data
3) Data Selection – Data Selection is the process where data relevant to the analysis task are
retrieved from the database. Sometimes data transformation and consolidation are performed
before the data selection process.
4) Data Transformation – Data transformation is the process of changing the format, structure,
or values of data. For data analytics projects, data may be transformed at two stages of the data
pipeline. Organizations that use on-premises data warehouses generally use an ETL process, in
which data transformation is the middle step. Today, most organizations use cloud-based data
warehouses, which can scale compute and storage resources with latency measured in seconds or
minutes. The scalability of the cloud platform lets organizations skip preload transformations and
load raw data into the data warehouse, then transform it at query time — a model called ELT
5) Data Mining – Data mining involves six common classes of tasks:
• Anomaly detection (outlier/change/deviation detection) – The identification of unusual data
records, that might be interesting or data errors that require further investigation.
• Association rule learning (dependency modeling) – Searches for relationships between
variables. For example, a supermarket might gather data on customer purchasing habits. Using
association rule learning, the supermarket can determine which products are frequently bought
together and use this information for marketing purposes. This is sometimes referred to as
market basket analysis.
• Clustering – is the task of discovering groups and structures in the data that are in some way
or another "similar", without using known structures in the data.
• Classification – is the task of generalizing known structure to apply to new data. For example,
an e-mail program might attempt to classify an e-mail as "legitimate" or as "spam".
• Regression – attempts to find a function that models the data with the least error that is, for
estimating the relationships among data or datasets.
• Summarization – providing a more compact representation of the data set, including
visualization and report generation.
6) Pattern Evaluation- Pattern Evaluation is defined as identifying strictly increasing patterns
representing knowledge based on given measures. Uses summarization and Visualization to make
data understandable by the user.
7) Knowledge Representation- Knowledge representation is the presentation of knowledge to the
user for visualization in terms of trees, tables, rules graphs, charts, matrices, etc.
Data Mining Algorithms
Data Algorithms
Data mining is known as an interdisciplinary subfield of computer science and basically is a
computing process of discovering patterns in large data sets. It is considered as an essential
process where intelligent methods are applied in order to extract data patterns.
Techniques of working with Traditional Methods.
K-means:
K-means clustering that is also known as nearest centroid classifier or The Rocchio algorithm is
a method of vector quantization that is considerably popular for cluster analysis in data mining.
K-means is used to create k groups from a set of objects just so that the members of a group are
more similar. Cluster analysis is a family of algorithms designed to form groups such that the
group members are more similar versus non-group members. It assigns data points to a cluster
such that the sum of the squared distance between the data points and the cluster’s centroid
(arithmetic mean of all the data points that belong to that cluster) is at the minimum. The less
variation we have within clusters, the more homogeneous (similar) the data points are within the
same cluster.
K-Means is relatively an efficient method. However, we need to specify the number of clusters, in
advance and the final results are sensitive to initialization and often terminates at a local optimum.
To process the learning data, the K-means algorithm in data mining starts with a first group of
randomly selected centroids, which are used as the beginning points for every cluster, and then
performs iterative (repetitive) calculations to optimize the positions of the centroids. It halts
creating and optimizing clusters when either:
● The centroids have stabilized — there is no change in their values because the clustering
has been successful.
● The defined number of iterations has been achieved
Apriori Algorithm:
Apriori algorithm is a classical algorithm in data mining. It is used for mining frequent item sets
and relevant association rules. It is devised to operate on a database containing a lot of transactions,
for instance, items brought by customers in a store. With the quick growth in e-commerce
applications, there is an accumulation vast quantity of data in months not in years. Data Mining,
also known as Knowledge Discovery in Databases (KDD), to find anomalies, correlations,
patterns, and trends to predict outcomes. It is very important for effective Market Basket Analysis
and it helps the customers in purchasing their items with more ease which increases the sales of
the markets. It has also been used in the field of healthcare for the detection of adverse drug
reactions. It produces association rules that indicate what all combinations of medications and
patient characteristics lead to ADRs.
Usage of Apriori Algorithm in Complementary Goods Concept:
Complements are the items bought and used together. These items help to lift the sales of each
other in a customer basket. Examples of complements could be Bread and Butter or Flights and
Taxi services etc.
How Apriori Helps!!
1. Item placement in the stores. Complementary items can be placed together/closer.
2. In the E-commerce website, whenever an item is bought, recommending its complimentary
items as these are items bought together.
3. On unavailability of an item, recommending its substitute.
4. Giving combo offers on the item and its complement to lift the sales or clear the stocks.
5. Whenever there is a price hike/drop of an item, monitoring the impact on the sales/demand of
its substitute. This helps in taking conscious and planned pricing decisions.
Naive Bayes -
Naive Bayes (NB) is a simple supervised function and is special form of discriminant analysis. It's
a generative model and therefore returns probabilities. It's the opposite classification strategy of
one Rule. All attributes contribute equally and independently to the decision. Naive Bayes makes
predictions using Bayes' Theorem, which derives the probability of a prediction from the
underlying evidence, as observed in the data. Naive Bayes works surprisingly well even if
independence assumption is clearly violated because classification doesn’t need accurate
probability estimates so long as the greatest probability is assigned to the correct class.
Naive Bayes works also on text categorization. NB affords fast model building and scoring and
can be used for both binary and multi-class classification problems. The naive Bayes classifier is
very useful in high-dimensional problems because multivariate methods like QDA and even LDA
will break down. For example, a fruit may be considered to be an apple if it is red, round, and
about 3 inches in diameter. Even if these features depend on each other or upon the existence of
the other features, all of these properties independently contribute to the probability that this fruit
is an apple and that is why it is known as ‘Naive’.
What is Big Data and why is Big Data Analytics important?
Big Data refers to a huge volume of data that cannot be stored and processed using the traditional
computing approach within a given time frame. But how huge this data needs to be? To be termed
as Big Data? There is a lot of misconception surrounding, what amount of data can be termed as
Big Data. Usually, the data which is either in gigabytes, terabytes, petabytes, Exabyte or anything
larger than this in size is considered as Big Data. This is where the misconception arises. Even a
small amount of data can be referred to as Big Data depending on the context it is being used.
Big Data Analytics and Data Science
The analytics involves the use of advanced techniques and tools of analytics on the data obtained
from different sources in different sizes. Big Data analytics involves the use of analytics
techniques like machine learning, data mining, natural language processing, and statistics. The
data is extracted, prepared and blended to provide analysis for the businesses. Data analytics
involves qualitative as well as quantitative techniques to improve business productivity and
profits. The data analytics tools are used by researchers, analysts, and engineers for business
organizations to access the data efficiently.
The education sector is also making use of data analytics in a big way. There are new options for
research and analysis using data analytics. The institutional data can be used for innovations by
technical tools available today. Due to immense opportunities, Data analytics has become an
attractive option to study for students as well.
Use Cases
IBM has its own project that has been using analytics and helping schools succeed. The
University of Florida has also been using this platform to extract the student data and use it to
monitor and predict their student performance. Finger Lakes Community College and The Keller
Graduate School of Management have already adopted the IBM Analytics platform to track their
students and both the institutes have seen a rise in the performance of their students.
The insights provided by the big data analytics tools help in knowing the needs of customers
better. This helps in developing new and better products. Improved products and services with
new insights can help the firm enormously. This may help the customers too as they get better
offerings satisfying their needs effectively. All in all, Data analytics has become an essential part
of the companies today.
Coca-Cola
Coca Cola is known to have ploughed extensive research and development resources into
artificial intelligence (AI) to ensure it is squeezing every drop of insight it can from the data it
collects.
Fruits of this research were unveiled earlier this year when it was announced that the decision to
launch Cherry Sprite as a new flavor was based on monitoring data collected from the latest
generation of self-service soft drinks fountains, which allow customers to mix their own drinks.
Healthy options - The Company combines weather data, satellite images, information on crop
yields, pricing factors and acidity and sweetness ratings, to ensure that orange crops are grown
in an optimum way, and maintain a consistent taste. The algorithm then finds the best
combination of variables in order to match products to local consumer tastes in the 200-plus
countries around the world where its products are sold.
Social data mining - Coca Cola closely tracks how its products are represented across social
media, and in 2015 was able to calculate that its products were mentioned somewhere in the
world an average of just over once every two seconds.
Knowing this gives insight into who is consuming their drinks, where their customers are, and
what situations prompt them to talk about their brand. The company has used AI-driven image
recognition technology to spot when photographs of its products, or those of competitors, are
uploaded to the internet, and uses algorithms to determine the best way to serve them
advertisements.
Netflix Recommendation System
With over 100 million subscribers, the company collects huge data, which is the key to achieving
the industry status Netflix boasts. Whenever you access the Netflix service, their
recommendations system helps you find a show or movie to enjoy with minimal effort. The
likelihood that you will watch a particular title in the catalogue is estimated based on a number
of factors including your interactions with the service (such as your viewing history and how you
rated other titles), other members with similar tastes and preferences on the service information
about the titles, such as their genre, categories, actors, release year etc.
All of these pieces of data are used as inputs that are processed in the algorithms. The
recommendations system does not include demographic information (such as age or gender) as
part of the decision-making process.
To improve the recommendation system, feedback from every visit to the Netflix service is
collected used to and continually re-train the algorithms with those signals to improve the
accuracy of their prediction of what you’re most likely to watch.
Amazon Fresh and Whole Foods
Amazon knows the online customer backwards and forwards, but when it comes to
understanding the brick-and-mortar shopper, they lack insight.
Amazon didn’t buy Whole Foods for the business -they bought it for the data.
What exactly is in the Whole Foods data that Amazon would want? Answer: Grocery buying
habits and patterns. Preferences. Correlations between purchases of different products and even
different categories. With massive amounts of data from Whole Foods shoppers, Amazon will
ultimately be able to tailor the grocery shopping experience to the individual. Amazon has
already mastered the process of upselling, i.e., offering additional items that go with the items
the consumer is looking to buy. With consumables like groceries, Amazon will know when you
run out of cereal and will present you with the offer to buy more at precisely the right time.
Alternatively, the new box of cereal may just show up at your door at the moment you take that
last bite.
Adidas
Due to the size of Adidas, keeping track of all products on its e-commerce activity is a difficult
task. Johannes Wagner, senior business analyst at Adidas Group explained, “Across Europe,
there are two brands (Adidas and Reebok), 17 markets and over 9,000 individual articles”.
Combined, this creates a mammoth task with a huge number of data points.
This high volume of data means that, before implementing data analytics, the workload was very
high and all areas of the business were being pushed for time. Secondly, merchandisers were
unable to perform in-season management of their items as they were not equipped with the
appropriate solutions, meaning they had to contact the analysis team instead. The solution was
based around the collection of transactional data from multiple sources and enabling this data to
dictate the direction of business decisions through analytics insights. Using an analytics platform,
Wagner explained how the nature of analytics allowed merchandisers to perform data tasks and
in-season management on their own without consulting analyst teams. This increased the teams’
independence and allowed the analysts to spend more time on high-value tasks. This, along with
newfound granular product tracking, had an overall outcome of higher profit margins.
Amazon Alexa
Alexa is built based on natural language processing (NLP), a procedure of converting speech
into words, sounds, and ideas. Amazon records your words. Indeed, interpreting sounds takes up
a lot of computational power, the recording of your speech is sent to Amazon’s servers to be
analyzed more efficiently.
Amazon breaks down your “orders” into individual sounds. It then consults a database containing
various words’ pronunciations to find which words most closely correspond to the combination
of individual sounds. It then identifies important words to make sense of the tasks and carry out
corresponding functions. For instance, if Alexa notices words like “sport” or “basketball”, it
would open the sports app. Amazon’s servers send the information back to your device and Alexa
may speak. If Alexa needs to say anything back, it would go through the same process described
above, but in reverse order.
The main command has 3 main parts: Wake word, Invocation name, Utterance.
Wake word - When users say ‘Alexa’ which wakes up the device. The wake word put the Alexa
into the listening mode and ready to take instructions from users.
Invocation name - Invocation name is the keyword used to trigger a specific “skill”. Users can
combine the invocation name with an action, command or question. All the custom skills must
have an invocation name to start it.
Utterance - ‘Taurus’ is an utterance. Utterances are phrases the users will use when making a
request to Alexa. Alexa identifies the user’s intent from the given utterance and responds
accordingly. So basically, the utterance decides what user want Alexa to perform.
After, Alexa enabled devices sends the user’s instruction to a cloud-based service called Alexa
Voice Service (AVS). Think the Alexa Voice Service as the brain of Alexa enabled devices and
perform all the complex operations such as Automatic Speech Recognition (ASR) and Natural
Language Understanding (NLU). Alexa Voice Service process the response and identify the
user’s intent, then it makes the web service request to third party server if needed
IBM Watson Analytics is an intelligent, self-service data analysis and visualization application
for discovering patterns and insights in your data. It guides you through the process of discovery
and automates the predictive analysis and related cognitive processes that comes afterward.
Because of the natural language processing capability of IBM Watson Analytics, you can interact
with your data as if you are having a conversation with it. As such, you can extract answers from
structured and unstructured information with ease. IBM Watson represents a new era of
computing called Cognitive computing. It is a cloud-based data discovery service intended to
provide the benefits of advanced analytics without the complexity. Watson Analytics empowers
even novice users to understand and make use of data science techniques ranging from machine
learning and predictive modeling.
Furthermore, IBM Watson Analytics lets you instantly find new and emerging trends in your
data. The service even presents it in a visual manner through your dashboards so you can detect
patterns faster.
Quantum Computing
Classical computers that we use today can only encode information in bits that take the value of 1
or 0. This restricts their ability. Quantum computing, on the other hand, uses quantum bits or
qubits. It harnesses the unique ability of subatomic participles that allows them to exist in more
than one state i.e., a 1 and a 0 at the same time. Superposition and entanglement are two features
of quantum physics on which these supercomputers are based. This empowers quantum computers
to handle operations at speeds exponentially higher than conventional computers and at much
lesser energy consumption.
Quantum computing could contribute greatly in the fields of finance, military affairs, intelligence,
drug design and discovery, aerospace designing, utilities (nuclear fusion), polymer design,
Artificial Intelligence (AI) and Big Data search, and digital manufacturing.
Its potential and projected market size has engaged some of the most prominent technology
companies to work in the field of quantum computing, including IBM, Microsoft, Google, D-
Waves Systems, Alibaba, Nokia, Intel, Airbus, HP, Toshiba, Mitsubishi, SK Telecom, NEC,
Raytheon, Lockheed Martin, Rigetti, Biogen, Volkswagen, and Amgen.
Optimization: Many optimization problems are searching for a global minimal point solution.
By using quantum annealing, the optimization problems may be solved earlier than using
supercomputers.
Machine Learning / Big data: ML and deep learning researchers are seeking for efficient ways
to train and test models using large data set. Quantum computing can help to make the process
of training and testing faster.
Simulation: Simulation is a useful tool to anticipate possible errors and take action. Quantum
computing methods can be used to simulate complex systems.
Material Science: Chemistry and material science are limited by the calculations of the complex
interactions of atomic structures. Quantum solutions are promising a faster way to model these
interactions.
Cybersecurity
Cyber security refers to the body of technologies, processes, and practices designed to protect
networks, devices, programs, and data from attack, damage, or unauthorized access. Cyber security
may also be referred to as information technology security.
Cyber security is important because government, military, corporate, financial, and medical
organizations collect, process, and store unprecedented amounts of data on computers and other
devices. A significant portion of that data can be sensitive information, whether that be intellectual
property, financial data, personal information, or other types of data for which unauthorized access
or exposure could have negative consequences. Organizations transmit sensitive data across
networks and to other devices in the course of doing businesses, and cyber security describes the
discipline dedicated to protecting that information and the systems used to process or store it. As
the volume and sophistication of cyber-attacks grow, companies and organizations, especially
those that are tasked with safeguarding information relating to national security, health, or financial
records, need to take steps to protect their sensitive business and personnel information.
For an effective cyber security, an organization needs to coordinate its efforts throughout its entire
information system. Elements of cyber encompass all of the following:
Network security: The process of protecting the network from unwanted users, attacks and
intrusions.
Application security: Apps require constant updates and testing to ensure these programs are
secure from attacks.
Endpoint security: Remote access is a necessary part of business, but can also be a weak point
for data. Endpoint security is the process of protecting remote access to a company’s network.
Data security: Inside of networks and applications is data. Protecting company and customer
information is a separate layer of security.
Identity management: Essentially, this is a process of understanding the access every individual
has in an organization.
Database and infrastructure security: Everything in a network involves databases and physical
equipment. Protecting these devices is equally important.
Cloud security: Many files are in digital environments or “the cloud”. Protecting data in a 100%
online environment presents a large number of challenges.
Mobile security: Cell phones and tablets involve virtually every type of security challenge in and
of themselves.
There remains a lot of speculation about what happens after the pandemic, but six things appear to
be certain:
Some organizations will need to move to new operating models. For these companies,
immediately after the crisis, cybersecurity and IT rights will require careful examination and
handling. Remote worker monitoring and support will become vital. And for workers who
transition from home back to the office, cybersecurity professionals must ensure stringent system
and access scrutiny prior to allowing the shifted system to connect back to the network
Companies will need to reset their security systems to ensure there are no outliers. Both physical
and digital systems will need to be restarted, to check for any digital holes in the fence. System
and data access rights granted during the pandemic to enable remote work will require auditing to
determine whether they should be revoked or updated. IT systems will need to be analyzed for
cracks, foul paths or fraudulent identities. The reason is that cybercriminals may have found ways
to gain entry into otherwise hardened facilities.
New cyber risks that appeared during the pandemic must be understood. For instance, security
experts will need to scrutinize the digital capabilities of critical business functions, making sure
they can withstand cyberattacks during a lockdown. They will examine critical supply chains,
including digital supply chains, to ensure continuity during a health crisis.
Updates to remote access and bring-your-own-device (BYOD) policies must be made. They
should include cybersecurity hygiene controls.
Advanced technology must be deployed. Threat detection and response capabilities must include
advanced capabilities supported by next-generation technologies like big data, artificial
intelligence and machine learning. These are needed to detect and respond to adverse behaviour at
machine speed, without human interventions. Further, organizations will want to explore insurance
against losses from cyberattacks incurred during a pandemic scenario.
Encryption
Encrypting data in storage, transit and use.
Authentication
Securely identifying people and digital entities.
Authorization
Defining and implementing privileges for computing resources.
Network Security
Securing networks with techniques such as a network perimeter.
Sandboxing
Running untrusted software in a virtual environment where it can do no harm.
Internal Controls
Internal controls such as the requirement that different people write code, review the code and
launch it into production.
Security by Design
Architecting and designing systems, applications and infrastructure to be secure.
Secure Coding
A series of principles and practices for developing code that is free of security vulnerabilities.
Secure Testing
Testing cycles designed to discover security vulnerabilities.
Defense in Depth
The principle that each layer of security doesn't assume anything. For example, an application
that doesn't assume that a firewall has prevented external access
Physical Security
Physical security such as a data center with access controls.
Audit Trail
Logging that records interactions with systems, applications, databases and infrastructure such
that malicious activity can be detected and reconstructed.
Defensive Computing
Users who are aware of cybersecurity and are careful in their use of technology.
Non-Repudiation
The ability to prove that a commercial transaction took place.
Security Infrastructure
Foundational tools that offer security services such as a virus scanner or intrusion detection
system.
Monitoring
Monitoring systems, applications and infrastructure and promptly investigating suspicious
activity.
Vulnerability Management
Tracking known vulnerabilities to software and hardware and applying fixes in a timely manner.
Response to Breaches
Defending your services, resources and data from an attack
Targeted Ransomware
Another significant trend in cybersecurity is that we can't seem to ignore for 2020 is targeted
ransomware. Especially in the developed nation's industries rely heavily on specific software to
run their daily activities. These ransomware targets are more focused, such as the Wanna Cry
attack on the National Health Service hospitals in England Scotland corrupted more than 70,000
medical devices. Though generally, ransomware asks to threaten to publish victim's data unless a
ransom is paid still, it can affect the large organization or in case of nations too.
Insider Threats
Human error is still one of the primary reasons for the data breach. Any bad day or intentional
loophole can bring down a whole organization with millions of stolen data. Report by Verizon in
data breach gives strategic insights on cybersecurity trends that the employees directly or indirectly
made 34 percent of total attacks. So, make sure you create more awareness within premises to
safeguard data in every way possible.
NFT
Vertical Platform: Within individual industries, vertical platforms have formed in order to share
data and provide solutions to targeted needs, such as predictive maintenance, supply chain
optimization, operational efficiency, and network optimization. Airbus’s Skywise and Penske’s
Fleet Insight, for example, provide benchmarking and other services using aggregated data from
airlines and logistics providers, respectively. Volkswagen recently opened its Industrial Cloud to
external companies, inviting platform partners to both contribute software applications that
increase the carmaker’s production efficiency and improve their own operations by scaling their
applications.
Super Platforms: Big companies, both tech giants and traditional industry incumbents, have
recognized the value in data sharing and are positioning themselves to capture a significant share
by expanding their existing vertical platforms to become super platforms. Super platforms
aggregate data across both verticals and data entities to support the development of applications
that address new sets of use cases. Most super platforms so far have been consumer focused, but
there are also early examples that aggregate data for industrial and B2B uses. Siemens’
MindSphere, ABB Ability, and Honeywell Forge, among others, are competing to be the digital
entry point for the factory. Super platforms can also address big societal issues, such as energy
efficiency. Schneider Exchange connects Schneider Electric customers with an open ecosystem of
analytics and solution providers that can develop applications by tapping directly into data from
the entire electrical system, from generation to transmission to commercial and residential use.
The issue with super platforms is assuring where and how the data is being used.
Shared Infrastructure: Most large companies are migrating at least some of their data and IT
infrastructure to cloud services provided by so-called hyperscalers, such as Amazon Web Services
(AWS) and Microsoft Azure, some of which also provide super platforms. Hyperscalers facilitate
data sharing by providing both cloud storage infrastructure and the applications that put data to
use for consumers and businesses. They already host massive amounts of data from all kinds of
industries, and they are in a natural position to aggregate data by building connections across
companies.
As data sharing generates more value, and as more data migrates to the cloud, providers can
differentiate themselves by offering data connectivity services that help clients capture and retain
business. Down the road, cloud providers can offer additional features that allow companies to
control access to their data, trace it as it is being shared across ecosystems, and monitor—and
potentially charge for—its use. Ecosystem orchestrators looking to bolster data sharing can shift
the entire ecosystem to the cloud platform with the best sharing functionality.
Distributed Data Space: There are two drivers of innovation from data sharing: aggregation and
access. Aggregation of data from disparate sources can lead to more innovation as hidden
relationships are revealed. Aggregation of more and more data from the same source across time
and space can facilitate benchmark comparisons and generate insights into trends. Likewise,
greater access and more open platforms unlock innovation by allowing a broader base of talent to
solve problems. Innovation contests such as Kaggle and DrivenData can help connect data sets
and problems with analytical talent.
But data concentrated in a few companies’ hands can also hamper innovation if those companies
aggregate only limited data types or seek to control access. As tech giants and others build out
infrastructure and services to consolidate data, the impact of network effects propels them into
powerful positions in the market. The distributed data space concept is capturing the attention of
think tanks, academics, researchers, and investors. For example, the UK’s Open Data Institute is
exploring the value of data sharing as well as models such as data trusts and other data institutions.
Characteristics of RPA:
• Flexibility: RPA bot can be programmed to complete almost any repetitive task.
• Ease of integration: Thanks to screen scraping and existing integrations, RPA bots do not need
to be integrated and can input and evaluate the output of almost all Windows applications.
• Ease of implementation: Setting up RPA is as simple as setting up a macro in excel by
recording your actions. While the drag drop interface is available for setting up most of
automation, the next gen RPA bots learn activities to be automated based on employee’s
actions, also called cognitive or intelligent automation.
• Cost: RPA further reduces the cost of the process. Business process outsourcing solutions are
no longer economical when those processes can be automated yielding better results and
requiring less cost than outsourcing.
Attended Automation
Attended Automation refers to the kind of automation where the bot or the agent passively resides
on the user’s machine and is invoked by the user at certain instances. The triggering has to actively
happen by the user’s action since the points of triggering are programmatically hard to detect.
The best example for attended automation is customer-service. The customer’s inquiry might
mandate a few basic checks which would have to be manually performed by the service
representative. The output of that process would mostly be a work of inference. The ‘agent’ or the
‘bot’ takes care to scrap information and paste it in the relevant fields, and this takes a monotonous
task away from the representative. The automation also ensures that there is no error in the copying
or pasting of information.
Unattended Automation
Unattended Automation is an enhanced version of RPA. It is used in the tasks that can be run in
the background and process the essential data to give the output. This saves a lot of time for the
back-end employees who do not have to deal with customers but more with data and processes.
There are various triggers that are used for Unattended automation such as Data-Input in A Specific
Field, Bot-Initiated Launching, Workflow-Initiated, time-Slot Based Bots
Hybrid RPA
The large organizations of today that have both a support-environment and a back-end
environment mean that the RPAs that offer the best of both are needed to make the processes more
robust and efficient. Thus, they use both attended and unattended automation for their processes.
RPA cleans up your underlying processes to provide an easily integrated framework on top of your
existing digital systems. Without this underlying foundation, the barrier to entry for integrating AI
is much higher. Without that foundation, AI would need to be manually woven into your core
processes.
Trends in RPA:
RPA, an interesting issue among the C-suite, is quickly making strides over several businesses,
including Manufacturing, Retail, Telecommunications, BFSI, and protection. However, the
rundown doesn’t end there-it rather begins. They key trends to look forward are:
RPA Will be the New ERP
The community of global system integrators (GSIs) and audit-based counseling organizations will
motivate and train a huge number of laborers to embrace automation. Furthermore, GSIs will do
as such similarly they did with enterprise resource planning (ERP) software in the 1990s.
These organizations perceive that the automation industry is ready for explosive development and
see an undeniable opportunity to sell business systems and enable services to help their customers
to receive new rewards, much like they once did with ERP.
Rise of SPA
SPA, otherwise called Smart Process Automation, is nothing, however an expansion of RPA. The
prior generation of RPA was equipped for automating structured data, with a pre-characterized set
of rules. However, with the progressions, and simple incorporation of Machine Learning-SPA bots
fill in as an alternative for RPA’s ‘If-Then’ rules and statements.
The eventual fate of RPA will see the use of modern innovations, for example, Advanced Data,
Analytics Business Process Automation, Artificial Intelligence, Blockchain, Optical Character
Recognition (OCR), etc. joined with RPA to offer powerful automation. We can likewise expect
an expanded development with robotic process automation companies.
Take the 2020 novel COVID-19 pandemic for instance. Countless workers endure increased
anxiety and concern related to the recession and what it will mean for them and their families. The
whole automation market must address this with processes and operations that help improve the
employee experience that is crucial for increasing morale, engagement, and productivity.
References:
• https://research.aimultiple.com/what-is-robotic-process-automation/
• https://www.sapcle.com/blog/?p=1081
• https://www.uipath.com/blog/ai-rpa-differences-when-to-use-them-together
Edge Computing
Edge computing refers to the computing done at or near the source of the data, instead of relying
on the cloud at one of a dozen data centers to do all the work. The word edge in this context means
literal geographic distribution. Edge computing is based on a networking philosophy focused on
bringing computing as close to the source of data as possible in order to reduce latency and
bandwidth use. In simpler terms, edge computing means running fewer processes in the cloud and
moving those processes to local places, such as on a user’s computer, an IoT device, or an edge
server. Bringing computation to the network’s edge minimizes the amount of long-distance
communication that has to happen between a client and server.
Virtual Events
Tech marketers are beginning to embrace virtual events as a way of reaching diverse audiences to
meet a wide range of objectives — from brand awareness and demand generation to customer
experience and education. Most big vendors have already added virtual events to their marketing
mix. In response to this increasing demand, the number and diversity of players in the virtual event
field has exploded. Tech marketers face the daunting task of selecting the right platform for their
needs. Many of the available platforms provide the basic functionality — reminiscent of a physical
event with various venues and incorporating familiar online communication and collaboration
tools such as videos, chat, and collateral downloads. In this nascent market, certain vendors stand
out, distinguished by innovative features, exceptional service, or ability to scale delivery. As the
tools become more widely used, best practices develop and mature, as do some of the players in
the market. Choosing the right platform and services helps avoid the pitfalls experienced by early
adopters and increases the likelihood of the event delivering the chosen marketing objectives.
Virtual events powered by Artificial Intelligence
In a pandemic-riddled world, several industries including IT, Retail, Healthcare, Automotive,
Education, BFSI to name a few, are actively transitioning to virtual events. From internal trainings,
press releases, product launches, trade shows, or even a client conference, virtual events are
becoming the norm rather than an exception.
While the beginning of 2020 threw the events industry into a tizzy, the fag end of 2020 proved that
virtual events are not only here to stay, but grow and morph into mean and lean delivering
machines. With more than 93% of event marketers planning to invest in virtual platforms
according to the Post-Covid Event Outlook Report, virtual event platforms it seems, will hold
sway, moving forward.
With the adoption for Artificial Intelligence for virtual events, where AI powered bots drive virtual
companionship for audiences, the efficacy of these platforms to anchor customer engagement,
deliver personalized experience, build positive disposition for brands and drive demand generation
has become even more promising. For instance, using Machine Learning technology, bots can
observe and learn behaviour patterns of engagement and act as your personal virtual concierge.
Imagine this: in a virtual event, anywhere between 50-500 documents get uploaded, which are
used by attendees to browse through and read at their convenience. Trying to sift and filter these
documents is tedious and time-consuming. But bots can provide suggestions by learning about
your interests and even auto-suggest people you could network with. From converting voice to
text and making session notes to directly email them to you, virtual event bots have become an
inseparable part of platforms that aim to deliver greater personalization, increase audience
engagement and improve audience retention.
Virtual workspace
Many people had been working outside of their workplaces since the beginning of the pandemic,
and for some of them, this was the first time they had ever seen it for such a long time.
Clearly, this is the greatest remote work experiment ever performed, and since there was no other
choice but to make it succeed, many businesses have gone to great lengths to fully comprehend
the business's true needs as well as the conditions in which its workers could remain efficient.
Now, the big question is how we can make up for the missing real-life interactions between people
that normally get lost when everyone is working remotely. The first issue we must address is
replacing all voice communications with face-to-face video chats. Finally, we are just people, and
there is a lot to learn just by looking at the other person's face. That is how we learn to trust and
care for one another.
The problem here is the consistency of our internet speeds, which can often be a stumbling block,
and also the head pose, the background, and the lighting. Many of us have heard about Zoom's
new virtual background function, which uses deep learning to seamlessly replace your current
background with a virtual one.
The real breakthrough comes from Nvidia, which used GANs to solve the last three problems.
Rather than sending the entire video stream, it only sends data about the most important facial
features (eyes, nose, mouse). The receiver can easily re-construct the same exact live video of the
face with much higher quality while using much less bandwidth thanks to GANs and this small
amount of information. Furthermore, the newly created video can be easily changed to achieve the
ideal pose for natural eye contact between the participants.
Although most virtual world environments, such as SecondLife, are designed for entertainment,
social, and educational purposes, there is still a good chance to adapt this to our need for a virtual
workspace where everyone can meet and communicate with one another.
A virtual reality workplace may also encourage us to get up from our desks and walk around while
completely absorbed in the virtual world. Horizon has already been created by Facebook as a
virtual reality gaming environment, but who says we can't use the same concepts for work?
Companies have also begun to use RecRoom, a popular virtual world game, for meetings, virtual
outings, and team events. You would be able to maintain the same workplace habits in a virtual
environment, in addition to getting more interest in meetings, trainings, and close collaboration.
So, if you want to go to the pantry and have a small talk over lunch or just have a random
conversation over coffee, you can do so. We'll be able to transfer the office to the cloud this way!
Rather than choose between the office and the house, we would essentially add all of the office's
functionality to our homes.
SaaS Delivery
Due to its web delivery model, SaaS eliminates the need to have IT staff download and install
applications on each individual computer. With SaaS, vendors manage all potential technical
issues, such as data, middleware, servers, and storage, resulting in streamlined maintenance and
support for the business.
SaaS Advantages
SaaS provides numerous advantages to employees and companies by greatly reducing the time and
money spent on tedious tasks such as installing, managing, and upgrading software. This frees up
plenty of time for technical staff to spend on more pressing matters and issues within the
organization.
SaaS Characteristics
There are a few ways to help you determine when SaaS is being utilized:
• Managed from a central location
• Hosted on a remote server
• Accessible over the internet
• Users not responsible for hardware or software updates
Vendor lock-in. Vendors may make it easy to join a service and difficult to get out of it. For
instance, the data may not be portable–technically or cost-effectively–across SaaS apps from other
vendors without incurring significant cost or inhouse engineering rework. Not every vendor
follows standard APIs, protocols, and tools, yet the features could be necessary for certain business
tasks.
Lack of integration support. Many organizations require deep integrations with on-premise apps,
data, and services. The SaaS vendor may offer limited support in this regard, forcing organizations
to invest internal resources in designing and managing integrations. The complexity of integrations
can further limit how the SaaS app or other dependent services can be used.
Data security. Large volumes of data may have to be exchanged to the backend data centers of
SaaS apps in order to perform the necessary software functionality. Transferring sensitive business
information to public-cloud based SaaS service may result in compromised security and
compliance in addition to significant cost for migrating large data workloads.
Lack of control. SaaS solutions involves handing control over to the third-party service provider.
These controls are not limited to the software–in terms of the version, updates, or appearance–but
also the data and governance. Customers may therefore need to redefine their data security and
governance models to fit the features and functionality of the SaaS service.
Feature limitations. Since SaaS apps often come in a standardized form, the choice of features
may be a compromising tradeoff against security, cost, performance, or other organizational
policies. Furthermore, vendor lock-in, cost, or security concerns may mean it’s not viable to switch
vendors or services to serve new feature requirements in the future.
Performance and downtime. Because the vendor controls and manages the SaaS service, your
customers now depend on vendors to maintain the service’s security and performance. Planned
and unplanned maintenance, cyber-attacks, or network issues may impact the performance of the
SaaS app despite adequate service level agreement (SLA) protections in place.
Examples of SaaS
Popular examples of SaaS include:
• Google Workspace (formerly GSuite)
• Dropbox
• Salesforce
• Cisco WebEx
• SAP Concur
• GoToMeeting
PaaS Delivery
The delivery model of PaaS is similar to SaaS, except instead of delivering the software over the
internet, PaaS provides a platform for software creation. This platform is delivered via the web,
giving developers the freedom to concentrate on building the software without having to worry
about operating systems, software updates, storage, or infrastructure. PaaS allows businesses to
design and create applications that are built into the PaaS with special software components. These
applications, sometimes called middleware, are scalable and highly available as they take on
certain cloud characteristics.
PaaS Advantages
No matter the size of your company, using PaaS offers numerous advantages, including:
• Simple, cost-effective development and deployment of apps
• Scalable
• Highly available
• Developers can customize apps without the headache of maintaining the software
• Significant reduction in the amount of coding needed
• Automation of business policy
• Easy migration to the hybrid model
PaaS Characteristics
PaaS has many characteristics that define it as a cloud service, including:
• Builds on virtualization technology, so resources can easily be scaled up or down as your
business changes
• Provides a variety of services to assist with the development, testing, and deployment of
apps
• Accessible to numerous users via the same development application
• Integrates web services and databases
Data security. Organizations can run their own apps and services using PaaS solutions, but the
data residing in third-party, vendor-controlled cloud servers poses security risks and concerns.
Your security options may be limited as customers may not be able to deploy services with specific
hosting policies.
Integrations. The complexity of connecting the data stored within an onsite data center or off-
premise cloud is increased, which may affect which apps and services can be adopted with the
PaaS offering. Particularly when not every component of a legacy IT system is built for the cloud,
integration with existing services and infrastructure may be a challenge.
Vendor lock-in. Business and technical requirements that drive decisions for a specific PaaS
solution may not apply in the future. If the vendor has not provisioned convenient migration
policies, switching to alternative PaaS options may not be possible without affecting the business.
Customization of legacy systems. PaaS may not be a plug-and-play solution for existing legacy
apps and services. Instead, several customizations and configuration changes may be necessary for
legacy systems to work with the PaaS service. The resulting customization can result in a complex
IT system that may limit the value of the PaaS investment altogether.
Runtime issues. In addition to limitations associated with specific apps and services, PaaS
solutions may not be optimized for the language and frameworks of your choice. Specific
framework versions may not be available or perform optimally with the PaaS service. Customers
may not be able to develop custom dependencies with the platform.
Examples of PaaS
Popular examples of PaaS include:
• AWS Elastic Beanstalk
• Windows Azure
• Heroku
• Force.com
• Google App Engine
• OpenShift
IaaS Advantages
IaaS offers many advantages, including:
• The most flexible cloud computing model
• Easy to automate deployment of storage, networking, servers, and processing power
• Hardware purchases can be based on consumption
• Clients retain complete control of their infrastructure
• Resources can be purchased as-needed
• Highly scalable
IaaS Characteristics
Characteristics that define IaaS include:
• Resources are available as a service
• Cost varies depending on consumption
• Services are highly scalable
• Multiple users on a single piece of hardware
• Organization retains complete control of the infrastructure
• Dynamic and flexible
Anytime you are unsure of a new application’s demands, IaaS offers plenty of flexibility and
scalability.
Security. While the customer is in control of the apps, data, middleware, and the OS platform,
security threats can still be sourced from the host or other virtual machines (VMs). Insider threat
or system vulnerabilities may expose data communication between the host infrastructure and
VMs to unauthorized entities.
Legacy systems operating in the cloud. While customers can run legacy apps in the cloud, the
infrastructure may not be designed to deliver specific controls to secure the legacy apps. Minor
enhancement to legacy apps may be required before migrating them to the cloud, possibly leading
to new security issues unless adequately tested for security and performance in the IaaS systems.
Internal resources and training. Additional resources and training may be required for the
workforce to learn how to effectively manage the infrastructure. Customers will be responsible for
data security, backup, and business continuity. Due to inadequate control into the infrastructure
however, monitoring and management of the resources may be difficult without adequate training
and resources available inhouse.
Multi-tenant security. Since the hardware resources are dynamically allocated across users as
made available, the vendor is required to ensure that other customers cannot access data deposited
to storage assets by previous customers. Similarly, customers must rely on the vendor to ensure
that VMs are adequately isolated within the multitenant cloud architecture.
Examples of IaaS
Popular examples of IaaS include:
• DigitalOcean
• Linode
• Rackspace
• Amazon Web Services (AWS)
• Cisco Metacloud
• Microsoft Azure
• Google Compute Engine (GCE)
In the image shown below, pizza is used as an example to understand the differences between the
different cloud services:
Newer types of services are emerging, some of them are listed below:
SDLC stands for Software development Life Cycle. SDLC is a process that consists of a series of
planned activities to develop or alter Software Products. Below is an overview of SDLC, SDLC
models available and their application in the industry.
ISO/IEC 12207 is an international standard for software life-cycle processes. It aims to be the
standard that defines all the tasks required for developing and maintaining software.
Waterfall Model
The Waterfall Model was the first Process Model to be introduced. It is also
referred to as a linear-sequential life cycle model. It is very simple to
understand and use. In a waterfall model, each phase must be completed
before the next phase can begin and there is no overlapping in the phases.
The Waterfall model is the earliest SDLC approach that was used for
software development.
The waterfall Model illustrates the software development process in a
linear sequential flow. This means that any phase in the development
process begins only if the previous phase is complete. In this waterfall
model, the phases do not overlap.
Agile Methodology
Agile software development refers to a group of software development methodologies based on
iterative development, where requirements and solutions evolve through collaboration between
self-organizing cross-functional teams.
What is Scrum?
Scrum is a framework that helps teams work together. Often thought of as an agile project
management framework, Scrum describes a set of meetings, tools, and roles that work in concert
to help teams’ structure and manage their work.
Scrum Phases and Processes
Scrum roles
1) Scrum Master
• A Scrum Master is a facilitator and Servant Leader who encourages and demands self-
organization from the development team.
• A Scrum Master enables close cooperation across all roles and functions, addresses resource
issue and disobedience of scrum practices.
• A Scrum Master protects the team from external and internal distractions.
• A Scrum Master removes impediments so the team can focus on the work at hand and follow
scrum practices.
• A Scrum Master is not typically a manager or lead, but he is an influential leader and coach
who does not do direct command and control.
2) Product Owner
• A Product Owner owns the Product backlog and writes user stories and acceptance criteria.
• A Product Owner is responsible for prioritizing the Product Backlog is prioritized and decides
the release date and the content.
• A Product Owner accepts or rejects product backlog item.
• A Product Owner has the power to cancel the Sprint, if he thinks the Sprint goal is redundant.
• A Product Owner is the one who is responsible for the Return on Investment (ROI) of the
product
3) Development Team
• The Development Team builds the product that the Product Owner indicates: the application
or website, for example. The Team in Scrum is “cross-functional”
• The Development Team includes all the expertise necessary to deliver the potentially shippable
product each Sprint
• The Development Team is self-organizing, with a very high degree of autonomy and
accountability.
• The Development Team decides how many items to build in a Sprint, and how best to
accomplish that goal.
• The Development Team is a cross functional, small and self-organizing team which owns the
collective responsibility of developing, testing and releasing the Product increment.
• The Development Team may not appoint any team lead since decisions are taken collectively
by the team.
PDCA (plan do check act or plan do check adjust) is an iterative four-step management method
used in business for the control and continuous improvement of processes and products. It is also
known as the Deming circle/cycle/wheel, the Shewhart cycle, the control circle/cycle, or plan–do–
study–act (PDSA). A fundamental principle of the scientific method and PDCA is iteration—once
a hypothesis is confirmed (or negated), executing the cycle again will extend the knowledge
further. Repeating the PDCA cycle can bring its users closer to the goal, usually a perfect operation
and output.
Another fundamental function of PDCA is the "hygienic" separation of each phase, for if not
properly separated measurements of effects due to various simultaneous actions (causes) risk
becoming confounded.
PDCA (and other forms of scientific problem solving) is also known as a system for developing
critical thinking. At Toyota this is also known as "Building people before building cars". Toyota
and other lean manufacturing companies propose that an engaged, problem-solving workforce
using PDCA in a culture of critical thinking is better able to innovate and stay ahead of the
competition through rigorous problem solving and the subsequent innovations.
KAIZEN
Kaizen is a Japanese term with two words – KAI meaning change and ZEN meaning good, so
KAIZEN means "change for good" or "continuous improvement."
It is a Japanese business philosophy regarding the processes that continuously improve operations
and involve all employees. Kaizen sees improvement in productivity as a gradual and methodical
process. Description. Kaizen is a concept referring to business activities that continuously improve
all functions and involve all employees from the CEO to the assembly line workers. Kaizen is the
Sino-Japanese word for "improvement". Kaizen also applies to processes, such as purchasing and
logistics, that cross organizational boundaries into the supply chain.
9 knowledge areas of Project Management
5 phases of Project Management
Agile Waterfall
It separates the project development lifecycle Software development process is divided into
into sprints. distinct phases.
Agile is quite a flexible method which allows There is no scope of changing the
changes to be made in the project requirements once the project development
development requirements even if the initial starts.
planning has been completed.
Agile methodology, follow an iterative All the project development phases like
development approach because of this designing, development, testing, etc. are
planning, development, prototyping and other completed once in the Waterfall model.
software development phases may appear
more than once.
Test plan is reviewed after each sprint The test plan is rarely discussed during the
test phase.
Agile development is a process in which the The method is ideal for projects which have
requirements are expected to change and definite requirements and changes not at all
evolve. expected.
Agile introduces a product mindset where the This model shows a project mindset and
software product satisfies needs of its end places its focus completely on accomplishing
customers and changes itself as per the the project.
customer's demands.
Agile methodology works exceptionally well Reduces risk in the firm fixed price contracts
with Time & Materials or non-fixed funding. by getting risk agreement at the beginning of
It may increase stress in fixed-price scenarios. the process.
Prefers small but dedicated teams with a high Team coordination/synchronization is very
degree of coordination and synchronization. limited.
Test team can take part in the requirements It is difficult for the test to initiate any
change without problems. change in requirements.
Product Management (Role)
A PM takes holistic responsibility for the product, from the little details to the big picture. A PM
needs to set vision and strategy and finally define success and make decisions. It is a highly
collaborative role. The product manager usually serves as the main liaison between the engineering
and other roles such as design, quality assurance, user research, data analysts, marketing, sales,
customer support, business development, legal, content writers, other engineering teams, and the
executive team. It’s usually the job of the product manager to identify times when one of those
teams should be brought in, and to fill in for them if they don’t exist.
Function of a PM
The day-to-day work of a product manager varies over the course of the product life cycle. In the
beginning, they will be figuring out what to build, in the middle, they help the team to make
progress; at the end they prepare for the launch of the product.
While the product life cycle varies by company (and sometimes even by team), it usually follows
a general pattern of Research & Plan, Design, Implement & Test, and Release.
Design
Product design does not just mean user interface (UI) design or drawing out what the product will
look like. Product design is defining the features and functionality of the product. The PM’s role
in product design varies substantially between companies and teams.
Release
When the development process is finished, the product manager needs to make sure the launch
goes smoothly. The launch process varies from team to team but usually involves things like
running through the launch checklist, etc.
Difference between Project Manager and Product Manager
While some product managers have project management as a large part of their job, most do not.
Project managers are mostly concerned with timelines and coordination. While they might be
responsible for gathering the project requirements, they don’t have much say in identifying and
choosing the requirements.
Product managers are responsible for identifying problems and opportunities, picking which ones
to go after, and then making sure the team comes up with great solutions, either by thinking of the
solution themselves or by working with the designers and engineers. This is why product sense—
having the intuition to recognize the difference between a good product and a bad product—is so
important for product managers.
Discussion Topics
Impact of Technology on Jobs: Will Automation & Artificial Intelligence reduce jobs
Technology intervention is inevitable in any sphere. It does raise the bar of productivity,
efficiency and safety to a level which is not achievable by humans. Adoption of technology,
global reach and faster communication has overhauled manufacturing, servicing, product
delivery and also employment associated with these sectors. But this is not the first time the
world has experienced significant shifts in employment due to new technology. History states
that technology has been a creator of jobs and has augmented new avenues. The course this time
will be same or not is a debatable issue.
Link:
• https://www.hindustantimes.com/education/debate-robots-artificial-intelligence-will-make-
humans-jobless-in-50-years/story-beG3KbHf9VBnw4AsvdwQbJ.html
• https://www.skynettoday.com/editorials/ai-automation-job-loss
How Data Protection Act will change the way data is used as of now?
With technology influencing every facet of life around us, and the quantum of personal information
being shared online or offline, it has become essential, and at the same time crucial, to strike a
balance between the cultural revolution brought about by this very digital transformation and the
associated implications of personal data protection. With most organizations on a digitization
spree, PDPB is a valuable step towards a sustainable solution that would aid India in strengthening
its personal data security concerns and position, as well as empower and equip individuals to
manage their personal data. PDPB will serve as a model for ensuring that Indian citizens have
autonomy in the digital economy. It will also permit them to regain control over their personal
data. PDPB focuses on this mix, and seeks to establish an overarching data privacy framework by
standardizing collection, usage, storage and transmission of personal data, ensuring adequate
protection of personal data. The Bill also establishes an independent authority, the Data Protection
Authority of India (DPAI), which will be empowered to oversee the enforcement of the law.
Links:
• https://m.economictimes.com/tech/internet/changes-likely-in-proposed-data-privacy-rules-
only-critical-data-may-need-to-be-housed-in-india/articleshow/70355298.cms
• https://www.yelloveedub.com/blog/gdpr-regulation-change
Digital payments are secure and India is ready to go 100% cashless
There has been a massive expansion of the formal banking imprint over the last few years,
especially due to the efforts of the Jan Dhan Yojana, which is a central government initiative. The
number of bank account holders has doubled during this period as per official figures. Indians are
worldwide known for their IT skills, a lot of which is required in building the infrastructure needed
for such a cashless economy. The brain power exists to create this infra. India is also home to the
phenomenal success story of digital wallet and payment app-Paytm. It is by all means, one of the
top unicorns in the world today, with a substantial valuation. In addition, we also have other such
gateways such as Mobikwik and Phone Pe. Corruption can be controlled to a large degree with the
contraction of the cash economy. This will happen as all transactions will now get recorded via
digital transactions. A cashless economy will also be good for the social aspects of the economy.
Women will also now get their payments in their bank accounts, thus reducing their dependency
on men of the family, who usually control the household expenses. A cash economy is also the
backbone of several unorganized industries, employing millions of people. Shutting these
industries will be akin to taking away these peoples’ employment opportunities. It is true that
Indian tech personnel are responsible for several top tech giants worldwide, so we do have the
requisite talent to develop the infrastructure. But these same Indians thrived in an atmosphere
where corruption was minimal. Thus, the bureaucratic and government machinery will need much
cleaning before these people can make a similar impact here.
Links:
• https://itslyf.com/are-digital-payments-secure-enough-to-go-cashless/
• https://www.careerride.com/view/are-digital-payments-secure-enough-for-the-indian-
economy-to-go-cashless-30807.aspx
The biggest negative impact of technology is loss of Jobs as automation has replaced number of
jobs in banking sector. Through technology comes the threat of Cyber Attack, a loophole in the
system, millions of data can be lost in the blink of an eye. These technologies consume less time,
it also sometimes makes people careless-which causes loss of personal details as happened last
year in 2016, and many debit cards details of big banks were compromised.
Links:
• https://www.indiabix.com/group-discussion/how-is-technology-impacting-the-banking-
sector/
• https://bankinnovation.net/allposts/biz-lines/lending/the-impact-of-technology-on-banking-
revolution-or-evolution/
• https://www.information-age.com/technology-finance-banking-sector-123471800/
APPENDIX