0% found this document useful (0 votes)
157 views23 pages

IM Intro To Computing Unit 7 and 8

This document provides an overview of trends and issues in information and communications technology. It discusses several key trends, including cloud computing, the internet of things, mobile applications, human-computer interaction, data analytics, artificial intelligence, and data security. For each trend, it provides a brief definition and discussion of the impact and importance. It also identifies some of the main issues in ICT, such as data privacy and ensuring privacy of personal information.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
157 views23 pages

IM Intro To Computing Unit 7 and 8

This document provides an overview of trends and issues in information and communications technology. It discusses several key trends, including cloud computing, the internet of things, mobile applications, human-computer interaction, data analytics, artificial intelligence, and data security. For each trend, it provides a brief definition and discussion of the impact and importance. It also identifies some of the main issues in ICT, such as data privacy and ensuring privacy of personal information.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

COMP 20013 - Introduction to Computing

UNIT VII: TRENDS AND ISSUES IN INFORMATION AND COMMUNICATIONS


TECHNOLOGY (ICT)

OVERVIEW
This module covers the advancement and application of information technology. Some of
the trends in the information technology are Cloud computing, Mobile Application, Analytics,
Internet of Things, Data Security.

LEARNING OUTCOMES
At the end of this module, the student is expected to:

1. Demonstrate awareness on the current ICT trends and social issues.


2. Explain the current ICT trends and social issues and the impact that is having on society
3. Initiates disciplines ad relates knowledge of ICT trends and issues on study works.

COURSE MATERIALS
TRENDS IN ICT

21st century has been defined by application of and advancement in information


technology. Information technology has become an integral part of our daily life. According to
Information Technology Association of America, information technology is defined as “the study,
design, development, application, implementation, support or management of computer-based
information systems.”
Information technology has served as a big change agent in different aspect of business
and society. It has proven game changer in resolving economic and social issues.

Some of the advance developments in the Information Technology are:


1. Cloud Computing
One of the most talked about concept in information technology is the cloud computing.
Clouding computing is defined as utilization of computing services, i.e. software as well as
hardware as a service over a network. Typically, this network is the internet.
More and more businesses around the world are turning to cloud computing to help
support their business development demands. Cloud services allow companies to offload data
management, backend development, and even design so that their talent can focus on
innovation. To achieve better IT results, companies must build or reconfigure the appropriate
policies and workflow for a cloud-based approach. Cloud computing is expected to continue
being one of the most vital future trends in information technology.
Cloud computing offers 3 types of broad services mainly Infrastructure as a Service
(IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).
COMP 20013 - Introduction to Computing

Some of the benefit of cloud computing is as follows:

1. Cloud computing reduces IT infrastructure cost of the company.


2. Cloud computing promotes the concept of virtualization, which enables server and
storage device to be utilized across organization.
3. Cloud computing makes maintenance of software and hardware easier as
installation is not required on each end user’s computer.
Some issues concerning cloud computing are privacy, compliance, security, legal,
abuse, IT governance, etc.

2. Internet of Things
The Internet of Things (IoT) is transforming our physical world into a complex and
dynamic system of connected devices on an unprecedented scale.
Advances in technology are making possible a more widespread adoption of IoT, from
pill-shaped micro-cameras that can pinpoint thousands of images within the body, to smart
sensors that can assess crop conditions on a farm, to the smart home devices that are
becoming increasingly popular. But what are the building blocks of IoT? And what are the
underlying technologies that drive the IoT revolution?
The explosive growth of the “Internet of Things” is changing our world and the rapid
drop in price for typical IoT components is allowing people to innovate new designs and
products at home.
Internet of Things (IoT) devices are rapidly making their way into corporate spaces.
From gathering new data to the automation of infrastructure, companies are finding many
benefits from adding connectivity and intelligence to physical infrastructure. According to
CompTIA, adding digital capabilities to everyday components will drastically increase the
scope of IT responsibilities.

3. Mobile Application
Another emerging trend within information technology is mobile applications
(software application on Smart phone, tablet, etc.)
Mobile application or mobile app has become a success since its introduction. They
are designed to run on Smartphone, tablets and other mobile devices. They are available as
a download from various mobile operating systems like Apple, Blackberry, Nokia, etc. Some
of the mobile app are available free where as some involve download cost. The revenue
collected is shared between app distributor and app developer.

4. Human Computer Interaction


Human-computer interaction (HCI) is a multidisciplinary field of study focusing on the
design of computer technology and, in particular, the interaction between humans (the users)
and computers. While initially concerned with computers, HCI has since expanded to cover
almost all forms of information technology design.
HCI surfaced in the 1980s with the advent of personal computing, just as machines
such as the Apple Macintosh, IBM PC 5150 and Commodore 64 started turning up in homes
COMP 20013 - Introduction to Computing

and offices in society-changing numbers. For the first time, sophisticated electronic systems
were available to general consumers for uses such as word processors, games units and
accounting aids. Consequently, as computers were no longer room-sized, expensive tools
exclusively built for experts in specialized environments, the need to create human-computer
interaction that was also easy and efficient for less experienced users became increasingly
vital. From its origins, HCI would expand to incorporate multiple disciplines, such as computer
science, cognitive science and human-factors engineering.
HCI soon became the subject of intense academic investigation. Those who studied
and worked in HCI saw it as a crucial instrument to popularize the idea that the interaction
between a computer and the user should resemble a human-to-human, open-ended dialogue.
Initially, HCI researchers focused on improving the usability of desktop computers (i.e.,
practitioners concentrated on how easy computers are to learn and use). However, with the
rise of technologies such as the Internet and the smartphone, computer use would
increasingly move away from the desktop to embrace the mobile world.

5. Data Analytics
The field of analytics has grown many folds in recent years. Analytics is a process
which helps in discovering the informational patterns with data. The field of analytics is a
combination of statistics, computer programming and operations research.
The field of analytics has shown growth in the field of data analytics, predictive
analytics and social analytics.
Data analytics is tool used to support decision-making process. It converts raw data
into meaningful information.
Predictive analytics is tool used to predict future events based on current and
historical information.
Social media analytics is tool used by companies to understand and accommodate
customer needs.
The ever-changing field of information technology has seen great advancement and
changes in the last decade. And from the emerging trend, it can be concluded that its
influence on business is ever growing, and it will help companies to serve customers better.

6. Artificial Intelligence
Artificial intelligence (AI) requires significant computer resources (which can be
procured in the cloud); various algorithms allow learning (which can be baked into products
or provided as a service) and contextual awareness (which can come from IoT devices or
massive collections of data). By adding a layer of intelligence to the technical solutions they
are building, companies can both manage a more extensive IT architecture and solve a
broader range of problems.
COMP 20013 - Introduction to Computing

7. DataSecurity
One of the top trending technologies in computer science. IT services rely on digital
technology to work faster, data security becomes a top priority. It’s difficult to improve security
efforts when technology is updating so rapidly. Many businesses have increased investments
in security, but beyond the technical aspects, organizations will also begin building business
processes that enhance security. In order to adapt to the rapid IT development, companies
will have to shift their security mindset from technology-based defenses to proactive steps
that include technology, process, and education. In this top 5 disruptive technologies list,
Data security will always be import among the latest technology trends in information
technology.

ISSUES IN ICT
1. Data Privacy
Data privacy refers to the act of providing the integrity, confidentiality, and availability
of personal information that are collected, stored and processed. Data privacy, also called
information privacy, is the aspect of IT that deals with the ability an organization or individual
has to determine what data in a computer system can be shared with third parties.
Data privacy is challenging since it attempts to use data while protecting an individual's
privacy preferences and personally identifiable information. The fields of computer security,
data security, and information security all design and use software, hardware and human
resources to address this issue.
To ensure Data Privacy, the Philippines passed into Republic Act No. 10173 or known
as the Data Privacy Act of 2012.

Figure 7.1. RA 10173 - Data Privacy Act Infographics


COMP 20013 - Introduction to Computing

Figure 7.2. Data Privacy Rights * INFO G R AP HI C B Y J ES S A M ALA P IT

2. Cybersecurity
The cybersecurity challenge is two-fold. First is that Cyberattacks are growing in size
and sophistication and second, millions of cybersecurity jobs remain unfilled.
Organizations cannot take IT security lightly. An analysis of worldwide identity and
access management by the International Data Corporation revealed that 55% of consumers
would switch platforms or providers due to the threat of a data breach, and 78% would switch
if a breach impacted them directly. Customers aren’t willing to put their data at risk.
The problem is there aren’t enough IT professionals with cybersecurity expertise. Forty
percent of IT decision-makers say they have cybersecurity skills gaps on their teams. It’s also
identified as the most challenging hiring area in IT.
COMP 20013 - Introduction to Computing

There isn’t an immediate solution to this problem, but a long-term fix is to build your
cyber workforce from the inside. Invest in cybersecurity training and upskill your current staff.
Hiring and outsourcing isn’t always a viable (or cheap) solution. Current IT professionals who
know the industry are more apt to transition into successful cybersecurity professionals.

Figure 7.3. Electronic and Cybercrime Prevention Act


COMP 20013 - Introduction to Computing

UNIT ASSESSMENTS
1. What new technology coming out in the next 10 years do you think will disrupt the global IT
industry?
2. Make an analysis on how cybersecurity is being implemented in the Philippines.
3. From recent technology updates, what new devices are being connected to the internet.
4. How do students apply the concept of cloud computing?
5. Give examples of cybersecurity attacks which became headlines in the past year
(Philippines or abroad)
6. Related to question #5, give an example very specific to intrusion of data privacy.
7. Give examples of data security measure which are being implemented in certain institutions,
e.g. banks, school, offices.
8. Identify and discuss one or two application of Internet of Things that you think might be
useful in this time of health crisis.
9. If you are to create a mobile application, conceptualize an application that might be effective
in this situation of health crisis.
10. Why do you think access to correct and accurate data is essential these days with respect
to politics, health, world events, and the like?

References:
https://www.interaction-design.org/literature/topics/human-computer-interaction
https://online.stanford.edu/courses/xee100-introduction-internet-things
https://www.globalknowledge.com/us-en/resources/resource-library/articles/12-challenges-
facing-it-professionals/#2
https://www.bizvibe.com/blog/it-solutions-outsourcing/latest-technology-trends-information-
technology/
https://www.managementstudyguide.com/emerging-trends-in-information-technology.htm
https://www.coursera.org/learn
https://insidemanila.ph/article/293/heres-what-we-know-so-far-about-the-dfa-data-breach
https://www.privacy.gov.ph/data-privacy-act/
https://lawphil.net/statutes/repacts/ra2012/ra_10175_2012.html
COMP 20013 - Introduction to Computing

UNIT VIII: SPECIAL INTEREST TOPICS IN ICT

OVERVIEW
This module gives an introduction to three of special interest topics related to information
technology, Artificial Intelligence (AI), Data Science, and Social Networking and Society.
The topic on artificial intelligence defines AI, lists down the milestones in AI’s history and
explains the two buzzwords related to AI, machine learning and deep learning. It also discusses
the different fields where we would see the application of AI.
Data science, on the other hand discusses how this field of science came about. The
emergence of big data and the need to analyze this huge amount of data prompted the beginnings
of data science. It also explains the roles and skills of a data scientist.
Spending time in social networking sites has become a part of almost everybody’s daily
routine. The topic on social networking delves on the pros and cons of social media. It also briefly
discusses the most popular social networking sites.

PART 1: ARTIFICIAL INTELLIGENCE


LEARNING OUTCOMES

At the end of this module, the student is expected to:


1. Explain the difference between AI, machine learning, and deep learning
2. Provide applications of AI in different industries and in daily use.
3. Identify important milestones in the history of AI
4. Explain supervised, unsupervised learning and other concepts related to AI

COURSE MATERIALS
Artificial Intelligence
(https://www.investopedia.com/terms/a/artificial-intelligence-ai.asp
https://www.britannica.com/technology/artificial-intelligence
https://pathmind.com/wiki/ai-vs-machine-learning-vs-deep-learning)

Artificial intelligence (AI) refers to the simulation of human intelligence in machines that
are programmed to think like humans and mimic their actions. AI is frequently applied to the
project of developing systems endowed with the intellectual processes characteristic of humans,
such as the ability to reason, discover meaning, generalize, or learn from past experience.

John McCarthy, widely recognized as one of the godfathers of AI, defined it as “the science
and engineering of making intelligent machines.”

Other definitions of artificial intelligence:

• A branch of computer science dealing with the simulation of intelligent behavior in


computers.
• The capability of a machine to imitate intelligent human behavior.
COMP 20013 - Introduction to Computing

• A computer system able to perform tasks that normally require human intelligence, such
as visual perception, speech recognition, decision-making, and translation between
languages.

History of Artificial Intelligence


(https://www.javatpoint.com/history-of-artificial-intelligence)

o Year 1943: The first work which is now recognized as AI was done by Warren McCulloch
and Walter pits in 1943. They proposed a model of artificial neurons.
o Year 1949: Donald Hebb demonstrated an updating rule for modifying the connection
strength between neurons. His rule is now called Hebbian learning.
o Year 1950: The Alan Turing who was an English mathematician and pioneered Machine
learning in 1950. Alan Turing published "Computing Machinery and Intelligence" in
which he proposed a test. The test can check the machine's ability to exhibit intelligent
behavior equivalent to human intelligence, called a Turing test.
o Year 1955: An Allen Newell and Herbert A. Simon created the "first artificial intelligence
program" which was named as "Logic Theorist". This program had proved 38 of 52
Mathematics theorems, and find new and more elegant proofs for some theorems.
o Year 1956: The word "Artificial Intelligence" first adopted by American Computer scientist
John McCarthy at the Dartmouth Conference. For the first time, AI coined as an academic
field.
o Year 1966: The researchers emphasized developing algorithms which can solve
mathematical problems. Joseph Weizenbaum created the first chatbot in 1966, which was
named as ELIZA.
o Year 1972: The first intelligent humanoid robot was built in Japan which was named as
WABOT-1.
o The duration between years 1974 to 1980 was the first AI winter duration. AI winter refers
to the time period where computer scientist dealt with a severe shortage of funding from
government for AI researches.
o During AI winters, an interest of publicity on artificial intelligence was decreased.
o Year 1980: After AI winter duration, AI came back with "Expert System". Expert systems
were programmed that emulate the decision-making ability of a human expert.
o In the Year 1980, the first national conference of the American Association of Artificial
Intelligence was held at Stanford University.
o The duration between the years 1987 to 1993 was the second AI Winter duration.
o Again Investors and government stopped in funding for AI research as due to high cost
but not efficient result. The expert system such as XCON was very cost effective.
o Year 1997: In the year 1997, IBM Deep Blue beats world chess champion, Gary
Kasparov, and became the first computer to beat a world chess champion.
o Year 2002: for the first time, AI entered the home in the form of Roomba, a vacuum
cleaner.
o Year 2006: AI came in the Business world till the year 2006. Companies like Facebook,
Twitter, and Netflix also started using AI.
o Year 2011: In the year 2011, IBM's Watson won jeopardy, a quiz show, where it had to
solve the complex questions as well as riddles. Watson had proved that it could
understand natural language and can solve tricky questions quickly.
COMP 20013 - Introduction to Computing

o Year 2012: Google has launched an Android app feature "Google now", which was able
to provide information to the user as a prediction.
o Year 2014: In the year 2014, Chatbot "Eugene Goostman" won a competition in the
infamous "Turing test."
o Year 2018: The "Project Debater" from IBM debated on complex topics with two master
debaters and also performed extremely well.
o Google has demonstrated an AI program "Duplex" which was a virtual assistant and which
had taken hairdresser appointment on call, and lady on other side didn't notice that she
was talking with the machine.

Applications of Artificial Intelligence


(https://www.javatpoint.com/application-of-ai
https://www.valluriorg.com/blog/artificial-intelligence-and-its-applications/)

Figure 8.1 Applications of AI


(Image Source: gettingsmart.com)

The applications for artificial intelligence are endless. The technology can be applied to
many different sectors and industries.

AI in Healthcare: Companies are applying machine learning to make better and faster diagnoses
than humans. One of the best-known technologies is IBM’s Watson. It understands natural
language and can respond to questions asked of it. The system mines patient data and other
available data sources to form a hypothesis, which it then presents with a confidence scoring
schema.

AI in Finance: The finance industry is implementing automation, chatbot, adaptive intelligence,


algorithm trading, and machine learning into financial processes. it is used to detect and flag
activity in banking and finance such as unusual debit card usage and large account deposits—all
of which help a bank's fraud department. Applications for AI are also being used to help streamline
and make trading easier. This is done by making supply, demand, and pricing of securities easier
to estimate.
AI in Business: Robotic process automation is being applied to highly repetitive tasks normally
performed by humans. Machine learning algorithms are being integrated into analytics and CRM
(Customer relationship management) platforms to uncover information on how to better serve
COMP 20013 - Introduction to Computing

customers. Chatbots have already been incorporated into websites and e companies to provide
immediate service to customers. Automation of job positions has also become a talking point
among academics and IT consultancies.
AI in Education: It automates grading, giving educators more time. It can also assess students
and adapt to their needs, helping them work at their own pace.

AI in Automotive Industry: Some Automotive industries are using AI to provide virtual assistant
to their user for better performance. Such as Tesla has introduced TeslaBot, an intelligent virtual
assistant. Various Industries are currently working for developing self-driven cars which can
make your journey more safe and secure. Just like humans, self-driving cars need to have sensors
to understand the world around them and a brain to collect, processes and choose specific actions
based on information gathered. Autonomous vehicles are with advanced tool to gather
information, including long range radar, cameras, and LiDAR (light detection and ranging).

AI in Gaming: AI can be used for gaming purpose. The AI machines can play strategic games
like chess, where the machine needs to think of a large number of possible places.
AI in Data Security: The security of data is crucial for every company and cyber-attacks are
growing very rapidly in the digital world. AI can be used to make your data more safe and secure.
Some examples such as AEG bot, AI2 Platforms, are used to determine software bug and cyber-
attacks in a better way.

AI in Social Media: Social Media sites such as Facebook, Twitter, and Snapchat contain billions
of user profiles, which need to be stored and managed in a very efficient way. AI can organize
and manage massive amounts of data. AI can analyze lots of data to identify the latest trends,
hashtag, and requirement of different users.

AI in Travel & Transport: AI is becoming highly demanding for travel industries. AI is capable
of doing various travel related works such as from making travel arrangement to suggesting the
hotels, flights, and best routes to the customers. Travel industries are using AI-powered chatbots
which can make human-like interaction with customers for better and fast response.

AI in Robotics: Artificial Intelligence has a remarkable role in Robotics. Usually, general robots
are programmed such that they can perform some repetitive task, but with the help of AI, we can
create intelligent robots which can perform tasks with their own experiences without pre-
programmed. Humanoid Robots are best examples for AI in robotics, recently the intelligent
Humanoid robot named as Erica and Sophia has been developed which can talk and behave like
humans.

AI in Entertainment: We are currently using some AI based applications in our daily life with
some entertainment services such as Netflix or Amazon. With the help of ML/AI algorithms, these
services show the recommendations for programs or shows. The role of AI in film, television and
media can also be felt in marketing and advertising, personalization of user experience, and
search optimization. (https://emerj.com/ai-sector-overviews/ai-in-movies-entertainment-visual-media/)
COMP 20013 - Introduction to Computing

Machine learning (ML)


(https://www.deeplearningbook.org/contents/ml.html)

Machine Learning is seen as a subset of artificial intelligence. It is the study of computer


algorithms that improve automatically through experience. Machine learning algorithms build a
mathematical model based on sample data, known as "training data", in order to make
predictions or decisions without being explicitly programmed to do so.
Machine learning enables us to tackle tasks that are too difficult to solve with fixed
programs written and designed by human beings. From a scientific and philosophical point of
view, machine learning is interesting because developing our understanding of it entails
developing our understanding of the principles that underlie intelligence.

Learning Algorithms
(https://www.deeplearningbook.org/contents/ml.html)

A machine learning algorithm is an algorithm that is able to learn from data.


Learning has been defined by Mitchell (1997) as follows: “A computer program is said to learn
from experience E with respect to some class of tasks T and performance measure P. If its
performance at tasks in T, as measured by P improves with experience E.”

In this relatively formal definition of the word “task,” the process of learning itself is not the
task. Learning is our means of attaining the ability to perform the task. For example, if we want a
robot to be able to walk, then walking is the task. We could program the robot to learn to walk, or
we could attempt to directly write a program that specifies how to walk manually.

Categories of Machine Learning


(https://en.wikipedia.org/wiki/Machine_learning Bishop, C.M. (2006), Pattern Recognition and Machine Learning)

Machine learning approaches are traditionally divided into three broad categories, depending
on the nature of the "signal" or "feedback" available to the learning system:

• Supervised learning: The computer is presented with example inputs and their desired
outputs, given by a "teacher", and the goal is to learn a general rule that maps inputs to
outputs.
• Unsupervised learning: No labels are given to the learning algorithm, leaving it on its own to
find structure in its input. Unsupervised learning can be a goal in itself (discovering hidden
patterns in data) or a means towards an end (feature learning).
• Reinforcement learning: A computer program interacts with a dynamic environment in which
it must perform a certain goal (such as driving a vehicle or playing a game against an
opponent). As it navigates its problem space, the program is provided feedback that's
analogous to rewards, which it tries to maximize.

Deep learning
(https://www.investopedia.com/terms/d/deep-learning.asp
https://orbograph.com/deep-learning-how-will-it-change-healthcare/)

Deep Learning is an artificial intelligence (AI) function that imitates the workings of the
human brain in processing data and creating patterns for use in decision making. Deep learning
is a subset of machine learning in artificial intelligence that has networks capable of learning
COMP 20013 - Introduction to Computing

unsupervised from data that is unstructured or unlabeled. Also known as deep neural learning or
deep neural network.

Deep learning, a subset of machine learning, utilizes a hierarchical level of artificial neural
networks to carry out the process of machine learning. The artificial neural networks are built like
the human brain, with neuron nodes connected together like a web. While traditional programs
build analysis with data in a linear way, the hierarchical function of deep learning systems enables
machines to process data with a nonlinear approach.

Figure 8.2. An illustration of a deep learning neural network (Source: University of Cincinnati)

Deep learning, also known as hierarchical learning or deep structured learning, is a type
of machine learning that uses a layered algorithmic architecture to analyze data.
In deep learning models, data is filtered through a cascade of multiple layers, with each
successive layer using the output from the previous one to inform its results. Deep learning
models can become more and more accurate as they process more data, essentially learning
from previous results to refine their ability to make correlations and connections.
Deep learning is loosely based on the way biological neurons connect with one another to
process information in the brains of animals. Similar to the way electrical signals travel across the
cells of living creates, each subsequent layer of nodes is activated when it receives stimuli from
its neighboring neurons.
In artificial neural networks (ANNs), the basis for deep learning models, each layer may
be assigned a specific portion of a transformation task, and data might traverse the layers multiple
times to refine and optimize the ultimate output.
These “hidden” layers serve to perform the mathematical translation tasks that turn raw
input into meaningful output.
COMP 20013 - Introduction to Computing

Watch:

Understanding Artificial Intelligence and its Future


https://www.youtube.com/watch?v=SN2BZswEWUA

Deep Learning in 5 Minutes


https://www.youtube.com/watch?v=6M5VXKLf4D4

Read:

Recent use of Machine Learning


https://ph.yahoo.com/news/covid-19-symptom-clusters-223755338.html

UNIT ASSESSMENTS/ACTIVITIES

1. Explain artificial intelligence


2. Differentiate machine learning from deep learning
3. Give other specific applications of AI in the fields of manufacturing, education, business
(those not mentioned above)
4. How is machine learning different from traditional programming?
5. How does Netflix use AI?
6. List down the applications of AI in games which were mentioned in the History of AI.
7. List down other games which applied AI which were not mentioned in the History of AI.
8. What do you understand by training data?
9. Differentiate supervised and unsupervised learning. You may give an example
10. Discuss what a Turing Test is.

References:
https://www.investopedia.com/terms/a/artificial-intelligence-ai.asp
https://www.britannica.com/technology/artificial-intelligence
https://pathmind.com/wiki/ai-vs-machine-learning-vs-deep-learning
https://www.javatpoint.com/history-of-artificial-intelligence
https://www.javatpoint.com/application-of-ai
https://www.valluriorg.com/blog/artificial-intelligence-and-its-applications/
https://emerj.com/ai-sector-overviews/ai-in-movies-entertainment-visual-media/
https://www.deeplearningbook.org/contents/ml.html
Machine_learning Bishop, C.M. (2006), Pattern Recognition and Machine Learning
https://www.investopedia.com/terms/d/deep-learning.asp
https://orbograph.com/deep-learning-how-will-it-change-healthcare/
COMP 20013 - Introduction to Computing

PART 2: DATA SCIENCE

LEARNING OUTCOMES

At the end of this module, the student is expected to:


1. Explain what the field of data science is
2. Identify the skills/expertise needed to be a data scientist
3. Discuss what big data is and how it relates to data science

COURSE MATERIALS

What Is Data Science?


(https://www.investopedia.com/terms/d/data-science.asp)

Data science provides meaningful information based on large amounts of complex data
or big data. Data science, or data-driven science, combines different fields of work in statistics
and computation to interpret data for decision-making purposes.

Data is drawn from different sectors, channels, and platforms including cell phones, social
media, e-commerce sites, healthcare surveys, and Internet searches. The increase in the amount
of data available opened the door to a new field of study based on big data—the massive data
sets that contribute to the creation of better operational tools in all sectors.

The continually increasing access to data is possible due to advancements in technology


and collection techniques. Individuals buying patterns and behavior can be monitored and
predictions made based on the information gathered.

However, the ever-increasing data is unstructured and requires parsing for effective
decision making. This process is complex and time-consuming for companies—hence, the
emergence of data science.

What is Big Data?


(https://www.forbes.com/sites/peterpham/2015/08/28/the-impacts-of-big-data-that-you-may-not-have-heard-of)
(https://www.investopedia.com/terms/b/big-data.asp)

Historically, data was used as an ancillary to core business and was gathered for specific
purposes. Retailers recorded sales for accounting. Manufacturers recorded raw materials for
quality management. But as the demand for Big Data analytics emerged, data no longer serves
only its initial purpose. Companies able to access huge amounts of data possess a valuable asset
that when combined with the ability to analyze it, has created a whole new industry.
Big data refers to the large, diverse sets of information that grow at ever-increasing rates.
It encompasses the volume of information, the velocity or speed at which it is created and
collected, and the variety or scope of the data points being covered. Big data often comes from
multiple sources and arrives in multiple formats.
Successful players in Big Data are recognized well by the market. Some examples of companies
with big data are Amazon, Facebook, Google, Twitter, SAP to name a few.
COMP 20013 - Introduction to Computing

History of Data Science


(https://www.forbes.com/sites/gilpress/2013/05/28/a-very-short-history-of-data-science/#5ca4865a55cf)

1962 John W. Tukey writes in “The Future of Data Analysis”: … Data analysis, and the parts of
statistics which adhere to it, must…take on the characteristics of science rather than those of
mathematics… data analysis is intrinsically an empirical science
1974 Peter Naur publishes Concise Survey of Computer Methods in Sweden and the United
States. Naur offers the following definition of data science: “The science of dealing with data,
once they have been established, while the relation of the data to what they represent is delegated
to other fields and sciences.”
1977 The International Association for Statistical Computing (IASC) is established as a Section
of the ISI. “It is the mission of the IASC to link traditional statistical methodology, modern computer
technology, and the knowledge of domain experts in order to convert data into information and
knowledge.”

1989 Gregory Piatetsky-Shapiro organizes and chairs the first Knowledge Discovery in
Databases (KDD) workshop. In 1995, it became the annual ACM SIGKDD Conference on
Knowledge Discovery and Data Mining (KDD).

1996 Members of the International Federation of Classification Societies (IFCS) meet in Kobe,
Japan, for their biennial conference. For the first time, the term “data science” is included in the
title of the conference (“Data science, classification, and related methods”).

1997 In his inaugural lecture for the H. C. Carver Chair in Statistics at the University of Michigan,
Professor C. F. Jeff Wu (currently at the Georgia Institute of Technology), calls for statistics to be
renamed data science and statisticians to be renamed data scientists.

May 2005 Thomas H. Davenport, Don Cohen, and Al Jacobson publish “Competing on Analytics,”
a Babson College Working Knowledge Research Center report, describing “the emergence of a
new form of competition based on the extensive use of analytics, data, and fact-based decision
making... Instead of competing on traditional factors, companies are beginning to employ
statistical and quantitative analysis and predictive modeling as primary elements of competition.

July 2008 “The Skills, Role & Career Structure of Data Scientists & Curators: Assessment of
Current Practice & Future Needs,” defines data scientists as “people who work where the
research is carried out--or, in the case of data centre personnel, in close collaboration with the
creators of the data--and may be involved in creative enquiry and analysis, enabling others to
work with digital data, and developments in database technology.”

January 2009 Hal Varian, Google’s Chief Economist, tells the McKinsey Quarterly: “The ability to
take data—to be able to understand it, to process it, to extract value from it, to visualize it, to
communicate it—that’s going to be a hugely important skill in the next decades… Because now
we really do have essentially free and ubiquitous data. So the complimentary scarce factor is the
ability to understand that data and extract value from it… I do think those skills—of being able to
access, understand, and communicate the insights you get from data analysis—are going to be
extremely important.

May 2011 David Smith writes in "’Data Science’: What's in a name?”: “The terms ‘Data Science’
and ‘Data Scientist’ have only been in common usage for a little over a year, but they've really
COMP 20013 - Introduction to Computing

taken off since then: many companies are now hiring for ‘data scientists’, and entire conferences
are run under the name of ‘data science’

September 2011 D.J. Patil writes in “Building Data Science Teams”: “Starting in 2008, Jeff
Hammerbacher (@hackingdata) and I sat down to share our experiences building the data and
analytics groups at Facebook and LinkedIn. In many ways, that meeting was the start of data
science as a distinct professional specialization.

2012 Tom Davenport and D.J. Patil publish “Data Scientist: The Sexiest Job of the 21st Century”
in the Harvard Business Review

The Data Scientist


(https://searchenterpriseai.techtarget.com/definition/data-scientist
https://towardsdatascience.com/how-data-science-will-impact-future-of-businesses-7f11f5699c4d)

A data scientist is a professional responsible for collecting, analyzing and interpreting


extremely large amounts of data. The data scientist role is an offshoot of several traditional
technical roles, including mathematician, scientist, statistician and computer professional. This
job requires the use of advanced analytics technologies, including machine learning and
predictive modeling.

Figure 8.3 Domains of Data Science


(Image Source: kainos.com)

Since data scientists have an in-depth understanding of data, they work very well in moving
organizations towards deep learning, machine learning, and AI adoption as these companies
generally have the same data-driven aims. They also help in software development services for
that software that includes lots of data and analytics.

Data scientists help companies of all sizes to figure out the ways to extract useful
information from an ocean of data to help optimize and analyze their organizations based on these
COMP 20013 - Introduction to Computing

findings. Data scientists focus on asking data-centric questions, analyzing data, and applying
statistics & mathematics to find relevant results.

Data scientists have their background in statistics & advanced mathematics, AI and
advanced analysis & machine learning. For companies that want to run an AI based project, it is
crucial to have a data scientist on the team in order to customize algorithms, make the most of
their data, and weigh data-centric decisions.

UNIT ASSESSMENTS/ACTIVITIES
1. What fields of science are associated with data science?
2. Why do you think there is a lack of data scientists in the industry?
3. What are the usual sources of big data?
4. What data do you provide by using your social networking account, e.g.facebook?

References:
https://www.investopedia.com/terms/d/data-science.asp
https://www.forbes.com/sites/peterpham/2015/08/28/the-impacts-of-big-data-that-you-may-not-
have-heard-of
https://www.investopedia.com/terms/b/big-data.asp
https://www.forbes.com/sites/gilpress/2013/05/28/a-very-short-history-of-data-science
https://searchenterpriseai.techtarget.com/definition/data-scientist
https://towardsdatascience.com/how-data-science-will-impact-future-of-businesses
COMP 20013 - Introduction to Computing

PART 3: SOCIAL NETWORKING AND SOCIETY

LEARNING OUTCOMES
At the end of this module, the student is expected to:
1. Discuss where each specific popular media sites are commonly used
2. Analyze the benefits of social media to society
3. Discuss the disadvantages of social media

COURSE MATERIALS
What is Social Networking

A social networking service (also social networking site or social media) is an online platform
which people use to build social networks or social relationships with other people who share
similar personal or career interests, activities, backgrounds or real-life connections. Social
networking sites allow users to share ideas, digital photos and videos, posts, and to inform others
about online or real-world activities and events with people in their network.

Popular Social Media Sites


(Source: A Study on Positive and Negative Effects of Social Media on Society, W.Akram, R.Kumar, 2017)

The following are the most popular social media sites

Facebook. This is the largest social media network on the Internet, both in terms of total number
of users and name recognition. Facebook came into existence on February 4, 2004, Facebook
has within 12 years managed to collect more than 1.59 billion monthly active users and this
automatically makes it one of the best mediums for connecting people from all over the world with
your business. It is predictable that more than 1 million small and medium-sized businesses use
the platform to advertise their business.
Twitter We might be thinking that restrictive our posts to 140 characters is no way to advertise
our business, but we will be shocked to know that this social media stage has more than 320
million active monthly users who can build use of the 140 character limit to pass on information.
Businesses can use Twitter to interact with prospective clients, answer questions, release latest
news and at the same time use the targeted ads with specific audiences. Twitter was founded on
March 21, 2006, and has its headquarters in San Francisco, California.
Google+ Google+ is one of the popular social media sites in these days. Its SEO value alone
makes it a must-use tool for any small business. Google+ was propelled on December 15, 2011,
and has joined the major alliances enlisting 418 dynamic million clients as of December 2015.
YouTube YouTube : the biggest and most well known video-based online networking site — was
established on February 14, 2005, by three previous PayPal workers. It was later purchased by
Google in November 2006 for $1.65 billion. YouTube has more than 1 billion site guests for every
month and is the second most well known internet searcher behind Google.
COMP 20013 - Introduction to Computing

Pinterest Pinterest is commonly a beginner in the online networking field. This stage comprises
of computerized announcement sheets where organizations can stick their substance. Pinterest
reported September 2015 that it had obtained 100 million clients. Private ventures whose intended
interest group is for the most part comprised of ladies should put resources into Pinterest as the
greater parts of its guests are ladies.
Instagram Instagram is a visual online networking stage. The site has more than 400 million
dynamic clients and is possessed by Facebook. A significant number of its clients utilize it to post
data about travel, form, sustenance, workmanship and comparable subjects. The stage is likewise
recognized by its remarkable channels together with video and photograph altering highlights.
Right around 95 percent of Instagram clients additionally utilize Facebook.
Tumblr Tumblr is a standout amongst the most hard toutilize informal communication stages, but
at the same time it's a standout amongst the most fascinating locales. The stage permits a few
diverse post groups, including cite posts, talk posts, video and photograph posts and in addition
sound posts, so you are never constrained in the kind of substance that you can share. Like
Twitter, reblogging, which is more similar to retweeting, is speedy and simple. The long range
informal communication site was established by David Karp in February 2007 and at present has
more than 200 million sites.
Flickr Flickr, articulated "Glint," is an online picture and video facilitating stage that was made by
the then Vancouverconstruct Ludicorp in light of February 10, 2004, and later obtained by Yahoo
in 2005. The stage is well known with clients who share and install photos. Flickr had more than
112 million clients and had its impression in more than 63 nations. Million of photographs are
shared day by day on Flickr.
Reddit This is social news and excitement organizing site where enlisted clients can submit
substance, for example, coordinate connections and content posts. Clients are likewise ready to
arrange and decide their position on the site's pages by voting entries up or down. Entries with
the best votes show up in the best classification or primary page.
Snapchat Snapchat is a image informing application training item that was made by Reggie
Brown, Evan Spiegel and Bobby Murphy when they were understudies at Stanford University.
The application was authoritatively discharged in September 2011, and inside a limited ability to
focus time they have become hugely enrolling a normal of 100 million every day dynamic clients
as of May 2015. More than 18 percent of every social medium client utilizes Snapchat.
WhatsApp WhatsApp Messenger is a cross-platform instant messaging client for smartphones,
PCs and tablets. This application needs Internet connection to send images, texts, documents,
audio and video messages to other users that have the app installed on their devices. Launched
in January 2010, WhatsApp Inc. was purchased by Facebook on February 19, 2004, for about
$19.3 billion. Today, more than 1 billion persons make use of the administration to speak with
their companions, friends and family and even clients.
TikTok
(Source: https://www.commonsensemedia.org/blog/parents-ultimate-guide-to-tiktok
https://slate.com/technology/2018/09/tiktok-app-musically-guide.html)

TikTok It is a Chinese video-sharing social networking service owned by ByteDance, a Beijing-


based internet technology company founded in 2012 by Zhang Yiming. It is used to create short
dance, lip-sync, comedy and talent videos. It lets you watch and share videos -- often to a
soundtrack of the top hits in music -- right from your phone. ... As with the lip- synching app
Dubsmash, users can watch and record videos of themselves lip-synching to popular music and
sound bites.2
COMP 20013 - Introduction to Computing

Impact of Social Media on Society


Social media have had profound impacts on the modern world. According to a recent study
by Allcott, et.al (2020), the number of monthly users for Facebook, has reached 2.3 billion
monthly active users worldwide (Facebook 2018). The authors said that as of 2016, the average
user was spending 50 minutes per day on Facebook and its sister platforms Instagram and
Messenger (Facebook 2016). There may be no technology since television that has so
dramatically reshaped the way people get information and spend their time (Allcott, et.al, 2020),
Although there are a number of positive effects of social media, there have, however been
speculations of its negative impact. The study of Allcott focused on Facebook and its effect to its
users. It says that the results leave little doubt that Facebook provides large benefits for its
users. Among the benefits are: it is an important source of news and information, It is a source of
entertainment, it is a means to organize a charity or an activist group, and it is a vital social
lifeline for those who are otherwise isolated (Allcott, et.al, 2020).
On the downside, the conclusion of the study says , “We find that while (Facebook)
deactivation makes people less informed, it also makes them less polarized by at least some
measures, consistent with the concern that social media have played some role in the recent rise
of polarization in the United States. “. The study further said that although the negative effects
could be real concerns, they could be smaller than what might have been expected given prior
studies and researches on the topic (Allcott, et.al, 2020).

Akram and Kumar (2017) listed down the positive effects of social media on society. They are:
➢ Connectivity – easier for people to connect with anyone regardless of location,
➢ Education – easy for experts and professionals to educate via social media, regardlessof
location, education background, and it is also free,
➢ Help – A person’s issues can be shared with a group for help and energy,
➢ Information and updates – Availability of most recent happenings around the planet,
➢ Advertising – Business can be promoted to a very wide audience,
➢ Noble cause – Effective way to solicit contribution for needy people, and
➢ Helps in building communities – People of different communities can connect to discuss
and share related stuff.
While the negative effects are:
➢ Cyber harassing – Because of anonymity in the net, it is exceptionally straightforward to
bully people in the internet,
➢ Hacking – Personal information can be stolen through hacking
➢ Addiction – People spend so much more time than is necessary and lose a sense of
productiveness
➢ Fraud and scams – Fraudulent activities being involving money comes in many forms
➢ Reputation – Damage to reputation by spreading false story in the internet

In general, social media has contributed positively to the society in many ways. One
advantage which everybody would be able to relate to would be in our connectivity. Connecting
with people has never been so easy. Friends and family, we have not seen or talked with for
quite some time suddenly become just a message away. It has given people more opportunities
for socialization and for keeping updated of what’s going on with friends, family, business
COMP 20013 - Introduction to Computing

partners, or just mere acquaintances. The other advantages on education, ease of sharing
information, the help it provides by just being able to link with people who can provide guidance
and assistance, and building communities, these are major benefits that people enjoy with social
media.
Some of the negative effects could be avoided by making sure our user profile is secure
so that it will not be available to people we do not know. It will also help to use strong passwords.
People in social media should also study and examine businesses and investment opportunities
being offered before entering into any deal online. It is necessary to be circumspect when dealing
with people we only talk with, most of the time, using only chats or messages. Setting time limit
for using social media should also be a good practice as it makes us monitor our use and make
us conscious too just how much time we had already spent in social media.
Depending on the individual and his discipline on the use of social media, the benefits may
outweigh the disadvantages or the downside may overwhelm the advantages.

UNIT ASSESSMENTS
1. Give three social media sites and differentiate them
2. From the study made by Allcott, et.al, would you say that there are more harmful effects of
the use of Facebook?
3. From the advantages of social media (Akram & Kumar), give three which are most important
to you.
4. Which among the harmful effects of social media have you experienced. Elaborate on your
answer.

References:
https://en.wikipedia.org/wiki/Social_networking_service
A Study on Positive and Negative Effects of Social Media on Society, W.Akram, R.Kumar, 2017
https://www.commonsensemedia.org/blog/parents-ultimate-guide-to-tiktok 2
https://slate.com/technology/2018/09/tiktok-app-musically-guide.html 2
The Welfare Effects of Social Media By Hunt Allcott, Luca Braghieri, Sarah Eichmeyer and
Matthew Gentzkow* American Economic Review 2020, 110(3): 629–676
https://doi.org/10.1257/aer.20190658
COMP 20013 - Introduction to Computing

Final Assessment:

1. From among the ICT trends, issues, and special interest


topics…which one interests you most and why? Discuss a
specific/particular application of your selected ICT
trend/issue/topic. Discuss the advantages and disadvantages
of the specific ICT trend/issue/topic that you selected
2. From among the ICT trends, issues, and special interest topics…
which one do you think has the biggest impact in the next 4
years of your life? Discuss why. Discuss a specific/particular
application of your selected ICT trend/issue/topic. Discuss the
advantages and disadvantages of the specific ICT
trend/issue/topic that you selected
3. From among the ICT trends, issues, and special interest
topics…which one are you least interested and why?
4. What is the most significant learning have you had so far from
this course Introduction to Computing. Discuss why.

You might also like