0% found this document useful (0 votes)
73 views4 pages

Machine Learning-ASS-I-QUESTIONS

The document outlines 20 multiple choice questions related to machine learning concepts. The questions cover topics like the multi-disciplinary nature of machine learning, learning from experience, using machine learning to train autonomous vehicles, inductive hypotheses, the candidate elimination algorithm, version spaces, minimum description length principle, perceptrons, best fit approximations, characterizing relationships between examples and hypotheses, inductive bias, choosing appropriate representations, Occam's razor, overfitting, post-pruning neural networks, direct training examples, and function approximation.

Uploaded by

Maaz23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views4 pages

Machine Learning-ASS-I-QUESTIONS

The document outlines 20 multiple choice questions related to machine learning concepts. The questions cover topics like the multi-disciplinary nature of machine learning, learning from experience, using machine learning to train autonomous vehicles, inductive hypotheses, the candidate elimination algorithm, version spaces, minimum description length principle, perceptrons, best fit approximations, characterizing relationships between examples and hypotheses, inductive bias, choosing appropriate representations, Occam's razor, overfitting, post-pruning neural networks, direct training examples, and function approximation.

Uploaded by

Maaz23
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Machine Learning Assignment – I – Questions

Last date of submission on or before 27.04.2022

1. Explain the steps for designing a learning system in detail


2. How to finding a maximally specific hypothesis with the help of Find-S algorithm.
3. Explain the Back-propagation learning algorithm with example.
4. How a multi layered network learns using a gradient descent algorithm? Discuss.
5. State Bayes theorem. Illustrate Bayes theorem with example.
6. Explain Gibs algorithm with an example.

1. Machine learning is inherently a ____________field.

• Inter disciplinary
• Multi-disciplinary
• single
• None

Answer: B

2.A computer program is said to learn from __________E with respect to some class of
tasks T and performance measure P, if its performance at tasks in T, as measured by P,
improves with experience E.

• Training
• Experience
• Database
• Algorithm

Answer: B

3.__________methods have been used to train computer-controlled vehicles to steer


correctly when driving on a variety of road types.

• Machine Learning
• Data Mining
• Neural networks
• Robotics

Answer: A

4.Any hypothesis found to approximate the target function well over a sufficiently large set
of training examples will also approximate the target function well over other unobserved
examples

• Hypothesis
• Inductive Hypothesis
• Learning
• Concept Learning

Answer: B

5.The _________algorithm computes the version space containing all hypotheses from H
that are consistent with an observed sequence of training examples.

• Inductive Hypothesis
• Artificial Neural Network
• Candidate Elimination
• none

Answer: C

6.The _________, denoted VSHVD, with respect to hypothesis space Hand training
examples D, is the subset of hypotheses from H consistent with the training examples in D.

Space

Vertical space

version space

version spain

Answer: C

7.Quinlan and Rivest (1989) describe experiments applying the MDL principle to choose
the _______ for a decision tree.

• Best size
• big size
• Small size
• over fit

Answer: A

8.Minimum Description Principle is a version of _________that can be interpreted within a


Bayesian Network.

• ID3
• Selection measure
• occam’s razor
• PAC
Answer: C

9.A perceptron takes a vector of real-valued inputs, calculates a linear combination of these
inputs, then outputs ____________

• 1 or -1
• 0 or 1
• -1 or 0
• none

Answer: A

10.If the training examples are not linearly separable, the delta rule converges toward a
approximation to the target concept.

• Over fit
• under fit
• doesn’t fit
• best fit

Answer: D

11. Theoretical results have been developed that characterize the fundamental relationship
among the number of __________________ examples observed.

Answer:

Training

12.The __________of L is any minimal set of assertions B such that for any target concept c
and corresponding training examples Dc.

Answer:

Inductive Bias

13.To apply MDL principle in practice we must choose _______________appropriate for


the given learning task.

Answer:

Specific encodings or representations

14._____________________Prefer the simplest hypothesis that fits the data.

Answer:
Occam’s razor

15._____________is a significant practical difficulty for decision tree learning and many
other learning methods.

Answer:

Over fitting

16.One successful method for finding high accuracy hypotheses is a technique


called___________.

Answer:

Post-pruning

17.__________learning methods provide a robust approach to approximating real-valued,


discrete valued and vector-valued target functions.

Answer:

Neural network

18.In learning to play checkers, the system might learn from ________training examples
consisting of individual checkers board states and the correct move for each.

Answer:

Direct

19.learning algorithms to acquire only some approximation to the target function, and for
this reason the process of learning the target function is often called __________________.

Answer:

Function approximation

20. PAC Acronym ____________________.

Answer:

Probably Approximately Correct

You might also like