Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
15 views
11 pages
MLWP LAB Experiment's
Machine learning
Uploaded by
jasuboni3011
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save MLWP LAB Experiment's For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
0 ratings
0% found this document useful (0 votes)
15 views
11 pages
MLWP LAB Experiment's
Machine learning
Uploaded by
jasuboni3011
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save MLWP LAB Experiment's For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
Download
Save MLWP LAB Experiment's For Later
You are on page 1
/ 11
Search
Fullscreen
ceecece@ 1 Oe a SS \_- Experiment-2 AIM: For a given set of training data examples stored in a .CSV file, implement and demonstrate the Candidate Elimination algorithm to output a description of the set ofall hypotheses consistent with the training examples. Introduction The candidate elimination algorithm incrementally builds the version space given a hypothesis space H and a set E of examples. The examples are added one by ‘one; each example possibly shrinks the version space by removing the hypotheses that are inconsistent with the example. The candidate elimination algorithm does this by updating the general and specific boundary for each new example. ‘You can consider this as an extended form of the Find-S algorithm. Consider both positive and negative examples. ‘Actually, positive examples are used here as the Find-S algorithm While the negative example is specified in the generalizing form. Concept learning: Concept learning is basically the learning task of the machine (Leam by Train data) General Hypothesis: Not Specifying features to learn the machine, G= (7, P27... }: Number of attributes Specific Hypothesis: Specifying features to learn machine S_ {'pi'’pi,’pi’...}: The number of pi depends on a number of attributes, ‘Version Space: Its an intermediate of general hypothesis and Specific hypothesis. It not only just writes one hypothesis but a set of all possible hypotheses based on training data-set. vvyvvy CASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © Scanned with OKEN ScannerAlgorithm: Step]: Load Data set Step2: Initialize General Hypothesis and Specific Hypothesis. Step3: For each training example Step4: If example is positive example if attribute_value — hypothesis_value: Do nothing else: replace attribute value with '?' (Basically generalizing it) StepS: If example is Negative example Make generalize hypothesis more specific. Data Set: Air Enjoy Sk Temp Water _| Forecast_| Sport Sunny | Warm Warm _|Same__| Yes [Sunny [Warm Warm _|Same__| Yes Rainy _| Cold Warm | Change [No Sunny __| Warm Cool | Change | Yes ceronananeaneenoree aa ¢ o © Scanned with OKEN Scanner€C@e@ée€e ec ©& oee# & OVvVvIvt TBsIvwswwdweeooetveeuecuuteovUurs Program: import csv with open("trainingdata.csv") as f: csv_file=csv.reader(f) data=list(csv_file) s-dataf](-1] g-[['?' for i in range(len(s))] for j in range(len(s))] for i in data: elif i[-1]="No": forj in range(len(s)): ififj)!=sfj: sii]U}-"?" print("\nSteps of Candidate Elimination Algorithm" data.index(i)+1) print(s) print(g) gh=[] for iin g: for j ini ifj gh.append(i) break print("\nFinal specific hypothesis:\n",s) print("\nFinal general hypothesis:\n",gh) 7 ©ASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © Scanned with OKEN Scanneryoyvovewdvv weeuuwudey ecuuvuetuvy wuvuvewww _7 Expe iment-3 AL Write a program to demonstrate the working of the decision tree based 1D3 algorithm. Use an appropriate data set for building the decision tree and apply this knowledge to classify a new sample. Introducti In decision tree leaming, D3 (Iterative Dichotomiser 3) is an algorithm invented by Ross Quinlan! used to generate 6 decision tree from a dataset. ID3 is the precursor to the C45. algorithm, ad is typically used in the machine Jeamning and natural language processing domains. The 1D3 algorithm begins with the original set as the root node. On each iteration of the algorithm, it iterates through every unused attribute of the set and calculates the entropy oF the information gain of that attribute. It then selects the attribute which has the smallest entropy (or largest information gain) attribute to produce value, The set is then split or partitioned by the selected subsets of the data. teps of ID3 Als orithms: Select Root node(S) On each iteration 0! pased on low Entropy and Highest Informal Gain ‘ran algorithms it calculate the Entropy and Information gain, considering that every node is unused Select node base on Lowest Entropy OF Highest 1G * then Splits set S to produce the subsets of data tinuous to recur on each sub: ‘An algorithms con! attributes are fresh and Creates the decision Tree LS ae set and make sure that yay OASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © Scanned with OKEN ScannerData Set: Outlook [Temperature [Humidity [Windy [PlayTennis Sunny Hot High FALSE __|No Sunny Hot | High TRUE ___|No Overcast | Hot High FALSE _ | Yes Rainy Mild High FALSE _ | Yes Rain Cool Normal FALSE _ | Yes Rain Cool Normal TRUE ___|No Overcast | Cool Normal TRUE __| Yes Sunny Mild High FALSE [No ‘Sunny Cool Normal FALSE _ | Yes Rainy ___| Mild Normal FALSE _ | Yes ‘Sunny Mild Normal TRUE _| Yes Overcast__| Mild High TRUE__| Yes Overcast | Hot Normal FALSE _| Yes Rainy Mild High TRUE [No CASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © scanned with OKEN ScannerProgra import numpy as np import math import csv def read_data(filename): with open(filename, 't’) as csvfile: datareader = csv.reader(csvfile, delimiter=;,’ headers = next(datareader) metadata = [] traindata =[] for name in headers: metadata.append(name) for row in datareader: traindata.append(row) return (metadata, traindata) _ init _(self, attribute): self.attribut ittribute self-childret self.answer = def _str_(self): return self.attribute def subtables(data, col, delete): dict = (} items = np.unique(dataf:, col]) count = np.zeros((items.shape[0], 1), dtype=np.int32) for x in range(items.shape[0]): for y in range(data.shape(0)): if dataly, col] = items[x]: count[x] += 1 for x in range(items.shape[0)): dict{items[x]] = np.empty((int(count{x}), data shape[!]), dtype "|S32") By ‘©ASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © scanned with OKEN Scannerpos =0 for y in range(data shape{0)): if dataly, col] = items(x]: dictfitems{x]}[pos] = dataly] pos +=1 if delete: dictfitems[x]] = np.delete(dict{items{x]], col, 1) retum items, dict def entropy(S): items = np.unique(S) if items.size = return 0. counts = np.zeros((items.shape{0], 1)) sums =0 for x in range(items.shape[0]): counts[x] = sum(S = items[x]) / (S.size * 1.0) for count in counts: sums +=-1 * count * math.log(count, 2) return sums def gain_ratio(data, col): items, dict = subtables(data, col, delete=False) total_size = data.shape[0] entropies = np.zeros((items.shape(0], 1)) intrinsic = np zeros((items.shape[0], 1)) for x in range(items.shape[0]): ratio = dictfitems[x]].shape[0)/(total_size * 1.0) entropies[x] = ratio * entropy(dict{items[x]][;, -1]) intrinsic[x] = ratio * math.log(tatio, 2) total_entropy = entropy(data[:, -1]) iv=-1 * sum(intrinsic) 2 ‘©ASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © scanned with OKEN Scannerfor x in range(entropies.shape[0]): total_entropy = entropies[x] return total_entropy / iv def create_node(data, metadata): if (np.unique(dataf:, -1])).shape[0] node = Node("") node.answer = np.unique(data[:, -1])[0] return node 1: gains = np.zeros((data.shape[1] - 1, 1)) for col in range(data.shape{1] - 1): gains{col] = gain_ratio(data, col) split = np.argmax(gains) node = Node(metadata[split]) metadata = np.delete(metadata, split, 0) items, dict = subtables(data, split, delete=Truc) for x in range(items.shape[0]): child = create_node(dict{items{x]], metadata) node.children.append{(items[x], child)) return node def empty(size): —" for x in range(size): sto nil retum s def print_tree(node, level): if node.answer !="": print(empty(level), node.answer) return B (CASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © scanned with OKEN Scannerprint(empty(level), node.attribute) for value, n in node.children: print(empty(level + 1), value) print_tree(n, level + 2) metadata, traindata = read_data("tennisdata.csv") data = np.array(traindata) node = create_node(data, metadata) print_tree(node, 0) @ scanned with OKEN ScannerExperi iment-4 AIM: A) Exercises to solve the real-world problems using the machine learning methods Linear Regression Introduction: Linear Regression is a type of Regression algorithms that models the relationship between a dependent variable and a single independent variable. The relationship shown by a Simple Linear Regression model is linear or a sloped straight line, hence it is called Simple Linear Regression. The key point in Simple Linear Regression is that the dependent variable must be a continuous/real value. However, the independent variable can be measured on continuous or categorical values. Simple Linear regression algorithm has mainly two objectives: o Model the relationship between the two variables. Such as the relationship between Income and expenditure, experience and Salary, etc. Forecasting new observations. Such as Weather forecasting according to temperature, Revenue of a company according to the investments in a year, etc. Data set: Age | Income 25 | 25000 23 22000 24 26000 28 29000 34 38600 32 36500 42 | 41000 55 | 81000 45 | 47500 15 ‘ASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB © scanned with OKEN ScannerProgram: import pandas as pd import numpy as np import matplotlib.pyplot as plt # To read data from Age _Income.csv file dataFrame = pd.tead_csv(‘Age_Income.csv’) # To place data in to age and income vectors age = dataFramef'Age'] income = dataFrame["Income!] # number of points num = np.size(age) #To find the mean of age and income vector mean_age = np.mean(age) mean_income = np.mean(income) # calculating cross-deviation and deviation about age CD_ageincome = np.sum(income*age) - num*mean_income*mean_age CD_ageage = np.sum(age*age) - num*mean_age*mean_age # calculating regression coefficients b1=CD_ageincome / CD_ageage 0 = mean_income - b1*mean_age # to display coefficients print("Estimated Coefficients :") print("b0 = ",b0,"\nb1 =",b1) # To plot the actual points as scatter plot plt.scatter(age, income, color = "b",marker = # TO predict response vector response_Vec = b0 + bl *age #To plot the regression line plt.plot(age, response Vec, color ="r" # Placing labels plt.xlabel('Age’) plt ylabel(‘Income’) # To display plot plt.show() 16 ©ASRIKAR (Assistant Professor- SGEC) MACHINE LEARNING USING PYTHON LAB 9e939249 eeso9e290S9R9P Oo Ga nn a Orn a a © scanned with OKEN Scanner
You might also like
Machine Learning Laboratory Record Book: 1 Find S Algorithm
PDF
No ratings yet
Machine Learning Laboratory Record Book: 1 Find S Algorithm
22 pages
R20 Iii-Ii ML Lab Manual
PDF
100% (1)
R20 Iii-Ii ML Lab Manual
79 pages
ML Lab Observation
PDF
100% (1)
ML Lab Observation
44 pages
Machine Learning Lab Manual (15CSL76)
PDF
No ratings yet
Machine Learning Lab Manual (15CSL76)
30 pages
My ML Lab Manual
PDF
No ratings yet
My ML Lab Manual
21 pages
ML Lab Record
PDF
No ratings yet
ML Lab Record
33 pages
Machine Learning Lab
PDF
No ratings yet
Machine Learning Lab
33 pages
PESIT Bangalore South Campus: Vii Semester Lab Manual Subject: Machine Learning
PDF
No ratings yet
PESIT Bangalore South Campus: Vii Semester Lab Manual Subject: Machine Learning
31 pages
Lab Manual
PDF
No ratings yet
Lab Manual
25 pages
R20-21nm-Iii-I-Ml-Lab Manual
PDF
No ratings yet
R20-21nm-Iii-I-Ml-Lab Manual
38 pages
ML1 3 Merged
PDF
No ratings yet
ML1 3 Merged
19 pages
15CSL76 Students
PDF
No ratings yet
15CSL76 Students
18 pages
ML Lab Manual1 9
PDF
No ratings yet
ML Lab Manual1 9
38 pages
MLlab Manual LIET
PDF
No ratings yet
MLlab Manual LIET
52 pages
Fedal #5
PDF
No ratings yet
Fedal #5
33 pages
Machine Learning Lab Manual
PDF
No ratings yet
Machine Learning Lab Manual
26 pages
IT ML Lab
PDF
No ratings yet
IT ML Lab
35 pages
AD3461 - ML Lab Manual
PDF
No ratings yet
AD3461 - ML Lab Manual
54 pages
ML Lab Manual
PDF
No ratings yet
ML Lab Manual
14 pages
ML Manual
PDF
No ratings yet
ML Manual
74 pages
MLAll Practical
PDF
No ratings yet
MLAll Practical
27 pages
AD3461 ML Lab Manual
PDF
No ratings yet
AD3461 ML Lab Manual
32 pages
ML Priyesha - 778
PDF
No ratings yet
ML Priyesha - 778
23 pages
Machine Learning Through Python Lab Mannual
PDF
No ratings yet
Machine Learning Through Python Lab Mannual
33 pages
ML Ex1
PDF
No ratings yet
ML Ex1
12 pages
Machine Learninf File Final
PDF
No ratings yet
Machine Learninf File Final
45 pages
Machine Learning Lab Manual
PDF
No ratings yet
Machine Learning Lab Manual
43 pages
Amit MLT1
PDF
No ratings yet
Amit MLT1
22 pages
ML Lab Manual
PDF
No ratings yet
ML Lab Manual
25 pages
ML Lab Manual
PDF
No ratings yet
ML Lab Manual
70 pages
Lab Manual
PDF
No ratings yet
Lab Manual
55 pages
MANUAL
PDF
No ratings yet
MANUAL
34 pages
Lab Manual Final
PDF
No ratings yet
Lab Manual Final
34 pages
ML Lab Programs 1-10-Converted NAM COLLEGE PDF
PDF
No ratings yet
ML Lab Programs 1-10-Converted NAM COLLEGE PDF
33 pages
24CSPC212-PIC Lab Manual
PDF
No ratings yet
24CSPC212-PIC Lab Manual
45 pages
Shashidhar-18csl76 Final
PDF
No ratings yet
Shashidhar-18csl76 Final
19 pages
ML Lab Prog1-5 (5) College PDF
PDF
No ratings yet
ML Lab Prog1-5 (5) College PDF
12 pages
ML LAB Record
PDF
No ratings yet
ML LAB Record
35 pages
ML Lab Record
PDF
No ratings yet
ML Lab Record
49 pages
Ad3461 ML Lab Manual Format Edited
PDF
No ratings yet
Ad3461 ML Lab Manual Format Edited
45 pages
Machine Learning Lab (17CSL76)
PDF
No ratings yet
Machine Learning Lab (17CSL76)
48 pages
ML Lab Manual (1-9)
PDF
No ratings yet
ML Lab Manual (1-9)
37 pages
ML - LAB Record - Final
PDF
No ratings yet
ML - LAB Record - Final
39 pages
Machine Learning Lab Manual
PDF
No ratings yet
Machine Learning Lab Manual
23 pages
Machine Learning Lab Record: Dr. Sarika Hegde
PDF
No ratings yet
Machine Learning Lab Record: Dr. Sarika Hegde
23 pages
Code MLT
PDF
No ratings yet
Code MLT
9 pages
Machine Learning Lab Manaul BCSL606
PDF
No ratings yet
Machine Learning Lab Manaul BCSL606
27 pages
ML Experiments
PDF
No ratings yet
ML Experiments
22 pages
ML Manual
PDF
No ratings yet
ML Manual
34 pages
Machine Learning Laboratory Manual
PDF
No ratings yet
Machine Learning Laboratory Manual
11 pages
Machine Learning
PDF
No ratings yet
Machine Learning
27 pages
MLT Lab1
PDF
No ratings yet
MLT Lab1
27 pages
ML Lab File Batch 1
PDF
No ratings yet
ML Lab File Batch 1
20 pages
Wa0027.
PDF
No ratings yet
Wa0027.
34 pages
MANUAL
PDF
No ratings yet
MANUAL
33 pages
Ashwin Report
PDF
No ratings yet
Ashwin Report
18 pages
Advance Machine Learning
PDF
No ratings yet
Advance Machine Learning
28 pages
Machine Learning Lab Manual
PDF
No ratings yet
Machine Learning Lab Manual
12 pages