0% found this document useful (0 votes)
10 views4 pages

Perceptron Tugas JST

The document is a task report on neural networks by Ghifary Ahada Azra, detailing the implementation of a Perceptron with a bipolar step activation function and a BackPropagation model using a sigmoid activation function. It includes code snippets for training the Perceptron and the BackPropagation model with provided training data and class labels. The report concludes with predictions made by both models on the training data.

Uploaded by

azasakirio21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views4 pages

Perceptron Tugas JST

The document is a task report on neural networks by Ghifary Ahada Azra, detailing the implementation of a Perceptron with a bipolar step activation function and a BackPropagation model using a sigmoid activation function. It includes code snippets for training the Perceptron and the BackPropagation model with provided training data and class labels. The report concludes with predictions made by both models on the training data.

Uploaded by

azasakirio21
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

TUGAS II

Jaringan Syaraf Tiruan

Nama : Ghifary Ahada Azra


NIM : F1D021092
Kelas : A

SOAL:
Diberikan data :
X=[ 1, 0, 0, 0, 1, 0;
0, 1, 1, 1, 1, 0;
0, 0, 1, 0, 0, 1;
0, 0, 1, 0, 1, 0;
0, 1, 0, 0, 0, 1;
1, 0, 1, 0, 1, 1;
0, 0, 1, 1, 0, 0;
0, 1, 0, 1, 0, 0;
1, 0, 0, 1, 0, 1;
0, 1, 1, 1, 1, 1;
0, 0, 0, 1, 1, 0.9;
0, 0, 0, 1, 0.8, 1;
0, 0, 0, 1, 1, 0.75;
0, 0, 0, 1, 1, 1;]

class=[A B A A A A B B B B, C, C, C, C]

A. Perceptron dengan fungsi aktivasi bipolar step function bipolar


B. BackPropagation dengan fungsi aktivasi sigmoid function
Penyelesaian:
A. Perceptron dengan fungsi aktivasi bipolar step function bipolar
import numpy as np

class Perceptron:
def __init__(self, input_size):
# Inisialisasi bobot dengan nilai acak antara -1 dan 1
self.weights = np.random.uniform(-1, 1, input_size)
self.bias = np.random.uniform(-1, 1)

def activation(self, x):


# Fungsi aktivasi bipolar step function
return 1 if x >= 0 else -1

def predict(self, inputs):


# Menghitung jumlah dari input dan bobot, tambahkan bias
summation = np.dot(inputs, self.weights) + self.bias
# Fungsi aktivasi bipolar step function
return self.activation(summation)

def train(self, training_data, labels, learning_rate, epochs):


for epoch in range(epochs):
for i in range(len(training_data)):
inputs = training_data[i]
target = labels[i]
prediction = self.predict(inputs)
error = target - prediction
# Update bobot dan bias berdasarkan error
self.weights += learning_rate * error * inputs
self.bias += learning_rate * error

# Data pelatihan
x_train = np.array([
[1, 0, 0, 0, 1, 0],
[0, 1, 1, 1, 1, 0],
[0, 0, 1, 0, 0, 1],
[0, 0, 1, 0, 1, 0],
[0, 1, 0, 0, 0, 1],
[1, 0, 1, 0, 1, 1],
[0, 0, 1, 1, 0, 0],
[0, 1, 0, 1, 0, 0],
[1, 0, 0, 1, 0, 1],
[0, 1, 1, 1, 1, 1],
[0, 0, 0, 1, 1, 0.9],
[0, 0, 0, 1, 0.8, 1],
[0, 0, 0, 1, 1, 0.75],
[0, 0, 0, 1, 1, 1]
])

clas = ['A', 'B', 'A', 'A', 'A', 'A', 'B', 'B', 'B', 'B', 'C', 'C',
'C', 'C']

# Inisialisasi perceptron dengan jumlah input yang sesuai


input_size = len(x_train[0])
perceptron = Perceptron(input_size)

# Pelatihan perceptron
learning_rate = 0.1
epochs = 1000
perceptron.train(x_train, labels, learning_rate, epochs)

# Uji perceptron pada data pelatihan


for i, data in enumerate(x_train):
prediction = perceptron.predict(data)
if prediction == -1:
predicted_class = 'A'
elif prediction == 0:
predicted_class = 'B'
else:
predicted_class = 'C'
print(f"Data {i+1}: Predicted Class: {predicted_class}, Actual Class:
{clas[i]}")
Output:

B. BackPropagation dengan fungsi aktivasi sigmoid function


import tensorflow as tf

# Membuat kamus untuk mengubah label kelas menjadi angka


label_dict = {'A': 0, 'B': 1, 'C': 2}
labels = np.array([label_dict[c] for c in clas])

# Inisialisasi model jaringan saraf


model = tf.keras.Sequential([
# Hidden layer dengan fungsi aktivasi Sigmoid
tf.keras.layers.Dense(4, activation='sigmoid',
input_shape=(6,)),
# Layer output dengan fungsi aktivasi Softmax
tf.keras.layers.Dense(3, activation='softmax')
])

# Compile model
model.compile(optimizer='sgd',
loss='sparse_categorical_crossentropy', metrics=['accuracy'])

# Latih model
epochs = 100
model.fit(x_train, labels, epochs=epochs, verbose=0)

# Uji model pada data pelatihan


predictions = model.predict(x_train)

for i, data in enumerate(x_train):


predicted_class = np.argmax(predictions[i])
print(f"Data {i+1}: Predicted Class: {predicted_class}, Actual
Class: {clas[i]}")
Output:

You might also like