Combine DLfile
Combine DLfile
There are 16 programs to explain the various concepts in python programming such as:
Syntex,
Loop,
if-else,
Data Structur es,
Strings,
File Handaling,
Ex ception Handaling,
Random Numbers,
Command Line Ar gunment
Use of Libr aries
1 Hello World
Hello World
x=”akshita”
print (3*x)
akshita
akshita
akshita
Learning: How to declare variable, add, concatenate and print the result.
a = 10 b = 220 c = a + b #
Add two numbers print (a, " + ",
b, " --> ", c)
Assingment 2.1: WAP to add three numbers and print the result. Assingment 2.2:
WAP to concatinate three strings and print the result.
a = 100 b
= 200 c =
500 f =
a+b+c
print( f
)
800
s=a+b+str( c ) print
(a,"+",b,"+",c,"-->", s)
# Run the program with (1) Two strings and (2) Two numbers
4 Loop
9
10
for i in range(0,10):
print ( i )
0
1
2
3
4
5
6
7
8
9
for i in range(0,20,2):
print ( i )
0
2
4
6
8
10
12
14
16
18
0
-1
-2
-3
-4
-5
-6
-7
-8
-9
for i in range(1,11):
print (5," * ", i , " = ", i * 5)
5 * 1 = 5
5 * 2 = 10
5 * 3 = 15
5 * 4 = 20
5 * 5 = 25
5 * 6 = 30
5 * 7 = 35
5 * 8 = 40
5 * 9 = 45
5 * 10 = 50
4.5.1 Version 1
Sum is --> 55
s=0 for i in
range(1,11): s=s+i
print ("Sum is --> ",s)
4.5.2 Version 2
Sum is --> 55
for i in range(1,11):
print(7," * ", i ," = ", i * 7)
7 * 1 = 7
7 * 2 = 14
7 * 3 = 21
7 * 4 = 28
7 * 5 = 35
7 * 6 = 42
7 * 7 = 49
7 * 8 = 56
7 * 9 = 63
7 * 10 = 70
for i in range(1,11):
print(9," * ", i ," = ", i * 9)
9 * 1 = 9
9 * 2 = 18
9 * 3 = 27
9 * 4 = 36
9 * 5 = 45
9 * 6 = 54
9 * 7 = 63
9 * 8 = 72
9 * 9 = 81
9 * 10 = 90
for i in range(1,11):
print(18," * ", i ," = ", i * 18)
18 * 1 = 18
18 * 2 = 36
18 * 3 = 54
18 * 4 = 72
18 * 5 = 90
18 * 6 = 108
18 * 7 = 126
18 * 8 = 144
18 * 9 = 162
18 * 10 = 180
if n % 2 == 0:
print (n," is even")
else: print (n," is
odd")
if f==0:
print ("Prime")
else: print ("Not
Prime")
Enter a No: 5
Prime
if a == b:
print ("a == b")
elif a >= b:
print ("a > b")
else: print ("a
< b")
Assingment 5.1: WAP to nd max amoung three numbers and input from user. [Try max() function] Assingment 5.2:
WAP to add all numbers divisible by 7 and 9 from 1 to n and n is given by the user.
Assingment 5.3: WAP to add all prime numbers from 1 to n and n is given by the user.
if i % 7 == 0 and i % 9 == 0:
sum_divisible += i
6 Functions
def Add(a,b):
c=a+b
return c
Add(10,20) --> 30
Add(20,50) --> 70
Add(80,200) --> 280
def IsPrime(n):
for i in range(2, n//2 + 1):
if n%i==0: return 0
return 1
IsPrime(20) --> 0
IsPrime(23) --> 1
IsPrime(200) --> 0
IsPrime(37) --> 1
6.3 Add 1 to n
def AddN(n): s=
sum(range(n+1))
return s
AddN(10) --> 55
AddN(20) --> 210
AddN(50) --> 1275
AddN(200) --> 20100
Assingment 6.1: WAP using function that add all odd numbers from 1 to n, n is given by the user.
Assingment 6.2: WAP using function that add all prime numbers from 1 to n, n given by the user.
def sum_of_odds(n):
"""Calculate the sum of all odd numbers from 1 to n."""
total_sum = 0 for i in range(1, n + 1, 2):
total_sum += i return
total_sum n = int(input("Enter the value
of n: ")) result = sum_of_odds(n)
def is_prime(num):
"""Check if a number is prime."""
if num <= 1: return False
if num <= 3: return True
if num % 2 == 0 or num % 3 == 0:
return False i = 5 while i * i
<= num: if num % i == 0 or num % (i +
2) == 0:
return False i += 6
return True n = int(input("Enter the
value of n: ")) sum_primes = 0 for i in
range(1, n + 1): if is_prime(i):
sum_primes += i
print("The sum of all prime numbers from 1 to", n, "is", sum_primes)
7 Math library
8 Strings
# Strip + Split
print("var split --> ", var.strip().split(','))
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 0/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
var="Indian Army" print ("String
--> ", var) print ("var[::1] -->
", var[::1]) print ("var[::2] -->
", var[::2]) print ("var[::-1] -->
", var[::-1]) print ("var[::-2] -->
", var[::-2])
var=var[::-1]
print ("var after reverse --> ", var)
1
var[::-2] --> yr adI
var after reverse --> ymrA naidnI
8.9 Palindrome
s1 --> False
s2 --> True
s3 --> True
s4 --> False
9 Random Numbers/String
0.4192513686511823
0.8252917405886293
0.1728
70
34
6
6
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 1/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
96.94103499219948
10.376334587064289 -
6.087968259755884
-2.942454747413537
-1.12
1
import random as r
A=[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
[2 , 9, 5, 7]
[5 , 2]
[97 , 12]
[-87 , -24, 27, 25, 1]
passwd=r.sample(s.ascii_letters, 6)
print ("Selected Char --> ",passwd)
passwd1="".join(passwd) print
("passwd1 --> ",passwd1)
passwd2="+".join(passwd) print
("passwd2 --> ",passwd2)
passwd3="*".join(passwd) print
("passwd3 --> ",passwd3)
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 2/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
otp=r.sample(s.digits, 5) print
("Selected num1 --> ",otp)
otp="".join(otp) print ("otp1
--> ",otp)
otp=r.sample(s.digits, 5) print
("Selected num2 --> ",otp)
otp="".join(otp) print ("otp2
--> ",otp)
otp=r.sample(s.digits, 5) print
("Selected num2 --> ",otp)
otp="".join(otp)
print ("otp3 --> ",otp)
1
import string as s import random as r print ("String +
Digits --> ",s.ascii_letters + s.digits)
mixPasswd=r.sample(s.ascii_letters + s.digits, 5)
print ("\nSelected Str1 --> ",mixPasswd)
mixPasswd="".join(mixPasswd) print ("mixPasswd1
--> ",mixPasswd)
mixPasswd=r.sample(s.ascii_letters + s.digits, 6)
print ("\nSelected Str2 --> ",mixPasswd)
mixPasswd="".join(mixPasswd) print ("mixPasswd2
--> ",mixPasswd)
splChar="#@!~%^&*()_+=-[]{}|" mixPasswd=r.sample(splChar +
s.ascii_letters + s.digits, 8) print ("\nSelected Str3 -->
",mixPasswd) mixPasswd="".join(mixPasswd) print
("mixPasswd3 --> ",mixPasswd)
Selected Str3 --> ['c', '8', 'C', '+', '4', '~', '-', 'A']
mixPasswd3 --> c8C+4~-A
10 Exception Handaling
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 3/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
for i in range(-5,6):
print ("100/",i," --> ", 100/i)
for i in range(-5,6):
try:
print ("100/",i," --> ", 100/i)
except: print ("error")
1
10.3 Exception handaling for array out of index
L=[1,2,3,4,5]
for i in range(8):
try:
print (i," --> ",L[i])
except:
print ("error")
0 --> 1
1 --> 2
2 --> 3
3 --> 4 4 --> 5 error error error
Learning: How to use list, add, delete and search in the list.
Note: Read more about list and try yourself
1
11.2 List Iteration
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 5/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
L = ["Pratham", 'Sharma', 3.14, 3 ]
print ("Original List: ", L) for i in
range(0, len(L)): print ( L[i ])
del L[1]
print ("List After Deleting --> ", L)
1
L=[3, 6, 9, 12, 5, 3, 2] print
("Original List --> ", L)
L.sort(reverse=True)
print ("After Sort (Desending) --> ", L)
L1 = [3, 6, 9]
L2 = [12, 5, 3, 2]
L3 = L1 + L2 print
("L1 --> ",L1) print
("L2 --> ",L2) print
("L3 --> ",L3)
L1 --> [3, 6, 9]
L2 --> [12, 5, 3, 2]
L3 --> [3, 6, 9, 12, 5, 3, 2]
L = [12, 5, 3, 2, 7] print
("Original List --> ", L)
newL = [ i * 5 for i in L ]
print ("After Multiply with constant --> ", newL)
1
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 7/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
Original List --> [12, 5, 3, 2, 7]
After Multiply with constant --> [60, 25, 15, 10, 35]
if (6 in L) == True:
print ("Present") else:
print ("Not Present")
if 10 in L == False:
print ("Not Present")
else: print
("Present")
KeyError: 3
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 8/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
1
CGPA={1:8.9, 2:5.6, 4:6.7, 7:9.1, 8:5.3} print
("Dictionary --> ", CGPA) print ("Keys -
-> ", list( CGPA.keys ())) print ("Values
--> ", list( CGPA.values ()))
CGPA={1:8.9,2:5.6,4:6.7,7:9.1,8:5.3}
print ("Original Dictionary --> ", CGPA)
CGPA[4] = 9.2
print ("After Updating (4) --> ", CGPA)
del CGPA[1]
print ("After Deleting (1) --> ", CGPA)
del CGPA
print ("After Delete --> ", CGPA)
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 9/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
HomeTown={"Robin":"Delhi", "Govind":"Gwalior", "Anil":"Morena", "Pankaj":"Agra"}
print ("Original Dictionary --> ", HomeTown)
1
Data Structure 3 - Tuple
1
# Method 1
T = ("Pratham", 'Sharma', 3.14, 3)
# Method 2
T = tuple(["Pratham", 'Sharma', 3.14, 3]) # Convert list to tuple
#T = tuple(("Pratham", 'Sharma', 3.14, 3)) # Also Works
i = 0 while i <
len( T ):
print ( T[i ])
i += 1
1
.1 Declare Tuple
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 10/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
for s in T:
print ( s )
# Example 1:
T = (3, 6, 9, 12, 5, 3, 2)
print ("T -->", T)
# Example 2:
T = (3, 6, 9, 12, 5, 3, 2)
print ("T -->", T)
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scrollTo=V1oualoWhJWu&printMode=true 11/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
= (3, 6, 9, 12, 5, 3, 2)
print ("T -- T)
>",
print ("Sum -- sum( T ))
>",
print -- sum(T)/len( T ))
("Average >",
print -- sum(T)//len( T ))
("Average >",
Example 1
= (3, 6, 9, 12, 5, 3, 2) # Integer Tuple
print ("T -- T)
>",
print -- max( T ))
("Max >",
print -- min( T ))
("Min >",
Example 2
= ("Ram", "Shyam", "Ant") # String Tuple
"Human",
print ("T -- T)
>",
print -- max( T ))
("Max >",
print -- min( T ))
("Min >",
T1
= (3, 6, 9)
T2 = (12, 5, )
3, 2 print ("T1
-->", T1)
print ("T2 -- T2)
>",
T3 = T1 + T2
print ("T3 -- T3)
>",
T4 = T1 + T2 + T1 + T2
print ("T4 -->", T4)
#
T
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scroll
To=V1oualoWhJWu&printMode=true 20/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
T[2] = 900 # Error; 'tuple' object does not support item assignment
print ("T -->", T)
T1 = list( T )
T1.append(9.8) T = tuple(
T 1) print ("After Add --
>", T)
T1 = list( T )
T1.insert(2, "Rahul") T =
tuple( T 1) print ("After
Insert -->", T)
2
Data Structure 4 - Set
2
.1 Declare Set
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scroll
To=V1oualoWhJWu&printMode=true 22/23
8/13/24, 10:11 PM Learn_Python.ipynb - Colab
print ("Symetric Diff a - b --> ", a.symmetric_difference(b))
print ("Symetric Diff b - a --> ", b.symmetric_difference(a))
3
File Handling
Learning: How to open the le, read the le and write in the le
3
.1 Writing 1 to 10 in le
https://colab.research.google.com/drive/1DEI24YMyWk0TdNh34fkFuMYUR0Ltfytl#scroll
To=V1oualoWhJWu&printMode=true 23/23
8/20/24, 9:22 PM Copy of Welcome To Colab - Colab
import numpy as np
import matplotlib.pyplot as plt
for _ in range(num_iterations):
grad = gradient_f(x)
x = x - learning_rate * grad
history.append(x)
return x, history
plt.figure(figsize=(12, 6))
plt.xlabel('x')
plt.ylabel('f(x)')
plt.title('Gradient Descent on the Function f(x) = 3x^2 - 3x + 4')
plt.legend()
plt.grid(True)
plt.show()
# Print results
print(f"Result from Gradient Descent: x = {min_x}, f(x) = {min_value}")
print(f"Theoretical Minimum: x = {theoretical_min_x}, f(x) = {theoretical_min_value}")
https://colab.research.google.com/drive/1y2BmOL0LxC4tIEiaaNnDJoFQqpmcn0-d#scrollTo=-Rh3-Vt9Nev9&printMode=true 3/4
8/20/24, 9:22 PM Copy of Welcome To Colab - Colab
import numpy as np
class Neuron:
def __init__(self, n_inputs, bias = 0., weights = None):
self.b = bias
if weights: self.ws = np.array(weights)
else: self.ws = np.random.rand(n_inputs)
def __call__(self, xs): #calculate the neuron's output: multiply the inputs with the weights and sum the values together, add the bi
return self._f(xs @ self.ws + self.b)
-0.04999999999999999
https://colab.research.google.com/drive/1y2BmOL0LxC4tIEiaaNnDJoFQqpmcn0-d#scrollTo=-Rh3-Vt9Nev9&printMode=true 4/4
8/28/24, 4:18 PM Welcome To Colab - Colab
import numpy as np
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
import matplotlib.pyplot as plt
np.random.seed(0)
X = np.random.rand(1000, 2)
y = np.where(X[:, 0] + X[:, 1] > 1, 1, 0)
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
model = Sequential()
model.add(Dense(1, input_dim=2, activation='sigmoid')) # Single layer with 1 neuron
y_pred = model.predict(X_test)
y_pred_classes = np.where(y_pred > 0.5, 1, 0)
https://colab.research.google.com/#scrollTo=1RE2yiwqjRw4&printMode=true 2/6
8/28/24, 4:18 PM Welcome To Colab - Colab
https://colab.research.google.com/#scrollTo=1RE2yiwqjRw4&printMode=true 5/6
8/28/24, 4:18 PM Welcome To Colab - Colab
https://colab.research.google.com/#scrollTo=1RE2yiwqjRw4&printMode=true 6/6
11/11/24, 9:46 AM Untitled6.ipynb - Colab
# Import libraries
import tensorflow as tf
from tensorflow.keras import layers, models
from tensorflow.keras.datasets import mnist
import matplotlib.pyplot as plt
# Reconstructed images
plt.subplot(2, 10, i + 11)
plt.imshow(reconstructed_images[i].reshape(28, 28), cmap='gray')
plt.axis('off')
plt.show()
https://colab.research.google.com/drive/16zHOXNl1Qw9lD2K9YbmaMgftf3dau9Pp#printMode=true 1/2
11/11/24, 9:46 AM Untitled6.ipynb - Colab
https://colab.research.google.com/drive/16zHOXNl1Qw9lD2K9YbmaMgftf3dau9Pp#printMode=true 2/2
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
from tensorflow.keras import layers, models
from tensorflow.keras.datasets import mnist
plt.show()
/usr/local/lib/python3.10/dist-packages/keras/src/layers/
convolutional/base_conv.py:107: UserWarning: Do not pass an
`input_shape`/`input_dim` argument to a layer. When using Sequential
models, prefer using an `Input(shape)` object as the first layer in
the model instead.
super().__init__(activity_regularizer=activity_regularizer,
**kwargs)
Epoch 1/5
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 65s 33ms/step - accuracy: 0.8968 -
loss: 0.3309 - val_accuracy: 0.9856 - val_loss: 0.0448
Epoch 2/5
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 82s 33ms/step - accuracy: 0.9864 -
loss: 0.0458 - val_accuracy: 0.9882 - val_loss: 0.0350
Epoch 3/5
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 62s 33ms/step - accuracy: 0.9912 -
loss: 0.0286 - val_accuracy: 0.9898 - val_loss: 0.0310
Epoch 4/5
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 78s 31ms/step - accuracy: 0.9923 -
loss: 0.0245 - val_accuracy: 0.9906 - val_loss: 0.0341
Epoch 5/5
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 85s 33ms/step - accuracy: 0.9942 -
loss: 0.0197 - val_accuracy: 0.9907 - val_loss: 0.0287
313/313 - 3s - 8ms/step - accuracy: 0.9907 - loss: 0.0287
Test accuracy: 0.9907000064849854
# Without Dropout
plt.subplot(1, 2, 1)
plt.plot(history_no_dropout.history['accuracy'], label='Train
Accuracy')
plt.plot(history_no_dropout.history['val_accuracy'], label='Validation
Accuracy')
plt.title('Model Without Dropout')
plt.xlabel('Epochs')
plt.ylabel('Accuracy')
plt.legend()
# With Dropout
plt.subplot(1, 2, 2)
plt.plot(history_with_dropout.history['accuracy'], label='Train
Accuracy')
plt.plot(history_with_dropout.history['val_accuracy'],
label='Validation Accuracy')
plt.title('Model With Dropout')
plt.xlabel('Epochs')
plt.ylabel('Accuracy')
plt.legend()
plt.tight_layout()
plt.show()
loss_no_dropout, accuracy_no_dropout =
model_no_dropout.evaluate(X_test, y_test)
loss_with_dropout, accuracy_with_dropout =
model_with_dropout.evaluate(X_test, y_test)
/usr/local/lib/python3.10/dist-packages/keras/src/layers/core/
dense.py:87: UserWarning: Do not pass an `input_shape`/`input_dim`
argument to a layer. When using Sequential models, prefer using an
`Input(shape)` object as the first layer in the model instead.
super().__init__(activity_regularizer=activity_regularizer,
**kwargs)
Epoch 1/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 10s 84ms/step - accuracy: 0.6430 - loss:
0.6481 - val_accuracy: 0.7750 - val_loss: 0.5422
Epoch 2/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.8231 - loss:
0.4992 - val_accuracy: 0.8250 - val_loss: 0.4421
Epoch 3/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.8607 - loss:
0.4148 - val_accuracy: 0.8562 - val_loss: 0.3733
Epoch 4/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.8604 - loss:
0.3505 - val_accuracy: 0.8562 - val_loss: 0.3330
Epoch 5/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 26ms/step - accuracy: 0.8815 - loss:
0.3260 - val_accuracy: 0.8500 - val_loss: 0.3102
Epoch 6/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.8861 - loss:
0.3155 - val_accuracy: 0.8562 - val_loss: 0.2971
Epoch 7/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 36ms/step - accuracy: 0.8925 - loss:
0.2850 - val_accuracy: 0.8562 - val_loss: 0.2886
Epoch 8/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 31ms/step - accuracy: 0.9048 - loss:
0.2825 - val_accuracy: 0.8562 - val_loss: 0.2850
Epoch 9/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9051 - loss:
0.2756 - val_accuracy: 0.8687 - val_loss: 0.2818
Epoch 10/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9014 - loss:
0.2510 - val_accuracy: 0.8562 - val_loss: 0.2766
Epoch 11/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - accuracy: 0.9138 - loss:
0.2324 - val_accuracy: 0.8625 - val_loss: 0.2791
Epoch 12/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9210 - loss:
0.2429 - val_accuracy: 0.8625 - val_loss: 0.2725
Epoch 13/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 21ms/step - accuracy: 0.9153 - loss:
0.2370 - val_accuracy: 0.8625 - val_loss: 0.2705
Epoch 14/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 16ms/step - accuracy: 0.9192 - loss:
0.2314 - val_accuracy: 0.8625 - val_loss: 0.2721
Epoch 15/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 29ms/step - accuracy: 0.9319 - loss:
0.2082 - val_accuracy: 0.8625 - val_loss: 0.2666
Epoch 16/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9216 - loss:
0.2049 - val_accuracy: 0.8687 - val_loss: 0.2726
Epoch 17/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9237 - loss:
0.2079 - val_accuracy: 0.8813 - val_loss: 0.2709
Epoch 18/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9231 - loss:
0.2077 - val_accuracy: 0.8500 - val_loss: 0.2738
Epoch 19/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9461 - loss:
0.1544 - val_accuracy: 0.8687 - val_loss: 0.2673
Epoch 20/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9568 - loss:
0.1641 - val_accuracy: 0.8687 - val_loss: 0.2717
Epoch 21/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 27ms/step - accuracy: 0.9415 - loss:
0.1690 - val_accuracy: 0.8687 - val_loss: 0.2752
Epoch 22/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9543 - loss:
0.1475 - val_accuracy: 0.8750 - val_loss: 0.2680
Epoch 23/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.9591 - loss:
0.1356 - val_accuracy: 0.8750 - val_loss: 0.2750
Epoch 24/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9718 - loss:
0.1278 - val_accuracy: 0.8687 - val_loss: 0.2748
Epoch 25/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9663 - loss:
0.1365 - val_accuracy: 0.8875 - val_loss: 0.2821
Epoch 26/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.9652 - loss:
0.1278 - val_accuracy: 0.8687 - val_loss: 0.2796
Epoch 27/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.9838 - loss:
0.1022 - val_accuracy: 0.8875 - val_loss: 0.2832
Epoch 28/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.9722 - loss:
0.1124 - val_accuracy: 0.8750 - val_loss: 0.2850
Epoch 29/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9733 - loss:
0.1173 - val_accuracy: 0.8750 - val_loss: 0.2845
Epoch 30/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.9821 - loss:
0.0952 - val_accuracy: 0.8750 - val_loss: 0.2843
Epoch 31/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.9787 - loss:
0.0995 - val_accuracy: 0.8625 - val_loss: 0.2976
Epoch 32/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9845 - loss:
0.0956 - val_accuracy: 0.8687 - val_loss: 0.2914
Epoch 33/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9847 - loss:
0.0814 - val_accuracy: 0.8750 - val_loss: 0.2946
Epoch 34/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9823 - loss:
0.0829 - val_accuracy: 0.8687 - val_loss: 0.2979
Epoch 35/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9848 - loss:
0.0744 - val_accuracy: 0.8750 - val_loss: 0.3102
Epoch 36/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9888 - loss:
0.0736 - val_accuracy: 0.8813 - val_loss: 0.2900
Epoch 37/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9893 - loss:
0.0609 - val_accuracy: 0.8750 - val_loss: 0.3087
Epoch 38/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9873 - loss:
0.0616 - val_accuracy: 0.8750 - val_loss: 0.3063
Epoch 39/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9878 - loss:
0.0640 - val_accuracy: 0.8750 - val_loss: 0.3166
Epoch 40/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.9933 - loss:
0.0552 - val_accuracy: 0.8875 - val_loss: 0.3114
Epoch 41/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9884 - loss:
0.0527 - val_accuracy: 0.8750 - val_loss: 0.3219
Epoch 42/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9889 - loss:
0.0513 - val_accuracy: 0.8813 - val_loss: 0.3199
Epoch 43/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9965 - loss:
0.0425 - val_accuracy: 0.8875 - val_loss: 0.3195
Epoch 44/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9946 - loss:
0.0464 - val_accuracy: 0.8813 - val_loss: 0.3318
Epoch 45/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9953 - loss:
0.0387 - val_accuracy: 0.8750 - val_loss: 0.3297
Epoch 46/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9958 - loss:
0.0327 - val_accuracy: 0.8687 - val_loss: 0.3469
Epoch 47/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9978 - loss:
0.0314 - val_accuracy: 0.8750 - val_loss: 0.3387
Epoch 48/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.9994 - loss:
0.0327 - val_accuracy: 0.8813 - val_loss: 0.3388
Epoch 49/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 1.0000 - loss:
0.0321 - val_accuracy: 0.8813 - val_loss: 0.3462
Epoch 50/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 1.0000 - loss:
0.0229 - val_accuracy: 0.8875 - val_loss: 0.3498
Epoch 1/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.5152 - loss:
0.7853 - val_accuracy: 0.7188 - val_loss: 0.5864
Epoch 2/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.6228 - loss:
0.6686 - val_accuracy: 0.7937 - val_loss: 0.5273
Epoch 3/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.6706 - loss:
0.6189 - val_accuracy: 0.8313 - val_loss: 0.4856
Epoch 4/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.6945 - loss:
0.5651 - val_accuracy: 0.8438 - val_loss: 0.4484
Epoch 5/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.7278 - loss:
0.5187 - val_accuracy: 0.8562 - val_loss: 0.4172
Epoch 6/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.7402 - loss:
0.5468 - val_accuracy: 0.8625 - val_loss: 0.3909
Epoch 7/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.7315 - loss:
0.5248 - val_accuracy: 0.8625 - val_loss: 0.3676
Epoch 8/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.7987 - loss:
0.4617 - val_accuracy: 0.8687 - val_loss: 0.3507
Epoch 9/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.7858 - loss:
0.4741 - val_accuracy: 0.8687 - val_loss: 0.3344
Epoch 10/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8021 - loss:
0.4215 - val_accuracy: 0.8750 - val_loss: 0.3219
Epoch 11/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8331 - loss:
0.3714 - val_accuracy: 0.8750 - val_loss: 0.3135
Epoch 12/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8179 - loss:
0.4187 - val_accuracy: 0.8813 - val_loss: 0.3047
Epoch 13/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8565 - loss:
0.3632 - val_accuracy: 0.8750 - val_loss: 0.2969
Epoch 14/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8331 - loss:
0.3782 - val_accuracy: 0.8687 - val_loss: 0.2915
Epoch 15/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8367 - loss:
0.4313 - val_accuracy: 0.8750 - val_loss: 0.2847
Epoch 16/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8407 - loss:
0.3637 - val_accuracy: 0.8750 - val_loss: 0.2769
Epoch 17/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8581 - loss:
0.3331 - val_accuracy: 0.8750 - val_loss: 0.2717
Epoch 18/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8755 - loss:
0.3441 - val_accuracy: 0.8750 - val_loss: 0.2714
Epoch 19/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8439 - loss:
0.3797 - val_accuracy: 0.8750 - val_loss: 0.2688
Epoch 20/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8549 - loss:
0.3881 - val_accuracy: 0.8875 - val_loss: 0.2685
Epoch 21/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.8679 - loss:
0.3121 - val_accuracy: 0.8813 - val_loss: 0.2637
Epoch 22/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8549 - loss:
0.3219 - val_accuracy: 0.8687 - val_loss: 0.2599
Epoch 23/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 6ms/step - accuracy: 0.8244 - loss:
0.3870 - val_accuracy: 0.8687 - val_loss: 0.2599
Epoch 24/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.8538 - loss:
0.3253 - val_accuracy: 0.8687 - val_loss: 0.2603
Epoch 25/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - accuracy: 0.8482 - loss:
0.3312 - val_accuracy: 0.8687 - val_loss: 0.2591
Epoch 26/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.8662 - loss:
0.3036 - val_accuracy: 0.8687 - val_loss: 0.2552
Epoch 27/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - accuracy: 0.8561 - loss:
0.3585 - val_accuracy: 0.8625 - val_loss: 0.2550
Epoch 28/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - accuracy: 0.8735 - loss:
0.3187 - val_accuracy: 0.8687 - val_loss: 0.2524
Epoch 29/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.8841 - loss:
0.3073 - val_accuracy: 0.8687 - val_loss: 0.2500
Epoch 30/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.8419 - loss:
0.3552 - val_accuracy: 0.8687 - val_loss: 0.2533
Epoch 31/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 8ms/step - accuracy: 0.8861 - loss:
0.3104 - val_accuracy: 0.8687 - val_loss: 0.2538
Epoch 32/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.8707 - loss:
0.3173 - val_accuracy: 0.8687 - val_loss: 0.2510
Epoch 33/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 9ms/step - accuracy: 0.8880 - loss:
0.3089 - val_accuracy: 0.8625 - val_loss: 0.2485
Epoch 34/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 7ms/step - accuracy: 0.8619 - loss:
0.3288 - val_accuracy: 0.8625 - val_loss: 0.2529
Epoch 35/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8815 - loss:
0.3180 - val_accuracy: 0.8625 - val_loss: 0.2525
Epoch 36/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8688 - loss:
0.3215 - val_accuracy: 0.8625 - val_loss: 0.2504
Epoch 37/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8829 - loss:
0.2907 - val_accuracy: 0.8625 - val_loss: 0.2489
Epoch 38/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8665 - loss:
0.3169 - val_accuracy: 0.8625 - val_loss: 0.2502
Epoch 39/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8518 - loss:
0.3335 - val_accuracy: 0.8687 - val_loss: 0.2480
Epoch 40/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8722 - loss:
0.3199 - val_accuracy: 0.8687 - val_loss: 0.2475
Epoch 41/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8779 - loss:
0.2914 - val_accuracy: 0.8687 - val_loss: 0.2485
Epoch 42/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8829 - loss:
0.2764 - val_accuracy: 0.8687 - val_loss: 0.2472
Epoch 43/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.8906 - loss:
0.3278 - val_accuracy: 0.8687 - val_loss: 0.2463
Epoch 44/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8838 - loss:
0.2934 - val_accuracy: 0.8687 - val_loss: 0.2460
Epoch 45/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8769 - loss:
0.2908 - val_accuracy: 0.8687 - val_loss: 0.2435
Epoch 46/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8931 - loss:
0.2945 - val_accuracy: 0.8687 - val_loss: 0.2458
Epoch 47/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step - accuracy: 0.9131 - loss:
0.2687 - val_accuracy: 0.8687 - val_loss: 0.2485
Epoch 48/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8891 - loss:
0.2726 - val_accuracy: 0.8750 - val_loss: 0.2484
Epoch 49/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8665 - loss:
0.3065 - val_accuracy: 0.8750 - val_loss: 0.2484
Epoch 50/50
20/20 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8853 - loss:
0.2774 - val_accuracy: 0.8750 - val_loss: 0.2430
7/7 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8196 - loss: 0.6601