2020年5月17日 星期日

Machine Learning Foundations: Exercise 3 Improve accuracy of MNIST using Convolutions.

Exercise 3: Improve accuracy of MNIST using Convolutions:code lab link

Improve MNIST to 99.8% accuracy or more using only a single convolutional layer and a single MaxPooling 2D.
The filter amount will affect the accuracy and training time.

Code: 

import tensorflow as tf
# Callback function to check model accuracy
class RayCallback(tf.keras.callbacks.Callback):
def on_epoch_end(self, epoch, logs={}):
if(logs.get('accuracy')>0.998):
print("\nReached 99.8% accuracy so cancelling training!")
self.model.stop_training = True
# Load the MNIST handwrite digit data set
mnist = tf.keras.datasets.mnist
(training_images, training_labels), (test_images, test_labels) = mnist.load_data()
# Reshap and normalize training data and callback function
callbacks = RayCallback()
training_images=training_images.reshape(60000, 28, 28, 1)
training_images = training_images/255.0
# Create an 5 layer model:
# Convolution 16 filter with 3 X 3 size to each Image -> Polling each Image to 1/4
# Flatten -> 128 input -> 10 output
model = tf.keras.models.Sequential([
tf.keras.layers.Conv2D(16, (3,3), activation='relu', input_shape=(28, 28, 1)),
tf.keras.layers.MaxPooling2D(2, 2),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(128, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
# Setting optimizer and loss function
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
# Training model untill accuracy > 99.8%
model.fit(training_images, training_labels, epochs=10, callbacks=[callbacks])
view raw gistfile1.txt hosted with ❤ by GitHub

沒有留言:

張貼留言

Linux driver: How to enable dynamic debug at booting time for built-in driver.

 Dynamic debug is useful for debug driver, and can be enable by: 1. Mount debug fs #>mount -t debugfs none /sys/kernel/debug 2. Enable dy...