Write an MNIST classifier that trains to 99% accuracy or above, and does it without a fixed number of epochs -- i.e. you should stop training once you reach that level of accuracy.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import tensorflow as tf | |
# Callback function to check model accuracy | |
class RayCallback(tf.keras.callbacks.Callback): | |
def on_epoch_end(self, epoch, logs={}): | |
if(logs.get('accuracy')>0.99): | |
print("\nReached 99% accuracy so cancelling training!") | |
self.model.stop_training = True | |
# Load the MNIST handwrite digit data set | |
mnist = tf.keras.datasets.mnist | |
(x_train, y_train),(x_test, y_test) = mnist.load_data() | |
# Normalize training data and callback function | |
callbacks = RayCallback() | |
x_train = x_train/255.0 | |
x_test = x_test/255.0 | |
# Create an 3 layer model: Flatten -> 128 input -> 10 output | |
model = tf.keras.models.Sequential([ | |
tf.keras.layers.Flatten(), | |
tf.keras.layers.Dense(128, activation=tf.nn.relu), | |
tf.keras.layers.Dense(10, activation=tf.nn.softmax) | |
]) | |
# Setting optimizer and loss function | |
model.compile(optimizer='adam', | |
loss='sparse_categorical_crossentropy', | |
metrics=['accuracy']) | |
# Training model untill accuracy > 99% | |
model.fit(x_train, y_train, epochs=15, callbacks=[callbacks]) | |
# Evaluate with test data | |
model.evaluate(x_test, y_test) | |
Result:
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Epoch 1/15 | |
1875/1875 [==============================] - 3s 2ms/step - loss: 0.2570 - accuracy: 0.9265 | |
Epoch 2/15 | |
1875/1875 [==============================] - 4s 2ms/step - loss: 0.1133 - accuracy: 0.9667 | |
Epoch 3/15 | |
1875/1875 [==============================] - 3s 2ms/step - loss: 0.0778 - accuracy: 0.9765 | |
Epoch 4/15 | |
1875/1875 [==============================] - 3s 2ms/step - loss: 0.0580 - accuracy: 0.9822 | |
Epoch 5/15 | |
1875/1875 [==============================] - 3s 2ms/step - loss: 0.0444 - accuracy: 0.9859 | |
Epoch 6/15 | |
1875/1875 [==============================] - 3s 2ms/step - loss: 0.0345 - accuracy: 0.9893 | |
Epoch 7/15 | |
1863/1875 [============================>.] - ETA: 0s - loss: 0.0268 - accuracy: 0.9916 | |
Reached 99% accuracy so cancelling training! | |
1875/1875 [==============================] - 3s 2ms/step - loss: 0.0271 - accuracy: 0.9916 | |
313/313 [==============================] - 0s 1ms/step - loss: 0.0843 - accuracy: 0.9774 | |
[0.08431357890367508, 0.977400004863739] |
沒有留言:
張貼留言