TensorFlowãåæã®é ãã触ã£ã¦ã㦠define-and-run ã®æµåã«ã¯ãããªãã«æ £ãã¦ãã¾ã£ã¦ãããã©ãããããTensorFlowã2.0ãreleaseããããã ã(2019.09æç¹ã§ 2.0rc1) æ°ããinterfaceã触ã£ã¦ãããªãã¨ãã¨æã£ã¦åå¼·ãã¦ã¿ãã
Effective TensorFlow 2.0 ãèªãã¨ãmajor changesã¨ã㦠"Eager execution"ãrecommendationsã¨ãã¦"Keras layers and models"ãç´¹ä»ããã¦ããã ããããã®æ代ã¯Keras APIã使ã£ã¦Eager executionã§ãã£ã¦ããå¿ è¦ãããããã ã
ãé¡: å°æ£é§ç»åã®åé¡
æ¨å¹´ãããããå°æ£ã®ç»åèªèããããã¨æã£ã¦ é§ã®ç»åãã¼ã¿ã»ãããä½æ ãã¦ãããä»åã¯ããã使ãã
åé§14種ã®å
æã»å¾æã§28種ã空ç½ãã¹ãå ãã¦è¨29 classesã対象ã¨ãã¦ãåclassã«ã¤ãç´200ã300æãããã㤠96x96
ã®ã«ã©ã¼ç»å ãã©ãã«ä»ãã§ç¨æãã¦ããã
datasetã®æºå
ã©ãã«ä»ãã®ç»åãã¡ãä¸å®ã®å²å㧠training
, validation
, test
ã®datasetã«åå²ããã
ä»åã¯ç´ 8:1:1
ã§åå²ãã¦ã
- training:
6277
æ - validation:
816
æ - test:
745
æ
ã®datasetãç¨æã§ããã
å¾è¿°ãã tf.keras.preprocessing.image.ImageDataGenerator
ã§ä½¿ãããããããådatasetãålabelæ¯ã®ãã£ã¬ã¯ããªä»¥ä¸ã«å±éã
dataset âââ test â  âââ BLANK â  â  âââ 0de35ef1668e6396720e6fd6b22502b9.jpg â  â  âââ 15767c0eb70db908f98a9ac9304227c8.jpg â  â  âââ 18b0f691ac7b1ba6d71eac7ad32efdd5.jpg â â ... â  âââ B_FU â  â  âââ 01aad8b7a32ca76e1ed82d72d9305510.jpg â  â  âââ 04bc08425fb859f883802228e723c215.jpg â  â  âââ 080950c64fb64840da3835d67fb969b8.jpg â â ... â  âââ B_GI â ... âââ training â  âââ BLANK â  âââ B_FU â ... âââ validation âââ BLANK âââ B_FU ...
transfer learning
ã¾ãã¯ç°¡åã«ãå¦ç¿æ¸ã¿ã® MobileNetV2 ã®ã¢ãã«ã使ã£ã転移å¦ç¿ããã¦ã¿ãã
tf.keras.applications
packageã«ã¯pre-trained modelãå¹¾ã¤ãå梱ããã¦ããã®ã§ããããå¼ã³åºãã ãã§ç°¡åã«ä½¿ããã¨ãã§ããã
TensorFlow Hub ã«ãåæ§ã«å¦ç¿æ¸ã¿ã¢ãã«ãå
¬éããã¦ãã¦åå©ç¨ã§ããã®ã ãã©ãç¾ç¶ã§ã¯ TensorFlow 2.0 åãã®ãã®ã¯å°ãªãã¦ã MobileNetV2 ã®ãã®ã¯ input size ã 224x224
ã«å¶éããã¦ãããªã©ã§ã¡ãã£ã¨ä½¿ãã¥ããã
ä»å¾ãã£ã¨æ´åããã¦ããã®ãããããªããã© ä»å㯠tf.keras.applications.MobileNetV2
ã使ããã¨ã«ããã
INPUT_IMAGE_SIZE = (96, 96) tf.keras.applications.MobileNetV2( input_shape=INPUT_IMAGE_SIZE + (3,), include_top=False, pooling='avg', weights='imagenet')
input_shapeã®ãµã¤ãºã¯ 96, 128, 160, 192, 224
ã®ã©ããã§æå®ã§ãã¦(defaultã¯224
)ãããã«å¿ããimagenet
ã§ã®å¦ç¿æ¸ã¿ã¢ãã«ããã¦ã³ãã¼ãããã¦ä½¿ãããããã ã
optionãæå®ããªãã¨1000 classesã®åé¡ç¨logitsãåºåããããã転移å¦ç¿ã«ã¯ããã¯ä¸è¦ã§ãã®å段éã®ç¹å¾´éã ãæ½åºã§ããã°è¯ãã®ã§ include_top=False
ãæå®ã
input_shape=(96, 96, 3)
ã ã¨(None, 3, 3, 1280)
ã®ç¹å¾´éãåºåãããããã«ãªãã
ãããããã« pooling='avg'
ãæå®ãããã¨ã§å¹³åå¤ãåãpoolingã«ãã (None, 1280)
ã®2D Tensorãåºåãããããã«ãªã(pooling='max'
ã¨æå®ãããã¨ãã§ãã)ã
ãã®1280
åã®å¤ãç¹å¾´éãã¯ãã«ã¨ãã¦å©ç¨ãã¦çµå層ã®é¨åã ãå¦ç¿ããã¦æé©åãã¦ããã
feature extraction
base networkã¨ãªãMobileNetV2ãåºå®ãããã¾ã¾ä½¿ãå ´åã¯ å ¥åç»åã«å¯¾ããç¹å¾´éã®åºåã¯å¸¸ã«åºå®ã«ãªãã¯ããªã®ã§ãå ã«å ¨ç»åã«å¯¾ããç¹å¾´éãæ½åºãã¦ãããã¨ãåºæ¥ãã
Eager executionã®ãããã§åç´ã« out = model(images).numpy()
ã¨ãã£ãå½¢ã§å¼ã³åºãã¦åºåã®å¤ãåå¾ã§ããã
directoryãèµ°æ»ãã¦ç»åãã¼ã¿ãèªã¿è¾¼ãã§ã¢ãã«ã¸ã®å
¥åå¤ãä½ããåºåãããç¹å¾´éãã¯ãã«ã¨å¯¾å¿ããlabel indexãã»ããã«ãã¦ä¿åãã©ããå¾ã§shuffleãã¦ä½¿ãã®ã§é çªã¯æ°ã«ããªãã
ããã§ã¯numpy arrayã«ãã¦npz
ã§ä¿åããã
def dump_features(data_dir, features_dir): with open(os.path.join(data_dir, 'labels.txt'), 'r') as fp: labels = [line.strip() for line in fp.readlines()] model = tf.keras.applications.MobileNetV2( input_shape=INPUT_IMAGE_SIZE + (3,), include_top=False, pooling='avg', weights='imagenet') model.trainable = False features, label = [], [] for root, dirs, files in os.walk(data_dir): for filename in files: image = tf.io.read_file(os.path.join(root, filename)) image = tf.io.decode_image(image, channels=3) image = tf.image.convert_image_dtype(image, dtype=tf.float32) features.append(model(tf.expand_dims(image, axis=0)).numpy().flatten()) label.append(labels.index(os.path.basename(root))) np.savez(os.path.join(features_dir, 'out.npz'), inputs=features, targets=label)
æµç³ã«CPUã ã¨ããããæéããããããæ°å件ç¨åº¦ãªãæ°åå¾ ã¦ã°å®äºããã
>>> import numpy as np >>> npz = np.load('features/training.npz') >>> npz['inputs'] array([[0.19352797, 0. , 0. , ..., 0. , 1.0640727 , 2.7432559 ], [0. , 0. , 0. , ..., 0. , 1.4123839 , 3.057496 ], [0.18237096, 0. , 0.04695423, ..., 0.33184642, 0.8117143 , 1.5288072 ], ..., [0. , 0. , 0.06801239, ..., 0. , 0.13248181, 1.6491733 ], [0.2550986 , 0. , 0.10878149, ..., 0. , 0.0785673 , 0.0311639 ], [0. , 0. , 0. , ..., 0. , 0. , 2.0744066 ]], dtype=float32) >>> npz['inputs'].shape (6277, 1280) >>> npz['targets'] array([24, 24, 24, ..., 14, 14, 14]) >>> npz['targets'].shape (6277,)
ãã¨ã¯ãã®å ¥åºåã«æé©åããããã«çµå層ãå¦ç¿ããã¦ããã°è¯ãã
tf.data.Dataset
ã«ããå¦ç¿ãã¼ã¿æºå
def dataset(category): npz = np.load(os.path.join(features_dir, f'{category}.npz')) inputs = npz['inputs'] targets = npz['targets'] size = inputs.shape[0] return tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(size), size training_data, training_size = dataset('training') validation_data, validation_size = dataset('validation')
training
ç¨ã¨ validation
ç¨ã§ãããã tf.data.Dataset.from_tensor_slices
ã« inputs
㨠targets
ã®tupleã渡ããã¨ã§ãModelã«ä¸ããè¨ç·´ç¨å
¥åãã¼ã¿ã®æºåãã§ããã
>>> for images, labels in training_data.batch(32).take(1): ... print(images.shape, labels.shape) (32, 1280) (32,)
tf.keras.Sequential
ã«ããModelå®ç¾©
å¦ç¿ãããModel㯠tf.keras.Sequential
ã« tf.keras.layers.Layer
ã®listã渡ãã¦è¨è¿°ãã¦ããã
with open(os.path.join(args.data_dir, 'labels.txt')) as fp: labels = [line.strip() for line in fp.readlines()] classes = len(labels) model = tf.keras.Sequential([ tf.keras.layers.InputLayer((1280,)), tf.keras.layers.Dropout(rate=0.2), tf.keras.layers.Dense( classes, activation='softmax', kernel_regularizer=tf.keras.regularizers.l2(1e-4)), ]) model.summary()
å
¥åã¨ãªã1280
ã®ç¹å¾´éãã¯ãã«ã«å¯¾ãDropoutãå
¥ãã¤ã¤Dense layer㧠29 classesã®åé¡ã«ãªãããã«ãã¦ããã ããæçµå±¤ã®activationã¯softmax
ã«ã
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dropout (Dropout) (None, 1280) 0 _________________________________________________________________ dense (Dense) (None, 29) 37149 ================================================================= Total params: 37,149 Trainable params: 37,149 Non-trainable params: 0 _________________________________________________________________
summary
ã§ã¢ãã«ã®æ¦è¦ãç°¡åã«åãã£ã¦ä¾¿å©ã
å¦ç¿
ã¾ãã¯lossãmetricsã®å®ç¾©ãªã©ã Model.compile()
ã§ä½ãè¨æ¸¬ãã¦ä½ãæ¸å°ããã¦ãããããªã©ã決å®ããã
model.compile( optimizer=tf.keras.optimizers.RMSprop(), loss=tf.keras.losses.SparseCategoricalCrossentropy(), metrics=[tf.keras.metrics.SparseCategoricalAccuracy()])
ãã®è»¢ç§»å¦ç¿ç¨ã«ç¨æããdatasetã§ã¯targets
ã®labelã¯indexã表ãåä¸ã®intå¤ãªã®ã§ãone-hot vectorã«å¤æãã¦ããªãå ´å㯠tf.keras.losses.SparseCategoricalCrossentropy
ã®ããã« Sparse
ã¨ã¤ãããã®ã使ãã
one-hotãªè¡¨ç¾ã®å¤ã使ãããå ´åã¯tf.keras.utils.to_categorical
ãå©ç¨ãã¦å¤æãããã¨ãåºæ¥ãããã ã
optimizer㯠RMSprop
ãããã©ã«ãã§ä½¿ããããããã
ããã¾ã§æºåã§ãããå¦ç¿éå§ã Model.fit()
ãå¼ã¶ã ãã
history = model.fit( training_data.repeat().batch(batch_size), steps_per_epoch=training_size // batch_size, epochs=100, validation_data=validation_data.batch(batch_size), validation_steps=validation_size // batch_size, callbacks=[tf.keras.callbacks.TensorBoard()]) print(history.history)
ä¸ããå¦ç¿ãã¼ã¿ã«ã¯ä¸ã§ä½ã£ã training_data
ã使ããããã¯ç¹°ãè¿ã使ãã®ã§ .repeat()
ãæå®ã
.batch()
ã§å¦çããã®ã§ãããã data_sizeãbatch_sizeã§å²ã£ãstepsæ°ãæå®ãã¦ããã
callbacks
ã§åepochãçµãã£ãã¨ããªã©ã«ç¹å¥ãªå¦çãæããã¨ãåºæ¥ã¦ãããã§ã¯ tf.keras.callbacks.TensorBoard()
ã渡ããã¨ã§ ./logs
以ä¸ã«TensorBoardã§ç¢ºèªããç¨ã®logãã¼ã¿ãæ¸ãè¾¼ãã§ãããããã«ãªãã
Train for 98 steps, validate for 12 steps Epoch 1/100 2019-09-16 00:13:08.160855: I tensorflow/core/profiler/lib/profiler_session.cc:184] Profiler session started. 98/98 [==============================] - 1s 14ms/step - loss: 2.2567 - sparse_categorical_accuracy: 0.3463 - val_loss: 1.4933 - val_sparse_categorical_accuracy: 0.5755 Epoch 2/100 98/98 [==============================] - 0s 3ms/step - loss: 1.2355 - sparse_categorical_accuracy: 0.6362 - val_loss: 1.0762 - val_sparse_categorical_accuracy: 0.6966 Epoch 3/100 98/98 [==============================] - 0s 3ms/step - loss: 0.8975 - sparse_categorical_accuracy: 0.7380 - val_loss: 0.8647 - val_sparse_categorical_accuracy: 0.7630 Epoch 4/100 98/98 [==============================] - 0s 3ms/step - loss: 0.7226 - sparse_categorical_accuracy: 0.7943 - val_loss: 0.7586 - val_sparse_categorical_accuracy: 0.7943 Epoch 5/100 98/98 [==============================] - 0s 3ms/step - loss: 0.6036 - sparse_categorical_accuracy: 0.8364 - val_loss: 0.6855 - val_sparse_categorical_accuracy: 0.8073 ...
Dense Layerã®é¨åã ãã®å¦ç¿ãªã®ã§ã¨ã¦ãéããCPUã§ã1epochããã0.3ç§ç¨åº¦ã§ ãã£ã¨ããéã«å¦ç¿ãé²ããloss
ãæ¸å°ã sparse_categorical_accuracy
ãå¢å ãã¦ããã®ãè¦ã¦ã¨ããã
ã¨ãããã 100
epochåã㦠TensorBoardã§ç¢ºèªããã¨
loss:
sparse_categorical_accuracy:
æ©ãtraining, éãvalidationãtraining_dataã«å¯¾ãã¦ã¯ã©ãã©ãæ£ççãä¸ããã validation_dataã«å¯¾ãã¦ã®çµæã¯90%
ã«å±ããå±ããªããâ¦ç¨åº¦ã®ã¨ããã§æ¢ã¾ã£ã¦ãã¾ãããã ã
ã¾ãImageNetã§å¦ç¿ããMobileNetV2ãå°æ£é§ç»åã«å¯¾ãã¦ã©ãã ãã®ç¹å¾´ãæãããã¦ãããã¨ããã®ãèããã¨ãããªãã®ããªãã¨ããæ°ã¯ããã
å¦ç¿å¾ã®Modelãä¿å
å¦ç¿ããDense Layerã使ã£ã¦ãbase networkã®MobileNetV2ã¨ç¹ãã¦ã¾ãæ°ãã tf.keras.Sequential
ãä½ãããããããã¨ã§ãä»åº¦ã¯ (96, 96, 3)
ã®å
¥åã«å¯¾ã㦠(29)
ã®åºåãããç»ååé¡å¨ã¨ãã¦åãModelã«ãªãã
classifier = tf.keras.Sequential([ tf.keras.applications.MobileNetV2( input_shape=INPUT_IMAGE_SIZE + (3,), include_top=False, pooling='avg', weights='imagenet'), model, ]) classifier.trainable = False classifier.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= mobilenetv2_1.00_96 (Model) (None, 1280) 2257984 _________________________________________________________________ sequential (Sequential) (None, 29) 37149 ================================================================= Total params: 2,295,133 Trainable params: 0 Non-trainable params: 2,295,133 _________________________________________________________________
ç°¡åã«ç¹ãç´ãã¦ä½¿ããã¨ãåºæ¥ã¦ä¾¿å©â¦ã
ãã¨ã¯ãã®Modelã丸ãã¨ä¿åã.save()
ãå¼ã¶ã ãã§è¯ãã
classifier.save('transfer_classifier.h5')
test_dataã§ã®è©ä¾¡
ä¿åããModelã使ã£ã¦ãå¦ç¿ã«ã¯ä½¿ã£ã¦ããªã test
ã®ãã¼ã¿ã使ã£ã¦ç²¾åº¦ãè©ä¾¡ãã¦ã¿ãã
feature extractionããã¨ãã¨åæ§ã«ãã£ã¬ã¯ããªãèµ°æ»ã ç»åãã¡ã¤ã«ã¨label indexãzip
ã㦠tf.data.Dataset
ãçæããã
ä¿åããModel㯠tf.keras.models.load_model()
ã§èªã¿è¾¼ãããå¦ç¿æã¨åæ§ã« loss ã« SparseCategoricalCrossentropy
, metricsã« SparseCategoricalAccuracy
ãæå®ãã¦compile
ãã¦ãè©ä¾¡ã§è¦ãã¹ãå¤ãå®ããã
def evaluate(data_dir, model_path): with open(os.path.join(data_dir, 'labels.txt'), 'r') as fp: labels = [line.strip() for line in fp.readlines()] label_to_index = {label: index for index, label in enumerate(labels)} def load_image(image_path): image = tf.io.decode_jpeg(tf.io.read_file(image_path), channels=3) return tf.image.convert_image_dtype(image, tf.float32) image_paths = pathlib.Path(os.path.join(data_dir, 'test')).glob('*/*.jpg') image_paths = list(image_paths) label_index = [label_to_index[path.parent.name] for path in image_paths] images_ds = tf.data.Dataset.from_tensor_slices([str(path) for path in image_paths]).map(load_image) labels_ds = tf.data.Dataset.from_tensor_slices(label_index) test_data = tf.data.Dataset.zip((images_ds, labels_ds)).shuffle(len(image_paths)) model = tf.keras.models.load_model(model_path) model.trainable = False model.compile( loss=tf.keras.losses.SparseCategoricalCrossentropy(), metrics=[tf.keras.metrics.SparseCategoricalAccuracy()]) model.summary() test_result = model.evaluate(test_data.batch(1)) print(test_result)
745/745 [==============================] - 13s 17ms/step - loss: 0.4405 - sparse_categorical_accuracy: 0.9114 [0.44054528336796983, 0.9114094]
ä»åã®transfer learningã§ã®Modelã®ãå
¨745
件ã®test_dataç»åã¸ã®æ£çç㯠91.14%
ã¨ãªããã¨ãåãã£ãã
fine tuning
transfer learningã§ã¯90%
åå¾ã®ç²¾åº¦ãéçã®ããã ããbase networkã®MobileNetV2é¨åãå¦ç¿å¯¾è±¡ã«å«ãã¦è¨ç·´ãã¦ããã¨ã©ããªããã
Modelå®ç¾©
model = tf.keras.Sequential([ tf.keras.applications.MobileNetV2( input_shape=INPUT_IMAGE_SIZE + (3,), include_top=False, pooling='avg', weights='imagenet'), tf.keras.layers.Dropout(rate=0.1), tf.keras.layers.Dense( len(labels), activation='softmax', kernel_regularizer=tf.keras.regularizers.l2(1e-4)), ]) model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= mobilenetv2_1.00_96 (Model) (None, 1280) 2257984 _________________________________________________________________ dropout (Dropout) (None, 1280) 0 _________________________________________________________________ dense (Dense) (None, 29) 37149 ================================================================= Total params: 2,295,133 Trainable params: 2,261,021 Non-trainable params: 34,112 _________________________________________________________________
MobileNetV2
âDense
ã§ç¹ããã ãã¨ããModelã®æ§é èªä½ã¯ã§å¤ãããªãã
ãã ä»åº¦ã¯MobileNetV2ã®é¨åãå¦ç¿å¯¾è±¡ã¨ããã®ã§ Trainable params
ã 37,149
ãã 2,261,021
ã«æ¿å¢ãã¦ããã
tf.keras.preprocessing.image.ImageDataGenerator
ã使ã£ã¦augmentation
tf.keras.preprocessing
ã¨ããmoduleããã£ã¦ãããã«ã¯ãã¼ã¿ã®åå¦çã®ããã®utitityãããã
ç»åã«å¯¾ãã¦ãã以å㯠tf.image
ã®æ§ã
ãªoperationãèªåã§çµã¿åããã¦ä½ã£ã¦ããdata augmentationã®å¦çãã¾ã¨ãã¦ãã£ã¦ãããclassãããã
training_datagen = tf.keras.preprocessing.image.ImageDataGenerator( rotation_range=2, width_shift_range=2, height_shift_range=2, brightness_range=(0.8, 1.2), channel_shift_range=0.2, zoom_range=0.02, rescale=1./255)
ãã®ãããªå½¢ã§ãaugmentationããããéã®ãã©ã¡ã¼ã¿ãã¬ã³ã¸ãæå®ã㦠ImageDataGenerator
ãä½ãã
å転ã縦横ã¸ã®ç§»åãæ¡å¤§ç¸®å°ãæããå¤æ´ãªã©ãè²ã
æå®ã§ããã
ãã®Generatorã«å¯¾ã㦠ãã¼ã¿ãæµãè¾¼ãã§ãããã¨ã§ iterator
ãä½ãããã
æ¢ã«å±éãã¦ãããã¼ã¿ãããã° flow
ã§ãdirectoryã ãæå®ãã¦ããããç»åãã¡ã¤ã«ãèªã¿åã£ã¦ãããå ´å㯠flow_from_directory
ã§ã
training_data = training_datagen.flow_from_directory( os.path.join(data_dir, 'training'), target_size=(96, 96), classes=labels, batch_size=batch_size)
ããããã¨ãtraining_data
ããèªã¿åºããã³ã« ImageDataGenerator
çææã«æå®ãããã©ã¡ã¼ã¿ã«å¾ã£ã¦å¤æããããç»åãã¡ãbatchã§çæããã¦åºåãããããã«ãªãã
極端ã«ãã©ã¡ã¼ã¿ã大ãããã¦ã¿ã㨠以ä¸ã®ãããªæãã§ãå¾ãã¦ããç½é£ã³ãã¦ããæ§ã ã ãã®ã¸ãã¯ã©ãããç»ååé¡ã¿ã¹ã¯ã対象ã¨ãããã«ãã£ã¦é©åãªãã©ã¡ã¼ã¿ãç°ãªããã®ã«ãªãããã
ã¨ãããããã® ImageDataGenerator
ããã®åºåã使ã£ã¦å¦ç¿ããã¦ããã
model.compile( optimizer=tf.keras.optimizers.RMSprop(), loss=tf.keras.losses.CategoricalCrossentropy(), metrics=[tf.keras.metrics.CategoricalAccuracy()])
ImageDataGenerator.flow_from_directory
㯠class_mode='categorical'
ãdefaultã«ãªã£ã¦ãã¦ãããã¯targetsã®åºåãone-hotãªç¶æ
ã«ãªããã®ã§ããããå¦ç¿ã«ä½¿ãå ´å㯠Sparse
ã§ã¯ãªã CategoricalCrossentropy
, CategoricalAccuracy
ã使ããã¨ã«ãªãã
ããã¦å¦ç¿ã
training_data
ã ImageDataGenerator
ã®ãã®ãªã®ã§ãModel.fit
ã§ã¯ãªã Model.fit_generator
ã使ãã
history = model.fit_generator( training_data, epochs=100, validation_data=validation_data, callbacks=[ tf.keras.callbacks.TensorBoard(), tf.keras.callbacks.ModelCheckpoint( os.path.join(weights_dir, 'finetuning_weights-{epoch:02d}.h5'), save_weights_only=True), ])
callbacks
ã« tf.keras.callbacks.ModelCheckpoint()
ã追å ãã¦ã¿ããepochãã¨ã«modelã®ç¶æ
ãä¿åãã¦ããããsave_weights_only
ã«ããã°modelå®ç¾©ã¯ç¡è¦ãã¦å¤æ°ã®å¤ã ããä¿åãã¦ãããããã«ãªãã
Epoch 1/100 99/99 [==============================] - 53s 539ms/step - loss: 0.9509 - categorical_accuracy: 0.7492 - val_loss: 4.0943 - val_categorical_accuracy: 0.3885 Epoch 2/100 99/99 [==============================] - 52s 520ms/step - loss: 0.2898 - categorical_accuracy: 0.9246 - val_loss: 8.5777 - val_categorical_accuracy: 0.1973 Epoch 3/100 99/99 [==============================] - 51s 515ms/step - loss: 0.2019 - categorical_accuracy: 0.9498 - val_loss: 7.5333 - val_categorical_accuracy: 0.2512 Epoch 4/100 99/99 [==============================] - 52s 521ms/step - loss: 0.1509 - categorical_accuracy: 0.9651 - val_loss: 11.6409 - val_categorical_accuracy: 0.0907 ...
ããã¯æµç³ã«CPUã§ã¯ããªãã®æéãããã£ã¦ãã¾ããæå
ã®MacBookProã 㨠200s/epoch
ãããã
Google Colaboratory ã® GPU Runtime ã使ã㨠52s/epoch
ãããã«ç縮ã§ããããã ã
ã¡ãªã¿ã« TPU Runtime ã§ã¯ TensorFlow 2.0ãã¾ã 使ããªãã¦ããããã2.0rcæç¹ã§ã¯ã¾ã TPUãsupportã§ãã¦ããªããããã
ã¨ããããã©ãã«ã 100
epochåãã¦ã¿ãã¨
loss:
categorical_accuracy:
training(æ©)ã¯åæããé 調ã«lossãæ¸å°ãã¦accuracyãä¸æããããvalidation(é)ã¯ã©ãã«ãæåã®æ°epochã®ãã¡ã¯å ¨ç¶å®å®ããªãã ããè¾æ±å¼·ã 30ã40epochãããã¾ã§åãã¦ããã¨çªå¦ã¨ãã¦lossã®æ¸å°ãå§ã¾ã ã©ãã©ã精度ãä¸ãã£ã¦ããã ããã¯ã¡ãã£ã¨ããåãããªããã© fine-tuningã§base networkãå¦ç¿ããã£ã¦ããã®ã¯ãããããã¨ãªã®ããªãâ¦ã
ã¨ããããæçµçã«ã¯ããªãè¯ã精度ã«ãªã£ã¦ãããã
è©ä¾¡
transfer learningã®ã¨ãã¨åãããã« model.save()
ã§å¦ç¿å¾ã®ã¢ãã«ãä¿åããåãè©ä¾¡scriptã使ã£ã¦ evaluate
ãå®è¡ã
745/745 [==============================] - 14s 19ms/step - loss: 0.0542 - sparse_categorical_accuracy: 0.9906 [0.054189728634688225, 0.99060404]
転移å¦ç¿ããå§åçã«é«ãã 99.06%
ã®ç²¾åº¦ãåºãããããã
ãããä½ãééãã¦ããã®ãã¨ããã¨â¦
745ä»¶ä¸ 7件
(æ£è§£:â³æé¦, æ¨è«:â³ã¨é)
(æ£è§£:â³é¦è», æ¨è«:â³éå°)
(æ£è§£:â³é¦è», æ¨è«:â³æ©å µ)
(æ£è§£:â²æé¦, æ¨è«:â²æé)
(æ£è§£:â²æé¦, æ¨è«:â²ã¨é)
(æ£è§£:â²æé¦, æ¨è«:â²æé)
(æ£è§£:â²æ©å µ, æ¨è«: â³æé)
é¦è»ãæé¦ã¯ããªã¨ã¼ã·ã§ã³ããããã«ã¯ãã¼ã¿ãã¾ãéãããã¨ãã§ãã¦ããªãã¦ç¢ºãã«ééããããªãã¨ããæããã¨ã¯ããåã(å æãå¾æã)ã¯ã ãããè¦åãããã¨ãåºæ¥ã¦ãã⦠ã¨æã£ããæå¾ã®ãã¤ã¯æ©å µãå¾æã®æéã¨å ¨ç¶è¦å½éããªçµæã«ãªã£ã¦ãã¦è¬ãã¾ãæéããã¾ããã¼ã¿å¤ããªãã®ã§å¤ãªã¨ããã§ç¹å¾´ãè¦ã¦ãã¾ãã®ããã
ã¨èå¯ã§ãããããã«ã¯ãããããã«åé¡å¨ã¨ãã¦åãã¦ããããã ã
ãããã㯠ãã®Modelã使ã£ã¦JavaScriptãMobileAppã§æ¨è«ãåããã¦ãããã¨ããã®ããã£ã¦ããã¤ãã