- import: ํ์ํ ๋ชจ๋ import
- ์ ์ฒ๋ฆฌ: ํ์ต์ ํ์ํ ๋ฐ์ดํฐ ์ ์ฒ๋ฆฌ๋ฅผ ์ํํฉ๋๋ค.
- ๋ชจ๋ธ๋ง(model): ๋ชจ๋ธ์ ์ ์ํฉ๋๋ค.
- ์ปดํ์ผ(compile): ๋ชจ๋ธ์ ์์ฑํฉ๋๋ค.
- ํ์ต (fit): ๋ชจ๋ธ์ ํ์ต์ํต๋๋ค.
์ด๋ฏธ์ง: ๋ฐ์ดํฐ ๋ก๋
1) load_data()๋ฅผ ํตํด train, validation ๋๋๋ค.
(x_train, y_train), (x_valid, y_valid) = fashion_mnist.load_data()
์ ์ฒ๋ฆฌ
1) ์ด๋ฏธ์ง ์ ๊ทํ (Normalization)- ๋ชจ๋ ํฝ์ ์ 0~255(8bit)
x_train = x_train / 255.0
x_valid = x_valid / 255.0
3) Flatten- 2D> 1D
x = Flatten(input_shape=(28, 28)) #x_train์ shape ์ฐ์ด๋ณด๊ณ , 28*28-> 784
print(x(x_train).shape)
4) Dense Layer- ๋งจ ๋ง์ง๋ง ์ถ๋ ฅ์ธต ํ์ฑํจ์: 'softmax', ๋จ์ ๋ถ๋ฅ: 'sigmoid'
Dense(10, activation='softmax')
๋ชจ๋ธ ์ ์ (Sequential)
1) Sequential ๋ชจ๋ธ ์์์ ์ธต์ ์๋๋ค
- 2D > 1D
- Dense Layer์ activation='relu'
- ๋ง์ง๋ง ์ธต์ ์ถ๋ ฅ ์ซ์๋ ๋ถ๋ฅํ๊ณ ์ ํ๋ ํด๋์ค ๊ฐฏ์
model = Sequential([
# Flatten์ผ๋ก shape ํผ์น๊ธฐ
Flatten(input_shape=(28, 28)), # 28 * 28 -> 784์ ํํํ vecter
# Dense Layer
Dense(1024, activation='relu'),
Dense(512, activation='relu'),
Dense(256, activation='relu'),
Dense(128, activation='relu'),
Dense(64, activation='relu'),
# Classification์ ์ํ Softmax
Dense(10, activation='softmax'),
])
ํ๋ผ๋ฏธํฐ ์ค์ด๋๋ ๋ชจ์ต ํ์ธ
์ปดํ์ผ (compile)
- optimizer: 'adam' ,
- loss:
- (activation)sigmoid: binary_crossentropy
- (activation)softmax:
- ์ํซ์ธ์ฝ๋ฉ(O): categorical_crossentropy
- ์ํซ์ธ์ฝ๋ฉ(X): sparse_categorical_crossentropy
- metrics: 'acc' ํน์ 'accuracy'
model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['acc'])
ํ์ต (fit)
ModelCheckpoint: ์ฒดํฌํฌ์ธํธ ์์ฑ: val_loss ๊ธฐ์ค์ผ๋ก epoch ๋ง๋ค ์ต์ ์ ๋ชจ๋ธ์ ์ ์ฅํ๊ธฐ ์ํด
- checkpoint_path
- ModelCheckpoint
checkpoint_path = "my_checkpoint.ckpt"
checkpoint = ModelCheckpoint(filepath=checkpoint_path,
save_weights_only=True,
save_best_only=True,
monitor='val_loss',
verbose=1)
ํ์ต
history = model.fit(x_train, y_train,
validation_data=(x_valid, y_valid),
epochs=20,
callbacks=[checkpoint],
)
ํ์ต ์๋ฃ ํ Load Weights (ModelCheckpoint)
# checkpoint ๋ฅผ ์ ์ฅํ ํ์ผ๋ช
์ ์
๋ ฅํฉ๋๋ค.
model.load_weights(checkpoint_path)
ํ์ตํ ํ ๊ฒ์ฆํ๊ณ ์ถ๋ค๋ฉด?
model.evaluate(x_valid, y_valid)
ํ์ต loss / acc ์๊ฐํ
plt.figure(figsize=(12, 9))
plt.plot(np.arange(1, 21), history.history['loss'])
plt.plot(np.arange(1, 21), history.history['val_loss'])
plt.title('Loss / Val Loss', fontsize=20)
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend(['loss', 'val_loss'], fontsize=15)
plt.show()
plt.figure(figsize=(12, 9))
plt.plot(np.arange(1, 21), history.history['acc'])
plt.plot(np.arange(1, 21), history.history['val_acc'])
plt.title('Acc / Val Acc', fontsize=20)
plt.xlabel('Epochs')
plt.ylabel('Acc')
plt.legend(['acc', 'val_acc'], fontsize=15)
plt.show()
์์น: ๋ฐ์ดํฐ ๋ก๋
1) tensorflow-datasets (tfds): train, validation ๋๋๋ค.
train_dataset = tfds.load('iris', split='train[:80%]')
valid_dataset = tfds.load('iris', split='train[80%:]')
#valid_dataset = tfds.load('iris', split='train[-20%:]')
์ ์ฒ๋ฆฌ
- label ๊ฐ์ one-hot encoding ํ ๊ฒ
- feature (x), label (y)๋ฅผ ๋ถํ ํ ๊ฒ
def preprocess(data):
x = data['features']
y = data['label']
y = tf.one_hot(y, 3)
return x, y
๋ชจ๋ธ ์ ์ (Sequential)
model = tf.keras.models.Sequential([
# input_shape๋ X์ feature ๊ฐฏ์๊ฐ 4๊ฐ ์ด๋ฏ๋ก (4, )๋ก ์ง์ ํฉ๋๋ค.
Dense(512, activation='relu', input_shape=(4,)), #๋ค์ด์ค๋ ๋ฐ์ดํฐ๊ฐ 2D๊ฐ ์๋๋ผ flatten ํด์ค ํ์๊ฐ ์์, ์ ํ๋ฐ์ดํฐ(1D)์
Dense(256, activation='relu'),
Dense(128, activation='relu'),
Dense(64, activation='relu'),
Dense(32, activation='relu'),
# Classification์ ์ํ Softmax, ํด๋์ค ๊ฐฏ์ = 3๊ฐ
Dense(3, activation='softmax'),
])
์ปดํ์ผ (compile)๋ถํฐ๋ ์ด๋ฏธ์ง ๋ฐ์ดํฐ ๊ณผ์ ๊ณผ ๋์ผ
728x90
'๐ AI & Bigdata > AI & ML & DL' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
[ML] GBM์ด๋ (0) | 2022.08.29 |
---|---|
[AI] XAI, eXplanable AI์ด๋ (0) | 2022.08.29 |
[DL] Faster R-CNN ๋คํธ์ํฌ ์ธ๋ถ ๊ตฌ์ฑ (0) | 2022.08.18 |
[DL] CNN-Initializing Weights for the Convolutional and FC Layers (0) | 2022.08.18 |
[DL] CNN ๋ถ๋ฅ ์ฑ๋ฅ ๋์ด๊ธฐ/ ๋ฐ์ดํฐ ์ฆ๊ฐ, Mix Image (0) | 2022.08.09 |