Question: If you have 10 filters that are 3×3×3 in one layer of a neural network, how many parameters does that layer have?
Convolution
Pooling
Fully Connected
Typical keras
workflow:
fit()
method of your model# Define model structure
cnn_model <- keras_model_sequential() %>%
layer_conv_2d(filters = 32, kernel_size = c(3, 3),
activation = "relu", input_shape = input_shape) %>%
layer_max_pooling_2d(pool_size = c(2, 2)) %>%
layer_conv_2d(filters = 64, kernel_size = c(3, 3), activation = "relu") %>%
layer_dropout(rate = 0.25) %>%
layer_flatten() %>%
layer_dense(units = 128, activation = "relu") %>%
layer_dropout(rate = 0.5) %>%
layer_dense(units = num_classes, activation = "softmax")
# Compile model
cnn_model %>% compile(
loss = loss_categorical_crossentropy,
optimizer = optimizer_adadelta(),
metrics = c('accuracy')
)
# Train model
cnn_history <- cnn_model %>%
fit(
x_train, y_train,
batch_size = batch_size,
epochs = epochs,
validation_split = 0.2
)
NASnet: 1800 GPU days (5 yrs on 1 GPU)
AmoebaNet: 3150 GPU days
DARTS: 4 GPU days
ENAS: 1000 x cheaper than standard NAS
Space, Right Arrow or swipe left to move to next slide, click help below for more details