Nauka

DeepLearning_GPT3_questions

Wyświetlane są wszystkie pytania.
Przejdź na Memorizer+
W trybie nauki zyskasz:
Brak reklam
Quiz powtórkowy - pozwoli Ci opanować pytania, których nie umiesz
Więcej pytań na stronie testu
Wybór pytań do ponownego rozwiązania
Trzy razy bardziej pojemną historię aktywności
Wykup dostęp
Pytanie 1
Which of the following is a type of regularization that encourages weight values to be small but non-zero?
None of the above
L2 regularization
Dropout regularization
L1 regularization
Pytanie 2
Which of the following is a type of regularization that encourages sparse weight matrices?
None of the above
Dropout regularization
L2 regularization
L1 regularization
Pytanie 3
What is the purpose of early stopping as a regularization technique?
To minimize the training loss
To minimize the sum of training and validation loss
To prevent overfitting
To minimize the validation loss
Pytanie 4
Which of the following is a technique used for regularization in deep learning?
Softmax
Stochastic Gradient Descent
Dropout
Gradient Descent
Pytanie 5
Which of the following is a benefit of using multilayer perceptrons with multiple hidden layers?
They are less computationally expensive.
They require less labeled training data.
They are more easily interpretable.
They are less likely to overfit.
Pytanie 6
Which of the following is a disadvantage of using multilayer perceptrons?
They can suffer from the vanishing gradient problem.
They are easy to interpret.
They do not require labeled training data.
They are computationally efficient.
Pytanie 7
Which of the following is true about the backpropagation algorithm?
It is guaranteed to find the global minimum of the loss function.
It does not require the use of an activation function.
It is used to compute gradients of a loss function with respect to the weights of a neural network.
It is only used for feedforward neural networks.
Pytanie 8
Which of the following is not a method for avoiding overfitting in multilayer perceptrons?
Regularization
Removing hidden layers
Dropout
Removing hidden layers