Strona 3

DeepLearning_GPT3_questions

Pytanie 17
Which of the following is not a common approach to unsupervised pretraining in deep learning?
Deep Belief Networks
Autoencoders
Convolutional Neural Networks
Restricted Boltzmann Machines
Pytanie 18
Which of the following is not a commonly used regularization technique in deep learning?
Random forest regularization
Dropout
L2 regularization
L1 regularization
Pytanie 19
What is the main problem with using the vanilla gradient descent algorithm for training deep neural networks?
It can lead to overfitting
It can get stuck in local optima
It can be too slow to converge
It is computationally expensive
Pytanie 20
Which of the following is not a commonly used activation function in deep learning?
ReLU
Tanh
Sigmoid
Linear
Pytanie 21
What is the purpose of the softmax function in deep learning?
To calculate the output of the neural network
To normalize the output of the neural network to a probability distribution
To activate the neurons in the neural network
To compute the gradient of the loss function with respect to the weights
Pytanie 22
What is the purpose of the backpropagation algorithm in deep learning?
To update the weights in the neural network
To propagate the input forward through the network
To calculate the output of the neural network
To compute the gradient of the loss function with respect to the weights
Pytanie 23
What is the difference between supervised and unsupervised learning?
There is no difference between the two
Supervised learning requires less training data than unsupervised learning
Supervised learning is more accurate than unsupervised learning
Supervised learning requires labeled data, while unsupervised learning does not
Pytanie 24
Which of the following is not a commonly used activation function in deep learning?
ReLU
Polynomial
Sigmoid
Tanh
Przejdź na Memorizer+
W trybie testu zyskasz:
Brak reklam
Quiz powtórkowy - pozwoli Ci opanować pytania, których nie umiesz
Więcej pytań na stronie testu
Wybór pytań do ponownego rozwiązania
Trzy razy bardziej pojemną historię aktywności
Wykup dostęp