Nauka

DeepLearning_GPT3_questions

Wyświetlane są wszystkie pytania.
Pytanie 65
What is the purpose of the forget gate in an LSTM cell?
To determine the output of the LSTM cell
To control how much of the cell state is updated
To determine the input to the output gate
To decide whether to update the cell state or not
Pytanie 66
Which of the following is NOT a type of gate in an LSTM?
Update gate
Input gate
Forget gate
Output gate
Pytanie 67
What is the purpose of the teacher forcing technique in training RNNs?
To provide the network with the correct input at each time step during training.
To speed up the convergence of the network.
To improve the generalization ability of the network.
To prevent overfitting.
Pytanie 68
What is the difference between a unidirectional and bidirectional RNN?
A unidirectional RNN can process data in both directions, while a bidirectional RNN can only process data in one direction.
Both unidirectional and bidirectional RNNs can only process data in one direction.
Both unidirectional and bidirectional RNNs can process data in both directions
A unidirectional RNN can only process data in one direction, while a bidirectional RNN can process data in both directions.
Pytanie 69
What is the long short-term memory (LSTM) architecture designed to address in RNNs?
The exploding gradient problem.
The vanishing gradient problem.
The overfitting problem.
The underfitting problem.
Pytanie 70
What is the vanishing gradient problem in recurrent neural networks (RNNs)?
The weights of the network become too small.
The weights of the network become too large.
The gradients become too small during backpropagation.
The gradients become too large during backpropagation.
Pytanie 71
Which of the following is a potential application of UNet in medical image analysis?
All of the above
Detecting anomalies in a time series
Identifying objects in an image
Segmenting tumor regions in an MRI
Pytanie 72
How does UNet handle class imbalance in image segmentation tasks?
UNet does not handle class imbalance
By undersampling the majority classes
By weighting the loss function for underrepresented classes
By oversampling the minority classes