When using the following code to construct a neural network, MindSpore can inherit the Cell class and rewrite the __init__ and construct methods.
The global gradient descent, stochastic gradient descent, and batch gradient descent algorithms are gradient descent algorithms. Which of the following is true about these algorithms?
Which of the following is the activation function used in the hidden layers of the standard recurrent neural network (RNN) structure?
In a fully-connected structure, a hidden layer with 1000 neurons is used to process an image with the resolution of 100 x 100. Which of the following is the correct number of parameters?
When you use MindSpore to execute the following code, which of the following is the output?
python
Copy code
x = Tensor(np.array([[1, 2], [3, 4]]), dtype.int32)
x.dtype
"AI application fields include only computer vision and speech processing." Which of the following is true about this statement?
In a hyperparameter-based search, the hyperparameters of a model are searched based on the data on and the model's performance metrics.
AI inference chips need to be optimized and are thus more complex than those used for training.
Sigmoid, tanh, and softsign activation functions cannot avoid vanishing gradient problems when the network is deep.