Alex Net's contributions to this work include: ( ).
A、Use the modified linear unit (Re LU) as the nonlinear activation function
B、Use the Dropout technology to selectively ignore the single neuron during training to avoid over-fitting the model
C、Cover the larger pool to avoid the average effect of average pool
D、Use GPUNVIDIAGTX580 to reduce the training time
发布时间:2025-11-03 08:05:42