请在 下方输入 要搜索的题目:

Alex Net's contributions to this work include: ( ).


A、Use the modified linear unit (Re LU) as the nonlinear activation function
B、Use the Dropout technology to selectively ignore the single neuron during training to avoid over-fitting the model
C、Cover the larger pool to avoid the average effect of average pool
D、Use GPUNVIDIAGTX580 to reduce the training time

发布时间:2025-11-03 08:05:42
推荐参考答案 ( 由 快搜搜题库 官方老师解答 )
联系客服
答案:Use the modified linear unit (Re LU) as the nonlinear activation function ■Use the Dropout technology to selectively ignore the single neuron during training to avoid over-fitting the model ■Cover the larger pool to avoid the average effect of average pool ■Use GPUNVIDIAGTX580 to reduce the training time
专业技术学习
专业技术学习
搜搜题库系统