PyTorch中的优化器简介
算法
| 算法 | 说明 |
| Adadelta | Implements Adadelta algorithm. |
| Adafactor | Implements Adafactor algorithm. |
| Adagrad | Implements Adagrad algorithm. |
| Adam | Implements Adam algorithm. |
| AdamW | Implements AdamW algorithm. |
| SparseAdam | SparseAdam implements a masked version of the Adam algorithm suitable for sparse gradients. |
| Adamax | Implements Adamax algorithm (a variant of Adam based on infinity norm). |
| ASGD | Implements Averaged Stochastic Gradient Descent. |
| LBFGS | Implements L-BFGS algorithm. |
| NAdam | Implements NAdam algorithm. |
| RAdam | Implements RAdam algorithm. |
| RMSprop | Implements RMSprop algorithm. |
| Rprop | Implements the resilient backpropagation algorithm. |
| SGD | Implements stochastic gradient descent (optionally with momentum). |
最后更新于11月前
本文由人工编写,AI优化,转载请注明原文地址: PyTorch中的优化器简介
推荐阅读
评论 (0)
发表评论
昵称:加载中...
暂无评论,快来发表第一条评论吧!