PyTorch中的优化器简介
算法
| 算法 | 说明 |
| Adadelta | Implements Adadelta algorithm. |
| Adafactor | Implements Adafactor algorithm. |
| Adagrad | Implements Adagrad algorithm. |
| Adam | Implements Adam algorithm. |
| AdamW | Implements AdamW algorithm. |
| SparseAdam | SparseAdam implements a masked version of the Adam algorithm suitable for sparse gradients. |
| Adamax | Implements Adamax algorithm (a variant of Adam based on infinity norm). |
| ASGD | Implements Averaged Stochastic Gradient Descent. |
| LBFGS | Implements L-BFGS algorithm. |
| NAdam | Implements NAdam algorithm. |
| RAdam | Implements RAdam algorithm. |
| RMSprop | Implements RMSprop algorithm. |
| Rprop | Implements the resilient backpropagation algorithm. |
| SGD | Implements stochastic gradient descent (optionally with momentum). |
最后更新于11月前
本文由人工编写,AI优化,转载请注明原文地址: PyTorch中的优化器简介
推荐阅读
谷歌Antigravity IDE:AI智能体驱动的软件开发平台详解
7242025-11-24
OpenAI Codex命令行工具安装与使用教程:AI编程助手实战指南
14852025-10-08
43种常见网络爬虫详解:功能解析与访问量统计指南
2112025-11-23
Kaggle Notebook性能实测:免费GPU主机配置与运行时间分析
7122025-11-23
VMware Workstation 17许可证密钥及免费激活方法详解
25752025-10-26
Windows系统PyTorch安装教程:CUDA 12.1环境配置与TorchText版本兼容性指南
22432025-10-08
评论 (0)
发表评论
昵称:加载中...
暂无评论,快来发表第一条评论吧!