Python torch.optim Module
This page shows the popular functions and classes defined in the
torch.optim module.
The items are ordered by their popularity in 40,000 open source Python projects.
If you can not find a good example below, you can try
the search function
to search modules.
-
1. Adam()Used in 1166 projects
-
2. SGDUsed in 734 projects
-
3. RMSprop()Used in 193 projects
-
4. Optimizer()Used in 173 projects
-
5. Adadelta()Used in 125 projects
-
6. Adagrad()Used in 105 projects
-
7. Adamax()Used in 61 projects
-
8. ASGDUsed in 27 projects
-
9. SparseAdam()Used in 27 projects
-
10. LBFGSUsed in 26 projects
-
11. AdamW()Used in 19 projects
-
12. Rprop()Used in 18 projects
-
13. set_parameters()Used in 9 projects
-
14. optimizer()Used in 8 projects
-
15. zero_grad()Used in 6 projects
-
16. method()Used in 6 projects
-
17. step()Used in 5 projects