From 2a16af5de96e6174a9506f487cec88e2e5df8c2e Mon Sep 17 00:00:00 2001 From: Benjamin-eecs Date: Wed, 7 Dec 2022 02:35:03 +0800 Subject: [PATCH] chore: update README --- README.md | 23 ++++++++++++++++++++--- 1 file changed, 20 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 3dc1155f8..74acd889c 100644 --- a/README.md +++ b/README.md @@ -146,7 +146,7 @@ Check out section [Explicit Gradient (EG)](#explicit-gradient-eg) functional API We design a bilevel-optimization updating scheme, which can be easily extended to realize various differentiable optimization processes.
- +
As shown above, the scheme contains an outer level that has parameters $\phi$ that can be learned end-to-end through the inner level parameters solution $\theta^{\prime}(\phi)$ by using the best-response derivatives $\partial \theta^{\prime}(\phi) / \partial \phi$. @@ -435,8 +435,25 @@ If you find TorchOpt useful, please cite it in your publications. ``` ## The Team - -TorchOpt is a work by [Jie Ren](https://github.com/JieRen98), [Xidong Feng](https://github.com/waterhorse1), [Bo Liu](https://github.com/Benjamin-eecs), [Xuehai Pan](https://github.com/XuehaiPan), [Luo Mai](https://luomai.github.io), and [Yaodong Yang](https://www.yangyaodong.com). + + + + + + + + + + + + + + +

Jie Ren

Xidong Feng

Bo Liu

Xuehai Pan

Luo Mai

Yaodong Yang
+ + + + ## License