... | ... | @@ -61,7 +61,7 @@ Ramsauer et al. ICLR 2021<br> |
|
|
|
|
|
* **OPT: Open Pre-trained Transformer Language Models** <br>
|
|
|
S Zhang et al. 2022<br>
|
|
|
[[link](https://arxiv.org/abs/2205.01068]) [[BlogPost]( https://ai.facebook.com/blog/democratizing-access-to-large-scale-language-models-with-opt-175b/)]
|
|
|
[[link](https://arxiv.org/abs/2205.01068])] [[BlogPost]( https://ai.facebook.com/blog/democratizing-access-to-large-scale-language-models-with-opt-175b/)]
|
|
|
* Intro: Stefan Kesselheim
|
|
|
|
|
|
### 25 April 2022 (moved from 18 April due to Easter!)
|
... | ... | |