|
|
[[Home](Home)] [[Activities](JULAIN Activities)][[Journal Club](JULAIN Journal Club)]
|
|
|
|
|
|
---
|
|
|
# JULAIN journal club, 21 Sept 2020
|
|
|
|
|
|
|
... | ... | @@ -73,4 +76,7 @@ People try to already do this, for instance: "Deep Transformer Models for Time S |
|
|
* Educational: http://nlp.seas.harvard.edu/2018/04/03/attention.html
|
|
|
* MinGPT: minimal PyTorch implementation, https://github.com/karpathy/minGPT
|
|
|
* HuggingFace: https://github.com/huggingface/transformers, this is more for fine-tuning pre-trained transformers for NLP or training them from scratch.
|
|
|
They also have a tokenization library https://github.com/huggingface/tokenizers, which implements Byte-Pair encoding (see above) among other methods. |
|
|
\ No newline at end of file |
|
|
They also have a tokenization library https://github.com/huggingface/tokenizers, which implements Byte-Pair encoding (see above) among other methods.
|
|
|
|
|
|
---
|
|
|
[[Home](Home)] [[Activities](JULAIN Activities)][[Journal Club](JULAIN Journal Club)] |
|
|
\ No newline at end of file |