サクサク読めて、アプリ限定の機能も多数!
"largest transformer based language model ever trained at 24x the size of BERT and 5.6x the size of GPT-2"
Ryobot のブックマーク 2019/08/14 14:35
MegatronLM: Training Billion+ Parameter Language Models Using GPU Model Parallelism[GPT-2][BERT][Transformer] "largest transformer based language model ever trained at 24x the size of BERT and 5.6x the size of GPT-2"2019/08/14 14:35
"largest transformer based language model ever trained at 24x the size of BERT and 5.6x the size of GPT-2"
このブックマークにはスターがありません。 最初のスターをつけてみよう!
nv-adlr.github.io2019/08/14
Larger language models are dramatically more useful for NLP tasks such as article completion, question answering, and dialog systems. Training the largest neural language model has recently been th...
2 人がブックマーク・1 件のコメント
\ コメントが サクサク読める アプリです /
"largest transformer based language model ever trained at 24x the size of BERT and 5.6x the size of GPT-2"
Ryobot のブックマーク 2019/08/14 14:35
このブックマークにはスターがありません。
最初のスターをつけてみよう!
MegatronLM: Training Billion+ Parameter Language Models Using GPU Model Parallelism
nv-adlr.github.io2019/08/14
Larger language models are dramatically more useful for NLP tasks such as article completion, question answering, and dialog systems. Training the largest neural language model has recently been th...
2 人がブックマーク・1 件のコメント
\ コメントが サクサク読める アプリです /