home bbs files messages ]

Just a sample of the Echomail archive

Cooperative anarchy at its finest, still active today. Darkrealms is the Zone 1 Hub.

   ENGLISH_TUTOR      English Tutoring for Students of the Eng      4,347 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 3,286 of 4,347   
   Denis Mosko to All   
   The Transformer is a deep learning model   
   17 Sep 20 07:22:02   
   
   MSGID: 2:5064/54.1315 5f62e4e2   
   CHRS: CP866 2   
   TZUTC: 0300   
   TID: hpt/w32-mgw 1.4.0-sta 30-03-12   
   rily in the field of NLP.   
      
   Like RNNs, Transformers are designed to handle sequential data, such as   
   natural language, for tasks such as translation and text summarization.   
   However, unlike RNNs, Transformers do not require that the sequential data be   
   processed in order. For example, if the input data is a natural language   
   sentence, the Transformer does not need to process the beginning of it before   
   the end. Due to this feature, the Transformer allows for much more   
   parallelization than RNNs and therefore reduced training times.   
      
   Since their introduction, Transformers have become the model of choice for   
   tackling many problems in NLP, replacing older recurrent neural network models   
   such as the long short-term memory LSTM. Since the Transformer model   
   facilitates more parallelization during training, it has enabled training on   
   larger datasets than was possible before it was introduced. This has led to   
   the development of pretrained systems such as BERT and GPT, which have been   
   trained with huge general language datasets, and can be fine-tuned to specific   
   language tasks.   
      
   Why?   
   --- GoldED+/W32-MINGW 1.1.5-b20120519 (Kubik 3.0)   
    * Origin: В начале было слово. В конце будет ориджин. (2:5064/54.1315)   
   SEEN-BY: 1/123 50/109 90/1 120/340 123/131 221/0 6 226/30 227/114   
   SEEN-BY: 227/702 229/101 275 424 426 664 240/1120 1634 1895 2100 5138   
   SEEN-BY: 240/5411 5832 5853 8001 8002 8005 249/206 317 250/25 261/38   
   SEEN-BY: 280/5003 313/41 317/3 320/219 322/757 331/313 333/808 335/206   
   SEEN-BY: 335/364 370 342/200 382/147 450/1024 463/68 467/888 2454/119   
   SEEN-BY: 4500/1 5000/111 5001/100 5005/49 5015/42 46 5019/40 5020/830   
   SEEN-BY: 5020/846 1042 2140 4441 5053/51 5054/8 5064/41 54 56 5075/128   
   SEEN-BY: 5080/68 102 5083/1 444   
   PATH: 5064/54 5020/1042 221/6 335/364 240/1120 5832 229/426   
      

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca