Building A Recurrent Neural Network For Contextual Language Responses (IMAGE)
Caption
Contextual language encoding model with narrative stimuli. Each word in the story is first projected into a 985-dimensional embedding space. Sequences of word representations are then fed into an LSTM network that was pre-trained as a language model.
Credit
Huth lab, UT Austin
Usage Restrictions
None
License
Licensed content