4
4
**************************************************************
5
5
**Author**: `Sean Robertson <https://github.com/spro>`_
6
6
7
- We will be building and training a basic character-level RNN to classify
8
- words. This tutorial, along with the following two, show how to do
9
- preprocess data for NLP modeling "from scratch", in particular not using
10
- many of the convenience functions of `torchtext`, so you can see how
11
- preprocessing for NLP modeling works at a low level.
7
+ We will be building and training a basic character-level Recurrent Neural
8
+ Network (RNN) to classify words. This tutorial, along with two other
9
+ Natural Language Processing (NLP) "from scratch" tutorials
10
+ :doc:`/intermediate/char_rnn_generation_tutorial` and
11
+ :doc:`/intermediate/seq2seq_translation_tutorial`, show how to
12
+ preprocess data to model NLP. In particular these tutorials do not
13
+ use many of the convenience functions of `torchtext`, so you can see how
14
+ preprocessing to model NLP works at a low level.
12
15
13
16
A character-level RNN reads words as a series of characters -
14
17
outputting a prediction and "hidden state" at each step, feeding its
32
35
(-2.68) Dutch
33
36
34
37
35
- **Recommended Reading:**
38
+ Recommended Preparation
39
+ =======================
36
40
37
- I assume you have at least installed PyTorch, know Python, and
38
- understand Tensors:
41
+ Before starting this tutorial it is recommended that you have installed PyTorch,
42
+ and have a basic understanding of Python programming language and Tensors:
39
43
40
44
- https://pytorch.org/ For installation instructions
41
45
- :doc:`/beginner/deep_learning_60min_blitz` to get started with PyTorch in general
46
+ and learn the basics of Tensors
42
47
- :doc:`/beginner/pytorch_with_examples` for a wide and deep overview
43
48
- :doc:`/beginner/former_torchies_tutorial` if you are former Lua Torch user
44
49
@@ -181,10 +186,6 @@ def lineToTensor(line):
181
186
# is just 2 linear layers which operate on an input and hidden state, with
182
187
# a ``LogSoftmax`` layer after the output.
183
188
#
184
- # .. figure:: https://i.imgur.com/Z2xbySO.png
185
- # :alt:
186
- #
187
- #
188
189
189
190
import torch .nn as nn
190
191
@@ -195,13 +196,13 @@ def __init__(self, input_size, hidden_size, output_size):
195
196
self .hidden_size = hidden_size
196
197
197
198
self .i2h = nn .Linear (input_size + hidden_size , hidden_size )
198
- self .i2o = nn .Linear (input_size + hidden_size , output_size )
199
+ self .h2o = nn .Linear (hidden_size , output_size )
199
200
self .softmax = nn .LogSoftmax (dim = 1 )
200
201
201
202
def forward (self , input , hidden ):
202
203
combined = torch .cat ((input , hidden ), 1 )
203
204
hidden = self .i2h (combined )
204
- output = self .i2o ( combined )
205
+ output = self .h2o ( hidden )
205
206
output = self .softmax (output )
206
207
return output , hidden
207
208
0 commit comments