Skip to content

Commit 2282a2e

Browse files
authored
Code files added
1 parent f33cc30 commit 2282a2e

15 files changed

+2673
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,288 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Understanding the LSTM cell\n",
8+
"\n",
9+
"\n",
10+
"What makes the LSTM cells so special? How do the LSTM cells achieve long term\n",
11+
"dependency? How does it know what information to keep and what information to discard\n",
12+
"from the memory?\n",
13+
"\n",
14+
"This is all achieved by a special structure called gates. As shown in the following figure, a\n",
15+
"typical LSTM cell consists of three special gates called input gate, output gate, and forget\n",
16+
"gate:\n",
17+
"\n",
18+
"![image](images/1.png)\n",
19+
"\n",
20+
"These three gates are responsible for deciding what information to add; output and forget\n",
21+
"from the memory. With these gates, LSTM effectively keeps information in the memory\n",
22+
"only as long as they required. \n",
23+
"\n",
24+
"In an RNN cell, we used hidden state $h_t$ for two purposes, one for storing the information\n",
25+
"and other for making predictions. Unlike RNN, in the LSTM cell, we break the hidden\n",
26+
"states into two states called cell state and hidden state.\n",
27+
"\n",
28+
"* Cell state is also called internal memory where all the information will be stored.\n",
29+
"* Hidden state is used for computing the output. \n",
30+
"\n",
31+
"Both of these cell state and hidden states are shared across every time steps. Now we will\n",
32+
"deep dive into LSTM cell and see how exactly these gates are used and how hidden state is\n",
33+
"computed.\n",
34+
"\n",
35+
"\n",
36+
"## Forget Gate \n",
37+
"\n",
38+
"The forget gate $f_t$ is responsible for deciding what information should be removed from\n",
39+
"the cell state (memory). \n",
40+
"\n",
41+
"\n",
42+
"Consider the following sentences: Harry is a good singer. He lives in\n",
43+
"New York. Zayn is also a good singer.\n",
44+
"\n",
45+
"As soon as we start talking about Zayn, the network will understand that the subject has\n",
46+
"been changed from Harry to Zayn and the information about Harry is no longer required.\n",
47+
"Now, the forget gate will remove/forget information about Harry from the cell state.\n",
48+
"The forget gate is controlled by a sigmoid function. At a time step $t$ , we pass the\n",
49+
"input $x_t$ and previous hidden state ${h_{t-1}}$to the forget gate. It will return 0 if the particular\n",
50+
"information from the cell state should be removed and returns 1 if the information should\n",
51+
"not be removed. The forget gate $f$ at a time step $t$ is expressed as follows:\n",
52+
"\n",
53+
"$$f_{t}=\\sigma\\left(U_{f} x_{t}+W_{f} h_{t-1}+b_{f}\\right)$$\n",
54+
"\n",
55+
"Where:\n",
56+
"\n",
57+
"* $U_f$ is the input to hidden weights of the forget gate\n",
58+
"* $W_f$ is the hidden to hidden weights of the forget gate\n",
59+
"* $b_f$ is the bias of the forget gate\n",
60+
"\n",
61+
"The following figure shows the forget gate. As you can see, input $x_t$ is multiplied\n",
62+
"$U_f$ with and previous hidden state $h_{t-1}$ will be multiplied with $W_f$, both of them will get\n",
63+
"added together and sent to the sigmoid function which returns values from 0 to 1.\n",
64+
"\n",
65+
"![image](images/2.png)"
66+
]
67+
},
68+
{
69+
"cell_type": "markdown",
70+
"metadata": {},
71+
"source": [
72+
"## Input Gate\n",
73+
"\n",
74+
"\n",
75+
"The input gate is responsible for deciding what information should be stored in the cell\n",
76+
"state.\n",
77+
"\n",
78+
"Let's consider the same example: Harry is a good singer. He lives in New York. Zayn is\n",
79+
"also a good singer.\n",
80+
"\n",
81+
"\n",
82+
"After the forget gate removes information from the cell state, the input gate decides what\n",
83+
"information it has to keep in the memory. Here, since the information about Harry is\n",
84+
"removed from the cell state by the forget gate, the input gate decides to update the cell state\n",
85+
"with the information about Zayn.\n",
86+
"Similar to forget gate, the input gate is controlled by a sigmoid function which returns\n",
87+
"either 0 or 1. If it returns 1 then the particular information will be stored/update to the cell\n",
88+
"state and if it returns 0 then we will not store the information to the cell state. The input\n",
89+
"gate $i$ at a time step $t$ is expressed as follows:\n",
90+
"\n",
91+
"$$ i_{t}=\\sigma\\left(U_{i} x_{t}+W_{i} h_{t-1}+b_{i}\\right)$$\n",
92+
"\n",
93+
"\n",
94+
"Where:\n",
95+
"* $U_i$ is the input to hidden weights of the input gate\n",
96+
"* $W_i$ is the hidden to hidden weights of the input gate\n",
97+
"* $b_i$ is the bias of the input gate\n",
98+
"\n",
99+
"\n",
100+
"The following figure shows the input gate:\n",
101+
"\n",
102+
"![image](images/3.png)\n",
103+
"\n",
104+
"\n",
105+
"\n",
106+
"\n",
107+
"\n",
108+
"\n",
109+
"\n"
110+
]
111+
},
112+
{
113+
"cell_type": "markdown",
114+
"metadata": {},
115+
"source": [
116+
"## Output gate\n",
117+
"\n",
118+
"We will have a lot of information in the cell state (memory). The output gate is responsible\n",
119+
"for deciding what information should be taken from the cell state to give as an\n",
120+
"output. \n",
121+
"\n",
122+
"Consider the following sentences. Zayn's debut album was a huge success. Congrats\n",
123+
"____.\n",
124+
"\n",
125+
"\n",
126+
"The output gate will look up all the information in the cell state and select the correct\n",
127+
"information to fill the blank. Here, congrats is an adjective which is used to describe a noun.\n",
128+
"So the output gate will predict Zayn (noun), to fill the blank. Similar to other gates, it is also\n",
129+
"controlled by a sigmoid function. The output gate $o$ at a time step $t$ is expressed as follows:\n",
130+
"\n",
131+
"$o_{t}=\\sigma\\left(U_{o} x_{t}+W_{o} h_{t-1}+b_{o}\\right)$\n",
132+
"\n",
133+
"Where:\n",
134+
"* $U_o$ is the input to hidden weights of the output gate\n",
135+
"* $W_o$ is the hidden to hidden weights of the output gate\n",
136+
"* $b_o$ is the bias of the output gate\n",
137+
"\n",
138+
"The output gate is shown in the following figure:\n",
139+
"\n",
140+
"![image](images/4.png)"
141+
]
142+
},
143+
{
144+
"cell_type": "markdown",
145+
"metadata": {},
146+
"source": [
147+
"## Updating the cell state\n",
148+
"\n",
149+
"\n",
150+
"We just learned how all the three gates in the LSTM works. But, the question is how can we\n",
151+
"actually update the cell state by adding the relevant new information and deleting the\n",
152+
"information that is not required from the cell state with the help of the gates?\n",
153+
"\n",
154+
"__First, we will see how to add new relevant information to the cell state:__\n",
155+
"\n",
156+
"\n",
157+
"To hold all the new\n",
158+
"information that can be added to the cell state, we create a new vector called $g_t$. It is called\n",
159+
"a candidate state or internal state vector. Unlike gates which is regulated by the sigmoid\n",
160+
"function, candidate state is regulated by the tanh function. But, why? Sigmoid function\n",
161+
"returns either 0 or 1 i.e it is always positive. We need to allow the values of $g_t$ to be either\n",
162+
"positive or negative. So, we use tanh function which returns either +1 or -1.\n",
163+
"The candidate state $g$ at a time $t$ is expressed as follows:\n",
164+
"\n",
165+
"\n",
166+
"$$g_{t}=\\tanh \\left(U_{g} x_{t}+W_{g} h_{t-1}+b_{g}\\right)$$\n",
167+
"\n",
168+
"\n",
169+
"Where:\n",
170+
"* $U_g$ is the input to hidden weights of the candidate state\n",
171+
"* $W_g$ is the hidden to hidden weights of the candidate state\n",
172+
"* $b_g$ is the bias of the candidate state\n",
173+
"\n",
174+
"Thus, the candidate state holds all the new information that can be added to the memory\n",
175+
"and it is shown in the following figure:\n",
176+
"\n",
177+
"![image](images/5.png)\n",
178+
"\n",
179+
"But how do we decide whether the information in the candidate state is relevant? How do\n",
180+
"we decide whether to add or not add new information in the candidate state to the cell\n",
181+
"sate? We learned that the input gate is responsible for deciding whether to add new\n",
182+
"information or not to the cell state. So if we multiply $g_t$ and $i_t$, we get only relevant\n",
183+
"information which should be added to the memory. \n",
184+
"\n",
185+
"\n",
186+
"That is, as we know input gate returns 0 if the information is not required and 1 if the\n",
187+
"information is required. Say, $i_t=0$ , then multiplying $g_t$ and $i_t$ gives 0 which means the\n",
188+
"information in $g_t$ is not required and we don't want to update the cell state with $g_t$. When\n",
189+
"$i_t=1$, then multiplying $g_t$ and $i_t$ gives $g_t$ which implies we can update the information\n",
190+
"in the $g_t$ to the cell state.\n",
191+
"\n",
192+
"\n",
193+
"Adding the new information to the cell state with the input gate $i_t$, and the candidate\n",
194+
"state $g_t$, is shown in the following figure\n",
195+
"\n",
196+
"![image](images/6.png)"
197+
]
198+
},
199+
{
200+
"cell_type": "markdown",
201+
"metadata": {},
202+
"source": [
203+
"__ Now, we will see how to remove information from the previous cell state which is not\n",
204+
"required anymore.__\n",
205+
"\n",
206+
"\n",
207+
"We learned that forget gate is used for removing information which is not required in the\n",
208+
"cell state. So if we multiply previous cell state $c_{t-1}$ and forget gate $f_t$ then we retain only\n",
209+
"relevant information in the cell state.\n",
210+
"\n",
211+
"Say,$f_t = 0$ , then multiplying $c_{t-1}$ and $f_t$ gives 0 which means the information in the cell\n",
212+
"state $c_{t-1}$ is not required and it should be removed (forgotten). When $f_t=1$ , then\n",
213+
"multiplying $c_{t-1}$ and $f_t$ gives $c_{t-1}$ which imples that information in the previous cell\n",
214+
"state is required and it should not be removed.\n",
215+
"Removing information from the previous cell state$c_{t-1}$ with the forget gate $f_t$ is shown in\n",
216+
"the following figure:\n",
217+
"\n",
218+
"![image](images/7.png)"
219+
]
220+
},
221+
{
222+
"cell_type": "markdown",
223+
"metadata": {},
224+
"source": [
225+
"Thus, in a nutshell we update our cell state by multiplying $g_t$ and $i_t$ to add new\n",
226+
"information and multiplying $c_{t-1}$ and $f_t$ to remove information. We can express the cell\n",
227+
"state equation as follows:\n",
228+
"\n",
229+
"$$c_{t}=f_{t} c_{t-1}+i_{t} g_{t} $$\n"
230+
]
231+
},
232+
{
233+
"cell_type": "markdown",
234+
"metadata": {},
235+
"source": [
236+
"## Updating hidden state \n",
237+
"\n",
238+
"We just learned how the information in the cell state will be updated. Now we will see,\n",
239+
"how the information in the hidden state $h_$ will be updated. We learned that the hidden state\n",
240+
" is used for computing the output. But how can we compute the output?\n",
241+
" \n",
242+
"We know that the output gate is responsible for deciding what information should be taken\n",
243+
"from the cell state to give as an output. Thus multiplying $o_t$ and tanh (to squash between -1\n",
244+
"and +1) of cell state $tanh(c_t)$, returns the output.\n",
245+
"Thus, hidden state $h_t$ is expressed as follows:\n",
246+
"\n",
247+
"$$h_{t}=o_{t} \\tanh \\left(c_{t}\\right)$$ \n",
248+
"\n",
249+
"\n",
250+
"The following figure shows how the hidden sate $h_t$ is computed by mutliplying $o_t$ and\n",
251+
"$tanh(c_t)$ :\n",
252+
"\n",
253+
"![image to be added](images/8.png)\n",
254+
"\n",
255+
"And finally, once we have the hidden state value, we can apply the softmax function and\n",
256+
"compute $\\hat{y}_t$ as shown:\n",
257+
"\n",
258+
"$$\\hat{y}_{t}=\\operatorname{softmax}\\left(V h_{t}\\right)$$\n",
259+
"\n",
260+
"Where, $V$ is the hidden to output layer weights. \n",
261+
"\n",
262+
"\n",
263+
"In the next section, we will see how exactly forward propgation is performed in the LSTM cell. "
264+
]
265+
}
266+
],
267+
"metadata": {
268+
"kernelspec": {
269+
"display_name": "Python [conda root]",
270+
"language": "python",
271+
"name": "conda-root-py"
272+
},
273+
"language_info": {
274+
"codemirror_mode": {
275+
"name": "ipython",
276+
"version": 2
277+
},
278+
"file_extension": ".py",
279+
"mimetype": "text/x-python",
280+
"name": "python",
281+
"nbconvert_exporter": "python",
282+
"pygments_lexer": "ipython2",
283+
"version": "2.7.11"
284+
}
285+
},
286+
"nbformat": 4,
287+
"nbformat_minor": 2
288+
}

0 commit comments

Comments
 (0)