M.S. AAI Capstone Chronicles 2024

Detecting Fake News Using Natural Language Processing featuring forget, input, and output gates with sigmoid activation. Any value that gets multiplied by zero is forgotten while any other values are kept to cascade down the cell and network (Staudemeyer & Morris, 2019).

, the −1 )

Figure 5: A general layout of an LSTM cell. The inputs are the previous cell state ( , and the input at timestep . The cell state acts as a highway that (ℎ −1 ) transfers relevant information down the cell. As the highway continues, information is added or removed through the other gates. The hidden state acts as the cell’s memory, containing gates that regulate important information. The forget gate, the first step in an LSTM cell, processes the previous timestep , and ℎ −1 input through a sigmoid function, yielding the forget gate output . The input gate updates the cell state, with the previous hidden state , and input passes through it. Simultaneously, the ℎ −1 same information passes through a tanh function to create candidate values for the cell state. The input gate and candidate values are then pointwise multiplied to update the cell state. The previous hidden state

8

33

Made with FlippingBook - professional solution for displaying marketing and sales documents online