M.S. AAI Capstone Chronicles 2024
Detecting Fake News Using Natural Language Processing previous cell state is updated by pointwise multiplication with the output of the forget gate, added to the output of the input gate and candidate cell's multiplication, followed by a tanh function adjustment. This process yields the final cell state using the equation: (forget gate output ) * (previous cell state ) + (input gate output ) * (new cell state ). Finally, the output −1 gate determines the next hidden state, crucial for prediction, by selectively retaining values from the previous hidden state and input through a sigmoid function. The final products of this LSTM cell are the cell state and the updated hidden state (Phi, 2020). While initially employing a unidirectional LSTM network, we achieved greater success with a Bidirectional LSTM network. This architecture comprises interconnected LSTM cells that process information both in a forward and backward direction, capturing context from past and future inputs. This is particularly advantageous for NLP tasks. For example, the word “bark” could have different meanings depending on the context, and bidirectional LSTMs have the architecture to find insight from context clues (Zhao, 2023).
Figure 6: A basic layout of the data flow between bidirectional LSTM cells. A previous input , current input , and future input are fed into a backward-facing LSTM network t-n −1 +1 and then fed into a forward-facing LSTM network. Output values are put into a sigmoid function 9 34
Made with FlippingBook - professional solution for displaying marketing and sales documents online