Types of recurrent neural networks
Bidirectional recurrent neural networks
Long short-term memory
Gated recurrent units
Medical uses for recurrent neural networks
Recurrent neural networks and medical imaging
Recurrent neural networks and diagnostics
References
Further reading
Recurrent neural networks are a classification of artificial neural networks used in artificial intelligence (AI), natural language processing (NLP), deep learning, and machine learning. They process temporal, sequential data (such as time-series data) or language as either speech or text.
The networks function by creating connections between nodes. These nodes then create what is referred to as a sample. This leads to the output from specific nodes affecting the later inputs to those nodes. The exact effect varies depending on the behavior and exact typology of the recurrent neural network.
Information moves across the network from one layer to another, each state influencing the next. The information moves across layers using loops that already exist within the network. Because of such loops, recurrent neural networks can use memory to store previous computations. This ability is also what allows them to use dynamic temporal behaviors, such as interacting with time-sequence data.
Recurrent neural networks are modeled after the brain’s neural connections and as such are often referred to as neurons. They are a clear representation of the intersection of fields like neuroscience and computer science which exists in cognitive science. Because of this, much of the language used to discuss recurrent neural networks is borrowed from neuroscientific concepts.
Types of recurrent neural networks
There are a number of different architectures that recurrent neural networks can take on. These architectures each have different characteristics and structures from one another, allowing them to model a variety of temporal relationships. The different types of networks include gated recurrent units, bidirectional recurrent neural networks, and long short-term memory.
Bidirectional recurrent neural networks
Bidirectional recurrent neural networks are a combination of two recurrent neural networks that train in unison. One network trains from the start to the end of a sequence while the other works in the opposite direction.
The bidirectional method that this type of recurrent neural network uses allows the model to learn from both present and past information. Once the network has learned from this, it can analyze future events accordingly. This feature sets it apart from other types of recurrent neural networks. The dual nature of bidirectional recurrent neural networks is useful in circumstances where context is required.
Long short-term memory
Long short-term memory recurrent neural networks handle long time-series data. This means that they can recall long-term time-series data collected prior.
This model has three different gates: the input gate, the output gate, and the forget gate. These gates act as a form of control over features of the network, such as saving or removing memory.
The input gate decides which new information moves into the cell state. The output gate, on the other hand, regulates which information is selected from the cell state. After that decision is made, it chooses the next hidden state for the network. Finally, the forget gate removes any information from the cell state that is deemed irrelevant or insignificant.
While in the cell state, the network has automatic control while discarding irrelevant information or retaining relevant features. The vanishing gradient problem found in some networks can be prevented via the use of long short-term memory networks.
Gated recurrent units
Gated recurrent units are a gating mechanism used in recurrent neural networks. Gated recurrent units were developed in 2014 as a simpler alternative to long short-term memory.
Gating mechanisms update the hidden states of the network at hand on a selective basis. This controls the movement of information throughout the network at each time step, effectively dictating what information exits and enters the network.
Gated recurrent units have two gating mechanisms: the update gate and the reset gate. The update gate decides how much prior knowledge should be used in the future. Meanwhile, the rest gate selects how much past memory can or should be forgotten. Within the reset gate, there is also the current memory gate. The current memory gate manages the data in the system by ensuring that the input is zero-mean and introduces non-linearity into the input.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
Medical uses for recurrent neural networks
Recurrent neural networks play a role in the innovation of Medical Image Computing and Computer Assisted Intervention. Deep learning using recurrent neural networks has the scope to be used for the diagnosis of disease, while specific forms of recurrent neural networks may be used to improve the quality of medical imaging.
Recurrent neural networks and medical imaging
Research has been conducted into the use of long short-term memory recurrent neural networks for denoising medical imaging. In 2019, a process was tested to assess the effectiveness of removing white, salt, and pepper noise from images of lung computed tomography (CT) with high levels of efficiency. This means that the technology could be applied to improve the clarity and quality of medical scanning.
Recurrent neural networks and diagnostics
A model of long short-term memory recurrent neural networks has been tested as an analytical tool for intensive care unit (ICU) admissions with strong success.
The model was trained on the diagnostic criteria of 128 disorders and illnesses and given a sample of 13 clinical measurements formed from Electronic Health Records (EHR). When used to diagnose patients in this setting, it outperformed alternative solutions.
References
- Hung, CL (2023) Deep learning in biomedical informatics. In Zheng, Y.; Wu, Z. (eds.). Materials Today, Intelligent Nanotechnology. Elsevier. pp. 307-329.
- Schuster M, & Paliwal KK (1997). Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing, 45(11), 2673-2681.
- Rajeev R, Samath JA, & Karthikeyan NK (2019). An intelligent recurrent neural network with long short-term memory (LSTM) BASED batch normalization for medical image denoising. Journal of Medical Systems, 43, 1-10.
- Sharma DK, et al. (2022) Deep learning applications for disease diagnosis. In
- Gupta, D. et al. (eds.). Deep Learning for Medical Applications with Unique Data. Academic Press. pp. 31-51.
- *Lipton ZC, Kale DC, Elkan C, et al. (2015). Learning to diagnose with LSTM recurrent neural networks. arXiv preprint arXiv:1511.03677.
- Medsker LR & Jain LC (2001). Recurrent neural networks. Design and Applications, 5, 64-67.
- *Salehinejad H, Sankar S, Barfett J, et al. (2017). Recent advances in recurrent neural networks. arXiv preprint arXiv:1801.01078.
*Important notice: arXiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as conclusive, guide clinical practice/health-related behavior, or treated as established information.
Further Reading