A probabilistic neural network (PNN) [1] is a feedforward neural network, which is widely used in classification and pattern recognition problems. In the PNN algorithm, the parent probability distribution function (PDF) of each class is approximated by a Parzen window and a non-parametric function. Then, using PDF of each class, the class probability of a new input data is estimated and Bayes’ rule is then employed to allocate the class with highest posterior probability to new input data. By this method, the probability of mis-classification is minimized.[2] This type of artificial neural network (ANN) was derived from the Bayesian network[3] and a statistical algorithm called Kernel Fisher discriminant analysis.[4] It was introduced by D.F. Specht in 1966.[5][6] In a PNN, the operations are organized into a multilayered feedforward network with four layers:

  • Input layer
  • Pattern layer
  • Summation layer
  • Output layer

Layers

PNN is often used in classification problems.[7] When an input is present, the first layer computes the distance from the input vector to the training input vectors. This produces a vector where its elements indicate how close the input is to the training input. The second layer sums the contribution for each class of inputs and produces its net output as a vector of probabilities. Finally, a compete transfer function on the output of the second layer picks the maximum of these probabilities, and produces a 1 (positive identification) for that class and a 0 (negative identification) for non-targeted classes.

Input layer

Each neuron in the input layer represents a predictor variable. In categorical variables, N-1 neurons are used when there are N number of categories. It standardizes the range of the values by subtracting the median and dividing by the interquartile range. Then the input neurons feed the values to each of the neurons in the hidden layer.

Pattern layer

This layer contains one neuron for each case in the training data set. It stores the values of the predictor variables for the case along with the target value. A hidden neuron computes the Euclidean distance of the test case from the neuron's center point and then applies the radial basis function kernel using the sigma values.

Summation layer

For PNN there is one pattern neuron for each category of the target variable. The actual target category of each training case is stored with each hidden neuron; the weighted value coming out of a hidden neuron is fed only to the pattern neuron that corresponds to the hidden neuron’s category. The pattern neurons add the values for the class they represent.

Output layer

The output layer compares the weighted votes for each target category accumulated in the pattern layer and uses the largest vote to predict the target category.

Advantages

There are several advantages and disadvantages using PNN instead of multilayer perceptron.[8]

  • PNNs are much faster than multilayer perceptron networks.
  • PNNs can be more accurate than multilayer perceptron networks.
  • PNN networks are relatively insensitive to outliers.
  • PNN networks generate accurate predicted target probability scores.
  • PNNs approach Bayes optimal classification.

Disadvantages

  • PNN are slower than multilayer perceptron networks at classifying new cases.
  • PNN require more memory space to store the model.

Applications based on PNN

  • probabilistic neural networks in modelling structural deterioration of stormwater pipes.[9]
  • probabilistic neural networks method to gastric endoscope samples diagnosis based on FTIR spectroscopy.[10]
  • Application of probabilistic neural networks to population pharmacokineties.[11]
  • Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System.[12]
  • Ship Identification Using Probabilistic Neural Networks.[13]
  • Probabilistic Neural Network-Based sensor configuration management in a wireless ad hoc network.[14]
  • Probabilistic Neural Network in character recognizing.
  • Remote-sensing Image Classification.[15]

References

  1. Mohebali, Behshad; Tahmassebi, Amirhessam; Meyer-Baese, Anke; Gandomi, Amir H. (2020). Probabilistic neural networks: a brief overview of theory, implementation, and application. Elsevier. pp. 347–367. doi:10.1016/B978-0-12-816514-0.00014-X. S2CID 208119250.
  2. Zeinali, Yasha; Story, Brett A. (2017). "Competitive probabilistic neural network". Integrated Computer-Aided Engineering. 24 (2): 105–118. doi:10.3233/ICA-170540.
  3. "Probabilistic Neural Networks". Archived from the original on 2010-12-18. Retrieved 2012-03-22.
  4. "Archived copy" (PDF). Archived from the original (PDF) on 2012-01-31. Retrieved 2012-03-22.{{cite web}}: CS1 maint: archived copy as title (link)
  5. Specht, D. F. (1967-06-01). "Generation of Polynomial Discriminant Functions for Pattern Recognition". IEEE Transactions on Electronic Computers. EC-16 (3): 308–319. doi:10.1109/PGEC.1967.264667. ISSN 0367-7508.
  6. Specht, D. F. (1990). "Probabilistic neural networks". Neural Networks. 3: 109–118. doi:10.1016/0893-6080(90)90049-Q.
  7. "Probabilistic Neural Networks :: Radial Basis Networks (Neural Network Toolbox™)". www.mathworks.in. Archived from the original on 4 August 2012. Retrieved 6 June 2022.
  8. "Probabilistic and General Regression Neural Networks". Archived from the original on 2012-03-02. Retrieved 2012-03-22.
  9. Tran, D. H.; Ng, A. W. M.; Perera, B. J. C.; Burn, S.; Davis, P. (September 2006). "Application of probabilistic neural networks in modelling structural deterioration of stormwater pipes" (PDF). Urban Water Journal. 3 (3): 175–184. doi:10.1080/15730620600961684. S2CID 15220500. Archived from the original (PDF) on 8 August 2017. Retrieved 27 February 2023.
  10. Li, Q. B.; Li, X.; Zhang, G. J.; Xu, Y. Z.; Wu, J. G.; Sun, X. J. (2009). "[Application of probabilistic neural networks method to gastric endoscope samples diagnosis based on FTIR spectroscopy]". Guang Pu Xue Yu Guang Pu Fen Xi. 29 (6): 1553–7. PMID 19810529.
  11. Berno, E.; Brambilla, L.; Canaparo, R.; Casale, F.; Costa, M.; Della Pepa, C.; Eandi, M.; Pasero, E. (2003). "Application of probabilistic neural networks to population pharmacokineties". Proceedings of the International Joint Conference on Neural Networks, 2003. pp. 2637–2642. doi:10.1109/IJCNN.2003.1223983. ISBN 0-7803-7898-9. S2CID 60477107.
  12. Huang, Chenn-Jung; Liao, Wei-Chen (2004). "Application of Probabilistic Neural Networks to the Class Prediction of Leukemia and Embryonal Tumor of Central Nervous System". Neural Processing Letters. 19 (3): 211–226. doi:10.1023/B:NEPL.0000035613.51734.48. S2CID 5651402.
  13. Araghi, Leila Fallah; d Khaloozade, Hami; Arvan, Mohammad Reza (19 March 2009). "Ship Identification Using Probabilistic Neural Networks (PNN)" (PDF). Proceedings of the International MultiConference of Engineers and Computer Scientists. Hong Kong, China. 2. Retrieved 27 February 2023.
  14. "Archived copy" (PDF). Archived from the original (PDF) on 2010-06-14. Retrieved 2012-03-22.{{cite web}}: CS1 maint: archived copy as title (link)
  15. Zhang, Y. (2009). "Remote-sensing Image Classification Based on an Improved Probabilistic Neural Network". Sensors. 9 (9): 7516–7539. Bibcode:2009Senso...9.7516Z. doi:10.3390/s90907516. PMC 3290485. PMID 22400006.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.