REVIEW OF COMPUTER VISION TECHNOLOGIES BASED ON DEEP NEURAL NETWORKS FOR USE IN RAIL TRANSPORT
DOI:
https://doi.org/10.35546/kntu2078-4481.2025.3.2.48Keywords:
computer vision, DCNN, problems, classification, architecture, filter (kernel), activation function, deep learning, transfer learning, loss functionAbstract
This paper reviews computer vision technologies based on deep neural networks for use in railway transport. During military aggression, modernization of the railway infrastructure of Ukraine is required, which is associated with the use of computer vision systems based on deep neural networks, to ensure accuracy and safety. To solve computer vision tasks in railway transport (classification and identification of objects, as well as tracking them), the use of DCNN (Deep Convolutional Neural Networks) is recommended. The conducted review of computer vision technologies using deep convolutional neural networks made it possible to identify their problems, display the general architecture with the purpose of the layers (convolutional layer; pooling layer; activation layer; fully connected layer; normalization layer; dropout layer; dense layer), as well as to familiarize yourself with the results of the corresponding analysis of Ukrainian and foreign scientists according to their classification: based on spatial use; based on depth; with many connections; based on the use of a feature map, compression and excitation; based on the use of channel boosting; using the attention mechanism. At the present stage, the TensorFlow software environment is most often used in computer vision tasks to implement DCNN. The results of the review of computer vision technologies based on the use of DCNN can be used as a methodological basis for the modernization of the railway infrastructure of Ukraine under martial law.
References
Гангало І.М., Лісовий Д.О., Жебка В.В. Розпізнавання об’єктів за допомогою технологій комп’ютерного зору. Телекомунікаційні та інформаційні технології. 2022. № 4(77). С. 46–52. DOI: 10.31673/2412-4338.2022.044652
Зінченко О. В., Звенігородський О. С., Кисіль Т. М. Згорткові нейронні мережі для вирішення задач комп’ютерного зору. Телекомунікаційні та інформаційні технології. 2022. № 2(75). С. 4–12. DOI: 10.31673/2412-4338.2022.020411.
Мамута М. С., Кравченко І. В., Мамута О. Д., Тужанський С. Є. Оцінка класифікації зображень для перенесення навчання у згорткових нейронних мережах. Оптико-електронні інформаційно-енергетичні технології. 2023. № 1(45). С. 64–69. DOI: 10.31649/1681-7893-2023-45-1-64-70
Тимчишин Р. М., Волков О. Є., Господарчук О. Ю., Богачук Ю. П. Сучасні підходи до розв’язання задач комп’ютерного зору. Control systems and computers, 2018. № 6. С. 46–73. DOI: https://doi.org/10.15407/usim.2018.06.046
Boesch G. Very Deep Convolutional Networks (VGG) Essential Guide. 2021. [Electronic resource]. URL: https://viso.ai/deep-learning/vgg-very-deep-convolutional-networks/ (дата звернення: 20.08.2025)
Harrington R. M., Lima A. de O., Fox-Ivey R., Nguyen T., Laurent J., Dersch M. S., Edwards J. R.. Use of deep convolutional neural networks and change detection technology for railway track inspections. Journal of Rail and Rapid Transit. 2023. Vol. 237. Iss. 2. Pp. 137–145. URL: https://doi.org/10.1177/09544097221093486
Krizhevsky A., Sutskever I., Hinton G.E. ImageNet Classification with Deep Convolutional Neural Networks. Communications of the ACM. 2017. Vol. 60. No. 6. Pp.84-90. URL: https://doi.org/10.1145/3065386
Nafizul Haque Kh. What is Convolutional Neural Network – CNN (Deep Learning). 2023. [Electronic resource]. URL:https://nafizshahriar.medium.com/what-is-convolutional-neural-network-cnn deep-learning-b3921bdd82d5 (дата звернення: 20.08.2025).
Tang Y., Qian Yu. High-speed railway track components inspection framework based on YOLOv8 with high-performance model deployment. High-speed Railway. 2024. № 2(1). Pp. 42–50. URL: https://doi.org/10.1016/ j.hspr.2024.02.001
Tomka Yu., Talakh M., Dvorzhak V., Ushenko O. Implementation of a Convolutional Neural Network Using TensorFlow Machine Learning Platform. Optoelectronic Information-Power Technologies. 2023. Vol. 44. № 2. Pp. 55–65.







