USING SPATIAL TRANSFORMER TECHNOLOGY IN THE TASK OF NEURAL NETWORK CLASSIFICATION OF AGRICULTURAL FIELDS BASED ON SENTINEL SATELLITE IMAGES
DOI:
https://doi.org/10.35546/kntu2078-4481.2026.1.50Keywords:
High-resolution optical satellite images, Sentinel, classification of agricultural lands, deep learning, neural network, transformer, attention pooling, spatial attentionAbstract
This paper investigates the feasibility of using spatial transformation mechanisms for neural network classification of agricultural land using multi-temporal Sentinel-1 and Sentinel-2 satellite data. It is proposed to improve existing information technology by combining temporal attention for adaptive aggregation of multi-temporal observations with a spatial transform module that models global spatial dependencies in feature maps. The existing information technology involves a neural network with the U-Net architecture and the EfficientNetV2-L encoder, pre-trained on the ImageNet-21k dataset, that uses attention mechanisms to aggregate temporal features when processing multispectral and radar images. The paper considers several options for integrating temporal and spatial attention: sequential schemes (with the transformer placed before or after the temporal aggregation mechanism) and parallel schemes with different featurecombination methods (concatenation with 1 × 1 projection, gated additive merging, and weighted summation). Experimental studies were conducted on a sample comprising more than 90,000 land plots in the Berlin-Brandenburg region (Germany), divided into arable and non-arable lands, ensuring the representativeness of the results. The results indicate that using a spatial transformer does not always increase classification accuracy. The best results were achieved with a transformer placed before the temporal attention mechanism, whereas other integration schemes showed a decrease in quality. This indicates the need for appropriate and context-based use of transformer mechanisms in agricultural land classification tasks, where local textural and spectral features dominate.
References
Boulila W., et al. A transformer-based approach empowered by self-attention for semantic segmentation in remote sensing. Heliyon. 2024. № 10(8). e29396. DOI: https://doi.org/10.1016/j.heliyon.2024.e29396
Chen M., Li L. Hierarchical transfer learning with transformers to improve semantic segmentation in remote sensing land use. Remote Sensing. 2025. № 17(2). 290. DOI: https://doi.org/10.3390/rs17020290
Chen H., He G., Peng X., Wang G., Yin R. A multi-scale feature fusion deep learning network for the extraction of cropland based on Landsat data. Remote Sensing. 2024. № 16(21). 4071. DOI: https://doi.org/10.3390/rs16214071
Chen X., Li D., Liu M., Jia J. CNN and transformer fusion for remote sensing image semantic segmentation. Remote Sensing. 2023. № 15(18). 4455. DOI: https://doi.org/10.3390/rs15184455
Gao M., Lu T., Wang L. Crop mapping based on Sentinel-2 images using semantic segmentation model of attention mechanism. Sensors. 2023. № 23(15). 7008. DOI: https://doi.org/10.3390/s23157008
Lian Z., Zhan Y., Zhang W., Wang Z., Liu W., Huang X. Recent advances in deep learning-based spatiotemporal fusion methods for remote sensing images. Sensors. 2025. № 25(4). 1093. DOI: https://doi.org/10.3390/s25041093
Gong Y., Chen C., Zheng Y. Hybrid deep learning model for multi-source remote sensing data fusion: integrating DenseNet and Swin Transformer for spatial alignment and feature extraction. Informatica. 2025. № 49(24). DOI: https://doi.org/10.31449/inf.v49i24.8395
Xu H., Song J., Zhu Y. Evaluation and comparison of semantic segmentation networks for rice identification based on Sentinel-2 imagery. Remote Sensing. 2023. № 15(6). 1499. DOI: https://doi.org/10.3390/rs15061499
Wang R., Ma L., He G., Johnson B. A., Yan Z., Chang M., Liang Y. Transformers for remote sensing: a systematic review and analysis. Sensors. 2024. № 24(11). 3495. DOI: https://doi.org/10.3390/s24113495
Чумичов Д., Нікулін С. Технологія нейромережевої класифікації космічних знімків для виявлення неорних земельних ділянок. Електротехнічні та інформаційні системи. 2025. № 108. С. 114–127. DOI: https://doi.org/10.32782/EIS/2025-108-15
Чумичов Д., Нікулін С. Використання контурів об’єктів супутникових знімків Sentinel для класифікації сільськогосподарських земель за допомогою нейронних мереж. Комп’ютерно-інтегровані технології: освіта, наука, виробництво. 2025. № 61. С. 213–226. DOI: https://doi.org/10.36910/6775-2524-0560-2025-61-30
Safari P., India M., Hernando J. Self-attention networks in speaker recognition. Applied Sciences. 2023. № 13(11). 6410. DOI: https://doi.org/10.3390/app13116410
Slimani N., Jdey I., Kherallah M. Improvement of satellite image classification using attention-based vision transformer. Proceedings of the 16th International Conference on Agents and Artificial Intelligence. 2024. С. 80–87. DOI: https://doi.org/10.5220/0012298400003636
Gackstetter D., Yu K., Körner M. Self-attention and frequency-augmentation for unsupervised domain adaptation in satellite image-based time series classification. ISPRS Journal of Photogrammetry and Remote Sensing. 2025. № 224. С. 113–132. DOI: https://doi.org/10.1016/j.isprsjprs.2025.03.024
Gao Y., Jiang X., Li Z., Song X., Li W. M3LNet: multi-frequency multi-scale multi-modal learning for multisource image classification. Proceedings of the 2nd Asia Symposium on Image and Graphics. 2025. С. 85–91. DOI: https://doi.org/10.1145/3718441.3718454
Liu Y., Guo Y., Georgiou T., et al. Fusion that matters: convolutional fusion networks for visual recognition. Multimedia Tools and Applications. 2018. № 77. С. 29407–29434. DOI: https://doi.org/10.1007/s11042-018-5691-4
Fan Y., Niu L., Liu T. Multi-branch gated fusion network for image quality improvement in maritime perception systems. Journal of Marine Science and Engineering. 2022. № 10(12). 1839. DOI: https://doi.org/10.3390/jmse10121839
Yuan J., Shi Z., Chen S. Feature fusion in deep-learning semantic image segmentation: a survey. Science and Technologies for Smart Cities. Cham : Springer, 2022. С. 261–276. DOI: https://doi.org/10.1007/978-3-031-06371-8_18
Copernicus Browser [Електронний ресурс]. URL: https://browser.dataspace.copernicus.eu (дата звернення: 28.11.2025)
Field Boundaries for Agriculture (fiboa) [Електронний ресурс]. URL: https://fiboa.org (дата звернення: 30.11.2025)





