Penggunaan Metode You Only Look Once dalam Penentu Pindah Tanaman Cabai Besar Ternotifikasi Telegram

  • Feli Ramasari Politeknik Negeri Padang
  • Firdaus Firdaus Politeknik Negeri Padang
  • Sri Nita Politeknik Negeri Padang
  • Kartika Kartika Universitas Malikussaleh
Keywords: YOLOv4, Raspberry Pi, Telegram, bounding boxes, permanent leaves


Capsicum Annuum L. is one of the vegetable commodities that have high economic value in fulfilling Indonesia's domestic needs with an area of chili harvest of around 11,400 ha per month. One of the stages in the cultivation of chili plants is the process of transplanting after sowing, namely the condition of the seeds having 4-5 permanent leaves. The impact if it is not immediately transferred is that the plant does not have enough adaptation period in completing its vegetative growth so that the plant quickly ages and enters the generative stage. Digital image processing can be applied to the identification and calculation of permanent leaves of large chili plants with a deep learning approach. One method that can be used is You Only Look Once version 4, which is the development of a convolutional neural network in object detection in the form of boundary box prediction results with a CSPdarknet53 backbone. The YOLOv4 model is integrated with the Raspberry Pi 4 Model B mini computer and sends notifications via the Bot feature in the Telegram application. The image dataset used is 135 pieces with 1409 samples for permanent class and 1945 samples for the young class. This system was successfully implemented with the identification and calculation accuracy value of 92.85% and the inference time value for the detection of one plant was 11.88 seconds and for sending notifications was 25.29 seconds.


Download data is not yet available.


[1] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2016-Decem, pp. 779–788, 2016, doi: 10.1109/CVPR.2016.91.
[2] S. Shinde, A. Kothari, and V. Gupta, “YOLO based Human Action Recognition and Localization,” Procedia Comput. Sci., vol. 133, no. 2018, pp. 831–838, 2018, doi: 10.1016/j.procs.2018.07.112.
[3] H. Song, H. Liang, H. Li, Z. Dai, and X. Yun, “Vision-based vehicle detection and counting system using deep learning in highway scenes,” Eur. Transp. Res. Rev., vol. 11, no. 1, 2019, doi: 10.1186/s12544-019-0390-4.
[4] H. E. D. Mohamed et al., “MSR-YOLO: Method to Enhance Fish Detection and Tracking in Fish Farms,” Procedia Comput. Sci., vol. 170, no. 2019, pp. 539–546, 2020, doi: 10.1016/j.procs.2020.03.123.
[5] K. Itakura and F. Hosoi, “Automatic tree detection from three-dimensional images reconstructed from 360 spherical camera using YOLO v2,” Remote Sens., vol. 12, no. 6, 2020, doi: 10.3390/rs12060988.
[6] M. Buzzy, V. Thesma, M. Davoodi, and J. M. Velni, “Real-time plant leaf counting using deep object detection networks,” Sensors (Switzerland), vol. 20, no. 23, pp. 1–14, 2020, doi: 10.3390/s20236896.
[7] R. Suryani, Outlook cabai. Indonesia : Pusat Data dan Sistem Informasi Pertanian Sekretariat Jendral Kementrian Pertanian. 2018.
[8] S. Swastika, D. Pratama, T. Hidayat, and K. B. Andri, Buku Petunjuk Teknis Teknologi Budidaya Cabai Merah. 2017.
[9] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2016-December, pp. 770–778, 2016, doi: 10.1109/CVPR.2016.90.
[10] A. Hamidisepehr, S. V. Mirnezami, and J. K. Ward, “Comparison of object detection methods for corn damage assessment using deep learning,” Trans. ASABE, vol. 63, no. 6, pp. 1969–1980, 2020, doi: 10.13031/TRANS.13791.
[11] Y. Umar, Hanafi, S. Mardi, Nugroho, Susiki, and R. F. Rachmadi, “Deteksi Penggunaan Helm Pada Pengendara Bermotor Berbasis Deep Learning,” 2020.
How to Cite
Ramasari, F., Firdaus, F., Nita, S., & Kartika, K. (2021, November 27). Penggunaan Metode You Only Look Once dalam Penentu Pindah Tanaman Cabai Besar Ternotifikasi Telegram. Elektron : Jurnal Ilmiah, 45-52.