Autonomous Camera Tracking System Using Image Processing for Dynamic Educational Content Creation
DOI:
https://doi.org/10.33019/jurnalecotipe.v12i1.4545Keywords:
Camera Tracking, Image Processing, MediaPipe, Online Learning ActivityAbstract
In the context of the Online Learning activity, where video content plays a crucial role in educational materials, the demand for effective video production systems has become essential. Traditionally, at least two people are needed to operate cameras, which poses a challenge due to limited human resources. This study addresses this issue by developing a Camera Position Tracking System using image processing, specifically utilizing the MediaPipe framework for real-time tracking of presenters. The system's mechanics enable a DSLR camera to automatically adjust its position based on the presenter’s movement, detected within a range of 1.5 to 8 meters. The light intensity required for optimal operation is between 125 and 190 lux. The system's success lies in converting detected position data into motor stepper pulses that move the camera, ensuring efficient, cost-effective video production with minimal human intervention.
Downloads
References
N. Kumar et al., “Educational technology and libraries supporting online learning,†AI-Assisted Libr. Reconstr., hal. 209–237, 2024, doi: 10.4018/979-8-3693-2782-1.ch012.
N. Peimani dan H. Kamalipour, “Online education and the covid-19 outbreak: A case study of online teaching during lockdown,†Educ. Sci., vol. 11, no. 2, hal. 1–16, 2021, doi: 10.3390/educsci11020072.
A. Khumaidi, “Sistem Tracking Posisi Kamera Menggunakan Pengolahan Citra Untuk Pemusatan Posisi Pengambilan Video di Automation Academy,†J. Tek. Elektro dan Komput. TRIAC, vol. 9, no. 2, hal. 103–108, 2022, doi: 10.21107/triac.v9i2.16021.
U. Anitha, R. Narmadha, D. R. Sumanth, dan D. N. Kumar, “Robust Human Action Recognition System via Image Processing,†Procedia Comput. Sci., vol. 167, no. 2019, hal. 870–877, 2020, doi: 10.1016/j.procs.2020.03.426.
W. Rahmaniar dan A. Hernawan, “Real-time human detection using deep learning on embedded platforms: A review,†J. Robot. Control, vol. 2, no. 6, hal. 462-468Y, 2021, doi: 10.18196/jrc.26123.
Julham Comaro, I. Malik, M. Mesin Produksi dan Perawatan, P. Negeri Sriwijaya, dan J. Teknik Mesin, “Perancangan Dan Pengembangan Alat Uji Tarik Mini Berbasis Arduino Untuk Spesimen Non-Ferro,†Agustus, vol. 1, no. 1, hal. 2723–3359, 2020, [Daring]. Tersedia pada: http://dx.doi.org/10.5281/zenodo.4540926.
W. Chamorro, J. Andrade-Cetto, dan J. Solà , “High-speed event camera tracking,†31st Br. Mach. Vis. Conf. BMVC 2020, no. 2, hal. 1–12, 2020.
V. Bazarevsky, I. Grishchenko, K. Raveendran, T. Zhu, F. Zhang, dan M. Grundmann, “BlazePose: On-device Real-time Body Pose tracking,†2020, [Daring]. Tersedia pada: http://arxiv.org/abs/2006.10204.
T. J. Sánchez-Vicinaiz, E. Camacho-Pérez, A. A. Castillo-Atoche, M. Cruz-Fernandez, J. R. GarcÃa-MartÃnez, dan J. RodrÃguez-Reséndiz, “MediaPipe Frame and Convolutional Neural Networks-Based Fingerspelling Detection in Mexican Sign Language,†Technologies, vol. 12, no. 8, hal. 1–22, 2024, doi: 10.3390/technologies12080124.
N. H. M. DHUZUKI et al., “Design and Implementation of a Deep Learning Based Hand Gesture Recognition System for Rehabilitation Internet-of-Things (Riot) Environments Using Mediapipe,†IIUM Eng. J., vol. 26, no. 1, hal. 353–372, 2025, doi: 10.31436/IIUMEJ.V26I1.3455.
A. Amarudin, D. A. Saputra, dan R. Rubiyah, “Rancang Bangun Alat Pemberi Pakan Ikan Menggunakan Mikrokontroler,†J. Ilm. Mhs. Kendali dan List., vol. 1, no. 1, hal. 7–13, 2020, doi: 10.33365/jimel.v1i1.231.
A. D. Agustiani, S. M. Putri, P. Hidayatullah, dan M. R. Sholahuddin, “Penggunaan MediaPipe untuk Pengenalan Gesture Tangan Real-Time dalam Pengendalian Presentasi,†vol. 16, no. 2, 2024.
S. Shriram, B. Nagaraj, J. Jaya, S. Shankar, dan P. Ajay, “Deep Learning-Based Real-Time AI Virtual Mouse System Using Computer Vision to Avoid COVID-19 Spread,†J. Healthc. Eng., vol. 2021, 2021, doi: 10.1155/2021/8133076.
A. Specker, “ReidTrack : Reid-only Multi-target Multi-camera Tracking,†hal. 5442–5452.
R. Hartmann, F. Al MacHot, P. Mahr, dan C. Bobda, “Camera-based system for tracking and position estimation of humans,†2010 Conf. Des. Archit. Signal Image Process. DASIP2010, no. April 2014, hal. 62–67, 2010, doi: 10.1109/DASIP.2010.5706247.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Jurnal Ecotipe (Electronic, Control, Telecommunication, Information, and Power Engineering)

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright in each article is the property of the author.
- The author acknowledges that the Jurnal Ecotipe (Electronic, Control, Telecommunication, Information, and Power Engineering) has the right to publish for the first time with a Creative Commons Attribution 4.0 International License.
- The author can enter the writing separately, regulate the non-exculsive distribution of manuscripts that have been published in this journal into other versions (for example: sent to the author's institution respository, publication into books, etc.), by acknowledging that the manuscript was first published in the Jurnal Ecotipe (Electronic, Control, Telecommunication, Information, and Power Engineering);











