I am a PhD Graduate student at the Department of Electrical & Computer Engineering(ECE) at the University of Hawaii at Manoa(UHM). I am a research assistant at The Drew Research Lab(DRL), working on different robotics projects such as Drones, Hovercraft, swarm robotics, etc.
I completed my graduation in Electrical and Electronic Engineering (EEE) from Chittagong University of Engineering and Technology(CUET). I have also worked as a Research Assistant in the Department of Mechanical Engineering at CUET where I am engaged in a project centered on an IoT-based livestock health surveillance system.
I have had the privilege of immersing myself in hands-on robotics experiences, achieving honors in esteemed competitions including the RC Mud Track Race and Robo-race at CUET. My hands-on experience includes successful undergraduate projects in building a Quadcopter Drone, developing a Line Following Robot, and creating a Wall Following Robot. I have also worked on an IoT-based Home Automation System. My exposure as an instructor for Arduino and PCB design workshops underscores my ability to communicate complex ideas effectively. My research interests are integrated robotics, IoT systems, and innovative drone technologies and applications. I also possess a solid foundation in Robot operating systems - ROS2
Graduate PhD Student
January 2025 - Present
Department of Electrical & Computer Engineering(ECE)
University of Hawaii at Manoa (UHM)
BSc in Electrical & Electronic Engineering
January 2018 - June 2023
Chittagong University Of Engineering & Technology (CUET)
Thesis: Real-Time Sign Language Recognition and Conversion into Text Using Deep Learning
Higher Secondary Certificate Examination (H.S.C)
Institute: Birshreshtha Noor Mohammad Public School And College.
Secondary School Certificate Examination (S.S.C)
Institute: Habiganj Govt. High School, Habiganj.
Graduate Research Assistant (GRA)
January'25 - Present
Drew Research Lab
University of Hawaii at Manoa(UHM)
Research Assistant (RA)
September’23 - April'24
Chittagong University of Engineering & Technology(CUET)
Project Title: A livestock health surveillance system based on the Internet of Things (IoT) that enables the continuous tracking and health monitoring of cattle via a mobile application.
M. Modak, M. Bhowmick and M. R. Tanvir Hossain, "Real-Time Sign Language Recognition and Conversion into Text Using Deep Learning," 2025 International Conference on Quantum Photonics, Artificial Intelligence, and Networking (QPAIN), Rangpur, Bangladesh, 2025, pp. 1-6, doi: 10.1109/QPAIN66474.2025.11171668
M. Modak, Muin M. Pritom, Sajal C. Banik, and Md S. Rabbi. 2025. “Internet of Things-Based Health Surveillance Systems for Livestock: A Review of Recent Advances and Challenges.” IET Wireless Sensor Systems: e70013. https://doi.org/10.1049/wss2.70013.
F. Aziz, P. Bhattacharjee, M. Modak and M. S. Rabbi, "Automating Object Sorting by Color with a Robotic Arm and Conveyor Belt System," 2025 International Conference on Electrical, Computer and Communication Engineering (ECCE), Chittagong, Bangladesh, 2025, pp. 1-6, doi: 10.1109/ECCE64574.2025.11013097.
A. Bhowmic, M. Modak, M. J. Chak and N. Mohammad, "IoT-Based Home Energy Management System to Minimize Energy Consumption Cost in Peak Demand Hours," 2023 10th IEEE International Conference on Power Systems (ICPS), Cox's Bazar, Bangladesh, 2023, pp. 1-6, doi: 10.1109/ICPS60393.2023.10428752.
M. Shahjalal, T. Shams, Md. Islam, W. Alam, M. Modak, S. Hossain, V. Ramadesigan, Md. R. Ahmed, H. Ahmed, Atif Iqbal, A review of thermal management for Li-ion batteries: Prospects, challenges, and issues, Journal of Energy Storage, Volume 39, 2021, 102518, ISSN 2352-152X, https://doi.org/10.1016/j.est.2021.102518.
**Under Review**
Mrinmoy Modak, MD NAHIAN ABDULLAH, MD FOYSAL MIA, Advancements in Sleep Stage Prediction: A Comprehensive Review of EEG-Based Approaches [Computers in Biology and Medicine]
Title: Real-Time Sign Language Recognition and Conversion into Text Using Deep Learning
Abstract: For deaf and hard-of-hearing people, sign language is a fundamental form of communication. When trying to type using their normal way of communication, which is through hand gestures, they run into some challenges. Building a deep learning model that successfully translates hand gestures into text that contains an expression is our thesis’s key focus. The realization that many of the available typing aid solutions do not adequately support this issue inspired the concept. The suggested system is trained using the large dataset of American Sign Language (ASL) that includes a wide variety of sign language gestures. We have detected the hand gesture using a mediapipe library. An LSTM-RNN-based algorithm then processed it and turned it into text. The model successfully translates hand gestures into text with a high degree of precision, making it a helpful tool for the deaf.