The purpose of Sign-Interfaced Machine Operating Network, or SIMON, is to translate sign language into numbers. We used a machine learning classifier to classify images of hands into the corresponding symbols or actions. SIMON interprets American Sign Language and displays the translation on the LCD screen through the following steps. The given image of the ASL sign was extracted to produce a single numerical digit using a machine learning model. That value was then serialized to ASCII and sent to the microcontroller. The microcontroller would deserialize the value and post the result to the LCD screen. Hardware used to complete this project included the ATMega328P Xplained Mini Board, a laptop with Windows, a webcam, and an LCD screen. The goal of final project was achieved as we were successfully able to read the American Sign Language images and display the desired result on the LCD screen.