-
-
Illumination of SmathLamp model in Darkness
-
SmathLamp arduino automatically turns off in light or lit surroundings (prototype during experimentation)
-
SmathLamp arduino during experimentation (un-updated prototype)
-
Integration of SmathLamp arduino with motion sensor during experimentation
-
Motion sensor detection build on a web app for project demonstration (visualization)
-
Integration of SmathLamp arduino device with Motion detection.
-
Updated and most-recent prototype SmathLamp model device
-
Well... What do I have to say?
Inspiration
Light pollution, characterized by excessive brightening of the night sky through artificial light, has largely been overlooked in comparison to other forms of pollution (Goronczy 2020). It presents a strong correlation with human and environmental health, and has led to psychological and scene evaluation challenges within natural settings (Benfield, 2018). Škvareninová (2017) determined the influence of light pollution on phenological phases of urban tree species, leading to the delay of seasonal changes that contribute to seasonal disorders or disturbances to the sleep cycle. Potential risks of light pollution have also been linked to the eutrophication of reservoirs (Ściężor 2019). Street lamps remain a leading contributor to light pollution, causing significant increase in energy consumption, financial costs, and resident dissatisfaction (Terrich, 2022). Residual light pollution caused by lamps with strong blue emission remain significant polluters affecting wildlife, human health, and stellar visibility (Falchi, 2011). In 2010, energy costs associated with excessive street lightning was totaled approximately $7 billion; by 2025 (Gallaway, 2010), the European Union may reach annual costs of approximately 42.5 billion in energy consumption costs caused by unnecessary street lighting (Sȩdziwy, 2016). While use of shielding on lighting fixtures and avoiding over lighting may limit light pollution pollution, its application and impact is limited. (Falchi, 2011). Saraiji (2012) suggests an integrated approach, analyzing interactions between street and facade lighting in controlling light pollution. To address this issue, we propose the solution of a smart street lighting system, where a motion detector and cost-effective sensors obtains motion data and surrounding illumination. The illumination of the smart street light is dependent on the presence of movement or other forms of lighting, reducing significant light pollution and cutting street illumination costs.
What it does
We present SmathLamp, the miniature protype of a smart illumination device which receives input through sensors from various sources to control lighting projection of street lamps. Utilizing cost-effective sensors, the SmathLamp’s brightness is dependent on the lumens of surrounding space, including natural light and facade lighting. Utilizing a computationally-inexpensive motion detector, the SmathLamp successfully detects and tracks motion of human or object movement, and thus will only illuminate in the case of passing cars, humans, or animals. Its motion detector is unaffected by wind and adaptable to light weather conditions such as rain or snow.
How we built it
A model of our SmathLamp device was constructed on a circuit from an Arduino uno mini kit. Utilizing a 5V circuit with light sensors, distance sensors, photoresistors, and LEDs, we obtained analog output from the sensors to program a software, tracking motion and illumination surrounding the device. We connected the photoresistors to a power source and analog inputs, and connected all LEDs and resistors to ground.
Utilizing the HTML 5 canvas and processing, we obtained video feed from a webcam for the creation of a motion sensor. Every frame is processed such that the system runs through every iteration of frames, detecting any change in RGB pixelation of each frame (see eq. 1). If no difference in RGB is detected through each iteration, the output is a black pixel; if the change is detected, a non-black pixel is generated. A threshold is applied to quantify the change, represented by an array of black and non-black pixels. From this, we can determine the percentage and significance of motion per frame, which produces a signal affecting the illumination of the SmartLamp (see eq. 2).
Using an input of j packets with n frames per second and t seconds per packet, we can find information about the maximum and average motion per t seconds, which is fed into the SmathLamp circuit (see eq. 3 & 4).
F_n represents each frame, where n is the number of frames. i = F_(n+1) - F_n (eq.0)
P(x,y,i): grayscale pixel values at (x,y) in frame F_i ; Δr: change in red pixel values across each frame ; Δg: change in red pixel values across each frame ; Δb: change in red pixel values across each frame ;
P(x,y,i) = Δr(x,y,i) + Δg(x,y,i) + Δb(x,y,i) (eq.1)
F_i: Input frame at iteration i P(x,y,i): grayscale pixel values at (x,y) in frame F_i b = black; b_n = non-black T(x,y,i): Thresholded output value of pixels at coordinates (x,y) in frame Fi M(i): Motion significance at iteration i.
M(i) = [ ∑ _(x,y) 1(T(x,y,i) b) ] / [ ∑ _(x,y) 1(T(x,y,i)) ] such that b(dr/di , dg/di , db/di) = (0,0,0) (eq.2)
1(condition) is the indicator function which equals 1 if the condition is true and 0 otherwise.
M_max(j): Highest motion significance within packet j, where j represents the packet number. t is the duration of each packet in seconds, and n is the number of frames per second. M_avg(j): Average motion significance within packet j of information, fed to the SmathLamp device to determine average motion significance versus time.
M_max(j) = max_(1<i<t(n-1)) {M(i)} (eq.3)
M_avg(j) = 1/t ∫M(t)dt (eq.4)
By integrating the SmartLamp circuit to receive information from the motion sensor, the SmartLamp is able to receive M_max and M_avg of each packet of inputs from the motion sensor, influencing when to illuminate for passing street movement.
Challenges we ran into
Given the ambition of this project, we ran into numerous challenges, mainly consisting of making our custom programmed software compatible between MacOS and Windows OS, constructing the circuit model for SmathLamp, as well as designing the motion sensor, which was arguably the most complex of the challenges. We lacked many resources, including IoT relays, which would have helped our demonstration by connecting the camera and image processing to an arduino, linked to a desk lamp as a simpler representation of the street lamp. We also experienced challenges with programming the motion sensor, where we have to standardize all image frames, apply a filtering threshold, and optimize the threshold to cut minor distractions including background noise and lighting fluctuations not caused by meaningful or significant movement. Through the use of differential equations, we performed numerous calculations modeling the motion process while applying noise filters. Furthermore, we faced challenges with loading data into the cloud through ThinkSpeak, where we had to create specific channels, load the API key and Channel ID using JavaScript, and optimize the rate of data upload to 15 seconds without flooding the system with data, which reduces processing time (see eq. 4).
Accomplishments that we're proud of
Overall, we felt our biggest accomplishment was the successful management of our project by utilizing the key strengths and talents of each of our team members. Our team had very limited programming experience, and some of us were barely fluent in a programming language. Yet, by using the skills of each team member, we were able to create a pretty significant project which incorporated mathematical modeling, programming, computational analysis, arduino and circuitry, and presentation experiences. Although we went through many tutorials and had to seek help from the awesome mentors who attended SmathHacks this year, we learned a lot and gained significant understanding and skills in constructing multi sensor circuits and configuring it with custom-made software across numerous operating systems. As far as technical accomplishments are concerned, we successfully created a motion sensor by mathematically modeling motion across numerous image frames, which was programmed and connected to a cloud through ThinkSpeak. We not only improved the motion sensor through applying numerous filters, but integrated it with the rest of the SmathLamp project, such as the circuit. This was especially challenging and time-consuming, but we glad it worked out nevertheless.
What we learned
Despite the significant learning curve we had at the beginning of the project, our ambitious idea gave us an eye-opening and unforgettable experience. One of the major takeaways of this project was interdisciplinary collaboration, where our team, composed of diverse backgrounds and skill sets, used our skills in mathematics, programming, engineering, and project management. By leveraging each other’s strengths and weaknesses, we handled difficult technical challenges that arose throughout the project, particularly the lack of time and resources. In a technical sense, we vastly improve our knowledge in circuit design, arduino programming, video and image processing, as well as cloud integration. More specifically, some of the most important technical skills we learned involved installing new drivers to interpret hardware data and connect it to software, understanding how digital and analog sensors work and operate within microcontrollers, finding documentation on complex circuitry and programming concepts specific to this arduino-based project, using a webcam to process data, as well as connecting data to cloud through JavaScript. A significant aspect of this project was also based on mathematical modeling, where we applied our knowledge of calculus to program a motion-detection software specifically compatible with our SmathLamp device. Whereas other motion detectors simply detect motion with a lack of filters, our motion detector is specific for both compatibility and the ability to detect motion with respect to object size, filtering out meaningless motion (e.g. wind). By integrating these systems together to create this project model, we gained a significant amount of hands-on learning experiences and technical skills that can be applied to similar projects in the future. We are also very thankful for the numerous seminars offered through the SmathHacks Hackathon, especially the Arduino seminar and GitHub seminar. We also gained a lot of skills in other tools such as geospatial mapping and developing image classification models through deep learning, although due to limited time and other constraints, we were not able to expand our project to include these tools. These seminars were vital to our success, and it would be difficult for us to have achieved what we did without the wonderful support we received from these seminars and seasoned professionals.
What's next for SMathLamp
Since our current device is simply an arduino-based model, our future would be based on creating the actual device, compatible with city street lights. A good start could even be the street lights located on NCSSM’s campus! The integration of advanced sensors such as infrared or thermal sensors can allow SmathLamp to discern between different types of motion, providing more nuanced control over lighting levels and further reducing energy consumption. In addition, enhanced connectivity through wireless communication protocols can facilitate real-time data analysis and remote monitoring, which can be used to optimize the project and be used to make better decisions about street lamp planning. If this product were to be made for mass production, a supply chain to produce standardized circuit boards and other materials would be needed. This allows for integration into street lamps and other similar city infrastructure, and creates a system to network across these lamps. Someday in the long future, this device may synergize with broader urban infrastructure systems, such as traffic management and environmental monitoring, without overspending on excessive lighting that would otherwise cause massive amounts of light pollution and waste millions of US dollars in city budgets each year.
References
[1]Benfield, J. A., Nutt, R. J., Taff, B. D., Miller, Z. D., Costigan, H., & Newman, P. (2018). A laboratory study of the psychological impact of light pollution in national parks. Journal of Environmental Psychology, 57, 67–72. https://doi.org/10.1016/j.jenvp.2018.06.006 [2]Emlyn Etienne Goronczy. (2020). What Is Light Pollution? Springer EBooks, 5–7. https://doi.org/10.1007/978-3-658-29723-7_2 [3]Falchi, F., Cinzano, P., Elvidge, C. D., Keith, D. M., & Haim, A. (2011). Limiting the impact of light pollution on human health, environment and stellar visibility. Journal of Environmental Management, 92(10), 2714–2722. https://doi.org/10.1016/j.jenvman.2011.06.029 [4]Gallaway, T., Olsen, R. N., & Mitchell, D. M. (2010). The economics of global light pollution. Ecological Economics, 69(3), 658–665. https://doi.org/10.1016/j.ecolecon.2009.10.003 [5]Hunter, T. B., & Crawford, D. L. (1991). Economics of Light Pollution. International Astronomical Union Colloquium, 112, 89–96. https://doi.org/10.1017/s0252921100003778 [6]Saraiji, R., & Oommen, M. S. (2012). Light Pollution Index (LPI): An Integrated Approach to Study Light Pollution with Street Lighting and Façade Lighting. LEUKOS, 9(2), 127–145. https://doi.org/10.1582/leukos.2012.09.02.004 [7]Ściężor, T. (2019). Light pollution as an environmental hazard. Czasopismo Techniczne, 8, 129–142. https://doi.org/10.4467/2353737xct.19.084.10863 [8]Sȩdziwy, A., & Kotulski, L. (2016). Towards Highly Energy-Efficient Roadway Lighting. Energies, 9(4), 263. https://doi.org/10.3390/en9040263 [9]Škvareninová, J., Tuhárska, M., Škvarenina, J., Babálová, D., Slobodníková, L., Slobodník, B., Středová, H., & Minďaš, J. (2017). Effects of light pollution on tree phenology in the urban environment. Moravian Geographical Reports, 25(4), 282–290. https://doi.org/10.1515/mgr-2017-0024 [10]Terrich, T., & Balsky, M. (2022). The Effect of Spill Light on Street Lighting Energy Efficiency and Light Pollution. Sustainability, 14(9), 5376. https://doi.org/10.3390/su14095376
Acknowledgements
We would like to acknowledge all SmathHacks staff, faculty, advisors, as well as sponsors and mentors.

Log in or sign up for Devpost to join the conversation.