Ashish Patel 🇮🇳’s Post

𝗗𝗮𝘆-𝟰𝟵𝟵 𝗖𝗼𝗺𝗽𝘂𝘁𝗲𝗿 𝗩𝗶𝘀𝗶𝗼𝗻 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 Spiking Approximations of the MaxPooling Operation in Deep SNNs by University of Waterloo, Canada Follow me for a similar post: Ashish Patel ------------------------------------------------------------------- 𝗜𝗻𝘁𝗲𝗿𝗲𝘀𝘁𝗶𝗻𝗴 𝗙𝗮𝗰𝘁𝘀 : 🔸 This paper is published in ARXIV2022. 🤝 this work is a first to present two different hardware-friendly methods of spiking-MaxPooling operation in SNNs, with their evaluation on Loihi. In a first, we also deployed SNNs with MaxPooling layers (via MJOP method) on the Loihi boards. ------------------------------------------------------------------- 𝗜𝗠𝗣𝗢𝗥𝗧𝗔𝗡𝗖𝗘 👉 Spiking Neural Networks (SNNs) are an emerging domain of biologically inspired neural networks that have shown promise for low-power AI. 👉 A number of methods exist for building deep SNNs, with Artificial Neural Network (ANN)-to-SNN conversion being highly successful. 👉 MaxPooling layers in Convolutional Neural Networks (CNNs) are an integral component to downsample the intermediate feature maps and introduce translational invariance, but the absence of their hardware-friendly spiking equivalents limits such CNNs' conversion to deep SNNs. 👉 In this paper, we present two hardware-friendly methods to implement Max-Pooling in deep SNNs, thus facilitating easy conversion of CNNs with MaxPooling layers to SNNs. 👉 In a first, we also execute SNNs with spiking-MaxPooling layers on Intel's Loihi neuromorphic hardware (with MNIST, FMNIST, & CIFAR10 dataset); thus, showing the feasibility of our approach. #computervision #artificialintelligence  #deeplearning #data #technology

1 more to go for 500 . Keep growing 👏

Like
Reply
See more comments

To view or add a comment, sign in

Explore content categories