About
My name is Sanjay. I graduated from the Massachusetts Institute of Technology (MIT) in May 2025 with a Bachelor of Science degree in Electrical Engineering and Computer Science (Course 6-2). Starting Fall 2025, I will be a Doctoral student (PhD) in Electrical and Computer Engineering at Carnegie Mellon University (CMU), focusing on computer architecture and integrated circuits.
I am always looking for interesting work/research/internships in the realm of developing electrical and computer systems -- for example, CPU/GPU design, FPGA/ASICs, Integrated Circuits, Embedded Systems. My primary interest is in Computer Architecture. I enjoy developing novel systems, such as working on parallelizing systems (CPUs) and developing on low powered devices. Much of my past research and internships have focused on the development of such computer systems, e.g. developing a highly multithreaded RISC-V CPU, implementing an application specific accelerators for graph pattern mining, utilizing processing in memory, and creating various interfaces on FPGAs or simulation. I also have interest in creating embedded systems for distributed IoT (Internet of Things) sensing. Some of my past work has involved developing extremely low powered and interconnected sensing systems.
At MIT, my coursework has ranged from programming low-level systems to CPU design to semiconductor device fabrication. I have taken classes in Circuits, Signal Processing, Computer Architecture, FPGA-focused digital design, and microcomputer (embedded) systems development. I also have taken a fabrication class for developing nanoscale systems, including with novel materials for transistors (e.g. MoS2). Through my semiconductor coursework, I have experience with the layout process for mixed signal systems for Intel's 22nm silicon tapeout.
Some fun things..... I have built small to large robot systems for fun, such as a giant, interconnected and interactive PacMan game or a printer using LEGO MINDSTORMS. Some of my designs have appeared at the international LEGO World, or as part of the official software from LEGO. I have also been learning French, since I spent a summer in the French-speaking side of Switzerland. Minor planet 33762 is named after me, Sanjayseshan. I also enjoy travelling and taking pictures with a Nikon DSLR. I also take photos from events (formals, graduation, etc.). I maintain a selection of pictures from my travels.
Publications and Presentations
Hanly, B.(1st), Ospina, L.(1st), Seshan, S.(1st), Paul D.J., Niroui, F., Jan. 2024, Two-dimensional MoS2 transistors (Poster), Microsystems Annual Research Conference (MARC)
Designed and fabricated a nano-scale 2D transistor using MoS2 channels for use in developing logic gates. Work was developed as part of MIT 6.s059 and was presented at the Microsystems Annual Research Conference (MARC) in Jan. 2024.
Projects
Built digital circuits using TFTs on silicon. Paper is linked here.
Built a PPG sensor using an Arduino and 3d printed components.
Built a ripple tank using an Arduino to demonstrate wave interference patterns
Designed and implemented a PSoC (programmable system on chip) based oscilloscope, with two analog inputs and one analog output, including full frequency analysis and user customizability. Built as part of MIT 6.115 course.
I implemented a Graph-based Vector Search accelerator on a Xilinx FPGA using SystemVerilog, built using the vivado toolchain and testing software. Write-up can be found here.
Abstract—With the application of graphs in large-scale modeling, such as social network analysis and image and video segmentation, among other applications, graphs are increasingly being used to encode and find complex relationships between data for machine learning models, leading to an increased need for optimization of these models. As a result, in order to better support model-specific algorithm efficiency, there has been work to create specialized hardware accelerators focusing on aspects such as memory accessing, latency, and resource allocation. However, current accelerators for graph problems are not scalable and can only be optimized for a single application, such as graph random walks or matrix multiplication. Existing systems also run on CPU or GPUs. Following from prior accelerator work on FPGAs, we plan to implement a graph-based vector search algorithm, based on iQAN, that runs on an FPGA to produce better algorithm performance than on existing systems and can be used for more versatile applications.
Completed layout and tapeout process for a mixed-signal, CMOS-based differential amplifier in Cadence as part of the MIT 6.2080 course
Designed and implemented a pipelined, dual-core RISC-V 32-bit processor with a shared cache hierarchy in Bluespec SystemVerilog. Synthesized design to work on AWS-based FPGA
RISC-V processor built entirely in Javascript for web-based simulation. It can be used for making visualizations for computer architecture courses. See here.
Designed and tested a custom wearable wristband sensor to detect hand movements.
Developed a
gesture recognition system leveraging custom wearable and machine learning
analysis.
Paper at ArXiv.2009.13322.
The goal of this project is to create an inexpensive, lightweight, wearable
assistive device that can measure hand or finger movements accurately enough to
identify a range of hand gestures. One eventual application is to provide assistive
technology and sign language detection for the hearing impaired. My system, called
LiTe (Light-based Technology), uses optical fibers embedded into a wristband. The
wrist is an optimal place for the band since the light propagation in the optical
fibers is impacted even by the slight movements of the tendons in the wrist when
gestures are performed. The prototype incorporates light dependent resistors to
measure these light propagation changes. When creating LiTe, I considered a variety
of fiber materials, light frequencies, and physical shapes to optimize the tendon
movement detection so that it can be accurately correlated with different gestures.
I implemented and evaluated two approaches for gesture recognition. The first uses
an algorithm that combines moving averages of sensor readings with gesture sensor
reading signatures to determine the current gesture. The second uses a neural
network trained on a labelled set of gesture readings to recognize gestures. Using
the signature-based approach, I was able to achieve a 99.8% accuracy at recognizing
distinct gestures. Using the neural network the recognition accuracy was 98.8%. This
shows that high accuracy is feasible using both approaches. The results indicate
that this novel method of using fiber optics-based sensors is a promising first step
to creating a gesture recognition system.
Developed a system that
combines
on-vehicle and infrastructure-based sensors to provide
a more complete view of the
environment; developed a custom vision recognition system, sensor fusion model,
and
networking protocol design for the
system. Presented work at Intel International Science and Engineering Fair in
Phoenix,
AZ. Paper at Arxiv:2009.03458.
Studies
predict that demand for autonomous vehicles will increase tenfold between 2019 and
2026. However, recent high-profile accidents have significantly impacted consumer
confidence in this technology. The cause for many of these accidents can be traced
back to the inability of these vehicles to correctly sense the impending danger. In
response, manufacturers have been improving the already extensive on-vehicle sensor
packages to ensure that the system always has access to the data necessary to ensure
safe navigation. However, these sensor packages only provide a view from the
vehicle's perspective and, as a result, autonomous vehicles still require frequent
human intervention to ensure safety.
To address this issue, I developed a system, called Horus, that combines on-vehicle
and infrastructure-based sensors to provide a more complete view of the environment,
including areas not visible from the vehicle. I built a small-scale experimental
testbed as a proof of concept. My measurements of the impact of sensor failures
showed that even short outages (1 second) at slow speeds (25 km/hr scaled velocity)
prevents vehicles that rely on on-vehicle sensors from navigating properly. My
experiments also showed that Horus dramatically improves driving safety and that the
sensor fusion algorithm selected plays a significant role in the quality of the
navigation. With just a pair of infrastructure sensors, Horus could tolerate sensors
that fail 40% of the time and still navigate safely. These results are a promising
first step towards safer autonomous vehicles.
Programmed a 3D interactive robot simulator that navigates a minefield
Track your Boston area public transit. Find the location of your bus or train as well as predicted arrivals/departures. link
Uses Machine learning to give live feedback on users' exercises. Built using OpenPose and Tensorflow to process live camera data on a Raspbery Pi and detail feedback. link
Developed Web-based tournament management system for FIRST events with support to manage team submissions and a judging and scoring system for teams.
Some examples include PacMan game and a model printer. Others can be found on youtube.
Each day, there are 850 water main breaks in the US. Early detection of leaks is critical to managing costs and saving water. I did a case study on my own local township and learnt that around 20% of water is unaccounted for, but this is a relatively low number as in some regions this number can reach up to 40%. Unfortunately, all the existing solutions to detect leakages in pipes require a leak to already be present, causing all the water to still be lost. Therefore, my goal was to detect the leaking water by detecting pipe deterioration before there is an actual break and water is lost. To do this, I used the acoustic properties of pipes to infer damage. youtube link, code, article
Activities
- MIT SPLASH 2022: Taught a class on Graph Algorithms tailored towards High School students interested in Computer Science.
- MIT SPARK 2023: Taught a class on Gravitation and Electrostatics for Middle School students.
- Maseeh Dorm Council: Representative for Spring 2022, Chair for Campus Preview Weekend (CPW) for incoming freshmen for the 2022-2023 and 2023-2024 school years, Chair for REX (Freshman Orientation events) Fall 2023. Organized dozens of events for these events and represented the dorm in orientation-related events. Photographer for Maseeh formal and boat cruise.
- Volunteering for youth robotics development (e.g. writing programming lessons)
- Alpha and beta-tested both hardware and software products under NDA for the LEGO Group in Billund, Denmark. Designed three robots for the official LEGO MINDSTORMS App released in 2021.
Experience
Contact
Feel free to reach out to me at [email protected].