Explanation: My innovation is the Kofiko software program, a Matlab program that helps analyze monkey’s visual processing. I worked at Rockefeller University’s Laboratory of Neurobiology, where I received input from the post-doctoral research scientists. I used PsychToolBox, which is an open-source piece of software that combines the fields of psychology and computer science into a neat package that can be changed by engineering researchers like me. I used Matlab to program a piece of software that tracks monkey’s visual processing in the monkey’s brain. I am currently training three monkeys to perform a task (described next), and these monkeys will be tested using my program. The computer program uses two computer screens. One is the interface for the researcher. The other is the touch monitor for the monkey. There is a juicer hardware control box that handles the reward system.
As the researcher, I control what appears on the touch screen for the monkey to touch. First, I let the monkey see an image, e.g. a car. Then, I display other images in succession. If the monkey believes that he (all the monkeys are male in my experiment) is currently seeing an image that matches the car, he touches the green button. Otherwise, he touches the red button. When the monkey presses the buttons at correct times, he receives a few drops of juice from the juicer automatically, as I programmed the software to do.
After a monkey looks at all the images in the library of images I borrowed from a researcher who conducted an experiment on humans, I look at the statistics and computations that were completed by the software program. I programmed the software to keep track of correct and incorrect button presses, to keep track of the hand and eye movement of the monkeys, and to match every correct and incorrect choice with the corresponding images.
By using a PET scanner that I designed and built two years ago at the Stanford School of Medicine, I can see exactly which parts of the monkey brain are being stimulated while the monkey completes his tasks on Kofiko. That particular PET scanner was what won me Finalist recognition for both the Siemens Competition (one of the top 30 individual high school researchers in the United States) and the Synopsys Championship n+1 Prize for the Next Breakthrough; it is a piece of hardware that I spent up to 20 hours a week during the school year and up to 50 hours a week over the summer to make. I designed it using Altium Designer, simulated it using LTSpice, constructed it using a soldering iron, heat gun, and components I purchased from Advanced Circuits, and tested it using an oscilloscope, radioactive point source, and a Matlab program I coded up. It was because I had already successfully made my own PET scanner that the Rockefeller University Professor Charles Gilbert brought me into his lab to be able to work with his monkeys.
Inspiration: I have always been interested in cognitive science (a field consisting of neuroscience, computer science, psychology, anthropology, and linguistics) and industrial design, and after reading many articles about inspiring teenage researchers, I decided to do something remarkable myself. I took AP Physics C: Mechanics, AP Physics C: Electricity & Magnetism, and AP Biology in my senior year. I self-studied for AP Computer Science by reading the textbook and aced the exam, excited to apply computer science in the biomedical realm. I really wanted to do something related to the brain. I decided to create a piece of hardware and write software that can analyze the brain of my favorite animal, the monkey.
Specifically, I decided to analyze one part of the monkey’s brain. Why did I choose to analyze specifically the visual cortex? Because I love playing the game SET, which is a test of visual recognition. That game helped me to become especially interested in looking at the visual processing and object recognition parts of the monkey’s brain.
It was ultimately my love of learning, making, and hacking that inspired me to start the project. When I started the project, I read up journal articles and realized that what I was interested in was very much exigent in the world today. The stuff I was interested in, I found, is extremely relevant to the fast-paced modern lifestyle. By studying how the brain manages to perceive objects in lightning speed, I can learn enough to apply my knowledge to improve computers. I can use new visual algorithms to teach computers how to recognize different people in order to increase the security of our nation. Cars can be programmed to notify drivers of children on the street. Self-driving cars can be programmed to see obstacles on the road. Accidents can be prevented with these self-driving cars. In addition, computers can be programmed to spot terrorists at airports. There would be no need to wait in long lines for security checks. The world would move more efficiently; people would have safer, easier, and simpler lives. However, I am actually more passionate about the biomedical applications than the security applications because security applications can compromise people’s privacy; I value people’s privacy. The applications of visual perception in computers can be applied back to the biomedical fields. People can have tumors checked quickly by computers. Computers can visually detect if tumors are benign or malignant with a quick visual scan. Moreover, blind or nearsighted people can regain eyesight with computers helping them. Computers can convert light signals to electrical signals and transfer the electrical signals to the neurons of the human brain that are responsible for visual perception of specific objects.
Brain-computer interfaces are going to be the next big thing. What I’m doing is something that will help bring it on, and I am inspired by all these great applications. I am increasingly interested in this, and I am acquiring more and more skills necessary to be at the forefront of this field.