ntroduced to computing at the age of five, Animesh Tripathi can talk code and algorithms as fluently as a second language. From creating animations to websites, he had explored the diversity of programming world by the time he was in ninth grade. Since his first assignment that earned him $50 to the latest, worth $2,000, he has worked in a number of start-ups and won competitions at the international level.
When a friend aspiring to join the Indian Air Force was diagnosed as colour-blind, Animesh was introduced to the fact that 7% of the world's population shares this condition. One in 12 men and one in 200 women in the world live with colour blindness, either because of a genetic condition or a disease. From Bill Clinton to Mark Zuckerberg, people with deficient colour vision can see things as clearly as other people but they are unable to fully distinguish between red, green or blue light. (The most common type of colour blindness is red/green colour blindness.)
Tripathi was intrigued by the interaction colour-blind people had with the Web and digital devices. He began looking for the relevant research and solutions. To his surprise, there existed no exact science to make images more friendly to the colour-blind. So he decided to test methods from scratch, creating algorithms using MATLAB that helped him optimise images so that they would be clearer even to people with a deficiency of colour vision.
Tripathi likes the edgy flamboyant flat look that the User Interface smart devices are transitioning to, but believes that the accessibility features integrated in them need to work towards serving even the mildest condition that one would face.
With his research and results, he entered the Google Science Fair 2013, and made it to the Regional Finals for the Asia-Pacific. With his four algorithms of image optimisation aiming to find the most effective way for colour correction, he entered the Intel International Science and Engineering Fair this year, coming fourth. Soon, he had a group of 250 volunteers helping him pick out the best-looking images out of the 40 he sent out to them. With this survey based on the Ishihara test, Tripathi zeroed in on a method with 76.6% accuracy and worked on it to further create a Google Chrome plug-in that would optimise any page for the colour blind at the click of a button. The plug-in is ready to go live, but he is waiting for the patents to be sorted out before launching it.
Tripathi likes to travel, hike and trek when he's not figuring out ways to improve the UI and Accessibility options on devices. He looks forward to working with Google to deploy colour correction for the color blind as an integrated feature in Chrome and Android OS. He likes the edgy flamboyant flat look that the user interface that smart devices are transitioning to, but believes that the accessibility features integrated in them need to work towards serving even the mildest condition that one would face.
Animesh heads off later this month to pursue his engineering in Computer Science at the University of Illinois, where he plans to work on Image Processing and Artificial Intelligence. He sounds excited at the prospect of working with more start-ups and even hints at the possibility of setting up a company of his own.