The early months of 2021 have been a productive period for Dr. Brandon Reagen, an assistant professor of Electrical and Computer Engineering (ECE) and a member of the NYU Center for Cybersecurity (CCS). His recent initiatives to enhance data security have included two different projects utilizing an increasingly popular form of encryption called fully homomorphic encryption (FHE).
As explained in a Research Brief issued by NYU’s Tandon School of Engineering in March, homomorphic data encryption allows computing operations to be performed on encrypted data WITHOUT decrypting it first. It has proven effective in protecting sensitive data in medical, military, government and other settings. However, computing on encrypted data imposes a high penalty on latency, stemming mostly from non-linear operators like ReLU (rectified linear activation function). In response, Reagan and Dr. Siddharth Garg, an associate professor of ECE, proposed DeepReDuce, a set of optimizations that can remove ReLUs, and in doing so, reduce private inference latency. Developed along with two Ph.D. candidates Nandan Kumar Jha and Zahra Ghodsi (the latter now a postdoctoral researcher at the University of Californa-San Diego), and advancing an idea from an earlier paper authored by the same team (except for Jha), the researchers found that dropping ReLUs from classic networks significantly reduced inference latency, while maintaining high accuracy. The results of this research will be presented in July at the 2021 International Conference on Machine Learning.
More recently, Reagen has joined with Dr. Mihalis Maniatakos, a research assistant professor of electrical and computer engineering at Tandon, to pursue a different application of FHE. The research team will be using FHE to create microchips that can be moved through a supply chain in their encrypted state, thus greatly reducing the risk of tampering. Their work is being supported by a three and a half year, $14 million grant from the Defense Advanced Research Projects Agency (DARPA). and they will be working in collaboration with a data security company, Duality,
“With increased privacy concerns and tightening data protection regulations, organizations across industries are looking for secure computing methods to train and execute AI models without exposing sensitive or confidential data,” said Maniatakos, in a May 13 article posted on the Tandon news site.”For example, FHE could be of great value for such applications as personal DNA sequencing to scan for predisposition to some diseases.”FHE lets you encrypt your genome, send it to a service that predicts your predisposition to diseases, and you get the result back and decrypt it. The outsourced service cannot see your genome and cannot use it against you in any way.”