Jeffrey Pawlick and Quanyan Zhu
Internet tracking technologies and wearable electronics provide a vast amount of data to machine learning algorithms. This stock of data stands to increase with the developments of the internet of things and cyber-physical systems. Clearly, these technologies promise benefits. But they also raise the risk of sensitive information disclosure. To mitigate this risk, machine learning algorithms can add noise to outputs according to the formulations provided by differential privacy. At the same time, users can fight for privacy by injecting noise into the data that they report.