About me
Hello! I’m Aaryan (Pronouns: He/Him), a Computer Science Ph.D. student at the School of Computer, Information, and Data Sciences in the madS&P and madPL groups at The University of Wisconsin-Madison working with Prof. Ethan Cecchetti. I am broadly interested in privacy and programming languages. I did my bachelor’s at Penn State University in Computer science with honors in computer science, where I was advised by Prof. Danfeng Zhang.
Research
I am broadly interested in security and privacy, programming languages, and program analysis.
Privacy nutrition labels are representations of privacy policies (data collection, sharing, and handling practices) which summarizes the policies in a way it is easier for end-users to interpret, just like nutrition labels on food items. Google adopted privacy nutriton labels on Play Store and shows end-user an app’s data handling practices before they install it. Currently, an Android developer wanting to publish their app on play store needs to provide accurate description of how their app handles user data. Current work allows Android developers to add annotations into their java source code stating what they think their code is doing with user data, and the tool extracts out a high-level privacy label appropriate for the Play Store. Think of this as if adding specifications on method signatures regarding what data is being accessed and transmitted off user device.
My project, we aim to show that code satisfies these specifications and is handling user data exactly how the developer expects. Our goal is to end up with a tool that can guarantee highly accurate privacy nutrition labels and can be used by any Android developer. To do this, we are doing information flow tracking statically using a security type system. We have defined privacy labels into a lattice structure (a partially ordered set where every two elements have a supremum and an infimum). The security type system assigns privacy labels to each value and uses type checking to ensure at compile time that data never flows from a high context to low context, defined formally via the partial order on privacy labels. This ensures that the privacy labels on Play Store reflects the behaviour in the source code of the applications.
During undergrad, I worked on developing a language and novel type system for automatically enforcing approximate differentially privacy for programs that use Gaussian noise. The type system is sound and transforms a probabilistic differentially private program into a deterministic one which makes the privacy costs explicit using the characteristic function of privacy loss random variable. This deterministic program can be verified using an existing off the shelf verifier, like Dafny or CPAChecker.
Another interesting project I worked on was the final project for CS 782 - Advanced Security and Privacy (Taught by Prof. Kaseem Fawaz) where I used Large language models and static call graph analysis for checking consistency between commit messages and code changes on GitHub. I have also done an independent study with Prof. Loris D’Antoni studying capabilities of large language models for program synthesis. I explored grammar-constrained decoding, a technique to generate output constrained by context-free grammar.
Personal
Outside of work, I like weightlifting, running, swimming, and playing tennis.
News
January 2025: I started an independent study with Prof. Ethan Checcheti
November 2024: I attended Midwest Security Workshop
June 2024: I attended Oregon Programming Languages Summer School
September 2023: Started Ph.D. at UW Madison Computer Sciences
June 2023: I attended my first academic conference - Programming language Design and Implementation