awards

positions

publications

service

talks

teaching
Jun20
UCLA Outstanding Research Award

I am honored to be one of four graduating doctoral students in CS who received an Outstanding Research in CS award from the engineering school (HSSEAS) at UCLA.

May20
Defended PhD Thesis

I successfully defended my thesis (online).
It was an honor to have Adnan Darwiche, Jens Palsberg, Miryung Kim, Sumit Gulwani, and my advisor Todd Millstein on my committee!!
Slides:   Data-Driven Learning of Specifications and Invariants

May20
PLDI Distinguished Paper Award

Our Hanoi paper received a distinguished paper award at PLDI 2020. This year $5$ out of the $341$ submitted papers received this award.
In UCLA news:   Computer Science
Other mentions:   PLDI tweet

Feb20

My work on data-driven inference of representation invariants, with Anders Miltner and Prof. David Walker at Princeton University and my advisor Prof. Todd Millstein, would appear at PLDI 2020.

We present a counterexample-driven algorithm to infer provably sufficient representation invariants that certify correctness of data-structure implementations. Our implementation, Hanoi, can automatically infer representation invariants for several common recursive data structures, such as sets, lists, trees, etc.

Aug19
Syntactic Profiling Patent Granted

My first patent, with PROSE group at Microsoft on data profiling using program synthesis, was granted! USPTO seems super backlogged — our application was pending for over two years.

Aug19
On PLDI ' 20 ERC

I was invited to serve on the external review committee of the PLDI 2020 conference.

Jul19
Presentation @ CAV ' 19
( New York City, NY )

I presented our results on overfitting in program synthesis at CAV 2019.
Slides:   Overfitting in Synthesis

Jun19
On POPL ' 20 AEC

I was invited to serve on the artifact evaluation committee of the POPL 2020 conference. See the SIGPLAN Empirical Evaluation Checklist for the “why” and the “how” on conducting rigorous evaluations, and consider submitting the supporting artifacts for your research papers.
Deadlines:   10th July (papers)  ·  21st October (artifacts)

Jun19
CAV Student Travel Grant

I received a travel grant to attend the 31st CAV conference to be held in New York City next month.

May19
UCLA Dissertation-Year Fellowship

I am honored to be one of final-year doctoral students who received the Dissertation-Year fellowship (DYF) awarded by the UCLA Graduate Division.

Apr19

My work investigating overfitting in SyGuS, with my advisor Prof. Todd Millstein, Aditya Nori, and Rahul Sharma, would appear at CAV 2019 for publication in the Springer-Verlag LNCS series.

We define overfitting in the context of CEGIS-based SyGuS, and show that there exists a tradeoff between expressiveness and performance. We present two mitigation strategies: (1) a black-box approach that any existing tool can use, and (2) a white-box technique called hybrid enumeration.

Jun19

Apr19
VSRC @ Princeton University
( Princeton, NJ )

I am visiting the PL group at Princeton University for the spring quarter. I would be collaborating with Prof. David Walker’s group and my advisor Prof. Todd Millstein, who is currently visiting there as well, on invariant synthesis and related problems.

Mar19
On SAS ' 19 and OOPSLA ' 19 AEC

I was invited to serve on the artifact evaluation committees of SAS 2019 and SPLASH-OOPSLA 2019 conferences. Artifact evaluation ensures that the results claimed in research papers are easily and accurately reproducible.
Unlike OOPSLA, SAS requests authors of all papers (not just accepted papers) to submit their artifacts, which are due immediately after the paper submission deadline. The SAS program committee will also have access to the artifact reviews by the artifact evaluation committee.
SAS Deadlines:   25th April (papers)  ·  25th April (artifacts)
OOPSLA Deadlines:   5th April (papers)  ·  8th July (artifacts)

Mar19
Joined SyGuS-Comp OC

I was invited to join the organizing committee of the annual SyGuS competition. Preparations for the 6th SyGuS-Comp, to be held with SYNT@CAV 2019, have already begun! Please take a look at the SyGuS language standard v2.0, and submit your benchmarks and/or solvers.
Deadlines:   1st May (benchmarks)  ·  14th June (solvers)

Feb19
On DebugML (at ICLR ' 19) PC

I was invited to serve on the program committee of the DebugML workshop at ICLR 2019. The goal of this workshop is to discuss a variety of topics, such interpretability, verification, human-in-the-loop debugging etc. to help developers build robust ML models. Please consider submitting your work.

Jan19

An extended version of my undergraduate thesis work on slicing functional programs, with Prasanna Kumar, Prof. Amey Karkare and my then-advisor Prof. Amitabha Sanyal, would appear at CC 2019.

We propose a static analysis for slicing functional programs, which precisely captures structure-transmitted dependencies and provides a weak form of context sensitivity — weakened to guarantee decidability. We also show an incremental version this technique that is efficient in practice.

Dec18
Web Manager for SyGuS Group

I was invited to serve as the web manager for the SyGuS group. We recently revamped our website, and are now maintaining repositories for the language standard, tools and benchmarks on our GitHub organization.

Nov18
Presentation @ OOPSLA ' 18
( Boston, MA )

I presented FlashProfile at SPLASH-OOPSLA 2018.

Oct18
Invited to Microsoft PhD Summit
( Redmond, WA )

I am super excited to attend this two-day workshop at the Microsoft Research Redmond campus and meet other Microsoft Research PhD fellows & award winners.

Mar19

Sep18
Internship @ Microsoft Research
( Bengaluru, India )

I am interning with Rahul Sharma in the Systems group. I am investigating the expressiveness-vs-performance tradeoff in SyGuS tools, and techniques to efficiently mitigate them.

Jul18
FLoC Olympic Games Medal

For the second time, our loop invariant inference tool LoopInvGen (based on PIE) won the Inv track of SyGuS-Comp 2018. We received a FLoC Olympic Games medal, which is awarded every 4 years.
We solved $91\%$ of the benchmarks — was the fastest solver for $89\%$ of them, and produced the shortest invariants for $74\%$ of them.
In UCLA news:   Computer Science

Jul18

I was invited to serve on the artifact evaluation committee of the SPLASH-OOPSLA 2018 conference. Artifact evaluation ensures that the results claimed in research papers are easily and accurately reproducible.

Jun18

My research on pattern-based profiling, with PROSE group (at Microsoft) and my advisor Prof. Todd Millstein, would appear at SPLASH-OOPSLA 2018 for publication in the PACMPL journal.

We present a technique, called FlashProfile, to generate hierarchical data profiles. Existing tools, including commercial ones, generate a single flat profile, and are often overly general or incomplete. Furthermore, we show that data profiles can improve accuracy and efficiency of PBE techniques.

Jun18

Nov17
(Remote) RSDE @ Microsoft Research
( Redmond, WA )

As a part-time remote RSDE, I am continuing the exciting work with Ben Zorn, Rishabh Singh (now at Google Brain), and Alex Polozov on generating insights about tabular data in spreadsheets.

Jul17
Winner of SyGuS-Comp ' 17 (Inv)

Our loop invariant inference tool LoopInvGen (based on PIE) won the Inv track of SyGuS-Comp 2017. On an average, LoopInvGen was ~70x faster than the runner-up (CVC4).

Jul17
Attended the 1st DSSS

This two-week-long DeepSpec Summer School discussed state-of-the-art techniques for specifying and verifying of full functional correctness of everything from low-level hardware instructions to user-level software. The lectures covered several systems built on top of the Coq proof assistent, such as the Verified Software Toolchain (VST), CertiKOS, VeLLVM and more.

Sep17

Jun17
Internship @ Microsoft Research
( Redmond, WA )

I am interning with Ben Zorn and Rishabh Singh in the RiSE group, and am working on detecting and repairing inconsistencies in spreadsheet data.

May17
Oral Qualifying Examination

I passed the OQE, and have now advanced to candidacy!

Feb17
MSR PhD Fellowship

I am honored to be one of the 10 PhD candidates in the US who were awarded the Microsoft Research PhD fellowship for 2017 – 19.
In UCLA news:   Computer Science  ·  Engineering School (HSSEAS)
Other mentions:   Microsoft Research blog post  ·  Microsoft Research tweet  ·  HSSEAS tweet

Jan17
Talk @ Microsoft Research
( Bengaluru, India )

I presented the PIE & LoopInvGen work at the Microsoft Research India lab. (Invited by Rahul Sharma)

Dec16

Jun16
Internship @ Microsoft
( Redmond, WA )

I am interning with the PROSE group led by Sumit Gulwani. I am working on efficient syntactic profiling techniques for strings and applying them to improve existing program synthesis techniques.

Jun16
Presentation @ PLDI ' 16
( Santa Barbara, CA )

I presented PIE & LoopInvGen at PLDI 2016.

May16
Talk @ Stanford University
( Stanford, CA )

I presented the PIE & LoopInvGen work at the Software Research Lunch. (Invited by Rahul Sharma)

May16
Talk @ UC Berkeley
( Berkeley, CA )

I presented the PIE & LoopInvGen work to the programming languages and software verification group at UC Berkeley. (Invited by Prof. Sanjit Seshia)

May16
Written Qualifying Examination

I passed the WQE — one step closer to getting my PhD!

May16
Attended the 6th SSFT
( Menlo College, CA )

This one-week-long Summer School on Formal Techniques discussed the state-of-the-art tools for formal verification, including practical exercises in CVC4, PVS, Why3, and others. I would definitely recommend it to anyone interested in the field.

Jun16

Mar16
( Dept. of Computer Science, UCLA )

I am a teaching assistant for the upper-division Programming Languages (CS 131) course, taught by my advisor Prof. Todd Millstein, for the second time (after Fall 2014). For those interested, my notes are publicly available in the S16_TA_CS131 repo.

Jan16

My research on data-driven precondition inference, with Rahul Sharma and my advisor Prof. Todd Millstein, would appear at PLDI 2016.

We present a technique, called the precondition inference engine (PIE), which uses on-demand feature learning to automatically infer a precondition that would satisfy a given postcondition. We use PIE to also construct a novel automatic verification system called LoopInvGen.

Dec15
Talk @ SoCalPLS ' 15F
( Pomona College, CA )

I presented my work on data-driven precondition inference at the 15th Southern California Programming Languages and Systems (SoCalPLS) workshop.