Publications

Canary Extraction in Natural Language Understanding Models

Published in ACL Main Conference, 2022

In this work we demonstrate a white-box model inversion attack on Natural Language Understanding models. We show that an adversary can obtain sensitive information from the training data if given access to the model parameters.

Recommended citation: Parikh, Rahil, Christophe Dupuy, and Rahul Gupta. "Canary Extraction in Natural Language Understanding Models." arXiv preprint arXiv:2203.13920 (2022). https://arxiv.org/pdf/2203.13920.pdf

Harmonicity Plays a Critical Role in DNN Based Versus in Biologically-Inspired Monaural Speech Segregation Systems

Published in ICASSP, 2022

In this work we demonstrate that deep neural network based end-to-end speech segregation models cue on to the harmonic structure of speech for grouping and segregating sources. We demonstrate that these networks completely fail to separate inharmonic sources, and that they are unable to learn how to segregate speech when trained on mixtures of inharmonic speech.

Recommended citation: Parikh, Rahil, et al. "Harmonicity Plays a Critical Role in DNN Based Versus in Biologically-Inspired Monaural Speech Segregation Systems." ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2022. https://arxiv.org/pdf/2203.04420.pdf

Acoustic To Articulatory Speech Inversion Using Multi-Resolution Spectro-Temporal Representations Of Speech Signals

Published in Accepted to GlobalSIP, 2018

In this work we develop a speech inversion system to predict vocal tract parameters using the cortical features of acoustic speech. We demonstrate that the cortical features are correlated to the vocal tract parameters highlighting that the audiotry theory of speech perception is linked to the motor theory of speech production.

Recommended citation: Parikh, Rahil, et al. "Acoustic To Articulatory Speech Inversion Using Multi-Resolution Spectro-Temporal Representations Of Speech Signals." arXiv preprint arXiv:2203.05780 (2022). https://arxiv.org/pdf/2203.05780.pdf