I am the Founding Scientist at Permanence AI, where I work on building safe and effective language models. Before that, I was a Research Scientist at ASAPP. I received my Ph.D. from the University of Texas at Austin under the joint supervision of Prof. Sriram Vishwanath and Prof. Alex Dimakis.
I completed my B.E. in Electrical Engineering from The Cooper Union for the Advancement of Science and Art in 2012 and received my M.S. in Electrical and Computer Engineering from UT Austin in 2014. I am an alumnus of the Wireless Networking and Communications Group (WNCG), the Laboratory for Informatics, Networks and Communications (LINC), the Machine INtelligence and Decision Systems initiative (UT-MINDS), and the Signal Processing, Communications & Computer Engineering group (S*PROCOM²).
Shivanshu Gupta, Clemens Rosenbaum, and Ethan R. Elenberg, ‘‘GistScore: Learning Better Representations for In-Context Example Selection with Gist Bottlenecks’’, in Proc. International Conference on Machine Learning (ICML), 2024. [paper] [code] [poster] [preview video]
Shivanshu Gupta, Clemens Rosenbaum, and Ethan R. Elenberg, ‘‘GistScore: Learning Better Representations for In-Context Example Selection with Gist Bottlenecks’’, in Proc. North American Chapter of the Association for Computational Linguistics Student Research Workshop (NAACL SRW), 2024. Best Paper Award. [arXiv]
Loay Mualem, Ethan R. Elenberg, Moran Feldman, and Amin Karbasi, ‘‘Submodular Minimax Optimization: Finding Effective Sets’’, in Proc. International Conference on Artificial Intelligence and Statistics (AISTATS), 2024. [arXiv]
Anmol Kabra and Ethan R. Elenberg, ‘‘Domain Private Transformers for Multi-Domain Dialog Systems’’, in Proc. Findings of the Association for Computational Linguistics: EMNLP, 2023. [arXiv] [code] [poster]
Ramya Ramakrishnan, Ethan R. Elenberg, Hashan Narangodage, and Ryan McDonald, ‘‘Multi-Step Dialogue Workflow Action Prediction’’. [arXiv]
Paloma Sodhi, Felix Wu, Ethan R. Elenberg, Kilian Q. Weinberger, and Ryan McDonald, ‘‘On the Effectiveness of Offline RL for Dialogue Response Generation’’, in Proc. International Conference on Machine Leaning (ICML), 2023. [arXiv] [poster]
Nihal V. Nayak, Ethan R. Elenberg, and Clemens Rosenbaum, ‘‘CEREAL: Few-Sample Clustering Evaluation’’. [arXiv]
Geoff Pleiss, Tianyi Zhang, Ethan R. Elenberg, and Kilian Q. Weinberger, ‘‘Identifying Mislabeled Data using the Area Under the Margin Ranking’’, in Proc. Neural Information Processing Systems (NeurIPS), 2020, pp. 17044-17056. [arXiv] [code] [project page]
Hardik Jain, Ethan R. Elenberg, Ankit Singh Rawat, and Sriram Vishwanath. ‘‘Coded Access Architectures for Dense Memory Systems’’, in Proc. Future Technologies Conference (FTC), 2020. [doi] [preprint]
Jeremy Wohlwend, Ethan R. Elenberg, Samuel Altschul, Shawn Henry, and Tao Lei. ‘‘Metric Learning for Dynamic Text Classification’’, in Proc. EMNLP Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo), 2019, pp. 143-152. Oral Presentation (top 8% of accepted papers). [arXiv (extended)] [doi] [code]
Maurice Diesendruck, Ethan R. Elenberg, Rajat Sen, Guy W. Cole, Sanjay Shakkottai, and Sinead A. Williamson. ‘‘Importance Weighted Generative Networks’’, in Proc. European Conference on Machine Learning Knowledge and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD), 2019. [pdf] [arXiv (extended)]
Ethan R. Elenberg, Rajiv Khanna, Alexandros G. Dimakis, and Sahand Negahban. ‘‘Restricted Strong Convexity Implies Weak Submodularity’’, The Annals of Statistics, vol. 46 no. 6B pp. 3539-3568, 2018. [arXiv] [doi]
Ethan R. Elenberg, Alexandros G. Dimakis, Moran Feldman, and Amin Karbasi. ‘‘Streaming Weak Submodularity: Interpreting Neural Networks on the Fly’’, in Proc. Neural Information Processing Systems (NeurIPS), 2017, pp. 4047-4057. Oral presentation (top 6% of accepted papers). [pdf] [arXiv (preprint)] [poster] [NeurIPS video] [preview video] [code]
Rajiv Khanna, Ethan R. Elenberg, Alexandros G. Dimakis, Joydeep Ghosh, and Sahand Negahban. ‘‘On Approximation Guarantees for Greedy Low Rank Optimization’’, in Proc. International Conference on Machine Learning (ICML), 2017, pp. 1837-1846. [pdf] [arXiv (preprint)]
Rajiv Khanna, Ethan R. Elenberg, Alexandros G. Dimakis, Sahand Negahban, and Joydeep Ghosh. ‘‘Scalable Greedy Feature Selection via Weak Submodularity’’, in Proc. International Conference on Artificial Intelligence and Statistics (AISTATS), 2017, pp. 1560-1568. [pdf] [arXiv (preprint)]
Anthony Bonato, David Ryan D’Angelo, Ethan R. Elenberg, David F. Gleich, and Yangyang Hou. ‘‘Mining and Modeling Character Networks’’, in Proc. Workshop on Algorithms and Models for the Web Graph (WAW), 2016, pp. 100-114. [arXiv] [doi] [code]
Ethan R. Elenberg, Rajiv Khanna, Alexandros G. Dimakis, and Sahand Negahban. ‘‘Restricted Strong Convexity Implies Weak Submodularity’’, in Proc. NeurIPS Workshop on Learning in High Dimensions with Structure, 2016. [pdf]
Ethan R. Elenberg, Karthikeyan Shanmugam, Michael Borokhovich, and Alexandros G. Dimakis. ‘‘Distributed Estimation of Graph 4-profiles’’, in Proc. International World Wide Web Conference (WWW), 2016, pp. 483-493. [arXiv (extended)] [doi] [slides] [code]
Ethan R. Elenberg, Karthikeyan Shanmugam, Michael Borokhovich, and Alexandros G. Dimakis. ‘‘Beyond Triangles: A Distributed Framework for Estimating 3-profiles of Large Graphs’’, in Proc. ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2015, pp. 229-238. [arXiv (extended)] [doi] [slides] [code]
Jonathan I. Tamir, Ethan R. Elenberg, Anurag Banerjee, and Sriram Vishwanath. ‘‘Wireless Index Coding Through Rank Minimization’’, in Proc. IEEE International Conference on Communications (ICC), 2014, pp. 5220-5225. [pdf] [doi]
Joe Baylon, Ethan R. Elenberg, and Samantha G. Massengill. ‘‘iSCISM: interference Sensing and Coexistence in the ISM Band’’, High Frequency Electronics, vol. 11 no. 4 pp. 30-46, Apr. 2012. [pdf]