@def title = "Ezgi's Garden" @def date = Date(2021, 06, 05)
@@row @@container
<img
class="left"
id=profpic
src="assets/pic_ezgi.jpeg"
>
Hi, I am Ezgi (she/they). I am a PhD candidate in Electrical and Computer Engineering at New York University where I'm advised by Elza Erkip. I hold an (integrated) MEng degree in Electrical Electronics Engineering from Imperial College London. Recent collaborators include Jona Ballé and Aaron B. Wagner and Deniz Gündüz.
I am a collaborative researcher and enjoy working with people from diverse backgrounds. My current research is driven by a passion for connecting theory and practice in compression and telecommunication problems, particularly in distributed scenarios. I leverage tools from deep learning, signal processing, data compression and information theory, yielding interpretable results. When I am not busy with research, I enjoy hiking and expanding my food and coffee ☕ palate.
I am always happy to chat about topics at the intersection of information theory and deep/machine learning -- feel free to drop me an email at ezgi(dot)ozyilkan(at)nyu(dot)edu!
Useful links: Google Scholar | LinkedIn | arXiv | GitHub @@ @@
I'm actively looking for a research internship role for summer 2025. Feel free to reach out if you see a fit!
-
October 2024: Our recent preprint titled Learning-Based Compress-and-Forward Schemes for the Relay Channel got accepted to IEEE Journal on Communications (JSAC) and will appear, as part of this special issue, in 2025!
-
September 2024: I presented our work titled Neural Compress-and-Forward for the Relay Channel at SPAWC 2024, in the beautiful Italian city of Lucca! Here is the poster.
-
July 2024: Our workshop proposal (compression + machine learning) for NeurIPS 2024 has been accepted! Details will follow shortly :)
-
July 2024: Our recent work titled Neural Compress-and-Forward for the Relay Channel got accepted to IEEE International Workshop on Signal Processing Advances in Wireless Communications (SPAWC) 2024!
-
May 2024: In our recent preprints ([1] and [2]), we propose neural "compress-and-forward" (CF) schemes for the relay channel, that leverage my previous neural distributed compression work. Our proposed neural CF operates closely to the maximum achievable rate in a "primitive relay channel" and also yields interpretable results :)
-
April 2024: Our recent work titled "Neural Distributed Compressor Discovers Binning" got accepted to IEEE Journal on Selected Areas in Information Theory (JSAIT), part of the special issue on Toby Berger.
-
April 2024: Our recent preprint on robust distributed lossy compression was accepted to the 2024 IEEE International Symposium on Information Theory Workshops (ISIT'24 Wkshps)!
-
March 2024: In our recent preprint, we extend our neural distributed lossy compression framework to more robust/general compression settings -- for example, where side information may be absent. We demonstrate that our learned compressors mimic the theoretical optimum and yield interpretable results :)
-
February 2024: Our recent survey titled "Distributed Compression in the Era of Machine Learning: A Review of Recent Advances" will appear at the Conference on Information Sciences and Systems (CISS'24) as an invited paper! Preprint is available here.
-
January 2024: The full program for our 'Learn to Compress' workshop @ ISIT'24 (including keynote speakers and call for papers) is out.
-
December 2023: "Distributed Deep Joint Source-Channel Coding with Decoder-Only Side Information" was accepted to the inaugural 2024 IEEE International Conference on Machine Learning for Communication and Networking (ICMLCN)! Preprint is available here.
-
November 2023: Our proposal "Learn to Compress" has been accepted as a workshop at ISIT 2024. The proposal was put forward by Aaron Wagner (Cornell University), Elza Erkip (NYU) and myself. We will release more details about this workshop in December -- but meanwhile, feel free to check out our workshop website!
-
October 2023: The draft version of the journal version of our previous ISIT 2023 paper is available in arXiv! We demonstrate that the neural distributed compressor mimics the theoretical optimum for more exemplary sources :)
-
July 2023: I presented our work titled "Neural Distributed Compressor Does Binning" at Neural Compression Workshop @ ICML'23. Here are the slides.
-
July 2023: I was selected as the best reviewer for the Neural Compression Workshop @ ICML'23.
-
July 2023: Our recent ISIT'23 work was accepted as an oral presentation to Neural Compression Workshop @ ICML'23.
-
June 2023: I presented our work titled "Learned Wyner--Ziv Compressors Recover Binning" at International Symposium on Information Theory (ISIT) 2023. Here are the slides!
-
June 2023: I presented a poster about our upcoming ISIT'23 paper at North American School of Information Theory (NASIT) 2023.
-
May 2023: I presented a poster titled Neural Distributed Compressor Does Binning at UC Berkeley Simons Institute's workshop on Information-Theoretic Methods for Trustworthy Machine Learning.
-
April 2023: "Learned Wyner--Ziv Compressors Recover Binning" was accepted to International Symposium on Information Theory (ISIT) 2023. Preprint is available here!
-
December 2022: "Learned Disentangled Latent Representations for Scalable Image Coding for Humans and Machines" was accepted to Data Compression Conference (DCC) 2023.
-
August 2022: I presented a poster titled Neural Distributed Source Coding at North American School of Information Theory (NASIT) 2022.
-
June 2022: Interning at InterDigital -- Emerging Technologies Lab in Los Altos, CA.