During ICML reviews I noticed that my personal take on reviewing is becoming increasingly distinct from my peers. Personally, I want to go to a conference and come away with renewed creativity and productivity. Thus, I like works that are thought provoking, groundbreaking, or particularly innovative; even if the execution is a bit off.Read Full Story
Here’s something to noodle on while you finalize your ICML submissions.Have you ever heard of Max Martin? You probably haven’t, which is something considering he (currently) has 21 #1 hits in the United States. Lennon (26) and McCartney (32) have more, but Max Martin has the advantage of still being alive to catch up.Read Full Story
Yesterday I posted
on my favourite papers from the beginning of ICML (some of those
papers were actually presented today, although the posters were
ICML is too big for me to “review” it per se, but I can provide a myopic perspective.The heavy hitting topics were Deep Learning, Reinforcement Learning, and Optimization; but there was a heavy tail of topics receiving attention. It felt like deep learning was less dominant this year; but the success of deep learning has led to multiple application specific alternative venues (e.g.Read Full Story
The outcome of the election for the IMLS (which runs ICML) adds Emma Brunskill and Hugo Larochelle to the board.Read Full Story
Welcome to my ICML 2019 jetlag special – because what else do you do when you wake up earlier than anyone than write a blog post. Here’s a paper that was presented yesterday which I really liked.Read Full Story
and I are organizing a workshop at ICML this year, titled
Workshop URL: https://sites.google.com/view/implicitmodels/
Leveraging this recent and highly impactful topic, I’m personally
excited to see how we might foster discussion across communities.
Two exciting NLP papers at ICML 2018!
ICML 2018 accepts are out, and I am excited about two papers that I will briefly outline here. I think both papers are phenomenally good and will bring back structured prediction in NLP to modern deep learning architectures.
The ICML is now already over for two weeks, but I still wanted to write about my reading list, as there have been some quite interesting papers (the proceedings are here).Read Full Story
ICML 2017 has just ended. While Sydney is remote for those in Europe and North America, the conference centeris a wonderful venue (with good coffee!), and the city is a lot of fun. Everything went smoothly and the organizers did a great job.You can get a list of papers that I liked from my Twitter feed, so instead I’d like to discuss some broad themes I sensed.Read Full Story
At ICML I recently published a paper that I somehow decided to title “A Divergence Bound for Hybrids of MCMC and Variational Inference and an Application to Langevin Dynamics and SGVI”. This paper gives one framework for building “hybrid” algorithms between Markov chain Monte Carlo (MCMC) and Variational inference (VI) algorithms.Read Full Story
ICML, ICLR, and NeurIPS are all considering or experimenting with code and data submission as a part of the reviewer or publication process with the hypothesis that it aids reproducibility of results. Reproducibility has been a rising concern with discussions in paper, workshop, and invited talk.
The fundamental driver is of course lack of reproducibility.
ICML 2019 has an option for supplementary code submission that the authors can use to provide additional evidence to bolster their experimental results. Since we have been getting a lot of questions about it, here is a Frequently Asked Questions for authors.
1. Is code submission mandatory?
The ICML 2019 Conference will be held from June 10-15 in Long Beach, CA — about a month earlier than last year. To encourage reproducibility as well as high quality submissions, this year we have three major changes in place.
There is an abstract submission deadline on Jan 18, 2019.