The performance of machine learning methods is heavily dependent on the choice of data representation (or features) on which they are applied. The rapidly developing field of representation learning is concerned with questions surrounding how we can best learn meaningful and useful representations of data. We take a broad view of the field and include topics such as deep learning and feature learning, metric learning, compositional modeling, structured prediction, reinforcement learning, and issues regarding large-scale learning and non-convex optimization. The range of domains to which these techniques apply is also very broad, from vision to speech recognition, text understanding, gaming, music, etc.
A non-exhaustive list of relevant topics:
- Unsupervised, semi-supervised, and supervised representation learning
- Representation learning for planning and reinforcement learning
- Metric learning and kernel learning
- Sparse coding and dimensionality expansion
- Hierarchical models
- Optimization for representation learning
- Learning representations of outputs or states
- Implementation issues, parallelization, software platforms, hardware
- Applications in vision, audio, speech, natural language processing, robotics, neuroscience, or any other field
The program will include keynote presentations from invited speakers, oral presentations, and posters.
ICLR features two tracks: a Conference Track and a Workshop Track. Submissions of extended abstracts to the Workshop Track will be accepted after the decision notifications for Conference Track submissions are sent. Some of the submitted Conference Track papers that are not accepted to the conference proceedings will be invited for presentation in the Workshop Track.
The goal is to improve the quality of the overall reviewing process. By using OpenReview, authors can update their paper respond to comments anytime. Also, anybody in the community can comment on submissions and reviewers can leverage public discussions to improve their understanding and rating of papers.
By November 4th (5:00pm Eastern Daylight Time - EDT) authors are asked to submit their paper to:
The submission deadline will be strictly enforced. There is no strict limit on paper length. However, we strongly recommend keeping the paper at 8 pages, plus 1 page for the references and as many pages as needed in an appendix section (all in a single pdf). The appropriateness of using additional pages over the recommended length will be judged by reviewers. Authors are encouraged to update their submission as desired and participate in the public discussion of their paper, as well as any other paper submitted to the conference. Submissions are not anonymous, but reviews will be anonymized. For detailed instructions about the format of the paper, please visit iclr.cc.
Submissions that are identical (or substantially similar) to versions that have been previously published, or accepted for publication, or that have been submitted in parallel to other conferences or journals are not allowed and violate our dual submission policy. However, papers that cite previous related work by the authors and papers that have appeared on non-peered reviewed websites (like arXiv) or that have been presented at workshops (I.e., venues that do not have a publication proceedings) do not violate the policy.
To prepare your submission to ICLR 2017, please use the LaTeX style files provided below: