Abstract |
---|
Classical optimization techniques have found widespread use in machine learning. Convex optimization has occupied the center-stage and significant effort continues to be still devoted to it. New problems constantly emerge in machine learning, e.g., structured learning and semi-supervised learning, while at the same time fundamental problems such as clustering and classification continue to be better understood. Moreover, machine learning is now very important for real-world problems with massive datasets, streaming inputs, the need for distributed computation, and complex models. These challenging characteristics of modern problems and datasets indicate that we must go beyond the "traditional optimization" approaches common in machine learning. What is needed is optimization "tuned" for machine learning tasks. For example, techniques such as non-convex optimization (for semi-supervised learning, sparsity constraints), combinatorial optimization and relaxations (structured learning), stochastic optimization (massive datasets), decomposition techniques (parallel and distributed computation), and online learning (streaming inputs) are relevant in this setting. These techniques naturally draw inspiration from other fields, such as operations research, polyhedral combinatorics, theoretical computer science, and the optimization community. Motivated by these concerns, we would like to address these issues in the framework of this workshop. |
Background and Objectives |
The closest in spirit to our workshop are the previously held workshops on 'Mathematical Programming in Machine Learning / Data Mining' from 2005--2007. These workshops were quite extensive and provided a solid platform for encouraging exchange between machine learners and optimization researchers. Our proposal draws inspiration from these previous workshops, but we have a somewhat sharper focus because our proposal is for a short one-day workshop. Another relevant workshop was the BigML NIPS*2007 workshop that focused on algorithmic challeges faced for large-scale machine learning tasks, with a focus on parallelization or online learning. Our workshop proposal addresses the following major issues, some of which have not been previously tackled as a combined optimization and machine learning effort. In particular, the aim of the workshop is to:
|