Truly Open, Truly Trusted -This is NEC. 不確実な時代において、Purpose実現のために NECが今、大切にするキーワード
Overview This page collects some material and references related to submodular optimization, with applications in particular in machine learning and AI. Convex optimization has become a main workhorse for many machine learning algorithms during the past ten years. When minimizing a convex loss function for, e.g., training a Support Vector Machine, we can rest assured to efficiently find an optimal
My primary research interests are in the areas of mathematical programming. I have been working on design and analysis of efficient algorithms for discrete optimization concerning matroids and submodular functions. I am also interested in applications of discrete optimization techniques to algebraic/numerical computation that arises in systems analysis and control. Research Institute for Mathemati
Beyond Convexity: Submodularity in Machine Learning Description Convex optimization has become a main workhorse for many machine learning algorithms during the past ten years. When minimizing a convex loss function for, e.g., training a Support Vector Machine, we can rest assured to efficiently find an optimal solution, even for large problems. In recent years, another fundamental problem st
Advances in Neural Information Processing Systems 23 (NIPS 2010) The papers below appear in Advances in Neural Information Processing Systems 23 edited by J.D. Lafferty and C.K.I. Williams and J. Shawe-Taylor and R.S. Zemel and A. Culotta. They are proceedings from the conference, "Neural Information Processing Systems 2010." Repeated Games against Budgeted Adversaries Jacob D. Abernethy, Manfred
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く