Stop Using word2vec When I started playing with word2vec four years ago I needed (and luckily had) tons of supercomputer time. But because of advances in our understanding of word2vec, computing word vectors now takes fifteen minutes on a single run-of-the-mill computer with standard numerical libraries1. Word vectors are awesome but you don’t need a neural network – and definitely don’t need deep
Speee社でISUCON7の復習会をやったのでその資料を公開します。 ISUCON7復習会 2017/11/08 at Speee Lounge. original repo: https://github.com/isucon/isucon7-qualify 概要 ISUCON7の予選突破組の上位陣の戦略をいくつか分析してみました。 †空中庭園†《ガーデンプレイス》 repo: https://github.com/ryotarai/isucon7q http://eagletmt.hateblo.jp/entry/2017/10/24/010832 https://mozami.me/2017/10/24/isucon7_qualify.html スギャブロエックス repo: https://github.com/gfx/isucon7-qualify http://memo.su
Many language generation tasks require the production of text conditioned on both structured and unstructured inputs. We present a novel neural network architecture which generates an output sequence conditioned on an arbitrary number of input functions. Crucially, our approach allows both the choice of conditioning context and the granularity of generation, for example characters or tokens, to be
A significant amount of the world's knowledge is stored in relational databases. However, the ability for users to retrieve facts from a database is limited due to a lack of understanding of query languages such as SQL. We propose Seq2SQL, a deep neural network for translating natural language questions to corresponding SQL queries. Our model leverages the structure of SQL queries to significantly
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く