Li Wan, Matthew Zeiler, Sixin Zhang, Yann LeCun, Rob Fergus Dept. of Computer Science, Courant Institute of Mathematical Science, New York University Introduction We introduce DropConnect, a generalization of Hinton's Dropout for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer.