
Torch (machine learning)[size=0.875em]From Wikipedia, the free encyclopedia
Torch
Original author(s)Ronan Collobert, Koray Kavukcuoglu, Clement Farabet
Initial releaseOctober 2002; 13 years ago[1]
Stable release7.0 / September 1, 2015; 8 months ago[2]
Written inLua, LuaJIT, C, CUDAand C++
Operating systemLinux, Android, Mac OS X, iOS
TypeLibrary for machine learning and deep learning
LicenseBSD License
Websitetorch.ch
Contents [size=12.502px] [hide]
torch[edit]The core package of Torch is torch. It provides a flexible Ndimensional array or Tensor, which supports basic routines for indexing, slicing, transposing, typecasting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library. The Tensor also supports mathematical operations likemax, min, sum, statistical distributions like uniform, normal and multinomial, and BLAS operations like dot product, matrixvector multiplication, matrixmatrix multiplication, matrixvector product and matrix product. The following exemplifies using torch via its REPL interpreter: > a = torch.randn(3,4)> =a0.2381 0.3401 1.7844 0.2615 0.1411 1.6249 0.1708 0.82991.0434 2.2291 1.0525 0.8465[torch.DoubleTensor of dimension 3x4]> a[1][2]0.34010116549482 > a:narrow(1,1,2)0.2381 0.3401 1.7844 0.2615 0.1411 1.6249 0.1708 0.8299[torch.DoubleTensor of dimension 2x4]> a:index(1, torch.LongTensor{1,2})0.2381 0.3401 1.7844 0.2615 0.1411 1.6249 0.1708 0.8299[torch.DoubleTensor of dimension 2x4]> a:min()1.7844365427828
Objects created with the torch factory can also be serialized, as long as they do not contain references to objects that cannot be serialized, such as Lua coroutines, and Lua userdata. However, userdata can be serialized if it is wrapped by a table (or metatable) that provides read() and write() methods. nn[edit]The nn package is used for building neural networks. It is divided into modular objects that share a common Module interface. Modules have a forward() andbackward() method that allow them to feedforward and backpropagate, respectively. Modules can be joined together using module composites, like Sequential,Parallel and Concat to create complex tasktailored graphs. Simpler modules like Linear, Tanh and Max make up the basic component modules. This modular interface provides firstorder automatic gradient differentiation. What follows is an example usecase for building a multilayer perceptron using Modules: > mlp = nn.Sequential()> mlp:add( nn.Linear(10, 25) )  10 input, 25 hidden units> mlp:add( nn.Tanh() )  some hyperbolic tangent transfer function> mlp:add( nn.Linear(25, 1) )  1 output> =mlp:forward(torch.randn(10))0.1815[torch.Tensor of dimension 1]
Loss functions are implemented as subclasses of Criterion, which has a similar interface to Module. It also has forward() and backward methods for computing the loss and backpropagating gradients, respectively. Criteria are helpful to train neural network on classical tasks. Common criteria are the Mean Squared Error criterion implemented in MSECriterion and the crossentropy criterion implemented in ClassNLLCriterion. What follows is an example of a Lua function that can be iteratively called to train an mlp Module on input Tensor x, target Tensor y with a scalar learningRate: function gradUpdate(mlp,x,y,learningRate) local criterion = nn.ClassNLLCriterion() pred = mlp:forward(x) local err = criterion:forward(pred, y); mlp:zeroGradParameters(); local t = criterion:backward(pred, y); mlp:backward(x, t); mlp:updateParameters(learningRate);end
It also has StochasticGradient class for training a neural network using Stochastic gradient descent, although the Optim package provides much more options in this respect, like momentum and weight decay regularization. Other packages[edit]Many packages other than the above official packages are used with Torch. These are listed in the torch cheatsheet. These extra packages provide a wide range of utilities such as parallelism, asynchronous input/output, image processing, and so on. Applications[edit]Facebook has released a set of extension modules as open source software. [12] Related libraries[edit]See also[edit]References[edit]
External links[edit]
Categories:

