开源深度学习CNN库DeepCL(C++):DeepCL

jopen 9年前

C++实现的卷积神经网络训练库,正在集成、完善Q-learning模块和Python调用接口。

OpenCL library to train deep convolutional networks

  • C++
  • OpenCL
  • Deep convolutional
  • (New!) includes Q-learning module (draft)
  • (New!) Python wrappers available (draft too :-) )

Functionalities:

  • convolutional layers
  • max-pooling
  • normalization layer
  • random translations, as in Flexible, High Performance Convolutional Neural Networks for Image Classification
  • random patches, as in ImageNet Classification with Deep Convolutional Networks
  • multinet, ie Multi-column deep convolutional network, McDnn
  • simple command-line network specification, as per notation in Multi-column Deep Neural Networks for Image Classification
  • pad-zeros possible for convolutional layer
  • various activation functions available:
    • tanh
    • scaled tanh (1.7519 * tanh(2/3x) )
    • linear
    • sigmoid
    • relu
    • softmax
    </li>
  • fully-connected layers
  • various loss layers available:
    • square loss
    • cross-entropy
    • multinomial cross-entropy (synonymous with multinomial logistic, etc)
    • </ul> </li>
    • Q-learning
    • </ul>

      Example usage:

      • intend to target 19 x 19 Go boards, eg something similar to Clark and Storkey or Maddison, Huang, Sutskever and Silver
        • obtained 36.3% test accuracy, on next move prediction task, using 33.6 million training examples from kgsgo v2 dataset
        • commandline used./deepclrun dataset=kgsgoall netdef=32c5{z}-32c5{z}-32c5{z}-32c5{z}-32c5{z}-32c5{z}-500n-361n numepochs=3 learningrate=0.0001
        • 3 epochs, 1.5 days per epoch, on an Amazon GPU instance, comprising half an NVidia GRID K520 GPU (about half as powerful as a GTX780)
        </li>
      • obtained 99.5% test accuracy on MNIST, usingnetdef=rt2-8c5{padzeros}-mp2-16c5{padzeros}-mp3-150n-10n numepochs=20 multinet=6 learningrate=0.002