A Neural Net Training Interface on TensorFlow


Tensorpack is a training interface based on TensorFlow.

Build Status ReadTheDoc Gitter chat model-zoo


It's Yet Another TF high-level API, with speed, readability and flexibility built together.

  1. Focus on training speed.

    • Speed comes for free with tensorpack -- it uses TensorFlow in the efficient way with no extra overhead. On different CNNs, it runs training 1.2~5x faster than the equivalent Keras code.

    • Data-parallel multi-GPU/distributed training strategy is off-the-shelf to use. It scales as well as Google's official benchmark.

    • See tensorpack/benchmarks for some benchmark scripts.

  2. Focus on large datasets.

    • It's unnecessary to read/preprocess data with a new language called TF. Tensorpack helps you load large datasets (e.g. ImageNet) in pure Python with autoparallelization.
  3. It's not a model wrapper.

    • There are too many symbolic function wrappers in the world. Tensorpack includes only a few common models. But you can use any symbolic function library inside tensorpack, including tf.layers/Keras/slim/tflearn/tensorlayer/....

See tutorials to know more about these features.


We refuse toy examples. Instead of showing you 10 arbitrary networks trained on toy datasets, tensorpack examples faithfully replicate papers and care about reproducing numbers, demonstrating its flexibility for actual research.


Reinforcement Learning:

Speech / NLP:



  • Python 2.7 or 3
  • Python bindings for OpenCV (Optional, but required by a lot of features)
  • TensorFlow >= 1.3.0 (Optional if you only want to use tensorpack.dataflow alone as a data processing library)
# install git, then:
pip install -U git+https://github.com/tensorpack/tensorpack.git
# or add `--user` to avoid system-wide installation.

Citing Tensorpack:

If you use Tensorpack in your research or wish to refer to the examples, please cite with:

  author={Wu, Yuxin and others},
  • 最近提交:06-04
  • 创建时间:2015-12-25