GitHub项目地址:https://github.com/tensorlayer/TensorLayerX
启智平台(国内访问):https://openi.pcl.ac.cn/OpenI/TensorLayerX
Document
TensorLayerX has extensive documentation for both beginners and professionals.
DeepLearning course
Bilibili link
Design Features
Compare with TensorLayer, TensorLayerX(TLX) is a brand new seperated project for platform-agnostic purpose.
Compare to TensorLayer version:
-
Model Zoo: Build a series of model Zoos containing classic and sota models,covering CV, NLP, RL and other fields.
-
Deploy: In feature, TensorLayerX will support the ONNX protocol, supporting model export, import and deployment.
-
Parallel: In order to improve the efficiency of neural network model training, parallel computing is indispensable.
Resources
-
TLX2ONNX ONNX Model Exporter for TensorLayerX.
✅ -
Examples for tutorials
✅ -
GammaGL is a multi-backend graph learning library based on TensorLayerX.
✅ - OpenIVA an easy-to-use product-level deployment framework
✅ -
TLXZoo pretrained models/backbones
🚧 - TLXCV a bunch of Computer Vision applications
🚧 - TLXNLP a bunch of Natural Language Processing applications
🚧 - TLXRL a bunch of Reinforcement Learning applications, check RLZoo for the old version
✅
More resources can be found here
Quick Start
Installation
Via docker
Docker is an open source application container engine. In the TensorLayerX Docker Repository, different versions of TensorLayerX have been installed in docker images.
# pull from docker hub
docker pull tensorlayer/tensorlayerx:tagname
Via pip
# install from pypi
pip3 install tensorlayerx
Build from source
# install from Github
pip3 install git+https://github.com/tensorlayer/tensorlayerx.git
For more installation instructions, please refer to Installtion
Define a model
You can immediately use tensorlayerx to define a model, using your favourite framework in the background, like so:
import os
os.environ['TL_BACKEND'] = 'tensorflow' # modify this line, switch to any framework easily!
#os.environ['TL_BACKEND'] = 'mindspore'
#os.environ['TL_BACKEND'] = 'paddle'
#os.environ['TL_BACKEND'] = 'torch'
import tensorlayerx as tlx
from tensorlayerx.nn import Module
from tensorlayerx.nn import Linear
class CustomModel(Module):
def __init__(self):
super(CustomModel, self).__init__()
self.linear1 = Linear(out_features=800, act=tlx.ReLU, in_features=784)
self.linear2 = Linear(out_features=800, act=tlx.ReLU, in_features=800)
self.linear3 = Linear(out_features=10, act=None, in_features=800)
def forward(self, x, foo=False):
z = self.linear1(x)
z = self.linear2(z)
out = self.linear3(z)
if foo:
out = tlx.softmax(out)
return out
MLP = CustomModel()
MLP.set_eval()
Contributing
Join our community as a code contributor, find out more in our Help wanted list and Contributing guide!
Contact
Citation
If you find TensorLayerX useful for your project, please cite the following papers:
@article{tensorlayer2017,
author = {Dong, Hao and Supratak, Akara and Mai, Luo and Liu, Fangde and Oehmichen, Axel and Yu, Simiao and Guo, Yike},
journal = {ACM Multimedia},
title = {{TensorLayer: A Versatile Library for Efficient Deep Learning Development}},
url = {http://tensorlayer.org},
year = {2017}
}
@inproceedings{tensorlayer2021,
title={TensorLayer 3.0: A Deep Learning Library Compatible With Multiple Backends},
author={Lai, Cheng and Han, Jiarong and Dong, Hao},
booktitle={2021 IEEE International Conference on Multimedia \& Expo Workshops (ICMEW)},
pages={1--3},
year={2021},
organization={IEEE}
}