Skip to content

SemiFlow is a deep learning framework with automatic differentiation and automatic shape inference, developing from Numpy. 一个基于Numpy支持自动求导的深度学习框架

License

Notifications You must be signed in to change notification settings

nanguoyu/SemiFlow

Repository files navigation

SemiFlow

Build Status codecov DOI

SemiFlow Logo

English | 中文

SemiFlow is a deep learning framework with auto-differentiation, developing from Numpy.

News!

Jan 24, 2022. SemiFlow supports converting models from SemiFlow to ONNX

July 21, 2021. SemiFlow is now supporting Distributed deep learning. The first parallel solution is Parameter Sever.

July 2021. We are introducing model.save and model.load !

Installation

git clone https://github.com/nanguoyu/SemiFlow.git
cd SemiFlow
pip install .

Quick start

MNIST_MLP

A classification model trained in MNIST.

# Import SemiFlow

from SemiFlow.layer import Dense
from SemiFlow.Model import Sequential
from SemiFlow.utils.dataset import mnist
import numpy as np

# Prepare MNIST data.
train_set, test_set = mnist(one_hot=True)

x_train, y_train = train_set[0][:128], train_set[1][:128]
x_test, y_test = test_set[0][:128], test_set[1][:128]
x_train = x_train.reshape(x_train.shape[0], x_train.shape[1] * x_train.shape[2])
x_test = x_test.reshape(x_test.shape[0], x_test.shape[1] * x_test.shape[2])

# Specify trainig setting

num_classes = 10
batch_size = 32
epochs = 30

# Init a sequential model
model = Sequential()

# Add the first layer and specify the input shape
model.add(Dense(units=256, activation='relu', input_shape=(784,)))
# Add more layer
model.add(Dense(units=128, activation='relu'))
model.add(Dense(units=64, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

# Print model structure
model.summary()

# Compile model and specify optimizer and loss function
model.compile(loss='categorical_crossentropy', optimizer='RMSprop', learning_rate=0.05)

# Train model
history = model.fit(x_train, y_train,
                batch_size=batch_size,
                epochs=epochs,
                verbose=1,
                validation_split=0.2,
                validation_data=(None, None))
                
# Evaluate model in test data 
score = model.evaluate(x_test, y_test, verbose=0)

MNIST_CNN

# Import SemiFlow

from SemiFlow.layer import Dense, Conv2D, Flatten, MaxPooling2D
from SemiFlow.Model import Sequential
from SemiFlow.utils.dataset import mnist
import numpy as np

# Prepare MNIST data.
train_set, test_set = mnist(one_hot=True)

x_train, y_train = train_set[0][:128], train_set[1][:128]
x_test, y_test = test_set[0][:128], test_set[1][:128]

# Resize to height * width * channel
x_train = x_train.reshape((-1, 28, 28, 1))

x_test = x_test.reshape((-1, 28, 28, 1))

# Specify trainig setting

num_classes = 10
batch_size = 32
epochs = 30

# Init a sequential model
model = Sequential()

# Add the first layer and specify the input shape
model.add(Conv2D(32, kernel_size=(3, 3),
                 activation='relu',
                 input_shape=(28, 28, 1),
                 dtype='float32'))
# Add other Conv2D layer
model.add(Conv2D(64, (3, 3), activation='relu'))
# Add a MaxPooling2D layer
model.add(MaxPooling2D(pooling_size=(3, 3)))
# Add a Flatten layer
model.add(Flatten())
# Add a Dense layer
model.add(Dense(units=64, activation='relu'))
# Add another Dense layer as output layer
model.add(Dense(num_classes, activation='softmax'))

# Print model structure
model.summary()

# Compile model and specify optimizer and loss function
model.compile(loss='categorical_crossentropy', optimizer='RMSprop', learning_rate=0.05)

# Train model
history = model.fit(x_train, y_train,
                batch_size=batch_size,
                epochs=epochs,
                verbose=1,
                validation_split=0.2,
                validation_data=(None, None))
                
# Evaluate model in test data 
score = model.evaluate(x_test, y_test, verbose=0)

Distributed Machine Learning

Distributed machine learning is now launched! SemiFLow now supports parameter server.

parameter server example

Client/Worker example code in distributed_parameter_client.py

Server/Master example code in distributed_parameter_server.py

Features

  • Dense/Full-connected layer
  • Model manager for training
  • Optimizer
  • Activation function
    • ReLU
    • Sigmoid
    • tanh
  • Loss
    • mse
    • mae
    • bce
    • ce
  • Complex Layer
    • Conv2D layer
    • MaxPooling2D layer
    • Flatten layer
    • RNN layer
  • Stochastic gradient descent
  • Momentum
  • RMSProp
  • Big dataset support
    • Train MNIST
    • cifar10
  • Save model
  • Load model
  • Distributed machine learning
    • Parameter Server
  • SemiFlow-ONNX
  • CUDA support
  • Examples and other docs

Other

There is an independent part in A computation graph part. In this part, we develop a deep learning engine like Tensorflow. It also supports auto-differentiation and computation graph. There is an example for Regression a line. This part is dated and will not be updated. We are going to introduce a sub-class of Model containing computation graph in the future.

Features

  • computational graph
    • feedforward
    • numpy style operator
    • compute gradient
  • Auto differentiate
  • Tensor support

Blogs

Reference

About

SemiFlow is a deep learning framework with automatic differentiation and automatic shape inference, developing from Numpy. 一个基于Numpy支持自动求导的深度学习框架

Topics

Resources

License

Stars

Watchers

Forks

Languages