Brain-inspired computing, Information storage, Integrated opto-electronics，Intelligent systems and instruments-
Prof. Shi joined Tsinghua University (THU) in 2013 and founded the Center for Brain Inspired Computing Research (CBICR) in 2014. His research interests include brain-inspired computing in all directions from fundamental theory, neuromorphic devices, chip, software to systems and applications.
In order to overcome the bottlenecks of the von Neumann system architecture and realize ultimate artificial general intelligence, Professor Shi and his team investigated brain-inspired computing models and algorithms, brain-inspired chips and brain-inspired computers, and developed the basic theory and core technologies of artificial general intelligence.
In November 2015,the First Cross-modal mixed-architecture neuromorphic brain inspired chip,named ‘Tianjic’was designed and taped out. This chip is able to stimulate large scale neural networks with competitive advance in high speed，real-time，and low cost. Related work was published on Science Robert in Dec/2016.
In October 2017, Professor Shi and his team successfully developed the second generation of Tianjic neuromorphic chip. The chip is based on advanced 28-nm semiconductor technology and integrates 10 million synapses and approximately 40,000 neurons. It supports Spiking Neural Networks (SNN) and Artificial Neural Networks (ANN), including Convolutional Neural Networks (CNN), Multi-layer Perceptron (MLP), and Long Short-Term Memory (LSTM) network architectures. Compared with other neuromorphic chips, see for example, the IBM TrueNorth chip, the chip density, speed, and bandwidth have been significantly improved.
In the year 2017, Professor Shi’s team has also developed a software tool-chain for neuromorphic chips, which supports the automatic mapping and compilation of neural network algorithms from Tensorflow, Caffe, and other machine learning platforms to the Tianjic chips. In addition, the team has also set up the first brain-inspired computing demonstration platform in China.
In the year 2018, Professor Shi’s team proposed a framework named WAGE for both inference and training of deep neural networks, which can simultaneously quantize weights, activation, errors and gradients to low bit width integers. This model empowers low-power neuromorphic chips with online learning abilities.
In the same year, Professor Shi’s team also proposed a new algorithm named Spatio-Temporal Backpropagation (STBP) that employs the spatio-temporal features of SNNs to solve the non-differentiable issue at spiking time. An approximated derivative for spike activity was proposed, which is appropriate for gradient descent based training. This algorithm is suitable for both fully connected and convolutional architectures, which enables training deep SNN based on the classic backpropagation.