...

Back in 2014, Hao Dong was working in China to get his start-up company off the ground. He had finished his master’s degree at Imperial College in London two years before and had since worked to start a business. That’s when his college friend Luo Mai called him with a proposal. Luo and Hao had studied together at Imperial, but while Hao returned to China, Luo had stayed in London to pursue a Ph.D.

Luo called with a plan. He wanted to convince his friend to return to Imperial to study and work on deep learning, a field that was, he told Hao, “the next big thing”.

A year later, Hao found himself back in London, starting his own Ph.D. and working in a research group on deep learning and computer vision.


From lab frustration to TensorLayer in less than a year

In 2016, Hao was working on an experiment on model compression. At the time, there were two popular wrapper libraries for Google’s TensorFlow: Keras and TFLearn. Hao tried both for his project, but he quickly grew frustrated with them: “I started to use Keras and TFLearn, but I noticed that it’s difficult to understand and modify the backend code.”

So, Hao decided to implement his own wrapper layer around TensorFlow. Under the supervision of Professor Yike Guo, he developed TensorLayer, a library that would make it easy to quickly adjust model parameters and whose well-structured code would be easy to understand.

Quickly after finishing the first version of the wrapper library, other researchers in Hao’s lab started to use it for their own experiments. In September 2016, TensorLayer was first published on Github. Since then, the project has gathered over 2,700 stars on Github and has been forked almost 700 times. In October of 2017, TensorLayer won the Best Open Source Software award at the annual ACM Multimedia conference.


TensorLayer keeps its academic roots close

Although there are several other TensorFlow wrapper libraries, including the well-known Keras and TFLearn, TensorLayer occupies a unique spot within the deep learning ecosystem.

Most notably, there is TensorLayer’s focus on research. Even after the library grew beyond Hao’s lab, it has kept a closer relationship with the academic community than other frameworks. The majority of users, according to Hao, are university researchers — and they don’t only use the library, many of them also contribute their own code to it. “One very cool thing about our community is that because we have a lot of academic and Ph.D. users, we always have the latest reference implementations of algorithms,” Hao said.

Luo and Hao believe that the success of TensorLayer has two main reasons: one structural and the other cultural.

They built their library out of frustration over how difficult it was for them to customize Keras to their needs. Therefore, the main design principles behind TensorLayer became customizability and transparency. “We are trying to advertise a more modular design, where if you change one layer, the other layers won’t be affected”, Luo explained. That way, Luo said, it is even possible to use TensorLayer together with Keras, TFLearn, as well as other TensorFlow libraries.

The other reason to explain TensorLayer’s success can be found in its documentation. It’s not only available in English, but also in Chinese. “We have a lot of users from the large Chinese Internet companies,” Luo told us. While other — mostly American or European-made — products focus on English and expect foreign users to either speak English or rely on translation apps, Hao and Luo use their language skills and connections to their homeland to provide a native Chinese documentation.


“China is one of the biggest players in deep learning today”

That just might be the most important feature of their library. “China is one of the biggest players in deep learning today. Of the 10 biggest Internet companies, 3 to 4 (author’s note: depending on which statistic you look at) of them are Chinese,” Luo said. And although proficiency in English is very important for Chinese engineers, Luo said that if a Chinese documentation is available, they will favor this product over others that requires understanding an English documentation.

Luo believes the Chinese market for machine learning is well equipped for the future. China is the largest online market today. “There are more than 800 million users with Internet access, and their data is digitized to a very high degree,” he explained. More than in other parts of the world, purchases are made with mobile devices, creating massive amounts of data that can be used to train AI models.

This huge market attracts innovation and many of the Chinese mobile app companies have discovered that artificial intelligence can help grow their business. “Companies can use AI to maximize their profit, so they have a very strong motivation to integrate AI tech into their products” Luo said.

But it’s not only the market that moves AI forward in China. “The Chinese government currently has a strong motivation to push the AI industry,” Luo explained. He continued: “China is transforming from a manufacturing country to a service-oriented country. In this process, AI can play a very important role to replace some of the jobs in the manufacturing industry.”

Hao gave an example of the massive investments of the Chinese government. “When people start an AI company in certain cities in China, the government will give them $4 for every $1 they put in. I think that is a crazy amount,” he said. But it doesn’t stop there, Hao explained. AI startups are also provided with free office space by the government, and even housing for their employees.


Manpower is the issue

But Luo is also cautious. “I see something like a bubble there in the Chinese market”, he said. As AI companies grow like mushrooms after the rain, it gets increasingly harder for them to hire qualified engineers at reasonable rates. This is where Europe and the US still have an edge, Luo thinks.

While the foundations of AI and machine learning were laid in the US several decades ago, China has very little experience to build on and is painfully lacking the manpower to keep up with its own speed of development. “The more human intelligence we put into the development of AI, the more intelligent it gets,” Luo said. “In the end, I believe this will become a competition of manpower.”


The Cost Bottleneck

Another bottleneck that Hao and Luo see for advancing AI is the immense cost of training models and running experiments today. Although research institutions like Imperial College are well equipped with clusters and supercomputers, it’s not enough for the high resource demands of machine learning tasks, especially if the experiment requires frequent hyper-parameter tuning. “I’m working on a project using a genetic model,” Hao told us. “It takes 4 days to get the results of a single experiment.”

The democratization of AI, one of the industry’s buzzwords these days, would require this cost to decrease dramatically. New chip technologies could help solve this problem, but Luo also sees a different way: “When we look at the utilization of computer clusters, in particular private clusters owned by small business and research institutes, they are actually not very high. The majority of the machines are in idle most of the time.”

A promising idea could be to build a global shared AI infrastructure, Luo believes. He told us about PlanetLab, a project of Princeton University to connect clusters and computers around the world to share their resources. One of their goals is to donate idle computing time to the network, so others can use it.

With a system like PlanetLab, Luo thinks, AI research could utilize the existing computing power much better and bring down costs. “I think when the cost of AI is lower, there can be much more innovation from the general public — especially if researchers, university students, and even high-school students could all use the same platform and run their own models.”


Where to from here?

With TensorLayer, Hao and Luo are trying to provide a tool that helps people to get into machine learning more easily, while being useful for experts at the same time. In the future, Hao says, they want to focus on integrating more modules for specific applications into the library, for example for object detection. They also want to expand on distributed learning and model management.

Both Hao and Luo are going to continue their research at Imperial College. At the same time, they will work with the TensorLayer community on improving the library further and adding their research output to it. When Hao told us about a friend of his who started a company in China and received millions of dollars of funding and free housing, Luo laughed: “I hadn’t heard of this before. I have to admit that sounds pretty attractive. Maybe we should go do that.”

TensorLayer is a wrapper library for TensorFlow. It is in active development by Hao, Luo, and a growing group of other contributors. TensorLayer is available in English, Chinese, and soon in Korean. They always welcome new contributions.