Inaugural Sparks Interview: Prof. Zhang Jiang — Complexity Theory

X-Order
8 min readJan 19, 2019

--

This is the first of the Sparks Interview Series.

Prof. Zhang Jiang

Zhang Jiang is a Professor at School of Systems Science (Beijing Normal University), Founder of Jizhi Club and Jizhi Academy, as well as Special Advisor of Tencent Research Institute. His main areas of focus include Artificial Intelligence (AI), Machine Learning (ML), Computational Social Sciences, Complex Networks, etc.

Prof. Zhang Jiang

Employment

In 2006, Prof. Zhang graduated from Beijing Jiaotong University with a Major in Economics and Management. From 2006 to 2008, he worked as a postdoctoral researcher at the Complex Systems Research Center of the Institute of Mathematics and Systems Science (Chinese Academy of Sciences). From June 2008 to July 2012, he served as a lecturer at School of System Science (Beijing Normal University).

Achievements

Prof. Zhang has been actively involved in International Academic Exchanges, having visited Michigan University, the University of Vermont, the Santa Fe Institute, and Arizona State University just to name a few. In 2008, he won the Championship in the Science category of the 11th Young Teachers’ Teaching Basic Skills Competition at Beijing Normal University. In 2011 and 2016 respectively, his projects such as Allometric Law for Weighted Food Web and Collective Attention Flow Research on the Internet were supported by the National Natural Science Foundation of China. Till date, he has published more than 20 papers on SCI.

Academia and Research

Currently, Prof Zhang is involved in ML on Graphs Research.

One of his research directions is based on a combination of complexity web with ML — conducting deep ML on the web. He and his fellow researchers are using ML algorithms that they have written to analyze the networks formed between enterprises and the mechanisms behind them. It will then mine enterprises’ data (e.g. cooperation with Tencent Research QQ Data Center), with the output producing valuable information.

The other part of his research is related to environmental issues, focusing on forward-looking research planning, termed “Participate 2050”. The development of the world now reflects two major characteristics (1) the singularity of AI development is approaching, and (2) the degrading environment is suggesting signs of a major collapse in the future. When these two characteristics are combined, they form a system of technological development and economic progress tied to the entire Earth system.

He hopes to be involved in practical predictions with the assistance of scientific tools. For example, using a systematic science and interdisciplinary approach to predict what the world’s environment will be like in 2050.

‘World in 2050' Movement

However, not all is a bed of roses for Prof. Zhang. There have been a lot of headwinds in the research industry. He laments that the biggest bottleneck in the industry is not having a clear and thorough understanding of Complexity Theory. Prof Zhang then recommends readers to take a look at “Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies” by Geoffrey West. This book presents much clearer viewpoints on interdisciplinary issues such as homeostasis, and life and death relationship, which can answer some of the problems in Complexity Theory.

Book Recommendation: “Scale”

Complexity Theory

Despite all the challenges, Prof. Zhang believes that Complexity Theory possesses several advantages that traditional research theories do not have.

Firstly, topics such as social and computer science systems, which can be factual in nature will be able to receive interdisciplinary inputs compared to solely referring to specific industry’s domain experts and the industry’s database.

Secondly, Complexity Theory provides a more holistic perspective. It highlights that many phenomena are not purely the result of an individual or a single body but rather a structural mechanism that creates a series of influence.

Complexity Theory has also recently been categorized together with ML and deep learning in the research community. Although these 3 concepts utilize the same toolboxes, many are unaware that they are used in different contexts. Complex Networks or Agent-Based Model (ABM) methods always assume some simple rules; people construct some rules by analyzing the data before reasoning. In addition, Prof. Zhang reminded us that despite assumptions, ABM models are falsifiable depending on the information we provide to construct the model. If we provide a vague topic, then it would invariably be challenging to falsify. On the other hand, if we use a more concrete concept like the evolution of Bitcoin web, it is falsifiable because there is a lot of big data to support the model.

ML and deep learning work with many generalizations, making experimental feedback and data convergence faster. With these understandings in mind, he shared how Complexity Theory can come together with AI — such as his project on ML Graphical Recognition.

Taking a step backward, how can one measure the value of a good technology? He feels that value and production are inextricably linked. One of the ideas he proposed is to quantify human brainpower on how much cognitive resources an individual utilizes. Nonetheless, this field is still at its infancy stage and commercializing products will be overly complex.

From the example mentioned above, it is rather antithetic to mention the inserting of too many functions in Complexity Theory simulation studies. A real-life phenomenon is always complex but too many variables and parameters would result in having the inability to identify the variable that influenced the last phenomenon.

To end this topic, Prof. Zhang shared three pieces of important advice to us.

(1) We would always need to authenticate and recalibrate after constructing our model. If you know the definition of a parameter in the actual system, you should involve the parameters in the system. For example, when you want to create a market simulation, you would need to add income distribution. Setting parameters randomly are acceptable but it is not advisable. Such an approach can greatly reduce the workload and improve the objectivity of the parameter settings.

(2) We can use ML to adjust the parameters in the research. There have been individuals conducting parallel studies on the real stock market and the simulated stock market. By allowing the agent to adopt some of the simple rules in the given model, the generic algorithm will be slowly eliminated, and those irrelevant agents will be removed, leading to a more intelligent selection of the agents. Through such learning, corresponding parameters can be obtained.

(3) We can use ML to automatically set the parameters.

Applications of Complexity Theory

Blockchain

As X-Order is a token economy think-tank, we are interested in how Complexity Theory could help decentralized blockchain network develop. Giving a brief history of Complexity Theory, Prof. Zhang shared that research on ABM stimulation methods was relatively common in the 90s. In the 2000s, the mainstream method was complexity web. From 2010 till today, the mainstream method is data derived from Complexity Theory.

The interesting and advantageous thing about the blockchain is that the data is public. Yet, many real-world transactions are not made public. Therefore, we can use the method of a complex network to analyze these data by the method of “scale”, and we may find out the answers to many basic scientific research questions.

Internet

Prof. Zhang and his team have conducted some research, which they term “collective attention”. This subject is similar to the “evolution of public concerns on the Internet.” One of the shortcomings of this research is that the data sets they acquired were relatively small and old. They are now trying to acquire and analyze data on the Internet and related domain name.

They will tabulate the user traffic and traffic transfer between each website and analyze the data by using a set of network methods. Thereafter, the data will be visualized, the evolution trend will be identified, and predictions will be made.

So far, there is an understanding of how the ecosystem looks like. Through visualization, the data can be better understood by knowing where the nodes are. In addition, the forecasts are also being optimized. In principle, these methods could be used to predict which website have dark horse potential, which areas will rise in the future, and which domain name will become popular. He and his team have published some papers on this.

Human’s Adaption in an AI-driven Society

He believes that even if we became the slaves of AI, it does not matter at all. To him, the core value of human beings lies in having humanity. Purpose, desires, and interest form something unchangeable and constant that human would always possess. He also shared that it is highly plausible for robots to have consciousness in the future. Gödel machine is a design concept that could illustrate how such consciousness work.

Robot on robot horse riding off into the sunset, steered by a Gödel machine

Reaping the Benefits of Cutting-Edge Research

Prof. Zhang mentioned that there are people working on projects anything within your sphere of imagination or thoughts. The point is to return to yourself and discover your own dreams and what you want to do.

The book (“Scale”) he mentioned earlier, has touched on a few basic questions. For example, on life and death, the answers are simple to understand compared to traditional perspectives on Complexity Theory.

The book also served as an inspiration to him as it allowed him to see the relationship between economic development and environmental change. He has decided to spend the next decade to research this issue because he felt only system science could answer this question.

Principle of Scale

Prof. Zhang feels that the “Principle of Scale” is the relationship between two macro variables, such as the relationship between metabolism and body weight, and the relationship between GDP and population.

For example, it is very important to understand the metabolism of different systems. All living complex systems have inflows and outflows — the exchange of energy and information. If we can delve deeper along the lines of metabolism, a certain structure will be uncovered. This structure is presented in the form of a network. The growth of the network is not arbitrary, and its evolution must follow some nature principles.

Metabolism

He concludes the book with two takeaways:

(1) The book presented strong quantitative and precise concepts and arguments.

(2) The author also highlighted macro characteristics which can be observed holistically. However, its weakness is that it does not explain issues at the micro level, such as the timing of the stock prices rise and fall.

--

--

X-Order
X-Order

Written by X-Order

We discover and invest in meaningful blockchain tokens and projects with a helping hand from intelligent machines

No responses yet