Swarm Dynamics: Flocking and Communication-Based Optimization
Files
Publication or External Link
Date
Authors
Advisor
Citation
Abstract
In this thesis, we discuss the swarm dynamics in the modeling of flocking behavior and in the application to numerical optimization. In general, the research of swarm dynamics deals with the collective behavior of multi-agent systems, where interactions between individuals would lead to the emergence of higher-level, self-organized structures of large crowd. Examples occur in a variety of research fields, including physics, biology, human society, and artificial intelligence, etc. Due to the wide range of applications, swarm dynamics has been an important research topic over the last decades.
The first part of our study is devoted to the flocking dynamics driven by the Cucker-Smale system. The flocking behavior is addressed at particle and hydrodynamic levels for single- and multi-species. When studying the hydrodynamic model, a canonical issue is the lack of closure for pressure. The main contribution of our work is to prove the hydrodynamic alignment for multi-species without assuming specific thermodynamic closure. The proof is based on the graph connectivity and a weak dispersion bound. Besides the analytical results, we also discuss the simulation techniques for collective dynamics, including particle and hydrodynamic simulations. Numerical examples are presented to demonstrate the flocking behavior.
The second part of the thesis focuses on the swarm-based approach for numerical optimization. Based on the idea of consensus formation, we propose a new Swarm-Based Gradient Descent (SBGD) method. To improve the flexibility of exploration, the method employs multiple interacting agents to search for solution. The key innovation of SBGD is to introduce individual masses, which distinguish the quality of different agents and dictate their choice of step size. Agents exchange information through dynamical mass transitions. As iterations proceed, the good agents will gradually become heavy `leaders' and help the crowd find better minimizers. The remarkable improvement of SBGD over the classical gradient descent method and the recently popular Adam method is verified in a series of benchmark tests.