TY - GEN
T1 - Quantitative Analysis of Dynamical Complexity in Cultured Neuronal Network Models for Reservoir Computing Applications
AU - Moriya, Satoshi
AU - Yamamoto, Hideaki
AU - Hirano-Iwata, Ayumi
AU - Kubota, Shigeru
AU - Sato, Shigeo
N1 - Funding Information:
Future work will be directed towards investigation of the input responses in modular networks and evaluation the computational performance of the network as a reservoir. In addition, a formulation of the functional complexity, which is ACKNOWLEDGMENT This study was supported by the Cooperative Research Project Program of the Research Institute of Electrical Communication of Tohoku University, the Program on Open Innovation Platform with Enterprises, Research Institute and Academia (OPERA) from Japan Science and Technology Agency (JST), JSPS KAKENHI (No. 17K18864, 18J12197, and 18H03325), and JST PRESTO (No. JMPJPR18MB).
PY - 2019/7
Y1 - 2019/7
N2 - Reservoir computing is a machine learning paradigm that was proposed as a model of cortical information processing in the brain. It processes information using the spatiotemporal dynamics of a large-scale recurrent neural network and is expected to improve power efficiency and speed in neuromorphic computing systems. Previous theoretical investigation has shown that brain networks exhibit an intermediate state of full coherence and random firing, which is suitable for reservoir computing. However, how reservoir performance is influenced by connectivity, especially which revealed in recent connectomics analysis of brain networks, remains unclear. Here, we constructed modular networks of integrate-and-fire neurons and investigated the effect of modular structure and excitatory-inhibitory neuron ratio on network dynamics. The dynamics were evaluated based on the following three measures: synchronous bursting frequency, mean correlation, and functional complexity. We found that in a purely excitatory network, the complexity was independent of the modularity of the network. On the other hand, networks with inhibitory neurons exhibited complex network activity when the modularity was high. Our findings reveal a fundamental aspect of reservoir performance in brain networks, contributing to the design of bio-inspired reservoir computing systems.
AB - Reservoir computing is a machine learning paradigm that was proposed as a model of cortical information processing in the brain. It processes information using the spatiotemporal dynamics of a large-scale recurrent neural network and is expected to improve power efficiency and speed in neuromorphic computing systems. Previous theoretical investigation has shown that brain networks exhibit an intermediate state of full coherence and random firing, which is suitable for reservoir computing. However, how reservoir performance is influenced by connectivity, especially which revealed in recent connectomics analysis of brain networks, remains unclear. Here, we constructed modular networks of integrate-and-fire neurons and investigated the effect of modular structure and excitatory-inhibitory neuron ratio on network dynamics. The dynamics were evaluated based on the following three measures: synchronous bursting frequency, mean correlation, and functional complexity. We found that in a purely excitatory network, the complexity was independent of the modularity of the network. On the other hand, networks with inhibitory neurons exhibited complex network activity when the modularity was high. Our findings reveal a fundamental aspect of reservoir performance in brain networks, contributing to the design of bio-inspired reservoir computing systems.
KW - complexity
KW - modular networks
KW - reservoir computing
UR - http://www.scopus.com/inward/record.url?scp=85073246903&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85073246903&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2019.8852207
DO - 10.1109/IJCNN.2019.8852207
M3 - Conference contribution
AN - SCOPUS:85073246903
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2019 International Joint Conference on Neural Networks, IJCNN 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 International Joint Conference on Neural Networks, IJCNN 2019
Y2 - 14 July 2019 through 19 July 2019
ER -