引用本文:高娟,刘新为.有向网络分布式优化的Barzilai-Borwein梯度跟踪方法[J].控制理论与应用,2023,40(9):1637~1645.[点击复制]
GAO Juan,LIU Xin-wei.Barzilai-Borwein gradient tracking method for distributed optimization over directed networks[J].Control Theory and Technology,2023,40(9):1637~1645.[点击复制]
有向网络分布式优化的Barzilai-Borwein梯度跟踪方法
Barzilai-Borwein gradient tracking method for distributed optimization over directed networks
摘要点击 820  全文点击 344  投稿时间:2022-02-21  修订日期:2023-06-10
查看全文  查看/发表评论  下载PDF阅读器
DOI编号  10.7641/CTA.2022.20125
  2023,40(9):1637-1645
中文关键词  分布式优化  多智能体系统  有向图  Barzilai-Borwein方法  优化算法  收敛速度
英文关键词  distributed optimization  multi-agent systems  directed graphs  Barzilai-Borwein method  optimization algorithm  convergence rate
基金项目  国家自然科学基金项目(12071108, 11671116, 91630202)
作者单位E-mail
高娟 河北工业大学 gaojuan514@163.com 
刘新为* 河北工业大学 mathlxw@hebut.edu.cn 
中文摘要
      本文研究有向网络上的分布式优化问题, 其全局目标函数是网络上所有光滑强凸局部目标函数的平均值.受Barzilai-Borwein步长改善梯度方法表现的启发, 本文提出了一种分布式Barzilai-Borwein梯度跟踪方法. 与文献中使用固定步长的分布式梯度算法不同, 所提出的方法中每个智能体利用其局部梯度信息自动地计算其步长. 通过同时使用行随机和列随机权重矩阵, 该方法避免了由特征向量估计引起的计算和通信. 当目标函数是光滑和强凸函数时, 本文证明了该算法产生的迭代序列可以线性地收敛到最优解. 对分布式逻辑回归问题的仿真结果验证了所提出的算法比使用固定步长的分布式梯度算法表现更好
英文摘要
      This paper studies the distributed optimization problem over directed networks. The global objective function of this problem is the average of all smooth and strongly convex local objective functions on the networks. Motivated by the capability of Barzilai-Borwein step sizes in improving the performance of gradient methods, a distributed Barzilai-Borwein gradient tracking method is proposed. Different from the distributed gradient algorithms using fixed step sizes in the literature, the proposed method allows each agent to calculate its step size automatically using its local gradient information. By using row- and column-stochastic weights simultaneously, the method can avoid the computation and communication on eigenvector estimation. It is proved that the iterative sequence generated by the proposed method converges linearly to the optimal solution for smooth and strongly convex functions. Simulation results on the distributed logistic regression problem show that the proposed method performs better than some advanced distributed gradient algorithms with fixed step sizes.