引用本文:陈滔,王立杰,刘洋,徐丽莉,于海生.带有Dropout结构的贝叶斯近似宽度学习系统[J].控制理论与应用,2025,42(8):1632~1640.[点击复制]
CHEN Tao,WANG Li-jie,LIU Yang,XU Li-li,YU Hai-sheng.Bayesian approximate broad learning system with dropout structure[J].Control Theory & Applications,2025,42(8):1632~1640.[点击复制]
带有Dropout结构的贝叶斯近似宽度学习系统
Bayesian approximate broad learning system with dropout structure
摘要点击 3080  全文点击 186  投稿时间:2023-02-26  修订日期:2025-01-24
查看全文  查看/发表评论  下载PDF阅读器   HTML
DOI编号  10.7641/CTA.2024.30087
  2025,42(8):1632-1640
中文关键词  宽度学习系统  Dropout  高斯过程  贝叶斯近似  拉格朗日乘子  回归分析
英文关键词  broad learning system  dropout  Gaussian process  Bayesian approximate  Lagrange multipliers  regression analysis
基金项目  国家自然科学基金项目(62103214,62373208,62273189), 中国博士后科学基金项目(2021M700077,2023T160348),山东省青年泰山学者项目 (tsqnz20221133, tsqn202306218), 山东省自然科学基金项目(ZR2024QF026, ZR2024YQ032)资助.
作者单位E-mail
陈滔 青岛大学自动化学院 qdu.chentao@qdu.edu.cn 
王立杰* 青岛大学自动化学院 lijiewang1@gmail.com 
刘洋 青岛科技大学自动化与电子工程学院  
徐丽莉 北京师范大学文理学院  
于海生 青岛大学自动化学院 山东省工业控制技术重点实验室  
中文摘要
      宽度学习系统(BLS)及其改进算法均普遍存在一个问题,即随着实际场景中数据复杂性的逐步增强,网络 结构变得极其复杂,进一步导致计算资源的消耗也大幅度增加.针对此问题,本文提出了一种带有Dropout算法的贝 叶斯近似宽度学习系统(Dropout-BABLS).首先,利用Dropout算法对宽度学习系统的隐藏层节点随机进行丢弃.其 次, 通过结合高斯回归过程和贝叶斯理论近似Dropout对输出结果的损失函数以确定Dropout-BABLS的目标函数, 进一步采用增广拉格朗日乘子法对目标函数的输出权重进行优化求解.最后,通过UCI机器学习知识库的10组回归 数据集和自建的6组时间序列数据集对算法进行分析评估.结果表明,本文所提出的Dropout-BABLS算法能保证相 应的输出精度,并减少25%~50%的训练时间.
英文摘要
      The existing broad learning system (BLS) and its improved algorithms have a common problem, that is, with the increasing complexity of data in practical scenarios, the network structure becomes extremely complex, resulting in the consumption of computing resources increased greatly. To handle the problem, this paper proposes a Bayesian approximate broad learning system with dropout structure (Dropout-BABLS). Firstly, the dropout algorithm is used to randomly discard the hidden layer nodes of broad learning system. Secondly, by combining the Gaussian regression process and Bayesian theory to approximate the loss function of Dropout on the output results, the objective function of Dropout-BABLS is determined. Next, the augmented Lagrange multiplier method is used to optimize the output weight of the objective function. Finally, the analysis and evaluation of the algorithm 10 sets of regression data of UCI machine learning knowledge base and 6 sets of time series data builted by ourselves. The results show that the developed algorithm by Dropout-BABLS can maintain the corresponding output accuracy and reduce the training time by 25% to 50%.