电器与能效管理技术 ›› 2025, Vol. 0 ›› Issue (12): 1-8.doi: 10.16628/j.cnki.2095-8188.2025.12.001

• 研究与分析 •    下一篇

基于知识蒸馏和增量学习的电能质量扰动分类

丁峰1,2,3, 秦超3, 薛敏涓2,3, 吴亦然4, 施天灵4, 汪飞4   

  1. 1 同济大学 交通学院, 上海 201804
    2 电磁能技术全国重点实验室, 上海 200030
    3 上海船舶设备研究所, 上海 200030
    4 上海大学 机电工程与自动化学院, 上海 200444
  • 收稿日期:2025-10-14 出版日期:2025-12-30 发布日期:2025-12-31
  • 作者简介:丁 峰(1993—),男,高级工程师,主要从事船舶电站系统研究。|秦 超(1983—),女,高级工程师,主要从事船舶电站系统研究。|薛敏涓(2000—),女,主要从事船舶电站系统研究。

Power Quality Disturbance Identification Based on Knowledge Distillation and Incremental Learning

DING Feng1,2,3, QIN Chao3, XUE Minjuan2,3, WU Yiran4, SHI Tianling4, WANG Fei4   

  1. 1 College of Transportation, Tongji University, Shanghai 201804, China
    2 National Key Laboratory of Electromagnetic Energy, Shanghai 200030, China
    3 Shanghai Marine Equipment Research Institute, Shanghai 200030, China
    4 School of Mechatronic Engineering and Automation, Shanghai University, Shanghai 200444, China
  • Received:2025-10-14 Online:2025-12-30 Published:2025-12-31

摘要:

为了准确快速识别电能质量扰动,提出一种融合知识蒸馏和增量学习的卷积神经网络模型。先构建高识别准确率的教师模型,通过知识蒸馏技术将教师模型对旧类别的识别知识有效转移至学生模型;再通过改进传统知识蒸馏损失函数,引入动态权重机制,学生模型实现高效的旧知识蒸馏,还可进行新知识的增量学习。与常规深度学习模型相比,所提模型无需整个重新训练即可适应新扰动,且既能保证高识别准确率,又能显著缩短训练时间并节省计算资源。

关键词: 电能质量, 增量学习, 知识蒸馏, 卷积神经网络

Abstract:

To accurately and quickly identify power quality disturbances, a convolutional neural network model combining knowledge distillation and incremental learning is proposed. First, a teacher model with high identification accuracy is constructed, and the identification knowledge of the teacher model for the old categories is effectively transferred to the student model through knowledge distillation technology. Then, by improving the traditional knowledge distillation loss function and introducing a dynamic weight mechanism, the student model achieves efficient distillation of old knowledge and enables incremental learning of new knowledge. Compared with the conventional deep learning model, the proposed model adapts to new disturbances without full retraning, which can significantly reduce the training time and saves computing resources while ensuring high identification accuracy.

Key words: power quality, incremental learning, knowledge distillation, convolutional neural network

中图分类号: