Research on rolling bearing fault feature extraction based on entropy feature

<p>In large machinery, the most common element we can use is rolling bearing. When the rolling bearing fails, it is very likely to affect the normal operation of the equipment, or even cause danger. Therefore, it is necessary to monitor and diagnose the bearing fault in advance. The most impor...

Full description

Saved in:
Bibliographic Details
Main Authors: Zihan Wang (Author), Yong Jian Sun (Author)
Format: Book
Published: Annals of Mathematics and Physics - Peertechz Publications, 2021-08-16.
Subjects:
Online Access:Connect to this object online.
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:<p>In large machinery, the most common element we can use is rolling bearing. When the rolling bearing fails, it is very likely to affect the normal operation of the equipment, or even cause danger. Therefore, it is necessary to monitor and diagnose the bearing fault in advance. The most important step in fault diagnosis is feature extraction. This is the research content of this paper. In this paper, the approximate entropy, the sample entropy and the information entropy are analyzed, and the feature is extracted to diagnose the rolling bearing fault. The specific research contents are as follows: (1) Firstly, the concepts of approximate entropy, sample entropy and information entropy are introduced briefly, and the approximate entropy, sample entropy and information entropy of rolling bearing vibration signals under different fault modes are calculated. The feasibility and shortcomings of the features extracted from these three entropy in the fault characteristics of rolling bearing are analyzed. (2) In order to make up for their defects, a method of fault feature extraction based on approximate entropy, sample entropy and information entropy is proposed, and its feasibility is verified. (3) Simulation experiments are carried out to calculate the accuracy of fault feature extraction based on the joint analysis of approximate entropy, sample entropy and information entropy.</p>
DOI:10.17352/amp.000025