Generic meta-transfer learning model with special neuronal processing parameters for few-shot fault bearing diagnosis
-
-
Abstract
The society is now in the data-rich environment, and deep learning is widely used in bearing fault diagnostic technology due to the advancement of information technology. These methods typically need a large amount of data to support. However, in some practical cases, only few of samples are frequently available when a fault occurs rather than adequate data to be analyzed. This situation indicates that bearing fault diagnostic problems are frequently fewshot problems. In this work, a generic meta-transfer learning model with special neuronal processing parameters (MSNPP) is proposed. MSNPP avoids the issue of overfitting commonly encountered in traditional meta-learning approaches when solving few-shot problems and maintains excellent performance when extracting features with deep networks. Moreover, MSNPP discovers the connection between different tasks by analyzing a few samples and quickly adapts to new tasks. In MSNPP, a technique known as neuron transfer (NT) is used to manipulate neurons by scaling and shifting them. The scaling and shifting parameters are used as meta-learning hyperparameters to transfer within different tasks, which is the work of NT. Experimental result shows that MSNPP prevents the issue of overfitting in conventional meta-learning approaches and achieves satisfactory accuracy and robustness when solving few-shot problems in fault diagnosis.
-
-