典型文献
Improving meta-learning model via meta-contrastive loss
文献摘要:
Recently,addressing the few-shot learning issue with meta-learning framework achieves great success.As we know,regularization is a powerful technique and widely used to improve machine learning algorithms.However,rare research focuses on designing appropriate meta-regularizations to further improve the generalization of meta-learning models in few-shot learning.In this paper,we propose a novel meta-contrastive loss that can be regarded as a regularization to fill this gap.The motivation of our method depends on the thought that the limited data in few-shot learning is just a small part of data sampled from the whole data distribution,and could lead to various bias representations of the whole data because of the different sampling parts.Thus,the models trained by a few training data(support set)and test data(query set)might misalign in the model space,making the model learned on the support set can not generalize well on the query data.The proposed meta-contrastive loss is designed to align the models of support and query sets to overcome this problem.The performance of the meta-learning model in few-shot learning can be improved.Extensive experiments demonstrate that our method can improve the performance of different gradient-based meta-learning models in various learning problems,e.g.,few-shot regression and classification.
文献关键词:
中图分类号:
作者姓名:
Pinzhuo TIAN;Yang GAO
作者机构:
Department of Computer Science and Technology,Nanjing University,Jiangsu 210023,China
文献出处:
引用格式:
[1]Pinzhuo TIAN;Yang GAO-.Improving meta-learning model via meta-contrastive loss)[J].计算机科学前沿,2022(05):101-107
A类:
regularizations,misalign
B类:
Improving,meta,learning,via,contrastive,loss,Recently,addressing,few,shot,issue,framework,achieves,great,success,know,powerful,technique,widely,used,machine,algorithms,However,rare,research,focuses,designing,appropriate,further,generalization,models,In,this,paper,novel,that,can,regarded,fill,gap,motivation,our,method,depends,thought,limited,data,just,small,sampled,from,whole,distribution,could,lead,various,bias,representations,because,different,sampling,parts,Thus,trained,by,training,support,test,query,might,space,making,learned,not,generalize,well,proposed,designed,sets,overcome,performance,improved,Extensive,experiments,demonstrate,gradient,problems,regression,classification
AB值:
0.465037
相似文献
机标中图分类号,由域田数据科技根据网络公开资料自动分析生成,仅供学习研究参考。