经典风格(研究主题+方法)
1、Attention Is All You Need *(Transformer 架构的开创性论文)*
2、BERT: Pretraining of Deep Bidirectional Transformers for Language Understanding
3、Deep Residual Learning for Image Recognition *(ResNet 论文)*
4、Generative Adversarial Networks *(GAN 的原始论文)*
简洁技术型
*A Survey on Contrastive SelfSupervised Learning*
*Diffusion Models for Medical Image Segmentation*
*Neural Architecture Search with Reinforcement Learning*
*FewShot Learning via MetaTransfer Networks*
问题驱动型
*Why Does MAML Outperform FineTuning in FewShot Learning?*
*How to Train Your ViT? Data, Augmentation, and Regularization in Vision Transformers*
*When Does Label Smoothing Help?*
创新方法命名
*LoRA: LowRank Adaptation of Large Language Models*
*YOLOv7: Trainable BagofFreebies Sets New StateoftheArt*
*DALL·E 3: Improving Image Generation with Better Captions*
跨学科应用
*AI for Science: A New Paradigm in Scientific Discovery*
*Predicting Molecular Properties with Graph Neural Networks*
*RoboAgent: Generalist Robot via MultiTask Learning*
趋势方向(20232024热点)
*LLMAugmenter: Enhancing Reasoning in Large Language Models*
*World Models for Autonomous Driving*
*Detecting AIGenerated Text: Is Watermarking Enough?*
幽默/创意型(需谨慎使用)
*I’m Sorry Dave, I’m Afraid I Can’t Do That* *(探讨AI伦理的论文)*
*The Bitter Lesson: Scaling Wins in AI* *(引用自强化学习领域的经典观点)*