Search
Log In
Subscribe

#foundation_models #self_supervised #BERT #Masked Language Model #Next Sentence Prediction #Transformer #WordPiece #bidirectional #fine-tuning #pre-training #self-attention #transfer learning

#foundation_models #self_supervised #BERT #Masked Language Model #Next Sentence Prediction #Transformer #WordPiece #bidirectional #fine-tuning #pre-training #self-attention #transfer learning

BERT: 언어를 양방향으로 읽는 법👈👉

Apr 7, 2026

•

4 min read

BERT: 언어를 양방향으로 읽는 법👈👉

문장을 왼쪽에서 오른쪽으로만 읽던 AI에게, 빈칸 채우기 퀴즈를 내서 문맥 전체를 이해하게 만든 논문

Jason Lee
Jason Lee