I am a Master's student (MLT) at Language Technologies Institute at School of Computer Science, Carnegie Mellon University, advised by Eduard Hovy. My research interests lie in natural language processing and machine learning, with a focus on efficiency.
Previously, I was a research engineer at NAVER CLOVA working with pretraining and parameter-efficient finetuning of Korean GPT-3. I received a Master's degree in Computer Science and Engineering from Sogang University, advised by Jungyun Seo from Natural Language Processing Lab.
Before working on the quantitative side of language, I explored various aspects of language in Humanities (Bachelor's degree in American Culture) and Social Science (Bachelor's degree in Psychology) Department at Sogang University.
Most recent publications on Google Scholar.
‡ indicates equal contribution.
On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model
Seongjin Shin‡, Sang-Woo Lee‡, Hwijeen Ahn, Sungdong Kim, HyoungSeok Kim, Boseop Kim, Kyunghyun Cho, Gichang Lee, Woomyoung Park, Jung-Woo Ha, Nako Sung
NAACL'22: Conference of the North American Chapter of the Association for Computational Linguistics. 2022.
Cross-Cultural Similarity Features for Cross-Lingual Transfer Learning of Pragmatically Motivated Tasks
Hwijeen Ahn‡, Jimin Sun‡, Chan Young Park‡, Yulia Tsvetkov, David R Mortensen
EACL'21: Conference of the European Chapter of the Association for Computational Linguistics. 2022.
NLPDove at SemEval-2020 Task 12: Improving Offensive Language Detection with Cross-lingual Transfer
Hwijeen Ahn‡, Jimin Sun‡, Chan Young Park‡, Jungyun Seo
SemEval'20: International Workshop on Semantic Evaluation. 2020.
Extensive Use of Morpheme Features in Korean Dependency Parsing
Hwijeen Ahn, Minyoung Seo, Chanmin Park, Juae Kim, Jungyun Seo
IEEE BigComp'19: IEEE International Conference on Big Data and Smart Computing
Download full resume in PDF.