Please see my google scholar for an up-to-date list.
*: equal contributions
2025
Handling Korean Out-of-Vocabulary Words with Phoneme Representation Learning
Nayeon Kim*, Eojin Jeon*, Jun-Hyung Park, SangKeun Lee
PAKDD 2025
Continual Debiasing: A Bias Mitigation Framework for Natural Language Understanding Systems
Mingyu Lee, Junho Kim, Jun-Hyung Park, SangKeun Lee
ESWA
2024
MolTRES: Improving Chemical Language Representation Learning for Molecular Property Prediction
Jun-Hyung Park, Yeachan Kim, Mingyu Lee, Hyuntae Park, SangKeun Lee
EMNLP 2024
Zero-shot Commonsense Reasoning over Machine Imagination
Hyuntae Park*, Yeachan Kim*, Jun-Hyung Park, SangKeun Lee
Findings of EMNLP 2024
MELT: Materials-aware Continued Pre-training for Language Model Adaptation to Materials Science
Junho Kim*, Yeachan Kim*, Jun-Hyung Park, Yerim Oh, Suho Kim, SangKeun Lee
Findings of EMNLP 2024
Moleco: Molecular Contrastive Learning with Chemical Language Models for Molecular Property Prediction
Jun-Hyung Park*, Hyuntae Park*, Yeachan Kim, Woosang Lim, SangKeun Lee
EMNLP 2024 Industry
SEED: Semantic Knowledge Transfer for Language Model Adaptation to Materials Science
Yeachan Kim, Jun-Hyung Park, SungHo Kim, Juhyeong Park, Sangyun Kim, SangKeun Lee
EMNLP 2024 Industry
Coconut: Contextualized Commonsense Unified Transformers for Graph-Based Commonsense Augmentation of Language Models
Jun-Hyung Park, Mingyu Lee, Junho Kim, SangKeun Lee
Findings of ACL 2024
2023
DIVE: Towards Descriptive and Diverse Visual Commonsense Generation
Jun-Hyung Park*, Hyuntae Park*, Youjin Kang, Eojin Jeon, SangKeun Lee
EMNLP 2023
Leap-of-Thought: Accelerating Transformers via Dynamic Token Routing
Yeachan Kim, Junho Kim, Jun-Hyung Park, Mingyu Lee, SangKeun Lee
EMNLP 2023
SMoP: Towards Efficient and Effective Prompt Tuning with Sparse Mixture-of-Prompts
Joon-Young Choi, Junho Kim, Jun-Hyung Park, Wing-Lam Mok, SangKeun Lee
EMNLP 2023
Client-Customized Adaptation for Parameter-Efficient Federated Learning
Yeachan Kim*, Junho Kim*, Wing-Lam Mok, Jun-Hyung Park, SangKeun Lee
Findings of ACL 2023
Dynamic Structure Pruning for Compressing CNNs
Jun-Hyung Park, Yeachan Kim, Junho Kim, Joon-Young Choi, SangKeun Lee
AAAI 2023
2022
Break it Down into BTS: Basic, Tiniest Subword Units for Korean
Jun-Hyung Park*, Nayeon Kim*, Joon-Young Choi, Eojin Jeon, Youjin Kang, SangKeun Lee
EMNLP 2022
Tutoring Helps Students Learn Better: Improving Knowledge Distillation for BERT with Tutor Network
Jun-Hyung Park*, Junho Kim*, Mingyu Lee, Wing-Lam Mok, Joon-Young Choi, SangKeun Lee
EMNLP 2022
Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking
Jun-Hyung Park*, Mingyu Lee*, Junho Kim, Kang-Min Kim, SangKeun Lee
EMNLP 2022
Quantized Sparse Training: A Unified Trainable Framework for Joint Pruning and Quantization of DNNs
Jun-Hyung Park, Kang-Min Kim, SangKeun Lee
ACM TECS
Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference
Jun-Hyung Park*, Yong-Ho Jung*, Joon-Young Choi, Mingyu Lee, Junho Kim, Kang-Min Kim, SangKeun Lee
Findings of ACL 2022
Examining the Impact of Adaptive Convolution on Natural Language Understanding
Jun-Hyung Park, Byung-Ju Choi, SangKeun Lee
ESWA