About me

UPDATE

  • From Oct. 2023, I moved to KRAFTON.

  • During Mar. 2022 ~ Feb. 2025, I work as a research scientist at AITRICS as the alternative for mandatory military service, working under the supervision of Prof. Sung Ju Hwang. I plan to come back to academia after the end of the military service.

I am a Ph.d. student at Machine Learning and Artificial Intelligence (MLAI) lab in KAIST, under the supervision of Prof. Sung Ju Hwang.

My research interest includes:

  • Efficient Deep Learning with Limited Data
    • Domain Adaptation and Generalization
  • Natural Language Processing and Understanding
  • Graph Representation Learning
  • Text-to-Speech Synthesis

Education

Preprints / Workshop Publications

  • [W1] Knowledge Graph-Augmented Language Models for Knowledge-Grounded Dialogue Generation
    [workshop] [paper]
    Minki Kang*, Jin Myung Kwak*, Jinheon Baek*, Sung Ju Hwang
    Knowledge Retrieval and Language Models (KRLM) Workshop @ ICML 2022

Conference Publications

  • [C12] Knowledge-Augmented Language Model Verification
    [paper]
    Jinheon Baek, Soyeong Jeong, Minki Kang, Jong C. Park, Sung Ju Hwang
    EMNLP 2023

  • [C11] Knowledge-Augmented Reasoning Distillation for Small Language Models in Knowledge-Intensive Tasks
    [paper]
    Minki Kang, Seanie Lee, Jinheon Baek, Kenji Kawaguchi, Sung Ju Hwang
    NeurIPS 2023

  • [C10] ZET-Speech: Zero-shot adaptive Emotion-controllable Text-to-Speech Synthesis with Diffusion and Style-based Models
    [paper]
    Minki Kang*, Wooseok Han*, Sung Ju Hwang, Eunho Yang
    Interspeech 2023

  • [C9] Grad-StyleSpeech: Any-speaker Adaptive Text-To-Speech Synthesis with Diffusion Models
    [paper]
    Minki Kang*, Dongchan Min*, Sung Ju Hwang
    ICASSP 2023

  • [C8] Sparse Token Transformer with Attention Back Tracking
    [paper]
    Heejun Lee, Minki Kang, Youngwan Lee, Sung Ju Hwang
    ICLR 2023

  • [C7] Self-Distillation for Further Pre-training of Transformers
    [paper]
    Seanie Lee, Minki Kang, Juho Lee, Sung Ju Hwang, Kenji Kawaguchi
    ICLR 2023

  • [C6] KALA: Knowledge-Augmented Language Model Adaptation
    [paper]
    Minki Kang*, Jinheon Baek*, Sung Ju Hwang
    NAACL 2022

  • [C5] Edge Representation Learning with Hypergraphs
    [paper]
    Jaehyeong Jo*, Jinheon Baek*, Seul Lee*, Dongki Kim, Minki Kang, Sung Ju Hwang
    NeurIPS 2021

  • [C4] Learning to Perturb Word Embeddings for Out-of-distribution QA
    [paper]
    Seanie Lee*, Minki Kang*, Juho Lee, Sung Ju Hwang
    ACL 2021

  • [C3] Accurate Learning of Graph Representations with Graph Multiset Pooling
    [paper]
    Jinheon Baek*, Minki Kang*, Sung Ju Hwang
    ICLR 2021

  • [C2] Neural Mask Generator: Learning to Generate Adaptive Word Maskings for Language Model Adaptation
    [paper]
    Minki Kang*, Moonsu Han*, Sung Ju Hwang
    EMNLP 2020

  • [C1] Episodic Memory Reader: Learning What to Remember for Question Answering from Streaming Data
    [paper]
    Moonsu Han*, Minki Kang*, Hyunwoo Jung, Sung Ju Hwang
    ACL 2019

(*: equal contribution)

Experiences

  • KRAFTON
    Oct. 2023 -

    Natural Language DL Team @ AI Research Center

  • AITRICS
    Mar. 2022 - Oct. 2023

    Researcher @ Virtual Human Team

  • Technial University of Munich
    Apr. 2019 - Aug. 2019

    Exchange Student @ Electical and Computer Engineering Department

  • Kakao
    Jan. 2018 - Feb. 2018

    Internship @ Context Department