Link to download the latest CV at: Link
Education
- Ph.D., Speech-Language-Hearing Sciences, The Graduate Center, City University of New York, USA, 2021
- M.A., English Linguistics (ABD), Korea University, Seoul, Korea, 2016
- B.A., English Language and Literature, Korea, 2014
PROFESSIONAL EXPERIENCE
- 2022 – 2024, Director of AI Research, i-Scream arts, Korea
- Let a team in developing multimodal AI solutions for digital drawing analysis and psychological evaluation. Focused on user sentiment analysis through AI-based image color and sketch element analysis.
- 2021 - 2022, AI Researcher, i-Scream kids, Korea
- Designed and implemented AI engines for social network applications, enhancing user experience and platform interactivity.
- 2016 - 2021, Research Assistant, Speech-Production-Acoustics Perception Lab, CUNY, USA
- Collaborated on projects involving articulatory phonology and motor control with Dr. Douglas H. Whalen, contributing to understanding speech variability and perception.
- 2021 - 2022, Research Assistant, Haskins Laboratories, USA
- Focused on speech production and perception research, particularly on articulatory variability in native and non-native speakers (EMA, Ultrasound utilized).
- 2021 - 2022, Research Assistant, EMCS/NAMZ Labs, Korea
- Conducted research in automatic speech recognition and synthesis, contributing to advancements in language processing technologies.
PEER-REVIEWED JOURNAL ARTICLES
- Roon, K. D., Chen, W.-R., Iwasaki, R., Kang, J., Kim, B., Shejaeya, G., Tiede, M. K., & Whalen, D. H. (2022). Comparison of auto-contouring and hand-contouring of ultrasound images of the tongue surface. Clinical Linguistics & Phonetics, 36(12), 1112–1131.
- Lee, S., Kang, J., & Nam, H. (2020). Identification of English vowels by non-native listeners: Effects of listeners’ experience of the target dialect and talkers’ language background. Second Language Research, 1–27.
- Roon, K. D., Kang, J., & Whalen, D. H. (2020). Effects of Ultrasound Familiarization on Production and Perception of Nonnative Contrasts. Phonetica, 77(5), 350–393.
CONFERENCE PROCEEDINGS
- Kang, J., Nam, H., & Whalen, D. H. (2020). Estimating the “good” variability in speech production using invertible neural networks. 12th International Seminar on Speech Production.
- Kang, J., Nam, H., Chen, W., & Whalen, D. H. (2019). Benign vs. harmful variability in second language vowel production. Proceedings of International Congress of Phonetic Sciences 2019.
- Whalen, D. H., Kang, J., Iwasaki, R., Shejaeya, G., Kim, B., Roon, K. D., Mark, K., Preston, J. L., Phillips, E., McAllister, T., & Boyce, S. E. (2019). Accuracy Assessments of Hand and Automatic Measurements of Ultrasound Images of the Tongue. Proceedings of International Congress of Phonetic Sciences 2019.
- Chen, W.-R., Tiede, M., Kang, J., Kim, B., & Whalen, D. H. (2019). An electromagnetic articulography-facilitated deep neural network model of tongue contour detection from ultrasound images. The Journal of the Acoustical Society of America, 146(4), 3081–3081.
- Kang, J., Whalen, D. H., & Nam, H. (2018). The effect of native language on the second language vowel variability. The Journal of the Acoustical Society of America, 144(3), 1868.
- Chen, W., Saltzman, E., Nam, H., & Kang, J. (2018). Benign vs. destructive variability in speech production: an uncontrolled manifold approach. UConn Language Fest.
- Kang, J., Whalen, D. H., & Nam, H. (2017). Non-linear dimensionality reduction for correlated tongue measurement points. The Journal of the Acoustical Society of America, 141(5), 3581–3581.
- Nam, H., Kang, J., & Saltzman, E. (2017). Uncontrolled manifold method to speech production. The Journal of the Acoustical Society of America, 141(5), 3584–3584.
- You, H., Yang, H., Kang, J., Cho, Y., Hwang, S. H., Hong, Y., Cho, Y., Kim, S., & Nam, H. (2016). Development of articulatory estimation model using deep neural network. Phonetics and Speech Sciences, 8(3), 31–38.
INVITED TALKS
- Guest lecturer (2020). Articulatory Phonology Fall 2020, The Graduate Center, CUNY An introduction to TADA, the Haskins Laboratories Task-Dynamics Application.
- Guest lecturer (2018). The Graduate Center, CUNY An introduction to PsychoPy3 for designing your own online experiment
- Guest lecturer (2018). Doctoral Research Spring 2018, The Graduate Center, CUNY Designing behavioral experiments using Psychopy3 and analysis using Matlab
AWARDS & SCHOLARSHIPS
- Dissertation Fellowship (2020). The Graduate Center, CUNY
- Doctoral Student Research Grant (2019). The Graduate Center, CUNY
- The Moe and Hannah Bergman Award for Conference Travel (2017, 2019). The Graduate Center, CUNY
- Award of the Best Student Paper (2015). International Conference on Speech Sciences, Seoul, Korea
- Research Assistant Scholarships (2014, 2015). Korea University, Seoul, Korea
- Graduation with Great Honor (2014). Academic Affairs, Korea University, Seoul, Korea
SKILLS & ABILITIES
- Programming: Python, MATLAB, R, Praat scripting, vanilla JS, React, SQLite, basic C++
- Machine Learning & Deep Learning: TensorFlow, PyTorch, Sklearn, LangChain, various LLM APIs
- Speech Recognition & Processing: HTK, Kaldi, Espnet, Transformer-based models
- Experimental Design & Data Analysis: MATLAB, Python, PsychoPy, jsPsych
- Data Visualization: Plotly, basic D3.js, Observable
- Productivity Tools: Notion, Slack, Jira; basic Figma
PATENTS
- System and method for providing AI stress-related psychological examination service. Kang et al. 1026989840000. Granted: 2024.08.21
- Method for analyzing visual perception processing ability and spatiotemporal composition ability based on digital figure inspection. Kang et al. Korea Patent No. 1026833910000. Granted: 2024.07.04
- Method for analyzing a user’s emotions from their drawing data. Kang et al. Korea Patent No. 1026736890000. Granted: 2024.06.04
- Method for analyzing the emotions of infants and children based on multiple drawing data. Kang et al. Korea Patent No. 1026736900000. Granted: 2024.06.04
- A method for analyzing a user’s emotions based on the color scheme of user-generated images. Kang et al. Korea Patent No. 1026736860000. Granted: 2024.06.04
- User sentiment analysis method using color analysis module. Kang et al. Korea Patent No. 1026403500000. Granted: 2024.02.20
- Method for developing Color Emotion Model, CEM. Kang et al. Korea Patent No. 1026552700000. Granted: 2024.04.02
- A Systematic Method of Analyzing Digital Drawings using AI. Kang et al. Korea Patent No. 1025480740000. Granted: 2023.06.22
- Method and System of AI-based Image Color Analysis. Kang et al. Korea Patent No. 1025022100000. Granted: 2023.02.16
- A method for analyzing a user’s emotions using multiple images created by the user. Kang et al. Korea Patent No. 1026736880000. Granted: 2024.06.04
- System for analyzing a user’s emotions from drawing data. Kang et al. Korea Patent No. 1026736910000. Granted: 2024.06.04
- A method for analyzing a user’s emotions by extracting sketch elements, monochromatic and color scheme adjectives, color elements, and emotional elements from an image, and then using these elements as a basis for the analysis. Kang et al. Korea Patent No. 1026736870000. Granted: 2024.06.04
- How to generate color usage degree, color characteristic distribution map and word cloud as result data according to color analysis of picture data. Kang et al. Korea Patent No. 1025215920000. Granted: 2023.04.10
- A method for analyzing the user’s personality based on the output data in the form of visual feedback. Kang et al. Korea Patent No. 1025215930000. Granted: 2023.04.10
- A system that analyzes picture colors and analyzes users based on artificial intelligence. Kang et al. Korea Patent No. 1025215940000. Granted: 2023.04.10
- How to analyze colors applied to objects in user drawing data. Kang et al. Korea Patent No. 1025215910000. Granted: 2023.04.10
- How you can analyze a user’s based on the usage information and picture data for each production tool used in the process of creating the picture data. Kang et al. Korea Patent No. 1025555800000. Granted: 2023.07.11
- A picture analysis system using a digital drawing tool that allows users to analyze picture data produced using a digital drawing tool. Kang et al. Korea Patent No. 1025555790000. Granted: 2023.07.11