top of page

Case Study : Utilizing AI Translation Solutions in Global Workshops (Offline)

ree

AI translation solution presented through a real prompter


Executive Summary


 This Case Study aims to discuss our in-house developed AI-based real-time translation solution designed to overcome communication difficulties in corporate workshops with a multilingual environment. Although the initial model was built upon the OpenAI API, it faced issues with mistranslating company-specific terminology and employee names. To address this, we integrated our independently developed Context-Aware Knowledge Base, which significantly improved translation accuracy and achieved low latency, enabling seamless two-way communication. This document introduces our successful case.



The Challenge


 The number of workshops involving employees from various language backgrounds within the company has increased. However, language barriers hindered idea sharing and caused some participants to feel excluded, posing a significant obstacle to achieving the fundamental goals of the workshops. Specifically, in offline workshops, real-time comprehension of speakers' presentations and immediate two-way feedback proved difficult.


Key Issues


  • Lack of Real-time Communication: Consecutive interpretation interrupted the flow of workshops and reduced participant concentration.


  • Information Distortion: There was a possibility that subtle nuances or core information could be omitted or distorted during the interpretation process. Particularly, when company-specific information or employee names were incorrectly translated, the credibility of the translation solution was further compromised.


  • Imbalanced Participation: Employees not proficient in a specific language had limited comprehension, making it difficult to access information smoothly



The Solution


 To address these issues, we built a custom real-time two-way translation system.


Step 1: Initial Model Implementation


First, we integrated OpenAI's powerful language model API with Speech-to-Text (STT) technology. We implemented a system where if a speaker spoke Korean into the microphone, it would instantly display as English text on the screen, and if an English speaker spoke, it would convert to Korean text. This laid the foundation for basic real-time communication.


Step 2: Innovative Improvement - Integration of In-house Knowledge Base (The Breakthrough)


However, the initial model showed limitations in accurately recognizing company-specific information. While it performed translation, it operated without understanding the organization's ‘meaning’ and ‘context’. For instance, it sometimes mistranslated company slogans, internal project names, and employee names with similar pronunciations, leading to awkward situations.


To solve this problem, we developed and applied a unique technology called the 'Context-Aware Knowledge Engine'.


  • Knowledge Base Construction: We refined internal data such as employee rosters from the HR system, in-house glossaries, technical documents, and historical slogans to build a knowledge base for AI learning.


  • Knowledge Injection: When a translation request occurs, the AI model is designed to not merely perform general translation but to reference the established knowledge base in real-time. This enables the AI to first determine whether a word in context is a company-specific term or a person's name, then produce the most accurate translation result.


  • Latency Optimization: To ensure the flow of conversation is not interrupted, we cached frequently used terms and optimized the model calling process, shortening response times to milliseconds (ms).



Real-World Examples


 The difference before and after integrating the knowledge base was striking.

Situation Category

Before Knowledge Base Integration

After Knowledge Base Integration

Employee Name

Korean Speaker: "이로운 님의 아이디어입니다." → Screen: "It's an idea from E-Lown." (Awkward pronunciation)

Korean Speaker: "이로운 님의 아이디어입니다." → Screen: "It's an idea from Rowoon Lee." (Accurate English name)

Company Slogan

English Speaker: "Our slogan is ‘We love Challenges.” → Screen: "우리의 슬로건은 '우리는 도전을 사랑한다.’" (Awkward direct translation)

English Speaker: "Our slogan is ‘We love Challenges.” → Screen: "우리의 슬러건은 ‘ '우리는 도전을 즐긴다.’ " (Translated to the official slogan)

Technical Term

English Speaker: "We're launching Project Nautilus." → Screen: "우리는 앵무조개 프로젝트를 시작합니다." (Literal translation)

English Speaker: "We're launching Project Nautilus." → Screen: "우리는 프로젝트 노틸러스를 시작합니다." (Accurately recognized as a proper noun)


ree

AI Translation Solution as a Multilingual Support Catalyst with Slides


The Results


 After the introduction of the new translation solution, the landscape of workshops completely transformed.


  • 95% Improvement in Communication Efficiency: All participants could understand presentation content in real-time and freely exchange opinions without language barriers.


  • Significant Increase in Participant Satisfaction: Employees, especially those less confident in English or Korean, expressed high satisfaction, stating, "It feels as if everyone is speaking the same language".


  • Increased Workshop Productivity: Unnecessary interpretation wait times were eliminated, and active idea exchange led to the successful achievement of the workshops' fundamental goals.


 This solution successfully served as a powerful catalyst for communication, uniting members who speak different languages, going beyond being a mere translator.



Next Steps


 TecAce, true to its slogan "A company that enjoys challenges", is preparing for the next phase of innovation.


  • On-device LLM Application: We plan to implement lightweight on-device LLMs to significantly reduce server dependency, costs, and latency. This will provide a fast and stable real-time translation experience even in offline environments.


  • Multi-device UX Expansion: We plan to implement an expandable user experience (UX) usable across various devices such as smartphones, smartwatches, and smart glasses. This will allow natural application in more business scenarios, including business trips, field meetings, and remote work, not just workshops.


  • Supervision: We plan to add a supervision module equipped with real-time performance monitoring, error detection, and quality evaluation functions to ensure LLM-based apps and services operate stably and consistently deliver expected performance. This will enable early detection and response to issues such as degraded translation quality, increased latency, and security risks.


If you are considering implementing an AI translation solution, we invite you to start with TecAce!


Comments


bottom of page