top of page
LATEST TECH ARTICLES
![[DeepTecTok #2] Model Improvement through Fine-tuning of Translation-specialized LLM](https://static.wixstatic.com/media/2ea07e_5ee47f346b9a4a04b9ac9897d66d2543~mv2.webp/v1/fill/w_444,h_250,al_c,q_30,blur_30,enc_avif,quality_auto/2ea07e_5ee47f346b9a4a04b9ac9897d66d2543~mv2.webp)
![[DeepTecTok #2] Model Improvement through Fine-tuning of Translation-specialized LLM](https://static.wixstatic.com/media/2ea07e_5ee47f346b9a4a04b9ac9897d66d2543~mv2.webp/v1/fill/w_300,h_169,al_c,q_90,enc_avif,quality_auto/2ea07e_5ee47f346b9a4a04b9ac9897d66d2543~mv2.webp)
[DeepTecTok #2] Model Improvement through Fine-tuning of Translation-specialized LLM
The following text has been translated from Korean to English using AssistAce . Sungjin (James) Kim, Ph.D. | LinkedIn | YouTube Case Study of Translation LLM Fine-Tuning Introduction The technology of Large Language Models (LLMs) is advancing rapidly. There are various ways to utilize LLMs, including prompting, embedding, and fine-tuning. In this article, we will focus on fine-tuning, which requires a significant amount of GPU computing resources. While it is possible to sec
May 21, 2024
![[DeepTecTok #1] AI LLM Structure and Use of Transformer](https://static.wixstatic.com/media/2ea07e_c18edb08403046dbac265a3cf45e61d7~mv2.webp/v1/fill/w_444,h_250,al_c,q_30,blur_30,enc_avif,quality_auto/2ea07e_c18edb08403046dbac265a3cf45e61d7~mv2.webp)
![[DeepTecTok #1] AI LLM Structure and Use of Transformer](https://static.wixstatic.com/media/2ea07e_c18edb08403046dbac265a3cf45e61d7~mv2.webp/v1/fill/w_300,h_169,al_c,q_90,enc_avif,quality_auto/2ea07e_c18edb08403046dbac265a3cf45e61d7~mv2.webp)
[DeepTecTok #1] AI LLM Structure and Use of Transformer
Sungjin (James) Kim, Ph.D. | LinkedIn Table of Contents Introduction The Structure and Operating Principle of the Transformer Encoder Operation Decoder Operation Types and Applications of Transformer-based LLMs Encoder-Only Approach Decoder-Only Approach Encoder-Decoder Approach Latest Methods Implications Introduction Generative AI (GenAI), including Large Language Models (LLMs), has seen significant advancements since the introduction of the transformer architecture[1]. Int
Mar 18, 2024


Closing the Gap: Solutions for the Growing GPU Rich and Poor Divide in AI Technology
Status In the AI realm, the GPU landscape is starkly divided. The technological landscape is increasingly being divided into two distinct categories: 'GPU rich' and 'GPU poor'. This division, predominantly seen in the AI industry, is creating a significant gap in capabilities and advancements between different companies and regions. This gap is primarily due to the unequal distribution and accessibility of Graphics Processing Units (GPUs). Giants like Google, Microsoft, and M
Dec 11, 2023
![[Case Study] GPU Training Impossible Deadline Achieved](https://static.wixstatic.com/media/2ea07e_55c35d2e8cc14e23b51f0467529db353~mv2.webp/v1/fill/w_444,h_250,al_c,q_30,blur_30,enc_avif,quality_auto/2ea07e_55c35d2e8cc14e23b51f0467529db353~mv2.webp)
![[Case Study] GPU Training Impossible Deadline Achieved](https://static.wixstatic.com/media/2ea07e_55c35d2e8cc14e23b51f0467529db353~mv2.webp/v1/fill/w_300,h_169,al_c,q_90,enc_avif,quality_auto/2ea07e_55c35d2e8cc14e23b51f0467529db353~mv2.webp)
[Case Study] GPU Training Impossible Deadline Achieved
BACKGROUND The customer was looking to start an LLM project but was in a time crunch. The current solution was to purchase Nvidia’s DGX...
Jul 12, 2023
SECURE YOUR BUSINESS TODAY
bottom of page