Samsung Electronics and online platform Naver unveiled plans to jointly develop chips designed to increase the handling of large AI workloads by combining their respective next-generation memory technologies, hyperscale and compression algorithms.
The companies signed an MoU to collaborate and formed a working-level task force.
Samsung EVP of Memory Global Sales and Marketing Han Jinman (pictured left) stated it will develop cutting-edge semiconductors to solve the memory bottleneck in large-scale AI systems.
Chung Suk-Geun, head of Naver’s AI-focused CLOVA unit (pictured right), said the partnership can create a new class of products that can “better tackle the challenges of today’s AI technologies”.
The companies aim to improve large-scale data processing using Samsung’s chip design and manufacturing expertise and Naver’s experience in the development and verification of AI algorithms.
They explained the performance and efficiency limitations of current computing systems creates significant challenges to meet heavy computational requirements, fuelling the need for AI-optimised chips, which requires extensive convergence of the semiconductor and AI disciplines.
Samsung added it will work with Naver to optimise its memory and storage products supporting high-speed data processing to advance large-scale AI systems.
Naver stated it will initially use HyperCLOVA, a hyperscale language model with more than 200 billion parameters, to improve its compression algorithms to create a simplified model which increases computation efficiency.