⏱️ 09/27 (Fri.) 13:30-14:00 at R2 - 2nd Conference Room
Generative AI is very popular this year, and major software and hardware companies have invested in it.
This trend will quickly extend to mobile or edge devices in the future.
For example, PC, mobile phone, AR/VR, home products, etc.
Not only performs LLM model inference, Vision LLM combined with 4K Vision AI will also become mainstream.
To perform Vision LLM model inference on mobile or edge devices, and to reduce the font factor of the end product,not only the known problems of memory configuration, computing power, and memory bandwidth,how to solve the overheating problem with the advanced package of COWOS and WOW stacked up will be a more severe challenge.
The Inventec Magna™ NPU IP series provides ultra-low power consumption solutions with the highest computing density for Vision LLM, which can systematically solve the above problems.
😊 Share this page to friends:
Graduated from the Institute of Electrical Engineering of National Chiao Tung University. His research topic is the neural network application of Cerebellar Model Articulation Controller (CMAC) in robot arm control. He is currently the head of the AI NPU IP design team of Inventec AI Research Center. His main expertise is in digital signal processing IP, image processing IP, neural network processor IP, etc., and he has in-depth research on ultra-low energy consumption design of large language models (LLM).
畢業於國立交通大學電機工程研究所,研究題目為人類小腦模型類神經網路 (CMAC) 於機器手臂控制的應用。 現為英業達 AI 研究中心 AI NPU IP 晶片設計團隊負責人,主要專長在數位信號處理 IP、影像處理 IP、類神經網路處理器 IP 等方面,並對大型語言模型(LLM) 超低能耗設計有深入研究。
😊 Share this page to friends:
😊 Share this page to friends: