NEUCHIPS' Purpose-Built Accelerator Designed to Be Industry's Most Efficient Recommendation Inference Engine

LOS ALTOS, Calif., May 31, 2022 (GLOBE NEWSWIRE) -- NEUCHIPS is excited to announce its first ASIC, RecAccelTM N3000 using TSMC 7nm process, and specifically designed for accelerating deep learning recommendation models (DLRM). NEUCHIPS has partnered with industry leaders in Taiwan's semiconductor and cloud server ecosystem and plans to deliver its RecAccel™ N3000 AI inference platform on Dual M.2 modules for Open Compute Platform compliant servers as well as PCIe Gen 5 cards for standard data center servers during the 2H'2022.

"In 2019, when Facebook open sourced their Deep Learning Recommendation Model and challenged the industry to deliver a balanced AI inference chip platform, we decided to pursue the challenge," said Dr. Lin, NEUCHIPS CEO, Co-Founder of Global Unichip Corp, subsidiary of TSMC and Professor at National Tsing Hua University, Taiwan. "Our continued improvements in MLPerf DLRM benchmarking and whole-chip emulation give us confidence that our RecAccel™ AI hardware architecture co-designed with our software will scale to deliver industry leadership and exceed our target of 20M inferences per second at 20 Watts." 

NEUCHIPS RecAccel™ N3000 Inference platform includes sophisticated hardwired accelerators, patented query scheduling and a comprehensive software stack optimized to provide high accuracy and hardware utilization while maintaining energy efficiency required in data centers. Other key features include the following: 

  •  Proprietary 8-bit coefficient quantization, calibration and hardware support that deliver 99.95% of FP32 accuracy.  
  •   Patented embedding engine with novel cache design and DRAM traffic optimization that reduces LPDDR5 access by 50% and increases bandwidth utilization by 30%.  
  •  Dedicated MLP compute engines that deliver state-of-the-art energy efficiency at engine level, and 1 microjoule per inference at SOC level.
  •  Proven software stack that delivers very high scalability across multiple cards.  
  •  Support for leading recommender AI models including DLRM, WND, DCN, and NCF.  
  • Robust security based on hardware root of trust.

About Neuchips: 

NEUCHIPS develops purpose-built AI inference chip platforms from the ground up, by co-developing hardware and software to meet our customer's requirements for performance, accuracy, power, and cost-efficiency. NEUCHIPS is a founding member of MLCommons™. For more information, please visit  https://www.neuchips.ai or contact contact@neuchips.ai 

Related Images






Image 1: NEUCHIPS INC.


NEUCHIPS INC. Logo



This content was issued through the press release distribution service at Newswire.com.

Attachment


Primary Logo

NEUCHIPS INC.

NEUCHIPS INC.
Featured Video
Jobs
Senior Principal Software Engineer for Autodesk at San Francisco, California
Machine Learning Engineer 3D Geometry/ Multi-Modal for Autodesk at San Francisco, California
Principal Engineer for Autodesk at San Francisco, California
Mechanical Engineer 3 for Lam Research at Fremont, California
Equipment Engineer, Raxium for Google at Fremont, California
Manufacturing Test Engineer for Google at Prague, Czechia, Czech Republic
Upcoming Events
Digital Twins 2024 at the Gaylord National Resort & Convention Center in, MD. National Harbor MD - Dec 9 - 11, 2024
Commercial UAV Expo 2025 at RAI Amsterdam Amsterdam Netherlands - Apr 8 - 11, 2025
Commercial UAV Expo 2025 at Amsterdam Netherlands - Apr 8 - 10, 2025
BI2025 - 13th Annual Building Innovation Conference at Ritz-Carlton Tysons Corner McLean VA - May 19 - 21, 2025



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise