CPU

LeddarTech Collaborates With Arm on Future-Ready Software-Defined Vehicles

Retrieved on: 
Wednesday, March 13, 2024

"We are delighted to announce that LeddarTech and Arm, two industry leaders, are collaborating to optimize and showcase LeddarTech's advanced AI-based sensor fusion and perception software, LeddarVision, using the latest Arm Automotive Enhanced compute technology," stated Frantz Saintellemy, President and CEO of LeddarTech.

Key Points: 
  • "We are delighted to announce that LeddarTech and Arm, two industry leaders, are collaborating to optimize and showcase LeddarTech's advanced AI-based sensor fusion and perception software, LeddarVision, using the latest Arm Automotive Enhanced compute technology," stated Frantz Saintellemy, President and CEO of LeddarTech.
  • "This partnership, leveraging the unique capabilities of both organizations, is poised to enhance CPU capabilities for ADAS, accelerate time-to-market and facilitate the rollout of software-defined vehicles.
  • "Software has become ubiquitous from bumper to bumper in modern-day vehicles, and traditional product development timelines don't allow for innovation to happen at the pace the industry needs," said Suraj Gajendra, Vice-President of Products and Solutions, Automotive Line of Business, Arm.
  • "This collaboration with LeddarTech enables OEMs and Tier 1s to easily leverage Arm's latest generation Automotive Enhanced hardware and software solutions, integrated with LeddarVision, to accelerate the delivery of new AI-enabled ADAS and automated driving capabilities."

LeddarTech Collaborates With Arm on Future-Ready Software-Defined Vehicles

Retrieved on: 
Wednesday, March 13, 2024

"We are delighted to announce that LeddarTech and Arm, two industry leaders, are collaborating to optimize and showcase LeddarTech's advanced AI-based sensor fusion and perception software, LeddarVision, using the latest Arm Automotive Enhanced compute technology," stated Frantz Saintellemy, President and CEO of LeddarTech.

Key Points: 
  • "We are delighted to announce that LeddarTech and Arm, two industry leaders, are collaborating to optimize and showcase LeddarTech's advanced AI-based sensor fusion and perception software, LeddarVision, using the latest Arm Automotive Enhanced compute technology," stated Frantz Saintellemy, President and CEO of LeddarTech.
  • "This partnership, leveraging the unique capabilities of both organizations, is poised to enhance CPU capabilities for ADAS, accelerate time-to-market and facilitate the rollout of software-defined vehicles.
  • "Software has become ubiquitous from bumper to bumper in modern-day vehicles, and traditional product development timelines don't allow for innovation to happen at the pace the industry needs," said Suraj Gajendra, Vice-President of Products and Solutions, Automotive Line of Business, Arm.
  • "This collaboration with LeddarTech enables OEMs and Tier 1s to easily leverage Arm's latest generation Automotive Enhanced hardware and software solutions, integrated with LeddarVision, to accelerate the delivery of new AI-enabled ADAS and automated driving capabilities."

ZutaCore's HyperCool Liquid Cooling Technology to Support NVIDIA's Advanced H100 and H200 GPUs for Sustainable AI

Retrieved on: 
Wednesday, March 13, 2024

SAN JOSE, Calif., March 13, 2024 /PRNewswire/ -- ZutaCore®, a leading provider of direct-to-chip, waterless liquid cooling solutions, today announced support for the NVIDIA H100 and H200 Tensor Core GPUs to help data centers maximum AI performance while delivering sustainability. Several leading server manufacturers are engaged with ZutaCore to complete the certification and testing on these GPU platforms. During the GTC 2024 Conference, ZutaCore will be showcasing H100 and H200 waterless dielectric cold plates supporting 1500W and beyond in the Boston Limited, Hyve Solutions, and Pegatron booths.

Key Points: 
  • "Next-generation GPUs have unique cooling requirements that are most effectively solved by waterless, direct-to-chip liquid cooling technology for current GPU of 1500W while increasing rack-processing density by 300%," said Erez Freibach, Co-founder and CEO at ZutaCore.
  • The increasing need for sustainable AI solutions highlights the importance of sustainable practices in data centers.
  • "With the worldwide AI server market expected to reach $49B by 2027, this announcement from ZutaCore supporting next generation GPUs designs is a significant milestone in the industry."
  • ZutaCore technology, including H100 and H200 dielectric cold plates, will be on display in the Boston Limited booth #1621, Pegatron booth #533, and Hyve Solutions booth #1129.

MinIO Introduces Enterprise Object Store with Advanced Features Designed For Exascale Data Infrastructure

Retrieved on: 
Tuesday, March 12, 2024

REDWOOD CITY, Calif., March 12, 2024 /PRNewswire/ -- MinIO, the leader in high-performance object storage for AI, today announced the MinIO Enterprise Object Store. This new product facilitates the creation and management of exabyte-scale data infrastructure for commercial customers. Building on MinIO's technical leadership in object storage, the MinIO Enterprise Object Store is expressly designed for the performance and scale challenges introduced by massive AI workloads.

Key Points: 
  • REDWOOD CITY, Calif., March 12, 2024 /PRNewswire/ -- MinIO , the leader in high-performance object storage for AI, today announced the MinIO Enterprise Object Store.
  • Building on MinIO's technical leadership in object storage, the MinIO Enterprise Object Store is expressly designed for the performance and scale challenges introduced by massive AI workloads.
  • The new features available in the MinIO Enterprise Object Store include:
    Catalog: The MinIO Enterprise Catalog feature solves the problem of object storage namespace and metadata search.
  • Key Management Server: The MinIO Enterprise Key Management Server is a highly available, operationally simple, KMS implementation optimized for massive data infrastructure.

MinIO Introduces Enterprise Object Store with Advanced Features Designed For Exascale Data Infrastructure

Retrieved on: 
Tuesday, March 12, 2024

REDWOOD CITY, Calif., March 12, 2024 /PRNewswire/ -- MinIO, the leader in high-performance object storage for AI, today announced the MinIO Enterprise Object Store. This new product facilitates the creation and management of exabyte-scale data infrastructure for commercial customers. Building on MinIO's technical leadership in object storage, the MinIO Enterprise Object Store is expressly designed for the performance and scale challenges introduced by massive AI workloads.

Key Points: 
  • REDWOOD CITY, Calif., March 12, 2024 /PRNewswire/ -- MinIO , the leader in high-performance object storage for AI, today announced the MinIO Enterprise Object Store.
  • Building on MinIO's technical leadership in object storage, the MinIO Enterprise Object Store is expressly designed for the performance and scale challenges introduced by massive AI workloads.
  • The new features available in the MinIO Enterprise Object Store include:
    Catalog: The MinIO Enterprise Catalog feature solves the problem of object storage namespace and metadata search.
  • Key Management Server: The MinIO Enterprise Key Management Server is a highly available, operationally simple, KMS implementation optimized for massive data infrastructure.

Akamai and Neural Magic Partner to Accelerate Deep Learning AI

Retrieved on: 
Tuesday, March 12, 2024

CAMBRIDGE, Mass., March 12, 2024 /PRNewswire/ -- Akamai Technologies (NASDAQ: AKAM), the cloud company that powers and protects life online, and Neural Magic, a developer of software that accelerates artificial intelligence (AI) workloads, today announced a strategic partnership intended to supercharge deep learning capabilities on Akamai's distributed computing infrastructure. The combined solution gives enterprises a high-performing platform to run deep learning AI software efficiently on CPU-based servers. As an Akamai Qualified Computing Partner, Neural Magic's software will be made available alongside the products and services that power the world's most distributed platform for cloud computing, security, and content delivery.

Key Points: 
  • The combined solution gives enterprises a high-performing platform to run deep learning AI software efficiently on CPU-based servers.
  • Neural Magic's solution enables deep learning models to run on cost-efficient CPU-based servers rather than on expensive GPU resources.
  • "Scaling Neural Magic's unique capabilities to run deep learning inference models across Akamai gives organizations access to much-needed cost efficiencies and higher performance as they move swiftly to adopt AI applications."
  • Akamai and Neural Magic share a common origin, both having been born out of Massachusetts Institute of Technology (MIT).

2CRSi SA: 2CRSi announces the signing of a memorandum of understanding (MOU) worth more than 12 million USD over 5 years with ICT Security Agency in East Africa

Retrieved on: 
Wednesday, March 13, 2024

Strasbourg (France), the 21st February 2024 - 2CRSi (ISIN : FR0013341781), leader in the design and manufacturing of high-performance energy-efficient computer servers, announces the signing of a memorandum of understanding worth more than 12 million USD over 5 years with an East African country.

Key Points: 
  • Strasbourg (France), the 21st February 2024 - 2CRSi (ISIN : FR0013341781), leader in the design and manufacturing of high-performance energy-efficient computer servers, announces the signing of a memorandum of understanding worth more than 12 million USD over 5 years with an East African country.
  • 2CRSi will provide a complete infrastructure (CPU, GPU and data storage) integrating more than 600 servers and network equipment.
  • The memorandum of understanding also provides for 2CRSi to train local teams in the deployment and management of Cloud solutions.
  • Through this protocol, 2CRSi delivers not only a set of hardware, but also skills in the latest Cloud technologies.

NVIDIA and HP Supercharge Data Science and Generative AI on Workstations

Retrieved on: 
Thursday, March 7, 2024

LAS VEGAS, March 07, 2024 (GLOBE NEWSWIRE) -- HP Amplify — NVIDIA and HP Inc. today announced that NVIDIA CUDA-X ™ data processing libraries will be integrated with HP AI workstation solutions to turbocharge the data preparation and processing work that forms the foundation of generative AI development.

Key Points: 
  • LAS VEGAS, March 07, 2024 (GLOBE NEWSWIRE) -- HP Amplify — NVIDIA and HP Inc. today announced that NVIDIA CUDA-X ™ data processing libraries will be integrated with HP AI workstation solutions to turbocharge the data preparation and processing work that forms the foundation of generative AI development.
  • RAPIDS cuDF and other NVIDIA software will be available as part of Z by HP AI Studio on HP AI workstations to provide a full-stack development solution that speeds data science workflows.
  • “Pandas is the essential tool of millions of data scientists processing and preparing data for generative AI,” said Jensen Huang, founder and CEO at NVIDIA.
  • The close collaboration between HP and NVIDIA allows data scientists to streamline development by working on local systems to process even large generative AI workloads.

STMicroelectronics powers up the intelligent edge with second-generation STM32 microprocessors, bringing performance boost and industrial resilience

Retrieved on: 
Thursday, March 7, 2024

ST’s new STM32MP2 MPUs will power the next generations of equipment that create the fabric of this evolving digital world.

Key Points: 
  • ST’s new STM32MP2 MPUs will power the next generations of equipment that create the fabric of this evolving digital world.
  • These include industrial controllers and machine-vision systems, scanners, medical wearables, data aggregators, network gateways, smart appliances, and industrial and domestic robots.
  • The devices are fully supported with STM32 development resources that are familiar to engineers working with ST’s STM32 microcontrollers (MCUs) and STM32 microprocessors.
  • * STM32 is a registered and/or unregistered trademark of STMicroelectronics International NV or its affiliates in the EU and/or elsewhere.

Microchip Launches New dsPIC® DSC-Based Integrated Motor Drivers that Bring Controllers, Gate Drivers and Communications to a Single Device

Retrieved on: 
Monday, February 26, 2024

“By integrating multiple device functions into one chip, the dsPIC DSC-based integrated motor drivers can reduce system-level costs and board space.”

Key Points: 
  • “By integrating multiple device functions into one chip, the dsPIC DSC-based integrated motor drivers can reduce system-level costs and board space.”
    The integrated motor driver devices can be powered by a single power supply up to 29V (operation) and 40V (transient).
  • Operating between 70—100 MHz, the dsPIC DSC-based integrated motor drivers provide high CPU performance and can support efficient deployment of FOC and other advanced motor control algorithms.
  • To learn more about Microchip’s growing portfolio of integrated motor drivers visit the dsPIC DSC-Based integrated motor drivers webpage.
  • The MCSK includes a dsPIC33CK low-voltage motor control development board, a 24V three-phase BLDC motor, an AC/DC adapter, a USB cable and other accessories.