Shanghai ZENTEK Co., Ltd. ZENTEK 信弘,智能,信弘智能科技 Elite Partner,Omniverse,智能科技 NVIDIA GPU,NVIDIA vGPU,TESLA,QUADRO,AI, AI Training,AI Courses,Artificial Intelligence (AI),Solutions,DLI,Mellanox,InfiniBand (IB),Deep Learning, NVIDIA RTX,IT,RACLE Database,ORACLE Cloud Services,Deep Learning Institute, bigdata,Big Data, Data Security & Backup,鼎甲SCUTECH CORP,High-Performance Computing (HPC),Virtual Machines (VM), Virtual Desktop Infrastructure (VDI),Virtual Desktop Infrastructure (VDI),Hardware,Software, Accelerated Computing,High-Performance Computing (HPC),Supercomputing,Servers,Virtual Servers, IT Consulting,IT System Planning, Application Deployment,System Integration

ZENTEK News

NVIDIA Jetson Thor 【ZENTEK Pre-sale Starts Simultaneously!】 Unleashes real-time inference capabilities for general robots and physical AI




This robotics computer, designed for millions of developers worldwide and based on NVIDIA Blackwell, delivers up to 2,070 FP4 TFLOPS of computational performance, efficiently handling complex applications such as embodied AI, high-speed sensor data processing, and general-purpose robotics tasks.







With the launch of the NVIDIA Jetson Thor module for physical AI, robots worldwide are set to achieve a significant leap in intelligence. This new robotics computer will serve as the "brain" for robotic systems in both research and industrial fields.



👉 The NVIDIA DRIVE AGX Thor Developer Kit is now available for pre-order. Developers and enterprise users can prioritize their orders through ZENTEK to get an early experience of the next-generation computational leap in physical AI.



Robots rely on rich sensor data and low-latency AI processing capabilities. Running real-time robotics applications requires powerful AI computing performance and memory to handle concurrent data streams from multiple sensors. The newly released Jetson Thor offers 7.5x the AI computational performance, 3.1x the CPU performance, and 2x the memory capacity compared to its predecessor, NVIDIA Jetson Orin, enabling real-time processing at the edge.



This performance leap will empower robotics developers to process high-speed sensor data and perform visual inference at the edge. Previously, such workflows were too slow to operate in dynamic real-world environments. This opens up new possibilities for multimodal AI applications, including humanoid robots.



Leading humanoid robotics company Agility Robotics has already utilized NVIDIA Jetson in its fifth-generation robot Digit and plans to adopt Jetson Thor as the computational core for the sixth-generation Digit. This upgrade will further enhance Digit’s real-time perception and decision-making capabilities, supporting increasingly complex AI functionalities and behavioral demands. Digit is already commercially deployed, performing logistics tasks such as stacking, loading, and palletizing in warehouse and manufacturing environments.



Agility Robotics CEO Peggy Johnson stated: "The powerful edge processing capabilities of Jetson Thor will elevate Digit to a new level—enhancing its real-time responsiveness and expanding its functionality to broader and more complex task areas. With Jetson Thor, we can apply the latest physical AI advancements to optimize operational efficiency in our customers’ warehouses and factories."



Boston Dynamics, with over 30 years of experience in robotics and a track record of developing industry-leading robots, is integrating Jetson Thor into its humanoid robot Atlas. This will equip Atlas with computational capabilities previously only available in servers, enabling accelerated AI workloads at the edge, high-bandwidth data processing, and large memory support.



Beyond humanoid robots, Jetson Thor will accelerate various robotics applications, including surgical assistance robots, intelligent tractors, delivery robots, industrial robotic arms, and visual AI agents. It provides real-time inference capabilities for larger and more complex AI models at the edge.



A Giant Leap in Real-Time Inference for Robotics



Jetson Thor is designed for generative inference models, supporting next-generation physical AI agents. These agents are driven by large transformer models, vision language models (VLMs), and vision-language-action models (VLAs), enabling real-time operation at the edge and minimizing reliance on the cloud.



Optimized through the Jetson software stack, Jetson Thor meets the low-latency and high-performance demands of real-time applications. It supports all major generative AI frameworks and AI inference models, delivering significant real-time performance advantages. These models include general-purpose ones such as Cosmos Reason, DeepSeek, Llama, Gemini, and Qwen, as well as robotics-specific models like Isaac GR00T N1.5, allowing developers to easily experiment with and run inference locally.



By integrating multiple sensor inputs, NVIDIA Jetson Thor has further enhanced its real-time inference capabilities. Through FP4 precision and inference decoding optimization, its performance is expected to be further improved.



With multi-sensor input, NVIDIA Jetson Thor further expands its real-time inference capabilities. Through FP4 precision and speculative decoding optimizations, its performance is expected to improve even further.



Leveraging the full lifecycle support of the NVIDIA CUDA ecosystem, future software updates are anticipated to further enhance Jetson Thor’s throughput and responsiveness.



The Jetson Thor module also supports the full NVIDIA AI software stack, accelerating nearly all physical AI workflows. This includes NVIDIA Isaac for robotics, NVIDIA Metropolis for video analytics AI agents, and NVIDIA Holoscan for sensor processing.



Using these software tools, developers can easily build and deploy various applications, such as visual AI agents that analyze real-time camera streams to monitor worker safety, humanoid robots that perform tasks in unstructured environments, and intelligent operating room systems that provide guidance to surgeons based on multi-camera stream data.



Jetson Thor Driving Breakthroughs in Research



Research labs at Stanford University, Carnegie Mellon University, and ETH Zurich are leveraging Jetson Thor to push the boundaries of perception, planning, and navigation models, exploring their potential across various applications.


A research team at Carnegie Mellon University’s Robotics Institute is using NVIDIA Jetson to power autonomous robots that navigate complex unstructured environments and perform tasks such as medical triage and search and rescue.



Sebastian Scherer, Research Associate Professor at the university and head of the AirLab, stated: "Our research outcomes depend on the upper limits of available computing power. Years ago, there was a significant gap between computer vision and robotics because computer vision workloads couldn’t meet the speed requirements for real-time decision-making. But today, models and computational power are robust enough for robots to handle far more intricate tasks than before."



Scherer anticipates that upgrading the team’s existing NVIDIA Jetson AGX Orin systems to the Jetson AGX Thor Developer Kit will not only enhance the performance of AI models—including their award-winning robotic edge perception model MAC-VO—and improve sensor fusion capabilities but also enable experiments with robot swarms.



Unlocking the Full Potential of Jetson Thor



The Jetson Thor product series includes developer kits and production-ready modules. The developer kit contains a Jetson T5000 module, a reference carrier board with rich interfaces, an active cooler with a fan, and a power adapter.




NVIDIA Jetson AGX Thor Developer Kit



The Jetson ecosystem caters to diverse application needs, helping enterprise developers shorten time-to-market. Hardware partners such as Advantech, Aetina, ConnectTech, MiVue, and Tianzhun are developing production-ready Jetson Thor systems with flexible I/O interfaces, customizable configurations, and multiple form factors.



Sensor and actuator companies including Analog Devices, e-con Systems, Infineon, Leopard Imaging, RealSense, and SenseCloud are leveraging the NVIDIA Holoscan Sensor Bridge—a platform that simplifies sensor fusion and data streaming—to transmit sensor data from devices such as cameras, radar, and lidar directly into the GPU memory of Jetson Thor with ultra-low latency.



Currently, thousands of software companies can upgrade their traditional visual AI and robotics applications using multi-AI-agent workflows running on Jetson Thor. Leading companies such as Openzeka, Rebotnix, Solomon, and Vaidio are among the early adopters.



Over 2 million developers worldwide use NVIDIA technologies to accelerate robotics development. Read the technical blog and watch the developer kit tutorial to get started with Jetson Thor now:


https://developer.nvidia.com/blog/introducing-nvidia-jetson-thor-the-ultimate-platform-for-physical-ai/



To experience Jetson Thor firsthand, sign up for the upcoming hackathon hosted by Seeed Studio and LeRobot from Hugging Face.



The NVIDIA Jetson AGX Thor Developer Kit is now available for 3,499.TheNVIDIAJetsonT5000moduleisalsoavailable,pricedat2,999 per unit for orders of 1,000 or more. Purchase now through authorized NVIDIA partners.



NVIDIA also announced that the NVIDIA DRIVE AGX Thor Developer Kit for autonomous vehicles and mobile solutions is now open for pre-orders, with shipments expected to begin in September.


Images or videos related to NVIDIA products (in whole or in part) are copyright © NVIDIA Corporation.