QCar 2-Product Data Sheet

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

QCAR 2

Sensor-rich autonomous vehicle for self-driving applications


The QCar 2 is the feature vehicle of the Self-Driving Car Studio, an open-architecture, 1/10th scale vehicle designed
for academic self-driving initiatives. Powered by the uncompromising NVIDIA Orin AGX, and equipped with a variety of
inertial, visual and ranging sensors, it is ready to take your research, education and outreach to the next level.
Working individually or in a fleet, the QCar 2 is the ideal vehicle for validating concepts related to self-driving stacks,
machine vision and learning, traffic and fleet management, platooning, city and highway maneuvering, and many more.

Features

Product may not appear


exactly as shown

Performant Reliable
Powered by the NVIDIA Jetson Robust, student-proof and
Orin AGX running Ubuntu and 1/10th scaled mechanical design
supporting the latest Jetpack

Open Software Architecture Expandable


Design and deploy applications Relevant inertial, visual and
using Simulink, Python, C/C++, ROS2, ranging sensors for self-driving,
TensorFlow, and more and variety of expansion IO & USB
ports for customization

Research Studio

The Self-Driving Car Studio comes with everything you need to jumpstart your research.

Vehicles Ground Control Station Studio Space

• QCar 2 * • High-performance computer with RTX • Driving map featuring


• QCar * graphics card with Tensor AI cores intersections, parking spaces,
• Three monitors single & double lane roads
(single vehicle or fleet) and roundabouts
• High-performance router
• Supporting infrastructure
• Wireless gamepad including traffic lights, signs
• QUARC Complete license and cones

* Subject to change

WWW.QUANSER.COM | INFO@QUANSER.COM |
Product Details * Slamtech 2D RPLidar A2M12

Intel RealSense D435


NVIDIA Jeston Orin AGX
WiFi & Gigabit Connectivity

2.7” LCD TFT


USB & HDMI 400x240
connectivity options

Headlamps, turn signals

User bottons ( not visible)

6-axis IMU (not visible)


21W drivetrain
with 720 count LED strips
encoder & tach 360° 2D CSI cameras

Device Specifications

Dimensions 39 x 19 x 20 cm

Weight (with batteries) 2.7 kg

Power 3S 11.1 V LiPo (3300 mAh) with XT60 connector

Operation time (approximate) ~ 2 hours 11 m (stationary w/ sensor feedback) & 30 min (driving w/o sensor feedback)
Onboard computer NVIDIA Jetson Orin AGX GPU: 930 MHz 1792-CUDA/56-TENSOR cores
CPU: 2.2 GHz 8-core ARM Cortex-A78 64-bit NVIDIA Ampere GPU architecture 200 TOPS
Memory- 32GB 256-bit LPDDR5 @ 204.8 GB/s

Lidar LIDAR with 16k points, 5-15 Hz scan rate, 0.2-12m range

Cameras Intel D435 RGBD Camera 360° 2D CSI Cameras using 4x 160° FOV
wide angle lenses, 21fps to 120fps

Encoders 720 count motor encoder pre-gearing with hardware digital tachometer

IMU 6-axis IMU (gyroscope & accelerometer)

Safety features Hardware ‘safe’ shutdown button Auto-power off to protect batteries

Expandable IO • 2 user PWM output channels • 3 user buttons


• Motor throttle control • 2 general purpose 3.3V high-speed serial ports*
• Steering control • 1 high-speed 3.3V SPI port (up to 25 MHz)*
• 2 unipolar user analog input • 11.8V I2C port (up to 1 MHz)*
• channels, 12-bit, +3.3V • 1 3.3V I2C port (up to 1 MHz)*
• motor current analog inputs • 2 CAN bus interfaces (supporting CAN FD)
• 3 encoder channels (motor position plusup to two • 1 USB port
additional encoders) • 1 USB-C host por
• 1 reconfigurable digital I/O • 1 USB-C DRP

Connectivity Wi-Fi 802.11a/b/g/n/ac 867 Mbps 1x HDMI


with dual antennas 1x 10/100/1000 BASE-T Ethernet

Additional QCar features • Headlamps, brake lights, turn signals and reverse lights • Dual microphones
• Individually programmable RGB LED strip (33x LEDs) • Speaker
• 2.7” LCD TFT 400x240 for diagnostic monitoring

Supported Software • QUARC for Simulink® • VPI™


and APIs • Quanser APIs • GStreamer
• TensorFlow • Jetson Multimedia APIs
• Python™ 2.7 / 3 & ROS 2 • Docker containers with GPU support
• CUDA® • Simulink® with Simulink Coder
• cuDNN • Simulation and virtual training environments
• TensorRT (Gazebo and Quanser Interactive Labs )
• OpenCV • Multi-language development supported with
• VisionWorks® Quanser Stream APIs for inter-process communication
* Subject to change

About Quanser:
For 30 years, Quanser has been the world leader in innovative technology for engineering education and research. With roots in control, mechatronics, and robotics,
Quanser has advanced to the forefront of the global movement in engineering education transformation in the face of unprecedented opportunities and challenges
triggered by autonomous robotics, IoT, Industry 4.0, and cyber-physical systems.
Products and/or services pictured and referred to herein and their accompanying specifications may be subject to change without notice. Products and/or services mentioned herein are
trademarks or registered trademarks of Quanser Inc. and/or its affiliates. ©2024 Quanser Inc. All rights reserved.

You might also like