QCar 2
Sensor-rich autonomous vehicle for self-driving applications.
The QCar 2 is the feature vehicle of the Self-Driving Car Studio, an open-architecture, 1/10th scale vehicle designed for
academic self-driving initiatives. Driven by the powerful NVIDIA Orin AGX and equipped with a comprehensive suite of inertial, visual, and ranging sensors, it is designed to elevate your research, education, and outreach to the next level.
Product Details
Working individually or in a fleet, the QCar 2 is the ideal vehicle for validating concepts related to self-driving stacks, machine vision and learning, traffic and fleet management, platooning, city and highway maneuvering, and many more.
Check ACC Self-Driving Car Student Competition to get more information on how Quanser and QCar empowering the next generation of engineers.
Dimensions | 39 x 19 x 20 cm |
Weight (with batteries) | 2.7 kg |
Power | 3S 11.1V LiPo (3300 mAh) with XT60 connector |
Operation time (approximate) | ~ 2 hours 11 m (stationary w/ sensor feedback) & 30 min (driving w/o sensor feedback) |
Onboard Computer | NVIDIA Jetson Orin AGX
CPU- 2.2 GHz 8-core ARM Cortex-A78 64-bit GPU- 930 MHz 1792-CUDA/56-TENSOR cores NVIDIA Ampere GPU architecture 200 TOPS Memory- 32GB 256-bit LPDDR5 @ 204.8 GB/s |
Lidar | LIDAR with 16k points, 5-15 Hz scan rate, 0.2-12m range |
Cameras | Intel D435 RGBD Camera
360° 2D CSI Cameras using 4x 160° FOV wide angle lenses, 21fps to 120fps |
Encoders | 720 count motor encoder pre-gearing with hardware digital tachometer |
IMU | 6-axis IMU (gyroscope & accelerometer) |
Safety Features | Hardware ‘safe’ shutdown button
Auto-power off to protect batteries |
Expandable IO | •2 user PWM output channels
•Motor throttle control •Steering control •2 unipolar user analog input channels, 12-bit, +3.3V •motor current analog inputs •3 encoder channels (motor position plus up to two additional encoders) •11 reconfigurable digital I/O •3 user buttons •2 general purpose 3.3V high-speed serial ports* •1 high-speed 3.3V SPI port (up to 25 MHz)* •1 1.8V I2C port (up to 1 MHz)* •1 3.3V I2C port (up to 1 MHz)* •2 CAN bus interfaces (supporting CAN FD) •1 USB port •1 USB-C host port •1 USB-C DRP * Subject to change |
Connectivity | •Wi-Fi 802.11a/b/g/n/ac 867 Mbps with dual antennas
•1x HDMI •1x 10/100/1000 BASE-T Ethernet |
Additional QCar features | •Headlamps, brake lights, turn signals and reverse lights
•Individually programmable RGB LED strip (33x LEDs) •Dual microphones •Speaker •2.7″ LCD TFT 400×240 for diagnostic monitoring |
The Self-Driving Car Studio is the most academically extensible solution that Quanser has created. As a turnkey research studio, the Self-Driving Car Studio equips researchers with the vehicles, accessories, infrastructure, devices, software, and resources needed to get from a concept to a published paper faster and consistently over generations of projects and students. This capacity is enabled first by the powerful and versatile 1/10 scale Qcar 2s that have an unprecedented level of instrumentation, reconfigurability, and interoperability. To compliment the cars, the studio also comes with a rich infrastructure component that includes:
- Programmable traffic lights
- Durable and multipurpose driving map featuring intersections, parking spaces, single & double lane roads and roundabouts
- A collection of scaled accessories such as signs and pylons
- A preconfigured High-Performance PC with three monitors and network infrastructure.
Bridging from research to teaching applications, the studio comes with a one-year subscription to the QLabs Virtual Qcar 2 that mirrors and enriches the studio experience with a digitally twinned cityscape and virtual QCar 2 fleet. This environment allows researchers to expand the potential scenarios that can be built to validate algorithms and behaviours. By bridging and blending code between the physical and virtual, researchers can explore new avenues for complex intelligence and behavioural analysis. For teaching, the virtual environment offers each student the opportunity to explore and develop when and where they want before testing and validating on the physical cars. Finally, across both teaching and research, our rich collection of research examples and recommended curriculum for MATLAB Simulink, Python, and ROS enables adoption without the need for extensive development of the fundamentals. Overall, the Self-Driving Car Studio is a multi-disciplinary and multipurpose turnkey laboratory that can accelerate research, diversify teaching, and engage students from recruitment to graduation.
|
(Gazebo and Quanser Interactive Labs )
Quanser Stream APIs for inter-process communication |