Driven-by-QCraft heralds 3rd-Generation Autonomous Driving hardware
Photo Credit To QCraft

Driven-by-QCraft heralds 3rd-Generation Autonomous Driving hardware

Driven-by-QCraft heralds 3rd-Generation Autonomous Driving hardware

QCraft has released the 3rd-generation hardware based on its “Driven-by-QCraft” solution. The sensor suite provides compact features of multiple types of advanced high-precision sensors to achieve 360-degree blind spot-free perception with solid stability and real-time performance.

As a safety guarantee, every module including sensors, computing platform, power system and communication system are designed with full redundancy.

Most importantly, NVIDIA DRIVE Orin™ system-on-a-chip (SoC) will be adopted by QCraft to boost the next-generation of the Driven-by-QCraft solution. Armed with NVIDIA Orin, QCraft expects to accelerate the progress of its L4 autonomous driving solution to achieve automotive-grade and commercialization.

Designed to support multi-vehicle, multi-scenario and multi-city road operations

QCraft currently operates a fleet of around 100 autonomous driving vehicles, powered by the same hardware solution. Up to now, the fleet has reached ten cities globally, including in the Silicon Valley in the U.S., Beijing, Shenzhen, Suzhou and other major cities in China. This hardware solution has been powering ten types of vehicles and is capable of coping with all sorts of scenarios—urban congestion, storms and tunnels, to name a few—with high sophistication.

As the pioneer of robobus in China, Longzhou ONE, QCraft’s first robobus for public roads, has been operated in six cities, including Suzhou, Shenzhen, Beijing, Wuhan, Wuxi and Chongqing, comprising the largest fleet of its kind in China.

In 2020, QCraft launched China’s first 5G robobus project that achieve regular operation in Suzhou. One year later, the company is now expanding its footprint with the launch of a ride-hailing 5G robobus project in the downtown area of Wuxi. By deploying robobuses and three bus routes totalling 15 kilometres (9.3 miles) in the city, QCraft’s initial service connects major shopping centers and subway stations to residential communities, within a region of about 10 square kilometres in the busiest area of the city.

“Driven-by-QCraft”, the powerful autonomous driving solution, supports the firm’s rapidly responsive implementation. This solution has two modules: Onboard Software and Onboard Hardware.

The hardware solution has been developed correspondingly to realize the application of software technology. QCraft owns the full tech stacks for onboard software, including perception, mapping & localization, route planning & decision making, and controlling. The seamless compatibility of hardware and software enables quick deployment and wide applications for different urban scenarios and vehicle models.

The widespread adoption of Longzhou vehicles continuously generates massive amounts of data. Therefore, the capacity to autonomously and efficiently collect and use the data plays a crucial role in accomplishing a rapid advancement of autonomous driving technology. QCraft Dataflow Platform accelerates the autonomous driving development by automating large-scale data collection, cleaning and labelling, as well as by facilitating a data-driven and simulation-based verification and evaluation process that spans all development stages.

Multi-sensor fusion: 360-degree blindspot-free perception

In order to more stably perceive information of traffic participants, QCraft adopts a multi-sensor fusion method in constructing a sensor system that can achieve 360-degree perception, without blind spots. This multi-sensor fusion suite can be easily deployed and upgraded with the modular design, including two long-range measurement lidars (main lidar), three short-range blind spot-filling lidars (blind spot area lidar), four millimetre-wave radars, nine cameras and one IMU set.

  • 360-degree blind spot-free perception: traditional sensor solutions are prone to blind spots, which can be dangerous for high-speed cars and large buses. QCraft has launched the first 360-degree blind spot-free sensor solution in China, solving the problem of dead angles around the vehicle for the first time. Sensors can be redundant to each other, covering an area even less than 10cm from the car.
  • Left-right mutual redundancy of the sensor suite: Unlike a cell phone, TV or computer, the malfunction of an autonomous car can cause fatal accidents. With this in mind, we have built “multi-insurance” to make the sensor suite redundant. Based on three groups of sensors, even if one or two of them fail, the autonomous driving system can still ensure the normal operation of the perception module and will allow the vehicle to stop safely.
  • High-synchronization lidar solution: The sensor suites are installed in three groups. The lidars of each group always rotate simultaneously in the same direction, to create a super high degree of synchronization. Dislocation and ghosting of the point cloud will be avoided when there are dynamic objects in the vicinity. It is ensured that all the point cloud data can be collected and processed at the same time to maximize the use of all information.
  • Camera’s intelligent environment adaption: Through advanced software algorithms, the system can deal with either overexposure or underexposure under different light conditions and can solve the problem of smearing caused by motion blur while driving. The camera specially designed to identify traffic lights can accurately identify the shape and color of traffic lights 150 meters away at night. In addition, the high-resolution camera meets the requirement of vehicle manufacturing standards to deal with extreme environments. It can operate in temperatures ranging from -40 to 125°C.
  • Minimization of camera blind spots: Based on seven surround-view 5-megapixel cameras, QCraft expands the vertical perception range by turning the camera at a 90-degrees angle, which can significantly reduce camera blind spots by more than 90%. The camera can distinguish small objects at a close range, such as cone barrels and children. Also, it ensures the consistency between the line-by-line exposure direction of the camera and the scanning direction of the lidar. It therefore improves the front fusion effect of cameras and lidars.
  • The outdoor temperature fluctuation and rainy weather will condense water on the camera lens, blurring the image. The sensors of QCraft have a self-cleaning function, which can automatically remove water mist, dust, and other dirt.

Computing platform: Three levels of mutual redundancy design

The computing platform of the Driven-by-QCraft solution includes the central computing unit, the backup computing unit and the on-board computing unit. Under normal circumstances, the central computing unit is responsible for processing the software. If this fails for any reason, the backup computing unit will take over vehicle control immediately and determine its movement.

The redundancy design allows the vehicle’s protection mechanisms to pull over to the side of the road or brake during an emergency.

Power system: Comprehensive power-path delamination and protection design

With layered power path management, power can be dynamically allocated according to real-time weather and road conditions, giving priority to supporting the core function modules and regulating the power supply of auxiliary function modules. This kind of management can help to:

  • extend the vehicle’s operating mileage;
  • effectively identify and isolate abnormal fault units within the system, avoid fault cascades, protect core functional modules from being affected by random failures, and reduce operation and maintenance costs;

Combined with a redundant power supply and the sensor suite, the power system maintains the minimum subsystems that keep the vehicle safe to drive, even when there is accidental damage to a single or even multiple core functional modules.

Multi-sensor fusion suite: Suitable for different models, and cost-controllable

The hardware solution introduced by QCraft is available for different vehicle models and is the very first hardware solution that can be used for both robotaxis and robobuses. The single hardware solution can also be applied to different vehicle models and different cities, helping to integrate data from the whole fleet and bringing significant convenience to form a closed data loop. With tremendous amounts of data from universal hardware, QCraft can accelerate the software iteration and continue to upgrade the OTA among all vehicles in the fleet.

Furthermore, the hardware solution can also be configured according to the various needs of diverse scenarios. Depending on the forecast of QCraft, the cost of a sensor suite will drop under 100,000 yuan in the next two to three years.

Computing platform: Aiming to achieve auto grade

QCraft has also announced the adoption of NVIDIA’s Orin product in the most updated-generation hardware.
Among current automotive AI chips, NVIDIA’s Orin X chip is at the top of the pyramid, having been called “the strongest AI chip for intelligent driving on the earth.” Relying on its computing power, NVIDIA’s Orin X chip can perform massive concurrent operations with excellence, and can support complex deep neural networks to process data generated by the autonomous driving system to make decisions.

In addition, NVIDIA’s Orin X chip meets the auto grade and complies with the ASIL-D standard in ISO 26262 at the system level, which is very important for application scenarios containing strict safety requirements.

The power consumption of computing platforms based on Orin chips ranges from 2 to 3 TOPS/W. In turn, this efficiency is helpful in the large-scale implementation of L4 autonomous vehicles.

With the gradual commercialization of advanced driver assistance systems, the affordable price of NVIDIA’s Orin X chip brings great benefits by reducing the overall cost of the entire autonomous driving solution.

The custom-on-demand “Driven-by-QCraft” solution targets for efficiently deploying autonomous vehicles across a plurality of settings. Longzhou autonomous vehicles rolled out by QCraft can be applied to many scenarios, including online ride-hailing, buses and shuttle cars.

When intelligent, connected vehicles evolve away from just private ownership in favour of shared mobility, robotaxis move towards new formats, providing larger space and greener mobility services. To create a more forward-looking robotaxi, QCraft has started from robobus, exploring new forms that are more applicable to shared mobility.

Moreover, QCraft also provides customers with the toolchain for autonomous driving technology research and development. It helps clients with their data-driven algorithm development, which can quickly improve their autonomous driving systems based on their own data closed loop.

Looking to the future, QCraft plans to apply “Driven-by-QCraft” to more smart transport scenarios, along with more Longzhou models and sustainable tech advancements, aiming to finally make autonomous driving a reality.

Driven-by-QCraft heralds 3rd-Generation Autonomous Driving hardware

Post source : QCraft

About The Author

Anthony brings a wealth of global experience to his role as Managing Editor of Highways.Today. With an extensive career spanning several decades in the construction industry, Anthony has worked on diverse projects across continents, gaining valuable insights and expertise in highway construction, infrastructure development, and innovative engineering solutions. His international experience equips him with a unique perspective on the challenges and opportunities within the highways industry.

Related posts