As a result, users can reliably test all aspects of GNSS-based vehicle positioning – a core functionality of autonomous vehicles. AVL DRIVINGCUBE enables the reproducible testing of driver assistance systems and driving features for self-driving vehicles using a real vehicle within a virtual environment in a variety of different traffic situations. For that purpose, test drives are performed with a real, ready-to-drive vehicle on a chassis dynamometer or powertrain testbed.
With the help of realistic virtual driving scenarios it is possible to test peripheral sensors, control systems and actuators inside the vehicle in a fully reproducible and reliable manner. Automated vehicle functions are thus sufficiently validated during development and even before testing on the proving ground.
The range of environment simulations carried out with AVL DRIVINGCUBE can now be extended to include GNSS signals, bringing simulation closer to reality than ever before. The vehicle’s GNSS receiver (e.g. GPS) is stimulated realistically using GNSS signals generated on the testbed. This way, technical engineers can identify exactly how sensors, automated driving features and other actuators respond inside the vehicle. The now possible GNSS-based vehicle positioning feature is a core functionality of automated driving, and the approach ensures that it is reliably tested.
For generating GNSS signals, Rohde & Schwarz GNSS stimulators are used (R&S®SMBV100B or R&S®SMW200A), which allow the generation of signals for all of the available satellite navigation systems (GPS, Glonass, Galileo, BeiDou, QZSS, SBAS) across all frequency bandwidths (L1, L2, L5). This also makes them suitable for testing multi-frequency receivers, which are playing an increasingly important role in automated driving.
“In Rohde & Schwarz, we now have a strong and reliable partner for GNSS stimulation. By generating consistent GNSS signals in connection with environment simulation, AVL DRIVINGCUBE now provides a test system that allows users to validate GNSS-based driver assistance systems and autonomous driving features,” explained Dr.-Ing. Tobias Düser, Head of Advanced Solution Lab at AVL Deutschland GmbH.
Christoph Pointner, Head of Signal Generators at Rohde & Schwarz, added: “We are very pleased to bring our expertise in the field of signal generation to this collaboration with AVL and contribute to such an important innovation and trendsetting solution for testing automated driving features.”
The additional GNSS stimulation makes testbed testing not only more realistic, it is above all a further step in moving testing from the road to the rig. This leads to a much sharper reduction of test drives than was the case previously and major savings in the kilometres driven.
A flexible system easily integrated Rohde & Schwarz GNSS stimulators form a flexible, modular system that can be adapted to your requirements and is easily integrated in the AVL DRIVINGCUBE environment. The stimulator is controlled automatically from the simulation platform.
GNSS extensions for AVL DRIVINGCUBE are available with immediate effect.
Operating on NXP’s i.MX 8QuadMax Applications Processor, the toolchain will give automotive manufacturers (OEMs) the ability to tap into Unity’s real-time 3D rendering technology.
The collaboration enables drivers to experience Unity-powered content — from real-time navigation and advanced visualisation for self-driving vehicles to personalised driving features, 3D maps, access to Unity games and more — regardless of the vehicle trim level or cost.
“Every driver wants to stay connected to their devices, but for the value-minded driver that is often not an option unless they are willing to buy a higher-priced trim level or luxury car,” said Tim McDonough, General Manager of Industrial, Unity Technologies. “Unity’s collaboration with NXP provides the opportunity for immersive HMI systems to be placed into all vehicles of all costs, allowing consumers to have a high fidelity and immersive interface so they can play games, connect their smartphone and engage with their vehicle.”
Unity’s real-time 3D platform allows OEMs to shorten timelines for HMI prototype development and production and increases efficiency and speed by offering a best-in-class experience to UI/UX designers, artists, developers, and engineers. The platform also offers a unified HMI toolchain covering the full product cycle of design, prototyping, development and mass product deployment. Customisable content created by Unity’s millions of developers can also be found on the Unity Asset Store. The Unity NXP collaboration makes it easier for OEMs to bring their HMI designs to production with NXP chipsets.
“NXP’s i.MX 8 QuadMax applications processors are powering entirely new, personalized and interactive control hubs that are fueling immersive infotainment experiences of the future,” said Ron Martino, Vice President and General Manager, i.MX applications processors for NXP Semiconductors. “With expanded digital clusters, our innovative infotainment and in-vehicle automotive solutions adjust to driver preferences seamlessly, while advanced HMI support enables voice commands, gestures, augmented reality, and advanced personalization – all with an eye toward driver safety.”
In addition to its collaboration with NXP, Unity works with eight of the 10 largest automotive OEMs in the world by helping improve the way they design, build, service and sell automobiles. Unity continues to invest in its automotive and manufacturing business by bringing in the best talent in 3D, AR and VR as well as automotive experts from companies like BMW, Toyota, Volvo, and the Volkswagen Group.
Through this collaboration, FLIR will integrate a fully physics-based thermal sensor into ANSYS’ leading-edge driving simulator to model, test, and validate thermal camera designs within an ultra-realistic virtual world. The new solution will reduce original equipment manufacturers’ (OEM) development time by optimising thermal camera placement for use with tools such as automatic emergency braking (AEB), pedestrian detection, and within future AVs.
Having the ability to test in virtual environments complements the existing systems available to FLIR customers and partners, including the FLIR automotive development kit (ADK) featuring a FLIR Boson thermal camera, the FLIR starter thermal dataset and the regional, city-specific thermal datasets. The FLIR thermal dataset programs were created for machine learning in advanced driver assistance development (ADAS), AEB, and AV systems.
The current AV and ADAS sensors face challenges in darkness or shadows, sun glare and inclement weather such as most fog. Thermal cameras, however, can effectively detect and classify objects in these conditions. Integrating FLIR Systems’ thermal sensor into ANSYS VRXPERIENCE enables simulation of thousands of driving scenarios across millions of miles in mere days. Furthermore, engineers can simulate difficult-to-produce scenarios where thermal provides critical data, including detecting pedestrians in crowded, low-contrast environments.
“By adding ANSYS’ industry-leading simulation solutions to the existing suite of tools for physical testing, engineers, automakers, and automotive suppliers can improve the safety of vehicles in all types of driving conditions,” said Frank Pennisi, President of the Industrial Business Unit at FLIR Systems. “The industry can also recreate corner cases that drivers can see every day but are difficult to replicate in physical environments, paving the way for improved neural networks and the performance of safety features such as AEB.”
“FLIR Systems’ recognises the limitations of relying solely on gathering machine learning datasets in the physical world to make automotive thermal cameras as safe and reliable as possible for automotive uses,” said Eric Bantegnie, Vice president and General Manager at ANSYS. “Now with ANSYS solutions, FLIR can further empower automakers to speed the creation and certification of assisted-driving systems with thermal cameras.”
In addition to the city-specific data sets, FLIR has more than a decade of experience in the automotive industry. FLIR has provided more than 700,000 thermal sensors as part of its night vision warning systems for a variety of carmakers, including GM, Audi and Mercedes-Benz. Also, FLIR recently announced that its thermal sensor has been selected by Veoneer, a Tier 1 automotive supplier, for its level-four AV production contract with a top global automaker, planned for 2021.
On the section operated by the German Aerospace Center (DLR), cameras will record anonymised data on the driving behaviour of different kinds of road users. Using the stretch of road, which has been financed by the State of Lower Saxony and the DLR, Volkswagen hoped to gather new knowledge for the purposes of assisted driving.
Last year, the DLR installed the data collection technology along the stretch of road, with which all vehicle positions are precisely measured in order to record all traffic events. Volkswagen will use the information to improve assisted driving software. The testing area is an open research and development platform that offers a unique combination of different testing options, such as the simulation of various traffic flows.
Dr. Frank Welsch, Chief Development Officer of Volkswagen Passenger Cars, emphasised the importance of the test track: “In order to research assisted driving, data from standard daily traffic is absolutely necessary. The Lower Saxony testing area allows us not only to collect such data in a completely real-world environment, but also to expand on it using simulations.”
The data will be collected along the stretch of road in an anonymised fashion, such that only so-called trajectories can be evaluated and no data specific to individual vehicles is collected such as license plates or the driver’s face. Trajectories are the straight lines and curves that describe the vehicle movements only.
Beyond the data collection technology, pWLAN technology has also been installed for development purposes. It enables the direct communication between vehicles and with the traffic infrastructure. This so-called Car2X technology is already standard equipment on the new Golf and will be used in future on the ID.3.
Guests at the opening of the Lower Saxony Testing Area included the Minister for Economic Affairs, Labour, Transport and Digitalisation of Lower Saxony and Deputy Minister-President Bernd Althusmann, the Minister for Science and Culture of Lower Saxony, Björn Thümler, and the member of German Parliament Thomas Jarzombek who serves as the federal government’s coordinator for German Aerospace and Digitalisation.
The DLR project is financed by the Ministry for Economic Affairs, Labour, Transport and Digitalisation of Lower Saxony, as well as the Ministry for Science and Culture of Lower Saxony. Along with Volkswagen, the measurement data from the stretch of the A39 motorway will also be used by Continental AG, Wolfsburg AG, IAV GmbH, OECON GmbH, ADAC Niedersachsen/Sachsen-Anhalt e.V., NordSys GmbH and Siemens AG.
The makeup of the new material is simple. A combination of a lattice structure and plastic film controls air vibrations to limit the transmission of wide frequency band noise (500-1200 hertz), such as road and engine noise.
Currently, most materials used to isolate this frequency band consist mainly of heavy rubber board. Nissan’s new acoustic meta-material weighs one-fourth as much as these while providing the same degree of sound isolation.
Because of its simple structure, the material’s cost competitiveness in terms of mass production is almost the same as, or possibly better than, current materials. Therefore, the material can also be applied to vehicles where the use of sound insulation materials is currently limited due to cost or weight.
Nissan started its research on meta-material technology around 2008. At the time, meta-material was used in high-sensitivity antennas used for electromagnetic wave research. Nissan worked to extend the applicability of meta-material technology to include sound waves, leading to the successful invention of acoustic meta-material.
Making vehicles lighter helps limit the environmental impact of driving by improving energy efficiency. It also enhances enjoyment, as the quiet vehicle cabin makes driving more comfortable.
In the concept vehicle, SkipGen 3.0 is paired with the next generation cockpit domain controller, SPYDR 3.0. At the core, the single brain SPYDR 3.0 acts as a hypervisor and is capable of driving up to eleven displays. Both SkipGen 3.0 and SPYDR 3.0 are connected and powered by Panasonic’s proprietary software and cloud platform, OneConnect. Whether sending or receiving key infotainment messages on the run, this advanced cockpit system can also seamlessly run multimedia streaming or gaming applications for passenger and rear seat entertainment.
“Infotainment will integrate more functions in the cockpit of next generation vehicles. Panasonic’s software and communications cloud platform, OneConnect, acts as the bridge between a multitude of sensors being enabled in each vehicle and the real-time maps and data necessary to improve the accuracy and usability of those sensors,” said Panasonic Automotive CTO, Panasonic Automotive Systems of America, Andrew Poliak. “And that bridge of personalised relevant content will extend to all aspects of commuting, from their personal vehicle, to a shared ride, to their next flight.”
Combining forces, Panasonic Automotive and Karma Automotive, the Southern California based creator of luxury electric vehicles, integrated a connected system into the Karma SC-1 Vision Concept car. “We are more than just a car company,” said Karma CEO Dr. Lance Zhou. “Karma is a high-tech incubator, and a supplier to others who need our resources to speed their product development. We join like-minded partners such as Panasonic Automotive to showcase their emerging innovations and technologies. Karma’s 2020 Revero GT proudly features Ficosa production antenna, shifter, switches and mirror technology. Ficosa is Panasonic family company.”
SkipGen 3.0 IVI is Panasonic’s third generation in-vehicle infotainment system running on Google’s Android Automotive OS, Android 10 and is also equipped with Qualcomm’s Gen 3 processor Snapdragon 8155/6155. As Google’s reference hardware supplier, Panasonic designs SkipGen 3.0 to deliver the most advanced infotainment spectrum of assistance and entertainment features available, many of these features can be seamlessly controlled and activated by voice.
With SkipGen 3.0’s embedded connectivity and SiriusXM tuner built-in, the system will also proudly support SiriusXM with 360L. SiriusXM’s most advanced audio platform delivers content via both satellite and streaming to give drivers and their passengers more than 200 live SiriusXM channels, the ability to make on-demand programming choices, “For You” recommendations, as well as SiriusXM’s Personalized Stations Powered by Pandora. The SkipGen platform also features SiriusXM’s latest module hardware, X28.
SPYDR 3.0 is the next evolution of Panasonic’s cockpit domain controller. SPYDR 3.0 features 4K display resolution with multimedia streaming and can effectively support up to eleven information or entertainment displays in a vehicle. As such, this platform is capable of driving a variation of HUD displays, infotainment displays, rear seat and passenger seat displays all from a single brain system. Content streaming can range from interactive gaming on today’s most popular systems to streaming video via the owner’s application of choice. The show exhibit will demonstrate a live play gaming demo.
Panasonic’s OneConnect global platform ensures vehicles are maintained and up-to-date by providing predictive maintenance reminders to the driver, while providing analytics via the platform to the OEM and end consumer. As represented in the concept vehicle, OneConnect analytics can be customised to focus on electric vehicle data to create algorithms that improve battery efficiency to optimise short and long-term state of health of the vehicle and one’s total investment. OneConnect analytics and data can be stored or accessed through SkipGen or SPYDR and transferred between the OEM, the vehicle and the end consumer.
ZF’s coASSIST is the first step into the modular Level 2+ hardware and software suite and highlights ZF’s capabilities as the full system supplier. From 2020, ZF will equip production vehicles with this new ZF system for a major Asian Manufacturer.
The company sees so-called Level 2+ semi-automated driving systems as a pragmatic and feasible approach to help enhance safety and comfort in passenger cars. By combining an advanced sensor suite including cameras and radars with a central electronic control unit, functions such as adaptive cruise control, traffic sign recognition, lane change assist, lane keeping assist and traffic jam support are enabled.
ZF’s coASSIST system offers drivers significant comfort and safety benefits at a price well under $1,000, while meeting projected Euro NCAP 2024 test protocols and delivering popular Level 2+ functions. It will be in production with a major Asian manufacturer by the end of 2020 utilising the full ZF system including Mobileye’s EyeQ. Following the initial start of production, the system will also introduce ZF’s first launch of its Gen21 Medium-Range Radar.
“At ZF we believe that Level 2+ systems that meet advanced safety test protocols and help relieve the stress on the driving task will be the primary driver for personal passenger vehicles in the near future and provide a platform to help identify, and then test and validate for unusual global road scenarios,” said Christophe Marnat, Executive Vice President for the ZF Electronics and Advanced Driver Assist Systems division.
Making these systems affordable is also important as it enables more market penetration and brings the system advantages to consumers more rapidly. This will also help them to acclimate to semi-automated functions as the driver is always in the loop. ZF is the system integrator and offers a full range of L2+ systems including ZF coASSIST, ZF coDRIVE and ZF coPILOT.
ZF coASSIST is the cost-effective Level 2+ solution that helps meet Euro NCAP performance requirements while delivering the most popular Level 2+ ADAS functions utilising Mobileye, an Intel Company, EyeQ technology. ZF coDRIVE extends the functionality of traffic jam and highway driving support. 360° surround camera perception and the processing capability of Mobileye’s EyeQ technology enable automated lane changes and automatic overtaking. ZF coPILOT is designed for advanced computing power and processing scalability from Level 2+ up to Level 4 and is jointly developed with NVIDIA. It offers functions like feet-free and hands-free operation, automated lane change and overtaking, automated garage parking and route learning and utilises ZF’s ProAI controller.
“This full suite of offerings is the most comprehensive in the industry and allows customers to choose the right level of functionality to assist in driving scenarios such as Highway driving, traffic jams in urban or highway situations, driver initiated to automatic lane changes and automated parking assistance. Following the launch of coASSIST this year ZF’s full range of Level 2+ systems will be available in the following 1-3 years,” said Marnat.
The two partners are integrating Sennheiser’s patented AMBEO 3D audio technology with Continental’s Ac2ated Sound system. Continental’s innovative concept abandons conventional speaker technology altogether, exciting select surfaces in the vehicle interior to produce sound.
Combined with Sennheiser’s AMBEO Mobility, the concept achieves a breathtaking 3D sound reproduction that envelops passengers in an incredibly detailed and vivid soundscape and lets them enjoy their in-car entertainment to the fullest. In comparison to conventional audio systems, Ac2ated Sound enables a reduction of weight and space of up to 90%. In this way, the system not only produces the highest audio quality but is also perfectly suited for electric vehicles, where saving space and weight is a high priority.
“For Ac2ated Sound we have brought together the highest levels of expertise in the areas of acoustics, infotainment and vehicle design. In Sennheiser we have found an audio expert who helped us make our pioneering audio system even better,” said Helmut Matschi, member of the Executive Board and Head of the business area Vehicle Networking and Information at Continental. “Together, we have developed an audio system that creates premium sound out of nowhere. Additionally, Ac2ated Sound reduces space and weight. At Continental, we call this sustainability that’s music to your ears.”
“We are delighted to bring our audio expertise and AMBEO Mobility software into the pioneering Ac2ated Sound system from Continental, calibrating and fine-tuning the sound quality to deliver a completely immersive and natural sound experience that opens new audio perspectives and realities,” explained Dr. Andreas Sennheiser, co-CEO of Sennheiser.
Co-CEO Daniel Sennheiser added: “Our AMBEO immersive audio solutions deliver the ultimate quality in sound capture, processing and playback. Crucially, the ability to enjoy breathtaking immersive sound does not require specific 3D audio sources – AMBEO Mobility’s spatialisation algorithm can turn any stereo material into an immersive experience. By intelligently analyzing the content, the patented algorithm artistically remixes the sound to provide an emotional experience, transporting the listener into the music.”
Inspired by the technology of classical string instruments, which use their wooden body as a resonance chamber, specially developed actuators excite specific surfaces in the vehicle interior. The result is an extremely natural sound experience for the occupants, who feel as if they are sitting in a concert hall surrounded by sound. Additionally, in comparison with conventional speaker systems, the audio solution has a much lower weight and significantly reduced box volume.
In comparison: With a multitude of components, conventional audio systems weigh up to 40 kilograms (more than 88 pounds). By using already existing surfaces, Ac2ated Sound is distinctly more efficient in saving space. It achieves a reduction between 75 and 90% in contrast to existing conventional systems on the market.
At the same time, the invisible audio technology gives vehicle designers and manufacturers the freedom to do more with an automobile’s interior, as they no longer need to account for large speaker faces taking up valuable space. With Ac2ated Sound, many components are unnecessary because the surfaces in the vehicle vibrate just like speaker diaphragms. Actuators cause components such as the A-pillar trim, door trim, roof lining and rear shelf to vibrate so that they emit sound in different frequency ranges.
Continental and Sennheiser will present their futuristic audio system in a private exhibit at CES 2020, where visitors can experience how the system responds to the challenges for the next generation of vehicles by reducing weight and saving space without sacrificing sound quality.
The sun causes twice as many car accidents as any other weather-related condition due to temporary blindness. The National Highway Traffic Safety Administration reports thousands sun glare-related car accidents each year, and another study indicates the risk of a car crash is 16% higher during bright sunlight than normal weather. The traditional sun visor is not equipped to adequately address this safety concern. At best, it blocks some of the sun from your eyes but along with it, some of your view is blocked as well.
Bosch is offering a solution with the revolutionary Virtual Visor, a transparent LCD and intuitive camera, which replaces the traditional vehicle sun visor completely. As the first reimagined visor in nearly a century, Bosch’s technology utilises intelligent algorithms to intuitively block the sun’s glare and not the view of the road ahead.
“For most drivers around the world, the visor component as we know it is not enough to avoid hazardous sun glare – especially at dawn and dusk when the sun can greatly decrease drivers’ vision,” said Dr. Steffen Berns, president of Bosch Car Multimedia. “Some of the simplest innovations make the greatest impact, and Virtual Visor changes the way drivers see the road.”
The Virtual Visor, which was honoured as a Best of Innovation in the CES 2020 Innovation Awards, will debut at CES 2020 in Las Vegas. The Virtual Visor was also named as an honouree in the awards competition, which recognises products across 28 categories. Virtual Visor received the Best of Innovation for the In-Vehicle Entertainment & Safety category, as it received the highest ratings from a panel of judges that includes designers, engineers and members of the tech media.
Virtual Visor links an LCD panel with a driver or occupant-monitoring camera to track the sun’s casted shadow on the driver’s face. The system uses artificial intelligence to locate the driver within the image from the driver-facing camera. It also utilises AI to determine the landmarks on the face ‒ including where the eyes, nose and mouth are located ‒ so that it can identify shadows on the face. The algorithm analyses the driver’s view, darkening only the section of the display through which light hits the driver’s eyes. The rest of the display remains transparent, no longer obscuring a large section of the driver’s field of vision.
“We discovered early in the development that users adjust their traditional sun visors to always cast a shadow on their own eyes,” said Jason Zink, technical expert for Bosch in North America and one of the co-creators of the Virtual Visor. “This realization was profound in helping simplify the product concept and fuel the design of the technology.”
The creative use of liquid crystal technology to block a specific light source decreases dangerous sun glare, driver discomfort and accident risk; it also increases driver visibility, comfort and safety.
From the original ideation and concept phase to testing and prototyping, Virtual Visor is a bottom-up solution made possible through the innovation culture established at Bosch. Employees are encouraged to apply lean startup methodologies to confirm customer benefits, market potential and feasibility for new ideas, which are then validated by peers and approved for development.
“We’ve built a culture around empowering our associates by putting them in the driver’s seat,” said Mike Mansuetti, President of Bosch in North America. The Virtual Visor was developed by a team in North America as part of Bosch internal innovation activities. “As a leading global technology provider, we understand that innovation can come from any level of an organisation, and we want to see that grow.”