Up-to-Date vehicles are nowadays equipped with a broad range of different sensor types, to provide an overall good local perception. This local perception is necessary to implement helpful Advanced Driver Assistance Systems (ADAS), which increase safety or provide an extra comfort to the driver. The sensor based perception however, is subject to certain limitations. A limited range up to where a sensor can provide reasonable information or environmental influences such as weather are just two examples of limiting factors, that have to be taken into account.
The in the following described Horizon.KOM and RemoteHorizon.KOM is an approach to provide global vehicular perception based on different data sources, providing additional types of information with a wider range of view and without limiting factors such as environmental influences. Prototype implementations are available and can be downloaded under the Apache License (Version 2.0).
In the field of Advanced Driver Assistance Systems (ADAS), aggregation of contextual information consistently gains significance. The combination of various information from a broad range, allows developers to design new and more beneficial applications. To gather contextual information, vehicles are equipped with several types of sensors providing an overall good local perception. ADAS Applications can harness information from a single sensor or use a sensor fusion, combining multiple different senor types to provide a safety or comfort functionality. Some manufactures provide vehicle configurations offering a full 360° sensor coverage.The Long-Range Radar, typically located in front or rear of a vehicle, is often used to monitor other road participants while driving. An example application is the Adaptive Cruise Control, adjusting the speed to maintain a predefined distance towards the vehicle in front. Short- or Medium-Range Radar typically has a wider but shorter area coverage. An example application is the Rear Collision Warning system, observing road participants in the near rear field of the vehicle. Lidar uses multiple laser beams to scan a certain section of the vehicles environment. For example facing towards the front of the vehicle, its information can be harnessed to implement Emergency Breaking Applications. Mono-/Stereo Video sensors are one of the most promising detection devices for the future. Often located under the windshield, video sensors have a wide sensor area enable object recognition, classification and tracking. In case of stereo video sensors, a distance determination can also be performed. An example application is the Traffic Sign Recognition. Ultrasound sensors are typically used for close environment detection. Example applications are Blind Spot Detection or Park Assist.
Figure 1: Overview of different perception sensors (from left to right): long-range radar, short-/medium- range radar, lidar, ultra sound and combined overview.
Although the combination of these different sensor types leads to a good local perception, the sensors are still subject to important limitations, which have to be taken into account, when using the provided sensor information. External influences interfere in the sensing process, effecting the gathered information in terms of integrity and trustworthiness. As a day to day example, different weather conditions have different effects on each sensor type. While fog has a heavy impact on the operability of video and Lidar sensors, the impact on radar sensors is comparatively small. Heavy rain however effects Radar, Lidar and Video sensors [Motorblog].
As a second example, by design all previously mentioned sensor types need to be placed on the vehicle where they ensure a clear vision to operate properly most of the time. Objects sensed by these sensors are also obstacles blocking the sensor from sensing the area behind the object. To picture this problem, a road sign cannot be sensed by a video sensor when it is being blocked by a large truck or is not visible due to road geometry.
In order to make developing easier and to be able to develop new ADAS Applications that have not been possible before, one has to overcome traditional sensor limitations. Preferable information about the road geometry, road furniture or any other imaginable information about the environment should always be available and reliable. This includes also information that even cannot be sensed by the previously mentioned sensors such as the strength of the mobile signal connecting the vehicle to the World Wide Web. The information, if existing, should be accessible without any environmental influence and over a large range of view (e.g. several kilometers) independent of the road geometry, or blocking obstacles.
This leads to the concept of an electroic horizon, also named eHorrizon. Based on digital map material, and optionally also other sources, information about road segments in driving direction are provided. The developed Horizon.KOM can be understood as a virtual (software) sensor to achieve this preferences. The concept is also illistrated in Figure 2. The area perceived with local sensors is marked by the green area. In driving directin an eHorizon of the road geometrie is provided, marked by the grey lines. This eHorizon contains also several other attributes, e.g. traffic signs.
Figure 2: Example illustration of an eHorizon. The eHorizon extends the local sensor view.
In the following we introduce Horizon.KOM a local dynamic eHorizon provider. It is based on Java and prototype implemented as Android application. Afterwars we introduce RemoteHorizon.KOM, a server based eHorizon service.
The developed Horizon.KOM can be understood as a virtual (software) sensor aiming to achieve the previously mentioned preferences. Information related to the vehicle's position and context, such as road geometry, road furniture and other are made available on the vehicle data bus. The information offered inside the vehicle are transmitted using the ADASIS specification (http://adasis.org/). The Horizon.KOM is physically not limited in terms of range and information quantity. However to keep the application performing well, a maximum horizon size is always set. It is defined as a function depending on the vehicles speed. High speeds require a larger horizon (e.g. highways) and slow speeds may require a rather wide than large horizon (e.g. city area).
Figure 3: Schematic illustration of the dynamic data management structure.
Figure 3 shows the general structure of the Horizon.KOM. Dynamic data is stored using the Tree-Structure. The Dynamic Road Administration and Horizon Size Determination are responsible to keep the dynamic data inside the horizon always up-to-date and the horizon itself always at a suitable size. The Map-Matching module is responsible to update the vehicle's position, to keep track of movement and context changes. The Most Probable Path is required to determine the path / direction where the vehicle most likely is going to drive. It has a direct influence on the Dynamic Road Administration module and therefore also on the dynamic data inside the horizon itself. Each road segment is assigned to a calculated probability value (Figure 4). Dead Reckoning is harnessed to improve the Map-Matching and the MPP determination and gain a higher accuracy.
Figure 4: Example of the probability distribution within the tree-structure.
For a detailed description and further information, please see [BPB16] at the bottom of this site. The source code is licenced under Apache 2.0 open source licence. The files are available at the end of this site.
With RemoteHorizon.KOM, a server based Horizon Provider solution has been created. It was developed from the Horizon.KOM and provides additional features. The key idea was to introduce a server client driven eHorizon offering always up-to-date information with a light weight client side implementation, not requesting any local data stored. Figure 5 illustrates the general model. The top of the figure represents the server side, hosting multiple remote eHorizon instances for different vehicles. The eHorizon Core can be understood as an advanced version of the Horizon.KOM which will be synchronized with a counter instance on the client side (bottom of figure 5). The Communication Core is a module handling all data flows between the server and client. The main intention of the Communication Core is a highly scalable and cost-efficient generic communication between server and client, which provides high robustness and fast data transmission (figure 6). The Data Transmitter Add-on provides additional functionality to drive the application as a background service, avoiding high data burst and therefore preventing the communication channel from over stressing.
Figure 5:System overview of RemoteHorizon.KOM.
The communication architecture is depicted in figure 6. For the communication between the client-side and the server-side we use the publish subscribe pattern to realize an asynchronous and scalable communication. In our prototype we use the lightweight MQTT protocol with the Mosquitto message broker. For message serialization we make use of Google Protocol Bufffers (protobuf) that is very efficient, language neutral and in particular extensible. The protobuf data structure is part of the communication core project within the source code package of RemoteHorizon.KOM.
Figure 6: Communication architecture.
For a detailed description and further information, please see [BXB16] at the bottom of this site. The source code is licenced under Apache 2.0 open source licence. The files are available at the end of this site.
The source code of Horizon.KOM and RemoteHorizon.KOM is available under Apache 2.0 open source license.
Download the ProtoBuf source files of our proposed data structure: MessageStructure
Download the source code files of Horizon.KOM here: Horizon.KOM
Download the source code files of RemoteHorizon.KOM here: RemoteHorizon.KOM
[BMB17]ftp://ftp.kom.tu-darmstadt.de/papers/BMB17.pdf Daniel Burgstahler, Martin Möbus, Tobias Meuser, Doreen Böhnstedt and Ralf Steinmetz:
A Categorization Scheme for Information Demands of Future Connected ADAS (accepted for publication). In: Proceedings of the AmE 2017 - Automotive meets Electronics, March 2017.
[BPB16] Daniel Burgstahler, Christoph Peusens, Doreen Boehnstedt and Ralf Steinmetz: Horizon.KOM: A First Step Towards an Open Vehicular Horizon Provider. In: Proceedings of the 2nd International Conference on Vehicle Technology and Intelligent Transport Systems (VEHITS), p. 79-84, SCITEPRESS, April 2016. ISBN 978-989-758-185-4.
[BXB16] Daniel Burgstahler, Athiona Xhoga, Christoph Peusens, Martin Möbus, Doreen Böhnstedt and Ralf Steinmetz: RemoteHorizon.KOM: Dynamic Cloud-based eHorizon. In: Proceedings of the AmE 2016 - Automotive meets Electronics, VDE VERLAG GMBH, March 2016. ISBN 978-3-8007-4167-0.
[BZM+15] Daniel Burgstahler, Sebastian Zöller, Martin Möbus, Tim Walter, Tobias Rückelt and Ralf Steinmetz: Navigate.KOM: Datenbankbasierter Informationsansatz für Fahrassistenzsysteme. In: Proceedings of the AmE 2015 - Automotive meets Electronics, p. 111-116, VDE VERLAG GMBH, February 2015. ISBN 978-3-8007-3890-8.