Category Archives: Featured Article

HYBex3 Concept Vehicle

Predictive Functions in the HYBex3 Concept Vehicle

30. January 2020 | Engineering Service, Featured Article

Predictive Functions in the HYBex3 Concept Vehicle

The hybridization of powertrains is an important step toward efficient and clean mobility. In particular, the possibility of shifting the operation of the combustion engine to ranges with a higher efficiency level and representing purely electric driving modes is one of the main advantages of hybrid drives. This shifting of the load point can be further optimized on the basis of route data that includes the expected vehicle speed as well as the road gradient, and is considered to be the state of the art with regard to modern hybrid drives.

Combined with the development of predictive and automated driving functions, further potentials can be tapped. The key factor for an actual reduction of the energy requirement under real driving conditions is a precise forecast of the future development of a traffic situation. This forecast can be based on a multitude of potential sources, such as sensor data, high-resolution maps, and vehicle communication, whereby all the data is fused into a comprehensive environmental model.

Based on the information from this model, the longitudinal guidance of the vehicle and the powertrain control can be optimized. In cooperation with the Institute for Combustion Engines of RWTH University Aachen, Germany, FEV has developed a function structure that is capable of using a multitude of potential data sources. This creates a solution space for predictive speed profile optimization. This speed profile can then be used in order to optimize the operation of torque distribution between the hybrid components.

The function structure was integrated in a hybrid prototype vehicle constructed jointly with DENSO. A robust, real-time model predictive control algorithm is used in order to optimize the longitudinal guidance of the vehicle.

The HYBex3 concept vehicle

The HYBex3 (”HYBrid power exchange 3 modes“) vehicle was developed in order to determine the impact of a cost-effective DHT transmission concept on the driveability of the vehicle and test it under real conditions. It was developed jointly with DENSO AUTOMOTIVE Germany. The base vehicle is a MINI Cooper with a turbocharged 100 kW three-cylinder combustion engine. The serial transmission was replaced with the hybrid transmission to be examined, which was specially developed for the application case. The powertrain topology is equivalent to a mixed hybrid equipped with two electric engines (EE) in a P2/P3 layout. The P2 machine is located between the electrohydraulically powered clutch and the two-stage spur gear component. The synchronization elements are also actuated electrohydraulically. The P3 machine is positioned at the transmission output and therefore has a fixed transmission ratio to the wheel.

Various operating modes can be represented with this DHT transmission. For purely electric driving, the combustion engine is stopped and the clutch is opened. Electric engine P2 can therefore be operated in both transmission stages. In addition to a high starting torque in the first gear, this enables a maximum vehicle speed of 140 km/h in the second gear.

In hybrid operation, serial or parallel driving is possible. In parallel operation, one of the two gear sets is engaged. In serial operation mode, the transmission is shifted to neutral. The combustion engine is then exclusively connected to electric engine P2 while electric engine P3 operates the wheels. All gear changes are synchronized entirely electrically, so that the friction clutch can remain closed even in hybrid operation. The serial operation in the low speed range and the parallel operation at higher speeds enable a significant increase of the system efficiency level.

The operating strategy provides for the combustion engine being operated at a very low dynamic and the implementation of fast load changes by the electric path. The transmission ratios enable a significant reduction of the rotational speed of the combustion engine, without compromising the overall dynamic of the powertrain. The operating strategy was optimized with a Design of Experiments. For this purpose, the parameters of the stop-start strategy of the combustion engine were optimized simultaneously with the parameters of the battery charging strategy. For the final parameterization, a compromise between the layouts for different driving cycles was selected.

The distribution of the torques of the two electric engines, both in parallel operation and in fully electric driving, is determined by an online optimization patented by FEV. The search algorithm varies the torque distribution until the energetically optimal case is found. In doing so, both the battery limits and the power limits of the electric engines are taken into account for the current situation.

Predictive functions

The function structure developed for predictive longitudinal dynamic control is designed in such a way that a multitude of data sources, optimization routines, and powertrain structures can be represented in said function structure.

The first step is an aggregation and fusion of the available data into an environmental model, followed by a prediction of the traffic situation. This enables an optimization of the speed profile. On the basis of that, an acceleration control of the vehicle is carried out. The planned speed profile can also be used in order to adjust the charging status strategy. If the desired charging performance is determined, the torque distribution between the powertrain components is carried out on the basis of said performance and the wheel torque requirement.

The precise forecast of the current traffic situation requires the aggregation of all available data. This includes, for instance, RADAR sensors, LIDAR sensors, or optical cameras that traffic participants can identify with the help of image recognition techniques. Usually, these sensors indicate the type (passenger car, truck, pedestrian, etc.), the relative positions and, potentially, the relative speed of the detected objects.Further information can be obtained from the on-board navigation systems, which indicate speed limits, road gradients and curvatures as well as, potentially, intersection data for the most probable path of the vehicle via an “electronic horizon”. If the navigation system is connected to the internet, data on average speeds along the planned route and traffic jams can be provided.

Additional data can be obtained through the future connection of vehicles using 5G or ETSI ITS G5. This Vehicle-to-everything (V2X) communication should include, among other things, the positions, direction, and speeds of other vehicles, as well as the layout of intersections and the status of traffic light systems. The vehicle communication can therefore provide data that goes beyond the horizon detectable via on-board sensors.

Since the same object can therefore be detected multiple times by various data sources, the data aggregation must also include a functionality for data fusion. This is especially advantageous for hardware setups with different types of sensors, e.g. a RADAR sensor and camera sensor. The RADAR sensor can precisely define the distance to and the relative position of a vehicle driving ahead, but cannot determine the lateral position of the vehicle in relation to the road markings. In contrast, the camera sensor can only provide estimates regarding the relative speed and the distance, but can precisely determine whether the detected object is in the same lane as the vehicle under consideration. After the fusion of several data sources, an aggregated object list is created, which only contains valid and relevant data for all detected objects, and generates a corresponding environmental model.

Before an optimization of the vehicle trajectory can be carried out, there must be a forecast of the development of the current situation. This forecast is based on the relevant objects that the environmental model provides. The first step is the determination of the speed limit along the prediction horizon. Based on that and the current condition of detected vehicles driving ahead, the speed and position trajectory of these vehicles is forecast.

On the basis of this, a solution space is spread out in which the downstream optimization algorithm can operate. The function structure developed by FEV and the Institute for Combustion Engines enables the implementation of different algorithms to this end. Depending on the requirement, simple, rule-based approaches, as well as model predictive control or discrete dynamic programming methods can be represented.

Application in the vehicle

To test the function structure, a real time-compatible model predictive control (MPC) was implemented in the rapid prototyping control unit of the HYBex3 concept vehicle and various test scenarios were carried out. In a first demonstration, the functionality and real-time compatibility of these scenarios for a predictive adjustment of the HYBex3 concept vehicle was proven. With an efficient implementation of the MPC using the qpOASES tool, an optimization of the speed curve for a horizon of 10 s can be carried out within less than 100 µs.

In the future, the modular design of the function structure can be used to expand the forecast horizon of the vehicle – for instance, with traffic lights ahead – or to represent predictive, automated driving functions such as Predictive Cruise Control (PCC).


Powertrain Test Field

Workflow-Based Information Management for Powertrain Testing Facilities

22. October 2019 | Featured Article, Software & Testing Solutions

Workflow-Based Information Management for Powertrain Testing Facilities

Compared to the amount of raw measurement data generated and processed by self-driving vehicles, the amount produced during powertrain testing is quite easily manageable. However, the sheer variety of information encountered with powertrain tests and how the pieces interact place high demands on the tools used to process that information. An efficiently organized testing facility therefore needs that information to be structured and standardized sensibly and its information management tools to be networked intelligently. That is the only way to speed up information processing during the testing process and maximize the knowledge gains.

Due to the variety of information, it makes sense to divide information into domains (Figure 1). Then, the flow of information between the domains and an adequate tool chain can be specified and set up.

1: Information domains and flow in the testing process

FEVFLEX™ enables the configuration of project data, such as team definitions and availability, time frames and budgetary conditions, and allows for the transfer of this data from the ERP system and machine data acquisition systems to the first information domain – the testing assignment database. The powerful, graphical user interfaces in FEVFLEX™ enable a reliable planning and coupling of test programs and resources (test beds, measuring instruments, personnel).

A tied-in, digital order management system allows instructions prepared in writing to be issued to laboratories and workshops along with the master data so the necessary measuring tools can be prepared and the test setup be initiated. Information about the subject of testing and the testing program as well as the control unit data sets are supplied by the respective specialist department.

It is clear that reliably functioning information tools also need to promote collaboration and the exchange of information between process partners during testing assignment planning so the information can be combined efficiently and without any loss.

At FEV, our specialist departments and testing facilities do this by using the identical front end of the testing assignment database seen in Figure 2.

The inspection order data is automatically transmitted to the second information domain. FLEX Lab™ creates the configuration of the test bench automation system, MORPHEE™, where it serves as the basis for performing the tests.

2: User interface of the testing assignment and testing database

In the testing database, individual test steps, such as engine characteristic map measurement, full-load curve, or an emission cycle, are specified, depending on the test types required (Figure 2). Because the information is inherited, only deviations from the planned testing requirements need to be recorded for the subsequent steps and documentation purposes. The test bench operator selects the appropriate step from the testing database via the automation system’s interface, thus creating a connection to the measurement data.

Linking the test assignment data with associated rules, the set points actually achieved during the test, and the time-synchronous measurement data from the various measuring systems produces a complete set of data for calculating test results and for further analyses. Computations are performed as needed. The automation system calculates control deviations or significant quality criteria, e.g. measuring point stability, in real time during measurement. Once the system processes them, they are transferred to the database. Additional calculations based on a standardized list of formulas are performed after the measurement results are imported into the testing database. The results of those calculations are stored separately.

Upon conclusion of each step in testing, we have a pool of informative data available for quality assurance by the testing facility or for additional analyses, even for multiple projects, by the specialist department.

In the third domain – the operational database – the logbook functionality in FEVFLEX™, enables the logging of code-based operating states and error messages of the automation systems and measuring devices in the test field. This makes additional information available.

The test bench operator supplements that information with reports on the error patterns and root causes. If need be, the personnel also prepares 8D reports, as seen in Figure 3, which are forwarded directly to the responsible workshops or laboratories via a messaging system for further follow-up.

3: Operational database and code-based logging of operating states and error messages

This makes the operational database an important tool for supporting operations in addition to the automated productivity analysis of individual projects or entire testing facilities. In the testing facility’s organization, this is handled by various equipment managers, each of whom receives reports about error codes within their purview as well as the anomalies, errors, and the root causes contained in the reports. A powerful interface provides them with extensive information, and they can intervene quickly and selectively if a risk is encountered.
Inheriting the test assignment data links together all information from the test steps and operations. All the information can still be traced, and the preparation of component histories, i.e. load spectra, measurements, and anomalies experienced during the test phase, is simplified considerably.

Quality assurance using online plausibility checks

To verify and examine the plausibility of measurement results while testing is being conducted, the interface between the test bench and testing database offers a data transfer tool with enhanced features, as seen in Figure 5. It successively imports raw data into the testing database during the current measurement process, performing automatic analyses as it does. The test bench operator receives continuous information about the test results via the on-screen visualization (Figure 4) and can, if necessary, intervene to make manual corrections, unless they are already made automatically.

4: User interfaces and visualization of the information management system, shown here at an operator station in a control room

Besides confirming adherence to the testing rules, the plausibility checks involve ensuring the measurement results are complete and comparing the measurements to an expected but not yet critical range of values. They enable early detection of changes or malfunctions in the test item or even the testing equipment. Furthermore, the transfer tool can perform a data-driven analysis of gas travel times when measuring emissions and perform a regression analysis in order to promptly calculate and examine the plausibility of specific emission values. The results of the plausibility check are also stored in the testing database.

5: Online plausibility checks of measurement data using the data transfer tool with enhanced features

Online plausibility checks during data import thus contribute significantly to the quality assurance of testing operations.

Post-processing and reporting

Automated reporting is based on machine-readable report definition, plus standardized report templates and naming. A typical quality measure used by testing facilities is regular, at least daily, taking of reference measurements based on characteristic operating points. The status of the data on the test item and the testing conditions are kept constant at all times. This allows changes or shifting measurements over long periods of testing or after modifications or repairs to be detected quickly. The analysis feature built into the data transfer tool calls up the automated generation of a report in the evaluation tool, UNIPLOT™. The feature supplements data currently being measured with reference measurements already stored since the beginning of the test.

In addition to the quality reports, other project-specific testing reports have been defined. They are available shortly following completion of each test thanks to automatic processing of the test results to be presented. Calculations for individual projects are stored in the database as supplemental computing rules, thereby expanding the contents of the automated testing reports.

Global networking of testing facilities

If test runs are organized across locations, for instance, whenever complicated logistics for test items and components are to be avoided, but a different facility possesses the subject matter expertise, it becomes essential to have rapid and secure exchange of information within the worldwide corporate network.

It is not mandatory for the databases to be situated in the same location as the test execution site or the specialist departments. To ensure efficient testing operations while meeting quality standards, a part of the testing results must be available locally after a short time. This is made possible by replicating the data transfer tool at the local facility. The tool then performs the online plausibility checks and prepares the quality reports. At the same time, the local data transfer tool organizes the data transfer to the central database, where it starts additional alculations and preparation of the project reports, as illustrated in Figure 6. The testing facility, therefore, has a comprehensive report on quality assurance and initial analysis available to it in just a few minutes.

6: Global networking

The testing assignment database and testing database are accessed directly using a virtual desktop infrastructure. With it, the expert team can specify new testing assignments or individual test steps, which are made available to the test bench operator as orders on the books in the central testing database. To aid communication and global collaboration, FEV also uses virtual control stations. Comparable to the central control room at a testing facility (Figure 4), a virtual control station is also used to transfer information on online plausibility checks and the status of the automation and the application tool.

Test steps can be continuously assigned and complete test results then promptly communicated between an expert team and a remote testing facility via the testing database. In a series of internationally organized projects conducted at FEV, it was shown that the entire set of test results, including all automated calculations and reports for a test step, can be available worldwide in no more than 15 minutes.

FEV’s shared testing database is thus the central platform for the group’s global network of testing activities.

The necessary standardizations and information management tools were developed by FEV and are being continuously perfected. On that basis, our customers have an attractive range of products available to them – from the automation system MORPHEETM through data management in FEVFLEXTM and FLEX Lab™ to the evaluation in UNIPLOT™ for the information management in testing facilities.