Compared to the amount of raw measurement data generated and processed by self-driving vehicles, the amount produced during powertrain testing is quite easily manageable. However, the sheer variety of information encountered with powertrain tests and how the pieces interact place high demands on the tools used to process that information. An efficiently organized testing facility therefore needs that information to be structured and standardized sensibly and its information management tools to be networked intelligently. That is the only way to speed up information processing during the testing process and maximize the knowledge gains.
Due to the variety of information, it makes sense to divide information into domains (Figure 1). Then, the flow of information between the domains and an adequate tool chain can be specified and set up.
FEVFLEX™ enables the configuration of project data, such as team definitions and availability, time frames and budgetary conditions, and allows for the transfer of this data from the ERP system and machine data acquisition systems to the first information domain – the testing assignment database. The powerful, graphical user interfaces in FEVFLEX™ enable a reliable planning and coupling of test programs and resources (test beds, measuring instruments, personnel).
A tied-in, digital order management system allows instructions prepared in writing to be issued to laboratories and workshops along with the master data so the necessary measuring tools can be prepared and the test setup be initiated. Information about the subject of testing and the testing program as well as the control unit data sets are supplied by the respective specialist department.
It is clear that reliably functioning information tools also need to promote collaboration and the exchange of information between process partners during testing assignment planning so the information can be combined efficiently and without any loss.
At FEV, our specialist departments and testing facilities do this by using the identical front end of the testing assignment database seen in Figure 2.
The inspection order data is automatically transmitted to the second information domain. FLEX Lab™ creates the configuration of the test bench automation system, MORPHEE™, where it serves as the basis for performing the tests.
In the testing database, individual test steps, such as engine characteristic map measurement, full-load curve, or an emission cycle, are specified, depending on the test types required (Figure 2). Because the information is inherited, only deviations from the planned testing requirements need to be recorded for the subsequent steps and documentation purposes. The test bench operator selects the appropriate step from the testing database via the automation system’s interface, thus creating a connection to the measurement data.
Linking the test assignment data with associated rules, the set points actually achieved during the test, and the time-synchronous measurement data from the various measuring systems produces a complete set of data for calculating test results and for further analyses. Computations are performed as needed. The automation system calculates control deviations or significant quality criteria, e.g. measuring point stability, in real time during measurement. Once the system processes them, they are transferred to the database. Additional calculations based on a standardized list of formulas are performed after the measurement results are imported into the testing database. The results of those calculations are stored separately.
Upon conclusion of each step in testing, we have a pool of informative data available for quality assurance by the testing facility or for additional analyses, even for multiple projects, by the specialist department.
In the third domain – the operational database – the logbook functionality in FEVFLEX™, enables the logging of code-based operating states and error messages of the automation systems and measuring devices in the test field. This makes additional information available.
The test bench operator supplements that information with reports on the error patterns and root causes. If need be, the personnel also prepares 8D reports, as seen in Figure 3, which are forwarded directly to the responsible workshops or laboratories via a messaging system for further follow-up.
This makes the operational database an important tool for supporting operations in addition to the automated productivity analysis of individual projects or entire testing facilities. In the testing facility’s organization, this is handled by various equipment managers, each of whom receives reports about error codes within their purview as well as the anomalies, errors, and the root causes contained in the reports. A powerful interface provides them with extensive information, and they can intervene quickly and selectively if a risk is encountered.
Inheriting the test assignment data links together all information from the test steps and operations. All the information can still be traced, and the preparation of component histories, i.e. load spectra, measurements, and anomalies experienced during the test phase, is simplified considerably.
Quality assurance using online plausibility checks
To verify and examine the plausibility of measurement results while testing is being conducted, the interface between the test bench and testing database offers a data transfer tool with enhanced features, as seen in Figure 5. It successively imports raw data into the testing database during the current measurement process, performing automatic analyses as it does. The test bench operator receives continuous information about the test results via the on-screen visualization (Figure 4) and can, if necessary, intervene to make manual corrections, unless they are already made automatically.
Besides confirming adherence to the testing rules, the plausibility checks involve ensuring the measurement results are complete and comparing the measurements to an expected but not yet critical range of values. They enable early detection of changes or malfunctions in the test item or even the testing equipment. Furthermore, the transfer tool can perform a data-driven analysis of gas travel times when measuring emissions and perform a regression analysis in order to promptly calculate and examine the plausibility of specific emission values. The results of the plausibility check are also stored in the testing database.
Online plausibility checks during data import thus contribute significantly to the quality assurance of testing operations.
Post-processing and reporting
Automated reporting is based on machine-readable report definition, plus standardized report templates and naming. A typical quality measure used by testing facilities is regular, at least daily, taking of reference measurements based on characteristic operating points. The status of the data on the test item and the testing conditions are kept constant at all times. This allows changes or shifting measurements over long periods of testing or after modifications or repairs to be detected quickly. The analysis feature built into the data transfer tool calls up the automated generation of a report in the evaluation tool, UNIPLOT™. The feature supplements data currently being measured with reference measurements already stored since the beginning of the test.
In addition to the quality reports, other project-specific testing reports have been defined. They are available shortly following completion of each test thanks to automatic processing of the test results to be presented. Calculations for individual projects are stored in the database as supplemental computing rules, thereby expanding the contents of the automated testing reports.
Global networking of testing facilities
If test runs are organized across locations, for instance, whenever complicated logistics for test items and components are to be avoided, but a different facility possesses the subject matter expertise, it becomes essential to have rapid and secure exchange of information within the worldwide corporate network.
It is not mandatory for the databases to be situated in the same location as the test execution site or the specialist departments. To ensure efficient testing operations while meeting quality standards, a part of the testing results must be available locally after a short time. This is made possible by replicating the data transfer tool at the local facility. The tool then performs the online plausibility checks and prepares the quality reports. At the same time, the local data transfer tool organizes the data transfer to the central database, where it starts additional alculations and preparation of the project reports, as illustrated in Figure 6. The testing facility, therefore, has a comprehensive report on quality assurance and initial analysis available to it in just a few minutes.
The testing assignment database and testing database are accessed directly using a virtual desktop infrastructure. With it, the expert team can specify new testing assignments or individual test steps, which are made available to the test bench operator as orders on the books in the central testing database. To aid communication and global collaboration, FEV also uses virtual control stations. Comparable to the central control room at a testing facility (Figure 4), a virtual control station is also used to transfer information on online plausibility checks and the status of the automation and the application tool.
Test steps can be continuously assigned and complete test results then promptly communicated between an expert team and a remote testing facility via the testing database. In a series of internationally organized projects conducted at FEV, it was shown that the entire set of test results, including all automated calculations and reports for a test step, can be available worldwide in no more than 15 minutes.
FEV’s shared testing database is thus the central platform for the group’s global network of testing activities.
The necessary standardizations and information management tools were developed by FEV and are being continuously perfected. On that basis, our customers have an attractive range of products available to them – from the automation system MORPHEETM through data management in FEVFLEXTM and FLEX Lab™ to the evaluation in UNIPLOT™ for the information management in testing facilities.
In the coming year, FEV plans to open two new battery test centers – one in Germany and the other one in France. Additionally, new e-motor and e-axle test benches have been integrated into FEV’s test centers and on customer sites. Based upon long-term planning and construction experience with FEV’s own test cells and test centers, as well as in numerous customer projects, FEV provides an effective methodology for specification development, concept layout and planning for e-mobility test benches, test cells and test centers, this methodology covers hardware (test equipment, technical infrastructure, building), software (data management), logistic and operation aspects.
Based upon FEV’s long-term experience, the sustainable success for the construction of new test cells and test centers is highly influenced from quality and completeness of the specification and planning phases. Precise requirement analysis, complete specification development and well-designed concept development are the key factors which deliver the solid foundation for a successful realization of these projects. Due to the extensive experience acquired by FEV, the described project phases can be actively organized and guided in close collaboration with future users/ customers in order to ensure the development of sustainable and cost-effective solutions which cover future requirements to the highest possible degree.
The final goal is to develop a technical solution covering building construction aspects, concepts for the test cells and test benches, laboratories, workshops, the technical infrastructure including supply media and energy supply, furthermore operational and logistical issues. Due to long-term, global experience, FEV’s experts provide the right solutions. They have the in-depth knowledge and experience gained in the construction of their own test centers for the mobility of the future to support customers. They use specific calculation and simulation tools to simulate the different scenarios.
Boosting the test center performance
In state-of-the-art test centers, the visible parts, such as the buildings, the building infrastructure and the test benches can no longer be separated from the invisible parts – the comprehensive information system with a high automation degree.
Let’s evaluate how this information system controls the workflow and use cases in a battery test center. When the battery pack, module or cells and (sub-) components are received, a bar code is created that follows the Unit Under Test (UUT) throughout the entire workflow. The UUT is taken from a safe storage room and subsequently equipped with sensors and measuring devices in a preparation area. The availability and maintenance status of resources (equipment, test benches, employees) is documented in a database, thereby supporting an efficient and effective planning and assignment of UUT and resources. After the installation of the UUT at the test bench, the test programme is executed, followed by the post processing of the measurement data being acquired via the automation system and further measuring devices. The measurement data is checked regarding plausibility and finally documented in standardized test reports. The information system allows data on the UUT, the assigned resources, the test program and test results to be logically linked throughout the workflow. The above information system is based on the FEVFLEX™ software suite.
This modular, layer-based suite features dedicated modules for managing the main workflow of a test center, starting from the test demands up to the final test reports:
- Enterprise functionality at the layer of the overall test center:
FEVFLEX™ facilitates experiments in the field of simulation, benchmarking, and component and system test benches up to vehicle fleet tests, as well as combinations of those. At this layer, work orders are created by combining data from ERP and MES systems (e. g. customer and project data, cost centers) with information on the UUT, the test program and the availability and status of resources (equipment, test benches, employees). Tasks are planned and subsequently assigned to test benches and resources. Moreover, FEVFLEX™ allows the UUT and its (sub-) components to be defined in a Build of Material (BOM) list – well known from benchmarking contexts – thereby supporting UUT life cycle control. In the final stage of the workflow, FEVFLEX™ handles test results from any source (benchmark or simulation data and measurement data obtained from the automation system and measuring devices), which are subsequently time-synchronized and pushed to data evaluation tools.
- Host system functionality as binding factor between test center and test benches:
FLEX Lab™ takes care of the overall data handling and parametrization of MORPHEE® automation systems at component and system test benches. At this layer, the FEVFLEX™ work orders are translated into the preparation of the automation system resulting in a base parametrization (including e.g. a measuring plan, channel limits, log lists, integration of measuring devices, test program).
Furthermore FLEX Lab™ supports the management of MORPHEE® configurations, including back-up and versioning. Launching the execution of test programs at the test bench is secured via communication between the FLEX Lab™ host system and the MORPHEE® automation system. Finally, FLEX Lab™ pushes the measurement data, which was acquired via the automation system to data evaluation tools, such as UniPlot.
As a final conclusion, the workflow in FEVFLEX™ is supported by SCADA remote monitoring and run-time statistics:
- Remote monitoring supports immediate alerts and interventions in case of incidents
- Run-time statistics support facility managers to repair weaknesses in their workflow sustainably
With the help of this comprehensive information system based on the FEVFLEX™, an effective test bench usage of 95 percent was reached in FEV’s battery durability test center.