Huaji Wang
Cranfield University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Huaji Wang.
IEEE/CAA Journal of Automatica Sinica | 2018
Chen Lv; Dongpu Cao; Yifan Zhao; Daniel J. Auger; Mark J.M. Sullman; Huaji Wang; Laura Millen Dutka; Lee Skrypchuk; Alexandros Mouzakitis
In present-day highly-automated vehicles, there are occasions when the driving system disengages and the human driver is required to take-over. This is of great importance to a vehicle U+02BC s safety and ride comfort. In the U.S state of California, the Autonomous Vehicle Testing Regulations require every manufacturer testing autonomous vehicles on public roads to submit an annual report summarizing the disengagements of the technology experienced during testing. On 1 January 2016, seven manufacturers submitted their first disengagement reports: Bosch, Delphi, Google, Nissan, Mercedes-Benz, Volkswagen, and Tesla Motors. This work analyses the data from these disengagement reports with the aim of gaining abetter understanding of the situations in which a driver is required to takeover, as this is potentially useful in improving the Society of Automotive Engineers U+0028 SAE U+0029 Level 2 and Level 3 automation technologies. Disengagement events from testing are classified into different groups based on attributes and the causes of disengagement are investigated and compared in detail. The mechanisms and time taken for take-over transition occurred in disengagements are studied. Finally, recommendations for OEMs, manufacturers, and government organizations are also discussed.
IEEE Transactions on Computational Social Systems | 2018
Yang Xing; Chen Lv; Zhaozhong Zhang; Huaji Wang; Xiaoxiang Na; Dongpu Cao; Efstathios Velenis; Fei-Yue Wang
Driver decisions and behaviors regarding the surrounding traffic are critical to traffic safety. It is important for an intelligent vehicle to understand driver behavior and assist in driving tasks according to their status. In this paper, the consumer range camera Kinect is used to monitor drivers and identify driving tasks in a real vehicle. Specifically, seven common tasks performed by multiple drivers during driving are identified in this paper. The tasks include normal driving, left-, right-, and rear-mirror checking, mobile phone answering, texting using a mobile phone with one or both hands, and the setup of in-vehicle video devices. The first four tasks are considered safe driving tasks, while the other three tasks are regarded as dangerous and distracting tasks. The driver behavior signals collected from the Kinect consist of a color and depth image of the driver inside the vehicle cabin. In addition, 3-D head rotation angles and the upper body (hand and arm at both sides) joint positions are recorded. Then, the importance of these features for behavior recognition is evaluated using random forests and maximal information coefficient methods. Next, a feedforward neural network (FFNN) is used to identify the seven tasks. Finally, the model performance for task recognition is evaluated with different features (body only, head only, and combined). The final detection result for the seven driving tasks among five participants achieved an average of greater than 80% accuracy, and the FFNN tasks detector is proved to be an efficient model that can be implemented for real-time driver distraction and dangerous behavior recognition.
Sensors | 2017
Yifan Zhao; Lorenz Görne; Iek-Man Yuen; Dongpu Cao; Mark J.M. Sullman; Daniel J. Auger; Chen Lv; Huaji Wang; Rebecca Matthias; Lee Skrypchuk; Alexandros Mouzakitis
Although at present legislation does not allow drivers in a Level 3 autonomous vehicle to engage in a secondary task, there may become a time when it does. Monitoring the behaviour of drivers engaging in various non-driving activities (NDAs) is crucial to decide how well the driver will be able to take over control of the vehicle. One limitation of the commonly used face-based head tracking system, using cameras, is that sufficient features of the face must be visible, which limits the detectable angle of head movement and thereby measurable NDAs, unless multiple cameras are used. This paper proposes a novel orientation sensor based head tracking system that includes twin devices, one of which measures the movement of the vehicle while the other measures the absolute movement of the head. Measurement error in the shaking and nodding axes were less than 0.4°, while error in the rolling axis was less than 2°. Comparison with a camera-based system, through in-house tests and on-road tests, showed that the main advantage of the proposed system is the ability to detect angles larger than 20° in the shaking and nodding axes. Finally, a case study demonstrated that the measurement of the shaking and nodding angles, produced from the proposed system, can effectively characterise the drivers’ behaviour while engaged in the NDAs of chatting to a passenger and playing on a smartphone.
Measurement | 2018
Yang Xing; Chen Lv; Dongpu Cao; Huaji Wang; Yifan Zhao
IEEE/CAA Journal of Automatica Sinica | 2018
Yang Xing; Chen Lv; Long Chen; Huaji Wang; Hong Wang; Dongpu Cao; Efstathios Velenis; Fei-Yue Wang
ieee intelligent vehicles symposium | 2018
Chen Lv; Huaji Wang; Dongpu Cao; Yifan Zhao; Mark J.M. Sullman; Daniel J. Auger; James Brighton; Rebecca Matthias; Lee Skrypchuk; Alexandros Mouzakitis
Iet Intelligent Transport Systems | 2018
Yang Xing; Chen Lv; Huaji Wang; Dongpu Cao; Efstathios Velenis
IEEE/CAA Journal of Automatica Sinica | 2018
Hongyan Guo; Dongpu Cao; Hong Chen; Chen Lv; Huaji Wang; Siqi Yang
IEEE-ASME Transactions on Mechatronics | 2018
Chen Lv; Huaji Wang; Dongpu Cao; Yifan Zhao; Daniel J. Auger; Mark J.M. Sullman; Rebecca Matthias; Lee Skrypchuk; Alexandros Mouzakitis
IEEE Transactions on Vehicular Technology | 2018
Chao Lu; Huaji Wang; Chen Lv; Jianwei Gong; Junqiang Xi; Dongpu Cao