Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Xingjun Ma is active.

Publication


Featured researches published by Xingjun Ma.


computer-based medical systems | 2017

Design and Evaluation of a Virtual Reality Simulation Module for Training Advanced Temporal Bone Surgery

Sudanthi N. R. Wijewickrema; Bridget Copson; Yun Zhou; Xingjun Ma; Robert Briggs; James Bailey; Gregor Kennedy; Stephen O'Leary

Surgical education has traditionally relied on cadaveric dissection and supervised training in the operating theatre. However, both these forms of training have become inefficient due to issues such as scarcity of cadavers and competing priorities taking up surgeons time. Within this context, computer-based simulations such as virtual reality have gained popularity as supplemental modes of training. Virtual reality simulation offers repeated practice in a riskfree environment where standardised surgical training modules can be developed, along with systems to provide automated guidance and assessment. In this paper, we discuss the design and evaluation of such a training module, specifically aimed at training an advanced temporal bone procedure, namelycochlear implant surgery.


international joint conference on artificial intelligence | 2017

Adversarial generation of real-time feedback with neural networks for simulation-based training

Xingjun Ma; Sudanthi N. R. Wijewickrema; Shuo Zhou; Yun Zhou; Zakaria Mhammedi; Stephen O'Leary; James Bailey

Simulation-based training (SBT) is gaining popularity as a low-cost and convenient training technique in a vast range of applications. However, for a SBT platform to be fully utilized as an effective training tool, it is essential that feedback on performance is provided automatically in real-time during training. It is the aim of this paper to develop an efficient and effective feedback generation method for the provision of real-time feedback in SBT. Existing methods either have low effectiveness in improving novice skills or suffer from low efficiency, resulting in their inability to be used in real-time. In this paper, we propose a neural network based method to generate feedback using the adversarial technique. The proposed method utilizes a bounded adversarial update to minimize a L1 regularized loss via back-propagation. We empirically show that the proposed method can be used to generate simple, yet effective feedback. Also, it was observed to have high effectiveness and efficiency when compared to existing methods, thus making it a promising option for real-time feedback generation in SBT.


computer-based medical systems | 2017

Simulation for Training Cochlear Implant Electrode Insertion

Xingjun Ma; Sudanthi N. R. Wijewickrema; Yun Zhou; Bridget Copson; James Bailey; Gregor Kennedy; Stephen O'Leary

Cochlear implant surgery is performed to restore hearing in patients with a range of hearing disorders. To optimise hearing outcomes, trauma during the insertion of a cochlear implant electrode has to be minimised. Factors that contribute to the degree of trauma caused during surgery include: the location of the electrode, type of electrode, and the competence level of the surgeon. Surgical competence depends on knowledge of anatomy and experience in a range of situations, along with technical skills. Thus, during training, a surgeon should be exposed to a range of anatomical variations, where he/she can learn and practice the intricacies of the surgical procedure, as well as explore different implant options and consequences thereof. Virtual reality simulation offers a versatile platform on which such training can be conducted. In this paper, we discuss a prototype implementation for the visualisation and analysis of electrode trajectories in relation to anatomical variation, prior to its inclusion in a virtual reality training module for cochlear implant surgery.


medical image computing and computer assisted intervention | 2017

Providing Effective Real-Time Feedback in Simulation-Based Surgical Training

Xingjun Ma; Sudanthi N. R. Wijewickrema; Yun Zhou; Shuo Zhou; Stephen O'Leary; James Bailey

Virtual reality simulation is becoming popular as a training platform in surgical education. However, one important aspect of simulation-based surgical training that has not received much attention is the provision of automated real-time performance feedback to support the learning process. Performance feedback is actionable advice that improves novice behaviour. In simulation, automated feedback is typically extracted from prediction models trained using data mining techniques. Existing techniques suffer from either low effectiveness or low efficiency resulting in their inability to be used in real-time. In this paper, we propose a random forest based method that finds a balance between effectiveness and efficiency. Experimental results in a temporal bone surgery simulation show that the proposed method is able to extract highly effective feedback at a high level of efficiency.


artificial intelligence in education | 2018

Providing Automated Real-Time Technical Feedback for Virtual Reality Based Surgical Training: Is the Simpler the Better?

Sudanthi N. R. Wijewickrema; Xingjun Ma; Patorn Piromchai; Robert Briggs; James Bailey; Gregor Kennedy; Stephen O’Leary

In surgery, where mistakes have the potential for dire consequences, proper training plays a crucial role. Surgical training has traditionally relied upon experienced surgeons mentoring trainees through cadaveric dissection and operating theatre practice. However, with the growing demand for more surgeons and more efficient training programs, it has become necessary to employ supplementary forms of training such as virtual reality simulation. However, the use of such simulations as autonomous training platforms is limited by the extent to which they can provide automated performance feedback. Recent work has focused on overcoming this issue by developing algorithms to provide feedback that emulates the advice of human experts. These algorithms can mainly be categorized into rule-based and machine learning based methods, and they have typically been validated through user studies against controls that received no feedback. To our knowledge, no investigations into the performance of the two types of feedback generation methods in comparison to each other have so far been conducted. To this end, we introduce a rule-based method of providing technical feedback in virtual reality simulation-based temporal bone surgery, implement a machine learning based method that has been proven to outperform other similar methods, and compare their performance in teaching surgical skills in practice through a user study. We show that simpler rule-based methods can be equally or more effective in teaching surgical skills when compared to more complex methods of feedback generation.


international conference on learning representations | 2018

Characterizing Adversarial Subspaces Using Local Intrinsic Dimensionality

Xingjun Ma; Bo Li; Yisen Wang; Sarah M. Erfani; Sudanthi N. R. Wijewickrema; Grant Schoenebeck; Michael E. Houle; Dawn Song; James Bailey


computer vision and pattern recognition | 2018

Iterative Learning With Open-Set Noisy Labels

Yisen Wang; Weiyang Liu; Xingjun Ma; James Bailey; Hongyuan Zha; Le Song; Shu-Tao Xia


national conference on artificial intelligence | 2017

Unbiased Multivariate Correlation Analysis.

Yisen Wang; Simone Romano; Vinh Nguyen; James Bailey; Xingjun Ma; Shu-Tao Xia


international conference on machine learning | 2018

Dimensionality-Driven Learning with Noisy Labels

Xingjun Ma; Yisen Wang; Michael E. Houle; Shuo Zhou; Sarah M. Erfani; Shu-Tao Xia; Sudanthi N. R. Wijewickrema; James Bailey


uncertainty in artificial intelligence | 2018

Learning Deep Hidden Nonlinear Dynamics from Aggregate Data

Yisen Wang; Bo Dai; Lingkai Kong; Hongyuan Zha; Xingjun Ma; Sarah M. Erfani; James Bailey; Le Song; Shu-Tao Xia

Collaboration


Dive into the Xingjun Ma's collaboration.

Top Co-Authors

Avatar

James Bailey

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yun Zhou

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shuo Zhou

University of Melbourne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge