Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Shudong Sun is active.

Publication


Featured researches published by Shudong Sun.


International Journal of Computer Integrated Manufacturing | 2015

Real-time information capturing and integration framework of the internet of manufacturing things

Yingfeng Zhang; Geng Zhang; Junqiang Wang; Shudong Sun; Shubin Si; Teng Yang

Currently, the typical challenges that manufacturing enterprises faced are the lack of timely, accurate and consistent information of manufacturing things (resources) during manufacturing execution. Real-time information visibility and traceability allows decision makers to make better-informed shop-floor decisions. In this article, a real-time information capturing and integration architecture of the internet of manufacturing things (IoMT) is presented to provide a new paradigm by extending the techniques of IoT to manufacturing field. Under this architecture and its key components, the manufacturing things such as operators, machines, pallets, materials etc. can be embedded with sensors, they can interact with each other. Considering the challenges of processing a huge amount of real-time data into useful information and exchange it among the heterogeneous application systems, a Real-time Manufacturing Information Integration Service (RTMIIS) has been designed to achieve seamless dual-way connectivity and interoperability among enterprise layer, workshop floor layer and machine layer. Finally, a near-life scenario has been used to illustrate the proof-of-concept application of the proposed IoMT.


Expert Systems With Applications | 2014

Review: Fight sample degeneracy and impoverishment in particle filters: A review of intelligent approaches

Tiancheng Li; Shudong Sun; Tariq P. Sattar; Juan M. Corchado

During the last two decades there has been a growing interest in Particle Filtering (PF). However, PF suffers from two long-standing problems that are referred to as sample degeneracy and impoverishment. We are investigating methods that are particularly efficient at Particle Distribution Optimization (PDO) to fight sample degeneracy and impoverishment, with an emphasis on intelligence choices. These methods benefit from such methods as Markov Chain Monte Carlo methods, Mean-shift algorithms, artificial intelligence algorithms (e.g., Particle Swarm Optimization, Genetic Algorithm and Ant Colony Optimization), machine learning approaches (e.g., clustering, splitting and merging) and their hybrids, forming a coherent standpoint to enhance the particle filter. The working mechanism, interrelationship, pros and cons of these approaches are provided. In addition, approaches that are effective for dealing with high-dimensionality are reviewed. While improving the filter performance in terms of accuracy, robustness and convergence, it is noted that advanced techniques employed in PF often causes additional computational requirement that will in turn sacrifice improvement obtained in real life filtering. This fact, hidden in pure simulations, deserves the attention of the users and designers of new filters.


Signal Processing | 2012

Deterministic resampling: Unbiased sampling to avoid sample impoverishment in particle filters

Tiancheng Li; Tariq P. Sattar; Shudong Sun

A novel resampling algorithm (called Deterministic Resampling) is proposed, which avoids uncensored discarding of low weighted particles thereby avoiding sample impoverishment. The diversity of particles is maintained by deterministically sampling support particles to improve the residual resampling. A proof is given that our approach can be strictly unbiased and maintains the original state density distribution. Additionally, it is practically simple to implement in low dimensional state space applications. The core idea behind our approach is that it is important to (re)sample based on both the weight of particles and their state values, especially when the sample size is small. Our approach, verified by simulations, indicates that estimation accuracy is better than traditional methods with an affordable computation burden.


Computers & Industrial Engineering | 2014

Multi-agent based real-time production scheduling method for radio frequency identification enabled ubiquitous shopfloor environment

Yingfeng Zhang; George Q. Huang; Shudong Sun; Teng Yang

Propose a multi-agent real-time scheduling architecture to close the loop of production planning and control.Four types of agents are designed to implement real-time scheduling according to the real-time feedback.The new crossover and mutation operations are designed in the RSA to improve the efficiency. The lack of timely feedback shopfloor information during manufacturing execution stage leads to significant difficulties in achieving real-time production scheduling. To address this problem, an overall architecture of multi-agent based real-time production scheduling is presented to close the loop of production planning and control. Several contributions are significant. Firstly, wireless devices such as radio frequency identification (RFID) are deployed into value-adding points in a ubiquitous shopfloor environment to form Machine Agent for the collection and processing of real-time shopfloor data. Secondly, Capability Evaluation Agent is designed to optimally assign the tasks to the involved machines at the process planning stage based on the real-time utilization ration of each machine. The third contribution is a Real-time Scheduling Agent for manufacturing tasks scheduling/re-scheduling strategy and methods according to the real-time feedback. Fourthly, a Process Monitor Agent model is designed for tracking and tracing the manufacturing execution based on a critical event structure. Finally, a case is used to demonstrate the proposed multi-agent based real-time production scheduling models and methods.


IEEE Transactions on Reliability | 2012

Integrated Importance Measure of Component States Based on Loss of System Performance

Shubin Si; Hongyan Dui; Xibin Zhao; Shenggui Zhang; Shudong Sun

This paper mainly focuses on the integrated importance measure (IIM) of component states based on loss of system performance. To describe the impact of each component state, we first introduce the performance function of the multi-state system. Then, we present the definition of IIM of component states. We demonstrate its corresponding physical meaning, and then analyze the relationships between IIM and Griffith importance, Wu importance, and Natvig importance. Secondly, we present the evaluation method of IIM for multi-state systems. Thirdly, the characteristics of IIM of component states are discussed. Finally, we demonstrate a numerical example, and an application to an offshore oil and gas production system for IIM to verify the proposed method. The results show that 1) the IIM of component states concerns not only the probability distributions and transition intensities of the states of the object component, but also the change in the system performance under the change of the state distribution of the object component; and 2) IIM can be used to identify the key state of a component that affects the system performance most.


Electronics Letters | 2013

Adapting sample size in particle filters through KLD-resampling

Tiancheng Li; Shudong Sun; Tariq P. Sattar

An adaptive resampling method is provided. It determines the number of particles to resample so that the Kullback-Leibler distance (KLD) between the distribution of particles before resampling and after resampling does not exceed a pre-specified error bound. The basis of the method is the same as Foxs KLD-sampling but implemented differently. The KLD-sampling assumes that samples are coming from the true posterior distribution and ignores any mismatch between the true and the proposal distribution. In contrast, the KLD measure is incorporated into the resampling in which the distribution of interest is just the posterior distribution. That is to say, for sample size adjustment, it is more theoretically rigorous and practically flexible to measure the fit of the distribution represented by weighted particles based on KLD during resampling than in sampling. Simulations of target tracking demonstrate the efficiency of the method.


International Journal of Production Research | 2009

Theory of constraints product mix optimisation based on immune algorithm

Junbiao Wang; Shudong Sun; Shubin Si; H. Yang

Product mix optimisation is one of the most fundamental problems in manufacturing enterprise. As an important component in theory of constraints (TOC), product mix optimisation is solved by the TOC heuristic (TOCh) and some intelligent search algorithms, even though these approaches often cannot effectively obtain a good solution in the previous attempts, especially for the large-scale product mix optimisation. Aiming at this problem, a contribution has been made to the following aspects in the present paper. Firstly, a model of TOC product mix optimisation, which identifies and exploits the capacity constrained resource (CCR) to maximise system throughput is put forward and simplified by cutting down some constraints of non-CCRs. Secondly, an intelligent optimisation approach based on immune algorithm (IA) and TOC for product mix optimisation is presented to search optimal solution(s), whether it is a small-scale or large-scale instance. Thirdly, the immune mechanisms, such as the immune response mechanism, immune self-adaptive regulation and vaccination, are studied in detail, which not only greatly improves the searching ability and adaptability, but also evidently increases the global convergence rate of immune evolution. Fourthly, the proposed approach is implemented and applied in both small-scale and large-scale product mix optimisation. Finally, a comparison between the proposed approach and existing approaches is made. Simulation results show that the proposed approach is superior to the existing approaches, such as the TOCh, revised TOCh, integer linear programming (ILP), tabu search (TS), and genetic algorithms (GA).


Signal Processing | 2016

Algorithm design for parallel implementation of the SMC-PHD filter

Tiancheng Li; Shudong Sun; Miodrag Bolic; Juan M. Corchado

The sequential Monte Carlo (SMC) implementation of the probability hypothesis density (PHD) filter suffers from low computational efficiency since a large number of particles are often required, especially when there are a large number of targets and dense clutter. In order to speed up the computation, an algorithmic framework for parallel SMC-PHD filtering based on multiple processors is proposed. The algorithm makes full parallelization of all four steps of the SMC-PHD filter and the computational load is approximately equal among parallel processors, rendering a high parallelization benefit when there are multiple targets and dense clutter. The parallelization is theoretically unbiased as it provides the same result as the serial implementation, without introducing any approximation. Experiments on multi-core computers have demonstrated that our parallel implementation has gained considerable speedup compared to the serial implementation of the same algorithm. A fully and unbiasedly parallel implementation framework of the SMC-PHD filtering is proposed based on the centralized distributed system that consists of one central unit (CU) and several independent processing elements (PEs). Display Omitted An algorithmic framework for parallel SMC-PHD filtering is proposed.All the main calculations of the filter are unbiasedly paralleled.The parallelization obtains theoretically the same result as the serial implementation.Considerable speed-up is gained.


IEEE Transactions on Reliability | 2015

Semi-Markov Process-Based Integrated Importance Measure for Multi-State Systems

Hongyan Dui; Shubin Si; Ming J. Zuo; Shudong Sun

Importance measures in reliability engineering are used to identify weak components of a system and signify the roles of components in contributing to proper functioning of the system. Recently, an integrated importance measure (IIM) has been proposed to evaluate how the transition of component states affects the system performance based on the probability distributions and transition rates of component states. In the system operation phase, the bathtub curve presents the change of the transition rate of component states with time, which can be described by three different Weibull distributions. The behavior of a system under such distributions can be modeled by the semi-Markov process. So, based on the reported IIM equations of component states, this paper studies how the transition of component states affects system performance under the semi-Markov process. This measure can provide useful information for preventive actions (such as monitoring enhancement, construction improvement, etc.), and provide support to improve system performance. Finally, a simple numerical example is presented to illustrate the utilization of the proposed method.


Expert Systems With Applications | 2011

Identifying product failure rate based on a conditional Bayesian network classifier

Zhiqiang Cai; Shudong Sun; Shubin Si; Bernard Yannou

Research highlights? CBN introduces the conditional independence relationships among attribute variables. ? CBN provides an effective approach to classify the failure rate rank of products. ? CBN increases the classification accuracy. ? CBN makes an acceptable balance between classifier complexity and performance. To identify the product failure rate grade under diverse configuration and operation conditions, a new conditional Bayesian networks (CBN) model is brought forward. By indicating the conditional independence relationship between attribute variables given the target variable, this model could provide an effective approach to classify the grade of failure rate. Furthermore, on the basis of the CBN model, the procedure of building product failure rate grade classifier is elaborated with modeling and application. At last, a case study is carried out and the results show that, with comparison to other Bayesian networks classifiers and traditional decision tree C4.5, the CBN model not only increases the total classification accuracy, but also reduces the complexity of network structure.

Collaboration


Dive into the Shudong Sun's collaboration.

Top Co-Authors

Avatar

Shubin Si

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar

Zhiqiang Cai

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar

Tiancheng Li

London South Bank University

View shared research outputs
Top Co-Authors

Avatar

Junqiang Wang

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar

Hongyan Dui

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yingfeng Zhang

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar

Ning Wang

Northwestern Polytechnical University

View shared research outputs
Top Co-Authors

Avatar

Tariq P. Sattar

London South Bank University

View shared research outputs
Top Co-Authors

Avatar

Javier Bajo

Technical University of Madrid

View shared research outputs
Researchain Logo
Decentralizing Knowledge