Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pengfei Shao is active.

Publication


Featured researches published by Pengfei Shao.


Annals of Biomedical Engineering | 2014

Designing a Wearable Navigation System for Image-Guided Cancer Resection Surgery

Pengfei Shao; Houzhu Ding; Jinkun Wang; Peng Liu; Qiang Ling; Jiayu Chen; Junbin Xu; Shiwu Zhang; Ronald X. Xu

A wearable surgical navigation system is developed for intraoperative imaging of surgical margin in cancer resection surgery. The system consists of an excitation light source, a monochromatic CCD camera, a host computer, and a wearable headset unit in either of the following two modes: head-mounted display (HMD) and Google glass. In the HMD mode, a CMOS camera is installed on a personal cinema system to capture the surgical scene in real-time and transmit the image to the host computer through a USB port. In the Google glass mode, a wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A software program is written in Python to call OpenCV functions for image calibration, co-registration, fusion, and display with augmented reality. The imaging performance of the surgical navigation system is characterized in a tumor simulating phantom. Image-guided surgical resection is demonstrated in an ex vivo tissue model. Surgical margins identified by the wearable navigation system are co-incident with those acquired by a standard small animal imaging system, indicating the technical feasibility for intraoperative surgical margin detection. The proposed surgical navigation system combines the sensitivity and specificity of a fluorescence imaging system and the mobility of a wearable goggle. It can be potentially used by a surgeon to identify the residual tumor foci and reduce the risk of recurrent diseases without interfering with the regular resection procedure.


PLOS ONE | 2016

A Wearable Goggle Navigation System for Dual-Mode Optical and Ultrasound Localization of Suspicious Lesions: Validation Studies Using Tissue-Simulating Phantoms and an Ex Vivo Human Breast Tissue Model.

Zeshu Zhang; Jing Pei; Dong Wang; Qi Gan; Jian Ye; Jian Yue; Benzhong Wang; Stephen P. Povoski; Edward W. Martin; Charles L. Hitchcock; Alper Yilmaz; Michael F. Tweedle; Pengfei Shao; Ronald X. Xu

Surgical resection remains the primary curative treatment for many early-stage cancers, including breast cancer. The development of intraoperative guidance systems for identifying all sites of disease and improving the likelihood of complete surgical resection is an area of active ongoing research, as this can lead to a decrease in the need of subsequent additional surgical procedures. We develop a wearable goggle navigation system for dual-mode optical and ultrasound imaging of suspicious lesions. The system consists of a light source module, a monochromatic CCD camera, an ultrasound system, a Google Glass, and a host computer. It is tested in tissue-simulating phantoms and an ex vivo human breast tissue model. Our experiments demonstrate that the surgical navigation system provides useful guidance for localization and core needle biopsy of simulated tumor within the tissue-simulating phantom, as well as a core needle biopsy and subsequent excision of Indocyanine Green (ICG)—fluorescing sentinel lymph nodes. Our experiments support the contention that this wearable goggle navigation system can be potentially very useful and fully integrated by the surgeon for optimizing many aspects of oncologic surgery. Further engineering optimization and additional in vivo clinical validation work is necessary before such a surgical navigation system can be fully realized in the everyday clinical setting.


Journal of Biomedical Optics | 2015

Three-dimensional fuse deposition modeling of tissue-simulating phantom for biomedical optical imaging

Erbao Dong; Zuhua Zhao; Minjie Wang; Yanjun Xie; Shidi Li; Pengfei Shao; Liuquan Cheng; Ronald X. Xu

Abstract. Biomedical optical devices are widely used for clinical detection of various tissue anomalies. However, optical measurements have limited accuracy and traceability, partially owing to the lack of effective calibration methods that simulate the actual tissue conditions. To facilitate standardized calibration and performance evaluation of medical optical devices, we develop a three-dimensional fuse deposition modeling (FDM) technique for freeform fabrication of tissue-simulating phantoms. The FDM system uses transparent gel wax as the base material, titanium dioxide (TiO2) powder as the scattering ingredient, and graphite powder as the absorption ingredient. The ingredients are preheated, mixed, and deposited at the designated ratios layer-by-layer to simulate tissue structural and optical heterogeneities. By printing the sections of human brain model based on magnetic resonance images, we demonstrate the capability for simulating tissue structural heterogeneities. By measuring optical properties of multilayered phantoms and comparing with numerical simulation, we demonstrate the feasibility for simulating tissue optical properties. By creating a rat head phantom with embedded vasculature, we demonstrate the potential for mimicking physiologic processes of a living system.


PLOS ONE | 2016

Benchtop and Animal Validation of a Projective Imaging System for Potential Use in Intraoperative Surgical Guidance.

Qi Gan; Dong Wang; Jian Ye; Zeshu Zhang; Xinrui Wang; Chuanzhen Hu; Pengfei Shao; Ronald X. Xu

We propose a projective navigation system for fluorescence imaging and image display in a natural mode of visual perception. The system consists of an excitation light source, a monochromatic charge coupled device (CCD) camera, a host computer, a projector, a proximity sensor and a Complementary metal–oxide–semiconductor (CMOS) camera. With perspective transformation and calibration, our surgical navigation system is able to achieve an overall imaging speed higher than 60 frames per second, with a latency of 330 ms, a spatial sensitivity better than 0.5 mm in both vertical and horizontal directions, and a projection bias less than 1 mm. The technical feasibility of image-guided surgery is demonstrated in both agar-agar gel phantoms and an ex vivo chicken breast model embedding Indocyanine Green (ICG). The biological utility of the system is demonstrated in vivo in a classic model of ICG hepatic metabolism. Our benchtop, ex vivo and in vivo experiments demonstrate the clinical potential for intraoperative delineation of disease margin and image-guided resection surgery.


Journal of Biophotonics | 2018

Performance of a cost-effective and automated blood counting system for resource-limited settings operated by trained and untrained users

Dengling Xie; Yanjun Xie; Peng Liu; Lieshu Tong; Chuanzhen Hu; Pengfei Shao; Kaiqin Chu; Zachary J. Smith

Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this article we report a streamlined, easy-to-use method to count red blood cells (RBC), white blood cells (WBC), platelets (PLT) and 3-part WBC differential through a cost-effective and automated image-based blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection utilizes volume-controlled capillary tubes, which are then dropped into a premixed, shelf-stable solution to stain and dilute in a single step. Sample measurement and analysis are fully automated, requiring no input from the user. Cost of the system is minimized through the use of custom-designed motorized components. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard on 120 adult blood samples, demonstrating agreement within Clinical Laboratory Improvement Amendments guidelines, with no statistical difference in performance among different operator groups. The systems cost-effectiveness, automation and performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.


Proceedings of SPIE | 2016

A Google Glass navigation system for ultrasound and fluorescence dual-mode image-guided surgery

Zeshu Zhang; Jing Pei; Dong Wang; Chuanzhen Hu; Jian Ye; Qi Gan; Peng Liu; Jian Yue; Benzhong Wang; Pengfei Shao; Stephen P. Povoski; Edward W. Martin; Alper Yilmaz; Michael F. Tweedle; Ronald X. Xu

Surgical resection remains the primary curative intervention for cancer treatment. However, the occurrence of a residual tumor after resection is very common, leading to the recurrence of the disease and the need for re-resection. We develop a surgical Google Glass navigation system that combines near infrared fluorescent imaging and ultrasonography for intraoperative detection of sites of tumor and assessment of surgical resection boundaries, well as for guiding sentinel lymph node (SLN) mapping and biopsy. The system consists of a monochromatic CCD camera, a computer, a Google Glass wearable headset, an ultrasonic machine and an array of LED light sources. All the above components, except the Google Glass, are connected to a host computer by a USB or HDMI port. Wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A control program is written in C++ to call OpenCV functions for image calibration, processing and display. The technical feasibility of the system is tested in both tumor simulating phantoms and in a human subject. When the system is used for simulated phantom resection tasks, the tumor boundaries, invisible to the naked eye, can be clearly visualized with the surgical Google Glass navigation system. This system has also been used in an IRB approved protocol in a single patient during SLN mapping and biopsy in the First Affiliated Hospital of Anhui Medical University, demonstrating the ability to successfully localize and resect all apparent SLNs. In summary, our tumor simulating phantom and human subject studies have demonstrated the technical feasibility of successfully using the proposed goggle navigation system during cancer surgery.


Proceedings of SPIE | 2016

A projective surgical navigation system for cancer resection

Qi Gan; Pengfei Shao; Dong Wang; Jian Ye; Zeshu Zhang; Xinrui Wang; Ronald X. Xu

Near infrared (NIR) fluorescence imaging technique can provide precise and real-time information about tumor location during a cancer resection surgery. However, many intraoperative fluorescence imaging systems are based on wearable devices or stand-alone displays, leading to distraction of the surgeons and suboptimal outcome. To overcome these limitations, we design a projective fluorescence imaging system for surgical navigation. The system consists of a LED excitation light source, a monochromatic CCD camera, a host computer, a mini projector and a CMOS camera. A software program is written by C++ to call OpenCV functions for calibrating and correcting fluorescence images captured by the CCD camera upon excitation illumination of the LED source. The images are projected back to the surgical field by the mini projector. Imaging performance of this projective navigation system is characterized in a tumor simulating phantom. Image-guided surgical resection is demonstrated in an ex-vivo chicken tissue model. In all the experiments, the projected images by the projector match well with the locations of fluorescence emission. Our experimental results indicate that the proposed projective navigation system can be a powerful tool for pre-operative surgical planning, intraoperative surgical guidance, and postoperative assessment of surgical outcome. We have integrated the optoelectronic elements into a compact and miniaturized system in preparation for further clinical validation.


Proceedings of SPIE | 2016

A portable fluorescence microscopic imaging system for cholecystectomy

Jian Ye; Chaoyu Yang; Qi Gan; Rong Ma; Zeshu Zhang; Shufang Chang; Pengfei Shao; Shiwu Zhang; Chenhai Liu; Ronald X. Xu

In this paper we proposed a portable fluorescence microscopic imaging system to prevent iatrogenic biliary injuries from occurring during cholecystectomy due to misidentification of the cystic structures. The system consisted of a light source module, a CMOS camera, a Raspberry Pi computer and a 5 inch HDMI LCD. Specifically, the light source module was composed of 690 nm and 850 nm LEDs, allowing the CMOS camera to simultaneously acquire both fluorescence and background images. The system was controlled by Raspberry Pi using Python programming with the OpenCV library under Linux. We chose Indocyanine green(ICG) as a fluorescent contrast agent and then tested fluorescence intensities of the ICG aqueous solution at different concentration levels by our fluorescence microscopic system compared with the commercial Xenogen IVIS system. The spatial resolution of the proposed fluorescence microscopic imaging system was measured by a 1951 USAF resolution target and the dynamic response was evaluated quantitatively with an automatic displacement platform. Finally, we verified the technical feasibility of the proposed system in mouse models of bile duct, performing both correct and incorrect gallbladder resection. Our experiments showed that the proposed system can provide clear visualization of the confluence between the cystic duct and common bile duct or common hepatic duct, suggesting that this is a potential method for guiding cholecystectomy. The proposed portable system only cost a total of


Proceedings of SPIE | 2016

Fabrication of Indocyanine Green and 2H, 3H-perfluoropentane loaded microbubbles for fluorescence and ultrasound imaging

Yutong He; Qiang Wu; Rong Ma; Shufang Chang; Pengfei Shao; Ronald X. Xu

300, potentially promoting its use in resource-limited settings.


Optics in Health Care and Biomedical Optics VII | 2016

3D printing of tissue-simulating phantoms for calibration of biomedical optical devices

Zuhua Zhao; Ximing Zhou; Shuwei Shen; Guangli Liu; Li Yuan; Yuquan Meng; Xiang Lv; Pengfei Shao; Erbao Dong; Ronald X. Xu

As a near-infrared (NIR) fluorescence dye, Indocyanine Green (ICG) has not gained broader clinical applications, owing to its multiple limitations such as concentration-dependent aggregation, low fluorescence quantum yield, poor physicochemical stability and rapid elimination from the body. In the meanwhile, 2H,3H-perfluoropentane (H-PFP) has been widely studied in ultrasound imaging as a vehicle for targeted delivery of contrast agents and drugs. We synthesized a novel dual-modal fluorescence and ultrasound contrast agent by encapsulating ICG and H-PFP in lipid microbubbles using a liquid-driven coaxial flow focusing (LDCFF) process. Uniform microbubbles with the sizes ranging from 1-10um and great ICG loading efficiency was achieved by this method. Our benchtop experiments showed that ICG/H-PFP microbubbles exhibited less aggregation, increased fluorescence intensity and more stable photostability compared to free ICG aqueous solution. Our phantom experiments demonstrated that ICG/H-PFP microbubbles enhanced the imaging contrasts in fluorescence imaging and ultrasonography. Our animal experiments indicated that ICG/H-PFP microbubbles extended the ICG life time and facilitated dual mode fluorescence and ultrasound imaging in vivo.

Collaboration


Dive into the Pengfei Shao's collaboration.

Top Co-Authors

Avatar

Ronald X. Xu

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Zeshu Zhang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Jian Ye

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Qi Gan

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Dong Wang

Ohio State University

View shared research outputs
Top Co-Authors

Avatar

Chuanzhen Hu

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Houzhu Ding

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Jinkun Wang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Peng Liu

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Shiwu Zhang

University of Science and Technology of China

View shared research outputs
Researchain Logo
Decentralizing Knowledge