Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lu Lan is active.

Publication


Featured researches published by Lu Lan.


Biomedical Optics Express | 2015

Assessing breast tumor margin by multispectral photoacoustic tomography

Rui Li; Pu Wang; Lu Lan; Frank P. Lloyd Jr.; Craig J. Goergen; Shaoxiong Chen; Ji-Xin Cheng

An unmet need exists in high-speed and highly-sensitive intraoperative assessment of breast cancer margin during conservation surgical procedures. Here, we demonstrate a multispectral photoacoustic tomography system for breast tumor margin assessment using fat and hemoglobin as contrasts. This system provides ~3 mm tissue depth and ~125 μm axial resolution. The results agreed with the histological findings. A high sensitivity in margin assessment was accomplished, which opens a compelling way to intraoperative margin assessment.


Advanced Materials | 2017

Semiconducting Polymer Nanoparticles for Centimeters-Deep Photoacoustic Imaging in the Second Near-Infrared Window

Jiayingzi Wu; Liyan You; Lu Lan; Hyeon Jeong Lee; Saadia T. Chaudhry; Rui Li; Ji-Xin Cheng; Jianguo Mei

Thienoisoindigo-based semiconducting polymer with a strong near-infrared absorbance is synthesized and its water-dispersed nanoparticles (TSPNs) are investigated as a contrast agent for photoacoustic (PA) imaging in the second near-infrared (NIR-II) window (1000-1350 nm). The TSPNs generate a strong PA signal in the NIR-II optical window, where background signals from endogenous contrast agents, including blood and lipid, are at the local minima. By embedding a TSPN-containing tube in chicken-breast tissue, an imaging depth of more than 5 cm at 1064 nm excitation is achieved with a contrast-agent concentration as low as 40 µg mL-1 . The TSPNs under the skin or in the tumor are clearly visualized at 1100 and 1300 nm, with negligible interference from the tissue background. TSPN as a PA contrast in the NIR-II window opens new opportunities for biomedical imaging of deep tissues with improved contrast.


Photoacoustics | 2017

Spectral analysis assisted photoacoustic imaging for lipid composition differentiation

Yingchun Cao; Ayeeshik Kole; Lu Lan; Pu Wang; Jie Hui; Michael Sturek; Ji-Xin Cheng

Recent advances in atherosclerotic plaque detection have shown that not only does lipid core size and depth play important roles in plaque rupture and thrombi formation, but lipid composition, especially cholesterol deposition, is equally important in determining lesion vulnerability. Here, we demonstrate a spectral analysis assisted photoacoustic imaging approach to differentiate and map lipid compositions within an artery wall. The approach is based on the classification of spectral curves obtained from the sliding windows along time-of-flight photoacoustic signals via a numerical k-means clustering method. The evaluation result on a vessel-mimicking phantom containing cholesterol and olive oil shows accuracy and efficiency of this method, suggesting the potential to apply this approach in assessment of atherosclerotic plaques.


Photoacoustics | 2018

Photoacoustic tomography of intact human prostates and vascular texture analysis identify prostate cancer biopsy targets

Brittani L. Bungart; Lu Lan; Pu Wang; Rui Li; Michael O. Koch; Liang Cheng; Timothy A. Masterson; Murat Dundar; Ji-Xin Cheng

Prostate cancer is poorly visualized on ultrasonography (US) so that current biopsy requires either a templated technique or guidance after fusion of US with magnetic resonance imaging. Here we determined the ability for photoacoustic tomography (PAT) and US followed by texture-based image processing to identify prostate biopsy targets. K-means clustering feature learning and testing was performed on separate datasets comprised of 1064 and 1197 nm PAT and US images of intact, ex vivo human prostates. 1197 nm PAT was found to not contribute to the feature learning, and thus, only 1064 nm PAT and US images were used for final feature testing. Biopsy targets, determined by the tumor-assigned pixels’ center of mass, located 100% of the primary lesions and 67% of the secondary lesions. In conclusion, 1064 nm PAT and US texture-based feature analysis provided successful prostate biopsy targets.


Light-Science & Applications | 2018

A fiber optoacoustic guide with augmented reality for precision breast-conserving surgery

Lu Lan; Yan Xia; Rui Li; Kaiming Liu; Jieying Mai; Jennifer Anne Medley; Samilia Obeng-Gyasi; Linda K. Han; Pu Wang; Ji-Xin Cheng

Lumpectomy, also called breast-conserving surgery, has become the standard surgical treatment for early-stage breast cancer. However, accurately locating the tumor during a lumpectomy, especially when the lesion is small and nonpalpable, is a challenge. Such difficulty can lead to either incomplete tumor removal or prolonged surgical time, which result in high re-operation rates (~25%) and increased surgical costs. Here, we report a fiber optoacoustic guide (FOG) with augmented reality (AR) for sub-millimeter tumor localization and intuitive surgical guidance with minimal interference. The FOG is preoperatively implanted in the tumor. Under external pulsed light excitation, the FOG omnidirectionally broadcasts acoustic waves through the optoacoustic effect by a specially designed nano-composite layer at its tip. By capturing the acoustic wave, three ultrasound sensors on the breast skin triangulate the FOG tip’s position with 0.25-mm accuracy. An AR system with a tablet measures the coordinates of the ultrasound sensors and transforms the FOG tip’s position into visual feedback with <1-mm accuracy, thus aiding surgeons in directly visualizing the tumor location and performing fast and accurate tumor removal. We further show the use of a head-mounted display to visualize the same information in the surgeons’ first-person view and achieve hands-free guidance. Towards clinical application, a surgeon successfully deployed the FOG to excise a “pseudo tumor” in a female human cadaver. With the high-accuracy tumor localization by FOG and the intuitive surgical guidance by AR, the surgeon performed accurate and fast tumor removal, which will significantly reduce re-operation rates and shorten the surgery time.Optoacoustics: tumor localizationImplanting an optoacoustic beacon into breast tumors provides surgeons with sub-millimeter localization accuracy when performing surgery. The system has been developed by Lu Lan from Boston University and coworkers. A fiber optoacoustic guide(FOG) which has a fiber tip coated with ZnO nanoparticles and graphite so that it is strongly light absorbing is implanted into the tumor prior to surgery. When exposed to pulsed light excitation the tip generates omnidirectional ultrasound waves due to the optoacoustic effect. The use of three ultrasound sensors to collect this signal makes it possible to accurately triangulate the exact location of the tumor during surgery. Combining the FOG with an augmented reality system offers surgeons an intuitive guidance aid which not only allows accurate and rapid tumor removal but will also help reduce the need for re-operation.


Proceedings of SPIE | 2017

An optoacoustic guide with augmented reality system towards precision breast conserving surgery (Conference Presentation)

Lu Lan; Kaiming Liu; Yan Xia; Jiayingzi Wu; Rui Li; Pu Wang; Linda K. Han; Ji-Xin Cheng

Breast-conserving surgery is a well-accepted breast cancer treatment. However, it is still challenging for the surgeon to accurately localize the tumor during the surgery. Also, the guidance provided by current methods is 1 dimensional distance information, which is indirect and not intuitive. Therefore, it creates problems on a large re-excision rate, and a prolonged surgical time. To solve these problems, we have developed a fiber-delivered optoacoustic guide (OG), which mimics the traditional localization guide wire and is preoperatively placed into tumor mass, and an augmented reality (AR) system to provide real-time visualization on the location of the tumor with sub-millimeter variance. By a nano-composite light diffusion sphere and light absorbing layer formed on the tip of an optical fiber, the OG creates an omnidirectional acoustic source inside tumor mass under pulsed laser excitation. The optoacoustic signal generated has a high dynamic range (~ 58dB) and spreads in a large apex angle of 320 degrees. Then, an acoustic radar with three ultrasound transducers is attached to the breast skin, and triangulates the location of the OG tip. With an AR system to sense the location of the acoustic radar, the relative position of the OG tip inside the tumor to the AR display is calculated and rendered. This provides direct visual feedback of the tumor location to surgeons, which will greatly ease the surgical planning during the operation and save surgical time. A proof-of-concept experiment using a tablet and a stereo-vision camera is demonstrated and 0.25 mm tracking variance is achieved.


Cancer | 2016

MarginPAT: High-Speed Intraoperative Breast Tumor Margin Assessment Tool

Pu Wang; Lu Lan; Rui Li; Ji-Xin Cheng

As lumpectomy is well accepted for the breast cancer treatment, a highly sensitive tool is needed for intraoperative margin assessment. We present a multi-modal photoacoustic/ultrasound imaging system for high-speed intraoperative margin assessment.


Analytical Chemistry | 2017

Quantification of Lipid Metabolism in Living Cells through the Dynamics of Lipid Droplets Measured by Stimulated Raman Scattering Imaging

Chi Zhang; Junjie Li; Lu Lan; Ji-Xin Cheng


Medical Devices & Sensors | 2018

High-speed Intraoperative Assessment of Breast Tumor Margins by Multimodal Ultrasound and Photoacoustic Tomography

Rui Li; Lu Lan; Yan Xia; Pu Wang; Linda K. Han; Gary L. Dunnington; Samilia Obeng-Gyasi; George E. Sandusky; Jennifer Anne Medley; Susan T. Crook; Ji-Xin Cheng


Cancer Research | 2017

Abstract 1873: Intraoperative assessment of breast tumor margins using multimodal photoacoustic tomography (MarginPAT)

Kyle McElyea; George E. Sandusky; Rui Li; Lu Lan; Ji-Xin Cheng; Linda K. Han; Pu Wang

Collaboration


Dive into the Lu Lan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Linda K. Han

Indiana University Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge