Introduction: Breast cancer is one of the most commonly diagnosed cancer among American women, it is expected that in 2017 30% of newly diagnosed cancers will be breast cancers. Several studies show the statistical significance of including ultrasound examination next to mammography in the whole breast screening. Ultrasound is more appropriate for the dense-breast population, there is no ionizing radiation involved, and the technology is affordable and thus available worldwide. Moreover, thanks to the growing computational power, efficiency parameters of ultrasound tools, such as achievable data resolution, have been significantly improved. The downside of the most widespread handheld probe form is the fact that an expert gets representation of 3D object in the form of its 2D crossections. It is difficult to ensure the full coverage of the breast during examination and, moreover. There are systems for automated breast ultrasound (ABUS) often able to produce 3D data reconstruction but their availability is still limited. We propose to improve the overall usability of the common ultrasound examination by introducing computer aided tools that provide data analysis support and work with position tracking devices often already present in modern ultrasound devices. Results: The novel framework for breast examination is applicable when the expert uses a handheld probe complemented by a 3D position tracker. The proposed tool offers new functionality next to the common ultrasound image/video acquisition and lessens the intricacy of the 3D object/2D frame-wise data relationship. First, the framework evaluates in real-time the level of breast coverage by highlighting parts which were not sufficiently examined, both space- and time-wise. Using our framework, but turning the breast coverage information off, we conducted a blind validation test by tracking several radiologists that were asked to perform whole breast screening on 75 patients. It was revealed that in average 3.8% of the breast surface was missed, which is equivalent to around 10 cm2 for medium-sized breasts. We also calculated other characteristics of the examinations, such as average time, influence of expert’s experience, discrepancies between right and left breast examinations, etc. The second functional advantage of the framework is its ability to locally generate 3D view of the present structures, thus enabling the expert to study the region of interest (e.g. tumor) in 3D. Finally, the chosen regions/objects can be represented by means of features motivated by BI-RADS descriptors, capturing the shape and textural characteristics, and classified. Future work: The classification of the region of interest based on their local 3D reconstructions can be realized using convolution neural networks. We plan to test potential of this approach with special focus on categorization of the tumors into BI-RADS assessment categories. Conclusions: We have developed a novel tool which improves the breast examination with handheld ultrasound probe and lessens the 3D-2D discrepancy. In real-time it monitors the level of breast coverage during the examination, provides local 3D views of objects of interest, and characterizes these objects by means of BI-RADS-like features useful for tissue classification.

 

 

MEDICAL IMAGING & BIOMEDICAL DIAGNOSTICS

Author: Filip Sroubek

Coauthor(s): Michal Bartos PhD 1, Barbara Zitova PhD 1, Jan Danes Dr 2, Lukas Lambert Dr 2, Jan Vydra 3; 1 The Czech Academy of Sciences, Institute of Information Theory and Automation, Prague, CZ, 2 Department of Radiology, First Faculty of Medicine, Charles University in Prague and General University Hospital in Prague, Prague, CZ, 3 Medico, Prague, CZ.

Status: Work In Progress

Funding Acknowledgment: This work has been supported by the Technology Agency of the Czech Republic (TACR) Project no. TA04011392.