0% found this document useful (0 votes)
6 views8 pages

Project Report (2021-BME-107)

Uploaded by

Hamza Asjad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views8 pages

Project Report (2021-BME-107)

Uploaded by

Hamza Asjad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Project Report 2021-BME-107

Project Report
Submitted to: Dr. M Umair Ahmed Khan
Submitted by: Hamza Asjad
Registration number: 2021-BME-107
Course name: Medical Image Processing
Course Code BME 420
Semester: 7th
Department: Biomedical Engineering

UNIVERSITY OF ENGINEERING AND TECHNOLOGY


LAHORE

(NAROWAL CAMPUS)
Project Report 2021-BME-107

Image Processing for Tissue Detection, Localization, Bounding Boxes and Coordinates
in Ultrasound Images Using MATLAB

Objective

The purpose of this report is to apply step-by-step process of detecting and localizing fetal tissue
structures from the ultrasound image, drawing bounding boxes around detected tissues, and determining
both image and real-time coordinates.

Introduction

Ultrasound imaging is the most commonly used non-invasive diagnostic method in medicine. The
ultrasound signal has proven to be very helpful in monitoring fetal development. This report describes an
image processing procedure designed for the analysis of fetal ultrasound images with special emphasis on
tissue detection, localization, and the extraction of necessary coordinates. The procedure has been divided
into four steps: contrast enhancement, edge detection, generation of a bounding box around the tissue of
interest, and finally, coordinate extraction. All of these steps that implement the analysis automatically
use MATLAB, which will be helpful in monitoring fetuses for healthcare professionals.

Methodology

1. Preprocessing
Ultrasound images are either noisy or even have contrasts that may appear uneven, and there are several
preprocessing steps to make the important structures clear and reduce artifacts as much as possible.

2. Edge Detection
Edge detection recognizes edges or boundaries in an image where pixel intensity changes suddenly.
This is important in marking many structures found in an ultrasound image.

This work opted for the canny edge detection algorithm, which is quite effective in edge detection in
noisy environments like ultrasound images.

3. Bounding Box Generation


Border boxes are a borderline around the outlined tissues allowing easy identification of each tissue
structure within the ultrasound image.

 Connected Component Analysis: Following edge detection, the regions present in the binary
image are assigned labels through connected component analysis. In this step, all distinct
regions of connected pixels are considered as individual objects.

Bounding box Calculation Using regionprops for MATLAB over all detect objects for which it
computes the bounding box as the minimum enclosing rectangle covering all tissue areas it can
detect in an image. Box Drawing By overlay of every box over either an original or any
processing done over it within an image.
Project Report 2021-BME-107

Bounding boxes around the tissue identify its location within the ultrasound image. It might be
useful later for further analysis or report.

4. Coordinate Extraction
Coordinates of detected tissues' image are extracted to determine their location, measurement, and, if
possible, track them in real time. It might be useful in diagnostic and monitoring procedures.

Centroid Computation: This function computes the centroid of detected region, which is a geometric
center of the tissue. The coordinate will give a reference for every structure within the image.

Coordinate Output: The coordinates of all centroids are output or saved indicating precisely where
each tissue structure has been detected. These coordinates form a reference in case of reporting or
further spatial analyses.

Real-Time Approximation Coordinate: In the lack of direct support, I took the GINPUT function in
place. GINPUT picks out and describes very distinctive points in the image which can be used as
coordinates for real-time spatial referencing.

Procedure

Load and Preprocess the Image

The ultrasound image was loaded into MATLAB, and initial preprocessing was applied. Grayscale
conversion, contrast enhancement, and noise reduction were used to prepare the image:

 Grayscale conversion simplifies the image for edge detection.

Tissue Detection Using Edge Detection

Canny edge detection algorithm is applied to outline the edges of the tissues in the ultrasound image.
This results in clear edges for the detection as Canny edge detection has robustness in medical
images.

Bounding Box Generation and Tissue Localization

The binary edge detected image was labeled for its connected components and boxes are drawn
around each of the connected components as follows:

 Every individual tissue structure was localized.

 In order to delineate the exact region of the found tissue, I made bounding boxes by using the
BoundingBox property of connected components.
Project Report 2021-BME-107

Image Coordinate Mapping Using GINPUT Features

Due to the limitations of directly using ArUco markers in MATLAB, GINPUT is used as an
alternative to approximate image coordinates. GINPUT identifies featurepoints within the image,
providing key points that can be used for reference in real-time tracking applications.

Results MATLAB Code:


img = imread('mip.jpeg');

grayImg = rgb2gray(img); % Convert to grayscale


imshow(grayImg);
title('Original Preprocessed Image (Step 1)');

edges = edge(grayImg, 'Canny'); % Perform edge detection


imshow(edges);
title('Tissue Detection using Edge Detection (Step 2)');
boundingBoxProps = regionprops(edges, 'BoundingBox', 'Centroid'); % Calculate
bounding box and centroids
imshow(grayImg); hold on;

boundingBoxes = vertcat(boundingBoxProps.BoundingBox); % Extract bounding


boxes
for i = 1:size(boundingBoxes, 1)
rectangle('Position', boundingBoxes(i, :), 'EdgeColor', 'g', 'LineWidth',
2); % Draw each bounding box
end
title('Tissue Localization with Bounding Boxes (Step 3)');
centroidCoordinates = reshape([boundingBoxProps.Centroid], 2, []).'; %
Extract coordinates
imshow(grayImg); hold on;
plot(centroidCoordinates(:, 1), centroidCoordinates(:, 2), 'r+',
'MarkerSize', 10, 'LineWidth', 2);
title('Coordinates of Detected Points (Step 4)');
saveas(gcf, 'Coordinates_DetectedPoints_Step4.png');

disp('Click on the image to get real-time coordinates. Press Enter to


finish.');
imshow(grayImg);
title('image Coordinate Capture');
[x, y] = ginput; % Capture mouse clicks on the image
hold on;
plot(x, y, 'bo', 'MarkerSize', 8, 'LineWidth', 1.5); % Plot the real-time
coordinates
disp('image coordinates (X, Y):');
disp([x, y]); % Display the coordinates in the Command Window

The image processing steps Using the above code produced the following results:

1. Preprocessed Image: The original ultrasound image was enhanced for contrast andfiltered
to reduce noise, improving the visibility of tissue structures.
Project Report 2021-BME-107

Figure 1 Preprocessed Image

2. Edge Detection: The Canny edge detection highlighted boundaries of various tissue
structures, allowing precise localization.

Figure 2 Tissue detection using Edge detection

3. Bounding Boxes and Tissue Localization: Each detected tissue structure was
surrounded by a bounding box, clearly indicating its boundaries. The centroids ofeach
bounding box were marked as well.

Figure 3 Detected tissues with bounding Boxes


Project Report 2021-BME-107

4. Image Coordinates: The coordinates of each centroid were displayed, indicating the
image positions of the detected tissues.

Figure 4 Tissue Localization with coordinates

5. Image Coordinates (Alternative Method): Using GINPUT, several feature pointswithin


the image were identified, which could serve as reference points for real-timeapplications
if real-time spatial tracking was needed.
Project Report 2021-BME-107

Figure 5: GINPUT Feature Points for Real-Time Mapping

Real-time Coordinates
For real time coordinates, we use Python programming language, specifically with the OpenCV library
for computer vision tasks and NumPy for numerical operations.

Libraries:
 OpenCV: For video capture, color detection, and contour operations.
 NumPy: For array operations, such as defining HSV ranges.

Code:

import cv2 x, y, w, h))


import numpy as np return detected_colors

# Function to detect specific colors and their # Real-time color detection


coordinates def real_time_multi_color_detection():
def detect_colors(frame, color_ranges): color_ranges = {
hsv = cv2.cvtColor(frame, "Blue": ([100, 150, 0], [140, 255, 255]),
cv2.COLOR_BGR2HSV) "Green": ([40, 70, 70], [80, 255, 255]),
detected_colors = [] "Red": ([0, 150, 50], [10, 255, 255]),
for color_name, (lower, upper) in "Yellow": ([20, 100, 100], [30, 255, 255]),
color_ranges.items(): "Orange": ([10, 100, 100], [20, 255, 255]),
mask = cv2.inRange(hsv, np.array(lower), "Purple": ([130, 50, 50], [160, 255, 255]),
np.array(upper)) "Pink": ([160, 50, 50], [170, 255, 255])
contours, _ = cv2.findContours(mask, }
cv2.RETR_EXTERNAL,
cv2.CHAIN_APPROX_SIMPLE) cap = cv2.VideoCapture(0)
for contour in contours: while cap.isOpened():
if cv2.contourArea(contour) > 500: # ret, frame = cap.read()
Ignore small regions if not ret:
x, y, w, h = break
cv2.boundingRect(contour)
detected_colors.append((color_name, detected_colors = detect_colors(frame,
Project Report 2021-BME-107

color_ranges) cv2.imshow("Detected Colors", frame)


for color_name, x, y, w, h in if cv2.waitKey(1) & 0xFF == ord('q'):
detected_colors: break
cv2.rectangle(frame, (x, y), (x + w, y +
h), (255, 255, 255), 2) cap.release()
cv2.putText(frame, f"{color_name} cv2.destroyAllWindows()
({x},{y})", (x, y - 10),

cv2.FONT_HERSHEY_SIMPLEX, 0.5, (255, # Run the detection


255, 255), 2) real_time_multi_color_detection()

Output:

Figure 6 showing the real-time coordinates Figure 7 showing the real-time coordinates

Conclusion

The project is able to employ image processing algorithms to the ultrasound images to achieve tissue
detection, tissue localization, generation of bounding box around the localized tissue and eventual
extraction of the tissue coordinates using MATLAB. Furthermore, real-time coordinate detection by
using Python and OpenCV for precaution of multiple colors demonstrated sufficiently accurate spatial
monitoring and analysis.

You might also like