Here we will connect the AI model with calculations that find people violating social distancing rules.
In this article, we continue developing a Python console application for an AI-powered social distancing detector. After learning how to detect people's locations in an image, we are ready to calculate the distances between them and indicate which people are too close to each other.
As shown in the image below, if the distance between two people is below a predefined threshold, those people will be outlined with red rectangles. The complete companion code is here.
Calculating the Distance Between Two Points
We start by creating a method to calculate the distance between two people. To do so, we calculate the Euclidean distance between the centers of two bounding boxes:
def calculate_distance_between_rectangle_centers(rect_center_1, rect_center_2):
x_abs_diff = abs(rect_center_1[0] - rect_center_2[0])
y_abs_diff = abs(rect_center_1[1] - rect_center_2[1])
return math.sqrt(x_abs_diff**2 + y_abs_diff**2)
The above function is a static method of the distance_analyzer
module (see distance_analyzer.py in the Part_06 folder).
Finding People That Are Too Close
To find people that are too close, we need to check the distance between each person. Virtually, we will need to perform a double loop. However, the distance between the ith and jth person is the same as for jth and ith person. So, the number of iterations in the nested for loop can be reduced from j=1..N to j=i+1..N. Here is the complete implementation of a method that, given the list of detected people, finds those who are too close (also see distance_analyzer.py):
def find_people_that_are_too_close(detection_results, distance_threshold):
results = []
rectangle_centers = DistanceAnalyzer.get_rectangle_centers(detection_results)
N = len(detection_results)
for i in range(N):
for j in range(i+1, N):
rect_center_1 = rectangle_centers[i]
rect_center_2 = rectangle_centers[j]
distance = DistanceAnalyzer.calculate_distance_between_rectangle_centers(
rect_center_1, rect_center_2)
if(distance <= distance_threshold):
results.append((detection_results[i]['rectangle'],
detection_results[j]['rectangle']))
return results
The method accepts two parameters:
detection_results
– The list of detected people returned by the AI model. Note that each element in that list contains an object that holds the detection score, bounding box, and label. distance_threshold
– Specifies how close the people can be in the image (measured in pixels). If the calculated distance between people is below this value, the bounding boxes will be added to the returned results
list.
Given the input parameters, the method finds centers of the bounding boxes and then calculates the distances between them. It returns the rectangles surrounding the people who are too close. Let's see how to display those rectangles in the image.
Showing Who Is Not Social Distancing
To indicate people who are too close, we can use OpenCV's rectangle
function, as explained earlier. The code might look very similar to the code we used to draw bounding boxes and labels for objects detected in the image. However, as input, we take the list returned by find_people_that_are_too_close
. Remember, that method returns a list in which each element contains two rectangles. These rectangles are the bounding boxes of people violating social distancing rules (distance_threshold
).
We need to iterate over elements in that list and display two rectangles at a time (see the image_helper
module from Part_03):
def indicate_people_that_are_too_close(image, people_that_are_too_close, delay=0):
opencv.namedWindow(common.WINDOW_NAME, opencv.WINDOW_GUI_NORMAL)
for i in range(len(people_that_are_too_close)):
for j in range(len(people_that_are_too_close[i])):
rectangle_points = people_that_are_too_close[i][j]
opencv.rectangle(image, rectangle_points[0], rectangle_points[1],
common.RED, common.LINE_THICKNESS)
opencv.imshow(common.WINDOW_NAME, image)
opencv.waitKey(delay)
As shown above, we use the namedWindow
function to create the window and imshow
to display the image. The other components are nearly the same as in the case of indicating detected objects.
Putting Things Together
With all of the above pieces, we can create the main script as follows (see main.py in the Part_07 folder):
import sys
sys.path.insert(1, '../Part_03/')
sys.path.insert(1, '../Part_05/')
sys.path.insert(1, '../Part_06/')
from inference import Inference as model
from image_helper import ImageHelper as imgHelper
from video_reader import VideoReader as videoReader
from distance_analyzer import DistanceAnalyzer as analyzer
if __name__ == "__main__":
# Load and prepare model
model_file_path = '../Models/01_model.tflite'
labels_file_path = '../Models/02_labels.txt'
# Initialize model
ai_model = model(model_file_path, labels_file_path)
# Initialize video reader
video_file_path = '../Videos/01.mp4'
video_reader = videoReader(video_file_path)
# Detection and preview parameters
score_threshold = 0.4
delay_between_frames = 5
# Perform object detection in the video sequence
while(True):
# Get frame from the video file
frame = video_reader.read_next_frame()
# If frame is None, then break the loop
if(frame is None):
break
# Perform detection
results = ai_model.detect_people(frame, score_threshold)
# Find people that are too close
proximity_distance_threshold = 50
people_that_are_too_close = analyzer.find_people_that_are_too_close(
results, proximity_distance_threshold)
# Indicate those objects in the image
imgHelper.indicate_people_that_are_too_close(
frame, people_that_are_too_close, delay_between_frames)
The script sets up the AI model, opens the sample video file, and then finds people who are too close. Here, I set the distance threshold to 50 pixels. You can freely experiment with this parameter. After running the main.py, you will get the results shown in the introduction.
Wrapping Up
We have finally connected the AI model with calculations that find people violating social distancing rules. However, the model we've been using is not very robust, and sometimes the solution fails to correctly indicate people that are too close. We will fix that in the last article by incorporating a state-of-the-art YOLO detector.
Dawid Borycki is a software engineer and biomedical researcher with extensive experience in Microsoft technologies. He has completed a broad range of challenging projects involving the development of software for device prototypes (mostly medical equipment), embedded device interfacing, and desktop and mobile programming. Borycki is an author of two Microsoft Press books: “Programming for Mixed Reality (2018)” and “Programming for the Internet of Things (2017).”