Graduation Year

2018

Document Type

Dissertation

Degree

Ph.D.

Degree Name

Doctor of Philosophy (Ph.D.)

Degree Granting Department

Computer Science and Engineering

Major Professor

Miguel A. Labrador, Ph.D.

Co-Major Professor

Yu Sun, Ph.D.

Committee Member

Sean Barbeau, Ph.D.

Committee Member

Sudeep Sarkar, Ph.D.

Committee Member

Idalides Vergara, Ph.D.

Keywords

Fall Detection, Floor Detection, Object-on-floor Detection, Distance Estimation, Tripping Analysis

Abstract

The World Health Organization claims that there are more than 285 million blind and visually impaired people in the world. In the US, 25 million Americans suffer from total or partial vision loss. As a result of their impairment, they struggle with mobility problems, especially the risk of falling. According to the National Council On Aging, falls are among the primary causes for fatal injury and they are the most common cause of non-fatal trauma-related hospital admissions among older adults. Visibility, an organization that helps visually impaired people, reports that people with visual impairments are twice as likely to fall as their sighted counterparts.

The Centers for Disease Control and Prevention reported that 2.5 million American adults were treated for fall-related injuries in 2013, leading to over 800,000 hospitalizations and over 27,000 deaths. The total cost of fall injuries in the United States in 2013 was $31 billion, and the financial total is expected to rise to $67.7 billion by 2020. Reducing the amount of these unexpected hospital visits saves money and expands the quality of life for the affected population.

Technology has completely revolutionized how nowadays activities are conducted and how var- ious tasks are accomplished, and mobile devices are at the center of this paradigm shift. According to the Pew Research Center, 64% of American adults own a smartphone currently, and this number is trending upward. Mobile computing devices have evolved to include a plethora of data sensors that can be manipulated to create solutions for humanity, including fall prevention.

Fall prevention is an area of research that focuses on strengthening safety in order to prevent falls from occurring. Many fall prevention systems use sensing devices to measure the likelihood of a fall. Sensor data are usually processed using computer vision, data mining, and machine learning techniques.

This work pertains to the implementation of a smartphone-based fall prevention system for the elderly and visually impaired. The system consists of two modules: fall prevention and fall detection. Fall prevention is in charge of identifying tripping hazards in the user’s surroundings. Fall detection is in charge of detecting when falls happen and alerting a person of interest. The proposed system is challenged by multiple problems: it has to run in near real time, it has to run efficiently in a smartphone hardware, it has to process structured and unstructured environments, and many more related to image analysis (occlusion, motion blur, computational complexity, etc).

The fall prevention module is divided into three parts, floor detection, object-on-floor detection, and distance estimation. The evaluation process of the best approach for floor detection achieved an accuracy of 92%, a precision of 88%, and a recall of 92%. The evaluation process of the best approach for object-on-floor detection achieved an accuracy of 90%, a precision of 56%, and a recall of 78%. The evaluation process of the best approach for distance estimation achieved a MSE error of 0.45 meters.

The fall detection module is approached from two perspectives, using inertial measuring units (IMU) embedded in today’s smartphones, and using a 2D camera. The evaluation process of the solution using IMUs achieved an accuracy of 83%, a precision of 89%, and a recall of 58.2%. The evaluation process of the solution that uses a 2D camera achieved an accuracy of 85.37% and a recall of 70.97%.

Share

COinS