Skip to Main Content
Frequently Asked Questions
Submit an ETD
Global Search Box
Need Help?
Keyword Search
Participating Institutions
Advanced Search
School Logo
Files
File List
MahaThesis2015.pdf (34.71 MB)
ETD Abstract Container
Abstract Header
ABSTRACT MODELING OF GESTURE AND INTERACTIVE EMOTION USING FUZZY SET FOR SOCIAL ROBOTICS
Author Info
Thafar, Maha Abduljabbar
Permalink:
http://rave.ohiolink.edu/etdc/view?acc_num=kent1447707351
Abstract Details
Year and Degree
2015, MS, Kent State University, College of Arts and Sciences / Department of Computer Science.
Abstract
Abstract Recently, there has been a growing interest in the development of technology that focuses on automated intelligent machines. Specifically, more research has been conducted on improving the interaction between machines and human beings. For example, robotics applications in various human enterprises such as security, education, health care, etc., have attracted researchers' attention as they address profound life demands. These intelligent robotic applications will require a seamless communication channel with humans. In order to make this interaction more natural, robotics should be able to perceive, recognize, and respond to human emotional states. This field of robotics known as social robotics is supported by emotion recognition system. There are many ways that humans show their emotions through both verbal and nonverbal communication such as speech, vocal information, facial expression, and gestures. Although there are several approaches that have been proposed to recognize limited amount of human emotion based on one modality; limited works have been done to integrate and fuse several modalities. There is little work on the identifying gestures and their role in identifying interactive emotions, and there are no abstract models for the fusion of multiple modalities to identify emotions. In this thesis, different gestures are analyzed, classified, recognized, and associated with typical human emotions with different intensity levels. Attributes like facial expression and speech are also considered in the system. Knowledge base `Gesture Catalog’ for the social robotics, is generate. Semantic algebra rules are identified to identify different relationships among several attributes; namely, facial-expressions, gestures, intensity level, and emotions. Gesture parameterization is developed using fuzzy sets and classification. Then, an abstract mathematical model is presented for modeling gestures and identifying emotional states by deploying several tuples data fit, Cartesian product, mapping function, and set theory. At the end, some algorithms are developed for gesture localization, gesture fuzzification, and emotion mapping.c
Committee
Arvind Bansal (Advisor)
Pages
210 p.
Subject Headings
Computer Science
Recommended Citations
Refworks
EndNote
RIS
Mendeley
Citations
Thafar, M. A. (2015).
ABSTRACT MODELING OF GESTURE AND INTERACTIVE EMOTION USING FUZZY SET FOR SOCIAL ROBOTICS
[Master's thesis, Kent State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=kent1447707351
APA Style (7th edition)
Thafar, Maha.
ABSTRACT MODELING OF GESTURE AND INTERACTIVE EMOTION USING FUZZY SET FOR SOCIAL ROBOTICS.
2015. Kent State University, Master's thesis.
OhioLINK Electronic Theses and Dissertations Center
, http://rave.ohiolink.edu/etdc/view?acc_num=kent1447707351.
MLA Style (8th edition)
Thafar, Maha. "ABSTRACT MODELING OF GESTURE AND INTERACTIVE EMOTION USING FUZZY SET FOR SOCIAL ROBOTICS." Master's thesis, Kent State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=kent1447707351
Chicago Manual of Style (17th edition)
Abstract Footer
Document number:
kent1447707351
Download Count:
111
Copyright Info
© 2015, all rights reserved.
This open access ETD is published by Kent State University and OhioLINK.