Abstract
Aggression in children is frequent during the early years of childhood. Among children with psychiatric disorders in general, and autism in particular, challenging behaviours and aggression rates are higher. These can take on different forms, such as hitting, kicking, and throwing objects. Social robots that are able to detect undesirable interactions within its surroundings can be used to target such behaviours. In this study, we evaluate the performance of five machine learning techniques in characterizing five possible undesired interactions between a child and a social robot. We examine the effects of adding different combinations of raw data and extracted features acquired from two sensors on the performance and speed of prediction. Additionally, we evaluate the performance of the best developed model with children. Machine learning algorithms experiments showed that XGBoost achieved the best performance across all metrics (e.g., accuracy of 90%) and provided fast predictions (i.e., 0.004 s) for the test samples. Experiments with features showed that acceleration data were the most contributing factor on the prediction compared to gyroscope data and that combined data of raw and extracted features provided a better overall performance. Testing the best model with data acquired from children performing interactions with toys produced a promising performance for the shake and throw behaviours. The findings of this work can be used by social robot developers to address undesirable interactions in their robotic designs.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Aggression is defined as a behaviour done by a living being that is meant to either cause harm, violates rights, and hurt others either psychologically or physically [1]. According to the American Psychological Association (APA), aggression could be intentional to cause harm (i.e., hostile), not intentional (i.e., instrumental), and emotionally motivated (i.e., affective) [2]. Aggression among children was reported to be more frequent during pre-school years and during early years of childhood in the form of physical aggressions, such as biting and hitting [3, 4]. The reported rates of aggression are high among children with psychiatric disorders (i.e., 45.8% to 62.3%) and one of the reasons for the mental health referrals among children [5, 6].
Children with autism have higher rates of challenging behaviours and aggression compared to others with developmental disabilities and it was reported to start since infancy [7,8,9,10]. The challenging behaviours rates are high (e.g., 49% to 69%) and increase with the severity of autism [11,12,13,14,15]. A challenging behaviour might manifest as meltdown, tantrum, withdrawing, or as a stereotypical behaviour that could pose a risk on the children themselves or others around them [16, 17].
Technology advancement accelerated the integration of robots in healthcare to rehabilitate, monitor, assist in surgeries, and to improve the quality of patients’ lives [18,19,20]. Social robots in autism therapy have gained a lot of attention in the past decade due to the reported positive outcomes, such as increased attention and imitation [21]. However, having this form of technology that is meant to elicit behaviours in the vicinity of children with autism might trigger negative or undesirable reactions. For example, some studies reported challenging behaviours and aggression during interaction sessions with social robots [22,23,24]. Some forms of challenging behaviours, such as the throwing of a small robot or a toy, might cause a harm, especially if it hits another person’s head [25]. Design considerations are needed to account for such scenarios [26,27,28,29].
Social robots can be used to address the issue of aggression and undesirable behaviours among children to prevent progression and potential harm. In combination with other sensors and wearables, a social robot can identify the occurrence of these negative interactions of children with their surroundings (e.g., toys) and respond appropriately to that action [24, 30]. The robot’s responses to the child can take on different forms (e.g., gestures and sounds) and should be clear enough to the child to comprehend [31]. To date, limited work has been done to identify such interactions and means to address them [23, 32, 33].
In this study, we evaluate the performance of five machine learning techniques in characterizing five possible interactions and idle. We examine the effects of adding different combinations of data and extracted features acquired from two sensors on the performance and speed of prediction. Additionally, we test the performance of the developed model with children.
The contributions are summarized as follows:
-
1.
Evaluating the performance of different machine learning techniques and their prediction speed.
-
2.
Studying the effects of different combinations of raw data and extracted features acquired from two sensors.
-
3.
Testing the best performing model with interaction data acquired from children.
This paper is organized as follows. Section 2 describes background. Section 3 describes materials and methods. Section 4 provides results and Sect. 5 presents the discussion.
2 Background
Reactions in social robotics are essential to establish meaningful interactions. There are few commercially available robots that exhibited responses once manipulated or handled in a specific way. Professor Einstein (Hanson Robotics, Hong Kong) is one example of a small humanoid robot that resemble an actual human. Along with the capability of being integrated with a mobile app wirelessly, this robot can track faces and perform certain preprogrammend interactions, such as telling a joke and pointing its hand. PARO is an interactive animaloid robot that is made to resemble a seal [34]. PARO can perform limited physical interactions using light, audio, and tactile sensors once handled in a specific way. For example, it emits voices when it gets stroked.
Sensors and wearable devices are being used to acquire data of different modalities to assess the activities and conditions of the users [35,36,37]. Solutions based on motion sensors have been used in healthcare applications, such as detecting falls among the elderly using wearable devices [38, 39]. For example, a study used an accelerometer embedded in a belt to detect falls with an accuracy of 99.4% using a machine learning classifier [39]. In another application in healthcare, a study considered using a wearable device to predict the occurrence of challenging behaviours among children with autism using machine learning techniques [30]. Motion sensors are also being considered in applications that require direct interactions with robots, such as robot-based games [40, 41]. For example, a study considered a tri-axial accelerometer to detect player’s motions relative to a robot, such as dodging and running [42].
Few studies were conducted to classify the interactions that might occur between a child and a robot [23, 43]. One study used a ball-like robot to categorize interactions (e.g., kicking and pickup) using an accelerometer and gyroscope embedded in the robot [33]. The study considered data acquired from adult participants to train a supervised machine learning model that was then tested with children’s data to achieve an accuracy of 49%. In an earlier study, we considered the magnitude of raw acceleration data over a small window size of 25 samples to characterize six possible interactions and scenarios between a child and a social robot [32]. The considered behaviours were hit, throw, drop, shake, pickup or carry, and being idle. Based on a neural network model, the model achieved 80% accuracy when tested with data acquired from children. In another work, we investigated the influence of reaction time of a robot’s response on the children’s comprehension when an undesirable behaviour is performed with the robot [31]. The findings highlight the importance of providing a quick response once an unwanted interaction has been detected.
3 Materials and Methods
3.1 Participants
3.1.1 Data Collection
The data were collected from six adult participants performing different undesirable interactions with three different toys (Fig. 1a). The considered interactions were hitting, throwing, shaking, carrying, being idle, and dropping. A total of around six thousand instances were collected from the adult participants for the 6 classes. Idle was considered to cover the no interaction case while carry was considered because it might be a precursor for other interactions. The data were then annotated highlighting the interactions. Handcrafted features were extracted from the annotated data over a window size of 30 samples. More details about the collection procedures and access to the raw data can be found in earlier work [32, 44].
3.1.2 Evaluation
Data acquired from ten children were used in the evaluation of the best developed machine learning model. The total duration of their interactions was around 30 min (3 min per child) that averaged at 176 instances for each session. Children performed three scenarios with the three toys (Fig. 3a). In each scenario, the children were told an imaginative scenario to perform an interaction (Table 1), for example, “You need to pick the robot up, and shake it to wake it up.” The duration of each interaction session was around three min. Parental consent was secured by the school and the children were accompanied by their teachers. The procedures for this work did not include invasive or potentially hazardous methods and were in accordance with the Code of Ethics of the World Medical Association (Declaration of Helsinki).
3.2 Experimental Setup
3.2.1 Detection Device
The device that meant to detect the undesired interactions and to send commands to the social robot was based on Raspberry Pi. Raspbian (v4.14, Debian Project) was used as the operating system. Raspberry Pi was attached with a Sense Hat board contains different sensors and a display(Fig. 1c). Sense Hat contains an IMU (LSM9DS1, STMicroelectronics) that contains an accelerometer, gyroscope, and magnetometer. The built-in accelerometer can acquire acceleration values of up to ± 16 g at 30 Hz while the gyroscope can measure angular rate of up to ± 2000 dps. This device has the flexibility of being embedded in different robotic forms to acquire new data if required. A dedicated power bank was used to power up the recognition devices inside the toys. The device was used to test the developed machine learning models that were used in the detection of undesired interactions.
3.2.2 Social Robot
Three different toys were considered representing different forms of social robots. Each toy was embedded with a recognition device. The first toy was a stuffed panda, and the second toy was a stuffed toy robot, while the third toy was an excavator toy (Fig. 1a).
3.3 Development of Machine Learning Models
3.3.1 Algorithms
Five different machine algorithms were considered to evaluate their efficacy in distinguishing between the 6 classes. All the machine learning algorithms were developed based on Python programming language. The considered algorithms are listed here below:
Decision Tree (DT): A decision tree is a tree-like method to help in making decisions by listing all the possible outcomes. A typical DT model consists of nodes (e.g., decision nodes, internal nodes, and leaf nodes) and a hierarchy of branches that are constructed from building steps such as splitting, stopping and pruning.
Random Forest (RF): RF is a classification algorithm that consists of multiple decision trees trained on different portions of a training set.
K-Nearest Neighbor (KNN): KNN is a non-parametric algorithm that finds the k closest examples in a dataset. KNN can be used in both classification and regression.
Multilayer-Perceptron (MLP): MLP is a neural network that consists of an input and an output layers with one hidden layer in between. More complex configurations may include several hidden layers.
EXtreme Gradient Boosting (XGBoost): XGBoost is an ensemble machine learning model based on decision-tree that makes use of gradient boosting framework.
3.3.2 Data Format
Combinations of raw signal data and time extracted features were used in testing the machine learning models (Fig. 2). The raw data contained the signals acquired from the gyroscope and accelerometer. Additionally, the magnitude of acceleration (A) was calculated. The time extracted features were max, min, mean, and standard deviation. These features were extracted from the gyroscope and accelerometer raw data over a window size of 30 samples. The extracted features and raw data were used to test and develop the machine learning algorithms. Balanced data for each class were considered in the training of the machine learning models. Unseen samples for each behaviour were used in testing. These samples were used to calculate the speed of prediction (i.e., test time) for each algorithm.
3.3.3 Evaluation Metrics
All trained models were evaluated based on the accuracy, precision, recall, and F1-score. Additionally, the training and testing time for each algorithm were calculated. The relationships of the evaluation metrics are as follows:
4 Results
4.1 Machine Learning Algorithms
The five algorithms were tested and their evaluation metrics were tabulated (Table 2). Additionally, the training and testing times were calculated. In terms of precision, XGBoost scored the best followed by RF while KNN scored the lowest. Similarly, XGBoost achieved the best results in terms of recall, F1-score, and accuracy. RF was the second best algorithm while KNN was the worse performer. However, KNN algorithm was the fastest to train followed by DT. MLP took the longest time to be trained followed by RF and then XGBoost. In terms of test time, DT was the fastest to predict the testing samples followed by XGBoost while KNN was the slowest. XGBoost was selected to perform the upcoming tests with features due to the best achieved results and due to the relatively fast training and testing times.
4.2 Experiments with Features
Training machine learning models based on XGBoost were conducted to experiment with different combinations of features. The tested configurations included raw data only, extracted features only, and a combination of both. The experiments were performed on the separate data of the accelerometer and gyroscope sensors and on their combined data. The results for these tests along with their corresponding train and test times were calculated (Table. 3).
For the accelerometer, the model with the combined raw and extracted features data achieved the best outcomes in terms of precision, recall, f1-score, and accuracy. Not far behind, the model based on the raw data achieved the second best outcomes. The feature based model was the fastest in terms of training time. The test times for the three models were close.
Compared to the experiments conducted with the accelerometer, the scores for the gyroscope experiments were lower (e.g., accuracy of 89% vs 66%). Similar to the observations made for the accelerometer, the combined experiment for the gyroscope achieved the highest scores while the feature experiment achieved the fastest training time.
The experiments for the combined sensors achieved slightly better results compared to those of the accelerometer alone. The combined raw and extracted features of the two sensors achieved the best results compared to any other combination, but at an increased training time. Additionally, the test time witnessed a slight increase.
4.3 Evaluation Experiments with Children
The best trained model (i.e., based on XGBoost) using adult data was evaluated with data acquired from children interacting with the three toys mimicking actual scenarios (Fig. 3a). The duration of interactions were short (i.e., 3 min) and were limited to three scenarios demonstrating shaking, hitting, and throwing behaviours. The changes in the magnitude of acceleration allowed to identify segments corresponding to the three scenarios (Fig. 3b).
The data for the children were analyzed and the behaviours were predicted using the best trained model. The outcomes of prediction for each scenario performed by each child were averaged and plotted as bar charts (Fig. 3c). In the first scenario, the model was able to identify shaking instances correctly. However, few instances of hit and throw were detected. The model was also able to detect hit instances in the second scenario, but along with shake and throw instances. In the last scenario, the model identified throw instances correctly along with shake and few hit instances.
5 Discussion
The dynamics of the physical interaction between a human and a robot can be complex. Hence, identification strategies and detection methods can be used to decipher these interactions. The integration of sensors and machine learning techniques has been considered in this study that was aimed to detect undesired interactions between children and their toys in their surroundings. Based on data acquired from adult participants, the best-trained model based on XGBoost showed promising potential in detecting undesired interactions between children and the three toys. Over the short duration of each experiment, the algorithm was able to identify the behaviours of interest. However, there were instances of incorrect predictions in each scenario. This could be attributed to the complexity of children’s interactions with the toys that made some behaviours intertwined and their predictions overlap. During the hitting scenario, some instances were predicted as shake while others as throw. Predicting some instances as shake could be attributed to the gentle hitting performed by the children compared to adult participants that caused the toys to shake. As for the incorrect throw instances, this could be attributed to the way some of the children were holding the toy while hitting. Some children were carrying the toys, hence, the hit confused the model and was reported as a throw. Additionally, part of a full throw involves a hit as a result of an impact with a surface. These intricate nuance dynamics imply the need for careful considerations when developing specific machine learning algorithms for this application that involves aggressive interactions.
How quickly an algorithm can predict an undesired behaviour plays an important role during interactions with or in the presence of a social robot. There are many factors that affect the time required to process new sensory data and for a machine learning model to provide a prediction.The machine learning algorithm selection is another crucial part that need to be decided. Selecting an algorithm with many parameters to tune and long training time will make it more challenging to optimize and experiment on the actual system. In our tests, XGBoost provided a relatively quick training time without compromising the performance and with less tuning efforts. Additionally, the selection of an algorithm can directly affect the time needed to make a prediction (Table 2). Having a machine learning model with quick predictions is crucial in applications that require a robot to respond quickly to certain undesired interactions.
The type of data and the number of sensors reflect on the performance and speed of a machine learning algorithm (Table 3). Considering extracted features that use smaller input vector compared to raw data provided faster training time, but at a slightly reduced overall performance. The time required to tabulate such features from the raw data may introduce extra time delay during the actual operations of a detection system. Using multiple sensors of different modalities might increase the overall accuracy, however, the time required to process their data will also increase. This study showed that using one sensor (i.e., accelerometer) that measures one modality is more than enough to reach a high prediction performance. While using the gyroscope sensor did not provide much noticeable improvement over the accelerometer in detecting undesired interactions, it might be still useful to incorporate for a different purpose. For example, the gyroscope can be used to detect different aspects of interactions, such as the orientation of a toy or robot, and that might be useful for certain applications that require specific interactions to be performed.
Certain design aspects are essential in robots or toys that are meant to detect undesired interactions. The internal structure should be robust enough to withstand such aggressive behaviours (e.g., hitting) while the outer structure should be optimized to mitigate any potential harm. Additionally, embedded sensors should withstand the dynamics of such interactions and not drift or lose accuracy over time due to damage, heat, or misalignment. Compensation techniques through software implementations can be used to address some of these challenges. Another design consideration is the number of detected undesired interactions needed before a robot should make a response. For example, the needed number of detected hits within a time frame before a robot may react should be determined. A frequent response to every behaviour might appear unnatural while less frequent ones might make the interactions feel dull [31]. Designers of social robots might need to make a trade-off between different parameters and traits to meet the requirements of their robotic designs and intended applications [29].
The current work has certain limitations. The data considered to develop the machine learning models were obtained from adult participants while the target end users are children. Furthermore, the current study did not evaluate the developed model with children with different degrees of autism. More data are needed to be acquired from children with or without autism to capture the full spectrum of interactions among children. The tested detection devices and toys were limited to off-the-shelf options that might not be suitable in such applications. More dedicated and custom-made devices that withstand aggressive behaviours are needed to perform better evaluations. The conducted tests were limited to a few children. Hence, experimental evaluations of robotic reactions with more children are needed.
6 Conclusion
The occurrence of aggression among neurotypical children and those with psychiatric disorders is high and can be concerning to their family, therapists, and caregivers. Technology, such as social robots, can be used to address such behaviours. However, social robots are in need to be able to detect such interactions. In this study, we demonstrated the possibility of detecting different interactions using a detection device and machine learning techniques. Detection algorithms, such as XGBoost, can accurately distinguish between different behaviors that include undesirable ones, such as throwing. Furthermore, it can provide a quick prediction for new data, hence, reducing the overall time delay. Data acquired form a single tri-axial accelerometer alone can be sufficient to provide the necessary information for the machine learning model to make an accurate enough prediction. However, integrating more sensors, such as a gyroscope, can be useful to capture different aspects of interactions. Having a social robot that responds quickly to interactions within its environment is possible using simple solutions.
The insights and findings of this work can be further explored by researchers in the field of social robotics to integrate new concepts and solutions into their designs. A social robot that can detect direct physical interactions between children and their environments can be used to address the issue of aggression and challenging behaviours among children with or without psychiatric disorders.
Data Availability
The datasets generated during and/or analysed during the current study are available from the corresponding author on request.
References
Connor DF (2012) Aggression and antisocial behavior in children and adolescents: research and treatment. Guilford Press
American Psychological Association (2018) Apa dictionary of psychology. https://dictionary.apa.org/aggression. Accessed 24 Jan 2022
N. E. C. C. R. Network, Arsenio WF et al (2004) Trajectories of physical aggression from toddlerhood to middle childhood: predictors, correlates, and outcomes. In: Monographs of the society for research in child development, pp i–143
Alink LR, Mesman J, Van Zeijl J, Stolk MN, Juffer F, Koot HM, Bakermans-Kranenburg MJ, Van IJzendoorn M H (2006) The early childhood aggression curve: development of physical aggression in 10-to 50-month-old children. Child Dev 77(4):954–966
Nock MK, Kazdin AE, Hiripi E, Kessler RC (2007) Lifetime prevalence, correlates, and persistence of oppositional defiant disorder: results from the national comorbidity survey replication. J Child Psychol Psychiatry 48(7):703–713
Sukhodolsky DG, Smith SD, McCauley SA, Ibrahim K, Piasecka JB (2016) Behavioral interventions for anger, irritability, and aggression in children and adolescents. J Child Adolesc Psychopharmacol 26(1):58–64
Estes A, Munson J, Dawson G, Koehler E, Zhou X-H, Abbott R (2009) Parenting stress and psychological functioning among mothers of preschool children with autism and developmental delay. Autism 13(4):375–387
Gurney JG, McPheeters ML, Davis MM (2006) Parental report of health conditions and health care use among children with and without autism: national survey of children’s health. Arch Pediatr Adol Med 160(8):825–830
Fodstad JC, Rojahn J, Matson JL (2012) The emergence of challenging behaviors in at-risk toddlers with and without autism spectrum disorder: a cross-sectional study. J Dev Phys Disabil 24(3):217–234
Kozlowski AM, Matson JL (2012) An examination of challenging behaviors in autistic disorder versus pervasive developmental disorder not otherwise specified: Significant differences and gender effects. Res Autism Spect Disord 6(1):319–325
Murphy O, Healy O, Leader G (2009) Risk factors for challenging behaviors among 157 children with autism spectrum disorder in ireland. Res Autism Spect Disord 3(2):474–482
Matson JL, Shoemaker M (2009) Intellectual disability and its relationship to autism spectrum disorders. Res Dev Disabil 30(6):1107–1114
Kanne SM, Mazurek MO (2011) Aggression in children and adolescents with asd: prevalence and risk factors. J Autism Dev Disord 41(7):926–937
Baghdadli A, Pascal C, Grisi S, Aussilloux C (2003) Risk factors for self-injurious behaviours among 222 young children with autistic disorders. J Intellect Disabil Res 47(8):622–627
Bodfish JW, Symons FJ, Parker DE, Lewis MH (2000) Varieties of repetitive behavior in autism: comparisons to mental retardation. J Autism Dev Disord 30(3):237–243
Machalicek W, O’Reilly MF, Beretvas N, Sigafoos J, Lancioni GE (2007) A review of interventions to reduce challenging behavior in school settings for students with autism spectrum disorders. Res Autism Spect Disord 1(3):229–246
Johnson NL, Lashley J, Stonek AV, Bonjour A (2012) Children with developmental disabilities at a pediatric hospital: Staff education to prevent and manage challenging behaviors. J Pediatr Nurs 27(6):742–749
Robinson H, MacDonald B, Broadbent E (2014) The role of healthcare robots for older people at home: A review. Int J Soc Robot 6(4):575–591
Broekens J, Heerink M, Rosendal H et al (2009) Assistive social robots in elderly care: a review. Gerontechnology 8(2):94–103
Cabibihan J-J, Alhaddad AY, Gulrez T, Yoon WJ (2021) Influence of visual and haptic feedback on the detection of threshold forces in a surgical grasping task. IEEE Robot Autom Lett 6(3):5525–5532
Cabibihan J-J, Javed H, Ang M Jr, Aljunied SM (2013) Why robots? a survey on the roles and benefits of social robots in the therapy of children with autism. Int J Soc Robot 5(4):593–618
Alhaddad AY, Javed H, Connor O, Banire B, Al Thani D, Cabibihan J-J (2019) Robotic trains as an educational and therapeutic tool for autism spectrum disorder intervention. In: Lepuschitz W, Merdan M, Koppensteiner G, Balogh R, Obdržálek D (eds) Robotics in education. Springer International Publishing, Cham, pp 249–262
Boccanfuso L, Barney E, Foster C, Ahn YA, Chawarska K, Scassellati B, Shic F (2016) Emotional robot to examine differences in play patterns and affective response of children with and without asd. In: The eleventh ACM/IEEE international conference on human robot interaction. IEEE Press, pp 19–26
Cabibihan J-J, Chellali R, So CWC, Aldosari M, Connor O, Alhaddad AY, Javed H (2018) Social robots and wearable sensors for mitigating meltdowns in autism—a pilot test. In: Ge SS, Cabibihan J-J, Salichs MA, Broadbent E, He H, Wagner AR, Castro-González Á (eds) Social robotics. Springer International Publishing, Cham, , pp 103–114
Alhaddad AY, Cabibihan J-J, Bonarini A (2018) Head impact severity measures for small social robots thrown during meltdown in autism. Int J Social Robot, pp 1–16
Alhaddad AY, Cabibihan J-J, Hayek A, Bonarini A (2019) Safety experiments for small robots investigating the potential of soft materials in mitigating the harm to the head due to impacts. SN Appl Sci 1(5):476
Alhaddad AY, Cabibihan J-J, Hayek A, Bonarini A (2019) Influence of the shape and mass of a small robot when thrown to a dummy human head. SN Appl Sci 1(11):1468
Cabibihan J-J, Javed H, Sadasivuni KK, Alhaddad AY (2020) Smart robotic therapeutic learning toy. Oct. 6 . US Patent 10,792,581
Alhaddad AY, Mecheter A, Wadood MA, Alsaari AS, Mohammed H, Cabibihan JJ (2021) Anthropomorphism and its negative attitudes, sociability, animacy, agency, and disturbance requirements for social robots: a pilot study. In: International conference on social robotics. Springer, pp 791–796
Alban AQ, Ayesh M, Alhaddad AY, Al-Ali AK, So WC, Connor O, Cabibihan JJ (2021) Detection of challenging behaviours of children with autism using wearable sensors during interactions with social robots. In: 2021 30th IEEE international conference on robot & human interactive communication (RO-MAN). IEEE, pp 852–857
Alhaddad AY, Cabibihan JJ, Bonarini A (2020) Influence of reaction time in the emotional response of a companion robot to a child’s aggressive interaction. Int J Soc Robotics 12:1279–1291 (2020). https://doi.org/10.1007/s12369-020-00626-z
Alhaddad AY, Cabibihan J-J, Bonarini A (2019) Recognition of aggressive interactions of children toward robotic toys. In: 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, pp 1–8
Li B, Boccanfuso L, Wang Q, Barney E, Ahn Y. A, Foster C, Chawarska K, Scassellati B, Shic F, (2016) Human robot activity classification based on accelerometer and gyroscope. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN). Presented at the 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 423–424
Shibata T (2012) Therapeutic seal robot as biofeedback medical device: qualitative and quantitative evaluations of robot therapy in dementia care. Proc IEEE 100(8):2527–2538
Ann OC, Theng LB (2014) Human activity recognition: a review. In: 2014 IEEE International conference on control system, computing and engineering (ICCSCE). IEEE, pp 389–393
Cabibihan J-J, Javed H, Aldosari M, Frazier T W, Elbashir H (2017) Sensing technologies for autism spectrum disorder screening and intervention. Sensors 17(1)
Alhaddad AY, Aly H, Gad H, Al-Ali A, Sadasivuni KK, Cabibihan J-J and Malik RA (2022) Sense and Learn: recent advances in wearable sensing and machine learning for blood glucose monitoring and trend-detection. Front Bioeng Biotechnol 10:876672. https://doi.org/10.3389/fbioe.2022.876672
Bagala F, Becker C, Cappello A, Chiari L, Aminian K, Hausdorff JM, Zijlstra W, Klenk J (2012) Evaluation of accelerometer-based fall detection algorithms on real-world falls. PLoS ONE 7(5):e37062
Sucerquia A, López JD, Vargas-Bonilla JF (2018) Real-life/real-time elderly fall detection with a triaxial accelerometer. Sensors 18(4):1101
Oliveira EL, Orrù D, Morreale L, Nascimento TP, Bonarini A (2018) Learning and mining player motion profiles in physically interactive robogames. Future Internet 10(3):22
Oliveira E, Orrù D, Nascimento T, Bonarini A (2017) Modeling player activity in a physical interactive robot game scenario. In: Proceedings of the 5th international conference on human agent interaction. ACM, pp 411–414
Oliveira EL, Orrù D, Nascimento T, Bonarini A (2017) Activity recognition in a physical interactive robogame. In: 2017 Joint IEEE international conference on development and learning and epigenetic robotics (ICDL-EpiRob). IEEE, pp 92–97
Feil-Seifer D, Matarić MJ (2011) Automated detection and classification of positive vs. negative robot interactions with children with autism using distance-based features. In: 2011 6th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 323–330
Alhaddad AY, Cabibihan J-J, Bonarini A (2021) Datasets for recognition of aggressive interactions of children toward robotic toys. Data Brief 34:106697
Acknowledgements
The work was supported by a research grant from QU Marubeni Concept to Prototype Grant under the grant number M-CTP-CENG-2020-4. The statements made herein are solely the responsibility of the authors. Open Access funding provided by the Qatar National Library.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Alhaddad, A.Y., Cabibihan, JJ. & Bonarini, A. Real-Time Social Robot’s Responses to Undesired Interactions Between Children and their Surroundings. Int J of Soc Robotics 15, 621–629 (2023). https://doi.org/10.1007/s12369-022-00889-8
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-022-00889-8