Can Robots Recover A Service Using Interactional Justice
Can Robots Recover A Service Using Interactional Justice
Can Robots Recover A Service Using Interactional Justice
https://doi.org/10.1007/s11628-023-00525-z
REVIEW ARTICLE
Received: 19 July 2022 / Accepted: 30 January 2023 / Published online: 12 February 2023
© The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2023
Abstract
Interactional justice (e.g., empathy) plays a crucial role in service recovery. It relies
on human social skills that would prevent it from automation. However, several
considerations challenge this view. Interactional justice is not always necessary to
recover service, and progress in social robotics enables service robots to handle
social interactions. This paper reviews service recovery and social robotics litera-
ture and addresses whether service robots can use interactional justice as frontline
employees do during service recovery. Results show service robots can replicate
interactional justice norms, although with some considerations. Accordingly, we
propose a research agenda for future studies.
1 Introduction
Service failure occurs when a service provider does not meet customer expecta-
tions (Lin 2006). Anger—the most prominent emotion after a service failure—
strongly influences customers’ evaluation of the firm (Valentini et al. 2020).
Therefore, frontlines employees (FLEs) must re-establish customer satisfaction
through service recovery procedures (Michel et al. 2009). Service research views
customers’ satisfaction with service recovery as a function of distributive, proce-
dural, and interactional justice (IJ) (Krishna et al. 2011). IJ would play a vital part
* Mathieu Lajante
[email protected]
David Remisch
[email protected]
Nikita Dorofeev
[email protected]
1
emoLab, Ted Rogers School of Management, Toronto Metropolitan University, 55 Dundas St
W, Toronto, ON M5G 2C3, Canada
13
Vol.:(0123456789)
316 M. Lajante et al.
13
Can robots recover a service using interactional justice as… 317
2 Theoretical background
Service failure occurs when a service provider does not meet customer expec-
tations regarding service production or delivery or when customers appraise a
service interaction as unsatisfactory (Lin 2006). Anger and irritation are the most
prominent emotions after a service failure (Harrison‐Walker 2012) and influence
customers’ satisfaction (Valentini et al. 2020) and coping strategy (Balaji et al.
2017). Customers’ coping strategies depend on service failure severity (Sengupta
et al. 2015). Customers coping with low service failure tend to rely on emotion-
based strategies such as positive thinking and avoidance (Sengupta et al. 2015).
They are also likely to suppress their negative emotions temporarily, opting to
use retaliatory behaviors (e.g., negative WOM) (Balaji et al. 2017). In contrast,
customers coping with high service failure rely on problem-solving such as action
coping (Sengupta et al. 2015). Customers’ anger increases the intentions to com-
plain face-to-face (Luo & Mattila 2020) and to spread negative WOM while
decreasing re-patronage and reconciliation intentions (Harrison-Walker 2019).
Service recovery refers to the integrative actions a company takes to re-
establish customer satisfaction and loyalty after a service failure (Michel et al.
2009, p. 267). It relies on the outcome dimension ("what is done") and the pro-
cess dimension ("how it is done") (Dong et al. 2008). Service recovery comprises
seven steps (acknowledgment, empathy, apology, ownership, the fix, assurance,
and compensation; Krishna et al. 2011), mainly focused on the recovery phase
and justice perception—an evaluative judgment about the fairness of customers’
treatment by FLEs (DeWitt et al. 2008; Van Vaerenbergh et al. 2019). Justice
theory relies on three components: (1) distributive justice (i.e., the tangible out-
come of service recovery), (2) procedural justice (i.e., the procedure to conduct
the recovery), and (3) IJ (i.e., how a customer is treated during service recovery)
(Krishna et al. 2011). The three components impact customer service recovery
satisfaction (Ozgen & Kurt 2012). However, researchers praise the vital role of IJ
(Krishna et al. 2011).
IJ is a focus area because of the impact of customers’ emotions on service
recovery efforts evaluations (Gelbrich 2010; Salagrama et al. 2021). Customers
are emotionally invested in service recovery (Tektas 2017; Wen and Chi 2013),
and IJ provides customers with fair treatment during the complaint-handling pro-
cess (e.g., politeness, courtesy, apology) and redistributes intangible resources
such as self-esteem and a sense of control through empathy (Smith et al. 1999).
FLEs’ empathy enhances service quality and satisfaction (Wieseke et al. 2012),
service delivery (Umasuthan et al. 2017), and service evaluation (Min et al.
2015). For instance, FLEs with high emotional skills can leave a lasting impres-
sion and generate satisfaction after the service recovery (Fernandes et al. 2018).
A positive appraisal of the service recovery process elicits positive emotions, fur-
ther increasing customer satisfaction, trust, and loyalty (Valentini et al. 2020). In
13
318 M. Lajante et al.
The COVID-19 pandemic has resulted in a sharp demand increase for ServBots that
reduce human contact (Global Robotics Industry 2021). ServBots are increasingly
incorporated in the new service triad—FLEs, customers, and ServBots (Odekerken-
Schröder et al. 2021). ServBots refer to system-based autonomous and adaptable
interfaces that interact, communicate, and deliver service to an organization’s cus-
tomers (Wirtz et al. 2018, p. 909). Three fundamental elements characterize them:
sensors to understand their surrounding through imitation and association, proces-
sors to analyze the collected data and make decisions, and actuators to act in the real
world (Devillers 2017). ServBots can take different shapes (e.g., mechanoids, chat-
bots, humanoids, androids) and rely on varying levels of intelligence and autonomy
(e.g., pre-programmed strategy vs. automation autonomy; Roberts et al. 2020). The
marketing and service research literature—and most of the commercial applications
currently available—refer to ServBots as programmable, humanoid social robots
(e.g., ARI, Pepper, TIAGo).
The combination of humanoid shapes with social intelligence algorithms and
communication protocols (Zlotowski et al. 2015) allows ServBots to create the
appearance of autonomy and exhibit social skills during customer-ServBot interac-
tions (Forgas-Coll et al. 2022). ServBots can use physical features such as gaze or
gestures to express and perceive customers’ emotions to learn from and communi-
cate with them (Søraa et al. 2021). Therefore, ServBots are applied social robots
capable of producing and delivering services to customers while engaging emo-
tionally and interacting socially with them (Van Doorn et al. 2017). According to
the well-cited AI intelligence framework introduced by Huang and Rust (2021),
ServBots integrate three types of intelligence. First, mechanical AI allows ServBots
to perform routine tasks. Then, thinking AI allows ServBots to develop analytical
13
Can robots recover a service using interactional justice as… 319
and intuitive skills. Finally, feeling AI allows ServBots to identify and understand
customers’ emotions to respond empathically (Scheper et al. 2022).
Service research examines ServBots’ implementation in various service settings
(see Table 1). ServBots interact with customers to share information (Schepers and
Streukens 2022), book services (Pillai and Sivathanu 2020), facilitate the check-in
process at a hotel (Fan et al. 2020), and handle functional operations such as carry-
ing luggage (Christou et al. 2020) and bringing food to a table (Tuomi et al. 2021).
ServBots also have social skills to engage in entertaining interactions with custom-
ers by demonstrating empathy (Pozharliev et al. 2021), interest (Choi et al. 2020),
assurance, and reliability (Chiang and Trimi 2020). ServBots can also detect cus-
tomers’ emotions and work with FLEs to satisfy customers’ needs (Čaić et al. 2018).
Customers’ perceptions, acceptance, and intentions toward ServBots are consistent
throughout the literature. On the one hand, customers perceive ServBots as conveni-
ent and flexible (Amelia et al. 2021), reliable (Belanche et al. 2020), socially enter-
taining (Choi et al. 2020), efficient and productive (Xu et al. 2020), and useful (Cha
2020). On the other hand, customers perceive ServBots as emotionally limited (Pil-
lai and Sivathanu 2020), unsecured (Flavian et al. 2021), apathetic (De Kervenoal
et al. 2020), artificial (Shin and Jeong 2020), and antisocial (Christou et al. 2020).
Most service studies rely on surveys and scenarios where customers did not inter-
act with ServBots. Therefore, customers’ attitudes and intentions relate more to their
beliefs, while actual experiences with ServBots could lead to different results. For
instance, a systematic review investigating the social acceptance of robots showed
that participants have more positive attitudes (67.7% vs. 18.2%) when exposed to
actual robots performing tasks rather than hypothetical robots (Savela et al. 2018).
Nevertheless, no field study has investigated how ServBots handle emotional
encounters and how customers appraise ServBots’ emotional displays. Besides,
long-term studies in a real-world setting reveal critical challenges when introducing
ServBots at the organizational frontline (Pinillos et al. 2016). For instance, provid-
ing customers with clear instructions to operate the ServBots is critical (Severinson-
Eklundh et al. 2003). Other studies showed the importance of more interactive dia-
logs and improved capacity to identify repeated customers (Gockley et al. 2005) and
to improve ServBots’ perceptual abilities to foster rich social interaction and engage
sporadic customers (Leite et al. 2013a, b).
The Covid-19 pandemic and the companies’ need for standardization and scale
economy foster the implementation of ServBots at the organizational frontline and
the replacement of FLEs (Global Robotics Industry 2021). Moreover, progress in
social robotics increasingly fills the gap between FLEs’ and Servbots’ emotional
skills (Kerruish 2021). During service recovery, distributive and procedural justice
obey standardized companies’ policies that ServBots can replicate as efficiently as
FLEs. But whether ServBots will effectively provide emotional and social services
at a level humans can (Wirtz et al. 2018) is still under debate. The role of custom-
ers’ emotions in service recovery appraisal requires emotionally intelligent FLEs to
13
Table 1 Literature review on service robots (ServBots) in marketing and service research
320
13
Service robots
Front desk agent in retail stores and agen- Pillai and Sivathanu (2020) Ability to plan and book vacations Require FLEs intervention upon a difficulty
cies Difficulty in responding to emotional lan-
guage
De Gauquier et al. (2021) Increase customers’ interactions and partici- n/a
pation
Pitardi et al. (2021) Offer more comfort/less embarrassment for n/a
specific products and services
Amelia et al. (2021) Greeting customers Raise security concerns
Increase service convenience and flexibility Lack of human interaction
of operating hours
Flavián et al. (2021) Deliver banking advice Raise security concerns
Rancati and Maggioni (2022) Interact with customers through a service Interaction with ServBots lose importance
script and meaningfulness over time
Increase customers’ immersion
Schepers and Streukens (2022) Sharing information, answering questions n/a
Greeting customers and interacting
Concierge in hotels Xu et al. (2020) Increase service efficiency and productivity FLEs’ reluctance to change
Reduce costs Looks too gimmicky
Need FLEs’ support
Belanche et al. (2020) Stability of the service performance Customers expect little improvement in
ServBots’ performance over time
De Kervenoael et al. (2020) Handle mundane, social interactions Low empathy
Ivkov et al. (2020) Improve business outcome, experience, Low empathy and social influence
reliability, communication, and service
assurance
M. Lajante et al.
Table 1 (continued)
Tasks Citations Pros Cons
Shin and Jeong (2020) Increase usefulness perception at the front Customers prefer FLEs because of humans’
desk sincere and genuine interactions
Fan et al. (2020) Facilitate the check-in process n/a
Choi et al. (2020) Provide a unique and interesting experience More gimmicks than useful tools
Service consistency Require a leap in technology to effectively
communicate with customers
Chiang and Trimi (2020) Demonstrate assurance and reliability dur- Low level of empathy
ing service delivery
Mingotto et al. (2021) Answer questions and give information in Reduced to simple interactions
multiple languages
Pozharliev et al. (2021) Demonstrate artificial empathy that reas- n/a
sures customers with low anxious attach-
Can robots recover a service using interactional justice as…
ment style
Borghi and Mariani (2021) Connect with customers n/a
Akdim et al. (2021) Delivering uniformed and transaction- Lack of flexibility and coldness
focused service Customers have an implicit preference for
FLEs
Waiter at restaurants Cha (2020) Useful for young customers Displeasing for old customers
Christou et al. (2020) Functional operations such as carrying Customers’ concern with the loss of employ-
luggage ment
Interacting with robots is weird and anti-
social
Pelau et al. (2021) Free will increases perceived empathy and Anthropomorphism can harm customer
care experience in absence of empathy
321
13
Table 1 (continued)
322
13
to the table
Wu et al. (2021) Increase the hedonic value if used as a Decreased utilitarian value
gimmick
Belanche et al. (2021) n/a Customers are less likely to use service
robots if they perceive them to be a cost-
cutting measure
Food delivery Ivanov and Webster (2021) n/a Customers expect to pay less for robotic
delivery
Byrd et al. (2021) Reduce costs Robot delivery has software payment issues
Increase operating hours
Social contact and assistance in elderly care Khaksar et al. (2016) Handle human interactions through games n/a
Detect individuals’ emotional state and
warn human staff
Čaić et al. (2018) Collaborate with staff n/a
Encourage customers to be self-conscious
M. Lajante et al.
Can robots recover a service using interactional justice as… 323
identify, understand and respond to the customers’ emotions (Kozub et al. 2013).
High-order cognitive processes such as empathy through IJ would make ServBots
irrelevant for handling emotionally driving service recovery encounters while suit-
able for mechanical and analytical tasks (Huang and Rust 2018; Davenport et al.
2020; Wirtz et al. 2018). However, a few recent studies examined customers’ per-
ception of ServBots causing a service failure (e.g., Belanche et al. 2020; Fan et al.
2020) or ServBots handling service recovery (Choi et al. 2021) and showed that
ServBots could use IJ norms during service recovery.
In addition, a closer look at the emotional dynamic during service recovery fol-
lowing a service failure allows us to ponderer the importance of FLEs’ empathy
for displaying normative IJ. Service failure elicits customer anger, which arouses
FLEs’ facial mimicking of anger due to emotional contagion (Dallimore et al. 2007).
FLEs must regulate their emotions through emotional labor to display normative IJ
behaviors (Liu et al. 2019), which can be emotionally exhausting and interfere with
FLEs’ ability to process the customer’s complaint and respond efficiently (Smith
and Hart 1994). Indeed, angry customers expect instrumental (i.e., solution-oriented
response) rather than emotional (i.e., empathy-oriented response) support after a
service failure (Menon and Dubé 2007). In contrast, ServBots are not sensitive to
emotional contagion and can show complementary emotions instead of mimicking
angry facial expressions. They can replicate the standardized display of IJ behaviors,
and their analytic skills (Huang and Rust 2018) enable them to provide customers
with instrumental support.
Drawing on justice theory (Lin et al. 2011), service failure/recovery theory (Van
Vaerenbergh et al. 2019), emotional labor theory (Liu et al. 2019), and social robot-
ics literature, our paper aims to determine whether ServBots can use IJ during the
service recovery process as FLEs do. Our goals are to (1) identify the norms and
boundaries of IJ during service recovery, (2) compare those IJ’s norms and bounda-
ries to current advances in social robotics, and (3) set the boundaries of ServBots’
empathic skills at the service encounter. To do so, we first systematically review
20 years of empirical research on IJ in service recovery to identify the IJ’s norms
and boundaries (Sect. 3). Second, we answer our research question by compar-
ing IJ’s norms and boundaries with the review of ten years of empirical research
on social robotics, artificial empathy, and emotion in human–computer interactions
(Sect. 4). Finally, we discuss the pros and cons of ServBots handling service recov-
ery and future research agenda (Sect. 5).
This section reports the methodology of the systematic literature review we per-
formed to identify the norms and boundaries of IJ during service recovery. We took
several steps to ensure our review process was replicable and transparent by follow-
ing the process published recently in the Journal of Business Research (Kähkönen
et al. 2021).
13
324 M. Lajante et al.
3.1 Search protocol
The research questions were formulated through dialogue between the authors and
other academic experts. Based on this question formulation process, the research
questions in this paper are (1) "What are the FLEs’ IJ norms during service recov-
ery?" and (2) "what are the boundaries of the FLEs’ IJ norms during service
recovery?".
Each included research article met our inclusion criteria, namely: (1) empirical
research providing evidence on service recovery, (2) including measures or manipu-
lations of IJe, (3) including an employee or customer perspective on service recov-
ery, (4) conducted within the context of work or an organizational context, (5) being
peer-reviewed, (6) being available in English, and (7) located within the disciplines
of business, management, and accounting. We searched for literature published in
the past two decades, from 2000 to 2021. We excluded (1) non-empirical papers, (2)
papers that represented only external stakeholders (e.g., citizens, suppliers, custom-
ers, shareholders, and regulators), (3) papers that did not include a service recov-
ery, (4) papers that did not explicitly measure or manipulate IJe, and (5) papers out-
side of the service context. Papers were also excluded if it was unclear whether an
employee perspective was included (e.g., experimental designs where the respond-
ent’s role was unclear).
13
Can robots recover a service using interactional justice as… 325
or customer perspective. In the third stage, 52 accepted papers were scanned, and
articles that failed to meet the inclusion criteria were eliminated. In this stage, nine
studies were excluded based on full-text review because papers did not include a
customer or employee perspective on service recovery or because IJ was not meas-
ured or manipulated.
In the fourth stage and after the full-text examination, the number of relevant arti-
cles was reduced to 42. Our last stage of the selection process involved scanning the
reference lists of the 42 accepted papers. The screening and selection of the articles
were verified independently by two researchers to avoid possible selection bias.
3.2 Findings
The study of IJ in service recovery relies on the data of at least 16,735 respondents,
including 8925 customers, 708 FLEs, and 4177 students, collected through surveys
over the last two decades. These studies have been carried out in countries on six
continents and represent a wide range of service contexts. IJ definitions are consist-
ent throughout the 42 papers we reviewed (see Table 2). Authors agree to define IJ
as the fairness (e.g., Räikkönen et al. 2015), mannerisms (e.g., Wang et al. 2011),
and quality (e.g., Wirtz and McColl-Kennedy 2010) of interpersonal treatment (e.g.,
13
Table 2 Description of the selected papers in the systematic literature review
326
13
de Ruyter and Wetzels (2000) n/a The Netherlands Hairdresser; Café; Quality of interpersonal treatment during the enact-
Department store; Bank ment of organizational procedures
Maxham et al. (2002) 692 customers USA Bank; Home care service FLEs’ fairness of interpersonal treatment throughout
339 customers the recovery process (e.g., courtesy, honesty, interest
in fairness, effort)
Weun et al. (2004) 533 students Korea; USA Hotel; Mail-order Fairness of interpersonal treatment
537 non-students
Chebat et al. (2005) 186 customers Canada Bank Communication process (e.g., courtesy, politeness,
adequacy of language level)
Ok et al. (2005) 286 customers USA Restaurant Interpersonal behavior in the enactment of procedures
266 students and the delivery of outcomes (e.g., friendliness,
honesty, interest, sensitivity, politeness)
Hocutt et al. (2006) 221 students USA Restaurant Quality of interpersonal treatment during the enact-
ment of organizational procedures (e.g., empathic
concern, courtesy, apology)
Ok et al. (2006) 286 respondents USA Restaurant Perceived fairness of interpersonal manner during ser-
vice recovery implementation and outcome delivery
(e.g., politeness, empathy, apology, courtesy)
Shapiro et al. (2006) 247 students USA Any service experience Content and manner of information exchange between
FLEs and customers (e.g., apology, empathy, expla-
nation)
Aurier & Siadou-Martin (2007) 188 students France Restaurant FLEs’ interpersonal behavior in the enactment of pro-
cedures and the delivery of outcomes (e.g., honesty,
courtesy, respect, politeness, candor)
dos Santos et al. (2008) 306 customers Brazil Car repair service FLEs’ mannerisms and communication (e.g., honesty,
politeness, empathy, effort, explanation, apology)
Wirtz et al. (2010) 360 students; Singapore; Australia Airline; Catering service Interaction quality provided to the customer (e.g.,
343 employees FLEs’ concern and friendliness)
M. Lajante et al.
Table 2 (continued)
Citation Sample Cultural context service context Definition of IJ
Lin et al. (2011) 225 students Taiwan Online shopping FLEs’ mannerisms, communication, and fairness in
interpersonal treatment (e.g., sensitivity, dignity,
respect, explanations, politeness, empathy, effort,
information, honesty, attitude)
Wang et al. (2011) 221 respondents Taiwan Online shopping FLEs’ fairness and mannerisms in interpersonal treat-
ment (e.g., sensitivity, dignity, respect, explanations)
de Matos et al. (2012) 298 customers Brazil Any service experience Interpersonal treatment in response to customers’
complaints
Ellyawati et al. (2012) 102 respondents Indonesia Retail setting FLEs’ attitude toward customers during the problem-
solving process (e.g., accessibility, speed, flexibility,
righteousness, explanation, hospitality, sensitivity,
interest, humility, empathy, assurance, openness,
attention)
Kuo et al. (2012) 252 students Taiwan Online shopping Fairness of interpersonal treatment (e.g., courtesy,
Can robots recover a service using interactional justice as…
13
Table 2 (continued)
328
13
Siu et al. (2013) 200 customers Hong Kong Restaurant Interpersonal treatment during the enactment of pro-
cedures (e.g., explanation/causal account, honesty,
politeness, effort)
Wu (2013) 668 customers Taiwan Online shopping Perceived fairness of interpersonal treatment (e.g.,
justification for decisions, truthfulness, respect,
propriety)
Assefa (2014) 400 customers Ethiopia Bank Perceived fairness of personal interactions with FLEs
(e.g., courtesy, politeness, empathy, efforts, explana-
tions)
Choi et al. (2014) 356 students USA Any service experience Interpersonal treatment (e.g., courtesy, politeness,
efforts)
Barakat et al. (2015) 589 customers Brazil; USA Airline Genuine effort to address the issue
Räikkönen et al. (2015) 220 customers Finland Travel tour operator Fairness of the interpersonal behavior in the complaint
handling (e.g., apology, explanation, attentiveness,
effort)
Yeoh et al. (2015) 248 customers USA Online shopping Perceived quality and fairness of the interpersonal
treatment (e.g., dignity, respect, courtesy, explana-
tions)
Choi et al. (2016) 1,007 online customers Australia; USA; Singapore Online shopping Perceived fairness of the interpersonal treatment (e.g.,
sincerity, help, empathy, respect, care, politeness)
Gohary et al. (2016) 977 customers Iran Mobile banking Mannerisms and interpersonal communication (e.g.,
sensitivity, respect, courtesy, respect, interest, listen-
ing, effort, trust, explanation, empathy, apology,
communication)
Nadiri (2016) 178 customers Turkey Bank Customers experienced justice from FLEs during the
recovery process (e.g., apology, empathy, attentive-
ness, courtesy, communication style)
Joosten et al. (2017) 260 customers The Netherlands Any service experience Perceived fairness of the interaction in service recovery
M. Lajante et al.
Table 2 (continued)
Citation Sample Cultural context service context Definition of IJ
Jung et al. (2017) 368 respondents USA Online shopping Fairness of interaction and communication in solving
the problems resulting from the service failure
Ortiz et al. (2017) 262 restaurant diners Taiwan; USA Restaurant FLEs’ attitudes and interpersonal relationships in deal-
ing with the process of customer complaints
Petzer et al. (2017) 281 customers South Africa and Norway Bank Fairness of interaction during the post-complaint inter-
action following the negative service encounter
Azab et al. (2018) 365 FLEs; USA Restaurant; Retail settings How the recovery is implemented
203 students
Cai et al. (2018) 395 customers USA Restaurant Customers experienced justice in interpersonal interac-
tions with FLEs during the enactment of procedures
Tsao (2018) 424 respondents Taiwan Hotel Customers’ perception of justice through communica-
tions and interactions with FLEs during the recovery
progress
Chen et al. (2019) 477 respondents China; Korea Restaurant Customers’ feelings of FLEs’ treatment and manner-
Can robots recover a service using interactional justice as…
13
330 M. Lajante et al.
Lin et al. 2011), communication (e.g., Chebat et al. 2005), or behavior (e.g., Aurier
and Siadou-Martin 2007) during service recovery. Most reviewed studies found that
IJ increases customers’ satisfaction with service recovery (e.g., Hocutt et al. 2006;
Jung and Seock 2017; Petzer et al. 2017; Räikkönen and Honkanen 2016; Tsao
2018; Radu et al. 2020). Especially, FLEs’ mannerisms are the most satisfying for
customers (Siu et al. 2013). IJ norms refer to professional obligations to address the
customers’ complaints and can be sorted out into six dimensions:
13
Table 3 Main results of interactional justice (IJ) in service recovery
Citations Themes Direct effects Interaction effects/moderators Mediators
tomer’s perception of IJ
FLE attractiveness has a positive
effect on customer’s percep-
tion of IJ
Jung et al. (2017) Recovery type The apology has a higher, posi- n/a n/a
tive effect on customer percep-
tions of IJ than compensation
only
Consequences of interactional justice (IJ)
331
13
Table 3 (continued)
332
13
de Ruyter and Wetzels (2000) Service recovery satis- IJ has a positive effect on cus- IJ x PJ increase customer satis-
Maxham et al. (2002) faction tomer satisfaction with service faction with service recovery prior satisfaction on customer
Weun et al. (2004) recovery DJ x IJ increases customer satis- satisfaction with service
Hocutt et al. (2006) faction with service recovery recovery
dos Santos et al. (2008) Likeability weakens the positive Positive emotions mediate the
Lin et al. (2011) effect of IJ on customer satis- positive effect of IJ on customer
Agnihotri et al. (2012) faction with service recovery satisfaction with service
de Matos et al. (2012) Customer co-creation of the recovery
Ellyawati et al. 2012 service recovery strengthens DJ mediated the positive effect of
Kuo et al. (2012) the effect of IJ on customer IJ on customer satisfaction with
Lii et al. (2012) satisfaction with service service recovery
Nikbin et al. (2012) recovery Service recovery satisfaction and
Prasongsukarn et al. (2012) Psychological distance weakens trust mediate the positive effect
Siu et al. (2013) the positive effect of IJ on of IJ on positive WOM and
Wu (2013) customer satisfaction with repurchase intention
Assefa (2014) service recovery Customer satisfaction with service
Räikkönen et al. 2015) recovery mediates the posi-
Yeoh et al. (2015) tive effect of IJ on repurchase
Gohary et al. (2016) intention
Nadiri (2016)
Joosten et al. (2017)
Jung et al. 2017)
Petzer et al. (2017)
Tsao (2018)
Mohd-Any et al. (2019)
Radu et al. (2020)
Maxham et al. (2002) Customer cumulative IJ has a positive effect on cumu- n/a n/a
Ok et al. (2005 satisfaction lative customer satisfaction
Joosten et al. (2017)
Mohd-Any et al. (2019
M. Lajante et al.
Table 3 (continued)
Citations Themes Direct effects Interaction effects/moderators Mediators
Ok et al. (2005 Word-of-Mouth IJ decreases negative WOM IJ x PJ decrease negative WOM Emotions mediate the positive
Hocutt et al. (2006 (WOM) IJ increases positive WOM DJ x IJ decrease negative WOM effect of IJ on customer inten-
Lin et al. (2011 Service recovery effort strength- tion to praise; the negative effect
Kuo et al. (2012 ened the negative effect of IJ of IJ on customer intention to
Räikkönen et al. (2015) on intention to condemn condemn
Yeoh et al. (2015)
Choi et al. (2016)
Ortiz et al. (2017)
Cai et al. (2018)
Chebat et al. (2005) Customer emotion IJ increases post-recovery posi- Customer co-creation of the n/a
Kuo et al. (2012) tive emotions and decreases service recovery strengthens
Ozgen and Kurt (2012) post-recovery negative emo- the effect of IJ on positive
Gohary et al. (2016) tions customer emotions
Cai et al. (2018)
Chen et al. (2019)
Can robots recover a service using interactional justice as…
Ok et al. (2005) Post-service recovery IJ increases customer intention Customer co-creation of the n/a
Lin et al. (2011) intentions to reconcile, repurchase, reuse, service recovery strengthens
Kuo et al. (2012) and revisit the effect of IJ on customer
Räikkönen et al. 2015) IJ decreases customer likelihood intention to reuse
Choi et al. (2016) to switch service provider DJ x IJ increase customer repur-
Gohary et al. (2016) chase intention
Cai et al. (2018)
Radu et al. (2020)
de Ruyter and Wetzels (2000) Trust IJ increases customer trust n/a n/a
Aurier et al. (2007)
Lii et al. (2012)
Mohd-Any et al. (2019)
333
13
Table 3 (continued)
334
13
Chebat et al. (2005) Loyalty IJ increases customer loyalty PJ x IJ increase customer loyalty Positive and negative emotions
Wang et al. (2011) IJ weakens the negative relation- mediate the positive effect of IJ
Yeoh et al. (2015) ship between service failure on customer loyalty
severity and customer loyalty
Gohary et al. (2016) Perceived value IJ increases customer perceived Customer co-creation of the n/a
value service recovery strengthens
the effect of IJ on customer
perceived value
Aurier et al. (2007) Service quality evalu- IJ increases service interaction PJ x IJ → increased interaction n/a
ation quality quality
Ortiz et al. (2017) Empathy, anger, IJ increases customer empathy IJ has a positive effect on anger n/a
revenge and decreases customer anger for the high-blame attribution
and revenge intention and the high failure sever-
ity groups, but not for the
low-blame attribution the low-
service failure severity groups
Choi et al. (2016) Perceived breach and IJ decreases customer’s feelings IJ strengthens the negative n/a
feelings of violation of violation relationship between PJ and
perceived breach
IJ weakens the negative effect of
DJ on feelings of violation
PJ x IJ decreases perceived
breach
Gohary et al. (2016) Intention to co-create IJ increases customer’s intention n/a n/a
service recovery in to co-create service recovery
the future in the future
Wirtz et al. (2010) Opportunistic claiming IJ decreases opportunistic n/a n/a
claiming
M. Lajante et al.
Can robots recover a service using interactional justice as… 335
demonstrated that distributive and procedural justices explain more customers’ emo-
tions than IJ (Cai and Qu 2018).
Moreover, it is unclear whether IJ is the most or the least essential justice compo-
nent to explain customers’ satisfaction with the service recovery. Some studies found
that IJ (especially empathy and courtesy) explains more customers’ satisfaction with
service recovery than procedural (e.g., responsiveness) and distributive (e.g., free
meal) justice (Hocutt et al. 2006; Mohd-Any et al. 2019; Nadiri 2016). For instance,
a study showed that IJ, but not distributive justice, significantly affects post-recovery
overall satisfaction, indicating that interpersonal treatment plays a crucial role in
recovering dissatisfying experiences (Ok et al. 2005). However, other studies found
that distributive justice plays a more significant role than IJ in customers’ satisfac-
tion with service recovery, indicating the importance of tangible over intangible
aspects (e.g., Yeoh et al. 2015; Radu et al. 2020; Räikkönen and Honkanen 2016).
Finally, some studies found procedural justice exerted stronger effects on customers’
satisfaction with service recovery than distributive justice and IJ (Lii and Lee 2012).
Findings of our systematic review show that service failure severity, customer
characteristics, complementary effect with other justice components, and service
contexts moderate the effect of IJ on customers’ emotions and service recovery
satisfaction:
1. Service failure severity. In the case of a severe service failure, IJ alone cannot
overcome the negative effects of the service failure and elicit customers’ posi-
tive emotions: distributive (e.g., monetary compensation) and procedural justice
(e.g., prompt conflict resolution; Cheung and To 2017) are also required (Choi
and Choi 2014). In the case of a less severe service failure, however, monetary
compensation would not be necessary, as ensuring interactional and procedural
justice would be enough to enhance customer affection (Choi and Choi 2014).
2. Customer characteristics. IJ elicits a stronger positive (negative) effect on custom-
ers’ positive (negative) emotions on overall satisfaction for repeat customers than
for first-time customers (Chen and Kim 2019).
3. Complementary effects with other justice components. A study showed that IJ
complements distributive justice (Choi et al. 2016). When IJ is high, customers
understand that firm is committed to taking responsibility for the service failure
and are less likely to doubt the fairness of compensation (i.e., distributive justice).
Distributive and procedural justice also correlate with IJ to elicit positive post-
recovery customers’ intentions (Gohary et al. 2016; Choi et al. 2016; Lin et al.
2011). For instance, a study on online shopping found that only distributive justice
positively affects repurchase intention, and only IJ has a negative effect on nega-
tive WOM (Lin et al. 2011). IJ alone might not be enough to regain customers’
choices in specific service contexts (Ortiz et al. 2017).
4. Service context. Online services entail less human contact, and customers would
favor procedural justice (Jun and Seock 2017). Another study showed that IJ
weakens the negative effect of service failure on customer loyalty in the context
of online services. However, only high procedural justice in service recovery
can elicit higher post-failure customer loyalty (Wang et al. 2011). Customers
engaged in airline service encounters also seem to favor procedural and distribu-
13
336 M. Lajante et al.
tive justice because of the time constraints and the need for a concrete solution
(Nikbin et al. 2015). IJ (e.g., apology and showing empathy) would be essential
but not enough to achieve customer satisfaction with service recovery (Nikbin
et al. 2015). However, other service contexts require more IJ, especially when the
service outcome is uncertain or technical service production (Siehl et al. 1992).
For instance, customers experiencing banking failures value IJ more (Maxham
and Netemeyer 2002), probably because it would be more difficult for customers
to appraise the fairness of the service recovery on procedural or distributive com-
ponents (Seiders and Berry 1998). Another study showed that the effect of IJ on
trust is more critical in the restaurant and banking contexts than in retailing and
personal services, where only the interactions of procedural justice with IJ elicit
greater customer loyalty (De Ruyter & Wetzel 2000). IJ could play a significant
role in the hospitality industry, where interactions between FLEs and customers
are frequent and prolonged (Tsao 2018).
Traditional service encounters rely on FLEs, who are socially capable of estab-
lishing emotional connectedness through empathy with customers (Wieseke et al.
2012). After a service failure, FLEs develop efforts to recover the service prosocially
through IJ (Valentini et al. 2020). They represent the firm to customers, deliver its
promises, and enhance its reputation and image—the service encounter is the focal
point in customer evaluation of the firm (Bettencourt and Brown 2003). However,
the rise of AI applications at the organizational frontline challenges the traditional
model of FLEs-based service encounters (Wirtz et al. 2018). ServBots will soon
be part of the service interactions to replace FLEs for functional operations (Tuomi
et al. 2021; Christou et al. 2020). What about interactional service operations? To
answer our research question, we first discuss the boundaries of IJ during service
recovery. Then, we compare the IJ norms identified through the systematic review
with the review of empirical research in social robotics.
Several service researchers consider that ServBots are unable, and will probably
not be able soon, to handle interactional operations in emotionally driven service
encounters (Wirtz et al. 2018; Huang and Rust 2018; Davenport et al. 2020). Cus-
tomers still have implicit preferences for FLEs (Akdim et al. 2021) because of
humans’ sincere and genuine interactions (Shin and Jeong 2020). However, we iden-
tified IJ boundaries that qualify the somewhat definitive judgment on the ServBots’
capacity to handle emotionally driven service recovery. First, we found that IJ com-
plements rather than substitutes procedural and distributive justice dimensions.
13
Can robots recover a service using interactional justice as… 337
Second, our results show IJ is not always the most significant dimension of justice to
recover service. Four moderators explain the inconsistent effect of IJ during service
recovery: service failure severity, customer characteristics, complementary effect
with the other dimensions of justice, and service contexts. For instance, in a service
recovery where time constraints and the need for a concrete solution are high, proce-
dural and distributive justice—which can be programmed into Servbots—are more
critical than IJ.
A third and final IJ boundary we identified through our systematic literature
review challenge the opinion that empathy is critical to supporting customers in
emotionally driven interactions and recovering service. FLEs’ empathy toward cus-
tomers is normalized and standardized through emotional labor. It does not refer
to the psychophysiological process of feeling and sharing customers’ emotions—
a non-imitable, specific human nervous system functioning process. Accordingly,
there are at least three reasons to curb the role of FLEs’ empathy in emotionally-
driven encounters like service recovery, which would lower the gap ServBots should
fill in to compete with FLEs’ empathic capacity:
1. Empathy is only one of the six dimensions of IJ (i.e., politeness, empathy, respect,
apology, explanation, and interest), which is correlated to distributive and pro-
cedural justice. FLEs’ empathy through IJ in service recovery appears to be a
valuable yet standardized response that plays only a contributive role.
2. FLEs’ empathy through IJ is not a spontaneous emotional response but an
expected normative response. FLEs must regulate their automatic reactions
through "emotional labor" to produce normative empathic displays when inter-
acting with angry customers. ServBots can produce normative displays of IJ at a
high level of consistency and stability throughout interactions (Choi et al. 2020;
Belanche et al. 2020).
3. Anger is the most common emotion after a service failure, and angry customers
seek utilitarian and solution-based support rather than emotional support (Menon
and Dubé 2000, 2007), while ServBots have excellent analytical skills (Huang and
Rust 2018). Besides, no study tested the relevance of IJ during service recovery
in response to angry customers.
13
338 M. Lajante et al.
In this subsection, we show that ServBots can display IJ’s six normative dimensions
(see Table 4) as FLEs during service recovery by reviewing more than 30 papers
published in social robotics in the last ten years.
Let us consider first the IJ dimensions of politeness, respect, apology, explana-
tion, and interest. Several works showed that ServBots could appear courteous and
engaging (IJ dimension: politeness) through their capacity to dialogue and display
identifiable human emotions (Kharub et al. 2021; Rincon et al. 2019). ServBots can
deliver an empathic speech and voice (James et al. 2020), understand jokes (Fung
et al. 2016), and display small social behaviors like nodding (Sakurai et al. 2020) to
increase customers’ perception of ServBots’ social agency, sincerity, and engage-
ment. Servbots can also appear honest and trustful (IJ dimension: respect) through
human-like features (Cominelli et al. 2021; Roesler et al. 2021), cute faces (Pinney
et al. 2022), extroverted voice pitch (Niculescu et al. 2013), and small social behav-
iors (Sakurai et al. 2020). ServBots’ capacity to display emotional expressions also
increases customers’ perception of trustfulness (Acosta-Mitjans et al. 2019).
ServBots are also able to accept blame (Kerruish 2021) (IJ dimension: apol-
ogy) and provide explanations, feedback, and information (Kerruish 2021; Leite
et al. 2013a, b), and recommend activities (Rincon et al. 2019) with high scores
in social presence and sociability (Rossi et al. 2020) (IJ dimension: explanation).
Finally, ServBots can show helpfulness, attention, and listening skills (IJ dimen-
sion: interest) through facial expressions (Pasternak et al. 2021), social behaviors
(Rossi et al. 2020; Sakurai et al. 2020), empathic display (Bagheri et al. 2021; Kuhn-
lenz et al. 2013), and high social presence (Leite et al. 2014). These findings show
that ServBots’ emotional and social skills to handle interactions during service
encounters go beyond customers’ concerns and negative perceptions of ServBots,
as reported in Table 1. In the following paragraphs, we discuss ServBots’ empathy
more precisely, which is by far the most studied in social robotics literature (Asada
2015).
Service is interactivity, and customers must recognize ServBots as a social agent
capable of free will, "feeling" emotions, and displaying empathic and supportive
behaviors to accept and consider them. Artificial empathy is, therefore, a crucial
topic in social robotics (Asada 2015). Our literature review shows that ServBots
can identify customers’ emotions (Pasternak et al. 2021; Rincon et al. 2019) and
react emotionally to customers’ actions (Kerruish 2021; Dumouchel 2017), facial
expressions of emotions (Chumkamon et al. 2016; Bagheri et al. 2021; Burns et al.
2018), vocabulary (Fung et al. 2016), voice, and level of arousal (De Carolis et al.
2017). In turn, ServBots can manifest empathy to customers through an interface
display (Kerruish et al. 2021), facial expressions of emotions (Konijn and Hoorn
2020; Menne and Schwab 2018; Pasternak et al. 2021), speech, and voice (James
et al. 2020), small social movements like nodding (Sakurai et al. 2020), supportive
behaviors (Leite et al. 2013a, b, 2014), cuteness (Dumouchel 2017), and friendliness
(Niculescu et al. 2013).
ServBots’ capacity to autonomously respond emotionally to diverse stimuli in
real-time (Hieida et al. 2018), achieve emotional intelligence and learn new emo-
tional patterns through machine learning (Chumkamon et al. 2016; Glaskin 2012)
reinforce customers’ perception of ServBots’ empathic skills. Social robotic studies
13
Table 4 Emotional and empathic skills of social robots through the six dimensions of interactional justice
IJ dimensions Social robots Social skills Citations
AI-based robot “Zara” Understand basic humor by outlining the Fung et al. (2016)
buildup and punchline of a joke
Empathy Care-o-Bot 3 (vs. Paro) Empathize through an interface display that Kerruish (2021)
sensitivity, concern, friendliness, openness, reacts to users’ actions
help, care Persuade users about the robot’s own entity
allowing for sharing of emotions
Shame users when the robot is mistreated
Integration of an embodied conversational Identify and replicate users’ facial expres- Pasternak et al. (2021)
agent with the Toyota Human-Support sions
Robot
Pepper Answer users’ facial expressions Bagheri et al. (2021)
Pepper Elicit emotional connection with users Riddoch and Cross (2021)
through empathy
Healthbot Demonstrate empathy through speech and James et al. (2020)
voice
339
13
Table 4 (continued)
340
13
and connection through an articulated face
AI-based robot “Zara” Increase users’ perception of robot’s Sakurai et al. (2020)
empathic skill through small social move-
ments (i.e., nodding)
Emotional Interactive Robot (EmIR) Estimate users’ emotional state to express Rincon et al. (2019)
empathy
Recurrent attention model Respond to diverse stimuli by using emotions Hieida et al. (2018)
in real-time
Robotis OP2 Increase users’ perception of empathy Burns et al. (2018)
through facial mimicry
Nao Increase users’ satisfaction through empathy Kwon et al. (2018)
Robot dinosaur Pleo Communicate emotions and elicit empathy Menne and Schwab (2018)
to users through facial expressions of
emotions
Paro Elicit empathy to users through cuteness and Dumouchel (2017)
reaction to users’ actions
Nao Predict users’ emotional state through De Carolis et al. (2017)
valence of the voice and level of arousal
Communicate empathy to users
“CONBE” robot Identify users’ emotions through facial recog- Chumkamon et al. (2016)
nition and share emotions with users
Achieve emotional intelligence and learn
new emotional patterns through machine
learning
AI-based robot “Zara” Empathize with users’ emotions through Fung et al. (2016)
speech and face detection, and vocabulary
in real-time
M. Lajante et al.
Table 4 (continued)
IJ dimensions Social robots Social skills Citations
Social robot “Feelix” Express emotions to users Mirnig et al. (2015)
“iCat” robot Display empathic and supportive behaviors Leite et al. (2014)
Einstein the robot Elicit users’ facial mimicry of emotional Hofree et al. (2014)
expressions
Social robots “Olivia” and “Cynthia” Elicit users’ perception of friendliness Niculescu et al. (2013)
through humorous dialogue
“iCat” robot Increase users’ enjoyment through the display Leite et al. (2013a, b)
of supportive and empathic behaviors
Nao Display and develop emotions based on input Glaskin (2012)
provided
Respect Canbot-U03 Increase users’ trust through cartoon or cute Pinney et al. (2022)
Fairness, honesty, dignity, righteousness, faces (vs. uncanny valley)
trustfulness
Humanoid robot Increase users’ trust through human-like Cominelli et al. (2021)
Can robots recover a service using interactional justice as…
features
Context Respectful Counselling Agent Increase users’ perception of robot’s trust- Sakurai et al. (2020)
(CRECA) worthiness through small social movements
(i.e., nodding)
Social robot “Eva” Increase users’ perception of trustworthiness Acosta-Mitjans et al. (2019)
through the robot’s emotional expressions
Social robots “Olivia” and “Cynthia” Elicit users’ assurance and satisfaction Niculescu et al. (2013)
through extroverted voice pitch
Apology Care-o-Bot 3 (vs. Paro) Display sadness when users do not comply Kerruish (2021)
Acceptance of blame with the robot’s instructions
341
13
Table 4 (continued)
342
13
Reliable information, causal account, justi-
fication
Pepper Administer a cognition test with high scores Rossi et al. (2020)
in social presence and sociability
Emotional Interactive Robot (EmIR) Recommend activities to users Rincon et al. (2019)
Persuade users into accepting the suggested
activities
“iCat” robot Increase users’ satisfaction through valuable Leite et al. (2013a, b)
feedback
Interest Integration of an embodied conversational Increase interpersonal interaction through the Pasternak et al. (2021)
Effort, endeavor, attention, listening agent with the Toyota Human-Support display of facial expressions
Robot
Pepper Increase users’ engagement and sense of Bagheri et al. (2021)
the robot’s presence through the robot’s
empathic capacity
Pepper Increase social presence through social Rossi et al. (2020)
behavior
Context Respectful Counselling Agent Increase users’ perception of the robot’s Sakurai et al. (2020)
(CRECA) attention and listening through small social
movements (i.e., nodding)
Emotional Interactive Robot (EmIR) Make reminders about activities already Rincon et al. (2019)
scheduled
“iCat” robot Maintain social presence over time Leite et al. (2014)
Social robot “EDDIE” Increase users’ perception of the robot’s Kühnlenz et al. (2013)
helpfulness by adapting behaviors to users’
emotions
M. Lajante et al.
Can robots recover a service using interactional justice as… 343
show that customers are satisfied with ServBots’ empathic skills (Kwon et al. 2018).
They perceive them as autonomous social agents (Kerruish 2021) and, in turn,
empathize with ServBots (Hofree et al. 2014), increasing the sense of engagement
and interpersonal interaction (Leite et al. 2013a, b; Riddoch and Cross 2021). Sev-
eral studies showed that ServBots could express their feelings, shame customers
who mistreat them (Kerruish 2021), and elicit customers’ empathy toward them (De
Jong et al. 2021; Malinowska 2021; Mattiassi et al. 2021; Schmetkamp 2020).
Customers’ perception of ServBots as social agents is well demonstrated in stud-
ies where customers are asked to mistreat ServBots. For instance, 50% of partici-
pants refused to comply with the research instructions to hit the ServBots with a
mallet (Riddoch and Cross 2021). The other half of the participants who complied
with the instructions felt very uncomfortable. When asked why they refused to hit
the ServBot or felt uncomfortable doing it, participants answered that they felt an
emotional connection with the ServBot that refrained from hurting it (Riddoch and
Cross 2021). Customers also do not perceive a ServBot as a simple machine but as
a social agent through emotional capacity and anthropomorphic features (Carlson
et al. 2019). When witnessing the mistreatment of a ServBot versus a computer, cus-
tomers showed significantly more empathy for the ServBot (Carlson et al. 2019).
5 Discussion
Our results show that ServBots could handle service recovery by displaying IJ
norms and behaviors as FLEs do. However, implementing ServBots in emotionally-
driven service encounters raises several questions that need to be addressed. The
following discussion discusses the potential risks of misusing ServBots’ empathy for
customers’ well-being and free will. Second, we discuss the theoretical and method-
ological issues about using AI-based facial expression detection algorithms heavily
implemented in ServBots. Finally, several questions still need to be addressed before
implementing ServBots in emotionally-driven service encounters, and we propose a
research agenda for future studies.
The customers’ capacity to empathize with ServBots is a solid argument for assum-
ing that ServBots can handle emotionally-driven service encounters and provide
customers with genuine interpersonal communication (Malinowska 2021; Matti-
assi et al. 2021; Schmetkamp 2020). However, misuse of customers’ empathy for
ServBots could also present some risks that merit mentioning in our present study.
Those risks are mainly associated with: safety, privacy, data security, liability,
autonomy and independence, social connectedness and human interactions, objec-
tification and infantilization, deception and anthropomorphizing, and social justice
(Tan et al. 2021). Moreover, the commercial, self-interested nature of the relation-
ship between ServBots and customers biases the ties and the profit motive of the
companies could lead to abusive behavior. A ServBot interacting with a customer
13
344 M. Lajante et al.
will be able to synchronize with the customers’ gestures, emotions, and attitudes,
a mechanism known to influence and direct others’ behavior without their consent
(Tisseron 2015).
Moreover, ServBots will have different information about the world than cus-
tomers. For instance, a ServBot will not be able to identify the meaning of emo-
tions from what it experiences itself, as a human does (i.e., physiological correlates
of emotion), but by comparing it to patterns that have been stored in its software.
The encyclopedic description of an apple does not replace the experience of eating
this apple (Devillers 2017). However, decoding without sharing customers’ affec-
tive and mental states could lead to egocentric behaviors reminiscent of psychopaths
(Mullins-Nelson et al. 2006). The lack of empathy can result in significant social
and relational disability. Psychopaths rely almost exclusively on the cognitive inputs
of others at the expense of their emotional inputs. While they fail to share others’
emotional and mental states, they are perfectly capable of understanding and taking
advantage of them: this makes them very good at anticipating social and behavioral
intentions. This ability to understand the mental and emotional states without shar-
ing them translates into antisocial behaviors marked by disregard for others’ well-
being and lack of guilt, which forms the basis for socially manipulative behaviors
(Zaki and Ochsner 2016).
Does implementation of ServBots in service encounters expose companies to a
lack of empathy and, thus, to psychopathic organizational behavior? The question is
worth asking. On the one hand, implementing ServBots will lead to a loss of human
and physical interaction, resulting in an emotional disconnection from the customer.
This emotional disconnection alters the company’s ability to feel and share the emo-
tional states of its customers and thwarts the development of compassionate feelings
necessary for the development of customer-oriented behaviors, such as assistance.
Moreover, the fact that customers share their emotional states with ServBots does
not make the interaction more human if data processing is automated and standard-
ized. On the other hand, digital transformation gives access to an impressive amount
of data (big data) about customers. It dramatically increases the company’s ability
to understand customers’ affective and mental states without increasing its ability to
share them. Their habits, opinions, beliefs, behaviors, feelings, and intentions will
be digitized, stored, and processed to develop predictive models and streamline the
customer relationship (Søraa et al. 2021). At the risk of producing manipulative and
antisocial behaviors solely focused on the company’s interest (Lajante 2019a, b).
13
Can robots recover a service using interactional justice as… 345
13
346 M. Lajante et al.
13
Can robots recover a service using interactional justice as… 347
6 Research agenda
This paper showed that ServBots could be a viable option to replace FLEs and han-
dle emotionally driven service encounters like service recovery. However, we also
mentioned that, to date, no empirical study has investigated the antecedents and con-
sequences of customers’ acceptance and satisfaction with ServBots during service
recovery. Research on ServBots in marketing and service is still in its infancy, but it
is a promising research stream, and several questions need to be addressed. We espe-
cially identified five priority research directions for the future.
First, we showed that no study tested the effect of specific discrete emotions asso-
ciated with service failure (e.g., anger vs. anxiety) on customers’ expectations. How-
ever, negative emotions rely on different cognitive appraisals and produce different
responses (Scherer and Moors 2019). Service research showed that customers’ sup-
port needs vary depending on their emotions after a service failure: angry custom-
ers expect utilitarian support, whereas anxious customers expect emotional support.
(Menon and Dube 2000, 2007). It would imply that customers accept interacting
with a ServBots during the service recovery if it can provide the same relevant solu-
tion as FLEs, no matter the intensity of the empathic display. Artificial empathy
through IJ would be more acceptable than for anxious customers who value emo-
tional and authentic empathy. The role of discrete emotions in customers’ accept-
ance of ServBots has been studied in other service contexts. For instance, a recent
study showed that embarrassing service encounters where customers feel ashamed
and embarrassed lead to higher ServBots’ acceptance (e.g., Pitardi et al. 2021). So,
the emotion shared by customers could moderate ServBots’ acceptance during ser-
vice interactions.
Second, customers’ ServBots acceptance might depend on how authentic or
unauthentic ServBots’ display of emotion and empathy through IJ appear. Our sys-
tematic literature showed that emotional contagion from FLEs to customers could
explain the positive effect of authentic emotional display on customers’ beliefs that
FLEs have treated them fairly (Azab et al. 2018). FLEs’ emotional labor is efficient
if authentic (deep acting) but inefficient if unauthentic (surface acting), leading to
customers’ dissatisfaction with the service encounters (Liu et al. 2019). However,
empathizing with angry customers who display aggressive behaviors is difficult. It
can impair emotional labor authenticity and subsequent processes such as empathic
concern, prosocial behaviors, and customer satisfaction (Dallimore et al. 2007).
Conversely, ServBots do not need to regulate emotions to display normative empa-
thy through emotional labor. Although artificial, ServBots’ empathic display might
be more stable and consistent over time, offering customers more assurance and
relief during the service recovery (Choi et al. 2020). Therefore, studies comparing
13
348 M. Lajante et al.
the effect of FLEs’ versus ServBots’ emotional labor (deep versus surface acting)
on customers’ perception of IJ norms, service recovery satisfaction, and behaviors
would be worth considering.
Third, we compared FLEs directly to ServBots, assuming that ServBots would
replace FLEs in the future. However, recent studies suggested that ServBots will
work with FLEs rather than replace them (Paluch et al. 2021). This approach con-
siders the service triad—customers, service robots, frontline employees—as a more
likely future of ServBots at the organizational frontline (Odekerken-Schroder et al.
2021). However, it is still unclear how FLEs will accept working with ServBots and
how customers will feel interacting with both FLEs and ServBots simultaneously
(Xu et al. 2020). Recent studies show that FLEs and ServBots divide tasks accord-
ing to their nature: ServBots would be responsible for operational tasks, while FLEs
would be responsible for interactional tasks (Rancati and Maggioni 2022). However,
the risk of FLEs’ rejection exists and should be investigated in future studies. Simi-
lar to introducing new management tools to assist and control workers’ performance
in the organization, FLEs could perceive ServBots as a threat and consider boycott-
ing, hijacking, or vandalizing them (Lux and Lajante 2017). Such adverse behaviors
toward ServBots could eventually affect service quality and customer satisfaction.
Fourth, social robotics and service literature showed varying levels of technolo-
gies and ServBots’ capacity to interact and display emotions and empathy. Although
some ServBots can produce and deliver a high-quality service to customers, not all
companies will have the resources to implement such sophisticated technologies.
More simple applications of ServBots could appear like a company’s strategy to cut
costs (Belanche et al. 2020), putting customers at work. Indeed, some companies
could decide to replace FLEs with ServBots and ask customers to co-produce and
co-deliver the service by guiding and operating the ServBots as it already exists
with self-checkout systems and self-ordering kiosks (Grewal et al. 2020). However,
putting customers at work after a service failure might be tricky. Such a "prosump-
tion" experience (Ritzer and Jurgenson 2010) could be a significant source of cus-
tomer frustration. It would reduce customers’ opportunities to spontaneously share
emotions and force them to rationalize their experiences to adapt to the ServBots’
technical constraints and limitations (Lajante 2019a). Therefore, it is necessary to
investigate customers’ ServBots acceptance and satisfaction with the service recov-
ery according to the required level of engagement to co-produce and co-deliver the
service with the ServBots.
Fifth, findings from our systematic review showed that the importance of IJ in
service recovery is context-dependent. For instance, customers recovering service
from online retailers (Jun and Seock 2017) and airline companies (Nikbin et al.
2015) are less sensitive to FLEs’ IJ norms than when coping in banking (Maxham
and Netemeyer 2002) or restaurant service encounters (Tsao 2018). Moreover, dif-
ferent segmentation strategies and brand positioning can produce other customers’
expectations and sensitivities. For instance, customers’ ServBots’ acceptance might
be higher for low-cost retailers and airline companies focusing on transactional ser-
vices than for premium companies focusing on interactional services (Rancati and
Maggioni 2022). Finally, different stages of economic and technological develop-
ments in other cultures may impact customers’ ServBots acceptance responses. For
13
Can robots recover a service using interactional justice as… 349
instance, empathy is a vital service quality dimension for Asian customers for high-
tech services companies (He and Li 2010), while reliability is the primary service
quality dimension for Western customers (Andronikidis and Bellou 2010). There-
fore, such moderators could increase or decrease customers’ ServBots acceptance
and should be investigated in future studies.
Funding This work was supported by the Social Sciences and Humanities Research Council of Canada
(No 430-2019-00321, 2019).
Data Availability The data that support the findings of this study are available from the corresponding
author [[email protected]] on request.
References
Acosta-Mitjans A, Cruz-Sandoval D, Hervas R, Johnson E, Nugent C, Favela J (2019) Affective embod-
ied agents and their effect on decision making. Proc MDPI 31(1):71. https://doi.org/10.3390/proce
edings2019031071
Agnihotri D, Kulshreshta K, Tripathi V (2020) A study on service justice effectiveness on customer sat-
isfaction and repurchase intention in social media environment on major online shopping malls.
Finance India 34(2):541–562
Akdim K, Belanche D, Flavián M (2021) Attitudes toward service robots: analyses of explicit and
implicit attitudes based on anthropomorphism and construal level theory. Int J Contemp Hosp
Manag. https://doi.org/10.1108/IJCHM-12-2020-1406
Amelia A, Mathies C, Patterson PG (2021) Customer acceptance of frontline service robots in
retail banking: a qualitative approach. J Serv Manag 33(2):321–341. https://doi.org/10.1108/
JOSM-10-2020-0374
Andronikidis A, Bellou V (2010) Verifying alternative measures of the service-quality construct: consist-
encies and contradictions. J Mark Manag 26(5–6):570–587. https://doi.org/10.1080/0267257090
3498850
Asada M (2015) Towards artificial empathy. Int J Soc Robot 7(1):19–33. https://doi.org/10.1007/
s12369-014-0253-z
Assefa ES (2014) The effects of justice-oriented service recovery on customer satisfaction and loyalty in
retail banks in Ethiopia. EMAJ 4(1):49–58
Aurier P, Siadou-Martin B (2007) Perceived justice and consumption experience evaluations: a qualita-
tive and experimental investigation. Int J Serv Ind Manag 18(5):450–471. https://doi.org/10.1108/
09564230710826241
Azab C, Clark T, Jarvis CB (2018) Positive psychological capacities: the mystery ingredient in successful
service recoveries? J Serv Mark 32(7):897–912. https://doi.org/10.1108/JSM-11-2017-0407
Bagheri E, Roesler O, Cao HL, Vanderborght B (2021) A reinforcement learning based cognitive
empathy framework for social robots. Int J Soc Robot 13(5):1079–1093. https://doi.org/10.1007/
s12369-020-00683-4
Balaji MS, Roy SK, Quazi A (2017) Customers’ emotion regulation strategies in service failure encoun-
ters. Eur J Mark 51(5/6):960–982. https://doi.org/10.1108/EJM-03-2015-0169
Barakat LL, Ramsey JR, Lorenz MP, Gosling M (2015) Severe service failure recovery revisited: evi-
dence of its determinants in an emerging market context. Int Res J Mark 32(1):113–116. https://
doi.org/10.1016/j.ijresmar.2014.10.001
Barakova EI, De Haas M, Kuijpers W, Irigoyen N, Betancourt A (2018) Socially grounded game strategy
enhances bonding and perceived smartness of a humanoid robot. Connect Sci 30(1):81–98. https://
doi.org/10.1080/09540091.2017.1350938
Barrett LF (2017) How emotions are made: the secret life of the brain. Pan Macmillan, New York
Barrett LF, Adolphs R, Marsella S, Martinez AM, Pollak SD (2019) Emotional expressions reconsid-
ered: challenges to inferring emotion from human facial movements. Psychol Sci Public Interest
20(1):1–68. https://doi.org/10.1177/1529100619832930
13
350 M. Lajante et al.
Belanche D, Casaló LV, Flavián C, Schepers J (2020) Robots or frontline employees? Exploring cus-
tomers’ attributions of responsibility and stability after service failure or success. J Serv Manag
31(2):267–289. https://doi.org/10.1108/JOSM-05-2019-0156
Belanche D, Casaló LV, Flavián C (2021) Frontline robots in tourism and hospitality: service enhancement
or cost reduction? Electron Mark 31(3):477–492. https://doi.org/10.1007/s12525-020-00432-5
Bettencourt LA, Brown SW (2003) Role stressors and customer-oriented boundary-spanning behaviors
in service organizations. J Acad Mark Sci 31(4):394–408. https://doi.org/10.1177/0092070303
255636
Borghi M, Mariani MM (2021) Service robots in online reviews: online robotic discourse. Ann Tour Res.
https://doi.org/10.1016/j.annals.2020.103036
Burns R, Jeon M, Park CH (2018) Robotic motion learning framework to promote social engagement.
Appl Sci 8(2):241. https://doi.org/10.3390/app8020241
Byrd K, Fan A, Her E, Liu Y, Almanza B, Leitch S (2021) Robot vs human: expectations, performances,
and gaps in off-premise restaurant service modes. Int J Contemp Hosp Manag 33(11):3996–4016.
https://doi.org/10.1108/IJCHM-07-2020-0721
Cai R, Qu H (2018) Customers’ perceived justice, emotions, direct and indirect reactions to service
recovery: moderating effects of recovery efforts. J Hosp Mark Manag 27(3):323–345. https://doi.
org/10.1080/19368623.2018.1385434
Čaić M, Odekerken-Schröder G, Mahr D (2018) Service robots: value co-creation and co-destruction in
elderly care networks. J Serv Manag 29(2):178–205. https://doi.org/10.1108/JOSM-07-2017-0179
Carlson Z, Lemmon L, Higgins M, Frank D, Salek Shahrezaie R, Feil-Seifer D (2019) Perceived mis-
treatment and emotional capability following aggressive treatment of robots and computers. Int J
Soc Robot 11(5):727–739. https://doi.org/10.1007/s12369-019-00599-8
Cha SS (2020) Customers’ intention to use robot-serviced restaurants in Korea: relationship of cool-
ness and MCI factors. Int J Contemp Hosp Manag 32(9):2947–2968. https://doi.org/10.1108/
IJCHM-01-2020-0046
Chebat JC, Slusarczyk W (2005) How emotions mediate the effects of perceived justice on loyalty in
service recovery situations: an empirical study. J Bus Res 58(5):664–673. https://doi.org/10.1016/j.
jbusres.2003.09.005
Chen P, Kim YG (2019) Role of the perceived justice of service recovery: a comparison of first-time and
repeat visitors. Tour Hosp Res 19(1):98–111. https://doi.org/10.1177/1467358417704885
Cheung MF, To WM (2017) The effect of organizational responses to service failures on customer satis-
faction perception. Serv Bus 11(4):767–784
Chiang AH, Trimi S (2020) Impacts of service robots on service quality. Serv Bus 14(3):439–459. https://
doi.org/10.1007/s11628-020-00423-8
Choi B, Choi BJ (2014) The effects of perceived service recovery justice on customer affection, loyalty,
and word-of-mouth. Eur J Mark 48(1/2):108–131. https://doi.org/10.1108/EJM-06-2011-0299
Choi BC, Kim SS, Jiang Z (2016) Influence of firm’s recovery endeavors upon privacy breach on online
customer behavior. J Manag Inf Syst 33(3):904–933. https://doi.org/10.1080/07421222.2015.
1138375
Choi Y, Choi M, Oh M, Kim S (2020) Service robots in hotels: understanding the service quality per-
ceptions of human–robot interaction. J Hosp Mark Manag 29(6):613–635. https://doi.org/10.1080/
19368623.2020.1703871
Choi S, Mattila AS, Bolton LE (2021) To err is human (-oid): how do consumers react to robot service
failure and recovery? J Serv Res 24(3):354–371
Christou P, Simillidou A, Stylianou MC (2020) Tourists’ perceptions regarding the use of anthropomor-
phic robots in tourism and hospitality. Int J Contemp Hosp Manag 32(11):3665–3683. https://doi.
org/10.1108/IJCHM-05-2020-0423
Chumkamon S, Hayashi E, Koike M (2016) Intelligent emotion and behavior based on topological con-
sciousness and adaptive resonance theory in a companion robot. Biol Inspired Cogn Archit 18:51–
67. https://doi.org/10.1016/j.bica.2016.09.004
Cominelli L, Feri F, Garofalo R, Giannetti C, Meléndez-Jiménez MA, Greco A, Kirchkamp O (2021)
Promises and trust in human–robot interaction. Sci Rep 11(1):1–14. https://doi.org/10.1038/
s41598-021-88622-9
Dallimore KS, Sparks BA, Butcher K (2007) The influence of angry customer outbursts on service pro-
viders’ facial displays and affective states. J Serv Res 10(1):78–92. https://doi.org/10.1177/10946
70507304694
13
Can robots recover a service using interactional justice as… 351
Davenport T, Guha A, Grewal D, Bressgott T (2020) How artificial intelligence will change the future of
marketing. J Acad Mark Sci 48(1):24–42. https://doi.org/10.1007/s11747-019-00696-0
De Ruyter K, Wetzels M (2000) Customer equity considerations in service recovery: a cross-industry per-
spective. Int J Serv Ind Manag 11(1):91–108. https://doi.org/10.1108/09564230010310303
De Carolis B, Ferilli S, Palestra G (2017) Simulating empathic behavior in a social assistive robot.
Multimed Tools Appl 76(4):5073–5094. https://doi.org/10.1007/s11042-016-3797-0
De Gauquier L, Brengman M, Willems K, Cao HL, Vanderborght B (2021) In or out? A field obser-
vational study on the placement of entertaining robots in retailing. Int J Retail Distrib Manag
49(7):846–874. https://doi.org/10.1108/IJRDM-10-2020-0413
De Jong D, Hortensius R, Hsieh TY, Cross ES (2021) Empathy and schadenfreude in human–robot
teams. J Cogn 4(1):35. https://doi.org/10.5334/joc.177
De Kervenoael R, Hasan R, Schwob A, Goh E (2020) Leveraging human-robot interaction in hospital-
ity services: incorporating the role of perceived value, empathy, and information sharing into
visitors’ intentions to use social robots. Tour Manag 78:104042. https://doi.org/10.1016/j.tourm
an.2019.104042
Devillers L (2017) Des robots et des hommes: Mythes, fantasmes et réalité. Éditions Plon, Paris
DeWitt T, Nguyen DT, Marshall R (2008) Exploring customer loyalty following service recovery:
the mediating effects of trust and emotions. J Serv Res 10(3):269–281. https://doi.org/10.1177/
1094670507310767
Dong B, Evans KR, Zou S (2008) The effects of customer participation in co-created service recovery.
J Acad Mark Sci 36(1):123–137. https://doi.org/10.1007/s11747-007-0059-8
Dos Santos CP, Fernandes DV (2008) The impact of service recovery processes on consumer trust
and loyalty in car repair services. Lat Am Bus Rev 8(2):89–113
Dumouchel P (2017) Of objects and affect artificial empathy, pure sociality, and affective coordina-
tion. Jpn Rev Cult Anthropol 18(1):99–113. https://doi.org/10.14890/jrca.18.1_99
Ekman P, Friesen WV (1978) Manual for the facial action coding system. Consulting Psychologists
Press
Ekman P, Dalgleish T, Power M (1999) Basic emotions. Handbook of cognition and emotion. Wiley,
Chihester
Ellyawati J, Purwanto BM, Dharmmes BS (2012) The effect of perceived justice on customer satisfac-
tion in the service recovery context: testing mediating variables. J Serv Sci Manag 5(2):87–100
Fan A, Wu L, Miao L, Mattila AS (2020) When does technology anthropomorphism help alleviate
customer dissatisfaction after a service failure? The moderating role of consumer technology
self-efficacy and interdependent self-construal. J Hosp Mark Manag 29(3):269–290. https://doi.
org/10.1080/19368623.2019.1639095
Fernandes T, Morgado M, Rodrigues MA (2018) The role of employee emotional compe-
tence in service recovery encounters. J Serv Mark 32(7):835–849. https://doi.org/10.1108/
JSM-07-2017-0237
Flavián C, Pérez-Rueda A, Belanche D, Casaló LV (2021) Intention to use analytical artificial intelli-
gence (AI) in services–the effect of technology readiness and awareness. J Serv Manag 33(2):293–
320. https://doi.org/10.1108/JOSM-10-2020-0378
Forgas-Coll S, Huertas-Garcia R, Andriella A, Alenyà G (2022) The effects of gender and personality of
robot assistants on customers’ acceptance of their service. Serv Bus 16:359–389
Fung P, Bertero D, Wan Y, Dey A, Chan RHY, Bin Siddique F, Lin R (2016) Towards empathetic
human–robot interactions. Cicling 9624:173–193. https://doi.org/10.1007/978-3-319-75487-1_14
Gelbrich K (2010) Anger, frustration, and helplessness after service failure: coping strategies and
effective informational support. J Acad Mark Sci 38(5):567–585. https://doi.org/10.1007/
s11747-009-0169-6
Glaskin K (2012) Empathy and the robot: a neuroanthropological analysis. Ann Anthropol Pract
36(1):68–87. https://doi.org/10.1111/j.2153-9588.2012.01093.x
Global Robotics Industry (2021) https://www.proquest.com/docview/2605420980?accountid=13631&
parentSessionId=16Xo9vf8N6k0UcSitm23kSo%2ByMh4CYha%2FsAYdi6lr2I%3D
Gockley R, Bruce A, Forlizzi J, Michalowski M, Mundell A, Rosenthal S, Sellner B, Simmons R, Snipes
K, Schultz A, Wang J (2005) Designing robots for long-term social interaction. IEEE/RSJ interna-
tional conference on intelligent robots and systems pp 1338–1343
Gohary A, Hamzelu B, Pourazizi L (2016) A little bit more value creation and a lot of less value destruc-
tion! Exploring service recovery paradox in value context: a study in travel industry. J Hosp Tour
Manag 29:189–203. https://doi.org/10.1016/j.jhtm.2016.09.001
13
352 M. Lajante et al.
Grewal D, Noble SM, Roggeveen AL, Nordfalt J (2020) The future of in-store technology. J Acad Mark
Sci 48(1):96–113. https://doi.org/10.1007/s11747-019-00697-z
Gross JJ (1999) Emotion regulation: past, present, future. Cogn Emot 13(5):551–573. https://doi.org/10.
1080/026999399379186
Gunes H, Hung H (2016) Is automatic facial expression recognition of emotions coming to a dead end?
The rise of the new kids on the block. Image vis Comput 55(1):6–8. https://doi.org/10.1016/j.ima-
vis.2016.03.013
Haidt J, Keltner D (1999) Culture and facial expression: open-ended methods find more expressions and
a gradient of recognition. Cogn Emot 13(3):225–266. https://doi.org/10.1080/026999399379267
Harrison-Walker L (2012) The role of cause and affect in service failure. J Serv Mark 26(2):115–123.
https://doi.org/10.1108/08876041211215275
Harrison-Walker LJ (2019) The effect of consumer emotions on outcome behaviors following service
failure. J Serv Mark 33(3):285–302. https://doi.org/10.1108/JSM-04-2018-0124
He H, Li Y (2010) Key service drivers for high-tech service brand equity: the mediating role of overall
service quality and perceived value. J Mark Manag 27(1/2):77–99. https://doi.org/10.1080/02672
57X.2010.495276
Hempel S (2020) Conducting your literature review. American Psychological Association, Washington
Heyes C (2018) Empathy is not in our genes. Neurosci Biobehav Rev 95:499–507. https://doi.org/10.
1016/j.neubiorev.2018.11.001
Hieida C, Horii T, Nagai T (2018) Deep emotion: a computational model of emotion using deep neural
networks. https://doi.org/10.48550/arXiv.1808.08447
Hocutt MA, Bowers MR, Donavan DT (2006) The art of service recovery: fact or fiction? J Serv Mark
20(3):199–207. https://doi.org/10.1108/08876040610665652
Hofree G, Ruvolo P, Bartlett MS, Winkielman P (2014) Bridging the mechanical and the human mind:
spontaneous mimicry of a physically present android. PLoS ONE 9(7):e99934. https://doi.org/10.
1371/journal.pone.0099934
Huang MH, Rust RT (2018) Artificial intelligence in service. J Serv Res 21(2):155–172. https://doi.org/
10.1177/1094670517752459
Huang MH, Rust RT (2021) A strategic framework for artificial intelligence in marketing. J Acad Mark
Sci 49(1):30–50
Ivanov S, Webster C (2021) Willingness-to-pay for robot-delivered tourism and hospitality services–
an exploratory study. Int J Contemp Hosp Manag 33(11):3926–3955. https://doi.org/10.1108/
IJCHM-09-2020-1078
Ivkov M, Blešić I, Dudić B, Pajtinková Bartáková G, Dudić Z (2020) Are future professionals willing to
implement service robots? Attitudes of hospitality and tourism students towards service robotiza-
tion. Electron 9(9):1442. https://doi.org/10.3390/electronics9091442
James J, Balamurali BT, Watson CI, MacDonald B (2020) Empathetic speech synthesis and testing for
healthcare robots. Int J Soc Robot 13:2119–2137. https://doi.org/10.1007/s12369-020-00691-4
Joosten H, Bloemer J, Hillebrand B (2017) Consumer control in service recovery: beyond decisional con-
trol. J Serv Manag 28(3):499–519. https://doi.org/10.1108/JOSM-07-2016-0192
Jung NY, Seock YK (2017) Effect of service recovery on customers’ perceived justice, satisfaction, and
word-of-mouth intentions on online shopping websites. J Retail Consum Serv 37:23–30. https://
doi.org/10.1016/j.jretconser.2017.01.012
Kähkönen T, Blomqvist K, Gillespie N, Vanhala M (2021) Employee trust repair: a systematic review of
20 years of empirical research and future research directions. J Bus Res 130:98–109. https://doi.
org/10.1016/j.jbusres.2021.03.019
Keltner D, Sauter D, Tracy J, Cowen A (2019) Emotional expression: advances in basic emotion theory. J
Nonverbal Behav 43(2):133–160. https://doi.org/10.1007/s10919-019-00293-3
Kerruish E (2021) Assembling human empathy towards care robots: the human labor of robot sociality.
Emot Space Soc 41:100840. https://doi.org/10.1016/j.emospa.2021.100840
Khaksar SMS, Khosla R, Chu MT, Shahmehr FS (2016) Service innovation using social robot to reduce
social vulnerability among older people in residential care facilities. Technol Forecast Soc Change
113:438–453. https://doi.org/10.1016/j.techfore.2016.07.009
Kharub I, Lwin M, Khan A, Mubin O (2021) Perceived service quality in HRI: applying the SERVBOT
framework. Front Robot AI 8:746674. https://doi.org/10.3389/frobt.2021.746674
Kołakowska A, Landowska A, Szwoch M, Szwoch W, Wrobel MR (2014) Emotion recognition and its
applications. Adv Intell Syst Comput 300:51–62. https://doi.org/10.1007/978-3-319-08491-6_5
13
Can robots recover a service using interactional justice as… 353
Konijn EA, Hoorn JF (2020) Differential facial articulacy in robots and humans elicit different levels
of responsiveness, empathy, and projected feelings. Robotics 9(4):92. https://doi.org/10.3390/robot
ics9040092
Kozub K, Anthony O’Neill M, Palmer A (2014) Emotional antecedents and outcomes of service recov-
ery. J Serv Mark 28(3):233–243. https://doi.org/10.1108/JSM-08-2012-0147
Krishna A, Dangayach GS, Jain R (2011) Service recovery: literature review and research issues. J
Serv Res 3(1):71. https://doi.org/10.1007/s12927-011-0004-8
Kühnlenz B, Sosnowski S, Buß M, Wollherr D, Kühnlenz K, Buss M (2013) Increasing helpfulness
towards a robot by emotional adaption to the user. Int J Soc Robot 5(4):457–476. https://doi.org/
10.1007/s12369-013-0182-2
Kuo YF, Wu CM (2012) Satisfaction and post-purchase intentions with service recovery of online
shopping websites: perspectives on perceived justice and emotions. Int J Inf Manag Sci
32(2):127–138. https://doi.org/10.1016/j.ijinfomgt.2011.09.001
Kwon O, Kim J, Jin Y, Lee N (2018) Impact of human-robot interaction on user satisfaction with
humanoid-based healthcare. Int J Eng Technol 7(2):68–75. https://doi.org/10.14419/ijet.v7i2.
12.11038
Lajante M, Ladhari R (2019) The promise and perils of the peripheral psychophysiology of emotion
in retailing and consumer services. J Retail Consum Serv 50:305–313. https://doi.org/10.1016/j.
jretconser.2018.07.005
Lajante M, Lux G (2020) Perspective: why organizational researchers should consider psychophysi-
ology when investigating emotion? Front Psychol 11:1705. https://doi.org/10.3389/fpsyg.2020.
01705
Lajante M (2019a) The firm’s empathic capacity: a social neuroscience perspective for managing cus-
tomer engagement in the digital era. Augmented Customer Strategy: CRM in the Digital Age 183–
201. https://doi.org/10.1002/9781119618324.ch11
Lajante M (2019b) Augmented empathic capacity: an integrative framework for supporting customer
engagement throughout the automated customer journey. International Conference on Advances
in National Brand and Private Label Marketing pp 121–129. https://doi.org/10.1007/978-3-030-
18911-2_16
Landowska A (2014) Emotion monitoring–verification of physiological characteristics measurement pro-
cedures. Metrol Meas Syst 21(4):719–732. https://doi.org/10.2478/mms-2014-0049
Landowska A (2019) Uncertainty in emotion recognition. J Inf Commun Ethics Soc 17(3):273–291.
https://doi.org/10.1108/JICES-03-2019-0034
Lee JLM, Siu NYM, Zhang TJF (2020) Does brand equity always work? A study of the moderating effect
of justice perceptions and consumer attribution towards Chinese consumers. J Int Consum Mark
32(1):69–81. https://doi.org/10.1080/08961530.2019.1635551
Leite I, Martinho C, Paiva A (2013a) Social robots for long-term interaction: a survey. Int J Soc Robot
5(2):291–308
Leite I, Pereira A, Mascarenhas S, Martinho C, Prada R, Paiva A (2013b) The influence of empathy
in human–robot relations. Int J Hum Comput Stud 71(3):250–260. https://doi.org/10.1016/j.ijhcs.
2012.09.005
Leite I, Castellano G, Pereira A, Martinho C, Paiva A (2014) Empathic robots for long-term interaction.
Int J Soc Robot 6(3):329–341. https://doi.org/10.1007/s12369-014-0227-1
Lewinski P, Den Uyl TM, Butler C (2014) Automated facial coding: validation of basic emotions and
FACS AUs in FaceReader. J Neurosci Psychol Econ 7(4):227–236. https://doi.org/10.1037/npe00
00033
Lii YS, Lee M (2012) The joint effects of compensation frames and price levels on service recovery of
online pricing error. Manag Serv Qual 22(1):4–20. https://doi.org/10.1108/09604521211198083
Lin WB (2006) Correlation between personality characteristics, situations of service failure, customer
relation strength and remedial recovery strategy. Serv Mark Q 28(1):55–88. https://doi.org/10.
1300/J396v28n01_04
Lin HH, Wang YS, Chang LK (2011) Consumer responses to online retailer’s service recovery after a
service failure: a perspective of justice theory. Manag Serv Qual 21(5):511–534. https://doi.org/10.
1108/09604521111159807
Liu XY, Chi NW, Gremler DD (2019) Emotion cycles in services: emotional contagion and emotional
labor effects. J Serv Res 22(3):285–300. https://doi.org/10.1177/1094670519835309
13
354 M. Lajante et al.
Luo A, Mattila AS (2020) Discrete emotional responses and face-to-face complaining: the joint effect of
service failure type and culture. Int J Hosp Manag 90:102613. https://doi.org/10.1016/j.ijhm.2020.
102613
Lux G, Lajante M (2017) Introducing emotions in the appropriation of management tools: a propaedeu-
tic. Int J Work Organ Emot 8(3):213–233. https://doi.org/10.1504/ijwoe.2017.10009298
Malinowska JK (2021) What does it mean to empathize with a robot? Minds Mach 31(3):361–376.
https://doi.org/10.1007/s11023-021-09558-7
Mattiassi AD, Sarrica M, Cavallo F, Fortunati L (2021) What do humans feel with mistreated humans,
animals, robots, and objects? Exploring the role of cognitive empathy. Motiv Emot 45(4):543–555.
https://doi.org/10.1007/s11031-021-09886-2
Maxham JG III, Netemeyer RG (2002) Modeling customer perceptions of complaint handling over time:
the effects of perceived justice on satisfaction and intent. J Retail 78(4):239–252. https://doi.org/10.
1016/S0022-4359(02)00100-8
Menne IM, Schwab F (2018) Faces of emotion: investigating emotional facial expressions towards a
robot. Int J Soc Robot 10(2):199–209. https://doi.org/10.1007/s12369-017-0447-2
Menon K, Dubé L (2000) Ensuring greater satisfaction by engineering salesperson response to customer
emotions. J Retail 76(3):285–307. https://doi.org/10.1016/S0022-4359(00)00034-8
Menon K, Dubé L (2007) The effect of emotional provider support on angry versus anxious consumers.
Int Res J Mark 24(3):268–275. https://doi.org/10.1016/j.ijresmar.2007.04.001
Michel S, Bowen D, Johnston R (2009) Why service recovery fails. J Serv Manag 20(3):253–273. https://
doi.org/10.1108/09564230910964381
Min H, Lim Y, Magnini VP (2015) Factors affecting customer satisfaction in responses to negative online
hotel reviews: The impact of empathy, paraphrasing, and speed. Cornell Hosp Q 56(2):223–231.
https://doi.org/10.1177/1938965514560014
Mingotto E, Montaguti F, Tamma M (2021) Challenges in re-designing operations and jobs to embody AI
and robotics in services. Findings from a case in the hospitality industry. Electron Mark 31(3):493–
510. https://doi.org/10.1007/s12525-020-00439-y
Mirnig N, Strasser E, Weiss A, Kühnlenz B, Wollherr D, Tscheligi M (2015) Can you read my face? Int J
Soc Robot 7(1):63–76. https://doi.org/10.1007/s12369-014-0261-z
Mohd-Any AA, Mutum DS, Ghazali EM, Mohamed-Zulkifli L (2019) To fly or not to fly? An empiri-
cal study of trust, post-recovery satisfaction and loyalty of Malaysia Airlines passengers. J Serv
Theory Pract 29(5/6):661–690. https://doi.org/10.1108/JSTP-10-2018-0223
Mullins-Nelson JL, Salekin RT, Leistico AMR (2006) Psychopathy, empathy, and perspective-taking
ability in a community sample: Implications for the successful psychopathy concept. Int J Forensic
Ment Health 5(2):133–149. https://doi.org/10.1080/14999013.2006.10471238
Nadiri H (2016) Diagnosing the impact of retail bank customers’ perceived justice on their service recov-
ery satisfaction and postpurchase behaviours: an empirical study in financial centre of middle east.
Econ Res 29(1):193–216. https://doi.org/10.1080/1331677X.2016.1164925
Niculescu A, van Dijk B, Nijholt A, Li H, See SL (2013) Making social robots more attractive: the
effects of voice pitch, humor and empathy. Int J Soc Robot 5(2):171–191. https://doi.org/10.1007/
s12369-012-0171-x
Nikbin D, Marimuthu M, Hyun SS, Ismail I (2015) Relationships of perceived justice to service recovery,
service failure attributions, recovery satisfaction, and loyalty in the context of airline travelers. Asia
Pac J Tour Res 20(3):239–262
Odekerken-Schröder G, Mennens K, Steins M, Mahr D (2021) The service triad: an empirical study of
service robots, customers and frontline employees. J Serv Manag 33(2):246–292. https://doi.org/
10.1108/JOSM-10-2020-0372
Ok C, Back KJ, Shanklin CW (2005) Modeling roles of service recovery strategy: a relationship-focused
view. J Hosp Tour Res 29(4):484–507. https://doi.org/10.1177/1096348005276935
Ortiz J, Chiu TS, Wen-Hai C, Hsu CW (2017) Perceived justice, emotions, and behavioral intentions in
the Taiwanese food and beverage industry. Int J Confl Manag 28(4):437–463. https://doi.org/10.
1108/IJCMA-10-2016-0084
Ozgen O, Kurt SD (2012) Pre-recovery and post-recovery emotions in the service context: a preliminary
study. Manag Serv Qual 22(6):592–605. https://doi.org/10.1108/09604521211287561
Ozkan-Tektas O (2017) Perceived justice and post-recovery satisfaction in banking service failures: do
commitment types matter? Serv Bus 11(4):851–870
13
Can robots recover a service using interactional justice as… 355
Ozuem W, Ranfagni S, Willis M, Rovai S, Howell K (2021) Exploring customers’ responses to online
service failure and recovery strategies during Covid-19 pandemic: an actor–network theory per-
spective. Psychol Mark 38(9):1440–1459. https://doi.org/10.1002/mar.21527
Paluch S, Tuzovic S, Holz HF, Kies A, Jörling M (2021) “My colleague is a robot”–exploring frontline
employees’ willingness to work with collaborative service robots. J Serv Manag 33(2):363–388.
https://doi.org/10.1108/JOSM-11-2020-0406
Pasternak K, Wu Z, Visser U, Lisetti C (2021) Let’s be friends! A rapport-building 3D embodied conver-
sational agent for the Human Support Robot. https://doi.org/10.48550/arXiv.2103.04498
Pelau C, Dabija DC, Ene I (2021) What makes an AI device human-like? The role of interaction quality,
empathy and perceived psychological anthropomorphic characteristics in the acceptance of artifi-
cial intelligence in the service industry. Comput Hum Behav 122:106855. https://doi.org/10.1016/j.
chb.2021.106855
Petzer DJ, De Meyer-Heydenrych CF, Svensson G (2017) Perceived justice, service satisfaction and
behavior intentions following service recovery efforts in a South African retail banking context. Int
J Bank Mark 35(2):241–253. https://doi.org/10.1108/IJBM-04-2016-0047
Pillai R, Sivathanu B (2020) Adoption of AI-based chatbots for hospitality and tourism. Int J Contemp
Hosp Manag 32(10):3199–3226. https://doi.org/10.1108/IJCHM-04-2020-0259
Pinillos R, Marcos S, Feliz R, Zalama E, Gómez-García-Bermejo J (2016) Long-term assessment of a
service robot in a hotel environment. Rob Auton Sys 79:40–57
Pinney J, Carroll F, Newbury P (2022) Human–robot interaction: the impact of robotic aesthetics on
anticipated human trust. PeerJ Comput Sci 8:e837. https://doi.org/10.7717/peerj-cs.837
Pitardi V, Wirtz J, Paluch S, Kunz WH (2021) Service robots, agency and embarrassing service encoun-
ters. J Serv Manag 33(2):389–414. https://doi.org/10.1108/JOSM-12-2020-0435
Pozharliev R, De Angelis M, Rossi D, Romani S, Verbeke W, Cherubino P (2021) Attachment styles
moderate customer responses to frontline service robots: evidence from affective, attitudinal, and
behavioral measures. Psychol Mark 38(5):881–895. https://doi.org/10.1002/mar.21475
Prasongsukarn K, Patterson PG (2012) An extended service recovery model: the moderating impact of
temporal sequence of events. J Serv Mark 26(7):510–520. https://doi.org/10.1108/0887604121
1266477
Prince EB, Martin KB, Messinger DS, Allen M (2017) Facial action coding system. SAGE Encycl Com-
mun Res Methods 1:487–491
Radu A, Surachartkumtonkun J, Weaven S, Thaichon P (2020) Examining antecedents of reconciliation
following service failure and recovery. J Strateg Mark 28(5):417–433. https://doi.org/10.1080/
0965254X.2018.1518920
Räikkönen J, Honkanen A (2016) Making it right the third time? Pursuing satisfaction and loyalty in a
double service recovery. Scand J Hosp Tour 16(4):333–351
Rancati G, Maggioni I (2022) Neurophysiological responses to robot–human interactions in retail stores.
J Serv Mark. https://doi.org/10.1108/JSM-04-2021-0126
Riddoch KA, Cross E (2021) “Hit the robot on the head with this mallet”–making a case for including
more open questions in HRI research. Front Robot AI. https://doi.org/10.3389/frobt.2021.603510
Rincon JA, Costa A, Novais P, Julian V, Carrascosa C (2019) A new emotional robot assistant that facili-
tates human interaction and persuasion. Knowl Inf Syst 60(1):363–383. https://doi.org/10.1007/
s10115-018-1231-9
Ritzer G, Jurgenson N (2010) Production, consumption, prosumption: the nature of capitalism in the age
of the digital “prosumer.” J Consum Cult 10(1):13–36. https://doi.org/10.1177/1469540509354673
Roberts SF, Koditschek DE, Miracchi LJ (2020) Examples of Gibsonian affordances in legged robotics
research using an empirical, generative framework. Front Neurorobot 14:12
Roesler E, Manzey D, Onnasch L (2021) A meta-analysis on the effectiveness of anthropomorphism in
human-robot interaction. Sci Robot 6(58):5425. https://doi.org/10.1126/scirobotics.abj5425
Romero J, Lado N (2021) Service robots and COVID-19: exploring perceptions of prevention efficacy
at hotels in generation Z. Int J Contemp Hosp Manag 33(11):4057–4078. https://doi.org/10.1108/
IJCHM-10-2020-1214
Rossi S, Conti D, Garramone F, Santangelo G, Staffa M, Varrasi S, Di Nuovo A (2020) The role of per-
sonality factors and empathy in the acceptance and performance of a social robot for psychometric
evaluations. Robotics 9(2):39. https://doi.org/10.3390/robotics9020039
Sakurai E, Kurashige K, Tsuruta S, Sakurai Y, Knauf R, Damiani E, Frati F (2020) Embodiment matters:
toward culture-specific robotized counselling. J Reliab Intell Environ 6(3):129–139. https://doi.org/
10.1007/s40860-020-00109-y
13
356 M. Lajante et al.
Salagrama R, Prashar S, Sai Vijay T (2021) Do customers exhibit gratitude after service recovery?
Understanding the moderating role of relationship type. Serv Bus 15(4):757–779
Savela N, Turja T, Oksanen A (2018) Social acceptance of robots in different occupational fields: a sys-
tematic literature review. Int J Soc Robot 10(4):493–502
Schepers J, Streukens S (2022) To serve and protect: a typology of service robots and their role in physi-
cally safe services. J Serv Manag 33(2):197–209. https://doi.org/10.1108/JOSM-11-2021-0409
Schepers JJL, Belanche Gracia D, Casaló LV, Flavián C (2022) How smart should a service robot be? J
Serv Res 25(4):565–582. https://doi.org/10.1177/10946705221107704
Scherer KR, Grandjean D (2008) Facial expressions allow inference of both emotions and their compo-
nents. Cogn Emot 22(5):789–801. https://doi.org/10.1080/02699930701516791
Scherer KR, Moors A (2019) The emotion process: event appraisal and component differentiation. Annu
Rev Psychol 70:719–745. https://doi.org/10.1146/annurev-psych-122216-011854
Schmetkamp S (2020) Understanding AI—can and should we empathize with robots? Rev Philos Psychol
11(4):881–897. https://doi.org/10.1007/s13164-020-00473-x
Seiders K, Berry LL (1998) Service fairness: what it is and why it matters. Acad Manag Perspect
12(2):8–20. https://doi.org/10.5465/ame.1998.650513
Sengupta A, Balaji MS, Krishnan BC (2015) How customers cope with service failure? A study of brand
reputation and customer satisfaction. J Bus Res 68(3):665–674. https://doi.org/10.1016/j.jbusres.
2014.08.005
Severinson-Eklundh K, Green A, Httenrauch H (2003) Social and collaborative aspects of interaction
with a service robot. Robot Auton Syst 42(3–4):223–234
Shin HH, Jeong M (2020) Guests’ perceptions of robot concierge and their adoption intentions. Int J Con-
temp Hosp Manag 32(8):2613–2633. https://doi.org/10.1108/IJCHM-09-2019-0798
Siehl C, Bowen DE, Pearson CM (1992) Service encounters as rites of integration: an information pro-
cessing model. Organ Sci 3(4):537–555. https://doi.org/10.1287/orsc.3.4.537
Siu NYM, Zhang TJF, Yau CYJ (2013) The roles of justice and customer satisfaction in customer
retention: a lesson from service recovery. J Bus Ethics 114(4):675–686. https://doi.org/10.1007/
s10551-013-1713-3
Smith ME, Hart G (1994) Nurses’ responses to patient anger: from disconnecting to connecting. J Adv
Nurs 20(4):643–651
Smith AK, Bolton RN, Wagner J (1999) A model of customer satisfaction with service encounters involv-
ing failure and recovery. J Mark Res 36(3):356–372. https://doi.org/10.1177/002224379903600305
Søraa RA, Nyvoll P, Tøndel G, Fosch-Villaronga E, Serrano JA (2021) The social dimension of domes-
ticating technology: interactions between older adults, caregivers, and robots in the home. Technol
Forecast Soc Change 167:120678. https://doi.org/10.1016/j.techfore.2021.120678
Tan SY, Taeihagh A, Tripathi A (2021) Tensions and antagonistic interactions of risks and ethics of using
robotics and autonomous systems in long-term care. Technol Forecast Soc Change 167:120686.
https://doi.org/10.1016/j.techfore.2021.120686
Tisseron S (2015) Le jour où mon robot m’aimera: Vers l’empathie artificielle. Albin Michel, Paris
Tsao WC (2018) Star power: the effect of star rating on service recovery in the hotel industry. Int J Con-
temp Hosp Manag 30(2):1092–1111. https://doi.org/10.1108/IJCHM-05-2016-0247
Tuomi A, Tussyadiah IP, Stienmetz J (2021) Applications and implications of service robots in hospital-
ity. Cornell Hosp Q 62(2):232–247. https://doi.org/10.1177/1938965520923961
Umasuthan H, Park OJ, Ryu JH (2017) Influence of empathy on hotel guests’ emotional service experi-
ence. J Serv Mark 31(6):618–635. https://doi.org/10.1108/JSM-06-2016-0220
Valentini S, Orsingher C, Polyakova A (2020) Customers’ emotions in service failure and recovery: a
meta-analysis. Mark Lett 31(2):199–216. https://doi.org/10.1007/s11002-020-09517-9
Van Doorn J, Mende M, Noble SM, Hulland J, Ostrom AL, Grewal D, Petersen JA (2017) Domo arigato
Mr Roboto: emergence of automated social presence in organizational frontlines and customers’
service experiences. J Serv Res 20(1):43–58
Van Vaerenbergh Y, Varga D, De Keyser A, Orsingher C (2019) The service recovery journey: conceptu-
alization, integration, and directions for future research. J Serv Res 22(2):103–119
Wang YS, Wu SC, Lin HH, Wang YY (2011) The relationship of service failure severity, service recov-
ery justice, and perceived switching costs with customer loyalty in the context of e-tailing. Int J Inf
Manage 31(4):350–359. https://doi.org/10.1016/j.ijinfomgt.2010.09.001
Wen B, Chi C (2013) Examine the cognitive and affective antecedents to service recovery satisfaction. Int
J Contemp Hosp Manag 25(3):306–327. https://doi.org/10.1108/09596111311310991
13
Can robots recover a service using interactional justice as… 357
Weun S, Beatty SE, Jones MA (2004) The impact of service failure severity on service recovery evalu-
ations and post-recovery relationships. J Serv Mark 18(2):133–146. https://doi.org/10.1108/08876
040410528737
Wharton AS (2009) The sociology of emotional labor. Annu Rev Sociol 35:147–165. https://doi.org/10.
1146/annurev-soc-070308-115944
Wieseke J, Geigenmüller A, Kraus F (2012) On the role of empathy in customer-employee interactions. J
Serv Res 15(3):316–331. https://doi.org/10.1177/1094670512439743
Wirtz J, McColl-Kennedy JR (2010) Opportunistic customer claiming during service recovery. J Acad
Mark Sci 38(5):654–675. https://doi.org/10.1007/s11747-009-0177-6
Wirtz J, Patterson PG, Kunz WH, Gruber T, Lu VN, Paluch S, Martins A (2018) Brave new world: service
robots in the frontline. J Serv Manag 29(5):907–931. https://doi.org/10.1108/JOSM-04-2018-0119
Wu L (2013) The antecedents of customer satisfaction and its link to complaint intentions in online shop-
ping: an integration of justice, technology, and trust. Int J Inf Manage 33(1):166–176. https://doi.
org/10.1016/j.ijinfomgt.2012.09.001
Wu L, Fan A, Yang Y, He Z (2021) Robotic involvement in the service encounter: a value-centric experi-
ence framework and empirical validation. J Serv Manag 32(5):783–812. https://doi.org/10.1108/
JOSM-12-2020-0448
Xu S, Stienmetz J, Ashton M (2020) How will service robots redefine leadership in hotel manage-
ment? A Delphi approach. Int J Contemp Hosp Manag 32(6):2217–2237. https://doi.org/10.1108/
IJCHM-05-2019-0505
Yeoh PL, Woolford SW, Eshghi A, Butaney G (2015) Customer response to service recovery in online
shopping. J Serv Res 14(2):33–56
Yitzhak N, Giladi N, Gurevich T, Messinger DS, Prince EB, Martin K, Aviezer H (2017) Gently does it:
humans outperform a software classifier in recognizing subtle, nonstereotypical facial expressions.
Emotion 17(8):1187. https://doi.org/10.1037/emo0000287
Zaki J, Oschner K (2016) Empathy. In: Feldman-Barrett L, Lewis M, Haviland-Jones JM (eds) Handbook
of emotions. The Guilford Press, New York
Złotowski J, Proudfoot D, Yogeeswaran K, Bartneck C (2015) Anthropomorphism: opportunities and
challenges in human–robot interaction. Int J Soc Robot 7(3):347–360
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published maps
and institutional affiliations.
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under
a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted
manuscript version of this article is solely governed by the terms of such publishing agreement and
applicable law.
13