Practices that foster a learning or mastery orientation

In the 1970s, Carol Dweck introduced a concept, called goal orientation, that revolutionised the research on motivation.  Goal orientation refers to whether individuals primarily strive to enhance their knowledge, skills, and competence, referred to as a learning orientation, or generally attempt to demonstrate their abilities and expertise, referred to as a performance orientation.  Several threads of research indicate that a learning orientation should foster humility or diminish the deleterious effects of narcissism.  To illustrate

  • as Owens et al. (2013) revealed, people who adopt a learning orientation are more likely to exhibit humility, in which they acknowledge their limitations and respect diverse perspectives,
  • a growth mindset, in which individuals assume that competence and character are modifiable (Dweck, 2006), tends to foster humility (Porter & Schumann, 2018) as well as promote a learning orientation (e.g., Wibowo & Sumiati, 2022),
  • as Braun et al. (2025) revealed, when the culture or climate of an organisation prioritise learning and development over competition and rivalry, narcissistic leaders are not as inclined to harm their organisation merely to pursue their personal interests.

Indeed, a learning orientation seems to foster an openness of individuals towards diverse perspectives and novel ideas.  That is, when individuals adopt a learning orientation, they perform better in diverse teams (Pieterse et al., 2013).  This finding implies that a learning orientation encourages individuals to respect and to embrace diverse perspectives: one of the cornerstones of humility, as defined by Tangney et al. (2000).  Accordingly, initiatives that promote a learning orientation should foster humility or at least diminish the deleterious impact of narcissism.

Definitions of a learning orientation

Originally, the notion of a goal orientation emanated from the research that Carol Dweck published, in collaboration with colleagues, primarily on primary school children, in the 1970s and 1980s (e.g., Diener & Dweck, 1978; Diener & Dweck, 1980; Dweck, 1986). In these studies, children received problems to complete. As these problems became increasingly challenging, some children continued to enjoy the challenge, remaining confident and engaged as well adapting their strategies to solve these problems. Their principal goal was, seemingly, to develop and master knowledge, skills, and expertise, referred to as a learning orientation, as defined by Dweck and Elliott (1983). Other authors have applied different terms to delineate the same, or similar, behaviours—such a task-involved (Nicholls, 1984) or mastery-focused (Ames, 1984).

In contrast, other children become especially upset, disengaged, disinterested, and unconfident as these problems became more challenging, demonstrating a helpless response rather than adaptive response. Their principal goal was to demonstrate and validate, rather than develop and refine, their competence. This inclination is referred to as a performance orientation (Dweck & Elliott, 1983) and overlaps with the concept of ego-involved (Nicholls, 1984) and ability focused (Ames, 1984).  Soon afterwards, these goals orientations were later established in adults as well (Farr et al., 1993).

Initially, researchers assumed that individuals adopt either a learning orientation or a performance orientation.  Over the next couple of decades, this approach gradually evolved:

  • During the 1990s, researchers argued that a learning orientation and performance orientation should be conceptualised as two distinct dimensions because, in principle, individuals could demonstrate a strong motivation both to develop and to demonstrate their competence (Button et al, 1996).  
  • During the late 1990s, researchers subdivided the notion of a performance orientation into two facets: a prove or approach dimension, in which individuals strive to demonstrate favourable attributes, and an avoid or avoidance dimension, in which individuals attempt to minimise or to conceal unfavourable characteristics (Elliott & Harackiewicz, 1996; VandeWalle, 1997).
  • Over a decade later, researchers also subdivided the notion of a learning or mastery orientation into two facets: mastery approach, in which individuals strive to improve their capabilities, and mastery avoid, in which individuals strive to minimise the deterioration of their capabilities (Baranik et al., 2013; Van Yperen et al., 2009). 

In short, both learning orientation and performance orientation can be divided into two facets: an approach version and an avoidance version.  Measures of goal orientation thus assign individuals a score on four dimensions: performance-approach, performance-avoid, learning-approach, and learning-avoid.  All four dimensions can be useful in particular circumstances.  Nevertheless, a strong learning orientation in general, or a learning-approach orientation in particular, is often especially useful.  For example,

  • when individuals adopt a learning orientation instead of a performance orientation, increases in their workload are not as likely to damage their satisfaction at work (Van Yperen & Janssen, 2002),
  • when individuals embrace a learning orientation instead of a performance orientation, they are more likely to share their resources and behave cooperatively—perhaps because they perceive colleagues as sources of knowledge and not as potential rivals (Poortvliet & Giebels, 2012),
  • when people are encouraged to adopt a learning orientation instead of a performance orientation, they are more inclined to cheat on tasks (Van Yperen et al., 2011),
  • when people adopt a learning orientation instead of a performance orientation, their working memory tends to improve; that is, they can retain, transform, integrate, and consider many facets or issues simultaneously (Linnenbrink et al., 2000).

Arguably, when individuals adopt a learning orientation, their goal—to develop capabilities rather than to achieve some outcome—tends to be close rather than remote, generally boosting motivation.  In addition, these individuals perceive challenges and criticisms as opportunities to improve, promoting resilience and flexibility. 

Limited time pressure

To foster humility as well as many other benefits, leaders, practitioners, and other individuals need to encourage people to adopt a learning or mastery orientation in many settings.  Fortunately, researchers have shown that various interventions, instructions, or initiatives can indeed promote a learning orientation.

To illustrate, the extent to which people feel rushed, in the workplace or in other settings, may affect goal orientation.  Specifically, as Beck and Schmidt (2013), when people feel rushed, they tend to demonstrate the hallmarks of an avoidant performance orientation.  In contrast, when people feel they are granted the luxury of time, they tend to exhibit a learning or mastery orientation.

In their study, undergraduate students first indicated the degree to which they agree with statements about time pressure, such as “I am working under excessive time pressure”.  In addition, these participants completed questions that assess goal orientation and exam performance.  Time pressure was negatively associated with learning orientation but positively associated with a performance-avoid orientation.  Learning orientation also improved exam performance.  Accordingly, to evoke a learning orientation, and thus potentially to foster humility, individuals should be granted enough time, or control over how to allocate their time, to complete their tasks.  

Presumably, when time is limited, people are concerned they may not fulfill their goals.  Accordingly, they orient their attention towards more immediate needs rather than future goals, such as professional development. 

A supportive interpersonal environment

When individuals feel their interpersonal environment is supportive, they tend to exhibit all the hallmarks of a learning orientation, such as embrace challenging tasks.  In these settings, individuals are not as worried or vigilant about their relationships and can instead shift their attention to their own learning and development.

To illustrate, in one study, conducted by Kiuru et al. (2014), the participants were children, between kindergarten and grade 4.  Teachers rated how they feel about each child as well as the extent to which the children were accepted by peers.  Parents rated their parenting style.  Children evaluated the extent to which they enjoy or avoid challenging tasks.  When teachers, parents, and peers were supportive at one, the children were more inclined to embrace challenging tasks at subsequent times, epitomising a learning orientation.

This finding can be ascribed to the tenets of attachment theory. That is, cues that signal a sense of support prime memories of the past, especially during childhood, in which individuals felt protected by a supportive and accessible caregiver.  While individuals experience this sense of protection, called a secure base, they feel safe enough to explore unfamiliar features of their environment and to develop their repertoire of capabilities, epitomising a learning orientation. As Mikulincer et al. (2001) revealed, even subliminal exposure to the names of supportive caregivers enhances creativity—a mindset that facilitates growth and development.   

Whether an environment is deemed as supportive may partly depend on the degree to which individuals tend to be trusting.  Consistent with this possibility, trust propensity, measured by questions like “I generally trust other people unless they give me a reason not to”, is positively associated with a learning orientation (Chughtai & Buckley, 2011). 

Teaching style

Some teaching practices are especially likely to foster a learning or mastery orientation in the classroom as well as promote intellectual humility.  Tenelle Porter, from Rowan University, and her associates conducted a study that uncovered four teaching practices that tend to foster mastery goals and intellectual humility:

  • discussions about how students can apply strategies to improve their intellectual abilities, priming a growth mindset,
  • feedback about the degree to which students are applying effective strategies, mobilising effort, and persisting in response to obstacles,
  • enabling students to assume distinct roles during classroom activities,
  • encouraging students to think conceptually about how to improve their learning.

Specifically, in this study, the participants were 547 students—typically around 12 years of age, derived from 26 classrooms and nine schools—together with their 17 teachers.  In the fall of one year and then six months later, the students completed two scales:

  • a measure of intellectual humility that comprised three questions, including “I am willing to admit it when I do not know something”,
  • a measure of the degree to which the classroom prioritises mastery or learning goals that includes six items, such as “In my class, it’s OK to make mistakes as long as you are learning” (Midgley et al., 2000).

Between these two periods, observers rated the degree to which the teachers applied the four teaching practices that purportedly foster mastery goals, as Catalán Molina et al. (2022) enumerated.  For example, the observers identified whether teachers

  • alluded to how the brain is like a muscle that can become stronger over time,
  • when delivering feedback, referred to the strategies or effort of students,
  • assigned each student a unique role while they worked in small workgroups,
  • encouraged students to apply particular strategies to solve problems,
  • asked students to explain the strategies they used to complete the assignment.

Finally, a year after the first survey, the students again completed the measure of intellectual humility.  The researchers used MPlus Version 8 (Muthen & Muthen, 1998–2017) to conduct multilevel Bayesian analysis, with informative priors to minimise bias (see Smit et al., 2020), in which students were nested within classrooms and schools.  When teachers applied the four sets of teaching practices, students became more likely to feel the classroom prioritised mastery.  This belief also enhanced intellectual humility over time.   

Leadership style

Presumably, the approach that leaders adopt may affect the degree to which individuals feel rushed or supported.  Consequently, the leadership style of managers and supervisors may affect the goal orientation of staff.  Indeed, studies confirm that leadership style shapes learning orientation.  Coad and Berry (1998), for example, revealed that a learning orientation prevailed when managers demonstrated a transformational style—a style in which leaders promulgate an inspiring vision of the future as well as grant staff the tools and opportunities to pursue this vision.  

Other research has also clarified how the leadership style of managers could affect the learning orientation of staff.  To illustrate, Özsahin et al. (2011) conducted a study to explore this relationship in more detail.  In this study, 127 participating firms completed a questionnaire that assessed

  • the degree to which leaders facilitate change, such as “empower peoples to implement new strategies”,
  • the extent to which leaders coordinate and plan tasks carefully, such as “assigns work to groups or individuals” and “clarifies role expectations and task objectives”,
  • the degree to which leaders establish trusting relationships with staff,
  • feel that learning is valued.

All three leadership styles encouraged the pursuit of learning.  Presumably, when leaders inspire staff to embrace innovative practices, individuals feel motivated to develop novel strategies rather than merely achieve excellence, epitomising a learning orientation.  When leaders plan tasks carefully and establish trusting relationships, staff are confident they can fulfill the expectations of these leaders.  Therefore, these staff may feel protected and, consistent with attachment theory, thus able to explore unfamiliar tasks and possibilities.  

Leadership exploration

Rather than overarching leadership styles, some researchers have explored more specific behaviours that may foster a learning orientation.  Specifically, some managers often adopt an approach called exploitation in which they apply, extend, and refine the existing knowledge or skills of the organisation to improve performance.  In practice, these organisations tend to modify their extant products and services to accommodate potential customers. In contrast, other managers often adopt an approach called exploration in which they experiment with novel practices or perspectives to improve performance.  In practice, these organisations tend to develop novel products and services to attract potential customers (March, 1991).  Because exploitation tends to attract more immediate returns, organisations do not often prioritise exploration sufficiently (Fang et al., 2010).

As Matsuo (2020) proposed, if managers prioritise exploration, they inspire staff to explore unfamiliar sources of knowledge and insight, fostering a learning orientation.  Second, these managers also role model some of the practices that epitomise a learning orientation, such as attempting unfamiliar tasks.  To confirm this premise, a sample of 147 employees of a Japanese pharmaceutical company completed a survey that Matsuo (2020) assembled.  These employees completed questions that assess

  • the degree to which their managers undertake exploration activities, such as “(My supervisor searches) for new possibilities with respect to products, services, processes, or markets” (Mom et al., 2007),
  • the extent to which they adopt a learning orientation, such as “I look for opportunities to develop new skills and knowledge” (Bunderson & Sutcliffe, 2003),
  • their willingness to reflect upon their work, such as “I often reflect upon whether I am working effectively” and “I often review the methods I use to get the job done” (West, 2000).

As the data revealed, leadership exploration was positively associated with both a learning orientation and reflection.  Indeed, learning orientation mediated the association between leadership exploration and reflection.  Similar to a learning orientation, reflection may also foster humility.

Practices that may diminish time pressure: A four-day work week

Because time pressure tends to diminish the degree to which individuals feel inspired to develop their capabilities (Beck & Schmidt, 2013)—a key determinant or feature of humility (Owens et al., 2013)—initiatives that diminish the extent to which people feel rushed should foster humility.  Conceivably, a four-day work week may grant some individuals more time to develop capabilities and pursue other rewarding pursuits.  This arrangement, therefore, could foster humility.

Four-day work weeks have been discussed and trialled since the mid 1900s (Dunham & Hawk, 1977; Hartman & Weaver, 1977) but then resurfaced with zeal this century.  Over 60 UK companies have recently experimented with this arrangement (Stewart, 2023).  Nevertheless, despite the proclaimed benefits of four-day work weeks—such as improvements in productivity, work engagement, job satisfaction, and wellbeing—some of the purported merits are unsubstantiated and exaggerated (Bird, 2010).

The Icelandic case study

One case study, often touted as evidence of the consequences, and specifically the benefits, of the four-day work week was conducted in Iceland.  Specifically, 66 workplaces trialled a reduced work week.  According to a report, in general, these arrangements enhanced the wellbeing of staff as well as the capacity of these individuals to balance their work and family responsibilities.  Furthermore, these arrangements did not appear to dent, and may have even improved, the services these organisations delivered (Haraldsson & Kellam, 2021).

Nevertheless, as Campbell (2024) underscored, some caveats of this study may constrain the implications of these results.  For example

  • 61 of the 65 workplaces did not reduce work to four days—but merely diminished work hours by three or fewer hours a week,
  • the organisations, including the Reykjavik City Council, provided services—in contrast to the manufacturing, construction or fisheries industries that are prevalent in Iceland and may be more impeded by reduced work hours 

The case study in New Zealand

Besides the case study in Iceland, one of the other most prominent examples of a four-day work week was introduced by an established company in New Zealand that was medium in size and providing financial services. Delaney and Casey (2022) published a report that outlines and analyses this case study in detail. 

Specifically, in 2018, this company decreased the working hours of staff from 40 hours, across 5 days, to 32 hours, across four days, while maintaining staff remuneration and conditions.  The primary, but not sole, motivation of the Director was to increase productivity—predicated on the assumption that staff motivation, concentration, and efficiency may otherwise wane across the week.  However, because of workloads and reporting deadlines, some teams or members of staff could not fully embrace these arrangements.  Before this trial commenced, few staff had been granted flexible working arrangements, such as telecommuting, flexitime, or job sharing. 

The director announced that, if productivity was maintained after eight weeks, this arrangement would be permanent.  Teams were assigned the responsibility to define, measure, and report their productivity.  Energy usage, absenteeism, and other indices were also measured. Delaney and Casey (2022) also conducted focus groups and interviews to explore approaches to measure productivity, attitudes towards the trial, challenges the staff experienced, and implications of this trial.

As these discussions as well as documentation revealed, to implement this trial, staff were granted more opportunities to contribute towards workplace practices—to, for example, choose suitable measures of productivity, record the insights they gained from this trial, and suggest innovations to improve efficiency.  These opportunities also stimulated intellectual engagement, promoted collaboration across teams and staff, as well as ignited effort in staff.  To work more efficiently, staff indicated they

  • did not linger during coffee breaks or similar occasions,
  • experienced a sense of urgency and exhilaration,
  • either within or outside work hours, planned their days or weeks in advance
  • may have worked more expediently, potentially compromising some quality,
  • sometimes experienced stress (Delaney & Casey, 2022).

Because of this stress, some individuals returned to a five-day week.  Other staff indicated they would like more flexibility or discretion on when they could work four days a week.  But staff who maintained the four-day week experienced many benefits to their lives because they could participate in more activities that revolve around family, community, study, wellbeing, and other duties. Nevertheless, some caveats were recognised:

  • First, the novelty of this trial might have boosted motivation and could wane over time.
  • Similarly, staff were motivated to bolster the success of this arrangement; otherwise, they would need to work five days a week in the future.  Consequently, during this trial, staff may have been especially motivated and productive.  

Because the trial was successful, the company maintained the four-day work week, although staff were instructed to remain available to work if needed—a position that most staff, but not all leaders, embraced.  Staff also agreed they need to earn this right to work four days only if their productivity is sufficient: the arrangement was deemed as a privilege and not a right or entitlement. 

More comprehensive programs

Despite the significant media attention they attracted, the trials in Iceland and New Zealand were confined to specific industries and two nations.  In contrast, Fan et al. (2025) presented research on 141 organisations, located in the US, Canada, UK, Ireland, Australia, and New Zealand, that had trialled a four-day work week or similar reductions in work hours.  In all these organisations, the 2896 staff maintained their pay during this trial.  Specifically

  • before the trial began, over two months, organisations introduced a range of initiatives, discussions, and events to maintain productivity and minimise inefficiency—such as cancel unnecessary meetings,
  • in general, staff could work longer than four days if they preferred,
  • the trials lasted six months.

As the findings revealed (Fan et al., 2025), reductions in work hours tended predict decreases in burnout and improvements in job satisfaction, mental health, and even physical health.   These benefits were mediated by three key effects of reduced work hours:

  • the tendency of staff to feel they could work more effectively,
  • a decrease in sleep problems,
  • a decrease in fatigue, potentially improving productivity. 

Although productivity was not assessed objectively, over 90% of the organisations chose to sustain these arrangements.  And, importantly, even after 12 months, wellbeing remained elevated, suggesting that perhaps the benefits of reduced hours could last a significant time.  

Reasons that a four-day work week may be effective

To introduce a four-day work week or a similar arrangement, practitioners need to know why, and thus when, reduced work hours could enhance productivity.  To help practitioners achieve this goal, Rae and Russell (2025) explored some of the physiological and cognitive changes that may underpin these benefits.  

First, as Rae and Russell (2025) underscored, if individuals work fewer hours, they are granted more time to detach psychologically from their workplace (Clinton et al., 2017). This detachment has been shown to diminish rumination (see Weigelt et al., 2019) and facilitate sleep (Pereira et al., 2016).  Delays and disruptions to sleep tend to diminish.    

As sleep improves, individuals tend to reach more effective decisions.  That is, restored sleep improves the functioning of specific brain circuits (e.g., Libedinsky et al., 2011), such as frontal-striatal pathways and the default mode network (De Havas et al., 2012).  Better functioning of the frontal-striatal pathways enhances the capacity of individuals to set and to fulfill goals.  Activation of the default mode network may improve the ability of staff to solve problems creatively, and so forth.

Moderate levels of detachment from work also facilitate recovery, enhancing the degree to which staff feel alert and energised the next day (Wendsche & Lohmann-Haislah, 2017).  This recovery may also enhance attention, regulation, and performance.

Second, when staff need to confine their work to fewer days, they naturally and creatively seek opportunities to improve efficiency. They may agree that fewer people need to contribute to some task, such as a meeting or decision.  They might utilise technologies that facilitate productivity.  They may prevent distractions.  Or teams might agree to relinquish activities that are no longer productive or necessary. 

Effects of reduced work hours on sleep and recovery

Some research has indeed confirmed that working fewer hours does indeed improve sleep and facilitate recovery. As evidence of this premise, Schiller et al. (2017) published a study that investigated an ambitious project. Specifically, between 2005 and 2006, the Swedish government commissioned a project in which 33 government agencies or services were randomly assigned to one of two conditions.  In one condition, comprising 354 staff members of 17 agencies, the participating individuals were instructed to work 25% fewer hours than usual.  The agencies received additional funding to engage other staff to fulfill unmet work demands. This trial lasted 18 months. In the control condition, 226 staff members of 16 agencies worked their usual hours.

At three times—before, during, and after the trial—participants maintained a sleep diary.  For example, they answered questions about whether they slept well, whether they were worried about their sleep, and so forth. During the day, these participants answered questions that gauge the degree to which they feel drowsy. As the findings revealed

  • staff of the agencies that reduced work hours reported improvements in sleep duration and quality as well as decreases in drowsiness, worry, and stress,
  • these effects persisted even after controlling age, education, and job control (Schiller et al., 2017).

A year later, Schiller et al. (2018) examined how staff utilised the additional time they were granted after their work hours were curtailed.  As this study revealed, the staff who worked fewer hours dedicated more time to activities that might facilitate recovery and recuperation, such as hobbies, reading, and listening to the radio. 

Concerns about a four-day work week: Perceptions of employees

To explore the potential concerns or complications of a four-day work week, from the perspective of employees, Jain et al. (2025) interviewed 14 adult staff from manufacturing, healthcare, finance, education, and other sectors.  The researchers prompted the participants to share their perspectives on this arrangement.  Although participants recognised some of the benefits of a four-day work week, such as the capacity to balance work and family responsibilities, these individuals also raised some concerns.  For example

  • some participants were concerned that attempting to cram five days of work into four days might either provoke haste and stress, prolong work days, or compromise revenue,
  • other participants felt that four-day work weeks may not be appropriate in sectors in which staff must collaborate daily with stakeholders from other organisations, unless these organisations also impose this arrangement,
  • finally, some participants were dubious about whether many organisations would adopt this model without government incentives or subsidies—because the financial risk to workplaces might otherwise be too onerous (Jain et al., 2025). 

Practices that may diminish time pressure: How to manage the planning fallacy

Planning fallacy can impede humility

Introduction

To diminish time pressure and thus inspire people to develop their capabilities— a key determinant or feature of humility (Owens et al., 2013)—people need to manage the planning fallacy. The planning fallacy is the tendency of individuals to underestimate the duration that is needed to complete most tasks (e.g., Kahneman & Tversky, 1979;  or reviews, see Buehler et al., 2002; Buehler & Griffin, 2015).  

To illustrate, in one study that Buehler et al. (1994) published, a class of students were instructed to estimate the date at which they will finish their thesis. They actually completed their thesis, on average, in 56 days.  However, they predicted they will complete their thesis in 34 days.  Because of this planning fallacy, individuals often do not fulfill their planned deadlines and thus tend to feel rushed.

In some studies, participants must complete an extensive task, comprising several distinct phases.  Participants must then predict the date at which they will complete the task.  In other studies, participants must complete a more confined task.  They must then predict the time at which they will complete this activity.  The planning fallacy has been observed in both of these circumstances, with overestimates of time ranging from 20% to 50% (see Dunning, 2007).  Nevertheless, when the task is completed over an extended time, this fallacy is especially pronounced.

Accounts to explain the planning fallacy

One explanation of this planning fallacy is that people, when planning a task, orient their attention to this activity rather than other key sources of information, called a focal bias.  To illustrate

  • people often underrate how similar the task is to previous activities they have completed and, hence, disregard the considerable time that was needed to complete these previous activities (Kahneman & Lovallo, 1993),
  • when constructing plans, individuals tend to consider the best possible outcome (Newby-Clark et al., 2000), overlooking the possibility of unexpected, but plausible, complications and obstacles,
  • when constructing plans, individuals tend to orient their attention to the overall goals, benefits, or phases of a task—and thus may overlook some of the specific actions that may demand considerable time (Kruger & Evans, 2004),
  • incentives to complete a task rapidly tends to amplify these various biases (Buehler et al., 1997).

Alternatively, the planning fallacy could partly be ascribed to anchoring and adjustment.  To demonstrate, in typical studies, participants are asked to estimate the date at which they will complete some task.  Their initial estimate, called the anchor, is perhaps the same week.  Next, they adjust their estimate from this anchor to future days or weeks. Importantly, as many studies show, individuals do not adjust sufficiently from initial anchors. Their initial estimate may bias their final estimate.

LeBoeuf and Shafir (2009) uncovered some findings that substantiate this possibility.  In one study, some participants were warned that adjustments are often inadequate.  This warning diminished or nullified the planning fallacy. In another study, some participants estimated the number of weeks, rather than days, they would need to complete some task.  These participants were more likely to estimate longer times.  Presumably

  • if participants focus on weeks instead of days, even a few adjustments from the anchor could generate a prediction of two or more months,
  • if participants orient their attention to days instead of weeks, a few adjustments will not diverge appreciably from the anchor (LeBoeuf & Shafir, 2009).

Strategies that may diminish the planning fallacy: Task deliberation

Some deliberations about the activities that individuals plan to complete may diminish the planning fallacy and thus might decrease time pressure.  To illustrate, when tasks are perceived as arduous and demanding, the planning fallacy might diminish.  For example, in one study, conducted by Jiga-Boy et al. (2010),

  • participants were asked to visualise a range of events, such as shifting house,
  • next, these participants indicated the degree to which this activity would demand effort on a 7-point scale as well as how close or far away this event feels in time,
  • events or activities that demand considerable effort felt closer in time.

Therefore, if a task seems taxing or demanding, the deadline might seem close. Individuals might thus feel they need more time to complete this activity, potentially diminishing the planning fallacy.  Accordingly, if people visualise the effort they might need to devote to a task, the planning fallacy might subside. 

Admittedly, Jiga-Boy et al. (2010) did not explicitly assess whether individuals who consider the effort they need to dedicate to a task are less susceptible to the planning fallacy.  However, as Hadjichristidis et al. (2014) revealed, if individuals are prompted to consider phases of a task that are difficult or unfamiliar, the planning fallacy tends to subside. 

Other research has also investigated whether deliberation about a task may limit the planning fallacy.  For example, in a study that Koole and Spijker (2000) conducted,

  • some participants were prompted to consider the time and place in which they will complete an activity, as vividly as possible, called an implementation intention,
  • other participants were not prompted to consider the time and place in which they will complete an activity,
  • if participants consider the time and place in which they will complete an activity, the planning fallacy dissipated.

Specifically, the individuals who had considered the time and place of this activity vividly tended to complete the task before other participants—but had underestimated the degree to which these implementation intentions would facilitate task completion. 

Intriguingly, if individuals plan the task in reverse order, the planning fallacy dissipates.  For example, in one study that Wiese et al. (2016) conducted,

  • participants were instructed to plan the activities they will undertake to prepare before a date,
  • some participants were instructed to first plan the last activity they need to complete, then to plan the second last activity, and so forth
  • relative to the other participants, these individuals recognised the plan would demand a longer time, diminishing the planning fallacy.

According to Wiese et al. (2016), this technique may be effective because, when individuals consider the activities they need to complete in reverse order,

  • they do not tend to construct the plan as fluently,
  • consequently, they are not as likely to overlook potential obstacles and complications,
  • furthermore, the participants are more attuned to the final task or deadline—and so this deadline feels closer, diminishing optimism.    

Nevertheless, some deliberation about the task may exacerbate the planning fallacy.  For example, in a study that Buehler and Griffin (2003) conducted,  

  • participants estimated the time they may need to complete their Christmas shopping
  • some participants were asked to consider the main steps they will complete to fulfill this task, including when, where, and how they will undertake these tasks
  • relative to a control group, the planning fallacy was more pronounced after individuals considered the main steps

These findings are consistent with the proposition that planning fallacies can partly be ascribed to the inclination of people to orient their attention unduly to the forthcoming task rather than other cues, such as past experience (Buehler & Griffin, 2003).

Strategies that may diminish the planning fallacy: Mental images

Mental images can also diminish the planning fallacy.  For example, to plan a task, individuals could visualise the activities they need to complete as well as potential disruptions or challenges.  As Buehler, Griffin, et al. (2012) revealed, if people imagine this task from the perspective of someone else—as if watching themselves complete this task—the planning fallacy tends to subside.  In contrast, if people imagine this task from their own perspective, as if watching the scene from their own eyes, the planning fallacy is magnified.  

Buehler, Griffin, et al. (2012) proposed, and then validated, several reasons to explain this finding.  Specifically, if people imagine these tasks from the perspective of someone else

  • they feel a sense of detachment from this task,
  • therefore, they are more willing to consider, rather than trivialise, potential obstacles that could unfold,
  • their motivation to complete the activity rapidly is not as likely to bias their estimates of when they can finish the task.

Strategies that may diminish the planning fallacy: Practices that elicit uncertainty

Some researchers have explored whether practices or activities that elicit a sense of uncertainty or subordination, rather than certainty or power, may also diminish the planning fallacy.  To illustrate, in a study that Weick and Guinote (2010) published,

  • to elicit a sense of power, some participants were informed their opinions would affect the final decision about some policy,
  • other participants were informed their opinions would not affect this decision,
  • all participants were then told to estimate they time they would need to complete some assignment; the actual time was recorded as well
  • if participants experienced a sense of power, they were more likely to underestimate the time that would be needed to complete some task—even after controlling self-efficacy.

In another study, some of the participants were also asked to consider the time they needed to complete similar tasks in the past. This reflection diminished the effect of power on the planning fallacy (Weick & Guinote, 2010).  To explain this set of results, Weick and Guinote (2010) argued that

  • when people experience a sense of power, they orient their attention only to the goals they want to achieve rather than other sources of information, such as obstacles they experienced in the past on similar tasks,
  • if participants are prompted to consider these previous occasions, this tendency is curbed and the planning fallacy subsides.

For similar reasons, if a plan is hard to imagine vividly, the planning fallacy dissipates. To illustrate, in one study that Min and Arkes (2013),

  • the participants, who were engaged to be married, identified an activity they needed to complete to plan their wedding, such as select a reception hall
  • these individuals needed to write about either two steps or five steps they would complete to fulfill this plan
  • finally, participants were asked to specify the date at which they would complete this task
  • if asked to specify five steps—a task that is relatively challenging—the planning fallacy diminished. 

A challenging task provokes a sense of unease, usually interpreted as a problem or complication, diminishing optimism and offsetting the planning fallacy.  Consistent with this interpretation, when participants were informed that feelings of difficulty are natural and do not signify a problem, the difficulty associated with the five steps did not as effectively curb the planning fallacy (Min & Arkes, 2013). 

Likewise, the research that Bordley et al. (2019) published implies that another technique could also elicit a sense of uncertainty and diminish the planning fallacy.  Specifically, when individuals estimate the time they might need to complete a task or subtask, they should record both an optimistic estimate and a pessimistic estimate.  These multiple values implies these estimates are uncertain—and this uncertainty diminishes the planning fallacy (Bordley et al., 2019), as a series of Monte Carlo simulations verified.  

Strategies that may diminish the planning fallacy: Individual versus team settings

In team settings, individuals are especially likely to underestimate the time that is needed to complete some project, amplifying the planning fallacy (Buehler et al., 2005).  Specifically, as Buehler et al. (2005) revealed

  • when teams discussed how long they need to complete a task, the estimates were especially optimistic and hence the planning fallacy was amplified,
  • teams tended to orient their discussion on opportunities that could boost success rather than potential obstacles or complications. 

Presumably, in team settings, individuals feel the need to impress other people.  That is,

  • in team settings, individuals want to be perceived as optimistic and efficient
  • this need to impress other people has been shown to magnify the planning fallacy (Pezzo et al., 2006)
  • consistent with this premise, when individuals do not disclose their estimates of when they will complete a task, the planning fallacy subsides (Pezzo et al., 2006).

Furthermore, managers are more likely to underestimate the duration that a large team, rather than a small team, needs to complete tasks, called the team scaling fallacy (Staats, Milkman, & Fox, 2012).   For example,

  • managers might assume that a team of 4 people may complete a task in 5 days and a team of 2 people may complete the same task in double that time: 10 days
  • in practice, however, the team of 4 people may actually complete the task in 9 days and the team of 2 people may complete the task in 11 days.

Staats, Milkman, and Fox (2012) ascribe the team scaling fallacy to the inclination of people to underestimate the complications of teams.  That is, when teams are large, the individuals experience several benefits: They can often specialise in the tasks they enjoy and can access more extensive knowledge.  However, large teams also evoke complications.  Information is not communicated as effectively. People may become unmotivated rather than assume responsibility.  And conflict is more likely. Individuals tend to be more attuned to the benefits, instead of the drawbacks, of large teams.  Therefore, they overestimate the efficiency of large teams.

Strategies that may diminish the planning fallacy: Immediate versus future projects

The planning fallacy might depend on whether the project is scheduled to be completed in the immediate or remote future (Peetz et al, 2010).   According to construal level theory (Trope & Liberman, 2000, 2003; Trope et al., 2007), when events seem near, rather than distant, in time, individuals consider the tangible actions or features of a task.  They might, for example, describe someone as “using a keyboard”—a tangible action—rather than “communicating”—an intangible or abstract concept.  This orientation to concrete, tangible details can either amplify or inhibit the planning fallacy, depending on the primary thoughts of individuals.

To clarify, when individuals contemplate the duration that is needed to complete a task, they sometimes orient their attention to possible obstacles, such as “I might be interrupted by telephone calls”.  If they plan to complete the task soon, and thus orient their attention to concrete details, these obstacles seem especially salient.  The planning fallacy dissipates: That is, individuals predict they will not complete the task rapidly.

In contrast, some individuals orient their attention to avenues or activities that will facilitate performance, like “I will contact three friends to help”.  In this instance, if they plan to complete the task soon, these provisions and opportunities seem especially salient.  The planning fallacy will be amplified.

Peetz et al. (2010) conducted a study that substantiates these arguments.  Participants needed to estimate the duration they will need to complete some task, such as purchase gifts for Christmas or complete a school assignment—either weeks or months before undertaking this activity.  In addition, the thoughts they entertained as they estimated these durations were sought.  When the tasks needed to be completed in several weeks rather than months

  • the planning fallacy diminished if participants referred to obstacles
  • but the planning fallacy increased if participants referred to provisions and plans.

Accordingly, to curb the planning fallacy,

  • if the task needs to be completed very soon, individuals should be encouraged to identify two or three key obstacles
  • if the task needs to be completed in several months or years, individuals should be encouraged to identify many provisions and opportunities they could utilise to facilitate performance.

Controversies around the planning fallacy

As Love et al. (2019) revealed, the planning fallacy may not be observed in all settings or sectors.  Specifically, Love et al. examined a repository of data about transportation projects, worth about US 14 billions dollars in aggregate.  For each project, the database recorded the approved budget and the final cost.  The rationale was

  • if the planning fallacy is prevalent, projects should often be delayed, and hence the final cost should generally exceed the approved budget,
  • however, as the analysis revealed, in almost half the cases, the final cost was less than was the approved budget, suggesting the planning fallacy does not apply in large transportation projects.

Arguably, in some circumstances, planning fallacies or optimistic plans are especially detrimental. In these circumstances, the likelihood of planning fallacies might naturally abate.  That is, leaders may introduce a range of practices to prevent optimistic estimates that could be damaging to the organisation.