The Anchor AI logo

How to Conduct a Task Analysis (With Examples)

Apr 16, 2024

Creating a to-do list and using a daily task tracker can go a long way toward helping you and your team get things done. But identifying and delegating tasks is only one part of the process. Performing a task analysis can help you refine the purpose of your task, break your task down into subtasks, and improve productivity and efficiency.

Team leaders in nearly any industry can perform a task analysis as a way to optimize internal practices, improve the customer experience, or even to assist employees with autism spectrum disorder (ASD) . Let’s take a look at what a task analysis is, how to perform a task analysis, and some real-world task analysis examples.

What Is Task Analysis?

Task analysis is the process of identifying the purpose and components of a complex task and breaking it down into smaller steps. Rather than trying to teach a new skill or process all at once, the purpose of task analysis is to separate it into individual steps that can be followed in a logical sequence.

The principles of task analysis can be used in product design and industrial engineering. It provides a method to better understand the way a customer uses a product and to design more user-friendly workflows. Forward and backward chaining can even be applied to systems that use artificial intelligence (AI) to make data-driven decisions and solve problems.

You’ll often see principles of task analysis applied to special education settings, which can inform employers who have employees with disabilities. For example, applied behavior analysis (ABA) is a type of therapy that uses task analysis to teach complex skills to children with autism spectrum disorder or other developmental disabilities.

In ABA therapy, practitioners use techniques like forward chaining to break down a task into a sequence of discrete steps. A related approach, discrete trial training (DTT) , can be used for teaching students everything from motor skills to daily living skills.

Types of Task Analysis

When using task analysis to plan a project or develop a new product, you can choose from one of two forms: cognitive and hierarchical. A cognitive task analysis is useful for tasks that require critical thinking or decision-making, while a hierarchical task analysis can be used for processes with a consistent structure or workflow.

Here’s how these two types of task analysis differ.

Cognitive task analysis

Let’s say you’re developing a new piece of software and you want to better understand how your customers will interact with the user interface. Rather than tell them how to perform a task, you simply give them a goal and watch how they achieve it.

Since different users will complete the task in a different way, you can use this analysis to identify pain points or understand how a customer’s knowledge and mindset inform their approach to completing the task.

Hierarchical task analysis

A hierarchical task analysis is one in which the process is fixed. In other words, you give the user a set of specific steps and watch how they perform each step of the task. You may discover that some steps are unnecessary or don’t serve the overall goal.

A hierarchical task analysis can be used to determine how long it takes to perform the total task process, and which steps can be eliminated with task automation .

How to Perform a Task Analysis in 4 Steps

The steps to conducting a task analysis will vary depending on whether you’re analyzing an internal process, a UX workflow, or a social or academic skill. But you can use these five steps to break down nearly any type of task and perform a task analysis as part of team project management or your own self-management process .

1. Define your goal

Start by defining the overall goal or task process that you want to analyze. This could be as simple as “Create a new user account and buy a product” or as in-depth as “ Run a post-mortem meeting and send out meeting minutes to everyone who attended.” The more specific your goal, the more useful your task analysis will be.

2. Create a list of subtasks

Next, break your higher-level task down into manageable steps. The idea is to create a list of all the subtasks that go into performing the task, even those that you might take for granted. You never know which tasks are slowing the whole process down.

For example, if you’re testing a new app, the first step might be “Turn on your phone” and the last step might be “Turn off your phone.”

3. Make a flowchart or diagram

A process flow chart or workflow diagram can help you determine which type of analysis to perform. Is your workflow a linear process with a series of discrete tasks that need to be completed in a specific order? Consider performing a hierarchical task analysis to find steps that you can automate or eliminate.

Is it more of a “choose your own adventure” in which different users will complete the task in a different way? Conduct a cognitive task analysis to identify pain points and prerequisites based on how different categories of users complete the task.

4. Analyze the task

Now, you can run through the process and pay attention to the length, frequency, and difficulty of each subtask. Were there any steps that you missed or that took longer than expected to complete? If another user performed the task, did they have the skills and knowledge necessary to complete the entire process?

You can use this information to make changes to the product or process, create more accurate documentation, or improve your training or onboarding practices.

3 Task Analysis Examples

The principles of task analysis can be applied to a wide range of scenarios, so let’s take a look at a few examples of task analysis in the real world.

Task analysis in UX design

In UX design, a task analysis may take the form of a focus group or usability testing. If you’ve just designed a new app, you might want to see how easy it is for customers to download the app and sign up for a new account. The process might look like this:

  • Go to the App Store
  • Search for the app
  • Download the app
  • Open the app
  • Select “Create account”
  • Enter your email address
  • Verify your email address
  • Choose a username and password

Upon conducting a task analysis, you determine that Step 7, “Verify your email address,” actually consists of multiple subtasks, such as opening up an email app. You decide to move this step later in the process to avoid disrupting the workflow.

Task analysis in project management

As a project manager, it’s important to know how your team members are spending their time so you can improve productivity and team accountability . Let’s say you want to find ways to delegate tasks more efficiently by using task automation. You come up with a list of the steps you usually follow to delegate tasks:

  • Document action items during team meetings
  • Add action items to your task manager
  • Create a description for each task
  • Assign each task to a team member
  • Attach a due date to each task
  • Send out a reminder email

After performing a task analysis, you determine that you don’t actually have to do any of these steps manually. You can use an AI task manager like Anchor AI to identify and delegate action items, attach due dates, and send out reminders automatically.

Task analysis for learning disabilities

In employment settings, a task analysis can be used to help employees with learning disabilities who otherwise struggle to complete tasks. One study found that individuals with intellectual disabilities were able to complete office tasks like scanning, copying, and shredding when they were broken down into steps like:

  • Pick up documents from folder
  • Open the scanner cover
  • Place documents face-down on the scanner
  • Close the scanner cover
  • Press “Scan”
  • Remove documents
  • Return documents to the folder

Employees with learning disabilities may benefit from similarly specific instructions for other daily tasks, such as using time management tools or a password manager.

Streamline Task Management With Anchor AI

Performing a task analysis is a way of breaking down complex tasks into smaller steps so you can better understand how they all fit together. It’s used in workplaces, learning environments, and other settings to standardize processes, streamline workflows, and even teach social skills. You can use a task analysis to optimize internal processes or customer-facing workflows and eliminate unnecessary tasks altogether.

Anchor AI makes it easy to identify tasks and break them down into manageable steps with Max, your AI project manager. Simply invite Anchor AI to your next team meeting and Max will identify action items and delegate tasks automatically. Or, Ask Max for deeper insights into how specific tasks align with your overall project goals.

Sign up today to try it out for yourself and streamline task and project management!

Ready to shed the busy work?

Let Max 10x your grind so you can focus on the gold.

No credit card required. Free forever.

See what's included.

Feel the power of a personal AI project manager.

Investor inquiries

[email protected]

PR inquiries

[email protected]

General inquiries

[email protected]

Copyright ©2023 Anchor AI. All rights reserved.

Privacy policy

Terms and conditions

  • What is task analysis?

Last updated

28 February 2023

Reviewed by

Miroslav Damyanov

Every business and organization should understand the needs and challenges of its customers, members, or users. Task analysis allows you to learn about users by observing their behavior. The process can be applied to many types of actions, such as tracking visitor behavior on websites, using a smartphone app, or completing a specific action such as filling out a form or survey.

In this article, we'll look at exactly what task analysis is, why it's so valuable, and provide some examples of how it is used.

All your UX research in one place

Surface patterns and tie themes together across all your UX research when you analyze it with Dovetail

Task analysis is learning about users by observing their actions. It entails breaking larger tasks into smaller ones so you can track the specific steps users take to complete a task.

Task analysis can be useful in areas such as the following:

Website users signing up for a mailing list or free trial. Track what steps visitors typically take, such as where they find your site and how many pages they visit before taking action. You'd also track the behavior of visitors who leave without completing the task.

Teaching children to read. For example, a task analysis for second-graders may identify steps such as matching letters to sounds, breaking longer words into smaller chunks, and teaching common suffixes such as "ing" and "ies." 

  • Benefits of task analysis

There are several benefits to using task analysis for understanding user behavior:

Simplifies long and complex tasks

Allows for the introduction of new tasks

Reduces mistakes and improves efficiency

Develops a customized approach

  • Types of task analysis

There are two main categories of task analysis, cognitive and hierarchical.

Cognitive task analysis

Cognitive task analysis, also known as procedural task analysis, is concerned with understanding the steps needed to complete a task or solve a problem. It is visualized as a linear diagram, such as a flowchart. This is used for fairly simple tasks that can be performed sequentially.

Hierarchical task analysis

Hierarchical task analysis identifies a hierarchy of goals or processes. This is visualized as a top-to-bottom process, where the user needs top-level knowledge to proceed to subsequent tasks. A hierarchical task analysis is top-to-bottom, as in Google's example following the user journey of a student completing a class assignment .

What is the difference between cognitive and hierarchical task analysis?

There are a few differences between cognitive and hierarchical task analysis. While cognitive task analysis is concerned with the user experience when performing tasks, hierarchical task analysis looks at how each part of a system relates to the whole.

  • When to use task analysis

A task analysis is useful for any project where you need to know as much as possible about the user experience. To be helpful, you need to perform a task analysis early in the process before you invest too much time or money into features or processes you'll need to change later.

You can take what you learn from task analysis and apply it to other user design processes such as website design , prototyping , wireframing , and usability testing .

  • How to conduct a task analysis

There are several steps involved in conducting a task analysis.

Identify one major goal (the task) you want to learn about. One challenge is knowing what steps to include. If you are studying users performing a task on your website, do you want to start the analysis when they actually land on your site or earlier? You may also want to know how they got there, such as by searching on Google.

Break the main task into smaller subtasks. "Going to the store" might be separated into getting dressed, getting your wallet, leaving the house, walking or driving to the store. You can decide which sub-tasks are meaningful enough to include.

Draw a diagram to visualize the process. A diagram makes it easier to understand the process.

Write down a list of the steps to accompany the diagram to make it more useful to those who were not familiar with the tasks you analyzed.

Share and validate the results with your team to get feedback on whether your description of the tasks and subtasks, as well as the diagram, are clear and consistent.

  • Task analysis in UX

One of the most valuable uses of task analysis is for improving user experience (UX) . The entire goal of UX is to identify and overcome user problems and challenges. Task analysis can be helpful in a number of ways.

Identify the steps users take when using a product. Can some of the steps be simplified or eliminated?

Finding areas in the process that users find difficult or frustrating. For example, if many users abandon a task at a certain stage, you'll want to introduce changes that improve the completion rate.

Hierarchical analysis reveals what users need to know to get from one step to the next. If there are gaps (i.e., not all users have the expertise to complete the steps), they should be filled.

  • Task analysis is a valuable tool for developers and project managers

Task analysis is a process that can improve the quality of training, software, product prototypes, website design, and many other areas. By helping you identify user experience, you can make improvements and solve problems. It's a tool that you can continually refine as you observe results.

By consistently applying the most appropriate kind of task analysis (e.g., cognitive or hierarchical), you can make consistent improvements to your products and processes. Task analysis is valuable for the entire product team, including product managers , UX designers , and developers .

Should you be using a customer insights hub?

Do you want to discover previous user research faster?

Do you share your user research findings with others?

Do you analyze user research data?

Start for free today, add your research, and get to key insights faster

Editor’s picks

Last updated: 31 January 2023

Last updated: 12 February 2023

Last updated: 3 July 2023

Last updated: 18 January 2023

Last updated: 25 June 2023

Last updated: 20 March 2023

Last updated: 27 April 2023

Last updated: 21 February 2023

Last updated: 24 June 2023

Last updated: 29 May 2023

Last updated: 14 March 2023

Last updated: 15 July 2023

Last updated: 13 May 2024

Last updated: 1 June 2023

Latest articles

Related topics, .css-je19u9{-webkit-align-items:flex-end;-webkit-box-align:flex-end;-ms-flex-align:flex-end;align-items:flex-end;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-box-flex-wrap:wrap;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;row-gap:0;text-align:center;max-width:671px;}@media (max-width: 1079px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}}@media (max-width: 799px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}} decide what to .css-1kiodld{max-height:56px;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}@media (max-width: 1079px){.css-1kiodld{display:none;}} build next, decide what to build next.

task analysis to

Users report unexpectedly high data usage, especially during streaming sessions.

task analysis to

Users find it hard to navigate from the home page to relevant playlists in the app.

task analysis to

It would be great to have a sleep timer feature, especially for bedtime listening.

task analysis to

I need better filters to find the songs or artists I’m looking for.

Log in or sign up

Get started for free

How to Do a Task Analysis Like a Pro

Community Team

Task analysis is one of the cornerstones of instructional design. But what is it, really? The name says a lot: you analyze a task, step by step, to document how that task is completed.

At first glance, this seems like a straightforward thing. But even the easiest tasks can be quite complex. Things you do every day might seem simple when you first think about them. But what happens when you eliminate internalized or assumed knowledge? 

Take sending an email. Easy, right? Maybe four or five steps? 

  • Click the New Mail icon
  • Enter a Recipient
  • Enter a Subject
  • Enter your email text 

But what about carbon copy or blind carbon copy recipients? What if you need to attach an invoice or picture? What app do you use to create the email in the first place (or are you sending from Gmail in your browser)? For that matter, from which device are you sending the email? 

Suddenly that “simple” task is a set of processes, organized by device, operating system, and application, with various subtasks along the way accounting for mailing list complexities and the purpose of your email. As I was writing this I came up with about a dozen different variations, all of which would need to be closely analyzed and broken down precisely. 

Even the most average task has a lot behind it.

This is why understanding how to do a task analysis is so important to becoming a successful instructional designer. When instructional designers create training, they’re teaching the learner how to accomplish something. Task analysis helps you focus on what they’re going to do and how they’ll do it (don’t worry so much about the why ; that comes later). 

The easiest way to illustrate the process is with an example. Let’s say you work at a midsize media company and your boss asks you to complete a task analysis on how the company’s social media manager does her job. They want this documented for training purposes for future hires. That means you’ll need to:

  • Identify the task to analyze
  • Break down the task into subtasks
  • Identify steps in subtasks

Let’s take a closer look at each of these steps.

Step 1: Identify the Task to Analyze

Tasks are the duties carried out by someone on the job. The social media manager carries out a lot of duties, so you need to be able to break them down into broad activities (aka tasks!) and focus on them one at a time. Don’t worry about all the little things that make up the task; we’ll get to that in a second. Here we’re looking to paint with broad strokes.

One of the social media manager’s tasks is to add new content to social media sites every morning. Your tasks should describe what a person does on the job and must start with an action verb.

So, in this case, the first task to analyze is “Add new content to social media.”

Step 2: Break Down the Task into Subtasks

Once you identify the task, you need to identify the subtasks, the smaller processes that make up the larger task. Remember in the email example above where I mentioned attachments and carbon-copying recipients? That’s the kind of thing you capture here. These should also be brief and start with an action verb.

Continuing the social media manager example, you need to find out the subtasks of adding new content to social media. You can figure this out by talking to or observing the social media manager. Through this process, you discover that the subtasks for adding new content to social media are:

  • Check the editorial calendar
  • Add new content to Twitter

You’re making good progress! You can now move on to Step 3.

Step 3: Identify Steps in Subtasks

Now it’s time to get into the nitty-gritty. You’ve identified the task and broken it down into subtasks. The final step, then, is to identify and list the steps for each subtask. 

Do this by breaking down all of the subtasks into specific step-by-step, chronological actions. The key here is to use a “Goldilocks” approach to detail: not too much and not too little. Use just the right amount so learners can follow the instructions easily. Again, as with tasks and subtasks, your steps need to start with an action verb. 

So, putting everything together from steps 1 and 2 and then breaking the subtasks into steps, your final task analysis would look like this;

1. Adding new content to social media  

1.1 Check the editorial calendar

1.1.1 Navigate to the calendar webpage

1.1.2 Click today’s date

1.1.3 Click newest article title to open article

1.1.4 Click inside article URL bar

1.1.5 Copy URL for article to clipboard

1.1.6 Highlight title text of article

1.1.7 Copy the title text to clipboard

1.1.8 Close the calendar

1.2 Add new content to Twitter

1.2.1 Navigate to Twitter account

1.2.2 Log in to Twitter account

1.2.3 Click Tweet button

1.2.4 Paste article title from clipboard

1.2.5 Paste article URL from clipboard

1.2.6 Click Tweet button to publish

There are several ways to approach task analysis. It’s a fine art deciding how far down the rabbit hole you need to go with detail. Instructional designers can debate for hours whether saying “log in” is enough or if that needs to be broken down further into “enter user name,” “enter password,” and “click the login button.” Again, it all comes down to figuring out how much detail is just right for your audience.

Wrapping Up

That’s it! As you can see, while creating a task analysis boils down to “just” three steps, there are a lot of nuanced decisions to make along the way. Remember the Goldilocks Rule and always consider your audience and the seriousness of the subject matter when deciding just how nitpicky you need your task analysis to be. After all, there’s a marked difference between how much detail a learner needs when they’re learning how to perform brain surgery versus filling out their timecard.

Do you have any do’s and don’ts of your own for completing a successful task analysis? If you do, please leave a comment below. We love to hear your feedback!

Follow us on Twitter and come back to E-learning Heroes regularly for more helpful advice on everything related to e-learning.

Related Content

Top 3 types of e-learning analysis, 5 tips to improve your technical writing skills.

Nicole Legault

The One Thing You Need to Do to Organize Training Content: Task Analysis

20 comments.

Christine Hounsham

  • Christine Hounsham

Ivy Shi

  • Akhila Vasan

Jacinta Nelligan

  • Jacinta Nelligan

David Mayor

  • David Mayor

Alexander Salas

  • Alexander Salas

Nicole Legault

  • Nicole Legault

Chris Purvis

  • Chris Purvis

Jerrie Paniri

  • Jerrie Paniri

Chris Janzen

  • Chris Janzen

Years ago, long before I ever even considered that I might possibly be an ID, my English professor assigned us a paper to write a set of directions for a task of our choosing that could be successfully executed by anyone who could read English. Coincidentally, like Jerrie, I chose making a peanut butter and jelly sandwich, but for my part because I was lazy and wanted to pick a task for which it would be very easy to write the steps. Two days later I had a paper that was just shy of three pages (it was an English class, and we had to write it in prose, not instructional format), and a much deeper understanding of how much unconscious knowledge and experience we rely on to perform what we consider to be the simplest of tasks. I've never forgotten the lesson I got from writing that paper, an... Expand

Caroline Smith

  • Caroline Smith

David Kolmer

  • David Kolmer

Djuana Lamb

  • Djuana Lamb

Brock Lenser

  • Brock Lenser

John Maical

  • John Maical

Task analysis is a systematic process used to understand a task or activity in detail. It involves breaking down a complex task into smaller, manageable steps to identify the specific skills and knowledge required. Here's a guide on how to perform a task analysis like a pro: 1. Define the Task: Clearly define the task you want to analyze. Be specific about the goals and objectives. 2. Identify the Users: Determine who will be performing the task. Consider their background, skills, and knowledge. 3. Break Down the Task: Divide the task into smaller, manageable steps. Start with the overall goal and then break it down into subtasks. 4. Sequence the Steps: Arrange the steps in a logical order. Consider dependencies between steps and how they contribute to the overall task. 5. Gathe... Expand

Abigail ava

  • Abigail ava

Task Analysis 101: What Is It and How To Improve Your UX Using Task Analysis?

10 min read

Task Analysis 101: What Is It and How To Improve Your UX Using Task Analysis? cover

Task analysis is a powerful way to timely discover and address friction points in the user experience. It helps UX designers and product managers better understand users’ goals and the steps they must take to get their job done.

Task analysis enables you to design more efficient user experiences and ultimately drive your product growth.

In this article, we’ll be covering all the necessary steps to successfully perform a task analysis.

Let’s dive in!

  • Task analysis is the process of analyzing the number of steps (tasks) a user has to complete to get their jobs to be done (JTBD) when using your product.
  • It helps UX designers and product managers understand user behavior and eliminate unnecessary steps in the user path.
  • The primary goal of task analysis is to detect flaws in the UX design that compromise the user journey , customer engagement , and customer satisfaction.
  • Cognitive task analysis will help you gauge how much mental effort is required to reach the desired outcome when using your product (aka how difficult it is for customers to use it for a given task).
  • Hierarchical task analysis allows UX researchers to examine the nooks and crannies of interface design and understand how each task contributes to the users’ prime goals.
  • Task analysis involves five steps.
  • Defining the task that should be analyzed.
  • Identifying customers’ end goals by segmenting them in the welcome flow.
  • Breaking down complex tasks into small steps to find overloaded UX areas.
  • Creating a task-analysis diagram based on the gathered data.
  • Finding friction points and creating a strategy to fix them.

What is a task analysis in UX?

Task analysis is the process of analyzing the number of steps (tasks) a user has to complete to get their jobs to be done (JTBD) when using your product. To put it simply, task analysis breaks down complex tasks into small steps to find overloaded UX areas.

This helps take a deep dive into understanding user behavior and eliminate unnecessary steps toward completing the goals (JTBD).

The more advanced UX, the fewer friction points users encounter, and the better customer satisfaction .

Task-Analysis

Why is task analysis important in UX?

A task analysis is a process of putting yourself in the shoes of your customers and experiencing their user journey. How easy it is for them to complete the steps, what steps make them confused or upset, etc.

The end goal is to address all the downsides and deliver a best-in-class product experience.

But there’s more to it than that.

Have a deep understanding of users and their end goals

Task analysis helps UX designers and product managers to understand the whole picture of the user journey toward particular goals. You will uncover:

  • What triggers lead to the task, and what steps do they take to reach the end goal?
  • What does their learning process look like?
  • How does their competence in performing tasks affect the speed at which they complete tasks and the overall completion rate?
  • What does their everyday flow look like?
  • What hinders their journey?

The sweet point is that you can conduct task analysis for any user’s goal within the product and make well-informed decisions toward product updates.

Identify how customers behave in the app

While running task analysis, you will map out all the steps users execute to achieve their goals. This gives you a clear understanding of their in-app behavior and enables you to spot roadblocks on both the product and UX layers.

See how users are influenced by their environment

Task analysis also shows how users are influenced by their in-app environment. For example, you can compare the differences in the user experience of users employing the mobile app and web version of your product.

Detect flaws and friction points

The prime goal of task analysis is to detect UX design flaws that compromise customer engagement and satisfaction.

Do you have an easy-to-use navigation menu, intuitive design prompting users to perform the next task, and workflow efficiency?

You can put everything under the test and see whether you’ve logically built your app.

Types of task analysis

There are two types of task analysis — cognitive and hierarchical analysis.

Let’s learn the pros and cons of each.

Cognitive task analysis

Cognitive task analysis (CTA) studies users’ cognitive activity when performing specific tasks. In other words, CTA aims to gauge how much mental effort is required to reach the desired outcome when using your product (aka how hard it’s for customers to use your product for a given task).

With CTA, you will understand:

  • Performance differences between basic users and pro or advocates
  • The extent of mental workloads
  • The motivation to use your product
  • The emotional side of your users engaging with your product (angry, happy, upset, confused, etc.)

The cognitive analysis consists of several steps:

  • Defining the task (goal) to analyze
  • Determining the critical decision points
  • Grouping by user’s behavior
  • Acting on findings

We can highlight two main benefits of CT analysis:

  • Provides insight into user motivations
  • Helps establish the participants’ end goals

Disadvantages

The main disadvantage of cognitive analysis is its qualitative nature. You may not get accurate results or relatively clear results.

Hierarchical task analysis

Hierarchical task analysis lays out every step a user performs to accomplish their goal. It involves a linear diagram like signing up → creating an account → connecting to a Facebook account. And it also breaks down every major step into smaller subtasks (tasks’ decomposition).

Thus the signing up task implies the following steps — signing up with Google → reading through a welcome screen → completing a 4-step welcome survey, etc.

The hierarchy of tasks enables UX researchers to examine the nooks and crannies of interface design and understand how each contributes to the users’ goals.

This way, you may spot that multiple tasks in the signing-up process can overwhelm users and lead to a low completion rate.

The hierarchical analysis is essential for designing new features or reverse-engineering existing ones. With this, you can explore different approaches to achieving the same goal and find the most efficient path.

At an earlier stage, hierarchical task analysis enables you to build efficient product usability. When applied later, it helps identify hidden UX flaws and address them accordingly.

There are no obvious disadvantages as such. You’re good to go as long as you do task decomposition correctly and get detailed results.

When should you perform a task analysis?

Task analysis is an essential step in the product design process. It should be done in the early stages because it helps teams frame the problem and gather user requirements.

Basically, task analysis is the foundation of the product.

In the realm, we cannot expect that once we complete task analysis, we will build the most authentic product UX ever and never return to this task again.

With the company’s growth, we build various features, incorporate new flows, etc. Hence, we must ensure that updates are aligned with existing flows and in no way hinder user experience.

Bottom line: Task analysis is an ongoing process that helps product teams design a user-friendly and appealing interface.

What data do you need for a task analysis process?

There are five pillars for task analysis. You should find answers to all of them while conducting task analysis. This will help you decompose user goals efficiently and create the fastest path to value.

  • Trigger: Determine what triggers users to begin their journey. What caused the goal to occur?
  • Desired Outcome: What is the desired outcome that users aim for?
  • Base Knowledge: What base knowledge do users have before getting started?
  • Required Knowledge: What knowledge do users lack in order to complete the task?
  • Artifacts: What additional tools or information do the users rely on when performing the tasks?

Now let’s find out what a task analysis process consists of.

How to conduct a task analysis and improve UX?

In this chapter, we’ve laid out the entire task analysis process and how to act on findings.

Let’s begin.

Define the task that should be analyzed

Any analysis begins with a goal and questions behind it. Why do we need to conduct the research? What do we aim for? What is a starting point for analysis?

In our case, we must define the high-level task (the user goal) to analyze. The specific step in the user journey that users should perform (e.g., account creation).

Segment customers in the welcome flow and understand their goals

Customer segmentation refers to categorizing customers based on common characteristics for further analyses (e.g., behavior analysis , task analysis, customer journey analysis, etc.).

When it comes to task analysis, segmenting your customers from the onset gives you a deeper understanding of them. What niche they come from, how they heard about your company, what is their job to be done, etc.

To gather such data, you need to implement the welcome flow (a welcome screen ). This is a pop-up with a microsurvey that appears at the last step (or at the beginning) of the sign-up process.

Welcome screens usually serve two purposes: greeting customers and collecting data.

For example, Kontentino utilizes a welcome screen by Userpilot to define customers’ goals, workflows, and the type of company they represent.

Kontentino-welcome-screen-task-analysis

Use feature tagging to identify what customers are doing in the app

Feature tagging is another solution to understanding what your users are doing in the app and what their path toward the goal looks like. In short, feature tagging allows you to analyze product usage behavior .

Thus, you will learn and document every click users make. What features do they use more or less frequently, etc?

With Userpilot, you can select any UI pattern of your app to track its usage.

Use this data to understand when users reach certain milestones in their journey.

feature-tagging-userpilot

Once you set up feature tracking and data starts flowing, you can segment customers by their in-app experiences (e.g., their interactions with the features).

This will help you identify segments that are having trouble with a specific feature, etc.

userpilot-smarter-segmentation

Set up custom goals and monitor how users are progressing toward goals

Whenever you want to know how customers feel about recent changes to a product or design, this step is crucial.

For this, you can digitize all the steps of hierarchical task analysis and track how many users complete pre-defined milestones. You can also monitor the completion rate of intermediate steps (tasks) toward goals.

This will help you measure how successful product updates were or you can identify the best performing features of your app. Additionally, you can understand what step (task) causes trouble. Essentially, these are tasks with a low completion rate of concrete action.

With Userpilot, you can create goals and track their completion. It’s code-free and can be set up in just a few clicks away.

goal-tracking-userpilot-dashboard_(1)

Create a task analysis diagram based on the gathered data

Lastly, collect the new data (from the steps above) and make a graphical representation called a task-analysis diagram.

This will help understand the overall number of tasks, subtasks, sequence, and hierarchy.

The diagram will also help you analyze the complexity of the process users are going through to achieve their goals.

Ultimately, you will uncover tasks that users find insufficient.

Task-Analysis-Diagram

Discover friction points and fix them to improve the user experience

Once you finish the analysis, you will locate the friction points that hurt the user experience and might lead to churn .

Regardless of what task damages the user experience, your next step is fixing the problem.

Most drawbacks arise either in the onboarding flow or in a specific part of the user journey. No matter what part of the product has flaws, it usually comes down to overcomplicated navigation and unnecessary steps to get to value.

The next time you’re working on a new design, do UX research first. Interview customers, analyze competitors’ UX , and run task analysis. Make your UX flawless by using as many methods as you can.

Task analysis example

Here we will show you how bad UX can drastically impair the overall user experience.

Let’s look at two tools for keyword research (SEO) Semrush and Serpstat.

Our goal is to run a quick analysis of the most important components of SEO. Keywords our site ranks for and the number of backlinks.

We will be testing both tools and running a small task analysis to compare their UX.

Type the query → Click on “Search” → Done! The tool shows me the needed metrics from the first screen.

But let’s make the task more difficult. Now, I want to analyze my Anchor text list.

Click on “Backlinks” from the Domain Overview → View Details → Anchors. Three clicks and you’re on the destination page.

Semrush-ux-task-analysis

Takeaways: Super intuitive design. Flawless path to get to value. It took less than 10 seconds to open the needed report.

Type the query → Click on “Search” → scroll six screens down to reach the Backlinks overview → click on “Backlinks” → scroll two screens down → fail!

No jump link will get you to the Anchor report.

The workaround is to click “Anchor” from the navigation menu.

Serpstat-task-analysis

Takeaways: Not friendly and not intuitive design. It took up to 40 seconds to realize the next step to reach the objective and some cognitive and emotional effort (irritation).

As you can see, task analysis is crucial if your goal is to build an outstanding product on the market.

Conducting a task analysis is important if you don’t want a bad UX to impair the user experience and lead to high churn.

Ideally, you should analyze customer behavior and understand your users’ needs and goals before creating a product or updating an existing feature.

Want to collect customer insights and understand their goals code-free? Book a demo call with our team and get started!

Leave a comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Book a demo with on of our product specialists

Get The Insights!

The fastest way to learn about Product Growth,Management & Trends.

The coolest way to learn about Product Growth, Management & Trends. Delivered fresh to your inbox, weekly.

task analysis to

The fastest way to learn about Product Growth, Management & Trends.

You might also be interested in ...

How to identify user problems in ux design [+ best tools and methods], how to analyze open-ended questions in saas.

Aazar Ali Shad

Post-event Survey Questions for SaaS and How to Deliver Them

Task Analysis: What It Is and How It Improves Your UX

  • Task analysis is the process of investigating the tasks users complete to achieve a desired goal or outcome.

While that sounds simple enough, task analysis is often left out of the UX design process. This is unfortunate because task analysis can actually have a big impact on design decisions.

By observing and understanding the steps users go through to complete various tasks, you can learn everything from what goals users truly want to achieve with the product you’re building to how their previous knowledge will factor into how they approach a given task.

Let’s dig deeper into why task analysis is valuable, how to go about conducting one, and how it can be used to improve UX.

  • What is task analysis in UX?
  • When to conduct task analysis
  • Two common types of task analysis
  • How to conduct a task analysis
  • Using task analysis to improve UX
  • Key takeaways

1. What is task analysis in UX?

Task analysis is a process that helps UX designers learn how users actually go about completing tasks with a product. 

According to Maria Rosala of the Nielsen Norman Group , “a task refers to any activity that is usually observable and has a start and an end point.” So, in task analysis, UX designers first research how users complete tasks by asking them to perform a specific activity and observing how they do so—from start to finish.

Of course, as Rosala notes, it’s important to recognize that tasks are not goals. For instance, if a user’s goal is to see a nearby dentist, their tasks may include searching for dentists in the area, learning which ones accept their insurance and ensuring there are appointments available that fit their schedule.

None of these tasks are the ultimate goal, though. The user’s goal isn’t to complete a form that details their location and insurance information. Completing the form is a means to an end: seeing a local dentist. This is important to keep in mind for UX designers because the more easily users can complete tasks that help them meet their goals, the better the user experience will be. Whether it’s streamlining the number of steps in a task, eliminating potential points of confusion with clearer messaging or innovations that will make completing a task easier, the UX designer’s focus should be on how to design the task so it enables the user to most easily and efficiently meet their goals.

2. When to conduct task analysis

Task analysis can have a big impact on key choices made throughout the design process. As a result, it should be conducted early in the process. It’s definitely not something that should happen after you’ve started making major design decisions.

Typically the process of task analysis should start during user research (which usually happens in the empathize and define stages of the UX design process); that way, your task analysis findings can be baked into other key tasks in the design process, including requirements gathering, developing content strategy and site structure, wireframing and prototyping .

3. Two common types of task analysis

There are several kinds of task analysis but the two types that are used most regularly are cognitive task analysis and hierarchical task analysis .

Cognitive task analysis

Cognitive task analysis focuses on understanding the cognitive outlay involved in completing tasks. This includes decision-making, problem-solving, memory, judgment and attention. One of the important things to keep in mind with this kind of task analysis is that depending on the user, the findings may vary from task to task.

For example, an expert user may quickly and easily find a carton of milk and place it in an online shopping cart, whereas this task will take a new user substantially longer. Cognitive task analysis enables UX designers to explore how both kinds of users complete the task and how they can make the task easier for the new user.

Hierarchical task analysis

Hierarchical task analysis is the most commonly used kind of task analysis. Hierarchical task analysis essentially involves breaking a task down into sub-tasks in order to understand the way the user interacts with a given product. UX Matters’ Peter Hornsby observes that this can help UX designers no matter what kind of project they’re working on: when creating a new product, a task analysis enables UX designers to examine different approaches to the same task and arrive at the best one, and when redesigning an existing product, it can help optimize interactions—and task completion.

Keep in mind that it’s also possible to combine these two kinds of task analysis by noting where key decisions or other cognitive factors may come into play during the subtasks outlined in a hierarchical task analysis.

In fact, there are many different things that can be accounted for when analysing a given task. Tarik Dzekman from UX Collective provides a long list that includes the context of the task, what triggers the task, how long the task takes, and how frequently the task will be performed. Dzekman cautions that it would be impossible to capture everything that plays a role in a single task through task analysis, but at a minimum most task analyses will capture the sequence of subtasks that make up a task and a description of the task.

4. How to conduct a task analysis

A task analysis consists of two discrete steps : Gathering information to determine which tasks should be analyzed and then analyzing those tasks.

Gathering information

The first step in task analysis involves user research . UX designers can use any one of a myriad of user research techniques to uncover the key tasks users perform with a product and how they go about performing them. Everything from observing a user as they complete a task to interviewing them can be employed in this step. The ultimate goal is to identify the tasks that should be analyzed.

Analyzing tasks

After the UX designer decides on the tasks to analyze, separate documents breaking down each individual task should be created. While this document can be a simple list or a detailed flowchart, it will most commonly take the form of a hierarchical task-analysis diagram. A hierarchical task-analysis diagram visually lays out the user’s goal, the tasks they must complete to achieve the goal, and the subtasks that go into each task in a visual format that shows the sequence and relationship between these things.

Note: The Nielsen Norman Group has a great example of what the process of task analysis looks like—including a task analysis diagram.

A task-analysis diagram is useful in that it helps the UX designer visualize and understand the steps a user will go through to meet a specific goal. However, this diagram should also be viewed as a living document that can be altered and adjusted.

For example, if a user’s goal is to make a purchase from an online grocery store and they want to reorder something from a past order, they will have to login to their account. However, some users may forget their password, forcing them to reset it. It would be valuable to acknowledge this potential step in the task-analysis diagram.

The need for updates and adjustments is why some UX designers prefer to use spreadsheets over diagrams for task analysis, although some use both a written list of tasks in combination with a diagram.

5. Using task analysis to improve UX

Of course, the most important thing about task analysis is that UX designers can apply what they’ve learned to their design solution, improving the user experience in the process. By understanding the steps a user goes through to complete a task, UX designers can come up with the best approach to support that task . This is valuable as it can eliminate points of confusion for the user, such as an excessive number of choices, or reduce the number of steps a user must take to complete a task.

It can also lead to innovations a UX designer may not have thought of otherwise. For example, perhaps when designing an online grocery store, a UX designer notices users heavily rely on shopping lists they keep in their mobile phones when filling their carts. This could lead the UX designer to create a way for users to sync their mobile phone’s shopping list with the store’s interface in order to streamline shopping. In referencing the task analysis, the UX designer knows they are coming up with solutions that will positively impact users’ interactions.

6. Key takeaways

You should now have a basic understanding of the task analysis. To sum it up:

  • Tasks are observable activities that have a start and an end point.
  • Task analysis should be conducted early in the design process, usually starting during user research.
  • There are multiple kinds of task analysis, but the two that are used the most are cognitive task analysis and hierarchical task analysis.
  • A task analysis is conducted in two steps. First, through user research, the UX designer will gather information that will identify the tasks to be analyzed. Second, the UX designer will create a diagram or other document to break down a task.
  • UX designers apply what they’ve learned from task analysis to create the best user experience. This can lead to design improvements and innovations.

Now that you know about task analysis, you might want to learn more. If so, you’ll find the following articles useful:

  • What is user research, and what’s its purpose?
  • How to deal wIth cognitive load in UX and voice design
  • What is the UX design process? A complete, actionable guide

Skip navigation

Nielsen Norman Group logo

World Leaders in Research-Based User Experience

Task analysis: support users in achieving their goals.

task analysis to

September 20, 2020 2020-09-20

  • Email article
  • Share on LinkedIn
  • Share on Twitter

Task analysis refers to the broad practice of learning about how users work (i.e., the tasks they perform) to achieve their goals. Task analysis emerged out of instructional design (the design of training) and human factors and ergonomics (understanding how people use systems in order to improve safety, comfort, and productivity). Task analysis is crucial for user experience, because a design that solves the wrong problem (i.e., doesn’t support users’ tasks) will fail, no matter how good its UI.

In the realm of task analysis, a task refers to any activity that is usually observable and has a start and an end point . For example, if the goal is to set up a retirement fund, then the user might have to search for good deals, speak to a financial advisor, and fill in an application form — all of which are tasks. It’s important not to confuse goals with tasks. For instance, a user’s goal isn’t to fill in a form. Rather, a user might complete a form to register for a service they want to use (which would be the goal).

Task analysis is slightly different from job analysis (what an employee does in her role across a certain period of time — such as a week, month, or year) or workflow analysis (how work gets done across multiple people). In task analysis, the focus is on one user, her goal, and how she carries out tasks in order to achieve it. Thus, even though the name “task analysis” may suggest that the analysis is of just one task, task analysis may address multiple tasks, all in service of the same goal.

task analysis to

Studying users, their goals, and their tasks, is an important part of the design process. When designers perform task analysis, they are well equipped to create products and services that work how users expect and that help users achieve their goals easily and efficiently. Task analysis, as a method, provides a systematic way to approach this learning process. It can be flexibly applied to both existing designs (e.g., the use of an enterprise system) and system-agnostic processes (e.g., shopping for groceries).

The task-analysis process can be viewed as two discrete stages:

Stage 1: Gather information on goals and tasks by observing and speaking with users and/or subject-matter experts.

Stage 2: Analyze the tasks performed to achieve goals to understand the overall number of tasks and subtasks, their sequence, their hierarchy, and their complexity. The analyst typically produces diagrams to document this analysis.

In This Article:

Stage 1: gather information, stage 2: analyze tasks.

In stage 1, typically, a combination of methods is used to learn about user goals and tasks. They include:

  • Contextual inquiry : The task analyst visits the user onsite and conducts a semistructured interview to understand the user’s role, typical activities, and the various tools and processes used and followed. Then the analyst watches the user work. After a period of observation, the user is asked questions about what the analyst observed.
  • Interviews using the critical incident technique : Users are asked to recall critical incidents, and the interviewer asks many followup questions to gather specific details about what happened. The stories provide detail on the tasks performed, the user’s goals, and where problems lie.
  • Record keeping : Users are asked to keep records or diary entries of the tasks they perform over a certain period of time. Additionally, tracking software can be used for monitoring user activity.
  • Activity sampling : Users are watched or recorded for a certain period of time in order to document which tasks are being performed, as well as their duration and frequency.
  • Simulations : The task analyst walks through the steps that a user might take using a given system.

When carrying out research, do not rely solely on self-reported behavior (i.e., through interviews or surveys) or simulations (remember: you are not the user! ), but also observe the user at work in her own context . Otherwise, you could miss out on important nuances or details.

In stage 2, the task analyst will structure the observations by certain attributes like order, hierarchy, frequency, or even cognitive demands, to analyze the complexity of the process users follow in order to achieve their goals. The result of this analysis is often a graphical representation called a task-analysis diagram .

There are many different types of diagrams that could be produced, such as standard flowcharts or operational-sequence diagrams. However, the most commonly known and used in task analysis is the hierarchical task-analysis diagram (HTA). The figure below shows an example of an HTA for the goal of creating a digital copy of a physical letter using a new home scanner.

A hierarchical task analyst starts with the user's goal: to make a digital copy of a physical letter using a new home scanner. This goal is broken down into 4 tasks. Task 1: downloading software; task 2: launch scanner program; task 3: scan document; task 4: Save document. Each of these tasks can be broken down into further subtasks. For example, task 1 is shown broken down into 6 further subtasks: subtask 1.1: check printer model; subtask 1.2: search online for printer software; subtask 1.3: click link to download software; subtask 1.4: enter app-store password; subtask 1.5: click reset password, and subtask 1.6: enter a new app-store password. Subtasks 1.5 and 1.6 are marked as only being performed if a user has forgotten their password.

An HTA diagram starts with a goal and scenario (in the same way that a customer-journey map does) and highlights the major tasks to be completed in order to achieve it. In human factors, these tasks are referred to as ‘operations’. Each of the tasks in the top layer can be broken down into subtasks. The number of levels of subtasks depends on the complexity of the process and how granular the analyst wants the analysis to be.

Not all users accomplish goals in the same way. For example, a novice user might perform more tasks than an expert user — the latter might skip certain steps. The HTA enables these differences to be captured through ‘plans’. A plan specifies, at each level, what the order of the steps is, and which steps might be undertaken when or by whom. For example, a user who can’t remember his password has to undertake steps 1.5 (Click Reset password ) and 1.6 (Enter a new app-store password) in order to accomplish the goal of downloading software for the scanner.

While a task-analysis diagram is useful to illustrate the overall steps in a process and is an excellent communication tool — especially for complex systems — it can also be used as a starting point for further analyses. For example, the following attributes could be considered for the tasks in an HTA.

  • The overall number of tasks: Are there too many? Perhaps there are opportunities to create a design that could streamline the process and remove some steps.
  • The frequency of tasks: How often are certain tasks performed? Are some tasks filled with repetition?
  • The cognitive complexity of the tasks: What mental processes (i.e., thoughts, judgments, and decisions) are needed to complete a given task? (A whole branch of task analysis known as cognitive task analysis is concerned with these questions and with making visible the mental schemas and processes). If there are a lot of mental operations involved, the difficulty of the overall task increases, and the analyst should consider the likelihood of user error.
  • The physical requirements of the task: What does the user need to physically do? Could this physical requirement affect user performance and comfort? And how could these physical requirements affect users with disabilities?
  • The time taken to perform each task: Activity sampling or theoretical modeling (such as GOMS) can be used to estimate how long tasks would take users to complete.

At the end of the task analysis, the analyst has a good understanding of all the different tasks users may perform to achieve their goals and the nature of those tasks. Armed with this knowledge, the analyst can design (or redesign) an efficient, intuitive, and easy-to-use product or service.

Task analysis is a systematic method of studying the tasks users perform in order to reach their goals. The method begins with research to collect tasks and goals, followed by a systematic review of the tasks observed. A task-analysis diagram or an HTA is often the product of task analysis; the HTA can be used to communicate to others the process users follow, as well as a starting point for further assessment.

Hackos, J. A. T., & Redish, J. (1998). User and task analysis for interface design . New York: Wiley.

Kirwan, B. (Ed.), Ainsworth, L. (Ed.). (1992). A guide to task analysis . London: CRC Press, https://doi.org/10.1201/b16826

Stanton, N. A. (January 01, 2006). Hierarchical task analysis: Developments, applications, and extensions. Applied Ergonomics, 37, 1, 55-79. https://doi.org/10.1016/j.apergo.2005.06.003 .

Related Courses

Analytics and user experience.

Study your users’ real-life behaviors and make data-informed design decisions

Measuring UX and ROI

Use metrics from quantitative research to demonstrate value

Personas: Turn User Data Into User-Centered Design

Successfully turn user data into user interfaces. Learn how to create, maintain and utilize personas throughout the UX design process.

Related Topics

  • Human Computer Interaction Human Computer Interaction
  • Research Methods

Learn More:

task analysis to

How to Use the Zeigarnik Effect in UX

Feifei Liu · 5 min

task analysis to

Card Sorting: Why & When

Samhita Tankala · 3 min

task analysis to

Top Tasks for UX Design: How and Why to Create Them

Kara Pernice · 3 min

Related Articles:

Card Sorting: Pushing Users Beyond Terminology Matches

Samhita Tankala and Jakob Nielsen · 5 min

Good Abandonment on Search Results Pages

Kate Moran and Cami Goray · 8 min

User-Experience Quiz: 2019 UX Year in Review

Raluca Budiu · 3 min

Writing Tasks for Quantitative and Qualitative Usability Studies

Kate Moran · 6 min

Summary of Usability Inspection Methods

Jakob Nielsen · 1 min

Evaluate Interface Learnability with Cognitive Walkthroughs

Kim Salazar · 8 min

Integrations

What's new?

Prototype Testing

Live Website Testing

Feedback Surveys

Interview Studies

Card Sorting

Tree Testing

In-Product Prompts

Participant Management

Automated Reports

Templates Gallery

Choose from our library of pre-built mazes to copy, customize, and share with your own users

Browse all templates

Financial Services

Tech & Software

Product Designers

Product Managers

User Researchers

By use case

Concept & Idea Validation

Wireframe & Usability Test

Content & Copy Testing

Feedback & Satisfaction

Content Hub

Educational resources for product, research and design teams

Explore all resources

Question Bank

Research Maturity Model

Guides & Reports

Help Center

Future of User Research Report

The Optimal Path Podcast

Task analysis: How to optimize UX and improve task efficiency

User Research

Feb 6, 2024

Task analysis: How to optimize UX and improve task efficiency

Gain a fresh perspective on UX by analyzing how users approach tasks—here’s how to build a plan of action for designing goal-based user experiences.

Ella Webber

Ella Webber

Thinking you know how users perform specific tasks within your product is very different to having data-backed insights on the ins-and-outs of how users approach your product. From goals, touchpoints, journey and overall experience—taks analysis is a unique method to understand your user’s perspective.

In this article, we’re covering all you need to know about task analysis—the process of studying and analyzing users’ jobs to be done, and how they complete those tasks—including when to do it, how to do it, and best practices according to industry experts.

Task analysis made easy

Maze is a complete research toolkit to understand your users' experience and gather game-changing insights to shape your product

task analysis to

What is task analysis in UX?

Task analysis is a UX research method for mapping out how users complete a specific task within your product, e.g. paying an invoice in accounting software, or updating their picture on a social app. It identifies major decision points, cognitive load, and points of friction they encounter when completing the task.

UX researchers and designers can use task analysis insights to create more intuitive products, and the technique comes in helpful at any phase of the design process, from concept testing to prototype testing and usability testing live websites .

After watching how users approach a task, you break it down into smaller sub-tasks—giving you a clear, step-by-step understanding of their thought-process and decision-making. Once you know the steps and desired process and outcomes, you’re in a better position to identify user needs, and optimize the user experience.

What are the types of task analysis?

There’s two main types of task analysis, each of which lends itself to different objectives and stages of the UX design process . We’ve also included the pros and cons of each so you can choose the best one for your research.

Hierarchical task analysis

Cognitive task analysis.

Hierarchical task analysis is about structuring sequences. It involves creating a tree diagram or flowchart depicting a hierarchy of tasks your user needs to complete a goal.

First, you outline a main task. Then, the main task is divided into a set of sub-tasks. These sub-tasks are divided into even smaller tasks, and those are segmented further. This continues until you’re left with only the core decisions and jobs-to-be-done.

hierarchical task analysis example

Consider these pros and cons when choosing hierarchical task analysis:

  • Comprehensive, detailed view of tasks
  • Clearly organizes tasks
  • Helps identify dependencies and relationships between tasks
  • Time consuming to break down tasks into components
  • Difficult to map out for non-linear or complex tasks
  • Doesn’t take into account the cognitive load for completing tasks

Users spend valuable mental energy whenever they complete tasks while interacting with your product. Cognitive task analysis seeks to observe users' underlying processes during task performance. It includes behavior, emotions, and—debatably most important—mental effort.

Since they’re more complex, cognitive task analysis representations can come in all shapes and sizes, including some of the representations included in hierarchical task analysis.

Some types include:

  • Narrative descriptions: Detailed reports, documents, and descriptions of cognitive processes
  • Flowcharts: Visual representations, emphasizing decision points and thought processes
  • Decision trees: Diagrams highlighting the choices users can make, underlining the mental load

Some pros and cons of cognitive task analysis are:

  • Offers in-depth insights on users’ mental models
  • Especially useful for making more intuitive designs
  • Good for mapping out complex tasks that need problem-solving
  • Can be resource intensive, requiring time and effort
  • No singular output method, making a deliverable complex to create
  • Can sometimes overlook external or “hard” factors central to task performance

To get insights for your task analysis, you’ll need to user research to gather more information on how users achieve their goals. UX research methods like user interviews, card sorting , focus groups, UX surveys , and contextual inquiry can all be used to get insights on your user’s goals and mental models.

For example, Scott Hurff , Co-Founder and Chief Product Officer at Churnkey , uses a variety of methods to analyze billing-related tasks within Churnkey:

“At Churnkey, our product serves two customers: our direct customers (subscription businesses who use our platform) and then their customers.

“First, we hear about the problems being faced by our customers’ customers when dealing with billing-related topics, and then we dial down into what they’ve been trying to achieve.

“We then listen to our direct customers’ wishes about how their lives could be made easier, the roadblocks they’re facing when trying to complete certain tasks, and how to alleviate their sense of feeling overwhelmed with customer billing needs.

“Finally, we take all of these inputs, synthesize them, close-read them, and come up with new product concepts that we think will solve these problems in novel, useful ways.”

Task analysis is an adaptable technique, and you’ll likely find that a combination of UX research methods is key for getting a full picture of user experience.

The specific method you’ll opt for largely depends on what tasks you’re analyzing, and at which stage of the UX research process you’re conducting your analysis.

When to use task analysis in UX?

One of the benefits of task analysis is its versatility as a framework, offering value throughout the product development process . But before you break out the flowcharts, let’s look at when to use this technique.

1. Initial phase and discovery

Using task analysis at the beginning of the design process helps you explore and define user behavior in the context of a product. It gives your team the insights necessary for laying the foundation of user-centric design and contributes to the early stages of product research .

What task analysis can help with:

  • Brainstorming solutions
  • Identifying user paths and goals
  • Establishing user personas and pain points

2. Usability testing and validation

During the validation phase of the design process, task analysis can aid in ensuring you’re on the right path. It acts as a framework for defining benchmarks, successes, and failures for tasks.

  • Understanding user behavior and mental models
  • Creating realistic user scenarios to help guide usability testing
  • Identifying clear usability metrics , KPIs and decision points during user interaction

3. Iteration, improvement, and post-launch

UX teams can use task analysis as part of continuous product discovery to continuously refine a product for a better user experience. Task analysis is especially useful for identifying when and where to introduce features while maintaining user satisfaction after launch.

  • Creating and accessing opportunities for new features
  • Re-evaluating and updating user tasks
  • Assessing overall task efficiency long-term

💡 Looking for a UX research tool to uncover task-related insights at every stage of the design and development? Maze provides a comprehensive suite of user research methods, such as card sorting , user interviews , feedback surveys , and more.

How to use task analysis in the UX design process

Now we’re clear on when to use task analysis, let’s dive into what this process looks like.

1. Set a main task to analyze

After recruiting research participants , the first step of task analysis is choosing a primary user task to analyze. The scenario that you choose should be clear, with a set beginning and end. Some examples of main tasks include feature onboarding, completing a purchase, or customizing a profile.

You’ll want to ask yourself:

  • What are the user’s goals and motivations for completing the task?
  • Who is performing the task? What are their skills, experience, and knowledge level?
  • What’s the aim of this task analysis? What insights are you hoping to gain?

Asking yourself these questions will help you get a clear view of the main task at hand. It will also set the stage for the next important step of task analysis.

2. Select UX research methods and conduct task analysis

After you’ve defined your main task, you’ll want to gather in-depth insights showing exactly how users complete it. Choose your user research methods and conduct research into how users complete a specific task.

For hierarchical task analysis, methods like website testing, on-screen recordings, or heatmaps will give you a better understanding of how users are completing the task. They show exactly where and what your users click on to finish user processes.

If you’re more interested in cognitive task analysis, qualitative research through user interviews and surveys is the way to go. By asking users open-ended research questions , you’ll receive a wealth of information on users’ mental effort and emotions during task completion. Open card sorting can also be especially helpful for getting more context on user mental models.

3. Break your main task into smaller sub-tasks

Breaking down the main task into smaller sub-tasks is crucial for understanding each small step in user-product interactions. This is where you’ll identify any friction points, otherwise hidden away within the digital product experience.

For example, let’s say your main task is completing a purchase on the website. Some of the sub-tasks would include:

Selecting products to add to cart

  • Browsing the catalog
  • Selecting products by clicking on them
  • Selecting product options and quantity

Accessing the shopping cart to make a purchase

  • Finding the shopping cart
  • Proceeding to check-out
  • Typing in personal information
  • Selecting a shipping method
  • Deciding to save information for further purchasing

Confirming and placing the order

  • Adding a payment method and securely inputting details
  • Selecting options to use saved payment methods
  • Accepting terms and conditions
  • Confirming the order

Breaking your main task into its relevant sub-tasks enables you to understand each individual aspect and approach your tasks one step at a time. This is crucial for effectively optimizing the entire process.

Once you’ve listed out your sub-tasks, it’s time to create a visual representation that maps out decision points on your user’s journey.

4. Create a diagram to map out major decision points

Creating a diagram for task analysis gives you a comprehensive view of the user processes at work and how to improve them. Essentially, you’re taking your main task, the sub-tasks you’ve identified, and turning them into a flowchart. Flowcharts help you visualize the user’s journey through a specific task and identify opportunities for improvement.

On a flowchart, your main task will be the starting point. A simple arrow starting from the main task brings you to the first sub-task, then the second, then the third, and so on. Eventually, you’ll get down to the final step of your user’s task.

The result will look something like the image below.

task analysis flowchart

With a complete flowchart, you’ll be able to identify any redundancies or inefficiencies in your digital product. It will also serve as a constant reference you can come back to for brainstorming areas for improvement.

Flowcharts are essential for hierarchical task analysis, as they clearly define what users need to do while interacting with your digital product. However, they’re only half the picture. The other half incorporates user insights into each decision, task, and friction point.

5. Create a narrative report with next steps

While flowcharts are important, they don’t give you the context surrounding particular friction points in your user’s experience. For that, you’ll need to create a narrative report—a detailed explanation of the task analysis in chronological order.

Here’s what your narrative report should include:

  • Introduction: Outlining the purpose of your task analysis, the main task, and its importance
  • Describing sub-tasks: With a detailed description of how each sub-task is performed with relevant insights, processes, and context
  • Discuss dependencies and decision points: Explaining how subtasks are connected and the criteria for making decisions between them
  • Highlight potential improvements: Including any potential friction points and improvements in sub-tasks, informed by insights
  • Conclusion: Summarize your key findings and how improvements can enhance user experience and completing the main task

It’s during the elaboration of this narrative report that your UX research and data collection from the second step will come into play. Include insights you’ve collected on mental effort, decisions, and your user’s thoughts at each stage.

You’ll want to pay special attention to the key decision points throughout the task, and what you can do to remove friction and optimize the experience. Considering these issues will help formulate potential solutions to the task-related issues you’ve identified.

With this UX report , you’ll have a clearer idea of what’s next. Based on your insights, you may need to introduce new features, optimize existing ones, or redesign your task flow for a smoother experience. You’ll likely want to conduct further research once you’ve got your solutions to ensure you’re heading in the right direction.

Once your narrative report is complete, you can use it as a guide for actioning the insights you’ve collected. It’s also a key document for getting stakeholder buy-in , if applicable, and democratizing user research in your organization.

3 Best practices for effective task analysis

Knowing the rules for conducting task analysis doesn’t guarantee success—you need to keep some best practices in mind to ensure your task analysis is as fruitful as possible.

Here, Scott from Churnkey shares three best practices he uses to guide task analysis.

1. Show, don’t tell

It’s not enough to have a vague understanding of the problems your users face while interacting with your product. The more concrete examples you can get of their friction points, the better. A deep understanding of the issues they face is crucial for resolving them. And what better way to understand user issues than seeing them first-hand?

“Get on a call with a customer and have them share their screen.” Says Scott, “Have them take you through the exact steps they’re following to solve their problem with today’s method. Record the call, as typically, this ends up being the ‘ideal case’, so you’ll want to see them complete this process more than once.”

Interviews and surveys definitely have their place in task analysis, but you want to use them alongside other research methods that prioritize showing over telling.

2. Study multiple users completing the same task

Not every user will approach a task in the same way. That’s why it's helpful to test different users completing the same process. This not only ensures you’re tackling analysis effectively but also potentially reveals new friction points or solutions.

Scott suggests taking notes on the differences of each unique case:

“Whenever possible, see if you can experience how other members of a team complete the task. Take note of the little adjustments and tweaks they have to make for each unique case.

“Chances are, you’ll experience a few little ‘tricks’ they use to complete the task in a different way. Take note of the origin points of the task and its eventual destination. What problems arose along the way? How did each completion differ slightly?”

Collecting insights from a variety of users is crucial for a broad understanding of how users complete tasks. It’s essential to get a wide array of perspectives to ensure you build a solution that optimizes the process for everyone.

3. Come back to the big picture

Breaking down a main task into smaller subsets is crucial for successful analysis. However, this can make it easy to get bogged down by complex processes and decision points. Revisiting the end goal allows you to explore new avenues for potential solutions and bring the data back to your objective.

Scott notes how returning to the main task and identifying the bigger themes can help form an altogether new angle:

“Step back and take away the broad themes of the task. What’s the actual thing being done? From how many different angles can you examine the existing solution to the task, and does that offer any potential for a fresh approach?”

It’s easy to get caught up in the details, but taking a step back can give you a fresh perspective on pre-existing knowledge to improve decision-making.

Enhance your user experience with Maze

Task analysis is an effective way to get a clear view of how your users interact with your product, and is a useful analysis method at any phase of the product design process.

Performing task analysis correctly ensures every step of your user’s experience is intuitive and frictionless.

If you’re looking for a tool that can support all stages of the task analysis process—from recruiting participants to creating reports—Maze is your answer. Maze is a holistic user research platform that provides multiple research methods to uncover actionable insights.

Task analysis—like every type of user research—is a whole lot easier with the right toolkit. Try Maze today to start optimizing every step of the varying paths users take in your product.

Frequently asked questions about task analysis

The two main types of task analysis include hierarchical task analysis and cognitive task analysis However, there are other types, including goal task analysis and sequential task analysis.

What are the steps of task analysis?

The steps of task analysis entail setting a main task to analyze, gathering information on that task through UX research methods, breaking down main tasks into smaller tasks, and creating a flowchart or narrative report.

What is an example of task analysis?

Let’s say your digital product requires users to make a profile with contact information. A hierarchical task analysis would entail identifying a key process, like uploading a profile picture, and then breaking it down into smaller processes—like scrolling their gallery and selecting a picture from their camera roll. UX teams can study these processes to identify friction points and their potential solutions.

Build better products with powerful user research

  • Reviews / Why join our community?
  • For companies
  • Frequently asked questions

Task Analysis

What is task analysis.

Task analysis begins by defining any of the user’s problems as scenarios and concludes with creating a task flow that outlines the journey from problem to solution.

For example, when interviewing users who are interested in gardening and the designer realizes the majority of them have the problem of forgetting to water their plants every morning, the designer may include an alarm-feature in the final design to address this problem.

The designer’s goal is to keep the tasks as simple as possible and eliminate any unnecessary steps, keeping the process simple and straightforward.

How to Improve Your Design with Task Analysis

Literature on Task Analysis

Here’s the entire UX literature on Task Analysis by the Interaction Design Foundation, collated in one place:

Learn more about Task Analysis

Take a deep dive into Task Analysis with our course User Research – Methods and Best Practices .

How do you plan to design a product or service that your users will love , if you don't know what they want in the first place? As a user experience designer, you shouldn't leave it to chance to design something outstanding; you should make the effort to understand your users and build on that knowledge from the outset. User research is the way to do this, and it can therefore be thought of as the largest part of user experience design .

In fact, user research is often the first step of a UX design process—after all, you cannot begin to design a product or service without first understanding what your users want! As you gain the skills required, and learn about the best practices in user research, you’ll get first-hand knowledge of your users and be able to design the optimal product—one that’s truly relevant for your users and, subsequently, outperforms your competitors’ .

This course will give you insights into the most essential qualitative research methods around and will teach you how to put them into practice in your design work. You’ll also have the opportunity to embark on three practical projects where you can apply what you’ve learned to carry out user research in the real world . You’ll learn details about how to plan user research projects and fit them into your own work processes in a way that maximizes the impact your research can have on your designs. On top of that, you’ll gain practice with different methods that will help you analyze the results of your research and communicate your findings to your clients and stakeholders—workshops, user journeys and personas, just to name a few!

By the end of the course, you’ll have not only a Course Certificate but also three case studies to add to your portfolio. And remember, a portfolio with engaging case studies is invaluable if you are looking to break into a career in UX design or user research!

We believe you should learn from the best, so we’ve gathered a team of experts to help teach this course alongside our own course instructors. That means you’ll meet a new instructor in each of the lessons on research methods who is an expert in their field—we hope you enjoy what they have in store for you!

All open-source articles on Task Analysis

How to improve your ux designs with task analysis.

task analysis to

  • 1.1k shares

How to Use Mental Models in UX Design

task analysis to

Design Scenarios - Communicating the Small Steps in the User Experience

task analysis to

  • 3 years ago

Open Access—Link to us!

We believe in Open Access and the  democratization of knowledge . Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change , cite this page , link to us, or join us to help us democratize design knowledge !

Privacy Settings

Our digital services use necessary tracking technologies, including third-party cookies, for security, functionality, and to uphold user rights. Optional cookies offer enhanced features, and analytics.

Experience the full potential of our site that remembers your preferences and supports secure sign-in.

Governs the storage of data necessary for maintaining website security, user authentication, and fraud prevention mechanisms.

Enhanced Functionality

Saves your settings and preferences, like your location, for a more personalized experience.

Referral Program

We use cookies to enable our referral program, giving you and your friends discounts.

Error Reporting

We share user ID with Bugsnag and NewRelic to help us track errors and fix issues.

Optimize your experience by allowing us to monitor site usage. You’ll enjoy a smoother, more personalized journey without compromising your privacy.

Analytics Storage

Collects anonymous data on how you navigate and interact, helping us make informed improvements.

Differentiates real visitors from automated bots, ensuring accurate usage data and improving your website experience.

Lets us tailor your digital ads to match your interests, making them more relevant and useful to you.

Advertising Storage

Stores information for better-targeted advertising, enhancing your online ad experience.

Personalization Storage

Permits storing data to personalize content and ads across Google services based on user behavior, enhancing overall user experience.

Advertising Personalization

Allows for content and ad personalization across Google services based on user behavior. This consent enhances user experiences.

Enables personalizing ads based on user data and interactions, allowing for more relevant advertising experiences across Google services.

Receive more relevant advertisements by sharing your interests and behavior with our trusted advertising partners.

Enables better ad targeting and measurement on Meta platforms, making ads you see more relevant.

Allows for improved ad effectiveness and measurement through Meta’s Conversions API, ensuring privacy-compliant data sharing.

LinkedIn Insights

Tracks conversions, retargeting, and web analytics for LinkedIn ad campaigns, enhancing ad relevance and performance.

LinkedIn CAPI

Enhances LinkedIn advertising through server-side event tracking, offering more accurate measurement and personalization.

Google Ads Tag

Tracks ad performance and user engagement, helping deliver ads that are most useful to you.

Share Knowledge, Get Respect!

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this page.

New to UX Design? We’re Giving You a Free ebook!

The Basics of User Experience Design

Download our free ebook The Basics of User Experience Design to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

  • Blog Home Home
  • Explore by categories Categories TOPICS Case Studies Information Architecture Product Development UI/UX Design User Research User Testing UX Career UX Tips Women in UX
  • News and Updates News
  • UX Glossary

Save Time and Frustration

Say No to Poorly Designed Products!

Save Time and Frustration

UX Task Analysis: A Complete Guide + Example

UX Task Analysis: A Complete Guide + Example

TABLE OF CONTENTS

It’s almost impossible to create an intuitive website without knowing your user’s goals and struggles along the way. But how do you find out what they are? Luckily, there is an effective way to do that — task analysis. By following this article, you will master task analysis and obtain the knowledge you need to design an efficient and user-centered product.   

Key Takeaways

➡️ Task analysis in UX means detailed mapping of how a user completes their goal using a digital product and of dependent system actions

📈 It is crucial when developing a new product or when updating an existing one

🎯 Understanding exactly how a user interacts with a system leads to design improvements, increased user satisfaction, and overall increased efficiency

🐝 To gather data for task analysis one may use methods such as interviews, contextual inquiry , task-based usability testing and more

✅ The output of a UX task analysis is most often a task analysis diagram

What is task analysis?

Task analysis is, simply put, the understanding of a user’s task. It’s a combination of understanding the user, their task, and their environment. Performing a task analysis leaves a detailed understanding of the task sequence, its complexity, environmental conditions, tools, skills, and information the user needs to perform the task to achieve their goal.

It encompasses a broad range of techniques from observations of the user in their natural environment to documenting how the users perform their tasks in an existing system. A good task analysis leads to actionable insights into user processes. This information can be directly applied in designing efficient user flows that liberate users from unnecessary work and delegate said work to the system. 

What are the types of approaches to user task analysis

There are three approaches to task analysis, which can however be combined:

  • Contextual  
  • Hierarchical  

UX task analysis approaches

Now, we will break down each approach in more detail.

Contextual task analysis

A central, key step in contextual task analysis is contextual observations/interviews. The idea is that analysts must observe and interview users in their real-life work context to understand their needs and “hot button” motivators.

Julie A. Jacko; professor, author of Human-Computer Interaction Handbook

Contextual analysis means obtaining a model of how a user completes a certain task, in their natural environment. This enables you to understand how the product will fit the user’s environment, actual needs, and other tools they already use.

For example, if you want to test the usability of an app for bike-sharing, you should test this with users outside, on the go. Nobody will be using this app from the comfort of their couch, on the contrary, one might expect it to be used under changeable outside light conditions or in a hurry This is a specific context within which the task analysis should be conducted.

To design products used in a distracting environment one should consider providing safeguards against unintentional errors, and including options to pick the task up again after a delay (Mayhew, 2007).

Contextual task analysis is indispensable in pinpointing novel business opportunities – “At which point can we design technology solutions that help the user do their task more efficiently?” It also helps design the product so that it can be seamlessly integrated into the user’s existing processes and it’s easy for new users to pick up.

Lastly, understanding how users already interact with existing tools helps design an interface that’s inherently familiar to the users.

Cognitive task analysis

Cognitive task analysis focuses on understanding the deeper mental processes such as decision-making, attention, memory, and judgment that a task involves . By studying users’ cognitive processes, UX researchers and designers can gain insights into how users understand, learn, and perform tasks within a given interface or system.

This technique can include UX methods such as think-aloud protocols, observational studies, user interviews, and usability testing .

Hierarchical task analysis

Hierarchical task analysis studies user behavior by breaking complex tasks down into smaller subtasks . This approach helps to gain more detailed and precise information into the process of users completing complex tasks as each step can be analyzed separately. 

Each subtask can be analyzed using either of the two methods described above or a combination of both methods. This detailed information can be later visualized in a form of a diagram that describes the steps taken to accomplish a certain larger goal.

task analysis to

Conduct UX task analysis with UXtweak!

The only UX research tool you need to understand your customers’ tasks and better understand their issues

What is the main goal of Task Analysis in user research?

Task analysis is supposed to provide actionable insights into user processes which can be directly applied in designing efficient user flows that liberate users from unnecessary work and delegate said work to the system. It encompasses a range of techniques from observations of the users to documenting their performance.

When to do task analysis?

There are two use cases when task analysis is most beneficial:

  • When developing a brand-new product
  • When updating an existing system

Ideally done during all stages of the design process , task analysis is most useful at the beginning (Courage et al, 2007).

If you follow the Design Thinking process, incorporate it into the Empathize and Define stages. Doing task analysis at the beginning will ultimately save time and money during the later stages. Understanding how users work makes the design phase move much more quickly, it helps prioritize the features, and saves on testing, as the design will be more informed and fewer iterations will be needed.

However, task analysis can be just as successfully applied to updating an existing project and can still drive your updates to be more user-centric.

Preparing for a UX task analysis

The first step, of course, is to pick a specific task you want to analyze . Before starting the task analysis, decide on the scope and granularity – i.e. how much time you have, what user population you want to cover, how many types of tasks, and in how much detail you want to specify them.

Split your task into more specific tasks if needed , depending on the level of detail you decided.

For example, if you are designing a collaboration platform, you may be interested in a larger picture – understanding how work moves from person to person and the users’ general jobs. On the other hand, if your product is targeting single users who don’t interact, you may want to start with the target user’s main goals and sub-goals and move down to the breakdown of specific steps they take to achieve these goals. 

How to conduct a UX task analysis?

There are 2 main parts to a UX task analysis:

  • Gather information about users
  • Analyze the data

The output of a UX task analysis is a task-analysis flow diagram.

1. Task Analysis: Gathering information

The objective is to understand users’ goals, mental models, and tasks in their natural environment. Who are they? What information do they have and lack? What mental models do they have of the activities that your product covers? And most importantly – what are their goals?

These are some methods that are used:

  • Contextual analysis or contextual inquiry – If you have the time and resources, bring the research to users by conducting site visits. Observing the users in their natural environment will allow you to document their steps and decisions as they solve tasks. Supplement your observations by asking questions about their goals and reasoning.
  • Interviewing – Make the interview behavioral rather than attitudinal – get them to walk you through their process and explain their decisions. Ask them to show you the artifacts and tools they would normally use and let them walk you through how they would use them. Artifacts could be e.g. a calendar, notes, paper form – anything they already produce to help themselves in the task.
  • Recording user activities – This might be a user taking self-recorded notes in a diary study or with the help of a tracking software such as a session recording tool .
  • Focus group – A semi-structured discussion with multiple target users.  Moderate their discussion to reach a consensus on what the task steps look like, what kind of decisions they have to make along the way, and what kind of goals they are achieving.
  • Task-based user testing – If you are not developing a product from scratch, but rather updating an existing one, conduct task-based usability testing and observe how users complete the tasks in the existing system. You can do this in person or remotely with the help of a usability testing tool . Keep records of all user actions, such as page views, click paths, and actions like purchase or download.

An informal task analysis is better than none. Oftentimes rigorous task analysis requires much time and effort so that one is tempted to let go of the idea completely. If interviewing, focus groups, user testing, and other techniques are unavailable to you, consider informal, unobtrusive observations of real users using a product - it will be more valuable than doing nothing.

2. Putting the Analysis into Task Analysis

A variety of tools can help you to make sense of your data: 

  • Affinity diagrams – If you are just starting generative research to prioritize features of a new product, use affinity diagrams mapping users’ needs, goals, and preferences. 

  • User Personas – To understand your target users, craft rich persona descriptions that contain user backgrounds, goals, needs, knowledge, and environment information. 

  • Users Scenarios – Moving closer to the task itself, you can write user scenarios   – short stories starting with the user’s situation and describing the steps, tools, and artifacts the user uses to arrive at a happy ending. However, the ultimate method in hierarchical task analysis is the diagrams.

The result of a UX task analysis is most often a flow diagram.

Flow diagrams

UX task analysis flow diagram

Flow diagrams are the most important outcome of task analysis. They document the core of the task – how users interact with a system as they move through and complete their task . They illustrate the sequence of steps and the dependencies. Depending on the scope of your analysis your UX task analysis diagram can incorporate detailed elements such as detailed user decisions, interaction with other individuals, pop-up dialog elements and menu items, other tools, etc. 

For preexisting design solutions, the diagram will often be surprisingly elaborate and messy. The information from a detailed diagram allows you to see unnecessarily complicated information exchange between the system and the user and define actionable design recommendations.

A good practice for creating an organized flow diagram is color-coding the tasks so that you can immediately see which actions are done by the user and which actions are done by the system.

UX task analysis example

To make it easier for you to understand the process, we are going to walk you through a simplified example of UX task analysis on an existing system. In this example, we will be analyzing Marco, who wants to buy a new pair of jeans for the summer.

Marco’s goal: “Purchase the jeans from an online store.”

Step 1: Split the task into smaller subtasks.

  • Find the jeans on the website
  • Add jeans to the shopping cart
  • Proceed to checkout

Step 2: Research how exactly Marco completes the subtasks using a website usability testing tool and analyze how he proceeds, or if he most often shops on a mobile device, use a mobile testing tool of course.

Step 3: Analyze Marco’s behavior and prepare data for creating a task analysis diagram.

Step 4: Create a diagram

Here’s an example of what the diagram could look like:

Flow diagram and sequence diagram of Marco's flow

Step 5: Next step

Looking at the task diagram, how can we optimize the task flow to make it more efficient from Marco’s point of view? Can we reduce the number of steps, decisions, and information he needs to know?

This is where your own cutting-edge design solution comes in.

References and further reading

Courage, Reddish and Wixon: Task Analysis. In Human-Computer Interaction, 2007

Marine: Task Analysis: The Key UX Design Step Everyone Skips, https://www.searchenginewatch.com/2014/03/27/task-analysis-the-key-ux-design-step-everyone-skips/

Mayhew: Requirement Specifications within the Usability Engineering Lifecycle. In Human-Computer Interaction, 2007

Usability’s Body of Knowlege: Taks Analysis, http://usabilitybok.org/task-analysis

People also ask (FAQ)

Task analysis in UX is a systematic approach to mapping out how a user completes their task within a system . It reveals each step needed to be taken, by the user and by the system, the flow, and the dependencies. Often in task analysis larger goals are broken down into smaller subtasks which are all analyzed in detail and this approach can offer insight into any inefficiencies and possible issues.

These are the main steps of conducting a UX task analysis:

  • Define a task you want to analyze
  • Break the task down into smaller subtasks
  • Gather information about users (e.g. task-based usability testing )
  • Create a flow diagram

Task analysis is important in UX because it helps you understand your users, and their behavior when completing tasks using your digital product . This understanding can reveal potential issues and areas for improvement, guide design decisions, and support the overall efficiency of the user flow.

Hive full of creative minds, UX researchers, UX/UI designers, content writers and editors dedicated to sharing their collective knowledge and expertise with the UX community. Our content team collaborates to produce high-quality resources on a variety of topics related to UX research, UX/UI design, usability and user testing, and a lot of actionable UX tips. You can find insights and practical tips that can help businesses improve their user experience and achieve their goals in our Blog Posts and Guides . You can find articles by our staff, as well as mentions of UXtweak and our content in the top UX publications such as Smashing Magazine , Interaction Design Foundation , UX Magazine , UXmatters , UXbooth , UX Mastery, and UXtools . UXtweak and our content have also been featured by companies such as Figma , Wix , HubSpot , Elementor, Toptal , Avast , CareerFoundry , and others.

task analysis to

UXtweak is buzzing with expert UX research, making thousands of products more user friendly every day

task analysis to

Card Sorting & Tree Testing = Best Friends

Card sorting offers a way to understand your users’ thoughts and to find ideas for designing a good IA. What it doesn’t do is confirm that your new IA is right. That’s what tree testing is for. Read more ...

task analysis to

10 Examples of Great Website Navigation

Thinking of website navigation? Try to take inspiration from these 10 examples of unique and innovative designs that have taken menus to the next level. Read more ...

task analysis to

Improve your Website Navigation with Taxonomy

Do your customers have a hard time finding information on your website? Do they get lost? Your website’s navigation can either make or break the user experience. So here’s how you can use information taxonomy to improve it. Read more ...

task analysis to

  • Card Sorting
  • Tree Testing
  • Preference Test
  • Five Second Test
  • Session Recording
  • Freeform Interviews
  • Study Interviews
  • Mobile Testing
  • First Click Test
  • Prototype Testing
  • Website Testing
  • Onsite Recruiting
  • Own Database
  • Documentation
  • Product features
  • Comparisons

Get started

  • Project management
  • CRM and Sales
  • Work management
  • Product development life cycle
  • Comparisons
  • Construction management
  • monday.com updates

Task analysis and how it can help build a project team

task analysis to

Building an effective project team is vital to successful outcomes. As a project manager, you know there’s nothing more important than working with skilled, productive, and collaborative team members to meet or exceed project goals. However, doing this can be significantly more challenging than it sounds.

Thankfully, task analysis provides insight into a potential team member’s work processes before a project begins. This allows you to see what individual users bring to the tasks and helps you build the most efficient team possible. Today we’ll discuss how to use task analysis to build your project team. We’ll also discuss how monday.com can simplify task analysis and your project’s overall processes to encourage maximum productivity.

What is task analysis?

Task analysis is a behavior analysis that allows project managers to see how individual team members accomplish their tasks. During your task analysis, your primary goal will be to learn key things like:

  • How a person accomplishes their goals
  • The specific steps a person takes to perform the task
  • The type of problem-solving a person uses to complete complex tasks
  • What experience and skill set an individual brings to your team
  • How a person complete their tasks in different environments
  • The mood and thoughts a person has about their task

To conduct a high-level task analysis, there are specific steps to take. These are crucial to fully understanding how a task analysis works on an individual and team-wide level.

How does task analysis work?

A task analysis includes the following six steps:

  • Identify your goals:  Figure out which task you want to analyze and what your purpose for analyzing it is. Determine your observation’s start and endpoint so you know when to begin your data collection and note-taking.
  • Break the task down into smaller tasks:  Once you’ve identified your goals, you’ll want to break down the main task you’ll be observing into smaller ones. You can break this down into as many smaller tasks as you’d like, but generally, six to eight will allow for the best results.
  • Decide what type of analysis you’ll be doing:  There are five common types of task analysis, so you must decide which works best in achieving your goals.
  • Begin your task analysis:  Make sure you take lots of notes during your study so you can review them later. While taking notes, consider how challenging each smaller task was for the team member, their process in completing it, and how they performed both physically and cognitively. Also, note how long each smaller task takes and the total time to achieve the primary goal from start to finish.
  • Review your notes:  After the in-person analysis is complete, you’ll find it helpful to look things over on your own, away from the situation. Identify any areas that may present an issue to your project team and consider ways that you could optimize the individual’s workflow. If you’ve conducted a task analysis for each project team member, compare these results to see where you could optimize workflow, increase productivity, and improve collaboration team-wide.
  • Share your results:  Once you’ve completed your individual or team-wide analysis, share your results with the whole team. After your independent analysis, getting input from the entire team improves consistency and ensures the plan moving forward is feasible for everyone involved.

Task analysis in project development and management

Task analysis can help you develop and manage projects more efficiently. Thorough task analysis can allow you to optimize workflows and make the most of your team’s skill set, experience, and expertise. To be most effective, you’ll want to perform individual task analysis on each project team member and then look at the results from a team-wide perspective.

Task analysis helps project managers create more competent teams and manage project roadblocks.

Benefits and potential drawbacks of using task analysis to build a team

There are numerous benefits to using task analysis when building a team. Notably, using task analysis can help you:

  • Simplify complex tasks:  Challenging or complicated tasks are more manageable when broken down into smaller sub-tasks. Task analysis is an excellent way to help you achieve this. In return, you may find your team feels more empowered and confident in their roles instead of overwhelmed by tasks.
  • Reduce mistakes:  Conducting an in-depth task analysis helps you find potential errors or roadblocks that may be caused by how your team executes their tasks. By finding these potential issues early on, you can reduce the risk of mistakes happening further into the project and even apply that to future projects to refine your processes.
  • Improve existing procedures and processes:  A significant benefit of task analysis is the ability to improve current procedures and processes or even develop new ones that are more effective and productive. During your analysis, you should also be able to identify the resources and skills necessary for new or existing processes.

Of course, there are a few potential drawbacks every project manager should be aware of. A few potential issues in the task analysis process you should be mindful of include:

  • The process can be time-consuming.
  • Task analysis sometimes yields complex findings that are challenging to decipher.
  • Since your team members are only human, there could be discrepancies in how quickly or efficiently they perform a given task from one day to another.
  • When getting feedback from your team, you may encounter diverse viewpoints that make it challenging to reach a consensus on how you should perform tasks.

You may find it helpful to better understand task analysis, its benefits, and its potential drawbacks by seeing some real-life examples.

Examples of task analysis and how it can be applied

There are many applications for task analysis in building and managing project teams. Below are two examples that can help you understand how task analysis is applied in real life.

Example one

A project manager needs to build a team for a project designed to market a new product for their company. They use task analysis to observe candidates for the team. Then, they review their notes to match the right team members based on experience, skill sets, personality, and overall efficiency.

Example two

A project manager notices the project is slipping behind initial projections and is at risk of missing the final deadline. They conduct an in-depth task analysis of each team member and then cross-reference those results for the entire team. Using this information, they identify tasks that could be simplified and share their results with the team, who implements the suggested changes. As a result, productivity levels increase, and the group gradually gets back on track with initial projections to successfully meet their final deadline.

Task analysis and monday.com

Although worthwhile, task analysis can be a lengthy process. Using the monday.com Work OS can make this process simpler and quicker by:

  • Promoting real-time collaboration:  Share results with your team faster and more efficiently with the real-time collaboration capabilities of our Work OS.
  • Providing customizable dashboards:   Customizable dashboards  allow you to view and compare the information that means the most to you.
  • Offering different ways to view and compare your data:  Our Work OS provides multiple ways to view your data, including  Kanban boards  and Gantt charts .
  • Automating routine tasks:   Automate everyday tasks  and approvals to streamline your work processes and achieve maximum efficiency during your task analysis and projects.

By now, you have a pretty solid understanding of what task analysis is and why it’s important. But, we’ve answered a few frequently asked questions below just in case.

Frequently asked questions

What are the five steps of task analysis.

The five steps of task analysis include:

  • Identify your goals
  • Break the larger task into smaller sub-tasks
  • Decide which type of task analysis you’ll be conducting
  • Conduct your analysis
  • Share your results with your team members and other stakeholders

What are the five types of task analysis?

The five types of task analysis include:

  • Performance analysis
  • Cognitive task analysis
  • Content analysis
  • Learning analysis
  • Activity analysis

What is the importance of task analysis?

Task analysis is important because it helps your project team members understand how to complete each task step to their best abilities. Additionally, it reduces mistakes, streamlines processes, and increases productivity.

Learn what task analysis can do for your team

Task analysis can help you build the best team for each project by ensuring you have the necessary experience, skill set, and personalities to promote collaboration and maintain high productivity levels. Additionally, task analysis can help simplify complex tasks and help your hand-picked team meet or exceed deadlines. Using monday.com to perform your task analysis and compare data makes the process simpler and more efficient.

Send this article to someone who’d like it.

The Essential Intro to Task Analysis

Swetha Amaresan

Updated: November 20, 2019

Published: July 29, 2019

When DIY furniture arrives at your doorstep, there are typically two reactions you can have. The first is an excited, confident feeling that you'll have this thing built in no time. I envy those people because they usually are done in no time.

guide to task analysis

The other reaction is cautious optimism, where you nervously peer through the project materials trying to figure out which side of the directions to start with -- only to find that the entire document is in a language you can't read. These are the people who spend hours scratching their heads convincing themselves they don't need a new couch, they have perfectly good lawn chairs instead. Guess which group I belong to?

The best products are intuitive and user-friendly. Their interfaces and designs are organized and simple to navigate, making it easy for customers to achieve their goals. Regardless if you're a DIY furniture company or a SaaS business, creating an easy-to-use product is crucial to your organization's success.

If you're not sure whether your product is user-friendly, you can perform a task analysis to measure its usability. A task analysis is a product development test that records a customer's ability to complete a task. The outcome can provide insight into customer behavior and how you can improve your product's design.

In this post, we'll go into more depth on what a task analysis is, then provide a template you can use to run this test at your business.

Download Our Free UX Research & Testing Kit

What Is Task Analysis?

Task analysis is the process of observing customers using your product or service in real-time to better understand their process for performing certain tasks. Once completed, you can learn which tasks your application should support and what features or interfaces should be adjusted to align with customer needs .

A task analysis can help you understand more than just which tasks customers can complete and how they go about doing so. It can also demonstrate the existing knowledge some users have and how that affects their performance with your product. This gives your team helpful information that further defines your target demographic.

Additionally, it's important to keep in mind that in this case task analysis is being described as a method for improving a business's user experiences. However, this process can also be used for several other reasons, such as helping people with disabilities perform certain tasks and putting together training materials.

The next section breaks down the two types of task analysis you can perform as well as the differences between each two.

Types of Task Analysis

Below are two types of task analysis.

1. Hierarchical Task Analysis

This type takes a complex task and breaks it down into smaller, simpler sub-tasks. That way, you can direct the user to achieve a goal using a certain set of steps. This creates a controlled environment that lets you analyze specific aspects of the customer experience. We can view an example of this type using the image below.

Hierarchal Task Analysis

Source: UXmatters

2. Cognitive Task Analysis

This type asks participants to use their problem-solving, decision-making, and personal judgment to complete a task. Customers are given an objective but unlike in a hierarchical test, they can choose how they'll achieve the goal. The researcher takes notes on the participant's process and records the key pain points experienced during the test. This gives businesses an unbiased look into customer perception and how they interpret your interface and design. We can look at a cognitive task analysis in the image below.

Cognitive Task Analysis

Source: NWLink

Now that we're familiar with the two types of task analyses, let's take an in-depth look at each one using the examples below.

Task Analysis Examples

1. hierarchal task analysis example.

Goal/End Task: Find your company's FAQ page online.

Sub-Task 1: Turn on the computer.

Sub-Task 2: Log-in.

Sub-Task 3: Open the web browser.

Sub-Task 4: Search "[Your company] knowledge base."

Sub-Task 5: Select the link titled "[Your company] Knowledge Base."

Sub-Task 6: Scroll down the landing page until you see the heading "FAQ."

This task analysis can help you understand what steps in this process can be simpler or automated to save your users time. For instance, rather than forcing customers to search for your knowledge base and scroll down to read the FAQ , perhaps your FAQ page can be set up as its own page that can be searched straight from the web.

2. Cognitive Task Analysis Example

Goal/End Task: Publish a new blog post.

In this example, you can analyze the decisions a user makes when asked to complete a generic task. For instance, one user might pull out a smartphone, unlock it, open the Notes app, and begin jotting down notes. Another user might pull out a laptop, turn it on, and begin typing in Microsoft Word.

A third user might perform the same steps as the previous user but, instead, open their web browser, search for their blog, and begin typing their blog post there. A fourth user might just shrug their shoulders and claim they've never done such a thing.

This task analysis can help you understand how different users navigate the process of solving the same problem in their unique way. Considering what the majority of users do when assigned a certain problem can tell you how a task is accomplished -- not how you believe it would be accomplished.

For example, you may have assumed most people would have opened a laptop or computer but could be surprised to find that some people prefer to write on their phones or with a pen and notepad. Recognizing these specific behaviors will help your team adjust features and align the product with how your customers want to use it.

Both types of tests are useful to a product development team and can be conducted on any product at your business. To help your team get started, we put together the template below that can be used for both types of task analyses.

Task Analysis Template

Participant Name:

Observer Name:

(I): Independent Step

(V): Verbal Prompt

(P): Physical Prompt

1. 

 

2.

 

3.

 

4. 

 

5. 

 

6. 

 

7. 

 

8. 

 

For usability tests, check out our guide to first click testing .

ux templates

Don't forget to share this post!

Related articles.

How to Run an Effective Heuristic Evaluation

How to Run an Effective Heuristic Evaluation

The UX Designer's Guide to Affinity Diagrams

The UX Designer's Guide to Affinity Diagrams

Beta Testing: The Ultimate Guide For Product Teams

Beta Testing: The Ultimate Guide For Product Teams

Scrum Product Owner: Role & Responsibilities, Explained

Scrum Product Owner: Role & Responsibilities, Explained

5 Qualitative Research Methods Every UX Researcher Should Know [+ Examples]

5 Qualitative Research Methods Every UX Researcher Should Know [+ Examples]

How the Serial Position Effect Influences Your Users [Cheat Sheet]

How the Serial Position Effect Influences Your Users [Cheat Sheet]

User Feedback: The Best Ways to Ask For It (& Why You Should)

User Feedback: The Best Ways to Ask For It (& Why You Should)

User Testing: The Ultimate Guide

User Testing: The Ultimate Guide

Generative Research: Everything You Need to Know

Generative Research: Everything You Need to Know

How to Use Card Sorting to Better Understand Your Users

How to Use Card Sorting to Better Understand Your Users

3 templates for conducting user tests, summarizing UX research, and presenting findings.

Service Hub provides everything you need to delight and retain customers while supporting the success of your whole front office

Principles of Task Analysis and Modeling: Understanding Activity, Modeling Tasks, and Analyzing Models

  • Living reference work entry
  • First Online: 17 November 2022
  • Cite this living reference work entry

task analysis to

  • Célia Martinie 4 ,
  • Philippe Palanque 4 &
  • Eric Barboni 4  

151 Accesses

2 Citations

Task analysis identifies user goals and tasks when using an interactive system. In the case of users performing real-life work, task analysis can be a cumbersome process gathering a huge amount of unorganized information. Task Models provide a mean for the analysts to organize information gathered during task analysis in an abstract way and to detail it further if needed. This chapter presents the benefits of using task models for task analysis with a practical view on the process for building task models. As task models can be large, it is important to provide the analyst with computer-based tools for editing task models and for simulating them. In this chapter, we illustrate the presented concepts with the HAMSTERS notation and its associated eponym tool.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

https://www.irit.fr/ICS/tools/

Anderson R, Carroll J, Grudin J, McGrew L, Scapin D (1990) Task analysis: the oft missing step in the development of computer-human interfaces; its desirable nature, value, and role. INTERACT:1051–1054

Google Scholar  

Anett J (2004) Hierarchical task analysis. In: Dan D, Neville S (eds) The handbook of task analysis for human-computer interaction. Lawrence Erlbaum Associates, pp 67–82

Annett J, Duncan K (1967) Task analysis and training design. Occup Psychol 41:211–221

Bernhaupt R, Palanque P, Drouet D, Martinie C (2018) Enriching task models with us-ability and user experience evaluation data. In: Bogdan C, Kuusinen K, Lárusdóttir M, Palanque P, Winckler M (eds) Human-centered software engineering. HCSE 2018, Lecture Notes in Computer Science, vol 11262. Springer, Cham

Bouzekri E, Martinie C, Palanque P, Atwood K, Gris C (2021) Should I add recommendations to my warning system? The RCRAFT framework can answer this and other questions about supporting the assessment of automation designs. In: Ardito C et al (eds) Human-computer interaction – INTERACT 2021. INTERACT 2021, Lecture Notes in Computer Science, vol 12935. Springer, Cham. https://doi.org/10.1007/978-3-030-85610-6_24

Chapter   Google Scholar  

Broders N, Martinie C, Palanque P, Winckler M, Halunen K (2020) A generic multimodels-based approach for the analysis of usability and security of authentication mechanisms. In: Bernhaupt R, Ardito C, Sauer S (eds) Human-centered software engineering. HCSE 2020, Lecture Notes in Computer Science, vol 12481. Springer, Cham. https://doi.org/10.1007/978-3-030-64266-2_4

Calvary G, Coutaz J, Nigay L (1997) From single-user architectural design to PAC*: a generic software architecture model for CSCW. In Proc. of CHI '97. ACM, 242–249

Campos JC, Fayollas C, Gonçalves M, Martinie C, Navarre D, Palanque P, Pinto M (2017) A more intelligent test case generation approach through task models manipulation. Proc ACM Hum-Comput Interact. 1, EICS, Article 9, 20 p

Card S, Moran T, Newell A (1983) The psychology of human-computer interaction. Erlbaum, ISBN 0898598591, pp. I-XIII, 1–469

Cockton G, Woolrych A (2001) Understanding inspection methods: lessons from an assessment of heuristic evaluation. Springer, People and Computers, pp 171–192

Diaper D (2004) Understanding task analysis for human-computer interaction. Lawrence Erlbaum Associates, The handbook of task analysis for human-computer interaction

Dix A, Ramduny-Ellis D, Wilkinson J (2004) Chapter 19:Trigger analysis - understanding broken tasks. In: Diaper D, Stanton N (eds) The handbook of task analysis for human-computer interaction. Lawrence Erlbaum Associates, pp 381–400

Ellis CA, Gibbs SJ, Rein G (1991) Groupware: some issues and experiences. Comm ACM 34(1):39–58

Article   Google Scholar  

Fahssi R, Martinie C, Palanque P (2015) Enhanced task modelling for systematic identification and explicit representation of human errors. IFIP TC 13 INTERACT conference, LNCS 9299, part IV, Springer Verlag

Forbrig P., Martinie C., Palanque P., Winckler M., Fahssi R (2014) Rapid task-models development using sub-models, sub-routines and generic components. IFIP conf. on Human-Centric Software Eng., HCSE pp 144–163

Friedenthal S, Moore A, Steiner R (2011) A practical guide to SysML: the systems modeling language, 2nd edn. The MK/OMG Press

Gong R, Elkerton J (1990) Designing minimal documentation using the GOMS model: a usability evaluation of an engineering approach. CHI 90 Proc ACM DL

Greenberg S (2004) Working through task-centered system design. In: Diaper D, Stanton N (eds) The handbook of task analysis for human-computer interaction. Lawrence Erlbaum Associates, pp 49–66

Gribova V (2008) A method of context-sensitive help generation using a task project. Int J Info Theories Appl 15:391–395

Guerrero J, Vanderdonckt J, Gonzalez Calleros J (2008) FlowiXML: a step towards designing workflow management systems. J Web Eng:163–182

Heer J, Agrawala M (2008) Design considerations for collaborative visual analytics. Info Visualiz 7(1):49–62

International Standard Organization (2018). ISO 9241-11:2018 ergonomics of human-system interaction part 11: usability: Definitions and concepts, 2018, ISO

John B, Kieras DE (1996) The GOMS family of user interface analysis techniques: comparison and contrast. ACM Trans Comput-Hum Interact 3(4):320–351

Johnson P (1992) Human-computer interaction: psychology, task analysis and software engineering. McGraw Hill, Maidenhead

Johnson P, Johnson H, Hamilton F (2000) Getting the knowledge into HCI: theoretical and practical aspects of task knowledge structures. In: Schraagen J, Chipman S, Shalin V (eds) Cognitive task analysis. LEA

Kieras D (2004) GOMS models for task analysis. The handbook of task analysis for human-computer interaction, Lawrence Erlbaum Associates, pp 83–116

Lallai G, Loi ZG, Martinie C, Palanque P, Pisano M, Spano LD (2021) Engineering task-based augmented reality guidance: application to the training of aircraft flight procedures. Interact Comput 33(1):17–39

Martinie C, Palanque P, Navarre D, Winckler M, Poupart E (2011a) Model-based training: an approach supporting operability of critical interactive systems: application to satellite ground segments, EICS 2011, ACM DL. pp. 141–151

Martinie C, Palanque P, Barboni E, Ragosta M (2011b) Task-model based assessment of automation levels: application to space ground segments. Proc of the IEEE SMC, Anchorage

Martinie C, Palanque P, Winckler M (2011c) Structuring and composition mechanisms to address scalability issues in task models. In: IFIP TC 13 INTERACT conference. Springer Verlag, pp 589–609

Martinie C, Palanque P, Ragosta M, Fahssi R (2013) Extending procedural task models by systematic explicit integration of objects, knowledge and information. Europ Conf Cognitive Ergonomics: 23-34, ACM DL

Martinie C, Barboni E, Navarre D, Palanque P, Fahssi R, Poupart E, Cubero-Castan E (2014) Multi-models-based engineering of collaborative systems: application to collision avoidance operations for spacecraft. Proc. of the 2014 ACM SIGCHI symposium on engineering interactive computing systems (EICS '14). ACM, New York, pp 85–94

Martinie C, Palanque P, Bouzekri E, Cockburn A, Canny A, Barboni E (2019) Analysing and demonstrating tool-supported customizable task notations. PACM on human-computer interaction, Vol. 3, EICS, Article 12, 26 p

McGrath JE (1984) Groups: interaction and performance. Prentice Hall, Inc., Englewood Cliffs

Meyer DE, Annett J, Duncan KD (1967) Task analysis and training design. J Occup Psychol 41

Mori G, Paternó F, Santoro C (2002) CTTE : support for developing and analyzing task models for interactive system design. TOSE J 28(8):797–813

Navarre D, Palanque P, Bastide R, Paternó F, Santoro C (2001) A tool suite for integrating task and system models through scenarios. DSV-IS'2001; LNCS 2220. Springer

O’Donnell RD, Eggemeier FT (1986) Workload assessment methodology. In: Handbook of perception and human performance, Vol. II Cognitive Processes and Performance. Wi, pp 42–49

Palanque P, Martinie C (2011) Contextual help for supporting critical Systems' operators: application to space ground segments activity in context workshop, AAAI conference on. Artif Intell

Palanque P, Bastide R, Dourte L (1993) Contextual help for free with formal dialogue design. Proc HCI Int 1993:615–620

Pangoli S, Paternò F (1995) Automatic generation of task-oriented help. ACM Symp UIST:181–187

Parasuraman R, Sheridan TB, Wickens CD (2000) A model for types and levels of human interaction with automation. Syst Man Cybernetics Part A: Syst Humans IEEE Trans 30(3):286–297

Paternò F (1999) Model-based design and evaluation of interactive application. Springer. ISBN 1-85233-155-0

MATH   Google Scholar  

Paternò F (2002) Task models in interactive software systems. In: Handbook of software engineering and knowledge engineering, vol 1. World Scientific, pp 1–19

Paternò F, Mancini C (1999) Developing task models from informal scenarios. CHI Extended Abstracts pp 228–229

Paterno F, Santoro C (2002) Preventing user errors by systematic analysis of deviations from the system task model. Int J Human Comput Syst 56(2):225–245

Paternò F, Zini E (2004) Applying information visualization techniques to visual representations of task models. In Proceedings of the 3rd annual conference on Task models and diagrams (TAMODIA '04). ACM, New York, pp 105–111

Pinelle D, Gutwin C, Greenberg S (2003) Task analysis for groupware usability evaluation: modeling shared-workspace tasks with the mechanics of collaboration. ToCHI 10(4):281–311

Roschelle J, Teasley SD (1995) The construction of shared knowledge in collaborative problem solving. In C. E. O'Malley (Ed.), Computer-supported collaborative learning. pp 69–197

Rosson MB, Carroll JM (2002) Chapter 53: Scenario-based design. In: Jacko J, Sears A (eds) The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications. Lawrence Erlbaum Associates, pp 1032–1050

Rumbaugh J, Jacobson I, Booch G (2004) Unified modeling language reference manual. Pearson Higher Education

Sinnig D, Chalin P, Khendek F (2013) Use case and task models: an integrated development methodology and its formal foundation. ACM TSEM 22(3):27

Stapleton J (ed) (2003) DSDM: business focused development. Pearson Education

van der Veer GC, Lenting VF, Bergevoet BA (1996) GTA: groupware task analysis - modeling complexity. Acta Psychol 91:297–322

van Welie M, van der Veer GC (2003) Groupware task analysis. In: Handbook of cognitive task design. LEA, NJ, pp 447–476

Winckler M, Palanque P, Freitas C (2004) Tasks and scenario-based evaluation of information visualization techniques. In Proceedings of the 3rd annual conference on Task models and diagrams (TAMODIA '04). ACM, New York, NY, USA, pp 165–172

Download references

Author information

Authors and affiliations.

Institute of Research in Informatics of Toulouse (IRIT), Université Paul Sabatier – Toulouse III, Toulouse, France

Célia Martinie, Philippe Palanque & Eric Barboni

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Célia Martinie .

Editor information

Editors and affiliations.

Universite catholique de Louvain, Louvain-la-Neuve, Belgium

Jean Vanderdonckt

IRIT - Interactive Critical Sys Group, Paul Sabatier University, Toulouse, France

Philippe Palanque

I3S, INRIA wimmics/SPARKS team, Université Nice Sophia Antipolis, Sophia Antipolis Cedex, France

Marco Winckler

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this entry

Cite this entry.

Martinie, C., Palanque, P., Barboni, E. (2022). Principles of Task Analysis and Modeling: Understanding Activity, Modeling Tasks, and Analyzing Models. In: Vanderdonckt, J., Palanque, P., Winckler, M. (eds) Handbook of Human Computer Interaction. Springer, Cham. https://doi.org/10.1007/978-3-319-27648-9_57-1

Download citation

DOI : https://doi.org/10.1007/978-3-319-27648-9_57-1

Received : 17 December 2021

Accepted : 17 May 2022

Published : 17 November 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-27648-9

Online ISBN : 978-3-319-27648-9

eBook Packages : Springer Reference Computer Sciences Reference Module Computer Science and Engineering

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Get in touch
  • Enterprise & IT
  • Banking & Financial Services
  • News media & Entertainment
  • Healthcare & Lifesciences
  • Networks and Smart Devices
  • Education & EdTech
  • Service Design
  • UI UX Design
  • Data Visualization & Design
  • User & Design Research
  • In the News
  • Our Network
  • Voice Experiences
  • Golden grid
  • Critical Thinking
  • Enterprise UX
  • 20 Product performance metrics
  • Types of Dashboards
  • Interconnectivity and iOT
  • Healthcare and Lifesciences
  • Airtel XStream
  • Case studies

Data Design

  • UCD vs. Design Thinking

User & Design Research

Task analysis.

Task Analysis is a method of observing participants in action performing their tasks. Task Analysis helps figuring out how users perform tasks and how a system, product or service should be designed for users so that they can achieve their intended goals. Task Analysis also helps determine what user goals are i.e. what designers must design for; how do users determine or measure the completion of tasks, what sort of personal, social as well as cultural attributes influence the user’s performance, etc.

Quick details: Task Analysis

Structure: Structured

Preparation: Respondent recruitment, Tasks outline, Recording tools

Deliverables: Recordings, Transcripts, Data

More about Task Analysis

Task analysis can be used in a number of situations such as when we are designing a website, when we want to test a prototype and task analysis can also be part of user testing/validation. It is important that task analysis is performed during or before the design phase so that the insights obtained can be easily incorporated into the product, service or system being designed.

Task Analysis can be performed one on one or online depending on the project under consideration. In order to analyze the way a user performs tasks, complicated and time consuming tasks can be broken down into subtasks, which can be analyzed as well as observed individually.

Types of Task Analysis

Task Analysis is of two types depending on the end-goal and composition. If the task analysis involves analyzing qualitative end-goals such as decision-making, emotions, problem-solving skills, recall, then it is termed as Cognitive Task Analysis. Whereas, if the Task Analysis involves breaking down a complex task into subtasks, analyzing the subparts and deducing the nature of the whole based on its composite parts, then such analysis is termed as Hierarchical Task Analysis.

MethodPurposeAdvantagesDisadvantages
CognitiveTo analyze qualitative end-goals such as decision-making, emotions, problem-solving skills, recall, etc. Qualitative nature of this type of task analysis may not give accurate findings or rather clear findings. The results may be vague.
HierarchicalTo Understand the performing of subtasks and analyze the complex tasks based on the participants performing the subtasks.If undertaken at an earlier stage, the results can help identifying crisp improvement areas in terms of user experienceDepending on the degree of decomposition, the data obtained will be detailed or basic. There could be different researchers involved in decomposing the task, the one observing the participants performing the tasks and the one analyzing findings.

Advantages of Task analysis

1. great understanding of users and their end-goals.

Task analysis allows the researcher to not only understand the participants end goals but also their competence in performing the task, the triggers that lead to the task, the triggers that disrupt the user’s flow during the task as well as the tools the user employs to perform the task.

2. High level understanding of user environments

Task Analysis also gives an indication of the user’s environment and whether or not the environment is conducive to perform the task .

3. Relevant at every stage of the project

Task Analysis can be conducted at any stage of the product or service development but the earlier it is conducted, the better.

4. Practical

Task analysis helps to highlight the practical aspects that come into play when a user is performing a task .

5. Determine gaps between set processes and actual steps in performing a task

This method also helps figure whether there is a difference between the way the user actually performs the task and the way the user says they perform the task .

Disadvantages of Task analysis

1. time consuming.

If the task analysis were performed with a large sample of participants, the activity would be time consuming. Online tools may help is recording data but actual observation will happen when the researcher is present at the time the task is being performed.

2. Complex findings

Depending on whether the task analysis method is cognitive or hierarchical, the findings may be complex and not that easily analyzable.

3. Discrepancy in the pace of performing the task

If the users do not give sufficient time for performing a task during the exercise, then the system or product can be off from what the user requirement is. This may be due to the users rushing through a task during the exercise which they otherwise perform at a relaxed pace. This would be true even when the user performs a task hastily otherwise but during the exercise, performs it in a relaxed manner.

4. Diverse Viewpoints

If the scope of the project is large, then user testing may result in large amount of diverse data may be difficult to collate and analyze .

Think Design's recommendation

Task Analysis goes a long way in enhancing the usability of your product/ organization if done correctly. Use this as a method if your objective is to assist users in performing their tasks or if you are intending to improve organizational efficiency by understanding the tasks and then optimizing them. 

Consider the following recommendations to improve the impact of your task analysis exercise:

  • Understanding linkages and hierarchy of tasks is important to get a bird’s eye perspective. Map your understanding in a hierarchical flow-chart and then use that to analyze.
  • More often than not, there are handoffs involved during tasks. Understand those handoffs and mapping people to tasks will improve your understanding of the kind of organization you are working for. More importantly, this will give you the linkages among people, tasks and systems/tools which are much needed while redefining a product/ organization.
  • The purpose of this exercise is to understand the current state and how this could become a baseline for the future state. Hence, Task Analysis should be done as an analysis than a documentation of what is existing. Always question why the tasks are being done the way they are being done and find out how the situation can be improved.
  • As a UX practitioner, you are dealing with emotions of the user and you want to make amends in the areas where users are frustrated. When you map out tasks, it might greatly help if you also capture emotions while performing those tasks.

Was this Page helpful?

Related methods.

  • Brainstorming
  • Business Model Canvas
  • Ethnography
  • Guided Tour
  • Participatory Design
  • Visit Survey

UI UX DESIGN

Service design.

We use cookies to ensure that we give you the best experience on our website. If you continue we'll assume that you accept this. Learn more

Recent Tweets

Sign up for our newsletter.

Subscribe to our newsletter to stay updated with the latest insights in UX, CX, Data and Research.

Get in Touch

Thank you for subscribing.

You will be receive all future issues of our newsletter.

Thank you for Downloading.

One moment….

While the report downloads, could you tell us…

Task Analysis Examples for Use in the Classroom to Help Students Reach Goals

  • Carol Lee McCulloch
  • Categories : Inclusion strategies for mainstreamed classrooms
  • Tags : Special ed information for teachers & parents

Task Analysis Examples for Use in the Classroom to Help Students Reach Goals

Task Analysis Examples

Use these task analysis examples to help your classroom

Make It Simple

Classrooms from pre-school to high school can utilize the task analysis process by using routine rules and learning skills. For example, in the kindergarten and lower elementary setting, the daily routine laid out for students to follow can provide opportunities for sub-tasking. If a teacher posts rules of conduct, or expectations in a given subject area, a checklist can be provided to monitor behavioral and academic progress. If rules or procedures are too general for young children to grasp completely, a listing of “how-to’s” can be charted for clarity. Here’s a simple task analysis example - If the general rule or procedure is “Be Respectful To Your Fellow Classmates,” it may be more helpful to list step by step the ways this can be accomplished; a) Ask different classmates to play with you on the playground, b) Speak kindly to each classmate, c) Do not make fun of anyone, d) Be a helper, not a troublemaker, and so on. The young student can then check off the steps he or she has accomplished, and as a result, good classroom habits will be developed and the general concept will be fully understood.

Strategies and Skills

For high school and college instructors, task analysis may be best utilized through the use of charting strategies and skills that are required to accomplish the task. In other words, the instructor needs to know if the student’s prerequisite skills are in place before designing the course of study. In English class, for example, a task analysis on how to write a simple research paper can prove very useful. The procedures and strategies approach is highly successful in teaching a how-to lesson. STRATEGIES are listed on one side of the chart with SKILLS REQUIRED directly across. Each section is sub-divided to best explain what is expected and what a student should know in order to accomplish the goal. Another analysis approach lists sequential (boxed) steps which must be followed to complete a specific task. Long division in upper elementary, as well as organizing thoughts and processes in science and social studies class, have proven much easier to digest using this method of task analysis.

Positive Benefits

According to an article on “Linking Task Analysis to Student Learning,” from the Educational Resource Information Center, there are many perspectives and approaches to task analysis. But the one point that all theorists agree on is that “task analysis, at a minimum, assists the instructor or designer to understand the content to be taught. This alone is sufficient reason for recommending it.” Task analysis activities have definitely been useful in helping teachers, students, employers and employees stay on track throughout a specific learning process. Goals are more easily understood and accomplished if the expected outcome is presented in pieces. Let us know in the comments if you have any task analysis examples you wish to share!

What’s the Purpose of Task Analysis ERIC Education Resources Information Center “Linking Task Analysis with Student Learning” Image by Aline Dassel from Pixabay

This post is part of the series: Special Education Activities

With many innovative approaches to teaching children with disabilities, educators, coaches, and volunteers alike can find exciting, rewarding ways to share expertise with the special needs population!

  • Task Analysis Activities: Teaching Students to Complete Tasks
  • Incorporating Music Into Teaching Students With Special Needs
  • Sports Activities for the Disabled
  • How the School Based Support Team Works

Regis College

  • 1.866.411.2159
  • Online Master’s Degrees Overview
  • Master’s in Applied Behavior Analysis
  • Online Master of Science in Product Management
  • Online Master of Science in Nursing
  • Online Master of Social Work
  • Online Doctoral Degrees Overview
  • Online BSN to Doctor of Nursing Practice
  • Online MSN to Doctor of Nursing Practice
  • Online Certificate Programs Overview
  • Online ABA Certificate
  • Online Introduction to Product Management Certificate
  • Online Nursing Certificate Programs
  • About Regis College
  • Corporate Partners Tuition Discount
  • Federal Employee Program
  • Military & Veterans Tuition Discount Program
  • Faculty Directory

Task Analysis in Applied Behavior Analysis (ABA) Therapy: Strategies and Examples

June 30, 2023

View all blog posts under Articles | View all blog posts under Master of Science in Applied Behavior Analysis

A therapist works with a child.

Autism Spectrum Disorder (ASD) is defined by the American Psychiatric Foundation as “a complex developmental condition that involves persistent challenges in social interaction, speech and nonverbal communication, and restricted/repetitive behaviors.” There is a wide range of effects and severity of symptoms experienced by people who are diagnosed with ASD.

The U.S. Centers for Disease Control and Prevention estimates that autism spectrum disorders are present in 1 in 59 children. ASD is about four times as prevalent in boys than in girls, with 1 in 37 boys diagnosed as having ASD, compared to 1 in 151 girls.

The most popular treatment for children with autism spectrum disorder is applied behavior analysis (ABA), which the Association for Science in Autism Treatment describes as the use of interventions to improve “socially important behavior.” Behavior analytic interventions are based on learning theory and methods that have been studied scientifically and shown to be effective in improving the lives of people with autism spectrum disorders.

The antecedent-behavior-consequence (ABC) method of assessing functional behavior can be combined with an intervention such as task analysis as the basis for effective interventions in children with autism spectrum disorder. These types of assessments and interventions work to “increase appropriate skills and decreas[e] maladaptive behaviors,” as Psych Central reports. The goal of a task analysis is to break down and simplify complex tasks in order to provide step-by-step guidance on how to complete specific behaviors. This guide describes several specific task analysis techniques and presents examples of their application in diverse settings.

What Is Task Analysis?

The National Professional Development Center on Autism Spectrum Disorders defines task analysis as a teaching process that breaks down complex activities into a series of simple steps that students are able to learn more easily. Researchers have shown that task analysis meets the criteria for evidence-based practice by improving adoption of “appropriate behaviors and communication skills” by children in preschool, elementary school, and middle school.

Task analysis techniques fall into two broad categories, as the Autism Classroom blog explains:

  • The desired skill can be broken into discrete steps that are performed in sequence, such as the appropriate way to wash one’s hands. The steps are linked via “chaining,” which signals the completion of each step as a cue to begin the next step.
  • Alternatively, a task can be divided into short chunks of time, so a 20-minute activity may be broken into five four-minute segments. This approach is frequently associated with “shaping,” which teaches new behaviors by reinforcing “successive approximations” of the behavior rather than repeating previous approximations, as the Association for Science in Autism Treatment explains.

However, a simple definition of what task analysis is doesn’t explain why the approach has become so important in educating children with ASD. Three characteristics are vital to the success of task analysis as a teaching method:

  • Consistency: If three different people demonstrate to a student how to perform a specific activity, such as brushing teeth, the student will likely be shown three different methods, because each “teacher” performs the activity in a unique way. This can leave the student confused. Task analysis ensures that a single approach is presented and reinforced in all learning situations.
  • Individualization: Each student has unique strengths and weaknesses, so task analysis methods can be customized to meet the student’s specific circumstances. For example, when teaching a student to remain in a group for 20 minutes via shaping, the task increments can be varied to the abilities of the student, with some responding best to two-minute chunks and others to five-minute blocks.
  • Systematic instruction: One challenge students with ASD face is dealing with the many variables that complicate learning. Task analysis relies on “discrete trial programs” that divide activities into small steps that culminate in the end goal. For example, students who have learned four of the eight steps entailed in tying their shoes have successfully mastered those four steps, although they have not yet achieved the end goal.

The task analysis technique of chaining has two primary components, as ThoughtCo. explains:

  • Forward chaining relies on the student learning from the start of the task sequence through each step of the task in sequence, so step two begins only after step one is completed. Each step is first modeled by the instructor and then imitated by the student, although some students will require hand-over-hand prompting followed by “fading” of the prompt as the student exhibits increasing mastery of the step.
  • Backward chaining begins by teaching the student the last step of the task, first by having the student observe the teacher and then by having the student assist the teacher. After the last step has been grasped (though not yet perfected), the instructor turns to the second-to-last step of the process and continues backward to the initial steps. An example is learning to do laundry: the student is first taught how to remove the clothes from the dryer and fold them, then how to transfer the clothes from the washer to the dryer, and all preceding steps in the process one-by-one in reverse order.

Other effective task analysis techniques include these two approaches:

  • Discrete trial instruction: The teacher gives the student a short, clear instruction and provides a prompt to help the student complete the instruction, whether by modeling the target response or guiding the student’s own response. As the student progresses, the prompt is removed gradually. When the student responds accurately, the teacher offers immediate positive feedback; when the student’s response is incorrect, the teacher demonstrates or guides the student to perform the correct response.
  • Modeling: The student is shown the target behavior and is then instructed to imitate that behavior. Modeling has proven effective in teaching social, play, and self-help skills.

What Is the Purpose of Task Analysis?

The goal of applied behavior analysis is to help people with ASD learn the fundamental skills that will allow them to lead independent lives. Task analysis is one of several methods used by applied behavior analysts to understand and modify a person’s behavior.

The Autism Classroom describes task analysis as both “unexciting” and “critical to systematic instruction.” The advantages of task analysis over other ABA approaches are explained by Autism Speaks:

  • Task analysis is easy to adapt to the needs of each individual learner.
  • The techniques can be applied in multiple settings, including classrooms, homes, and the community.
  • The skills taught via task analysis are practical in the student’s everyday life.
  • Task analysis can be used in one-on-one instruction and in group settings.

When preparing an ABA program for a student, applied behavior analysts begin by assessing the student’s skills, as well as the goals and preferences of the student and the student’s family. Age appropriate skills evaluated in the initial assessment serve as the foundation for the student’s specific treatment goals. These skills include the following:

  • Communication and language skills
  • Social interaction
  • Self-help (hygiene, healthy living, etc.)
  • Play and relaxation activities
  • Motor skills
  • Academic skills

The primary use of task analysis in ABA settings is to teach activities for daily living (ADLs), as Total Spectrum explains. ADLs are actions that most people complete on a daily basis, such as setting a table for dinner or purchasing an item and asking for change. For people with autism spectrum disorder, however, these skills are especially important as  these types of activities serve as the foundation for their independence.

Individuals with autism spectrum disorder gain a better understanding of basic living skills by focusing on the mastery of individual steps in a complex process. Task analysis can be applied to any process that can be broken into multiple steps. Once the steps have been identified and the directions created, instructors devise a learning plan that is customized to the needs and goals of the student. The instruction often relies heavily on visual support tools, such as cards, small replicas of objects, or the objects themselves.

In addition to helping the student with autism spectrum disorder, task analysis can improve the quality of life for all family members. Strong skills in communication, interpersonal relations, and social interactions help enable people with ASD to lead successful, independent lives. Autism Speaks outlines the purpose of task analysis and the many ways task analysis and other ABA approaches benefit individuals with ASD, their families, and their communities:

  • Task analysis replaces problem behaviors with new skills, so students learn “what to do” rather than simply “what to stop doing.”
  • Reinforcement increases on-task positive behaviors and minimizes negative behaviors.
  • Tasks that teach self-monitoring and self-control engender skills that are easily transferred to social and job-related capabilities.
  • Responding positively to a student’s behavior prevents unintentionally rewarding problem behavior.
  • Students are better able to focus on and comply with specific tasks, which motivates them to perform.
  • By improving cognitive skills, the tasks make it easier for students to learn other academic subjects.
  • Learning appropriate behaviors in specific situations helps students generalize skills and apply them outside the classroom.

Demonstrating the Task Analysis for Brushing Teeth

Teeth brushing is a daily routine for dental hygiene that most adults perform with little conscious thought, but it is an example of an activity that can be challenging for children with autism spectrum disorder. Behavioral Health Works describes the task analysis for brushing teeth. The teaching begins by reinforcing the reason for the activity: to have clean, healthy teeth.

The next steps may seem intuitive to adults, but the process can be formidable for children who have never brushed their teeth themselves and may fear the sensory components of teeth brushing or making a mistake. By dividing the task into a sequence of discrete actions, children are more confident that they can perform each subtask correctly. Task analysis has been shown to teach these types of skills much more quickly than alternative instruction methods.

Few adults would guess that the relatively simple act of brushing one’s teeth is comprised of at least 18 separate operations:

  • Pick up the toothbrush.
  • Turn on the water tap.
  • Wash and rinse the toothbrush.
  • Turn off the water.
  • Pick up the toothpaste tube.
  • Remove the cap from the tube.
  • Place a dab of toothpaste on the bristles of the toothbrush.
  • Put the cap back on the tube of toothpaste.
  • Use the bristle end of the brush to scrub all of the teeth gently. (This step may need to be broken into several subtasks, such as, “Start brushing the teeth in the top left corner of your mouth, then brush the top center, then the top right, then the bottom right,” etc.)
  • After brushing all the teeth, spit the toothpaste into the sink.
  • Turn on the water.
  • Rinse off the toothbrush.
  • Place the toothbrush back into its holder.
  • Pick up a rinsing cup.
  • Fill it partially with water.
  • Rinse the mouth with water from the cup.
  • Spit the water into the sink.

By breaking down the task into smaller activities, students are less likely to feel overwhelmed by the overall objective. However, students with ASD will likely need to master one or two of the steps at a time and then link the separate activities using either forward chaining or backward chaining, as ThoughtCo. describes:

  • For students who are able to learn multiple steps at one time, forward chaining can be used to link the steps in the proper sequence via modeling and verbal prompts. Once the student demonstrates mastery of the first few linked steps without guidance, the next linked steps of the task can be taught.
  • For students who lack strong language skills, backward chaining allows the teacher to perform the initial steps hand over hand while naming each step. This gives the student an opportunity to practice each step while simultaneously learning the corresponding vocabulary. Prompting is removed as the last steps of the process are taught, but reinforcement continues until the student has mastered the entire task.

The task analysis for brushing teeth can be facilitated by creating a visual schedule that indicates when the student has completed each step. The student can review the visual schedule before beginning the task, or the schedule can be placed on the counter so the student can refer to it as each step is performed.

Demonstrating the Task Analysis for Washing Hands

One of the simplest and most effective ways to prevent illness — in oneself and in others — is by washing one’s hands. The CDC recommends that people wash their hands frequently each day:

  • Before and after preparing food
  • Before eating
  • Before and after treating a cut or wound
  • After using the bathroom
  • After blowing the nose, coughing, or sneezing
  • After touching an animal, animal feed, or animal waste
  • After handling pet food or pet treats
  • After touching garbage

The CDC divides hand washing into five separate operations:

  • Wet the hands with clean running water, turn off the tap, and apply soap.
  • Rub the hands together with the soap to create a lather that covers the front and back of the hands and goes between the fingers and under the fingernails.
  • Scrub the hands for a minimum of 20 seconds.
  • Thoroughly rinse the hands under clean running water and then turn off the tap.
  • Dry the hands using a clean towel or air dryer.

However, the task analysis for washing hands breaks down the process into several more discrete steps, as the New Behavioral Network describes:

  • Stand in front of the sink.
  • Run the water over the hands thoroughly.
  • Apply soap to the hands.
  • Scrub the hands for 20 seconds.
  • Turn the water back on.
  • Rinse the soap off the hands thoroughly.
  • Dry the hands.

As with the task analysis for teeth brushing, breaking down the complexities of such basic hygiene tasks into smaller pieces helps individuals with autism spectrum disorder to build a chain of learning that completes the overall task when the separate steps are linked together. The forward and backward chaining taught as part of these exercises can be transferred to other social and employment situations.

A Look at Other Task Analysis Examples

The range of applications for task analysis in ABA therapy is limited only by the imagination of teachers and the needs of students.

  • Accessible ABA highlights the many ways chaining can be combined with task analysis to teach students with autism spectrum disorder using the methods that are most effective for the way these students learn. A task analysis example demonstrating the versatility of this approach is learning how to put on a pair of pants, which may include steps for sliding each foot into each pant leg one at a time, pulling the pants up, and buttoning and zipping them.
  • Think Psych offers the task analysis example of teaching students with autism spectrum disorder how to eat yogurt, steps for which include opening the refrigerator, taking the yogurt container out, removing the lid of the container, retrieving a spoon from the utensil drawer, using the spoon to eat the yogurt, throwing the empty yogurt container in the trash, and placing the dirty spoon in the dishwasher.
  • The Autism Community in Action explains how to use task analysis to teach a student with autism spectrum disorder how to fold a towel, which starts by laying the towel flat on a table, taking the top corners of the towel in each hand, bringing the top edge down to the bottom edge, bringing the left edge of the towel to the right edge, smoothing the towel flat, and placing the folded towel in a basket or closet.
  • ThoughtCo. provides an example of task analysis with backward chaining to help a student learn how to do laundry. The instruction begins when the load of laundry is completed: The student begins by removing the laundry from the dryer and folding it, and after this step is mastered, the student is shown how to set the dryer and push the start button. The instruction works backward step-by-step through the washing and drying process, culminating with lessons on how to sort the dirty laundry and load it into the washer.

Preparing for a Satisfying Career in ABA Therapy

Task analysis and other ABA techniques are part of a comprehensive evidence-based practice that teaches students with autism spectrum disorder the life skills they will need to live independently. Visual presentation approaches and breaking down complex tasks into a series of simple steps are keys to helping children with ASD process information quickly and simply.

Graduate programs such as Regis College’s masters in Behavior Analysis online prepare students who are starting their careers or looking to advance in their field. Among the career options available to MS-ABA graduates are ABA training coordinator, clinical supervisor, and clinical director. Graduates often work at outpatient care centers or government agencies, or in private practice.

Learn More About ABA Therapy Strategies

Discover more about how Regis College’s online Master of Science in Applied Behavior Analysis degree program helps address the growing need for health professionals trained in task analysis and other ABA methods that help students with autism learn the skills they will need to lead independent lives.

Recommended Readings

What Is Autism Spectrum Disorder and What Should Parents and Educators Know?

Strategies in Behavior Therapy: Creating a Behavior Assessment

Exploring ABA Techniques and Their Role in Treatment

Accessible ABA, “Use Chaining and Task Analysis to Help Your Child with Autism”

American Psychiatric Association, “What Is Autism Spectrum Disorder?”

Association for Science in Autism Treatment, Applied Behavior Analysis (ABA)

Association for Science in Autism Treatment, Behavior Chaining

Association for Science in Autism Treatment, Discrete Trial Instruction

Association for Science in Autism Treatment, Modeling

Association for Science in Autism Treatment, Shaping

Association for Science in Autism Treatment, “Teaching Procedures Using Principles of Applied Behavior Analysis”

Association for Science in Autism Treatment, Verbal Behavior/Applied Verbal Behavior

Autism Classroom, “What You Need to Know About Task Analysis and Why You Should Use It”

The Autism Community in Action , “Developing Lifeskills: How to Teach a Skill”

Autism Grown Up, “Task Analysis as an Evidence-Based Practice”

Autism Speaks , “Applied Behavior Analysis: A Parent’s Guide”

Autism Speaks, Autism Facts and Figures

Autism Speaks, What Is Applied Behavior Analysis?

Behavioral Health Works , “Using a Task Analysis to Teach a Child to Brush Their Teeth”

BetterHelp, “Understanding the Antecedent Behavior Consequence Model”

Indian Journal of Psychological Medicine , “Comprehensive Management of Autism: Current Evidence”

National Professional Development Center on Autism Spectrum Disorders , “Evidence-Based Practice Brief: Task Analysis”

New Behavioral Network, Washing Your Hands — Task Analysis

Psych Central, “ABC’s of Behavior (Antecedent-Behavior-Consequence)”

Psych Central, “Shaping, Chaining & Task Analysis with an Example from Everyday Life”

Research Autism, Applied Behaviour Analysis and Autism

ThinkPsych, Free Resource: Task Analysis Sheet

ThinkPsych, Task Analysis for Getting Dressed and Hygiene

ThoughtCo., “Chaining Forward and Chaining Backwards”

ThoughtCo., “Teaching the Functional Skill of Tooth Brushing”

Total Spectrum, Task Analysis of ADL’s and Multi-Step Directions

U.S. Centers for Disease Control and Prevention, Data & Statistics on Autism Spectrum Disorder

U.S. Centers for Disease Control and Prevention, When and How to Wash Your Hands

Let’s move forward

Wherever you are in your career and wherever you want to be, look to Regis for a direct path, no matter your education level. Fill out the form to learn more about our program options or get started on your application today.

1.What degree program are you most interested in?
2.What degree program are you most interested in?
2.What DNP Program are you interested in?
2.What option are you most interested in?
2.What option are you interested in?
2.What track are you interested in?
3.Complete the form to download the brochure.

loading

  • Introduction
  • Conclusions
  • Article Information

The line represents the estimated efficiency frontier for model D; labels above line indicate efficient and near-efficient strategies. Efficiency frontier graphs for all models are shown in the Supplement . Grey shading, in which near-efficient strategies are located, shows area within 5% of the value for screening biennially during ages 50 to 74 years with digital breast tomosynthesis (DBT). For panels A and C, near-efficient strategies included those within 2.20 days of life gained per woman of the efficient frontier for all women and 3.22 days of life per Black woman. For B and D, near-efficient strategies included those within 5% of the efficiency frontier on a relative scale—on an absolute scale, equivalent to 1.27 percentage points for women overall and 1.21 points for Black women. Strategies vary by age at starting and stopping screening, interval between mammograms, and screening modality. A indicates annual; and B, biennial. a Near-efficient.

All strategies use digital breast tomosynthesis. Results shown as medians across 6 models of women overall (D [Dana-Farber Cancer Institute], E [Erasmus Medical Center], GE [Georgetown Lombardi Comprehensive Cancer Center-Albert Einstein College of Medicine], M [University of Texas MD Anderson Cancer Center], S [Stanford University], and W [University of Wisconsin-Madison and Harvard Pilgrim Health Care Institute]) and across 4 models of Black women (D, GE, M, and W). Differences in medians calculated by subtracting values in Table 2, Table 3, and eTables 8 and 9 in the Supplement may not be equivalent to the median of the differences across models, as shown in this figure.

a Efficient or near-efficient in most models.

eTables 1-15

eFigures 1-5

eReferences

  • USPSTF Recommendation: Screening for Breast Cancer JAMA US Preventive Services Task Force June 11, 2024 This 2024 Recommendation Statement from the US Preventive Services Task Force recommends biennial screening mammography for women aged 40 to 74 years (B recommendation) and concludes that evidence is insufficient to assess the balance of benefits and harms of screening mammography in women 75 years or older (I statement) and of screening using ultrasonography or MRI in women with dense breasts on a negative mammogram (I statement). US Preventive Services Task Force; Wanda K. Nicholson, MD, MPH, MBA; Michael Silverstein, MD, MPH; John B. Wong, MD; Michael J. Barry, MD; David Chelmow, MD; Tumaini Rucker Coker, MD, MBA; Esa M. Davis, MD, MPH; Carlos Roberto Jaén, MD, PhD, MS; Marie Krousel-Wood, MD, MSPH; Sei Lee, MD, MAS; Li Li, MD, PhD, MPH; Carol M. Mangione, MD, MSPH; Goutham Rao, MD; John M. Ruiz, PhD; James J. Stevermer, MD, MSPH; Joel Tsevat, MD, MPH; Sandra Millon Underwood, PhD, RN; Sarah Wiehe, MD, MPH
  • USPSTF Review: Screening for Breast Cancer JAMA US Preventive Services Task Force June 11, 2024 This systematic review to support a 2024 US Preventive Services Task Force Recommendation Statement summarizes published evidence on the benefits and harms of screening for breast cancer in adult females. Jillian T. Henderson, PhD, MPH; Elizabeth M. Webber, MS; Meghan S. Weyrich, MPH; Marykate Miller, MS; Joy Melnikow, MD, MPH
  • Toward More Equitable Breast Cancer Outcomes JAMA Editorial June 11, 2024 Joann G. Elmore, MD, MPH; Christoph I. Lee, MD, MS
  • When Is It Best to Begin Mammograms, and How Often? JAMA Medical News & Perspectives June 11, 2024 This Medical News story discusses new USPSTF recommendations about the timing of screening mammograms. Rita Rubin, MA
  • Screening for Breast Cancer JAMA JAMA Patient Page June 11, 2024 In this JAMA Patient Page, the US Preventive Services Task Force provides a guide to screening for breast cancer. US Preventive Services Task Force
  • New Recommendations for Breast Cancer Screening—In Pursuit of Health Equity JAMA Network Open Editorial April 30, 2024 Lydia E. Pace, MD, MPH; Nancy L. Keating, MD, MPH
  • USPSTF Breast Cancer Screening Guidelines Do Not Go Far Enough JAMA Oncology Editorial April 30, 2024 Wendie A. Berg, MD, PhD

See More About

Select your interests.

Customize your JAMA Network experience by selecting one or more topics from the list below.

  • Academic Medicine
  • Acid Base, Electrolytes, Fluids
  • Allergy and Clinical Immunology
  • American Indian or Alaska Natives
  • Anesthesiology
  • Anticoagulation
  • Art and Images in Psychiatry
  • Artificial Intelligence
  • Assisted Reproduction
  • Bleeding and Transfusion
  • Caring for the Critically Ill Patient
  • Challenges in Clinical Electrocardiography
  • Climate and Health
  • Climate Change
  • Clinical Challenge
  • Clinical Decision Support
  • Clinical Implications of Basic Neuroscience
  • Clinical Pharmacy and Pharmacology
  • Complementary and Alternative Medicine
  • Consensus Statements
  • Coronavirus (COVID-19)
  • Critical Care Medicine
  • Cultural Competency
  • Dental Medicine
  • Dermatology
  • Diabetes and Endocrinology
  • Diagnostic Test Interpretation
  • Drug Development
  • Electronic Health Records
  • Emergency Medicine
  • End of Life, Hospice, Palliative Care
  • Environmental Health
  • Equity, Diversity, and Inclusion
  • Facial Plastic Surgery
  • Gastroenterology and Hepatology
  • Genetics and Genomics
  • Genomics and Precision Health
  • Global Health
  • Guide to Statistics and Methods
  • Hair Disorders
  • Health Care Delivery Models
  • Health Care Economics, Insurance, Payment
  • Health Care Quality
  • Health Care Reform
  • Health Care Safety
  • Health Care Workforce
  • Health Disparities
  • Health Inequities
  • Health Policy
  • Health Systems Science
  • History of Medicine
  • Hypertension
  • Images in Neurology
  • Implementation Science
  • Infectious Diseases
  • Innovations in Health Care Delivery
  • JAMA Infographic
  • Law and Medicine
  • Leading Change
  • Less is More
  • LGBTQIA Medicine
  • Lifestyle Behaviors
  • Medical Coding
  • Medical Devices and Equipment
  • Medical Education
  • Medical Education and Training
  • Medical Journals and Publishing
  • Mobile Health and Telemedicine
  • Narrative Medicine
  • Neuroscience and Psychiatry
  • Notable Notes
  • Nutrition, Obesity, Exercise
  • Obstetrics and Gynecology
  • Occupational Health
  • Ophthalmology
  • Orthopedics
  • Otolaryngology
  • Pain Medicine
  • Palliative Care
  • Pathology and Laboratory Medicine
  • Patient Care
  • Patient Information
  • Performance Improvement
  • Performance Measures
  • Perioperative Care and Consultation
  • Pharmacoeconomics
  • Pharmacoepidemiology
  • Pharmacogenetics
  • Pharmacy and Clinical Pharmacology
  • Physical Medicine and Rehabilitation
  • Physical Therapy
  • Physician Leadership
  • Population Health
  • Primary Care
  • Professional Well-being
  • Professionalism
  • Psychiatry and Behavioral Health
  • Public Health
  • Pulmonary Medicine
  • Regulatory Agencies
  • Reproductive Health
  • Research, Methods, Statistics
  • Resuscitation
  • Rheumatology
  • Risk Management
  • Scientific Discovery and the Future of Medicine
  • Shared Decision Making and Communication
  • Sleep Medicine
  • Sports Medicine
  • Stem Cell Transplantation
  • Substance Use and Addiction Medicine
  • Surgical Innovation
  • Surgical Pearls
  • Teachable Moment
  • Technology and Finance
  • The Art of JAMA
  • The Arts and Medicine
  • The Rational Clinical Examination
  • Tobacco and e-Cigarettes
  • Translational Medicine
  • Trauma and Injury
  • Treatment Adherence
  • Ultrasonography
  • Users' Guide to the Medical Literature
  • Vaccination
  • Venous Thromboembolism
  • Veterans Health
  • Women's Health
  • Workflow and Process
  • Wound Care, Infection, Healing

Others Also Liked

  • Download PDF
  • X Facebook More LinkedIn
  • CME & MOC

Trentham-Dietz A , Chapman CH , Jayasekera J, et al. Collaborative Modeling to Compare Different Breast Cancer Screening Strategies : A Decision Analysis for the US Preventive Services Task Force . JAMA. 2024;331(22):1947–1960. doi:10.1001/jama.2023.24766

Manage citations:

© 2024

  • Permissions

Collaborative Modeling to Compare Different Breast Cancer Screening Strategies : A Decision Analysis for the US Preventive Services Task Force

  • 1 Department of Population Health Sciences and Carbone Cancer Center, School of Medicine and Public Health, University of Wisconsin–Madison
  • 2 Department of Radiation Oncology and Center for Innovations in Quality, Safety, and Effectiveness, Baylor College of Medicine, Houston, Texas
  • 3 Health Equity and Decision Sciences (HEADS) Research Laboratory, Division of Intramural Research at the National Institute on Minority Health and Health Disparities, National Institutes of Health, Bethesda, Maryland
  • 4 University of Washington School of Medicine, Seattle
  • 5 Division of Cancer Prevention, National Cancer Institute, National Institutes of Health, Bethesda, Maryland
  • 6 Department of Medicine, Stanford University School of Medicine, Stanford, California
  • 7 Department of Biostatistics and Medical Informatics, School of Medicine and Public Health, University of Wisconsin–Madison
  • 8 Stanford University, Stanford, California
  • 9 Department of Data Science, Dana-Farber Cancer Institute, Boston, Massachusetts
  • 10 Harvard Pilgrim Health Care Institute, Boston, Massachusetts
  • 11 Erasmus MC—University Medical Center, Rotterdam, the Netherlands
  • 12 University of Texas MD Anderson Cancer Center, Houston
  • 13 Department of Industrial and Systems Engineering and Carbone Cancer Center, University of Wisconsin–Madison
  • 14 Departments of Medicine and Epidemiology and Population Health, Stanford University, Stanford, California
  • 15 Departments of Medicine and Epidemiology and Biostatistics, University of California San Francisco
  • 16 Kaiser Permanente Washington Health Research Institute, Seattle, Washington
  • 17 Department of Surgery, University of Vermont, Burlington
  • 18 Dartmouth Institute for Health Policy and Clinical Practice and Departments of Medicine and Community and Family Medicine, Dartmouth Geisel School of Medicine, Hanover, New Hampshire
  • 19 Division of Cancer Control and Population Sciences, National Cancer Institute, National Institutes of Health, Bethesda, Maryland
  • 20 Departments of Biomedical Data Science and Radiology, Stanford University, Stanford, California
  • 21 Albert Einstein College of Medicine, Bronx, New York
  • 22 Department of Public Health Sciences, University of California Davis
  • 23 Departments of Oncology and Medicine, Georgetown University Medical Center, and Georgetown Lombardi Comprehensive Institute for Cancer and Aging Research at Georgetown University Lombardi Comprehensive Cancer Center, Washington, DC
  • Editorial Toward More Equitable Breast Cancer Outcomes Joann G. Elmore, MD, MPH; Christoph I. Lee, MD, MS JAMA
  • Editorial New Recommendations for Breast Cancer Screening—In Pursuit of Health Equity Lydia E. Pace, MD, MPH; Nancy L. Keating, MD, MPH JAMA Network Open
  • Editorial USPSTF Breast Cancer Screening Guidelines Do Not Go Far Enough Wendie A. Berg, MD, PhD JAMA Oncology
  • US Preventive Services Task Force USPSTF Recommendation: Screening for Breast Cancer US Preventive Services Task Force; Wanda K. Nicholson, MD, MPH, MBA; Michael Silverstein, MD, MPH; John B. Wong, MD; Michael J. Barry, MD; David Chelmow, MD; Tumaini Rucker Coker, MD, MBA; Esa M. Davis, MD, MPH; Carlos Roberto Jaén, MD, PhD, MS; Marie Krousel-Wood, MD, MSPH; Sei Lee, MD, MAS; Li Li, MD, PhD, MPH; Carol M. Mangione, MD, MSPH; Goutham Rao, MD; John M. Ruiz, PhD; James J. Stevermer, MD, MSPH; Joel Tsevat, MD, MPH; Sandra Millon Underwood, PhD, RN; Sarah Wiehe, MD, MPH JAMA
  • US Preventive Services Task Force USPSTF Review: Screening for Breast Cancer Jillian T. Henderson, PhD, MPH; Elizabeth M. Webber, MS; Meghan S. Weyrich, MPH; Marykate Miller, MS; Joy Melnikow, MD, MPH JAMA
  • Medical News & Perspectives When Is It Best to Begin Mammograms, and How Often? Rita Rubin, MA JAMA
  • JAMA Patient Page Screening for Breast Cancer US Preventive Services Task Force JAMA

Question   What are the benefits and harms of different screening mammography strategies?

Findings   Six validated CISNET models found that, compared with no screening, biennial mammography screening with digital breast tomosynthesis from age 40 to 74 yielded a median of 8.2 breast cancer deaths averted per 1000 women screened, equal to a 30% reduction in breast cancer mortality, and 165 life-years gained, 1376 false-positive recalls, 201 benign biopsies, and 14 overdiagnosed cases per 1000 women screened. For each strategy, benefits were larger for Black women than for all women.

Meaning   Biennial mammography from ages 40 to 74 years has favorable benefit-to-harm tradeoffs.

Importance   The effects of breast cancer incidence changes and advances in screening and treatment on outcomes of different screening strategies are not well known.

Objective   To estimate outcomes of various mammography screening strategies.

Design, Setting, and Population   Comparison of outcomes using 6 Cancer Intervention and Surveillance Modeling Network (CISNET) models and national data on breast cancer incidence, mammography performance, treatment effects, and other-cause mortality in US women without previous cancer diagnoses.

Exposures   Thirty-six screening strategies with varying start ages (40, 45, 50 years) and stop ages (74, 79 years) with digital mammography or digital breast tomosynthesis (DBT) annually, biennially, or a combination of intervals. Strategies were evaluated for all women and for Black women, assuming 100% screening adherence and “real-world” treatment.

Main Outcomes and Measures   Estimated lifetime benefits (breast cancer deaths averted, percent reduction in breast cancer mortality, life-years gained), harms (false-positive recalls, benign biopsies, overdiagnosis), and number of mammograms per 1000 women.

Results   Biennial screening with DBT starting at age 40, 45, or 50 years until age 74 years averted a median of 8.2, 7.5, or 6.7 breast cancer deaths per 1000 women screened, respectively, vs no screening. Biennial DBT screening at age 40 to 74 years (vs no screening) was associated with a 30.0% breast cancer mortality reduction, 1376 false-positive recalls, and 14 overdiagnosed cases per 1000 women screened. Digital mammography screening benefits were similar to those for DBT but had more false-positive recalls. Annual screening increased benefits but resulted in more false-positive recalls and overdiagnosed cases. Benefit-to-harm ratios of continuing screening until age 79 years were similar or superior to stopping at age 74. In all strategies, women with higher-than-average breast cancer risk, higher breast density, and lower comorbidity level experienced greater screening benefits than other groups. Annual screening of Black women from age 40 to 49 years with biennial screening thereafter reduced breast cancer mortality disparities while maintaining similar benefit-to-harm trade-offs as for all women.

Conclusions   This modeling analysis suggests that biennial mammography screening starting at age 40 years reduces breast cancer mortality and increases life-years gained per mammogram. More intensive screening for women with greater risk of breast cancer diagnosis or death can maintain similar benefit-to-harm trade-offs and reduce mortality disparities.

Since 2009, the US Preventive Services Task Force (USPSTF) has recommended biennial mammography screening at ages 50 to 74 years, with clinical recommendations for discussion between patients and their primary care clinicians about individual risks and preferences for starting screening before age 50. 1 , 2 The USPSTF concluded in 2016 that the evidence was insufficient to assess the benefits and harms of digital breast tomosynthesis (DBT) as a primary screening method. In contrast to digital mammography, which uses a single radiograph projection per view, DBT involves multiple projections that are used to construct image slices, reducing tissue overlap. Screening facilities have been transitioning from digital mammography to DBT because of lower false-positive recall rates and higher cancer detection rates for DBT compared with digital mammography, 3 , 4 even though data do not show a reduction in rates of advanced cancer diagnosis. 5 , 6 Other changes since the 2016 recommendation include increasing breast cancer incidence among younger women and advances in treatment. 7 Importantly, Black and African American women (hereafter referred to as Black women) continue to experience higher breast cancer mortality than White women despite similar rates of mammography screening and lower (but steadily increasing) rates of breast cancer incidence. 8 The impact of these new data on the net benefit of screening mammography is unknown.

Population simulation models are a valuable tool for synthesizing evidence from observational and trial data to estimate the impact of different screening strategies. We used well-established Cancer Intervention and Surveillance Modeling Network (CISNET) models to estimate the benefits and harms of breast cancer screening strategies that varied by the ages to start and stop screening, modality, and interval for women overall and for Black women, including the impact of screening strategies on breast cancer mortality disparities for Black women. The results are provided to inform discussions about US breast cancer screening strategies by the USPSTF and other groups.

Six CISNET breast cancer models were used to estimate benefits and harms of mammography screening: Dana-Farber Cancer Institute (model D), Erasmus University Medical Center (model E), Georgetown Lombardi Comprehensive Cancer Center-Albert Einstein College of Medicine (model GE), University of Texas MD Anderson Cancer Center (model M), Stanford University (model S), and University of Wisconsin-Madison–Harvard Medical School (model W). These models were included in the 2 previous decision analyses conducted for the USPSTF. 9 , 10 Since the 2016 analysis, the models have incorporated several updates to inputs including screening performance characteristics for digital mammography and DBT, current breast cancer incidence trends, updated breast cancer stage and hormone receptor distributions, “real-world” treatment assignment and effects for women overall and for Black women. Detailed descriptions of each model are available elsewhere 11 - 17 and in an online technical report. 18 The University of Wisconsin Health Sciences institutional review board determined that this study was not human subjects research.

These analyses modeled a single cohort of US women with no personal history of breast cancer born in 1980 (ie, age 40 years in 2020) excluding women at the highest risk (ie, genetic susceptibility mutations or chest radiation at a young age). The models began with women at birth or age 20 or 25 years (since breast cancer is rare before this age; the initiation age varied by model) and accumulated all outcomes until death. The models evaluated women overall and Black women, and strata according to breast density, elevated risk, or comorbidity level. The term “women” was used while recognizing that not all individuals eligible for mammography screening self-identify as women. 19 Since model results are based on data for sex (ie, female) rather than gender identity, models apply to cisgender women and may not accurately reflect breast cancer risk for transgender men and nonbinary persons. This modeling analysis treated race as a social construct and aimed to provide evidence regarding the trade-offs of mammography screening strategies for self-identified Black women as an approach to reduce the observed disparities in breast cancer mortality. 20

All 6 models used a common set of data inputs for women overall and 4 models included race-specific inputs for Black women for breast cancer incidence, breast density, digital mammography and DBT performance, treatment assignment and efficacy, and causes of death other than breast cancer ( Table 1 ). 18 In addition, model-specific parameters were used to represent preclinical detectable times, lead-time, and age- and estrogen receptor (ER)/human epidermal growth factor receptor 2 (HER2)–specific stage distribution in screen-detected vs non–screen-detected cases on the basis of each model’s structure.

Five of the 6 models adapted an age-period-cohort modeling approach to estimate breast cancer incidence in the absence of screening among the overall and Black female population 21 , 22 ; model M used Surveillance, Epidemiology, and End Results (SEER) rates with a linear model based on rates in 1975 and calibrated over time. 12 Incidence was increased for subgroups with elevated risk or with greater breast density. Density was modeled by Breast Imaging Reporting and Data Systems (BI-RADS) categories: almost entirely fatty (“a”), scattered fibroglandular densities (“b”), heterogeneously dense (“c”), and extremely dense (“d”). 40 Density category was assigned at age 40 years and remained the same or decreased by 1 level at age 50 years and again at age 65, based on observed age-specific prevalence rates in the Breast Cancer Surveillance Consortium (BCSC). 18 Density was related to breast cancer risk and screening performance but was assumed to not affect molecular subtype or disease natural history (eg, tumor growth rates). Models incorporated screening sensitivity applied to each mammogram a woman received. Age-specific sensitivity values for digital mammography and DBT (hereafter referred to collectively as mammography) overall and by density category were also based on data from the BCSC. 18 Data for the BCSC reflects breast imaging in community practice across the US. 41

With treatment, screen detection at an earlier stage could lead to improved survival, reduced risk of death, and/or greater chance of cure with a smaller tumor size, depending on model. Treatment was assigned based on age, stage, and molecular subtype. To reflect real-world patterns of breast cancer care, the probability of receiving specific types of systemic treatment was based on data from the National Comprehensive Cancer Network as previously reported and, for newer therapies, expert opinion. 25 , 26 Efficacy of systemic therapy was based on the most recent published meta-analysis of clinical trials and, for newer therapies, clinical trial reports 28 , 29 ; treatment efficacy (in the setting of optimal stage–based and tumor subtype–based treatment) was assumed to be equal by race. 42 In contrast to efficacy, treatment effectiveness was modeled as lower for Black women due to multiple factors that may arise from systemic racism and lead to worse treatment quality (eg, delayed initiation, suboptimal regimens, dose reductions, and incomplete cycles). 43 - 46 Based on published data, treatment benefit was therefore reduced by 28% for ER-negative tumors and 56% for ER-positive tumors in models restricted to Black women. 27

Probability of death from non–breast cancer causes was derived from Centers for Disease Control and Prevention (CDC) Wide-ranging Online Data for Epidemiologic Research (WONDER) and the Human Mortality Database; these values were replaced by comorbidity-specific values in subgroup analyses. 35 , 36

We compared model results for 36 mammography screening scenarios that varied by modality (digital mammography or DBT performed with concurrent or synthetic digital mammography), 47 - 52 starting age (40, 45, or 50 years) and stopping age (74 or 79 years), and interval (annual, biennial, or hybrid intervals). The 3 hybrid screening scenarios were (1) annual from ages 40 to 49 then biennial at age 50; (2) annual from ages 45 to 54 then biennial at age 55; and (3) annual from ages 45 to 49 then biennial at age 50. The models assumed 100% adherence to screening.

Benefits included percent reduction in breast cancer mortality, breast cancer deaths averted, and life-years gained (LYG) over the lifetimes of 1000 women screened compared with no screening. We also examined quality-adjusted life-years (QALYs) gained, which were calculated using age-specific utilities for women in the general population, 38 , 53 with disutilities applied for undergoing screening, diagnostic evaluation, and breast cancer treatment based on the stage at diagnosis (eTable 1 in the Supplement 37 , 39 ).

Harms accumulated over the lifetime included recalls for additional imaging in women without cancer (hereafter referred to as false-positive recalls), benign results from biopsies recommended for findings on screening mammography (hereafter referred to as benign biopsies), and overdiagnosed cases of ductal carcinoma in situ (DCIS) and invasive breast cancer. Overdiagnosis was defined as the excess breast cancer cases diagnosed in the presence of screening that were not diagnosed in the absence of screening over the lifetime. The harm of overtreatment after overdiagnosis was captured by the treatment-related decrement in utility without a change in life expectancy.

Outcomes were tallied from age 40 years (the youngest age to start screening across strategies) to death and expressed per 1000 women. Results were summarized by the median and range across models for each outcome. We also generated efficiency frontiers by plotting the sequence of strategies that represented the largest incremental percent breast cancer mortality reduction (or LYG) per mammogram performed. Screening strategies on this frontier were considered the most efficient (ie, no alternative existed that provided equal or greater benefit with fewer screens or harms). Because a strategy providing outcomes that was very similar to an efficient strategy may be still be considered by decision-makers for other reasons (eg, consistency of starting and stopping ages across screening modalities), 54 we also identified “near-efficient” strategies 55 defined as a strategy within 5% of the value for screening biennially from ages 50 to 74 with DBT. Strategies that had more harms and/or fewer benefits were referred to as “inferior” to (inefficient or dominated by) other strategies.

Analyses were repeated for Black women and for strata according to density category, elevated relative risk of breast cancer, or comorbidity level.

In sensitivity analyses, for comparison with previous modeling in 2009 and 2016, we repeated the analysis assuming all women with cancer received the most effective therapy (vs the real-world patterns used in the primary analyses).

The 6 models produced consistent results for the screening strategies (eTables 2 and 3 in the Supplement ). For instance, biennial screening with DBT from ages 40 to 74 years yielded a median 30.0% (range, 24.0%-33.7%) reduction in breast cancer mortality vs no screening, with 1376 (range, 1354-1384) false-positive recalls per 1000 women screened ( Table 2 ). Compared with biennial screening with DBT from ages 50 to 74 years, starting at age 40 averted 1.3 (range, 0.9-3.2) additional breast cancer deaths, with 503 (range, 493-506) additional false-positive recalls, 65 (range, 62-66) additional benign biopsies, and 2 (range, 0-4) more overdiagnosed cases per 1000 women screened ( Table 3 ).

Annual screening led to greater reductions in mortality than biennial strategies, with a 37.0% median reduction (range, 33.6%-38.9%) ( Table 2 ) with screening annually from ages 40 to 74 years with DBT but resulted in more false-positive recalls, benign biopsies, and overdiagnosed cases.

With biennial screening from ages 40 to 74 years, digital mammography resulted in 1540 false-positive recalls and 210 benign biopsies per 1000 women screened vs 1376 and 201, respectively, with DBT ( Table 2 ). Use of DBT instead of digital mammography further decreased breast cancer mortality by approximately 1 percentage point and averted less than 1 additional breast cancer death per 1000 women and reduced false-positive recalls by approximately 150-300 per 1000 women over their lifetimes among 9 screening strategies stopping at age 74 (eTable 4 in the Supplement ).

Stopping screening at age 79 vs 74 years generally resulted in an additional 3– to 5–percentage point mortality reduction, 1 additional breast cancer death averted, 64 to 172 more false-positive recalls per 1000 women, and 2 to 4 additional overdiagnosed cases, depending on strategy (eTable 5 in the Supplement ).

Among all possible strategies, 5 DBT screening strategies were identified as efficient or near-efficient for both percent mortality reduction and LYG in at least 5 of 6 models, including one with stopping age 74 years (biennial starting at age 50) and 4 with stopping age 79 (biennial starting at age 40; biennial starting at age 45; annual from ages 40 to 49 with biennial thereafter; and annual starting at age 40) ( Figure 1 ; eFigures 1 and 2 and eTable 6 in the Supplement ). Efficient strategies ranged from 1.7 to 4.3 more breast cancer deaths averted and 41 to 168 more benign biopsies than screening biennially from ages 50 to 74 years per 1000 women ( Figure 2 ). Five similar strategies were identified as efficient when limited to the 18 options with stopping age 74 (biennial starting at age 40, biennial starting at age 45, biennial starting at age 50, annual at ages 40 to 49 with biennial at ages 50 to 74, and annual at ages 40 to 74; eFigure 3 in the Supplement ).

Seven screening strategies were efficient or near-efficient for LYG or breast cancer mortality reduction among Black women ( Figure 1 ; eFigures 4 and 5 and eTable 7 in the Supplement ). Three strategies were efficient or near-efficient for both metrics among most models, including biennial from ages 40 to 79 years, biennial from ages 45 to 79, and annual from ages 40 to 79. Expanding biennial screening with DBT from ages 50 to 74 to ages 40 to 74 or 79 averted a median of 1.8 and 3.0 additional breast cancer deaths across models, respectively ( Figure 2 ).

Trade-offs between benefits and harms of different screening strategies for Black women followed similar patterns as for all women combined (eTables 8-10 in the Supplement ). All strategies resulted in more breast cancer deaths averted and LYG for Black women compared with the same strategies for women overall. However, this gain in averted breast cancer deaths was insufficient to reduce breast cancer mortality disparities for Black women compared with women overall. Specifically, if Black women were screened with the same strategy as for women overall, breast cancer mortality for Black women would remain more than 40% greater than for women overall ( Table 4 ). Alternatively, if Black women were screened annually from ages 40 to 49 years with biennial screening from ages 50 to 79 and the overall population was screened biennially from ages 40 to 74, the ratio of breast cancer mortality rate for Black women vs women overall would be reduced from 1.44 (28.8/20.0) to 1.34 (26.8/20.0; a disparity reduction of 23%). Notably, Black women screened annually at ages 40 to 49 and biennially at ages 50 to 79 would experience fewer false-positives and mammograms per breast cancer death averted with greater life-years gained than women overall screened biennially at ages 40 to 74 (eTable 10 in Supplement ).

Only 3 strategies were efficient in most models for women with dense breasts (BI-RADS category c and d), including biennial screening from ages 50 to 74 years, biennial screening from ages 40 to 79, and annual screening at ages 40 to 79 (eTable 11 in the Supplement ). Across all strategies efficient in at least 1 density category, breast cancer deaths averted using DBT for women with almost entirely fatty breasts ranged from 4.9 for biennial screening at ages 50 to 74 to 7.6 with annual screening at ages 40 to 79 and increased among women with extremely dense breasts from 8.3 to 14.6 (eTable 12 in the Supplement ).

Models showed greater benefits and fewer harms as breast cancer risk increased to 150% and 200% of average risk, with the same 3 screening strategies efficient for both elevated risk levels as for dense breasts (eTable 13 in the Supplement ). Incremental benefits of screening after age 74 years were reduced in the presence of severe comorbidities (eTable 14 in the Supplement ).

When all breast cancer cases received the most effective treatment for their cancer subtype and screening stopped at age 74 years, the percent reduction in breast cancer mortality increased as compared with the primary analysis, in which cases received treatment based on real-world treatment patterns (eTable 15 in the Supplement ).

This study used 6 well-established models to estimate the potential benefits and harms of different breast cancer screening strategies in the US. The models demonstrated that screening initiation at age 40 years had superior benefit-to-harm tradeoffs compared with no screening and other screening strategies. Benefits of DBT were comparable with those of digital mammography but resulted in fewer false-positive recalls and similar benign biopsies. Annual screening would lead to greater reductions in breast cancer mortality than biennial strategies but correspondingly more false-positive recalls and overdiagnosed cases. Since breast cancer death rates are higher for Black women, all screening strategies generated greater survival and mortality benefits for Black women than for women overall. However, to reduce racial disparities in breast cancer mortality in the absence of improved equity in the treatment setting, an increase in screening intensity such as annual screening of Black women from ages 40 to 49 would also be needed. Benefits for women with elevated risk or higher breast density were higher than for women overall, but the rankings of strategies were similar to those for women overall. In addition, several strategies with a stopping age of 79 were efficient. For women aged 75 to 79, comorbidities may be an important factor in decisions about when to cease breast cancer screening.

Compared with our 2016 analysis, 10 the predicted benefit-to-harm ratios with biennial strategies starting at age 40 or 45 years have modestly improved. Due to recent increases in breast cancer incidence among women aged 40-49 (154.1 to 160.5 per 100 000 from 1999 to 2018), life-years gained were notably higher for screening strategies that started at age 40 or 45. 7 , 56 Past analyses assumed optimal treatment selection; starting screening earlier partially compensated for less-than-optimal real-world treatment uptake in the current analysis. Also, with the growing evidence for lower false-positive recall rates with DBT than with digital mammography, 3 , 4 fewer harms were associated with earlier ages of screening initiation than occurred in prior analyses.

Prospective studies that include multiple rounds of breast cancer screening are needed to determine whether, compared with digital mammography, DBT results in a shift toward detecting breast cancer at earlier stages with a concomitant decrease in advanced stage. Initial studies suggest that DBT leads to increased detection of stage I invasive breast cancer as compared with digital mammography, although a reduction in advanced stage has not yet been demonstrated. 6 , 57 - 59 Screening benefit related to reductions in breast cancer deaths depends on the advantage of beginning treatment in earlier vs more advanced stages.

This analysis extended findings published in 2021 for a model (GE) that evaluated strategies for reducing breast cancer mortality disparities and improving health equity between Black and White women. 60 Our models are intended to generate findings for individuals who self-identify as Black, defining race as a social construct where the sociopolitical environment influences biological processes over the life course. 61 - 63 The current study showed that Black women gained more life-years per mammogram than women overall for each screening strategy. This was due in part to Black women having higher breast cancer mortality, especially among younger women, and gaining less benefit from intended therapy due to worse quality of care. If Black women obtained annual mammography from age 40 to 49 years with biennial screening afterward, mortality disparities were projected to decline while also achieving similar benefit-to-harm tradeoffs as biennial screening starting at age 40 for women overall. These results are similar to those recently published by others using US mortality data that more intensive screening could potentially reduce the Black/White disparity in breast cancer mortality. 64 If health care systems, policymakers, clinicians, and scientists work to fully eliminate disparities experienced by Black women, the balance of benefits and harms for screening could eventually change to the extent that more intensive screening strategies for Black women are no longer needed to increase equity. However, as described by Chapman et al, 60 until treatment disparities are substantially decreased or eliminated, screening Black women more intensively represents an immediate possible solution for improving equity. Optimal implementation of any strategy will also require improved equity in DBT access and timely diagnostic workup. 65

Our analysis considered breast cancer screening strategies using mammography, which has poorer performance in women with dense breasts compared with nondense breasts. Our models estimated that for any given mammography screening strategy, women with dense breasts had more deaths averted and greater life-years gained per mammogram than those with nondense breasts, but false-positive recall rates were higher. Evidence on the impact of supplemental screening with breast magnetic resonance imaging (MRI) or ultrasound for women with dense breasts is limited. 66 , 67 With federal regulations expanding breast density notification in September 2024 and the absence of consistent clinical guidelines for supplemental screening, 68 this is a critical area for future research and policymaking.

After accounting for recent trends in life expectancy (prior to the COVID-19 pandemic) and improvements in breast cancer therapies, strategies with screening until age 79 years were identified as efficient. This is consistent with a recent simulation study but contrasts with an emulated trial based on Medicare data showing that breast cancer mortality was not significantly reduced among women screened through age 79. 69 , 70 Current breast cancer screening trials in progress, including TMIST and WISDOM, are not recruiting women older than 74, and trials testing screening in older women are unlikely to be conducted. Evidence from other types of studies is needed to better understand outcomes of screening for older women.

Relative rankings of strategies were similar across the models. However, the models differ in meaningful ways in structure and assumptions. For example, some models incorporated a benefit from screening due to within-stage shift in detection and subsequent treatment (models E, S, and W) while others required a stage shift (models D and GE) or assigned greater benefit for screen-detected than clinically detected cases within each stage at detection (model M). Among the 5 models that included DCIS as well as invasive breast cancer, 3 models found that the overall number of overdiagnosed cases exceeded the number of breast cancer deaths averted for all screening strategies considered. Underlying incidence in the absence of screening and the proportion of tumors that were nonprogressive are unknown and unobservable; therefore, the different results across models with their respective assumptions about breast cancer natural history provide a range of possible estimates.

This research has many important strengths, including the collaboration of 6 independent modeling teams with consistent results and use of the most current data on incidence, screening performance, and modern, real-world therapy. Several caveats should also be considered in interpreting our results. First, the models portray the entire lifetime of women in the 1980 birth cohort and assume that future trends continued along the same trajectories as observed now. Second, we compared results for Black women with the overall female population, which leads to an underestimate of the impact of racism. This was a necessary simplification because these models did not produce estimates for other minoritized groups, non-Black women, or White women. In future research, models will be developed to examine results by racial and ethnic groups as well as interventions to improve health equity. Finally, some analyses were based on findings from fewer than 6 models for pragmatic reasons. In particular, some models were well-poised to examine analyses of racial disparities, 60 breast density, 71 or comorbidities 36 due to programming completed in previous projects.

Overall, this analysis suggests that biennial screening starting at ages 40 or 45 years with digital mammography or DBT and continuing through age 74 or 79 provides gains in life-years and breast cancer mortality reduction per mammogram—and averts more deaths from breast cancer among Black women—than waiting to start screening at age 50. More intensive screening for populations of women with greater risk of breast cancer diagnosis or death can maintain similar benefit-to-harm trade-offs and reduce breast cancer mortality disparities. In the presence of recent changes in breast cancer incidence and improvements in screening technology and breast cancer therapy, mammography screening remains an important strategy to reduce breast cancer burden.

Accepted for Publication: November 9, 2023.

Published Online: April 30, 2024. doi:10.1001/jama.2023.24766

Corresponding Author: Amy Trentham-Dietz, PhD, MS, Department of Population Health Sciences and Carbone Cancer Center, School of Medicine and Public Health, University of Wisconsin-Madison, 610 Walnut St, WARF Room 307, Madison, WI 53726 ( [email protected] ).

Author Contributions: Drs Trentham-Dietz and Mandelblatt had full access to all the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.

Concept and design: Trentham-Dietz, Chapman, Lowry, Heckman-Stoddard, Kerlikowske, Sprague, Tosteson, Berry, Plevritis, X. Huang, de Koning, van Ravesteyn, Schechter, Miglioretti, Mandelblatt.

Acquisition, analysis, or interpretation of data: Trentham-Dietz, Chapman, Jayasekera, Lowry, Heckman-Stoddard, Hampton, Caswell-Jin, Gangnon, Ying Lu, H. Huang, Stein, Sun, Gil Quessep, Yang, Yifan Lu, Song, Munoz Medina, Li, Kurian, Kerlikowske, O'Meara, Sprague, Tosteson, Feuer, Berry, Plevritis, de Koning, van Ravesteyn, Lee, Alagoz, Schechter, Stout, Miglioretti, Mandelblatt.

Drafting of the manuscript: Trentham-Dietz, Chapman, Hampton, H. Huang, Mandelblatt.

Critical review of the manuscript for important intellectual content: Trentham-Dietz, Chapman, Jayasekera, Lowry, Heckman-Stoddard, Caswell-Jin, Gangnon, Ying Lu, Stein, Sun, Gil Quessep, Yang, Yifan Lu, Song, Munoz Medina, Li, Kurian, Kerlikowske, O'Meara, Sprague, Tosteson, Feuer, Berry, Plevritis, X. Huang, de Koning, van Ravesteyn, Lee, Alagoz, Schechter, Stout, Miglioretti, Mandelblatt.

Statistical analysis: Trentham-Dietz, Chapman, Jayasekera, Hampton, Gangnon, Ying Lu, H. Huang, Stein, Sun, Yang, Yifan Lu, Song, Munoz Medina, Li, Berry, Plevritis, X. Huang, Lee, Alagoz, Schechter, Stout, Miglioretti.

Obtained funding: Trentham-Dietz, Sprague, Berry, Plevritis, X. Huang, van Ravesteyn, Lee, Alagoz, Stout, Mandelblatt.

Administrative, technical, or material support: Trentham-Dietz, Jayasekera, Heckman-Stoddard, Gil Quessep, Kerlikowske, O'Meara, Tosteson, Feuer, Plevritis, X. Huang, Mandelblatt.

Supervision: Trentham-Dietz, Heckman-Stoddard, Plevritis, X. Huang, de Koning, van Ravesteyn, Lee, Alagoz, Stout, Miglioretti, Mandelblatt.

Conflict of Interest Disclosures: Dr Chapman reported receiving personal fees from ASCO Advantage Program/Daiichi Sankyo outside the submitted work. Dr Caswell-Jin reported receiving grants from Novartis, Effector Therapeutics, and QED Therapeutics outside the submitted work. Dr Li reported holding stock in Agenus Inc and Mink Therapeutics Inc outside the submitted work. Dr Berry reported receiving grants from MD Anderson Cancer Center of the University of Texas during the conduct of the study and being co-owner of Berry Consultants LLC, a company that designs bayesian adaptive clinical trials for pharmaceutical and medical device companies, National Institutes of Health (NIH) cooperative groups, patient advocacy groups, and international consortia, outside the submitted work. Dr Plevritis reported serving as a scientific advisor to Adela Biosciences. Dr X. Huang reported receiving grants from University of Texas MD Anderson Cancer Center during the conduct of the study. Dr van Ravesteyn reported receiving consulting fees (paid to institution) from Wickenstones outside the submitted work. Dr Alagoz reported receiving personal fees from Bristol Myers Squibb, Johnson & Johnson, and Exact Sciences and owning stock in Innovo Analytics LLC outside the submitted work. No other disclosures were reported.

Funding/Support: This report is based on research conducted by the CISNET Breast Cancer Working Group under National Cancer Institute grant U01CA253911. This research was also supported in part by National Cancer Institute (NCI) grant P30CA014520 and P01CA154292 and a Vilas Associate Award to Dr Trentham-Dietz by the University of Wisconsin-Madison. Dr Jayasekera was supported by the Division of Intramural Research at the National Institute on Minority Health and Health Disparities of the National Institutes of Health (NIH) and the NIH Distinguished Scholars Program. The Breast Cancer Surveillance Consortium ( http://www.bcsc-research.org/ ) and its data collection and data sharing activities are funded by the NCI (P01CA154292).

Role of the Funder/Sponsor: Investigators worked with USPSTF members, AHRQ staff, and the EPC review team to define the scope of the project and key questions to be addressed. AHRQ had no role in the conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript findings.

Disclaimer: The contents of this manuscript are solely the responsibility of the authors and do not necessarily represent the official views of the National Cancer Institute, the National Institute on Minority Health and Health Disparities, or the Veteran’s Affairs Administration. The opinions expressed in this document are those of the authors and do not reflect the official position of AHRQ, the US Department of Health and Human Services, or the National Cancer Institute.

Additional Contributions: We thank the following individuals for their contributions to this project: Tracy Wolff, MD, MPH (US Preventive Services Task Force Program); Howard Tracer, MD (AHRQ); Jillian Henderson, PhD, MPH, Elizabeth Webber, MS, and other members of the Kaiser Permanente Evidence-based Practice Center; members of the US Preventive Services Task Force who contributed to the decision analysis work plan; Doug Owens, MD, MS (Stanford University), Ya-Chen Tina Shih, PhD (MD Anderson Cancer Center), Jennifer Croswell, MD, MPH (Healthcare Delivery Research Program, National Cancer Institute), Sarah M. Temkin, MD (Office of Research on Women’s Health, National Institutes of Health), and 2 additional content experts for their review of the draft report; Julie McGregor, BA, Jennie Martin, MS, and Victoria Foster, MS, for project coordination at the University of Wisconsin-Madison; Linn Abraham, MS (Kaiser Permanente Washington), and Thomas Lawler, PhD (University of Wisconsin-Madison), for assistance with data; and Amy Knudsen, PhD (Massachusetts General Hospital), Claudia Seguin, BA (Massachusetts General Hospital), and Hannah Johnson, MPH (University of Wisconsin-Madison), for assistance with graphing. None of the persons acknowledged received any additional compensation for their contributions.

Additional Information: The small writing committee (Drs Trentham-Dietz, Chapman, Jayasekera, Lowry, Miglioretti, and Mandelblatt) wrote initial drafts and solicited input from the larger group.

  • Register for email alerts with links to free full-text articles
  • Access PDFs of free articles
  • Manage your interests
  • Save searches and receive search alerts

task analysis to

  • Air Warfare
  • Land Warfare
  • Naval Warfare
  • Networks / Cyber
  • Multi-Domain
  • Indo-Pacific
  • All Domain: Connecting the Joint Force
  • Defense Budget Coverage
  • Advanced Weapons Technology
  • Air Dominance
  • Newsletters
  • Newsletter Signup
  • Resource Library

ISRAEL-DEFENCE-EXHIBITION-UAV

Israel’s Aeronautics introduces loitering munition, surveillance drone combination

CUBA-RUSSIA-WARSHIPS

Russian warships visit Cuba, sending a message to Washington: Analysts

President Biden Meets With Visiting Ukrainian President Zelensky At The White House

US, Ukraine sign ‘historic’ 10-year bilateral security agreement

ISRAEL-LEBANON-PALESTINIAN-CONFLICT

Why a unit in Israel’s IDF purportedly used a medieval trebuchet against Hezbollah

KNDS_LEOPARD_2_A_RC_3.0_01

KNDS unveils new Leopard and Leclerc main battle tank concepts

Houthis Kill Innocent Civilians with More Reckless Attacks

65 countries affected by Houthi attacks in Red Sea, including Iran: US intelligence

USMC VMGR-234 Executes Aerial Refueling Operation with French Mirage Fighters

French Mirage-2000 fighters are headed to Ukraine. Here’s how Kyiv will use them.

Image 12-6-2024 at 4.16 PM

China, eye on economy, wants to ‘stabilize’ US relations, State’s No. 2 says

Rheinmetall photo

Germany’s Rheinmetall plans Lynx IFV production in Ukraine in ‘near future,’ says company CEO

Paris air show embraer c-390

Netherlands says first Embraer C-390 delivery from joint buy with Austria is delayed to 2027

IDEX edge stand

Emirati defense giant joins forces with Indian firm with an eye towards a host of defense systems

President Biden Hosts State Visit For Japanese Prime Minister Kishida

Japan looks to revamp defense industry after years of downsizing

Merkel Visits VJTF Bundeswehr Armoured Brigade

Leonardo and KNDS pull plug on main battle tank and IFV cooperation

SINGAPORE-MARITIME-NAVY

In a first, Malaysia inks agreement to procure 3 Turkish corvettes

Israeli air strike hits southern Lebanon

In South Lebanon, empty villages, ruined crops and fears of what comes next

Defense industry news, analysis and commentary.

  • Special Features

Breaking Defense In your inbox

Want the latest defense industry news? Sign up for the Breaking Defense newsletter.

  • Global , Naval Warfare , Networks & Digital Warfare

US postured to lose without a Standing Combined Joint Task Force in INDOPACOM

In this op-ed, john d. rosenberger discusses the need for a standing combined joint task force to face the china threat..

U.S. Indo-Pacific Command change of command ceremony

Adm. Samuel Paparo arrives for the U.S. Indo-Pacific Command change of command ceremony presided over by Secretary of Defense Lloyd J. Austin III on Joint Base Pearl Harbor Hickam, May 3, 2024. During the ceremony, Adm. Samuel Paparo assumed command from Adm. John Aquilino, who retired with 40 years of service in the Navy. (DoD Photo by U.S. Air Force Tech. Sgt. Jack Sanders)

Earlier this month, retired Army officer John D. Rosenberger wrote in Breaking Defense about the need for US Indo-Pacific Command to tackle the Combined Joint All-Domain Command and Control (CJADC2) challenge in the Pacific. In this companion piece, Rosenberger raises concerns about the other half of the equation: a Standing Combined Joint Task Force positioned specifically for the China threat.

If Beijing wants to execute a first strike and seize Taiwan, the ideal time might be right now, when INDOPACOM has no Standing Combined Joint Task Force (SCJTF) trained, poised, and ready to immediately employ the combat power of its Component Commands to defeat the PLA’s effort. Should this occur, INDOPACOM chief Adm. Samuel Paparo would have to form an ad hoc operational CJTF  that would take weeks to assemble and prepare for combat.

In other words, INDOPACOM has no means of orchestrating a synchronized Joint All Domain campaign to absorb the blow, then launch a devastating counter-offensive operation against the PLA — which has to be so effective it keeps China from considering the nuclear option. We stand postured to lose our first large-scale conventional battle against the PLA.

It doesn’t have to be this way. Paparo just took over at the command this month, and with a new commander should come a new way of doing business. Specifically, Paparo should look to create a permanent SCJTF, specifically postured for a China scenario.

Even Congress recognizes this imperative requirement. In the FY23 National Defense Authorization Act (NDAA), Congress directed the SECDEF to create a fully equipped and persistent operational Joint Task Force within INDOPACOM. A year later, in the FY24 NDAA, Congress directed the SECDEF to send Congress his implementation plan to bring this Combined Joint Task Force to life.

Two years later, in testimony before Congress last month, Secretary of Defense Lloyd Austin stated, “What I’ve asked my team to do is look at this and do an assessment to make sure that we get it right, and we understand the operational and cost issues associated with this.” In non-Pentagon speak, we’re going to study it some more and kick the can down the street.

Granted, DoD has announced it is standing up a new operational Joint Task Force-Micronesia, under the command of a two-star admiral, with responsibility for the homeland defense of Guam, the US Commonwealth of Northern Marianas Islands, Federated States of Micronesia, and the Republic of Pulau — our partner nations along the first island chain. While this may satisfy political interests to secure basing rights where we need them, this is not the type of warfighting organization that Congress directed the SECDEF to form.

Although Congress chose the wrong Joint doctrinal term in its directive two years ago, its intent was clear. Congress directed the SECDEF to establish a SCJTF, led by a four-star commander, poised and ready to command, orchestrate, and synchronize the employment of the Combined Joint forces allocated to the INDOPACOM theater.

For clarity, doctrinal definitions are important. In Joint doctrine, “Operational JTFs are the most common type of JTF and are established in response to a SECDEF-approved military operation or crisis … a Standing JTF is a JTF originally established as an operational JTF, but that has an enduring mission that is projected to continue indefinitely .” Clearly, there is no military operation or crisis ongoing in the INDOPACOM AOR, as yet, therefore a SCJTF aligns with Congressional intent.

Let’s cut to the chase. If the PLA chose to launch a first strike to seize Taiwan, Paparo would be compelled set up a CJTF on the fly, in the chaos of war, with no time for Crisis Action Planning. In other words, he will have to form a “pick up” team so to speak, cobbled together from personnel and equipment within his Component Commands, augmented by specialists across all Services, Allies, and other government agencies.

The problem is that the US military’s ability to do this quickly is poor . Historically, this process has taken Combatant Commands months to complete. INDOPACOM and other Combatant Commands were never designed to be warfighting headquarters ; conducting large-scale, combat operations to protect America’s interests in the region was not what was envisioned when the COCOM structure stood up in 1986.

It’s difficult to understate how serious this is. The business of mastering the complex operational-level warfighting tasks necessary to defeat the combined forces of the PLA cannot be achieved without a stable command and staff team that has trained together tirelessly to develop and sustain proficiency in these tasks. Strangers, however competent , thrown together to form an ad hoc operational CJTF won’t cut it. Conducting synchronized, Joint All-Domain Operations at CJTF-level is a tough, complex, uncompromising business involving the orchestration of thousands of moving pieces. It is no place for amateurs.

For example, the ability to collect, analyze and disseminate actionable intelligence, the ability to employ the full array of Joint sensors throughout the Joint Operations Area to find high-payoff targets, the ability to mass lethal and non-lethal effects against these targets at the right time and place, while protecting forces and critical facilities from enemy counterstrikes, hinges on a mastery of joint combined arms synchronization — the most complex of all operational and tactical warfighting tasks. The skill and ability to do this takes endless training and practice. Only continual Combined/Joint, multi-echelon training, and plenty of it, can transform the combat potential of a SCJTF into a dominant force that can withstand first strike, rally quickly to seize the tactical and operational initiative, then transition to counter-offensive operations — the only means of achieving victory against the PLA.

Frustratingly, we see the opposite approach in INDOPACOM training exercises today. INDOPACOM typically tasks one of its component command headquarters to serve as a temporary operational JTF or CJTF for training exercises, e.g., Headquarters, US Army Pacific or Headquarters, III Marine Expeditionary Force.  None of these and other Component Command headquarters are fully staffed, trained, and equipped to serve as an operational CJTF under combat conditions. All are cobbled together for exercise purposes, all requiring substantial augmentation from other Component Commands, Allies, and national agencies to perform the role of a CJTF for a typical two-week training exercise. Moreover, they are given 18-24 months to prepare for each exercise.

This approach — forming temporary CJTFs using Component Command headquarters as a base — guarantees that few members of the temporary CJTF staff become experts in Joint warfare nor have the knowledge of all capabilities the other services bring to the fight. At the end of each exercise, the knowledge, skills, and abilities that CJTF commanders and their staffs gain during an exercise evaporates when they return to their previous assignments, where they re-focus on parochial service interests in theater. Whatever Joint warfighting skills they develop during training decay quickly over time. High staff personnel turnover disperses the rest. Year after year, the process repeats itself.

We need to come to our senses. If the PLA launches a first strike, there will be no time to stand up an operational CJTF in the confusion and chaos of battle, much less train the organization. Forming a SCJTF now, highly trained and poised to fight, is the only solution to the PLA’s most likely course of action. A SCJTF would put teeth in deterrence forcing the Chinese to factor this formidable warfighting organization in its cost-benefit calculations.

Now is the time to reorganize and restructure INDOPACOM accordingly. And while it may feel less urgent, let’s do the same for EUCOM and CENTCOM as well. A restructure of Combatant Commands is long overdue. They were formed under strategic conditions that existed 38 years ago during the Cold War, specifically to fit the needs of their era. But that era is long gone.

Ignore the detractors within DoD who perpetually whine and argue the idea is inefficient and unaffordable. If we lose our first battle against the PLA, that will be a lame excuse.

Colonel (Ret) John D. Rosenberger served 29 years in the US Army as an armored cavalry officer and 20 years as a defense contractor at the forefront of Army and Joint modernization. For two years, serving directly for the SACEUR, he orchestrated the training of all NATO Combined Joint Task Forces in Europe in planning and executing large scale conventional operations. The views expressed are those of the author and do not reflect the official position of JANUS Research Group, Department of the Army, or the Department of Defense.

Latest from Breaking Defense

Boeing VC-25A

First flight of new Air Force One jet slips to 2026, Air Force says

CSAF Allvin presents 2022 Kolligian Trophy

Allvin: ‘No decision’ made on NGAD, but ‘difficult’ choices loom

Raytheon-meo missile warning

Space Force boots RTX from MEO missile warning/tracking program

ISRAEL-DEFENCE-EXHIBITION-UAV

Sign up and get Breaking Defense news in your inbox.

  • Advertising & Marketing Solutions
  • Breaking Defense
  • Breaking Energy
  • Breaking Gov
  • Above the Law
  • Dealbreaker
  • MedCity News

Copyright © 2024 Breaking Media, Inc. All rights reserved. Registration or use of this site constitutes acceptance of our Terms of Service and Privacy Policy .

Privacy Center | Do not sell my information

Friend's Email Address

Your Email Address

Help | Advanced Search

Computer Science > Computation and Language

Title: semeval-2024 task 3: multimodal emotion cause analysis in conversations.

Abstract: The ability to understand emotions is an essential component of human-like artificial intelligence, as emotions greatly influence human cognition, decision making, and social interactions. In addition to emotion recognition in conversations, the task of identifying the potential causes behind an individual's emotional state in conversations, is of great importance in many application scenarios. We organize SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, which aims at extracting all pairs of emotions and their corresponding causes from conversations. Under different modality settings, it consists of two subtasks: Textual Emotion-Cause Pair Extraction in Conversations (TECPE) and Multimodal Emotion-Cause Pair Extraction in Conversations (MECPE). The shared task has attracted 143 registrations and 216 successful submissions. In this paper, we introduce the task, dataset and evaluation settings, summarize the systems of the top teams, and discuss the findings of the participants.
Comments: 12 pages, 3 figures, 4 Tables
Subjects: Computation and Language (cs.CL); Artificial Intelligence (cs.AI); Multimedia (cs.MM)
Cite as: [cs.CL]
  (or [cs.CL] for this version)
  Focus to learn more arXiv-issued DOI via DataCite
Journal reference: Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024)

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. FREE 15+ Sample Task Analysis Templates in Google Docs

    task analysis to

  2. How To Conduct An Effective Job Task Analysis In 8 Steps

    task analysis to

  3. Task Analysis: Support Users in Achieving Their Goals

    task analysis to

  4. FREE Task Analysis Template

    task analysis to

  5. FREE 15+ Sample Task Analysis Templates in Google Docs

    task analysis to

  6. Hich Best Describes a Task Analysis

    task analysis to

VIDEO

  1. System Approach: Task Analysis Component #gndu #handwritten #ict #component #sem2 #approach #task

  2. Job Task Analysis Advanced Report

  3. Task Analysis Video Assignment for EDSP 473

  4. How To Create Task Analysis Manually

  5. Task Analysis Video

  6. Command task analysis by Group Testing mentor- Vishal sir.#SSB#nda #cds #afcat

COMMENTS

  1. How to Conduct a Task Analysis (With Examples)

    Verify your email address. Choose a username and password. Upon conducting a task analysis, you determine that Step 7, "Verify your email address," actually consists of multiple subtasks, such as opening up an email app. You decide to move this step later in the process to avoid disrupting the workflow.

  2. What Is Task Analysis? Definition, How To and Examples

    Task analysis is an observation method that divides goals into smaller subtasks. The task analysis process applies to numerous industries and can improve the efficiency of goal-setting, employee training and task completion. Learning what task analysis is and how you can apply it to your work can help you improve the daily operations of a ...

  3. What Is Task Analysis?

    Task analysis is the complete study and breakdown of how a user successfully completes a task, including all physical and cognitive steps needed. It involves observing an individual to learn the knowledge, thought processes, and ability necessary to achieve a set goal. For example, a website designer may perform a task analysis to see the ...

  4. Task Analysis: Definition, When to Use and Examples

    Task analysis is learning about users by observing their actions. It entails breaking larger tasks into smaller ones so you can track the specific steps users take to complete a task. Task analysis can be useful in areas such as the following: Website users signing up for a mailing list or free trial. Track what steps visitors typically take ...

  5. How to Do a Task Analysis Like a Pro

    So, putting everything together from steps 1 and 2 and then breaking the subtasks into steps, your final task analysis would look like this; 1. Adding new content to social media. 1.1 Check the editorial calendar. 1.1.1 Navigate to the calendar webpage. 1.1.2 Click today's date.

  6. Task Analysis 101: What Is It and How To Improve Your UX?

    Task analysis is the process of analyzing the number of steps (tasks) a user has to complete to get their jobs to be done (JTBD) when using your product. It helps UX designers and product managers understand user behavior and eliminate unnecessary steps in the user path. The primary goal of task analysis is to detect flaws in the UX design that ...

  7. What Is Task Analysis In UX? [Complete Guide]

    Task analysis is a process that helps UX designers learn how users actually go about completing tasks with a product. According to Maria Rosala of the Nielsen Norman Group, "a task refers to any activity that is usually observable and has a start and an end point.". So, in task analysis, UX designers first research how users complete tasks ...

  8. Task Analysis: Support Users in Achieving Their Goals

    The task-analysis process can be viewed as two discrete stages: Stage 1: Gather information on goals and tasks by observing and speaking with users and/or subject-matter experts. Stage 2: Analyze the tasks performed to achieve goals to understand the overall number of tasks and subtasks, their sequence, their hierarchy, and their complexity.

  9. Task Analysis: How to Optimize UX

    Task analysis is a UX research method for mapping out how users complete a specific task within your product, e.g. paying an invoice in accounting software, or updating their picture on a social app. It identifies major decision points, cognitive load, and points of friction they encounter when completing the task.

  10. What is Task Analysis?

    Show video transcript. Task analysis is a method that helps you understand how users accomplish their goals and the steps they take to get there. This establishes their mental models and is crucial for task-oriented design. The most common output of a task analysis is a diagram that outlines the user's actions and the system's responses.

  11. UX Task Analysis: A Complete Guide + Example

    Key Takeaways. ️ Task analysis in UX means detailed mapping of how a user completes their goal using a digital product and of dependent system actions. 📈 It is crucial when developing a new product or when updating an existing one. 🎯 Understanding exactly how a user interacts with a system leads to design improvements, increased user satisfaction, and overall increased efficiency

  12. Task analysis and how it can help build a project team

    Task analysis can help you build the best team for each project by ensuring you have the necessary experience, skill set, and personalities to promote collaboration and maintain high productivity levels. Additionally, task analysis can help simplify complex tasks and help your hand-picked team meet or exceed deadlines.

  13. The Essential Intro to Task Analysis

    Task analysis is the process of observing customers using your product or service in real-time to better understand their process for performing certain tasks. Once completed, you can learn which tasks your application should support and what features or interfaces should be adjusted to align with customer needs.

  14. Principles of Task Analysis and Modeling: Understanding Activity

    Task analysis is a cornerstone of User Centered Design (UCD) approaches (Diaper 2004), aiming to collect information from users about the work they are doing and the way they perform it.According to (Johnson 1992), "any Task Analysis is comprised of three major activities; first, the collection of data; second, the analysis of that data; and third, the modeling of the task domain " (p. 165).

  15. What Is Task Analysis (Including How-To and Examples)

    Task analysis example Jimmy's electronics is looking to hire a new inventory manager. The owner conducts the following analysis to help them create an accurate job description of the role: The owner begins the recruitment process by identifying the goal, which is to hire an assistant to help manage and track the store's inventory. They divide the goal into sub-tasks that they complete, which ...

  16. Task analysis

    Task analysis is a fundamental tool of human factors engineering. It entails analyzing how a task is accomplished, including a detailed description of both manual and mental activities, task and element durations, task frequency, task allocation, task complexity, environmental conditions, necessary clothing and equipment, and any other unique ...

  17. Using Task Analysis to Support Inclusion and Assessment in the

    Task analysis is an evidence-based practice that promotes independence and instruction in inclusive settings. Although task analysis has an extensive history in the field of special education, recent research extends the application to both teachers and students, a pro-active approach, and promotes self-monitoring.

  18. PDF Task Analysis (TA) ---Step-by-Step Guide---

    Task Analysis National Professional Development Center on ASD 2015 3 Step 2: Using TA This step describes the process of using each of the task analysis procedures. 2.1 Follow the unique steps for backward chaining. When backward chaining is used to teach a target skill or behavior, the steps identified in the task analysis will ...

  19. Task Analysis: Evaluative UX Research Methods

    Task analysis should be done near the beginning of the design or redesign process. It makes sense to understand the problem (which happens in the discovery phase) before creating the tasks to solve it. So, task analysis is a great approach to use in the early in the prototyping or research validation stage. ‍ Once you find out a user's most ...

  20. Task Analysis in User Research

    Task analysis allows the researcher to not only understand the participants end goals but also their competence in performing the task, the triggers that lead to the task, the triggers that disrupt the user's flow during the task as well as the tools the user employs to perform the task. 2. High level understanding of user environments.

  21. Task Analysis Examples for Use in the Classroom to Help Students Reach

    Task analysis, in simple terms, is a process that breaks down an activity into smaller parts. By using task analysis in the classroom, teachers find that goals are more easily reached and that students are more likely to recall material at a later date. Sequences or steps are followed and practiced, making complex goals more attainable and hazy ...

  22. PDF Task Analysis: Steps for Implementation

    Task Analysis. Madison, WI: National Professional Development Center on Autism Spectrum Disorders, Waisman Center, University of Wisconsin. Task analysis is the process of breaking a skill down into smaller, more manageable components. Once a task analysis is complete, it can be used to teach learners with ASD a skill that is too challenging to ...

  23. Task Analysis in ABA Therapy: Strategies and Examples

    Task analysis is easy to adapt to the needs of each individual learner. The techniques can be applied in multiple settings, including classrooms, homes, and the community. The skills taught via task analysis are practical in the student's everyday life. Task analysis can be used in one-on-one instruction and in group settings.

  24. What Is Data Analysis? (With Examples)

    Written by Coursera Staff • Updated on Apr 19, 2024. Data analysis is the practice of working with data to glean useful information, which can then be used to make informed decisions. "It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts," Sherlock ...

  25. SWOT Analysis With SWOT Templates and Examples

    SWOT Analysis is a tool that can help you to analyze what your company does best now, and to devise a successful strategy for the future. ... So, make the task easier and more effective by arranging your four lists together in one view. A SWOT matrix is a 2x2 grid, with one square for each of the four aspects of SWOT. (Figure 1 shows what it ...

  26. Collaborative Modeling to Compare Different Breast Cancer Screening

    The line represents the estimated efficiency frontier for model D; labels above line indicate efficient and near-efficient strategies. Efficiency frontier graphs for all models are shown in the Supplement.Grey shading, in which near-efficient strategies are located, shows area within 5% of the value for screening biennially during ages 50 to 74 years with digital breast tomosynthesis (DBT).

  27. PDF Report of the Independent Task Force on the Application of National

    Task Force reviewed several thousand incident reports gathered from several dozen credible orga-nizations to reach its findings. In reviewing this data, the Task Force has identified scores of specific incidents for which it be- ... Our aggregate analysis of credible reports involving U.S-provided weapons by Israeli forces indi-

  28. Bersama Warrior 2024 first week wraps up with Mission Analysis Brief

    Forty members of the Washington National Guard and their Malaysian Armed Forces counterparts wrapped up a successful first week of the Bersama Warrior 2024 staff exercise with a mission analysis brief on June 8, 2024, in Kuala Lumpur, Malaysia., The official website of Joint Task Force - Mirconesia (JTF-M).

  29. US postured to lose without a Standing Combined Joint Task Force in

    By John D. Rosenberger on May 16, 2024 at 1:26 PM. Adm. Samuel Paparo arrives for the U.S. Indo-Pacific Command change of command ceremony presided over by Secretary of Defense Lloyd J. Austin III ...

  30. SemEval-2024 Task 3: Multimodal Emotion Cause Analysis in Conversations

    In addition to emotion recognition in conversations, the task of identifying the potential causes behind an individual's emotional state in conversations, is of great importance in many application scenarios. We organize SemEval-2024 Task 3, named Multimodal Emotion Cause Analysis in Conversations, which aims at extracting all pairs of emotions ...