Learning and Development Trends: Your Questions Answered!

In Learning and Development (L&D), it feels like we are always trying to catch up with all there is to know in this fascinating field. Augmented reality! The latest in eLearning! Leadership and cultural transformation! Organizational learning! Constant upskilling! Rightskilling! Just-in-time training! The list goes on. So we decided to curate the latest in learning and development trends for all of you.

After surveying learning leaders and professionals in the L&D field and analyzing the data, we compiled our comprehensive Learning and Development Trends Report: L&D Challenges, Life-Centered Solutions: Transforming Business & Unlocking Human Potential for L&D professionals like you to reference as you gear up to make your next strategic decisions. (You’re welcome! We actually had a lot of fun making it. Who knew stats and data could be so fun?)

In May, we shared the findings of our 2022 L&D Trends Report in our webinar L&D Trends From The Real-World Challenge Perspective: Insights From Your Peers Working In The Craft. In true SweetRush style, this wasn’t your typical webinar. Instead of lecturing participants on the findings of the report, our panel of thought leaders and experts invited participants to bring their real-life challenges to discuss and answer live! Panelists included Andrei Hedstrom (Cofounder & CEO), Annie Hodson (Director of Client Solutions), Hernán Muñoz (Creative Director Team Lead), Judi Kling (Senior Learning Experience Designer), and Danielle Hart (Director of Marketing). 

download content

Interactive Virtual Discussion

As we compiled the Trends Report, we noticed an overarching theme around building a culture of learning and specifically: How we might foster a culture of learning that serves learners and helps them unlock their full potential? For the interactive portion of the webinar, we posed this question to the audience and asked them to share their unpopular opinion, what’s challenging for them, and what they are curious about. The audience’s responses guided the question and answer session that followed.

How we foster a culture of learning

The overall question of the interactive webinar focused on how to foster a culture of learning.

How may we foster a culture of learning

Webinar participants used virtual sticky notes to provide feedback on the following topics: 

My Unpopular Opinion, What’s Challenging For Me, and I’m Curious About. The mural became a learning artifact for learners so they can reference it long after the event is finished. (Read more about what this means on page 34 of the Trends Report.)

download content

Learning and Development Trends Q&A

We had a lively, engaged audience for the webinar who brought some tough questions about L&D trends to our panel. Here are the questions and insights they shared!

Q1. How do we motivate learners to take eLearning courses? 

Judi: It revolves around the culture of learning. We’ve seen a shift from our clients to move away from learning as an event to learning as continuous. This goes beyond just motivating learners to take eLearning courses but also creating experiences that support them holistically

We can pair those eLearning courses with cross-departmental projects, mentoring, and/or social support groups.

This way, the eLearning courses become a piece of a bigger whole, and then there’s the motivation to go and see what’s in that eLearning course. 

Andrei: You need to understand the individual learner. And this is part of a larger question we see being asked of: How do you fully appreciate the effort it takes to understand your learners and build in the right level of support and awareness for your team? Who’s designing and developing these experiences? 

Part of that awareness is understanding that there are going to be communications that are going to support learners coming to the training to begin with. That’s why some of our client-partners work with us to develop branding and communications.

That’s another layer of dialing into those learners—understanding what’s important to them and spending some time and resources, thinking about the sorts of things that are going to motivate them and get them to the training.

Q2. I don’t like my learning management system (LMS), and I am not able to switch. What should I do?

Judi: We’ve gotten really creative with LMSes and trying to get them to do things that you wouldn’t normally think would be possible. So we’ve had some social learning experiences that we’ve designed around training, and the groups go to the LMS and they click on a wrapper that looks very similar to a course. The LMS is tracking, but the learners can go outside of the LMS to do social learning. 

There are techniques that we’ve used to maximize our client-partners’ LMS without limiting our learning solutions.

Hernan: If we’re limited to an LMS, maybe it’s time to think outside the box. What other things are available to us? Maybe there are things that you can do outside of the LMS. An LMS could be a legacy item, but it’s a great opportunity to think about communication, strategy, marketing, and ways to maximize what you have outside of the LMS. 

Q3. Can we talk about screen fatigue?

Danielle: In the Trends Report, this concept of screen fatigue comes up a lot. At SweetRush, one of my favorite ways to learn is just to get together with a group of people who are really interested in the same topic and just chat about it.

Judi: We’ve had a lot of clients ask us about this over the past two years. It used to be that everybody wanted that “quick hit” in an LMS or eLearning course. 

Now, we are looking at things like mentoring programs to put those experiences in with an eLearning to develop cross-departmental projects that might help this learning piece like social support groups or book clubs. 

We have a book club going here around Learning and Development topics at SweetRush. We pick a book, read, come together, and learn together. It’s more of a social network and you’re learning. You’re doing what you need to do to advance your career, but it doesn’t feel like it. People are needing that right now.

Annie: With social learning, one of my favorite developments that I’ve seen in recent years is the shift of any kind of virtual instructor-led training (VILT) needing to be formal. It used to always have to be facilitated. It was: Here are the slides and facilitator guide. We have the activity, people come, and it’s this formal learning event.

I’ve seen a lot of our client-partners transitioning to this idea of a practice lab where people can come together and learn and not necessarily have a formal facilitator there. 

Q4. How do you get leadership on board with learning initiatives?

Andrei: It takes time, resources, and awareness. There’s a really interesting discussion going on right now across cultures and organizations. There’s a collective awareness that if you can affect an individual’s potential, you can affect the organization’s potential.

Annie: People in L&D say, “Leaders don’t even know what they don’t know sometimes,” or “Leaders think they know a lot about education or learning or L&D but they don’t.”

At SweetRush, curiosity is an important part of our work and how we operate, and curiosity is good for all of those leaders too! Promote design thinking mindsets. Begin from a place of empathy and encourage leaders by saying, “All of us have our blind spots.”

We all can learn from what’s going on around us. Before we launch a big initiative, have we stopped to ask the learners? Have we stopped to conduct a focus group? 

We have an activity that we do in our CoDesign workshops called The Talk Show where we talk to learners who are part of our target learning audience. We use this talk show format to ask them questions and get them sharing about what works, what doesn’t work, what they liked, what they wished for, and what was hard.

We let that empathy work ground the decisions that we make throughout the rest of the design process. 

Q5. How do we create something cutting-edge but still make sure it’s accessible for learners of all types?

Hernan: When we think about cutting-edge technology, we can still make sure accessibility is at the forefront so that it can be inclusive of all learners. There’s a big trend in accessibility in the gaming industry right now, for example, which our industry can look to. There’s a very popular game called Elden Ring that has about 70 features for accessibility. 

They are making games accessible in a variety of ways (color contrast, gameplay, mechanics of the controllers, audio settings). High-tech companies have heard the call to build more accessible products as well. 

No matter the technology we are using, we can make learning that is accessible through inclusive design. At SweetRush, we are leading the way and working with several other companies to create accessibility standards for the eLearning industry.

The following are some questions that appeared in the chat that the panelists did not have the chance to answer:

Q6. Can you tell me more about the Talk Show format?

Hernan: Talk Show is a technique we use to interview our learner audience so that we can find out how they like to learn.

We call it Talk Show because, when we do our in-person CoDesign workshops, we set up a little stage area that looks like a talk show set. The facilitator (aka the talk show host) greets the guests (our learner audience) on stage with a warm welcome and proceeds to interview them. As an interviewer, I rely on active listening, reiterating what I heard from the client, asking clarifying questions and following my own curiosity at certain points. 

Learn more about our Talk Show format in our Needs Analysis Playbook on page 38.

Q7. How long will it take for L&D professionals to become more of a stakeholder and have a seat at the table?

Andrei: It’s an incredible time to be in L&D. As L&D professionals, you’ve worked hard for decades to elevate the craft—once considered an afterthought—and now you’re being asked to create, contribute to, guide, and shape business strategy and transformation.

The pandemic may have fast-tracked this moment, but make no mistake, it was coming, supported by a growing body of evidence. Focusing on our people is the path to transform business and society for the better.

I’ve written an article about this in our Trends Report. Head to page 19 to read more. 

Q8. Can you talk more about adaptive learning organizations? 

Danielle: Adaptive learning organizations are able to quickly recognize change on the horizon and pivot to new strategies to support the needs of their business and the people they serve (leaders, employees, customers, etc.). They are collecting and analyzing the right data; they have frequent and direct communication with their stakeholders and audiences; and they make thoughtful (but not prolonged) decisions about restructuring, reorganizing, and reimagining their work.

During the pandemic, all learning organizations were forced to make rapid changes—the most obvious being the accelerated digital transformation of learning. Yet what changed and how smooth or painful that change was…every organization had its own unique experience. We’re fortunate to work with very large, global organizations, and it was impressive to see how a “big ship” can turn fast when there is a strong, unified culture and leadership. For example, many clients needed to shift resources when an area of the business was hit by the pandemic, and they collaborated with us on reskilling programs to help move that talent into other areas.

We also saw that our clients that have adaptive learning organizations were already assessing their digital learning portfolios by year two of the pandemic and looking to improve them. Continuous improvement and innovation are hallmarks of these organizations. I recommend Josh Bersin’s report on adaptive learning organizations to learn more!

Q9. What do you do when eLearning is not the employees’ preferred way of learning?

Judi: Here at SweetRush, we start with learner interviews to determine their feelings, needs, and wants around training. From there, we get what we call “audience insights.” These are statements that learners make, such as “I’m on the sales floor and don’t have time to sit down for eLearning.” 

Then, we think about how insight will influence the learning solution. For the cases where eLearning is not the preference, we brainstorm alternative learning experiences that do fit in with their preferences. We think about questions such as: Where is our learner? In this place, how might they consume content? What does a learning experience look like to motivate them?  

We’ve created just-in-time training where we place QR codes at key locations. The learner can scan the code and it brings up content in a variety of forms, such as a “how-to” document or a quick video. We’ve created podcasts for sales executives. This allows them to “train” while doing something else (driving to the client, sitting on an airplane, working out, and so on). We’ve also created sales onboarding where learners go on a physical scavenger hunt and collect customer service experiences from stores. 

Just about anything you can imagine in terms of an experience can be leveraged for learning. 

Thanks again to those of you who tuned in to the live webinar! If you’d like to watch the recorded webinar, you can do so here.

If you’re both excited by and a little overwhelmed with all the unfolding developments in L&D, you’re in great company. We all have questions, challenges, and insights. It’s how we approach them that makes the difference.

Reading our latest 2022 L&D Trends Report is a great starting place to orient yourself in this unique time in L&D. And reach out to start a conversation if you are finding yourself needing a partner for custom eLearning, L&D staffing, immersive technologies for learning, or transforming leaders and culture. We are here to help you with any of your people development and learning solutions.

Ask a Psychometrician, Part I: What’s a Psychometrician?

What’s a Psychometrician?

When I first learned that we had a psychometrician on our team, I pictured someone who spent their day at a workbench full of phrenology skulls, calipers, and bubbling beakers—maybe even with an adorable Muppet assistant named Beaker. 

Meep! I was intrigued. 

I caught up with Barbara Rowan, SweetRush’s resident psychometrician, to learn what a day in her life is really like. (Spoiler: It’s absolutely riveting—and caliper-free!)

Barbara Rowan, PhD, Psychometrician

TV: Help me out—what does a psychometrician actually do? Do you have an elevator speech you give at parties?

BR: Sure! A psychometrician is someone who’s an expert in assessment and measurement. We write tests, but we do so much more! We can also look at existing tests to make sure they’re reliable and valid—or that each item measures what we say it’s measuring.

We also review existing tests. It’s all about analyzing data that help us pinpoint exactly how well a test is functioning. For example, if your learners say your test questions are too hard, I can find which—if any—are too hard. Ideally, a test should have a balance of hard, easy, and average items. I can also tell you if your learners aren’t studying or, on the other hand, if your items are too easy. 

TV: I think I recognize the terms reliable and valid from my Statistics 101 days—do I have that right? 

BR: Reliability and validity are some of the first concepts we learn in statistics, but psychometrics takes them to a whole new depth. 

Let’s start with validity. Put simply, validity means that your assessment measures what it’s supposed to measure and you’re not accidentally measuring other skills and knowledge instead. 

TV: Can you share a bit about why that matters? 

BR: In short, moral and legal accountability. Assessments decide where—and whether—we’re admitted to schools or offered a job. 

For example, a class-action lawsuit was brought against The Educational Testing Service by test takers who had received incorrect scores on an assessment some states used in teacher licensing decisions. They ended up paying $11.1 million to the plaintiffs. 

There have also been cases against high schools by learners who failed the exit exams required to graduate. These learners claim the test is not reliable or valid (We’ll talk more about reliability shortly!), which isn’t acceptable for such a high-stakes test. 

Several cases were brought against other high schools by students who are differently abled or non-native speakers of English. They felt that the schools’ assessments were biased against them. 

As you can see, test questions have a real impact on people’s lives and futures. It’s important for organizations to know that their assessments are measuring what they purport to measure. Those that build their assessments responsibly have performed all the right psychometric tests and documented the results in a technical manual. 

TV: This is so timely, with all of the discussion and reconsideration around standardized testing. I want to get to how to do things right—but I’m also morbidly curious about what it means for an assessment to measure something other than what it claims to measure. Can we delve into the dark side just for a moment? 

BR: Imagine that you are taking a literature test on the computer. You don’t do very well. 

So, why was your score lower than you expected? Perhaps it’s difficult for you to read a computer screen. Perhaps you have to scroll down the page to completely read the passage, so you can’t see the entire passage while answering the questions. Perhaps you don’t feel comfortable with using technology. Maybe English isn’t your first language. So, this literature test is not accurately measuring your ability to read a passage and answer questions. Instead, the test is highlighting the difficulties you have taking tests on computers or in the English language. 

Another example of this is when we are trying to measure one construct, but inadvertently measure another one. Imagine a math test with story problems. Not only are we measuring one’s math skills, but we could also inadvertently be measuring one’s reading skills. A poor test score could mean that the student doesn’t know how to perform the math calculations necessary OR it could mean that the reading level is too high for this particular learner.

So organizations that use tests to make any decision—especially high-stakes decisions, have a moral and legal obligation to ensure that their tests are fair and equitable for all test takers. At a minimum, organizations must perform the Big Three of psychometrics.

TV: I have a feeling you’re not referring to the auto industry when you say “the Big Three.” What does the Big Three mean to a psychometrician? 

BR: The Big Three are the top—you guessed it!—three indicators of whether an assessment is performing the way it should. They need to be measured with every assessment, every time. The good news is, they are easy to calculate with the right software.

The Big Three includes: 

  1. Reliability 
  2. Item difficulty
  3. Item discrimination

Reliability means that learners get essentially the same score if they take the assessment more than once. Reliability also measures an assessment’s internal consistency, or how well each single item relates to a learner’s total score. There are several measures of reliability used in psychometrics, but Cronbach’s Alpha is the most widely used. Cronbach’s Alpha is a test-level statistic, but I also care about every individual item on the test. 

Item difficulty is just as it sounds. This calculation gives us an indication of how difficult or how easy a question is. This is a statistic that we calculate for each question. We ultimately want the majority of our questions falling in the moderate level of difficulty.

Item discrimination indicates how well a question discriminates between learners who understand the content and learners who don’t. Ideally, we want questions that highly discriminate between those who do well on the test and those who do not. We definitely don’t want a question that low scorers are getting correct and high scorers are getting incorrect. That is a question that does not discriminate well.

In addition to the Big Three, we need to conduct validity studies.

There are so many types of validity! And most validity studies take several months or more to conduct. However, one of the quickest and easiest types of validity to establish is content validity. To establish content validity, I work with subject matter experts (SMEs) to review an assessment before administering it to the learner. Through this process, the SMEs review the questions to ensure that the content is correct, and that all of the questions measure the construct, or subject, that we intended. To calculate other types of validity, learner sample size is critical. Ideally, we’d include between 300 and 500 learners—but we can work with a minimum of 200. Larger numbers of learners reduces error and gives us more faith in the results.

TV: Wow, that kind of deep study makes a lot of sense—especially for high-stakes assessments that affect people’s lives and futures. Is that kind of assessment situation the best case for a psychometrician? 

BR: Anytime an assessment is being written and anytime you need to vet an assessment you’ve already developed, you’ve got a case for a psychometrician. 

Do you think your questions are too hard? Too easy? Get hold of the data, and I can tell you.

As you’re building a course and deciding what your content needs to be, please bring in a psychometrician.

I need to partner with instructional designers (IDs) from the beginning, as they’re considering the learning objectives (LOs) for a solution. We need to ensure that their LOs can be measured—if they can’t, then our assessment results are meaningless.

For example, some LOs can’t be measured by the kinds of autograded assessments we see in many eLearning modules; they can only be measured by having learners create, write, or build something. If live assessment graders aren’t part of the project scope, we need to rethink the assessment and the LOs. 

Once we land on measurable LOs, the ID creates the learning journey and the content. I come back in when the assessment items need to be written. 

I think of my relationship with my ID friends as a system of checks and balances. I can’t do what they do, and they can’t do what I do—but we make one heck of a partnership!

TV: As a former ID, I appreciate that! And I hear you about the importance of measurable LOs. Can you share more about the risks of not involving a psychometrician in a learning solution design

BR: A big part of these risks goes back to moral and legal accountability. Obviously, we want to build a sound assessment tool because it’s the right thing to do. But we also need to be sure we are protected in case a learner questions the results.

Bringing in a psychometrician early in the development process can get you answers to these key questions: 

  • How do you know your assessment is measuring what you say it is? If you’re using it to make decisions, you need to know that it’s performing well. 
  • How do you know that the decisions you make using your assessment data are the right decisions? You want to do your best work, and you want a testing instrument that has been properly vetted. 
  • How sure are you that your assessment is free of bias? You want a fair playing field for everyone taking your test—and you want to be able to show the work you’ve done to provide an equal opportunity for everyone. 

These questions aren’t a one-and-done, either: You should be reviewing your assessment every few years. A psychometrician can put your assessment questions to the test—and help you respond in case your assessment is questioned. 

Suppose I’m applying for a job, and an organization’s HR department administers a test. I feel the questions are biased and I say that probably one group of people is performing much better than everyone else. If the organization hasn’t done their homework and studied the Big 3, I could very well be right. And if bias or a lack of reliability are discovered after the fact—or worse, if they were discovered but not addressed—the organization is liable. 

Even something as simple as test format can have an impact! My dissertation pitted paper and computer versions of the same STEM literacy exam against one another. I wanted to see if either format conveyed an advantage. Controlling for gender, age, ethnicity, and race, I found that the mean scores weren’t significantly different. (For those who speak stats: The t-test showed no significant difference between mean scores on the two delivery methods.) Even though the t-Test was not significant, the two versions of the test were found to be tau equivalent. This means that the two versions of the test were measuring the same construct, but on a different scale. To use these two test forms interchangeably, the scores would have to be rescaled to the same scale. Most people wouldn’t even think about the fact that the paper and computer versions could measure on different scales. I mean, each and every question is exactly the same across both versions. 

The lesson? Even two versions of the identical test don’t necessarily perform the same way or on the same scale when delivery methods differ. 

TV: Wow. I am thoroughly cured of the illusion that I can write a solid assessment.

Let’s close on a lighter note. Can you tell me about an assessment you’re really proud of? 

BR: Absolutely! I was working with a client-partner at a global organization focused on improving community health. They were struggling with evaluating their new hires. These new hires were the people who went out to conduct workshops in local communities. But not all of the new hires who passed the evaluation actually did well in those communities. People who lacked the skills to do the job effectively were passing. 

This was a case for a psychometrician! I partnered with the organization to standardize the new hire assessment and ensure that everyone who passed was actually ready to go forth and serve the communities. 

I took a look at their old assessment, which consisted of a list of checkboxes. There was a lot of room for individual interpretation on these! I worked with the evaluation team to develop new rubrics using definable, observable criteria in three key areas of evaluation. 

When they took the first rubric into the field for testing, the team found that people actually failed. And in this case, that was a good thing! It meant that the rubric finally had a high level of discrimination—in other words, the people who didn’t have the skills to do the work didn’t pass. The evaluation team could follow up with people who didn’t pass to offer additional training—or initiate job fit conversations.

TV: That’s a great example! I love that it serves a cause—and helps good people do better work. Thanks so much for sharing more about what you do. 

Join us for Part II of Ask a Psychometrician, where Barbara will show us how to write great test questions—and make our assessments better, smarter, and fairer.

Got a Case for a Psychometrician? Here’s How to Tell 


Not sure if you’ve got a case for a psychometrician? You're not alone! 

Rodrigo Salazar, Director of Talent Solutions


Rodrigo Salazar-Kawer, our Director of Talent Solutions, likens Barbara's value-add to the invention of the automobile. Before the Model T, people looking for speed were in the market for faster horses. They couldn't even conceive of something as fast as a car.

That’s the kind of power a psychometrician brings to your assessment! 

And if any of the following challenges sound familiar, you just might have a case for involving one: 

  • You're having trouble formulating assessment questions or quantifying data.

  • Your assessment or rubric is passing people who don't have the skills—and/or failing people who do.

  • Your assessment will be the basis for deciding high-stakes outcomes, such as admission or employment.

  • You're developing a certification program for a skill or job role.

  • You need to assess a learner's ability to explain, create, or do something.

  • Your instructional designers (IDs) or subject matter experts (SMEs) are writing assessment questions.

  • You need to reconsider an assessment that may be biased or outdated.

  • You aren't sure what's going on under the hood of your assessment and could use a second opinion. 


Like any test drive, there's no obligation—just an opportunity for Barbara to ask a lot of questions. It's a needs assessment...for your assessment. 

There's a range of models to choose from, too! Some client-partners are ready to bring in Barbara for their entire project from the outset—while others may prefer to work in phases. Phased work is a great option for clients who need to demonstrate results or secure budget incrementally. 

A Tale of Two Phases

A leading technology company believed its assessment questions were too easy and asked Barbara to review them. This client-partner noticed that too many learners were passing its exam. That was an immediate red flag! 

Barbara’s Phase I project was to examine the learning outcomes and content of the course. She discovered that most of the questions were well mapped to the learning outcomes—but the learning outcomes were too low-level. To make the questions more complex, her client-partner would need higher-level learning outcomes. 

That meant a lot of changes ahead. For Phase II, Barbara created new assessment questions based on the client-partner’s learning outcomes—and ensure that these outcomes were measurable by the autograded assessments they needed to use.

Backtracking is never fun! That’s why it’s best to bring a psychometrician in as you develop your learning outcomes and content blueprint. They’ll tell you what’s possible–and how it can be measured fairly and accurately. 

Want to chat about your assessment challenge? Get in touch.

Design Thinking Webinar: Your Questions Answered!

We’ve said it before: You, your learners, and your stakeholders are the experts on your learning needs. That’s why CoDesign℠, our learning design experience inspired by design thinking, invites all of these folks into the learning design process. It’s learning, co-designed by learners. (See what we did there?)

CoDesign isn’t just a great way to increase the efficiency and effectiveness of your learning solutions—it’s also a lot of fun! When Shakespeare said, “The play’s the thing,” he might well have been talking about a CoDesign session. But setting everyone up to play well requires great casting, cuing, and direction. 

In the webinar, Want To Create Learner-Centric Training Programs That Delight Your Stakeholders? Discover Design Thinking!, our star practitioners shared a detailed overview of the design thinking process and how it can help your team build a learning solution that shines. 

Our expert cast included:

  • Dani Howarth, Former Facilitation Team Lead, SweetRush
  • Andrei Bonilla, Creative Director Lead, SweetRush
  • John Cleave, Senior Learning Engineer, SweetRush
  • Judi Kling, Manager, Applied Learning Experience Design and Consulting, SweetRush
  • Guest star Jeanne Morris, Vice President of Education at the Society for Human Resource Management 

In this presentation, they shared a behind-the-scenes view on:

  • How design thinking contributes to better learning design outcomes
  • Ways that collaborative technology can support virtual learning co-creation
  • How collaborative, human-centered design enhances each element of learning design
  • How to assess your project’s readiness for a successful CoDesign

As always, our audience played a major part in the show, and their questions kept us thinking long after the final act. This encore performance covers everything we didn’t get to answer live—from use cases to ideation etiquette.

Design Thinking Process

Q1. Why do you recommend CoDesign to your client-partners?

Dani: There is always a benefit to hearing firsthand from learners! That is really the critical piece. There are many other benefits, including alignment on the program constraints and requirements, expedited design phase, and collaboration and camaraderie with our client-partners.

Getting all of the right people (including learners) in the same virtual room at the same time ensures that we understand the learner and business needs from the very beginning. Getting these needs right early helps to ensure that the rest of our work is successful. It also helps shave up to 34 business days from our project timelines!

Q2. How does CoDesign make learning solutions stronger? 

John: They’re built from data, not from assumptions. Numerous times, we’ve gone into a CoDesign experience with an initial vision for the training, only to have that vision completely change in the face of the realities of the situation we gather in our sessions. If we had gone the more traditional route, we would have missed the mark. But the CoDesign experience allowed us to leap beyond our assumptions to get to a better solution.

Judy: A CoDesign experience kicks off with upfront consideration and integration of instructional design, creative design, and technical requirements. This allows us to determine a much more realistic time estimate for developing the solution.

Q3. What makes a project a good fit for CoDesign?

John: CoDesign is most successful with teams that are prepared and eager to collaborate. It’s a good fit if your stakeholders are excited for the possibilities, your team is eager to dig in with ours, and your learners have the time and commitment to participate. 

The CoDesign experience is not a good fit if you come into it with a vision of the solution already locked into your mind. It works best if you let yourself remain open to the solutions that the experience will reveal. 

Q4. SweetRush quickly shifted from live CoDesign to virtual due to COVID. What did you gain in terms of supporting co-creation using collaborative technology?

John: Shifting to virtual helped broaden the pool of collaborators. It’s that much easier for you to bring in participants from across your organization and for us to loop in the subject matter experts (SMEs).

And often, a virtual space can be a safer space for collaboration. Participants who might be more reserved in face-to-face meetings feel freer to open up in a virtual setting. Also, getting together virtually gives us more flexibility in how we match people up for group work. We can group the participants who share a creative and collaborative style, making things proceed that much more smoothly. 

Q5. What if the “How Might We” (HMW) question is too big and complex?

The HMW is a lot like the audience suggestion in an improv performance: It ignites the team’s imagination and inspires them to generate ideas. (See below for some recent examples.) 

Dani: In most design thinking models, the HMW is usually formed during the second step of the process: Define. But in CoDesign, we frame our HMW question before we ever meet for that first collaborative session—because the reality is that our client-partners never arrive with a blank slate. They always have a business objective in mind for the engagement.

The broader the HMW, the more ideas that can be generated. After all, it’s meant to spark creative discussions! We tell our client-partners that it’s always adjustable, so it’s OK if we uncover things during the process and need to tweak the HMW.

How might we codesign questions

Q6. Who should be invited to participate in CoDesign? 

Jeanne: One of the great things about the CoDesign process for us is that we’re reimagining one of the Society for Human Resource Management’s (SHRM’s) oldest and most popular premier products. So this design thinking process allowed us to take out our judgments and personal feelings. And we brought together this diverse group of folks in a virtual environment, where we could let go of our ownership over what was and instead think collaboratively about what could be. 

John: Jeanne, it sounds as if SHRM had the advantage of having the key decision-makers who were either directly involved with the project or close to it. When we’ve faced challenges is when we have a senior executive—somebody with a lot of opinions but not a lot of time—who flits into the fourth session or so and then changes all of the dynamics. They say, “No, that’s not right; we need to do this other thing.” 

Then, all of a sudden, all of this work that you’ve carefully constructed needs to pivot. To prevent those 11th-hour surprises, I’d recommend that if you have somebody who’s going to have a very strong opinion, get them involved in the CoDesign process very early. Have them participate—or at least be checked in—every step of the way. 

Q7. What’s the best way to approach a team member who is shutting down others’ ideas in a Play, or Ideation, session?

Dani: It’s a buildup to the ideation session: Start with setting ground rules, including being open to new ideas. Also, the small group work in breakout rooms helps build trust and relationships. The small groups work so well because we consider each person’s creative style and personality before matching them up! In short, it takes a keen awareness of group dynamics—and purposeful steering by the facilitator throughout the CoDesign experience.

Q8. How can we help participants who are having a difficult time sharing their thoughts? 

Dani: The small breakout groups and purposeful groupings (see above, Q7) really help with this challenge! We make sure to include a SweetRush team member in each group, and the feedback we’ve received is that the client participants were most open when the groups were smaller. 

Virtual sessions also enable participants to ask questions or share their thoughts—for example, by raising a virtual hand or writing questions on a virtual sticky note. We even have an activity where everyone changes a sticky note to a different color if they want to know more about someone’s idea. 

Jeanne: Our people didn’t experience too many challenges. Everyone had fun—even the team members we expected to be shy. After our sessions, people were emailing me to say, “Thanks for including me! I had a great time, and I’m glad I’m part of this process.” It was all very positive for us.

Design Thinking Codesign

Q9. Can you share more about the prototypes you create? I’m curious about the level of fidelity; are they more like storyboards, outlines, or true prototypes?

Andrei: The fidelity of the prototypes changes from client to client and project to project. The important part is what you want to focus on. If you’re still focusing on your concept, then high-fidelity prototypes work best. But if everyone is clear on the concept, and it’s time to start focusing on the steps, low-fidelity is the way to go. You can start having more fun by creating marks and more visually appealing things. (See above for an insiders’ look at our low- and high-fidelity prototypes.) 

John: In some cases, you’ll be focusing on the interaction and the interface, so you’ll use low-fidelity wireframes to show what the learner is going to see. But there are other times when you don’t want to worry about the individual experiences. You want to worry about the collection of experiences. What is the sequence of experiences that will lead people to become lifelong learners? So, as Andrei said, you pick the right way to prototype based on the needs of the particular project.

We hope these habits and hacks will help your team get your own design thinking show on the road. And we promise that, as you continue to connect with your learners, your learning designs will be nothing less than heroic. (No, we’re not being dramatic!). 

Want to re-watch the first act? Check out the webinar recording

Prefer the director’s cut? Flip to Chapter 4 of our eBook on human-centered life and learning.

Adaptive Learning: Your Questions Answered!

At our webinar, One Size Fits No One: Tailoring Learning Experiences to Individual Interests, Knowledge, and Skills with Adaptive Learning, John Cleave, SweetRush Senior Learning Engineer, Clare Dygert, Director of Learning Experience and Instructional Design, and Adrián Soto, Director of Immersive Technologies, discussed how to achieve maximum learner engagement without breaking the bank. 

The team also explored the benefits of adaptive learning and shared examples of how it:

  • Reduces the amount of time spent training
  • Deepens the impact of the training
  • Offers insightful data and analytics
  • Enhances the transfer of learning to the job 
  • Creates happier learners!

There was considerable interest in this topic, and the participants had some great questions. Here’s a quick summary along with the team’s responses. (The webinar is available for replay here.)

Q1. In my setting (healthcare), learners know the information (as evidenced in their daily work) but are required to complete annual training as proof of competency. Where does adaptive learning fit in when learners know the content but have to retake it every year? 

Clare: This is a tricky one because what you’re talking about is compliance training. In this situation there isn’t always a way around the requirement. There may be legal reasons why people have to be exposed to the same content or a liability issue for the organization that hires them. Generally speaking, adaptive learning is not a good solution for compliance training. That said, there is a good argument against noncompliance “refresher” style training, and it’s called the “expertise reversal effect.”[1]

Expertise reversal effect suggests instructional methods that are good for low-knowledge learners will actually make things more difficult for high-knowledge learners. It’s important, therefore, to understand the knowledge level of your target audience when designing adaptive learning solutions. You can then give your experienced and knowledgeable learners the opportunity to either go deeper into the content or to skip over the parts they already know. 

John: For me, the single most difficult aspect of adaptive learning isn’t the technology or the instructional design. It’s getting stakeholders and leadership to realize that forcing people through content isn’t going to make them any smarter. When you can convince stakeholders that learning doesn’t happen simply because we show people content, you’ll open the door to a richer dialogue about alternative options—like adaptive learning. 

Q2. How do we build in remediation for learners who are falling behind?

John: A technique that works well for us is to create a core path through the subject matter that everyone needs to see. Then, along the way, present the learners with challenges designed to reveal their knowledge and skill level. These challenges shouldn’t be designed for learners to simply recall the content; instead they should be framed so that learners have to apply what they’ve learned or demonstrate that they understand the concept. If the learners aren’t successful in the challenges, you can direct them to additional content or activities within the learning experience where they can get more help and explore the subject more deeply. Once they achieve a level of mastery, they’ll be returned to the core path to continue the learning journey. 

Clare: To build on this point, it’s important to think about the size of your courses or your units of instruction. The smaller the chunk size, the more flexibility you’ll have to build in adaptive learning. For example, if people are falling behind because they’re not mastering the instruction, you’ll need to identify where in the content they are getting stuck. If the content chunk is small, not only will it be easier to test and assess its efficacy, but it will also be easier to build in more targeted remediation.

Q3. Is it possible to create adaptive eLearning solutions without spending an unmanageable amount of time in development?

Adrián: From my experience working on virtual reality learning solutions, it’s manageable. That said, we are fortunate to have a full complement of team members on each of our projects, including project managers, learning designers, creative directors, developers, and engineers. If we know that we are going to be working on an adaptive learning solution—which does require more effort because we are creating more content—we map out the work in advance and scale up the team as needed. Planning is key. 

Clare: Regardless of your team size, here are some tips for making the development process more manageable.

  1. Repurpose existing assets. To cut down on production time, identify and then direct learners to content that has already been created by Learning and Development or that exists elsewhere within the organization. 
  2. Send learners offline. Another option is to direct people to complete a series of offline tasks such as interviewing someone in the organization to find out more about something. You could also point them to additional research they can complete independently—think articles or white papers. The golden rule here is to plan activities and assess external content to ensure they align with your objectives. 
  3. Chunk out the learning: My final tip is to break your content into smaller chunks and focus each chunk on a specific skill and related knowledge objective. Then, invite learners to practice the skill. If they aren’t able to perform successfully, you can direct them to the accompanying knowledge-based content before trying again. 

John: While there may be more work involved in creating adaptive learning solutions, you must look at the overall gains you achieve by having learners exposed to content that’s relevant to them without having to wade through stuff that isn’t. The gain is well worth the effort. 

Q4. How do you ensure people have mastered the skills if you give them choices of what to look at?

Clare: The short answer is that it really depends on what the skills are. For soft skills, you could create assessments to evaluate competency. Branching scenarios are one way to do this—learners are placed in a situation and have to choose the correct or most appropriate answer. Alternatively, you could take the assessment outside of the learning to get approval via some sort of observation where they have to demonstrate their competency to someone. 

John: Keep in mind that forcing learners to look at stuff doesn’t ensure mastery. But one way that adaptive learning can help increase mastery is to give learners an assessment, and based on their performance, direct them to learnings and resources that will address their skill gaps. 

Q5. How do I create reliable pre-assessments to direct people to different learning paths? In my experience, people often overestimate their abilities. 

Clare: Pre-assessments are typically designed to get a pulse on what people already know and can do, so that we can put them on the right path. When learners overestimate their abilities, they can end up on the wrong path, and their learning experience will be affected. 

A best practice we use is to include questions about people’s confidence in their knowledge or abilities. This additional metric helps to differentiate between people who know what they know and those who know what they don’t know.

We’ve found that pairing these questions with some sort of test or assessment early in the learning experience offers an instant reality check. A person who identifies as being very confident in a task will get a surprise if they do badly in the assessment. This experience should make the learner more open to learning—and the learning itself becomes more sticky as a result.

Adrián: We see this in virtual reality-based solutions often. People grab the headset and controller and think they will pick it up straight away—they are highly confident in their abilities. When they discover that they can’t control the device, their expectations—about their own competency and what they need to learn—shifts.

Q6. You have shown the outcomes and benefits of adaptive learning, could you share an example of how you build a small activity with adaptive learning? Maybe a video to share later would help?

John: This is a great question! We love the idea of creating a video and will put it on our “to do” list! In the meantime, there are some simple things you can do to incorporate adaptive learning into your learning solutions: 

  • Build a core set of slides in your authoring tool of choice and then build a separate set that you can link to if learners want a deeper dive.
  • Make videos or content optional to view. Decide which pieces of content are required and which pieces can be skipped based on what you know about the learners and the desired outcomes. Most authoring tools allow you to choose whether to make the content mandatory—learners have to view or click through all of the content on-screen—or optional. 
  • Add optional links for learners who want to explore the content at a deeper level. This is a great example of adaptive learning—people will click the link if they want but know that they don’t have to.

Q7: How might we use adaptive learning for second language training? And how could we integrate crowdsourced content?

Clare: If you’re going to crowdsource content, you’ll need to establish and communicate standards about what you’re looking for your crowd to do. You’ll also need to be explicit about the format you need the content in so that it will fit in seamlessly with your learning design. 

For the adaptive learning piece, a good place to start is to identify the types of errors that people commonly make when learning a second language—these mistakes may be specific to learning a language or to the language itself—and design remediation or additional practice activities for each one.

John: Duolingo and other language apps are already doing this. The programs notice where learners are struggling and then provide alternative practice activities to strengthen their skills in those areas of weakness. 

Q8. I work within a university on a vocational course, and we have to use the university’s LMS/CMS, which doesn’t seem to support adaptive learning. How do we develop adaptive learning experiences? Is there a way around it?

John: When you’re working with an authoring tool like WordPress, for example, my advice is to put in links—don’t cram your content into one long string of HTML. Decide what content your learners might want to look at versus what they have to look at and then create links. Use a billboard approach to incentivize people to click on the links and get creative with your descriptions. Instead of the standard “Click here for more resources,” tell them what they will get by clicking on the resource. For example, “Hey, want to learn how to use a lathe? Click here!” Give learners a clear idea about what to expect when they move to a different learning space. 

Thanks again for your great questions! If you want to listen back to the webinar recording, you can find it here

If you haven’t already done so, download your copy of John’s eBook Hats Off to Adaptive Learning and take a deeper dive into adaptive learning and its many benefits to learners and organizations alike. The book is also filled with adaptive learning techniques and examples to help you personalize your training programs.

Finally, if you’d like to geek out with the team and continue the adaptive learning conversation, or if you have a question you don’t see answered here, they would LOVE to hear from you. You can reach out to them here: John Cleave, Clare Dygert, and Adrián Soto.

Happy adapting!

[1] https://www.tandfonline.com/doi/abs/10.1207/S15326985EP3801_4

Use the Design Thinking Process to Create Learner-Centric Blended Learning and Deliver Value!

High-profile and high-impact training initiatives are a significant investment for your organization. Leverage the design thinking process to create learner-centric blended learning programs that deliver optimal ROI!

Every year, organizations spend billions of dollars on employee onboarding, sales training, and leadership training—all high-profile programs with a wide reach—and they’re expecting to get a significant return on their investment. No pressure, L&D!

But the truth is, pressure from leadership is no greater than the pressure we put on ourselves as learning professionals. We want our learning programs to be the right solution to address business needs and performance objectives—and to make an authentic, emotional connection with learners. 

The Trend of Blended Learning

We have more tools, technology, and techniques than ever before to accomplish these goals. And rather than thinking in silos about delivery modalities (e.g., instructor-led training versus eLearning), we’re getting more and more creative in how we combine them. 

Blended learning experiences—learning journeys—can engage learners in different ways, fire up cognition, fit into their busy workdays, and provide time for reflection. We can tailor a program to include context setting, demonstration, practice, self-reflection, social learning, coaching, and performance support. In other words, we can craft a holistic program that delivers value at every step in the learning process. 

Our goals are to speed adoption and application on the job, maximize retention, and make learners feel good about their own growth and development!

The question then becomes, with so many options available to create custom learning programs, how do you know if you’re designing the right solution for your organization, audience, and objectives? If not, you’re wasting your time and money.

The Blind Leading the Blind

Typically, when a new learning solution design is needed, project teams come together, including stakeholders, L&D professionals, and SMEs. And along with their relevant expertise and positive intent, they bring something else: their bias. Everyone at the table thinks they know what the best solution is from their perspective.

Some rely on their experience being in the learner’s seat, but it may have been quite a while ago. “Back when I was a new manager in the ’90s…”

Some want to push and pioneer new technology. “Wouldn’t it be cool if we made this a mobile app?”

Everyone believes they know what learners want and need. But do they really?

Blended learning and design thinking process

Lauren Granahan, Director of Organizational Effectiveness and codeveloper of SweetRush CoDesign℠ (more on that soon), compares this typical solution design process to the children’s fable of the six blind men and the elephant:

Six blind men have an opportunity to meet an elephant. They’re excited because they’ve heard a lot about the creatures but have never encountered one in person. When they’re brought to the elephant, each man approaches from a different direction. The first man touches the trunk and says, “An elephant is like a giant snake.” The second man touches the body and says, “An elephant is like a wall.” The third touches the legs and says, “An elephant is like a tree.” And this continues, with each man forming a different conclusion. Each is correct and simultaneously wrong. 

How does this apply to learning solution design? Each of the project team members naturally wants to make conclusions about what’s right for the learners (and the organization), but each has only a fraction of the information.

At SweetRush we saw an opportunity to elevate our learning solution design approach—not only for blended learning solutions but all solutions. We started integrating design thinking in our process, and the gains have far exceeded our initial goal to “see the elephant.” 

Design thinking—and SweetRush’s version for learning experience design, which we call CoDesign—enables:

  • Deep understanding and empathy for the learner audience
  • Alignment on audience needs
  • Alignment on program outcomes, performance objectives, and mental model shifts
  • Alignment on program constraints
  • Alignment on program (or component) design
  • Alignment on implementation and adoption (change management)
  • Expedited design phases
  • Collaboration and camaraderie in the project team (which can help later during those inevitable “bumps in the road”)

Design thinking (CoDesign) turned out to be disruptive, eye-opening, and extraordinarily effective.

Let’s find out why.

Building Empathy for Your Audience: The First Step in Design Thinking

It’s true for product development, it’s true for advertising and marketing, and it’s true for learning: If you want to know how to reach and connect with your audience, you need to build empathy for them and understand them. That happens to be the first stage of design thinking.

graphic of CoDesign design thinking process

Each step of the process includes tailored activities that provoke deep thought and consideration as the team explores ideas and comes together to align on the best ones. 

The first step, which we call “Connect,” tends to be the most eye-opening in the design thinking process. We use a variety of activities to gain a deep understanding of learners. Some activities happen live in the design thinking session—for example, a “talk show” interview with learners. Some are a hybrid of prework and live experience. One of the activities we particularly love for redesigning programs is having learners write “love letters” and “breakup letters” to the existing program. Writing happens before CoDesign, and letters are read and discussed during the session. People get very creative with these letters!

Understanding Constraints for Blended Learning: Design Thinking’s Second Step

Our ultimate goal is to create a learning solution design that’s optimal for both the learners and the organization. So it’s important to understand program constraints, which we explore in this step, “Define.” These might include anything from budget and timeline to the learning environment to the time learners have available for training. Getting these on the wall (or on a virtual whiteboard) early means that as we move to the next steps of design and start dreaming up ideas, we have a tether to the reality of what works for the organization.

“This is also when we are exploring the mental model shifts that we want learners to make as a result of learning and the performance objectives,” Lauren says. “We ensure that we have appropriately framed the challenge (the ‘how might we’) for ideation as well.”

The Define step can also help us mitigate risk and anticipate change management needs related to the learning program. In an activity called “Kill the Program,” we brainstorm the reasons a program might fail—from adoption to implementation. We then determine what we can influence through our solution design or other methods, such as communications, manager support, systems or procedural changes, and so on.

Ideating and Prototyping a Blended Learning Program: The Next Steps of Design Thinking

We now have empathy for our learners and a better understanding of their needs, and we understand the organizational constraints. Now it’s time for the magic to happen! During these next steps, which we call “Play” and “Sketch,” we generate ideas about what our blended learning program should look and feel like. Learners are a critical part of the team generating these ideas.

It’s completely normal for everyone in the room to be experiencing different feelings as we step into this stage. Some will be excited and geared-up idea machines, while some may be a little uneasy about translating all the information gathered into something concrete. An experienced design thinking facilitator can help all parties show up and be the best contributors they can be—this takes careful preparation and the ability to “read the room” and adjust on the fly. 

The beauty of design thinking is that the process allows people to stretch their imaginations and then come back to gain consensus, and this happens multiple times. Each time, new ideas are added to the mix and then validated against the understanding and constraints gathered in the initial steps.

Testing and Validating with the Actual Learners: Wrapping Up the Design Thinking Process

The Sketch step resulted in a prototype. Now in our final “Align” step, that prototype gets tested. A prototype can be anything from a sketch on paper to a video storyboard to a functional mock-up of an eLearning course. The idea is to present something that the audience (learners) can react to and provide feedback on. 

“It’s important to emphasize that the learners have been along for the ride throughout the CoDesign process. At this stage, they’re being leveraged as testers for the prototype we create,” Lauren says.

And in our experience, there will definitely be things to tweak. But overall, the design thinking steps set us up for success in designing a solution that learners connect with.

From Solution Design to Launching Your Blended Learning Program

The Align step validates the team’s work during CoDesign, and it solidifies buy-in and sets the team up with a clear path forward for blended learning design and development. The design thinking process overall can also yield great insights that can help with implementation and launch—for example, themes that resonate with learners that can be integrated into marketing communications. 

Greater empathy, level setting on constraints, learner-centric solutions, validation, and buy-in. Reduce blind spots and bias, and maximize your time and money! What new blended learning program or redesign do you have coming up that could benefit from design thinking

Interested in learning more about design thinking for learning? Watch this webinar!

The Value of Conducting a Needs Analysis—Part 3: Existing Needs

Welcome to this final post in our needs analysis blog post series. In case you missed it, in part 1, I explained how a needs analysis can save lives (well, sort of). In part 2, I walked you through a strategic-level needs analysis and how to plan for the future. In this post, I am going to show how to handle existing training needs at the individual project level.

This type of analysis should be completed on every learning project, yet it often gets missed or overlooked. My theory: People confuse project-level analysis with the strategic-level analysis and assume it will be time-consuming and complex. In reality, it’s extremely easy and straightforward to do—in fact, I’ve even written a playbook on how to get it done.

Let’s take a closer look.

Project Level Needs Analysis

Project-Level Needs Analysis

This type of analysis happens on a much smaller scale—at the project level—and is usually triggered when L&D is approached by its business partners to help with specific training requests. In fact, as a vendor, this is how we begin most of our engagements with our clients. 

The objective for this analysis is simple: to design an effective learning solution that meets the needs of the business and the learner.

When to use this: Use it on every single training project. I’m serious. You should be doing this always—for real, no excuses. Trust me, you’ll thank me for it.

What happens: L&D partners with the project’s stakeholders to uncover the business needs, learner needs, and any constraints related to budget, scope, and time. 

Level of complexity: While the duration of the analysis may vary depending on the size and scope of the specific project, it remains a very low complexity activity.

Needs Analysis Goal

What this looks like: This type of needs analysis involves talking to people, gathering data, and then analyzing and synthesizing your findings. Simple!

First, you’ll need to speak to stakeholders to uncover the business needs, identify any constraints, and define what success looks like. And then you’ll need to speak to your learner audience to find out what they already know and can do, what their work life is like, and when and how they like to learn. 

Of course, there are very specific questions that you’ll need to ask the stakeholders and learners—in my playbook, I list all of these for you—but it really is as simple as that. 

Finished output: At the end of the analysis, L&D prepares a report that lists the needs analysis findings and recommendations and may or may not also include a high-level design. 

So there you have it—my quick and easy guide to the value of conducting a needs analysis! I hope this blog series has been helpful. Wherever you are on your needs analysis journey, we have a number of resources to help.

Additional Needs Analysis Resources

For the majority of situations, you’ll need to do a project-level needs analysis. Our step-by-step guide, The Needs Analysis Playbook, will walk you through how to do this from start to finish.  

Want more information about how to talk to stakeholders about their needs? Our Needs Analysis Clinic webinar focuses on the six questions you should always ask to help uncover the business needs.

Got a general question about needs analysis? Check out our needs analysis Q&A where we answer your burning questions.

Don’t see your question listed? Connect with me and I’ll answer your question.

Finally, if you are interested in partnering with SweetRush on your next learning project, contact us and we’ll be happy to find out more about your needs!

Download Needs Analysis eBook

The Value of Conducting a Needs Analysis—Part 2: Plan for the Future

In part 1 of this blog post series, I explained why we invest time in a needs analysis—and how critical it is to the success of learning solutions. 

In the next two posts, I’m going to walk you through two different approaches to needs analysis: the first is more strategic and is focused on the future needs of the organization as a whole, and the second is more tactical and concentrates on existing needs. 

Strategic Level Needs Analysis

Strategic-Level Needs Analysis

This level of needs analysis is a proactive, forward-looking activity with one simple objective: to ready the workforce to meet future performance goals.

When to use this: You need to help the business identify and anticipate future training needs across an entire business group or organization.

What happens: L&D partners with senior leadership to review the company’s strategic goals—usually for the next three to five years depending on your organization’s cadence—to determine what knowledge, skills, and performance the workforce will need to meet the goals. 

Level of complexity: Depending on the size of the organization and the number and scope of the strategic goals, this type of needs analysis can range from simple (for small companies with few goals) to extremely complex (think large multinational companies with multiple goals).

Conducting a Needs Analysis

What this looks like at a high level: While the specific methods for conducting the tasks listed here may vary, these are the fundamental steps that L&D will need to complete:

  • Partner with leaders and stakeholders to identify and prioritize the strategic business goals and associated desired business and performance outcomes. 
  • Identify the necessary knowledge, skills, and behaviors needed to achieve the goals. 
  • Conduct assessment to benchmark the existing knowledge, skill, and performance levels of the organization (performance gap analysis). 
  • Define the learning objectives and evaluation strategy for each business goal and desired outcome.
  • Inventory any existing training materials and resources to see what can be leveraged (content mapping and gap analysis). 

Finished output: At the end of the analysis, L&D prepares a findings and recommendations report along with a detailed solution blueprint and roadmap to help the business visualize and prioritize the solution development.

In our recent Needs Analysis Clinic webinar, a participant asked: “Do you have any recommendations for needs analysis for a large and diverse audience? I need to start working on a company-wide strategy (400+ employees). We are looking to identify the top skills needed.”

In this instance, I recommended completing a strategic-level needs analysis. To see my detailed response, click here to view the Q&As from the session and scroll down to question number three.

While the strategic approach does take a while to do and it can be complex, it only needs to be done once every three to five years or so depending on how often your company updates its strategy. And the work you do now will pay off dividends over time!

For a less complex look at needs analysis, head over to part 3 of this blog post series where I walk you through a project-level needs analysis.

For a comprehensive step-by-step guide to completing a needs analysis, download our eBook, The Needs Analysis Playbook

Download Needs Analysis eBook

The Value of Conducting a Needs Analysis—Part 1: The Why

Conducting a needs analysis is a critical step in the learning experience design process, yet it is often overlooked—or skipped over entirely. This is most likely due to a common misconception about it being a time-consuming, expensive, or overly complex undertaking. 

But designing and implementing a learning experience solution without first doing a needs analysis is a huge risk. After all, if you don’t know what the underlying problem or need is, how can you be sure that what you’re creating will solve it? 

It would be like a doctor prescribing treatment without having a clear understanding of who the patient is and what the symptoms are—not to mention details about the patient’s medical history, allergies, or other critical risk factors. Not only is it unlikely that the patient will get better, there’s also a pretty big chance that the doctor may end up doing more harm than good.

In this blog post series, I’ll help demystify the needs analysis process and demonstrate its value by examining why we do it as well as two different ways to approach it. 

Let’s begin where all good needs analysis begins—with the why.

Why do a needs analysis?

Why even do Needs Analysis?

Like a doctor performing a series of diagnostic tests, we carry out a needs analysis to uncover what the underlying problem is, whom it affects, what impact it’s having on the individual—or, in L&D’s case, what impact it’s having on the individual, the team, and the business—and what the desired outcome, aka “success,” looks like. 

We then use our findings to design effective learning solutions but perhaps more importantly, to determine if training is the right and only solution

So, what do I mean by this? 

Let’s think about why we train people in the first place. We train to improve their knowledge, skills, and performance. We identify the gaps and fill them in. And when we get it right—when we develop effective training solutions—we should see performance improve, which, in turn, should impact business results.

But there are lots of things that can affect performance—many of which can fall outside the scope of training. These can include internal factors such as the learners’ mindsets, attitudes, and beliefs as well as external factors, such as an organization’s systems and tools, procedures and policies, culture, and even people. 

To revisit our doctor/patient analogy for a moment, there may be instances when medication alone may not be enough to ensure a full and successful recovery. There may be other factors impacting the patient’s health condition or ability to heal such as their lifestyle, diet, stress levels, exercise routine, or sleep habits. 

In fact, even the word “recovery” (success) might mean different things to different people. For some, it might mean regaining basic mobility after breaking a leg and being able to walk or drive again, whereas for others it might mean being able to compete in the Olympic Games. 

The doctor needs to take everything into consideration in order to devise and prescribe the most effective treatment plan. And the same is true for learning. 

So we do a needs analysis to find out what the problem is, whom it affects, what or who might be contributing to the performance problem, and what success looks like on individual and business levels. Once we have this information, we can determine whether training is the right and only solution before going on to design and develop an effective program. 

At SweetRush, we’ve devised a whole new needs analysis experience that sits at the intersection of learning experience design and design thinking. To find out more about our groundbreaking CoDesign service and whether it’s a good fit for you, get in touch and we’ll be happy to help.

Now we know the “why.” Let’s take a look at the “how.” 

If you’re interested in finding out more about strategic-level needs analysis and how it can help ready the workforce to meet future performance needs, go to part 2.

If you want to explore project-level needs analysis, what it entails—hint: it’s super easy!—and why you should include this level of analysis on every single training project, go to part 3.

If you’d like a deeper dive into why we do a needs analysis and how to conduct a project-level analysis, download our definitive guide, The Needs Analysis Playbook.

Download Needs Analysis eBook

The Needs Analysis Clinic: Answers to your Questions

At our webinar, The Needs Analysis Clinic: Bring Your Learning Challenges and Get Expert Help, Emma Klosson, SweetRush Senior Instructional Designer/Learning Evangelist, and Tiffany Vojnovski, Instructional Designer/Learning Evangelist, discussed how to effectively partner with stakeholders and shared six questions you should always ask them to help set your learning solutions up for success. 

They also answered your learning challenges by explaining:

  • How to position L&D to the business and get buy-in on the needs analysis process 
  • How to complete a needs analysis when time and resources are stretched and stakeholders need solutions fast!
  • What to do with the data you gather during the needs analysis and how to use it to make solution design recommendations 

We want to thank the webinar participants for their excellent questions. As promised, here are Emma’s answers to the questions she didn’t have time to answer live!

Q1. How do you get your non-L&D stakeholders away from the mindset that training “completion” or 100% participation is the measurement of success? I realize it ties into a larger learning culture issue, but always have trouble explaining how and why to measure. Any tips?

This is a great question and one that I think many people will identify and struggle with. And you’re absolutely right in thinking this is a larger learning culture issue that requires a mindset shift. 

In the webinar, I talked about stepping into the shoes of a learning vendor and reflecting on how you position L&D to the business. I suggested ways you can help the business understand your services and what to expect when partnering together—for example, by documenting your processes and creating guidelines that describe how and when to engage with you. I truly believe that this is the best way to alter those mindsets and beliefs and embed the behaviors that you are looking for.

For your specific challenge, I recommend taking charge of the conversation around measurement by developing a project intake process that is focused on business and performance outcomes. Use the six stakeholder questions we shared (which you can also find in my Needs Analysis Playbook) to guide this process and the conversation that follows. 

Needs Analysis Infographic

When you can steer stakeholders toward talking about the specific behaviors and measurable or observable outcomes they expect to see, it will be much easier to lead them to a solution—and evaluation strategy—that is targeted to a specific audience’s needs. The best part: you won’t even need to have the conversation about how and why to measure, because your recommendation will address all their needs. 

Q2. Are there any suggestions for assessing whether the stakeholders you’re working with truly understand their audience? For example, I recently had a client who based their audience’s needs for a DEI training on an in-depth survey from 2019. I realized, belatedly, that the date should have been a red flag.

At SweetRush, we truly believe that there is no substitute for talking to the learners. Stakeholders can tell you who the target audience is and will have a sense of what their needs might be, but no one is better placed to speak to this than the learners themselves.

I recommend making room in your needs analysis process for a learner audience analysis. Take on board what the stakeholders tell you, and then ask to speak to a representative sample of the target audience as part of this process.

I’ve devoted an entire chapter of the Needs Analysis Playbook to just this topic. Head to Chapter 2, “The Learner Audience Analysis” (page 24), for my step-by-step guide to carrying out this critical task—find out who you should talk to and what information you should gather. Then head over to Chapter 4, “The Needs Analysis Report,” to find out how to synthesize your findings and present your recommendations to the stakeholders.

Q3. Do you have any recommendations for needs analysis for a large and diverse audience? I need to start working on a company-wide strategy (400+ employees). We are looking to identify the top skills needed.

This question is connected to the strategic-level needs analysis that we touched on during the webinar but didn’t go into great detail on.

If your goal is to identify the top skills needed, you’ll need to partner with your senior leadership team to discuss the company’s short- and long-term strategic goals. Focus on the measurable outcomes—for example, increase sales of a specific product/service by X%; or increase market share by X%; or grow a specific sales channel by X%. Note: the company may have several goals, and you’ll need to ask the leadership team to tell you what their priorities are—you can’t determine this yourself. 

Once you know what the business priorities are, you can begin partnering with the stakeholders for those goals to identify who the target audience is and what specific competencies and skills they’ll need—what will they need to know and be able to do to achieve the desired outcomes? 

Once you have your top skills identified, you can complete a benchmark assessment of the target audience’s current competency and skill levels. Identify where the gaps are and who the experts (SMEs) are—you’ll need the SMEs later on, when you design and develop the solution!

Next, conduct an inventory of your existing training materials and resources, to see what content already exists and where the gaps are—this is also known as curriculum- or content-mapping. 

Finally, use your findings to develop and then present a strategic roadmap that shows the leaders and stakeholders what needs to be done and how you recommend getting there.

We recently worked with a client who did something similar. Here’s their story: The company, a global retail brand, sells its products through its own-brand global retail stores and dot-com business. It also sells products through global distributors—both online and in brick and mortar stores. As part of its five-year strategy, the company wanted to grow its online sales business with digital retail partners. 

The stakeholder for this goal partnered with the company’s global sales L&D team to identify the target audience, the global wholesale team, and the skills and competencies they’d need to drive this growth—eCommerce.

The L&D team completed a content-mapping exercise and identified an internal thought partner. They then contacted external vendors, including SweetRush, to help develop content to fill in the gaps. 

Together with their partners, the L&D team went on to develop a comprehensive, three-tiered eCommerce training strategy for their target audience of 2,000+ global sales professionals.

Good luck! 

Q4. How much energy do I put into developing content for a temporary work system? The systems will be retired and replaced with a new business solution. In the interim, I am concerned that I may overdevelop.

The short answer to this question is: as little as possible! 

That said, it really depends on how crucial the system is to the business. I would recommend looking at critical tasks first and focusing on those. What are the areas that people struggle with most, and what impact does that have on productivity or other critical outcomes? Solve those problems first.

If you know which solution will be replacing your current one, you might want to identify what, if any, overlaps or similarities there might be between the two, to see whether content you develop now can be repurposed later on. This doesn’t even have to be system-related—think about the learner WIIFM and any mindsets or beliefs that you might need to shift as a result of the change and start there.

Q5. How do I do a more in-depth needs analysis for remote departments? Let’s say we need a needs analysis for the accounting department. Although I will be able to talk to the leader and learners, I won’t be able to observe them actually doing their job, to really comprehend their needs.

When it comes to assessing the needs of an entire department, you’ll need to partner with the leaders and stakeholders first, to get aligned on their performance goals and desired outcomes. Use the six stakeholder questions we shared to help uncover these. (You can also see the answer I gave to Q3, above.) Once you know what those goals are, you can turn your attention to the learners. 

While there will be some situations where there is no substitute for observation—procedural task-based training comes to mind here—you won’t always need to observe learners doing their job to fully understand their needs. You can do this by asking them targeted questions and by collaborating with subject matter experts (SMEs), who are currently performing the role or tasks instead. And I say this with experience! Having worked for a fully-remote learning vendor for the past five years, I’ve relied heavily on collaboration from learners and SMEs to help uncover their needs successfully. 

If you do need to observe the learners and you can’t be with them in person, I recommend partnering with leadership to identify individuals who can help you complete a job-task-inventory (JTI). 

Target both highly experienced and lesser experienced individuals to keep records of how they perform the specific task or duty that you need to develop training on. There are lots of great JTI templates available online. I like to include the following information in mine:

  • What is the name of the task? 
  • What are the specific steps—and substeps—within the task? 
  • What knowledge is required to complete the steps/substeps?
  • What tools or technology is required to complete the steps/substeps?
  • What other resources do you rely on to complete the steps/substeps? (Think people, performance support, etc.)

Regardless of the format you use, make sure that you teach the learners how to complete the JTI. Walk and talk them through the document and leave them with guidelines that illustrate what good looks like (WGLL) and what bad or not-so-good looks like (WBLL). 

Another great option is to use video. Assuming this is allowed, ask your target learners to record themselves completing their tasks. Ask them to talk through everything they are doing as they are doing it. This is a great option if the job requires them to make decisions or exercise judgement-making skills—you can ask the learner to explain where the decision points are and what logic or skill they are using to inform their choices.

Q6. How can I measure the impact in dollars of not training?

This is a tough question to answer, particularly without knowing where specifically you are hoping to add value. Assuming your focus is on productivity, you could perform a time and motion study. Track how much time it’s currently taking to perform a specific task and compare that to the time or manpower that will be saved by improving an aspect of that task. Don’t forget to factor in your estimated costs for designing, developing, and implementing the training—along with any costs associated with learner participation (time away from work) and running fees (venue hire or technology, for example)—when you are doing your calculations. 

I recommend looking at the Philips model of evaluation (Level 5 Evaluation: ROI) to help with this work—or to determine if your training initiative is a suitable candidate for this approach. 

You might also want to consider the intangible costs of not doing training, such as the impact on employee engagement and even attrition rates. Partner with your HR team here to study exit interview data and employee engagement survey results, to discover if there is any evidence to support this impact.

Q7. What if a manager doesn’t really know what they should do because he/she is new? (For example, a small company with 25 employees.)

Since I’m not sure who asked this question, I’m going to answer it from two different points of view.

If you are an L&D practitioner or leader: My advice is to partner closely with the manager and educate them on the value you can provide. Find out about the company’s strategy and short- and long-term business goals, and work together to identify what skills the workforce will need to help meet them. Once you have a clear picture of the business’ and learners’ needs—and you have spoken to the learners and mapped out your constraints!—present your findings and recommendations in a way that will help the manager identify and prioritize the options for closing the performance gaps. 

If you are the manager: Grab your L&D partner and share your vision and strategy with them. What are your short- and long-term performance goals? Where do you see opportunities to upskill your team to help meet those goals? What are the performance gaps now? And what performance gaps might you anticipate as you look to the future? How will you know that you have achieved those goals? What does success look like to you?

If you don’t have an L&D person yet, consider partnering with a vendor or augmenting your team with temporary expert help. SweetRush can help with this. Head over to our Get in Touch page to share your needs with us.

Q8. Any recommendations on an LMS for small businesses?

Our clients typically have a preferred enterprise LMS, and our job is to make sure our courses work flawlessly within their system—a fun and sometimes challenging task! From time to time, we do consult with clients to choose an LMS as part of a larger project scope. 

There are lots of great resources you can use to compare and contrast different learning management systems. Our friends at eLearning Industry have a directory of LMS providers you can search—you can then filter the results based on your needs. 

Good luck! 

———————-

Thanks again all for your great questions! It’s great to see that you are as passionate about needs analysis as I am! 

If you want to listen back to the webinar recording, you can do that here. If you haven’t already done so, download your copy of The Needs Analysis Playbook, our step-by-step guide to needs analysis. The book is packed with useful tips and practical advice for doing a stakeholder and learner audience analysis, identifying the project’s constraints, and preparing a needs analysis report.

Finally, if you’d like to stay in touch with me and to continue the needs analysis conversation, you can find me on LinkedIn.

Happy analyzing!
Emma

Press Play: 5 Tips for Writing Audio Scripts

Do you fancy yourself the Greta Gerwig or Bong Joon Ho of the eLearning world? Do you want to write blockbuster audio scripts that will make stars of your on-screen characters and have your audience reaching for popcorn and hoping for a sequel? Most importantly, do you want to be able to connect with your audience members and elicit a meaningful response from them? 

If you’re already visualizing your acceptance speech, keep reading for our top five tips for writing audio scripts.

From Blockbusters to Rotten Tomatoes: The Payoff and Pitfalls of Writing Audio Scripts 

Instructional designers are no strangers to writing. You write educational, instructive, and insightful content every day. You may even write training scripts for role-play activities, or speaker notes for facilitators. But scriptwriting requires a different skill set. You need to think more like a screenwriter. You need to set the tone. You need to win over your audience. You need to get it rooting for the main characters—better yet, your audience needs to be the hero of your story. And you need to do all of this in a way that feels authentic and relatable.

Audio Script Quote

When you can do this, you’ll create rich learning experiences that enhance engagement, build empathy, elicit an emotional response, and motivate the learner to take action.

And if you miss the mark, if you create experiences that don’t feel authentic or relatable, you run the risk of distracting the members of your audience or, even worse, alienating, angering, frustrating, or offending them.

So how do you ensure that your audio script will be Certified Fresh and not deemed a Rotten Tomato

Tip 1: Use the Right Voice

Unlike dialogue, which is specific to the actual words your characters will be speaking, voice has more to do with the general feeling you want to evoke. 

Voice is usually driven by your client, their brand, and how they talk to their customers. 

A simple way to find out more about your customer and their voice is to visit their website. Make a beeline for their About Us and Our Story pages. Here, you’ll find out who your client is, where they came from, and what they’re about. More importantly, you’ll see how they like to present themselves to the world and how they talk to their customers. 

As you’re reading, notice the language they use and how it makes you feel. Is the language formal or casual? Technical or simple? Does it feel inspiring, intellectual, playful? Pay attention to the voice, and try to use the same language and echo the same feeling when you’re writing audio scripts.

If in Doubt, Ask! 

Stakeholders may want to use a different voice for their training, so be sure to ask your what they’re looking for before you start writing your scripts. If you’re using the company voice, ask to see a copy of the brand or style guidelines. Most of these now include examples of the brand voice along with general branding guidelines. Review this carefully, and discuss anything you’re unclear on with the stakeholder. 

Tip 2: Use the Right Tone

To home in on the tone, think about what you’re trying to do. Are you trying to educate and inspire your audience? Do you need to sell them on an idea or persuade them to do or try something new? Perhaps you need to warn them about the dangers of something?

The intention (or purpose) of your training will inform the tone you use when writing audio scripts. 

Note: It’s important to think about any disconnects between voice and tone at this stage. While it’s possible to approach more serious subjects with a lighter tone, it’s a skill that requires a practiced hand. Getting it wrong could be disastrous. Work with the stakeholders to get this part right. 

Tip 3: Create Character Personas

If your goal is to represent the learner on-screen, you’ll need to create dialogue that they recognize and can relate to. Use the words they use. Say the things they say. 

What’s the best way to find out what they say and how they say it? Talk to the learners! Find out who they are and what they’re like.

Once you have a sense of the people you’re writing for—and the people you’re writing about—you can develop character personas. The character persona is a brief statement that describes who the character is, what they do, what they find challenging, and what they might be thinking or feeling in times of calm or stress. The persona provides guidelines not only for the audio script writing but also for the voiceover actors and the design teams who are bringing the characters to life.

Taking this one step further at SweetRush, our Instructional Designers and Creative Directors are spending more and more time working on characters’ backstories. They begin by sketching them out on a virtual whiteboard before introducing them to the illustrators and voice-over actors. Taking the time to complete this step allows the characters’ personalities to shine through. 

We recommend taking this extra step if you’re writing characters for a long program or a series of programs—or if the characters play a large part in your story and content.

Writing Audio Scripts
Heather is one of six characters we developed for new people manager qualification program with our partner SHRM. As the learning progressed, so, too, did the characters. Instructional Designers mapped out each character’s background and onward journey before writing the audio scripts.

 

Audio Script
The character of Malik has appeared in no fewer than 28 eLearning courses! Malik has four coworkers, and each, like him, has unique skills and experiences as well as some vulnerabilities. The SweetRush team used character personas to not only steer and direct the scriptwriting but also ensure attention to detail and consistency concerning nuances and character quirks.

Tip 4: Include Direction for the Voice-Over Actor 

The voice-over actors typically won’t see the entire eLearning script or know the entire story. They’ll see only their lines. It’s really important, therefore, to give them what they need to bring your characters to life in the way you want. 

Share the character personas and context with the actors. Provide the actors with background on your characters along with instructions on how to portray them. Here are a couple of examples:

CharacterThe setupTips for playing this character
MayaMaya is an experienced recruiter. She’s interviewing Jose for a potential promotion opportunity.Maya is a seasoned professional who knows exactly what to ask to gain the insights she needs from candidates. She’s very deliberate in her approach and uses her active listening skills to give candidates the time and space they need to respond. She wants candidates to feel at ease.

Use a warm, open, friendly, and relaxed tone when playing Maya unless otherwise directed.
JoseJose is a highly skilled key accounts manager. He recently applied for a promotion to team lead and will be interviewing with Maya.Jose is applying for an internal promotion. He’s usually quite confident and in control during work situations and is well liked by his peers, but he’s feeling anxious about the interview. He really wants this position, but he doesn’t interview often, and he’s worried he’ll say or do the wrong thing.

Pay attention to the shifts in Jose’s tone as directed. He’ll start out anxious and flustered. As the interview proceeds, he’ll become more calm and relaxed.

Add direction and prompts to specify tone, inflection, and emotion. Is your character nervous, angry, or elated? Are they trying to inspire, educate, or warn the learner? Annotate your scripts with these directions so the actor can match the emotion and tone you wish to convey.

Example:

Maya [warm, friendly]:“Thanks for taking the time to meet with us today, Jose.”
Jose [confident then flustered]:“Yeah, no problem. … I mean, thank you … for meeting with me. I’m excited about this.”

Provide instructions for the pronunciation of jargon or unusual words. Spell out jargon phonetically, or better yet, record an example for the actor to have as a reference.

Pop into the “booth” and give direction. If you have access to the actor, schedule time up front to brief them on the overall project and story, the script, and any nuances or special pronunciation that they should be aware of. 

Specify how numbers and acronyms should be pronounced. Should the number “123” be spoken as “one, two, three” or “one hundred twenty-three”? Is ACT pronounced like the word “act” or “A, C, T?” Remove any doubts by adding in this detail.

Read it aloud. Before you hand the script off to the production team or actor, read it aloud. Does the dialogue sound natural? Have you provided enough direction around tone and emotions? Hearing your script read aloud will help you catch anything you might have missed.

Tip 5: Get Inspired! 

Our final tip is all about getting inspired. Voice, tone, and authenticity are all key elements of great audio script writing. But it doesn’t mean that you can’t play around with themes or get inspiration from outside the workplace to build your stories. 

Here are some of our favorite places to draw inspiration from:

  • Television and movies: Pay attention to how the writers and actors build suspense in drama. Observe the dialogue and timing used in comedies.
  • Advertising: This is short-form persuasive writing at its best. Pay attention to how writers pack a punch while being economical with their words.
  • Podcasts and radio: Neither format relies on visual aids, so how do they gain and maintain your attention? Do they use different guests and voices? How do they make the experience more dynamic?
  • Novels and audiobooks: Get inspired by stories and narratives, and pay attention to how writers show versus tell. 
  • Articles and print media: Pay attention to the different tones used—are they informative? Instructive? Educational? How do they engage and inspire you to take action?

Remember Heather from the SHRM people manager qualification program? We took inspiration from outside the workplace to create a learning experience that centers around six friends who regularly meet up in a coffee shop to talk about life and work. 

Sound familiar? 

Wherever you get your inspiration from, keep a record of the things you like. Carry a notebook or make audio recordings on your phone. Make notes or write prompts to remind you of what you liked and where you might use a similar technique in your next audio script.

Elevate Your Audio Script Writing

Well-written audio scripts can enhance learner engagement, build empathy, and elicit an emotional response from your audience. To nail the voice and tone and create authentic and realistic dialogue, engage both the stakeholders and the learner audience in the process. 

For more ideas for bringing your characters to life, check out our eBook Virtual Training—SweetRush Style: 5 Inspiring Case Studies for a Learner-Centered Approach. It’s packed with real-world examples and tips and tricks from our experts, and you’re bound to find something in it to inspire your next audio script!