Adam Avramescu  00:12

Welcome to see lab the customer education lab where we take customer education myths and misconceptions and throw them in the laundry chute never to be seen. Again. I’m Adam Avrame I’m Dave Derington. And hey, Adam. It’s a special time of year isn’t it? Know what time it is? Love is in the air.

Dave Derington  00:33

Huh? No, it’s It’s July.

Adam Avramescu  00:39

What’s what what time of year? Is it? Is it national? National, something? National Butterbean day No.

Dave Derington  00:48

National junk food day. That’s a great, that’s a greatest time of the year ever. Oh, gosh. Needing to dye it as it is. Yeah, unfortunately, I think every time of year is national junk food day for me. Especially now living in Amsterdam, where like so many of the regional foods are essentially just like fried. Ooh, fried carbs. Who would like what? Give me an example? It’s really good to get to try bitterballen No, can’t say that. I have what is it? It’s a Dutch regional or national, you might say food. It’s very typical here. You’d have it with a beer. And it’s sort of like a like a fried ball of like, gravy. Sort of is the best best way I could put it. We just skipped everything and went right to that’s like fried twinkies or. Right butter? Yeah. No.

Adam Avramescu  01:41

It’s quite nice with a beer. I will say that. You know, so are so are some fruits with mayonnaise and peanut sauce. I’m very European. Now. I promise. You know, this stuff’s good. So let’s, let’s celebrate that for national junk food. David, David, this is actually not where I thought you were going with this. I thought I thought you were going to tell me that. It was report season. It is report season. Yes. 2022. And today, we’re going to we’re actually we’re going to do some minis here. So we’ll kick off the first of three. And we’re going to talk about three different reports, one from the Tsia, one from thought industries, and IDC. And the last one is from skill jar. So Well, lots of stuff to talk about lots of stuff to talk about. Lots of stuff to talk about. And I just want to start with maybe a few thoughts about these reports. Because, look, I think it’s great that we’re multiple years into this project, we’ve been covering these reports on CELab pretty much every year that they’ve come out. And we’ve been doing this podcast. In fact, I think we’ve seen some of these reports launch in the time we’ve been doing this show. And it’s really nice to see that these reports are now coming out more continuously. They’re becoming an annual thing. And for us, I think it’s also really interesting to keep track of what’s changed and stays the same as well as I think each year kind of brings the zeitgeist in terms of what are some of the things that the reports are focusing on where there’s actually commonalities between what all three of them are measuring?

Dave Derington  03:15

Yeah, I like that. It’s kind of important. I was thinking about this when I was just kind of like walking along last night about how we need this every year, we need to go okay, what has changed, what’s improved, what’s different? How did COVID impact things now, what’s after COVID that we have to deal with the world is changing. But one of the things if we’re going to start kicking this off, one of the things I’m really interested in is definitely seeing what’s new now as COVID is kind of trending down, you know, kind of not really. Now we’re in 2022, we’ve changed a lot of things we’ve we’ve done across these three reports. What is it that? Well, let’s just get into it. Because there’s there’s things that that kind of surprised and validated me and I thought were very interesting. Where we want to start Adam, we’re gonna start with skill jar to begin with. Yeah, let’s start with skill jar to begin with. And I think before we dive into these reports, let’s maybe just spend a moment on just kind of how to read and analyze these reports, because I think we’re gonna see more of them coming out over time. And maybe it’s worth spending a moment on actually looking at as these reports come out of how do you really know what you’re looking at. So, for instance, like when we look at the reports, there’s some things that we look at to try to figure out what the reports are really telling us and how they’ve collected the information. So, for instance, most valid reports are going to tell you what their data collection claims are, who they surveyed, how they surveyed them. So for example, in the skill jar report, I think you’re gonna see that this one is mostly collected from skill jars customer base, and in fact, they bring in their own product data to supplement the claims that they make. The other reports you’ll see kind of similar, similar methodologies published, but I think it’s really important you got to look at that you’ve got to look at like, what’s this

Adam Avramescu  05:00

sample size, what’s the relevance of the people who are sampled? So for instance, like if you have a customer education report out there where they, they didn’t sample customer education people, then maybe that’s skewing the data somehow. I don’t know what else Dave, what would you look at to kind of determine how effective the report is? Categories? I like the fact that you’d said, hey, who actually are? Who’s the audience? I know, one of the reports, straddles both like the education services and the customer educate actually two of them do, I think, the Tsia and the TI and IDC, one to some degree? Do you want to understand that the questions that I find myself asking is like, what’s the end, then? Is that not? How big is that sample size? How statistically relevant is it? How deep did they go? How did they conduct a survey? All those things are really good. I mean, I was a scientist, look for these things out here that have good quantifiable data, in less just making up an impression or not settling your hypothesis with fact. Yeah, like, I think ultimately, what this comes down to is, when you read the report, do you trust the way that the data was collected, does the information you’re reading makes sense, if it doesn’t make sense, it’s definitely worth looking at how the data was collected. And even if it doesn’t make sense, it’s probably worth still looking at how the data was collected, just to make sure you’re not feeding into your your confirmation bias. And again, one way to look at that, and this is going to tie in to, for instance, the skill jar report that we’re looking at today, you have to kind of look at if the claims are appropriate. So for example, if an LMS is surveying its own customer base, and then presenting database on that that’s an appropriate claim versus say, if you are collecting data from kind of like a random sample of customer education professionals, and then trying to make a claim that this is this represents the entire state of the industry, where you really want to make sure then that that the actual people who participated are representative of the industry as a whole. So like, in a way you kind of got to look at like, whether the claim that’s being made is appropriate and is kind of appropriately humble. So put your BS detector on full. So what you’re saying, Now, most of these were a reputable agencies. These are Yeah, I think all the reports that we’re going to cover today are good. And they all cite their methodologies. And we’ll dive into that. But just wanted to start with a little bit of report intelligence there. Because this is just an important piece whenever you interpret anything like that, and we only really get report season to comment on this. So I thought we would start there. Yeah, well, let’s dive into it. And I said, the first report that we’re going to bring up is the skill jar 2022 benchmarks and trends for customer education. So that was pretty recently published a month or two ago. So it’s as of today we’re this is relevant current. Well, let’s talk about who they were looking at. Again, I think this is one of the vendors that you were talking about serving their own sample data, their own community. Yeah, this is primarily skills are talking to skill jars, customers, right. So like they call out upfront that this heavily weighted towards tech companies, because that’s a big part of skills as customer base. And in terms of the size of the companies that they were working with, mostly, the tech companies were between 200 and 1000 employees, but also an additional 40% of the respondents. So still a very significant chunk between on the lower end 51 to 200. And then on the higher end 1000 to 10,000. So you kind of have this like, higher band, even though the main cluster was between 201,000. So again, like if you’re listening to the results here, and you’re like, I don’t know if that applies to me, like there’s a question of Are you a tech company, roughly in that employee range, because if you’re not, then your mileage may vary on the claims being made here, but it hits customer education market very squarely like this is the circumstances that we’re dealing with. And then when I was reading through it, I see focuses those of you in this audience, directors, managers, maybe some ICs, here and there, who are by far their program owners, right? Because the next point is they were at this is cool, because they’re looking at the audience of training customers, partners and employees. So that would put it in a spectrum of also enablement, related or enablement adjunct. Well, I think you call this out this kind of reflects potentially an emerging trend towards this consolidation of education responsibilities and actually unifying those different audiences. And now, even if you are primarily a customer education team, you probably are taking on some partner training responsibilities. You probably are also training internal employees, even if you aren’t Central l&d Like I’ll give you an example my team at personeel do. We are also responsible for training our internal customer experience team, so we’re not the central l&d team at personeel. But we do take a really strong role in educating our own teams about the product and also helping them with career development. So I do think Ted Blosser, when he was on the show, sorry to call out one LMS. And another LMS is reports. But this exact phenomenon, where we’re seeing more consolidation of customer partner and employee training,

Dave Derington  10:15

I think that’s good. And we see that throughout. So other things, I guess we don’t have to go through all the details, they had a pretty even distribution between companies, what do we say here between five and 1010, and 20, and 20 plus years, that high concentration of those teams, which we know, because we’re in this relatively new or nascent field, we’re two to five years old, and then maybe have a few of few companies of the zero to two brand new? Absolutely. And they also look at forum optimization. And business impact, like, this is the kind of stuff that we’re really interested in, you’re starting to get into that. I’m seeing that throughout these reports, impact impact impact value, what do we deliver?

Adam Avramescu  10:58

Yeah, and like this is an important concept too, because I feel like we all have different names for this. Like in my book, I call them program metrics and value metrics, I think can’t even remember, it’s a benefit point to get the fact that you have two different it’s I haven’t read that book in a long time, you have these different categories of metrics, right, you’ve got local metrics that help you figure out what to do with your programs, and how you actually make decisions about your content. And then you’ve got these more global metrics that actually point to what effect your education programs are having. So actually, I like program optimization and business impact as ways to label these metrics. And so they split it up right there, we’re looking at program optimization metrics as essentially data about your courses and your content, about how accounts are engaging with your your content and kind of how your your programs are performing locally. Whereas like with business impact, and they were looking at the tie to the customer lifecycle, the overall program success, broader impact on revenue and leads. So you’re saying you’re kind of looking like outside your program at this point.

Dave Derington  11:59

Right. And that’s the real starter, right?

Adam Avramescu  12:03

Yeah. And like you You call this out? I think earlier, a really big focus area here was around measuring outcomes. So what did the report have to say here about the way that we’re measuring outcomes? All right,

Dave Derington  12:13

the first thing, and I think what we should do here is kind of follow the report, I like that skill Jr had brought or bubbled up these main three highlights or takeaways. And we can use them to like, kind of frame this up. So outcomes, big, big, big deal. And that’s, again, something you’re gonna see throughout all the reports, the most interesting thing that I thought I saw any here is only 41% of the respondents throughout our tracking usage, data, product usage, data, revenue, and CSAT, only 41%. That means almost 60% of folks out there are not a big deal.

Adam Avramescu  12:54

That is That means was that that means that people are looking at, essentially the metrics that are provided locally, maybe within the LMS. But they haven’t yet gotten to the point where they’re actually connecting that data to their broader broader business data.

Dave Derington  13:07

Right. And that’s telling us that, you know, we always talk maturity models, where are you at, on your journey for your customer education program, and it’s still showing that a lot of folks just by merit of how complexes are still in the earlier phases of that continuum. That’s one thing and let’s let’s maybe go through a few of the highlights, these are the things that I thought were cool. Apart from that 41% not tracking 80% Use definitely use surveys for feedback, which that’s great. 59% are not using any kind of analytic tools. Wow. And the biggest gap that they’re seeing here, that’s huge, biggest gap is that is most people see this as a focus for development, they want to start tracking against revenue. So like, 70%, so

Adam Avramescu  13:56

yeah, 70% want to but but there, but most of us are not and like, yeah, they want it, you’re pointing towards a couple of things here that I think are really, we’re gonna see as recurring themes throughout all of the reports. One, I think, is the idea that all programs now have an eye towards, or most programs, I should say, towards how to drive increased revenue. There are still a lot of programs who aren’t generating. But now, increasingly, you’re seeing people thinking about it. And I think that has something to do with the macro economic climate, which we didn’t really see addressed as much in these reports, I think, based on the timing of when they were constructed, but the undercurrent is there. There’s some thoughts to probably about how this relates to education budgets, which we see increasing through a couple of their reports and they’re in a moment. But these ideas are correlated, right? One is just the this idea that okay, we’re targeting more of an impact on revenue. And then we’re also seeing a broader investment in customer education because we think we’re going to get that ROI. So that’s really encouraging. That’s a great sign that More programs are headed in that direction. Maybe a little alarming that so few programs are, are anywhere near that yet. But the part that I think is perhaps more alarming is just that you’re seeing so many programs out here who still have not collected, connected and visualize their data to borrow a term that we’ve used on.

Dave Derington  15:19

Yeah, we use that for a couple of times. I really think this is important. If and if we like keep the pace up. This is validating is that? Look, folks know, this is a problem. We all know, this is a challenge. It’s very difficult. So in the coming year, I think we should see a lot more focus a call to action on us and everybody in the community. Focus on how we can make that easier. How is it that we connect like that your systems people? Yep, it’s integrations, Salesforce and other things. But it’s beyond that. There are all kinds of things that you can integrate, not just your CRM data. Yeah.

Adam Avramescu  15:55

Yeah. Now we’ll move on to they talked about training teams infrastructure. We talked a little bit about what they covered

Dave Derington  16:01

there. Yeah, yeah. Let’s start first, I love numbers. I love talking from the numbers standpoint. And one thing that I thought was interesting was a whopping oh, here’s your point out a whopping 75% of existing programs saw budget increases over the last two years for customer education as a huge, absolutely. Oh, my gosh, yeah, that one, it is it’s

Adam Avramescu  16:23

huge. And we saw that one echoed, I believe in the Tsia report as well. So is actually actually my all three of them might might have tracked this. So that’s really good to see. We also saw I think the continuing trend of customer success teams, I should say, being the main sponsors of customer education. So we’ve got most of our, our teams in the skill jar sample living within customer success. And actually, when we get ready to say, well, we’ll point out maybe a key difference there. Because there you’ve got education services, teams living within services, I believe, more primarily my point to a little bit of a difference between the types of customer education teams that are being built in tech companies that a company like skill jar serves and the types of education services teams being built within a broader services organization that are members of Tsia. So again, it’s like kind of looking at the sampling and trying to figure out who am I more like,

Dave Derington  17:13

you know, that’s really interesting. And I don’t want to steal our thunder and pull away from the report that we’re gonna read later. But I think juxtaposing skills jar, this skill jar one, which is really all in customer, Ed, versus Tsia, which is more kind of straddling and writing the hump, probably with more, I would say more of a leaning towards veteran organizations that have had existing training functions for years, decades, maybe.

Adam Avramescu  17:41

Yeah, and if you look, in fact, even at the skill jar data, I think they do a breakdown of company size. And it’s like 100% of the smaller teams sit within CES. And then you start to look at the skew for larger and larger education teams larger and larger companies. And they’re, in fact, you start to see that more often those teams are actually sitting within services. And again, I think that kind of reflects that same skew that you you talked about. Yeah. The other interesting one that that skews based on company size is team size. So they did a really interesting correlation here, where they were looking at, if you’re a customer education team of one to five, how many learners on average? Are you serving? If you’re a team of 11, to 25? How many customers are you serving, etc. So there’s some interesting correlations there, where it’s like, if you’re like teams of between one to five people have around 1000 learners 1125, it’s around 3000 learners. And then these like, Mondo customer education teams of 25. Plus, now you’re talking about 5000 Plus learners, but those two curves are not really equal to each other. So it’s almost like once you hit a critical mass of your customer education program, and what it’s doing within the organization, it’s not as though you continue adding heads on your team to support increased customer size, like it’s not actually that the number of learners is completely proportional to the number of people you have on your team. It’s that I think at a certain point, you hit this critical mass, where you realize that the education programs you’re providing you’re making more of an investment in it to provide probably higher quality or more in depth education to those customers. That’s what I think we’re seeing in those numbers.

Dave Derington  19:20

Yeah, I like that. You bring that out, because it oh, gosh, as I work more with maturity models, and I advise companies that are different states, the model changes, it represents the fact that in customer education, that’s a fluid is fluidly changing function. Early on, you might have one person or no people later, you might have more people. Even later, you might have a limited set of people, but you’re focusing on a different kind of content. Right? Or, gosh, I was just listening to the localization episode that you and Courtney had done, and the sophistication of the things that you’re doing at that level could be much more complicated too. That’s interesting too. Talk about scaling, you’re learning scaling your reach based off of who you’re doing. There’s a correlation to the size and the what you’re trying to achieve. You know, in the last part of this, I think we’ve got about five ish minutes to round out this part, let’s talk about training, content and training formats, which was the third pillar of their formats. The first call out that I want to make is the for investment over time next year, who’s planning what video is number one.

Adam Avramescu  20:30

I mean, that makes sense. We’re seeing video continue to be at the forefront of what of what customer education programs are doing to grow their programs, we know that whether it’s live action videos with a talking head, whether it’s software demos, whether it’s tutorial videos, that this is both generally an effective way to drive learning, right, I’m thinking here of mayors total principle principles for multimedia learning, which actually put a lot of research behind why the video or even narrated elearning that’s similar to video can provide a more effective learning experience and can lead to retention. Spoiler alert, I’m doing a solo episode on mayor’s 12 principles soon. So you know, look for that in your feed. But like, what else? I mean, why else Dave? Do we think that people are investing so much in video,

Dave Derington  21:18

in a lot of ways, I’m gonna bring out a trend here or something that I’m seeing. And this is a little off script, but it’s relevant. One of the companies that we work with video aid has a product that allows you to actually quickly make videos from scripting. I talked with another company just yesterday, actually. And what I found remarkable of talking with them was that they have a video tooling engine built into their learning platform. And like it’s all of these different vectors of folks saying, hey, I can make this really quick, like TechSmith Camtasia, putting a heavy investment in customer Ed, because why it’s easy to create video and the YouTube and Tiktok economy has really showed us the way people see

Adam Avramescu  21:59

you’re making, you’re making a really good point like one I think in general, we’re more used to learning from videos than we ever have been. But maybe maybe even more significantly, we’re seeing the entry costs of video starting to come down, it’s more accessible, it’s easier to make a video with more accessible software, you don’t have to be a pro in Adobe Premiere or something like that to be able to produce accessible education video. Yeah, love that.

Dave Derington  22:24

But let’s tweak off that let’s rough up off of what you said the accessible. There’s some other things in this report along content training format that I thought were interesting one is, hey, accessibility, we’re starting to talk on the podcast about localization and globalization, all these different things. Accessibility is a function that 45% of programs don’t think about, again, different from localization, this is for disability and such like that. That’s interesting. What do

Adam Avramescu  22:53

you think actually, given the push? Well, there’s, I mean, there’s a huge push towards people now recognizing why accessibility is important. So often you you start with awareness, and then you eventually lead to implementation. So I think the fact that 45% of programs don’t focus on accessibility means 55% of programs, my math, right, yes,

Dave Derington  23:13

that’s how you mask these. That’s, that’s

Adam Avramescu  23:16

55 programs and more than half of programs are and that actually, I think, I don’t remember if they track this in the past, but that is most likely an increase, right? I don’t think we’re seeing this number going down over time, I think you’re seeing more awareness of accessibility and why it’s important. And a lot of this is actually coming from the instructional design world, which is also I think, really starting to respond to, you know, some of the the lessons that we’re learning from dei advocates. And so what was previously I think an invisible problem is now becoming very visible, and people are just making a lot more noise about why this is important. So you’re starting to see programs pick this up more.

Dave Derington  23:56

Yeah, I think it’s super cool. Okay, let’s round this one out. A couple more really interesting things from my notes. There was like a 5x increase in usage of VLT, you know, virtual instructor led 5x. That’s, that’s amazing, but then correlates to our experiences COVID. And the fact that we now know we can do training like this.

Adam Avramescu  24:17

Yeah. So this is we’re seeing now the the hangover of all the questions about COVID. From the previous years, the IoT is most likely here to stay, even as we see people starting to return to the classroom cautiously. People know now that VI OC is more scalable, it’s more cost effective in many cases. So it’ll definitely stay in the mix for those who have adopted it.

Dave Derington  24:37

Yeah. And that’s great. That’s really validating, rewarding because we know we’re moving to those forms. And there’s some other reports we’ll talk about in a little bit here that play off of this. And I guess if I had to pick anything out in the last few seconds that we have on this topic, we’re going to break it up a little bit. The last thing I thought was interesting is that there was a big discussion about certification. Were 4047 Exactly

Adam Avramescu  24:57

what I would have gotten gone to what else Like this certification? Well, 47%

Dave Derington  25:01

of respondents use certification now and have a question about that terminology. And 23% are planning to invest in it. So that’s a significant portion of the market. My question being Adam, is this truly certification? Or is this credentialing?

Adam Avramescu  25:16

Good? Good question. I’m guessing that it is credentialing or you know, lowercase c certification, where they’re issuing certificates or badges based on completion of a course. But they’re not necessarily doing high stakes certification. And I can say that knowing I think some of the programs that are in schools are as customer base, like Procore Academy, I think is a good example of a really robust lowercase c certification program, where none of it is proctored. But it is role based, you are getting credentials based on that. And I think if you look at some of the other attributes of the teams that skills are listed in their report, again, going back to the audience, a lot of these are not programs that are necessarily at the point where creating a high stakes, proctored capital C certification program actually makes sense for them. Because by and large people are not getting hired, fired or getting work based on learning their software or or going through the learning that that these companies are offering. It’s really more about Yeah, I think providing a way for them to really put a check on their knowledge, and then be able to share that with the world, which is still a really effective promotion strategy that I think we’ll see in some of the other reports. Yeah, which might be a segue into which one are we covering in the next episode, Dave?

Dave Derington  26:30

Let’s shift to I think we both had listed on our own. We took our own segments or our own notes. Let’s go to the Thought industries and IDC report next.

Adam Avramescu  26:40

So okay, well, next week on C lab or next two weeks on C lab industries and IDC. So we’ll we’ll see you next time. Leave us a five star review. Thanks, Alan Coda for the theme music.

Leave a Reply