Survey Report: Trust and Safety in AI Output for Learning

Peter Enestrom
CEO & Founder, Learnexus
Survey Report: Trust and Safety in AI Output for Learning

Executive Summary

As AI becomes increasingly prevalent in L&D, the paramount concerns are the trustworthiness and safety of its outputs.

We surveyed over 1,000 learning professionals and found that, while only 7% of respondents completely trust AI-generated content for L&D purposes, over 60% have some level of trust.

It seems that a ‘trust, but verify’ approach exists when using AI for creative output. This skepticism is shown in the data with 21% barely trusting and 11% outright distrusting AI outputs.

60% of Learning Professionals somewhat trust AI output for L&D purposes

Learnexus April 2024 Survey on Trust & Safety in AI Output

The ‘trust, but verify’ approach signifies a pragmatic utilization of AI in L&D – This approach treats AI as an initial step, not the final word, underscoring the need for human expertise to ensure the content’s quality and applicability.

I think of it as a ‘trust, but verify’ situation. I tend to use AI tools and content as a jumping-off point rather than an end product, at least for now. I either use a tool to improve my existing content, or I use the tool to gather info from various sources before checking it all out myself. I don’t know that I’ll ever get to a point of ‘completely trust,’ but to be fair I sometimes fact-check real SMEs’ work, too.

Learning Professional

graphy export 1712851430

No doubt that many shortcomings still exist with AI-driven content, particularly in handling complex, technical content, emphasizing the critical role of human experts in ensuring precision and accuracy.

AI output is ‘okay’ for soft skills but super tough to use for anything related to hard skills and mechanical/technical training. I need a high degree of accuracy with my work and visual AI tools just don’t give the accuracy I need. And content-related AI uses like creating summaries or rewording content also doesn’t meet my need for accuracy.

Learning Professional

The key takeaway? While AI offers a helping hand, it’s clear that the human touch remains irreplaceable, especially when fine-tuning the nuanced details of learning content to meet high standards.

Trust Issues with AI in L&D

The skepticism towards AI in Learning & Development isn’t just about its novelty; it’s about its depth and reliability. Over 60% of professionals hold some trust in AI, but this trust is layered with caution, especially when it comes to creating content that’s both precise and meaningful.

It’s hit and miss with AI. I’ve put in controversial prompts to see what comes out. Sometimes it’s good, sometimes it’s not.

Learning Professional

This on-the-fence sentiment captures the unpredictable nature of AI-generated content, where the output can range from insightful to irrelevant.

AI is fine for a first draft, but someone knowledgeable must take over to refine and correct errors.

Learning Professional

Concerns around technology limitations highlight the necessity for human oversight, ensuring that the content not only starts strong but also finishes with clarity and accuracy.

Using AI is like having a knowledgeable high school intern. It’s helpful but struggles with context and can jumble information.

Learning Professional

This analogy to a high school intern aptly describes the AI experience in L&D: promising yet imperfect, useful but requiring guidance.

In essence, while AI can kickstart the creative process and handle some of the heavy lifting, it’s the seasoned L&D professionals who steer these initial outputs to their polished, final form. The journey from AI draft to educational masterpiece is one that traverses the bridge of human expertise, highlighting the irreplaceable value of the human element in educational content creation

Navigating the AI Landscape in L&D

AI’s ability to generate content quickly is undeniable, yet its effectiveness and accuracy often fall under scrutiny.

AI can spark ideas and assist in drafting, but it’s up to us to refine and validate.

Learning Professional

This underscores the AI’s role as a brainstorming partner rather than a sole content creator, emphasizing the critical role of human intervention in ensuring content quality and relevance.

AI helps in summarizing and chunking information, saving time. However, it needs heavy editing to align with our tone and branding.

Learning Professional

This reflection points to AI’s efficiency in processing and organizing content, yet it also highlights the necessity for detailed human review to meet specific organizational standards.

AI-generated content for L&D is okay for general topics but struggles with detailed, technical material.

Learning Professional

This perspective sheds light on AI’s current limitations, particularly in handling content that requires deep expertise and precision.

Working with AI in L&D is really about knowing when to let it help and when to take over. It’s great for starting things off, but in the end, it’s the experienced professionals who make sure everything’s spot on.

Top Concerns with AI in L&D

When it comes to using AI for Learning & Development, professionals have clear concerns. Our poll shed some light on what’s keeping L&D folks up at night about AI content.

graphy export 1712852374

A quarter of the respondents (24%) worry about the accuracy or detail in AI-generated material. If you’re teaching something, it’s got to be spot on, and AI doesn’t always get the details right.

As with all content created by the support from AI, it is critical to validate the information provided and for the content to only be treated as a DRAFT.

Learning Professional

The survey also showed that 26% of respondents are wary of potential biases and inaccuracies in AI-generated content.

It can’t be taken as is and will need to be verified further especially when trying to fill in the void left by your SME.

Learning Professional

The largest concern, expressed by 35%, is the loss of human insight. As one L&D professional said, “AI can’t generate effective content from start to finish… An experienced learning designer must make necessary revisions.” This emphasizes the irreplaceable value of human expertise in creating meaningful and effective learning experiences.

I have worked with it extensively and have a high level of trust but the reason for that is I can quickly validate its authenticity because of my own extensive experience in L&D.

Learning Professional

Legal concerns were highlighted by 15% of respondents, with worries about AI navigating the complex waters of copyright law. “It’s here where I feel AI-generated content can greatly reduce plagiarism issues as AI can curate content from multiple sources without replicating it,” one respondent cautiously noted, pointing to the nuanced legal landscape AI must traverse.

These quotes and concerns illustrate the nuanced perspective of L&D professionals: while recognizing AI’s potential, they remain alert to its limitations and risks, advocating for a balanced approach that leverages AI’s strengths without compromising content quality or legal integrity.

Conclusion: Finding Balance in AI Integration for L&D

While AI has a place in L&D, it must be navigated with care and human wisdom.

AI is a tool, not a replacement. It can initiate and support the learning process, but the finishing touches, the deep understanding, and the alignment with organizational goals must come from human hands.

It’s hit and miss with AI… Sometimes it’s good, sometimes it’s not.

Learning Professional

AI’s inconsistencies mean that L&D professionals must adopt a ‘trust but verify’ approach, verifying AI outputs to ensure they meet the high-quality standards expected in educational content.

AI can’t generate effective content from start to finish… An experienced learning designer must make necessary revisions.

Learning Professional

We are seeing a recurring theme in our reports on AI that highlights the indispensable role of human expertise in guiding, correcting, and enhancing the AI-generated content to ensure its effectiveness and relevance.

By leaning into the ‘trust, but verify’ approach and utilizing AI as a supportive tool rather than a standalone solution, L&D professionals can harness the potential of AI while safeguarding the standards that define effective learning.

Apply to join LHQ, the Community for Learning Leaders

We invite you to apply to join LHQ, the number one place for L&D leaders to learn from each other through expert-led roundtables, our active forum, and data-driven resources.

You’ll have access to reports like these on a weekly basis, must-know stats & data around AI and Learning & Development, and shared knowledge from your peers. Plus, it’s free to join. Apply to join here.