Accessibility Skip Menu
  • Skip to main content
  • Skip to main Navigation
  • SHARE
Gallagher
  • Projects
  • Expertise
    • Purpose, strategy and culture Activate narratives that drive employee and customer experience
    • Change and Transformation Activate the people side of change to drive transformation
    • Internal Communication Make sure your internal communication investment adds real value
    • Rewards, benefits and wellbeing Turn your reward and benefit spend into a competitive advantage
    • Pension communications Build your employees' confidence in their financial future
  • Insights
  • About Us
  • Careers
  • Contact

Transformation

Why Communicators Need to Take the Lead on AI

null
Transformation | Insights
Sharn Kleiss , Employee Experience Strategy Lead
15 May, 2025 · 5 -minute read
Getting your Trinity Audio player ready...

When we were developing the global State of the Sector 2023/24 survey, ChatGPT and its viral impact was still fresh. It was the shiny new tool on everyone's radar, but many communicators were still in the early stages of exploring what it meant for our work and how it could fit into our strategies. The 2023/24 survey reflected this. Opinions about AI were split—some saw it as an exciting tool, while others approached it with caution.

In 2025, things feel different.

AI tools have evolved, become more accessible, and embedded themselves more deeply into workflows across industries. So, in our 2024/25 survey we got more specific, not asking communicators how they felt about AI—but how willing they were to use it for their daily tasks.

The benefits are clear, but we're still hesitant to adopt AI

The number one challenge for employee communicators this year? Time and capacity. No surprise there — it has consistently appeared in our report as a top five barrier for a number of years. The more interesting finding was that those who reported lack of time and capacity as a high impact barrier were less comfortable using AI to assist with content creation (ranked third most time-consuming task) than those who saw a lack of time and capacity as low impact.

Intuitively, you might think that the people who are most overwhelmed would be the most enthusiastic about AI—anything to lighten the load. But discomfort remains, especially when it comes to applying AI to the heart of what we do: creating authentic, human-centered content.

So why is it that are those who could most benefit from saving time on one of their most time-consuming tasks are not doing it? While those with less time were more comfortable using AI for administrative tasks, the reluctance towards creative applications of AI suggests deeper concerns within organizations—perhaps a lack of understanding or trust in these technologies or a fear of what they mean for our job security as creatives.

It's not about time—it's about trust

The data suggests it's not just about trust in the tools themselves. It may be about something more fundamental: a lack of governance—or perhaps even a lack of awareness that governance exists.

Thirty-eight percent of respondents say they don't have established AI guidance, training or governance (or perhaps simply unaware if they exist) in their organizations and 1 in 3 communicators hadn't made any decisions about being transparent with their use of AI within their companies. This points to a lack of direction from the top (which, funnily enough, was a high-impact barrier to success for 39% of communicators).

If communicators don't know the organization's position on AI, employees almost certainly don't, and that's risky in more ways than one.

  1. Unauthorized Usage: If employees are unsure whether they can use AI tools, they may start experimenting on their own—potentially introducing security, privacy, or IP risks by inadvertently sharing confidential information outside the confines of their organization's programs. With the abundance of free, user-friendly AI tools available to support all aspects of work, businesses should assume their employees are using AI. Without clearly communicated policies and guidelines you may be putting your people and your business at risk.
  2. Lack of Return on Investment (ROI): If you are investing in proprietary AI tools but employees aren't using them—or are afraid to—you're not getting the return on your investment. Worse, you might be unintentionally encouraging shadow use of unapproved AI tools.

The bottom line is that we've entered a phase where AI use is rapidly normalizing. Failing to communicate a clear position on AI puts companies at risk of falling behind their competitors.

The culture question

Interestingly, we observed regional differences. In European markets, communicators reported greater comfort using AI tools—a reflection possibly influenced by proactive EU regulations offering clearer AI guidelines from the start. As usual, it seems that clearer rules create safer environments to experiment with and adopt new technologies. That's something organizations everywhere can learn from.

When to bring in Internal Communications

So, what's the solution? As usual, it starts with communication.

As communicators, we hold a unique position in the organization. We're often the first to frame new technologies for employees, shaping how people feel and behave toward change. It's crucial for us to address these concerns head-on by providing clarity and guidance on how best to integrate AI into daily operations without fear or hesitation.

Start with three questions:

  1. Do your people know your organization's position on AI? If not, you may be unintentionally encouraging shadow use—or creating a culture of hesitation. Internal comms teams can work closely with IT, legal, and leadership to develop and distribute clear messaging about what's allowed, what's not, and why.
  2. Do your people know how to use AI effectively? Awareness isn't the same as fluency. Communicators can lead the charge in demystifying AI tools, providing practical guidance, and celebrating smart use cases from across the business. That might include creating internal AI literacy campaigns or partnering with L&D to build targeted training resources.
  3. Do you role model AI transparency? As communication professionals, we often act as tone-setters. If we're using AI to help draft content, are we upfront about it? Are we showing others how to use it responsibly, ethically, and creatively? Modeling transparency (like the 5% of communicators who explicitly disclose their AI usage within communications messages) and responsible practices helps build employee trust and gives them permission to explore new tools with confidence.

If the answer to any of these questions is "no" or even "I'm not sure," it's time to bring comms into the conversation.

AI isn't going anywhere. And whether your organization is full speed ahead or still figuring things out, one thing is clear: with the right communication strategy, AI can become a tool for efficiency, innovation and empowerment rather than a source of risk.

Looking for support in creating or communicating your business's AI governance? We can help.

People Also Viewed

  • Transformation
  • Insights
Lead with Impact: Embracing AI to Transform Employee Comms
  • 18 Sep, 2025
  • 2 -minute read
  • Transformation
  • Insights
Plan to Attend: Gallagher’s 2025 Digital Experience Summit in London
  • 25 Jun, 2025
  • 2 -minute read
  • Transformation
  • Insights
Meaningful Measurement: Understand Signals That Drive Effective Communication
  • Vinny Foreman
  • Behavior and Insights Lead
  • 11 Jun, 2025
  • 8 -minute read
View More
  • Global Privacy Notice
  • Cookie Policy
  • Terms of Use
  • Modern Slavery
Gallagher

© Gallagher 1999 - 2025