Given the urgency around AI adoption, what approaches are you using to accelerate skills development without compromising quality or employee experience?

146 viewscircle icon3 Comments
Sort by:
Global Chief Information Officer in Construction12 hours ago

The adoption to a certain extent really tackles human behavior. It’s not a technology issue; it’s about how we adapt technology to do what we want. We’ve been at it for three years and have done a lot of the easy stuff, like building virtual assistants. That was successful and helped with adoption. With Copilot licenses, everyone was enthusiastic at first, then the enthusiasm went down and you got to the serious people. We had to change tactics. Adoption is driven by two forces: leadership-driven programs and bottom-up initiatives by IT. We have a good-sized IT team, so we can experiment. For anything that touches our core business, we have a rigorous process to validate before sending it to a client. For internal stuff, we have to experiment, because this technology is different from anything we’ve seen before. We took three old applications and had a small team rebuild them using AI, to teach our people new methods and technologies. There are a lot of teaching moments. Either it works or it doesn’t, which is predictable. We deployed some of these things, people are using the new versions, and there’s lots of ways to deal with adoption. You have to be open to new ideas and experimentation.

Director of IT in Manufacturing12 hours ago

For us, it’s really important for the IT group to find use cases with a purpose to work around AI and not get sidetracked. They’re busy and have other things to do, so if I tell them to work with AI because it’s coming, they need to know why. Partnering with the business to understand what the business might need and benefit from gives them something to work towards. I’m struggling with the balance between quality and employee experience, as it depends on company culture. Are you going to go fast and break stuff, or take a long time to validate everything? Quality will be compromised if you just run Copilot and accept the output, so I haven’t made up my mind around that.

CIO in Retail12 hours ago

I am encouraging people to do stretch assignments. Only certain people are working on AI right now, but others want to jump in, so they are open to doing AI as a stretch goal. When building this framework, I want it validated so people can build prototypes and showcase what they want. You can take your own organization and show what you can do to improve IT or do something for the organization as a functional team. Beyond IT, I’m also pushing more AI literacy for the executive leadership team, because that’s critical. I want them to understand what it means, because sometimes it feels like technology is unique, but people just need to adapt so every organization can use it to be more productive. I told the legal department, “You’re wasting a lot of time reading documentation. You should just use Copilot.” No one said no, so I said, “You can just do it.” People aren’t thinking about what more they can do because of fear. Fear is good, but it shouldn’t stop us from leapfrogging. We put in AI training for the enterprise, a bootcamp to validate terminology, and encourage self-learning.

Content you might like

We do not have a governance framework yet18%

We use manual controls or periodic reviews39%

We have defined policies and guidelines for agent creation and deployment, which each team enforces independently30%

We have implemented a platform to centrally manage policies, versioning, and monitoring for agents12%

Other (If possible, please include an open-text field)

View Results

Less than 2 years45%

2 to 5 years52%

5 plus years3%

View Results