Are you worried about shadow AI? Do you have a formalized governance process around it?

2k viewscircle icon1 Upvotecircle icon4 Comments
Sort by:
Senior Director, Enterprise Architecture in IT Services2 months ago

Yes, shadow AI can be a real problem for regulated industries but also in general if participants are unware of AI-related constraints from the EU AI act, client MSAs etc. Formalised governance will never stop shadow AI, but it should put in place AI policies and an education framework so that people have a better awareness of risks and consequences.

The best way to stop shadow AI is to make the official AI solution(s) the easiest to use, and have a receptive process when changes are requested. Governance only goes so far, managed facilitiation is better.

Lightbulb on1
Executive Director - Computing Services in Manufacturing2 months ago

Yes, it's a real concern. We have an GenAI Governance Council established for a couple of years now trying to streamline the intake process both to ensure there is value as well as control in place. As part of the council, we do have business partners which allows us to minimize the shadow AI, but can't totally eliminate it due to individuals can use any AI capability to incorporate into their part of the work. Part of is to educate the employees the do's and dont's which we continue to emphasize so everyone understands the risks and be compliant.

Lightbulb on1
Senior VP, Business Services and CIO in Manufacturing2 months ago

We don’t call it shadow AI; instead, we invited interested employees to join an AI working group, enabling us to monitor and guide their activities. We give them autonomy with a set budget and encourage them to share their results. This approach helps eliminate negative aspects of shadow AI and generates valuable use cases. However, there are still rogue users, and governance is an ongoing challenge.

Lightbulb on2
Vice President Information Technology2 months ago

Shadow AI is a real concern, especially with free tools and employees purchasing solutions on their own. Our industry is less regulated, which presents both opportunities and risks. We have some controls in place, such as approved large language models, but the risk is lower compared to highly regulated sectors. Governance is part of the technologist’s responsibility, and controlling shadow AI is challenging, especially on personal devices.

Lightbulb on1

Content you might like

Robust pipeline with several in-house leadership candidates25%

A few in-house leadership candidates, but we need more64%

No current in-house candidates for leadership positions11%

I’m not sure

View Results

More reactive35%

More proactive58%

A balanced mix of both8%

View Results