How prescriptive should guidance on AI coding assistants be? Should developers have more flexibility, or should software leaders set firm guidelines?

511 viewscircle icon4 Comments
Sort by:
Director of IT in Software10 months ago

Accountability is crucial. Both developers and leaders share the responsibility. Developers should follow proper guidelines while using AI assistants to ensure productivity without compromising security. Leaders need to regulate the use of these tools to maintain a balance between flexibility and compliance.

Data Manager in Banking10 months ago

The responsibility lies with both the developers and the organization. Companies need to set guidelines on coding standards and tests for AI-generated code. AI is here to assist, not replace developers. It’s everyone’s accountability to ensure proper use of AI, following set guidelines.

VP of Engineering10 months ago

AI coding assistants are here to assist, not take over our jobs. The responsibility lies with the developers, who must balance freedom with clear rules. In the financial industry, we don’t use publicly available AI tools but have in-house models with our own datasets. We encourage peer reviews to ensure code quality. Despite this, bugs can still occur, so manual reviews and security checks are essential to allow innovation within safe boundaries.

CTO in Media10 months ago

It depends on the organization and its needs. In a smaller startup like mine, where we deal with legal information but not highly regulated data, we allow more flexibility. My guidance is that developers must understand every bit of code they check in, whether it’s suggested by AI or written by themselves. They are responsible for their software. However, in MedTech or FinTech companies, I would be much more prescriptive. Developers should also be cautious about sharing potentially sensitive information with AI tools. They need to understand their responsibility for the code and that organizations provide the necessary guidelines to use AI tools effectively and securely.

Content you might like

Taking a course37%

Attending a conference46%

Undertaking special projects55%

Shadowing colleagues37%

Taking on new duties44%

Reading industry texts22%

Training or mentoring others30%

Being part of a professional body18%

Networking23%

Other (please list in the comments)

View Results

Privacy42%

Ethical concerns49%

Data security (including biometric data)39%

More hype than applicable business use cases38%

Personal sovereignty26%

Interoperability19%

Intellectual property (IP) protections19%

I currently have no concerns.9%

Other (comment below)1%

View Results