If your developers are using ChatGPT as a coding assistant, what are you doing to prevent devs from leaking corporate secrets to it?

1.1k viewscircle icon2 Upvotescircle icon2 Comments
Sort by:
Director Global Network / Security Architecture and Automation in Finance (non-banking)3 years ago

End user / developer training and DLP controls. Chat GPT can not come up with code ideas but it can write code based on requirements. That means unless the entire app is written in one query it's not placing data that could hurt intellectual property.
Like every new online tool it needs to be secured but we should make the security into an obstacle course.

Co-Founder in Services (non-Government)3 years ago

First we need to define secrets, basically we need to do data classification in the company, Second,  if they using the web option, then network DLP ( swg  casb,sse) should do the trick (if configured correctly)..if they using API i will need to do a research on options. Could be end point dlp, but its a different problem:)

Content you might like

Migration from existing MFA35%

App coverage/integration57%

Security Concerns58%

End user education/awareness26%

Other (please comment below)4%

View Results

Cyber insurance with ransomware coverage31%

Law enforcement contact(s)41%

Ransomware response plan57%

Ransomware task force/team39%

Bitcoin account for ransomware payments12%

Disaster recovery site27%

Other (comment below)1%

View Results