If your organization starts using generative AI for security ops (like threat hunting or incident response), does that impact how you think about your team's roles/responsibilities? Would you expect to need fewer staff for SecOps, or even more? More or fewer high-skilled employees?


2.4k views1 Upvote3 Comments

Information and Security Office & Enterprise Data Governance/AI in Finance (non-banking), 1,001 - 5,000 employees
In short, 'yes', as we leverage AI for security ops the role of first-level soc analyst becomes redundant. Basic questions that first-level analyst performs such as reviewing the logs and creating events/alerts, can be automated based on prompt questions that can be responded to by the LLM model or ChatBot AI functionality. Even if you pay extra for the capability, the human expense is reduced. 
I am not saying it today, but that is how we see it in the next 12 to 18 months as the features mature.
Chief Information Security Officer in Healthcare and Biotech, 1,001 - 5,000 employees
Yes. It will be concerning; if the employees are not trained enough. I would be limit this services till the time we don't identify the potential risks.
CIO in Telecommunication, 1,001 - 5,000 employees
I'd argue it's relatively unchanged.  As the industry continues to consolidate tools and automate security functions, the threat actors are also innovating and using the same tools against you.  In my experience we are simply shifting resources from older, but still necessary, security tools as they mature into newer threat defenses.

Content you might like

CTO in Software, 201 - 500 employees
Without a doubt - Technical Debt! It's a ball and chain that creates an ever increasing drag on any organization, stifles innovation, and prevents transformation.
Read More Comments
47.1k views133 Upvotes325 Comments

Data security52%

Shared resources/services34%

Compliance11%

Other: please specify.1%


703 PARTICIPANTS

2.6k views5 Upvotes1 Comment