How does the IT security role transform in the world of DevOps?

1.9k views1 Upvote3 Comments

VP, Chief Security & Compliance Officer in Software, 1,001 - 5,000 employees
You have to shift left—I don't think there is such a thing as no security in DevOps. We are past the point of security being a central service that will come into your area to scan it. Now you’re going to scan it. You'll learn to understand the integrity of the code.

My team is developing some threat modeling libraries so that we help the developers more by showing them, "Here are the scenarios that you should be thinking about if you’re going to do that with your application. Test it this way." It's not their primary job, so we can't really expect them to think about these threat-modeling scenarios but we definitely should teach them how to incorporate it because we need to shift left.
Worldwide Strategy & Portfolio, Cross Industry (Supply Chain, ESG, Engineering, Customer Experience, Intelligence Automation, ERP) in Manufacturing, 1,001 - 5,000 employees
Security should definitely be integrated but when everyone is scoping projects and budgeting, they're not putting a timeline in there. There's always pushback: "Can you get it done faster? Can you get it done cheaper?" Those are the things that are cut out because they take a long time to figure out when you're not used to doing it and you want to get to the end result the quickest. It's going to be a whole lot of shifting.
Global CIO & CISO in Manufacturing, 201 - 500 employees
Every angle has to be looked at. As practitioners we have to look at everything that we can secure. If you're not shifting left and you're not putting all the controls in with the bodies to read you, forget about an automated system. You have to slow down: You have to allocate the time to make sure you have those checks.

It drives me crazy because we hear it and say it—separation of duties, validations, extreme programming. We have all of these concepts that have been around for two decades. Why not try using them? Because that's how you get to these things.  You can't just let systems automate. It’s cute that you have Veracode, etc. But guess what? At the end of the day, a body that needs to validate a lot of these things, especially when you've got key, critical systems.

At least for our old environments, which may not be at that scale, we should be able to do better. A developer should have 10-20% over their mind on those things—even if it's after hours, I don't care. They have to be able to say, "My code is secure. I can attest to having put it through the right tools to make sure that it isn't going to bring the whole company down, whether it's open source or paid." Right now we have a lot of very fast computers but once quantum computing takes off we're screwed.

Content you might like

crowd strike37%

sentinel one59%

carbon black5%




Yes - Maine did the right thing. There are too many security risks with free versions of these tools. Not enough copyright or privacy protections of data.31%

No, but.... - You must have good security and privacy policies in place for ChatGPT (and other GenAI apps). My organization has policies and meaningful ways to enforce those policies and procedures for staff.52%

No - Bans simply don't work. Even without policies, this action hurts innovation and sends the wrong message to staff and the world about our organization.12%

I'm not sure. This action by Maine makes me think. Let me get back to you in a few weeks (or months).3%


9.2k views9 Upvotes1 Comment

CTO in Software, 201 - 500 employees
Without a doubt - Technical Debt! It's a ball and chain that creates an ever increasing drag on any organization, stifles innovation, and prevents transformation.
Read More Comments
46.5k views133 Upvotes324 Comments