Are there any ethical concerns you have about implementing AI/ML?

1.3k views1 Upvote6 Comments

CISO in Software, 51 - 200 employees
I'm in healthcare, and what we're trying to do is develop drugs. Right now it's like a 10 year cycle, so we're trying to find a way to speed that up. There's a lot of ethics involved in AI/ML on what you can access when you're swallowing sensors. What kind of data can you pull? Is it going to be PHI? Is it going to be HIPAA data? Things like that. What kind of data do you want to release to your doctor, if you're all sensitized, because pretty soon we're going to be poking sensors all over our bodies that do a great number of things. So that's where we have to be careful.
Head of Information and Data Analytics in Software, 5,001 - 10,000 employees
When you talk about security, privacy becomes important. For example, when you're doing this whole insider threat part, you're actually profiling your employees. Now, how far do you want to go? When you profile, it's a very, very thin line. There has to be controls in place for who sees employees’  behavior versus who doesn't.  Do you want to get your privacy lawyers involved? Another example is with Target, the case around predictive analytics and sending coupons to a girl who was pregnant.   And take the self driving car example. How are you going to optimize for hitting a wall versus a human? Who would manage that? What are the implications from an insurance perspective? How are you going to plan for that? I think it's still early. We're still getting there, but those are the areas we'll be spending more and more time on.
1 Reply
Sr Director of Information Security and Compliance, 1,001 - 5,000 employees

There's not a day, an hour, in security that goes by where we don't think about privacy. Security has become privacy. It's not only “do I have to be careful with who sees what,” when it comes to employee data, I’m actually beingvery conscientious about what we're gathering from the employees to begin with in the first place. I'll give you a perfect example of how that came about.  Recently we were looking at some MDM tools for our phones. I ended up going with an MDM tool that did more sandboxing, but collected less data because I had to find that balance. I want my employees to trust me.I know it's their phone, and I know they want to be able to do some things. So where can I draw the line where these guys will feel comfortable using their phone and I still feel like I've got the protections in place that I need from a corporate perspective without violating their privacy. I'm not seeing what's on their pictures. I don't need to see their GPS locations. I had to find a solution where I didn't really need that stuff to protect the data. It was more of a privacy exercise, then it was a security exercise.

Sr Director of Information Security and Compliance, 1,001 - 5,000 employees
Forescout has employees all over the world. There are GDPR rules, Canadian rules, etc.. People are in all sorts of different jurisdictions.  Somebody in Amsterdam versus somebody in Germany has different requirements and different rules. How do you make that work for a corporation? You've got to make some tradeoffs in privacy, and sometimes you make tradeoffs in security.  Finding that balance is really tough. When I launched the program that needed access to employees’ phones, I said, we better go talk to our employees. We also talked to the lawyers in the different locations and explained what we wanted to do. The initial thought and the initial plan that we had actually got scrapped. There were too many complaints from our employees, things like... that's too much of a privacy violation or you're getting too much data from my phones. They also lost some things. For  example: they cannot copy and paste anything inside of our sandbox outside the sandbox. In the beginning, you couldn't take something that was an address and copy it into maps because I can't tell that it's a maps program. I can't tell that it's an address, because I took that trade off and said, “I'm not going to look at what you're looking at or copying, but I'm just not going to allow you to do it even if it was totally legit.”
Chief Information Officer in Manufacturing, 10,001+ employees
It's a fin line between invasion of privacy and securing your enterprise. You want to ensure you are capturing risk-based traffic and anomalies to protect the company, but at the same time your poking into personal lives.
Director of IT in Transportation, 5,001 - 10,000 employees
Not if properly within our compliance regulations.

Content you might like

crowd strike38%

sentinel one56%

carbon black5%




CTO in Software, 201 - 500 employees
Without a doubt - Technical Debt! It's a ball and chain that creates an ever increasing drag on any organization, stifles innovation, and prevents transformation.
Read More Comments
46.4k views133 Upvotes324 Comments

Yes, AI has significantly reduced costs and improved customer experiences.4%

Somewhat, there have been some cost reductions and customer benefits, but there's room for improvement.81%

No, AI implementation has not yielded noticeable cost savings or substantial customer enhancements.11%

Not sure / I don't have enough information to assess AI's impact.4%