What’s the right balance between privacy and transparency when it comes to insider risk management?


1.5k views1 Upvote5 Comments

CISO in Software, 201 - 500 employees
It all comes down to how effective the risk management program is, and how experienced and mature the Cyber risk. governance team is. For the most part insider risks can be mitigated to a large extent by preventive and detective security controls including (but not limited to ) identity and privilege management, security event log monitoring, business-specific use cases (based on the critical assets and crown jewels), and the principle of least privilege. This should address most risks, for the residual high-risk users with access to sensitive biz assets, enhanced controls could possibly trespass some privacy areas (MDM, DLP, etc.), however in such cases it is advised to take a consent-based approach, where the infosec team can educate users about the control requirements and take their consent prior to enabling monitoring controls. If consent is an issue, then go the traditional way by issuing company-hardened mobile devices (laptops, mobiles, tablets, etc) that give more control and confidence to the data and assets being accessed.  
1
Director of IT in Software, 201 - 500 employees
Privacy is important, but in a corporate environment where you work on a company network, company laptop and smartphone, it should not be assumed. The company has the right to protect itself and its intellectual rights, so if one is concerned about privacy, they should not be doing anything personally on a corporate computer. Having said that, there has to be transparency and explicit notification that the traffic can be monitored by the employer, and employees need to be aware of the extent of it.
There are a lot of technologies that can be used to mitigate insider risk. 
Associate Vice President, Information Technology & CISO in Education, 1,001 - 5,000 employees
Need to distinguish between insider risk and insider threat. Insider threat management is about a malicious insider that can exploit the organization (leak data, harm systems, etc.) while insider risk is about the risk that surrounds an insider's status / access. A malicious outsider maybe harvest a user's credentials who has privileged access, leading to a breach. This is insider risk.

For insider risk, there should be transparency within the organization. These would be general things like education and awareness, enforcing strong controls, letting users know about their access and risk of exposure, etc.

When it comes to insider threat, those are cases that you want to keep close to your chest until you have all evidence to execute some kind of action (prosecution, further investigation, termination, etc.)

Now, with regard to the question of user's having privacy protected while working on corporate devices, that would likely need to be looked at with the lens of the legislation you fall within. In Canada, for example, employers now need to inform their employees if they are being monitored. So, it's a no brainer on that front.

If no legislation exists where you are, this will come down to your corporate culture and risk tolerance. I think having an acceptable use policy gives good enough transparency to your users on what they can and cannot do when using your equipment, and monitoring becomes fair game for any insider threats.

Hope that helps.
Director, Information Security Engineering and Operations in Manufacturing, 5,001 - 10,000 employees
There's no magic answer to this, and will depend heavily on your risk apetite and your desire for privacy. You will have to consider local/country laws for both topics, and come up with an approach that works and gives you the right balance and confidence.
CISO in Software, 201 - 500 employees
Short answer: build trust. Make sure everybody understands that your company reserves the right to monitor employee activity and call out all the areas / systems / network / equipment are monitored. As required by the applicable laws, include additional information including the retention periods, types of data you are collecting, etc. Communicate what is an acceptable use, put in place policy that pardons employees who self-report accidental misuse / access of data.

The specific techniques and mechanisms for monitoring and analysis of the events do not need to be disclosed. In fact, by doing so, you would be giving away valuable intelligence for a potential internal adversary.

Last but not least - it has been already mentioned - if you build strong access controls, ideally based on zero trust principles, and segregate and protect data & systems according the value & risks, you will implicitly decrease a risk that your monitoring will be excessive and violating the privacy of your employees.  

Content you might like

Community User in Software, 11 - 50 employees

organized a virtual escape room via https://www.puzzlebreak.us/ - even though his team lost it was a fun subtitue for just a "virtual happy hour"
10
Read More Comments
7.7k views26 Upvotes58 Comments

Yes, always.20%

Yes, sometimes.53%

No, never.24%

It depends (tell us in the comments!)2%


430 PARTICIPANTS

1.3k views1 Comment

CTO in Software, 201 - 500 employees
Without a doubt - Technical Debt! It's a ball and chain that creates an ever increasing drag on any organization, stifles innovation, and prevents transformation.
Read More Comments
41.7k views131 Upvotes319 Comments

Scaled back – new requirements are too detailed28%

Expanded – new requirements aren’t detailed enough61%

Neither – new requirements are appropriate11%


72 PARTICIPANTS

583 views1 Comment