Data Protection & Encryption

Data Protection & Encryption
Has anyone got experience with both CommVault and Cohesity for data protection/recovery? What are the differentiators? Or have you found an alternative you prefer?

Top Answer: I have evaluated both. Good products. Cohesity in my opinion is more modern. A good alternative is Rubrik. To me the biggest differentiator is the environment i.e if you are protecting cloud workloads then Rubrik or Cohesity are both great choices. I find Rubrik more cost effective. Commvault has a product for cloud workloads called Metallic. If majority of your workloads are on prem then CommVault is better option. Rubrik and Cohesity are best for ransomware protection. I would suggest you check Veeam out as well, they’ve come a long way

End-User Security TrainingEnd-User Security Training

As sophisticated as cybersecurity tools become, end users still need to be aware of cybersecurity risks. Benchmark your end-user training against your peers.

Where do you think your organization needs the most help in the Cloud related technologies / solutions?

Top Answer: These votes are interesting. While initially "Cloud native development" was leading in comparison with "Cloud migration", currently it looks like (at 88 votes), "Cloud migration" is leading and the Cloud native development has lagged.   Does that mean Organizations are yet to migrate many of their workloads to Cloud yet (at the end of 2021), and Cloud native development takes a backseat in the priorities overall?

Is security by design the best approach to regulatory data compliance?

Top Answer: Security by design is the best approach to compliance in my opinion. In my present position, I was brought on to make us HIPAA and GDPR compliant, because the company was going to run into a hard stop if they couldn’t accomplish that. They generate a large amount of data that has protected patient health information in it, and it's coming in on a second by second basis. They were terrified that data would leak, or be otherwise compromised. Both HIPAA and GDPR have massive regulatory fines associated with data loss, which could kill a company. And it's a tremendous embarrassment to be “shamed” when those losses are made public.  My challenge was to figure out how we protect that data and all of the associated information, as well as our cloud infrastructure. When I joined, I said, "We have to rewrite not only our cloud app and the way that we deal with the data we're saving in the cloud, but also the mobile apps." My task to our development team was to rewrite everything that we own, so it would be encrypted at every point in its life cycle. Because if you do that, you are compliant with almost every security standard on the planet by default. If you encrypt every single piece of data end-to-end — in use, in motion and at rest — it's never unencrypted at any point. We also had to re-examine every toolset and codebase we were using, to make sure we had no deprecated packages that contained known backdoors, flaws or other compromised components That has a couple of different benefits. One of the benefits is that under HIPAA and to a lesser degree GDPR regulations, if you encrypt your data to a certain standard, you won't have to report a data loss if it does happen. You don't even have to self-report because you've exceeded the standard they require, so that's a huge saving in every regard. Not having to pay a fine or even report it under the law makes your life a lot easier. And it took about eight months of solid work to get us to that point.


What is the current state of ransomware attacks? What level of defense and preparedness do companies have from their backup support?

Is security by design the best approach to ransomware prevention?

Top Answer: As part of a holistic and well thought-out strategy, security by design is the best approach to ransomware prevention. My approach to security has always been that you can try to solve each problem one by one, or you can solve the vast majority of problems in one stroke by encrypting everything end-to-end. At that point, you have far fewer data security problems left, or you've minimized them enough that a bad actor will not be able to get anywhere unless they’re targeting you hard. Even then, they would have to put a lot of effort into doing so; it eventually becomes a waste of their time and money, and they will move on to a softer target.  Most people have security added onto what they do. They'll write an application and they'll try to secure it afterwards. But you can't write something and then throw money at it to fix it after the fact. Because you’ve probably already lost control of your data at that point — you’ve already been hacked or compromised in some fashion. You don't know where the holes are until somebody else finds them. Even if you test all day, every day, you don't have the mindset or the resources that bad actors on the world stage have. You're not going to find your flaws until after the horse has bolted; closing the gate after the fact is of no use to anybody. In every company I've worked for over the last 12 years, I’ve always pushed the idea that this will be an ongoing and continuous problem for every organization on this planet until everything is encrypted end-to-end. And that includes the entire internet of things (IoT) as well. Otherwise, people are going to be hacking your fridge, your smart TV or your router and taking control of your home network. IoT is a disaster at the moment. It needs to be worked on seriously, because it's a global problem. I’m working on a reputation-based approach to solving some of the pressing IoT security issues we are facing.

What’s the best approach to random number generation?

Top Answer: To simplify a complex issue somewhat, when I was with Chronicle Data Corporation, I was part of a team that wrote a brand new way of generating random numbers. Random number generation is often left to some off-the-shelf Linux packages where the number is generated in a very particular way. It's not that complex and it's okay for general use, but it’s useless for true high security. You basically use an entropy pool, which is a pool of numbers that tends to randomness. If you buy Linux packages, they come with a built-in random number generator, and it will generate random numbers for you. But the problem is that it's quite bounded. The entropy pool is created by combining various inputs from the computer; things like mouse movements, keyboard strokes, fan speeds, CPU temperatures, and a few other things, but it's not a very big pool of numbers. A CPU can only operate within a set temperature range, so fan speeds can only be within a certain range as well. That’s what makes it such a bounded pool. You end up with a theoretically predictable initialization vector for your first random number. So we developed a new random number generator that was then bought as a standalone product by a governmental agency, as well as an Australian gaming company who used it for their One Arm Bandit/Fruit Machines.

Does anyone have learnings/best practices to share for moving data to third parties, or granting access to third parties?

Top Answer: Like all things in life context matters and so the answer is often going to be "it depends".  In this case, the variables at play are going to be things like how many entities are you sharing with, access patterns, data sensitivity, volumes, latency, quality, discoverability, audit, cost etc.  There's probably other things I've missed that others can point out. If any of these factors are significant, then manually managing things via an SFTP drop zone (as so many do) is going to get old quickly.  There are lots of good technologies aimed at solving these particular problems out there now and are always improving.  For example, if you are moving hundreds of terabytes then you will be wanting to use an appliance of some kind unless you have some serious bandwidth.  

Related Tags