Future of Work: The Importance of Having an Insider Risk Program in Place | Joe Payne

ABOUT THIS EPISODE

What are insider threats and is there something companies do to prevent data and information breaches? Joe Payne, CEO of Code42 Software talks about why organizations should have an insider risk program in place, the role of training in preventing breaches, and some red-flags companies should look out for.

GUEST

Transcript

Ceci Amador [00:00:17] Welcome to the Future of Work podcast by Allwork.Space. I’m Ceci Amador de San Jose and today I’m looking forward to talking about cybersecurity implications that come with having a hybrid workforce and digging into insider threats with Jo Payne, CEO of Code 42 Software. Joe, welcome.

Joe Payne [00:00:37] Hi, Ceci. Good morning.

Ceci Amador [00:00:39] Good morning. And for those of you who don’t know, Jo Jo is the president and CEO of Code 42 Software, a leading data security company that focuses on reducing the risk of data leakage from insider threats. He’s a seasoned executive with more than 20 years of leadership experience. So, Joe, I want to start just with the basic question. Why is cybersecurity such a big topic today? Insider threats, particularly, what are some changes that brought about this topic?

Joe Payne [00:01:10] Well, that’s that’s a great question. So let’s just jump right in. Cyber security itself is a very broad topic. And one of the reasons that it’s so important today is that almost all of our work is digitized today. So, you know, every important plan, every important customer list, every important design for all kinds of firms is digitized. And so if all of your intellectual property and all your company’s best ideas and all of their customer information is digitized, it means that it can be taken to some form of cyber attack. And so that’s why, you know, 20 years ago, companies weren’t so worried about cyber threats, but today they are because all of their best work has been put in a position where it can be easily taken now on inside a risk. It’s most what most people don’t know because the press loves to talk about the general press and nation state actors stealing information. But last year, two thirds of all breaches were actually caused by insiders. And that means employees, contractors, temporary workers are really responsible for a lot of the data that leaves an organization.

Ceci Amador [00:02:29] And what are some of the ways that this data. You sleep through insiders. Is it malicious? Is it unintentionally? What are some of the most scum? Come on. Data leakage points that you have identified.

Joe Payne [00:02:43] It’s a great question. Let’s maybe maybe I’ll step back and give your readers a little more context with what’s also changing in the world at the same time that all of the data is being digitized. Is that? We have great technology today that allows all of us to work together in a collaborative manner across lots of different teams. And so specifically, we have things like slack. And we have things like teams. And we have cloud services like Box and Dropbox and one drive and all of this new technology that has been launched in the last, I don’t know, five years or so allows me and you and our other friends and colleagues to work together in a seamless manner across the Internet, basically, and across networks.

Joe Payne [00:03:28] And in the last six months, we’ve all some people that were laggards and learning how to use that technology have all learned how to use it, because Colvard 19 has forced us all to work from home or many of us to work from home or work remotely. And so we’ve all had to learn how to use Zoom, how to use slack, how to use teams. And those technologies are fantastic for sharing and collaborating and working together. They also make it really easy to move information around an organization.

Joe Payne [00:04:01] And that makes it easy for employees to keep that information and get exposed information that maybe isn’t part of their domain or isn’t something that they should be keeping. And they’ve kept that information. So what we see is that a lot of times when people leave an organization, 60 percent of people report that when they leave an organization, when they quit, that they take information with them to their next job to help them in their next job. And so that’s more than half of people admit that in a survey. And our data indicates that it’s actually even much higher than Nazir’s set of people that are smart enough not to admit it. And so how are people taking information the same way they share it? They’re taking it to their Dropbox account or they store it up in their Gmail account. They send themselves the Jo email or they put it on a thumb drive or an external hard drive and take it that way. And a lot of this is maybe not meant to be malicious. They’re not trying to hurt a company or their previous employer, but they are really proud of their work and they’re under the impression that maybe they should try to keep some of that work. I’ve had it. We had a situation at our company where an employee just wanted her contacts and she took her entire hard drive, which had lots and lots and lots of confidential information on it. So sometimes it’s not malicious like that, but sometimes it actually is malicious where an employee is stealing what are really important trade secrets. And a good example of that is, you know, the folks at Jawboned lost some employees and they all went to Fitbit and they took some of the most important intellectual property the Jawbone had, and that eventually really caused job on to go out of business. So it can have pretty dire consequences for the company who has their status stolen in that kind of scenario. So we see both malicious and we see non malicious.

Ceci Amador [00:06:06] And you’ve said before that you that companies should always assume positive intent. So basically what you’re saying here is that companies, when the insider threat or a data leak happens, that they should assume that it was not a malicious leak. Am I correct?

Joe Payne [00:06:23] At Code 42 we believe that at code 42, we built software that helps monitor employee activity. And it does. It’s not about looking at the contents of what employees are doing, but it’s also looking at the kind of file movements that might happen on a certain scale with every employee. So you have a sense of if somebody is leaving. Why are they taking a lot of files with them? Maybe that requires some investigation. What we believe, though, is that everybody should be collaborating all the time in their jobs. And so we don’t want to block people from collaborating with each other. And we want to assume positive intent. We want to assume that everyone’s working together as they should be doing. Doing the right things.

Joe Payne [00:07:03] But the company also needs to make sure that it verifies that there aren’t malicious actors that are doing what they shouldn’t be doing. So we call that trust but verify. And then our approach will really help. It really helps companies stay productive, stay efficient. Trust your employees. Let them use all the tools that they want to use the company sanctions and but also keep an eye on everyone. And that really deters people from doing things that they shouldn’t do.

Ceci Amador [00:07:38] I like what you said about a company trusting its employees, and I think that’s definitely a big part of having a good employee employer relationship. And I kind of want to ask, what’s the role of training in insider threats and cybersecurity? Because a lot of people I mean, a lot of people are really good at what they do in their jobs, but they might not be as knowledgeable in cybersecurity topics and an insider threat. So I want to assume that a lot of the time data breaches have happened because they just don’t really know what they did wrong or that something posed a risk. So other than training, what type of training companies should offer their employees, particularly now that many are working from home as a new arrangement? What are there some other steps that companies can take to kind of like protect themselves and prevent insider threats?

Joe Payne [00:08:35] You’re absolutely right. Ceci be any kind of activity like this should be built around a program so it’s not enough to throw technology in. And the first two things that we tell our customers at code to the first two things you want to put in place is transparency and training. And so transparency is to make sure everybody knows that, hey, what you do at work and your computer is actually monitored. So you put you you’re very clear with everybody, hey, we do monitor the activity. We do monitor this so that there is no sort of secret spyware concern that everybody has. So transparency first. The second is training and super, super important that you train people on the technologies that we have available in the organization to be successful. Here’s a perfect example of why training is important. And and a lot of times, as you point out, that the activity is not malicious. Most companies today have either a Microsoft one drive or a Google Drive installed in their company where they want their employees to shift to store their data in the cloud. If an employee does not know how to use one drive, but they used Dropbox in the past for their, you know, soccer team or for their local church club or some other sort of be to see activity that they do outside of work. They’ll find Dropbox is super easy to use. And we’ve seen a lot of circumstances where employees say, hey, I want to share this document with my colleague and I have Dropbox and it works really well. And I send them a link and I share it with them. There’s no malicious intent there. But that actually creates risk for a company because now their corporate data is outside your organization in Dropbox. All that’s easily solved with a little training video from the security team or from the H.R. team that says, here’s how you use Microsoft one drive. And here’s the reasons that it’s good and it works just as well as Dropbox. And again, it’s just an example we see time in and time out for our customers because employees just don’t know the risks that are involved. And the other piece of training that has to happen is that younger workers today who have not been in the workforce for a very long time can be confused about what they have a right to. So we often see engineers who say, I’m really proud of this code that I wrote and I built a correlation engine for my company. And I’m really proud of that correlation engine. And I’d like to take it with me to my next company because maybe we can use it there. But that’s really not appropriate because the company paid that worker to build that engine and they own that engine. And so sometimes it’s just a matter of explaining to employees, you know, what you own versus what the company owns. And so that’s an important part of training also, because, again, a lot of younger workers just haven’t been exposed to these issues in the past. And we’re not aware of how the laws work and how companies own the products that they take. So those first two parts are super important, any program. And then and then we layer on technology to help sort of enforce that and verify that. And most of our clients, code 42, you know, 90 percent of the incidents that they face, they will just educate their end users. And that education really helps because then the end user tells the other end user and then tells the other end user and tells the other end user. And everybody understands that in our organization, these are the processes and policies that we run. So. Hope that’s helpful.

Ceci Amador [00:12:25] Yeah, and I definitely. You just hit the nail on the head and one of the things that I wanted to address, and it’s not just with younger workers, but people that are very proud of their job.

Ceci Amador [00:12:34] And I can see this in what I want to say in all fields, but particularly creative fields like writing, product design and PR press context and stuff like that. I can definitely see why workers would want to take that information that network their creations with them, and sometimes not all the time. Kind of like the WHO is the owner of the rights of the work. Produce is included in contracts. Not always, but then I mean, people read the contracts once when they signed a contract, they signed it. But then if they’ve been with a company for a long time, you don’t really remember everything that was on your contract. So I can see how that can create certain issues and potential threats. And you talk about transparency also. And you were talking about different kinds of software that can monitor the activity of employees and stuff like that. Isn’t that a privacy breach? Like, how do you define kind of like where the surveillance should end or what should be appropriate and what shouldn’t be?

Joe Payne [00:13:43] It’s a great question and you’ve got a couple of different issues there. The first is, is one, people are working at their organizations. I think being transparent goes both ways. If you’re about to leave your company, what I would tell your listeners is super important would be to simply have a conversation with your manager and make sure you say, listen, I want when I leave, I want to take my contacts with me and I want to take a couple samples of my work that I’ve done here as an important part of my portfolio. And I would tell you that most of our clients would say that makes a lot of sense. And by having that conversation, you don’t raise any red flags. I think what I would advise your listeners not to do is go copy a bunch of the work that they’ve done and and stored away without having that conversation with managers. Because in every contract I’ve ever seen and read that you have, when you are hired by an employer, they’ll talk about the fact that they own the work that they pay you to create. And one of the examples I often use with my team is, you know, if you hired a painter to come in and paint your portrait and they painted your portrait and then they got up and left and took the portrait with them because they said, hey, that portrait. Well, I did it. That was my creative work. It was great. Most of us would say, well, that’s silly. You paid them to paint the portrait. And so we understand that and in that context. But that’s also true in the creative work that you do for the company because they’re paying you to do that work. And so I think it’s super important to recognize that. But I also would tell you that my belief is that most organizations would be happy for you to take examples of your work and your contacts. But that’s something that you should talk to the organization about both before you do it. That’s a really important part of affording yourself is doing it the right way so that you don’t get yourself in trouble in terms of the spyware question and the surveillance question. If you’re on a company lap, if you work in the United States and you’re on a laptop that’s bought by your company, you really shouldn’t have an expectation of privacy in the sense that it is the company’s property. And what you do on that machine is tracked by the company and all of the companies. And the software does know what websites you go to and whether you’re watching Netflix on your company laptop or not, what you do with the data on that machine. Your company will know what you’re doing there. And so one of the things that you should be cognizant of and every company is different, small companies say do as much of your own. You know, many small companies say use your laptop as your personal laptop, use it as your corporate laptop. That’s fine. But you should know that we do have all the software on that tracks exactly what you do. And we do that for safety reasons, you know, for safety of the company’s information, but also to make sure the company isn’t breached in a high profile breach, etc.. But many big companies would even be reluctant to have you do any personal information. So it’s really important that the company has an education program to explain sort of what you’re allowed to do, what you’re not allowed to do, and it’s very clear with their employees. I think that Twitter breach that just happened a week ago, which was a pretty major breach because we had some of the most important leaders in the world whose accounts were breached in the process and tweets were sent out on their behalf, is an example of of insider threat that was not malicious by the employees, but it was malicious by sort of third party backed bad actors who were able to take over a the accounts of some of some insiders, of some Twitter employees, and thereby reset passwords and bypassed two factor authentication, et cetera.

The Latest News
Delivered To Your Inbox

Joe Payne [00:17:59] So if you’re an employee and you’re listening to this and you’re saying, why does my you know, why is my employer care so much about all the cybersecurity stuff and are they impinging on my rights? And, you know, it’s important to note that for many companies, the stakes on this are very high. So they’re trying to keep you secure and your data secure, both from external forces as well as a little bit from yourself, from you accidentally doing something that could cause a lot of damage to the company.

Ceci Amador [00:18:31] And you talk about employer employees feeling like they have some rights being infringed upon. I know that at least in the United States, the law isn’t yet up to date with new remote work practices and new technologies and data security and privacy. What are some things that you think need to be addressed not only at a corporate level, but kind of like a national regulatory level to kind of make it easier to like if there was an insider threat that was malicious and that resulted in the closure of a company or major losses. What are some things that need to be taken into consideration to make it easier to handle and manage the crisis?

Joe Payne [00:19:17] Well, I think the guiding principle for four employees is to. Make sure that they’re. That they really understand a separation between things that they consider very private and personal. And what they do on no work machines and on network networks. And I think it’s super important that they. I think a lot of us blend those things far too easily. And we all do it at times. And so it’s OK unless this is all assumed with positive intent. I don’t think people will. You know, we encourage security teams to look for that intent and understand that intent, but not blending those your personal life and your personal habits and your corporate life is a super important thing. I don’t know that there’s a lot of regulation necessarily that’s required in the B2B world that’s not already out there. I do think I’m not an expert on B2C security, but certainly all of us are concerned about our data being sold by folks like Facebook and Google and maybe other smaller but more nefarious type companies. So understanding what’s happening to our data and A, B to C world is very different than understanding what your rights are as an employee and what your obligations are as an employee to the people that are paying you to do your job. And so that’s that’s I want to separate these two things and we can talk about those more Sassy if you want to.

Joe Payne [00:21:02] But I do want to mention one thing on content, maybe to bring this to life to some of your consumers, some of your listeners. One of the things that our technology code 42 does is it looks to see if somebody has changed the type of file that something is and we call that the mime type extension. But let me explain what that means. So all of us are familiar with a JPeg file. It’s like a picture or maybe an MP three file, which is a song or a piece of music. And oftentimes people that are malicious were trying to steal information from a company, will actually rename a file and call it an MP three file as a fit. So it looks like a song or they will take a customer list and change the name and the extension of it so that it looks like it’s a picture. And the reason they would do that is to say, hey, when I’m leaving, I’m just taking my music and my photos that I put on this machine and I’m taking them with me to, you know, because they’re my personal stuff. When a security team sees that kind of behavior, then that’s a pretty good indicator that somebody is actually acting with ill intent. And so while we assume positive intent going in, we also look for not positive intent.

Joe Payne [00:22:20] And that’s that’s that’s something that many our listeners are like. Well, I never even thought of that. But we see that quite frequently when people are trying to expel trait information.

Ceci Amador [00:22:33] That’s really interesting. Are there any other red flags that security teams kind of use to determine if there is malicious intent from someone or not?

Joe Payne [00:22:44] There are. I’ll tell you some of the tricks and you’ll kind of laugh because you may. People may think of themselves as like, oh, you do that. So one of the other indicators that people are maybe doing something that they shouldn’t be doing is that they work differently. They do these activities on off hours. And so we have one of the things that Code 42 software does is it uses artificial intelligence to look at everybody’s working hours and just says, OK, when are your normal working hours and one of your off hours? Because the data shows very clearly that people decide if they’re going to maliciously take some information. I’m going to try to do it stealthily when I don’t normally work. Ironically, that actually is a red flag. And so people that do that, that they do raise their profile. Another thing that people often do that you’re going to say, oh, yeah, that does sound obvious is you and I maybe work on one hundred two hundred files in any given week, which is our normal sort of working process. If all of a sudden next week I touch 4000 files. That’s an indicator that somebody is doing something very different. And that may also be another indicator of compromise that would suggest that there needs to be a secondary look at what the activity that person is. But I’ll tell you, the one that is the greatest indicator that someone might actually at least need a second look as to what their activity is. That is, they quit in this country, in this country today. Last year about. Ninety five percent of people change jobs. We have a very mobile workforce. And that’s great in many ways. It’s good for employees. It’s a way to upscale. It’s a way to advance your career. And so and it’s part of what makes us America, which I think is fantastic. But we also and when people switch jobs, they typically switch jobs and stay in the industry that they’re in. And so they don’t they don’t leave industries. And so it’s often tempting if I’m leaving a company in one industry and I’m going to another company in the same industry, it’s often tempting for employees to think about cheese. Wouldn’t this information be super helpful to me in my next job? And and so that’s where most of the breaches actually happen, is when people leave. Companies, unfortunately, that increase multiplicative Lee when they are let go. And because they’re angry about leaving, instead of choosing to leave the company, they’ve been laid off or they’ve been furloughed, et cetera. And so over the last few months, in this difficult economic environment, when people are laid off and they’re angry, we see even greater exploitation of data. And there’s a number of cases that have been in the press recently where you can see organizations that are pursuing employees who took really, really important intellectual property. And remember, there is, you know, there’s rocket ships being built. There are new types of cars that are being invented, self-driving things, which is really expensive intellectual property. But there’s also even one of our clients is a company named McDonnel Miller, which is in the H.V. Acey business. And they you would think that’s not a very you know, there’s not a lot of intellectual property in that business, but they invented a little capability where you could see how their systems would work in your new building in virtual reality. And they won a few bids because of it against their competitors. So one of their competitors hired someone in the company and asked them to bring that virtual reality technology with them to the new customer so they could compete. And fortunately. They use our software and fortunately they do. That person was caught before they were able to take that data to the competitor. And so, you know, it’s not just the cutting edge companies that are seeing this. It’s companies in all areas of the economy that have to watch out for this problem.

Ceci Amador [00:27:01] That’s yeah, that’s my blowing. But I can definitely see if it makes sense that people who are let go be kind of like angrier and have more of a malicious intent at times. Say a company, identify some of these red flags before a person leaves the water, some things that they can do. What are some protocols that they should have in place to kind of like handle the situation to prevent the information from leaving the company?

Joe Payne [00:27:29] Honestly, that’s the best scenario. And one thing that we really stress in our businesses is finding these breaches as they happen or or and before people leave or shortly thereafter because they’re so much easier to handle before they get into the courts and law enforcement. So the protocols you have are pretty simple. The first, as you ask the employee, hey, this looks suspicious. What a whole. Why are you taking all this information? If it’s an egregious breach, if they say they think the security team thinks it’s a breach, egregious, they might go to the manager instead of the employee or they might even go to H.R. or the legal department. But solving that and dealing with that upfront with the employee is so much more efficient. Probably one of the most famous cases of insider threat has been Antony Levin Dyleski, who took a whole bunch of data out of Google and then went to a company called Wait. So took it out of the way, Mo, I guess. And then one ended up being an Uber. And if you follow that case, they didn’t discover Google away. Moe didn’t discover it until a year after he had left. And then Uber didn’t know that he had brought the information in. And when you fast forward to where we are today, he’s looking at potential prison time. Uber had to pay hundreds of millions of dollars and restitution to Google. Google’s unhappy. Everybody’s unhappy. So. And for the person that took the data would have been really nice to catch him when it was happening so that they could have just said, you need to give that data back. You need to promise you’re not going to use it. You’re on notice that you have it. And then he wouldn’t be facing prison time today. And this is one of our smartest engineers, you know, in the country who, you know, made some really bad decisions so big. We are big believers and do this. You know, catch these incidents early on and course correct people and activities so that you don’t have to deal with massive lawsuits. Law enforcement, all that kind of stuff. And that’s we just think that’s a better way.

Ceci Amador [00:29:46] And then switching a little bit gears, the Covid 19 pandemic, like you said, people, organizations, world cattle already there, information lands and data digitally. But with their pandemic, a lot of people, kinda like, were forced into the situation where they had to find a new way to work. They had to use new software. And at the beginning of the pandemic, there was a lot of talk that this increased security threats and stuff like that. Have you seen this happen actually over the past six months? Has it been kind of like the same? What are some of the kinds of new implications or areas that companies should focus on if they will continue to allow workers to work remotely in the long term?

Joe Payne [00:30:31] Well, there’s no question that the surface area for digital problems is much greater now with everybody working on their home network and working on their machine at home. And so, yes, it’s been a lot more. There’s been a lot more cyber security incidents. And so I think the you know, what it comes down to is as we’ve moved lots of software to the cloud and you hear lots of activity go to cloud, lots of applications and cloud, at the end of the day, that end point sitting in Starbucks or sitting in your home office really needs to have protection also. And I think there was a move in security to really focus on the network and on the cloud. And you didn’t really need to worry about the end point anymore. And I think security teams have kind of circled back to say, wow, that’s that endpoint is a place that can be attacked, but also is a place where, you know, exfiltration can occur. So, again, when an insider wants to take information, they always pull it down from the cloud and then they take it from the endpoint. They either upload it to their Dropbox account or they put it on a thumb drive. And so having visibility into that is really important for an organization to help one necessary course, correcting the team to not do things that it shouldn’t.

Ceci Amador [00:31:54] Awesome. And then just to recap a little bit of what we talked about, can you give us the five highlights that you think people should take away from from this conversation today?

Joe Payne [00:32:06] I think the most important things that company should take away from this conversation is that you need to have a program around insider risk. Today. Two thirds of the breaches are caused by insiders, but only 10 percent of the security budget is focused on that.

Joe Payne [00:32:22] So you need a program. You need to have a program that includes transparency, that includes training, and that includes technology. So companies need to think about that. And that’s what Code 42 does. We help companies build those programs and install that technology for employees if you’re an employee listening to this. I think the most important thing to remember is to keep a healthy separation between your personal information, your personal data and your company information on your company data, because the company owns that information. And any activity that you do on your company laptop or your company desktop is going to be monitored by most companies. And you should have that expectation going in.

Joe Payne [00:33:11] And if you do happen to leave your organization because you’re gonna move on to a new job or you’re going to be doing something different and you want to take some of the information that you worked on while you’re there, make sure you ask dont dont assume that it’s OK and get yourself in trouble. Ask. As I said earlier, we found that most organizations will let you take your contacts. They’ll let you take some samples of your work, and they’ll be more than happy to help you do that. And you’ll leave on the right foot from your old employer.

Ceci Amador [00:33:40] Perfect. So thank you so much. This is a very insightful conversation and definitely covers topics that we haven’t previously discussed with our listeners. So, Jo, thank you again for taking the time to chat with us.

Joe Payne [00:33:53] I am thrilled to have been invited. Thank you.

Ceci Amador [00:33:55] And thank you, everyone, for tuning in once again to the Future of Work podcast. Remember, you can also tune in on Allwork.Space, Apple Podcasts, Spotify, Google Play, Stitcher, TuneIn Radio and Podbean.

Share this article