On April 16, 2024, a group of people walked into the office of Google Cloud CEO Thomas Kurian and staged a sit-in. They were protesting Google’s contract with the state of Israel to build an IT cloud platform for the Israeli government and military. [1]
Ironically, Mr. Kurian knew the people. How?
Security professionals might argue that this specific incident resulted from inadequate insider threat detection. The protestors were verifiable company employees. They disrupted Google’s ability to conduct business and behaved in ways that Google said were “completely unacceptable.” [2] After an investigation, Google fired the employees.
Insider threat detection is the ability to proactively and holistically identify, prevent, and/or mitigate an incident that may harm an organization’s ability to perform its mission.
Google tells the world its mission “is to organize the world's information and make it universally accessible and useful.” The protesting Google employees accepted that mission upon joining the company. But the protestors had another mission as well, a mission reflected in their statement of values: “Technology should be used to bring people together, not enable apartheid, ethnic cleansing, and settler-colonialism.”
When values collide, insider threats may arise
When a company’s values collide with those of an employee, the costs can be high for each. The employee loses a salary, benefits, and a solid reference for career advancement. The company loses intellectual capital, productivity, and up to 125% of that employee’s salary in recruiting and hiring a replacement. [4] Replacing its 50 fired employees will undoubtedly cost Google millions of dollars.
Wouldn’t it be better to identify, detect and deter or mitigate an insider risk before it escalates into a legitimate threat to the organization? At Babel Street, we call this “being left of boom.” By “left of boom” we mean the time before the legitimate incident occurs, i.e., before the bomb goes off.
Later in this post, we’ll discuss how insider threat incidents like this can be mitigated, neutralized, and even completely prevented by building a new paradigm of insider risk detection — a paradigm that emphasizes not punishment, but cooperation amongst the employee population. First, though, let’s review what security professionals mean by insider threat detection.
Insider threat detection: a professional definition
The Cybersecurity & Infrastructure Security Agency (CISA) defines insider threat as follows:
“An insider is any person who has or had authorized access to or knowledge of an organization’s resources, including personnel, facilities, information, equipment, networks, and systems… Insider threat is the potential for an insider to use their authorized access or understanding of an organization to harm that organization.” [5]
CISA goes on to classify insider threats as unintentional or intentional. An unintentional threat can be either negligent (leaving a laptop screen open in a coffee shop while grabbing a refill) or accidental (clicking the link in a phishing email).
An intentional insider threat arises when a person within an organization is very likely to commit a malicious act that will manifest in harm to that organization. Discovering and defusing that threat — the practice of insider threat detection — has become more and more critical for organizations because this phenomenon is increasing rapidly: between 2023 and 2024 insider threats grew 23%.
Examples of such malicious acts are:
- Leaks
- Sabotage
- Espionage
- Workplace violence
As defined by CISA, workplace violence includes harassment and intimidation. [5] Under this definition, one could argue that the sit-in at Google on April 16 was an example of workplace violence.
In a moment we’ll give examples of insider threats that manifested into leaks, sabotage, and espionage. Next, though, we’ll discuss what motivates an insider to become an inside threat.
What motivates an employee to become a threat?
According to a study by Ekran, the top three insider threat actors are:
- Regular employees who, through negligence, open the digital door to cyber attacks.
- Third party contractors who have access to an organization’s IT systems and do not follow proper cybersecurity policies or conduct malicious attacks.
- Employees (systems administrators, developers, management) who have privileged access to IT systems and misuse that privilege to conduct an attack.
In that same report [6], Ekran notes that insider threat actors who misuse privileges are motivated by the following reasons:
Financial Gain | 89% |
Grudge | 13% |
Espionage | 5% |
Convenience | 3% |
Ideology | 2% |
Given these root motivations, how does an employee — an organization’s trusted insider —transform into a threat who intends to harm that organization? What path does that person take?
An excellent report published by CISA helps explain [7]. It’s worth quoting at length:
“While virtually every person will experience stressful events in their lives, the vast majority do so without resorting to disruptive or destructive acts. For those insiders that do turn to malicious activity, researchers have determined that the acts are rarely spontaneous; instead, they are usually the result of a deliberate decision to act. Researchers of insider threats describe an evolution from trusted insider to insider threat as a critical pathway wherein the subject’s personal predispositions and background, which make them susceptible to the temptation of a malicious act, interact with their personal stressors and the organizational environment, moving them down a pathway toward a malicious incident. Often, the perpetrator harbors resentment, displaying behaviors that may be observed and reported by peers and colleagues. The journey may be rapid or slow, and the path varies from person to person. Warning signs, stressors, and behaviors may be evident along the progression to action. A deeply held grievance or humiliation, whether real or perceived, is often the first step on a journey toward intended violence.”
What we can conclude is this: an insider becomes a threat because of stress or other factors, and the insider who chooses to become a threat will leave clues. Insider threat detection is the process of proactively reading those clues, reporting them to the appropriate security personnel, and acting on them to prevent harm to the organization or its personnel.
Let’s now look at examples of what happened when organizations did not read the clues.
What happens when insider threat detection fails?
The cost of an insider threat incident is not trivial.
From 2020 to 2022, the number of insider threat incidents rose by 44%. The costs per incident increased to $15 million. [8]
These costs are borne by investors in private companies, shareholders of public companies, customers of those companies, and taxpayers who support the government.
The costs in damage to a company’s reputation and a country’s national security are incalculable.
Here are some instances of what happens when insider threat detection fails.
Sabotage motivated by a grudge
Take, for example, the case of Christopher Dobbins.
In March of 2020, when COVID-19 was spreading across the United States, personal protective equipment (PPE) was in high demand by both the public and healthcare professionals. But Stradis Healthcare, a company that supplies PPE, was experiencing extreme delays in its shipping.
Was the demand too much for the company? Was that the cause?
In a word, no.
Earlier in 2020, Stradis had fired one of its executives, Christopher Dobbins. In retaliation, Mr. Dobbins created a secret systems account that could access the company’s shipping system. Even though Stradis had revoked Mr. Dobbins’ known accounts, their security team did not detect this clandestine account. [9]
After separating from Stradis, Mr. Dobbins used that account to modify and delete data in Stradis’ shipping system. [10] This malicious act disrupted the company’s operations for months, aggravating an already deepening health crisis in the United States. Eventually, Mr. Dobbins was sentenced to a year in prison for his crime.
Mr. Dobbins was motivated by a grudge and used sabotage to carry out his insider attack.
Espionage motivated by financial stress
Over $200,000 in debt and more than $12,000 behind in his mortgage, former CIA agent Kevin Mallory flew to China in April 2017. There he met with a man who worked for a Chinese think tank.
This man was not, however, what he presented himself to be on LinkedIn when he first contacted Mr. Mallory. The man’s name was Michael Yang, and he was, in fact, a Chinese intelligence officer tasked with recruiting Mr. Mallory.
A month later, when interviewed by the FBI, Mr. Mallory admitted that he had received $25,000 in exchange for “white papers” he gave Mr. Yang. Upon examining a communication device Mr. Mallory was given in China, the FBI found messages in which Mr. Mallory promised to deliver eight classified documents to Mr. Yang in June of that year.
Kevin Mallory was sentenced to 20 years in prison for his malicious insider act of espionage motivated by financial stress.
Leak of national security documents motivated by ego
Jack Teixeira was 22 years old when he plead guilty to posting national security documents to an online chat group. An IT specialist with the Massachusetts Air National Guard, Airman Teixeira uploaded the highly sensitive documents simply because he wanted to impress his teenage friends. [12]
What’s interesting about this case is that the people who worked directly with Airman Teixeira — members of the Massachusetts Air National Guard — knew that he engaged in “questionable activities.” Though not accused of knowing about Airman’s Texeira’s leaking of national security documents, his fellow airmen knew that Texeira was actively seeking intelligence documents. Fifteen of those airmen were disciplined for failing to take proper action, and Airman Texeira’s commanding officer was relieved of duty.
Jack Teixeira received a prison sentence of 16 years for leaking extremely sensitive national security documents just because his ego craved the attention.
Let’s look at what an insider threat detection program is and what it can accomplish.
Three approaches to insider threat detection
As we have documented here, insider threats are rapidly increasing each year. The cost of these threats in economic and national security damage are, likewise, accelerating.
Given these costs, every public organization and every company needs to develop an insider threat detection program. By law, U.S. government agencies are mandated to have an insider threat program; however, no resources were made available when the law was enacted, so effectiveness of each agency’s insider threat program varies.
We at Babel Street advocate a holistic and comprehensive insider threat detection program that consists of three coordinated approaches:
- Adopt a holistic plan. The decision to betray an organization is a human decision. It is imperative that organizations focus on the behaviors associated with insider acts so an effective program can be implemented. Training a workforce to recognize these behaviors is paramount detecting damaging insider actions.
- Implement reactive and proactive prevention strategies. Prevention is the goal so a potential attack can be mitigated or neutralized before it occurs (left of boom).
- Develop insider trust. By increasing trust at all levels, organizations reduce the risk of insider acts.
Before we detail our plan, let’s zero in on where the attacks are coming from. This quote summarizes it succinctly:
“Attackers typically reach out to employees via email, phone, or social media, and offer them large financial incentives to support them in their attacks. Per a study conducted by Hitachi, this engagement trend has been increasing, from 48% in 2021 to 65% in 2022.” [13]
A comprehensive approach to insider threat detection must focus on employees, but not in a punitive sense. Rather, organizations should strive to increase trust and empower employees to be part of the solution to prevent the damaging effects of insider attacks.
We advocate a program that fosters trust and cooperation between management, security, and employees. As we have seen in our examples of insider threat stories, everyone in the organization is adversely affected by an insider threat attack.
In short, we advocate a new paradigm for insider threat detection, a paradigm that we believe will more likely lead to success.
Let’s take a look at our paradigm:
Adopt a holistic plan
We advocate a holistic plan that includes a detailed and thorough insider threat detection policy, strong leadership within the organization to champion the plan, and employee education that empowers them to carry out the plan.
Such a plan will provide the foundation upon which a successful insider threat detection program can be built. Without this plan, the entire effort will crumble.
Implement reactive and proactive prevention strategies
Once we have a detailed plan and developed a thorough insider threat policy, we can implement strategies to implement that policy.
Strategy: Reactive insider threat detection
A reactive insider threat detection strategy uses IT resources — email logs and filters, spam and phishing detection software, web logs, firewalls, monitoring, alerting, and security software to detect and prevent attacks.
Using these resources, security staff can also conduct “penetration tests” to detect weaknesses in an organization’s network that leave it vulnerable to attack. Scanning employee computers for viruses and malware is another robust practice that can thwart attacks.
A robust reactive threat detection strategy may have prevented the insider attack that occurred at Stradis Healthcare in 2020. Even though the login the perpetrator created was clandestine, the account existed somewhere in the system. A thorough security audit may have revealed it.
Strategy: Proactive insider threat detection
A proactive insider threat detection strategy uses behavior analysis, training of the workforce, human interaction analysis, and Open Source Intelligence (OSINT). [13]
Behavior analysis enabled the FBI to determine that Kevin Mallory’s travel to China was alarming enough to begin questioning him. Mallory’s meeting with a Chinese agent was part of the FBI’s human interaction analysis.
OSINT can be used to develop a more comprehensive profile of a person. A rounded portrait will enable an organization to more clearly understand what an employee is thinking and feeling. When posting online, an insider may choose to explain, discuss, reason, and persuade. Or that person may decide to boast, vent, rage, and even threaten.
In all cases, a person’s social media presence leaves clues for security personnel to identify and take action to get left of boom before an insider attack occurs. OSINT can help researchers analyze and summarize those clues to fill a profile’s contour with character.
If the Massachusetts Air National Guard had, for example, adopted an effective program of proactive threat detection, perhaps they would have discovered that Jack Teixeira had “posted about ‘violence and murder’ in online forums, asked for advice on how to turn an SUV into an ‘assassination van,’ researched mass shootings, and amassed an ‘arsenal’ of weapons in his home.” [12]
Develop insider trust
Trust is critical to the success of the insider threat detection program. As we stated in our webinar on insider risk management, by empowering employees and increasing trust, organizations can reduce risk. Empower employees to be part of the solution — they are the first line of defense.
The word “threat” has a negative connotation. You can, if you like, name an insider threat detection program the “employee safety and well-being program.” Because employees are most often the target of cyberattacks and are the first line of defense, they have the most to gain by being educated about the behaviors associated with insider attacks. This education simultaneously reduces insider risk while increasing trust in the organization.
People want to trust each other. To foster this trust, insider threat detection needs to be a complete campaign: policy (legal and policy personnel working together), technology, people, and leadership. Tools are not authoritarian and should not be relied upon solely to prevent insider threats. A holistic approach to combating insider threat enables everyone in the organization to defend what they value most: a safe, productive, and prosperous workplace where everyone can be successful.
To learn more about our recommendations on insider threat detection, please view our webinar.
End Notes
1. Anonymous Google and Amazon Workers, October 12, 2021, “We are Google and Amazon workers. We condemn Project Nimbus”, https://www.theguardian.com/commentisfree/2021/oct/12/google-amazon-workers-condemn-project-nimbus-israeli-military-contract
2. Clare Duffy, May 1, 2024, “Former Google workers fired for protesting Israel deal file complaint claiming protected speech”, https://www.cnn.com/2024/05/01/tech/google-workers-nlrb-complaint-israeli-palestinian-protest/index.html
3. https://www.notechforapartheid.com/
4. “Calculating the Cost of Employee Turnover”, https://www.gnapartners.com/resources/articles/how-much-does-employee-turnover-really-cost-your-business
5. Cybersecurity & Infrastructure Security Agency, “Defining Insider Threats”, https://www.cisa.gov/topics/physical-security/insider-threat-mitigation/defining-insider-threats
6. Yana Storchak, February 14, 2024, “Insider Threat Statistics for 2024: Reports, Facts, Actors, and Costs”, https://www.ekransystem.com/en/blog/insider-threat-statistics-facts-and-figures
7. Cybersecurity & Infrastructure Security Agency, November 2020, “Insider Threat Mitigation Guide”, https://www.cisa.gov/sites/default/files/2022-11/Insider%20Threat%20Mitigation%20Guide_Final_508.pdf
8. Ponemon Institute, 2022, “2022 Cost of Insider Threats Global Report”, https://www.proofpoint.com/us/resources/threat-reports/cost-of-insider-threats
9. Becky Bracken, January 7, 2021, “Fired Healthcare Exec Stalls Critical PPE Shipment for Months”, https://threatpost.com/healthcare-exec-stalls-critical-ppe-shipment/162855/
10. AP News, October 20, 2020, “Man Gets Prison for Sabotage That Caused PPE Delivery Delay”, https://apnews.com/general-news-national-national-463e41083b75b83c44fc6a9d00dc952c
11. Center for Development of Security Excellence, “Case Study Espionage Kevin Patrick Mallory”, https://www.cdse.edu/Portals/124/Documents/casestudies/case-study-mallory.pdf
12. Vera Bergengruen, March 4, 2004, “Jack Teikeira Pleads Guilty to Massive Leak of Pentagon Secrets”, https://time.com/6837530/teixeira-discord-leak-guilty-plea/
13. Muhammad Muneer, Chris Madge, Arjun Bhardwaj, November 16, 2023, “Insider Threat: Hunting and Detecting”, https://www.mandiant.com/resources/blog/insider-threat-hunting-detecting
14. DoDIIS 2023 Webinar Series, 2023, “Securing Your Inner Circle Webinar: Mastering Insider Risk with Public Data Insights”, https://www.babelstreet.com/landing/securing-your-inner-circle-mastering-insider-risk-management-with-public-data-insights
15. Babel Street, “Developing a Holistic Insider Risk Management Program Can Improve DoD Security”, https://www.babelstreet.com/blog/developing-a-holistic-insider-risk-management-program-can-improve-dod-security
Find out how to transform your data into actionable insights.
Schedule a DemoStay Informed
Sign up to receive the latest intel, news and updates from Babel Street.