What are the best practices for the prevention and detection of insider threats?

  

Insider ThreatLearning Objective: Describe the security practices used to control employee behavior and prevent misuse of information.Assignment RequirementsReviewCERT’s Common Sense Guide to Prevention and Detection of Insider Threats.Choose one of the 16 best practices listed in the document. Write a summary paper that includes the following:Introduce the problem,the insider threat.Summarize the best practice you selected as if you are describing it to a Human Resource person in your organization.Conclude with a recommendation of how to implement the best practice in your organization.CERT Document :Download HereSubmission RequirementsFormat: Microsoft WordFont:Arial, 12-Point, Double- SpaceCitation Style: APALength: 2 pages (plus a cover sheet)
Common Sense Guide to Prevention and
Detection of Insider Threats
3rd Edition Version 3.1
Dawn Cappelli
Andrew Moore
Randall Trzeciak
Timothy J. Shimeall
January 2009
This work was funded by
Copyright 2009 Carnegie Mellon University.
NO WARRANTY
THIS CARNEGIE MELLON UNIVERSITY MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE
MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS
TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR
MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL.
CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT
TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.
Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder.
Internal use. Permission to reproduce this document and to prepare derivative works from this document for internal
use is granted, provided the copyright and No Warranty statements are included with all reproductions and derivative
works.
External use. Requests for permission to reproduce this document or prepare derivative works of this document for
external and commercial use should be directed to permission@sei.cmu.edu.
CERT | SOFTWARE ENGINEERING INSTITUTE | 2
Table of Contents
INTRODUCTION ………………………………………………………………………………………………………………………. 4
WHAT IS MEANT BY “INSIDER THREAT?” ……………………………………………………………………………………… 5
CERTS DEFINITION OF A MALICIOUS INSIDER ………………………………………………………………………………. 5
ARE INSIDERS REALLY A THREAT?……………………………………………………………………………………………….. 6
WHO SHOULD READ THIS REPORT? ………………………………………………………………………………………………. 8
CAN INSIDERS BE STOPPED?………………………………………………………………………………………………………… 8
ACKNOWLEDGEMENTS …………………………………………………………………………………………………………. 9
PATTERNS AND TRENDS OBSERVED BY TYPE OF MALICIOUS INSIDER ACTIVITY …….. 11
INSIDER IT SABOTAGE……………………………………………………………………………………………………………… 15
THEFT OR MODIFICATION FOR FINANCIAL GAIN ………………………………………………………………………….. 18
THEFT OF INFORMATION FOR BUSINESS ADVANTAGE …………………………………………………………………… 21
SUMMARY ……………………………………………………………………………………………………………………………… 24
BEST PRACTICES FOR THE PREVENTION AND DETECTION OF INSIDER THREATS ……. 27
SUMMARY OF PRACTICES …………………………………………………………………………………………………………. 27
PRACTICE 1: CONSIDER THREATS FROM INSIDERS AND BUSINESS PARTNERS IN ENTERPRISE-WIDE RISK
ASSESSMENTS. (UPDATED) …………………………………………………………………………………………………….. 32
PRACTICE 2: CLEARLY DOCUMENT AND CONSISTENTLY ENFORCE POLICIES AND CONTROLS. (NEW) …… 36
PRACTICE 3: INSTITUTE PERIODIC SECURITY AWARENESS TRAINING FOR ALL EMPLOYEES. (UPDATED) 39
PRACTICE 4: MONITOR AND RESPOND TO SUSPICIOUS OR DISRUPTIVE BEHAVIOR, BEGINNING WITH THE
HIRING PROCESS. (UPDATED) …………………………………………………………………………………………………. 43
PRACTICE 5: ANTICIPATE AND MANAGE NEGATIVE WORKPLACE ISSUES (NEW) ……………………………….. 47
PRACTICE 6: TRACK AND SECURE THE PHYSICAL ENVIRONMENT (NEW) …………………………………………. 49
PRACTICE 7: IMPLEMENT STRICT PASSWORD AND ACCOUNT MANAGEMENT POLICIES AND PRACTICES.
(UPDATED) ………………………………………………………………………………………………………………………….. 52
PRACTICE 8: ENFORCE SEPARATION OF DUTIES AND LEAST PRIVILEGE. (UPDATED) ………………………… 55
PRACTICE 9: CONSIDER INSIDER THREATS IN THE SOFTWARE DEVELOPMENT LIFE CYCLE (NEW) ……….. 59
PRACTICE 10: USE EXTRA CAUTION WITH SYSTEM ADMINISTRATORS AND TECHNICAL OR PRIVILEGED
USERS. (UPDATED) ……………………………………………………………………………………………………………….. 63
PRACTICE 11: IMPLEMENT SYSTEM CHANGE CONTROLS. (UPDATED) ……………………………………………. 66
PRACTICE 12: LOG, MONITOR, AND AUDIT EMPLOYEE ONLINE ACTIONS. (UPDATED) ………………………. 70
PRACTICE 13: USE LAYERED DEFENSE AGAINST REMOTE ATTACKS. (UPDATED) …………………………….. 74
PRACTICE 14: DEACTIVATE COMPUTER ACCESS FOLLOWING TERMINATION. (UPDATED)…………………. 77
PRACTICE 15: IMPLEMENT SECURE BACKUP AND RECOVERY PROCESSES. (UPDATED) …………………….. 81
PRACTICE 16: DEVELOP AN INSIDER INCIDENT RESPONSE PLAN. (NEW) ………………………………………… 85
REFERENCES/SOURCES OF BEST PRACTICES …………………………………………………………………… 87
CERT | SOFTWARE ENGINEERING INSTITUTE | 3
INTRODUCTION
In 2005, the first version of the Common Sense Guide to Prevention and Detection of
Insider Threats was published by Carnegie Mellon Universitys CyLab. The document
was based on the insider threat research performed by CERT, primarily the Insider
Threat Study 1 conducted jointly with the U.S. Secret Service. It contained a description
of twelve practices that would have been effective in preventing or detecting malicious
insider activity in 150 actual cases collected as part of the study. The 150 cases occurred
in critical infrastructure sectors in the U.S. between 1996 and 2002.
A second edition of the guide was released in July of 2006. The second edition included a
new type of analysis by type of malicious insider activity. It also included a new section
that presented a high-level picture of different types of insider threats: fraud, theft of
confidential or proprietary information, and sabotage. also In addition, it contained new
and updated practices based on new CERT insider threat research funded by Carnegie
Mellon CyLab 2 and the U.S. Department of Defense Personnel Security Research
Center. 3 Those projects involved a new type of analysis of the insider threat problem
focused on determining high-level patterns and trends in the cases. Specifically, those
projects examined the complex interactions, relative degree of risk, and unintended
consequences of policies, practices, technology, insider psychological issues, and
organizational culture over time.
This third edition of the Common Sense Guide once again reflects new insights from
ongoing research at CERT. CyLab has funded the CERT Insider Threat Team to collect
and analyze new insider threat cases on an ongoing basis. The purpose of this ongoing
effort is to maintain a current state of awareness of the methods being used by insiders to
commit their attacks, as well as new organizational issues influencing them to attack.
This version of the guide includes new and updated practices based on an analysis of
approximately 100 recent insider threat cases that occurred from 2003 to 2007 in the U.S.
In this edition of the guide, CERT researchers also present new findings derived from
looking at insider crimes in a new way. These findings are based on CERTs analysis of
118 theft and fraud cases, which revealed a surprising finding. The intent of the research
was to analyze cases of insider theft and insider fraud to identify patterns of insider
behavior, organizational events or conditions, and technical issues across the cases. The
patterns identified separated the crimes into two different classes than originally
expected:
Theft or modification of information for financial gain This class includes cases
where insiders used their access to organization systems either to steal
1
See http://www.cert.org/insider_threat/study.html for more information on the Insider Threat Study.
A report describing the MERIT model of insider IT Sabotage, funded by CyLab, can be downloaded at
http://www.cert.org/archive/pdf/08tr009.pdf.
3
A report describing CERTs insider threat research with the Department of Defense can be downloaded
from http://www.cert.org/archive/pdf/06tr026.pdf.
2
CERT | SOFTWARE ENGINEERING INSTITUTE | 4
information that they sold to outsiders, or to modify information for financial gain
for themselves or others.
Theft of information for business advantage – This class includes cases where
insiders used their access to organization systems to obtain information that they
used for their own personal business advantage, such as obtaining a new job or
starting their own business.
It is important that organizations recognize the differences in the types of employees who
commit each type of crime, as well as how each type of incident evolves over time: theft
or modification for financial gain, theft for business advantage, IT sabotage, and
miscellaneous (incidents that do not fall into any of the three above categories). This
version of the guide presents patterns and trends observed in each type of malicious
activity. There have been minor updates to the IT sabotage information in this guide;
however, the most significant enhancements in this edition were made to the theft and
modification sections.
Some new practices were added in this edition that did not exist in the second edition. In
addition, every practice from the second edition has been modifiedsome significantly,
others to a lesser degreeto reflect new insights from the past years research at CERT.
Case examples from the second edition were retained in this edition for the benefit of
new readers. However, a Recent Findings section was included for all updated practices.
It details recent cases that highlight new issues not covered in the previous edition of this
guide.
What is Meant by “Insider Threat?”
CERTs definition of a malicious insider is
A current or former employee, contractor, or business
partner who
has or had authorized access to an organizations
network, system, or data and
intentionally exceeded or misused that access in
a manner that negatively affected the
confidentiality, integrity, or availability of the
organizations information or information
systems
CERT | SOFTWARE ENGINEERING INSTITUTE | 5
Note that one type of insider threat is excluded from this guide: cases of espionage
involving classified national security information.
The scope of insider threats has been expanding beyond the traditional threat posed by a
current of former employee. Specifically, the CERT team has noted the following
important new issues in the expanding scope of insider threat.
Collusion with outsiders: Insider threat has expanded beyond the organizational
boundary. Half of the insiders who stole or modified information for financial gain were
actually recruited by outsiders, including organized crime and foreign organizations or
governments. It is important to pay close attention to the section of the guide titled Theft
or Modification of Information for Financial Gain It will help you understand the types
of employees who may be susceptible to recruitment.
Business partners: A recent trend noted by the CERT research team is the increase in the
number of insider crimes perpetrated not by employees, but by employees of trusted
business partners who have been given authorized access to their clients networks,
systems, and data. Suggestions for countering this threat are presented in Practice 1.
Mergers and acquisitions: A recent concern voiced to the CERT team by industry is the
heightened risk of insider threat in organizations being acquired by another organization.
It is important that organizations recognize the increased risk of insider threat both within
the acquiring organization, and in the organization being acquired, as employees endure
stress and an uncertain organizational climate. Readers involved in an acquisition should
pay particular attention to most of the practices in this guide.
Cultural differences: Many of the patterns of behavior observed in CERTs insider threat
modeling work are reflected throughout this guide. However, it is important for readers to
understand that cultural issues could influence employee behaviors; those same
behavioral patterns might not be exhibited in the same manner by people who were raised
or spent extensive time outside of the U.S.
Issues outside the U.S: CERTs insider threat research is based on cases that occurred
inside the United States. It is important for U.S. companies operating branches outside
the U.S. to understand that, in addition to the cultural differences influencing employee
behavior, portions of this guide might also need to be tailored to legal and policy
differences in other countries.
Are insiders really a threat?
The threat of attack from insiders is real and substantial. The 2007 E-Crime Watch
SurveyTM conducted by the United States Secret Service, the CERT Coordination Center
(CERT/CC), Microsoft, and CSO Magazine, 4 found that in cases where respondents
could identify the perpetrator of an electronic crime, 31% were committed by insiders. In
4
http://www.cert.org/archive/pdf/ecrimesummary07.pdf
CERT | SOFTWARE ENGINEERING INSTITUTE | 6
addition, 49% of respondents experienced at least one malicious, deliberate insider
incident in the previous year. The impact from insider attacks can be devastating. One
employee working for a manufacturer stole blueprints containing trade secrets worth
$100 million, and sold them to a Taiwanese competitor in hopes of obtaining a new job
with them.
Over the past several years, Carnegie Mellon University has been conducting a variety of
research projects on insider threat. One of the conclusions reached is that insider attacks
have occurred across all organizational sectors, often causing significant damage to the
affected organizations. Examples of these acts include the following:
Low-tech attacks, such as modifying or stealing confidential or sensitive
information for personal gain.
Theft of trade secrets or customer information to be used for business advantage
or to give to a foreign government or organization.
Technically sophisticated crimes that sabotage the organizations data, systems, or
network.
Damages in many of these crimes are not only financialwidespread public reporting of
the event can also severely damage the organizations reputation.
Insiders have a significant advantage over others who might want to harm an
organization. Insiders can bypass physical and technical security measures designed to
prevent unauthorized access. Mechanisms such as firewalls, intrusion detection systems,
and electronic building access systems are implemented primarily to defend against
external threats. However, not only are insiders aware of the policies, procedures, and
technology used in their organizations, but they are often also aware of their
vulnerabilities, such as loosely enforced policies and procedures or exploitable technical
flaws in networks or systems.
CERTs research indicates that use of many widely accepted best practices for
information security could have prevented many of the insider attacks examined. Part of
CERTs research of insider threat cases entailed an examination of how each organization
could have prevented the attack or at the very least detected it earlier. Previous editions of
the Common Sense Guide identified existing best practices critical to the mitigation of
the risks posed by malicious insiders. This edition identifies additional best practices
based on new methods and contextual factors in recent cases, and also presents some new
suggestions for countering insider threat based on findings that could not be linked to
established best practices.
Based on our research to date, the practices outlined in this report are the most important
for mitigating insider threats.
CERT | SOFTWARE ENGINEERING INSTITUTE | 7
Who should read this report?
This guide is written for a diverse audience. Decision makers across an organization can
benefit from reading it. Insider threats are influenced by a combination of technical,
behavioral, and organizational issues, and must be addressed by policies, procedures, and
technologies. Therefore, it is important that management, human resources, information
technology, software engineering, legal, security staff, and the owners of critical data
understand the overall scope of the problem and communicate it to all employees in the
organization.
The guide outlines practices that should be implemented throughout organizations to
prevent insider threats. It briefly describes each practice, explains why it should be
implemented, and provides one or more actual case examples illustrating what could
happen if it is not, as well as how the practice could have prevented an attack or
facilitated early detection.
Much has been written about the implementation of these practices (a list of references on
this topic is provided at the end of this guide). This report provides a synopsis of those
practices, and is intended to convince the reader that someone in the organization should
be given responsibility for reviewing existing organizational policies, processes, and
technical controls and for recommending necessary additions or modifications.
Can insiders be stopped?
Insiders can be stopped, but stopping them is a complex problem. Insider attacks can only
be prevented through a layered defense strategy consisting of policies, procedures, and
technical controls. Therefore, management must pay close attention to many aspects of its
organization, including its business policies and procedures, organizational culture, and
technical environment. It must look beyond information technology to the organizations
overall business processes and the interplay between those processes and the technologies
used.
CERT | SOFTWARE ENGINEERING INSTITUTE | 8
Acknowledgements
In sponsoring the Insider Threat Study, the U.S. Secret Service provided more than just
funding for CERTs research. The joint study team, composed of CERT information
security experts and behavioral psychologists from the Secret Services National Threat
Assessment Center, defined the research methodology and conducted the research that
has provided the foundation for all of CERTs subsequent insider threat research. The
community as a whole owes a debt of gratitude to the Secret Service for sponsoring and
collaborating on the original study, and for permitting CERT to continue to rely on the
valuable casefiles from that study for ongoing research. Specifically, CERT would like to
thank Dr. Marisa Reddy Randazzo, Dr. Michelle Keeney, Eileen Kowalski, and Matt
Doherty from the National Threat Assessment Center, and Cornelius Tate, David
Iacovetti, Wayne Peterson, and Tom Dover, our liaisons with the Secret Service during
the study.
The authors would also like to thank the CERT members of the Insider Threat Study
team, who reviewed and coded cases, conducted interviews, and assisted in writing the
study reports: Christopher Bateman, Casey Dunlevy, Tom Longstaff, David Mundie,
Stephanie Rogers, Timothy Shimeall, Bradford Willke, and Mark Zajicek.
Since the Insider Threat Study, the CERT team has been fortunate to work with
psychologists who have contributed their vast experience and new ideas to our work: Dr.
Eric Shaw, a Visiting Scientist on the CERT Insider Threat team who has contributed to
most of the CERT insider threat projects, Dr. Steven Band, former Chief of the FBI
Behavioral Sciences Unit, who has provided expertise on psychological issues, and Dr.
Lynn Fischer from the Department of Defense Personnel Security Research Center, who
sponsored CERTs initial insider threat research and has continued to work with the
CERT team on various insider threat projects.
The CERT team is extremely appreciative of the ongoing funding provided by CyLab.
The impact of the insider threat research sponsored by CyLab has been enormous, within
industry and government, and inside the U.S. as well as globally. CyLab has provided
key funding that has enabled the CERT team to perform research for the benefit of all:
government and industry, technical staff as well as management. Specifically, we would
like to thank Pradeep Khosla, Don McGillen, and Linda Whipkey, who have been
advocates for CERTs insider threat research since its inception, as well as Richard
Power, Gene Hambrick, Virgil Gligor, and Adrian Perig, who the CERT team has had
the pleasure of working with over the past year.
The CERT team has had assistance from various CyLab graduate students over the past
few years. These students enthusiastically joined the team and devoted their precious
time to the CERT insider threat projects: Akash Desai, Hannah Benjamin-Joseph,
Christopher Nguyen, Adam Cummings, and Tom Carron. Special thanks to Tom, who is
a current member of the CERT/CyLab insider threat team, and who willingly dropped
everything he was doing over and over again to search the database for specific examples
we needed to make this report as compelling as possible.
CERT | SOFTWARE ENGINEERING INSTITUTE | 9
The Secret Service provided the 150 original casefiles for CERTs insider threat research.
CyLabs research required identification and collection of additional case materials. The
CERT team gratefully acknowledges the hard work and long hours, including many
weekends, spent by Sheila Rosenthal, SEIs Manager of Library Services, assisting with
this effort. Sheila was instrumental in obtaining the richest source materials available for
more than 100 new cases used in the teams CyLab-sponsored research.
Finally, CERT would like to thank all of the organizations, prosecutors, investigators, and
convicted insiders who agreed to provide confidential information to the team to enhance
the research. It is essential to the community that all of the good guys band together
and share information so that together we can keep employees happy, correct problems
before they escalate, and use our technical resources and business processes to prevent
malicious insider activity or detect the precursors to a devastating attack.
CERT | SOFTWARE ENGINEERING INSTITUTE | 10
Patterns and Trends Observed by Type of Malicious
Insider Activity
The CERT insider threat team has collected approximately 250 actual insider threat
cases. One hundred ninety of those cases were analyzed in detail for this report. Because
the remaining cases did not have sufficient information available or were still in the U.S.
court system at the time of this publication, they have not yet been formally analyzed.
This section of the document presents trends and patterns observed in those cases by class
of malicious insider activity:
IT sabotage: cases in which current or former employees, contractors, or business
partners intentionally exceeded or misused an authorized level of access to
networks, systems, or data with the intention of harming a specific individual, the
organization, or the organizations data, systems, and/or daily business operations.
Theft or modification for financial gain: cases in which current or former
employees, contractors, or business partners intentionally exceeded or misused an
authorized level of access to networks, systems, or data with the intention of
stealing or modifying confidential or proprietary information from the
organization for financial gain.
Theft or modification for business advantage: cases in which current or former
employees, contractors, or business partners intentionally exceeded or misused an
authorized level of access to networks, systems, or data with the intention of
stealing confidential or proprietary information from the organization with the
intent to use it for a business advantage.
Miscellaneous: cases in which current or former employees, contractors, or
business partners intentionally exceeded or misused an authorized level of access
to networks, systems, or data with the intention of stealing confidential or
proprietary information from the organization, not motivated by financial gain or
business advantage.
The breakdown of the cases into those four categories is shown in Figure 1.
CERT | SOFTWARE ENGINEERING INSTITUTE | 11
Theft for
Miscellaneous
Reasons
17
IT Sabotage
80
77
Theft or
Modification for
Financial Gain
24
Theft for Business
Advantage
Figure 1. Breakdown of Insider Threat Cases
5
Some cases fell into multiple categories. For example, some insiders committed acts of
IT sabotage against their employers systems, then attempted to extort money from them,
offering to assist them in recovery efforts only in exchange for a sum of money. A case
like that is categorized as both IT sabotage and theft or modification of information for
financial gain. Four of the 190 cases were classified as theft for financial gain and IT
sabotage. Another case involved a former vice president of sales copying a customer
database and sales brochures from the organization before deleting them and taking
another job. This case is classified as theft of information for business advantage and IT
sabotage. One case was classified as theft for business advantage and IT sabotage.
Finally, three cases where classified as IT Sabotage and Theft for Miscellaneous Reasons.
A breakdown of the cases depicting the overlap between categories is shown in Figure 2.
5
190 cases were analyzed for this report; however, some of the cases were classified as more than one type
of crime.
CERT | SOFTWARE ENGINEERING INSTITUTE | 12
IT Sabotage
75
4
1
0
73
0
Theft/
Modification for
Financial Gain
Theft for
Business
Advantage
ITSabotage
Figure 2. Overlap among the Insider Threat Classes
23
6
Figure 3 shows the distribution of each type of case by critical infrastructure sector. It is
interesting to note the differences among sectors. For instance, it is not surprising that
theft of information for financial gain is most prominent in the Banking and Finance
sector. However, it might be a bit unexpected to note that theft for financial gain in the
Government sector is a close second, followed by Information Technology and
Telecommunications.
Theft of information for business advantage, on the other hand, is highly concentrated in
the IT and Telecommunications sector, with cases in the Banking and Finance sector
second. Chemical and Hazardous Materials and the Defense Industrial Base were the
only other two critical infrastructure sectors that experienced theft of information for
business advantage.
The number of cases of insider IT sabotage in the IT sector is quite striking. The
government sector was second in number of insider IT sabotage attacks. Note that the
only two sectors to have experienced no insider IT sabotage attacks were Chemical and
6
Seventeen of the cases were classified as Miscellaneous Theft cases, in which the motive was not for
financial gain or business advantage. This figure does not depict those seventeen crimes.
CERT | SOFTWARE ENGINEERING INSTITUTE | 13
Hazardous Materials and Emergency Services; every other sector experienced at least one
attack.
Crimes by Critical Infrastructure Sector
45
40
35
# Cases
30
25
20
15
10
5
Misc_Theft
Theft – Bus. Adv.
Theft – Fin. Gain
Sabotage
N
/A
C B
he an
m ki
. n
D Ind g a
ef
n
en ust d F
se ry
in
In & H anc
du
az e
st
E
ria Ma
m
l B ts
er
ge
as
nc E
e
y n
S er
er g
vi y
ce
s
G
In
Fo
o
P fo ve o
os & rn d
ta
T m
l a ele en
nd co t
S mm
h
P
ub ipp
in
g
Tr lic
an H
e
sp
al
or th
ta
tio
n
0
Critical Infrastructure Sector
Figure 3. Distribution of Cases by Critical Infrastructure Sector
CERT | SOFTWARE ENGINEERING INSTITUTE | 14
Insider IT Sabotage
In this report, insider IT sabotage cases are defined as follows: cases in which current or
former employees, contractors, or business partners intentionally exceeded or misused an
authorized level of access to networks, systems, or data with the intention of harming a
specific individual, the organization, or the organizations data, systems, and/or daily
business operations.
CERT researchers analyzed 80 cases of IT sabotage that occurred in the United States
between 1996 and 2007.
Who were the insiders?
The insiders who committed IT sabotage were primarily male and held highly technical
positions, the majority hired with system administrator or privileged access. However,
according to the U.S. Department of Labor Bureau of Labor Statistics, in 2007, 74% of
all employees in computer and mathematical occupations were male. 7 Therefore, while it
is useful to note that sabotage was typically committed by technically sophisticated
employees, focusing attention only on male employees is probably not a logical
conclusion. In addition, the majority of the insiders who committed IT sabotage were
former employees.
Why did they do it?
Over half of the insiders were perceived as disgruntled, and most of them acted out of
revenge for some negative precipitating event. Examples of negative events include
termination, disputes with the employer, new supervisors, transfers or demotions, and
dissatisfaction with salary increases or bonuses.
How did they attack?
The majority of the insiders who committed IT sabotage did not have authorized access at
the time of their attack. Only 30% used their own username and password; 43% of them
compromised an account. Twenty-four percent used another employees username and
password, and 16% used an unauthorized (backdoor) account they had created
previously. They also used shared accounts, including some that had been overlooked in
the termination process; 23% used system administrator or database administrator (DBA)
accounts and 11% used other types of shared accounts, for instance testing accounts or
training accounts.
Thirty-five percent used sophisticated technical means for carrying out their attacks.
Commonly used technical methods included writing a script or program, such as a logic
bomb, or creating a backdoor account for later use. Other technical mechanisms included
planting a virus on customer computers, using password crackers, and installation of
remote system administration tools.
7
http://www.bls.gov/cps/cpsaat9.pdf
CERT | SOFTWARE ENGINEERING INSTITUTE | 15
Approximately 30% took technical preparatory actions prior to the attack, particularly in
cases where they anticipated termination. For example, they wrote, tested, and planted
logic bombs, sabotaged backups, and created backdoor accounts. Most logic bombs were
designed to delete massive amounts of data; however, at least one was designed to disrupt
business operations surreptitiously, six months following the insiders termination. Some
backdoor accounts were fairly obvious and could have been detected easily in an account
audit, while others were well concealed. Most insiders used remote access, and carried
out their attack outside of normal working hours.
How was it detected?
Most of the attacks were detected manually due to system failure or irregularity. Nonsecurity personnel, including customers in almost 25% of the cases, often detected the
attacks. Employees detecting the attacks included supervisors, coworkers, and security
staff.
Observable concerning behaviors were exhibited by the insiders prior to setting up and
carrying out their attack. Common behavioral precursors included conflicts with
supervisors and coworkers (which were sometimes quite angry or violent), decline in
performance, tardiness, or unexplained absenteeism. In some cases, management did not
notice or ignored the problems. In other cases, sanctions imposed by the organization
only increased the insiders concerning behaviors, rather than put an end to them.
How was the insider identified?
In most cases, system logs were used to identify the insider, including remote access logs,
file access logs, database logs, application logs, and email logs. Most of the insiders took
steps to conceal their actions; some insiders, knowing that the logs would be used for
identification, attempted to conceal their actions by modifying the logs. In some cases,
they modified the logs to implicate someone else for their actions.
What were the impacts?
In 68% of the cases, the organization suffered some type of business impact, such as
inability to conduct business due to the system or network being down, loss of customer
records, or inability to produce products due to damaged or destroyed software or
systems.
Other negative consequences resulted from

negative media attention
forwarding management email containing private information, like strategic plans
or plans of impending layoffs to customers, competitors, or employees
exposure of personal information, like Social Security numbers
web site defacements in which legitimate information was replaced with invalid
or embarrassing content
publication of confidential customer information on a public web site
CERT | SOFTWARE ENGINEERING INSTITUTE | 16
In 28% of the cases, an individual was harmed. Examples of harm to individuals include
threats, modification of evidence to falsely implicate supervisors or coworkers, and
exposure of personal or private information.
For a more detailed description of insider IT sabotage, see The “Big Picture” of Insider IT
Sabotage Across U.S. Critical Infrastructures, which can be downloaded at
http://www.cert.org/archive/pdf/08tr009.pdf.
CERT | SOFTWARE ENGINEERING INSTITUTE | 17
Theft or Modification for Financial Gain
In this report, insider theft or modification for financial gain cases are defined as follows:
cases in which current or former employees, contractors, or business partners
intentionally exceeded or misused an authorized level of access to networks, systems, or
data with the intention of stealing or modifying their employers confidential or
proprietary information for financial gain.
CERT researchers analyzed 77 cases of theft or modification for financial gain that
occurred in the United States between 1996 and 2007. Seventy three cases involved only
theft or modification for financial gain and four also involved IT sabotage.
Who were the insiders?
Only five of the insiders who committed crimes in this category were former employees;
all others were current employees when they committed their illicit activity. Half of the
insiders were male and half were female. The insiders committing this type of crime
tended to occupy lower-level,, non-technical positions in the organization. Their job
duties included data entry and management of personally identifiable information (PII) or
customer information (CI). For example, many of these insiders held data entry positions
or were classified as clerks.
Why did they do it?
The primary motivation for all insiders in this category is financial gain. Insiders stole
information to sell it, modified data to achieve financial benefits for themselves, friends,
or family, or were paid by outsiders to modify information. Some insiders were
motivated to provide additional income for their relatives, and a few insiders had large
credit card debts or drug-related financial difficulties.
Most of these attacks were long, ongoing schemes; approximately one third of the
incidents continued for more than one year. Of the short, quick compromises, half ended
because the insider was caught quickly, and the other half ended because the crime was
committed as the employee was leaving the organization or following termination.
The prevalence of collusion between the insiders in these cases and either people external
to the organization or with other insiders is extremely high. Some cases involved
collusion with both insiders and outsiders. In cases of insider theft for financial gain, the
insider colluded with outsiders in two thirds of the cases, and one third of the cases
involved collusion between the insider and someone else inside the organization. In those
theft cases, an outsider recruited the insider to commit the crime in half of the cases. In
less than one third of the cases, the insider acted alone.
A recurring pattern in the theft of information for financial gain cases includes an
outsider recruiting an insider in a low-paying, non-technical position who has access to
PII or CI. The insider steals the information; the outsider then pays the insider and uses
the information to commit fraud or identity theft.
CERT | SOFTWARE ENGINEERING INSTITUTE | 18
Some insiders were paid to modify data, for example credit histories. In some cases they
were paid by people with poor credit histories, and in others by someone (like a car
dealer) who would benefit from the beneficiaries loan approvals. Other insiders were
paid by external people to create false drivers licenses, to enter fake health care providers,
and to generate false claims totaling significant amounts. Still others were paid to
counterfeit federal identity documents.
Finally, some insiders were able to design and carry out their own modification scheme
due to their familiarity with the organizations systems and business processes. For
example, a payroll manager defrauded her employer of more than $300,000 by adding
her husband to the payroll every week, generating a paycheck for him, then removing
him immediately from the payroll system to avoid detection. Her crime was only
discovered approximately one year after she left the company when an accountant
noticed the unauthorized checks.
In cases of insider modification of information for financial gain, insiders colluded with
an outsider in half of the cases, and almost half of the cases involved collusion between
the insider and someone else inside the organization. In modification cases, an outsider
recruited the insider to commit the crime in less than one third of the cases. In one third
of the cases, the insider acted alone.
How did they attack?
Ninety five percent of the insiders stole or modified the information during normal
working hours, and over 75% of the insiders used authorized access. Twenty five percent
did not have authorized access when they committed their crime; all others were
legitimate users. Five had system administrator or database administrator access and less
than 15% had privileged access. Almost all of the insiders used only legitimate user
commands to steal or modify the data. Only 16% of the crimes involved sophisticated
technical techniques, like use of a script or program, creation of a backdoor account, or
account compromise.
Eight-five percent of the insiders used their own usernames and passwords to commit
their crimes. Slightly over 10% compromised someone elses account, two insiders used
a computer left logged in and unattended by a coworker, one insider used a customer
account, and one used a company-wide training account. In nine of the cases, the insider
was able to compromise access to an account via social engineering methods. Some
insiders used more than one account to carry out their crime.
Only two insiders took technical preparatory actions to set up their illicit activity. One
insider enabled fraudulent medical care providers to be added to the database. Another
disabled automatic notification of the security staff when a certain highly restricted
function was used in the system, then used that function to conduct his fraudulent
scheme.
CERT | SOFTWARE ENGINEERING INSTITUTE | 19
How was it detected?
Only one of the insiders was detected due to network monitoring activities. Half were
detected due to data irregularities, including suspicious activities in the form of bills,
tickets, or negative indicators on individuals credit histories. The majority of the cases
were detected by non-technical means, such as notification of a problem by a customer,
law enforcement officer, coworker, informant, auditor, or other external person who
became suspicious. In five cases, the insider was detected when the information was
offered for sale directly to a competitor via email or posted online. Most of the malicious
activity was eventually detected by multiple people. Over 50% of the cases were detected
internally by non-IT security personnel, 26% by clients or customers of the organization,
approximately 10% by customers, and 5% by competitors.
How was the insider identified?
In most cases, system logs were used to identify the insider, including database logs,
system file change logs, file access logs, and others.
What were the impacts?
The theft or modification cases analyzed for this report affected not only the insiders
organizations, but also other innocent victims. For example, a check fraud scheme
resulted in innocent people receiving collection letters due to fraudulent checks written
against their account. Other cases involved insiders committing credit card fraud by
abusing their access to confidential customer data. Other insiders subverted the justice
system by modifying court records. Some cases could have very serious consequences
cases in which insiders created false official identification documents or drivers licenses
for illegal aliens or others who could not obtain them legally. Similarly, one insider
accepted payment to modify a database to overturn decisions denying asylum to illegal
aliens, enabling them to remain in the U.S. illegally.
The insiders organizations also suffered as a result of these crimes. Impacts included
negative media attention as well as financial losses. One insider committed fraud against
a state insurance fund for a total of almost $850,000, and another insider working for the
same company was tied to almost $20 million in fraudulent or suspicious transactions.
Another insider committed fraud against a federal agency for over $600,000. In a case
involving both sabotage and fraud, an insider set himself up to benefit from the abrupt
decline in his companys stock price when he deleted over 10 billion files on the
companys servers, costing the organization close to $3 million in recovery costs.
CERT | SOFTWARE ENGINEERING INSTITUTE | 20
Theft of Information for Business Advantage
In this report, cases involving theft of confidential or proprietary information are defined
as follows: cases in which current or former employees, contractors, or business partners
intentionally exceeded or misused an authorized level of access to networks, systems, or
data with the intention of stealing confidential or proprietary information from the
organization with the intent to use it for a business advantage. While an argument can be
made that this type of incident may ultimately be about money, these insiders had longer
term ambitions, such as using the information to get a new job, to use in a new job with a
competing business, or to start a competing business.
CERT researchers analyzed twenty-four cases of theft of confidential or proprietary
information for business advantage that occurred in the United States between 1996 and
2007. Twenty-three cases involved only information theft and one also involved IT
sabotage.
Who were the insiders?
In all of the cases analyzed, the insiders who stole confidential or proprietary information
were male and 71% held technical positions. The remaining 29% occupied sales
positions. Twenty-five percent were former employees; the other 75% were current
employees when they committed their illicit activity. Interestingly, nearly 80% of the
insiders had already accepted positions with another company or had started a competing
company at the time of the theft.
Why did they do it?
By definition, all of these insiders committed the crime in order to obtain a business
advantage. Some insiders stole the information to give them an immediate advantage at a
new job. Others used the information to start a new, competing business. Almost all
(95%) of the insiders resigned before or after the theft. Most (almost 70%) took place
within three weeks of the insiders resignation.
In 25% of the cases, the insider gave the information to a foreign company or government
organization. It is important to note that half of the theft for business advantage cases
with the highest financial impact involved foreign organizations.
How did they attack?
Eighty-eight percent of the insiders had authorized access to the information when they
committed the theft. The only insiders who did not have authorized access to the
information they stole were former employees at the time of the crime. None of the
insiders had privileged access, such as system administrator or database administrator
access, that enabled them to commit the crime, although one former employee was given
authorized access to do some additional work; he used that access to commit the theft. In
other words, the widespread fear of system administrators using their privileged access to
steal information was not evidenced in these cases.
The majority of these theft cases occurred quickly, spanning less than a one-month
period. Less than one third of the insiders continued their theft over a longer period, half
CERT | SOFTWARE ENGINEERING INSTITUTE | 21
of them stealing for a side business, and half to take to a new employer. Although most of
the thefts occurred quickly, there often was significant planning by the insider. More
than one third of the insiders had already created, or were planning to start, a new
business while still working at the victim organization. Some of the insiders were
deceptive about their plans when leaving the organization, either lying about future job
plans or declining to reveal that they had already accepted another position. One insider
created a side business as a vehicle for transferring trade secrets he stole from his current
employer to a foreign-state-owned company. He concealed his connection to the side
business by removing his name from the business article of incorporation and only using
a post office box as the address for the company.
There was slightly less collusion in these theft cases than in the cases of theft or
modification for financial gain, but the numbers are still significant. In approximately
half of the cases, the insider colluded with at least one other insider to commit the crime.
In some cases, the employee stole the information, resigned his position, then recruited
other employees still at the original organization to steal additional information. These
crimes were usually the insiders own idea; the insider was only recruited by someone
outside the organization in less than 25% of the cases.
The majority of these crimes were committed during working hours, although a few
insiders acted outside working hours. Very few (roughly 12%) used remote access,
accessing their employers networks from their homes or from another organization.
Some insiders stole information using both remote access and access from within the
workplace, and some acted both inside and outside normal working hours.
How was it detected?
Many of these incidents were detected by non-technical means, such as

Don't use plagiarized sources. Get Your Custom Essay on
What are the best practices for the prevention and detection of insider threats?
Just from $13/Page
Order Essay

notification by a customer or informant,
detection by law enforcement investigating the reports of the theft by victims,
reporting of suspicious activity by co-workers, and
sudden emergence of new competing organizations.
In one case, the victim organization became suspicious upon seeing a product strikingly
similar to theirs at a competitors booth at a trade show. In another, customers alerted the
victim organization to the theft when the insider attempted to sell identical products and
services to theirs on behalf of a new organization.
Twenty-five percent of the cases were detected by system administrators or IT security
personnel while monitoring download logs or email logs.
How was the insider identified?
In most cases, system logs were used to identify the insider, including file access,
database, and email logs.
CERT | SOFTWARE ENGINEERING INSTITUTE | 22
What were the impacts?
Impacts on organizations included financial and other losses. It is extremely difficult to
quantify the losses resulting from stolen trade secrets. In 38% of the cases, proprietary
software or source code was stolen; an equal number of cases involved business plans,
proposals, and other strategic plans; and a slightly smaller number involved trade secrets,
such as product designs or formulas.
Finally, the insiders themselves sometimes suffered unanticipated consequences. Some
insiders were surprised that their actions were criminal in nature, claiming that they
created the information once, and could do it again, and therefore it was easier to simply
take it with them when they left the organization. In one case, the insider committed
suicide before he could be brought to trial.
CERT | SOFTWARE ENGINEERING INSTITUTE | 23
Summary
Forty-five percent of the 176 cases analyzed for this report involved IT sabotage, 44%
involved theft or modification of information for financial gain, and 14% involved theft
or modification of information for business advantage. 8 However, although IT sabotage
and theft or modification of information for financial gain were the most prevalent types
of crime, the potential impacts of all three types of crime are serious. Therefore,
organizations should consider whether each of these activities is a potential threat to
them, and if so, consider the information in this report regarding those types of crimes
carefully.
Furthermore, the authors of this report contend that insider IT sabotage is a threat to any
organization that relies on an IT infrastructure for its business, regardless of the size or
complexity of the configuration. Likewise, it is unlikely that many organizations can
disregard insider theft of proprietary or confidential information as an insider threat.
Therefore, it is recommended that all organizations consider the practices detailed in the
remainder of this report for prevention of sabotage and information theft.
Table 1 provides a summary of the details surrounding the three types of insider crimes.
High-Level Comparison of Insider Threat Types
Potential threat of insider sabotage is posed by disgruntled technical staff following a
negative work-related event. These insiders tend to act alone. While coworkers might
also be disgruntled immediately following the negative event, most of them come to
accept the situation. The potential for insider IT sabotage should be considered when
there are ongoing, observable behavioral precursors preceding technical actions that are
taken to set up the crime.
Data pertaining to theft or modification of information for financial gain and information
theft for business advantage, on the other hand, suggest that organizations need to
exercise some degree of caution with all employees. Current employees in practically any
position have used legitimate system access to commit those types of crimes. In theft or
modification for financial gain, there was also a high degree of collusion with both
outsiders (primarily to market the stolen information or to gain benefit from its
modification) and other insiders (primarily to facilitate the theft or modification).
Collusion was less common, but still significant, in theft for business advantage. Crimes
for financial gain were also more likely to be induced by outsiders than crimes for
business advantage.
Of special note, however, is the fact that ninety-five percent of the employees who stole
information for business advantage resigned before or after the theft. Therefore, extra
caution should be exercised once the organization becomes aware of this type of
information, either formally or via rumor. A balance of trust and caution should factor
into the organizations policies, practices, and technology.
8
Recall that some crimes fit into multiple categories. Also, cases of Miscellaneous theft were excluded
from this calculation.
CERT | SOFTWARE ENGINEERING INSTITUTE | 24
Insider Theft or
Modification of
Information for
Financial Gain
Insider IT Sabotage
Percentage of
crimes in CERTs
case database
Current or former
employee?
Type of position
Gender
Target
Access used
When
Where
Insider Theft of
Information for
Business
Advantage
45%
44%
14%
Former
Current
Current
Technical (e.g. system
administrators or
database
administrators)
Male
Network, systems, or
data
Unauthorized access
Outside normal
working hours
Remote access
Recruited by
outsiders
None
Collusion
None
Non-technical, lowlevel positions with
access to confidential
or sensitive
information (e.g. data
entry, customer
service)
Fairly equally split
between male and
female
Technical (71%) scientists,
programmers,
engineers
Sales (29%)
Male
Intellectual Property
Personally Identifiable (trade secrets) 71%
Information or
Customer Information Customer
Information 33% 9
Authorized access
Authorized access
During normal
During normal
working hours
working hours
At work
At work
Half recruited for
theft; less than one
Less than one fourth
third recruited for
modification
Almost half colluded
Almost half colluded
with another insider in
with at least one
modification cases;
insider; half acted
2/3 colluded with
alone
outsiders in theft cases
Table 1. Summary Comparison by Type of Insider Incident
9
Some insiders stole more than one type of information.
CERT | SOFTWARE ENGINEERING INSTITUTE | 25
How Can they be Stopped?
The methods of carrying out malicious insider activity varied by type of crime. The IT
sabotage cases tended to be more technically sophisticated, while the theft or
modification of information for financial gain and information theft for business
advantage tended to be technically unsophisticated in comparison.
It is important that organizations carefully consider implementing the practices outlined
in the remainder of this report to protect themselves from any of these malicious activities
that pose a risk to them. Proactive technical measures need to be instituted and
maintained at a constant level in order to prevent or detect technical preparatory actions.
Good management practices need to be instituted and maintained in order to prevent
insider threats, or recognize and react appropriately when indicators of potential insider
threats are exhibited. Legal and contractual implications in the cases examined by CERT
need to be understood and accounted for with employees, contractors, and partner
organizations.
Too often, organizations allow the quality of their practices to erode over time because
they seem to be less important than competing priorities if no malicious insider activity
has been detected. One of the vulnerabilities posed by insiders is their knowledge of
exactly this: the quality of their organizations defenses.
What if an Insider Attack Succeeds?
One pattern common to all of the cases is the importance of system logs in identifying the
insider. Regardless of type of crime, system logs provide the evidence needed to take
appropriate action. Since many technical insiders attempted to conceal their actions,
sometimes by altering system logs, it is particularly important that organizations architect
their systems to ensure the integrity of their logs.
In addition to protecting and defending against insider threats, it is also important that
organizations are prepared to respond to an insider incident should one occur.
Organizations frequently overlook insider threats when preparing incident response plans.
Insider incidents need to be investigated carefully, since it is not always apparent who
can be trusted and who cannot. In addition, organizations should make a proactive
decision regarding forensics capability: if an insider incident occurs, will forensics be
handled internally, or will an external forensics expert be hired? Some insider cases
obtained by CERT could not be prosecuted because the organization did not properly
handle system logs, and as a result they could not be used as evidence in prosecution.
The remainder of this document is structured around sixteen practices that could have
been effective in preventing the insider incidents analyzed for this report, or at the very
least, would have enabled early detection of the malicious activity.
CERT | SOFTWARE ENGINEERING INSTITUTE | 26
Best Practices for the Prevention and Detection of
Insider Threats
Summary of practices
The following sixteen practices will provide an organization defensive measures that
could prevent or facilitate early detection of many of the insider incidents other
organizations experienced in the hundreds of cases examined by CERT. Some of these
practices have been updated from the previous version of the Common Sense Guide
based on approximately 100 recent cases collected and examined since that version was
published. Other practices are new ones added in this version. Each practice listed below
is labeled as either Updated or New.
PRACTICE 1: Consider threats from insiders and business partners in enterprise-wide
risk assessments. (Updated).
It is difficult for organizations to balance trusting their employees, providing them access
to achieve the organizations mission, and protecting its assets from potential
compromise by those same employees. Insiders access, combined with their knowledge
of the organizations technical vulnerabilities and vulnerabilities introduced by gaps in
business processes, gives them the ability and opportunity to carry out malicious activity
against their employer if properly motivated. The problem is becoming even more
difficult as the scope of insider threats expands due to organizations growing reliance on
business partners with whom they contract and collaborate. It is important for
organizations to take an enterprise-wide view of information security, first determining
its critical assets, then defining a risk management strategy for protecting those assets
from both insiders and outsiders.
NEW PRACTICE
PRACTICE 2: Clearly document and consistently enforce policies and controls.
Clear documentation and communication of technical and organizational policies and
controls could have mitigated some of the insider incidents, theft, modification, and IT
sabotage, in the CERT case library. Specific policies are discussed in this section of the
report. In addition, consistent policy enforcement is important. Some employees in the
cases examined by CERT felt they were being treated differently than other employees,
and retaliated against this perceived unfairness by attacking their employers IT systems.
Other insiders were able to steal or modify information due to inconsistent or unenforced
policies.
PRACTICE 3: Institute periodic security awareness training for all employees.
(Updated)
A culture of security awareness must be instilled in the organization so that all employees
understand the need for policies, procedures, and technical controls. All employees in an
organization must be aware that security policies and procedures exist, that there is a
good reason why they exist, that they must be enforced, and that there can be serious
consequences for infractions. They also need to be aware that individuals, either inside or
outside the organization, may try to co-opt them into activities counter to the
organizations mission. Each employee needs to understand the organizations security
CERT | SOFTWARE ENGINEERING INSTITUTE | 27
policies and the process for reporting policy violations. This section of the guide has been
updated with important new findings relevant to recruitment of insiders by outsiders to
commit crimes.
PRACTICE 4: Monitor and respond to suspicious or disruptive behavior, beginning
with the hiring process. (Updated)
Organizations should closely monitor suspicious or disruptive behavior by employees
before they are hired, as well as in the workplace, including repeated policy violations
that may indicate or escalate into more serious criminal activity. The effect of personal
and professional stressors should also be considered. This section has been updated
based on findings in 100 recent cases, particularly due to the high degree of internal and
external collusion observed in these cases and the high incidence of previous arrests.
NEW PRACTICE
PRACTICE 5: Anticipate and manage negative workplace issues.
This section describes suggestions for organizations beginning with pre-employment
issues and continuing through employment and with termination issues. For example,
employers need to clearly formulate employment agreements and conditions of
employment. Responsibilities and constraints of the employee and consequences for
violations need to be clearly communicated and consistently enforced. In addition,
workplace disputes or inappropriate relationships between co-workers can serve to
undermine a healthy and productive working environment. Employees should feel
encouraged to discuss work-related issues with a member of management or human
resources without fear of reprisal or negative consequences. Managers need to address
these issues when discovered or reported, before they escalate out of control. Finally,
contentious employee terminations must be handled with utmost care, as most insider IT
sabotage attacks occur following termination.
NEW PRACTICE
PRACTICE 6: Track and secure the physical environment.
While employees and contractors obviously must have access to organization facilities
and equipment, most do not need access to all areas of the workplace. Controlling
physical access for each employee is fundamental to insider threat risk management.
Access attempts should be logged and regularly audited to identify violations or
attempted violations of the physical space and equipment access policies. Of course,
terminated employees, contractors, and trusted business partners should not have physical
access to non-public areas of the organization facilities. This section details lessons
learned from cases in the CERT case library in which physical access vulnerabilities
allowed an insider to attack.
PRACTICE 7: Implement strict password and account management policies and
practices. (Updated)
No matter how vigilant an organization is in trying to prevent insider attacks, if their
computer accounts can be compromised, insiders have an opportunity to circumvent both
manual and automated controls. Password and account management policies and
practices should apply to employees, contractors, and business partners. They should
CERT | SOFTWARE ENGINEERING INSTITUTE | 28
ensure that all activity from any account is attributable to the person who performed it.
An anonymous reporting mechanism should be available and used by employees to report
attempts at unauthorized account access, including potential attempts at social
engineering. Audits should be performed regularly to identify and disable unnecessary or
expired accounts. This section has been updated to reflect new account issues identified
in 100 recent cases added to the CERT case library, many of them involving
unauthorized access by trusted business partners.
PRACTICE 8: Enforce separation of duties and least privilege. (Updated)
If all employees are adequately trained in security awareness, and responsibility for
critical functions is divided among employees, the possibility that one individual could
commit fraud or sabotage without the cooperation of another individual within the
organization is limited. Effective separation of duties requires the implementation of least
privilege; that is, authorizing insiders only for the resources they need to do their jobs,
particularly when they take on different positions or responsibilities within the
organization. This section has been updated to reflect findings from recent cases
involving collusion among multiple insiders.
NEW PRACTICE
PRACTICE 9: Consider insider threats in the software development life cycle.
Many insider incidents can be tied either directly or indirectly to defects introduced
during the software development life cycle (SDLC). Some cases, such as those involving
malicious code inserted into source code, have an obvious tie to the SDLC. Others, like
those involving insiders who took advantage of inadequate separation of duties, have an
indirect tie. This section of the report details the types of oversights throughout the SDLC
that enabled insiders to carry out their attacks.
PRACTICE 10: Use extra caution with system administrators and technical or privileged
users. (Updated)
System administrators and privileged users like database administrators have the
technical ability and access to commit and conceal malicious activity. Technically adept
individuals are more likely resort to technical means to exact revenge for perceived
wrongs. Techniques like separation of duties or two-man rule for critical system
administrator functions, non-repudiation of technical actions, encryption, and disabling
accounts upon termination can limit the damage and promote the detection of malicious
system administrator and privileged user actions. This section has been updated to
include recent findings regarding technical employees who stole information for business
advantageto start their own business, take with them to a new job, or give to a foreign
government or organization.
PRACTICE 11: Implement system change controls. (Updated)
A wide variety of insider compromises relied on unauthorized modifications to the
organizations systems, which argues for stronger change controls as a mitigation
strategy. System administrators or privileged users can deploy backdoor accounts,
keystroke loggers, logic bombs, or other malicious programs on the system or network.
These types of attacks are stealthy and therefore difficult to detect ahead of time, but
CERT | SOFTWARE ENGINEERING INSTITUTE | 29
technical controls can be implemented for early detection. Once baseline software and
hardware configurations are characterized, comparison of current configuration can
detect discrepancies and alert managers for action. This section has been updated to
reflect recent techniques used by insiders that could have been detected via change
controls.
PRACTICE 12: Log, monitor, and audit employee online actions. (Updated)
If account and password policies and procedures are enforced, an organization can
associate online actions with the employee who performed them. Logging, periodic
monitoring, and auditing provide an organization the opportunity to discover and
investigate suspicious insider actions before more serious consequences ensue. In
addition to unauthorized changes to the system, download of confidential or sensitive
information such as intellectual property, customer or client information, and personally
identifiable information can be detected via data leakage tools. New findings detailed in
this section can assist organizations in refining their data leakage prevention strategy, for
example, in the weeks surrounding employee termination.
PRACTICE 13: Use layered defense against remote attacks. (Updated)
If employees are trained and vigilant, accounts are protected from compromise, and
employees know that their actions are being logged and monitored, then disgruntled
insiders will think twice about attacking systems or networks at work. Insiders tend to
feel more confident and less inhibited when they have little fear of scrutiny by coworkers;
therefore, remote access policies and procedures must be designed and implemented very
carefully. When remote access to critical systems is deemed necessary, organizations
should consider offsetting the added risk with requiring connections only via
organization-owned machines and closer logging and frequent auditing of remote
transactions. Disabling remote access and collection of organization equipment is
particularly important for terminated employees. This section has been updated to include
new remote attack methods employed by insiders in recent cases.
PRACTICE 14: Deactivate computer access following termination. (Updated)
When an employee terminates employment, whether the circumstances were favorable or
not, it is important that the organization have in place a rigorous termination procedure
that disables all of the employees access points to the organizations physical locations,
networks, systems, applications, and data. Fast action to disable all access points
available to a terminated employee requires ongoing and strict tracking and management
practices for all employee avenues of access including computer system accounts, shared
passwords, and card control systems.
PRACTICE 15: Implement secure backup and recovery processes. (Updated)
No organization can completely eliminate its risk of insider attack; risk is inherent in the
operation of any profitable enterprise. However, with a goal of organizational resiliency,
risks must be acceptable to the stakeholders, and as such, impacts of potential insider
attacks must be minimized. Therefore, it is important for organizations to prepare for the
possibility of insider attack and minimize response time by implementing secure backup
and recovery processes that avoid single points of failure and are tested periodically. This
CERT | SOFTWARE ENGINEERING INSTITUTE | 30
section contains descriptions of recent insider threat cases in which the organizations
lack of attention to incident response and organizational resiliency resulted in serious
disruption of service to their customers.
NEW PRACTICE
PRACTICE 16: Develop an insider incident response plan.
Organizations need to develop an insider incident response plan to control the damage
due to malicious insiders. This is challenging because the same people assigned to a
response team may be among the most likely to think about using their technical skills
against the organization. Only those responsible for carrying out the plan need to
understand and be trained on its execution. Should an insider attack, it is important that
the organization have evidence in hand to identify the insider and follow up
appropriately. Lessons learned should used to continually improve the plan.
CERT | SOFTWARE ENGINEERING INSTITUTE | 31
Practice 1: Consider threats from insiders and business
partners in enterprise-wide risk assessments. (UPDATED)
Organizations need to develop a comprehensive risk-based security strategy to
protect critical assets against threats from inside and outside, as well as trusted
business partners who are given authorized insider access.
What to do?
It is not practical for most organizations to implement 100% protection against every
threat to every organizational resource. Therefore, it is important to adequately protect
critical information and other resources and not direct significant effort toward protecting
relatively unimportant data and resources. A realistic and achievable security goal is to
protect those assets deemed critical to the organizations mission from both external and
internal threats. Unfortunately, organizations often fail to recognize the increased risk
posed when they provide insider access to their networks, systems, or information to
other organizations and individuals with whom they collaborate, partner, contract, or
otherwise associate. The boundary of the organizations enterprise needs to be drawn
broadly enough to include as insiders all people who have a privileged understanding of
and access to the organization, its information, and information systems.
Risk is the combination of threat, vulnerability, and mission impact. Enterprise-wide risk
assessments help organizations identify critical assets, potential threats to those assets,
and mission impact if the assets are compromised. Organizations should use the results of
the assessment to develop or refine the overall strategy for securing their networked
systems, striking the proper balance between countering the threat and accomplishing the
organizational mission. 10
The threat environment under which the system operates needs to be understood in order
to accurately assess enterprise risk. Characterizing the threat environment can proceed in
parallel with the evaluation of vulnerability and impact. However, the sooner the threat
environment can be characterized the better. The purpose of this guide is to assist
organizations in correctly assessing the insider threat environment, organizational
vulnerabilities that enable the threat, and potential impacts that could result from insider
incidents, including financial, operational, and reputational.
Unfortunately, many organizations focus on protecting information from access or
sabotage by those external to the organization and overlook insiders. Moreover, an
information technology and security solution designed without consciously
acknowledging and accounting for potential insider threats often leaves the role of
protection in the hands of some of the potential threatsthe insiders themselves. It is
imperative that organizations recognize the potential danger posed by the knowledge and
access of their employees, contractors, and business partners, and specifically address
that threat as part of an enterprise risk assessment.
10
See http://www.cert.org/nav/index_green.html for CERT research in Enterprise Security Management.
CERT | SOFTWARE ENGINEERING INSTITUTE | 32
Understanding the vulnerability of an organization to a threat is also important, but
organizations often focus too much on low-level technical vulnerabilities, for example,
by relying on automated computer and network vulnerability scanners. While such
techniques are important, our studies of insider threat have indicated that vulnerabilities
in an organizations business processes are at least as important as technical
vulnerabilities. Organizations need to manage the impact of threats rather than chase
individual technical vulnerabilities. In addition, new areas of concern have become
apparent in recent cases, including legal and contracting issues, as detailed in the Recent
Findings section below.
Insider threats impact the integrity, availability, or confidentiality of information critical
to an organizations mission. Insiders have affected the integrity of their organizations
information in various ways, for example by manipulating customer financial information
or defacing their employers web sites. They have also violated confidentiality of
information by stealing trade secrets or customer information. Still others have
inappropriately disseminated confidential information, including private customer
information as well as sensitive email messages between the organizations management.
Finally, insiders have affected the availability of their organizations information by
deleting data, sabotaging entire systems and networks, destroying backups, and
committing other types of denial-of-service attacks.
In the types of insider incidents mentioned above, current or former employees,
contractors, or business partners were able to compromise their organizations critical
assets. It is important that protection strategies are designed focusing on those assets:
financial data, confidential or proprietary information, and other mission critical systems
and data.
Case Studies: What could happen if I dont do it?
One organization failed to protect extremely critical systems and data from internal
employees. It was responsible for running the 911 phone-number-to-address lookup
system for emergency services. An insider deleted the entire database and software from
three servers in the organizations network operations center (NOC) by gaining physical
access using a contractors badge. The NOC, which was left unattended, was solely
protected via physical security; all machines in the room were left logged in with system
administrator access.
Although the NOC system administrators were immediately notified of the system failure
via an automatic paging system, there were no automated failover mechanisms. The
organizations recovery plan relied solely on backup tapes, which were also stored in the
NOC. Unfortunately, the insider, realizing that the systems could be easily recovered,
took all of the backup tapes with him when he left the facility. In addition, the same
contractors badge was authorized for access to the offsite backup storage facility, from
which he next stole over fifty backup tapes.
Had an enterprise risk assessment been performed for this system prior to the incident,
the organization would have recognized the criticality of the systems, assessed the threats
and vulnerabilities, and developed a risk mitigation strategy accordingly.
CERT | SOFTWARE ENGINEERING INSTITUTE | 33
Another insider was the sole system administrator for his organization. One day, he quit
with no prior notice. His organization refused to pay him for his last two days of work,
and he subsequently refused to give them the passwords for the administrator accounts
for its systems. Over a period of three days, the insider modified the systems so that they
could not be accessed by the employees, defaced the company web site, and deleted files.
It is critical that organizations consider the risk they assume when they place all system
administration power into the hands of a single employee.
Recent Findings:
Organizations are increasingly outsourcing critical business functions. As a result, people
external to the organization sometimes have full access to the organizations policies,
processes, information, and systems, access and knowledge previously only provided to
employees of the organization. CERTs definition of insider, which originally
encompassed current and former employees and contractors, had to be extended to
include partners, collaborators, and even students associated with the organization.
One recent case involved an employee of a company that obtained a contract to set up a
new wireless network for a major manufacturer. The insider was on the installation team
and therefore had detailed knowledge of the manufacturers systems. He was removed
from the team by his employer, apparently under negative circumstances. However, he
was able to enter the manufacturing plant and access a computer kiosk in the visitors
lobby. Based on his familiarity with the manufacturers computer system and security, he
was able to use the kiosk to delete files and passwords from wireless devices used by the
manufacturer across the country. It was forced to remove and repair the devices, causing
wide-scale shutdown of facilities and disruption of its processes.
This case highlights several new insider threat issues. First of all, an enterprise-wide risk
assessment should have identified the ability to override security and obtain privileged
access to the manufacturers network from a publicly accessible kiosk. Second, the
manufacturers contract with the insiders organization should have instituted strict
controls over employees added to or removed from the project. Specifically,
organizations should consider provisions in their contracts that require advance
notification by the contracted organization of any negative employment actions being
planned against any employees who have physical and/or electronic access to the
contracting organizations systems. The contracting organization could require a
specified amount of time before the action occurs, in order to perform its own risk
assessment for the potential threat posed to its own network, systems, or information.
Another recent incident indicates the need to have transaction verification built into
supplier agreements. A computer help desk attendant employed by a military contractor
created fake military email addresses on the military systems for which he was
responsible. He then used those email addresses to request replacement parts for military
equipment recalled by a major supplier. The supplier sent the replacement parts to the
address specified in the emails, with the expectation that the original recalled products
would be returned after the replacements had been received. The insider provided his
CERT | SOFTWARE ENGINEERING INSTITUTE | 34
home address for the shipments, and never intended to return the original equipment. The
insider received almost 100 shipments with a retail value of almost five million dollars
and sold the equipment on eBay.
Another case reflects the complexity of defining the organizational perimeter and the
scope of insider threats. The outside legal counsel for a high tech company was preparing
to represent the company in civil litigation. The outside counsel was provided with
documents containing company trade secrets, which were necessary to prepare the legal
case. The legal firm had a contract with a document-imaging company for copying
documents for its cases. An employee of the document-imaging company brought in his
nephew to help him copy the trade secret documents due to the amount of work required.
The nephew, a university student not officially on payroll, scanned the confidential
documents using his uncles work computer, then sent them to a hacker web site for
posting. His goal was to help the hacker community crack the high tech companys
premier product. Organizations need to carefully consider their enterprise information
boundaries when assessing the risk of insider compromise, and use legal means for
protecting their information once it leaves their control.
CERT | SOFTWARE ENGINEERING INSTITUTE | 35
Practice 2: Clearly document and consistently enforce policies
and controls. (NEW)
A consistent, clear message on organizational policies and controls will help
reduce the chance that employees will inadvertently commit a crime or lash out
at the organization for a perceived injustice.
What to do?
Policies or controls that are misunderstood, not communicated, or inconsistently enforced
can breed resentment among employees and can potentially result in harmful insider
actions. For example, multiple insiders in cases in the CERT database took intellectual
property they had created to a new job, not realizing that they did not own it. They were
quite surprised when they were arrested for a crime they did not realize they had
committed.
Organizations should ensure the following with regard to their policies and controls:
concise and coherent documentation, including reasoning behind the policy,
where applicable
fairness for all employees
consistent enforcement
periodic employee training on the policies, justification, implementation, and
enforcement
Organizations should be particularly clear on policies regarding
acceptable use of organizations systems, information, and resources
ownership of information created as a paid employee or contractor
evaluation of employee performance, including requirements for promotion and
financial bonuses
processes and procedures for addressing employee grievances
As individuals join the organization, they should receive a copy of organizational policies
that clearly lays out what is expected of them, together with the consequences of
violations. Evidence that each individual has read and agreed to the organizations
policies should be maintained.
Employee disgruntlement was a recurring factor in insider compromises, particularly in
the insider IT sabotage cases. The disgruntlement was caused by some unmet expectation
by the insider. Examples of unmet expectations observed in cases include

insufficient salary increase or bonus
limitations on use of company resources
diminished authority or responsibilities
perception of unfair work requirements
poor coworker relations
CERT | SOFTWARE ENGINEERING INSTITUTE | 36
Clear documentation of policies and controls can help prevent employee
misunderstandings that can lead to unmet expectations. Consistent enforcement can
ensure that employees dont feel they are being treated differently from or worse than
other employees. In one case, employees had become accustomed to lax policy
enforcement over a long period of time. New management dictated immediate strict
policy enforcement, which caused one employee to become embittered and strike out
against the organization. In other words, policies should be enforced consistently across
all employees, as well as consistently enforced over time.
Of course, organizations are not static entities; change in organizational policies and
controls is inevitable. Employee constraints, privileges, and responsibilities change as
well. Organizations need to recognize times of change as particularly stressful times for
employees, recognize the increased risk that comes along with these stress points, and
mitigate it with clear communication regarding what employees can expect in the future.
Case Studies: What could happen if I dont do it?
An insider accepted a promotion, leaving a system administrator position in one
department for a position as a systems analyst in another department of the same
organization. In his new position, he was responsible for information sharing and
collaboration between his old department and the new one. The following events ensued:

The original department terminated his system administrator account and issued
him an ordinary user account to support the access required in his new position.
Shortly thereafter, the system security manager at the original department noticed
that the former employees new account had been granted unauthorized system
administration rights.
The security manager reset the account back to ordinary access rights, but a day
later found that administrative rights had been granted to it once again.
The security manager closed the account, but over the next few weeks other
accounts exhibited unauthorized access and usage patterns.
An investigation of these events led to charges against the analyst for misuse of the
organizations computing systems. These charges were eventually dropped, in part
because there was no clear policy regarding account sharing or exploitation of
vulnerabilities to elevate account privileges. This case illustrates the importance of
clearly established policies that are consistent across departments, groups, and
subsidiaries of the organization.
There are many cases in the CERT library where an employee compromised an
organizations information or system in order to address some perceived injustice:
An insider planted a logic bomb in an organizations system because he felt that
he was required to follow stricter work standards than his fellow employees.
CERT | SOFTWARE ENGINEERING INSTITUTE | 37

In reaction to a lower bonus than expected, an insider planted a logic bomb that
would, he expected, cause the organizations stock value to go down, thus causing
stock options he owned to increase in value.
A network administrator who designed and controlled an organizations
manufacturing support systems detonated a logic bomb to destroy his creation
because of his perceived loss of status and control.
A quality control inspector, who believed his employer insufficiently addressed
the quality requirements of its product, supplied company confidential
information to the media to force the company to deal with the problem.
An insider, who was upset about his companys practice of cancelling insurance
policies for policy holders who paid late, provided sensitive company information
to the opposing lawyers engaged in a lawsuit against the company.
What these insiders did is wrong and against the law. Nevertheless, more clearly defined
policies and grievance procedures for perceived policy violations might have avoided the
serious insider attacks experienced by those organizations.
CERT | SOFTWARE ENGINEERING INSTITUTE | 38
Practice 3: Institute periodic security awareness training for all
employees. (UPDATED)
Without broad understanding and buy-in from the organization, technical or
managerial controls will be short lived.
What to do?
All employees need to understand that insider crimes do occur, and there are severe
consequences. In addition, it is important for them to understand that malicious insiders
can be highly technical people or those with minimal technical ability. Ages of
perpetrators range from late teens to retirement. Both men and women have been
malicious insiders, including introverted loners, aggressive get it done people, and
extroverted star players. Positions have included low-wage data entry clerks, cashiers,
programmers, artists, system and network administrators, salespersons, managers, and
executives. They have been new hires, long-term employees, currently employed,
recently terminated, contractors, temporary employees, and employees of trusted business
partners.
Security awareness training should encourage identification of malicious insiders by
behavior, not by stereotypical characteristics. Behaviors of concern include

threats against the organization or bragging about the damage one could do to the
organization,
association with known criminals or suspicious people outside of the workplace,
large downloads close to resignation,
use of organization resources for a side business, or discussions regarding starting
a competing business with coworkers,
attempts to gain employees passwords or to obtain access through trickery or
exploitation of a trusted relationship (often called social engineering)
Managers and employees need to be trained to recognize social networking in which an
insider engages other employees to join their schemes, particularly to steal or modify
information for financial gain. Warning employees of this possibility and the
consequences may help to keep them on the watch for such manipulation and to report it
to management.
Social engineering is often associated with attempts either to gain physical access or
electronic access via accounts and passwords. Some of the CERT cases reveal social
engineering of a different type, however. In one recent case, a disgruntled employee
placed a hardware keystroke logger on a computer at work to capture confidential
company information. After being fired unexpectedly, the now former employee tried to
co-opt a non-technical employee still at the company to recover the device for him.
Although the employee had no idea the device was a keystroke logger, she was smart
enough to recognize the risk of providing it to him and notified management instead.
Forensics revealed that he had removed the device and transferred the keystrokes file to
his computer at work at least once before being fired.
CERT | SOFTWARE ENGINEERING INSTITUTE | 39
Training programs should create a culture of security appropriate for the organization and
include all personnel. For effectiveness and longevity, the measures used to secure an
organization against insider threat need to be tied to the organizations mission, values,
and critical assets, as determined by an enterprise-wide risk assessment. For example, if
an organization places a high value on customer service quality, it may view customer
information as its most critical asset and focus security on protection of that data. The
organization could train its members to be vigilant against malicious employee actions,
focusing on a number of key issues, including

detecting and reporting disruptive behavior by employees (see Practice 4)
monitoring adherence to organizational policies and controls (see Practices 2 and
11)
monitoring and controlling changes to organizational systems (e.g., to prevent the
installation of malicious code) (see practices 9 and 11)
requiring separation of duties between employees who modify customer accounts
and those who approve modifications or issue payments (see Practice 8)
detecting and reporting violations of the security of the organizations facilities
and physical assets (see Practice 6)
planning for potential incident response proactively (see Practice 16)
Training on reducing risks to customer service processes would focus on

protecting computer accounts used in these processes (see Practice 7)
auditing access to customer records (see Practice 12)
ensuring consistent enforcement of defined security policies and controls (see
practice 2)
implementing proper system administration safeguards for critical servers (see
practices 10, 11, 12, and 13)
using secure backup and recovery methods to ensure availability of customer
service data (see Practice 15)
Training content should be based on documented policy, including a confidential means
of reporting security issues. Confidential reporting allows reporting of suspicious events
without fear of repercussions, thereby overcoming the cultural barrier of whistle blowing.
Employees need to understand that the organization has policies and procedures, and that
managers will respond to security issues in a fair and prompt manner.
Employees should be notified that system activity is monitored, especially system
administration and privileged activity. All employees should be trained in their personal
responsibility, such as protection of their own passwords and work products. Finally, the
training should communicate IT acceptable use policies.
Case Studies: What could happen if I dont do it?
The lead developer of a critical production application had extensive control over the
application source code. The only copy of the source code was on his company-provided
laptop; there were no backups performed, and very little documentation existed, even
CERT | SOFTWARE ENGINEERING INSTITUTE | 40
though management had repeatedly requested it. The insider told coworkers he had no
intention of documenting the source code and any documentation he did write would be
obscure. He also stated that he thought poorly of his managers because they had not
instructed him to make backup copies of the source code.
A month after learning of a pending demotion, he erased the hard drive of his laptop,
deleting the only copy of the source code the organization possessed, and quit his job. It
took more than two months to recover the source code after it was located by law
enforcement in encrypted form at the insiders home. Another four months elapsed before
the insider provided the password to decrypt the source code. During this time the
organization had to rely on the executable version of the application, with no ability to
make any modifications. If the insiders team members had been informed that the
security and survivability of the system was their responsibility, and if they had been
presented with a clear procedure for reporting concerning behavior, they might have
notified management of the insiders statements and actions in time to prevent the attack.
Another insider case involved a less technically sophisticated attack, but one that could
have been avoided or successfully prosecuted if proper policies and training had been in
place. Four executives left their firm to form a competing company. A few days before
they left, one of them ordered a backup copy of the hard drive on his work computer,
which contained customer lists and other sensitive information, from the external
company that backed up the data. The company also alleged that its consulting services
agreement and price list were sent by email from the insiders work computer to an
external email account registered under his name. The insiders, two of whom had signed
confidentiality agreements with the original employer, disagreed that the information
they took was proprietary, saying that it had been published previously. Clear policies
regarding definition of proprietary information and rules of use could have prevented the
attack or provided a clearer avenue for prosecution.
Recent Findings
A striking finding in recent cases is that in over two thirds of the 31 cases of theft for
financial gain, the insider was recruited to steal by someone outside the organization. In
many of these cases, the insider was taking most of the risk while receiving relatively
small financial compensation. The outsider was often a relative of the insider or an
acquaintance who realized the value of exploiting the insiders access to information.
One manager of a hospitals billing records gave patients credit card information to her
brother, who used it for online purchases shipped to his home address. Another insider in
the human resources department for a federal government organization gave employee
personally identifiable information (PII) to her boyfriend who used it to open and make
purchases on fraudulent credit card accounts. As in CERTs previous research, outsiders
(e.g., car salesmen) continue to convince insiders to improve the credit histories of
individuals trying to obtain loans.
Organizations should educate employees on their responsibilities for protecting the
information with which they are entrusted and the possibility that unscrupulous
individuals could try to take advantage of their access to that information. Such
CERT | SOFTWARE ENGINEERING INSTITUTE | 41
individuals may be inside or outside, the organization. In almost half of the cases of
modification of information for financial gain, the insider recruited at least one other
employee in the company to participate in the scheme, possibly as a means to bypass
separation of duty restrictions, or to ensure that coworkers wouldnt report suspicious
behavior. In one recent case, several bank janitorial employees stole customer
information while working, changed the customer addresses online, opened credit cards
in their names, purchased expensive items using the cards, and drained their bank
accounts. Employees should be regularly reminded about procedures the company has in
place for anonymously reporting suspicious coworker behavior, or attempts of
recruitment by individuals inside or outside the organization.
Employees need to be educated about the confidentiality and integrity of the companys
information, and that compromises will be dealt with harshly. Insiders sometimes did not
understand this, viewing information as being their own property rather than the
companys; for example, customer information developed by a sales person or software
developed by a programmer.
There are also recent cases in which technical employees sold their organizations
intellectual property because of dissatisfaction with their pay, and others who gave the
information to reporters and lawyers over dissatisfaction with the organizations
practices. Signs of disgruntlement in cases like those often appear well before the actual
compromise. Such attacks can be prevented if managers and coworkers are educated to
recognize and report behavioral precursors indicating potential attacks.
CERT | SOFTWARE ENGINEERING INSTITUTE | 42
Practice 4: Monitor and respond to suspicious or disruptive
behavior, beginning with the hiring process. (UPDATED)
One method of reducing the threat of malicious insiders is to proactively deal with
suspicious or disruptive employees.
What to do?
An organizations approach to reducing the insider threat should start in the hiring
process by performing background ch…

Introduction:

Insider threat is a growing concern for organizations worldwide. Employees can unwittingly expose sensitive information or, in worse cases, intentionally harm a company. The problem lies in that those individuals already have access to confidential information, making it challenging to identify breaches before they occur. This has resulted in the need for adequate security measures to control employee behavior and prevent misuse of information.

Description:

The Common Sense Guide to Prevention and Detection of Insider Threats is a document developed by the CERT program at Carnegie Mellon University. It outlines best practices for preventing insider threats, with an emphasis on controlling employee behavior and identifying potential red flags. In this assignment, students are required to review the document and select one of the 16 best practices provided. The paper should include a summary of the chosen best practice and its relevance to an HR department. It must also conclude with a recommendation on how to implement the approach in their organization. This assignment aims to provide an understanding of the security practices necessary to prevent insider threats, a growing concern for all businesses.

Learning Objective: Identify the best practices for the prevention and detection of insider threats.

Headings:
– Introduction to the problem of insider threat
– Best practices for the prevention and detection of insider threats

Learning Outcome: After completing this assignment, students will be able to select and summarize a best practice for preventing and detecting insider threats and make a recommendation for implementing it in their organization.

Objectives:
– Explain the meaning of “insider threat” and why it is a problem.
– Analyze and evaluate the 16 best practices listed in CERT’s Common Sense Guide to Prevention and Detection of Insider Threats.
– Select one best practice and explain it to a HR person in the organization.
– Conclude with a recommendation on how to implement the best practice in the organization.

Solution 1:

Best Practice for Prevention and Detection of Insider Threats – Practice 2

The problem of insider threats is a significant security concern for any organization as it involves the risk of data breaches, financial loss, and reputational damage. Practice 2 from the CERT Common Sense Guide to Prevention and Detection of Insider Threats emphasizes the importance of clearly documenting and consistently enforcing policies and controls to mitigate the risk of insider threats.

As a Human Resource person in the organization, it is crucial to understand how to implement this best practice to prevent and detect insider threats. Firstly, all organizational policies and controls should be documented, and employees must be trained on these policies during their onboarding process, with regular updates carried out periodically. Secondly, the HR department should conduct regular audits to ensure compliance and detect any potential violation of policies. Finally, any violation of the policies should be immediately reported and handled accordingly.

Therefore, implementing Practice 2 can significantly reduce the risk of insider threats in the organization.

Solution 2:

Best Practice for Prevention and Detection of Insider Threats – Practice 7

Practice 7 from the CERT Common Sense Guide to Prevention and Detection of Insider Threats focuses on password and account management policies and practices to prevent insider threats. This best practice emphasizes the importance of enforcing strict password policies and account management practices to mitigate the risk of insider threats to an organization.

As a Human Resource person in the organization, it is essential to implement Practice 7 to enhance the security of the organization’s technical environment. Start by enforcing complex password policies, such as length, strength, and multifactor authentication, to ensure that only authorized personnel can access the organization’s systems. Comprehensive account management policies and practices, such as regularly reviewing user access privileges, disabling unused accounts, and monitoring account activities, must also be in place.

In conclusion, implementing Practice 7 can significantly strengthen the organization’s security posture against insider threats.

Suggested Resources/Books:
1. “Managing Insider Risk: A Framework for Combating Malicious Insider Threats” by Eric Shaw
2. “Insider Threat: A Guide to Understanding, Detecting, and Defending Against Insider Cyberattacks” by Shawn M. Davis
3. “Insider Threat: Protect Your Enterprise From Sabotage, Spying, and Theft” by Michael G. Gelles and Dawn Cappelli
4. “The CERT Guide to Insider Threats: How to Prevent, Detect, and Respond to Information Technology Crimes (Theft, Sabotage, Fraud)” by Dawn Cappelli and Andrew Moore

Similar asked questions:
1. What are the most common types of insider threats in organizations?
2. How can organizations prevent and detect insider threats?
3. What security practices should organizations implement to control employee behavior and prevent misuse of information?
4. What are the consequences of insider threats and how can they impact an organization?
5. How can Human Resources departments play a role in preventing and detecting insider threats in an organization?Insider ThreatLearning Objective: Describe the security practices used to control employee behavior and prevent misuse of information.Assignment RequirementsReviewCERT’s Common Sense Guide to Prevention and Detection of Insider Threats.Choose one of the 16 best practices listed in the document. Write a summary paper that includes the following:Introduce the problem,the insider threat.Summarize the best practice you selected as if you are describing it to a Human Resource person in your organization.Conclude with a recommendation of how to implement the best practice in your organization.CERT Document :Download HereSubmission RequirementsFormat: Microsoft WordFont:Arial, 12-Point, Double- SpaceCitation Style: APALength: 2 pages (plus a cover sheet)
Common Sense Guide to Prevention and
Detection of Insider Threats
3rd Edition Version 3.1
Dawn Cappelli
Andrew Moore
Randall Trzeciak
Timothy J. Shimeall
January 2009
This work was funded by
Copyright 2009 Carnegie Mellon University.
NO WARRANTY
THIS CARNEGIE MELLON UNIVERSITY MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE
MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS
TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR
MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL.
CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT
TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.
Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder.
Internal use. Permission to reproduce this document and to prepare derivative works from this document for internal
use is granted, provided the copyright and No Warranty statements are included with all reproductions and derivative
works.
External use. Requests for permission to reproduce this document or prepare derivative works of this document for
external and commercial use should be directed to permission@sei.cmu.edu.
CERT | SOFTWARE ENGINEERING INSTITUTE | 2
Table of Contents
INTRODUCTION ………………………………………………………………………………………………………………………. 4
WHAT IS MEANT BY “INSIDER THREAT?” ……………………………………………………………………………………… 5
CERTS DEFINITION OF A MALICIOUS INSIDER ………………………………………………………………………………. 5
ARE INSIDERS REALLY A THREAT?……………………………………………………………………………………………….. 6
WHO SHOULD READ THIS REPORT? ………………………………………………………………………………………………. 8
CAN INSIDERS BE STOPPED?………………………………………………………………………………………………………… 8
ACKNOWLEDGEMENTS …………………………………………………………………………………………………………. 9
PATTERNS AND TRENDS OBSERVED BY TYPE OF MALICIOUS INSIDER ACTIVITY …….. 11
INSIDER IT SABOTAGE……………………………………………………………………………………………………………… 15
THEFT OR MODIFICATION FOR FINANCIAL GAIN ………………………………………………………………………….. 18
THEFT OF INFORMATION FOR BUSINESS ADVANTAGE …………………………………………………………………… 21
SUMMARY ……………………………………………………………………………………………………………………………… 24
BEST PRACTICES FOR THE PREVENTION AND DETECTION OF INSIDER THREATS ……. 27
SUMMARY OF PRACTICES …………………………………………………………………………………………………………. 27
PRACTICE 1: CONSIDER THREATS FROM INSIDERS AND BUSINESS PARTNERS IN ENTERPRISE-WIDE RISK
ASSESSMENTS. (UPDATED) …………………………………………………………………………………………………….. 32
PRACTICE 2: CLEARLY DOCUMENT AND CONSISTENTLY ENFORCE POLICIES AND CONTROLS. (NEW) …… 36
PRACTICE 3: INSTITUTE PERIODIC SECURITY AWARENESS TRAINING FOR ALL EMPLOYEES. (UPDATED) 39
PRACTICE 4: MONITOR AND RESPOND TO SUSPICIOUS OR DISRUPTIVE BEHAVIOR, BEGINNING WITH THE
HIRING PROCESS. (UPDATED) …………………………………………………………………………………………………. 43
PRACTICE 5: ANTICIPATE AND MANAGE NEGATIVE WORKPLACE ISSUES (NEW) ……………………………….. 47
PRACTICE 6: TRACK AND SECURE THE PHYSICAL ENVIRONMENT (NEW) …………………………………………. 49
PRACTICE 7: IMPLEMENT STRICT PASSWORD AND ACCOUNT MANAGEMENT POLICIES AND PRACTICES.
(UPDATED) ………………………………………………………………………………………………………………………….. 52
PRACTICE 8: ENFORCE SEPARATION OF DUTIES AND LEAST PRIVILEGE. (UPDATED) ………………………… 55
PRACTICE 9: CONSIDER INSIDER THREATS IN THE SOFTWARE DEVELOPMENT LIFE CYCLE (NEW) ……….. 59
PRACTICE 10: USE EXTRA CAUTION WITH SYSTEM ADMINISTRATORS AND TECHNICAL OR PRIVILEGED
USERS. (UPDATED) ……………………………………………………………………………………………………………….. 63
PRACTICE 11: IMPLEMENT SYSTEM CHANGE CONTROLS. (UPDATED) ……………………………………………. 66
PRACTICE 12: LOG, MONITOR, AND AUDIT EMPLOYEE ONLINE ACTIONS. (UPDATED) ………………………. 70
PRACTICE 13: USE LAYERED DEFENSE AGAINST REMOTE ATTACKS. (UPDATED) …………………………….. 74
PRACTICE 14: DEACTIVATE COMPUTER ACCESS FOLLOWING TERMINATION. (UPDATED)…………………. 77
PRACTICE 15: IMPLEMENT SECURE BACKUP AND RECOVERY PROCESSES. (UPDATED) …………………….. 81
PRACTICE 16: DEVELOP AN INSIDER INCIDENT RESPONSE PLAN. (NEW) ………………………………………… 85
REFERENCES/SOURCES OF BEST PRACTICES …………………………………………………………………… 87
CERT | SOFTWARE ENGINEERING INSTITUTE | 3
INTRODUCTION
In 2005, the first version of the Common Sense Guide to Prevention and Detection of
Insider Threats was published by Carnegie Mellon Universitys CyLab. The document
was based on the insider threat research performed by CERT, primarily the Insider
Threat Study 1 conducted jointly with the U.S. Secret Service. It contained a description
of twelve practices that would have been effective in preventing or detecting malicious
insider activity in 150 actual cases collected as part of the study. The 150 cases occurred
in critical infrastructure sectors in the U.S. between 1996 and 2002.
A second edition of the guide was released in July of 2006. The second edition included a
new type of analysis by type of malicious insider activity. It also included a new section
that presented a high-level picture of different types of insider threats: fraud, theft of
confidential or proprietary information, and sabotage. also In addition, it contained new
and updated practices based on new CERT insider threat research funded by Carnegie
Mellon CyLab 2 and the U.S. Department of Defense Personnel Security Research
Center. 3 Those projects involved a new type of analysis of the insider threat problem
focused on determining high-level patterns and trends in the cases. Specifically, those
projects examined the complex interactions, relative degree of risk, and unintended
consequences of policies, practices, technology, insider psychological issues, and
organizational culture over time.
This third edition of the Common Sense Guide once again reflects new insights from
ongoing research at CERT. CyLab has funded the CERT Insider Threat Team to collect
and analyze new insider threat cases on an ongoing basis. The purpose of this ongoing
effort is to maintain a current state of awareness of the methods being used by insiders to
commit their attacks, as well as new organizational issues influencing them to attack.
This version of the guide includes new and updated practices based on an analysis of
approximately 100 recent insider threat cases that occurred from 2003 to 2007 in the U.S.
In this edition of the guide, CERT researchers also present new findings derived from
looking at insider crimes in a new way. These findings are based on CERTs analysis of
118 theft and fraud cases, which revealed a surprising finding. The intent of the research
was to analyze cases of insider theft and insider fraud to identify patterns of insider
behavior, organizational events or conditions, and technical issues across the cases. The
patterns identified separated the crimes into two different classes than originally
expected:
Theft or modification of information for financial gain This class includes cases
where insiders used their access to organization systems either to steal
1
See http://www.cert.org/insider_threat/study.html for more information on the Insider Threat Study.
A report describing the MERIT model of insider IT Sabotage, funded by CyLab, can be downloaded at
http://www.cert.org/archive/pdf/08tr009.pdf.
3
A report describing CERTs insider threat research with the Department of Defense can be downloaded
from http://www.cert.org/archive/pdf/06tr026.pdf.
2
CERT | SOFTWARE ENGINEERING INSTITUTE | 4
information that they sold to outsiders, or to modify information for financial gain
for themselves or others.
Theft of information for business advantage – This class includes cases where
insiders used their access to organization systems to obtain information that they
used for their own personal business advantage, such as obtaining a new job or
starting their own business.
It is important that organizations recognize the differences in the types of employees who
commit each type of crime, as well as how each type of incident evolves over time: theft
or modification for financial gain, theft for business advantage, IT sabotage, and
miscellaneous (incidents that do not fall into any of the three above categories). This
version of the guide presents patterns and trends observed in each type of malicious
activity. There have been minor updates to the IT sabotage information in this guide;
however, the most significant enhancements in this edition were made to the theft and
modification sections.
Some new practices were added in this edition that did not exist in the second edition. In
addition, every practice from the second edition has been modifiedsome significantly,
others to a lesser degreeto reflect new insights from the past years research at CERT.
Case examples from the second edition were retained in this edition for the benefit of
new readers. However, a Recent Findings section was included for all updated practices.
It details recent cases that highlight new issues not covered in the previous edition of this
guide.
What is Meant by “Insider Threat?”
CERTs definition of a malicious insider is
A current or former employee, contractor, or business
partner who
has or had authorized access to an organizations
network, system, or data and
intentionally exceeded or misused that access in
a manner that negatively affected the
confidentiality, integrity, or availability of the
organizations information or information
systems
CERT | SOFTWARE ENGINEERING INSTITUTE | 5
Note that one type of insider threat is excluded from this guide: cases of espionage
involving classified national security information.
The scope of insider threats has been expanding beyond the traditional threat posed by a
current of former employee. Specifically, the CERT team has noted the following
important new issues in the expanding scope of insider threat.
Collusion with outsiders: Insider threat has expanded beyond the organizational
boundary. Half of the insiders who stole or modified information for financial gain were
actually recruited by outsiders, including organized crime and foreign organizations or
governments. It is important to pay close attention to the section of the guide titled Theft
or Modification of Information for Financial Gain It will help you understand the types
of employees who may be susceptible to recruitment.
Business partners: A recent trend noted by the CERT research team is the increase in the
number of insider crimes perpetrated not by employees, but by employees of trusted
business partners who have been given authorized access to their clients networks,
systems, and data. Suggestions for countering this threat are presented in Practice 1.
Mergers and acquisitions: A recent concern voiced to the CERT team by industry is the
heightened risk of insider threat in organizations being acquired by another organization.
It is important that organizations recognize the increased risk of insider threat both within
the acquiring organization, and in the organization being acquired, as employees endure
stress and an uncertain organizational climate. Readers involved in an acquisition should
pay particular attention to most of the practices in this guide.
Cultural differences: Many of the patterns of behavior observed in CERTs insider threat
modeling work are reflected throughout this guide. However, it is important for readers to
understand that cultural issues could influence employee behaviors; those same
behavioral patterns might not be exhibited in the same manner by people who were raised
or spent extensive time outside of the U.S.
Issues outside the U.S: CERTs insider threat research is based on cases that occurred
inside the United States. It is important for U.S. companies operating branches outside
the U.S. to understand that, in addition to the cultural differences influencing employee
behavior, portions of this guide might also need to be tailored to legal and policy
differences in other countries.
Are insiders really a threat?
The threat of attack from insiders is real and substantial. The 2007 E-Crime Watch
SurveyTM conducted by the United States Secret Service, the CERT Coordination Center
(CERT/CC), Microsoft, and CSO Magazine, 4 found that in cases where respondents
could identify the perpetrator of an electronic crime, 31% were committed by insiders. In
4
http://www.cert.org/archive/pdf/ecrimesummary07.pdf
CERT | SOFTWARE ENGINEERING INSTITUTE | 6
addition, 49% of respondents experienced at least one malicious, deliberate insider
incident in the previous year. The impact from insider attacks can be devastating. One
employee working for a manufacturer stole blueprints containing trade secrets worth
$100 million, and sold them to a Taiwanese competitor in hopes of obtaining a new job
with them.
Over the past several years, Carnegie Mellon University has been conducting a variety of
research projects on insider threat. One of the conclusions reached is that insider attacks
have occurred across all organizational sectors, often causing significant damage to the
affected organizations. Examples of these acts include the following:
Low-tech attacks, such as modifying or stealing confidential or sensitive
information for personal gain.
Theft of trade secrets or customer information to be used for business advantage
or to give to a foreign government or organization.
Technically sophisticated crimes that sabotage the organizations data, systems, or
network.
Damages in many of these crimes are not only financialwidespread public reporting of
the event can also severely damage the organizations reputation.
Insiders have a significant advantage over others who might want to harm an
organization. Insiders can bypass physical and technical security measures designed to
prevent unauthorized access. Mechanisms such as firewalls, intrusion detection systems,
and electronic building access systems are implemented primarily to defend against
external threats. However, not only are insiders aware of the policies, procedures, and
technology used in their organizations, but they are often also aware of their
vulnerabilities, such as loosely enforced policies and procedures or exploitable technical
flaws in networks or systems.
CERTs research indicates that use of many widely accepted best practices for
information security could have prevented many of the insider attacks examined. Part of
CERTs research of insider threat cases entailed an examination of how each organization
could have prevented the attack or at the very least detected it earlier. Previous editions of
the Common Sense Guide identified existing best practices critical to the mitigation of
the risks posed by malicious insiders. This edition identifies additional best practices
based on new methods and contextual factors in recent cases, and also presents some new
suggestions for countering insider threat based on findings that could not be linked to
established best practices.
Based on our research to date, the practices outlined in this report are the most important
for mitigating insider threats.
CERT | SOFTWARE ENGINEERING INSTITUTE | 7
Who should read this report?
This guide is written for a diverse audience. Decision makers across an organization can
benefit from reading it. Insider threats are influenced by a combination of technical,
behavioral, and organizational issues, and must be addressed by policies, procedures, and
technologies. Therefore, it is important that management, human resources, information
technology, software engineering, legal, security staff, and the owners of critical data
understand the overall scope of the problem and communicate it to all employees in the
organization.
The guide outlines practices that should be implemented throughout organizations to
prevent insider threats. It briefly describes each practice, explains why it should be
implemented, and provides one or more actual case examples illustrating what could
happen if it is not, as well as how the practice could have prevented an attack or
facilitated early detection.
Much has been written about the implementation of these practices (a list of references on
this topic is provided at the end of this guide). This report provides a synopsis of those
practices, and is intended to convince the reader that someone in the organization should
be given responsibility for reviewing existing organizational policies, processes, and
technical controls and for recommending necessary additions or modifications.
Can insiders be stopped?
Insiders can be stopped, but stopping them is a complex problem. Insider attacks can only
be prevented through a layered defense strategy consisting of policies, procedures, and
technical controls. Therefore, management must pay close attention to many aspects of its
organization, including its business policies and procedures, organizational culture, and
technical environment. It must look beyond information technology to the organizations
overall business processes and the interplay between those processes and the technologies
used.
CERT | SOFTWARE ENGINEERING INSTITUTE | 8
Acknowledgements
In sponsoring the Insider Threat Study, the U.S. Secret Service provided more than just
funding for CERTs research. The joint study team, composed of CERT information
security experts and behavioral psychologists from the Secret Services National Threat
Assessment Center, defined the research methodology and conducted the research that
has provided the foundation for all of CERTs subsequent insider threat research. The
community as a whole owes a debt of gratitude to the Secret Service for sponsoring and
collaborating on the original study, and for permitting CERT to continue to rely on the
valuable casefiles from that study for ongoing research. Specifically, CERT would like to
thank Dr. Marisa Reddy Randazzo, Dr. Michelle Keeney, Eileen Kowalski, and Matt
Doherty from the National Threat Assessment Center, and Cornelius Tate, David
Iacovetti, Wayne Peterson, and Tom Dover, our liaisons with the Secret Service during
the study.
The authors would also like to thank the CERT members of the Insider Threat Study
team, who reviewed and coded cases, conducted interviews, and assisted in writing the
study reports: Christopher Bateman, Casey Dunlevy, Tom Longstaff, David Mundie,
Stephanie Rogers, Timothy Shimeall, Bradford Willke, and Mark Zajicek.
Since the Insider Threat Study, the CERT team has been fortunate to work with
psychologists who have contributed their vast experience and new ideas to our work: Dr.
Eric Shaw, a Visiting Scientist on the CERT Insider Threat team who has contributed to
most of the CERT insider threat projects, Dr. Steven Band, former Chief of the FBI
Behavioral Sciences Unit, who has provided expertise on psychological issues, and Dr.
Lynn Fischer from the Department of Defense Personnel Security Research Center, who
sponsored CERTs initial insider threat research and has continued to work with the
CERT team on various insider threat projects.
The CERT team is extremely appreciative of the ongoing funding provided by CyLab.
The impact of the insider threat research sponsored by CyLab has been enormous, within
industry and government, and inside the U.S. as well as globally. CyLab has provided
key funding that has enabled the CERT team to perform research for the benefit of all:
government and industry, technical staff as well as management. Specifically, we would
like to thank Pradeep Khosla, Don McGillen, and Linda Whipkey, who have been
advocates for CERTs insider threat research since its inception, as well as Richard
Power, Gene Hambrick, Virgil Gligor, and Adrian Perig, who the CERT team has had
the pleasure of working with over the past year.
The CERT team has had assistance from various CyLab graduate students over the past
few years. These students enthusiastically joined the team and devoted their precious
time to the CERT insider threat projects: Akash Desai, Hannah Benjamin-Joseph,
Christopher Nguyen, Adam Cummings, and Tom Carron. Special thanks to Tom, who is
a current member of the CERT/CyLab insider threat team, and who willingly dropped
everything he was doing over and over again to search the database for specific examples
we needed to make this report as compelling as possible.
CERT | SOFTWARE ENGINEERING INSTITUTE | 9
The Secret Service provided the 150 original casefiles for CERTs insider threat research.
CyLabs research required identification and collection of additional case materials. The
CERT team gratefully acknowledges the hard work and long hours, including many
weekends, spent by Sheila Rosenthal, SEIs Manager of Library Services, assisting with
this effort. Sheila was instrumental in obtaining the richest source materials available for
more than 100 new cases used in the teams CyLab-sponsored research.
Finally, CERT would like to thank all of the organizations, prosecutors, investigators, and
convicted insiders who agreed to provide confidential information to the team to enhance
the research. It is essential to the community that all of the good guys band together
and share information so that together we can keep employees happy, correct problems
before they escalate, and use our technical resources and business processes to prevent
malicious insider activity or detect the precursors to a devastating attack.
CERT | SOFTWARE ENGINEERING INSTITUTE | 10
Patterns and Trends Observed by Type of Malicious
Insider Activity
The CERT insider threat team has collected approximately 250 actual insider threat
cases. One hundred ninety of those cases were analyzed in detail for this report. Because
the remaining cases did not have sufficient information available or were still in the U.S.
court system at the time of this publication, they have not yet been formally analyzed.
This section of the document presents trends and patterns observed in those cases by class
of malicious insider activity:
IT sabotage: cases in which current or former employees, contractors, or business
partners intentionally exceeded or misused an authorized level of access to
networks, systems, or data with the intention of harming a specific individual, the
organization, or the organizations data, systems, and/or daily business operations.
Theft or modification for financial gain: cases in which current or former
employees, contractors, or business partners intentionally exceeded or misused an
authorized level of access to networks, systems, or data with the intention of
stealing or modifying confidential or proprietary information from the
organization for financial gain.
Theft or modification for business advantage: cases in which current or former
employees, contractors, or business partners intentionally exceeded or misused an
authorized level of access to networks, systems, or data with the intention of
stealing confidential or proprietary information from the organization with the
intent to use it for a business advantage.
Miscellaneous: cases in which current or former employees, contractors, or
business partners intentionally exceeded or misused an authorized level of access
to networks, systems, or data with the intention of stealing confidential or
proprietary information from the organization, not motivated by financial gain or
business advantage.
The breakdown of the cases into those four categories is shown in Figure 1.
CERT | SOFTWARE ENGINEERING INSTITUTE | 11
Theft for
Miscellaneous
Reasons
17
IT Sabotage
80
77
Theft or
Modification for
Financial Gain
24
Theft for Business
Advantage
Figure 1. Breakdown of Insider Threat Cases
5
Some cases fell into multiple categories. For example, some insiders committed acts of
IT sabotage against their employers systems, then attempted to extort money from them,
offering to assist them in recovery efforts only in exchange for a sum of money. A case
like that is categorized as both IT sabotage and theft or modification of information for
financial gain. Four of the 190 cases were classified as theft for financial gain and IT
sabotage. Another case involved a former vice president of sales copying a customer
database and sales brochures from the organization before deleting them and taking
another job. This case is classified as theft of information for business advantage and IT
sabotage. One case was classified as theft for business advantage and IT sabotage.
Finally, three cases where classified as IT Sabotage and Theft for Miscellaneous Reasons.
A breakdown of the cases depicting the overlap between categories is shown in Figure 2.
5
190 cases were analyzed for this report; however, some of the cases were classified as more than one type
of crime.
CERT | SOFTWARE ENGINEERING INSTITUTE | 12
IT Sabotage
75
4
1
0
73
0
Theft/
Modification for
Financial Gain
Theft for
Business
Advantage
ITSabotage
Figure 2. Overlap among the Insider Threat Classes
23
6
Figure 3 shows the distribution of each type of case by critical infrastructure sector. It is
interesting to note the differences among sectors. For instance, it is not surprising that
theft of information for financial gain is most prominent in the Banking and Finance
sector. However, it might be a bit unexpected to note that theft for financial gain in the
Government sector is a close second, followed by Information Technology and
Telecommunications.
Theft of information for business advantage, on the other hand, is highly concentrated in
the IT and Telecommunications sector, with cases in the Banking and Finance sector
second. Chemical and Hazardous Materials and the Defense Industrial Base were the
only other two critical infrastructure sectors that experienced theft of information for
business advantage.
The number of cases of insider IT sabotage in the IT sector is quite striking. The
government sector was second in number of insider IT sabotage attacks. Note that the
only two sectors to have experienced no insider IT sabotage attacks were Chemical and
6
Seventeen of the cases were classified as Miscellaneous Theft cases, in which the motive was not for
financial gain or business advantage. This figure does not depict those seventeen crimes.
CERT | SOFTWARE ENGINEERING INSTITUTE | 13
Hazardous Materials and Emergency Services; every other sector experienced at least one
attack.
Crimes by Critical Infrastructure Sector
45
40
35
# Cases
30
25
20
15
10
5
Misc_Theft
Theft – Bus. Adv.
Theft – Fin. Gain
Sabotage
N
/A
C B
he an
m ki
. n
D Ind g a
ef
n
en ust d F
se ry
in
In & H anc
du
az e
st
E
ria Ma
m
l B ts
er
ge
as
nc E
e
y n
S er
er g
vi y
ce
s
G
In
Fo
o
P fo ve o
os & rn d
ta
T m
l a ele en
nd co t
S mm
h
P
ub ipp
in
g
Tr lic
an H
e
sp
al
or th
ta
tio
n
0
Critical Infrastructure Sector
Figure 3. Distribution of Cases by Critical Infrastructure Sector
CERT | SOFTWARE ENGINEERING INSTITUTE | 14
Insider IT Sabotage
In this report, insider IT sabotage cases are defined as follows: cases in which current or
former employees, contractors, or business partners intentionally exceeded or misused an
authorized level of access to networks, systems, or data with the intention of harming a
specific individual, the organization, or the organizations data, systems, and/or daily
business operations.
CERT researchers analyzed 80 cases of IT sabotage that occurred in the United States
between 1996 and 2007.
Who were the insiders?
The insiders who committed IT sabotage were primarily male and held highly technical
positions, the majority hired with system administrator or privileged access. However,
according to the U.S. Department of Labor Bureau of Labor Statistics, in 2007, 74% of
all employees in computer and mathematical occupations were male. 7 Therefore, while it
is useful to note that sabotage was typically committed by technically sophisticated
employees, focusing attention only on male employees is probably not a logical
conclusion. In addition, the majority of the insiders who committed IT sabotage were
former employees.
Why did they do it?
Over half of the insiders were perceived as disgruntled, and most of them acted out of
revenge for some negative precipitating event. Examples of negative events include
termination, disputes with the employer, new supervisors, transfers or demotions, and
dissatisfaction with salary increases or bonuses.
How did they attack?
The majority of the insiders who committed IT sabotage did not have authorized access at
the time of their attack. Only 30% used their own username and password; 43% of them
compromised an account. Twenty-four percent used another employees username and
password, and 16% used an unauthorized (backdoor) account they had created
previously. They also used shared accounts, including some that had been overlooked in
the termination process; 23% used system administrator or database administrator (DBA)
accounts and 11% used other types of shared accounts, for instance testing accounts or
training accounts.
Thirty-five percent used sophisticated technical means for carrying out their attacks.
Commonly used technical methods included writing a script or program, such as a logic
bomb, or creating a backdoor account for later use. Other technical mechanisms included
planting a virus on customer computers, using password crackers, and installation of
remote system administration tools.
7
http://www.bls.gov/cps/cpsaat9.pdf
CERT | SOFTWARE ENGINEERING INSTITUTE | 15
Approximately 30% took technical preparatory actions prior to the attack, particularly in
cases where they anticipated termination. For example, they wrote, tested, and planted
logic bombs, sabotaged backups, and created backdoor accounts. Most logic bombs were
designed to delete massive amounts of data; however, at least one was designed to disrupt
business operations surreptitiously, six months following the insiders termination. Some
backdoor accounts were fairly obvious and could have been detected easily in an account
audit, while others were well concealed. Most insiders used remote access, and carried
out their attack outside of normal working hours.
How was it detected?
Most of the attacks were detected manually due to system failure or irregularity. Nonsecurity personnel, including customers in almost 25% of the cases, often detected the
attacks. Employees detecting the attacks included supervisors, coworkers, and security
staff.
Observable concerning behaviors were exhibited by the insiders prior to setting up and
carrying out their attack. Common behavioral precursors included conflicts with
supervisors and coworkers (which were sometimes quite angry or violent), decline in
performance, tardiness, or unexplained absenteeism. In some cases, management did not
notice or ignored the problems. In other cases, sanctions imposed by the organization
only increased the insiders concerning behaviors, rather than put an end to them.
How was the insider identified?
In most cases, system logs were used to identify the insider, including remote access logs,
file access logs, database logs, application logs, and email logs. Most of the insiders took
steps to conceal their actions; some insiders, knowing that the logs would be used for
identification, attempted to conceal their actions by modifying the logs. In some cases,
they modified the logs to implicate someone else for their actions.
What were the impacts?
In 68% of the cases, the organization suffered some type of business impact, such as
inability to conduct business due to the system or network being down, loss of customer
records, or inability to produce products due to damaged or destroyed software or
systems.
Other negative consequences resulted from

negative media attention
forwarding management email containing private information, like strategic plans
or plans of impending layoffs to customers, competitors, or employees
exposure of personal information, like Social Security numbers
web site defacements in which legitimate information was replaced with invalid
or embarrassing content
publication of confidential customer information on a public web site
CERT | SOFTWARE ENGINEERING INSTITUTE | 16
In 28% of the cases, an individual was harmed. Examples of harm to individuals include
threats, modification of evidence to falsely implicate supervisors or coworkers, and
exposure of personal or private information.
For a more detailed description of insider IT sabotage, see The “Big Picture” of Insider IT
Sabotage Across U.S. Critical Infrastructures, which can be downloaded at
http://www.cert.org/archive/pdf/08tr009.pdf.
CERT | SOFTWARE ENGINEERING INSTITUTE | 17
Theft or Modification for Financial Gain
In this report, insider theft or modification for financial gain cases are defined as follows:
cases in which current or former employees, contractors, or business partners
intentionally exceeded or misused an authorized level of access to networks, systems, or
data with the intention of stealing or modifying their employers confidential or
proprietary information for financial gain.
CERT researchers analyzed 77 cases of theft or modification for financial gain that
occurred in the United States between 1996 and 2007. Seventy three cases involved only
theft or modification for financial gain and four also involved IT sabotage.
Who were the insiders?
Only five of the insiders who committed crimes in this category were former employees;
all others were current employees when they committed their illicit activity. Half of the
insiders were male and half were female. The insiders committing this type of crime
tended to occupy lower-level,, non-technical positions in the organization. Their job
duties included data entry and management of personally identifiable information (PII) or
customer information (CI). For example, many of these insiders held data entry positions
or were classified as clerks.
Why did they do it?
The primary motivation for all insiders in this category is financial gain. Insiders stole
information to sell it, modified data to achieve financial benefits for themselves, friends,
or family, or were paid by outsiders to modify information. Some insiders were
motivated to provide additional income for their relatives, and a few insiders had large
credit card debts or drug-related financial difficulties.
Most of these attacks were long, ongoing schemes; approximately one third of the
incidents continued for more than one year. Of the short, quick compromises, half ended
because the insider was caught quickly, and the other half ended because the crime was
committed as the employee was leaving the organization or following termination.
The prevalence of collusion between the insiders in these cases and either people external
to the organization or with other insiders is extremely high. Some cases involved
collusion with both insiders and outsiders. In cases of insider theft for financial gain, the
insider colluded with outsiders in two thirds of the cases, and one third of the cases
involved collusion between the insider and someone else inside the organization. In those
theft cases, an outsider recruited the insider to commit the crime in half of the cases. In
less than one third of the cases, the insider acted alone.
A recurring pattern in the theft of information for financial gain cases includes an
outsider recruiting an insider in a low-paying, non-technical position who has access to
PII or CI. The insider steals the information; the outsider then pays the insider and uses
the information to commit fraud or identity theft.
CERT | SOFTWARE ENGINEERING INSTITUTE | 18
Some insiders were paid to modify data, for example credit histories. In some cases they
were paid by people with poor credit histories, and in others by someone (like a car
dealer) who would benefit from the beneficiaries loan approvals. Other insiders were
paid by external people to create false drivers licenses, to enter fake health care providers,
and to generate false claims totaling significant amounts. Still others were paid to
counterfeit federal identity documents.
Finally, some insiders were able to design and carry out their own modification scheme
due to their familiarity with the organizations systems and business processes. For
example, a payroll manager defrauded her employer of more than $300,000 by adding
her husband to the payroll every week, generating a paycheck for him, then removing
him immediately from the payroll system to avoid detection. Her crime was only
discovered approximately one year after she left the company when an accountant
noticed the unauthorized checks.
In cases of insider modification of information for financial gain, insiders colluded with
an outsider in half of the cases, and almost half of the cases involved collusion between
the insider and someone else inside the organization. In modification cases, an outsider
recruited the insider to commit the crime in less than one third of the cases. In one third
of the cases, the insider acted alone.
How did they attack?
Ninety five percent of the insiders stole or modified the information during normal
working hours, and over 75% of the insiders used authorized access. Twenty five percent
did not have authorized access when they committed their crime; all others were
legitimate users. Five had system administrator or database administrator access and less
than 15% had privileged access. Almost all of the insiders used only legitimate user
commands to steal or modify the data. Only 16% of the crimes involved sophisticated
technical techniques, like use of a script or program, creation of a backdoor account, or
account compromise.
Eight-five percent of the insiders used their own usernames and passwords to commit
their crimes. Slightly over 10% compromised someone elses account, two insiders used
a computer left logged in and unattended by a coworker, one insider used a customer
account, and one used a company-wide training account. In nine of the cases, the insider
was able to compromise access to an account via social engineering methods. Some
insiders used more than one account to carry out their crime.
Only two insiders took technical preparatory actions to set up their illicit activity. One
insider enabled fraudulent medical care providers to be added to the database. Another
disabled automatic notification of the security staff when a certain highly restricted
function was used in the system, then used that function to conduct his fraudulent
scheme.
CERT | SOFTWARE ENGINEERING INSTITUTE | 19
How was it detected?
Only one of the insiders was detected due to network monitoring activities. Half were
detected due to data irregularities, including suspicious activities in the form of bills,
tickets, or negative indicators on individuals credit histories. The majority of the cases
were detected by non-technical means, such as notification of a problem by a customer,
law enforcement officer, coworker, informant, auditor, or other external person who
became suspicious. In five cases, the insider was detected when the information was
offered for sale directly to a competitor via email or posted online. Most of the malicious
activity was eventually detected by multiple people. Over 50% of the cases were detected
internally by non-IT security personnel, 26% by clients or customers of the organization,
approximately 10% by customers, and 5% by competitors.
How was the insider identified?
In most cases, system logs were used to identify the insider, including database logs,
system file change logs, file access logs, and others.
What were the impacts?
The theft or modification cases analyzed for this report affected not only the insiders
organizations, but also other innocent victims. For example, a check fraud scheme
resulted in innocent people receiving collection letters due to fraudulent checks written
against their account. Other cases involved insiders committing credit card fraud by
abusing their access to confidential customer data. Other insiders subverted the justice
system by modifying court records. Some cases could have very serious consequences
cases in which insiders created false official identification documents or drivers licenses
for illegal aliens or others who could not obtain them legally. Similarly, one insider
accepted payment to modify a database to overturn decisions denying asylum to illegal
aliens, enabling them to remain in the U.S. illegally.
The insiders organizations also suffered as a result of these crimes. Impacts included
negative media attention as well as financial losses. One insider committed fraud against
a state insurance fund for a total of almost $850,000, and another insider working for the
same company was tied to almost $20 million in fraudulent or suspicious transactions.
Another insider committed fraud against a federal agency for over $600,000. In a case
involving both sabotage and fraud, an insider set himself up to benefit from the abrupt
decline in his companys stock price when he deleted over 10 billion files on the
companys servers, costing the organization close to $3 million in recovery costs.
CERT | SOFTWARE ENGINEERING INSTITUTE | 20
Theft of Information for Business Advantage
In this report, cases involving theft of confidential or proprietary information are defined
as follows: cases in which current or former employees, contractors, or business partners
intentionally exceeded or misused an authorized level of access to networks, systems, or
data with the intention of stealing confidential or proprietary information from the
organization with the intent to use it for a business advantage. While an argument can be
made that this type of incident may ultimately be about money, these insiders had longer
term ambitions, such as using the information to get a new job, to use in a new job with a
competing business, or to start a competing business.
CERT researchers analyzed twenty-four cases of theft of confidential or proprietary
information for business advantage that occurred in the United States between 1996 and
2007. Twenty-three cases involved only information theft and one also involved IT
sabotage.
Who were the insiders?
In all of the cases analyzed, the insiders who stole confidential or proprietary information
were male and 71% held technical positions. The remaining 29% occupied sales
positions. Twenty-five percent were former employees; the other 75% were current
employees when they committed their illicit activity. Interestingly, nearly 80% of the
insiders had already accepted positions with another company or had started a competing
company at the time of the theft.
Why did they do it?
By definition, all of these insiders committed the crime in order to obtain a business
advantage. Some insiders stole the information to give them an immediate advantage at a
new job. Others used the information to start a new, competing business. Almost all
(95%) of the insiders resigned before or after the theft. Most (almost 70%) took place
within three weeks of the insiders resignation.
In 25% of the cases, the insider gave the information to a foreign company or government
organization. It is important to note that half of the theft for business advantage cases
with the highest financial impact involved foreign organizations.
How did they attack?
Eighty-eight percent of the insiders had authorized access to the information when they
committed the theft. The only insiders who did not have authorized access to the
information they stole were former employees at the time of the crime. None of the
insiders had privileged access, such as system administrator or database administrator
access, that enabled them to commit the crime, although one former employee was given
authorized access to do some additional work; he used that access to commit the theft. In
other words, the widespread fear of system administrators using their privileged access to
steal information was not evidenced in these cases.
The majority of these theft cases occurred quickly, spanning less than a one-month
period. Less than one third of the insiders continued their theft over a longer period, half
CERT | SOFTWARE ENGINEERING INSTITUTE | 21
of them stealing for a side business, and half to take to a new employer. Although most of
the thefts occurred quickly, there often was significant planning by the insider. More
than one third of the insiders had already created, or were planning to start, a new
business while still working at the victim organization. Some of the insiders were
deceptive about their plans when leaving the organization, either lying about future job
plans or declining to reveal that they had already accepted another position. One insider
created a side business as a vehicle for transferring trade secrets he stole from his current
employer to a foreign-state-owned company. He concealed his connection to the side
business by removing his name from the business article of incorporation and only using
a post office box as the address for the company.
There was slightly less collusion in these theft cases than in the cases of theft or
modification for financial gain, but the numbers are still significant. In approximately
half of the cases, the insider colluded with at least one other insider to commit the crime.
In some cases, the employee stole the information, resigned his position, then recruited
other employees still at the original organization to steal additional information. These
crimes were usually the insiders own idea; the insider was only recruited by someone
outside the organization in less than 25% of the cases.
The majority of these crimes were committed during working hours, although a few
insiders acted outside working hours. Very few (roughly 12%) used remote access,
accessing their employers networks from their homes or from another organization.
Some insiders stole information using both remote access and access from within the
workplace, and some acted both inside and outside normal working hours.
How was it detected?
Many of these incidents were detected by non-technical means, such as

notification by a customer or informant,
detection by law enforcement investigating the reports of the theft by victims,
reporting of suspicious activity by co-workers, and
sudden emergence of new competing organizations.
In one case, the victim organization became suspicious upon seeing a product strikingly
similar to theirs at a competitors booth at a trade show. In another, customers alerted the
victim organization to the theft when the insider attempted to sell identical products and
services to theirs on behalf of a new organization.
Twenty-five percent of the cases were detected by system administrators or IT security
personnel while monitoring download logs or email logs.
How was the insider identified?
In most cases, system logs were used to identify the insider, including file access,
database, and email logs.
CERT | SOFTWARE ENGINEERING INSTITUTE | 22
What were the impacts?
Impacts on organizations included financial and other losses. It is extremely difficult to
quantify the losses resulting from stolen trade secrets. In 38% of the cases, proprietary
software or source code was stolen; an equal number of cases involved business plans,
proposals, and other strategic plans; and a slightly smaller number involved trade secrets,
such as product designs or formulas.
Finally, the insiders themselves sometimes suffered unanticipated consequences. Some
insiders were surprised that their actions were criminal in nature, claiming that they
created the information once, and could do it again, and therefore it was easier to simply
take it with them when they left the organization. In one case, the insider committed
suicide before he could be brought to trial.
CERT | SOFTWARE ENGINEERING INSTITUTE | 23
Summary
Forty-five percent of the 176 cases analyzed for this report involved IT sabotage, 44%
involved theft or modification of information for financial gain, and 14% involved theft
or modification of information for business advantage. 8 However, although IT sabotage
and theft or modification of information for financial gain were the most prevalent types
of crime, the potential impacts of all three types of crime are serious. Therefore,
organizations should consider whether each of these activities is a potential threat to
them, and if so, consider the information in this report regarding those types of crimes
carefully.
Furthermore, the authors of this report contend that insider IT sabotage is a threat to any
organization that relies on an IT infrastructure for its business, regardless of the size or
complexity of the configuration. Likewise, it is unlikely that many organizations can
disregard insider theft of proprietary or confidential information as an insider threat.
Therefore, it is recommended that all organizations consider the practices detailed in the
remainder of this report for prevention of sabotage and information theft.
Table 1 provides a summary of the details surrounding the three types of insider crimes.
High-Level Comparison of Insider Threat Types
Potential threat of insider sabotage is posed by disgruntled technical staff following a
negative work-related event. These insiders tend to act alone. While coworkers might
also be disgruntled immediately following the negative event, most of them come to
accept the situation. The potential for insider IT sabotage should be considered when
there are ongoing, observable behavioral precursors preceding technical actions that are
taken to set up the crime.
Data pertaining to theft or modification of information for financial gain and information
theft for business advantage, on the other hand, suggest that organizations need to
exercise some degree of caution with all employees. Current employees in practically any
position have used legitimate system access to commit those types of crimes. In theft or
modification for financial gain, there was also a high degree of collusion with both
outsiders (primarily to market the stolen information or to gain benefit from its
modification) and other insiders (primarily to facilitate the theft or modification).
Collusion was less common, but still significant, in theft for business advantage. Crimes
for financial gain were also more likely to be induced by outsiders than crimes for
business advantage.
Of special note, however, is the fact that ninety-five percent of the employees who stole
information for business advantage resigned before or after the theft. Therefore, extra
caution should be exercised once the organization becomes aware of this type of
information, either formally or via rumor. A balance of trust and caution should factor
into the organizations policies, practices, and technology.
8
Recall that some crimes fit into multiple categories. Also, cases of Miscellaneous theft were excluded
from this calculation.
CERT | SOFTWARE ENGINEERING INSTITUTE | 24
Insider Theft or
Modification of
Information for
Financial Gain
Insider IT Sabotage
Percentage of
crimes in CERTs
case database
Current or former
employee?
Type of position
Gender
Target
Access used
When
Where
Insider Theft of
Information for
Business
Advantage
45%
44%
14%
Former
Current
Current
Technical (e.g. system
administrators or
database
administrators)
Male
Network, systems, or
data
Unauthorized access
Outside normal
working hours
Remote access
Recruited by
outsiders
None
Collusion
None
Non-technical, lowlevel positions with
access to confidential
or sensitive
information (e.g. data
entry, customer
service)
Fairly equally split
between male and
female
Technical (71%) scientists,
programmers,
engineers
Sales (29%)
Male
Intellectual Property
Personally Identifiable (trade secrets) 71%
Information or
Customer Information Customer
Information 33% 9
Authorized access
Authorized access
During normal
During normal
working hours
working hours
At work
At work
Half recruited for
theft; less than one
Less than one fourth
third recruited for
modification
Almost half colluded
Almost half colluded
with another insider in
with at least one
modification cases;
insider; half acted
2/3 colluded with
alone
outsiders in theft cases
Table 1. Summary Comparison by Type of Insider Incident
9
Some insiders stole more than one type of information.
CERT | SOFTWARE ENGINEERING INSTITUTE | 25
How Can they be Stopped?
The methods of carrying out malicious insider activity varied by type of crime. The IT
sabotage cases tended to be more technically sophisticated, while the theft or
modification of information for financial gain and information theft for business
advantage tended to be technically unsophisticated in comparison.
It is important that organizations carefully consider implementing the practices outlined
in the remainder of this report to protect themselves from any of these malicious activities
that pose a risk to them. Proactive technical measures need to be instituted and
maintained at a constant level in order to prevent or detect technical preparatory actions.
Good management practices need to be instituted and maintained in order to prevent
insider threats, or recognize and react appropriately when indicators of potential insider
threats are exhibited. Legal and contractual implications in the cases examined by CERT
need to be understood and accounted for with employees, contractors, and partner
organizations.
Too often, organizations allow the quality of their practices to erode over time because
they seem to be less important than competing priorities if no malicious insider activity
has been detected. One of the vulnerabilities posed by insiders is their knowledge of
exactly this: the quality of their organizations defenses.
What if an Insider Attack Succeeds?
One pattern common to all of the cases is the importance of system logs in identifying the
insider. Regardless of type of crime, system logs provide the evidence needed to take
appropriate action. Since many technical insiders attempted to conceal their actions,
sometimes by altering system logs, it is particularly important that organizations architect
their systems to ensure the integrity of their logs.
In addition to protecting and defending against insider threats, it is also important that
organizations are prepared to respond to an insider incident should one occur.
Organizations frequently overlook insider threats when preparing incident response plans.
Insider incidents need to be investigated carefully, since it is not always apparent who
can be trusted and who cannot. In addition, organizations should make a proactive
decision regarding forensics capability: if an insider incident occurs, will forensics be
handled internally, or will an external forensics expert be hired? Some insider cases
obtained by CERT could not be prosecuted because the organization did not properly
handle system logs, and as a result they could not be used as evidence in prosecution.
The remainder of this document is structured around sixteen practices that could have
been effective in preventing the insider incidents analyzed for this report, or at the very
least, would have enabled early detection of the malicious activity.
CERT | SOFTWARE ENGINEERING INSTITUTE | 26
Best Practices for the Prevention and Detection of
Insider Threats
Summary of practices
The following sixteen practices will provide an organization defensive measures that
could prevent or facilitate early detection of many of the insider incidents other
organizations experienced in the hundreds of cases examined by CERT. Some of these
practices have been updated from the previous version of the Common Sense Guide
based on approximately 100 recent cases collected and examined since that version was
published. Other practices are new ones added in this version. Each practice listed below
is labeled as either Updated or New.
PRACTICE 1: Consider threats from insiders and business partners in enterprise-wide
risk assessments. (Updated).
It is difficult for organizations to balance trusting their employees, providing them access
to achieve the organizations mission, and protecting its assets from potential
compromise by those same employees. Insiders access, combined with their knowledge
of the organizations technical vulnerabilities and vulnerabilities introduced by gaps in
business processes, gives them the ability and opportunity to carry out malicious activity
against their employer if properly motivated. The problem is becoming even more
difficult as the scope of insider threats expands due to organizations growing reliance on
business partners with whom they contract and collaborate. It is important for
organizations to take an enterprise-wide view of information security, first determining
its critical assets, then defining a risk management strategy for protecting those assets
from both insiders and outsiders.
NEW PRACTICE
PRACTICE 2: Clearly document and consistently enforce policies and controls.
Clear documentation and communication of technical and organizational policies and
controls could have mitigated some of the insider incidents, theft, modification, and IT
sabotage, in the CERT case library. Specific policies are discussed in this section of the
report. In addition, consistent policy enforcement is important. Some employees in the
cases examined by CERT felt they were being treated differently than other employees,
and retaliated against this perceived unfairness by attacking their employers IT systems.
Other insiders were able to steal or modify information due to inconsistent or unenforced
policies.
PRACTICE 3: Institute periodic security awareness training for all employees.
(Updated)
A culture of security awareness must be instilled in the organization so that all employees
understand the need for policies, procedures, and technical controls. All employees in an
organization must be aware that security policies and procedures exist, that there is a
good reason why they exist, that they must be enforced, and that there can be serious
consequences for infractions. They also need to be aware that individuals, either inside or
outside the organization, may try to co-opt them into activities counter to the
organizations mission. Each employee needs to understand the organizations security
CERT | SOFTWARE ENGINEERING INSTITUTE | 27
policies and the process for reporting policy violations. This section of the guide has been
updated with important new findings relevant to recruitment of insiders by outsiders to
commit crimes.
PRACTICE 4: Monitor and respond to suspicious or disruptive behavior, beginning
with the hiring process. (Updated)
Organizations should closely monitor suspicious or disruptive behavior by employees
before they are hired, as well as in the workplace, including repeated policy violations
that may indicate or escalate into more serious criminal activity. The effect of personal
and professional stressors should also be considered. This section has been updated
based on findings in 100 recent cases, particularly due to the high degree of internal and
external collusion observed in these cases and the high incidence of previous arrests.
NEW PRACTICE
PRACTICE 5: Anticipate and manage negative workplace issues.
This section describes suggestions for organizations beginning with pre-employment
issues and continuing through employment and with termination issues. For example,
employers need to clearly formulate employment agreements and conditions of
employment. Responsibilities and constraints of the employee and consequences for
violations need to be clearly communicated and consistently enforced. In addition,
workplace disputes or inappropriate relationships between co-workers can serve to
undermine a healthy and productive working environment. Employees should feel
encouraged to discuss work-related issues with a member of management or human
resources without fear of reprisal or negative consequences. Managers need to address
these issues when discovered or reported, before they escalate out of control. Finally,
contentious employee terminations must be handled with utmost care, as most insider IT
sabotage attacks occur following termination.
NEW PRACTICE
PRACTICE 6: Track and secure the physical environment.
While employees and contractors obviously must have access to organization facilities
and equipment, most do not need access to all areas of the workplace. Controlling
physical access for each employee is fundamental to insider threat risk management.
Access attempts should be logged and regularly audited to identify violations or
attempted violations of the physical space and equipment access policies. Of course,
terminated employees, contractors, and trusted business partners should not have physical
access to non-public areas of the organization facilities. This section details lessons
learned from cases in the CERT case library in which physical access vulnerabilities
allowed an insider to attack.
PRACTICE 7: Implement strict password and account management policies and
practices. (Updated)
No matter how vigilant an organization is in trying to prevent insider attacks, if their
computer accounts can be compromised, insiders have an opportunity to circumvent both
manual and automated controls. Password and account management policies and
practices should apply to employees, contractors, and business partners. They should
CERT | SOFTWARE ENGINEERING INSTITUTE | 28
ensure that all activity from any account is attributable to the person who performed it.
An anonymous reporting mechanism should be available and used by employees to report
attempts at unauthorized account access, including potential attempts at social
engineering. Audits should be performed regularly to identify and disable unnecessary or
expired accounts. This section has been updated to reflect new account issues identified
in 100 recent cases added to the CERT case library, many of them involving
unauthorized access by trusted business partners.
PRACTICE 8: Enforce separation of duties and least privilege. (Updated)
If all employees are adequately trained in security awareness, and responsibility for
critical functions is divided among employees, the possibility that one individual could
commit fraud or sabotage without the cooperation of another individual within the
organization is limited. Effective separation of duties requires the implementation of least
privilege; that is, authorizing insiders only for the resources they need to do their jobs,
particularly when they take on different positions or responsibilities within the
organization. This section has been updated to reflect findings from recent cases
involving collusion among multiple insiders.
NEW PRACTICE
PRACTICE 9: Consider insider threats in the software development life cycle.
Many insider incidents can be tied either directly or indirectly to defects introduced
during the software development life cycle (SDLC). Some cases, such as those involving
malicious code inserted into source code, have an obvious tie to the SDLC. Others, like
those involving insiders who took advantage of inadequate separation of duties, have an
indirect tie. This section of the report details the types of oversights throughout the SDLC
that enabled insiders to carry out their attacks.
PRACTICE 10: Use extra caution with system administrators and technical or privileged
users. (Updated)
System administrators and privileged users like database administrators have the
technical ability and access to commit and conceal malicious activity. Technically adept
individuals are more likely resort to technical means to exact revenge for perceived
wrongs. Techniques like separation of duties or two-man rule for critical system
administrator functions, non-repudiation of technical actions, encryption, and disabling
accounts upon termination can limit the damage and promote the detection of malicious
system administrator and privileged user actions. This section has been updated to
include recent findings regarding technical employees who stole information for business
advantageto start their own business, take with them to a new job, or give to a foreign
government or organization.
PRACTICE 11: Implement system change controls. (Updated)
A wide variety of insider compromises relied on unauthorized modifications to the
organizations systems, which argues for stronger change controls as a mitigation
strategy. System administrators or privileged users can deploy backdoor accounts,
keystroke loggers, logic bombs, or other malicious programs on the system or network.
These types of attacks are stealthy and therefore difficult to detect ahead of time, but
CERT | SOFTWARE ENGINEERING INSTITUTE | 29
technical controls can be implemented for early detection. Once baseline software and
hardware configurations are characterized, comparison of current configuration can
detect discrepancies and alert managers for action. This section has been updated to
reflect recent techniques used by insiders that could have been detected via change
controls.
PRACTICE 12: Log, monitor, and audit employee online actions. (Updated)
If account and password policies and procedures are enforced, an organization can
associate online actions with the employee who performed them. Logging, periodic
monitoring, and auditing provide an organization the opportunity to discover and
investigate suspicious insider actions before more serious consequences ensue. In
addition to unauthorized changes to the system, download of confidential or sensitive
information such as intellectual property, customer or client information, and personally
identifiable information can be detected via data leakage tools. New findings detailed in
this section can assist organizations in refining their data leakage prevention strategy, for
example, in the weeks surrounding employee termination.
PRACTICE 13: Use layered defense against remote attacks. (Updated)
If employees are trained and vigilant, accounts are protected from compromise, and
employees know that their actions are being logged and monitored, then disgruntled
insiders will think twice about attacking systems or networks at work. Insiders tend to
feel more confident and less inhibited when they have little fear of scrutiny by coworkers;
therefore, remote access policies and procedures must be designed and implemented very
carefully. When remote access to critical systems is deemed necessary, organizations
should consider offsetting the added risk with requiring connections only via
organization-owned machines and closer logging and frequent auditing of remote
transactions. Disabling remote access and collection of organization equipment is
particularly important for terminated employees. This section has been updated to include
new remote attack methods employed by insiders in recent cases.
PRACTICE 14: Deactivate computer access following termination. (Updated)
When an employee terminates employment, whether the circumstances were favorable or
not, it is important that the organization have in place a rigorous termination procedure
that disables all of the employees access points to the organizations physical locations,
networks, systems, applications, and data. Fast action to disable all access points
available to a terminated employee requires ongoing and strict tracking and management
practices for all employee avenues of access including computer system accounts, shared
passwords, and card control systems.
PRACTICE 15: Implement secure backup and recovery processes. (Updated)
No organization can completely eliminate its risk of insider attack; risk is inherent in the
operation of any profitable enterprise. However, with a goal of organizational resiliency,
risks must be acceptable to the stakeholders, and as such, impacts of potential insider
attacks must be minimized. Therefore, it is important for organizations to prepare for the
possibility of insider attack and minimize response time by implementing secure backup
and recovery processes that avoid single points of failure and are tested periodically. This
CERT | SOFTWARE ENGINEERING INSTITUTE | 30
section contains descriptions of recent insider threat cases in which the organizations
lack of attention to incident response and organizational resiliency resulted in serious
disruption of service to their customers.
NEW PRACTICE
PRACTICE 16: Develop an insider incident response plan.
Organizations need to develop an insider incident response plan to control the damage
due to malicious insiders. This is challenging because the same people assigned to a
response team may be among the most likely to think about using their technical skills
against the organization. Only those responsible for carrying out the plan need to
understand and be trained on its execution. Should an insider attack, it is important that
the organization have evidence in hand to identify the insider and follow up
appropriately. Lessons learned should used to continually improve the plan.
CERT | SOFTWARE ENGINEERING INSTITUTE | 31
Practice 1: Consider threats from insiders and business
partners in enterprise-wide risk assessments. (UPDATED)
Organizations need to develop a comprehensive risk-based security strategy to
protect critical assets against threats from inside and outside, as well as trusted
business partners who are given authorized insider access.
What to do?
It is not practical for most organizations to implement 100% protection against every
threat to every organizational resource. Therefore, it is important to adequately protect
critical information and other resources and not direct significant effort toward protecting
relatively unimportant data and resources. A realistic and achievable security goal is to
protect those assets deemed critical to the organizations mission from both external and
internal threats. Unfortunately, organizations often fail to recognize the increased risk
posed when they provide insider access to their networks, systems, or information to
other organizations and individuals with whom they collaborate, partner, contract, or
otherwise associate. The boundary of the organizations enterprise needs to be drawn
broadly enough to include as insiders all people who have a privileged understanding of
and access to the organization, its information, and information systems.
Risk is the combination of threat, vulnerability, and mission impact. Enterprise-wide risk
assessments help organizations identify critical assets, potential threats to those assets,
and mission impact if the assets are compromised. Organizations should use the results of
the assessment to develop or refine the overall strategy for securing their networked
systems, striking the proper balance between countering the threat and accomplishing the
organizational mission. 10
The threat environment under which the system operates needs to be understood in order
to accurately assess enterprise risk. Characterizing the threat environment can proceed in
parallel with the evaluation of vulnerability and impact. However, the sooner the threat
environment can be characterized the better. The purpose of this guide is to assist
organizations in correctly assessing the insider threat environment, organizational
vulnerabilities that enable the threat, and potential impacts that could result from insider
incidents, including financial, operational, and reputational.
Unfortunately, many organizations focus on protecting information from access or
sabotage by those external to the organization and overlook insiders. Moreover, an
information technology and security solution designed without consciously
acknowledging and accounting for potential insider threats often leaves the role of
protection in the hands of some of the potential threatsthe insiders themselves. It is
imperative that organizations recognize the potential danger posed by the knowledge and
access of their employees, contractors, and business partners, and specifically address
that threat as part of an enterprise risk assessment.
10
See http://www.cert.org/nav/index_green.html for CERT research in Enterprise Security Management.
CERT | SOFTWARE ENGINEERING INSTITUTE | 32
Understanding the vulnerability of an organization to a threat is also important, but
organizations often focus too much on low-level technical vulnerabilities, for example,
by relying on automated computer and network vulnerability scanners. While such
techniques are important, our studies of insider threat have indicated that vulnerabilities
in an organizations business processes are at least as important as technical
vulnerabilities. Organizations need to manage the impact of threats rather than chase
individual technical vulnerabilities. In addition, new areas of concern have become
apparent in recent cases, including legal and contracting issues, as detailed in the Recent
Findings section below.
Insider threats impact the integrity, availability, or confidentiality of information critical
to an organizations mission. Insiders have affected the integrity of their organizations
information in various ways, for example by manipulating customer financial information
or defacing their employers web sites. They have also violated confidentiality of
information by stealing trade secrets or customer information. Still others have
inappropriately disseminated confidential information, including private customer
information as well as sensitive email messages between the organizations management.
Finally, insiders have affected the availability of their organizations information by
deleting data, sabotaging entire systems and networks, destroying backups, and
committing other types of denial-of-service attacks.
In the types of insider incidents mentioned above, current or former employees,
contractors, or business partners were able to compromise their organizations critical
assets. It is important that protection strategies are designed focusing on those assets:
financial data, confidential or proprietary information, and other mission critical systems
and data.
Case Studies: What could happen if I dont do it?
One organization failed to protect extremely critical systems and data from internal
employees. It was responsible for running the 911 phone-number-to-address lookup
system for emergency services. An insider deleted the entire database and software from
three servers in the organizations network operations center (NOC) by gaining physical
access using a contractors badge. The NOC, which was left unattended, was solely
protected via physical security; all machines in the room were left logged in with system
administrator access.
Although the NOC system administrators were immediately notified of the system failure
via an automatic paging system, there were no automated failover mechanisms. The
organizations recovery plan relied solely on backup tapes, which were also stored in the
NOC. Unfortunately, the insider, realizing that the systems could be easily recovered,
took all of the backup tapes with him when he left the facility. In addition, the same
contractors badge was authorized for access to the offsite backup storage facility, from
which he next stole over fifty backup tapes.
Had an enterprise risk assessment been performed for this system prior to the incident,
the organization would have recognized the criticality of the systems, assessed the threats
and vulnerabilities, and developed a risk mitigation strategy accordingly.
CERT | SOFTWARE ENGINEERING INSTITUTE | 33
Another insider was the sole system administrator for his organization. One day, he quit
with no prior notice. His organization refused to pay him for his last two days of work,
and he subsequently refused to give them the passwords for the administrator accounts
for its systems. Over a period of three days, the insider modified the systems so that they
could not be accessed by the employees, defaced the company web site, and deleted files.
It is critical that organizations consider the risk they assume when they place all system
administration power into the hands of a single employee.
Recent Findings:
Organizations are increasingly outsourcing critical business functions. As a result, people
external to the organization sometimes have full access to the organizations policies,
processes, information, and systems, access and knowledge previously only provided to
employees of the organization. CERTs definition of insider, which originally
encompassed current and former employees and contractors, had to be extended to
include partners, collaborators, and even students associated with the organization.
One recent case involved an employee of a company that obtained a contract to set up a
new wireless network for a major manufacturer. The insider was on the installation team
and therefore had detailed knowledge of the manufacturers systems. He was removed
from the team by his employer, apparently under negative circumstances. However, he
was able to enter the manufacturing plant and access a computer kiosk in the visitors
lobby. Based on his familiarity with the manufacturers computer system and security, he
was able to use the kiosk to delete files and passwords from wireless devices used by the
manufacturer across the country. It was forced to remove and repair the devices, causing
wide-scale shutdown of facilities and disruption of its processes.
This case highlights several new insider threat issues. First of all, an enterprise-wide risk
assessment should have identified the ability to override security and obtain privileged
access to the manufacturers network from a publicly accessible kiosk. Second, the
manufacturers contract with the insiders organization should have instituted strict
controls over employees added to or removed from the project. Specifically,
organizations should consider provisions in their contracts that require advance
notification by the contracted organization of any negative employment actions being
planned against any employees who have physical and/or electronic access to the
contracting organizations systems. The contracting organization could require a
specified amount of time before the action occurs, in order to perform its own risk
assessment for the potential threat posed to its own network, systems, or information.
Another recent incident indicates the need to have transaction verification built into
supplier agreements. A computer help desk attendant employed by a military contractor
created fake military email addresses on the military systems for which he was
responsible. He then used those email addresses to request replacement parts for military
equipment recalled by a major supplier. The supplier sent the replacement parts to the
address specified in the emails, with the expectation that the original recalled products
would be returned after the replacements had been received. The insider provided his
CERT | SOFTWARE ENGINEERING INSTITUTE | 34
home address for the shipments, and never intended to return the original equipment. The
insider received almost 100 shipments with a retail value of almost five million dollars
and sold the equipment on eBay.
Another case reflects the complexity of defining the organizational perimeter and the
scope of insider threats. The outside legal counsel for a high tech company was preparing
to represent the company in civil litigation. The outside counsel was provided with
documents containing company trade secrets, which were necessary to prepare the legal
case. The legal firm had a contract with a document-imaging company for copying
documents for its cases. An employee of the document-imaging company brought in his
nephew to help him copy the trade secret documents due to the amount of work required.
The nephew, a university student not officially on payroll, scanned the confidential
documents using his uncles work computer, then sent them to a hacker web site for
posting. His goal was to help the hacker community crack the high tech companys
premier product. Organizations need to carefully consider their enterprise information
boundaries when assessing the risk of insider compromise, and use legal means for
protecting their information once it leaves their control.
CERT | SOFTWARE ENGINEERING INSTITUTE | 35
Practice 2: Clearly document and consistently enforce policies
and controls. (NEW)
A consistent, clear message on organizational policies and controls will help
reduce the chance that employees will inadvertently commit a crime or lash out
at the organization for a perceived injustice.
What to do?
Policies or controls that are misunderstood, not communicated, or inconsistently enforced
can breed resentment among employees and can potentially result in harmful insider
actions. For example, multiple insiders in cases in the CERT database took intellectual
property they had created to a new job, not realizing that they did not own it. They were
quite surprised when they were arrested for a crime they did not realize they had
committed.
Organizations should ensure the following with regard to their policies and controls:
concise and coherent documentation, including reasoning behind the policy,
where applicable
fairness for all employees
consistent enforcement
periodic employee training on the policies, justification, implementation, and
enforcement
Organizations should be particularly clear on policies regarding
acceptable use of organizations systems, information, and resources
ownership of information created as a paid employee or contractor
evaluation of employee performance, including requirements for promotion and
financial bonuses
processes and procedures for addressing employee grievances
As individuals join the organization, they should receive a copy of organizational policies
that clearly lays out what is expected of them, together with the consequences of
violations. Evidence that each individual has read and agreed to the organizations
policies should be maintained.
Employee disgruntlement was a recurring factor in insider compromises, particularly in
the insider IT sabotage cases. The disgruntlement was caused by some unmet expectation
by the insider. Examples of unmet expectations observed in cases include

insufficient salary increase or bonus
limitations on use of company resources
diminished authority or responsibilities
perception of unfair work requirements
poor coworker relations
CERT | SOFTWARE ENGINEERING INSTITUTE | 36
Clear documentation of policies and controls can help prevent employee
misunderstandings that can lead to unmet expectations. Consistent enforcement can
ensure that employees dont feel they are being treated differently from or worse than
other employees. In one case, employees had become accustomed to lax policy
enforcement over a long period of time. New management dictated immediate strict
policy enforcement, which caused one employee to become embittered and strike out
against the organization. In other words, policies should be enforced consistently across
all employees, as well as consistently enforced over time.
Of course, organizations are not static entities; change in organizational policies and
controls is inevitable. Employee constraints, privileges, and responsibilities change as
well. Organizations need to recognize times of change as particularly stressful times for
employees, recognize the increased risk that comes along with these stress points, and
mitigate it with clear communication regarding what employees can expect in the future.
Case Studies: What could happen if I dont do it?
An insider accepted a promotion, leaving a system administrator position in one
department for a position as a systems analyst in another department of the same
organization. In his new position, he was responsible for information sharing and
collaboration between his old department and the new one. The following events ensued:

The original department terminated his system administrator account and issued
him an ordinary user account to support the access required in his new position.
Shortly thereafter, the system security manager at the original department noticed
that the former employees new account had been granted unauthorized system
administration rights.
The security manager reset the account back to ordinary access rights, but a day
later found that administrative rights had been granted to it once again.
The security manager closed the account, but over the next few weeks other
accounts exhibited unauthorized access and usage patterns.
An investigation of these events led to charges against the analyst for misuse of the
organizations computing systems. These charges were eventually dropped, in part
because there was no clear policy regarding account sharing or exploitation of
vulnerabilities to elevate account privileges. This case illustrates the importance of
clearly established policies that are consistent across departments, groups, and
subsidiaries of the organization.
There are many cases in the CERT library where an employee compromised an
organizations information or system in order to address some perceived injustice:
An insider planted a logic bomb in an organizations system because he felt that
he was required to follow stricter work standards than his fellow employees.
CERT | SOFTWARE ENGINEERING INSTITUTE | 37

In reaction to a lower bonus than expected, an insider planted a logic bomb that
would, he expected, cause the organizations stock value to go down, thus causing
stock options he owned to increase in value.
A network administrator who designed and controlled an organizations
manufacturing support systems detonated a logic bomb to destroy his creation
because of his perceived loss of status and control.
A quality control inspector, who believed his employer insufficiently addressed
the quality requirements of its product, supplied company confidential
information to the media to force the company to deal with the problem.
An insider, who was upset about his companys practice of cancelling insurance
policies for policy holders who paid late, provided sensitive company information
to the opposing lawyers engaged in a lawsuit against the company.
What these insiders did is wrong and against the law. Nevertheless, more clearly defined
policies and grievance procedures for perceived policy violations might have avoided the
serious insider attacks experienced by those organizations.
CERT | SOFTWARE ENGINEERING INSTITUTE | 38
Practice 3: Institute periodic security awareness training for all
employees. (UPDATED)
Without broad understanding and buy-in from the organization, technical or
managerial controls will be short lived.
What to do?
All employees need to understand that insider crimes do occur, and there are severe
consequences. In addition, it is important for them to understand that malicious insiders
can be highly technical people or those with minimal technical ability. Ages of
perpetrators range from late teens to retirement. Both men and women have been
malicious insiders, including introverted loners, aggressive get it done people, and
extroverted star players. Positions have included low-wage data entry clerks, cashiers,
programmers, artists, system and network administrators, salespersons, managers, and
executives. They have been new hires, long-term employees, currently employed,
recently terminated, contractors, temporary employees, and employees of trusted business
partners.
Security awareness training should encourage identification of malicious insiders by
behavior, not by stereotypical characteristics. Behaviors of concern include

threats against the organization or bragging about the damage one could do to the
organization,
association with known criminals or suspicious people outside of the workplace,
large downloads close to resignation,
use of organization resources for a side business, or discussions regarding starting
a competing business with coworkers,
attempts to gain employees passwords or to obtain access through trickery or
exploitation of a trusted relationship (often called social engineering)
Managers and employees need to be trained to recognize social networking in which an
insider engages other employees to join their schemes, particularly to steal or modify
information for financial gain. Warning employees of this possibility and the
consequences may help to keep them on the watch for such manipulation and to report it
to management.
Social engineering is often associated with attempts either to gain physical access or
electronic access via accounts and passwords. Some of the CERT cases reveal social
engineering of a different type, however. In one recent case, a disgruntled employee
placed a hardware keystroke logger on a computer at work to capture confidential
company information. After being fired unexpectedly, the now former employee tried to
co-opt a non-technical employee still at the company to recover the device for him.
Although the employee had no idea the device was a keystroke logger, she was smart
enough to recognize the risk of providing it to him and notified management instead.
Forensics revealed that he had removed the device and transferred the keystrokes file to
his computer at work at least once before being fired.
CERT | SOFTWARE ENGINEERING INSTITUTE | 39
Training programs should create a culture of security appropriate for the organization and
include all personnel. For effectiveness and longevity, the measures used to secure an
organization against insider threat need to be tied to the organizations mission, values,
and critical assets, as determined by an enterprise-wide risk assessment. For example, if
an organization places a high value on customer service quality, it may view customer
information as its most critical asset and focus security on protection of that data. The
organization could train its members to be vigilant against malicious employee actions,
focusing on a number of key issues, including

detecting and reporting disruptive behavior by employees (see Practice 4)
monitoring adherence to organizational policies and controls (see Practices 2 and
11)
monitoring and controlling changes to organizational systems (e.g., to prevent the
installation of malicious code) (see practices 9 and 11)
requiring separation of duties between employees who modify customer accounts
and those who approve modifications or issue payments (see Practice 8)
detecting and reporting violations of the security of the organizations facilities
and physical assets (see Practice 6)
planning for potential incident response proactively (see Practice 16)
Training on reducing risks to customer service processes would focus on

protecting computer accounts used in these processes (see Practice 7)
auditing access to customer records (see Practice 12)
ensuring consistent enforcement of defined security policies and controls (see
practice 2)
implementing proper system administration safeguards for critical servers (see
practices 10, 11, 12, and 13)
using secure backup and recovery methods to ensure availability of customer
service data (see Practice 15)
Training content should be based on documented policy, including a confidential means
of reporting security issues. Confidential reporting allows reporting of suspicious events
without fear of repercussions, thereby overcoming the cultural barrier of whistle blowing.
Employees need to understand that the organization has policies and procedures, and that
managers will respond to security issues in a fair and prompt manner.
Employees should be notified that system activity is monitored, especially system
administration and privileged activity. All employees should be trained in their personal
responsibility, such as protection of their own passwords and work products. Finally, the
training should communicate IT acceptable use policies.
Case Studies: What could happen if I dont do it?
The lead developer of a critical production application had extensive control over the
application source code. The only copy of the source code was on his company-provided
laptop; there were no backups performed, and very little documentation existed, even
CERT | SOFTWARE ENGINEERING INSTITUTE | 40
though management had repeatedly requested it. The insider told coworkers he had no
intention of documenting the source code and any documentation he did write would be
obscure. He also stated that he thought poorly of his managers because they had not
instructed him to make backup copies of the source code.
A month after learning of a pending demotion, he erased the hard drive of his laptop,
deleting the only copy of the source code the organization possessed, and quit his job. It
took more than two months to recover the source code after it was located by law
enforcement in encrypted form at the insiders home. Another four months elapsed before
the insider provided the password to decrypt the source code. During this time the
organization had to rely on the executable version of the application, with no ability to
make any modifications. If the insiders team members had been informed that the
security and survivability of the system was their responsibility, and if they had been
presented with a clear procedure for reporting concerning behavior, they might have
notified management of the insiders statements and actions in time to prevent the attack.
Another insider case involved a less technically sophisticated attack, but one that could
have been avoided or successfully prosecuted if proper policies and training had been in
place. Four executives left their firm to form a competing company. A few days before
they left, one of them ordered a backup copy of the hard drive on his work computer,
which contained customer lists and other sensitive information, from the external
company that backed up the data. The company also alleged that its consulting services
agreement and price list were sent by email from the insiders work computer to an
external email account registered under his name. The insiders, two of whom had signed
confidentiality agreements with the original employer, disagreed that the information
they took was proprietary, saying that it had been published previously. Clear policies
regarding definition of proprietary information and rules of use could have prevented the
attack or provided a clearer avenue for prosecution.
Recent Findings
A striking finding in recent cases is that in over two thirds of the 31 cases of theft for
financial gain, the insider was recruited to steal by someone outside the organization. In
many of these cases, the insider was taking most of the risk while receiving relatively
small financial compensation. The outsider was often a relative of the insider or an
acquaintance who realized the value of exploiting the insiders access to information.
One manager of a hospitals billing records gave patients credit card information to her
brother, who used it for online purchases shipped to his home address. Another insider in
the human resources department for a federal government organization gave employee
personally identifiable information (PII) to her boyfriend who used it to open and make
purchases on fraudulent credit card accounts. As in CERTs previous research, outsiders
(e.g., car salesmen) continue to convince insiders to improve the credit histories of
individuals trying to obtain loans.
Organizations should educate employees on their responsibilities for protecting the
information with which they are entrusted and the possibility that unscrupulous
individuals could try to take advantage of their access to that information. Such
CERT | SOFTWARE ENGINEERING INSTITUTE | 41
individuals may be inside or outside, the organization. In almost half of the cases of
modification of information for financial gain, the insider recruited at least one other
employee in the company to participate in the scheme, possibly as a means to bypass
separation of duty restrictions, or to ensure that coworkers wouldnt report suspicious
behavior. In one recent case, several bank janitorial employees stole customer
information while working, changed the customer addresses online, opened credit cards
in their names, purchased expensive items using the cards, and drained their bank
accounts. Employees should be regularly reminded about procedures the company has in
place for anonymously reporting suspicious coworker behavior, or attempts of
recruitment by individuals inside or outside the organization.
Employees need to be educated about the confidentiality and integrity of the companys
information, and that compromises will be dealt with harshly. Insiders sometimes did not
understand this, viewing information as being their own property rather than the
companys; for example, customer information developed by a sales person or software
developed by a programmer.
There are also recent cases in which technical employees sold their organizations
intellectual property because of dissatisfaction with their pay, and others who gave the
information to reporters and lawyers over dissatisfaction with the organizations
practices. Signs of disgruntlement in cases like those often appear well before the actual
compromise. Such attacks can be prevented if managers and coworkers are educated to
recognize and report behavioral precursors indicating potential attacks.
CERT | SOFTWARE ENGINEERING INSTITUTE | 42
Practice 4: Monitor and respond to suspicious or disruptive
behavior, beginning with the hiring process. (UPDATED)
One method of reducing the threat of malicious insiders is to proactively deal with
suspicious or disruptive employees.
What to do?
An organizations approach to reducing the insider threat should start in the hiring
process by performing background ch…

Introduction:

Insider threat is a growing concern for organizations worldwide. Employees can unwittingly expose sensitive information or, in worse cases, intentionally harm a company. The problem lies in that those individuals already have access to confidential information, making it challenging to identify breaches before they occur. This has resulted in the need for adequate security measures to control employee behavior and prevent misuse of information.

Description:

The Common Sense Guide to Prevention and Detection of Insider Threats is a document developed by the CERT program at Carnegie Mellon University. It outlines best practices for preventing insider threats, with an emphasis on controlling employee behavior and identifying potential red flags. In this assignment, students are required to review the document and select one of the 16 best practices provided. The paper should include a summary of the chosen best practice and its relevance to an HR department. It must also conclude with a recommendation on how to implement the approach in their organization. This assignment aims to provide an understanding of the security practices necessary to prevent insider threats, a growing concern for all businesses.

Learning Objective: Identify the best practices for the prevention and detection of insider threats.

Headings:
– Introduction to the problem of insider threat
– Best practices for the prevention and detection of insider threats

Learning Outcome: After completing this assignment, students will be able to select and summarize a best practice for preventing and detecting insider threats and make a recommendation for implementing it in their organization.

Objectives:
– Explain the meaning of “insider threat” and why it is a problem.
– Analyze and evaluate the 16 best practices listed in CERT’s Common Sense Guide to Prevention and Detection of Insider Threats.
– Select one best practice and explain it to a HR person in the organization.
– Conclude with a recommendation on how to implement the best practice in the organization.

Solution 1:

Best Practice for Prevention and Detection of Insider Threats – Practice 2

The problem of insider threats is a significant security concern for any organization as it involves the risk of data breaches, financial loss, and reputational damage. Practice 2 from the CERT Common Sense Guide to Prevention and Detection of Insider Threats emphasizes the importance of clearly documenting and consistently enforcing policies and controls to mitigate the risk of insider threats.

As a Human Resource person in the organization, it is crucial to understand how to implement this best practice to prevent and detect insider threats. Firstly, all organizational policies and controls should be documented, and employees must be trained on these policies during their onboarding process, with regular updates carried out periodically. Secondly, the HR department should conduct regular audits to ensure compliance and detect any potential violation of policies. Finally, any violation of the policies should be immediately reported and handled accordingly.

Therefore, implementing Practice 2 can significantly reduce the risk of insider threats in the organization.

Solution 2:

Best Practice for Prevention and Detection of Insider Threats – Practice 7

Practice 7 from the CERT Common Sense Guide to Prevention and Detection of Insider Threats focuses on password and account management policies and practices to prevent insider threats. This best practice emphasizes the importance of enforcing strict password policies and account management practices to mitigate the risk of insider threats to an organization.

As a Human Resource person in the organization, it is essential to implement Practice 7 to enhance the security of the organization’s technical environment. Start by enforcing complex password policies, such as length, strength, and multifactor authentication, to ensure that only authorized personnel can access the organization’s systems. Comprehensive account management policies and practices, such as regularly reviewing user access privileges, disabling unused accounts, and monitoring account activities, must also be in place.

In conclusion, implementing Practice 7 can significantly strengthen the organization’s security posture against insider threats.

Suggested Resources/Books:
1. “Managing Insider Risk: A Framework for Combating Malicious Insider Threats” by Eric Shaw
2. “Insider Threat: A Guide to Understanding, Detecting, and Defending Against Insider Cyberattacks” by Shawn M. Davis
3. “Insider Threat: Protect Your Enterprise From Sabotage, Spying, and Theft” by Michael G. Gelles and Dawn Cappelli
4. “The CERT Guide to Insider Threats: How to Prevent, Detect, and Respond to Information Technology Crimes (Theft, Sabotage, Fraud)” by Dawn Cappelli and Andrew Moore

Similar asked questions:
1. What are the most common types of insider threats in organizations?
2. How can organizations prevent and detect insider threats?
3. What security practices should organizations implement to control employee behavior and prevent misuse of information?
4. What are the consequences of insider threats and how can they impact an organization?
5. How can Human Resources departments play a role in preventing and detecting insider threats in an organization?

Basic features
  • Free title page and bibliography
  • Unlimited revisions
  • Plagiarism-free guarantee
  • Money-back guarantee
  • 24/7 support
On-demand options
  • Writer’s samples
  • Part-by-part delivery
  • Overnight delivery
  • Copies of used sources
  • Expert Proofreading
Paper format
  • 275 words per page
  • 12 pt Arial/Times New Roman
  • Double line spacing
  • Any citation style (APA, MLA, Chicago/Turabian, Harvard)

Our guarantees

Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.

Money-back guarantee

You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.

Read more

Zero-plagiarism guarantee

Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.

Read more

Free-revision policy

Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.

Read more

Privacy policy

Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.

Read more

Fair-cooperation guarantee

By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.

Read more
× How can I help you?