Defending cyberspace is much like defending any other domain in that it involves the military. What

  

Defending cyberspace is much like defending any other domain in that it involves the military. What is the military’s role in defending computer networks and carrying out offensive attacks? What are the issues pertaining to this role? Should the military be the only one in the US government carrying out this role? Why or why not? What can other entities add to this role?
cyber_intelligence_tradecraft_project___summary_of_key_findings.pdf

wars_of_disruption_and_resilience_cybered_conflict…_______chapter_three._challenges_in_a_new_strategy_for_cybered_threats_.pdf

Don't use plagiarized sources. Get Your Custom Essay on
Defending cyberspace is much like defending any other domain in that it involves the military. What
Just from $13/Page
Order Essay

cyber_strategy_summary_final.pdf

the_dod_cyber_strategy.pdf

national_cyber_strategy.pdf

Unformatted Attachment Preview

January 2013
SEI Innovation Center Report:
Cyber Intelligence Tradecraft
Project
Summary of Key Findings
A
Authors
Troy Townsend
Melissa Ludwick
Jay McAllister
Andrew O. Mellinger
Kate Ambrose Sereno
Copyright 2013
Carnegie Mellon University
This material is based upon work funded and supported by Office of the Director of
National Intelligence under Contract No. FA8721-05-C-0003 with Carnegie Mellon
University for the operation of the Software Engineering Institute, a federally funded
research and development center sponsored by the United States Department of
Defense.
NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE
ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS.
CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER
EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED
TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY,
OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON
UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO
FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.
This material has been approved for public release and unlimited distribution except
as restricted below.
Internal use:* Permission to reproduce this material and to prepare derivative
works from this material for internal use is granted, provided the copyright and “No
Warranty” statements are included with all reproductions and derivative works.
External use:* This material may be reproduced in its entirety, without modification,
and freely distributed in written or electronic form without requesting formal
permission. Permission is required for any other external and/or commercial use.
Requests for permission should be directed to the Software Engineering Institute at
permission@sei.cmu.edu.
* These restrictions do not apply to U.S. government entities.
Carnegie Mellon® is registered in the U.S. Patent and Trademark Office by Carnegie
Mellon University.
DM-0000194
Executive Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Participants. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Cyber Intelligence Definition and Analytic Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Baseline and Benchmarking Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Key Findings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1
2
2
2
3
4
State of the Practice in Cyber Intelligence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Challenge: Applying a strategic lens to cyber intelligence analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Challenge: Information sharing isn’t bad; it’s broken. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Best Practice #1: Aligning functional and strategic cyber intelligence resources. . . . . . . . . . . . . . . 6
Best Practice #2: Information sharing in the financial sector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Challenge: Understanding threats to the software supply chain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Challenge: Determining where cyber intelligence belongs organizationally. . . . . . . . . . . . . . . . . . . . 8
Best Practice #1: Scoping the cyber environment to the organization’s mission. . . . . . . . . . . . . . . . 8
Best Practice #2: Modeling threats to shape resource allocation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Data Gathering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Challenge: Data hoarding. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Challenge: Lack of standards for open source intelligence data taxes resources. . . . . . . . . . . . . . 10
Best Practice #1: Repurposing search engine referral data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Best Practice #2: Mind the gaps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Functional Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Challenge: Adopting a common cyber lexicon and tradecraft. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Challenge: Filtering critical cyber threats out of an abundance of data. . . . . . . . . . . . . . . . . . . . . . . 12
Best Practice #1: Comprehensive workflow to identify cyber threats and inform customers. . . . 12
Best Practice #2: Producing scripts to automate the filtration of known threat data. . . . . . . . . . . . 12
Strategic Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Challenge: No industry standard for cyber intelligence education and training . . . . . . . . . . . . . . . 13
Challenge: Adapting traditional intelligence methodologies to the cyber landscape. . . . . . . . . . . 14
Best Practice #1: Know your enemy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Best Practice #2: Global situational awareness. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Stakeholder Reporting and Feedback. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Challenge: Communicating “cyber” to leadership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Challenge: Difficulty capturing return on investment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Best Practice #1: Failure analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Best Practice #2: Carving channels for communication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
SEI Innovation Center Report: Cyber Intelligence Tradecraft Project
Summary of Key Findings
Executive Summary
The Software Engineering Institute (SEI) Innovation Center1
at Carnegie Mellon University is studying the state of cyber
intelligence across government, industry, and academia.
This study, known as the Cyber Intelligence Tradecraft Project
(CITP), seeks to advance the capabilities of organizations
performing cyber intelligence by elaborating on best practices
and prototyping solutions to shared challenges. Starting in
June 2012, six government agencies and 20 organizations from
industry and academia provided information on their cyber
intelligence methodologies, technologies, processes, and
training. This baseline data then was benchmarked against
a cyber intelligence analytic framework consisting of five
functions: environment, data gathering, functional analysis,
strategic analysis, and stakeholder reporting and feedback.
The aggregated results of the benchmarking led to the key
findings presented in this report.
Overall, the key findings indicate that organizations use a
diverse array of approaches to perform cyber intelligence.
They do not adhere to any universal standard for establishing
and running a cyber intelligence program, gathering data,
or training analysts to interpret the data and communicate
findings and performance measures to leadership. Instead,
pockets of excellence exist where organizations excel at
cyber intelligence by effectively balancing the need to protect
network perimeters with the need to look beyond them for
strategic insights. Organizations also continuously improve
data gathering and analysis capabilities with threat prioritization
models, information sharing, and conveying return on
investment to decision makers. This report captures the best
practices from successful cyber intelligence programs and
tailors them to address challenges organizations currently face.
To learn more about the SEI Innovation Center, visit:
www.sei.cmu.edu/about/organization/innovationcenter
1
1
Introduction
Cyber Intelligence Definition and Analytic Framework
Cyber intelligence grew from the halls of government into a burgeoning
business providing tools and services to industry and academia. As more
organizations focus on this topic, varying methodologies, technologies,
processes, and training complicate the operating environment.
Recognizing a need to understand and improve this situation, the SEI
Innovation Center began to study the state of the practice in cyber
intelligence in June 2012. This report discusses the CITP’s process and
key findings.
The SEI Innovation Center developed a definition of cyber intelligence
to standardize the scope of the CITP with participants. Drawn from
government and industry descriptions, the SEI Innovation Center defines
cyber intelligence as:
Participants
The CITP involved 26 organizations from government, industry, and
academia. They included six government agencies with dedicated cyber
intelligence missions and 20 entities representing multiple economic
sectors, such as academia, defense contracting, energy, financial
services, healthcare, information technology, intelligence service
providers, legal, and retail. These organizations range in size from one
employee to global organizations with hundreds of thousands of network
users. Their cyber intelligence workforces have diverse backgrounds in
intelligence, information security, and the military, and hold a multitude of
titles, such as chief technology officer, chief information security officer,
vice president of threat management, information architect, intelligence
analyst, and network analyst.
The acquisition and analysis of information to identify, track, and predict cyber
capabilities, intentions, and activities that offer courses of action to enhance
decision making.
An analytic framework also was created to guide the CITP’s baseline
and benchmark processes, the foundation of which is based on the U.S.
government’s traditional intelligence cycle.
Planning
and
Direction
Collection
Dissemination
Processing
Analysis
and
Production
Figure 1 – Traditional Intelligence Cycle 2
The CITP’s analytic framework promptly deviates from the preceding
because the utility of the traditional intelligence cycle is limited when
applied to cyber. This traditional intelligence cycle is depicted as a
linear process and does not emphasize the inter-related nature of its five
functions or their relevance to related functions, namely cyber security.
The SEI Innovation Center captured these unique cyber intelligence
analysis characteristics by creating an approach that more accurately
shows the inter-dependencies and outside influences in the cyber
intelligence process. This approach incorporates how technology
influences the way analysis is done, and uniquely identifies the functions
that integrate technology. In particular, the CITP’s analytic framework
separates analysis into two distinct functions: specialized technical
analysis (i.e. functional) and strategic analysis.
2
2
he Traditional Intelligence Cycle was reproduced from a paper authored by
T
Judith Meister Johnston and Rob Johnston, hosted on the Central Intelligence
Agency’s public website: https://www.cia.gov/library/center-for-the-study-ofintelligence/csi-publications/books-and-monographs/analytic-culture-in-the-u-sintelligence-community/page_46.pdf. Last accessed January, 2013.
This analytic framework utilizes five functions to capture interdependencies of and external influences on cyber intelligence:
ng
orti
Rep
Strategic
Analysis
Stakeholder
ck
dba
Fee
Environment
Cyber
Security
Functional
Analysis
Data
Gathering
Figure 2 – CITP Analytic Framework
• Environment: Establishes the scope of the cyber intelligence effort
and influences what data is needed to accomplish it.
• Data Gathering: Through automated and labor-intensive means, analysts
explore data sources, collect information, and aggregate it to perform
analysis.
• Functional Analysis: Analysts use gathered data to perform technical
and tailored analysis, typically in support of a cyber security mission.
• Strategic Analysis: Analysts apply a strategic lens to functional data
and report this intelligence to a stakeholder or use it to influence the
environment. If functional analysis attempts to answer the “what”
and “how” of cyber threats, strategic analysis aims to answer
“who” and “why.”
• Stakeholder Reporting and Feedback: After intelligence is disseminated
to stakeholders, they provide feedback and/or use the intelligence to
influence the environment.
It is important to note that the analytic framework does not solely exist
to address cyber security. Cyber intelligence is a critical component of
cyber security, and the two functions are inter-related; however, the CITP
focuses on cyber intelligence. Cyber intelligence supports a variety of
missions in government, industry, and academia; to include national policy,
military applications, strategic communications, international negotiations,
acquisitions, risk management, and physical security. Throughout the
analytic framework, cyber security professionals receive data and
intelligence, but the cyber intelligence process operates independently
and does not necessarily need to support a cyber security mission.
Baseline and Benchmarking Approach
The SEI Innovation Center employed an iterative process to create a
discussion guide that served as a starting point to baseline organizations.
It reduced biases and was specifically designed to capture entities’ core
cyber intelligence functions, regardless of if they were representing the
government, industry, or academia. Using the discussion guide, the SEI
Innovation Center typically sent a cross-functional team of intelligence and
software engineering professionals to engage with organizations during
face-to-face interview sessions. The team interacted with representatives
from their cyber intelligence and cyber security leadership as well as
functional and strategic analysts. During the interview sessions, these
entities provided information on the methodologies, technologies,
processes, and training enabling them to perform cyber intelligence.
The data gathered during these interviews established the baseline that
the SEI Innovation Center used to benchmark against its cyber intelligence
analytic framework. For benchmarking, the SEI Innovation Center compiled
and reviewed the baseline to ensure it captured the pertinent data. The
information then was ranked against 35 assessment factors distributed
amongst the analytic framework’s five functions using an ordinal scale of
++, +, 0, -, –, with 0 representing average performance. Due to the variety
in the organizations’ backgrounds and sizes, the ordinal scale offered
the necessary flexibility for benchmarking, despite its limitations with
numerical and interval analysis. Peer and group reviews also ensured
consistency throughout the rankings.
The SEI Innovation Center derived the 35 assessment factors from the
interview sessions and its cyber intelligence and software engineering
expertise:
• Environment: Top-sight on cyber footprint; cyber intelligence distinction
with cyber security; role alignment; personnel to support cyber
intelligence; organizational structure; workflow utilization; prioritization
of threats; organizational situational awareness; cyber intelligence
functional and strategic analysis; scope of past, present, and future
analysis; insider threat and cyber intelligence relationship.
• Data Gathering: Requirements and sources relationship; information
sharing; meeting analytical needs; technology facilitating data
gathering; indexing and archiving of data; validation of sources.
• Functional Analysis: Workflow exists; timeliness in producing
analysis; diversity with incorporating multiple technical disciplines;
skills, knowledge, and abilities; tools utilized.
• Strategic Analysis: Distinguished from functional analysis; workflow
exists; diversity with incorporating multiple technical disciplines;
skills, knowledge, and abilities; tools utilized.
• Stakeholder Reporting and Feedback: Report types generated;
reporting mechanism for actionable and predictive analysis;
leadership influences format and production timelines; cyber
intelligence influences decision making; feedback mechanisms
exist; feedback influences data gathering and analysis; satisfying
intelligence consumers; capturing return on investment.
3
Key Findings
The following highlights the common challenges and best practices
identified during the CITP by describing them within the context of the
analytic framework’s five functions. A stacked bar chart accompanies each
function to summarize the baseline of organizations’ ratings in these areas.
Each bar within the charts represents one of the benchmark’s 35 factors
(X-axis). The height of each color within the bars shows the percentage of
organizations (Y-axis) receiving that particular rating and the red-colored
diamond symbol displays the median. The ratings range between –, -, 0, +,
and ++, with 0 being average performance for that assessment factor.
Figure 3 – CITP Baseline
Figure 4 – CITP Baseline Variances
4
Figure 3 divides a stacked bar chart by the five functions of the analytic
framework to visually show the CITP’s baseline. Figure 4 removes the
median (the red-colored diamond symbol) and the yellow-colored bar
sections depicting the percentage of organizations receiving an average
rating in Figure 3 to highlight the variances among entities with ratings of
–, -, +, and ++. Figures 6, 9, and 11-13 display a stacked bar chart for the
factors within each of the five functions.
State of the Practice in Cyber Intelligence
Most organizations identified cyber intelligence and cyber security as
two distinct and capable work functions that interact when necessary to
best support their needs. They performed cyber intelligence by trying to
understand the internal and external environment, gathering data, and
analyzing …
Purchase answer to see full
attachment

  
Basic features
  • Free title page and bibliography
  • Unlimited revisions
  • Plagiarism-free guarantee
  • Money-back guarantee
  • 24/7 support
On-demand options
  • Writer’s samples
  • Part-by-part delivery
  • Overnight delivery
  • Copies of used sources
  • Expert Proofreading
Paper format
  • 275 words per page
  • 12 pt Arial/Times New Roman
  • Double line spacing
  • Any citation style (APA, MLA, Chicago/Turabian, Harvard)

Our guarantees

Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.

Money-back guarantee

You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.

Read more

Zero-plagiarism guarantee

Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.

Read more

Free-revision policy

Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.

Read more

Privacy policy

Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.

Read more

Fair-cooperation guarantee

By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.

Read more