Using Ontologies to Quantify Attack Surfaces Mr Michael

  • Slides: 18
Download presentation
Using Ontologies to Quantify Attack Surfaces Mr. Michael Atighetchi, Dr. Borislava Simidchieva, Dr. Fusun

Using Ontologies to Quantify Attack Surfaces Mr. Michael Atighetchi, Dr. Borislava Simidchieva, Dr. Fusun Yaman, Raytheon BBN Technologies Dr. Thomas Eskridge Dr. Marco Carvalho Florida Institute of Technology Captain Nicholas Paltzer Air Force Research Laboratory Distribution Statement “A” (Approved for Public Release, Distribution Unlimited). This material is based upon work supported by the Air Force Research Laboratory under Contract No. FA 8750 -14 -C-0104. 03. 10

Context • • Objective: Provide tools enabling Problem: Defense selection and automated security quantification

Context • • Objective: Provide tools enabling Problem: Defense selection and automated security quantification of configuration is a poorly understood, distributed systems with a focus on non-quantifiable process architectural patterns – Add defenses that provide little value or even increase the attack surface - – Introduce unacceptable overhead - – Cause unintended side effects when combining multiple defenses - Model key concepts related to cyber defense Provide algorithms to quantify and minimize attack surfaces Focus on Moving Target Defense 2

Systematic Quantification of Defense Postures 3

Systematic Quantification of Defense Postures 3

Attack Surface Reasoning (ASR) • Objective: Measure attack surfaces for security quantification – Establish

Attack Surface Reasoning (ASR) • Objective: Measure attack surfaces for security quantification – Establish appropriate metrics for quantifying different attack surfaces – Incorporate mission security and cost measurements – Address usability issues through representative and composite measures of effectiveness • Technical Achievements – Models for attack surfaces that include systems, defenses, and attack vectors to enable quantitative characterization of attack surfaces – Metrics for characterizing the attack surface of a dynamic, distributed system at the application, operating system, and network layers – Algorithms for evaluating the effectiveness of defenses and minimizing attack surfaces 4

Modeling Approach • Express a configuration C as a collection of OWL models –

Modeling Approach • Express a configuration C as a collection of OWL models – C = {system, defense, attack, adversary, mission, metrics} – Ontology openly available at https: //ds. bbn. com/projects/asr. html • Focus on interactions between distributed components – Adversaries tend to take advantage of weak seems • Make as few assumptions about adversaries as possible – Minimize “garbage in, garbage out” problems • Leverage extensible knowledge representation frameworks with powerful query languages – Ontologies expressed in OWL – Models can queried with SPARQL • Automate model creation when possible – Increase consistency and minimize cost of manual model creation 5

Systems Model • • Capture the relevant aspects of systems Based on Microsoft’s STRIDE

Systems Model • • Capture the relevant aspects of systems Based on Microsoft’s STRIDE dataflow model – Process • DLLs, EXEs, service – External Entity • People, other systems – Data Flow • Network flow, function call – Data Store • File • Database – Trust Boundary • Process boundary • File system • Extensions – Hierarchical Layering – Inclusion of specific concepts to make models more understandable 6

Attack Model Microsoft STRIDE MITRE CAPAC MITRE CWE S= T= R= I = D=

Attack Model Microsoft STRIDE MITRE CAPAC MITRE CWE S= T= R= I = D= E= Spoofing Tampering Repudiation Information Disclosure Denial of Service Elevation of Privilege Attack Types 6 Common Attack Pattern Enumeration And Classification >500 Common Weakness Enumeration >943 Expresses high-level attack steps 7

Attack Step Model Example Attack. Step. Definition: 8

Attack Step Model Example Attack. Step. Definition: 8

Current Set of Modeled Attack Steps 9

Current Set of Modeled Attack Steps 9

Adversary Model • Captures assumptions we make about adversaries – Starting position – Overall

Adversary Model • Captures assumptions we make about adversaries – Starting position – Overall objective of the attack • Quantification experiments assess attack surfaces across many different adversary models • To increase efficiency of attack vector finding, knowledge of adversarial workflows can be expressed in Attack. Vector. Templates 10

Defense Model • Express the security provided and cost incurred by cyber defenses •

Defense Model • Express the security provided and cost incurred by cyber defenses • Defense models may add new entities to system models (new data flows, processes, etc. ) • Current set of modeled defenses includes three types of MTDs – Time-bounding observables (e. g. , IPHopping) – Masquerading (OS Masquerading) – Time-bounding footholds (e. g. , continuous restart via Watchdogs) 11

Mission Model • Missions are simply modeled as a subset of data flows together

Mission Model • Missions are simply modeled as a subset of data flows together with information security and cost requirements – Security requirements are expressed as Confidentiality, Integrity, Availability – Cost requirements are expressed as %change of latency and throughput • Missions (and their individual flows) can be in three distinct modes – Pass, degraded, fail 12

Metrics Model • Attack surface metrics are themselves expressed through a model • Cover

Metrics Model • Attack surface metrics are themselves expressed through a model • Cover {system & mission, security & cost} dimensions 13

Attack Surface Indexes 14

Attack Surface Indexes 14

1. Wrap Defense 2. Scan System into Model Virtual Experimentation Environment Networked System* Se

1. Wrap Defense 2. Scan System into Model Virtual Experimentation Environment Networked System* Se cu rit Co y st M iss io n Quantification Methodology 4. Quantify Attack Surface ASI ACI AMI Networked System 123 System -5 Mission System +13 +3 Degraded System +23 +5 Attack 0 Fail +12 Fail Pass 3. Characterize Defense 5. Validate Attack Vectors Experimentation: • System auto-scan • Defense cost characterization • Attack vector validation Analytics: • Cost and security metrics • Attack vector finding • Attack surface minimization 15

Experimental Results • Generated models of tens of hosts and a small number of

Experimental Results • Generated models of tens of hosts and a small number of defenses and attack steps • Deployed scanning capabilities on BBN network and virtualized network at customer location and automatically generated system models from live systems • Explore runtime complexity of attack vector finding and metrics computation algorithms using a random model generator 16

Conclusion and Next Steps • We created a framework for quantifying attack surfaces using

Conclusion and Next Steps • We created a framework for quantifying attack surfaces using semantic models – Our ontologies are openly available at https: //ds. bbn. com/projects/asr. html – We hope you will try them out and provide feedback! • Next Steps – Automate defense deployment exploration within a system through a genetic search algorithm – Include metrics to capture interaction effect between multiple cyber defenses – Expand scenario to enterprise-scale regimes – Extend the set of modeled cyber defenses beyond MTDs • Proxy overlay networks, deception, reactive defenses 17

Contacts • • • Mr. Michael Atighetchi, matighet@bbn. com Dr. Borislava Simidchieva, simidchieva@bbn. com

Contacts • • • Mr. Michael Atighetchi, matighet@bbn. com Dr. Borislava Simidchieva, simidchieva@bbn. com Dr. Fusun Yaman, fyaman@bbn. com Dr. Thomas Eskridge, teskridge@fit. edu Dr. Marco Carvalho, mcarvalho@cs. fit. edu Captain Nicholas Paltzer, nicholas. paltzer@us. af. mil 18