Trust Management Chen Ding Chen Yueguo Cheng Weiwei

  • Slides: 67
Download presentation
Trust Management Chen Ding Chen Yueguo Cheng Weiwei

Trust Management Chen Ding Chen Yueguo Cheng Weiwei

Outline Ø Introduction ü A computational Model Ø Managing Trust in a Peer-2 -Peer

Outline Ø Introduction ü A computational Model Ø Managing Trust in a Peer-2 -Peer System ü DMRep ü Eigen. Rep Ø Security Concerns ü P 2 PRep ü XRep Ø Conclusion 2

Trust Management Ø “a unified approach to specifying and interpreting security policies, credentials, relationships

Trust Management Ø “a unified approach to specifying and interpreting security policies, credentials, relationships [which] allows direct authorization of security-critical actions” – Blaze, Feigenbaum & Lacy Ø Trust Management is the capture, evaluation and enforcement of trusting intentions. 3

Reputation, Trust and Reciprocity Ø Reputation: perception that an agent creates through past actions

Reputation, Trust and Reciprocity Ø Reputation: perception that an agent creates through past actions about its intentions and norms. Ø Trust: a subjective expectation an agent has about another's future behavior based on the history of their encounters. Ø Reciprocity: mutual exchange of deeds Given social network A reputation Increase ai’s reputation Increase aj’s trust of ai trust reciprocity Increase ai’s reciprocating actions 4

A computational Model Ø Defines trust as a dyadic quantity between the trustor and

A computational Model Ø Defines trust as a dyadic quantity between the trustor and trustee which can be inferred from reputation data about the trustee Ø Two simplifications ü The embedded social networks are taken to be static ü The action space is restrict to be: Action: α {cooperate, defect} 5

Notations for Model Ø Reputation: θji(c) [0, 1] ü Let C be the set

Notations for Model Ø Reputation: θji(c) [0, 1] ü Let C be the set of all contexts of interest. ü Let θji(c) represent ai’s reputation in an embedded social network of concern to aj for the context c C Ø History: Dji(c) = {E*} ü Dji(c) represents a history of encounters that aj has with ai within the context c. Ø Trust: T (c) = E [ θ(c) | D(c)] ü The higher the trust level for agent ai, the higher the expectation that ai will reciprocate agent aj’s actions. 6

A Computational Model (cont…) Ø θab : b’s reputation in the eyes of a.

A Computational Model (cont…) Ø θab : b’s reputation in the eyes of a. a Context c b Ø Xab(i): the ith transaction between a and b. Ø After n transactions. We obtained the history data ü History: Dab = {Xab(1), Xab(2), … , Xab(n)} Ø Let p be the number of cooperations by agent b toward a in the n previous encounters. 7

A Computational Model (cont…) Ø Beta distribution: p( ü ) = Beta(c 1, c

A Computational Model (cont…) Ø Beta distribution: p( ü ) = Beta(c 1, c 2) : estimator for θ ü c 1 and c 2: c 1=c 2=1 (by prior assumptions) Ø A simple estimator for θab Ø Assuming that each encounter’s cooperation probability is independent of other encounters between A and B. ü The likelihood for the n encounters: L(Dab| )= p(1 - )n-p Ø Posterior estimate for : P( |D) = Beta(c 1+p, c 2+n-p) 8

A Computational Model (cont…) Ø Trust towards b from a is the conditional expectation

A Computational Model (cont…) Ø Trust towards b from a is the conditional expectation of given D. Tab = p(xab(n+1)|D) = E[ |D] Where 9

Outline Ø Introduction ü A computational Model Ø Managing Trust in a Peer-2 -Peer

Outline Ø Introduction ü A computational Model Ø Managing Trust in a Peer-2 -Peer System ü DMRep ü Eigen. Rep Ø Security Concerns of the communication channel ü P 2 p. Rep ü XRep Ø Conclusion 10

Reputation-based trust management Ø 2 Examples ü Amazon. com • Visitors usually look for

Reputation-based trust management Ø 2 Examples ü Amazon. com • Visitors usually look for customer reviews before deciding to buy new books. ü e. Bay • Participants at e. Bay’s auctions can rate each other after each transaction. Ø Both examples use completely centralized mechanism for storing and exploring reputation data. 11

P 2 P Properties Ø No central coordination Ø No central database Ø No

P 2 P Properties Ø No central coordination Ø No central database Ø No peer has a global view of the system Ø Global behavior emerges from local interactions Ø Peers are autonomous Ø Peers and connections are unreliable 12

Design Considerations Ø The system should be self-policing ü The shared ethics of the

Design Considerations Ø The system should be self-policing ü The shared ethics of the user population are defined and enforced by the peers themselves and not by some central authority Ø The system should maintain anonymity ü A peer’s reputation should be associated with an opaque identifier rather with an externally associated identity Ø The system should not assign any profit to newcomers Ø The system should have minimal overhead in terms of computation, infrastructure, storage, and message complexity Ø The system should be robust to malicious collectives of peers who know one another and attempt to collectively subvert the system. 13

DMRep [KZ 2001] Ø An approach that addresses the problem of reputation-based trust management

DMRep [KZ 2001] Ø An approach that addresses the problem of reputation-based trust management at both the data management and the semantic level Ø Behavioral data B: ü Observations t(q, p) • a peer q p P. P makes when he interacts with a peer ü B(p) = { t (p, q) or t (q, p) | q P} B Ø In a decentralized environment: ü How to access trust given B(p) and B ü How to obtain such B(p) and B to construct trust. 14

DMRep Ø In the decentralized environment, if a peer q has to determine trustworthiness

DMRep Ø In the decentralized environment, if a peer q has to determine trustworthiness of a peer p ü It has no access to global knowledge B and B(p) ü 2 ways to obtain data: • Directly by interactions Bq(p) = { t (q, p) | t (q, p) B} • Indirectly through a limited number of referrals from witnesses r Wq P Wq(p) = { t (r, p) | r Wq, t (r, p) B} 15

DMRep Ø Assumption: ü The probability of cheating within a society is comparably low

DMRep Ø Assumption: ü The probability of cheating within a society is comparably low ü More difficult to hide malicious behavior. Ø Complaint c (p, q) ü An agent p can, in case of malicious behavior of q, file a complaint c (p, q) 16

A simple situation Ø p and q interact and later on r wants to

A simple situation Ø p and q interact and later on r wants to determine the trustworthiness of p and q. ü Assume p is cheating, q is honest ü After their interaction, • q will file a complaint about p • p will file a complaint about q in order to hide its misbehavior. ü If p continues to cheat, r can conclude p is the cheater by observing the other complaints about p 17

Reputation calculation Ø T(p) = |{c(p, q) | q P| x |{c(q, p)| q

Reputation calculation Ø T(p) = |{c(p, q) | q P| x |{c(q, p)| q P}| ü High value of T(p) indicate that p is not trustworthy ü Problem • The reputation was determined based on the global knowledge on complains which is very difficult to obtain. 18

The storage structure Ø P-Grid ü Insert (a, k, v), where a is an

The storage structure Ø P-Grid ü Insert (a, k, v), where a is an arbitrary agent in the network, k is the key value to be searched for, and v is the data value associated with the key ü Query (a, k): v, where a is an arbitrary agent in the network, which returns the data values v for a corresponding query k Ø Properties ü There exists an efficient decentralized bootstrap algorithm which creates the access structure without central control ü The search algorithm consists of randomly forwarding the requests from one peer to the other. ü All algorithms scale gracefully. Time and space complexity are both O(logn) 19

Decentralized Data Management 0 00 1 Query(5, 100) 01 Query(6, 100) 1 6 2

Decentralized Data Management 0 00 1 Query(5, 100) 01 Query(6, 100) 1 6 2 1: 3 01: 2 1: 5 01: 2 1: 4 01: 2 Stores complaints about and by 1 Stores complaints about and by 2. 3 10 11 Query(4, 100) found! 3 4 5 0: 2 01: 2 0: 6 10: 4 Stores complaints about and by 4, 5 Stores complaints about and by 6 20

DMSRep Ø Access Problem: ü p still has to decide r’s trustworthiness ? p

DMSRep Ø Access Problem: ü p still has to decide r’s trustworthiness ? p rq 1 ? ? … rqn ? ? rrq 1 … 1 q ? ? rrq n 1 … The exploration of the whole network! rrq 1 n … rrq n n ü Even r is honest, it may not be reachable reliably over the network. 21

Local computation of Trust Ø Assume that the peers are only malicious with a

Local computation of Trust Ø Assume that the peers are only malicious with a certain probability pi <= pimax <1. ü If there are r replicas satisfies on average pirmax < ε, where ε is an acceptable fault-tolerance. ü If we receive the same data about a specific peer from a sufficient number of replicas we need no further checks. Ø It also limits the depth of the exploration of trustworthiness of peers to limit the search space. 22

Algorithm Check Complaints p a 1 s 1 a 2 s 2 a 3

Algorithm Check Complaints p a 1 s 1 a 2 s 2 a 3 s 3 W = {cri(q), cfi(q), si, fi |i=1, …w} ? a 4 … w: number of witness found q … sw cri(q): number of complaints q received an cfi(q): number of complaints q filed fi: the frequency with which si is found (non-uniformity of the P-Grid structure) Normalized function crinorm(q) = cri(q)(1 -(s-fi/s)s), i=1, …, w cfinorm(q) = cfi(q)(1 -(s-fi/s)s), i=1, …, w 23

Algorithm Ø Function to determine trustworthy Decidep(crinorm(q) , cfinorm(q)) = if crinorm(q)* cfinorm(q) ≤

Algorithm Ø Function to determine trustworthy Decidep(crinorm(q) , cfinorm(q)) = if crinorm(q)* cfinorm(q) ≤ crpavgcfpavg then 1 else -1 Ø Exploring Trust. ü S= SUM(i=1 … w, decide(cr_i, cf_i) ü if S=0 Check the Trustworthy of single witness. 24

DMSRep Discussion Ø Strength ü An approach that addresses the problem at both the

DMSRep Discussion Ø Strength ü An approach that addresses the problem at both the data management and the semantic level ü The method can be implemented in a fully decentralized peer-to-peer environment and scales well for large number of participants. Ø Limitations ü environment with low cheating rates. ü specific data management structure. ü Not robust to malicious collectives of peers 25

Outline Ø Introduction ü A computational Model Ø Managing Trust in a Peer-2 -Peer

Outline Ø Introduction ü A computational Model Ø Managing Trust in a Peer-2 -Peer System ü DMRep ü Eigen. Rep Ø Security Concerns ü P 2 PRep ü XRep Ø Conclusion 26

How does one peer evaluate others? Ø Directly (by own experience) ü sat(i, j):

How does one peer evaluate others? Ø Directly (by own experience) ü sat(i, j): +1, i downloads an authentic file from j. ü unsat(i, j): +1, i downloads an inauthentic file from j, or i fails to download a file from j. ü local reputation value: sij=sat(i, j)- unsat(i, j). Ø Indirectly (by others’ experience) ü ask neighbors. ü ask friends (familiars). ü ask authorities (who are more reputable). ü ask witness. 27

Normalizing Local Reputation Value Ø Ø Local reputation vector: ü Most are 0 ü

Normalizing Local Reputation Value Ø Ø Local reputation vector: ü Most are 0 ü 28

Aggregating Local Reputation Values Ø Peer i asks its friends about their opinions on

Aggregating Local Reputation Values Ø Peer i asks its friends about their opinions on peer k. Ø Peer i asks its friends about their opinions on all peers. Ø Peer i asks its friends about their opinions about other peers again. (It seems like asking his friends’ friends) 29

Global Reputation Vector Ø Continues in this manner, Ø If n is large, will

Global Reputation Vector Ø Continues in this manner, Ø If n is large, will converge to the left principal eigenvector of C for every peer i. (C is irreducible and aperiodic) Ø We call this eigenvector , the global reputation vector. ü , an element of , quantifies how much trust the system as a whole places peer j. Ø Non-distributed Algorithm 30

Practical Issues Ø Pre-trust peers: P is a set of peers which are known

Practical Issues Ø Pre-trust peers: P is a set of peers which are known to be trusted, is the pre-trusted vector of P, where, Ø Assign some trust on pre-trust peers : Ø For new peers, who don’t know anybody else: Ø Modified non-distributed algorithm: 31

Distributed Algorithm Ø All peers in the network cooperate to compute and store the

Distributed Algorithm Ø All peers in the network cooperate to compute and store the global trust vector. Ø Each peer stores and computes its own global trust value. Ø Minimize the computation, storage, and message overhead. 32

Distributed Algorithm (cont…) Ø Ai: set of peers which have downloaded files from peer

Distributed Algorithm (cont…) Ø Ai: set of peers which have downloaded files from peer i. Ø Bi: set of peers which peer i has downloaded files. 33

Message Traffic Ø Mean number of acquaintance per peer : m. Ø Mean number

Message Traffic Ø Mean number of acquaintance per peer : m. Ø Mean number of iteration: k. Ø Mean number of messages per peer: O(mk). 34

Secure Algorithm Ø The trust value of one peer should be computed by more

Secure Algorithm Ø The trust value of one peer should be computed by more than one other peer. ü malicious peers report false trust values of their own. ü malicious peers compute false trust values for others. Ø Use multiple DHTs to assign mother peers. Ø The number of mother peers for one peer is same to all peers. 35

Secure Algorithm (cont…) … Ai, Bi. Ai 0 20 1 91 5 12 5

Secure Algorithm (cont…) … Ai, Bi. Ai 0 20 1 91 5 12 5 11 #11 … 36

Secure Algorithm (cont…) m 37

Secure Algorithm (cont…) m 37

Secure Algorithm (cont…) H 1(9) H 1(5) H 1(12 H 1(0) ) H 1(i

Secure Algorithm (cont…) H 1(9) H 1(5) H 1(12 H 1(0) ) H 1(i ) H 1(11 ) H 1(1) H 1(2) 38

Modified Secure Algorithm 39

Modified Secure Algorithm 39

Message Traffic Ø Mean number of acquaintance per peer: m. Ø Mean number of

Message Traffic Ø Mean number of acquaintance per peer: m. Ø Mean number of iteration: k. Ø Number of mothers for one peer: t. Ø Mean number of message per peer: O(tmk). 40

Using Global Reputation Values Ø Isolate malicious peers. ü download from reputable peers. Ø

Using Global Reputation Values Ø Isolate malicious peers. ü download from reputable peers. Ø Incent peers to share file. ü reward reputation. Ø Allow the newcomers to build trust. ü provide a probability of 10% to be selected. ü reward new comers greatly. Ø Balance the load. ü download probabilistically based on trust values. ü set up maximum reputation (e. g. sij<MAX Value). 41

Limitation of Eigen. Rep Ø Cannot distinguish between newcomers and malicious peers. Ø Malicious

Limitation of Eigen. Rep Ø Cannot distinguish between newcomers and malicious peers. Ø Malicious peers can still cheat cooperatively ü A peer should not report its predecessors by itself. Ø Flexibility ü How to calculate reputation values when peers join and leave, on line and off line. Ø When to update global reputation values? ü According to the new local reputation vector of all peers. Ø Anonymous? ü A mother peer know its daughters. 42

Outline Ø Introduction ü A computational Model ü Trust management in P 2 P

Outline Ø Introduction ü A computational Model ü Trust management in P 2 P system Ø Managing Trust in a Peer-2 -Peer System ü DMRep ü Eigen. Rep Ø Security Concerns ü P 2 p. Rep ü XRep Ø Conclusion 43

P 2 PRep & XRep Ø Not focus on computation of reputations Ø Security

P 2 PRep & XRep Ø Not focus on computation of reputations Ø Security of exchanged messages ü Queries ü Votes Ø How to prevent different security attacks 44

P 2 PRep & XRep Ø Using Gnutella for reference ü A fully P

P 2 PRep & XRep Ø Using Gnutella for reference ü A fully P 2 P decentralized infrastructure ü Peers have low accountability and trust ü Security threats to Gnutella • Distribution of tampered information • Man in the middle attack 45

Sketch of P 2 PRep Ø P select a peer among those who respond

Sketch of P 2 PRep Ø P select a peer among those who respond to P’s query Ø P polls its peers for opinions about the selected peer Ø Peers respond to the polling with votes Ø P uses the votes to make its decision 46

Sketch of P 2 PRep Cont’d Ø To ensure authenticity of offerers & voters,

Sketch of P 2 PRep Cont’d Ø To ensure authenticity of offerers & voters, and confidentiality of votes Ø Use public-key encryption to provide integrity and confidentiality of messages Ø Require peer_id to be a digest of a public key, for which the peer knows the private key 47

P 2 PRep Ø Two approaches: ü Basic polling • Voters do not provide

P 2 PRep Ø Two approaches: ü Basic polling • Voters do not provide peer_id in votes ü Enhanced polling • Voters declare their peer_id in votes 48

P 2 PRep – Basic Polling (a) Initiator P P * Peers S Query(search_string)

P 2 PRep – Basic Polling (a) Initiator P P * Peers S Query(search_string) Query. Hit(IP, port, speed, Result, peer_id) Select top list T of offerers Generate key pair (PKpoll, SKpoll) P * Si P, (Si S) Poll(T, PKpoll) Poll. Reply( {(IP, port, Votes)}PKpoll ) Remove suspicious votes Vi P, (Vi V) Select random subset V’ P D Vj, (Vj V’) True. Vote( Votesj ) True. Vote. Reply(resonse) Vj D P, (Vj V’) If response is negative, discard Votesj Select peer s for downloading 49

P 2 PRep – Basic Polling (b) Initiator P Peer s Generate random string

P 2 PRep – Basic Polling (b) Initiator P Peer s Generate random string r P D s Challenge(r) Response([r]SKs, PKs) s D P If h(PKs)=peer_ids && {[r]SKs}PKs=r: download Update experience_repository 50

P 2 PRep Ø Two approaches: ü Basic polling • Voters do not provide

P 2 PRep Ø Two approaches: ü Basic polling • Voters do not provide peer_id in votes ü Enhanced polling • Voters declare their peer_id in votes 51

P 2 PRep – Enhanced Polling (a) Initiator P P * Peers S Query(search_string)

P 2 PRep – Enhanced Polling (a) Initiator P P * Peers S Query(search_string) Query. Hit(IP, port, speed, Result, peer_id) Select top list T of offerers Generate pairs (PKpoll, SKpoll) P * Remove suspicious votes P D Vj, (Vj V’) P, (Si S) Vi P, (Vi V) Poll(T, PKpoll) Poll. Reply( {[(IP, port, Votes, peer_idi)]SKi, PKi}PKpoll ) Select random subset V’ Si Are. You( peer_idj ) Are. You. Reply(resonse) Vj D P, (Vj If response is negative, discard Votesj Select servent s for downloading 52 V’)

P 2 PRep – Enhanced Polling (b) Initiator P Peer s Generate random string

P 2 PRep – Enhanced Polling (b) Initiator P Peer s Generate random string r P D s Challenge(r) Response([r]SKs, PKs) s D P If h(PKs)=peer_ids && {[r]SKs}PKs=r: download Update experience_repository 53

Comparison: Basic vs Enhanced Ø Basic polling ü all votes are considered equal Ø

Comparison: Basic vs Enhanced Ø Basic polling ü all votes are considered equal Ø Enhanced polling ü peer_ids allow p to weight the votes based on v’s trustworthiness 54

Discussion Ø In enhanced polling, voters also provide IP & port in Poll. Reply

Discussion Ø In enhanced polling, voters also provide IP & port in Poll. Reply message Ø Discussion: IP & port, and Are. You message can be omitted ü Explanation 1: • basic polling needs IP & port to check truthfulness of Votes • voter’s private key guarantees this in enhanced polling ü Explanation 2: • the paper explains that Are. You message checks the truthfulness of (IP, Port) • the offerer’s (IP, Port) needs to be checked as later we need download from it. For voter, we only need the truthfulness of Votes 55

P 2 PRep: Security Improvements (1) Ø Distribution of Tampered Information ü B responds

P 2 PRep: Security Improvements (1) Ø Distribution of Tampered Information ü B responds to A with a fake resource Ø P 2 PRep Solution: ü A discovers the harmful content from B ü A updates B’s reputation, preventing further interaction with B ü A become witness against B in pollings by others 56

P 2 PRep: Security Improvements (2) Ø Man in the Middle Attack ü Data

P 2 PRep: Security Improvements (2) Ø Man in the Middle Attack ü Data from C to A can be modified by B, who is in the path • A broadcasts a Query and C responds • B intercepts the Query. Hit from C and rewrites it with B’s IP & port • A receives B’s reply • A chooses B for downloading • B downloads original content from C, modifies it and passes it to A 57

P 2 PRep: Security Improvements (2) Ø Man in the Middle Attack ü P

P 2 PRep: Security Improvements (2) Ø Man in the Middle Attack ü P 2 PRep addresses this problem by including a challenge-response phase before downloading ü To impersonate C, B needs • C’s private key • To design a public key whose digest is C’s identifier ü Public key encryption strongly enhances the integrity of the exchanged messages ü Both versions address this problem 58

XRep Ø Extended from P 2 PRep Ø Combining servent-based & resource-based reputations ü

XRep Ø Extended from P 2 PRep Ø Combining servent-based & resource-based reputations ü Servent-based Reputation • Associated with Peer Identifier ü Resource-based Reputation • Coupled to resource’s content 59

XRep Ø Two Requirements ü Peer_id is a digest of its public key ü

XRep Ø Two Requirements ü Peer_id is a digest of its public key ü Resource_id is a digest of its content Ø Each peer maintains two experience repositories ü Servent Repository ü Resource Repository 60

Sketch of XRep Ø XRep protocol consists of 5 stages: ü Recource searching ü

Sketch of XRep Ø XRep protocol consists of 5 stages: ü Recource searching ü Rescource selection & Vote Polling ü Vote evaluation ü Best peer check ü Resource downloading 61

XRep Ø Differences from P 2 PRep ü Query. Hit contains resource digests in

XRep Ø Differences from P 2 PRep ü Query. Hit contains resource digests in Result. Set ü Vote Polling: ask peers to vote on resource or on the peers who offer the resource ü Vote Reply: each peer can respond with votes on resources or peers Ø Similar to P 2 PRep, public key encryption is used 62

XRep: Security Consideration Ø Distribution of Tampered Information Ø Man in the middle attack

XRep: Security Consideration Ø Distribution of Tampered Information Ø Man in the middle attack 63

XRep: Improvements (1) Ø Decoupling of resource from offerers permits parallel downloads ü P

XRep: Improvements (1) Ø Decoupling of resource from offerers permits parallel downloads ü P can ask different offerers for different resource fragments 64

XRep: Improvements (2) Ø Combining servent-based & resource-based reputations ü Both have shortcomings and

XRep: Improvements (2) Ø Combining servent-based & resource-based reputations ü Both have shortcomings and advantages Servent-based Resource-based Reputation’s life cycle shorter due to peer_id changes good resource always recognizable Cold start avoid cold start for new resource avoid cold start for new peers Performance bottleneck may direct all downloads to most reputable peers avoids bottleneck for most reputable peers 65

Outline Ø Introduction ü A computational Model ü Trust management in P 2 P

Outline Ø Introduction ü A computational Model ü Trust management in P 2 P system Ø Managing Trust in a Peer-2 -Peer System ü DMRep ü Eigen. Rep Ø Security Concerns ü P 2 p. Rep ü XRep Ø Conclusion 66

Conclusion Reputation-based Trust Management Ø Reputation Computation & Management ü DMRep ü Eigen. Rep

Conclusion Reputation-based Trust Management Ø Reputation Computation & Management ü DMRep ü Eigen. Rep Ø Security Concerns ü P 2 PRep ü XRep 67