Association Rules l l Mining Association Rules between

  • Slides: 23
Download presentation
Association Rules l l Mining Association Rules between Sets of Items in Large Databases

Association Rules l l Mining Association Rules between Sets of Items in Large Databases (R. Agrawal, T. Imielinski & A. Swami) 1993. Fast Algorithms for Mining Association Rules (R. Agrawal & R. Srikant) 1994.

Basket Data Retail organizations, e. g. , supermarkets, collect and store massive amounts sales

Basket Data Retail organizations, e. g. , supermarkets, collect and store massive amounts sales data, called basket data. A record consist of n transaction date n items bought Or, basket data may consist of items bought by a customer over a period.

Example Association Rule 90% of transactions that purchase bread and butter also purchase milk

Example Association Rule 90% of transactions that purchase bread and butter also purchase milk Antecedent: bread and butter Consequent: milk Confidence factor: 90%

Example Queries l l l Find all the rules that have “Uludağ Gazozu” as

Example Queries l l l Find all the rules that have “Uludağ Gazozu” as consequent. Find all rules that have “Diet Coke” in the antecedent. Find all rules that have “sausage” in the antecedent and “mustard” in the consequent. Find all the rules relating items located on shelves A and B in the store. Find the “best” (most confident) k rules that have “Uludağ Gazozu” in the consequent.

Formal Model l I = i 1, i 2, …, im: set of literals

Formal Model l I = i 1, i 2, …, im: set of literals (items) D : database of transactions T D : a transaction. T I TID: unique identifier, associated with each T X: a subset of I n T contains X if X T. n l

Formal Model (Cont. ) l l l Association rule: X Y here X I,

Formal Model (Cont. ) l l l Association rule: X Y here X I, Y I and X Y = . Rule X Y has a confidence c in D if c% of transactions in D that contain X also contain Y. Rule X Y has a support s in D if s% of transactions in D contain X Y.

Example l I: itemset {cucumber, parsley, onion, tomato, salt, bread, olives, cheese, butter} l

Example l I: itemset {cucumber, parsley, onion, tomato, salt, bread, olives, cheese, butter} l D: set of transactions 1 {{cucumber, parsley, onion, tomato, salt, bread}, 2 {tomato, cucumber, parsley}, 3 {tomato, cucumber, olives, onion, parsley}, 4 {tomato, cucumber, onion, bread}, 5 {tomato, salt, onion}, 6 {bread, cheese} 7 {tomato, cheese, cucumber} 8 {bread, butter}}

Problem l l l Given a set of transactions, Generate all association rules that

Problem l l l Given a set of transactions, Generate all association rules that have the support and confidence greater than the user-specified minimum support (minsup) and minimum confidence (minconf).

Problem decomposition 1. Find all itemsets that have transaction support above minimum support. 2.

Problem decomposition 1. Find all itemsets that have transaction support above minimum support. 2. Use the large itemsets to generate the Association rules: 2 1. For every large itemset I, find its all subsets 2. 2. For every subset a, output a rule: a (I - a) if

Discovering Large Itemsets Apriori and Apriori. Tid algorithms: Basic intuition: Any subset of a

Discovering Large Itemsets Apriori and Apriori. Tid algorithms: Basic intuition: Any subset of a large itemset must be large Itemset having k items can be generated by joining large itemsets having k-1 items, and deleting those that contain any subset that is not large. Def. k-itemset: large itemset with k items.

Apriori Algorithm L 1 = { large 1 -itemsets } for (k=2; Lk-1 ;

Apriori Algorithm L 1 = { large 1 -itemsets } for (k=2; Lk-1 ; k++) do begin Ck = apriori-gen(Lk-1); // New candidates forall transactions t D do begin C’t = subset (Ck, t) // Candidates contained in t forall candidates c Ct do c. count++ end Lk = {c Ct | c. count minsup} end Return L k k

Apriori Candidate Generation apriori-gen(Lk-1): Returns a superset of the set of all large k-items

Apriori Candidate Generation apriori-gen(Lk-1): Returns a superset of the set of all large k-items l First select two itemsets p, q from Lk-1 s. t. first k-2 items of p and q are the same, form a new candidate k-itemset c as common k-2 items + 2 differing items l Prune those c, s. t. some (k-1) subset of c is not in Lk-1

Apriori Algorithm (cont. ) l l l Go thru all transactions in D, increment

Apriori Algorithm (cont. ) l l l Go thru all transactions in D, increment the counts of all itemsets in Ck Lk is the set of all large itemsets in Ck For minsup s=30%, L= {{bread}, {cheese}, {cucumber}, {onion}, {parsley}, {salt}, {tomato}, {cucumber, onion}, {cucumber, parsley}, {cucumber, tomato}, {onion, tomato}, {parsley, tomato}, {cucumber, parsley, tomato}}

Subset Function Subset (Ck, t): candidate itemsets contained in t l Candidate itemsets in

Subset Function Subset (Ck, t): candidate itemsets contained in t l Candidate itemsets in Ck are stored in a hashtree l Leaf node: contains a list of itemsets l Interior node: contains a hash table n Each bucket points to another node n Depth of root = 1 n Buckets of a node at depth d points to nodes at depth d+1

Subset Function (cont. ) Construction of hash-tree for Ck l To add itemset c:

Subset Function (cont. ) Construction of hash-tree for Ck l To add itemset c: n start from the root n go down until reaching a leaf node n At interior node at depth d, to choose the branch to follow, apply a hash function to the d th item of c l All nodes are initially created as leaves l A leaf is converted into internal when the number of nodes exceeds a threshold.

Subset Function (cont. ) l After constructing the hash-tree for Ck, subset function finds

Subset Function (cont. ) l After constructing the hash-tree for Ck, subset function finds candidates contained in t as follows: n At a leaf, find itemsets contained in t n At an interior node reached by hashing on item i, hash on each item that comes after i in t, recursively apply to the nodes in the corresponding bucket n At root, hash on every item in t.

Apriori. Tid Algorithm l l l Uses apriori-gen to generate candidates Database D is

Apriori. Tid Algorithm l l l Uses apriori-gen to generate candidates Database D is not used for counting support after the first pass The set Ck is used, for this purpose Elements of Ck are in the form <TID, {Xk}> where each Xk is a potentially large k-itemset present in the transaction with identifier TID. The member of Ck corresponding to transaction t is <t. TID, {c Ck | c contained in t}>

Apriori. Tid Algorithm (cont. ) L 1 = { large 1 -itemsets } for

Apriori. Tid Algorithm (cont. ) L 1 = { large 1 -itemsets } for (k=2; Lk-1 ; k++) do begin Ck = apriori-gen(Lk-1); // New candidates Ck = forall transactions t Ck do begin // Determine candidates in Ck contained in t. TID C’t = {c Ck | last two elements of c are in t } forall candidates c C’t do c. count++ if (Ct ) then Ck = += <t. TID, C’t> end Lk = {c C’t | c. count minsup} end Return k Lk

Example minsup = 2 transactions, s=50 D: TID Items L 1: Itemset Sup C

Example minsup = 2 transactions, s=50 D: TID Items L 1: Itemset Sup C 1: TID Set-of-Itemsets 100 1 3 4 {1} 2 100 {{1}, {3}, {4}} 200 2 3 5 {2} 3 200 {{2}, {3}, {5}} 300 1 2 3 5 {3} 3 300 {{1}, {2}, {3}, {5}} 400 2 5 {5} 3 400 {{2}, {5}} C 2={{100, {{1, 3}}}, {200, {{2, 3}, {2, 5}, {3, 5}}, {300, {{2, 3}, {2, 5}, {3, 5}}, {400, {{2, 5}}}} L 2={{1, 3}, {2, 5}, {3, 5}}

Performance Example: HW: IBM RS/6000, 33 MHz Dataset: Number of Items: 1000 Avg. size

Performance Example: HW: IBM RS/6000, 33 MHz Dataset: Number of Items: 1000 Avg. size of transactions: 10 Avg. size of maximal potentially large items: 4 Number of transactions: 100 K Data size: 4. 4 MBytes

Apriori vs. Apriori. Tid Per pass execution times of Apriori and Apriori. Tid Average

Apriori vs. Apriori. Tid Per pass execution times of Apriori and Apriori. Tid Average size of transactions: 10 Average size of maximal potentially large items: 4 Number of transactions: 100 K minsup=0. 75%

Apriori. Hybrid Algorithm l Uses Apriori in the initial passes and switches to Apriori.

Apriori. Hybrid Algorithm l Uses Apriori in the initial passes and switches to Apriori. Tid when it expects that the set Ck at the end of the pass will fit in memory.

Conclusions and Future Work l l l Apriori, Apriori. Tid and Apriori. Hybrid algorithms

Conclusions and Future Work l l l Apriori, Apriori. Tid and Apriori. Hybrid algorithms presented Future work: n use is-a hirarchies (e. g. , beef is-a red-meat is-a meat) n use quantities of items bought This work is in the context of Quest Project of IBM