JUSP The JISC Journal Usage Statistics Portal Ross

  • Slides: 49
Download presentation
JUSP - The JISC Journal Usage Statistics Portal Ross Mac. Intyre, Mimas The University

JUSP - The JISC Journal Usage Statistics Portal Ross Mac. Intyre, Mimas The University of Manchester [ross. macintyre@manchester. ac. uk]

Timeline • • • JISC Collections 1998 Nesli USAGE STATISTICS PORTAL SCOPING STUDY: PHASE

Timeline • • • JISC Collections 1998 Nesli USAGE STATISTICS PORTAL SCOPING STUDY: PHASE ii 2000 UKSG w/shop TECHNCIAL DESIGN AND PROTYPING Areas Discussion 2002 COUNTER* INVITATION TO for TENDER • Different perspectives: Summary 2003 *J&Db[R 1] Aggregators, Learning 1. • Publishers, This invitation to tender invites. Institutions, bidders to submit proposals 2004 Evidence Base report: Commercial Organisations, Product to undertake to the technical design and. Vendors. . . prototyping for a Usage • What do you want to monitor & why? ‘Nesli 2 Analysis of Usage Statistics’ Statistics Portal • What is “usage”? 2. The deadline for proposals is 12: 00 noon on Monday 14 2005 *J&Db[R 2] • ‘Are you getting enough? ’ July 2008. The work should start no later than end of July 2008. The 2006 Key report: • What are you. Perspectives supplying/gathering? work a detailed technical specification and design for the Usage • What do you do with. Portal, it? Statistics ’Usage Service Feasibility Study’ Statistics and a scoping of costs required to bring it to • What is your ‘holy grail’? production, and final report should be complete by 1 st March 2009. 2007 Content Complete report: ‘JUSP Scoping Study’ 2008 JISC ITT: ‘JUSP Scoping Study 2’ *J&Db[R 3] 2009 JUSP Report 2010 April JISC fund JUSP to service

Mission to assist and support libraries in the analysis of NESLi 2 usage statistics

Mission to assist and support libraries in the analysis of NESLi 2 usage statistics and the management of their e-journals collections. • 20+ NESLi 2 e-journal deals/Publishers • 130+ HEIs taking up NESLi 2 deals • 3 Intermediaries (gateway/host)

JISC JUSP service • Refine user requirements for usage portal • Develop portal in

JISC JUSP service • Refine user requirements for usage portal • Develop portal in line with requirements • To be based on COUNTER usage reports: • JR 1 = • Number Successful Full-Text Requests by Month and Journal JR 1 of (total number of Article full-text article requests) JR 1 a = Number of Successful Full-Text Article Requests by Month and Journal for • JR 1 A a Journal Archive (requests from archives or backfiles) JR 2 = Turnaways by Month and Journal Harvest statistics via JR 3 = Numberusage of Successful Item Requests and SUSHI Turnaways by Month, Journal and Page Type JR 4 = Total Searches Run by Month and Service JR 5 = Number of Successful Full-Text Article Requests by Year of Publication and Journal DB 1 = Total Searches and Sessions by Month and Database DB 2 = Turnaways by Month and Database DB 3 = Total Searches and Sessions by Month and Service

Technical – conversion from. xls

Technical – conversion from. xls

Technical – conversion to. xml

Technical – conversion to. xml

DEMO of JUSP

DEMO of JUSP

1. Single point of access to all JR 1 and JR 1 A usage

1. Single point of access to all JR 1 and JR 1 A usage statistics as currently downloaded individually from publisher websites • • • User informational text From this page, you can download JR 1 and JR 1 A (archive) reports. You can select data ‘from’ & ‘to’ • • Interface shows Report – drop down list (JR 1 (all), JR 1 A (archive only) Publisher – drop down list Date Span – from Month & Year – to Month & Year

2. Addition of host/gateway JR 1 statistics where relevant • User informational text •

2. Addition of host/gateway JR 1 statistics where relevant • User informational text • To get a full picture of usage you may need to add usage statistics provided by other services such as Swetswise. This will depend on the publisher. • Select publisher and date range to download JR 1 reports with Ingenta, Swetswise, Ebsco EJS etc included where appropriate. • • Interface shows Report – drop down list (JR 1 (all)) Publisher – drop down list Date From (m/y) & To (m/y)

3. Excluding usage of backfile collections • User informational text • JR 1 reports

3. Excluding usage of backfile collections • User informational text • JR 1 reports include all usage. Some publishers also produce JR 1 A reports which give only usage of their archive or backfile collections. If you have access to these, you can download here reports that exclude backfile use and show only usage of current titles. • • • Interface shows Publisher – drop down list Date From (m/y) & To (m/y) Data processing notes Titles in JR 1 and JR 1 A matched by ISSN. JR 1 A usage subtracted from JR 1.

4. SCONUL Return (Society of College, National and University Libraries) • User informational text

4. SCONUL Return (Society of College, National and University Libraries) • User informational text • Use this data for SCONUL return, which requires total use by Publishers by Academic Year. • These tables are used to look at usage trends over time, and to compare usage of the various publisher deals. • Interface shows • Publisher – drop down list • Academic year

5. Summary table to show use of host/gateways • User informational text • Use

5. Summary table to show use of host/gateways • User informational text • Use this table to see how much of your total usage goes through intermediaries, e. g. Ingenta and Swetswise • • Interface shows Publisher – drop down list Calendar Year(s) Data processing notes Separate columns for publisher, gateway, host and total. JR 1 usage shown in each. Percentage use from each source calculated.

6. Summary table to show use of backfiles • User informational text • Use

6. Summary table to show use of backfiles • User informational text • Use this table to see how much of your total usage comes from backfiles • • • Interface shows Publisher – drop down list Calendar Year(s) Data processing notes JR 1 total including intermediaries. Shows percentage of total JR 1 usage that comes from JR 1 A.

7. ‘Some more figures’ [sic] • User informational text • Find the average, median,

7. ‘Some more figures’ [sic] • User informational text • Find the average, median, (monthly) maximum number of requests, standard deviation and variance. • Interface shows • Publisher – drop down list • Calendar year(s)

8. Which titles have the highest use? • User informational text • Find the

8. Which titles have the highest use? • User informational text • Find the (20) titles which have the highest use • • Interface shows Publisher – drop down list Calendar year(s) Display (20) titles with the highest usage, including publisher, title, issn, no. of requests (descending order).

9. Tables and graphs • User informational text • See your monthly or annual

9. Tables and graphs • User informational text • See your monthly or annual usage over time as a chart • • • Interface shows Publisher – drop down list Calendar years Data processing notes Show table of monthly totals for each year Draw line graph

10. Benchmarking • User informational text • Compare usage with others in the same

10. Benchmarking • User informational text • Compare usage with others in the same JISC band • • • Interface shows Publisher – drop down list Calendar year(s) JISC Band (‘A’-’J’ & ‘All’) Data processing notes Give total for all libraries in the JISC band average.

JISC Collections Benchmarking Survey – March 2010 Usage Statistics Portal: Benchmarking functionality 76 Institutions

JISC Collections Benchmarking Survey – March 2010 Usage Statistics Portal: Benchmarking functionality 76 Institutions responded to our short survey in reference to the usage statistics portal (benchmarking functionality). Our findings are as detailed below. Question 1: How useful would it be for you to benchmark your institution’s journal usage for each individual NESLi 2 publisher against that of other HE institutions? (76 responses) 38 / 76 (50%) = Very useful 36 / 76 (47. 4%) = Somewhat useful 2 / 76 (2. 6%) = Not useful

Question 5. Regarding questions 2 -4 above, please indicate which would be your preferred

Question 5. Regarding questions 2 -4 above, please indicate which would be your preferred choice regarding benchmarking (74 responses) 37 / 74 (50%) = Named institution 23 / 74 (31. 1%) = Listed anonymously (same JISC band) 14 / 74 (18. 9%) = Average usage by institutions in the same JISC Band

Questions 10: Regarding questions 7 -9 above, which would be your preferred choice? (74

Questions 10: Regarding questions 7 -9 above, which would be your preferred choice? (74 responses) 37 / 74 (50%) = Being anonymised within my JISC Band 30 / 74 (40. 5%) = Other institutions being able to see my institution's name 7 / 74 (9. 5%) = Being part of an average figure for the Band I am in

Question 6. Is there any other benchmarking criteria you would like to see? •

Question 6. Is there any other benchmarking criteria you would like to see? • Same ‘mission group’ Select our own particular subset of named institutions • Similar size and structure • Usage, spend and budget for resources • Cost per download & cost per FTE - Student and Staff at department / subject level • SCONUL divisions (RLUK, old, new, coll. HE) and by area Scotland / Wales would also be useful • Trend over a period of years

Question 11: Please add any additional comments you would like to make • If

Question 11: Please add any additional comments you would like to make • If OK with the licence then comparing named institutions would be best/ Happy to be named if all institutions are named • Averages are not helpful unless accompanied by other institutional data. Anonymised usage figures would be more useful • Institutions within the same JISC Band can vary widely (e. g. do they have a medical school, do they still have a chemistry dept) so you really need the institution name to give any sort of useful benchmarking. • Pulling data like FTE and RAE would save us all from having to do that ourselves. • Would be useful for NESLi 2, however the majority of our deals are outside NESLi 2

Participation Agreement - Library 3. PERMITTED USES/ACTIVITIES 3. 1 The Institution hereby agrees to:

Participation Agreement - Library 3. PERMITTED USES/ACTIVITIES 3. 1 The Institution hereby agrees to: 3. 1. 1 permit the Consortium to include its COUNTER-compliant Usage Statistics in the database created for the Journal Usage Statistics Portal Service; 3. 1. 2 permit the Consortium to display the COUNTER-compliant Usage Statistics via the Journal Statistics Portal Service; 3. 1. 2 permit the Consortium to show the COUNTER-compliant Usage Statistics to other participating libraries in the Journal Usage Statistics Portal Service for benchmarking purposes; and 3. 1. 3 be identified in the Journal Usage Statistics Portal Service by: (1) institutional name; (2) JISC Band (3) institutional group.

Participation Agreement - Library 4. RESPONSIBILITIES OF THE CONSORTIUM 4. 1 The Consortium agrees

Participation Agreement - Library 4. RESPONSIBILITIES OF THE CONSORTIUM 4. 1 The Consortium agrees to: 4. 1. 1 only provide access to any COUNTER-compliant Usage Statistics collected by the Consortium to authorized users from other participating institutions in the Journal Usage Statistics Portal Service and the Consortium partners; 4. 1. 2 use authentication for access to the Journal Usage Statistics Portal Service; and 4. 1. 3 permit JISC Collections to use the COUNTER-compliant Usage Statistics in the Journal Usage Statistics Portal Service database for negotiation purposes with publishers within the framework of NESLi 2.

Participation Agreement – Publisher/Intermediary 3. PERMITTED USES/ACTIVITIES 3. 1 The Publisher hereby agrees to:

Participation Agreement – Publisher/Intermediary 3. PERMITTED USES/ACTIVITIES 3. 1 The Publisher hereby agrees to: 3. 1. 1 provide the Consortium with the COUNTER Usage Statistics of the Institutions, including by using the SUSHI Protocol; 3. 1. 2 permit the Consortium to include the collected COUNTERcompliant Usage Statistics in the database created for the JISC Journals Statistics Portal Project; 3. 1. 4 permit the Consortium to show all COUNTER-compliant Usage Statistics to any NESLi 2 -eligible Institutions for their own usage assessment and for benchmarking their own usage against that of other Institutions; 3. 1. 5 permit the Institutions to use the information in the JISC Journals Statistics Portal for their SCONUL returns and any other uses agreed between the Publisher and the Consortium; 3. 1. 6 provide the Consortium with usage statistics which are in compliance with the latest COUNTER guidelines; and 3. 1. 7 implement the SUSHI Protocol.

Some slight modifications…

Some slight modifications…

SUSHI Processing • OUP – Of the 51 sites now signed up, 24 had

SUSHI Processing • OUP – Of the 51 sites now signed up, 24 had 2010 data from OUP but no 2009 data. SUSHI used to collect 12 months worth of JR 1 and JR 1 a data for these sites. – (24 sites x 12 months x 2 files per month = 576 files. ) – Total time to collect files from OUP - 25 minutes – Total time to quality check them - 15 minutes – Total time to load them - 20 minutes – Total processing time for 2009 data for OUP for 24 sites - 1 hour • Publishing Technology – 2009 data collected, processed and loaded for 25 institutions. – Total time required: 17 minutes • AIP – 15 sites now have complete 2009 data loads for AIP. This involved the collection, processing and loading of 180 sushi files took just under 25 minutes.

Observations • SUSHI – rare indeed! • ‘NIL’ • Upload of publisher price lists

Observations • SUSHI – rare indeed! • ‘NIL’ • Upload of publisher price lists – lack of machine-readable sources (why not ONIX Serials – SPS? ) • Authority files to populate the Journal and Supplier tables • Subject categorisation of journals

Authentication/Authorisation • UK Access Management Federation • edu. Person. Scoped. Affiliation: • member@institution. ac.

Authentication/Authorisation • UK Access Management Federation • edu. Person. Scoped. Affiliation: • member@institution. ac. uk or staff@institution. ac. uk • edu. Person. Entitlement: • http: //jisc-collections. ac. uk/entitlements/representative

Final Observations • Open Source – available to institutions or other consortia • Complementary

Final Observations • Open Source – available to institutions or other consortia • Complementary not in competition with licensed software offerings

Q&A This artwork by ADA+Neagoe, originally published in Omagiu Magazine.

Q&A This artwork by ADA+Neagoe, originally published in Omagiu Magazine.

Raptor • Athens -> Shibboleth = loss of stats • Stats important for making

Raptor • Athens -> Shibboleth = loss of stats • Stats important for making budgetary decisions about e. Resources • Raptor is a Java based Auth. N system log file parser • • Shibboleth, Ezproxy and Open. Athens Future release may see some integration directly in Shibboleth • Designed for non technical users. But will

The end

The end