Asking Research Questions Sally Fincher Building Research in
Asking Research Questions Sally Fincher Building Research in Australasian Computing Education: Second Workshop 26 th-29 th January 2005: Sydney www. cs. kent. ac. uk
Questions do not spring fully-formed • Undergo iterative refinement Questions do not stand alone • • It’s not enough to ask implied in the asking is the expectation of an answer you must want to know (evidence) you must be able to answer (technique) 2
4 Loops Clarity Evidence Technique Significance 3
4 Loops Clarity Evidence Technique Significance 4
Loop: Clarity • Specificity - is it vague? • Language – is it understandable to your community (jargon, buzzwords)? • Recognizable as a question – is it a sentence? • Doable – can you imagine how you might study it? 5
Loop: Clarity (i) • Specificity – is it vague? “What makes a good programmer? ” • Language – is it understandable to your community (jargon, buzzwords)? “using the Co. P framework can we explain the learning environment “ 6
Loop: Clarity (ii) • Language – is it understandable to your community (jargon, buzzwords)? We are planning a conference in December 2005, to explore alternative approaches to educational IWhat certainly think the conference idea is a for good one. I would appreciate is a translation research. What we meanthinking by 'alternative' is still You seem to be mainly about 'alternatives' uninitiated people like myself. I tripped up on rather unclear. Our starting point is a feeling that it in terms of empirical methodologies. But wouldn't "ontological and epistemological perspectives? there is a extending bigger variety of practicesinbeing used by be worth to alternatives terms of Allied to that would be the methods of conceptual researchers and practitioners to understand ontological perspectives? analysis. " I and thinkepistemological the rest of Leonard's message learning and teaching than is apparent in the Allied to that would be the methods of conceptual rings some useful bells, and I would like to hear literature, and that some of these approaches analysis. more, but in language I can understand. could be very valuable if they were more widely known. 7
Loop: Clarity (iii) • Recognizable as a question – is it a sentence? “whether handset usage predicts final outcome” • Doable – can you imagine how you might study it? “When teaching students their first course on programming should the emphasis be on problem solving or on the language” “As programmers mature from novice to expert, do the pass through stages, where they read and reason about code in different ways? If so, in what ways do the reasoning strategies differ? “ 8
4 Loops Clarity Evidence Technique Significance 9
Loop: Evidence Empirical investigation • What makes you believe your claims? • What would you need to convince a colleague – in AND out of “the choir”? • What would a sufficient answer look like? • What would an unsatisfactory claim look like? • What might contradictory examples look like? 10
Loop: Evidence (i) Empirical investigation • What makes you believe your claims? • What would you need to convince a colleague – in AND out of “the choir”? “Is success in team-based work related to students’ epistemological beliefs (EB)? ” “given the approach is phenomenography it's interview transcripts talking for about an hour with developers” 11
Loop: Evidence (ii) • What would a sufficient answer look like? • What would an unsatisfactory claim look like? • What might contradictory examples look like? “What kinds of learning and learning contexts is peer assessment suited to? “ “Can explicit problem solving skill instruction be incorporated into a programming curriculum at the cost of time from practical or other explicit instruction? ” 12
4 Loops Clarity Evidence Technique Significance 13
Loop: Technique • What technique(s) might produce the evidence you need? • How do you know? • What’s involved with using this technique? • Assessing “costs” – opportunity, time/resources, available knowledge • Do you know how to do it? 14
Loop: Technique (i) • What technique(s) might produce the evidence you need? • How do you know? “I am interested in the way experts and novices use 'programming tools' to represent their emerging conceptualisations of a programming task. “ • What’s involved with using this technique? • Assessing “costs” – opportunity, time/resources, available knowledge • Do you know how to do it? • phenomenography, Co. P analysis etc. 15
Loop: Technique (ii) • What’s involved with using this technique? • Assessing “costs” – opportunity, time/resources, available knowledge “Do novice programmers exposed to explicit instruction in problem solving skills produce problem solutions: Faster; With greater accuracy; and with more consistency (between solutions)? ” 16
4 Loops Clarity Evidence Technique Significance 17
Loop: Significance (aka “so what? ”) For whom (audience)? • To you, others? • For both computing AND education? For a disciplinary community • • Relevance - Why might they be interested? Justification - What is the contribution? Context - How does this fit in the known landscape? “Edgy” – Does it open up new avenues? 18
Questions in context (reprise) 1. Pose significant questions that can be investigated empirically 2. Link research to relevant theory 3. Use methods that permit direct investigation 4. Provide coherent and explicit chain of reasoning 5. Replicate and generalize across studies 6. Disclose research to encourage professional scrutiny and critique 19
Remember … You can always rely on the basics: • Why am I doing this? • What am I doing? • Is this doable? 20
- Slides: 20