Get Posts by E-mail

Show the following five sentences to one group of students:

  1. A newly-wed bride had made clam chowder soup for dinner and was waiting for her husband to come home.
  2. Although she was not an experienced cook she had put everything into making the soup.
  3. Finally, her husband came home, sat down to dinner and tried some of the soup.
  4. He was totally unappreciative of her efforts and even lost his temper about how bad it tasted.
  5. The poor woman swore she would never cook for her husband again.

Then show all those sentences except the fourth, italicized sentence to another identical group of students.

Which group of students will rate their passage as more interesting?

For Greg Ashman, advocate of explicit instruction, the question is either a) moot, because learning matters more than interest, or b) answered in favor of the explicit version. Greg has claimed that knowledge breeds competence and competence breeds interest.

I don’t disagree with that first claim, that disinterested learning is better than interested ignorance. (Mercifully, that’s a false choice.) But that second claim is too strong. It fails to imagine a student who is competent and disinterested simultaneously. It fails to imagine that the very process of generating competence could be the cause of disinterest. It fails to imagine PISA where some of the highest achieving countries look forward to math the least.

That second claim is also belied by the participants in Sung-Il Kim’s 1999 study who rated the implicit passage as more interesting than the explicit one and who fared no worse in a test of recall. Kim performed two follow-up experiments to determine why the implicit version was more interesting. Kim’s determination: incongruity and causal bridging inferences.

That fifth sentence surprises you without the context of the fourth (incongruity) and your brain starts working to understand its cause and connect the third sentence to the fifth (casual bridging inference).

Kim concludes that “stories are interesting to the extent that they force challenging but resolvable inferences on the reader” (p. 67).

So consider a design principle for your math classes or math curriculum:

“Ask students to make challenging but resolvable inferences before offering them those resolutions.”

Start with estimation and invention, both of which offer cognitive benefits over and above interest.

[via Daniel Willingham’s article on the brain’s bias towards stories, which you should read]

2015 Jan 11. John Golden attempts to map Willingham’s research summary onto mathematics instruction.

34 Responses to “Study: Implicit Instruction Rated More Interesting Than Explicit Instruction”

  1. on 06 Jan 2016 at 6:19 pmGreg Ashman

    There’s quite a bit of imagining going on here. And it’s not necessarily relevant to maths. I reckon I could get kids interested in something pretty easily but I’m not sure that this would necessarily impact on future maths performance.

    Not only is this my experience, it seems to be supported by a study of elementary maths students. Achievement predicted motivation but motivation did not predict achievement:

    http://onlinelibrary.wiley.com/doi/10.1111/cdev.12458/abstract;jsessionid=93206928F589F5D3DD160667EC1AEC5D.f03t01

    This seems to be at odds with self-determination theory. I think this is probably because SDT applies more to experts (many things have a differential effect on experts and novices).

    I blogged about this study here:

    https://gregashman.wordpress.com/2015/11/12/motivating-students-about-maths/

  2. on 06 Jan 2016 at 6:41 pmAnna

    This is a thought provoking post. Thank you. Does age of the student matter? I am unfamiliar with the study you cite, but I would love to know how old those subjects were. My gut: age does not matter. I think from young children to adults, the implicit learning moments would be more engaging. But I would interested to know what the data say

  3. on 06 Jan 2016 at 6:55 pmDan Meyer

    @Anna, regarding the age of participants:

    Twenty undergraduates from an introductory psychology course received course credit for their participation.

    @Greg, you haven’t actually engaged the details of the study here.

  4. on 06 Jan 2016 at 6:56 pmJessie Turner

    “Kim concludes that “stories are interesting to the extent that they force challenging but resolvable inferences on the reader” (p. 67). So consider a design principle for your math classes or math curriculum: “Ask students to make challenging but resolvable inferences before offering them those resolutions.” Start with estimation and invention, both of which offer cognitive benefits over and above interest.

    The use of estimation is quick and seems effective, so I do not disagree here. But the studies on “invention” and “productive failure” are textbook examples of poorly controlled education research. Recently, a number of studies have controlled for the weak DI conditions Kapur and Schwartz used and found that DI results in comparable or sometimes superior learning. See http://bit.ly/1RjFZkO and http://bit.ly/1Oc57nj

    Here’s an excerpt from the abstract of the first study:

    1. Problem-solving prior to instruction triggered a global awareness of knowledge gaps. While this was beneficial for learning when combined with instruction with student solutions, our results indicate that comparing student solutions during instruction to specify the gaps is the most relevant factor.

  5. on 06 Jan 2016 at 7:16 pmGreg Ashman

    Dan – You are right. I have not. I don’t think it demonstrates very much at all. I think the study that I cite is far more relevant to our disagreement. Do you have any thoughts on that study?

  6. on 06 Jan 2016 at 7:20 pmElizabeth (@cheesemonkeysf)

    First, let me say that that is a deeply weird choice of “sentences” to use as an occasion for measuring learning versus interest. For one thing, I find the grammatical and punctuation errors distracting. For another, the inherent cultural biases in the “story” are also very distracting (where on Earth does a situation like this occur?). I can’t imagine any of my students not asking why in the name of everything holy we were wasting any time on that sequence of sentences at all.

    Even if I don’t have a good estimation or invention starter on hand, it’s well known that I can dramatically improve the effectiveness of my lecture sequences (or snippets) and my students’ learning gains by providing them with some occasion to “prime the pump” of their own memories and internalized experiences around a given math topic.

    Just yesterday I had student groups in Algebra 1 do a table reading of a “deleted scene” from a Harry Potter movie that I claimed to have “found” on the topic of “more than” and “less than.” This was our opening to a unit on inequalities.

    Sometimes just giving students a chance to disinhibit themselves — while they re-expose themselves to a topic — can improve receptivity a lot. :)

    For me, the larger point is that *mere* direct instruction is an impoverished form of teaching, especially if you are trying to teach for understanding. I’m a good lecturer and a respected guide to mathematics, but I have learned through experience that the only way to get kids to really digest the “expert model I am lending them” (as How People Learn puts it) over the longer term is to get them to prime themselves for learning by giving them a reason to *involve* themselves with the learning.

    – Elizabeth (@cheesemonkeysf)

  7. on 06 Jan 2016 at 10:38 pmDavid Didau

    I find the story with the fourth sentence much more interesting. I still have plenty of questions about the husband’s behaviour and the fact that this is not explicitly explained works well. But without the fourth sentence, the wife’s behaviour just seems random. It’s a non-sequitur rather than a story.

    This speaks to the difficulty of finding the sweet spot between too little to make sense and too much to provide interest. As far as stories go, they need to be more complex as readers become more knowledgeable. Very young children tend to need all the relevant information in order to make sense of what they’re reading. As they become increasingly versed in stories, we can give them increasingly complex narratives with more in the way of gaps and twists.

    Also, you might be confusing “disinterested” (impartial) with “uninterested” (bored). I think you mean the latter?

  8. on 07 Jan 2016 at 12:00 amNick Rose

    I may disagree with David. I think skilled writers often leave the reader to make inferences in order to generate interest (indeed often playing upon those expectations to create a surprise or a twist at the end – for example, discovering that the wife was secretly aware of her husband’s potentially fatal allergy to clams all along!).

    The scenario presented was rated as more interesting when ‘step 4’ was omitted – which Dan suggests demonstrates that implicit instruction provides cognitive benefits compared to explicit instruction. However, (for what it’s worth) there are two reasons why I think Sung-Il Kim’s study and Willingham’s paper on the power of stories is not incompatible with Greg’s position on explicit instruction.

    1) University undergraduates would be able to draw upon a great deal of prior experience in order to make causal inferences in the version of the story without step 4. For example, experiences of ingratitude when one has taken effort intended to help or please another person. I suspect Greg might agree that where prior knowledge is strong, then the benefits of explicit instruction start to decline.

    2) The scenario is based upon what David Geary might call ‘folk psychology’ – easily acquired and rapidly learnt information about people which had benefit for our survival and reproduction in our evolutionary past.

    http://evolution.binghamton.edu/evos/wp-content/uploads/2008/11/Geary01.pdf

    The task undergraduates in making an inference in this scenario would be very different to making inferences in a ‘secondary biological knowledge’ domain like mathematics. In short – undergraduates making social inferences from a story is not like 12 year olds learning maths.

    A good example of this is the difficulty of the Wason card test:

    https://en.wikipedia.org/wiki/Wason_selection_task

    Even undergraduates find this inferential logic task very difficult and the majority get it wrong. Cosmides and Tooby changed the context of the task to one involving social cheating (catching underage drinkers) and found that most people could easily make the correct logical inferences in this form.

    So – I think you’re both right. Where a domain involves biologically primary knowledge or where students have strong prior knowledge or experience to help them – implicit instruction can create interest and potentially lead to a great deal of ‘focused thinking’ which will likely help them learn. On the other hand, this doesn’t undermine the argument that where the domain is abstract or where students are novices, explicit instruction may be a much more effective way to learn.

  9. on 07 Jan 2016 at 2:26 amKenneth Tilton

    It is not clear the PISA results contradict Greg re competence translating to enjoyment. Looking at Singapore vs US, we see their worst kids doing better than our best but feeling like they suck at math, because all they have to compare with is other Singaporeans. Meanwhile, those top Americans feel they are doing great because they are–in comparison with their classmates. So I am not surprised students scoring well on PISA might not like math — the relevant question is how highly their competence is regarded in their classroom.

    As for the Sung-Il Kim study, that is quite a leap from the nature of good story-telling (leave some work for the reader) and recall of a story to how best to learn a skill manipulating symbols in a strict logical system. But I’ll try.

    Suppose we just learned that x^2-y^2 can be factored into (x+y)(x-y). The first problem might be a^2-b^2, then maybe d^2-9 (“oh, that is 3^2”) and we have had to fill in a missing bit. Next comes b^2-1 (“1^2!!”) or -1+d^6. How about x2/4 – 1/9? In each case I am forced to find a bridge from a straightforward rule to the problem at hand. If the person making the worksheet knows how to tell a good story.

  10. on 07 Jan 2016 at 4:17 amMichael Pershan

    Maybe there are two kinds of implicit instruction under discussion here — micro-explicitness and macro-explicitness

    OK that language is terrible and I’m sorry for jargonizing this.

    What I’m trying to get at, though, is that even the “implicit” version of that passage is still pretty darn explicit. Storytelling is pretty explicit. True, leaving an inference for the reader to make leads to more interesting storytelling, in this study. But if reading this passage is a form of instruction, we’d still have to rate it as pretty explicit instruction.

    I think this study tells me that, on the micro level, leaving resolvable inferences can make a form of instruction more interesting. But I don’t see what that can tell us about the macro structure of a learning experience.

    Like, from this study it’s clear to me that a lecture will be more interesting if it includes little bridgeable gaps for the listener. And if you’re going to have work on a problem-solving experience, it will be more interesting if there are more resolvable inferences sprinkled across it.

    But does that allow us to compare how interesting an interesting lecture is vs. an interesting problem-solving experience?

  11. on 07 Jan 2016 at 8:26 amJeff Lisciandrello

    Great illustration of a concept that can be very hard to put a finger on. I would add that for implicit instruction to work, it requires that the quantity left out is ride on the edge of student understanding. A student who is capable of inferring the reason for the woman’s refusal to make the soup again will likely have heightened interest, while those who cannot will just end up confused.

    When applying this to math class, it requires a teacher who is keenly aware of individual student strengths and able to differentiate to meet needs of a variety of students; however, I strongly agree that when done artfully, implicit instruction can be much more powerful than explicit instruction.

  12. on 07 Jan 2016 at 9:00 amDan Meyer

    David Didau:

    This speaks to the difficulty of finding the sweet spot between too little to make sense and too much to provide interest. As far as stories go, they need to be more complex as readers become more knowledgeable. Very young children tend to need all the relevant information in order to make sense of what they’re reading.

    That last sentence overextends itself, IMO. Very young children aren’t blank slates. I read your argument up to that point as, “Pick the right level of inferential work given the background knowledge of the student.” No problem there.

    My general disagreement with advocates of explicit instruction (to the extent that I try not to mischaracterize them) is that their instructional model makes no room for prediction, invention, or other inferential tasks we know to be productive and interesting for students.

    Michael Pershan:

    What I’m trying to get at, though, is that even the “implicit” version of that passage is still pretty darn explicit. Storytelling is pretty explicit. True, leaving an inference for the reader to make leads to more interesting storytelling, in this study. But if reading this passage is a form of instruction, we’d still have to rate it as pretty explicit instruction.

    Storytelling is an explicit medium. Agreed. Start to finish, it’s narrative. But the best stories make frequent room for incongruity and inference. More to the point, they make temporal room. They show something incongruous and don’t immediately explain it. They give the viewer (often through a proxy on the screen) room to ponder the incongruity and make a challenging but resolvable inference. Let’s all imagine the alternate scenario where the incongruity is either absent or immediately explained.

    Research is clear to me that teachers should be explicit. It’s also clear to me they should be thoughtful about how and when and under what preconditions they’re explicit.

    Jessie Turner:

    Recently, a number of studies have controlled for the weak DI conditions Kapur and Schwartz used

    Can you clarify how the Schwartz & Martin 2004 had a weak DI condition?

  13. on 07 Jan 2016 at 9:24 amMichael Pershan

    Research is clear to me that teachers should be explicit. It’s also clear to me they should be thoughtful about how and when and under what preconditions they’re explicit.

    For sure. And in this reading example, there were three examples of explicit instruction that preceded the inferential jump. That makes sense in this context. For the reader to have a shot at resolving this causal gap, she needs to know about the characters, the setting, their motivations, etc.

    Does that mean that every math lesson should start with explicit instruction? Of course not. But teaching (like storytelling) is very complex and hard to generalize.

    I think what I was initially commenting on was especially the title of the post. “Study: Implicit Instruction Rated More Interesting Than Explicit Instruction” doesn’t quite capture your view, I think. Or maybe I still don’t quite get what you mean by implicit/explicit instruction.

  14. on 07 Jan 2016 at 10:11 amDan Meyer

    Alternate title: “Bridging inferences are interesting and productive and fully explicit instruction doesn’t know what to do with them.”

  15. on 07 Jan 2016 at 11:35 amJessie Turner

    Jessie Turner:

    Recently, a number of studies have controlled for the weak DI conditions Kapur and Schwartz used

    Can you clarify how the Schwartz & Martin 2004 had a weak DI condition?

    Basically, all PFL studies do not provide contrasting cases to students in the DI condition. The DI students do
    not get a chance to learn how the solution was induced from specific cases. Let me also quote the Glogger-Frey et al.
    study I linked above:

    Schwartz et al. (2011) found advantages of an inventing condition with contrasting cases (in the domain of physics) as compared to a tell-and-practice condition. Both conditions worked with the same contrasting cases, but in process
    analyses, they found that the tell-and-practice condition did not contrast the cases but rather worked through them
    serially. Thus, contrasting the cases is crucial for noticing and learning the deep structure. If people do not learn the
    deep structure, they rarely exhibit spontaneous transfer to problem isomorphs with differing surface features
    (Chi & VanLehn, 2012; Gick & Holyoak, 1983). So the question arises if an inventing effect can still be found if the
    comparison condition also includes the contrasting activities, that is, if the two conditions (an inventing and a
    comparison) just differ in whether or not an index or criteria is generated or directly presented. Direct instruction
    with cases (i.e., worked examples) can be implemented in a way that encourages contrasting the cases, too (e.g.,
    by simply providing explanations about the contrasts or prompts in order to compare the cases; e.g.,
    Rittle-Johnson & Star, 2009; see also Renkl, 2014). Is there a benefit in generating a solution to a contrasting-cases
    problem (inventing) as compared to not generating but processing a given solution to the same problem?

    The authors are basically entertaining the possibility of “vicarious discovery” or observing the process of how someone
    could discover the solution (pg. 78 of the article contains a vicarious worked example). Their results are interesting and predictable. In experiment 1, the vicarious worked example group learned more. In experiment 2, middle school
    students had significantly higher transfer scores and higher self-efficacy when learning physics. The authors concluded
    that vicarious worked examples improve transfer and self-efficacy, whereas inventing activities improve knowledge-gap experience and curiousity.

  16. on 07 Jan 2016 at 4:00 pmeducation realist

    I’m caught in the middle on this.

    1) I work with far too many students who are not captured and engaged by math. I reject Dan’s basic premise (as I understand it) that kids will be intuitively interested in a perplexing problem. I know far too many kids who, given choice between engaging in math and just tuning out and taking the F, will take the F. They can do seat time in summer school. Other kids simply aren’t interested in big, wide, far ranging problems. To be specific, they would not be interested in how long it takes the tank to fill up, nor would they care how many postits would cover the surface. I couldn’t engage them with open-ended problems like that, at least not in the early days of a class.

    2) Likewise, I reject Greg’s basic premise that knowledge breeds competence and competence breeds interest. I read a post by Greg once in which he described a class discussion/explanation. I know he’s not talking about straight lecture. But the reality is, even a good class discussion (and I count that among my strengths) will have a good number of students just not paying attention, awaiting the moment I will come by and personally explain it to them.

    So I want to increase my students’ tolerance for both. I want them to be willing to challenge open-ended problems, and I want them to tune in when I need to explain something. This has nothing to do with whether or not to do explicit instruction, but *when*.

    A principal once told me that engaging unmotivated kids begins by ensuring they can do whatever task you set in front of them. I have found that to be true. And I’ve built on that discovery: increasing a student’s (often initially non-existent) confidence in his own ability to do the task in front of him also increases his trust in me, and his willingness to take on the unknown. So over time, I can set tasks that aren’t necessarily familiar at first, require him to use existing knowledge and build on it.

    However, anyone who wanted to examine this process critically would unhesitatingly describe it as “dumbing it down”. But over time, I’ve found that stepping back allows me to take on more challenging math over time, with a better chance of them remembering it. I describe that process here: https://educationrealist.wordpress.com/2013/12/07/the-release-and-dumbing-it-down/

    Finally, I’ve found that the more students do themselves, the more likely they are to remember the experience, when compared to my undoubtedly fabulous explanation. This does not point the way to discovery or open-ended projects, but rather tasks that allow them to use their existing math knowledge to take the next step. Then, I come in after the task to do cleanup, explaining what they found and putting it all together. They are more likely to listen because they are fully aware of the context and the math behind it. And it allows me to limit my explanations and make them more likely to tune in when they are necessary.

    Way too long an explanation, but I guess I’m saying this: To me, the important thing is not whether the story is more interesting with or without the sentence. The important thing is whether or not they are going to be interested in the first place.

  17. on 07 Jan 2016 at 4:10 pmeducation realist

    Oh, I meant to mention this:

    When Greg talks about research, I yawn. First, there’s almost no research on what methods work with math achievement in high school, and most of what exists involves algebra–the assumption being hey, if you increase their algebra scores, it all falls into place.

    So no one has any data to fall back on in this debate. I agree that motivation doesn’t lead to achievement. But the people who push that line are usually saying “increase their achievement, then they’ll be motivated!”

    Both arguments ignore cognitive ability’s role in achievement and, to a large extent, motivation.

    I’m largely uninterested in “achievement” in any absolute sense, because the range of abilities in my classroom is so huge that I’m going to be doing a disservice to some kids no matter what I do. My job is to minimize that disservice by creating challenges for everyone.

  18. on 07 Jan 2016 at 4:25 pmGreg Ashman

    Ed – The feeling is mutual. Whenever you go off on a long, anecdotal ramble I tend to get a little drowsy too ;-) I will let others decide whether they prefer argument from evidence or anecdote.

    I will pick up on one thing – When I hold a class discussion, pretty much everyone is paying attention because they know that they might be called upon at any time. I’ve taught in a wide range of high schools including a ‘school facing challenging circumstances’ in London and it was quite possible to have such discussions there.

  19. on 07 Jan 2016 at 4:37 pmChristian

    The effect of achievement on self-concept is stronger than vice versa. Both are important but one direction is stronger.

    The article Greg Ashman cites is quite nuanced in its implications for teaching and aware of its limitations, although the summary indeed summarises it in the ‘one does predict, other not’ way.

    I also liked Jessie’s contribution. Productive Failure and invention studies indeed have limitations, although ‘poorly’ controlled is too strong, in my opinion. One test for me is how scholars look at the studies (apart from acknowledging limitations) and I like how a researcher like Kalyuga, really is thinking how apparently contradictory results (so, yes, limitations but how can we explain things) could be synthesised.

    One recent paper on this is: “Kalyuga S; Singh AM, 2015, ‘Rethinking the Boundaries of Cognitive Load Theory in Complex Learning’, Educational Psychology Review, pp. 1 – 22, http://dx.doi.org/10.1007/s10648-015-9352-0” with the quote “One of the consequences of this reconceptualization is abandoning the rigid explicit instruction versus minimal guidance dichotomy and replacing it with a more flexible approach based on differentiating specific goals of various learner activities in complex learning.”

    Maybe this will show that PF indeed is not supported but it’s a continuous ongoing scientific investigation. Just like with, for example, Cognitive Load Theory with its numerous limitations and misinterpretations. It would be great to make steps there.

  20. on 07 Jan 2016 at 4:42 pmChristian

    Btw, maybe Greg can comment on the Kalyuga paper, as I think he is ‘close to the fire’. But I realise this might a bit off-topic.

  21. on 07 Jan 2016 at 6:02 pmTom Hoffman

    It seems to me that one way Dan and Greg are talking past each other is by conflating “interest” and “motivation,” which strike me as very different concepts, especially in the context of the study, where it is literally a moment to moment change that’s taking place as the story progresses.

    I tried to riff off Willingham’s piece a bit on how the Common Core ELA fails to recognize the difference in the role of inference in literary and “informational” texts, but the standards are so poorly written and inconsistent on the point that I gave up.

  22. on 07 Jan 2016 at 6:14 pmJessie Turner

    Christian’s comment on balancing techniques like PF, PFL, and more explicit instruction deserves more discussion. I agree with him and highly recommend reading
    Kalyuga’s book on instructional guidance (http://amzn.to/1mGDHzm).

    I did not mean to say, in any comments above, that vicarious worked examples followed by problem solving are sufficient for instruction. I just want to raise two points that are not often brought up:

    1. Many instructional guidance studies (PF and PFL are just a few examples) lack control. Unwarranted causal claims are made and that is never OK.

    2. Evidence exists for multiple sequences of guidance. For example, metacognitive instruction is amazingly successful for promoting far transfer (read http://bit.ly/22OhNux for science instruction). But it should be taught explicitly and
    in small doses distributed over time.

    My working theory is that when information is simple (learning facts), the generation effect is optimal. When lessons increase in complexity, use a high-low sequence of guidance for the most difficult lessons. For moderately difficult lessons, use a low-high sequence (PF or PFL with scaffolds) to impart metacognitive and problem-solving skills.

  23. on 07 Jan 2016 at 7:20 pmGreg Ashman

    Jessie – You may be right in your hierarchy. I don’t think we have the evidence to be definitive about this. I’d wish to make three points if I may.

    1. You are right to question the PF (and related) research. It could be argued that the controls don’t represent the best possible version of DI. For instance, Kapur 2014 has students spending an hour solving a single problem in different ways after they have been given the canonical solution method. It is hard to see why you would choose to do this outside an experiment. However, I concede that designing a fair test that also tested the best version of both models would be very difficult.

    2. The PF studies tend to utilise a context that is particularly suited to PF e.g. standard deviation. By asking ‘which is the most consistent baseballer?’ we have a question that students can comprehend and attempt, even if they don’t know the canonical method. It would be hard to think of an equivalent situation in many abstract areas of maths (although I suspect that Dan might dispute this).

    3. I am also interested in the process-product research of the 50s though 70s. Much of this has been forgotten and/or disregarded and I accept that it is essentially correlational; observing differences in instructional practices and correlating these to differences in results. However, the research shows that a form of explicit instruction was used by the more successful teachers (Rosenshine’s description is good – https://www.aft.org/sites/default/files/periodicals/Rosenshine.pdf). Nothing like PF emerges. This could be either because no teachers thought of using such a model at this time, some used the model but it was less effective or some teachers used it effectively but not at a scale captured by the research. It may be that well-designed PF works in the lab but is hard to implement at scale.

  24. on 07 Jan 2016 at 7:31 pmChester Draws

    It fails to imagine a student who is competent and disinterested simultaneously.

    But how many of those are there? Really. I’ve met a few, but they tend to be a small minority.

    In any case, they tend not to be much of a problem, precisely because they are competent. If you get a class moving along they will go with the rest, because it’s not too hard for them anyway so they might as well.

    Very much more common are those that wish to be involved, are happy to be “interested”, but don’t make much progress. They are the real problem for most teachers.

    If you focus on interesting them, my experience is that you don’t improve outcomes much. They enjoy Maths, sure, but shy from repeated application to build really solid understanding. However, if you can get them into the habit of doing quite a lot of work, then they start to pick up. Confidence follows, and interest often.

    However we teach, some students will make more relative progress. Any style is easy pickings if we concentrate on it’s worst possible sub-group, especially in the theoretical, without evidence that it is actually a problem.

    I also don’t care for “evidence” from PISA, because there’s too many cultural unknowns. I could argue that countries who move to where interest is prioritised as a starting point, before confidence via skills, see a slide in grades. Isn’t this what happened to Canada?

  25. on 07 Jan 2016 at 7:46 pmGreg Ashman

    Chester – I once heard Geoff Masters of the Australian Council for Educational Research make a similar point. Rather than look at between-country differences on PISA which are fraught due to culture (it’s all about private tuition, values etc), it may be better to look at the trajectories of individual countries over time as their policies change. Finland and Canada make interesting case-studies for such an analysis.

  26. on 07 Jan 2016 at 8:41 pmJessie Turner

    @ Greg Ashman

    You are correct about my tentative theory. It’s a hunch.

    1. Regarding the use of low guidance, I do not necessarily recommend PF alone. I do, however, think that PF tasks could be coupled with metacognitive instruction (read the JOEP paper above). Giving metacognitive prompts and problem-solving activites has excellent learning outcomes. The intervention I linked above was brief (several hours) and resulted in far transfer and increased motivation. Of course, I would always like to see more empirical evidence. It is also unclear how much metacognitive training is optimal. My guess is at most 15-20% of a school year.

    3. It may be that well-designed PF works in the lab but is hard to implement at scale.

    Some work on MetaCognitive Tutors may be helpful http://bit.ly/1UB3i78

    This has been a nice chat. I’m currently working on a master’s thesis in this area. Let me know if you would like to discuss this further through email (vpletap@outlook.com).

  27. on 09 Jan 2016 at 5:07 pmDan Meyer

    Greg:

    The PF studies tend to utilise a context that is particularly suited to PF e.g. standard deviation. By asking ‘which is the most consistent baseballer?’ we have a question that students can comprehend and attempt, even if they don’t know the canonical method. It would be hard to think of an equivalent situation in many abstract areas of maths (although I suspect that Dan might dispute this).

    Your comment here surprises me. I thought I understood your position as “all explicit, all the time when learning something new.” (Not the same as “all explicit, all the time.”) Here you seem to have carved out a genre of content where other instructional strategies are more effective than EI. That’s interesting.

    As you guessed, I think the category of “questions that students can comprehend and attempt, even if they don’t know the canonical method” is quite large, from subtraction (Saxe) through linear functions (Moschkovich) and on. My dissertation research was a variant on IPL and was purely abstract. Just coordinates, no context. Students can understand each of these concepts informally and the key in IPL is to “transfer in” that knowledge to help the students learn from new instruction that’s explicit and formal. (“When preparing students to learn,” write Schwartz & Martin, 2004, p. 132, “the instructional challenge is to help students transfer in the right knowledge.”)

    However, the process-product research shows that a form of explicit instruction was used by the more successful teachers. Nothing like PF emerges. This could be either because no teachers thought of using such a model at this time, some used the model but it was less effective or some teachers used it effectively but not at a scale captured by the research. It may be that well-designed PF works in the lab but is hard to implement at scale.

    Could be:

    • The science of learning has advanced a great deal since the 1970s.
    • Better pedagogy is harder pedagogy.

    If it’s that second case, I’m happy to start making compromises. I wouldn’t want to make the best pedagogy the enemy of good pedagogy.

    But Schwartz & Martin recreated the results they saw from their first study (in which they taught the classes) with other teachers of different pedagogical abilities and preference. The results replicated.

  28. on 09 Jan 2016 at 5:48 pmGreg Ashman

    Dan: I don’t think that PF works better than EI for some classes of problem. I just think it is less likely to fail for some classes of problem. I would still use EI to teach standard deviation to students who had not met this before. This is for two reasons: Firstly, I don’t see the types of EI that PF is compared with in studies such as Kapur’s as optimal forms of EI, although I do admit that this would be hard to achieve whilst also keeping the test fair (i.e. it is a genuine problem and not a deliberate attempt to influence the result). Secondly, there are other sources of evidence to support EI from outside this particular line of research such as effective teacher research and strategy instruction research.

    The fact that Schwartz and Martin were able to replicate their own study with different teachers tells us something. However, it probably does not tell us as much as a replication by other researchers who may have used a different method / control.

    “The science of learning” is an interesting phrase to use. Some of those who I tend to disagree with would dispute that such a thing is even possible. They would call it “positivism”. However, I do think that we have learnt a lot. This does not necessarily mean that teachers from the 1960s are all very different from teachers today. And it does not mean that we should dismiss a whole body of experimental research just because it is a bit old. We don’t do that with any other science.

  29. on 18 Jan 2016 at 9:19 amdsm

    I’ve got to figure that it will be EVEN more interesting of a conversation if you gender neutralize all the sentences and leave it as an exercise about two spouses of unspecified genders.

    Just sayin’.

  30. on 18 Jan 2016 at 9:22 amdsm

    Chester, you’re completely correct.

  31. […] As I summarized earlier, Sung-Il Kim’s research predicts that students will find this makeover more interesting than […]

  32. […] Study: Implicit Instruction Rated More Interesting Than Explicit Instruction […]

  33. on 03 Feb 2016 at 7:21 amEric Henry

    Isn’t the specific question important here? If I am going to ask a student to do a calculation, the most efficient way to get the student comfortable doing the calculation is to use explicit instruction (DI-style explicit instruction, not Deborah Ball’s version of explicit instruction). If I am going to ask students to explain why something is true (writing a proof) or to make a mathematical model of a situation and to argue for a particular decision given the mathematics (modeling), the instruction I deliver will be different.

    It really seems to be more about the problems than the methodology.

    An an example, I give a question like this as part of my summative assessment near the end of the year. There is no additional data provided, students are assessed with a rubric according to the competency of their explanation of their reasoning.


    The population of bacteria in a sample is estimated over a period of several hours. You have access to the population data. We would like to build a mathematical model of the bacteria population so that we can make predictions.

    We’ve worked with a variety of types of functions this year, including these eight: linear, quadratic, cubic, indirect, square root, cube root, exponential and logarithmic. How would you choose which of these functions or combination of functions to use to model the scenario? What justifications would you use?

    Some students immediately respond with exponential and give competent arguments for why. But many of the functions could be appropriate choices, depending on the data. There are many different aspects of the real world that may be affecting the real world data that a competent student will consider and discuss.

    I am not looking for one particular answer. I do not think that a problem is an authentic modeling problem if there is a single correct answer. I think we get good at modeling by working on modeling-ish problems. And oftentimes, DI is an appropriate technique for teaching modeling-ish problems. But sometimes students need an opportunity to solve real modeling problems. And the process of solving any modeling problem gets outside the bounds of strict DI. It has incongruity and casual bridging inferences written directly in to the problem.

    In other words, to say that a class exclusively uses explicit instruction is to say that the class doesn’t cover modeling.

    (An exclusive non-explicit instruction class is problematic for its own reasons but no one is arguing for that here.)

  34. on 03 Feb 2016 at 12:05 pmEric Henry

    To put it more briefly:
    Before talking about how to teach, don’t we need to determine what we want students to learn?

    And I am not talking about the specific content standards. Even if we accept the Common Core as a perfect document (which the authors certainly never intended), we still need to make decisions about emphasis and about how we understand mastery. I think this conversation will be much more grounded if we look at specific learning outcomes and discuss the teaching strategies in that context. I suspect that there will be a surprising amount of common ground in approaches once we decide on a particular learning outcome.

Leave a Reply