Teaching and Learning blog

Explore insights, trends, and research that impact teaching, learning, and leading.

Explore posts in other areas.

PreK-12Pearson studentsProfessional

  • blog image alt text

    #ShowTheEvidence: Building a movement around research, impact in ed tech

    By Aubrey Francisco, Bart Epstein, Gunnar Counselman, Katrina Stevens, Luyen Chou, Mahnaz Charania, Mark Grovic, Rahim Rajan, Robert Pianta, Rebecca Griffiths

    This is the first in a series of essays surrounding the EdTech Efficacy Research Symposium, a gathering of 275 researchers, teachers, entrepreneurs, professors, administrators, and philanthropists to discuss the role efficacy research should play in guiding the development and implementation of education technologies. This series was produced in partnership with Pearson, a co-sponsor of the symposium co-hosted by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator.

    To improve education in America, we must improve how we develop and use education technology.

    Teachers and students are increasingly using digital tools and platforms to support learning inside and outside the classroom every day. There are 3.6 million teachers using ed tech, and approximately one in four college students take online courses — four times as many as a decade earlier. Technology will impact the 74 million children currently under the age of 18 as they progress through the pre-K–12 education system. The key question is: What can we do to make sure that the education technology being developed and deployed today fits the needs of 21st-century learners?

    Our teachers and students deserve high-quality tools that provide evidence of student learning, and that provide the right kind of evidence — evidence that can tell us whether the tool is influencing the intended learning outcomes.

    Evidence and efficacy can no longer be someone else’s problem to be solved at some uncertain point in the future. The stakes are too high. We all have a role to play in ensuring that the money spent in ed tech (estimated at $13.2 billion in 2016 for K-12) lives up to the promise of enabling more educators, schools, and colleges to genuinely improve outcomes for students and help close persistent equity gaps.

    Still, education is complex. Regardless of the quality of a learning tool, there will be no singular, foolproof ed tech solution that will work for every student and teacher across the nation. Context matters. Implementation matters. Technology will always only be one element of an instructional intervention, which will also include instructor practices, student experiences, and multiple other contextual factors.

    Figuring out what actually works and why it works requires intentional planning, dedicated professional development, thoughtful implementation, and appropriate evaluation. This all occurs within a context of inconsistent and shifting incentives and, in the U.S., involves a particularly complex ecosystem of stakeholders. And unfortunately, despite the deep and vested interest of improving the system, the current ecosystem is many times better at supporting the status quo than introducing a potentially better-suited learning tool.

    That’s the challenge to be taken up by the EdTech Efficacy Research Symposium in Washington, D.C., this week, and the work underway as part of the initiative convened by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator. People like us rarely have the opportunity to collaborate, but this issue is too important to go it alone.

    Over the past six months, 10 working groups consisting of approximately 150 people spent valuable hours together learning about the challenges associated with improving efficacy and exploring opportunities to address these challenges. We’ve looked at issues such as how ed tech decisions are made in K-12 and higher education, what philanthropy can do to encourage more evidence-based decision-making, as well as what will be necessary to make the focus on efficacy and transparency of outcomes core to how ed tech companies operate.

    Over the next six weeks, we’ll explore these themes here, sharing findings and recommendations from the working groups. Our hope is to stimulate not just discussion but also practical action and concrete progress.

    Action and progress might look like new ways to use research in decision-making such as informational site Evidence for ESSA or tools that make it easier for education researchers to connect with teachers, districts, and ed tech companies, like the forthcoming National Education Researcher Database. Collaboration is critical to improving how we use research in ed tech, but it’s not easy. Building a common framework takes time. Acting on that framework is harder.

    So, as a starting point, here are three broader issues that we’ve learned about efficacy and evidence from our work so far.

    Everyone wants research and implementation analysis done, but nobody wants to pay more for it

    We know it’s not realistic to expect that the adoption of each ed tech product or curricular innovation will be backed up by a randomized control trial.

    Investors are reticent to fund these studies, while schools or developers rarely want to pick up the price tag for expensive studies. When Richard Culatta and Katrina Stevens were still at the U.S. Department of Education’s Office of Educational Technology, they pointed out that “it wouldn’t be economically feasible for most app creators (or schools) to spend $250k (a low price tag for traditional educational research) to evaluate the effectiveness of an app that only cost a total of $50k to build.”

    We could spend more efficiently, leveraging the 15,000 tiny pilots and decisions underway into new work and new insights without spending more money. This could look like a few well-designed initiatives to gather and share relevant information about implementations and efficacy. Critically, we’ll need to find a sustainability model for that type of rigorous evaluation to ensure this becomes a key feature in how adoption decisions are made.

    We need to recognize that evidence exists on a continuum

    Different types of evidence can support different purposes. What is important is that each decision is supported by an appropriate level of evidence. This guide by Mathematica provides a useful reference for educators on different evidence types and how they should be viewed. For educators, it would be wise to look at the scale and cost of the decision and determine the appropriate type of evidence.

    Tools like the Ed Tech Rapid Cycle Evaluation CoachLearn Platform, and Edustar can provide useful support in making decisions and evaluating the use of technology.

    It’s important to remember that researchers and philanthropists may use education research for different purposes than would a college, university system, or districts. Academic researchers may be looking to identify causal connections, learning gains, or retention rates, while a district is often focused on a specific context and implementation (what works for schools similar to mine).

    When possible, traditional randomized control trials provide useful information, but they’re often not affordable, feasible, or even necessarily appropriate. For example, many districts, schools, or colleges are not accustomed to or well versed in undertaking this type of research themselves.

    It’s easy to blame other actors for the current lack of evidence-driven decisions in education

    Everyone we spoke to agrees that decisions about ed tech should be made on the basis of merit and fit, not marketing or spin. But nearly everyone thinks that this problem is caused by other actors in the ecosystem, and this means that progress here will require hard work and coordination.

    For example, investors often don’t screen their investments for efficacy, nor do they promote their portfolio companies to necessarily undertake sufficient research. Not surprisingly, this tends to be because such research is costly and doesn’t necessarily drive market growth. It’s also because market demand is not driven by evidence. It’s simply not the case that selection choices for tools or technologies are most often driven by learning impact or efficacy research. That may be shifting slowly, but much more needs to be done.

    Entrepreneurs and organizations whose products are of the highest quality are frustrated that schools are too often swayed by their competitors’ flashy sales tactics. Researchers feel that their work is underappreciated and underutilized. Educators feel overwhelmed by volume and claims, and are frustrated by a lack of independent information and professional support. We have multiple moving pieces that must be brought together in order to improve our system.

    Ensuring that ed tech investments truly help close achievement gaps and expand student opportunity will require engagement and commitments from a disparate group of stakeholders to help invent a new normal so that our collective progress is directional and meaningful. To make progress on this, we must bring the conversation of efficacy and the use of evidence to center stage.

    That’s what we’re hoping to help continue with this symposium. We’ve learned much, but we know that the journey is just beginning. We can’t do it alone. Feel free to follow and join the conversation on Twitter with #ShowTheEvidence.


    Authors:

    • Aubrey Francisco, Chief Research Officer, Digital Promise
    • Bart Epstein, Founding CEO, Jefferson Education Accelerator
    • Gunnar Counselman, Chief Executive Officer, Fidelis Education
    • Katrina Stevens, former Deputy Director, Office of Educational Technology, U.S. Department of Education
    • Luyen Chou, Chief Product Officer, Pearson
    • Mahnaz Charania, Director, Strategic Planning and Evaluation, Fulton County Schools, Georgia
    • Mark Grovic, Co-Founder and General Partner, New Markets Venture Partners
    • Rahim Rajan, Senior Program Officer, Bill & Melinda Gates Foundation
    • Robert Pianta, Dean, University of Virginia Curry School of Education
    • Rebecca Griffiths, Senior Researcher, Center for Technology in Learning, SRI International

    This series is produced in partnership with Pearson. The 74 originally published this article on May 1, 2017, and it was re-posted here with permission.

  • blog image alt text

    3 simple research-based ways to ace a test

    By John Sadauskas, PhD, Learning Capabilities Design Manager, Pearson

    On top of the traditional challenges of balancing their classwork, part-/full-time jobs, extracurricular activities, and social lives, today’s higher education students also face the challenge of the ever-present information firehose that is the Internet. Every day, they receive a constant stream of emails, push notifications, instant messages, social media comments, and other digital content — all of which they can carry in their pockets, and more importantly, can interrupt whatever they’re doing at a moment’s notice.

    As a result, one major challenge for today’s students is to manage the ever-growing amount of information, communication, and priorities competing for their time and attention — especially when they need to study.

    We’ve been hearing from many students that when they do make time to sit down and study, they find it difficult to manage that time efficiently — particularly making decisions on what to study, when to study, how often to study it, and how long to study until they become confident enough in preparation for multiple upcoming exams.

    Fortunately, researchers have been investigating this problem for decades and have identified multiple methods for getting the most out of study sessions. Accordingly, here are some research-based best practices that students (or anyone else, for that matter) can use to boost their memorization skills.

    Memorization takes practice

    Every time you recall a piece of information (your mother’s birthday, a favorite meal at a restaurant, a key term’s definition for an exam) you retrieve it from the vast trove of knowledge that is your long-term memory. However, you’ve probably found that some pieces of information are easier to remember than others.

    You’re likely to recall your home address easily because you constantly need it when filling out online forms and ensuring Amazon knows where to ship your limited edition Chewbacca mask. On the other hand, it may not be as easy to recall a friend’s phone number because it’s stored in your contacts and you rarely need to actually dial the numbers.

    Unsurprisingly, researchers have found similar results to these — the more often people “practice” retrieving a certain piece of information, the easier it is for them to remember it. More importantly, scientists have demonstrated that getting yourself on a regular studying schedule can take advantage of this using what is called “spaced practice” — studying in short sessions spaced out over long periods of time. Essentially, spaced practice involves quizzing yourself and giving yourself many opportunities to practice pulling information out of your long-term memory — and doing it often over an extended period of time.

    Want to give spaced practice a try? Here are some key guidelines to ensure you’re getting the most out of it.

    Study early and daily

    One of the most important things to remember when using spaced practice is to give yourself enough lead time before an exam. Research has shown that in general, the earlier in advance students start studying and keep studying until an exam, the higher their scores.

    For example, if you have an exam in two weeks, you could begin studying for 20 minutes every day for those two weeks. That way, you’ll have many opportunities to practice retrieving the information, increasing the likelihood that you’ll remember it the day of the exam.

    In contrast, if you start studying only a few days before the exam, you’ll have fewer opportunities to practice retrieving the material, and are less likely to remember it. So while there isn’t a magic recipe to determine the exact moment to start studying based on the amount of material you need to remember, it’s clear that the earlier you start studying every day, the better.

    Short and sweet beats long and grueling

    Another key component to spaced practice is the length of the study session. While it is common for students to embark upon marathon, multi-hour study sessions, researchers have found that when using spaced practice, long study sessions are not necessarily more effective than short study sessions. In other words, committing to studying certain material every day for 30 minutes is likely just as effective as studying that same material for an hour every day.

    Now, this doesn’t mean we should all keep our study sessions as short as humanly possible and expect amazing results. Instead, it reinforces the concept of spaced practice. For instance, let’s say your goal is to memorize 15 definitions for a quiz, and you’re committed to practicing every day until that quiz. You sit down to practice each definition twice, which takes 30 minutes. (Remember, the aim of spaced practice is to retrieve a memory, and then leave a “space” of time before you retrieve it again.)

    Because your brain has already retrieved each definition twice in that sitting, you may not benefit much more from studying the same words for an additional 30 minutes and reviewing each definition a total of four times. In short, once you’ve started studying early and daily, make sure to practice each concept, definition or item a few times per session — but more than that in a single sitting is likely overkill.

    Don’t break the chain

    I’ve emphasized the importance of practicing daily quite a bit here, and there is also a scientific reason behind that. A solid spaced practice routine means we’re continually retrieving certain information and keeping it fresh in our minds. However, if we stop practicing before something is committed to our long term memories, we’ll eventually forget it. Scientists have charted out this phenomenon in what is referred to as “The Forgetting Curve.”

    The Forgetting Curve

    Source: https://www.cambridge.org/core/journals/cns-spectrums/article/play-it-again-the-master-psychopharmacology-program-as-an-example-of-interval-learning-in-bite-sized-portions/E279E18C8133549F94CDEE74C4AF9310#

    In the same way that continual practice with short spaces between each session helps us to remember information, scientists have found that our ability to remember something decreases over time if we don’t practice or use the information — which is what the steep downward slope of the Forgetting Curve is meant to illustrate. When we learn new information and are immediately asked to recall it, we’re likely to remember it (the very left side of the graph).

    However, from that moment on, the likelihood that we’ll remember decreases quickly and drastically unless we recall or use the memory again. If we do, then we can keep resetting or “recharging” that Forgetting Curve and keep remembering the information over time with daily practice.

    Herman Ebbinghaus and the forgetting curve

    Source: http://www.wranx.com/ebbinghaus-and-the-forgetting-curve/

    For example, if you took a foreign language in high school, it’s likely that being in class five days a week, doing homework and studying for the exams kept the language’s vocabulary words fresh in your mind. However, unless you have continual opportunities to practice speaking that language after high school, it’s likely that you won’t be able to recall words, phrases, and verb conjugations over time — unless you start practicing again.

    With this all in mind, if your goal is to remember something, the Forgetting Curve suggests that daily practice is key. Essentially, it’s “use it or lose it.”

    Start early, finish quickly, practice daily

    Although memorizing material for an exam (or multiple exams) can be intimidating, research on learning has given us a few key guidelines that have consistently demonstrated results:

    1. Start early. The earlier in advance you start studying daily for the exam, the better
    2. Finish quickly. Cover all of the material you need to remember in your daily session, but keep it short and sweet.
    3. Practice daily. Don’t break the daily studying chain.

    While today’s students may struggle with numerous competing priorities, incorporating these habits into their routines when they do sit down to study is sure to make their sessions much more efficient.

     

    References

    Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132(3), 354–380.

    Ebbinghaus, H. (1964). Memory: A contribution to experimental psychology (H. A. Ruger, C. E. Bussenius, & E. R. Hilgard, Trans.). New York: Dover Publications. (Original work published 1885)

    Nathan, M. J., & Sawyer, R. K. (2014). Foundations of the Learning Sciences. In R. K. Sawyer (Ed.) Cambridge Handbook of The Learning Sciences. New York: Cambridge University Press.

    Pavlik, P. I., & Anderson, J. R. (2005). Practice and forgetting effects on vocabulary memory: An activation-based model of the spacing effect. Cognitive Science, 29(4), 559-586.

    Rohrer, D., Taylor, K., Pashler, H., Wixted, J. T., & Cepeda, N. J. (2005). The effect of overlearning on long-term retention. Applied Cognitive Psychology, 19(3), 361–374.

    Stahl, S. M., Davis, R. L., Kim, D. H., Lowe, N. G., Carlson, R. E., Fountain, K., & Grady, M. M. (2010). Play it Again: The Master Psychopharmacology Program as an Example of Interval Learning in Bite-Sized Portions. CNS Spectrums, 15(8), 491–504.

     

  • blog image alt text

    University increases student access to course materials

    By University of California, Davis

    SUCCESS STORY

    A university saves students $7 million while increasing student access to course materials

    University of California, Davis

    “New students come to campus prepared for everything,” Jason Lorgan, executive director, Campus Recreation, Memorial Union, and University of California, Davis (UC Davis), Stores, explained. “They have a bus pass and a gym pass. All their classes and their dorm room are assigned. Yet the default is that they have no access to their course materials. Something that is core to their education is not automatic.”

    So Lorgan began investigating ways to increase student access to course materials. “As more adaptive learning digital content such as MyLabTM & MasteringTM came out, we started thinking that they could be adapted to a licensing model similar to the one our design students use for Adobe® Photoshop® versus the textbook model where the default is that you start without access to the content.”