Internet of Children's Things
A.A. Milne famously ended his classic children’s novels with an entreaty from Christopher Robin, a little boy, to his beloved stuffed animal, Winnie-the-Pooh, to “‘promise you won’t forget about me, ever. Not even when I’m a hundred.’” Today’s real toys may have powers of memory for information about their young human counterparts that far exceed those of Milne’s fictional bear whose head was stuffed with fluff but who came to life in Christopher Robin’s imagination. According to a new white-paper from the Future of Privacy Forum and the Family Online Safety Institute, “Kids and the Connected Home: Privacy in the Age of Connected Dolls, Talking Dinosaurs, and Battling Robots,” digital toys are creating new opportunities for children to play and learn. However, these Internet of Things devices marketed to kids also raise important “social and legal implications,” including whether the “Children’s Online Privacy Protection Act (COPPA) appl[ies] to connected toys” and what measures are necessary to “ensure that connected toys are sufficiently secure” to safeguard user privacy. Some of the report’s key recommendations for addressing these and related issues include that vendors “invest in developing creative and intuitive ways to alert children and parents when data is being collected or transmitted” and also invest “serious thought into the user interface and granularity of [parental] consent options” for choosing how children’s data is gathered and used. These and similar recommendations are quite timely, as consumers and consumer protection institutions in multiple countries are raising concerns about wired toys and children’s data privacy.
Happenings. The Future of Privacy Forumreleased its second annual parent survey on student privacy—“Beyond One Classroom: Parental Support for Technology and Data Use in Schools”— with important findings, including that “[a]lmost eighty percent of parents are now using school-related technology to keep up with their child’s educational progress, and ninety percent of children are using technology provided or recommended by their school.” Data Quality Campaign released a new analysis of the accessibility of state-level school data for each state and concluded that “while all states create annual aggregate report cards for the public with important data about how their students statewide are doing, these reports are often difficult to find and understand.” Recent findings from Common Sense Media reveal that “a significant number of [ed tech] vendors do not provide even basic support for encryption [of data].” The Harvard Law Review Forum launched a new Commentary Series on Law, Privacy, and Technology that aims to foster greater dialogue between legal academics and other stakeholders in the privacy and technology spaces; inaugural contributions include “Recoding Privacy Law: Reflections on the Future Relationship Among Law, Technology, and Privacy” by Professor Urs Gasser, Executive Director of the Berkman Klein Center for Internet & Society at Harvard University, and “Protecting One’s Own Privacy in a Big Data Economy” by Professor Anita L. Allen of University of Pennsylvania Law School. A Forbes contributing writer argues that “[f]or ethical and legal reasons, edtech entrepreneurs must understand student data privacy laws and related privacy issues.” A recent webinar sponsored by the EDUCAUSE Cybersecurity Initiative and ECAR Working Groups focused on the University of California Learning Data Privacy Principles and Recommended Practices. The ACLU expressed concern over the Uniform Law Commission’s proposed legislation for protecting the privacy of social media accounts; the causes for concern include that the ULC bill only covers “students ‘at the post-secondary level,’” leaving many students without privacy protection.
The University of California system announced that it will protect data related to the privacy and civil rights of undocumented members of its community. In an op-ed piece for the New York Times, University of California President Janet Napolitano challenged the rhetoric around immigrants as posing a security threat to the United States and describes Deferred Action for Childhood Arrival Act (DACA), explaining that those covered often came to the U.S. as children and demonstrate strong contributions to their community.
Happenings. Google removed suggestions such as “are Jews evil” and “are women evil” from the auto-complete function of its search. A Google spokesperson reported: “Our search results are a reflection of the content across the web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query.” A few articles reported user confusion that these results reflected Google’s opinion, demonstrating a gap in digital literacy about how the auto-complete suggestions are generated. Cathy O’Neil, blogger and author of Weapons of Math Destruction stated: “‘This is the end for Google pretending to be a neutral platform . . . It clearly has a terrible problem here and it has to own and acknowledge that.’”
Groups tracking hate crimes report an escalation of reported incidents since the election. A Brigham Young University undergraduate, Johnisha Demease-Williams, interviewed black students about their experiences of prejudice and misassumptions around race, creating a short film to raise awareness of the challenges and difficulties faced. Guidance from the Southern Poverty Law Center suggests that the kinds of data collected about students (e.g., disciplinary records and who is writing the referrals) can reveal patterns of inequality. In “I’m More Than An Other,” actress Meghan Markle reflects on being required to fill in an ethnicity box in a school survey in seventh grade and the impactful discussions and insights that resulted. Professor Nishant Shah describes what a pedagogy of care “can mean in the face of the hard mechanics of assessment, evaluation and enumeration within the classroom.” John Palfrey, Head of School at Phillips Academy, Andover, and a Director of the Berkman Klein Center for Internet & Society at Harvard University, explores the paradox of tolerance and its complex and challenging implications.
Tips for Promoting Digital Literacy for Children & Teenagers in the “Post-Truth” World
The conversation continues around how best to educate and parent digitally literate kids and teens in the so-called “post-truth” era. “In this post-truth universe,” Ruth Marcus writes in the Washington Post, “institutions—news media, the intelligence community—are drained of all credibility … [w]ith facts passe, the next, inexorable move is to reduce all news to the same level of distrust and disbelief. If nothing is true, then everything can be false” (emphasis added). In this environment, Sierra Filucci of Common Sense Mediarecommends that parents teach their children to ask themselves basic questions when they encounter a piece of media, including: “Who made this? Who is the target audience? Who paid for this? Or, who gets paid if you click on this?” Additionally, she highlights several ways to help recognize fake news stories, such as “unusual URLs … [or] signs of low quality, such as words in all caps, headlines with glaring grammatical errors, bold claims with no sources, and sensationalist images.”
However, when fake news stories are propagated by individuals with power and a visible platform for public dialogue—such as President-elect Trump’s pick for National Security Advisor, Retired General Michael Flynn—disentangling truth from fiction may become even more challenging for young people. In a recent article for the Washington Post, Valerie Strauss shares a framework from Mica Pollock—Professor of Education Studies and Director of the Center for Research on Educational Equity, Assessment, and Teaching Excellence (CREATE) at the University of California, San Diego—for promoting digital literacy and other values of digital citizenship in the classroom. Professor Pollock urges teachers “to reclaim the heart of an educator’s work and engage the facts” (bold in original). She observes that “Trump’s talk [in the campaign] was often about dividing and devaluing people, and he has often disregarded actual facts. But educators need to trade in facts and solid data.” Indeed, trading in “facts and solid data” is mission-critical for all professions, not just education; notably, this week Interior Secretary Sally Jewell urged researchers “to speak up and to talk about the importance of scientific integrity, and if they see that being undermined to say something about it.”
Happenings. NPR recently convened roughly 20 parents to discuss screen-time and related issues of digital domestic life; NPR reporters observed that “[w]e don't think there's ever been a generation of parents that's been so thoughtful about what they're doing [in this area], but also sometimes nervous and whipsawed by confusion and by different sources.” Cartoon Network has appointed a slate of advisors to their newly-minted Science, Technology, Engineering, Arts, and Math (STEAM) advisory board. A new course from David Sohn, Senior Counsel for the Center for Democracy & Technology, on the topic of copyright for educators is now available online, “cover[ing] the basics of copyright and fair use, and our rights and responsibilities as users and creators of creative works.” Digital Promise, a “national center created by Congress with bipartisan support to advance technologies that can transform teaching and learning,” has released an ed tech pilot framework that “provides an eight-step blueprint for district leaders to plan and lead effective pilots, along with actionable, research-based tools and resources.”