For purposes of this discussion, let’s separate the “quantified self” from the “quantified child.” The array of wearable devices associated with the quantified-self movement involves a user’s conscious, informed choice to collect and analyze data that can result in changed behaviors or habits – a wonderful and practical use of technology. The quantified child, however, gives up data without knowledge or understanding, much less consent; often the goal is to “normalize” individual behaviors based on aggregated data collected from other children. Sometimes the child is simply being tracked or monitored for the purpose of intervening with behaviors deemed risky, or to monitor health conditions – worthy goals.
Today’s quantified child offers ethical and practical dilemmas that are worth considering. Privacy issues are baked into the equation (yes, children have a right to a certain amount of privacy), while the value of pressuring kids with targets and goals derived from normalized biases is questionable. There’s no conspiracy to undermine our freedom, no Big Brother with a sinister streak; we’re doing this to ourselves – or, to be more precise, to our children.
The first generation has now been born that will be digitally monitored – quantified – from cradle to grave, with behavioral standards established through aggregated data, a crowdsourcing approach to determining how one should live his or her life. Aggregated data results in averaged data, rife with biases that don’t take the individual into account. While it may take a village to raise a child, the metaphor falls apart when the “village” is a hands-off algorithm that determines standards by crunching terabytes of data. When Marshall McLuhan coined the term “global village” thirty years before the Internet, his vision predicted a beneficial and benign ability to gather information at lightning speed and share it as enhanced media for the good of all. The global village is indeed here – but it is ethically challenging to contain this marvelous phenomenon in a way that is truly beneficial and benign.
Before getting into those ethical questions, it’s worth thinking about how a child handles data, personal or otherwise – their cognitive capacity to interpret data points versus taking in sensory experiences. Ben Williamson, a widely published researcher and lecturer in education, writes that “algorithmic calculations
There are many examples of digital experiences that are very beneficial to a child’s development and self-image, such as game-based applications that encourage children to think about their health and behaviors from a very young age. For example, Fitter Critters offers a so-called “creature-based” approach through a collection of “virtual pets”; feeding and caring for these virtual critters while comparing the routines to the child’s own habits provides sensory experiences that can translate into healthy personal habits in a fun environment in which the child is fully aware and invested. In the offline world, Sesame Street has been doing this for decades. Gamification that involves self-improvement or self-tracking allows a child to do what children do best: use their imaginations as they learn and adjust to the world. Jennifer Whitson, Assistant Professor of Management at the University of Texas in Austin, refers to such gamification strategies as “pleasurable surveillance” – a kind of self-monitoring and self-policing done voluntarily and for fun.
But data collection without a child’s conscious involvement is a different matter. The quantified child starts out as the quantified baby. Numerous websites offer scores of devices and applications that monitor babies as they sleep, play, or feed. Some are tremendously beneficial in that they monitor biological or neurological functions associated with disease and infant nutrition. Others monitor daily routines: “smart diapers” analyze urine to identify health concerns, and “smart” baby clothing uses sleep algorithms to track temperature and respiratory patterns. While some of these devices play into the fears of helicopter parents – parental insecurity is a marketing opportunity – these infant diagnostic tools don’t present the ethical questions that begin when a child enters preschool or kindergarten. That’s when quantification begins in earnest.
Our children are the most data-mined and quantified students in history. Tracking a student’s every move can begin when they step on the school bus and end when they close their laptops or retire their smartphones at night. This obsession with quantification has little to do with the child’s self-knowledge or self-improvement – at school it’s designed to “normalize” a child’s behavior based on the aggregated behavior patterns of other children, and at home it tracks their patterns for marketing purposes. In some instances, healthy living is the goal, but as we’ll see in a few paragraphs even that worthy goal can be misguided.
As data from a child’s daily routine is gathered, it can be used to either trigger rewards for certain actions or negative consequences for others – a role previously reserved for parents and teachers who “gathered data” in an entirely different way. Technology is very useful when it uses algorithms to take away the element of human variation, setting in motion a range of responses that aren’t tied to emotions. But it’s not so useful in interpreting the range of emotions connected with certain behavior patterns, especially behaviors that a child may not understand are tied to those very emotions. That requires uniquely human qualities: judgment, empathy and compassion.
The rapid growth of quantification technology over the past few years is astonishing. Gary Wolf, a leader in the quantified-self movement, says that the quantification explosion came about through the convergence of 4 developments: the miniaturization and automation of electronic sensors; the ubiquity and power of mobile computing devices; the normalization of sharing through social media; and the emergence of cloud connectivity. This cloud connectivity, which Wolf refers to as “global superintelligence,” is in some respects what Marshall McLuhan envisioned as the global village. But Wolf’s superintelligence is artificial intelligence, lacking the human capability to adapt and change based on emotions and compassion.
There are technologies being perfected for the “biometric classroom” that are truly mind-boggling in their scope, while others are somewhat mechanical and are in use all over the country. On the more mind-boggling side of the scale, software called EngageSense has been developed with the purpose of tracking students’ eye movements, conversations and facial expressions to measure “the highs and lows of student engagement.” EngageSense monitors eye movements to tell if the child is looking up at the teacher (engagement) or down at the desk (lack of interest); it also can monitor whether a child is interacting with other students when he’s supposed to be listening to the teacher (usually referred to as “fooling around”). “Using standard computer hardware and vision algorithms, EngageSense can automatically generate reports over days, weeks, semesters and across subjects to better understand what piques student engagement,” their website says. So much for the concept that many students learn differently, and classroom note passing might quickly become a lost art.
The Bill and Melinda Gates Foundation is helping fund the development of sensor bracelets that could track a student’s engagement level. The bracelet emits a low-level current over the skin that measures changes in how the nervous system responds to stimuli. This isn’t futuristic fantasy; similar bracelet technology has been used to measure consumers’ responses to advertising, gauging how someone responds to an ad.
Furthermore, predictive modeling has made its way into our schools; there is an educational movement that believes probability is destiny when it comes to learning. More than fifty colleges are now using sophisticated predictive analytics software to direct a student to those courses in which he or she would most likely succeed, and to determine a student’s chance of graduating. Education technology companies have developed software that assesses a student’s past academic performance, including high school, as well as analyzing how students have fared in a class over time. Algorithms produce “normalized” models aimed at higher graduation rates for the college or university. High schools, with matriculation issues of their own, are paying close attention to this technology. However, as Melissa Korn writes in The Wall Street Journal, not all educators think this quantitative approach to education is a good idea – those educators are most likely concerned with the quality of the learning experience. “Some administrators warn that predictive analytics aren’t a silver bullet and that even well-intentioned academic advisors could misuse the information and guide students toward easy options,” The Wall Street Journal reports. “This might improve a school’s graduation rates, they say, but it would also leave students with little opportunity to experiment or push themselves academically. ‘Anytime you intervene, you set up incentives… to just get people through [the system],’ says Gary Rhoades, director of the Center for the Study of Higher Education at the University of Arizona. ‘That’s a different incentive than challenging people to take on new knowledge and stretch themselves.’”
The biometric classroom is already a reality in more mundane usage. Schools all across the country are using biometric fingerprint and palm scanning for lunch payment systems. In some instances the data collected through these systems has been used to enforce eating habits that a school has deemed to be healthy (yes, the same schools that follow federal laws that qualify ketchup and relish as vegetables). How this data is used is up to individuals who may or may not have the required credentials or common sense to put it to good use.
But the quantified child experience can begin before the child even arrives in the classroom. Last year a Florida school district implemented iris-scanning technology in a “student safety pilot program.” In a letter to parents, the school said that the technology would “identify when and where a student gets on the bus, when they arrive at their school location, when and what bus the student boards [in the morning] and disembarks in the afternoon.” Other school districts use RFID-(Radio Frequency ID) enabled technology embedded in school badges to monitor student comings and goings in the classroom and at school-related functions. This technology is very useful when used in high-stakes security operations, but rather creepy when used to track young children.
When the quantified child gets home after a day of being monitored at school, he or she can log onto a computer or smartphone and provide valuable information to any number of commercial operations. The child’s exact location is known, and their online activities are tracked, scored, and remembered. Adding to the issue is the average child’s disregard for online privacy, and the pathological need to share one’s life with a very loose network of friends. School systems from North Carolina to Southern California are now monitoring students’ public postings on social media to ferret out threats of violence, drug use, bullying, and indications of suicidal behavior. These may be worthy goals on the surface, but questionable when you consider who does the monitoring, whether they’re qualified to do so, and what other authorities or institutions will ultimately see the information.
Increasingly, we tend to not even think about the constant surveillance and quantification of our daily life – in our homes, our cars, the workplace, and schools. When the algorithmic life begins at birth, it’s easy to envision individualism being undermined as we normalize – homogenize – behaviors and attitudes. Much of it is for our own good, or the good of society. But children conditioned to being digitally quantified over a lifetime could create a collective mindset that goes against the grain of human nature. We need to take care in choosing the technology we’re willing to live with.
Join us on LinkedIn for the latest in future-forward thinking.