May 14, 2012

Age defined, then defied

Our culture’s focus on the elasticity of young minds unduly limits adults’ creative capacities.

Ours is a youth-obsessed culture. This is of course no secret, as obvious as pancakes for breakfast. We look constantly to the imagined past, where old man Methuselah enjoyed all 782 years of his advanced age as a revered elder of his tribe, and wonder when the respect for age passed away.

But the worst elements of contemporary attitudes about age are the ones situated in the science of cognition and learning. Article after article appears in major publications detailing how hopelessly limited we’re doomed to be as adults if we weren’t immersed in a trilingual home from age zero onwards. We will never be as mentally acute, as gifted at multitasking or spatial cognition, as the young prodigies coming out of multicultural or well-to-do homes where child-rearing continues to be a matter of “best practices”—x inputs yield y results. Give junior the right toys, the right incentives, and the right auditory and visual stimuli, and she’ll be well on the way to maximal productivity and worldly success. The strange thing is, of course, in all this emphasis on building the best possible minds, the science of childrearing offers universal prescriptions, while contradictorily promising unique and gifted children.

I have it on fair authority that one of the great perks of being a university academic is being constantly surrounded with younger minds. One view holds that as a person ages, the mind becomes more rigid, less plastic, increasingly a total storage unit for information, rather than some generator of novel connections. Therefore, it is essential that the successful academic have fresher perspectives in abundance growing at her feet, to stimulate her own thinking, so she can pluck interesting ideas and use her greater encyclopedic knowledge to really pull things together. I don’t know how far this metaphor goes in describing the dynamics of the professor-student relationship, but other graduate students have confided that they feel grad-level seminars are sometimes little more than idea mills for their professors. There is of course a mutual reciprocity here, as the older mind shares loads of information and guides inquiry in useful directions, but the primary benefit probably accrues to the faculty.

But the worst consequence of contemporary attitudes about the rigidity of the adult brain is the sense of hopelessness they engender for…well, the vast majority of the world populace. Neuro- and behavioral scientific findings continue to paint an ever more deterministic view of human existence—the modern human is slave to her fairly crystallized neural pathways before she is old enough to realize her connection-generating prime has passed. Abused physically or verbally as a child? Suffered severe pre- and post-pubescent bullying? Raised sans one parent? Sorry, you’re broken for life, and any therapeutic successes can only ever mitigate or redirect, but never undo, the subsequent behavioral responses and their objects. I recognize I’m laying out very hyperbolic positions here—ones few would publicly hold when so many questions remain unanswered in these fields—but these are some of the conclusions being drawn by scientists and the reading public well in advance of a fuller picture.

For instance, being “hardwired” to do this or that is bandied about in everyday parlance and in the media. Therefore, behaviors in line with the hardwiring deserve acceptance or exculpation, within convenient and constantly shifting bounds. I don’t for a second propose to be able to properly gauge the extent to which we can be said to have free will or control over our behaviors, beyond what we tell ourselves our behavioral identities are (often without bothering to look at our own behaviors to determine how they are in-line with what we broadcast). However, I am constantly confronted with the sense that we cannot truly change our behaviors or perspectives past a certain range of ages—that all we are left with is a daily battle, ever-constant vigilance, and redirecting of patterns of thought, lest we slip once more into depression, anxiety, alcoholism (name your daily tragedy).

Our culture is in the midst of a monumental struggle of redefinition at the moment (or maybe at all moments, though that’s off-topic). On the one hand, many (most, one hopes) of us are avidly embracing a normativity that accommodates all flavors of humanity; I leave that intentionally vague. On the other, we are presented with scientific arguments that most of the values we seek for our society have few, if any, biological underpinnings. And yet, we seek them all the same. President Obama just came out in favor of gay marriage, and it seems like we are that much closer to being a society that lets people define for themselves how to express their love of one another. Clearly the picture is even more complicated than all the theorists paint it, at least at the collective level.

But in the end, it seems to me that there are two emergent attitudes regarding personal expectation for the lifelong learner: You believe (or try to believe) that you are not increasingly handicapped in your acquisition of languages, knowledge, skills, and experiences, and continue apace as you always have; or, you accept that you are severely constrained, and that it will only get worse—but you either make a go of it anyway, or gradually abandon all expectations as you age. There are other ways of looking at it, I’m sure, but these seem to me the most obvious. No matter what pattern of thought you fall into, though, and regardless of whether you chose that pattern, the consequences on your life are very real and will be with you daily.

Christopher Ivan is a graduate student in the MAPSS program.