I have noticed a couple concerning trends in YA fiction. While I do not proclaim myself to be an expert, what little I have read suggests that these messages frequently appear in YA fiction, and I wonder if that’s really what we want to be telling our young adults.
Young adults are adults…just young. Very little seems to differ between what happens to adults in adult fiction and what happens to the young adults in YA fiction. They fall in love, break-up, get drunk, have sex (or at least think about it as a possibility), and save their world. This makes me wonder: what are we telling our young adults about adulthood? If they can do it all now, why should they grow up? Is growing up just growing older, and more experienced, or is there actually a difference between a young adult and an adult?
There used to be a difference. Young adults were trained to become adults, to take their place in an adult world. They were treated as not yet “in” society, and they looked at becoming adults as their initiation into the world where they would fall in love, get married, and take their places in society, participating in the government and having a career and family. Now, the YA characters do almost all these things—all while in high school.
That violence and/or revolution is the only way to change things. The YA characters typically live in a dystopia, where everything is bad and becomes worse in the process of the story. (There is almost an expectation, as a reader, that things will get worse before they get better, because that’s how things always are.) Generally, to rescue their world, the characters change the system from the outside in. What I don’t see are YA characters who get involved, in the system, and change it from the inside out.
But what is this saying to our young adults? That problems in society can only be fixed by drastic upheaval, by violence, by revolution and bloodshed? What happened to changing things by voting or by being active in the government that does exist, rather than throwing it out?
What do you think? Are there YA books I’m just missing? Do books like The Hunger Games, Divergent, and others have a positive message for young adults? Or are they just saying something about the current state of things, rather than offering any kind of recommendation on how the readers themselves should act?
Copyright 2014 Andrea Lundgren