Posted by aahabershaw
Just read a really interesting, potentially terrifying article in the Boston Globe. The gist of it is this: Social Sciences (Psychology, Sociology, etc.) may, at some point, become active, hard, engineerable sciences. That is to say, you could conceivably engineer a person or society, through some mechanism, to behave predictably and controllably.
Though I can hear all the civil libertarians out there freaking out collectively as they read the above, this revelation should not, ultimately, come as much of a surprise to us. Not only is it not new to science fiction (Asimov’s psychohistory is essentially a version of applicable social sciences, and Heinlein’s stuff trafficks in the like, as well), but it also isn’t new to our own, real world. Take, by way of example, how Target figured out a teen girl was pregnant before her parents did, or any of the myriad other targeted web-ad campaigns that operate, essentially, on predicting human behavior in mechanistic ways. It’s crude at the moment, yes, but it will get better and better as our lives progress, no doubt.
All of us with any degree of EQ or empathy are able to finesse our fellow humans into behaving in predictable patterns. The reason for this is, essentially, that we humans aren’t all that spontaneous, on average. True spontaneity isn’t actual randomness – humans aren’t really capable of that or, at least, not functional ones – so much as it is the observation of an individual operating along unfamiliar personality parameters. If you can refine your equations and perfect your methods, you can account for the one girl in the office who will occasionally show up riding a unicycle or give presents made out of moose dung.
Is this difficult? Gods, yes! Is it impossible? Probably not. Is it depressing? Maybe.
It all depends on what is done with it. These mechanisms, by their very design, won’t be able to get us to behave against our nature. In that sense, we won’t be doing things we wouldn’t do otherwise in a similar situation, and so being ‘controlled’ is perhaps not the right word. Managed? Manipulated? Perhaps. I don’t know if this is good or bad, on the whole, but if we could manage to ‘manipulate’ the world into being nice and working together through feats of social engineering, is that so awful? I mean, presuming we could do it without burning/killing/oppressing anyone?
In any event, the science fiction potential of this kind of stuff is immense. I confess to already using theories similar to this spread throughout my writing in both the fantasy and scifi genres, and this article simply spurs me further in that direction. It terrifies as well as fascinates; it gives me hope and steals it away in the same breath. That right there, folks, is the place where great science fiction is born.
Or at least I hope so…