James Leroy Wilson's one-man magazine.

Friday, July 04, 2008

The Golden Rule and the Three Laws

I recently saw the 2004 film I Robot. It's set in 2035 with human-shaped robots filling various service roles for humans. Although apparently not given a"soul," they are given judgment, which means they have to be able to learn and grow, or "evolve." They are also programmed with the The Three Laws of Robotics, which figure prominently in the plot:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
A robot, then, is probably better able to protect human life than humans themselves can, because doing so is literally hard-wired in their nature. Humans, being organic life, must overcome the survival instinct. And, being social creatures, humans inherit religious beliefs that inspire an existential fear of death as well. So, saving others at the cost of one's own life is harder for us. But the Laws apply in a servant-master relationship, whereas in a relationship of equals, that is, among human beings, the Golden Rule would be sufficient.

The State likes the Golden Rule - when private citizens deal with each other. But its view of humans viz-a-viz The State is similar to the Three Laws of Robotics:

1. A human being may not injure a human being or, through inaction, allow a human being to come to harm - except on orders of the State.
2. A human being ought to obey orders given to it by people in positions of authority except where such orders would conflict with the First Law, and no order by authority figures in civil society can supercede orders given by the State.
3. A human being must protect its own existence as long as such protection does not conflict with the First or Second Law.

For example, Humans will agree that a youth should obey his father, but not obey orders to rob a bank. And that he should obey his coach, but disregard orders to deliberately injure someone on the opposing team.

But we also think a human being must obey orders given by the government, even where such orders conflict with the Golden Rule and inflict harm. Indeed, if you fail to support the deliberate infliction of harm on fellow human beings to achieve certain governmental ends, then you deserve to be vilified if not arrested.

Why do we think this way?

Perhaps it is because we, like the central computer in I, Robot, have "evolved" to see the "meaning" or intent of the laws, to see the "big picture" that examines the ends of humanity in general rather than the rights of individuals. In order to keep human beings physically safe and economically secure, we must tax them, regulate their property, and provide them with "services" they supposedly can't provide for themselves. And to preserve the lives of the many, we must spy on people, control their movements, and even sacrifice the lives of some.

But that's putting the cart before the horse. Laws emanating from the Golden Rule will protect our lives as individuals. And that is the point. Not saving the greatest number of humans. Not to make some suffer for others to prosper, not to make some die that others live. Not pre-]emptive attacks or restrictions to rid us of some perceived threat. Indeed, this is the opposite of the Golden Rule: if it is permissible for the State to harm one innocent person, why is it not permissible for the State to harm you?

It is no wonder that the State we rely on to "protect" us kills many times more people than criminals do. We rely on the State to fight poverty, when it has, through tax, land, and labor policies, inflicted more poverty and misery than any other institution throughout history. We rely on the government to fight pollution, but it is the biggest polluter. Without considering the individual as the primary unit in society, the State has literally no sense of proportion.

No comments:

Post a Comment