Sunday, October 10, 2010
Technopoly (part 2)
Chapter 6 of Postman's "Technopoly" discusses the ambiguity of every day life. It claims "Individual judgments, after all, are notoriously unreliable, filled with ambiguity and plagued by doubt" (93) and arrives at the conclusion that "Machines eliminate complexity, doubt and ambiguity" (93). While these statements are somewhat true, the reason humans are separate from other animals and machines is because of our ability to REASON. Of course life would be less complicated without having to worry about opposing views or if the decision being made is the correct one, but that is a very important part of life. Reading this chapter and taking in the message being spread throughout it got me wondering if all people want technology for is to make their lives as easy as possible, even if that means sacrificing having an opinion or a voice? This chapter deals mostly with technology in medicine/hospitals and the fact that our country or any country would even consider putting those important decisions in the hands of a "cut and dry" computer scares me. Again, life is ambiguous for every single living person on the planet and this fact cannot be ignored because a decision is coming from a computer that does not contain ability to understand this concept. "Machines cannot feel and, just as important, cannot understand" (112) represents the problem with relying solely on computers perfectly. Computers are still PROGRAMMED by somebody to perform the same task the same way every time the details are the same on paper. Postman also claims that "Technopoly wishes to solve, once and for all, the dilemma of subjectivity" (158). My problem with this statement is since when has subjectivity been a problem that needs to be solved? Throughout history there are plenty of instances where great ideas/inventions, etc. would have never been possible without subjectivity of society. How is the world supposed to grow/evolve without subjectivity? Science "cannot tell us when authority is 'legitimate' and when not, or how we must decide, or when it may be right or wrong to obey" (162) sums up the problem with technology and becoming too dependent on it. A perfect example would be the "twitter for dogs" we heard about in class. If you are a tech-savy person this invention could be enjoyable, but this in no way shape or form should be used as a "babysitter" or legitimate form of "keeping an eye" on a pet. Although that is not what the invention is made to do right now, I could see developers aiming to make it more reliable and gear it towards this type of use. I am scared for the future of the world because machines should not become superior to human reason and it seems that most of the world is too passive to care.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment