Some seventy years ago, the church in southern Texas where I became a Christian had a major squabble over the use of “modern technology.” I remember how it came out, but not which side I was on.
The question was whether to install air-conditioning in the church sanctuary or to use that money to increase our giving to foreign missions. The cool-air side argued that a comfortable church was more likely to grow and thus be able to send more money to missions. To the other faction, air-conditioning seemed self-indulgent when most of our giving went to places even hotter than a summer in Houston. Missionaries in a tropical climate might find our willingness to perspire with them inspiring to them.
Technology today is much more complicated, but the basic question remains: Do we need it?Each new technology requires a cost/benefit analysis but in most cases, neither the costs nor the benefits can be fully known. Faith enters the equation because we have to make personal and societal choices. A technology that looks promising now might do more harm than good in the long run. But harm to whom? How long a run? Will it really benefit people (and God's other creatures)? Is there a better way? Better for whom? If things go wrong, how much pain will be caused, and who will bear it? Should we “buy in” now, or wait for a still better technology?
One position we could take would be to embrace some technology enthusiastically, accept much of it cautiously, and resist the rest adamantly. In California's Silicon Valley, not far from where I live, entrepreneurs and venture capitalists lust after any new “disruptive technology.” By disruptive they mean capable of turning upside down the way millions of us live and think. In contrast, many technological advances are essentially improvements on what we're already doing—rendering it better, easier, cheaper, faster, more reliable, more sustainable. Sometimes an incremental technology becomes disruptive. Computers are probably a good example.
In 1984 I reluctantly bought my first desktop computer, after many people had insisted that any writer should use a word processor. “Look,” they said, “With its cut-and-paste commands, you can move even whole paragraphs around.” My first response was that I never did such a thing, so why buy a computer? Of course, the reason I hadn't done it was that my old Remington typewriter couldn't do it. Soon, processing words by moving them around became second nature to me. Then, after ten years of using a computer, I got “connected.” Now I can hardly imagine functioning without access to the Internet.
Nevertheless, computers are beginning to look a lot less friendly. I'm told that some evil-minded hacker anywhere in the world may be able to steal my identity or, far worse, manipulate the U.S. power grid or take control of a passenger airplane in flight.
Me, a Luddite?
In the early 19th century, weavers and other manual workers in England saw their livelihoods threatened by the industrial revolution, which was a wholesale rush to disruptive technologies. Some broke into textile mills and other factories to smash steam-powered machines capable of producing more goods with less “labour.” Such industrial saboteurs became known as Luddites. In the 21st century, people with strong anti-technology tendencies are sometimes labeled neo-Luddites.
I don't think of myself as a Luddite. I know how much I've benefited from that first industrial revolution, though it came at great cost to many and to the natural environment. Its 20th century counterpart has brought us the internal combustion engine and an ubiquity of electronics but also many potentially hazardous things, from nuclear weapons to drones, fracking, e-cigarettes, and soon, driverless cars. We have to chose which of these things to use. Limited resources make such choices even more critical. Pure science doesn't come cheap anymore, so taxpayers must be continually reassured that science will eventually produce greater national security, “better lives,” and enormous economic rewards. Is curiosity enough to justify billion-dollar research projects? Nailing down that ultimate particle or preparing to send explorers to other planets may not be the most important investments we could make.
An endless supply of science fiction feeds public fears of technology running amok, a concern that goes back a long way. Forty or fifty years ago I read a cautionary tale in an article published in The American Scientist. The story told of scientists building a super-computer with a humongous amount of “artificial intelligence.” They begin feeding it very complex problems, and they keep adding to its capacities. Eventually the computer says it will tell them anything they want to know, but first they must destroy all programs, circuit diagrams, etc., that would enable them to build a duplicate machine. Some of its designers are reluctant to give up such knowledge, but the lure of finally obtaining answers to life's deepest questions wins out. The scientists gather around the computer and one of them asks it, “Is there a God?” The computer whirs a few seconds and then says, “There is, now.”
In the garden of Eden (Genesis 2:9), the serpent probably said something like, “Here, have a bite of fruit from this lovely “knowledge of everything” tree. It's only a little snack. How could that hurt anyone?” The Bible is big on questions of good and evil but only in passing does it refer to the various technologies of its day (textiles, metalwork, chariots, parchment, and the like). In our day, to communicate the gospel, evangelists and the rest of us have many tools at our disposal, like language translation programs, CD players powered by solar panels, e-books, Power Point presentations, websites, etc. Can we be “wise as serpents” and still be “innocent as doves” (Matthew 10:16)?
A Troglodyte, Maybe
I suspect that I'm more a troglodyte than a Luddite. That is, I'm content to live as simply as possible, more or less isolated from what's going on outside the sanctuary of my low-tech cave. In my old age (89), it's hardly inappropriate to be old-fashioned. I'm not yearning to go back to the stone age, but I don't seem to mind being stuck in the 20th century.
Ginny and I still drive a 1986 Toyota Corolla with a stick shift, we have a land-line telephone with no answering machine, and we do most of our business in person or by postal mail. We prefer printed books and magazines to reading on a screen (except for God and Nature, of course). We've learned how to use the microwave oven our daughter gave us years ago, but not the smart phones she now insists that we have. We want nothing to do with “social media.” Why do we stay so far behind the technological curve? “Life-saving” devices are one thing, but “time-saving” devices add nothing to our life span, and “labor-saving” devices require us to work just as much, though at different tasks.
I can think of only two instances when I've been ahead of the curve. Television was not broadcast in color until the late 1960s. (In 1954 I watched Senator Sam Ervin take down red-baiting Senator Joe McCarthy in black and white.) When Ginny and I married in 1966, I painted the black box of Ginny's television set green to match other furniture. For awhile we were the only couple we knew to have “color TV.” Ours was sort of an avocado color, though the picture on the screen was still in B&W.
The more significant instance had to do with my professional life. In 1958, chemists William H. Stein and Stanford Moore of Rockefeller Institute designed and built a machine to speed up analysis of amino acid mixtures. Before long, Beckman Instruments had brought out a commercial “Amino Acid Analyzer” based on that prototype. (Stein and Moore used their machine to elucidate the structure of the enzyme ribonuclease, for which they shared the Nobel Prize in Chemistry in 1972.) Through an unusual “copasetic confluence of circumstances,” I scraped up enough research money to buy one of the earliest Beckman models to use for studies on peptide hormones. In those days, $14,000 was a lot of money, almost equal to my university salary. A few years later, in a chapter titled “Whole People and Half Truths,” I described how I felt about acquiring that cutting-edge instrument:
“I was now free to solve the problem in the best possible way, in fact in almost the only way it could be solved. But, alas, it then became necessary to choose problems which had to be solved with the aid of the instrument in order to justify its purchase and the salary of the technician to run it. If the machine and the technician were not both kept busy, we were wasting the taxpayers' money. Besides, if not used, both would get so out of condition that results would be questionable when we did need to count on the instrument to solve a problem. So, to feed the machine I had to build up an empire I did not want, or I had to hustle samples from other laboratories, in the process becoming a manager instead of a scientist” (p. 94).
That chapter was part of The Scientist and Ethical Decision (IVP, 1972), edited by mathematician Charles Hatfield and containing papers by a number of other ASA members. We had been invited to a 1972 conference at the University of Michigan sponsored by the Institute for Advanced Christian Studies, while the war in Vietnam was in full swing. The paper by physicist John A. McIntyre on “Is the Scientist for Hire?” caused considerable consternation. Many physicists opposed the war, but many also worked on classified research projects supporting the war. Were scientists responsible for human deaths caused by their work? Jack McIntyre argued, from the example set by the America Bar Association, that a professional scientist is no more responsible for the actions of a client or employer than is a professional legal counsel. Compared to such a major ethical conflict, my concern about scientific work becoming overly mechanized and dehumanizing seemed rather trivial.
Nevertheless, in his introduction, the editor cited Henry David Thoreau's warning in Walden (1854) not to let ourselves “become the tools of our tools.” Thoreau told his contemporaries to pity, not envy, a farmer who owned a big herd of livestock because, in reality, the herd controlled the farmer's life more than the other way around. Thoreau's motto was “Simplify! Simplify!” He was scathingly skeptical about new technology—for him, the telegraph and the railroad. Yet, when he wasn't hanging out at Walden Pond, he helped modernize his father's pencil-manufacturing business. (A “pencil,” by the way, is a piece of woodenware with graphite hardware at one end and rubber software at the other. Such tools were once used for content-providing, then known as “writing.”)
I didn't want to be “chained to a machine.” Yet that crude automated machine led to the invention of more sophisticated ones for analyzing and synthesizing DNA fragments, promising a truly individualized, more humanized medicine. Is the latest technology to be embraced, accepted, or resisted?
Maybe I'm still not sure which side I'm on.
Walter R. Hearn grew up in Houston and majored in chemistry at Rice University. He received a Ph.D. in biochemistry at the University of Illinois in 1951. After doing research for a year at Yale Medical School and for three years at Baylor College of Medicine, he spent 17 years on the biochemistry faculty at Iowa State University. His research interests included peptide chemistry, hypothalamic hormones, and bacterial pigment biosynthesis.
For five years he was a Visiting Biologist to Colleges for the American Institute of Biological Sciences. He is a Fellow and Life Member of AAAS and an Emeritus member of the American Chemical Society. In 1972 he switched professions and moved to Berkeley to do free-lance editorial work with his wife Virginia. They have edited periodicals and some 200 books, largely for Christian publishers.
Walt joined ASA while he was in grad school and served on the Council in the 1960s. From 1969 to 1993 he edited the ASA newsletter. He was a coauthor of the widely distributed publication, Teaching Science in a Climate of Controversy (ASA, 1986) and author of Being a Christian in Science (IVP, 1997).
He has also contributed chapters to a number of books, the latest being "Creation Matters" in Darwin and the Bible: The Cultural Confrontation (Penguin Academic, 2009), edited by anthropologists Richard Robbins and Mark Cohen. His articles, reviews, and poems have appeared in such publications as Perspectives on Science & Christian Faith and the Berkeley publication Radix, for which Ginny has been copy editor for over 40 years. Walt was once "poetry rejection editor" for Radix magazine. Walt and Ginny have strong IVCF backgrounds, helped to launch New College for Advanced Christian Studies in the 1980s, and are members of Berkeley's First Presbyterian Church.