Scientists challenge “scientism”: two perspectives (Pt. 1)

#1: Famed Yale computer science professor quits believing Darwin’s theories | The College Fix

https://anglophilicanglican.files.wordpress.com/2020/01/gelernter-370x242-1.jpgThe origin of species is exactly what Darwin cannot explain.

Source: Famed Yale computer science professor quits believing Darwin’s theories | The College Fix

Science is one of the great gifts God has given us: intellect, rationality, powers of observation, perception, and deduction, and the creation and use of technology to gain understanding about the world – and cosmos – in which we live. I am grateful for it; and so, I believe, should we all be!

Science, properly understood and utilized, should be a vehicle toward the greater glorification of God, as we come to understand more and more the glory and grandeur of Creation, and give praise to its Creator.

Scientism, in contrast – defined by Merriam-Webster as “an exaggerated trust in the efficacy of the methods of natural science applied to all areas of investigation (as in philosophy, the social sciences, and the humanities)” – is inward-looking, hubris-laden, and ultimately, closed-minded… however much its proponents think otherwise.

(For more on the problems and limitations of scientism, see this excellent post – “The Problem with Scientism” – by Massimo Pigliucci, on APA Online, the blog of the American Philosophical Association.)

I believe it is important to note that I am not a “young-Earth creationist,” nor do I find it necessary to my Christian faith to believe that all we see here and now, in this incredibly vast cosmos, or the incredibly complex web of life here on Earth, was brought into being over a period of an earthly week.

It could have been, of course; it would be hubristic of me to rule out the possibility. But given what we are able to discern, using our God-given powers of intellect, observation, reason, and discernment – as noted above – it seems to me more likely that this is a metaphor, or a poetic representation, for how the cosmos came into its current flourescence.

But nor do I find it convincing that it all sprang into being spontaneously – initiation without an Initiator – and then somehow independently ordered itself into the grandeur and glory that we see around us, without any sort of design or guidance… without, that is to say, a Designer, whom we Christians know as the God of the Bible.

The old argument is just as valid today as it was centuries ago: if you see a watch, in all its complexity, it is both logical and rational to suppose the existence of a Watchmaker.

To assume that all of the components came into existence and joined together randomly, without design or intentionality, to form a workable watch, is counter to reason and logic – much like the idea that a hundred monkeys, pecking away at typewriters, could eventually come up with Shakespeare.

Yet that is exactly what many – but not all – scientists today try to claim, about the origins of the Cosmos, and the rise of life on Earth. And for such extraordinary claims, extraordinary evidence is required: evidence which seems, quite frankly, to be lacking.

As David Gelernter, noted Yale professor of computer science, notes (quoted in the linked article),

“My argument is with people who dismiss intelligent design without considering, it seems to me — it’s widely dismissed in my world of academia as some sort of theological put up job — it’s an absolutely serious scientific argument,” Gelernter said during his interview. “In fact it’s the first and most obvious and intuitive one that comes to mind. It’s got to be dealt with intellectually.”

Gelenter notes that for many (most?) scientists today, Darwinian evolution has ceased to function as an ordinary scientific theory – an explanation that makes sense, given our current knowledge, and appears to account for the available evidence, but is subject to challenge as our knowledge base changes or expands – and become an object of ideological, almost religious faith, which cannot be challenged: quite the opposite of the scientific method! He then sounds a cautionary note:

“Gelernter said an ideological bent has taken over the field of science. There are good scientists doing good work, ‘but we have a cautionary tale [that] what happened to our English departments and our history departments could happen to us, God forbid,’ he said.”

He further notes that while he likes many of his colleagues at Yale, that they are his friends, when he looks at

“their intellectual behavior, what they have published – and much more importantly what they tell their students – Darwinism has indeed passed beyond a scientific argument as far as they are concerned. You take your life in your hands to challenge it intellectually. They will destroy you if you challenge it,”

and adds,

“what I have seen in their behavior intellectually and at colleges across the West is nothing approaching free speech on this topic. It’s a bitter, fundamental, angry, outraged rejection [of intelligent design], which comes nowhere near scientific or intellectual discussion.”

Why is this a problem? Leaving aside the issues of free speech and the free exchange of ideas which is one of the major underpinnings of both Western culture and Western academia, it renders its proponents unable to look at new data, or differing interpretations of existing data. With respect to life on earth, for example, adaption is well-attested. Speciation – much less origin – not so much. Here’s Gelenter again:

“There’s no reason to doubt that Darwin successfully explained the small adjustments by which an organism adapts to local circumstances: changes to fur density or wing style or beak shape,” the professor wrote. “Yet there are many reasons to doubt whether he can answer the hard questions and explain the big picture — not the fine-tuning of existing species but the emergence of new ones. The origin of species is exactly what Darwin cannot explain.”

Emphasis added. Darwin’s seminal work was entitled “The Origin of Species,” but “the origin of species is exactly what Darwin cannot explain.” This is a problem for Darwinists, naturally, and it explains their at-times almost hysterical reaction to anything that would challenge their core doctrine.

I will not here recount Gelenter’s specific arguments, but please click through the link and read them yourself: they are telling. But they come down to the conclusion that “the idea that random chance and mutations are the driving force behind the vast complexity of life – even with billions of years of time – is not just scientifically improbable, it’s an impossibility.”

The only place in which I disagree with Gelenter is when the article notes that “he sees intelligence in Earth’s design, and has no quarrel with ID proponents, but notes the world is a mess, its suffering far outweighs its goodness.” I would say that, on balance, the goodness far outweighs the suffering, although of course, locally, suffering can be intense. But a full discussion of the theodicy is far beyond the focus of this post!

In any case, Gelenter notes,

“Darwin would easily have understood that minor mutations are common but can’t create significant evolutionary change; major mutations are rare and fatal,” Gelernter wrote. “It can hardly be surprising that the revolution in biological knowledge over the last half-century should call for a new understanding of the origin of species.”

Indeed.

To be continued, in Part Two!

Yale’s David Gelernter: Darwin’s Doubt Is “One of the Most Important Books in a Generation” | Evolution News

Yale University computer scientist David Gelernter is a polymath, a brilliant writer, artist, and thinker. Famed both for his specific scientific expertise, and for his cultural, political, and historical reflections, he’s also now a confessed Darwin skeptic.

Galernter credits reading Meyer’s book as the primary cause of his rejecting neo-Darwinian evolution, a “brilliant and beautiful scientific theory” now overtaken by science.

Source: Yale’s David Gelernter: Darwin’s Doubt Is “One of the Most Important Books in a Generation” | Evolution News

I am not a “young earth” Creationist, in the sense of one who accepts the calculations of Archbishop Ussher (an Anglican, it must be said!), who determined that the cosmos was just over six thousand years old, or who believes it essential to adopt a literalist attitude to the “six days” of the Genesis creation narrative.

Let’s remember that a) the Ancient Near East of the time used a hexadecimal system of calculation – our 360° circle, 365 day year (360 + 5 intercalculary days), 12-month year, and even 24-hour day are vestiges of this system – so six is an appropriate number of completion; and b) these literal, 24-earth-hour days make sense only for Earth itself, since every other planet in the solar system (and beyond) has a different length of day.

Also, there are plenty of indications that God – being eternal – views time quite a bit differently than we do: “For a thousand years in thy sight are but as yesterday when it is past, and as a watch in the night” (Psalm 90:4), and “But, beloved, be not ignorant of this one thing, that one day is with the Lord as a thousand years, and a thousand years as one day” (2 Peter 3:8, likely following the Psalmist), to cite but two examples.

(Note: we should not jump to the conclusion that “a divine day equals a thousand earth-years,” either: the Scriptures frequently speak symbolically and allegorically).

And evolution, in the sense of adaption / development in response to environmental stimuli, is an observable phenomenon, both in the lab and in the field. As an explanation of observed phenomenon, evolution makes sense. Anyone who denies this is either not paying attention, or is hampered by ideological blinders.

The problem comes when science spins off into “scientism,” and people try to make of evolution something it is not, or should not be – a motive force, a mechanistic, deterministic replacement for God – and “believe in” Darwinism as a replacement for religion.

The bottom line is this: I am a Christian, and a Christian cleric. As such, I believe that we are living in a Creation, and that it has a Creator: God the Holy Trinity, Father, Son, and Holy Spirit. All the glory and grandeur we see around us, in the heavens and on earth, did not “just happen,” as a result of the intersection of random happenings and (unoriginated, self-existent) natural laws. We do not live in an accidental Cosmos.

What is interesting to me is that more and more professional scientists are beginning to adopt the same view. Intelligent Design (“ID”) is not, itself, a theological position; the Designer of ID need not necessarily, by the tenets of ID, be the God of the Bible. But it is at least a step in that direction; a concession that the Cosmos did not “just happen,” that – in the terms of the old natural theology – if you happen to come upon a watch, the logical deduction is that there is a Watchmaker.

(It must be noted that a great deal of the challenge to the dominant scientific paradigm comes from secular, naturalistic scientists. However, as an article in The Federalist points out, the

“leading critics [of Darwinian evolution] have been intelligent design supporters, who are looked down on by naturalists.” [N.B. – in the sense of those who support solely naturalistic explanations for observed phenomena, not those who explain the natural world to families in parks and nature centers!] “But as each group adds to the scientific literature, certain critiques and findings inevitably bolster or redirect the research of the other [emphasis added].”

The originally linked essay by David Klinghoffer is an excellent introduction to the growing debate, and includes a large number of links for those who wish to follow up on it. As Klinghoffer concludes,

“Scientists, intellectuals, and ordinary thoughtful adults are giving up the old pledge of allegiance to Darwin. The evolution in thought is very gradual, admittedly, but it’s unmistakably happening.”

 

The Perfect Storm: Sources of the depression epidemic | Psychology Today

https://cdn.psychologytoday.com/sites/default/files/styles/image-article_inline_full_caption/public/field_blog_entry_images/2019-05/screen_shot_2019-05-11_at_5.33.49_am.png?itok=uFUvXwVJ
America’s Children: Key National Indicators of Well-being
Source: Forum on Child and Family Statistics

Source: The Perfect Storm | Psychology Today

This blog article in Psychology Today traces the top five causes of the epidemic of depression here in the U.S. (and in fact, throughout much of the world). These include:

1. The erosion of traditional social structures and communities. “A gradual disintegration of the social fabric, which has closely paralleled industrial and technological growth, has resulted in greater isolation and loneliness… we have become increasingly disconnected from family, friends, and neighbors. Urbanization and the breakup of the extended family and rural community are leading causes of this social atomization.”

2. Changes in modes of communication. “Following the physical upheaval of urbanization, the world has been swept by a tidal wave of electronic innovation… [The] alarming rise in depression among U.S. youth during the period 2004–2015… coincides with the birth and rapid growth of smartphone usage during the same period. While this does not prove a cause-effect relationship, it would seem to reinforce an urgent need to closely examine the impact of smartphone usage on the communication skills and psychological well-being of young people.”

3. Changes in Diet. “Consumption of processed foods, which mostly contain a serious imbalance of omega fats, large quantities of sugar, and a lack of fermented ingredients, are radically affecting the delicate balance of our gut flora. A landmark comparison between North Africans and North Americans revealed sharp declines in bacterial diversity among the North American group, including genera containing the psychobiotic strains… Is fast food and processed food throwing our microbiome, that is, our internal environment, into chaos in the same way that pollution is destroying the macrobiome?”

[Note: the Weston A. Price Foundation has been saying this since 1999; Dr. Price himself raised the alarm regarding processed foods vs traditional dietary patterns, back in the 1930s and 40s. This is not new information! But it’s finally beginning to be recognized by the mainstream.]

4. The intense competition surrounding education among industrialized nations. “Korea, Japan, China, and to a growing degree, Western nations, are experiencing an exponential rise in youth depression… fierce competition in the academic arena, in which academic success is equated with social and economic “success” by parents, is leading to a loss of personal autonomy and acute stress. Secondary schools are now largely focused on exam-centered curricula… marked by a lack of content related to life skills, social-emotional learning, and wellbeing in general.”

5. The familiar socio-economic suspects, including war and poverty. “Nations strongly affected by conflict and extreme poverty, with an emphasis on extreme, rank relatively high on the depression scale and low in happiness and satisfaction. Nonetheless, the relationship between GDP and depression/happiness rates is by no means linear… Personal freedom and the presence of social networks, two factors inversely correlated to depression mentioned above, are highly related to scores on the Positive Experience Index of the Global Emotions Report.”

Assuming that the above is accurate, and based on my own experience and informal research, I believe it is, what is most interesting in all this to me – aside from a certain degree of grim satisfaction of the “I hate to say I told you so, but I told you so” variety – is that four out of five of these factors are both endemic to, and so far as can be determined, unique to, our modern/postmodern age. Our ancestors had rough lives in many respects – rougher than ours in most – but they do not appear to have suffering from comparable levels of depression… which has spiked in recent years, as recounted in the linked article, and many others.

These contributing factors to the contemporary depression epidemic can therefore (despite the usual disclaimers about correlation not equalling causation) be pretty much laid at the feet of our abandonment of traditional approaches, thoughts, understandings, philosophies, and ways of living and being, in so many areas of life, from foodways to lifeways, from communication to education.

This mindless neophilia, this willingness (even eagerness) to cast aside the traditional, the tried and true, and to eternally chase after the supposedly “new and improved,” which is so characteristic of our present society, is going to kill us – is killing us – if we do not moderate it with a more sensitive and sympathetic appropriation and re-adoption of traditional norms and ways of life.

Once again, I hate to say I told you so, but……!

Old-fashioned toys, not video games, best for kids, pediatricians say | WRCBtv.com – Chattanooga

Image

Don’t be fooled by all those “educational” electronics in stores. What’s best for your kids, pediatricians say, are old-fashioned toys that require you to actually interact with them.

Source: Old-fashioned toys, not video games, best for kids, pediatrician – WRCBtv.com | Chattanooga News, Weather & Sports

“Play is important for child development, but children learn best from adults. They get language skills, learn about how the world works, and get feedback that can reinforce learning and positive behavior, the American Academy of Pediatrics says in new guidelines for people buying toys for kids.”

The most amazing part of this is that, apparently, it comes as a surprise to some people!

The AAP cautions that

“a little common sense goes a long way, the AAP says in its reminders. Kids need to use their imaginations, they need to move both their hands and their bodies and they need to express creativity. Simple toys such as blocks, crayons and card games can fill these needs better than the flashiest video game”

And goes on to add,

“The truth is most tablets, computer games, and apps advertised as ‘educational’ aren’t. Most ‘educational’ apps target memory skills, such as ABCs and shapes,” the guidelines read.

“These skills are only one part of school readiness. The skills young children really need to learn for success in school (and life) include impulse control, managing emotions, and creative, flexible thinking. These are best learned through unstructured and social play with family and friends.”

So-called educational games and apps on digital media may, in fact, delay social development [emphasis added], especially for young children, because [such technology] interferes with their learning about real-life facial expressions and gestures.”

When it comes to screen time, less is more:

“Parents also need to remember to limit kids’ use of video and computer games, the AAP says. ‘Total screen time, including television and computer use, should be less than one hour per day for children 2 years or older and avoided for those younger than 2 years of age,’ the guidelines point out.”

That was the rule in my growing-up years, when “screens” meant television. I may have chafed at it, at the time, but (with the perspective and, hopefully, maturity that age brings) I recognize the wisdom of the restriction, now.

Caveat emptor! “Some products may be marketed in a way that makes parents feel their kids are missing out if they don’t get them. Don’t fall for it, the AAP says.” Oh, really? Do ya think? Gee, I didn’t know that corporations ever marketed their products in ways that over-state their benefits and minimize their risks… *wry smile*

In any case:

Read the whole article – there’s a lot more information, and it’s all interesting, especially to those who care about the social and physical, as well as intellectual and psycho-emotional, development of children.

90% of plastic polluting our oceans comes from just 10 rivers | World Economic Forum

Workers clear garbage at the bank of Yangtze River in Taicang, Jiangsu province, China, December 23, 2016. REUTERS/Stringer ATTENTION EDITORS - THIS IMAGE WAS PROVIDED BY A THIRD PARTY. EDITORIAL USE ONLY. CHINA OUT. NO COMMERCIAL OR EDITORIAL SALES IN CHINA. - RC1EC841D900

The world has become increasingly alarmed at the amount of plastic in its oceans. But where does all this plastic waste come from?

Source: 90% of plastic polluting our oceans comes from just 10 rivers | World Economic Forum

Here’s a hint: not from us.

Not if by “us” is meant the United States, or the West in general.

Plastic in the ocean is a major problem. As this article points out, “more than 8 million tons of it ends up in the ocean every year. If we continue to pollute at this rate, there will be more plastic than fish in the ocean by 2050.”

That is not just hype, and it is not something we should take lightly, especially if we care at all about this good Earth and its future (not to mention our future, on it). But here’s the thing: plastic straws in California – or anywhere else in the U.S. – are not the problem. We are not, by and large, the problem.

That’s not to say we couldn’t be doing a better job of disposing of (or, preferably, recycling) our plastic waste than we are; but for the most part, we’re not doing badly. So where does all that plastic waste come from?

Asia, primarily, and Africa.

According to the World Economic Forum, and recounted in the linked article and elsewhere, 80% of the plastic waste that makes it into the world’s oceans gets there via ten rivers: eight of them in Asia (including the storied Ganges and the Indus in India, and the Yangtze and Yellow in China), and two (the Nile and Tiber) in Africa.

Interestingly, this story came out this past summer. But how much attention has it received from the mainstream press? Little to none. Continue reading “90% of plastic polluting our oceans comes from just 10 rivers | World Economic Forum”

World-Famous Scientist: God Created the Universe | Intellectual Takeout

‘The final resolution could be that God is a mathematician.’

Source: World-Famous Scientist: God Created the Universe | Intellectual Takeout

Now, this does not come as any surprise to me! I have long believed that science and religion, properly understood, are not and cannot be in opposition to one another – except in the sense defined by British physicist Sir William Bragg:

“From religion comes a man’s purpose; from science, his power to achieve it. Sometimes people ask if religion and science are not opposed to one another. They are: in the sense that the thumb and fingers of my hands are opposed to one another. It is an opposition by means of which anything can be grasped.”

Sir William Bragg, in Sir Kerr Grant, The Life and Work of Sir William Bragg (1952), 43.

Indeed, it seems to be physicists who, among scientists, are most prone to adopt a theistic worldview. It may be that those whose life’s work leads them to ponder the secrets of the cosmos itself are more inclined to discover that one of those secrets is the secret of design. And design, of course, requires a Designer…

Unless, of course, one is so dead-set against the notion that one comes off sounding strident and silly in one’s opposition, as (for example) Richard Dawkins does, to anyone who is not among his defenders… but I digress!

Michio Kaku, who as the linked post notes, “has made a name for himself as a world-leading theoretical physicist unafraid to speak his mind,” has dropped a bit of a bombshell:

“I have concluded that we are in a world made by rules created by an intelligence,” Kaku said, as quoted by the Geophilosophical Association of Anthropological and Cultural Studies. “To me it is clear that we exist in a plan which is governed by rules that were created, shaped by a universal intelligence and not by chance…

“The final resolution could be that God is a mathematician,” says Kaku. “The mind of God, we believe, is cosmic music. The music of strings resonating through 11-dimensional hyperspace.”

I must say, I like that image!

When I consider thy heavens, even the work of thy fingers; the moon and the stars which thou hast ordained;

What is man, that thou art mindful of him? and the son of man, that thou visitest him?

O Lord our Governor, how excellent is thy Name in all the world!

— Psalm 8:3-4, 9 (Psalter, Book of Common Prayer 1928)

Amen, and amen.

Humanity 4.5 by Mark Shiffman | First Things

https://d2ipgh48lxx565.cloudfront.net/uploads/article_561eb1bedbc38.jpg?Expires=1535478203&Signature=KF2IKBKgytIXLrxt~Ubtu3M2fBLCrlb5vePO-kFZ-Vd-LPw5dQanvfLTEox~~gz5yAsJO-ZoNrCq6~2Qt6PNzvhp0iFYmZ4w9uhUPlqAj-zT4joQ7-V-8-3c0G7nHE8TgnCgqEFoxrklp8rza1A1gHb8JLjMznNrLIMGr7-kKHpw1Kicb0WD~C0qj1RODsCgAuS6OcbXoSKtjXZzQ~f1DIHF7VU30eicW6qR3X3i4R3ynn1YtMICEJswx0d5hYxUgZ2ZO66gAZbavBoYhXRU3Yltjc2dTcmVZwTx-FbRV-jnZys1TvE1D7foi0SO8WMFLzuSNwly6DqbRndMgIkmbA__&Key-Pair-Id=APKAIN7SVXNLPAOVDKZQ

“From the low-tech mania for tattooing and piercing, through the medium-tech tools of abortion, hormonal birth control, and transgendering, to the high-tech visions and explorations of genetic engineering and cyborgism, [the transhumanist] rebellion seems to be gathering steam.”

Source: Humanity 4.5 by Mark Shiffman | Articles | First Things

“Transhumanism” – the idea that we can, and more importantly, that we should, “transcend” our very humanity – seems to be in the process of moving from a “fringe phenomenon of science fantasy” to an organizing principle behind the present assault on traditional Western culture and civilization.

This excellent (albeit long) article by Mark Shiffman in the superb journal First Things explores and explains both the process and its implications. As he points out, transhumanism “styles itself a philosophy, [but] is really a religious movement with a twenty-first-century marketing campaign (under the brand ‘H+’),” and has a distinctly Gnostic (which denies the goodness of Nature, including humanity, viewing us as spiritual beings trapped in material bondage) character to it:

“The options come down to rejecting God entirely or reducing God to a useful projection of human possibilities. In either case, the human is no longer an ecstatic subject who receives the gift of being and the grace that fructifies our nature, but is himself the primary source of transcendence…

“The subject of projects gives modernity its Gnostic character. This world is not an inherently good ­creation. A better alternative world remains to be made by us, in the future…

“The distinctly transhumanist horizon comes about when our project of mastery turns its attention to our own bodies. They come to be treated as raw material, resources available to satisfy our free individual preferences. Our will to transcend nature through projects of mastery mounts a rebellion against the natural constraints of the organic human body, harnessing the power of technological innovations to render it the instrument of our arbitrary will…

“From the low-tech mania for tattooing and piercing, through the medium-tech tools of abortion, hormonal birth control, and transgendering, to the high-tech visions and explorations of genetic engineering and cyborgism, this rebellion seems to be gathering steam. An aggressive assertion of bodily self-ownership is becoming the new normal, with the status of a fundamental right.”

That is on the personal level. On a more cosmic level, Gnosticism teaches that the

“world is not an order of beings manifesting God’s goodness; it is rather an order of inert matter in motion, available for the human will and intellect to master and manipulate. Ancient Gnosticism sought deliverance from evil by severing the spirit’s ties to the material world. Modern Gnosticism appears at first to take a much more optimistic view of creation.

“Its hopes, however, are not placed in nature as created, but rather in the mind’s capacity to construct models that will unlock the powers trapped within the given order of beings, so as to release their infinite possibilities and make them subservient to our needs and aspirations.

“It hopes to escape evil not by fleeing the world, but by stepping away in distrust, securing the independent power of the mind through the scientific method, and then turning against the world with a vengeance and transforming it to suit the human will.”

In other worlds, transhumanism is a quasi-religious belief system (cf. scientism) which denies both God and the goodness of the created order, believing in the ability of the human intellect to bring about a secular version of paradise by transforming both humanity and the natural world.

Eye-roll emojiGiven the amazing number and variety of ways in which we have screwed up the natural world by messing with it so far, not to mention the incredible amount of havoc, destruction, and bloodshed that has come by humans trying to bring about a “kingdom of God” on Earth by our own efforts, one has to ask – with more than a touch of irony – what could possibly go wrong with this…?

Students learn more effectively from print textbooks than screens, study says | Business Insider

Books and Tablet

Our work has revealed a significant discrepancy. Students said they preferred and performed better when reading on screens. But their actual performance tended to suffer.

Source: Students learn more effectively from print textbooks than screens, study says – Business Insider

“Teachers, parents and policymakers certainly acknowledge the growing influence of technology and have responded in kind. We’ve seen more investment in classroom technologies, with students now equipped with school-issued iPads and access to e-textbooks…

“Given this trend, teachers, students, parents and policymakers might assume that students’ familiarity and preference for technology translates into better learning outcomes. But we’ve found that’s not necessarily true.

“As researchers in learning and text comprehension, our recent work has focused on the differences between reading print and digital media. While new forms of classroom technology like digital textbooks are more accessible and portable, it would be wrong to assume that students will automatically be better served by digital reading simply because they prefer it.”

This doesn’t surprise me a bit. There is something… superficial, for lack of a better term… about pixels on a screen compared to printed words on a page. They don’t stick in the mind – never mind sink down into the heart and soul – the way actual, physical, tangible books do.

And I had to chuckle at the comment that, “it would be wrong to assume that students will automatically be better served by digital reading simply because they prefer it.” Ya think? Given a choice, most school-age kids – and even many adults – would prefer ice cream or candy over solid, nourishing foods, but if health and well-being is the goal, that preference is a poor predictor. Our preferences, as humans, are not always to our own benefit, in a whole range of scenarios!

That said, the person who shifts over from a steady diet of soda-pop, fast food, and sweets to a steady diet of nutritionally beneficial foods generally will eventually come to prefer the latter, even wondering how on earth they could have ever stood to eat and drink the junk they’d eventually given up. And a person who shifts from a relationship pattern of one-night stands and superficial hook-ups to the love and commitment of a steady relationships is usually glad they did.

I suspect a shift from screens back to books, as a general rule, might have a similar effect. This is not to say the shift should be 100%! Even the most nutritionally-aware eater enjoys an occasional sundae, or slice of birthday cake. And screens aren’t likely to go away, in our larger society, short of a major X-class solar flare zapping our technology back to the 19th century, and students need to know how to use them.

Besides, as this article points out,

“One of the most consistent findings from our research is that, for some tasks, medium doesn’t seem to matter. If all students are being asked to do is to understand and remember the big idea or gist of what they’re reading, there’s no benefit in selecting one medium over another.”

However, “when the reading assignment demands more engagement or deeper comprehension, students may be better off reading print.” This is a distinction which should be kept in mind, in my opinion, both in school and in life! I have noticed the phenomenon myself, in my own reading, although I had not attempted to articulate it prior to reading this: I read faster on-screen, but engage the text – and the ideas behind it – better when I’m reading from a physical print medium.

And generally feel better and more satisfied after having completed the reading task, as well, which ties into another of the study’s conclusions:

“There may be economic and environmental reasons to go paperless. But there’s clearly something important that would be lost with print’s demise. In our academic lives, we have books and articles that we regularly return to. The dog-eared pages of these treasured readings contain lines of text etched with questions or reflections. It’s difficult to imagine a similar level of engagement with a digital text.”

There are both tangible and intangible benefits to directly, physically engaging with specific, individual books: their look, both the design of the book itself and the wear-and-tear it has received over the months, years, or decades; their heft, in which even the difference between a mass-market paperback, a trade paperback, or a hardback book can be significant, not only in weight but in the feeling of permanence and solidity it engenders; and even the scent: for many of us, the smell of old books is a part of their appeal, reminding us that they have been around, cherished and re-read, for in some cases a very long time. Conversely, the smell of a new book can be exciting in a different way, carrying with it the sense of beginning an adventure. Many of these benefits are substantially reduced, or lost entirely, if our reading is mostly or entirely on electronic screens.

You will have noticed that I’ve several times alluded to the permanence / impermanence issue. Pixels on a screen are fundamentally transient, impermanent. They can be changed or deleted, either individually or en masse; they can be rendered inaccessible for a myriad of reasons ranging from running out of battery, to not having the right operating system (Kindle vs Nook vs ….?), to forgetting your password, and the list could go on.

Yes, physical, printed books can have issues, too. They are vulnerable to fire (though that is rarely an issue) and water (I suppose you could drop yours in the toilet, or the lake, and you wouldn’t want to read it in the rain – but the same could likely be said of your tablet); you could forget it, or lose it… but again, the same applies to your e-reading device. There are simply not so many things that can go wrong with a physical book, as with an e-reader.

There is another concern, too: it is way too easy to get rid of electronic “books.” We humans have evolved, over the centuries, a protective attitude toward physical books, and an aversion to damaging, destroying, or discarding them. Many or most of us would prefer to give old books we don’t need anymore away, or take them to the library for a sale, or donate them, than simply throw them out. And the idea of burning books, or even banning them, carries connotations of police-state totalitarianism.

But what if those books can simply be deleted, or their text changed – quietly, unobtrusively, unnoticed – with a few strokes of a keyboard? What then for the preservation of ideas, the evolution of human thought? At this point, the practical considerations, and even the educational ones, shade over into philosophical and moral concerns. I am not sure anyone has sufficiently addressed these implications of the digitization of our written media.

Of course, the argument so often raised in favour of digital media is that you can carry a hundred (electronic) books in an e-reader the size of a paperback. A veritable library in your pocket, purse, backpack, or messenger bag! And that is an undeniable advantage – at certain times, and for certain reasons. Travel, for instance… if you’re sure you’ll have regular access to an electrical outlet, for charging. If not, you may be better off with one or a few well-chosen actual books.

Otherwise, it is at least arguable whether high capacity is a “feature,” or a “bug”! Distraction, and/or merely superficial attention, is one of the major issues with reading on-screen as opposed to in actual, physical, print media. Carrying a whole library with you in a single, compact device sounds great on the surface, but it may well serve to increase the tendency to engage the text(s) only superficially – and if, as many e-readers do, you have the ability to also go online, there is another two-edged sword.

It’s great to be able to easily look up obscure references or background information for a passage you’re reading. But it also increases the temptation to “just check my email (or Facebook, or Twitter, or whatever) while I’m online,” and before you know it, you’re down the rabbit-hole. As one comment I like (albeit in a rueful sort of way) puts it, “With the internet, we have immediate, 24-7 access to the wisdom of the ages. But most of us use it looking at pictures of cats.” Distraction is a thing.

This has gotten a bit far afield from the specific issue of using screens for reading in an educational context. But it is worth raising the question of whether encouraging students to use screens – whether computers, laptops, tablets, smart-phones, e-readers, etc. – as their primary information source is really serving them all that well, with respect to either their current educational task, or their future.

Like a lot of other forms of technology, screens are useful, but not entirely benign. They are, as the old saying goes, “useful servants, but bad masters.” The problem is that so many of us are allowing them to dictate our lives, rather than the other way ’round. Gotta check my email. Gotta check my Facebook. Gotta check my Instagram. Gotta check my messages. Gotta check, gotta check, gotta check… and respond, of course. And then look up something else. Scan articles. Scan blog-posts. And on an e-reader, scan books… or the electronic facsimiles thereof.

Now, I am aware of the slight irony of composing this objection to excessive use of online devices, online! If my goal was to bash technology entirely, I should be writing it on parchment, with a quill pen… or pressing it into damp clay with a wooden stylus. But I am not. As I said above: “useful servant, bad master.”

I am writing this online because I can reach far more people this way than by mailing it out in letters to people I think might be interested – and even if I were going to print it out and distribute it that way, I’d still type it on the computer, because I can type much faster than I can print or write longhand. Taking advantage of certain aspects of technology for its benefits does not, or should not, immunize us from also considering its problematic elements.

Nor am I limiting myself to electronic media. Before I started this essay, I was re-reading – for the nth time – J.R.R. Tolkien’s Lord of the Rings (specifically, the second volume, “The Two Towers”)… using an actual, physical book. Earlier still, I did an online broadcast of Morning Prayer – again, because I can reach more people that way – but using a decades-old copy of The Book of Common Prayer 1928, and reading a meditation from another book originally written in 1858 (the edition that I have was printed in 1890).

It’s one thing to use a variety of appropriate technologies, depending on your needs and intentions. It’s another thing to become so fixated or dependent on a particular one – particularly one with the limitations of electronic screens, as described above – that you don’t end up using anything else. As the authors of the linked essay put it,

“we realize that the march toward online reading will continue unabated. And we don’t want to downplay the many conveniences of online texts, which include breadth and speed of access. Rather, our goal is simply to remind today’s digital natives – and those who shape their educational experiences – that there are significant costs and consequences to discounting the printed word’s value for learning and academic development.”

Indeed.

Created Male and Female: An Open Letter from Religious Leaders, December 15, 2017 | Anglican Church in North America

https://dg.imgix.net/male-and-female-he-created-them-in-the-image-of-god-p453ynou-en/landscape/male-and-female-he-created-them-in-the-image-of-god-p453ynou.jpg?ts=1499368398&ixlib=rails-2.1.4&w=700&h=394&dpr=2&ch=Width%2CDPR&auto=format%2Ccompress&fit=min

We come together to join our voices on a more fundamental precept of our shared existence, namely, that human beings are male or female and that the socio-cultural reality of gender cannot be separated from one’s sex as male or female.

Source: Created Male and Female: An Open Letter from Religious Leaders, December 15, 2017 | Anglican Church in North America

With all empathy, respect, and compassion toward those who may be struggling with one or another aspect of their gender or sexual identity, I must affirm my agreement with this statement; inter alia:

“We come together to join our voices on a more fundamental precept of our shared existence, namely, that human beings are male or female and that the socio-cultural reality of gender cannot be separated from one’s sex as male or female.

“We acknowledge and affirm that all human beings are created by God and thereby have an inherent dignity. We also believe that God created each person male or female; therefore, sexual difference is not an accident or a flaw—it is a gift from God that helps draw us closer to each other and to God. What God has created is good…

“A person’s discomfort with his or her sex, or the desire to be identified as the other sex, is a complicated reality that needs to be addressed with sensitivity and truth [however]…

“The movement today to enforce the false idea—that a man can be or become a woman or vice versa—is deeply troubling. It compels people to either go against reason—that is, to agree with something that is not true—or face ridicule, marginalization, and other forms of retaliation.”

While aware that there is a very small percentage – less that one-half of one percent, as I understand it – of person who are born to some degree (usually not in a way that is detectable by observation, apart from specialized tests) intersexed, the reality is that we are, as humans, biologically, genetically, either male or female.

If our self-awareness does not match that bio-genetic reality, that is indicative of an issue (gender dysphoria) that needs to be addressed – and not by hormone treatments or surgical alterations which can lead to other issues, and do not address the underlying condition. It is, as this letter points out, especially harmful when children are subjected to attempts to “reassign” their gender, at a time when confusion, uncertainty, and perhaps even a degree of exploration are natural and human, and deserve sensitive and compassionate attention – not attempted alteration.

The idea that one can “reassign” one’s sex, which is in fact encoded in one’s genetic makeup and cannot be altered, or that one can be “gender-fluid” and switch back and forth at will, is both factually incorrect and psychologically and spiritually disordered. It is, as some have termed it, “LARPing” (Live-Action Role-Playing), an attempt to deal with a deep-seated psycho-emotional issue which is neither healthy nor helpful over the long term.

The given-ness of our sexual identity – male or female – is one of those things with which, like gravity or the boiling point of water, we have to deal, whether we want to or not, whether we like it or not. If we do not, when we need to deal with that disjuncture, not mask it or pretend to alter it.

We live in an era in which the very concept of objective truth, indeed of objective reality, is under assault – an assault which is one part, one portion, one prong of the larger attack on Western civilization: cause people to question everything, sow doubt in every realm, spread confusion and distrust, even of biological reality.

It is interesting – to say the least – that some of the same people who insist that we should “accept the science” on global warming also insist that we ignore the much more self-evident and “settled” biological reality of human sexual identify! We cannot conclusively “prove” that human activity is causing global warming; but we can count chromosomes. Yet we are told to commit to the first, but ignore the second. Other than a socio-political agenda, this dichotomy makes no sense.

But whether part of a coordinated movement or simply a synchronicity of societal forces, the effect is the same: softening us up for an attempted complete remake of societal and cultural order. It is an attempt which, denying reality, cannot be ultimately successful; but it is one which can do significant damage – to both individuals and cultures – and so it must be resisted. With sensitivity and compassion when it comes to individuals, some of whom are clearly hurting! But resisted, nonetheless.

As this essay points out,

“Gender ideology harms individuals and societies by sowing confusion and self-doubt. The state itself has a compelling interest, therefore, in maintaining policies that uphold the scientific fact of human biology and supporting the social institutions and norms that surround it…

“We desire the health and happiness of all men, women, and children. Therefore, we call for policies that uphold the truth of a person’s sexual identity as male or female, and the privacy and safety of all. We hope for renewed appreciation of the beauty of sexual difference in our culture and for authentic support of those who experience conflict with their God-given sexual identity.”

Amen, to all of the above!


Note: The picture above was not from the linked article, it is one I selected to illustrate this post.

Diminishing solar activity may bring new Ice Age by 2030 – Astronomy Now

In this 1677 painting by Abraham Hondius, "The Frozen Thames, looking Eastwards towards Old London Bridge," people are shown enjoying themselves on the ice. In 17th century there was a prolonged reduction in solar activity called the Maunder minimum, which lasted roughly from 1645 to 1700. During this period, there were only about 50 sunspots instead of the usual 40-50 thousand recorded. Image credit: Museum of London.
In this 1677 painting by Abraham Hondius, “The Frozen Thames, looking Eastwards towards Old London Bridge,” people are shown enjoying themselves on the ice. In the 17th century there was a prolonged reduction in solar activity called the Maunder minimum, which lasted roughly from 1645 to 1700. During this period, there were only about 50 sunspots recorded instead of the usual 40-50 thousand. Image credit: Museum of London.

The arrival of intense cold similar to the one that raged during the “Little Ice Age”, which froze the world during the 17th century and in the beginning of the 18th century, is expected in the years 2030—2040.

Source: Diminishing solar activity may bring new Ice Age by 2030 – Astronomy Now

Note: that’s the arrival! It may last a good bit longer. Here’s a fuller excerpt:

“The arrival of intense cold similar to the one that raged during the ‘Little Ice Age,’ which froze the world during the 17th century and in the beginning of the 18th century, is expected in the years 2030—2040. These conclusions were presented by Professor V. Zharkova (Northumbria University) during the National Astronomy Meeting in Llandudno in Wales by the international group of scientists, which also includes Dr Helen Popova of the Skobeltsyn Institute of Nuclear Physics and of the Faculty of Physics of the Lomonosov Moscow State University, Professor Simon Shepherd of Bradford University and Dr Sergei Zharkov of Hull University.”

That the climate is changing is obvious. That it is also warming, at least for now – despite cold snaps such as the one we’re currently going through! – and at least in some areas, is also obvious to those who consider, for example, the recession of glaciers that revealed “Otzi, the Iceman” in the Alps, or the dramatic shrinking of glaciers in Glacier National Park, Montana, just in the 35 or so years since I was there with my parents in the early 1980s. And of course, we have – I believe – both an ethical and a religious obligation to care for this good Earth which God has given us, to the best of our ability.

That said, there is a clearly political agenda driving a lot of the climate change / global warming hysteria these days, which causes me to look with a somewhat jaundiced eye in that direction. And it is not, regardless of what its proponents say, “settled science”: for one thing, there is no such thing! It may be (currently) the majority opinion, but so was the Ptolemaic cosmology, for a couple of thousand years – to cite but one example, among many. Science, if it is doing its job correctly, is always open to new information, and new interpretations.

Secondly, there are competent scientists who hold a contrary view, even if they are fewer in number than the global warming proponents, at the present time. One of those is the aforementioned Dr. Helen Popova, who writes,

“There is no strong evidence, that global warming is caused by human activity. The study of deuterium in the Antarctic showed that there were five global warmings and four Ice Ages for the past 400 thousand years. People [e.g., anatomically modern humans] first appeared on the Earth about 60 thousand years ago. However, even if human activities influence the climate, we can say, that the Sun with the new minimum gives humanity more time or a second chance to reduce their industrial emissions and to prepare, when the Sun will return to normal activity.”

This does not mean we should be careless, complacent, or inconsiderate to our fellow-creatures or the planetary home we share! Even if we are only contributing, to some extent, to what is primarily a process that’s much bigger than us, we should be cautious and considerate in our actions, and many of the proposed remedies for global warming can be defended on other grounds. But it does suggest that we should be a bit more reticent about claiming either credit or blame for the whole thing! Despite human hubris, this may be another example of a truth of which it is salutary to remind ourselves, from time to time: it’s not all about us.

Note: the “Little Ice Age” actually began around 1450, and lasted until around 1850, peaking (naturally) at c. 1650. But we have been warming from it for only a bit over 150 years. It’s not surprising, therefore, that temperatures have been climbing during that period! Also the first thermometers were invented between 1593 (a rudimentary water thermometer, by Galileo Galilei) and 1714 (the mercury thermometer, by Gabriel Fahrenheit) – again, right around the peak of the “Little Ice Age,” or Maunder Minimum. So it has literally been warming pretty much ever since the thermometer has existed! The psychological effects of this should not be underestimated.