Live long enough and you will experience many cutting-edge technologies. That certainly will be the case if you live to be 75 as I have done.
Some of the technologies that I have seen were so short-lived that you may not be familiar with them at all. Boom boxes. Cassette tape recorders. Floppy discs. Portable televisions. Reel-to-reel tape recorders. Slide projectors. TV watches. Transistor radios. VHS video format.
Others, however, remain part of the fabric of our lives. Barcodes. Birth control pills. Communication satellites. Coronary bypass surgery. DNA fingerprinting. DVDs. Fiber optics. GPS. Human Genome Project. Hybrid cars. Jet airliners. MRIs. Microwaves. Pacemakers. Polio vaccine. The list goes on.
Of all the cutting-edge technologies that I have experienced in my life–vintage and outdated as well as state-of-the-art and up-to-date–three are special to me because I was involved with them in the early stages of their development and saw in them promise rather than peril. Interestingly enough, all three relate to automation–specifically the role that computer technology has played–and continues to play–in advancing literacy and learning.
The first goes all the way back to 1969 and my first position at the Library of Congress as a MARC Editor. MARC stood for MAchine Readable Cataloging, a set of standards developed by the Library of Congress in the 1960s to enable electronic sharing of catalog records. Within the next few years, MARC became the cataloging standard, nationally and globally. A parallel RECON Project converted retrospective materials to MARC format. But imagine. Computers in those early days were referred to as machines, and one of the biggest challenges was keeping the mainframe from overheating. I dare say that I did not understand fully the magnitude and far-reaching ramifications of the work that I was doing, but I knew that I was contributing to cutting-edge technology.
Fast forward to the 1990s, when Blackboard started as an interactive learning network to help teachers and professors adapt to the new reality of being able to teach via the Internet. In the fall of 1999, I started teaching at Laurel Ridge Community College (formerly Lord Fairfax Community College). The following spring, I was one of two faculty who opted to pilot Blackboard as our online learning platform. This time, I knew the magnitude and far-reaching ramifications of teaching online.
From that point forward, I continued to teach at least two–sometimes three–courses each semester online. American Literature. Appalachian Literature. College Composition. Creative Writing. Leadership Development. Southern Literature. Technical Writing. I made a point of challenging myself. For any class that I taught in a traditional classroom environment, I wanted to figure out how to revamp it–how to redesign it–for online delivery. I wanted to be a pioneer as my college moved forward with this cutting-edge technology.
Fast forward almost two decades, and I was pioneering again. This time I was embracing Open Education Resources (OER) in my classes. In 2018, I was the keynote speaker for a statewide professional development workshop sponsored by Laurel Ridge Community College, Lumen Learning, and the Achieving the Dream Foundation. I titled my keynote “OER: Open the Future of Education Today.” I knew beyond a shadow of a doubt the magnitude and far-reaching ramifications of Open Education Resources.
Within two years, I was using OER for all of my classes. Mind you: I didn’t tap into existing OER course shells. Instead, I designed and developed my own from scratch so that I could honestly and proudly announce to my students: “FREE Open Education Resources, Personally Curated Just for You.”
It seemed to me then–and it seems to me now–that the power of four great cutting-edge technologies had come together in a powerful way beyond anyone’s wildest imagination: the Printing Press, the Internet, Online College Education, and Open Education Resources. Working together, they all revolutionized not only learning but also the way that we share and distribute information.
As with most innovations, those four met initially with varying degrees of skepticism and pushback.
Printing Press Skeptics: The monks will lose their work. Monks will become lazy. Paper isn’t as good as parchment. With only one typeface, books are really ugly.
Online Cataloging Skeptics: OMG. What will happen to the card catalog. It will die and then what?
Internet Skeptics: Just look at all the inappropriate materials poisoning the minds of our children. Staring at the screen will kill us. Thieves will steal my personal and financial information.
Online Education Skeptics: Online courses will never produce outcomes equal to in-person courses. Faculty-student interaction won’t be meaningful.
Open Education Resources Skeptics: Online resources are much more difficult to use. Who will guarantee quality? You just can’t do better than printed textbooks.
Now we’re all facing another cutting-edge technological moment. Its magnitude is beyond comprehension. Its ramification is beyond far-reaching.
AI. Artificial Intelligence.
We all have a general idea of what it is: a branch of computer science that simulates human intelligence–visual perception, speech recognition, decision-making, and translation between languages. AI can get better and better over time based on the information collected and processed.
AI, of course, has been around since the birth of computers in the 1950s. But major advances came in the 1990s. You may recall that the AI app “Deep Blue” defeated chess master Garry Kasparov. And NASA deployed its first robot, Sojourner, on the surface of Mars.
AI is in the news a lot. Below are some major headlines that surfaced during the last few days:
“The 5 Biggest Artificial Intelligence (AI) Trends In 2023”
“AI Infrastructure Market to Reach $309.4 Billion, Globally, by 2031”
“Artists Sue AI Company for Billions, Alleging ‘Parasite’ App Used Their Work for Free”
“‘It’s the Opposite of Art’: Why Illustrators Are Furious about AI”
“AI Tools Can Create New Images, but Who Is the Real Artist?”
“Publisher Halts AI Article Assembly Line after Probe”
“Can an AI ‘Angel’ Help Find Thousands in Mexico Who Were Forcibly Disappeared?”
“This Mind-Reading AI Can See What You’re Thinking – and Draw a Picture of It”
“ChatGPT Passed a Wharton MBA Exam and It’s still in Its infancy. One Professor is Sounding the Alarm”
As an educator and a writer, a more narrow cutting-edge AI technology demands my attention. I’m thinking about AI apps that are influencing the future of writing and authorship. ChatGPA, for example, can be used for text generation, text translation, text summarization, and even sentient analysis of a text to detect tone and emotion. It can even write poetry, and it can make ordinary prose sound like passages from the King James Bible.
It seems to me that all of us need to sit up and take serious notice of all that AI promises, all the while that we need to be aware of the perils that AI skeptics perceive and toss into the conversations.
Skeptics are convinced that AI will:
● change the perception of information
● take jobs away from humans
● rob us of our ability to make decisions independently
● take away our creativity
● make us lazy
● do away with our privacy
● control our behavior
● threaten copyright, authorship, and ethics
It seems to me that the perils being voiced by AI skeptics aren’t too different from those sounded by other skeptics down through the ages about other cutting-edge technologies. The Printing Press. Online Cataloging. The Internet. Online Learning. Open Education Resources.
But here’s what we have to accept. Whether we like it or not–whether we feel threatened by it or not–AI is here. It has started. It will not stop. It is the future.
Promise or Peril? I have to decide where I stand. You have to decide where you stand. We all have to decide where we stand.
We can’t ignore AI.
Sadly, we can, but only if we want to be among the left behind.