Happy Birthday, Me! Celebrating My Journey from Machines to Artificial Intelligence

“Computers are incredibly fast, accurate, and stupid. Human beings are incredibly slow, inaccurate, and brilliant. Together they are powerful beyond imagination.” 

–Slightly Modified Quotation by Albert Einstein (1879-1955; known for His monumental contributions to physics and our understanding of the universe with his theory of relativity, E=mc², and numerous other discoveries.)

Those who know me well–and even those who know me, but not well–know that I always gift myself on my birthday. I purchase my gifts in advance, I have them wrapped in over-the-top paper with ribbons and bows beyond glitzy, and, without fail, I include a note reminding myself of how special I am. Well, I am. If I don’t celebrate me, others won’t celebrate me either. Right? Right. (You’re special, too. Gift yourself when your birthday rolls around.)

I’ve written at least one essay about a gift that I gave myself for my birthday. Who amongst us does not remember last year’s “Celebrating the Gateway to Who I Am”? In that blog post, I shared with you my 75th birthday gift: my decision to not let others diminish my identity by calling me Sweetie, Dearie, or Honey, instead of calling me by my name. I shared with you what I planned to do whenever those well-intentioned terms of endearment grated my ears and pierced my being. Simply put, I decided to rise up to the full height of my politest best and do my utmost to turn those ageist comments into learning moments.

I am pleased to report that I have done just that for the last year, and it has brought positive results, particularly in my doctor’s office and at my local pharmacy. I’m now “Brent.” My name. My God. My name. Who would have believed that one word could be so symphonic?

I thought that last year’s birthday gift might have been my best, ever.

Maybe so. But this year’s gift might be even better. Once again, it will be a blog post–today’s actually–made up of ideas lounging lazily midst glamorous and glitzy spaces.

But only the ideas will matter. Nothing else ever matters, really. Only ideas.

The idea that I want to explore as my 76th birthday gift is simply this. How can it be that I am hyped beyond hype about Artificial Intelligence (AI) and its potential? I am, and candidly, it might just be the greatest technological thrill of my entire life. I’ve written about its potential in “What If We Use Artificial Intelligence (AI) to Become Even Better than We Are?”

But here’s what I’m trying to figure out. How can it be that I am so turned on by AI? After all, I’m the guy whose entire being screams, “Humanities!” It strikes me as rather strange, so much so that I’m beginning to think of myself as an oddity, peculiar even to myself.

Doesn’t it strike you as strange, too, especially when I tell you that in all the standardized tests that I’ve taken down through the years, I have always scored substantially higher in math and science than in English?

I mean, those test results would have had me marching right on down the STEM side of life, focusing on science, technology, engineering, and mathematics.

You certainly wouldn’t have expected me to sashay down the liberal arts aisle, having endless affairs with literature, philosophy, history, languages, and everything else that focuses on human culture, creativity, and critical thinking.

But that’s just what I did! I think my mother started it all. While I was still in her womb, she was reading a novel with a protagonist named Brentford. She fell in love with the name and decided that she would pass it on to me. I don’t think my mother read novels after that, no doubt because she was preaching and shouting the Gospel’s good news in the little Pilgrim Holiness Church that she pastored until I was five or so.

During that time, I fell in love with language, listening to my mother and watching others as they were slain in the Holy Spirit while she preached. I also saw that my mother valued the beauty of diversity. Even though it was not politically correct to do so at the time, everyone in our multiethnic coal camp came into our modest home through our front door and dined at our steel-framed, Formica-topped kitchen table. I saw my mother stand up time and time again for what she thought was right. She never compromised her convictions. She believed in forgiveness and taught us to never let the sun go down with an ought in our hearts. She embraced positive thinking: if you think you can, you can. She was the epitome of steadfast cheerfulness and optimism.

In addition to my mother’s influence was the impact of living in a multiethnic community made up not only of Blacks and Whites but also of Greeks, Hispanics, Hungarians, Italians, Jews, Poles, and Puerto Ricans, many of whom were first generation immigrants. I appreciated the rhythm of diverse languages, the symphony of cultures echoing through every corner of my little coal camp. I learned how to have conversations with passionate hand gestures and animated facial expressions. Black gospel music and spirited conversations in Italian became the backdrop of my days. The rich aroma of soulful collards and pintos, garlicky Greek beans, savory Italian pasta sauces, and Hungarian goulash wafted through our community. Our dinner tables were a melting pot of international flavors. Hands of varied textures united—Pole with Greek, Jew with Black—and danced the hard dance of shared labor and celebrated the simple things in life that forged our coal-camp community.

Such were the ordinary threads that made up the fabric of my early childhood, yet they were sufficient enough to help me understand how people think and feel and yearn, and they were ample enough to make me feel at home in my future educational pursuits that encompassed language and literature and philosophy and religion.

Little wonder that I’d go on to earn my bachelor’s degree in the humanities with a concentration in English and allied fields in philosophy, religion, and speech. I had the luxury of studying the parts of life that meant so much to me. Later, I would earn my doctoral degree in philosophy with specializations in American literature and British literature.

But here’s what’s remarkably beautiful and equally strange. An education–especially in the humanities–prepares and empowers us for many undertakings, ironically not always related directly to what we studied in college. To my surprise, after I earned my bachelor’s degree, I was hired as an editor at the Library of Congress (LOC). For someone who grew up in a home with three books, it was staggering for me to be working in the world’s premier library, the place with all the books.

It was in that position–going all the way back to 1969–that my love of the humanities started to intersect in silent and seamless ways with my love of computer technology and my current fascination with Artificial Intelligence (AI). Looking back, it’s clear to me that my first editorial job at the LOC allowed both hemispheres of my brain to work together and complement one another even if I was not aware of the tight interconnections.

In that editorial position, I worked under the leadership of Henriette Avram, a computer programmer and systems analyst who developed the MARC format (MAchine Readable Cataloging) that revolutionized cataloging and libraries. (Computer technology had begun in the late 1930s, but when I started working at the Library of Congress, computers were still referred to as machines.) As an editor, I identified the various bibliographic data fields on conventional 3 x 5 catalog cards, and those tags–signposts, if you will–allowed Library records to be converted into online catalogs. The MARC format became the standard for most computer programs and for cataloging books worldwide.

At the same time, the LOC launched its Retrospective Conversion (RECON) project to convert older cataloging records into machine-readable form. On January 1, 1981, the LOC stopped filing cards into its main card catalog: online cataloging of its collections officially began.

As I moved into other positions at the Library of Congress, my background in online cataloging was among my key assets. I would go on to serve as an editor of the Catalog of Copyright Entries, making final determinations for automated cataloging, editing, and publishing activities constituting the bibliographic and legal record of works registered for U.S. copyright protection.

After I left the Library of Congress and crossed over into academe, fulfilling my third-grade dream of becoming an English professor, my new career path provided other notable intersections that would integrate my knowledge of computer technology and my love of the humanities. When Laurel Ridge Community College (formerly Lord Fairfax Community College) launched online learning and teaching in 2000, I was among the first faculty to embrace the initiative and to offer classes using Blackboard as the delivery platform. Years later when the college wanted faculty to use Open Education Resources (OER) to lower the textbook costs for students, I volunteered and within a year I had designed and developed my own OER courses in American Literature, College Composition, Creative Writing, and Leadership Development.

Now, we are at poised at another historic milestone as Artificial Intelligence (AI)–specifically ChatGPT–offers us new learning resources that will revolutionize classrooms and lives. As an educator and as a human being, I embrace these technological advances fully. For me, it’s perhaps the most exciting moment in my entire life.

In tracing my path from my coal-camp beginnings to the vast landscape of Artificial Intelligence, I am reminded that life’s narrative is a complex dance between the humanities and technology. As I reflect on the unexpected turns, from my early days influenced by my mother’s sermons to my involvement in pioneering work at the Library of Congress and now my fascination with the marvels of AI, I find a harmonious integration of seemingly disparate worlds.

Just as the humanities laid the foundation for my understanding of the human experience, technology provided the tools to amplify and share that understanding with others. My journey from machines to Artificial Intelligence mirrors my own evolution, from a coal-camp kid fascinated by language and diverse cultures to a lifelong learner who eagerly embraces the next chapter of technological marvels.

As I celebrate this milestone year, I am grateful for the intersections of the humanities and technology in my life. It’s a testament to the ever-expanding possibilities that come to fruition when we allow these disciplines to not only coexist but also enrich each other. With a heart brimming with gratitude and a mind ignited by curiosity, I step into the AI future, ready to explore the uncharted territories where the humanities and technology continue to move in a captivating rhythm.

I hope that sharing highlights of my journey from my birth year of 1947, marked by the invention of the transistor, to the present day of Artificial Intelligence, serves as a testament to the enduring power of embracing change, fostering innovation, and finding harmony in the symphony of human and technological progress. Today, I see a dazzling future where the humanities and technology intertwine, creating a narrative that transcends the boundaries of what we ever deemed possible.

Cutting-Edge Technologies: Promise or Peril?

When … Fulton showed off his new invention, the steamboat, skeptics were crowded on the bank yelling, “It’ll never start, it’ll never start!” But it did. It got going with a lot of cranking and groaning, and as it made its way down the river, the skeptics were quiet for a brief moment. Then they started shouting, “It’ll never stop, it’ll never stop.”

“Grab an Oar and Row,” St. Louis Business Journal, November 9, 1997.

Live long enough and you will experience many cutting-edge technologies. That certainly will be the case if you live to be 75 as I have done.

Some of the technologies that I have seen were so short-lived that you may not be familiar with them at all. Boom boxes. Cassette tape recorders. Floppy discs. Portable televisions. Reel-to-reel tape recorders. Slide projectors. TV watches. Transistor radios. VHS video format. 

Others, however, remain part of the fabric of our lives. Barcodes. Birth control pills. Communication satellites. Coronary bypass surgery. DNA fingerprinting. DVDs. Fiber optics. GPS. Human Genome Project. Hybrid cars. Jet airliners. MRIs. Microwaves. Pacemakers. Polio vaccine. The list goes on.

Of all the cutting-edge technologies that I have experienced in my life–vintage and outdated as well as state-of-the-art and up-to-date–three are special to me because I was involved with them in the early stages of their development and saw in them promise rather than peril. Interestingly enough, all three relate to automation–specifically the role that computer technology has played–and continues to play–in advancing literacy and learning.

The first goes all the way back to 1969 and my first position at the Library of Congress as a MARC Editor. MARC stood for MAchine Readable Cataloging, a set of standards developed by the Library of Congress in the 1960s to enable electronic sharing of catalog records. Within the next few years, MARC became the cataloging standard, nationally and globally. A parallel RECON Project converted retrospective materials to MARC format. But imagine. Computers in those early days were referred to as machines, and one of the biggest challenges was keeping the mainframe from overheating. I dare say that I did not understand fully the magnitude and far-reaching ramifications of the work that I was doing, but I knew that I was contributing to cutting-edge technology.

Fast forward to the 1990s, when Blackboard started as an interactive learning network to help teachers and professors adapt to the new reality of being able to teach via the Internet. In the fall of 1999, I started teaching at Laurel Ridge Community College (formerly Lord Fairfax Community College). The following spring, I was one of two faculty who opted to pilot Blackboard as our online learning platform. This time, I knew the magnitude and far-reaching ramifications of teaching online.

From that point forward, I continued to teach at least two–sometimes three–courses each semester online. American Literature. Appalachian Literature. College Composition. Creative Writing. Leadership Development. Southern Literature. Technical Writing. I made a point of challenging myself. For any class that I taught in a traditional classroom environment, I wanted to figure out how to revamp it–how to redesign it–for online delivery. I wanted to be a pioneer as my college moved forward with this cutting-edge technology.

Fast forward almost two decades, and I was pioneering again. This time I was embracing Open Education Resources (OER) in my classes. In 2018, I was the keynote speaker for a statewide professional development workshop sponsored by Laurel Ridge Community College, Lumen Learning, and the Achieving the Dream Foundation. I titled my keynote “OER: Open the Future of Education Today.” I knew beyond a shadow of a doubt the magnitude and far-reaching ramifications of Open Education Resources.

Within two years, I was using OER for all of my classes. Mind you: I didn’t tap into existing OER course shells. Instead, I designed and developed my own from scratch so that I could honestly and proudly announce to my students: “FREE Open Education Resources, Personally Curated Just for You.”

It seemed to me then–and it seems to me now–that the power of four great cutting-edge technologies had come together in a powerful way beyond anyone’s wildest imagination: the Printing Press, the Internet, Online College Education, and Open Education Resources. Working together, they all revolutionized not only learning but also the way that we share and distribute information.

As with most innovations, those four met initially with varying degrees of skepticism and pushback.

Printing Press Skeptics: The monks will lose their work. Monks will become lazy. Paper isn’t as good as parchment. With only one typeface, books are really ugly.

Online Cataloging Skeptics: OMG. What will happen to the card catalog. It will die and then what?

Internet Skeptics: Just look at all the inappropriate materials poisoning the minds of our children. Staring at the screen will kill us. Thieves will steal my personal and financial information.

Online Education Skeptics: Online courses will never produce outcomes equal to in-person courses. Faculty-student interaction won’t be meaningful.

Open Education Resources Skeptics: Online resources are much more difficult to use. Who will guarantee quality? You just can’t do better than printed textbooks.

Now we’re all facing another cutting-edge technological moment. Its magnitude is beyond comprehension. Its ramification is beyond far-reaching.

AI. Artificial Intelligence.

We all have a general idea of what it is: a branch of computer science that simulates human intelligence–visual perception, speech recognition, decision-making, and translation between languages. AI can get better and better over time based on the information collected and processed.

AI, of course, has been around since the birth of computers in the 1950s. But major advances came in the 1990s. You may recall that the AI app “Deep Blue” defeated chess master Garry Kasparov. And NASA deployed its first robot, Sojourner, on the surface of Mars.

AI is in the news a lot. Below are some major headlines that surfaced during the last few days:

“The 5 Biggest Artificial Intelligence (AI) Trends In 2023”

“AI Infrastructure Market to Reach $309.4 Billion, Globally, by 2031”

“Artists Sue AI Company for Billions, Alleging ‘Parasite’ App Used Their Work for Free”

“‘It’s the Opposite of Art’: Why Illustrators Are Furious about AI”

“AI Tools Can Create New Images, but Who Is the Real Artist?”

“Publisher Halts AI Article Assembly Line after Probe”

“Can an AI ‘Angel’ Help Find Thousands in Mexico Who Were Forcibly Disappeared?”

“This Mind-Reading AI Can See What You’re Thinking – and Draw a Picture of It”

“ChatGPT Passed a Wharton MBA Exam and It’s still in Its infancy. One Professor is Sounding the Alarm”

As an educator and a writer, a more narrow cutting-edge AI technology demands my attention. I’m thinking about AI apps that are influencing the future of writing and authorship. ChatGPA, for example, can be used for text generation, text translation, text summarization, and even sentient analysis of a text to detect tone and emotion. It can even write poetry, and it can make ordinary prose sound like passages from the King James Bible.

It seems to me that all of us need to sit up and take serious notice of all that AI promises, all the while that we need to be aware of the perils that AI skeptics perceive and toss into the conversations.

Skeptics are convinced that AI will:

● change the perception of information

● take jobs away from humans

● rob us of our ability to make decisions independently

● take away our creativity

● make us lazy

● do away with our privacy

● control our behavior

● threaten copyright, authorship, and ethics

It seems to me that the perils being voiced by AI skeptics aren’t too different from those sounded by other skeptics down through the ages about other cutting-edge technologies. The Printing Press. Online Cataloging. The Internet. Online Learning. Open Education Resources.

But here’s what we have to accept. Whether we like it or not–whether we feel threatened by it or not–AI is here. It has started. It will not stop. It is the future.

Promise or Peril? I have to decide where I stand. You have to decide where you stand. We all have to decide where we stand.

We can’t ignore AI.

Sadly, we can, but only if we want to be among the left behind.