Foundational and peripheral technologies
- 12 minsThe world is about to change. In fact, it may have already. All because of ChatGPT and similar machine learning (ML) based technologies. It’s a promise that’s been made about various tech in recent decades. This time it’s different. We’ve reached a tipping point. Hear me out.
Foundational and Peripheral Technologies.
New technologies can be placed into one of two buckets.
- Foundational
- Peripheral
Foundational technologies change the way the world works forever. After unleashing a foundational technology on an unsuspecting planet, the way humans interact with each other, information or the environment around them is different forever. It will never return to the before state.
- Smartphone
- Internet
- Home Computers
- Television
- Radio
- Telephones
- Printing Press
Telephones and smartphones make the list. Mobile phones don’t. Foundational doesn’t mean they’re brand-new inventions from thin air. Everything has a predecessor. It’s a question of impact. Telephones and Smartphones changed the way people communicate at a fundamental level. Mobiles just made phones. . . well. . . mobile. A significant advancement for sure, but one that didn’t change the world until it morphed into the smartphone.
If we stretch ever so slightly into the realm of engineering, we could include aqueducts, electricity, steam power, and paved roads. Each of these was a foundational step towards a much different world. One that further advances were built upon over subsequent generations. Information exchange has been a core focus only in the last century.
Peripheral technologies change the way individuals work. Once a peripheral technology is available, a segment of the population adopts it. That segment may grow, but it’s unlikely ever to be in the hands of almost everyone or integrated into virtually every future product. Some, sure. But not all. Recent examples include:
- VR
- Blockchain
- Pagers
- Torrent
Whilst impressive and potentially sector-changing, VR is unlikely to replace the alternative fully. It may morph into another more ubiquitous form, but its applications beyond games, films, and demonstrating complex 3d designs before they’re ready are few and far between. The devices are powerful enough, but the use cases just haven’t cropped up. Does that mean they’ll disappear anytime soon? No. Will they improve? Yes.
Until we reach a state of dystopia where no one can (or is allowed) to leave their room, it’s doubtful we’ll ever see the same market penetration as with smartphones. 86.34% of the world’s population has a pocket rectangle [1]. When you multiply that by the average of 58 times each screen is switched on per day [2], you hit 401,360,000,000 daily uses of a smartphone worldwide. Even if we take these figures with a pinch of salt, I just can’t see the same ever happening with VR.
You may be saying to yourself that there is a flaw in this logic. Smartphones didn’t appear overnight. They result from years of progressive development, building on what came before. The telegram became the telephone. Telephones morphed into the mobile phone. In turn, they combined and turned into what we call a mobile phone today, assimilating many other technologies along the way. It’s these additional technologies that make it foundational and world-changing. When did you last reach for a dedicated camera, MP3 player, or even a landline? Smartphones changed the game in a very short period of time. VR just hasn’t.
The same goes for blockchain technologies. Pushing aside the currency angle, this technology has some compelling uses. Smart contracts could reduce paper pushing and double checking by a vast amount. . . but probably not to the same extent that computers have done in the last 30 years. There are even some interesting concepts built on top of the blockchain. The Helium network is an interesting example. But this type of effort still feels a little too pyramid for my liking.
Back to the AI
This topic arose during a conversation with my class of final-stage computing students. I’ve been actively encouraging them to play around with ChatGPT by providing pre-written scripts to help in class activities. Each time, the conversation around the technology goes a little deeper.
Most of the class were a part of the million users accrued in record time, the rest one of the following hundred million. They looked at it and had a quick play around. They found it cute. Perhaps even impressed by the technological feat. But after five minutes, they put it down and moved on with their lives. The impression was that the results were basic, repetitive, and nothing more than a technical demonstration of something cool. Why?
They lacked perspective. They lacked a use case.
At a glance, ChatGPT just produces words.
With a second, third, and fourth glance, you realise that it can do much more besides. In all cases, it picks one of the most likely words to come next given the prompt and the previously chosen word, drawing inspiration from the 570 billion pages worth of text that the large language model was trained on. But at university, using words you didn’t write isn’t allowed, contributing to the quick dismissal of the tech. It is plagiarism. Well. That was the case before GPT. Today that’s not so clear. . . but a topic we won’t dive into today.
A user story generator was the pre-written prompt that made them realise the potential to help them.
Fun. Eh!?
We’ve been working through the idea > prototype > MVP cycle, using it to prototype a plan for engaging with the software industry and becoming employable. Over three weeks, they research and plan out their module experience, with a user story map acting as a central document to refine and adjust. Purposely, I’ve held back on suggesting they create user stories until now. They don’t really add much value in the context of the module. But it does help them to think through their tasks.
The script allowed them to bullet point their list of goals (user stories) in the following format:
- Build an online profile
- Create a personal portfolio
- Attend three networking events
. . . before adding them to the prompt and dropping it into ChatGPT.
This generated a complete set of user stories and acceptance criteria in seconds rather than hours. Sure, they need tweaking. But that’s preferable to having to write them from scratch. They were impressed. They saved a couple of hours of boring work.
They had a use case for the technology.
I’ve experienced the same with other people. For my partner, it was creating an exhaustive risk assessment in two minutes. For my gran, creating a song in the style of Frank Sinatra about a tuna sandwich. Each unto their own.
The context
Growing up with the internet has fundamentally changed how those under 30 interact with information and society. It’s even shaped the way they interact with other people and how relationships are maintained. A cycle that has happened with every foundational technology released. This isn’t necessarily a bad change; it’s just change.
A change that made instant gratification the norm. Answers and information were available within a few clicks for the first time ever, rather than a trip to the library away. Communication with friends was instant yet asynchronous, rather than delayed and asynchronous. Of course, real-time was available by phone, but letters were still a thing until recently.
It was a change that altered the Dunbar number for an entire generation. We no longer have to manage relationships with the community we directly interact with, but now include internet celebrities and influencers in that limited number. But this was the same for my formative foundational change.
Whilst I didn’t grow up with the internet, I was an early adopter in my teenage years. I did grow up with television. Television changed my generation on a similar scale as the internet has with the following one. My generation internalised another nation’s social norms, language patterns, and cultural artifacts in a way that had never been possible. We had movies on demand via video. Of course, this nudged the Dunning-Kruger number upwards, but you only got to see ten characters an hour. You can see that in seconds on Instagram today.
Discussing the family tree with Gran over lockdown, I asked why I’d never been told about all the cool stories from my family’s history before. Television was the answer. The goggle box entertained me in the evenings, whereas my Grans generation was entertained by stories told by their parents. She’d sit in front of the fire and be told about the 500 years of seafaring history, the countries our ancestors explored, and the famous poems written about our forefathers.
Star Trek entertained me.
My generation wasn’t the first to have TV, but introducing hundreds of channels, video rentals, and video recorders changed the paradigm. The airwaves needed filling. And so did our heads. The US had plenty of programming to share around.
The radio and, to a lesser extent, the TV shaped how my gran interacted with the world in a similar way, but both were considered family activities in the early days. Electricity in the home shaped the way great-grandparents saw the world. They could see it after sundown at the flick of a switch without risking blowing up the house with gas lighting. The printing press changed knowledge distribution for our ancestors. Paved roads altered the way empires spread.
You can’t appreciate this cycle until you’ve seen it play out before. I’m just old enough to have seen it twice. Through music and smartphones. Not empires. I’m not quite that old.
Napster changed how we consume music to an extent not seen since music first became a mass consumer item. Those using Napster quickly realised they’d never need to buy a CD again. It took the recording industry a decade to catch up with the new reality and big tech to take advantage of it.
ChatGPT is today’s Napster. It will not change the world itself, but it’s the warning shot across the bow. AI-powered tools are coming. The moment to jump on board is now. If you’re a writer, there are many platforms to choose between. If you’re a coder, Co-pilot has your back. Video editor? Musician? Voice-over artist? Some tools promise to make your work life easier and more efficient. Don’t want to use them? That’s fine. These tools won’t replace you.
The people who use them just might.
Am I worried?
No.
The in-class discussion quickly turned to how it will affect the software industry. Immediately I noted that anything from here on was only an opinion. Mine. Not my employers.
“I don’t care.”
The industry isn’t my concern. It doesn’t put food on my table. I do.
The industry isn’t going to put food on my student’s tables. An employer will.
And anyway, It’s an industry that’s changed continuously since its inception. Possibly more than any other. If a developer doesn’t update their skill set in some way, they’ll very quickly fall behind the curve. This is one of the reasons we enjoy development so much. We’re always learning.
What I am concerned about is the next generation’s ability to problem-solve.
University computing assignments don’t just teach the syntax of a programming language. They instil an ability to look at a complex problem from different angles. To break it down into its constituent parts and to solve these micro-problems along the way.
ChatGPT and its ilk are still working at a syntax level. Give it some pseudocode, and suddenly the results get far better. But pseudocode is the stage in the engineering process where software developers solve problems. . . or at least it should be.
This is where our in-class conversation led us. It is going to change the way developers work. It already has. It may even reduce the number of developers needed in the future. But by focusing on the problem-solving aspect of the skill set, there will be jobs to step into. . . for a few more years at least.
This is where I disagree with some of my peers. In universities and industry, voices are beginning to call for a foundational change in the education system. I’m yet to be convinced that such a knee-jerk reaction is called for. Let’s get through the trough of disillusionment before we tear up thousands of years of education. We need to see how this plays out for a little while yet. Adapt. Sure. I’ll make that call. Do so today? Yes. But a seismic change in the next academic year or two. It’s never going to happen.
What now?
If you’ve even partially convinced that we’re beyond the tipping point, you might wonder what’s next. How can you ride the wave in the coming years?
Get used to using the early versions of this generation of tools. They’ll improve (hopefully) and become more useful (definitely) over time. They’ll even get easier to use. Getting used to them now will only help down the line.
Writing words? Have a play with ChatGPT.
Making music? Have a play with Beatbot.
Making video content? Have a play with Runway.
Making landing pages? Have a play with Sitekick.
Making a podcast? Have a play with Descript.
And have fun with them.
[1] https://www.bankmycell.com/blog/how-many-phones-are-in-the-world
[2] https://explodingtopics.com/blog/smartphone-usage-stats