These day, the word "technology" usually brings to mind something that runs on electricity, has a screen, a keyboard and involves Microsoft, Apple or Linux. Google brings up a definition for the word that says it is about applying scientific and engineering knowledge to industry and the creation of machines and equipment.
I am of a transitional generation. More or less by chance, I began my doctoral program with some work in the computer language called Fortran. The experience was probably a challenge for the professor as well as his students. It was only a 1 credit course but the teacher kept changing the course requirements and it took the class of 4 or 5 students three years to complete them to his satisfaction and before he modified them yet again.
I came to the campus to teach but while I was settling in, I was offered a half-time position to be the first academic computing director. Most people had no idea of what a computer was or could do at that time. This was well before the internet. Computers were a whiz at collating, printing and calculating. They were not networked in any way and stood all alone, doing not much. In 1984, nearly 20 years later, we got our first home computer. It had two disk drives, one for the program disk and one for the data disk.
I taught statistics for many years. At first, we used a calculator, then a spreadsheet. Built-in functions enabled us to get means and standard deviations with great accuracy and great speed. The chancellor of the university took it upon himself to buy a desktop computer and a basic word processing-spreadsheet-database piece of software for each faculty member, even those who didn't want it.
It wasn't really until the emergence of the smartphone that we reached the current state of most people looking at screens most of the time, walking or sitting. We are actually just beginning to enter the state of everyone looking up everything all the time for facts and ideas and references and dates.