In my last article, I addressed the need and advantage of being a lifelong learner. Related to this learning is the ability to choose what to learn, when to learn it, and the timing for adoption of the learned technology in your organization.
An often-used phrase of “being on the cutting edge” means to be on the forefront of technology and staying right on the edge of new inventions and concepts. One of these recent concepts is that of Industry 4.0. There are many areas of new and emerging technology that require a level of extra training and knowledge, such as virtual and augmented reality, the Internet of Things (IoT), Industrial Internet of Things (IIoT, Artificial Intelligence (AI), Big Data, and Data Analytics. As you can see, there is a lot to know and learn, should you decide to adopt it. The question is, “How do I know when to jump in and apply the technology to my company?”
In 1962, Everett Rogers published a book called Diffusion of Innovations where he classified consumers into distinct groups based on their tendencies to buy. This concept is sometimes associated with another premise, the technology lifecycle.
Although there are similarities, these two concepts are not the same; technology adoption refers to the stage in which a technology is selected for use by an individual or an organization, and diffusion of innovation refers to the stage in which the technology spreads to general use and application. To forward the distinction and to reinforce understanding, diffusion requires adoption. This means that for the product to be diffused into use, it must be adopted and accepted first.
Rogers broke the diffusion of innovation into the following stages:
You’re probably saying, “This is all well and good, but what does this have to do with technology and learning?” It depends. I am not being dismissive or flippant, but it really depends on the type of company you work for, the maturity of the product offering of your company, the budget you have, the work force, and more. There are many factors to consider.
Consider what might happen to Apple had they stopped innovating after the introduction of the iPhone 1. From the user standpoint, what about the people who need to buy the newest iPhone minutes after its introduction? The gamble of the bugs inherent in the phone and operating system might be minor to you or something that can be avoided or overlooked, but what if it was your product? Or, what if you are a company like Apple that innovates and introduces new technology? All these questions need to be evaluated and the level of risk must be determined specific to your offering or needs.
How does this all tie into learning, education, and knowing what to choose and when to implement technology? All of this means that you need to learn about the technology, the product, the overarching concept, and how it best ties into your needs and your company’s needs.
Trade conferences, such as EASTEC shown in the collage, offer many seminars, workshops, focus groups, direct training, and other various knowledge transfer sessions. Universities often offer concept and topic training, such as diverse topics in weekly seminars related to leadership. It goes without saying that colleges and universities offer the more formal, higher-level education needed to build the foundational knowledge for your chosen technology.
In conclusion, technology adoption is personal and unique and must be evaluated by the individual and the organization to determine the appropriate level. Are you or your company one that can afford to take the chance of buying the brand new and often unproven device? Or is your company one that can’t afford any time loss or investment because of unproven technology, and therefore must wait until it is more mature?
This is the initial evaluation. Once that is complete, then you need to take on the requisite level of learning commensurate with the technology being adopted.