Note from the Author: Around 1974, I was offered a consulting contract focused on modernizing an old refinery to take advantage of the then-modern automatic control technology. The first big project involved “modernizing” a refining unit originally built in the early 1950s with pneumatic instruments and communications. At the end of the project, many of us were asked to write down our feelings on nine topics that management felt were signposts for the future of the industry. These topics were Technologists, Maintenance, Unions, Engineers, Manufacturers, Programmers and Systems Analysts, Operators, Women in Engineering, and Computers. The fellow who had me write the assignment was a wonderful man I worked with for many years, and one of the world’s best people. As I read through my notes and writings from that time, I now wonder if he wanted to know my thoughts or perhaps wanted me to understand the significance of what we had all just done. His company, though, took it all very seriously and many changes across a broad range of issues developed.
The following is my original writings from 1978, which shows my views of the industry at the time by a young, 30-year-old engineer. To coincide with the original article, I have added my current day thoughts in response to my younger perspectives.
I suppose this story begins around 1976, but there is always the path leading up to any beginning and a period of evolution or convolution that follows. My 30th birthday occurred in January of that year. By then, I had completed a stint of military service during which I commanded a platoon, deployed to Europe, and worked in Staff Intelligence for the US Army—Europe headquarters.
I returned to college and completed a BSEE and a master’s program in physics. I’d done a few years (5 counting work during college) with a major electric utility company, three years with a cutting-edge systems integrator, and a couple of years of private practice. I was a registered engineer in Electrical and Control Systems engineering. Among my engineering services clients were several oil refineries. I was doing a lot of process automation and data communications work with some of them. They all saw automation opportunities and desired to position themselves accordingly.
One of the most extensive projects involved a massive updating of the original refining processes. Our portion involved a lot of process measurement and control systems work, mostly directed at equipment that had been installed in the early 1950s. One of the items proudly displayed in my office for decades is a Foxboro E3A pneumatic pressure transmitter, which had entered service when the plant was built and was replaced by our project in 1978. Over its 26-year service life, that transmitter had been there during the growth and development of the refinery, which now faced a new spectrum of process improvement possibilities and necessities. This and similar events brought the owner to ask many of us involved with mapping out and implementing this sea of change to write down our thoughts about where things were going. This article was drafted by me at that time.
A common problem in those days was dead-time intensive situations where the impact of a change at one place might take a long time to see at a distant “other.” My company’s products included a dead-time-aware controller which integrated conventional PID control with some integrated circuit logic, enabling a sophisticated unit that felt familiar to contemporary users.
This was only the beginning, but so much more was ahead! As I read and remember the thoughts below, I clearly remember standing at the beginning of the computer revolution in process control. Today, when I close my eyes and rock back in my desk chair, I see a similar story evolving with contemporary networking.
The beginning might be as far back as 1980, when then-new Ethernet began facilitating the burgeoning connectivity of multiple computers. The marketing tagline most easily remembered from what followed was “the network is the computer,” often attributed to John Gage at Sun Microsystems. The concept, however, spawned a broad effort. In essence, bigger and faster problems required more computing capability (more MIPS) than computers of the day could provide. DEC had pioneered the concept of multi-computer networks pointing out that synergistic use, of a collection of appropriately networked computers, could solve much bigger problems that even a collection of specifically applied ones. A bunch of computers networked together were “on it” and probably with more capability (more MIPS, e.g.) than the fastest supercomputer of the day. In those days, it was hard to make it through a week without at least one discussion about, or related to, “Shannon’s Theorem” which presented some profound connection between bandwidth and data rates. All that pointed toward where we ultimately needed to go, and we have made it a long way down that path.
In Shannon’s world, information meant things you didn’t already know. Data meant information about things you were measuring. Often, it took a lot of data to allow synthesis of the critical information. Then it became necessary to do something with these new things we knew. As the things we wanted to do with it became more important as well as difficult to discern in deep seas of data, computer technology struggled to keep pace. This feels a lot like the integration of what has become huge amounts of data into process control, what has become known in other endeavors as “big data.”
A few years later, I was in a client company’s national pipeline control center watching at least five major pipeline systems in various parts of the United States running along, fully controllable from the fifth floor of a downtown office building in Texas.
A few more years and I was in Saudi Arabia making some points about networking and resource consolidation. I had arranged to show displays of a process control system in Alaska responding to operator input from a control room in California. I did this by logging into the server in California. The inevitable question was, “how far can this all go?” I then picked up an iPhone out of my pocket and connected to the laptop I was using for the server connection. As the possibility of this previously impossible situation sank in, someone summed up the situation. “From Alaska to California to Saudi Arabia to an iPhone in our third-floor conference room? Wow!”
Without much imagination, it was easy to feel return to the “old days” (a few years back) when there might be five related control houses in a refinery processing crude oil into a few consumer products. With all the current issues of operation, manpower, and economics, are we standing at another door like the one we passed through back in ‘76? Is a lot of change in the industry, its leaders, its economics, its objectives, and where it is going visible from this image from the past?
The ‘80s will be the era of the technologist. With good engineers in short supply, the stage is set for the technologist to carry the bulk of the workload we now define as engineering. Just as the law clerk does the attorney’s research, and as the physician’s assistant relieves the doctor of involvement with routine cases, so will the technologist augment the engineer. Technologists will largely replace the project engineer on small projects. They will take over the engineering function on most routine projects.
Technologists are only beginning to understand their importance in industry. Within a few years, there will be a thriving professional society for technologists. It will be pressuring management for the recognition, working conditions, and salary that technologists deserve. It is very likely that technologists will loyally support a press toward organized professionalism and their economic and political power could easily outstrip that of the lackadaisical engineering community.
To be effective in control engineering, both education and experience are necessary. Experience with limited education is more useful than a lot of formal education without experience. Even if the colleges provided a glut of well-educated engineers this year, the deficiency in real control engineers would last for five years, while this group gains enough experience to learn how to apply a good education pragmatically to the real problems of industry. A much more realistic approach to the manpower deficiency is to augment the education of technologists and enable them to expand their experience to a more diverse set of problems. Not only would technologists then become more productive, but the scarce supply of good engineers in the field would then be freed to work on problems and opportunities that really require an engineering education.
The ‘80s will see an increasing number of short courses, books, seminars, and events dedicated to the enhancement of the role of technologists.
In the “old days,” companies trained their most outstanding maintenance people beyond the basic “take it apart, fix it, and put it back together” role that was essential for keeping things running. In refineries, much of what constituted advanced process control seemed to come from the “valve shop” where problems could be discovered during maintenance and modifying valves (e.g., changing the characteristics of the path of flow through them) to improve performance in specific situations. Many excellent technologists had made their way up through the “valve shop.”
When I started college in 1963, students interested in technical training had a building with lots of labs and courses focused on the technology of the time. Engineers spent their first couple of years in math and physics with a taste of engineering. Practical work was ready for those in technology majors, but there was a long path of experience and additional training involved in getting an engineer on the path. On the other hand, automation was a new thing, and while the technologists understood how and why engineers had a more intensive background in the underlying science and the methodology for applying it, even to unusual or “never seen before” problems, the insight of young engineers coupled with the deep experience of technologists made for amazing project teams.
For decades an engineering education was in demand. Eventually, all the common problems had been seen and solved at least once and doing the required things trickled down to the technologists. This balance is unstable and swings with what’s new and what is being adapted to new issues. When new technology is invented, the balance swings toward engineers. When the industry is stable, and investment is low, the balance swings toward technologists. In this current unsettled world, technologists will do well optimizing keeping things running properly. When there is forward vision and investment engineers will become more important, industry will have to sort out where they are going and why for the balance to shift back toward real engineering. In the meantime, technologists with appropriate training and some experience will be in intense demand.
The technological revolution of the ‘70s will find itself bogged down in the trenches of plant maintenance in the first half of the ‘80s. Manufacturers are now having a difficult time attracting software and hardware engineers for the new technology products. Plant maintenance organizations, in most cases, haven’t even recognized the problems their engineering departments have designed for them.
While supervisors of maintenance are trying to figure out why the “instrument mechanic” capable of doing software and hardware works on a microprocessor system should be treated differently than a boilermaker, manufacturers, sales organizations, and more progressive companies are hiring the very people he can’t afford to lose.
I have yet to meet a plant maintenance technician capable of working on microprocessor systems who is not actively looking for a job offering better working conditions, more money, and a larger measure of recognition. Engineers are designing, procuring, and installing systems they don’t understand. All too often it is left to plant maintenance to make these systems work. Maintenance usually is expected to do so with no new people, no or minimal training, and inappropriate test equipment.
There is but one reason the maintenance problem hasn’t yet been recognized as the crisis it is. That reason is a lack of adequate post-project performance auditing. Good operators can adapt to almost anything—even inoperative and inappropriate automatic control equipment. Management assumes that the continuing uneventful operation of a unit indicates a successful project. If an audit were performed to determine if the justification objectives were met, a different picture would, in many cases, emerge.
There is no easy source of qualified maintenance people. Extremely good ones are needed now in large quantities. Eventually, training programs will adjust, and the problems will diminish. This, however, is not on the horizon.
What it is has everything to do with how to maintain it. The maintenance function is driven by the design and the design is driven by the underlying science. The construction is influenced by the design, but also by the methodology involved in implementing it. Economics creep in to help optimize value, which is a function of what it costs to buy it and then costs to keep it working properly. Then, there’s how long it will last, and the impact maintenance can have on that. Maintaining mechanical things made from metal has its issues. Maintaining things that rely on moving parts, energy, mechanical power conversion devices require a different skill set. Then, there are things that rely on precise adjustment and sensitive measurements feeding, via a virtual mathematical realm; actuators and controllers using technology stretching from quantum-minuscule to impossibly large and powerful. That’s a lot of variability to cover in specifying what maintenance and its needs may be.
Back in ’76, it was becoming frighteningly clear to maintenance people and those who needed them that the best pneumatic controller maintainer on earth was about to find it much harder to eat regularly, unless he found a way to change. Mathematics not even mentioned in trade schools and industry courses were essential to understanding almost everything. It was all-around challenging. I have always taken comfort by the quote from Marcus Aurelius that has brought me through so many situations: “Never let the future disturb you. You will meet it, if you have to, with the same weapons of reason which today arm you against the present.” Applying all that logic and reason, though, requires an acceptance of change and a willingness, even a desire, to meet it.
The union movement is a dinosaur. It has no choice—evolve or die with the swamp. American industry used to be so productive that few countries in the world could compete with our manufactured products. That is simply not true anymore. As we slide from number one to number two as a car producer, we are forced to realize that we are losing market share and our ability to recover it in the style in which we were accustomed.
Industry has not insisted on sufficient profits to insure its future. The profits of many of our leading industries over the last few years, when adjusted for inflation, are really losses. The largest tragedy has been the failure to make sufficient profits to allow adequate investment in the future. Our heavy industries are capital intensive. The capital equipment is also old. The rebuilt industries of Germany and Japan are much more productive than our older plants. Their profits are higher and their investment in new plants and equipment is higher. Our decaying plants can barely make enough profit to continue operation, let alone set aside investment for the future.
Union agreements have priced labor beyond its value in American industry. We don’t have the relative productivity in the world’s markets to support them anymore. We are on the way down. While the Chinese have performed engineering miracles with no industry and large amounts of inexpensive labor, we have traditionally performed similar miracles with a highly industrialized economy, use of massive amounts of energy, and a very expensive but highly productive workforce.
Relative to Germany and Japan, we are becoming the China of the world industrial community. The price of doing this is going to be a dramatic decrease in the value of labor.
The union movement must sell management on its value. It is going to have to take the lead in training its members in skills and attitude, so they will be worth what they are insisting on being paid. In negotiating contracts, they are going to have to allow management a profit adequate, after adjusting for inflation, to support investment in the future. In fact, they should insist on it!
A huge driving force for industrial automation occurred with the Oil, Chemical, and Atomic Worker strike in 1980. By then, much of industry had experience with automation and had found exciting new opportunities. The strike provided an opportunity to shift focus from experienced people toward the benefits of new technology and new operational ideas. In some cases, production records exceeded and product quality increased. One of my services clients asked us to look at automating formulation of boxes of product onto shipping pallets, something that took quite a few people to get right. In the end, a machine with a bunch of pneumatic positioning arms run by a programmable controller replaced a large crew and improved pallet uniformity, thus facilitating shipping. I think we were the first to do that, but far from the last. Now, implementing any new process begins with looking at the labor vs. automation costs and benefits. This tends to reduce the importance of highly skilled labor at costs that are quickly amortized.
The instrumentation and control systems engineer of the future is going to have to be quite a fellow. An understanding of his industry’s processes is going to be more important than before, since equipment is now available that eliminates or pushes back old obstacles to improving processes. Process equipment capable of surviving more hostile environments, higher temperatures, and higher pressures is being used in new designs. Control devices capable of control beyond last year’s state of the art are increasingly available. New possibilities in control equipment mandates a search for opportunities to use them profitably. The gap between process engineering and control engineering will widen in the ‘80s because of the growth of both fields. Meeting process engineers halfway is going to become a taller order for control engineers. Achieving the mutual understanding of the process and the technology required to control it will be increasingly difficult and will require a lot of additional effort.
Keeping pace with technology is going to be increasingly difficult. The lack of standardization in advanced control equipment is going to force, at least in the short term, a heavier dependence on a single large vendor for a company or plant’s control equipment. In the past, installing a Taylor controller in a plant instrumented with Foxboro equipment was a minor problem. In the future, as the console-based systems take hold, it will become impossible to do so. No one can truly be familiar with all of the current vendor offerings to design with several of them. No maintenance department I know of can adequately handle, let alone several different kinds.
Control theory is going to become a more important part of the engineer’s work. This will take place because technologists will take over much of the more routine work and engineers will be expected to do more difficult control jobs. Also, equipment is becoming available to eliminate many of the technological barriers to applying techniques suggested by control theory. As energy becomes more expensive, good process control becomes more important and the additional benefits to be derived from small improvements become more significant, and therefore, worth the sometimes considerable trouble required to attain them.
Computer control is going to be an important part of the world of process control, at least until the industry determines what the new generation of control equipment should do. When functions are better defined, microprocessor products will replace a lot of the work we now do with computers. Until then, and probably even thereafter, control engineers are going to have to have considerable facility with computers both in software and hardware. They are going to have to become increasingly good at the algorithmic approach to solving process control problems.
A generation ago, my grandfather, one of my uncles, a fellow I worked for at one of my college jobs, and a supervisor I worked for after graduation built a system of hydroelectric power houses that are still in operation today. They were all part of my life at various times and in various ways. When I was about to graduate from college, the one who supervised my college work asked me over to his house one evening and after some discussion gave me a bunch of his old engineering books. He had graduated in electrical engineering just like I was about to do, and he wanted me to have some of his remembrances from those times.
There were some expected things, like an IEEE handbook. The course books were published by the same company as some of mine and were concentrated on things such as how to make wire. This was probably the single most important moment in my career, an epiphany about context—the past, the present, and what all that can say about the future. These showed me a beginning in which making wire for electric power distribution was a very important subject, followed by their life’s work of building powerhouses that changed how energy was made, transported, and used throughout the country. There emerged a spectrum of development, inspiration, innovation, and success reaching into and showing the way toward a new future. Here we were, passing on the engineering ambition, thought, innovation, and progression that has been with humans since the advent of civilization on Earth 6000 years ago.
Their concerns about wire weren’t trivial back then—but as I walked into the profession, they were a simple “check the box” catalog item, at least in most cases. Electric machines were covered in one of my many classes, but they were the driving force of the future across the profession. During this time, Neil Armstrong was taking “...one giant leap for mankind,” showing us a barely visible and completely new horizon that we were coming to vividly imagine but barely see. Some of us read Heinlein’s “Foundation Trilogy” and experienced finding the path and understanding change.
My work was mostly in process automation and control. The most important textbook was the second edition of Benjamin Kuo’s Automatic Control Systems. As I sit here in mid-2021, I see ads for the eighth edition of that book. Do you suppose there’s anything in that edition that a person who studied from mine would have difficulty with on a mid-term exam? It reaches into design for space vehicle payload control, something that Neil had just taken that first step toward back when I was about to complete college. A lot of time has gone by, making that “next horizon” easier and easier to see.
In the beginning, engineering began with practice, then it was backed up by science, and then by the organization of design process and evaluation from the impact of experience and the perspectives that come with science and the productive application of it. As we learn more and experience more, that horizon out there becomes clearer and clearer. Is it the end of the yellow brick road or is it just a resting spot on our way toward the fuzzy horizon beyond this progressively clearer one we’ve watched so closely? It becomes easier and easier to see where we’re going, but is that it?
In the short term, the control engineering manufacturing world will continue to be dominated by Foxboro and Taylor. Honeywell will make advances for a brief period with TDC-2000, but eventually the poor human interface of TDC-2000 and Honeywell’s wormy corporate structure will slow or eliminate the advance. There will soon be a clamoring for industry standards among the new small companies with highly specialized control products.
Foxboro will not be successful with Spectrum unless prices drop dramatically. Many segments of industry are still waiting for an H-line replacement, but they are not seeing an incentive to spend the kind of money to convert to the Spectrum equipment. The appearance of Spec 200 single station controllers and recorders represents a recognition that giving away the small project market is not a good idea. The sales of multiple small systems frequently set the stage for the eventual purchase of a large system. If Foxboro maintains its dependence on Spectrum and its “son of Spec 200” system philosophy, Foxboro will lose its position in the market in the 80s.
Taylor will probably be the most successful of the existing producers of console-based control systems. However, specific deficiencies in interconnection capability and data communications security will have to be overcome. If the additional work is done, the fine human interface of MOD III will result in Taylor gaining market share.
The dark horse in the vendor stable, however, is none of the above. The electronics industry has always been a garden of small, highly specialized, and technologically advanced companies. The life cycle of these companies typically begins by key people “spinning off” of a larger company to develop some idea or innovation. Growth is slow as is cash flow. When development is complete, a market must be penetrated and the need for capital intense. At this point, some companies refinance and grow into medium and occasionally large companies. Others become divisions of larger companies. Many die. Historically, these entrepreneurial operations have done expensive development work and cracked open difficult markets that have then been more effectively exploited by large companies.
In the controls field, the stage is set for quite the opposite event. The big companies have created an expanding market and pointed the way the technology needs to be developed. The stage is set for the electronics industry to exploit it. All that is needed is some industry standard mainframe system that all the diverse products of these many companies could be “compatible” with. The computer industry is full of products sold as “IBM compatible” or “Multibus compatible.” The electronics industry it good at designing products with interfaces compatible with anything. Exactly what that will be in the control equipment industry is the only question. The day a small, under-financed company in the Santa Clara Valley announces something like a user-programmable multivariable controller and states “Foxboro Spectrum compatible,” Foxboro stock should double by closing time. Whoever invents (or is accepted as) the industry standard mainframe system will become the dominant vendor in the 80s.
The opportunities for building better control equipment have never been more plentiful. The odds the standard fare of any one company can dominate the market are small. The smart money is carefully watching the electronics industry. The industry leader of 1989 may have yet to build its first control product.
Struggling to maintain a leading position in the environment of rapidly changing technology is very difficult and extremely expensive. It is almost inconceivable that any of the current vendors will sell their current console-based systems in sufficient quantity to make price reductions possible in the near term. Selling these systems is going to depend on better applications engineering support for the industrial users. The capabilities of these systems are not obvious to practicing engineers. In fact, most practicing engineers and many vendor sales organizations still think about these systems as another way to do a bunch of loops in a way similar to what we’ve always done. This approach is not going to be sufficient to penetrate major sections of the available market.
Back in 1976, all manufactures were struggling to find, develop, and incorporate the technology that was becoming available into their products. This began with ideas, which motivated the development of the underlying technology, and finally the production of the specific features valuable in the market. When some new idea came along, manufacturers had to consider its value and impact in their markets. In the ‘80s, a “color display” was one in which the printing was orange instead of white. As time went on, advanced stuff could display messages on monitors in several colors. Surprisingly, incredible graphic displays provided the pictures good for 1,000 words.
Nothing has changed all that much. A bright kid, probably an engineer, will dream tonight about some amazing way of showing the relationships among process benchmarks. He will tell his department supervisor, who will go to a meeting, then another meeting, and quickly it will be running in a lab and shortly thereafter will appear in a product, moving the requirements for being the “industry leader” farther along the path. One management function should be the motivation of creative thinking and work that, when beneficial, can be placed into product designs with the potential of keeping stockholders happy. This has gone from becoming a company-making opportunity to being a necessity of maintaining a competitive position. The three decades have illustrated how the process works and encouraged the developments required for enduring strength and maintaining leadership.
Computer people succeeded in establishing a priesthood in the data processing field in the ‘60s. It has continued and flowered into the ‘70s. The process control field has been more wary and dominated by more pragmatic engineer-managers. Certainly, every manager of computer control projects has personal experience with at least one flaky displaced flower child who, through a cruel trick of fate, became a programmer or analyst on an industrial control project.
This situation has been improved by engineers moving, frequently in total disgust and desperation, into the computer field. The term “software engineering” was coined at least partly to define a difference between the keyboard cowboys and the engineers who could get things done.
No trend toward professionalism is evident in the original programmer group. Engineers will supplant the flakes completely in the ‘80s. Software engineering will become a separate discipline and will develop a structure and a defined educational curriculum. Professionally, software engineers will probably continue to support the IEEE.
The first computer program I wrote was in engineering school. It was written in Fortran, typed as punched holes into what we knew as IBM cards, which were assembled in statement order into decks which were placed into the hopper of a “card reader.” The computer read the cards and translated the instructions into “machine language” and submitted to the computer itself for processing. Results were normally printed on “computer paper”—big white sheets with holes for the pin drives on the sides. There was a lot to know to get the computer to tell us what the product of “two times two” was. Usually, a homework problem might involve a few hundred punched cards. The deck would be turned in at the computer center and processed in turn. On a good day, there might be some computer paper with printing wrapped around your card deck, waiting in a pigeonhole for you by the end of the day.
In the “early days,” the focus was on writing cohesive but simple programs. As time moved along, the science of programming evolved far beyond the grammar of the programming language and on to strategies and program organization (construction) to facilitate optimizing the process. All of that, along with some new programming languages, made it into the technical press. Coursework began to evolve around such issues as Dykstra’s “top-down structured programming” and the IBM-spawned “input-process-output” organization. Program organization also evolved around operations research concepts that tried to avoid doing computer-resource intensive processes more often than essential. Synergistic network concepts took advantage of the computer integrated circuits, having multiple processors that could be collectively used to achieve high processing speed. Using all this productively transcended simply code-writing, and the systems analyst job emerged—more of a computer systems person than merely a code-writer—a “novelist” in a sea of note-writers. Engineer involvement was common. Knowing all the mathematical and operational processing issues enabled a whole different development concept than typing code until something worked.
I did the product that made most of my money in software, starting with proof of concept in Basic running on an Apple IIe. From there, a bunch of program elements were written and finally integrated into a one-computer (an Intel 386 processor) system. The code executed cyclically each time the data changed and was arranged to optimize processing speed as a method for keeping the analysis system running in “real-time.” The program was tightly organized and highly structured, thus facilitating trouble shooting and development. After that early period in the late ‘80s, a lot of programming was returning to an attitude of writing logically correct, but poorly implemented code, which required more computer time than did older code running on slower processors. Structuring programming and systems organization to stay away from simplistic code and less-than-optimal execution strategies became difficult—we drifted back toward less analysis, more code writing. That became an industry-wide struggle that now seems to be moving back to the importance of analysts and less respect for code-writers.
Operators are going to be expected to control more loops. Console-based control systems make it practical to organize operator responsibilities around inter-related features of plant operation rather than geographic distributions of the field instruments. The number of loops that can be presented to a single operator is now limited by the capability of the operator more than any technological factor. The benefits are improved plant operation through better co-ordination than is generally possible with many operators distributed over several control houses. Operators are going to have to understand more processes. They will control not one but several units. This will motivate operators toward a more progressive and professional attitude about their work.
In the beginning, a guy with a shovel and crowbar wonders what this big metal thing is supposed to do. A way further down the road, a guy looks at the big metal thing which he now understands is a “valve” and runs into the control room anxiously explaining that the “valve thing” is moving on its own. Later still, a technologist observes that product quality is up beyond his expectations because everything adjusts toward the same goal. Then, a guy working in an office in Texas preparing a financial report regarding process throughput looks out the window toward a clear blue sky and wonders how the company’s progress over this last year was even possible considering the improvements that have already been obtained.
Focus changes with who notices the changes and wonders about the new conditions and opportunities. In the beginning, keeping dirt out of a trench might have been a huge improvement. Having a pipe and valve instead of a trench and weir made a difference, in performance and in the skillset required to attain, maintain, and operate it. What needs to be on the operator’s mind changes, as does the thought process involved in evaluating it. When goals, objectives, and equipment run into obstacles, or spawn ingenuity, change is coming.
In the ‘70s, operator training was enhanced. More science was included at one end and more insightful practicality at the other. All the stuff in between required at least a less personal but more broadly aware understanding. The job had evolved—there were more operating tools, and there was more to know to use them optimally. There were opportunities for broadening and deepening, but with less touching and feeling. Virtualization had supplanted hard reality and adjustments were required.
Women are entering the process control and instrumentation field as engineers, operators, and mechanics. While overcoming the obstacles that the male establishment has presented them with has discouraged many, many others have forged into very capable and determined individuals. The extra effort that these succeeders have had to expend to stay in the game has accelerated their training and professional development.
Smart management is now looking to this group of determined, assertive women for middle managers. If what many have done on their own with considerable opposition is any measure, they can be counted on for outstanding achievements with management’s encouragement and backing.
While it remains unsubstantiated, it seems that the average ability of the successful female operator is considerably higher than her male counterpart. This is probably due to the relatively large number of educationally unprepared, but otherwise capable women seeking entry-level jobs and the attrition of the less motivated or capable ones due to sociological factors. Combined with the changes we see in operator responsibility, it is likely the role of women in the control room will expand considerably in the ‘80s.
Women in engineering meet with less open resistance, but still are frequent victims of intra-corporate warfare. These more educated and sophisticated individuals seem to be capable of dealing with the problems more readily than those entering the job market as unskilled employees. The similar difficulties in engineering school have probably prepared them somewhat better for industry.
Business’s emphasis on recruiting women, even if just to satisfy government mandates, and the untapped talent that exists in this half of the population suggests that the next few years should be very good for women in industry.
When I began college, there were no women in the engineering, advanced math, or physics programs, although one of my calculus courses was taught by a woman, and there was a female student in a required chemistry class. When I returned from the Army in 1968, nothing had changed very much. I graduated and left the college in 1971 and as I recall, there was one woman in the engineering program at the time. There was, however, gender acceptance of anyone capable of the work. I haven’t seen a class recently but have been told that about half the students are women. Operating from computers in a control room mitigates gender issues and being aware of how things are supposed to work and how the control equipment gets it all done. I can’t recall gender being any issue for at least the last 25 years.
Interestingly, I spent a lot of time as a California National Guard officer. I was introduced to and trained in mobile subscriber communications by a veteran of Iraqi Storm who was badly wounded there when a rocket-propelled grenade hit the operating unit she was using. She was among the best I ever met. I commanded a battalion led by a female officer who needed time off to prepare for promotion. Her work, and the people she developed, were superb in all respects. For a time, the battalion executive officer was a full-time officer who always demonstrated that she had complete technical understanding of a complex system and would leave nothing undone on her way to accomplishing a mission. I could go on, but the point is that the industry found the same kind of thing in its workers. Like most things in life, success begins with some native ability, implemented and augmented by desire. That, now, seems to be well-established.
There are three stages to the life of a technological item in a society or industry. In the first stage, the item is known to exist but is seldom, if ever, used in a practical way. In the second stage, the item achieves widespread use, but it is not relied upon to the extent that its overnight disappearance would be much more than a serious inconvenience. In the third stage, use of the item is so widespread and the dependence on it so high that the industry or society could not exist without it.
In the ‘70s, process control computers passed from Stage 1 to Stage 2. In the ‘80s, they will pass from Stage 2 to Stage 3. Some plants could even now only operate in the absence of their computers if a large quantity of well-trained operators were suddenly available. While plants so dependent on computer control are still a small percentage of the industry, their number is growing. The incentives toward the higher productivity possible through the proper use of process control computers are certainly present in the marketplace. In some industries, it is becoming difficult to the point of impossible to stay in the markets without them.
A second indication of this expected transition to Stage 3 is the quality and diversity of the equipment that is available. It is not difficult now to purchase equipment that can provide a control engineer with options so far beyond the limitations of previous hardware that human knowledge becomes the first constraint on what can be done.
The exact role computers will play in future industrial controls is not yet well-defined. This is largely because what can be done has stretched our imaginations so far that the exact features that should be in the box are hard to pin down. As the common denominators of computer usage become evident, products to perform the essential tasks will become available at attractive prices. Those vendors who guess well on the role of the computer will do very well in the next few years. Due to the very high cost of developing computer equipment and systems, those who guess wrong will carry a very heavy burden.
Computers can do a lot of things, but at the most native level it all comes from the quantity of instructions they can execute per unit time. The typical units for that, which we will use here, are MIPS, meaning “millions of instructions per second.” Another measure is FLOPS, meaning “Floating Point Operations per Second,” that is pertinent in assessing the ability to solve mathematical problems. A lot of work in process automation, though, is merely “logical” and because of decades of practice in optimizing solution times in the software used for calculations, MIPS is probably a better indicator of relevant computing power.
Doing it by hand in 1892 might demonstrate a computing power of 1.19 * 10-08 MIPS. That implies that completing a little more than one instruction could take 100 seconds, about 1.7 minutes. I’ve always felt that having a number for this sort of thing was entertaining, but even at the forefront of our technological development today a human can often see situations developing at a tiny fraction of the speed it takes for the automation to do it and is often done with more broadly applicable insight than can be instantly drawn from the automation system.
The MIPS compilation I’m looking at as I write this suggests that in 1911, adding a “Monroe Calculator” to our human nearly doubles his performance. That could be looked at as increasing by over 5% a year over the intervening 19 years—but it doesn’t work that way. To achieve this doubling of production it was necessary to identify numerical calculations and issues. Then, it became necessary to envision and invent a machine that could be built to do that.
During WWII, there was a difficult situation in Europe with discerning the content of encrypted radio-intercepted messages. A British group at a place called Bletchley Park led by a true genius named Alan Turing created the computer power required to decode and present the contents of military intercepts faster than the intended recipient could do it (using his human-focused manual procedures). In 1943, Turing and his team produced a programmable computer they called Colossus that was capable of 0.000224 MIPS. That’s an increase of more than 10,000 times. On a “by the year” basis that would be 32 times per year over the 33 years since the advent of the Monroe calculator. Of course, there were other advantages. The Monroe calculator wasn’t imaginative, while Colossus was “programmable” and could provide a computationally accurate result for the questions formulated for it. Colossus, of course, was motivated by need and spawned from existing technology advanced exponentially by human insight and imagination. Could that ever happen again?
It would be fun to spend another dozen or so examples to this. But to fast forward toward the bottom line, the iPhone I’ve been carrying for the last few years is capable of 18,200 MIPS. That’s a lot faster than the 0.000224 MIPS that Touring’s team achieved on Colossus. That’s a little over 81 million times as fast.
As Moore’s Law has pointed out in several ways, it is hard to see how this ends. I suppose it does as the technology involved reaches its quantum mechanical limits, but the technology of the Monroe adding machine, or the fellow with his pencil, didn’t see their future coming either.
It’s interesting to think about the path from basic pneumatic single-loop process control and the progression from it to all that we can imagine. Is there an end to this? It’s not clear to me that there is.
When I studied modern physics, most of the erudite members of the field believed there was no velocity involved in the propagation of gravity. Einstein, of course, disagreed with them and eventually experiments were done suggesting that Einstein was right, that gravitation changes propagated at the speed of light. That seemed like a boundary defining ultimate limitations.
Well, that settles the “how fast will it go for all time,” right? As it turns out, there is a feature in quantum mechanics called “entanglement.” It appears to reproduce changes initiated at one point in a specifically determinable other, instantly. If I were thinking about a higher than contemporary communications speed it would cross my mind. This isn’t a suggestion, merely an observation that all there is to know may not yet be available to us.
In a different context and with different goals, a late friend and scientist, Henry Taylor Howard, managed to establish radio communications between California and China using line-of-sight radio equipment. His twist was integrating the moon into the radio signal’s path. There are things we know and can determine with specificity and there are things we can imagine but that are not substantiate. The history we know or can extrapolate from what we know suggests there may still be worlds and lifetimes of surprises. We can be so insightful, and so very clever. I have to admit, science and engineering have provided a very exciting and satisfying life.