Automation, Supercomputers, & AI
By Frank Rovella
For the past two years, IBM has been involved in a lobbying effort to convince Congress to allow its Watson supercomputer technology to be used as a medical diagnostic tool. You may remember Watson by its now-famous victory on the game show “Jeopardy!”. IBM wants Congress to classify Watson as a diagnostic tool rather than a medical device. This will allow IBM to avoid the long and costly clinical trials that medical devices are subject to. If approved, and all indications are that it will, this would truly be groundbreaking, and a first for the medical industry, which until now has relied on the expertise, and experience of people who have dedicated their lives to medicine.
For the past two years, IBM has been involved in a lobbying effort to convince Congress to allow its Watson supercomputer technology to be used as a medical diagnostic tool. You may remember Watson by its now-famous victory on the game show “Jeopardy!”. IBM wants Congress to classify Watson as a diagnostic tool rather than a medical device. This will allow IBM to avoid the long and costly clinical trials that medical devices are subject to. If approved, and all indications are that it will, this would truly be groundbreaking, and a first for the medical industry, which until now has relied on the expertise, and experience of people who have dedicated their lives to medicine.
Proponents such as Eric Topol, a genomics professor at the
Scripps Research Institute has used the technology and is excited about the
prospect of its widespread use. He states, “No human being can read five
billion pages of medical literature in two seconds.” Of course, if adopted, a
physician will make the final decision. However, there are many who worry that
this powerful technology will make its way beyond the control of a physician.
This could undermine one of the key aspects of the doctor-patient relationship,
trust.
The widespread propagation of technology such as Watson has
far-reaching implications, both economically and ethically. In manufacturing
and industrial applications, deployment of new technology is standard
practice and is in large part the reason that the US has become so competitive
in the world market. The demands for higher precision, faster throughput, and
lower cost keep raising the bar. The new technology this is driving is
revolutionizing the workplace in ways not even thought of just a few years ago.
As Kurzweil’s predictions of the exponential growth of computing power have come to
fruition, so follows the automation of manufacturing processes. The subsequent
transformation this has precipitated has been very subtle, so much so that most
people hardly notice; an upgrade here and new function there. These are small
incremental advancements that have occurred over long a period of time and can
require a perspective spanning years to comprehend fully.
Automation: The use of automation is everywhere; we’ve all
seen it and work with it every day. From AutoCAD, CAD/CAM, and CNC controls to
telemetry on an N2 tank, monitoring software connected to hundreds of
thermocouples and pressure transducers, a robotic pick and place, or any one of
a thousand custom-designed automated systems. Automation simply put is the
conversion of a manual task into a hands-free process. In one of my recent
articles titled “The Next Industrial Revolution,” I wrote about the changes
brought about by automation. For example, because of automation, over the past
20 years of jobs for machinists have decreased by 20%. A recent study conducted by
Oxford University concluded that due to the widespread deployment of automation
and computerization that 45% of jobs in the United States will disappear over
the next 20 years.
Supercomputers: IBM calls the Watson system cognitive, and
many people are identifying it and others like it as artificial intelligence
(AI), in reality, it is neither. Supercomputers like Watson are just very large,
very powerful mainframe computers using millions of processors. Although the
use of supercomputers may be out the reach of most shops, high power computing
using increasingly sophisticated software is not. What this means to design and
engineering is what automation means to manufacturing. Advancements in MRP,
design, modeling, and FEA software are becoming less expensive and more
accessible. The ability to determine mechanical stress and vibration, the
effects of motion and fatigue, as well as heat transfer characteristics and
even electrostatics, can all be done long before the chips of a prototype fly.
This is not even touching on the developments in 3D printing. For industries
such as injection molding, these tools have provided significant advantages.
Considerations for machining tool steels, getting the correct mold flow
Artificial Intelligence (AI): Once this is developed, and
it’s just a matter of time, everything changes for the engineer and in time to
everyone else in the process stream. Development of an AI has been coined as
“The Singularity”, meaning the point where computing power surpasses that of
human ability and/or becomes self-aware; it has also been called man’s last
invention. There are many views pro and con, and some who say that a truly
self-aware machine will never be achieved, there is even argument as to how
self-awareness is even determined. Much of public perception is media-driven;
Hollywood has made a lot of money portraying self-aware machines as monsters
bent on ridding the world of us. I think a more accurate depiction of what we
can expect was exhibited in the recent movie “Her” (my wife made me watch it).
There are a lot of people taking the development of AI very
seriously; over the past four years, funding for AI research projects has
soared. In 2014, a total of 16 AI startups received over $309 million in
funding. This wasn’t a Kickstarter campaign either, companies like Google,
Amazon and Facebook are betting heavily on AI and are putting up the money to
prove it.
Automation, computing, and eventually AI are all intertwined
and will spell the end of work as we know it. What AI will ultimately mean to
jobs and the industries that are our livelihoods isn’t hard to predict. A
system that can learn can bring incredible speed and accuracy to almost every
industry. The wild card is a system that can learn and self-improve.
The widespread use of automation, computing, and eventually
AI will be driven as always, by the economy of scale. As each new advancement is
adopted, it gets less expensive and can be deployed by smaller organizations.
For manufacturing, it's already happening, the ability to work “lights out” is
a typical example of a fully automated system. If you go to the trade shows,
every year systems require less operator intervention. CAD and MRP software is
getting more sophisticated, 3D printing and materials technology is going through
the roof. I’ve even read about an algorithm that can generate technical
writing.
There are a lot of people who would like to think that all
of this is far in the future, but it’s not. We are on the doorstep of profound
changes in our society, and as with so much in the world today, it’s all
uncharted territory. This is a story of vicissitudes that is part and parcel of
every revolution, whether it is industrial, social, or political.