It’s hard to say exactly when the first electronic computer was invented, but historians agree that ENIAC (Electronic Numerical Integrator and Computer) was one of the earliest general-purpose computing systems. This system was formally dedicated at the University of Pennsylvania in 1946 – just 70 years ago.

Originally designed to calculate artillery firing tables for the military, ENIAC was programmable to perform complex sequences of calculations that included loops and subroutines that sped the ability to perform these operations 1,000-fold – three orders of magnitude. Back in those days, the concept of software had not yet come to light because ENIAC did not store software programs as we know them today.

Instead, the machine itself was programmable through the integration of a large number of arithmetic machines into which sequences were programmed by the manual setting of 10-way switches that “told” ENIAC what to do for a given calculation. To program an operation into the machine, it took days or weeks of manipulating its cables and switches to the right routine, which was followed by more time for debugging. In this situation, the hardware assumed the role of software as we know it today.

The term “software” was originally coined by Alan Turing, but it wasn’t until 1957 that software was installed onto a computer hardware system as a tool to be called on as needed. Software is all the information needed by computer hardware to perform a required task. It includes programs, libraries and related data necessary to perform the tasks set before it. Computer architecture is such that hardware and software need each other to function, and neither is of any use without the other.

Now let’s fast-forward through the maelstrom of seven decades of  computer research, innovation, digitization and miniaturization, and we indeed come to a new world of technological sophistication nobody would have ever foreseen. As I type this into my laptop, I have before me more computing power than the banks of processors and tape drives contained in the environmentally controlled room of a few decades ago.

We now live in a world in which photographic film barely exists; in which the largest taxi company in the world, Uber, owns no vehicles; in which the largest hotelier in the world, AirBnB, owns no properties; in which computers help doctors make difficult diagnoses; in which printing occurs in three dimensions; and in which cars and trucks can now drive themselves. And this is just scratching the surface. The pace of change itself continues to accelerate. It’s frightening in a way, but it’s also very exciting.

Closer to home, sophisticated computers and software packages with the ability to communicate with each other have made the trial-and-error approach to manufacturing practically obsolete. Who cannot help but marvel at our industry’s ability to design a part and predict its microstructure and in-service performance from a laptop, achieving in a just few hours or days what might have taken months using traditional methods?

And it’s not just the part itself. An entire process or production line can be designed on a laptop, incorporating raw-material data, deformation technologies, thermal treatments, production rates and system self-diagnostics.

Welcome to the advanced manufacturing sector we call forging. The best is yet to come.