A LITTLE BIT OF BACKGROUND
First, let me say that I am not an expert in CAD/CAM history, nor am I a CAD/CAM trivia buff. What I am, is an avid fan of CNC machining and CNC programming who has been involved in the trade for nearly 35 years. At times, I will speak of “the good old days” when NC controllers were adorned with red LED numerical readouts, and making them work entailed the fine and frustrating art of line editing APT (Automatic Part Tool) program files on a Digital Equipment DecWriter dot matrix green bar paper terminal.
I could say that “those were the days”, but, throughout the years, I have developed a keen appreciation and a never-ending interest in the ever-changing technology that has fueled the science of computerized machining. As if it were not enough to integrate the design tasks and manufacturing tasks into a single application to provide an associative design-to-manufacturing solution, we have seen huge advances in product life cycle management, product data management, integrated simulation and verification, and post processor development.
When I first started programming CNC machine tools, I worked at an aerospace company in a plant that kept several hundred manufacturing workers employed on three shifts. The factory housed a few hundred machines, and, at that time, there were more manual machines than CNC. The production machining process for most of our products required complex holding fixtures to accommodate multiple tooling packages,multiple machining steps, and multiple setups. Product and tool design was presented in two dimensions on paper drawings that were created by engineering people who could see a 2D cross-section through a part of assembly at any position or angle with nothing more than a vision of the finished design in their head. I’m still amazed by that skill set.
At that time, there were several CNC Programmers who interpreted and defined part geometry in APT from the 2D paper drawing, and then directed the tool centerline reference point around the part geometry. The input was a text-based programming language in 80 column format, and the output was the same G-code that our machine tool controls still understand today.
It was a pretty inspiring process when all was said and done, but the atmosphere was about to change.
THE ADDITION OF 3D MODELING
With the creation of 3D graphics, 3D surface creation, and 3D solid modeling, a more natural approach to the design and manufacturing process was realized. The solid model became the focal point for all of the derivative processes. The engineering drawing and machine tool cutter path data fell into their rightful place as tools that could be seen as by-products of the design.
At the same time, the desire to build and maintain an associative bond between design and manufacturing tools has become a key factor in the success of integrated CAD/CAM software design. It has, and probably will continue to be, the biggest factor in the CAD/CAM solution selection process for most companies. Eliminating the need to translate and rebuild design data at every step of the manufacturing process is not only more cost-efficient, it’s also safer and easier to change when the time comes (and we all know that time will come, probably sooner, rather than later).
As a CAM Programmer, being able to see the tool and its relationship to the part was a huge change for me. When we talk about CAM programming today, we talk about multi-axis cutter paths that remove a substantial amount of material. I thought that I was REALLY ahead of the curve when I learned to write looping procedures with embedded callable subroutines that would use stock allowances as variables.
The graphical display and verification of cutter path data capabilities that are prevalent in our CAM packages today have proven themselves as invaluable tools in reducing the trial and error programming techniques that were employed in the past. Being able to step through a cutter path and measure the distance from a tool surface to a part surface, in itself, is a pretty cool feature. But now, even that is easily taken for granted when you witness a full-blown 5 axis machine tool simulation that is being driven by a G-code program that might have been created by an integrated post-processor, or by a well-versed G-coder in a text editor. If you haven’t seen it already, being able to visualize what the tool and holder looks like in the spindle, and to see where all of the axes are going during machining is amazing (in my opinion).
During the course of my APT programming days, post processor development moved from Fortran to C++ while the APT processor itself moved from IBM mainframe to personal computer architecture. We had a group of post processor developers that supported the company’s five-plant collection of posts. During that time, a few of the larger manufacturers were still processing centerline data from NX/UG through the APT compiler, and we were one of them. The method provided a lot of flexibility for those of us who understood APT, but it soon became a maintenance nightmare for those who did not. Sometimes, it just takes a while for old habits to go away.
Eventually, the NX/UG postbuilder GUI found its way onto my desktop environment, and it has completely changed the way that post processor development is done today. Some companies keep dedicated post builders, others let the CNC programmer manage their own posts, and some choose to outsource the task. The end result is a better post in a shorter amount of time, and, if you’re using NX CAM, it is a tool that can provide direct input to the Integrated Simulation and Verification environment.
THE NEXT PHASE ... DATA MANAGEMENT
I have worked with more than my fair share of generic database management systems that were designed to manage data storage, data integrity, and revision control. Unfortunately, most of them didn’t fully support the complexity of aerospace configuration management in a very effective manner. I believe that an integrated data management system that complements our configuration management efforts is the link that completes the design-to-manufacturing cycle. I am pretty much sold on the idea that it is better to put and take data to and from a repository in a seamless fashion, than it is to leave that process to the end user to manage on their local drive.
When we have engineering and manufacturing personnel who are focused on their key area of expertise without having to concern themselves with the day-to-day details regarding where things are stored and how they are managed, they naturally become more efficient, and you no longer have to worry about misplacing data or violating your own rules of configuration management.
Yes, we have come a long way. I still see many changes occurring in manufacturing technologies, and we are nowhere near the end of this era of innovation. I’m excited to see what’s next.
Thanks for taking a trip down memory lane with me.
PS- I’d love to hear about your experiences and vision for the future, feel free to comment below!