Jack, H., ElMaraghy, W.H., "Design as a Focal Point for Concurrent Engineering", University of Western Ontario, 1992.
The old saying, ‘a stitch in time saves nine’ embodies the spirit of concurrent engineering. The fundamental concept of concurrent engineering is that during each step of the design phase, we consider global effects of decisions. Indeed, this seems like an obvious approach to product development. However, developing a product is a process requiring many people with diverse areas of expertise and creativity while observing the constraints of manufacturing and marketing. A popular method for simplifying the development process is to split it into a number of stages, some examples of typical departments are Marketing/Customer Service, Engineering Design, Process Planning, Materials, Industrial Engineering, and Production. These departments pass product designs and plans towards production one step at a time, and pass them back when a problem is found. In practice, this approach will result in repeated mistakes, higher product costs and longer development times. The concurrent engineering approach eliminates the ‘passing’ of product responsibility. All departments involved with the product life will have an active role in development, and ensure that bad design decisions are avoided, thus reducing the time and cost of fixing problems which arise from departmental isolation. The result is a better product which reaches market faster at lower costs.
To perform engineering functions concurrently we need to rethink some of the design paradigms. Various tools and methods have been developed lately to support alternate design methodologies. Most of these are intended to support better communication, group decision making, and analysis tools. Communication tools are required to record and track reasoning in the development stages. When a group makes decision, there must be some method for having it accepted and recorded. And finally, analysis tools (such as simulations and FEM) aid in estimating the effects of decisions. While both manual and computerized systems have been developed to support these functions, computers seem to be an emerging factor in successful implementations.
Companies like Digital have taken a serious interest in concurrent design [Wheeler, 1988]. They have developed software for management of communications and specifications during design. Their software was tested using 2 by 2, 4 by 4, and 6 by 1 teams. (These are teams with shifting leadership consisting of representatives from departments involved in the product development and production). By using computer network technology, they were able to bring teams together in an interactive environment, without moving people. They identified one problem with their approach. The lack of Decision Support Tools lead to confusion. In the approach described later this problem is overcome by immediate inclusion of design decisions in a common design model.
In general manufacturing is a complex subject which is broad and hard to mold into a single generic model. This makes tools for developing products hard to develop. Thus, most product development tools are based on design, even though the process of design is not fully understood. We propose that development of new methods using proven tools and technologies will improve the chance of success, and usefulness for industry. Thus, it leads naturally from this argument that any tool to support concurrent engineering should expand upon existing design tools, which are the most stable component of computerized manufacturing. We have taken this approach, and produced a system in which design is a centre for all product development activities. Before discussing the system in detail, we will discuss some of the other issues involved in the Computer Integrated Manufacturing (CIM) approach in general.
A Solid Model contains more information that any other geometric representation. This leads to very powerful software support, but may also create difficulties. Solid Modelers are often very slow, large, and create problems for data transfer. Despite these problems, the approach is often used for the benefit of representation. When using solids as a basis for representation more information may be associated with the solids. For example a solids model may have attributes like material, volume, geometry, positional and geometrical tolerances, and manufacturing operations. These attributes may be difficult, or impossible to include with a surface, or wireframe model.
In order to associate attributes to solid features, the standard graphics only interface must be augmented with a product structure interface. Some commercial packages do this with interactive command and text windows, but this may be awkward, and confusing. We have provided
a visual display of a tree of CSG operations, in which the nodes were represented as parts, instead of operations, which is more intuitive to the parts designer (Figures 3 and 4 show examples of this). This approach resulted in an easy to use system for visual construction of a product. This system was later modified to include tolerance form data for various primitives, based upon the ANSI Y14.5M Geometric Tolerancing Standard. This type of modeler is also discussed by Shah et. al. , but his treatment of tolerance information was much more rigorous and involved the inclusion of datum information.
A majority of the Solids Modeler front ends attempt to mask the Solids Modeling with features. This has been well explored by Hoda ElMaraghy  in her Intelligent Product Design and Manufacture (IPDM) system. The system uses a complete set of features for modeling geometry and tolerances. After modeling, the geometry may be recorded in a Product Description Language (PDL) file for storage and transmission. The PDL files are used by other modules which parse them, and attach manufacturing and analysis information to the Product.
The University of Illinois the Knowledge-Based Engineering Systems Research Laboratory has been developing a concurrent engineering system [KBESRL, 1991]. At the front end the solids model is described with a LISP based approach which allows parts to be defined with primary (structural), and secondary (detailed) features [KBESRL, 1991, pp. 33-40]. The approach ties a generative CAPP system closely to the CAD system. Unlike our approach, their CAPP system must choose operations (their CAPP system is discussed in the next section). This causes a loss of intent of the designer which was available when the design was enacted. A related project [KBESRL, 1991, pp. 65-72] discusses using a network of CAD systems which share design knowledge, which seems to be largely geometrical at this point. They have an intricate set of problems to overcome because their system is distributed, (whereas the our approach in this paper allows centralized client/server relationships). Some of the discussions in this paper involve the conflicts which may arise when multiple agents access the same design simultaneously. It is interesting to note their approach uses a roll back mechanism.
Another project within their system focuses on communication issues [KBESRL, 1991, pp. 25-32]. Concurrent design is enabled by keeping common records of specifications and communications. If a conflict arises because of design changes, then users are informed. This work is important to mention because it will eventually be incorporated into their Solid Modeler, to provide complete concurrent engineering support.
Process Planning is the vital link between Design and Production. Many interesting efforts have been made to develop tools to aid the Process Planner in his job. In particular computerized tools have been appearing in great numbers. At present, variant technologies are available to the commercial buyer, that allow planning using variations on existing process plans. Generative planners are able to develop process plans from the ground up, but these are still in research stages.
This representation is obviously for metal removal, but the concepts may be extended to assembly. The first item on the list is interpretation of part design data. Some designers attempt to simplify this stage by using solid models. Another KBESRL project eliminates the first phase by carrying over more design information from the solids modeler [KBESRL, 1991, pp. 57-64]. By carrying over the designers intent, the CAPP system no longer has to recognize, or interpret the Product Design. The system presented here allows both the first and second phases of process planning off-loaded to the designer at the design stage. Clearly this will reduce the complexity of the CAPP system.
It is clear that simulation plays an important role in new product design when attempting to determine producability and manufacturability. For example, there are a number of commercial systems which will simulate NC programs so that the tool path may be verified manually (for example CIMStation by Silma). There is still a shortage of methods for automatically determining validity of processes. One such algorithm has been published by Su et. al. . He discusses an approach for verifying design work by checking to see if a part is machinable on a 3 axis milling machine. This work is in its early phases, but would be a powerful feedback for designers.
Production simulation is a popular industrial tool for verifying production changes. As a result there are a number of vendors which supply software capable of performing various scheduling, breakdown, and analysis functions. (e.g. Hocus by P-E Inbucon, and ProModelPC by Production Modeling Corporation). The only shortfall to these systems is the lack of a good application interface for direct patching of production simulation into the software of the CIM system.
A CIM Architecture has been developed which will support concurrent engineering from design to production. To do this, manufacturing knowledge has been introduced at the design stage so that the design will incorporate standard items like tolerances, operations, costs, and times. Using this data, a simple yet fully generative CAPP system can convert the designs into process plans. The resultant process plans can then be verified and checked using process and production simulators. The use of simulation allows fast feedback to the designers, so that a design may be altered in the early stages to overcome unforeseen problems with the choice of operations, or with the design itself.
A novel feature of this system is its support of a simultaneous design environment. In particular, designers will have simultaneous access to one design. Instead of a single designer who considers all design stages concurrently, a team of designers may cooperate on a multi-seat CAD package, applying their own specialties. This is made possible by having a main Design Engine (DE) which deals with storage and update of geometry, tolerance data, and manufacturing data. The users’ stations can request updates to the design, and display the results. There is only one copy of the design, but each user will have his own view.
The Product Design data serves as a framework for enabling analysis and simulation. At present the Product Design will be passed through the CAPP module to produce a Manufacturing Bill of Materials. The operations may then be simulated using the UWO Workcell Controller. This controller is equally capable of applying this to control the NC Milling machine and Robot, or appropriate simulations. A link is being established to a production simulation package.
This paper is organized to first discuss the work which has been done. This is followed with a description of previous research, and how it relates. The first section is a discussion of the Design Engine based Computer Aided Design for Manufacturing (CADM) system. The next section discusses how process plans are generated from the Product Designs, and how process planning faults are handled. Following this there is a discussion of process simulation for process plan verification, including the workcell controller. The final application section describes a production simulation module used to statistically determine the effects of one process plan when interacting with all other process plans in a factory. The application sections are followed by a discussion of related work by other researchers, and finally by conclusions.
The Computer Aided Design for Manufacturing (CADM) module has two objectives. It allows multiple designers to work on a Product Design concurrently. To do this the Design Engine (DE) was developed. The DE is an individual process (that may be running on a multi-process computer) which is the only holder of Product Design information. It stores all geometry, tolerance, and manufacturing information. Various views of this Design Model may be received by any number of remote designers, each using their own graphics programs (i.e., CADM). Figure 1 shows the hardware connections of these processes.
The diagram in Figure 5 may be deceptive: the DE is not responsible for production of graphics, it simply acts as a common data server for the Product Design, and will make updates to the Product Data. Figure 2 below shows how a CADM station would update a design after a user request.
If the network was removed from Figure 2 this would be analogous to many CAD packages. The advantage of the present approach is that there may be multiple CADM stations changing a design concurrently. The only issue unresolved with this approach is an impasse which may occur when multiple CADM stations make conflicting changes. There are four basic methods which may be applied to this approach.
Each of these methods has merits, but at present there is no mechanism implemented (i.e., the first item on the list). This problem will be discussed more after the Product Design data structures have been presented.
The Product Design data structure is hierarchical and is based on a CSG approach. The data structure is not “parcelled-out”, so the structure remains one source of knowledge when reasoning and designing. The basic structure is seen in Figure 3 for a keytag with customized initials.
The product structure shown above, represents geometries and associations. Each node in the tree represents a physical feature, part, or assembly, and node each has associated attributes. The leaves of the tree are geometric primitives. On the diagram the ‘+’ and ‘-’ signs indicate CSG operations, and the ‘A’ indicates an assembly. The reader should note that the CSG operations on the tree may indicate manufacturing operations such as addition, joining, or removal of parts, assemblies and features. Thus, the tree structure may be equated with the Product Design Topology, and the Attributes may be equated with the Details of Design. A list of the various attributes that have been implemented are given below.
One of the benefits of this Product Design Structure is made obvious when its impact on CAPP is considered. A process plan can be partially described with the Product Structure, and the associated attributes. In this case the process planner simply converts the process plan to a Bill of Material, and assigns an order to the operations, then produces a Manufacturing Bill of Materials (or Routing Sheets). Note: the operations must be ordered because they are only stored in the form of a precedence graph.
A second benefit of this structure is that the DE may resolve multiple CADM user change conflicts by exploiting the structure of the product design tree. If the tree structure is locked at some node, then all subnodes down to the leaves cannot be added or deleted. This ensures that a structure will not become invalid. Even though a branch has been locked, the attributes of nodes and operations on the branch may be changed, and easily reversed if a conflict occurs.
The design system described in this paper has been developed at UWO using a CSG modeler and interface called TAGS, and some networking utilities for coarse grain concurrent processes [Jack et. al., 1992.]. A typical CADM window is shown in Figure 4, including the model, the tree structure, and some attributes which can be modelled. This package also includes tolerance analysis software which is not shown in Figure 4. Eventually, this software will be adapted to use ACIS (software by Spatial Technology), an object oriented solids modeler, capable of interpreting CSG and B-Rep definitions, as well as distributed ray tracing which permits photorealistic visualization.
The tasks performed by the CAPP system are very routine, because most of the reasoned information is entered directly during the Product Design stage. Therefore the CAPP system must perform the following functions.
This CAPP structure is very easy to construct, see Ruf et. al.,  for an example. A crude implementation of this module has been done in the present work, which verifies the connection between the DE and a BOM. A sample of the output may be seen in Figure 6; this was done with only one milling machine. This system requires a more complex product, and manufacturing facility for proper analysis of this system.
The Knowledge Base in Figure 5 will play a powerful role in the future CAPP system, by recognizing past failures, and making rules to avoid future situations. The Knowledge Base also holds information about sources of NC code for the design. When a design requires NC code generation, the Knowledge Base may suggest a program or subroutine which will generate the code. Existing software makes it possible to perform the feedback seen in the loops above [ElMaraghy, 1992]. The next section of interest is the process simulation.
While it is possible to select operations to produce a process plans, there is no guarantee that the designer has chosen operations which will succeed. But, it is not possible to verify this in the Design stage with only the operation (as specified in the DE). The simulation must occur after CAPP has assigned a particular machine to an operation.
An NC simulator, and robot simulator have been developed for an existing workcell at The University of Western Ontario. Screen dumps of the simulators may be seen in Figures 7 and 8. Each simulator runs directly in place of the actual NC Milling machine driver, and the robot driver, under the control software shown in Figure 9. This allows the complete generation of NC and robot programs, and then testing in a realistic environment, including manufacturing control software.
The NC Milling machine simulator will load the NC code destined for the Dyna 2400 Milling machine, and use the program to generate a graphical representation of the tool path. In Figure 7 the user will note that there are polygons, and lines. The lines indicate tool paths above the cutting surface, while the polygons represent the tool paths beneath the cutting surface. Here we can see the features described earlier in Figure 3, which had a hole, symbol, and initials (in this case SME). This is not a true tool path analysis, but for graphical purposes, the user may quickly determine if the results are satisfactory.
This program may be controlled through its user interface, or via commands passed through the network [Jack et. al., 1992], whereas the milling machine driver is only available over the network. The program to generate the NC code only communicates through the network, and is specific to these keytags. All of these applications were developed to be stand alone processes that run in the background of a heterogenous network.
The robot simulator is similar in setup to the NC simulator. The CRS Plus robot driver runs from the network only, whereas the robot simulator will accept commands from the network or from it’s user interface. The robot will track motion commands which would normally be sent to the real robot. This tool is excellent for predicting collisions, and operation sequences. The obviously poor resolution makes it impossible to determine exact positioning of the end effector. In a better simulation the complete manipulator would be modelled, along with other objects in the workspace.
Figure 9: Workcell Control Interface (This interface directly controls the robot and NC milling machine, but the workcell control architecture allows the real devices to be replaced by simulators, or run concurrently for verification).
The workcell controller in Figure 9 has its user interface to interpret commands from a user (left window), or a program (right window), and send them over the network to the other programs. The networking scheme used [Jack et. al., 1992] eliminates the need for the Controller to know which other processes are on the network. As a result, replacing the device drivers with simulations is trivial, and makes the simulation more realistic. It is also possible to run the simulation on a separate machine without affecting production.
Thus, it is easy to conclude that although these simulations are crude and require visual confirmation, they do allow a thorough verification of Process Plans. Eventually these simulators will be replaced by a commercial simulator called CIMStation by SILMA. The need for a powerful simulator will become even more important when a complex example is found which has many more devices and products.
It is possible to develop process plans which are producible on available resources, and then find that the resources are not available. A manufacturer may waste valuable time if a bottle neck is discovered after process planning. By simulating the interaction of one process plan with all others, the loads may be more evenly balanced between available resources during the process planning stages. Also, long-term bottlenecks may be identified early enough to make capital investments. Even more importantly, since the simulations are closely linked to CAPP, which is closely linked to the DE, it is possible for a designer to try a few “what-if” scenarios with ease.
There are many excellent simulation packages in existence for various manufacturing problems. In general these are all discrete event simulators, capable of making various types of widgets. The main disadvantage of these packages is that they have all been developed as stand-alone packages with user interfaces. For this application it would be best if a simulator could be a background process. Also important is the lack of source code for these packages. Without source code the only source of reports is parsed text files. Needless to say this makes the system awkward and bulky. In response, an initiative has been made to develop a simple simulator which can run process plans, determine problems and feed them back to the CAPP system.
A discrete event production simulator, called FactorySIM, has been developed using color petri net subroutines developed at UWO. This simulator will take factory layout files, routing files, order files, process plan files, and inventory files then simulate manufacturing in a random manner. This is not efficient, and not the way that a proper scheduling system would work, but it does give a reasonable estimate of machine overloading. The main window from the simulator is seen in Figure 10 below. The various object shapes represent resources in the Resource file, and a small moving robot represents a workpart between operations.
The files which drive this simulation are seen in Figures 11 to 15. These files contain cross referenced information, and may be uploaded during the simulation. This is a critical point for simulating the effects of introducing a new product during steady state production.
This simulation is being developed at The University of Western Ontario, and will be used as a feedback for the CAPP system. The simulator will also use the network [Jack et. al, 1992] to allow it to fit into a heterogenous computing environment. For example, at present this package will currently run on either Silicon Graphics or Sun computers with no changes to the source code.
This paper discusses the successful implementation of a Concurrent CAD system which is capable of supporting multiple users, simultaneously working on various stages of the Product Design. The use of Manufacturing data in the CADM modules allows the CAPP system to be simplified. While the implementation of CAPP now is fairly crude, it will be fully developed in the near future. The ready access of simulation modules helps reduce the guess work which the design engineer must do. The simulation modules open up many opportunities for ‘trial and error’ sessions with new Product Designs. There are two types of simulations in this report, the first is a process simulator, which is fully functional. The second is a Production simulator is based on colored petri nets, that is partially functional.
Future work in this area will involve further testing and evaluation of the concurrent engineering paradigm. As the reader has seen, there is a test case using keytags. While this test case is simple, and manual transference of some data was done, this system has excellent promise. Work will also be performed for system development, such as model transfer to other systems using formats such as the PDES/STEP standard.
Concurrent Engineering principles make it possible to develop strong Design Engineering systems which will boost the performance of the modern manufacturer. Thus, this research will continue with the hopes of developing better support for the manufacturer of the future.
6.17 Kroszynski, U. I., Palstroem, B., Trostmann, E. and Schlechtendahl, E. G., 1989, “Geometric Data Transfer Between CAD Systems: Solid Models”, IEEE Computer Graphics and Applications, Sept., 1989, pp. 57-71.
6.18 Takeuchi, Y., Sakemoto, M., Abe, Y. and Orita, R., 1989, “Development of a Personal CAD/CAM Systems for Mold Manufacture Based on Solid Modeling Technology”, Annals of the CIRP, Vol. 38, No. 1, pp. 429-432.
6.24 Willis, D., Donaldson, I. A., Ranage, A. D., Murray, J. L. and Williams, M. H., 1989, “A Knowledge-based system for process planning based on a solid modeller”, Computer Aided Engineering Journal, Feb., pp. 21-26.