A Virtual Automated Manufacturing Lab M. Karlesk, H. Jack, Padnos School of Engineering, Grand Valley State University, 301 West Fulton, Suite 618, Grand Rapids, MI 49504

 

A Virtual Automated Manufacturing Lab

 

 

Abstract

 

The development of an integrated manufacturing laboratory is discussed in detail. This laboratory makes use of Virtual Reality for simulating programs in the workcell. After programs have been verified the user may execute them on real devices and observe the results with video feedback. Common Internet based tools have been used: Web browsers can be used to access all of the functions in the laboratory. The laboratory is composed of a multiple number of computers running various operating systems, as well as ‘C’ and Java servers. Within the laboratory there are two robots, a lathe, a mill, and other computer controlled devices.

 

 

Introduction

 

Virtual Reality (VR) is inherently a computer generated environment. Ideally, VR would completely immerse the user. However, the current technological and cost constraints make this delivery method impractical. Currently VR can be experienced effectively by users using freely available software and inexpensive hardware. The Internet provides a unique and somewhat ideal utilization of the VR concepts, allowing a user to interact with a remote or non-existent scene. That is, a user may directly manipulate the presented environment and in some form directly experience the results of his or her interactions.

 

Internet Technologies: Backbone of Virtual Environment Development and Delivery

 

The Internet explosion of the last decade has provided a unique opportunity to replicate remote environments literally at arm’s length. Initially, any remote environment brought across the Internet was one of information in its most rudimentary electronic form, i.e. text. In effect, the first virtual environment provided by the Internet, then, was that of a library.

Text and images were soon made easily accessible with the advent of the World Wide Web (WWW). With the addition of images and URLs, the Internet provided a user an assortment of easily viewed, two dimensional, virtual environments. The Common Gateway Interface (CGI) added a greater deal of interaction, and it allowed the user to truly interact with and change his or her virtual environment. That is to say, the user crossed the “threshold of virtual exploration”1.Of course, this interaction was limited, but for the first time, a user was able to interact with a virtual environment. In this sense, a user was transformed from an observer to a participant in the environment presented.

Recent developments in streaming technology have made real time information delivery feasible. Development of video and audio delivery systems along these lines have dramatically increased the capability of the information virtual environment presented to a user. The introduction of Java and the Virtual Reality Modeling Language (VRML) have brought true client side environment interaction to virtual environments. Java’s client side program execution paradigm allows complex user involvement, information delivery, and networking capabilities. VRML allows further, realistic information delivery in a low cost package. The latest version of VRML, 2.0, allows environment objects and conditions to be manipulated externally through Java applets using the Extended Authoring Interface (EAI)2. These latest technologies, when combined, are the backbone of a useable virtual environment delivery system.

 

 

Automated Manufacturing Lab: Virtual Needs

 

The Padnos School of Engineering at Grand Valley State University faced a problem that is perhaps best solved through the use of a virtual environment. As part of newly introduced manufacturing engineering courses, two robots and two CNC machines (a lathe and a mill) were to be used for hands on lab experience. The difficulty arises in the logistics of large numbers of students attempting to program, debug, and verify their programs. These machines allow at most a single user at a time. For a class of even twenty, this makes a lab period of three hours insufficient. The problem is compounded by the facts that the students must be physically at the machines to operate them and the lab must be closed at the conclusion of an 8 to 5 weekday.

The desired solution was to allow multiple students to simultaneously work on their assignments from anywhere, anytime. This necessitates creating a virtual representation of the lab itself and the components of the work cell within it3. Furthermore, the goal was to create an augmented reality setting allowing remote control of devices in addition to a VR setting4. In effect, the programming interface and verification process is removed from the physical location of the lab itself and transferred as a virtual environment to be displayed within the students’ web browser as discussed by Rohrer and Swing5. The Internet and the latest technologies as mentioned previously provide the means of this work in progress.

 

Virtual Manufacturing Lab Environment: User Interface

 

At the WWW entrance to the Automated Manufacturing Lab6, a student is presented with a number of options to interact with the lab environment. A direct streaming video feed (Fig. 1) allows live viewing of lab conditions and use.

A VRML model of the lab itself allows exploration. Through subsequent links within this model, the capabilities and characteristics of each piece of equipment can be directly explored through further VRML models (Fig. 2).

 

Users are also given the ability to directly control some pieces of equipment. A set of controls presented to the student allows direct control (one user at a time) of a piece of equipment. The streaming video allows the student to witness the results of his or her actions.

The student is also allowed to control various environment variables of the actual lab witnessed through the streaming video. Lighting conditions and camera angles can be directly controlled via controls presented on a web page.

The final user interface is the most complex. A user firsts selects a piece of equipment to program. Subsequently, a programming window provides the programming interface to each piece of equipment (Fig. 3). After entering code, the user submits the program for error checking. Only once the code is grammatically and functionally correct is the user allowed to continue. Upon a successful code check, a VRML model of the piece of equipment is presented. Here, the user is allowed to begin a simulation of the code previously submitted. Once the code and simulation satisfy the student, he or she may submit the code to the actual piece of equipment. Here, the users are allowed only one-at-a-time access. The video feed then gives the students direct feedback of their work.

Eventually, all four pieces of equipment will be able to work together in concert via student programming projects.

 

 

 

 

 

Virtual Manufacturing Lab Environment: Implementation

 

The virtual environments provided by the Automated Manufacturing Lab are composed of several devices and technologies working together.

Lab Hardware

The computing power for the lab is provided by way of personal computers (PC) and various components. We utilize three Pentium 200MHz machines as our primary development and computing power providers. Each uses a different operating system to provide students with a varied environment with which to work in the lab itself. A Windows NT machine is used to serve the web pages as well as the VRML environments. A Windows 95 machine is used internally within the lab for testing the virtual environments created and can be used to host Java server applications if necessary. A Linux machine is used for student projects, hosting the video feed, running the most demanding simulation server processes, and as a Network File Server (NFS). Each device within the lab is controlled by a small single-board 486 running as a Linux diskless workstation on an NFS connection to the primary Linux server. Using several machines as we have has allowed us to distribute the potentially heavy load on any single server. No one machine has to host all the server processes or needs expansion cards for extra ports. Each of the machines runs a web server in order to serve the various Java applets. As the Java networking security model only allows an applet to connect to the host from which it was served, this is a necessary step. The number of machines in use has allowed us to ensure reasonable response times even at peak lab usage and avoid large numbers of lengthy parallel and serial cables throughout the lab. Each diskless workstation is very small and has its own ethernet connection. This allows us to run short cables from the workstations to the devices and more easily maintain the lab and equipment than with a central server with extra ports and lengthy cables. The four devices that

 

compose the workcell, two robotic arms and two CNC machines, are run by these small Linux machines. In addition, extra controls that control the camera and lighting are also run by one of these small Linux workstations (Fig. 4).

 

VR Environment Creation

The virtual environments provided to the users are created with a mix of live streaming video, VRML models, Java applets, servers, and device drivers.

The virtual environments which allow direct manipulation of equipment within the lab prove to be much simpler constructs than the more involved simulation environments. Any direct manipulation environments within the lab have the same construction. On the student’s web browser, a live video stream is displayed along with a Java applet of controls and status information. The applet comes directly from the machine running the device driver. The machine which hosts the device runs both a server written in Java and a ‘C’ device driver which directly controls the work cell device via one of the host’s ports (Fig. 5). The running server is written in Java for the simple fact that roughly fifty lines of code can produce a very stable and easy to use TCP/IP server. Just In Time (JIT) compilers allow us to minimize the overhead required to run these server processes. The device drivers are written in the C programming language because of the port access requirements. At the time of writing the user’s interaction with the virtual set of controls follows the following GUI elements:

 

The user requests the web page

The video stream begins and continually updates from the video source

The Java applet is loaded from the device host

The Java server on the host allows or disallows the connection on a one at a time basis

If the user is granted access, the Java applet controls become active

Any action through the virtual controls is communicated to the server

The server communicates with the device driver

The device driver communicates through the appropriate port on the host to the device

Any change in the lab is continually displayed on the user’s screen

 

 

The more involved virtual lab environments which include simulation require a greater deal of programming and Internet traffic.

Unlike the previous virtual environments, any number of students can use the first two steps of the process; the third step in the process, verification, is again a one at a time procedure.

The process begins by the user’s selection of the device with which to work. This can be accomplished through a set of links within the VRML model of the lab or through standard URLs presented on static web pages. Once the device is selected, a Java applet is brought up on the first of three pages. This applet presents a programming environment for the particular device. A student enters code for checking. When submitted, the code is sent to a Java server where it is parsed, lexigraphically checked, and analyzed for any obvious programming errors. Two tools, JavaCC and JJTree, were used to create the lexer/parser in Java. [Writing the parser/lexer and the servers and simulators in Java allows us to package a smaller version of the virtual environment for home use and development without a network connection.] The errors are returned to the student’s applet in an error window. The student refines the code until no errors are returned. Each connection and its programming interface are tracked by the server based on the user’s IP address.

 

 

Once the server communicates to the applet that the code is error free, an option on the user’s screen to bring up the next page becomes active. When the next page is loaded, a VRML model of the device is displayed as well as a Java applet which communicates with the original server. This applet receives a set of motion vectors with which to manipulate the VRML model. Using VRML 2.0 and the External Authoring Interface (EAI), a Java applet is able to step the VRML model through the program the student entered moments before. The Java applet also allows any user input called for in the device program. Finally, once the simulation is satisfactory, the student may submit the entered code to the real device. This is accomplished within the server by simply transferring the original user entered code to the server and device driver running on the host connected to the actual device. Again, a live video feed is displayed to the student (Fig. 6). This last step of verification allows only one student at a time.

 

 

The user(s) selects the device

The Java applet containing the programming interface is brought up on the screen

The user enters his or her program

The user checks the code by submitting it to a server

If the code is correct, a new web page is brought up with a VRML model of the device

A new Java applet communicates with the server to receive a set of motion vectors for the VRML model

The user starts the simulation and is given the opportunity to give user input as appropriate

The Java applet feeds motion instruction to the VRML model

Once completed, the user submits the code to the actual device (only one user at a time)

The server sends the original code to the device host which runs the device through its program

A new web page is brought up with the video stream displaying the actions of the programmed device

 

 

 

 

 

Conclusions

 

The hardware and software architecture for an automated manufacturing laboratory was described. At the time of writing this was well under development, and will be in use fully the summer of 1998. Students will use common Web browsers to access equipment in the laboratory with VR simulations, and eventually access real equipment. To do this extensive use of VRML, Java, TCP/IP servers, and other networking technologies have been used. We are planning to allow the students to have semi-formal access to the virtual laboratory before their scheduled lab sessions. When they finally arrive in the laboratory they will be familiar with the technology and devices. We can then focus on particular integration problems. An attractive benefit to this approach is that users may interact with the equipment 24 hours a day, from home or computer rooms, without safety concerns. Eventually students will be able to write simple programs to control the entire workcell.

 

 

 

 

References

1. S. Fisher "Viewpoint Dependent Imaging: An interactive stereoscopic display." Processing and Display of Three-Dimensional Data, S. Benton, Editor, Proc. SPIE 367, 1982.

2. C. Marrin. “Proposal for a VRML 2.0 Informative Annex: External Authoring Interface Reference” http://reality.sgi.com/cmarrin/vrml/

externalAPI.html. Aug. 2. 1996.

3. C. Potter, R. Brady, P. Moran, C. Gregory, B. Carragher, N. Kisseberth, J. Lyding, and J. Lindquist “EVAC: A Virtual Environment for Control of Remote Imaging Instrumentation” IEEE Computer Graphics and Applications, Vol. 16, No. 4, July 1996

4. R. T. Azuma. “A Survey of Augmented Reality” Presence: Teleoperators & Virtual Environments, MIT Press Journals, Vol. 6, No. 4, Aug 1997, pp. 355-385.

5. R. Rohrer and E. Swing. “Web-Based Information Visualization” IEEE Computer Graphics and Applications, Vol. 17, No. 4, Jul/Aug 1997, pp. 52-59.

6 M. Karlesky. “Automated Manufacturing Laboratory” http://www.aml.gvsu.edu/. Aug. 8. 1997.

6. R. Rohrer and E. Swing. “Web-Based Information Visualization” IEEE Computer Graphics and Applications, Vol. 17, No. 4, Jul/Aug 1997, pp. 52-59.

6 M. Karlesky. “Automated Manufacturing Laboratory” http://www.aml.gvsu.edu/. Aug. 8. 1997.