The hierarchy used in CAD/CAM networks today typically is one of a client/server nature. Each workstation, or client, has its own central processing unit (CPU) and local disk storage. The network may also have a file server that contains CPU and hard disk storage, as well as magnetic tape storage.
A key trait of client/server systems is that the file-server software provides file abstraction to the client. The workstation user doesn't need to know where the files physically reside. This behavior is courtesy of what is called a virtual-file system which allows sending file system requests to a local workstation file system or to a remote server across the network.
Some advanced networks are starting to evolve another layer of hierarchy above that of the file server. The idea is to connect the file server over a wide area network to massive storage facilities consisting of optical-disk jukeboxes, and mag tape or optical-disk libraries. This additional layer is basically a near-line storage system that keeps information available with access times on the order of seconds or longer.
The term virtual reality has come to mean the ability to "walk around" 3D mathematical models and view them as though they physically existed in space. Some CAD models now can be viewed in virtual reality through use of special viewing helmets or glasses, and software that formats video images to create the impression of depth.
The impression of depth results from seeing two perspective views of a three-dimensional object that correspond to the same views seen by our left and right eyes. There are two general ways to create these views. The approach employed in virtual reality helmets is to use two small video screens, one for each eye, and send each view to the respective screen. A second method, which is more typically used in CAD applications, multiplexes the two images onto the sequential video fields of an ordinary monitor. The viewer wears an inexpensive set of glasses that contain an infrared receiver and polarized left and right lenses. An infrared transmitter synchronized to the alternating video fields sends signals to the eyewear which cause each lens to shutter on and off at the appropriate time so that each eye sees the corresponding left or right view.
To track the view provided by the eyewear in space requires a special view controller consisting of three ultrasound speakers sitting on top of the monitor. They emit signals to microphones incorporated in the eyewear. Signals from both the eyewear and speaker array go to a control unit. The controller detects phase differences in the transmitted and received ultrasound signals, and uses the information to track the user's head position. Software calculates new perspective views from this information.
Third-party software vendors have begun releasing packages designed to work with 3D viewing equipment. One called Sudden Depth from Chasm Graphics provides a software tool kit that lets PC users create stereo images under Windows using computer graphics or 35-mm slides. It works in conjunction with the AutoDesk 3D Studio program. Another workstation-based interactive software package called 3D Interaction Accelerator from IBM permits users to "walk through" highly complex mechanical models whose complexity significantly exceeds that which graphics hardware can handle at interactive rates. It handles models comprised of thousands of parts and millions of faces.