Authored by: Brett Burger Product Engineer for Data-Acquisition Systems Edited by Leslie Gordon Article highlights Extra points |
A familiar engineering tenet says design is a trade-off — strength versus weight, weight versus speed, speed versus torque, and the like. Also to be considered, of course, is cost. Now add another trade-off: the environment. Green engineering throws an additional monkey wrench into the constant tug-of-war that is design.
The general goal of green engineering is the optimization of energy, resources, or environmental impact. And, as with many other optimization tasks, computers and recent technology play a pivotal role.
Consider such applications as monitoring power-plant emissions, collecting rainforest data, and measuring electrical signals, which seem on the surface to have little in common, but that, in fact, are all forms of green engineering. Some of these processes were previously optimized for cost, so “reoptimizing” for environmental efficiency would seemingly be more expensive. It turns out this is not necessarily true.
For example, when coal-fired power plants needed a better way to monitor the chemicals coming from their stacks to meet EPA requirements, engineers at Data Science Automation in McMurray, Pa. (dsautomation.com), developed a monitor that continuously measured mercury in the outgas flow. The system compared data to alarm limits and stored it for historical evaluation.
Many of the actuators and sensors had to sit in remote locations that were difficult to access directly. So the mercury monitor includes a wireless uplink to a PDA that allows plant workers easy access while on site. In maintenance mode, the PDA lets engineers and technicians operate all pumps and valves. Calibration data, from simple table comparisons to detailed flow checks of dry gas meters, store locally on the PAC. The capability to wirelessly perform these operations has significantly reduced the company’s maintenance costs.
Carbon flux in the Costa Rican rain forest
Another application involves data loggers in the Costa Rican rain forest. Researchers at the Univ. of Calif. at Los Angeles integrated wireless measurements, robotics, remote operation, and Web-hosted results to study the effects of carbon flux there. The specific area of study, La Selva Biological Station, has been one of the top sites for rain forest and canopy research for decades . The 3,900-acre area gets 13 ft of rainfall yearly.
Among other tests at this site, researchers are studying what is known as the “Gap Theory,” which hypothetically explains local carbon exchange in the rain forest. The idea: Small openings in the forest canopy, created by natural events such as a tree fall, generate a sort of carbon “chimney” that releases CO2, produced by soil respiration, into the atmosphere. The CO2 gradient between the forest floor and the canopy has been notoriously difficult to measure because of the large distance between the floor and canopy, remote location, and canopy height.
The researchers developed what is known as a networked infomechanical system (NIMS). It is based on a ruggedized, programmable data logger that comprises a Power PC running a real-time O/S, a field-programmable gate array, and I/O measurement modules to interface with a variety of sensors. To capture data between the canopy and the ground, the NIMS is hung from a cable strung at different heights from the forest floor. The system, known as “SensorKit,” moves along the cable stopping at 1-m intervals to quickly calibrate and take local measurements. This process is repeated over several layers of the forest to get a full 3D measurement image of the carbon flux. Specific measurements include temperature, CO2, humidity, 3D wind movement, heat flux, solar radiation, and photosynthetic active radiation.
The research findings could potentially help curtail the impact businesses and factories have on the environment. For example, should a widespread carbon cap and trade policy be adopted, businesses and regulatory bodies will need a way to determine whether specific land is a carbon sink or carbon source. In this scenario, an area’s carbon level would translate to carbon “credits” that could be purchased and traded by companies and land owners to help limit total global emissions.
Testing RD components
In another example, Summitek Instruments Inc. in Englewood, Colo. (summitekinstruments.com), developed software called Spartan that helps support lean manufacturing by automating the testing and analysis of RF and microwave components. Previously, many companies kept hard copies of such data in storage boxes and saved test data on individual computers, which made retrieving the information difficult. The software eliminates this problem by letting users design test sequences which can then be run with network analyzers from Agilent and Rohde & Schwarz, as well as passive-intermodulation- measurements analyzers (PIM) from Summitek.
Recall that network analyzers are instruments that examine the properties of electrical networks, especially those associated with the reflection and transmission of electrical signals known as scattering parameters (S-parameters). S-parameters (for example, gain, return loss, and voltage standing wave ratio) describe the electrical behavior of linear networks when they undergo various steady-state stimuli by small signals.
PIM analyzers examine unwanted interference signals which can come from worn connectors, ferrous materials, thermal heating, or even a particular surface roughness. Summitek says in a recent case, a customer installed the software, defined the measurement of a four-port antenna using a four-port VNA (10 different S-parameters, with all associated acceptance limits), and began testing within an hour.
Specifically, the company set up its own NI LabView Web server and developed common gateway interface (CGI) programs using examples in the program’s Internet Toolkit as a guide. On the main server, these programs make remote VI server calls to test stations to run the tests and the test software. The main server maintains configuration control of tests. The NI DIAdem data finder indexes test-data properties, generating a searchable database for data mining. Users can perform statistical analysis, run reports, and print data sheets.
Intuitively, it would seem that when a product is always manufactured with the same parts and processes, there should be no difference in test-data statistics. But in the real world, things aren’t that easy. In fact, problems crop up that must be identified and corrected as soon as possible. So a Spartan analysis routine might, for example, summarize statistics by property value. Organizing these by test station could then pinpoint damaged test fixtures. The software can thus find “hidden” differences in a company’s manufacturing process, helping it eliminate costly errors.