News | June 29, 2001

Thin client computing: Looking back to the future

Source: Rockwell Automation

By Chris Rodie
Rockwell Automation, Duluth

At first glance, the emergence of the "thin client" might appear to be a return to the past, involving a sort of "dumbing down" of the operator interface. But the thin client of today is far from dumb; it retains all the characteristics that once made the PC so attractive, while simplifying maintenance and administration of large systems. The PC revolution may be over, but its gains will be preserved in convincing fashion.

Over the past three decades the operator interface has evolved in two major phases, in conjunction with the evolution of control system architectures. From the early 1970s to the mid 1980s, industry's conversion to digital electronic control was in full swing. Plant-floor consoles crammed with discrete gauges, meters, dials and buttons were being replaced by CRT-based text terminals similar to those that had already become commonplace on front office desktops in large enterprises. Text terminals, which had originally been developed to provide "interactive" access to data stored on mainframe computers, were adapted to plant control applications by adding color displays and special character sets. These features allowed them to deliver the needed graphical representations of real-time plant data. Enhancing their ruggedness and durability allowed them to withstand the factory floor environment.

In the simplest control system of that period, a single operator interface terminal would be connected directly to a machine controller via a serial line; in a larger system, several such terminals would be connected to a supervisory-level controller (typically a minicomputer), each via its own serial line. This provided barely enough bandwidth to support updates of crude (low resolution, eight-color) graphics at rates that were marginally acceptable. While these operator interfaces were simpler and more flexible than their predecessors, they were nevertheless slow, expensive, and lacked interoperability (openness) and graphical display capability.

When the PC arrived on the plant floor in the mid-1980s, operator interfaces underwent another major transformation. The PC revolution brought a tremendous improvement in the graphic capabilities of the hardware (resolution, color rendition), along with a mechanism (the window and pointer system) for the presentation and manipulation of the graphics. These developments allowed more information to be presented to an operator in a more ergonomic form.

With the proliferation of high-speed networks such as Ethernet, the PC revolution also brought a tremendous improvement in communications bandwidth; PC hardware and software "standards" made it possible -- both technically and economically -- to build "bridges" between islands of automation, allowing for higher levels of process control. Networking brought with it a new architectural model, that of the client-server. On this basis emerged a new generation of operator interfaces. The PC-based operator interface -- with its sophisticated graphical display -- was elevated from the status of a mere peripheral device to that of a self-contained, peer device which functioned as a "client" (data consumer) of a machine controller.

While PC hardware proved itself generally reliable, it incorporated a delicate mechanical system – namely the hard drive that did not stand up well in a harsh plant-floor environment without extensive ruggedization. Second, the PC proved to be susceptible to security breaches, such as virus attacks and general unauthorized misuse.

Third, studies revealed that the costs of administering a PC-based system were much higher than originally projected. PCs, with their own operating systems and applications, require extensive configuration whenever a peripheral device is added or replaced, or whenever an application is changed or upgraded. As an operator interface, it might function as the "client" of another PC or machine controller. However it was still a "fat client" -- fat in the sense that it had a hard drive that stored an operating system, applications and even system-critical data.

Under the new paradigm of the "thin client," applications that would be installed directly on the "fat client" are pushed back onto a server. The client can access them as needed, by means of special "open" network protocols designed to take maximum advantage of the bandwidth available on a modern high-speed network. Thus, the client can continue to provide sophisticated graphical display without suffering significant performance loss and without all of the hardware baggage of a full-fledged PC. In one sense, the "thin" operator interface has again become a sort of peripheral device (unable to stand completely on its own), yet much simpler to operate, purchase, maintain and secure.

Companies are always seeking ways to maximize the efficiency, security and profitability of their operations. Thin-client computing will continue to grow in popularity as organizations see how the advantages of applying thin-client technology can maximize return on investment, while providing a stable control of the information processing environment. The transition to thin client solutions will be selective process, as many businesses will continue to need the flexibility and expandability of a fully powered PC. With their ease of scalability, thin clients are positioned to provide the best of both worlds.

For more information on industrial computers and monitors from Rockwell Automation, contact the Rockwell Automation Response Center, Dept. (1338), 10701 Hampshire Avenue South, Bloomington, Minn., 55438, (800) 223-5354, Ext. 1338, fax: (800) 500-0329.

Source: Rockwell Automation

Subscribe to our free e-mail newsletter.

Click for a free Buyer's Guide listing.