That’s where we are today with the cloud: Everyone is in the cloud — looking at data, sending data to it, building cloud infrastructure, or (like me) writing about it. This is disconcerting, because the number one rule of investing is when everyone gets in, it’s time to get out. In other words, if your barber, auto mechanic, and brother-in-law are invested in something, run like crazy to sell.
Consider the Tulip Panic of the late 16th and early 17th Century. Botanist Carolus Clusius brought a tulip plant to Holland to do some research on this unknown plant. Unfortunately, immoral neighbors snuck over to Clusius’ garden and stole some of the more beautiful bulbs to sell at a very nice price. By all accounts, that started a desire among the richer Dutch to have tulip bulbs. Once the rich started acquiring bulbs, the not so rich, the middle class, and even the poor decided that they, too, should have tulips. Over the years, every increase in demand was followed by an increase in price that fed the next increase in demand. Tulips became the single most important investment for rich and poor alike. A single tulip could command the equivalent of $2,500 in today’s dollars and, at its height, people were selling homes to buy tulips.
Just like our time’s dot-com and stock-market bubbles, the Great Tulip Panic of 1593 ended with a market crash. Eventually, when everyone owned tulips, the price crashed and Tulip Mania was over.
Now I don’t think that Cloud Mania directly compares to Tulip Mania, but there are similarities. The cloud is a hot conference topic. Amazon, Google, and Microsoft are warring to corner the Cloud-infrastructure market. Myriad industrial vendors are rushing out cloud-based versions of key products.
Cloud services are hot because they offer advantages to both the provider of the data and the consumer — including anywhere-access. That’s why cloud-based Customer Relationship Manager (CRM) software from companies like Sales Force is so successful. Salespeople can have complete access to all their customer records without carrying them around.
Security is another benefit. Someone steals your laptop and you don’t lose all your process data, customer records, or financial data to some scoundrel. There aren’t any management headaches or maintenance issues. No worries about physical site access, IT expertise, redundancy, security, hardware platforms, software releases, and everything else you have to manage. Best of all it’s affordable. A few days ago, Google dropped its prices for cloud services about 40% to boost their market share.
In fact, I think I’m actually a cloud pioneer. In the early 1980s I worked on a remote terminal that accessed a remote processor with a remote storage — a system I never saw, in a building I couldn’t find, in a locked room that I couldn’t access. But the cloud in this context is nothing more than a remote hard drive. For those of us in factory automation that hard drive might be located on a server machine in the control room. It might be located in plant operations on the other side of our factory floor firewall. Or it might be off in the Internet someplace.
Before end users can employ data off an embedded device, the data’s got to get on there. That can be tricky depending on the embedded platform’s capabilities and the OS or TCP/IP stack. Here are five common methods for moving embedded data to the cloud. Some are fast, cheap, and simple, while other are complex, expensive, and difficult to implement.
1. SERVE UP XML (eXtensible Markup Language)
One of the fastest, least-complex ways to send data from an embedded device is to serve up embedded data as an XML file. Once the device can send XML files, the designer can easily load the data into lots of standard applications like Internet Explorer, Sequel, MS Word, Excel, and more. Excel is a great tool to easily capture data. Setup Excel to automatically grab data and insert it into rows of a table at a rate chosen by the Excel user. Once in Excel, the end user can use all the visual power of Excel to present data in interesting ways.
Here, define a schema for your data. (That’s nothing more than a template for the XML data that the user is sending.) Then, any request from a Client triggers an action to send data (encoded in the XML format specified by a template) out over TCP. Applications request the data by referencing a Web page URL like 192.168.0.100/current.xml.
2. GOOGLE’S UNIVERSAL PROTOCOL
Google offers a pretty easy way to capture data. Using their Universal Protocol, send data to their servers and then use their tools to view the data. In fact, this service is a lot like the Google analytics tools to analyze Web-site traffic. They’ve just generalized this service and made it available to anyone to use for any data. It’s a simple protocol and a quick and easy way to store data in the cloud. The disadvantage is that users are at their mercy and there’s no guarantee they won’t discontinue or change the service.
3. SOAP WEB SERVICES
When a designer needs more-complex interactions between two devices than XML can provide, the simple-object-access protocol (SOAP) may be best. SOAP is nothing more than an enhanced functional XML file that adds the capabilities to request services from the receiver. It’s the basis for Web Services and the way that applications talk to each other on the Internet. In fact, lots of platform vendors are incorporating network connectivity into their platforms. Netburner and Digi International, through their Etherios brand, are two with easy-to-use offerings in this area. Netburner incorporates the ability to send JSON commands (a less-verbose XML) in all of its platforms. Etherios has a much more extensive set of offerings from the simplest possible solutions to complete done-for-you applications.
4. ACTIVE SERVER PAGES
Ambitious designers that want the functionality that a scripting language can provide should use Active Server Pages (ASPs), a server-side facility within Windows that provides a scripting language to let designers create complex applications. With ASP, a device forms commands consisting of a URL and data as command parameters. An ASP script can process those variables to summarize them, add them to a database, retrieve other pieces of data, and send them to some other application.
5. OPC UA
Open Process Control with Unified Architecture (OPC UA) is next-generation of OPC technology that offers secure, open, reliable information transfer within the process or enterprise environment. UA is flexible and adaptable for moving data between the types of controls, to let machines (for example) monitor devices and sensors that interact with real-world data. It also executes sophisticated data modeling, event notifications, and standard Internet-type transports.
A final note on proprietary “solutions” and the like: Over the past few years I’ve see a myriad of vendors pushing turnkey cloud-based data collection. These vendors provide a hardware interface, a proprietary communication protocol to their private cloud, and a custom data-access system, all for a monthly fee. In my opinion, contracting turnkey services is a huge mistake. Avoid like the plague these “solutions” that use proprietary hardware, communication links, or private clouds with proprietary interfaces.
Instead, look for open standards such as HTTPS for secure communications. Or use standard data encoding like XML. If XML is too verbose, use JSON or a standard binary (as in OPC UA). Standard platforms, encodings, protocols, and transports give designers maximum flexibility for using the cloud now and in the future.
And if you have any additional questions about the cloud, just make an appointment with my Acupuncturist. He’ll be happy to chat with you.