You know about the Cloud. How about the Fog?

A bottleneck could be ahead for the Cloud, with billions of objects generating overwhelming amounts of unfiltered data. According to Supply Chain Digest, some experts think the answer lies closer to the ground – in the Fog.

From SCDigest’s OnTarget e-Magazine

– May 20, 2014 –

RFID and AIDC News: Perhaps the Internet of Thing Should be a Little Less Chatty

Do We Really Want All that Data Pushed Into the Cloud? Cisco Pushes a Fog Alterative; How Often Should Pallets Really Communicate?

SCDigest Editorial Staff

The Internet of Things (IoT) seems to be becoming a reality. Earlier this year, for example, the analysts at Gartner predicted that there would be some 26 billion things connected to the Internet by 2020. (See As Internet of Things becomes Real, New Opportunities for Supply Chain.)

What kind of things? Pallets of inventory, machines on factory floors, automobiles, you name it, many certainly with supply chain implications. Gartner, in fact, believes operations staff and the IT team need to start conversations right now to begin planning for the IoT future that will soon be here, looking at what information might be available, and how it might be leveraged.

A key point relative to the Internet of Things is that to date, nearly all of the information on the Internet has been put there by human beings, whether it’s a blog post , the results some scientific study, or a government report. While the level of information available today continues to explode, to an extent this human factor has limited the amount of information that does get posted. Just consider the number of companies and organizations that have trouble keeping their web sites up to date.

While at least for now human beings will still have to develop applications to take in and display the information, with the IoT, once the “things” start sending their data, the Internet as we know it may be overwhelmed, unshackled from the dependence on humans to get data to the web. There will be billions of inanimate objects sending streams of data 24 x 7 to the Cloud.

Bandwidth, it seems, especially in the US (which is well behind Internet speeds versus many countries in Europe and Asia) could become a real IoT bottleneck given this huge increase in traffic. So while the idea that all the connected things will be sending data to the “Cloud,” readily accessible by all the relevant parties, that may just not be practical or even make sense in many applications.

Consider, for example, a truckload full of pallets in a temperature controlled trailer. Each pallet has an RFID chip to uniquely identify it, connected to a sensor that monitors temperature and perhaps other environmental conditions, such as humidity.

As that trailer is moving down the highway, how much of this real-time data do companies really want communicated – and how much data traffic can the communications systems really handle?

Will companies really want updates from the sensors every few minutes? Or will “event management” thinking need to be added to the IoT, so that, for example, data from our moving pallets only is sent to the Cloud if temperature or other conditions change, or start to approach tolerances?

The answer to that last question will often be Yes, meaning operations managers and IT will have yet another set of variables to consider when designing IoT applications.

Or consider that new aircraft engines from GE have sensors and Internet communications for nearly every part the engine contains, and can generate as much as half a terabyte of data for a single flight.

And this jet engine provides a great example of how tricky all this will be. Certainly, there would be some cases where data shows the engine needs some kind of maintenance when it lands, and just that information needs to be sent when it reaches certain tolerance levels.

But in other cases, does a maintenance engineer need to see all the data, because to understand the big picture, readings from several areas or parts need to be viewed together.

And this then begs this question: if only a fraction of the data generated by the IoT is actually communicated upwards, is all the rest just be discarded, or is stored locally?Some of the IT industry’s biggest names, in fact, are pushing the idea of doing much of the data storage locally, with rules about what portions of it get sent to the Cloud mothership.Networking systems giant Cisco, if fact, is pitching a concept it calls the “Fog” as a complement to the Cloud. In Cisco’s vision, a new generation of routers – Cisco’s bread and butter offering – would get even smarter, have big storage capacity, and make decisions about what data goes where.It’s called the Fog because much information would stay “close to the ground” of where it is being generated, not all pushed into the Cloud.
As part of this vision, these routers would not send information to the Cloud unless it needs to, based on defined business rules. IBM is pushing a similar concept.So we might be entering an age of what could be called “information logistics,” with complex questions about what data needs to be captured, how and when and how much of it needs to be moved, how much stored, and how anyone is going to make sense of it.The Internet of Things is likely to have a profound effect on many supply chains, but we are in the very early innings of figuring out how to optimally harness its power.