Powered by Blogger.

How close to the edge should you get?



Excitement over the possibilities of Internet of Things (IoT) has been building to a crescendo. Chip and device manufacturers have anticipated and waited for the day when IoT would finally mean something. We are at the boiling point, that sought-after confluence of technology innovations and demand for novel ways to produce business value — the moment in time when organizations can finally do something with IoT data. Now that this moment has arrived, business leaders are being asked to make multimillion-dollar bets about where they think the market is going. Regardless of what vertical they’re in, the answer will have to incorporate unlocking edge delivery for IoT… or your data will never deliver its potential. So the question is, how close to the edge should you get?

IoT Isn’t Spelled C-l-o-u-d

Though IoT and cloud computing are inextricably linked, the two ideas are not synonymous with one another. The cloud is effectively an extension of a company’s data center. It drives greater efficiencies across the business environment, from time-to-market to cost advantages. It’s also a key enabler in expanding the IoT ecosystem, as it extends the effective reach of a company’s core assets.
For example, a weather company’s sensors generate massive amounts of semi- and unstructured data. Those sensors are IoT devices that could be geographically distributed across the world. To derive value from the sensor data, the company must store, access and analyze it in real-time (or near real-time). Cloud infrastructures represent the most effective way of performing those tasks because IoT devices usually possess very little, if any, computing resources. A weather sensor could take temperature, humidity, pressure, dew point, visibility, wind direction and more readings and spit out numbers; it does not store or analyze that data.
Now multiply that data output across “x” amount of sensors, and you have potentially trillions of data points being collected daily. One company commented that they create 20TB of data a day from just one such source. To help understand the size of that, the entire Library of Congress, the compendium of all known books ever published, is 10TB; less than half the size of one day’s worth of weather data.
The emergence of edge delivery will power IoT

Edge Is the Answer. What’s the Question Again?

IoT creates similarly distributed demands on data like Cloud computing, however, it exacerbates those challenges by increasing the potential volume, velocity, and variety of data collected. Different companies collect their IoT data from vastly different locations, like in the case of the weather company’s sensor data. The weather sensor example represents data collection and processing at the furthest logical reach – the edge – of a company’s data system. Other examples include devices sitting in a remote location or mobile devices that are constantly on the move.
Edge computing looks to push the collection and processing of sensor data out of the centralized data stores and into edge locations, allowing companies to unlock more value from their data. This is sometimes referred to as a “Reverse CDN (Content Delivery Network).” CDNs push popular content out to the edge of a network, caching a sub-set of the aggregate available data. Similarly, edge computing processes sensor data as close to the sensor as is optimal, allowing companies to avoid both network latency, and overwhelming centralized infrastructure. Ever try moving a petabyte of data across a network? The answer is don’t try. Physics has a tendency to work the same no matter where you are; it doesn’t compromise.
Edge computing reduces the amount of volume a company must move, the distance the data must travel, and the costs associated with distributing data across disparate locations and storing it centrally. Edge computing relieves pressure on centralized data stores and enables companies to perform more effective, localized real-time analysis of data. Companies that can successfully implement edge-computing architecture can reduce latency, optimize infrastructure and increase the value of their data.

Edgy acquisitions — the case for new approaches

I talk to executives from some of the largest and most forward-thinking companies in the world, and I am amazed when they believe the IoT data challenge can be tackled with a traditional enterprise architectural POV. Jasper, ParStream, and the Weather Company will allow Cisco and IBM to reach IoT service providers and organizations in other verticals that they might not otherwise have access to.
In October 2015, Cisco acquired ParStream, a big data analytics company. Acquiring ParStream bolsters Cisco’s IoT offerings by providing customers the ability to analyze vast quantities of data at the edge in real-time. That same month, IBM announced their intent to acquire The Weather Company for over 2 billion dollars. Hint, they wanted more than just increased access to weather data. They were looking to leverage the Weather Company’s IoT platform’s ability to process billions of sensor data points. In February of this year, Cisco acquired Jasper Technologies, an IoT lifecycle management solutions provider. Jasper’s technology will help Cisco customers securely connect millions of devices and collect and analyze the data those devices produce. These acquisitions show how important a non-enterprise architectural approach will be to extending market reach into IoT and the edge.

Edge delivery…eerily familiar

The prevailing thought about edge computing seems to be that either you have centralized data stores or you move completely to an edge delivery model. I feel like this conversation is a rinse-and-repeat from five years ago when we talked about cloud computing and whether or not you should migrate everything to the public cloud. Just as then, the answer is not mutually exclusive.
The challenge is that a lot of edge computing today is done in an ad hoc fashion. Nodes like IoT sensors and mobile devices generate data at the network edge, but they don’t necessarily belong to a thought-out approach to edge computing. Organizations that want to derive long-term value from their data should implement a strategy that merges existing infrastructure, both centralized and cloud, with new tools purpose-built for IoT and edge analytics.
Edge computing is not a cure-all for every data question, but it is the next step for organizations that want to take advantage of IoT. Organizations must select technology purpose-built for edge analytics use cases. These tools must be able to absorb the continued increase in data volume, velocity, and variety compounded by increasing geographical distribution. Second, edge analytics initiatives must make data actionable in real time where it would otherwise take too long to derive value. IoT is here, whether companies are ready or not, and big decisions loom. The question is, will you go to the edge, or not.

0 comments:

Post a Comment