Join IoT Central | Join our LinkedIn Group | Post on IoT Central


Platforms (277)

IoT has already enjoyed a next-big-thing status for several years now and all signs point to continued growth as homes, cars, and nearly every piece of hardware in our lives gets connected. But that doesn’t mean the industry isn’t facing challenges. In fact, rapid growth and its status as the new kid on the block, make IoT perfectly poised to face two particularly big challenges in the years to come.

Given the high demand for connected products, we need to solve these problems as quickly as possible. Late last year, Gartner analysts estimated that 2016 would see about 5.5 million new IoT devices connected every single day. But that’s nothing compared to what’s coming. Back in 2013, Cisco made headlines by predictingIoT will be a $14.4 trillion dollar industry by 2022. So what does the future of IoT look like? And what can we do now to better prepare for the connected future?

Developers aren’t ready for IoT

The first and perhaps easiest challenge to overcome is the talent supply problem. Most developers were not taught to write software for connected hardware. And up until very, very recently, developers were set in a desktop mindset. Speaking to our CTO at Mokriya, Pranil Kanderi, there are a few exceptions to this rule: “A small market share of embedded devices has been around for a while now,” Kanderi says. “But they were such a small segment that, for the most part, software developers kept the focus on desktops.”

Now, as consumers eagerly buy up new IoT products as quickly as they hit the market, the demand for developers who can write for connected hardware is growing fast. Moving from this desktop-first mentality to different kinds of hardware has been a slow process, Kanderi says. Too slow for an industry releasing new products every single week.

Here at Mokriya, our IoT business is a majority of the work we do. “If I have to guess, more than 70-80% of our projects so far have some kind of a connected device or hardware,” says Kanderi. “And that’s in addition to the mobile device itself.”

It wasn’t until IoT exploded in popularity that developers began to realize there was a lot of demand in the space with few developers ready to fulfil that demand. Now that we’re seeing this demand balloon, how much trouble is the industry in? How likely is it that developers can learn the new skills and approaches that engineering hardware to software challenges will require?

Kanderi says it’s probably not going to be as dramatic as it looks right now. “Developers now have physical hardware — Raspberry Pi, IoT kits, etc. — on their desks that they can write software for,” he says. “The real shift will be learning to think in terms of smaller distributed networked components.”

But the majority of software developers still rely on just a few platforms, like Java, .Net, and ERP. “As the connected devices grow, there will come a time when a lot of these developers will need to transition to write software for connected devices,” Kanderi points out. And that time is approaching. So, developers should start learning the most important software development skills in IoT immediately.

Thankfully, for good software developers who understand the basics of programming, this isn’t likely going to be a massive issue, Kanderi says. “Some of the same programming languages like Python, C, java, JavaScript can be used on hardware platforms like Arduino, Raspberry Pi, Intel Edison.” Still, the relative silence on this issue within the industry is a bit troubling.

Fragmentation in connected hardware

What’s more worrisome in the long term is the unsolved issue of fragmentation within the industry. There are thousands of devices — with new products launching every week — and very few of them can communicate effectively. In order to really take advantage of the possibilities in connected hardware, we’re going to have to create a way for these products to come together.

The problem isn’t that this challenge has gone unnoticed, the opposite, actually. At this point, the larger issue is that too many tech enterprises have spotted the opportunity to become IoT’s universal hub and are currently fighting to build the budding industry around their own ecosystem. Google is expanding Google Homewhile Amazon is finding new ways to connect IoT products using Echo/Alexa. It comes as no surprise that these two giants have begun a very public initial battle to rule the smart home. And then there are organizations, like Chronicled, that are building open databases to connect IoT products. Earlier this month, even the giant consortium, AllSeen Alliance, announced a partnership with rival Open Connectivity Foundation.

There’s no way to tell how this will all turn out, partially because connected devices are still in such a nascent stage, but also due to increasing instances of tech brands partnering up with multiple IoT alliances at once. The fact that brands are hedging their bets is not a huge surprise, given the newness of the mainstream IoT industry, yet it does point to a greater uncertainty as we look toward the future of IoT. And while it’s in everyone’s interest to solve that problem together, the temptation of becoming the hub for all connected hardware in the future is too good for many brands to pass up altogether.

The lack of a common standard to communicate across various devices could be a major problem if we continue on this path. But either way, if you are developer its high time that you should look towards getting your hands on an IoT kit and start connecting with the future. 

Read about how Mokriya develops solutions for IoT problems

Read more…

As if the Internet of Things (IoT) was not complicated enough, the Marketing team at Cisco introduced its Fog Computing vision in January 2014, also known as Edge Computing  for other more purist vendors.

Given Cisco´s frantic activity in their Internet of Everything (IoE) marketing campaigns, it is not surprising that many bloggers have abused of shocking headlines around this subject taking advantage of the Hype of the IoT.

I hope this post help you better understand what is  the role of Fog Computing  in the IoT Reference Model and how companies are using IoT Intelligent gateways in the Fog to connect the "Things" to the Cloud through some applications areas and examples of Fog Computing.

The problem with the cloud

As the Internet of Things proliferates, businesses face a growing need to analyze data from sources at the edge of a network, whether mobile phones, gateways, or IoT sensors. Cloud computing has a disadvantage: It can’t process data quickly enough for modern business applications.

The IoT owes its explosive growth to the connection of physical things and operation technologies (OT) to analytics and machine learning applications, which can help glean insights from device-generated data and enable devices to make “smart” decisions without human intervention. Currently, such resources are mostly being provided by cloud service providers, where the computation and storage capacity exists.

However, despite its power, the cloud model is not applicable to environments where operations are time-critical or internet connectivity is poor. This is especially true in scenarios such as telemedicine and patient care, where milliseconds can have fatal consequences. The same can be said about vehicle to vehicle communications, where the prevention of collisions and accidents can’t afford the latency caused by the roundtrip to the cloud server.

“The cloud paradigm is like having your brain command your limbs from miles away — it won’t help you where you need quick reflexes.”

Moreover, having every device connected to the cloud and sending raw data over the internet can have privacy, security and legal implications, especially when dealing with sensitive data that is subject to separate regulations in different countries.

IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. Cloud servers, on the other hand, have the horsepower, but are too far away to process data and respond in time.

The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.

The variety of IoT systems and the need for flexible solutions that respond to real-time events quickly make Fog Computing a compelling option.

The Fog Computing, Oh my good another layer in IoT!

A study by IDC estimates that by 2020, 10 percent of the world’s data will be produced by edge devices. This will further drive the need for more efficient fog computing solutions that provide low latency and holistic intelligence simultaneously.

“Computing at the edge of the network is, of course, not new -- we've been doing it for years to solve the same issue with other kinds of computing.”

The Fog Computing or Edge Computing  is a paradigm championed by some of the biggest IoT technology players, including Cisco, IBM, and Dell and represents a shift in architecture in which intelligence is pushed from the cloud to the edge, localizing certain kinds of analysis and decision-making.

Fog Computing enables quicker response times, unencumbered by network latency, as well as reduced traffic, selectively relaying the appropriate data to the cloud.

The concept of Fog Computing attempts to transcend some of these physical limitations. With Fog Computing processing happens on nodes physically closer to where the data is originally collected instead of sending vast amounts of IoT data to the cloud.

Photo Source: http://electronicdesign.com/site-files/electronicdesign.com/files/uploads/2014/06/113191_fig4sm-cisco-fog-computing.jpg

The OpenFog Consortium

The OpenFog Consortium, was founded on the premise based on open architectures and standards that are essential for the success of a ubiquitous Fog Computing ecosystem.

The collaboration among tech giants such as ARM, Cisco, Dell, GE, Intel, Microsoft and Schneider Electric defining an Open, Interoperable Fog Computing Architecture is without any doubt good news for a vibrant supplier ecosystem.

The OpenFog Reference Architecture is an architectural evolution from traditional closed systems and the burgeoning cloud-only models to an approach that emphasizes computation nearest the edge of the network when dictated by business concerns or critical application the functional requirements of the system.

The OpenFog Reference Architecture consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.

Benefits of Fog Computing

  • ·         Frees up network capacity - Fog computing uses much less bandwidth, which means it doesn't cause bottlenecks and other similar occupancies. Less data movement on the network frees up network capacity, which then can be used for other things.
  • ·         It is truly real-time - Fog computing has much higher expedience than any other cloud computing architecture we know today. Since all data analysis are being done at the spot it represents a true real time concept, which means it is a perfect match for the needs of Internet of Things concept.
  • ·         Boosts data security - Collected data is more secure when it doesn't travel. Also makes data storing much simpler, because it stays in its country of origin. Sending data abroad might violate certain laws.
  • ·         Analytics is done locally- Fog computing concept enables developers to access most important IoT data from other locations, but it still keeps piles of less important information in local storages;
  • ·         Some companies don't like their data being out of their premises- with Fog Computing lots of data is stored on the devices themselves (which are often located outside of company offices), this is perceived as a risk by part of developers' community.
  • ·         Whole system sounds a little bit confusing- Concept that includes huge number of devices that store, analyze and send their own data, located all around the world sounds utterly confusing.

Disadvantages of Fog Computing

Read more: http://bigdata.sys-con.com/node/3809885

Examples of Fog Computing

The applications of fog computing are many, and it is powering crucial parts of IoT ecosystems, especially in industrial environments. See below some use cases and examples.

  • Thanks to the power of fog computing, New York-based renewable energy company Envision has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.
  • Plat One is another firm using fog computing to improve data processing for the more than 1 million sensors it manages. The company uses the Cisco-ParStream platform to publish real-time sensor measurements for hundreds of thousands of devices, including smart lighting and parking, port and transportation management and a network of 50,000 coffee machines.
  • In Palo Alto, California, a $3 million project will enable traffic lights to integrate with connected vehicles, hopefully creating a future in which people won’t be waiting in their cars at empty intersections for no reason.
  • In transportation, it’s helping semi-autonomous cars assist drivers in avoiding distraction and veering off the road by providing real-time analytics and decisions on driving patterns.
  • It also can help reduce the transfer of gigantic volumes of audio and video recordings generated by police dashboard and video cameras. Cameras equipped with edge computing capabilities could analyze video feeds in real time and only send relevant data to the cloud when necessary.

See more at: Why Edge Computing Is Here to Stay: Five Use Cases By Patrick McGarry  

What is the future of fog computing?

The current trend shows that fog computing will continue to grow in usage and importance as the Internet of Things expands and conquers new grounds. With inexpensive, low-power processing and storage becoming more available, we can expect computation to move even closer to the edge and become ingrained in the same devices that are generating the data, creating even greater possibilities for inter-device intelligence and interactions. Sensors that only log data might one day become a thing of the past.

Janakiram MSV  wondered if Fog Computing  will be the Next Big Thing In Internet of Things? . It seems obvious that while cloud is a perfect match for the Internet of Things, we have other scenarios and IoT solutions that demand low-latency ingestion and immediate processing of data where Fog Computing is the answer.

Does the fog eliminate the cloud?

Fog computing improves efficiency and reduces the amount of data that needs to be sent to the cloud for processing. But it’s here to complement the cloud, not replace it.

The cloud will continue to have a pertinent role in the IoT cycle. In fact, with fog computing shouldering the burden of short-term analytics at the edge, cloud resources will be freed to take on the heavier tasks, especially where the analysis of historical data and large datasets is concerned. Insights obtained by the cloud can help update and tweak policies and functionality at the fog layer.

And there are still many cases where the centralized, highly efficient computing infrastructure of the cloud will outperform decentralized systems in performance, scalability and costs. This includes environments where data needs to be analyzed from largely dispersed sources.

“It is the combination of fog and cloud computing that will accelerate the adoption of IoT, especially for the enterprise.”

In essence, Fog Computing allows for big data to be processed locally, or at least in closer proximity to the systems that rely on it. Newer machines could incorporate more powerful microprocessors, and interact more fluidly with other machines on the edge of the network. While fog isn’t a replacement for cloud architecture, it is a necessary step forward that will facilitate the advancement of IoT, as more industries and businesses adopt emerging technologies.

'The Cloud' is not Over

Fog computing is far from a panacea. One of the immediate costs associated with this method pertains to equipping end devices with the necessary hardware to perform calculations remotely and independent of centralized data centers. Some vendors, however, are in the process of perfecting technologies for that purpose. The tradeoff is that by investing in such solutions immediately, organizations will avoid frequently updating their infrastructure and networks to deal with ever increasing data amounts as the IoT expands.

There are certain data types and use cases that actually benefit from centralized models. Data that carries the utmost security concerns, for example, will require the secure advantages of a centralized approach or one that continues to rely solely on physical infrastructure.

Though the benefits of Fog Computing are undeniable, the Cloud has a secure future in IoT for most companies with less time-sensitive computing needs and for analysing all the data gathered by IoT sensors.

 

Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…

Earlier this month I attended the 2016 ANT Wireless Symposium in Banff, Canada (Listen up conference organizers: more meetings in Banff please!). Put on by ANT Wireless, stewards of a protocol and silicon solution for ultra-low power (ULP) practical wireless networking applications which are now integrated into many popular products and devices, the conference looked at how wireless advances define how we live, do business and use products.  

I moderated a panel on IoT and fragmentation. The panelists should make any conference organizer drool. Joining me on stage were IBM’s Doug Barton, Google’s Doug Daniels, Rick Gibbs of North Pole Engineering and sport technology guru Ray Maker.  

Each of these gentlemen has first hand experience in either creating, developing or heavily using connected devices. Ray Maker’s blog is the first place to to visit if you’re looking to try out sports equipment. Rick’s company is a vertically integrated electrical engineering company specializing in embedded microprocessor-based hardware and software design. Doug Daniels is the head of cloud platform at Google and helped create Mi Pulse, high-tech stylish activewear with integrated heart rate monitoring technology. And finally, Doug Barton, partnered with ultra-cyclist Dave Haase, to put IBM’s IoT and Big Data capabilities to the test.

If you’re looking at connecting low-powered devices, or are considering a communications standard, be it ANT+ vs. BTLE, then you’ll want to watch this discussion. We cover interoperability, explore the large-scale initiatives that will define the standards and frameworks for the IoT globally, what will help increase adoption for end-users, and ask if diversification of protocols is actually good for the market.

Update: Ray Maker also includes information about his talk and our discussion here. Be sure to read Ray’s additional thoughts in the comments section.

Read more…

IoT: Platform Transformation - O'Reilly Report

While the full impact of the Internet of Things has yet to arrive, many companies already offer IoT platforms with the services, software, and hardware necessary to make your company IoT-ready. How do you determine which platform will meet the needs of your industry and, more importantly, your business? This O’Reilly report will help you sort through criteria you need to think about when planning for the IoT strategy.

Author Matthew Perry explores key considerations for smoothing the transition to IoT components. You’ll not only examine the things and people you want to connect, but also the network upgrades necessary to connect them—including data flow, APIs and end user applications, and robust security.

Chances are your IoT platform won’t come out of a box; you may do part of the work internally. With the guidelines in this report, you’ll learn how to prepare for an IoT platform that specifically meets the needs of your business and its customers.

Brought to you by PTC

PTC 

>> Download Now

Read more…

Smarter Cities and The Internet of Things

Smarter Cities with The Internet of Things

Parking meters, information signs, CCTV, traffic signals – almost everywhere that you look in a modern city, there’s a microchip embedded device, connecting to what has now become known as the all-encompassing Internet of Things. Although we often overlook the fact, cities are, in essence, huge and complex businesses. Cities compete for residents, investors, tourists, and even funding from central government. For cities to remain relevant, they have to become smarter, leaner, and more connected. The IoT is helping the world’s largest cities to do this, and it’s all happening on a grand scale, and at a phenomenal rate.

According to Gartner Research, in this year alone, 5.5 million new ‘things’ are expected to become connected every day. From consumer devices like smartphones and fitness devices, to interactive flat panel displays and information kiosks, IoT is seeing huge adoption rates and staggering investment. Just over a year ago, an IDC FutureScape report predicted that local government bodies would represent up to a quarter of all government spending, specifically because of investment into the research and implementation of connected technologies.

Simple Ideas are Changing How Cities are Run

Looking at just a few of the innovative technologies from the last five years, it is possible to start developing a picture of what smart cities will look like within the next decade. Bitlock is an innovative technology that uses proximity keys to automatically activate or deactivate bike locks. At the same time, the system uses an owner’s smartphone to record the GPS location of the lock and bike. Such a system could be utilized on a large scale, such as in a bike sharing program in heavily congested cities. Private and government organizations could track bikes for better management, and they could even use the uploaded data to provide real time updates for bike availability, while also recording patterns of utilization.

Streetline is another smart city technology that shows great promise. Using networked parking sensors, Streetline can record parking availability in real time, and report to city officials and publicly available smartphone apps, simultaneously. The technology is in widespread use around Los Angeles, and as of May this year, over 490 million individual parking events had been recorded and reported using Streetline sensors. Studies have shown that smart parking systems can reduce peak parking congestion by up to 22%, and can reduce total traffic volume by 8%. With other technologies like IBM’s Intelligent Transportation Solutions, local governments could utilize devices to gather real time aggregated data which can be used to measure traffic volume, speed, and other metrics, which could be used to design better policy and city planning.

Opportunities for IoT Skilled Professionals

Innovative technologies like these are just the beginning of what is possible in a smart city. Emerging technologies have the potential to make major cities more functional and convenient for residents and visitors, and more manageable for government bodies. Even so, there are still challenges to overcome. Infrastructure is a major challenge, and cities will need to plan and implement high speed networks, as well as the servers that are necessary to support their sensors and other systems. Storage and processing needs will increase as IoT becomes more widespread, and security will need to become a major area of focus. Security is not just necessary to safeguard systems, but also to protect end user privacy and data.

It’s clear that smart technologies and IoT are the future of the world’s major cities. Which in turn means that experienced developers, operations professionals, engineers, and IT security specialists will be in high demand, with growing opportunities in the immediate future, and in the coming years.

For more information please check out our new website www.internetofthingsrecruiting.com

Read more…

Guest blog post by ajit jaokar

Introduction

At the Data Science for Internet of Things course, we have been working with the Predix APIs. We emphasise the Industrial IoT and Predix from GE is one the best examples of a mature IIoT platform.  Predix is currently available on a free trial and I asked GE about the process after the free trial. After 6 months of trail, developers would have to pay for the price listed under each service for example HERE I have used the documentation from the Predix site for this article.

What is Predix Platform?

The Predix platform is a cloud-based Platform-as-a-Service (PaaS) for the Industrial Internet. Predix provides a complete environment to create solutions to run industrial-scale analytics for Industrial IoT.  Predix platform uses a Cloud Foundry based microservice architecture which enables it to deliver functionality as a set of very small, granular, independent collaborating services.

 

The high-level architecture of Predix is as below

 

 

The Predix platform provides the following industrial microservices:

 

Asset Service: to support asset modeling. Application developers use the Asset service to create, update, and store asset model data that defines asset properties as well as relationships between assets and other modeling elements. The Asset service consists of an API layer, a query engine, and a graph database. For example, an application developer can create an asset model describing the logical component structure of all pumps in an organization, then create instances of the model to represent each pump in an organization.

 

Analytics Services:  can be used to manage the execution of analytics through configuration, abstraction, and extensible modules.

 

Data Services: include the following services: Time Series Service and the BLOB Store Service(Binary Large Object Storage). Data Services also include  a SQL Database, the PostgreSQL object-relational database management system, Message Queue, Key-Value Store

 

Security Services: consists of User Account and Authentication, Access Control and Tenant Management

 

Machine Services: a device-independent software stack that allows you to develop solutions to connect machines to the Industrial Internet. 

 

Predix Mobile: used for mobility applications

 

App/UI Services: includes support for standards like HTML5, Cascading Style Sheets (CSS), and JavaScript and several JavaScript MVC frameworks exist (such as AngularJS, Backbone.js, and Ember.js)

 

Event hub: Predix also includes the Event Hub beta pub/sub service built to be secure, massively scalable, fault tolerant, and language-agnostic. It supports textual (JSON over websocket) and binary (protobuf over gRPC) for publishing, and binary (protobuf over gRPC) for subscribing. Devices communicate with the Event Hub service by publishing messages to topics.Event Hub provides the ability to ingest streaming data from anywhere to the cloud for processing.

 

We are more interested in the Analytics service considering our emphasis on Data Science for IoT. An ‘Analytic’ is roughly equivalent to a model in traditional data science parlance.  The Analytics Catalog service provides a software catalog for sharing reusable analytics across development teams. The Analytics Catalog service supports analytics written in Java, Matlab and Python. The Analytics Runtime service supports analytics from the Analytics Catalog service.

 

Developer Workflow for analytics is as follows

Stage 1: Develop the Analytic

Stage 2: Publish the Analytic to the Analytics Catalog

Stage 3: Validate the Analytic

(3a) Test the analytic with a sample data set and verify the results using the Analytics Catalog service.

(3b) If necessary, modify the analytic and upload new artifact to the catalog using the Analytics Catalog service.

Stage 4: Release the Analytic

Stage 5: Execute the Analytic

 

Conclusion

Predix is interesting for our work in the Data Science for Internet of Things course, because of its emphasis on the Industrial Internet. In subsequent articles, I will cover Predix Edge Processing capabilities  and the Predix APIs. I am interested in finding an equivalent for Industrie 4.0. If you know such as service for Industrie 4.0 or you wish to join the course – please email me at info at futuretext.com

In next versions of this series, I will cover Predix Edge Processing and the Predix API

Predix is also covered in our forthcoming book on Data Science for Internet of Things by Ajit Jaokar and Jean Jacques Bernard

Follow us @IoTCtrl | Join our Community

Read more…

IoT Central Digest, September 16, 2016

In this issue we interview Autodesk's Bryan Kester. Hey wait a minute, you say, Autodesk is CAD, not IoT. Well read our interview to learn how Autodesk is more IoT than you think. Also, Bill McCabe looks at the skills for IoT and we revisit one of our most popular posts: Internet of Things Landscape 2016 - In One Diagram.  If you're interested in being featured, we always welcome your contributions on all things IoT Infrastructure, IoT Application Development, IoT Data and IoT Security, and more. All members can post on IoT Central. Consider contributing today. Our guidelines are here.

Autodesk's Bryan Kester - Skills for the IoT pro, disagreement with Gartner, and what's next for IoT

By David Oro 

In our latest installment of interviews with IoT practitioners, we interview Bryan Kester, Director of IoT, Autodesk, Inc. Bryan leads the Internet of Things (IoT) Product Group at Autodesk. We asked him questions about Gartner's prediction of IoT maturation, his take on the IoT platform wars, the skills sets needed in this rapidly emerging and changing field, and what's next for IoT. Bryan predicts, "There will be some continued hype and then a subtle, but significant shakeout among both pure play and "me too" vendors. Those that help simplify the systems integration nature of IoT will have a future."

Internet of Things Landscape 2016 - In One Diagram

By David Oro

Matt Turck, a venture capitalist at FirstMark, has mapped out the Internet of Things Landscape for 2016.

Matt notes "The IoT today is largely at this inflection point where “the future is already here but it is not evenly distributed”. From ingestibles, wearables, AR/VR headsets to connected homes and factories, drones, autonomous cars and smart cities, a whole new world (and computing paradigm) is emerging in front of us. But as of right now, it just feels a little patchy, and it doesn’t always look good, or work great – yet."

Deep Learning Applications for Smart cities

By Ajit Joakar

This blog is based on my talk in London at the Re.work Connected City Summit on Deep Learning Applications for Smart cities. The talk is based on a forthcoming paper created with the help of my students atUPM/citysciences on the same theme.

Here are some notes on our approach:

  • When we speak of Machines – the media dramatizes the issue.  Yet,  city officials and planners plan for ten to twenty years in the future. They will have to consider many of these issues in a pragmatic way.
  • Deep Learning / Artificial Intelligence will impact many aspects of Smart cities. We decided to approach the subject in a pragmatic manner and to explore the impact of Deep Learning/AI technology on the lives of future citizens.

How could self-learning machines affect humanity in cities?

The Great IOT Recruiting Rush

Posted by Bill McCabe 

With many IT professionals with business experience in hot industries like healthcare, telecom and wearables looking to make the switch from systems software and other terrestrial IT-based positions to M2M or IOT strategy, leadership and sales, what are the skills you need to work on the IOT.

Additional Links

Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

Read more…

The power of big data, analytics and machine learning have created unique opportunities in the e-commerce industry. Thanks to data-driven enhancements to ads, upselling and cross-selling, online shoppers are able to get “what they want, when they want it.”

This transformation has had a direct and positive impact on business efficiency, driving more sales and improving customer satisfaction. But it has also had the adverse effect of widening the gap between online and brick-and- mortar businesses, and has faced the retail industry with higher shopper expectations and unprecedented challenges.

However, the advent and development of the Internet of Things (IoT) and the widespread use of mobile devices and mobile apps can help overcome these challenges. Thanks to microprocessors and ubiquitous internet connectivity, smart devices can be deployed everywhere and on everything, from point of sales systems to dressing rooms.

Thanks to microprocessors and ubiquitous internet connectivity, smart devices can be deployed everywhere and on everything, from point of sales systems to dressing rooms.

This enables retailers to gather and analyze data like never before, and to interact with each shopper in a unique and personalized way. Here’s how every aspect of a retail business can benefit from IoT technology and mobile apps, effectively improving sales, cutting costs and drawing customers back to the store.

Supply chain and inventory management

Inventory management problems account for some of the biggest expenditures and losses in retail stores. According to a report by McKinsey, inventory distortion, including overstock, stockouts, and shrinkage, cost retailers a yearly $1.1 trillion worldwide. In the U.S., shrinkage alone is hitting retailers with $42 billion in losses every year, 1.5 percent of total retail sales.

Thanks to IoT, retailers will be able to not only improve inventory control within the store but also expand it to the supply chain. Tracking of goods no longer starts at the store’s receiving dock – it begins at the point of manufacturing.

Better handling of the supply chain

With RFID tags placed on goods and environmental sensors in transportation vehicles, retailers will be able to trace the goods they purchase and their treatment and conditions throughout the supply chain. Information gathered from devices will be analyzed in the cloud and rule-based notifications and alerts can be sent to desktop and mobile apps in order to inform employees and staff members of events that must be acted upon.

The enhanced control will enable suppliers to reduce product damage throughout the journey to retail outlets. This will prove especially useful for the shipping of perishable and temperature sensitive inventory.

Retailers can also leverage IoT technologies such as RFID to track products through the extended supply chain, i.e. after the product has been sold. Having data and improved visibility will streamline otherwise-difficult tasks such as critical product recalls.

Improving in-store inventory tracking

One of the perennial problems retailers are faced with is the lack of accurate inventory tracking. Store shelves aren’t replenished on time; items are misplaced in shelves; sales associates aren’t able to locate items customers are looking for; order management is abysmal, leading to excessive purchase orders to avoid stock-outs. The results are higher inventory costs, lost worker productivity, mishandled stocking, potentially empty shelves and missed sales opportunities.

IoT technology can tackle these problems by bringing more visibility into the location of inventory items and offering more control. By deploying an inventory management system that is based on RFID chips, sensors and beacons, physical assets can be directly synced with database servers. Additional technologies such as store shelf sensors, digital price tags, smart displays and high-resolution cameras combined with image analysis capabilities can further help enhance the control of retailers on goods located at store shelves and in the back storage.

Subsequently retailers can better ensure inventory is adequately stocked, and when stock levels become low, reorder quantities can be suggested based on analytics made from POS data. According to the McKinsey report, reducing stock-outs and overstocks can help lower inventory costs by as much as 10 percent.

The use of IoT can also reduce missed sales opportunities attributed to poorly stocked shelves. When customers are unable to find what they’re looking for, they’ll take their business elsewhere. This can happen while the desired item is actually available in the backroom or displaced to some other shelf. Sales associates can quickly track items by their RFIDs using their mobile devices and beacons installed across the store. They can also receive timely alerts for misplaced items and emptied shelves in order to minimize customer mishaps. Improved on-shelf availability can improve sales by as much as 11 percent, the McKinsey report states.

Improved on-shelf availability can improve sales by as much as 11 percent

Reducing shrinkage and fraud

Shrinkage and fraud is an ever-present challenge in retail stores, whether from customers or employees. IoT can help curb the theft of items by adding a layer of visibility and traceability to inventory items. RFIDs, smart-shelves and camera feeds combined with sophisticated machine learning technology can paint a clearer picture of what takes place in-store, detect suspicious movement and determine whether items have been obtained through legal means.

Also, knowing that items are being tracked will discourage patrons and employees from resorting to the pilfering of goods. This is a huge improvement from traditional systems which rely on human monitoring, point-of- sale data and receipts to validate the sale of goods.

Customer experience

One of the benefits of online shopping is being able to push products and offers to customers instead of waiting for them to find them on their own. This helps to catch the attention of customers at the right moment and improve sales dramatically.

IoT will help enhance the brick-and- mortar experience to this level by helping gather data, perform analysis and make the best decisions for retail stores.

Optimizing product placement

Trying to figure out how customers navigate store isles is valuable information. Retailers always try to lay out their stores in order to maximize exposure to customers and improve sales. In the pre-IoT days, this has been done through human observation, educated guesses, random experimentation and manual sales correlation.

But now, thanks to data gathered from RFID chips, IoT motion detection sensors, beacons and video analytics, retailers can gather precise data from customer movement patterns and identify premium traffic areas. IoT makes is possible to learn how customers interact with specific items and discover which items are abandoned. Changes to store layouts can be automatically correlated to customer behavior changes and sales figures in order to perform precise A/B testing on tweaks and modifications.

Optimized use of in-store staff

Being able to identify customers that need help, and tending to their needs in time is an important factor in closing sales and improving conversion rates. But in-store staff can only watch so many customers at once, and in many cases the presence of a salesperson can be misinterpreted and considered offensive by customers.

IoT helps deal with this problem without disrupting the customer experience. Motion detection sensors, cameras and facial expression recognition algorithms can help identify customer who have been standing too long in one location and are manifesting confusion and ambivalence. The IoT ecosystem can then notify a nearby sales associate through a mobile or smart watch app. This way, shoppers get a better experience because they aren’t kept waiting, and retailers optimize their in-store staff.

Personalized offers and promotions

Banner ads and product suggestions that are customized based on browsing and purchasing history are one of the features that give online shopping channels the edge over brick-and- mortar retail. Cross-selling and upselling have become an important source of revenue for online sellers.

IoT can help retailers collect data and make offers to customers that will put them on par with their online counterparts. RFID chips, sensors and beacons can gather data about customer interactions with store items. The data can be analyzed by machine learning solutions and used to push extra information, customer reviews, recommendations and special offers on smart displays that are installed in stores.

Mobile apps can help move the experience to the next level. While customers interact with in-store items, the IoT ecosystem can merge the collected insights with their online product browsing history in order to provide useful information, offer loyalty programs and offer smarter suggestions for upsells.

Mobile apps in retail

IoT devices and sensors help collect data and glean insights from virtually every physical object and event that takes place in retail stores. But it is with mobile apps that IoT becomes a hands-on experience, especially in retail where most of the tasks are performed in field rather than behind a desk.

it is with mobile apps that IoT becomes a hands-on experience, especially in retail where most of the tasks are performed in field rather than behind a desk.

With a fully featured mobile app (or a suite of app for mobile devices and wearables) retailers can make sure that everyone within the retail chain has access to the data they need anytime, anywhere, in order to become more efficient at their jobs. This includes salespersons, inventory managers, suppliers and everyone else.

Mobile apps will also improve the customer experience as it will drive loyalty and enable customers to engage in a more personalized experience with retail stores and the smart gadgets that are installed in them.

Conclusion

With actionable insights offered by IoT-powered solutions, retailers will be able to offer customers what they actually want through a digital, connected and personalized experience. The gamut of data-driven and cloud-powered technology that is available for the retail sector to take advantage of can help merge the benefits of online and brick-and- mortar shopping experience. Eventually IoT will become the de facto standard and reinvent retail as we know it today.

Read about how Mokriya develops solutions for IoT problems

Read more…

5G Internet of Things

By Anand R. Prasad. This article originally appeared here.

This is first part of the article based on several talks I have given on 5G security since last year. In this part I present my views about 5G. On purpose I have avoided discussion on various activities regarding 5G around the globe (3GPP, ITU etc.). 

5G is expected to bring several changes in mobile communications system but watch that these changes are not something that will happen couple of years down the road instead they are already happening. First flavor of 5G is expected to be available in 2018 with the complete solution available in the market by 2020. With digitization on its way to touch every part of life, Internet of Things (IOT) will be integral part of 5G from the very beginning; this is unlike 4G where IOT came later.

In the following we will go through some of the key changes 5G is expected to bring. In the next article I will discuss security aspects.

(Core) Network

As we move ahead, virtualization technology (network function virtualization or NFV) and software defined network (SDN) for mobile core networks will become mature and actually bring down the cost while achieving the quality requirement. This will lead to the network (core network) being increasingly built of off-the-shelf hardware and open source software, virtualization will become common and cloud based mobile network will become available. It is expected that the network should accommodate multiple radio technologies. 

Changes mentioned above (virtualization and cloud) will allow networks to be launched for a specific service, a concept now being termed as network slicing or just slicing - this is what I call as vertical network that fulfill specific requirements in contrast to horizontal networks today that cater for all services. Virtualization and cloud also means that the network will become more open and accessible that will lead to network border going deeper in the network, i.e. instead of network elements as end-point a software module in the server farm will be the end-point and this end-point can migrate to different locations.

Radio Access Technology

Radio access technology will see several improvements with data-rates available from few bits going up to several gigabits, delays going down to micro- if not nano-seconds (compare it with millisecond range in today’s system). Radio access network will also become partially virtualized and cloud based.

Spectrum

Spectrum used for 5G will be different, there have been discussions of higher GHz bands, and thus the radio characteristics will be very different as well. The spectrum will have implications on coverage and behavior of radio access technology.

With the arrival of 5G, we should also expect wider usage of cognitive radio in mobile networks and aggregation with unlicensed band as well as usage of unlicensed band technologies.

Security Credentials

Given the variety of scenarios and technologies expected to come in use for 5G, it is worth questioning whether the security credentials should stay the same as today and whether there will be change in technology for storage of credentials. Change in security credentials could have implications on authentication and other security mechanisms.

Storage of security credentials can be seen from both network and device side. From the network side the storage is in terms of location, whether the security credentials are stored at the mobile network domain or at partner domain. Implication is the change in authentication end-point and transfer of session related security credentials to appropriate network functions after completion of authentication.

With regard to security credential storage in devices one can consider three different forms of storage (1) secure hardware environment as we have today in the form of UICC that is commonly known as SIM card, (2) embedded secure hardware environment, e.g. a UICC or similar device is implemented in a modem, this brings us to something like embedded SIM and (3) some form of software.

End Devices

End devices will see huge transformation together with technology enhancements that we are seeing around us. Already with the arrival of 4G we have seen increased usage of smartphone and over the top (OTT) services. As we move towards 5G we will see increased number of smart “devices” as well as a whole variety of IOT devices associated to a plethora of services, there will be wearables in common use and also virtual or augmented reality (VR or AR) type devices commonly available. Open source devices are available since a while now; we should expect increased usage of such devices as we move towards 5G.

With 5G we should expect mobile devices (all types like smart devices, IOT, VR, AR) to be reachable over Internet Protocol (IP) addresses, i.e. they will be directly connected to the Internet. On the other end of the spectrum there will be devices requiring long battery life (say 10 years) that are expected to work at very low data rates. 

Services

Services for VR, AR, IOT, smart devices and many more will appear as 5G will provision a platform that can fulfill variety of requirements. These services will be provisioned by the mobile operator or by a third party with or without business relation with the mobile operator.

Over the top (OTT) services are already there that have the potential of leading to potential cyber-attacks through malware, phishing etc. Sponsored data should be a source of revenue for mobile operators but misuse here leads to operator making financial loss.

Business

We are already seeing change in business model of mobile operators. One such change is in the form of APIs being made available for third parties to launch services over the mobile network. 

With 5G in picture we will also see operators getting in partnership with other companies to provision the services. This would mean that the partners would own the subscribers while the operator would be responsible for correct usage of the licensed spectrum.

User Space

5G will have much deeper penetration in the society than any of the technologies to-date. This equates to technology being used by savvy users like millennials and also Information and Communication Technology (ICT) illiterates who will leap-frog directly to the new technology. Thus the technology will go to the deepest part of life; not just human beings but animals (e.g. for vital information) and plants (e.g. for watering) will also get connected with IOT.

Photo Credit sayasatria

Read more…

IoT Central Digest, August 15, 2016

Articles on wireless standards, finance, and medical devices are just some of the stories highlighted in this issue IoT Central Digest. If you're interested in being featured, we always welcome your contributions on all things IoT Infrastructure, IoT Application Development, IoT Data and IoT Security, and more. All members can post on IoT Central. Consider contributing today. Our guidelines are here.

How IoT Will Transform The Automotive Industry

Posted by Luke Ryan

Here’s a glimpse of how IoT connectivity, smart sensors and gadgets, edge computing, mobile apps and cloud services will revolutionize how you interact with and use your car.

Behold the great possibilities of the Internet of Medical Things (IoMT)

By Rick Blaisdell

Unlike other industries, healthcare has been relatively conservative and slow in embracing innovations like cloud computing and the IoT, but that is starting to change, especially if we think about the past years. Innovative tech products and services are more and more part of our daily lives, making it harder for healthcare providers to ignore the potential advantages of connected medical devices.

Realizing the Elusive Value promised by the Internet of Things – An Economic Perspective

By Anirban Kundu

Much has been said about the value at stake and new growth opportunities presented by the Internet of Things trend. A Cisco estimates puts this at $ 14.4 Trillion opportunity where as a new McKinsey survey values this around $ 6.2 Trillion by 2025. One thing which comes undisputed from various reports across analyst’s community is the significant addition to the global GDP, trade volumes and new opportunities which would be created across sectors and industries.  Most reports in unison claim the benefits of the Internet of Things and the far reaching consequences this would have for the city we live in, the buildings we work and live in to the vehicles we drive. Every aspect of our experience with the physical world would be re-imagined from the way we work, our shopping experience, our medical services to the purchase of the insurance and banking services.

Global IoT Market Grows Again Says Machina Research

Posted by David Oro

UK-based Machina Research is adding to the mix of predictions for IOT with a new Global IoT Market research report. Their headline today: Global Internet of Things market to grow to 27 billion devices, generating USD3 trillion revenue in 2025.

Does IoT Need Wireless?

By Wade Sarver

Hell yeah! Don’t get me wrong, you could use CAT 5 to connect most of this stuff, but the idea is to have the equipment everywhere and talking all the time, or at least when we need to. They need to be wireless controlled for it to work properly and to be autonomous. What fun would a drone be if you needed to have a copper line connected to it. The FCC laid out their plan to sunset copper lines. I did a lot of work on them but I won’t miss them because wireless is so cool! If you like copper so much, then put that smartphone down and use a landline, if you can find one.

Thoughts on IoT and Finance

By Javier Saade

IoT, smart devices, wearables, mobile technology and nanotech - yes, nanotech - are forcing financial services incumbents and challengers to rethink every aspect of their value chains.  Those value chains are getting to be exponentially more distributed and automated.   Increased digitization means more data being generated, from all kinds of places at an accelerating rate.   IoT, regardless of your perspective, promises to enable the development of new value-added services to improve and automate user engagement, customer acquisition and service delivery - everywhere at all times.  

Data Analysis for Running the Business the Intelligent Way

Posted by Marcus Jensen 

Our very own selves from so little as a decade ago could not even comprehend the amount of information we are exposed to on a daily basis. Everything from planners to weather information is nowadays absorbed through technology. The amount of data that circulates our daily lives can turn out overwhelming; however, if used intelligently, it can bring upon a world of help when running a business.

Additional Links

Follow us on Twitter | Join our LinkedIn group | Members Only | For Bloggers | Subscribe

Read more…

How IoT can benefit from fog computing

fog computing

By Ben Dickson. This article originally appeared here.

What I’m mentioning a lot these days (and hearing about it as well) is the chaotic propagation and growth of the Internet of Things. With billions of devices slated to connect to the internet every year, we’re going to be facing some serious challenges. I’ve already discussed howblockchain technology might address connectivity issues for huge IoT ecosystems.

But connectivity accounts for a small part of the problems we’ll be facing. Another challenge will be processing and making sense of the huge reams of data that IoT devices are generating. Close on its heels will be the issue of latency or how fast an IoT system can react to events. And as always, security and privacy issues will remain one of the top items in the IoT challenge list.

Fog computing (aka edge computing) can help mitigate – if not overcome – these challenges. As opposed to the cloud, where all the computation takes place in a central location, fog computing pushes the computation of tasks toward the edge of the network and distributes it among smart routers or gateways. The term and concept was coined by networking giant Cisco even before the IoT became a buzzword, but it was the advent of the Internet of Things that provided it with true, legitimate use cases.

Here are some of the domains where cloud computing can deal with the challenges of IoT.

Computation and data processing

Naturally, computation problems will be one of the main reasons we’ll descend from the cloud and wade into the fog. A problem lying ahead of us is the sheer amount of computation and data processing that IoT ecosystems will require.

With Machine-to-Machine (M2M) communications accounting for most of exchanges in IoT ecosystems, the amount of traffic that will be generated will be incomparable to what we’re used to deal with in human-machine settings. Pushing all of these tasks to the cloud will overburden centralized computation nodes and require bigger and stronger cloud servers.

The cloud is best known for its huge storage and analytics capacities. Meanwhile, many of the tasks and events that take place in IoT ecosystems do not require such capabilities and sending them to the cloud will be a waste of precious resources and will only bog down servers and prevent them from performing their more critical duties.

Fog computing can address this issue. Small computational tasks can be performed at the edge (IoT gateways and routers), while valuable data can continue to be pushed to the cloud. This way, precious cloud resources for can be saved for more suitable tasks such as big data analysis and pattern recognition. Reciprocally, functionality and policies of edge devices can be altered and updated based on insights gained from cloud analytics.

This model will also help address response time and latency issues, which is discussed next.

Response times and latency

Rather than requiring huge computational resources, many of the transactions and decisions being made in IoT systems are time-critical. Imagine a telemedicine scenario, or an IoT-powered hospital, where seconds and milliseconds can make a difference for patients’ health or life. The same can be said in industrial settings and work areas, where quick response can prevent or mitigate damage and safety issues. A simpler example would be parking lights that would have to respond to passage of cars and pedestrians, but must do so in a timely fashion.

Other settings that require large bandwidth, such as IoT ecosystems involving many CCTV cameras, would also be hard to deploy in environments that have limited connectivity if they rely on cloud computation.

In many cases, it’s funny (and outright ridiculous) that two devices that stand a few feet apart have to go through the internet and the cloud to exchange simple messages. It’s even more ridiculous having to cope with the fact that your fridge and toaster don’t work because they’re disconnected from the internet.

A roundtrip to the cloud can sometimes take seconds – or even minutes, in poorly connected areas – which is more than can be afforded in many of these scenarios. Meanwhile, at the edge, IoT ecosystems can make decisions at the speed of lightning, making sure that everything gets responded to in time.

study by IDC Futurescape shows that by 2018, some 40 percent of IoT-created data will be stored, analyzed and processed at the edge.

Security and privacy

As Phantom CEO Ken Tola mentioned in a previous post, encryption isn’t panacea to IoT security problems. And as a study by LGS Innovations told us earlier, hackers don’t necessarily need to crack into your encrypted communications in order to carry out their evil deeds. In fact, just eavesdropping on your IoT internet traffic – whether it’s encrypted or not – will provide malicious actors with plenty of useful information, e.g. give away your living habits.

Moreover, some forms of attacks, such as replay attacks, don’t require the attacker to have access to encryption keys. All they need to do is to replicate packets that are being exchanged on the network. For instance, with a good bit of network monitoring, an attacker might figure out which sequence of packets unlocks your home’s smart-lock.

Of course, there are ways to mitigate each of these threats, but robust security practices aren’t the greatest strength of IoT device manufacturers, and that’s why we’re seeing all thesespooky IoT hacks surface every week.

Fog computing will reduce many of these risks by considerably decreasing the amount of dependency on internet connections. Moving data and command exchange into the local area network will make it much harder for hackers to gain remote access to your data and devices. Moreover, with device-cloud exchanges no longer happening in real-time, it will be much harder to discern life and usage patterns by eavesdropping on your network.

Overcoming the challenges

Despite all the mentioned advantages, fog computing does have its own set of caveats and difficulties. For one thing, edge devices can’t match the power of cloud in computing and analytics. This issue can be addressed by distributing the workload between the cloud and the fog. Edge devices such as smart routers and gateways can mimic cloud capabilities at the edge location, making optimal use of their resources to respond to time-critical and lightweight tasks, while the heavier, analytics-intensive requests that don’t necessarily need to be carried out in real-time can be sent to the cloud.

Meanwhile, edge software should be designed and developed with flexibility in mind. For instance, IoT gateway software that controls industrial equipment should be able to receive policy and function updates, which will be produced by machine learning solutions analyzing big data at the cloud.

Read more…

Originally Posted and Written by: Michelle Canaan, John Lucker, & Bram Spector

Connectivity is changing the way people engage with their cars, homes, and bodies—and insurers are looking to keep pace. Even at an early stage, IoT technology may reshape the way insurance companies assess, price, and limit risks, with a wide range of potential implications for the industry.

Insurers’ path to growth: Embrace the future

In 1997, Progressive Insurance pioneered the use of the Internet to purchase auto insurance online, in real time.1 In a conservative industry, Progressive’s innovative approach broke several long-established trade-offs, shaking up traditional distribution channels and empowering consumers with price transparency.

This experiment in distribution ended up transforming the industry as a whole. Online sales quickly forced insurers to evolve their customer segmentation capabilities and, eventually, to refine pricing. These modifications propelled growth by allowing insurers to serve previously uninsurable market segments. And as segmentation became table stakes for carriers, a new cottage industry of tools, such as online rate comparison capabilities, emerged to capture customer attention. Insurers fought to maintain their competitive edge through innovation, but widespread transparency in product pricing over time created greater price competition and ultimately led to product commoditization. The tools and techniques that put the insurer in the driver’s seat slowly tipped the balance of power to the customer.

This case study of insurance innovation and its unintended consequences may be a precursor to the next generation of digital connectivity in the industry. Today, the availability of unlimited new sources of data that can be exploited in real time is radically altering how consumers and businesses interact. And the suite of technologies known as the Internet of Things (IoT) is accelerating the experimentation of Progressive and other financial services companies. With the IoT’s exponential growth, the ways in which citizens engage with their cars, homes, and bodies are getting smarter each day, and they expect the businesses they patronize to keep up with this evolution. Insurance, an industry generally recognized for its conservatism, is no exception.

IoT technology may still be in its infancy, but its potential to reshape the way insurers assess, price, and limit risks is already quite promising. Nevertheless, since innovation inevitably generates unintended possibilities and consequences, insurers will need to examine strategies from all angles in the earliest planning stages.

To better understand potential IoT applications in insurance, the Deloitte Center for Financial Services (DCFS), in conjunction with Wikistrat, performed a crowdsourcing simulation to explore the technology’s implications for the future of the financial services industry. Researchers probed participants (13 doctorate holders, 24 cyber and tech experts, 20 finance experts, and 6 entrepreneurs) from 20 countries and asked them to imagine how IoT technology might be applied in a financial services context. The results (figure 1) are not an exhaustive compilation of scenarios already in play or forthcoming but, rather, an illustration of several examples of how these analysts believe the IoT may reshape the industry.2

ER_2824_Fig.1

CONNECTIVITY AND OPPORTUNITY

Even this small sample of possible IoT applications shows how increased connectivity can generate tremendous new opportunities for insurers, beyond personalizing premium rates. Indeed, if harnessed effectively, IoT technology could potentially boost the industry’s traditionally low organic growth rates by creating new types of coverage opportunities. It offers carriers a chance to break free from the product commoditization trend that has left many personal and commercial lines to compete primarily on price rather than coverage differentiation or customer service.

For example, an insurer might use IoT technology to directly augment profitability by transforming the income statement’s loss component. IoT-based data, carefully gathered and analyzed, might help insurers evolve from a defensive posture—spreading risk among policyholders and compensating them for losses—to an offensive posture: helping policyholders prevent losses and insurers avoid claims in the first place. And by avoiding claims, insurers could not only reap the rewards of increased profitability, but also reduce premiums and aim to improve customer retention rates. Several examples, both speculative and real-life, include:

  • Sensors embedded in commercial infrastructure can monitor safety breaches such as smoke, mold, or toxic fumes, allowing for adjustments to the environment to head off or at least mitigate a potentially hazardous event.
  • Wearable sensors could monitor employee movements in high-risk areas and transmit data to employers in real time to warn the wearer of potential danger as well as decrease fraud related to workplace accidents.
  • Smart home sensors could detect moisture in a wall from pipe leakage and alert a homeowner to the issue prior to the pipe bursting. This might save the insurer from a large claim and the homeowner from both considerable inconvenience and losing irreplaceable valuables. The same can be said for placing IoT sensors in business properties and commercial machinery, mitigating property damage and injuries to workers and customers, as well as business interruption losses.
  • Socks and shoes that can alert diabetics early on to potential foot ulcers, odd joint angles, excessive pressure, and how well blood is pumping through capillaries are now entering the market, helping to avoid costly medical and disability claims as well as potentially life-altering amputations.3

Beyond minimizing losses, IoT applications could also potentially help insurers resolve the dilemma with which many have long wrestled: how to improve the customer experience, and therefore loyalty and retention, while still satisfying the unrelenting market demand for lower pricing. Until now, insurers have generally struggled to cultivate strong client relationships, both personal and commercial, given the infrequency of interactions throughout the insurance life cycle from policy sale to renewal—and the fact that most of those interactions entail unpleasant circumstances: either deductible payments or, worse, claims. This dynamic is even more pronounced in the independent agency model, in which the intermediary, not the carrier, usually dominates the relationship with the client.

The emerging technology intrinsic to the IoT that can potentially monitor and measure each insured’s behavioral and property footprint across an array of activities could turn out to be an insurer’s holy grail, as IoT applications can offer tangible benefits for value-conscious consumers while allowing carriers to remain connected to their policyholders’ everyday lives. While currently, people likely want as few associations with their insurers as possible, the IoT can potentially make insurers a desirable point of contact. The IoT’s true staying power will be manifested in the technology’s ability to create value for both the insurer and the policyholder, thereby strengthening their bond. And while the frequency of engagement shifts to the carrier, the independent agency channel will still likely remain relevant through the traditional client touchpoints.

By harnessing continuously streaming “quantified self” data, using advanced sensor connectivity devices, insurers could theoretically capture a vast variety of personal data and use it to analyze a policyholder’s movement, environment, location, health, and psychological and physical state. This could provide innovative opportunities for insurers to better understand, serve, and connect with policyholders—as well as insulate companies against client attrition to lower-priced competitors. Indeed, if an insurer can demonstrate how repurposing data collected for insurance considerations might help a carrier offer valuable ancillary non-insurance services, customers may be more likely to opt in to share further data, more closely binding insurer and customer.

Leveraging IoT technologies may also have the peripheral advantage of resuscitating the industry’s brand, making insurance more enticing to the relatively small pool of skilled professionals needed to put these strategies in play. And such a shift would be welcome, considering that Deloitte’s Talent in Insurance Survey revealed that the tech-savvy Millennial generation generally considers a career in the insurance industry “boring.”4 Such a reputational challenge clearly creates a daunting obstacle for insurance executives and HR professionals, particularly given the dearth of employees with necessary skill sets to successfully enable and systematize IoT strategies, set against a backdrop of intense competition from many other industries. Implementing cutting-edge IoT strategies could boost the “hip factor” that the industry currently lacks.

With change comes challenges

While most stakeholders might see attractive possibilities in the opportunity for behavior monitoring across the insurance ecosystem, inevitable hurdles stand in the way of wholesale adoption. How insurers surmount each potential barrier is central to successful evolution.

For instance, the industry’s historically conservative approach to innovation may impede the speed and flexibility required for carriers to implement enhanced consumer strategies based on IoT technology. Execution may require more nimble data management and data warehousing than currently in place, as engineers will need to design ways to quickly aggregate, analyze, and act upon disparate data streams. To achieve this speed, executives may need to spearhead adjustments to corporate culture grounded in more centralized location of data control. Capabilities to discern which data are truly predictive versus just noise in the system are also critical. Therefore, along with standardized formats for IoT technology,5 insurers may see an increasing need for data scientists to mine, organize, and make sense of mountains of raw information.

Perhaps most importantly, insurers would need to overcome the privacy concerns that could hinder consumers’ willingness to make available the data on which the IoT runs. Further, increased volume, velocity, and variety of data propagate a heightened need for appropriate security oversight and controls.

For insurers, efforts to capitalize on IoT technology may also require patience and long-term investments. Indeed, while bolstering market share, such efforts could put a short-term squeeze on revenues and profitability. To convince wary customers to opt in to monitoring programs, insurers may need to offer discounted pricing, at least at the start, on top of investments to finance infrastructure and staff supporting the new strategic initiative. This has essentially been the entry strategy for auto carriers in the usage-based insurance market, with discounts provided to convince drivers to allow their performance behind the wheel to be monitored, whether by a device installed in their vehicles or an application on their mobile device.

Results from the Wikistrat crowdsourcing simulation reveal several other IoT-related challenges that respondents put forward. (See figure 2.)6

ER_2824_Fig.2a

Each scenario implies some measure of material impact to the insurance industry. In fact, together they suggest that the same technology that could potentially help improve loss ratios and strengthen policyholder bonds over the long haul may also make some of the most traditionally lucrative insurance lines obsolete.

For example, if embedding sensors in cars and homes to prevent hazardous incidents increasingly becomes the norm, and these sensors are perfected to the point where accidents are drastically reduced, this development may minimize or eliminate the need for personal auto and home liability coverage, given the lower frequency and severity of losses that result from such monitoring. Insurers need to stay ahead of this, perhaps even eventually shifting books of business from personal to product liability as claims evolve from human error to product failure.

Examining the IoT through an insurance lens

Analyzing the intrinsic value of adopting an IoT strategy is fundamental in the development of a business plan, as executives must carefully consider each of the various dimensions to assess the potential value and imminent challenges associated with every stage of operationalization. Using Deloitte’s Information Value Loop can help capture the stages (create, communicate, aggregate, analyze, act) through which information passes in order to create value.7

The value loop framework is designed to evaluate the components of IoT implementation as well as potential bottlenecks in the process, by capturing the series and sequence of activities by which organizations create value from information (figure 3).

ER_2824_Fig.3

To complete the loop and create value, information passes through the value loop’s stages, each enabled by specific technologies. An act is monitored by a sensor that creates information. That information passes through a network so that it can be communicated, and standards—be they technical, legal, regulatory, or social—allow that information to be aggregated across time and space. Augmented intelligence is a generic term meant to capture all manner of analytical support, collectively used to analyze information. The loop is completed via augmented behavior technologies that either enable automated, autonomous action or shape human decisions in a manner leading to improved action.8

For a look at the value loop through an insurance lens, we will examine an IoT capability already at play in the industry: automobile telematics. By circumnavigating the stages of the framework, we can scrutinize the efficacy of how monitoring driving behavior is poised to eventually transform the auto insurance market with a vast infusion of value to both consumers and insurers.

Auto insurance and the value loop

Telematic sensors in the vehicle monitor an individual’s driving to create personalized data collection. The connected car, via in-vehicle telecommunication sensors, has been available in some form for over a decade.9 The key value for insurers is that sensors can closely monitor individual driving behavior, which directly corresponds to risk, for more accuracy in underwriting and pricing.

Originally, sensor manufacturers made devices available to install on vehicles; today, some carmakers are already integrating sensors into showroom models, available to drivers—and, potentially, their insurers—via smartphone apps. The sensors collect data (figure 4) which, if properly analyzed, might more accurately predict the unique level of risk associated with a specific individual’s driving and behavior. Once the data is created, an IoT-based system could quantify and transform it into “personalized” pricing.

ER_2824_Fig.4

Sensors’ increasing availability, affordability, and ease of use break what could potentially be a bottleneck at this stage of the Information Value Loop for other IoT capabilities in their early stages.

IoT technology aggregatesand communicatesinformation to the carrier to be evaluated. To identify potential correlations and create predictive models that produce reliable underwriting and pricing decisions, auto insurers need massive volumes of statistically and actuarially credible telematics data.

In the hierarchy of auto telematics monitoring, large insurers currently lead the pack when it comes to usage-based insurance market share, given the amount of data they have already accumulated or might potentially amass through their substantial client bases. In contrast, small and midsized insurers—with less comprehensive proprietary sources—will likely need more time to collect sufficient data on their own.

To break this bottleneck, smaller players could pool their telematics data with peers either independently or through a third-party vendor to create and share the broad insights necessary to allow a more level playing field throughout the industry.

Insurers analyze data and use it to encourage drivers to act by improving driver behavior/loss costs. By analyzing the collected data, insurers can now replace or augment proxy variables (age, car type, driving violations, education, gender, and credit score) correlated with the likelihood of having a loss with those factors directly contributing to the probability of loss for an individual driver (braking, acceleration, cornering, and average speed, as figure 4 shows). This is an inherently more equitable method to structure premiums: Rather than paying for something that might be true about a risk, a customer pays for what is true based on his own driving performance.

But even armed with all the data necessary to improve underwriting for “personalized” pricing, insurers need a way to convince millions of reluctant customers to opt in. To date, insurers have used the incentive of potential premium discounts to engage consumers in auto telematics monitoring.10 However, this model is not necessarily attractive enough to convince the majority of drivers to relinquish a measure of privacy and agree to usage-based insurance. It is also unsustainable for insurers that will eventually have to charge rates actually based on risk assessment rather than marketing initiatives.

Substantiating the point about consumer adoption is a recent survey by the Deloitte Center for Financial Services of 2,193 respondents representing a wide variety of demographic groups, aiming to understand consumer interest in mobile technology in financial services delivery, including the use of auto telematics monitoring. The survey identified three distinct groups among respondents when asked whether they would agree to allow an insurer to track their driving experience, if it meant they would be eligible for premium discounts based on their performance (figure 5).11 While one-quarter of respondents were amenable to being monitored, just as many said they would require a substantial discount to make it worth their while (figure 5), and nearly half would not consent.

ER_2824_Fig.5

While the Deloitte survey was prospective (asking how many respondents would be willing to have their driving monitored telematically), actual recruits have been proven to be difficult to bring on board. Indeed, a 2015 Lexis-Nexis study on the consumer market for telematics showed that usage-based insurance enrollment has remained at only 5 percent of households from 2014 to 2015 (figure 6).12

ER_2824_Fig.6

Both of these survey results suggest that premium discounts alone have not and likely will not induce many consumers to opt in to telematics monitoring going forward, and would likely be an unsustainable model for insurers to pursue. The good news: Research suggests that, while protective of their personal information, most consumers are willing to trade access to that data for valuable services from a reputable brand.13 Therefore, insurers will likely have to differentiate their telematics-based product offerings beyond any initial early-adopter premium savings by offering value-added services to encourage uptake, as well as to protect market share from other players moving into the telematics space.

In other words, insurers—by offering mutually beneficial, ongoing value-added services—can use IoT-based data to become an integral daily influence for connected policyholders. Companies can incentivize consumers to opt in by offering real-time, behavior-related services, such as individualized marketing and advertising, travel recommendations based on location, alerts about potentially hazardous road conditions or traffic, and even diagnostics and alerts about a vehicle’s potential issues (figure 7).14 More broadly, insurers could aim to serve as trusted advisers to help drivers realize the benefits of tomorrow’s connected car.15

Many IoT applications offer real value to both insurers and policyholders: Consider GPS-enabled geo-fencing, which can monitor and send alerts about driving behavior of teens or elderly parents. For example, Ford’s MyKey technology includes tools such as letting parents limit top speeds, mute the radio until seat belts are buckled, and keep the radio at a certain volume while the vehicle is moving.16 Other customers may be attracted to “green” monitoring, in which they receive feedback on how environmentally friendly their driving behavior is.

Insurers can also look to offer IoT-related services exclusive of risk transfer—for example, co-marketing location-based services with other providers, such as roadside assistance, auto repairs, and car washes may strengthen loyalty to a carrier. They can also include various nonvehicle-related service options such as alerts about nearby restaurants and shopping, perhaps in conjunction with points earned by good driving behavior in loyalty programs or through gamification, which could be redeemed at participating vendors. Indeed, consumers may be reluctant to switch carriers based solely on pricing, knowing they would be abandoning accumulated loyalty points as well as a host of personalized apps and settings.

For all types of insurance—not just auto—the objective is for insurers to identify the expectations that different types of policyholders may have, and then adapt those insights into practical applications through customized telematic monitoring to elevate the customer experience.

Telematics monitoring has demonstrated benefits even beyond better customer experience for policyholders. Insurers can use telematics tools to expose an individual’s risky driving behavior and encourage adjustments. Indeed, people being monitored by behavior sensors will likely improve their driving habits and reduce crash rates—a result to everyone’s benefit. This “nudge effect” indicates that the motivation to change driving behavior is likely linked to the actual surveillance facilitated by IoT technology.

The power of peer pressure is another galvanizing influence that can provoke beneficial consumer behavior. Take fitness wearables, which incentivize individuals to do as much or more exercise than the peers with whom they compete.17 In fact, research done in several industries points to an individual’s tendency to be influenced by peer behavior above most other factors. For example, researchers asked four separate groups of utility consumers to cut energy consumption: one for the good of the planet, a second for the well-being of future generations, a third for financial savings, and a fourth because their neighbors were doing it. The only group that elicited any drop in consumption (at 10 percent) was the fourth—the peer comparison group.18

Insurers equipped with not only specific policyholder information but aggregated data that puts a user’s experience in a community context have a real opportunity to influence customer behavior. Since people generally resist violating social norms, if a trusted adviser offers data that compares customer behavior to “the ideal driver”—or, better, to a group of friends, family, colleagues, or peers—they will, one hopes, adapt to safer habits.

ER_2824_Fig.7a

The future ain’t what it used to be—what should insurers do?

After decades of adherence to traditional business models, the insurance industry, pushed and guided by connected technology, is taking a road less traveled. Analysts expect some 38.5 billion IoT devices to be deployed globally by 2020, nearly three times as many as today,19 and insurers will no doubt install their fair share of sensors, data banks, and apps. In an otherwise static operating environment, IoT applications present insurers with an opportunity to benefit from technology that aims to improve profits, enable growth, strengthen the consumer experience, build new market relevance, and avoid disruption from more forward-looking traditional and nontraditional competitors.

Incorporating IoT technology into insurer business models will entail transformation to elicit the benefits offered by each strategy.

  • Carriers must confront the barriers associated with conflicting standards—data must be harvested and harnessed in a way that makes the information valid and able to generate valuable insights. This could include making in-house legacy systems more modernized and flexible, building or buying new systems, or collaborating with third-party sources to develop more standardized technology for harmonious connectivity.
  • Corporate culture will need a facelift—or, likely, something more dramatic—to overcome longstanding conventions on how information is managed and consumed across the organization. In line with industry practices around broader data management initiatives,20 successfully implementing IoT technology will require supportive “tone at the top,” change management initiatives, and enterprisewide training.
  • With premium savings already proving insufficient to entice most customers to allow insurers access to their personal usage data, companies will need to strategize how to convince or incentivize customers to opt in—after all, without that data, IoT applications are of limited use. To promote IoT-aided connectivity, insurers should look to market value-added services, loyalty points, and rewards for reducing risk. Insurers need to design these services in conjunction with their insurance offerings, to ensure that both make best use of the data being collected.
  • Insurers will need to carefully consider how an interconnected world might shift products from focusing on cleaning up after disruptions to forestalling those disruptions before they happen. IoT technology will likely upend certain lines of businesses, potentially even making some obsolete. Therefore, companies must consider how to heighten flexibility in their models, systems, and culture to counterbalance changing insurance needs related to greater connectivity.
  • IoT connectivity may also potentially level the playing field among insurers. Since a number of the broad capabilities that technology is introducing do not necessarily require large data sets to participate (such as measuring whether containers in a refrigerated truck are at optimal temperatures to prevent spoilage21 or whether soil has the right mix of nutrients for a particular crop22), small to midsized players or even new entrants may be able to seize competitive advantages from currently dominant players.
  • And finally, to test the efficacy of each IoT-related strategy prior to implementation, a framework such as the Information Value Loop may become an invaluable tool, helping forge a path forward and identify potential bottlenecks or barriers that may need to be resolved to get the greatest value out of investments in connectivity.

The bottom line: IoT is here to stay, and insurers need look beyond business as usual to remain competitive.

The IoT is here to stay, the rate of change is unlikely to slow anytime soon, and the conservative insurance industry is hardly impervious to connectivity-fueled disruption—both positive and negative. The bottom line: Insurers need to look beyond business as usual. In the long term, no company can afford to engage in premium price wars over commoditized products. A business model informed by IoT applications might emphasize differentiating offerings, strengthening customer bonds, energizing the industry brand, and curtailing risk either at or prior to its initiation.

IoT-related disruptors should also be considered through a long-term lens, and responses will likely need to be forward-looking and flexible to incorporate the increasingly connected, constantly evolving environment. With global connectivity reaching a fever pitch amid increasing rates of consumer uptake, embedding these neoteric schemes into the insurance industry’s DNA is no longer a matter of if but, rather, of when and how.

You can view the original post in its entirety Here

Read more…

Guest post by Jules Oudmans. This article first appeared here.

An important, crucial, aspect in IIoT applications is the communication layer. This layer is responsible for relaying sensor values from the field to northbound processing of this data and the southbound control.

In my previous blog I concluded that IIoT is reality, but headaches are ahead choosing the right protocol and communications provider, especially when your IIoT solution requires long-range support, will be deployed in multiple countries and needs cross border coverage.

Protocols, protocols, protocols …

The myriad of protocols in IoT land is still growing and can truly be overwhelming. In below table I have limited the protocols listed to only those that support long-range. Next to this the table shows data rates and bi-directional capability, these are important qualifiers for typical IIoT solutions and may help you choose the right protocol … but continue reading!!

What Protocol … No Sorry What Provider .. or Protocol !??

The vast majority of today’s M2M communication relies on 2G and 3G networks. These networks reliably provide, relatively low-cost, long range and border stretching networks for IIoT applications. They are offered by a wide variety of providers and support roaming. A truly excellent choice for your IIoT solution … were it not that 2G/3G networks are expected to disappear in the next 5 years. This is where the headache starts ... as today no cross-border roaming IIoT provider is out there supporting a protocol with an expected useful lifetime equal to a smartphone.

You could off course develop your IIoT solution – sensors, middleware, processing nodes et cetera. – to support multiple protocols but this is unpractical and costly.

When we limit ourselves to an IIoT deployment that requires long range connectivity, is low- cost, will be deployed in multiple countries and will be available after 2020 than, today, we can choose between:

  • SigFox: A protocol with country coverage in certain EU countries and countries under rollout such as Brazil, Germany, UK and the US; and
  • LoRa: A protocol offered by multiple providers. Some country covering networks are available today and some network providers cover regions or cities.

SigFox

SigFox is available in countries where rollout is requested (chicken and egg). Today roaming is not supported, but besides this all data goes over SigFox managed servers(!). The latter something that certain companies will not want.

LoRaWAN

LoRaWAN is growing in popularity, LoRa networks are predominantly rolled out by the NTCs – National Telephone Companies – but you also see new communication providers popping up such as DIGIMONDO in Germany, Wireless Things in Belgium et cetera. This because there are no frequency auctions for LoRa. So far so good, but LoRa also has a small caveat: roaming is under development – expected this year.

Conclusion

With new communication providers popping up out of nowhere and NTCs pushing rollouts of LoRa like there is no tomorrow there is a lot of turbulence in the IIoT communications space.

Today no cross-border roaming IIoT provider is out there supporting a protocol with a ‘long’ lifespan. Today LoRa is, in Europe, one of the best alternatives to focus on.

Closing Notes

In this post I have not taken LTE-M into consideration as it is becoming available this year(1):

LTE-M rollout will likely be fast as it can utilize existing mobile phone infrastructures. I recommend you to read the Ericsson White Paper: Cellular networks for massive IoTand to keep an eye on this space. But also don’t lose track on Ingenu (promising(2)),NWAVE and NB-IoT. Some expect that NB-IoT will ‘crush’ LoRa and SigFox(3) .. just furthering the headache.

Weightless was left out of the consideration in this article as it is only available in a few different European cities and is more mid than long range .. but hey it may well suite your IIoT needs!

Due to the turbulence and changes in communications land this article very likely needs to be revisited in 3-to-6 months from now.

Finally: If you are looking to set-up a private LoRaWAN network, or wanting to play around with LoRa possibilities there is not much stopping you. For approximately 280$ you can have your own LoRa network – have a look at TheThingsNetwork.

UREASON

UREASON has been at the forefront of IoT /IoE, reasoning over real-time streaming data and events in the manufacturing industry and telecom. We apply an ensemble of techniques – best fitting the requirements – and a wealth of knowledge focused on providing a tailored response to the environment of our customers.

Our capabilities in the Industrial Internet of Things field include:

  • Feasibility studies and Proof of Concepts including hardware prototyping and field tests;
  • Support and roll-out of IIoT solutions in Operational Safety and Predictive Maintenance;
  • Recommendations for human-cyber physical systems, augmented reality and Internet of Things technologies; and
  • Support in Machine Learning and Big-Data initiatives supporting IIoT applications.

References

(1): Cellular IoT alphabet soup, Ericsson Research Blog, 2016

(2): RPMA Technology for the Internet of Things, Ingenu

(3): Vodafone to 'Crush' LoRa, Sigfox With NB-IoT, 2016

Read more…

The Next Frontier for Developers: IoT

Development for the Internet of Things has grown substantially over the past 12 months according to the newly released Global Developer Population and Demographics Study from Evans Data Corp.  

The number of developers currently working on IoT applications has increased 34% since last year to just over 6.2 million today. In addition, the increase of development for mobile devices, up 14% since last year, has led to smartphones being the most commonly connected IoT platform.  

The study, which combines the industry’s most exhaustive developer population model with the results of Evans Data’s biannual Global Development Survey also provides fresh population data for the four major regions: North America, APAC, EMEA, and Latin America and for more than 40 countries. Population numbers for adoption of the hottest tech areas are also included.  

“We're seeing how in the space of just a year, the possibilities introduced by the Internet of Things has attracted many developers.” said Michael Rasalan, Director of Research for Evans Data Corp.“This transition to IoT, while not without barriers, is rapid, because developers are able to leverage existing knowledge and expertise in complementary technologies like cloud and mobile, to create entirely new use cases. We're also seeing developers branch out from concepts centered on wearables to applications for more complex tasks, seen in the industrial space.”  

For the general developer population, estimates and projections for growth to 2021 show APAC leading the pack with nine hundred thousand more developers than EMEA. Growth in India and China are predicted to keep APAC’s population the highest globally for the next several years.  

The full report can be found here.

Read more…

Originally Posted by: Mimi Spier

The Internet of Things (IoT) is here to stay—and rapidly evolving. As we try to make sense of IoT’s impact on our lives and businesses, we also continue grappling with the security challenges.

As the IoT security landscape evolves, here are five key insights for designing and implementing IoT deployments for your enterprise.

5 IoT insights vmware airwatch

1. Protect Your People

IoT has opened up a world of possibilities in business, but it has also opened up a host of ways to potentially harm employees and customers. A security breach is not limited to stealing credit card data, anymore. Anyone with the right access could breach firewalls or steal health records. A key challenge of the IoT world is providing the right access to the right people at the right time.

[Related: 5 Real Ways to Enable IoT Success in Your Enterprise]

2. Watch Your Things

As millions of “things” start joining the enterprise network, it also expands the surface area for hackers to breach your system. All these devices will be leveraging public Wi-Fi, cloud, Bluetooth networks, etc., which will create multiple points of vulnerabilities. Your system needs to be designed for security from the bottom up to account for:

A) Device level: better quality devices

B) Data level: encryption and cryptology

C) Network level: certificates and firewalls

D) Application level: login/authorized access

3. Poor Quality of Things

The standards for IoT hardware and software are still evolving, which means until we have any established guidelines, we need to account for a vast range in the quality of “things.” Some of these may be very sophisticated and hardy, while others may be of the cheap disposable variety. Which devices you pick may depend upon factors like cost, usage and the use case itself. However, be warned that lower-quality devices have been used to gain entry to a secure network.

“By 2020, more than 25% of identified attacks in enterprises will involve the Internet of Things (IoT), although the IoT will account for less than 10% of the IT security budget.” Gartner

4. Is Your Network Ready?

One of the biggest challenge for any IT department implementing company-wide IoT projects will be assessing and managing bandwidth. As millions of devices join your network at increasing rates, scaling your network’s bandwidth will be an ongoing struggle. Your bandwidth needs must remain elastic, so you can support your enterprise needs, while minimizing costs. It is critical to minimize exposure of your networks by using, for example, micro-segmentation.

5. Data Is Your Friend

As with protecting any system, predictive maintenance is the way to stay a step ahead of breaches. The usual ways of pushing out timely security patches and software upgrades will continue to be helpful. However, one big advantage of IoT is the sheer amount of data it generates. You can track operational data to create alerts based on anomalies in the system. For example, if someone logs into the system from Atlanta and then, 30 minutes later, logs in again from Palo Alto, the system should raise a red flag.

You can view the original post by clicking Here.

Read more…
RSS
Email me when there are new items in this category –

Sponsor