Join IoT Central | Join our LinkedIn Group | Post on IoT Central


All Posts (1131)

Sort by

Just a few years ago, Big Data and the Internet of Things (IoT) were terms generally unheard of. In 2016 they are set to revolutionise technology and the ways in which we acquire and process data, but what do they mean for the healthcare industry?

Xenon Health describe IoT as ‘a phenomenon through which the operational aspects of the physical world become increasingly integrated with digital platforms, enabling information to move seamlessly toward the computational resources that are able to make sense of it.’

Essentially, IoT goes hand-in-hand with the ‘mobile age’ and the diversity of data that is currently being retrieved from agile and mobile locations.

Big Data is a related concept – it addresses the ever-increasing amounts of data that are created every second of every day and recognises that these figures will only continue to grow. Domo provide a useful infographic that explains the ‘social media minute’: every single minute, for example, 277,000 tweets are sent, Whatsapp users share 347,222 photos and Google receives over 4,000,000 search queries!

These figures are remarkable even for those of us caught up in the social media hype, and most shocking of all is the realisation that the global internet population now represents 2.4 billion people. That’s a lot of people creating a lot of data – the question now is how we can utilise this data in a meaningful way.

IoT has revolutionised many industries and will continue to do so in the foreseeable future, but what about healthcare? Organisations within this industry tend to adopt new technologies slowly, relying upon solid evidence and demonstrable impact and efficiency before committing to any such change. The shift towards IoT is, however, beginning to take place, and increasing amounts of available patient data are beginning to inform decision making processes within this sector.

What will this mean for the future of our health?

1.  Prevent (Rather than Cure) Illness


This change in attitude towards the way patients are managed has been making national headlines for the past couple of years, particularly with the implementation of Simon Stevens’ Five Year Forward View for the NHS in England which hopes to reduce preventable cases of illness such as diabetes and cancer before they become incurable.

According to Forbes, the power to access and analyse enormous data sets ‘can improve our ability to anticipate and treat illnesses. This data can help recognise individuals who are at risk for serious health problems. The ability to use big data to identify waste in the healthcare system can also lower the cost of healthcare across the board.’

This is relevant not just in the US, where the costs of healthcare are a constant source of contention, but also in the UK where the NHS is under strict financial pressures. If we can reduce the number of patients admitted to hospital, the costs spent on these admissions will be directly reduced and the money can be spent where it’s most urgently needed.

On top of these immediate, local benefits, the rise of Big Data in healthcare organisations can be used to ‘predict epidemics, cure disease, improve quality of life and avoid preventable deaths. With the world’s population increasing and everyone living longer, models of treatment delivery are rapidly changing, and many of the decisions behind those changes are being driven by data.’

The big push for healthcare organisations will be to understand as much about their patients as possible in order to be warned of any impending symptoms or hereditary factors that could result in serious illness. If armed with enough data, as will be possible in the near future, warning signs of serious illness can be seen early enough that treatment is more simple if not altogether avoidable.

2.  Wearable Devices

So we’ve established that prevention is better than cure and that Big Data will allow health professionals to make more informed decisions regarding our well-being, but what about IoT?

For years there have been thousands of apps that allow us to measure the steps we take and count the calories we consume. For the health-conscious, tech-savvy individuals, the recent explosion of ‘wearable devices’ on the market means that there are even more ways to track our health and diet.

Devices such as the Fitbit are particularly useful as they utilise the IoT, allowing you to track your individual progress as well as upload your data to be measured alongside others’.

According to Forbes, this collection and collation of data ‘enables sophisticated predictive modelling to take place – a doctor will be able to assess the likely result of whichever treatment he or she is considering prescribing, backed up by the data from other patients with the same condition, genetic factors and lifestyle.’

As most of us make decisions about our health outside of a medical setting, however, these wearable devices are particularly useful. The data they record can inform us about the healthiness of our lifestyle choices and potentially help us to avoid preventable medical conditions – we can see whether we are doing enough exercise, measure our heartrate and see whether we are eating too much sugar, valuable information to help us become the healthiest possible versions of ourselves.

 

3.  Telemedicine and Digital GP Appointments

The final way in which these concepts can improve our health is through ‘telemedicine,’ a buzzword emerging alongside Big Data and IoT. The term refers to the process of receiving medicinal help and treatments outside of a medical environment, usually within a patient’s own home and with the assistance of a device connected to the internet. The phrase even encompasses self-diagnosis through websites such as NHS Choices and Web MD, but is set to increasingly involve one-on-one services with qualified professionals.

On this topic we must also consider IBM’s artificially intelligent supercomputer, ‘Watson’. After earning the title of ‘world Jeopardy champion’, Watson is now being considered a medical genius and is petitioning to be regarded as the ‘best doctor in the world.’ The supercomputer is already capable of storing much more information than doctors and making emotionless, unbiased decisions based on fact and evidence alone.

According to Business Insider, Watson is capable of understanding natural language, generating hypotheses, and of learning – that is, not just storing data but finding meaning from it. And that’s really the whole point of Big Data: not just collecting it on a mass scale but using it to derive meaning.

The quality and immediacy of care provided to every patient could be revolutionised if everyone was given access to their own personal ‘Watson.’

 

Are there any potential risks?

As you’d expect, a certain amount of caution is needed before we can sing the praises of these revolutionary concepts. If all of our personal health data is interconnected and more easily accessible, what does that mean for our privacy? As systems become more connected, there is potentially an increased risk of vulnerabilities within the storage vessels holding our data – how do we know that it’s safe?

The staged Fitbit hacking of last year serves as a perfect, if disconcerting, example of the potential exposure of our sensitive data. The hack made use of the open Bluetooth connection of the device – according to researchers, an attacker could send malware to a Fitbit nearby at a Bluetooth distance (worryingly small!), which could then be transferred to any computer the device came into contact with. Fitbit denied that the reported security faults were an issue, but we are still left questioning the safety of our personal data.

It has been recognised that, if not convincingly addressed, ‘these privacy and security risks may undermine consumer and business confidence in IoT in health care, slowing patient and provider adoption of the technology.’  A potential solution is to define standards for interoperability as a nation: techUK have already begun an initiative set to do just this by introducing their Interoperability Charter for companies involved in healthcare data to sign, ensuring that there are clear responsibilities for players in the ecosystem. Adhering to one particular standard will help ensure that devices within a network can communicate and work together both safely and effectively.

The amount of data produced globally is ten times greater than it was in 2009 and is predicted to increase annually by 4300% by 2020. Demographics are gradually changing – an aging Baby Boomer generation will drive market growth for technologies such as motion sensors that allow them to manage their conditions and age within their own homes. The Millennials will follow in their wake with their own demands on technology. Big Data and the Internet of Things are set to increase in importance but will have to respond quickly and efficiently to the changing needs of its users.

 

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Guest blog post by Bernard Marr

The term ‘Big Data’ is a massive buzzword at the moment and many say big data is all talk and no action. This couldn’t be further from the truth. With this post, I want to show how big data is used today to add real value.

Eventually, every aspect of our lives will be affected by big data. However, there are some areas where big data is already making a real difference today. I have categorized the application of big data into 10 areas where I see the most widespread use as well as the highest benefits [For those of you who would like to take a step back here and understand, in simple terms, what big data is, check out the posts in my Big Data Guru column].

Detection of Earth-like planets uses big data

1. Understanding and Targeting Customers

This is one of the biggest and most publicized areas of big data use today. Here, big data is used to better understand customers and their behaviors and preferences. Companies are keen to expand their traditional data sets with social media data, browser logs as well as text analytics and sensor data to get a more complete picture of their customers. The big objective, in many cases, is to create predictive models. You might remember the example of U.S. retailer Target, who is now able to very accurately predict when one of their customers will expect a baby. Using big data, Telecom companies can now better predict customer churn; Wal-Mart can predict what products will sell, and car insurance companies understand how well their customers actually drive. Even government election campaigns can be optimized using big data analytics. Some believe, Obama’s win after the 2012 presidential election campaign was due to his team’s superior ability to use big data analytics.

2. Understanding and Optimizing Business Processes

Big data is also increasingly used to optimize business processes. Retailers are able to optimize their stock based on predictions generated from social media data, web search trends and weather forecasts. One particular business process that is seeing a lot of big data analytics is supply chain or delivery route optimization. Here, geographic positioning and radio frequency identification sensors are used to track goods or delivery vehicles and optimize routes by integrating live traffic data, etc. HR business processes are also being improved using big data analytics. This includes the optimization of talent acquisition – Moneyball style, as well as the measurement of company culture and staff engagement using big data tools.

3. Personal Quantification and Performance Optimization

Big data is not just for companies and governments but also for all of us individually. We can now benefit from the data generated from wearable devices such as smart watches or smart bracelets. Take the Up band from Jawbone as an example: the armband collects data on our calorie consumption, activity levels, and our sleep patterns. While it gives individuals rich insights, the real value is in analyzing the collective data. In Jawbone’s case, the company now collects 60 years worth of sleep data every night. Analyzing such volumes of data will bring entirely new insights that it can feed back to individual users. The other area where we benefit from big data analytics is finding love - online this is. Most online dating sites apply big data tools and algorithms to find us the most appropriate matches.

4. Improving Healthcare and Public Health

The computing power of big data analytics enables us to decode entire DNA strings in minutes and will allow us to find new cures and better understand and predict disease patterns. Just think of what happens when all the individual data from smart watches and wearable devices can be used to apply it to millions of people and their various diseases. The clinical trials of the future won’t be limited by small sample sizes but could potentially include everyone! Big data techniques are already being used to monitor babies in a specialist premature and sick baby unit. By recording and analyzing every heart beat and breathing pattern of every baby, the unit was able to develop algorithms that can now predict infections 24 hours before any physical symptoms appear. That way, the team can intervene early and save fragile babies in an environment where every hour counts. What’s more, big data analytics allow us to monitor and predict the developments of epidemics and disease outbreaks. Integrating data from medical records with social media analytics enables us to monitor flu outbreaks in real-time, simply by listening to what people are saying, i.e. “Feeling rubbish today - in bed with a cold”.

5. Improving Sports Performance

Most elite sports have now embraced big data analytics. We have the IBM SlamTracker tool for tennis tournaments; we use video analytics that track the performance of every player in a football or baseball game, and sensor technology in sports equipment such as basket balls or golf clubs allows us to get feedback (via smart phones and cloud servers) on our game and how to improve it. Many elite sports teams also track athletes outside of the sporting environment – using smart technology to track nutrition and sleep, as well as social media conversations to monitor emotional wellbeing.

6. Improving Science and Research

Science and research is currently being transformed by the new possibilities big data brings. Take, for example, CERN, the Swiss nuclear physics lab with its Large Hadron Collider, the world’s largest and most powerful particle accelerator. Experiments to unlock the secrets of our universe – how it started and works - generate huge amounts of data. The CERN data center has 65,000 processors to analyze its 30 petabytes of data. However, it uses the computing powers of thousands of computers distributed across 150 data centers worldwide to analyze the data. Such computing powers can be leveraged to transform so many other areas of science and research.

7. Optimizing Machine and Device Performance

Big data analytics help machines and devices become smarter and more autonomous. For example, big data tools are used to operate Google’s self-driving car. The Toyota Prius is fitted with cameras, GPS as well as powerful computers and sensors to safely drive on the road without the intervention of human beings. Big data tools are also used to optimize energy grids using data from smart meters. We can even use big data tools to optimize the performance of computers and data warehouses.

8. Improving Security and Law Enforcement.

Big data is applied heavily in improving security and enabling law enforcement. I am sure you are aware of the revelations that the National Security Agency (NSA) in the U.S. uses big data analytics to foil terrorist plots (and maybe spy on us). Others use big data techniques to detect and prevent cyber attacks. Police forces use big data tools to catch criminals and even predict criminal activity and credit card companies use big data use it to detect fraudulent transactions.

9. Improving and Optimizing Cities and Countries

Big data is used to improve many aspects of our cities and countries. For example, it allows cities to optimize traffic flows based on real time traffic information as well as social media and weather data. A number of cities are currently piloting big data analytics with the aim of turning themselves into Smart Cities, where the transport infrastructure and utility processes are all joined up. Where a bus would wait for a delayed train and where traffic signals predict traffic volumes and operate to minimize jams.

10. Financial Trading

My final category of big data application comes from financial trading. High-Frequency Trading (HFT) is an area where big data finds a lot of use today. Here, big data algorithms are used to make trading decisions. Today, the majority of equity trading now takes place via data algorithms that increasingly take into account signals from social media networks and news websites to make, buy and sell decisions in split seconds.

For me, the 10 categories I have outlined here represent the areas in which big data is applied the most. Of course there are so many other applications of big data and there will be many new categories as the tools become more widespread.

What do you think? Do you agree or disagree with this data revolution? Are you excited or apprehensive? Can you think of other areas where big data is used? Please share your views and comments.

Bernard Marr is a bestselling business author and is globally recognized as an expert in strategy, performance management, analytics, KPIs and big data. His latest book is 'Big Data - Using Smart Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance'.

You can read a free sample chapter here.

Follow us @IoTCtrl | Join our Community

Read more…

Cloud computing vulnerabilities

When contemplating to migrate to cloud computing, you have to consider the following security issues for you to enhance your data safety.

Session Riding

Session riding occurs when an online attacker steals an internet user’s cookie to use the application later as the real user. The attackers might also use the CSRF attacks for them to trick the user to send authentic requests to random websites to accomplish various missions.

Virtual Machine Escape

Within virtualized settings, the physical servers operate multiple virtual apparatuses on top of the hypervisors. An online attacker can remotely exploit a hypervisor by using a weakness present in that particular hypervisor. However, such vulnerabilities are pretty rare, but they are real. Also, a virtual machine can avoid the virtualized sandbox setting to gain access to the hypervisor. Consequently, all the virtual machines ultimately run on the virtual machine.

Unsafe Cryptography

Cryptography algorithms normally use random number generators. They use unpredictable information sources to produce actual random numbers that are needed to get a large entropy pool. When the random number generators provide only a limited entropy pool, the numbers can be forced. In a client’s computer, the major source of randomization is user mouse operations and the key presses. Servers however normally operate without user interaction. That consequently means that there will be a lower number of sources for randomization. Hence, the virtual machines usually rely on the sources that are available to them. That could lead to easily guessable numbers that do not give much uncertainty in cryptographic algorithms.

CSP Lock-in

You have to choose a provider that has guarantee cloud security will enable you to shift easily to another provider when necessary. You do not want to choose a CSP that will force you to use its services. That is because sometimes you would prefer to use a CSP in one thing and another CSP for something different.

Cloud computing threats

Before you decide to shift to the cloud computing, you have to put into consideration the platform’s security vulnerabilities. You also need to assess the possible threats to determine whether the cloud platform is worth the risk due to the numerous advantages it has to offer. The following are the major security threats experienced regarding cloud security.

Ease of Use

It is a reality that cloud computing services can easily be exploited by malicious attackers since its registration process is pretty simple. You are only required to have valid credit card to get started on this platform. In some cases, you can even pay for the cloud computing charges by through PayPal, Payza, Bitcoin, Western Union or Litecoin. By using the payment methods, you can stay completely anonymous. The cloud platform can be used maliciously for various ill purposes like malware distribution, botnet C&C servers, spamming, DDoS, hash cracking and password cracking.

Secure Data Transmission

When sending the data from clients to a cloud computing platform, the data can be transferred by using a secure, encrypted communication channel such as SSL/TLS. That prevents various attacks like the dreaded MITM. During these attacks, your online data could be stolen by an attacker intercepting your communication.

Insecure APIs

Most cloud services are exposed by their application programming interfaces. Since the APIs are easily accessible from any location on the Internet, malicious attackers can exploit them to compromise the integrity and confidentiality of the internet users. An attacker has access to a token used by a legit user to access the service through cloud computing. The API can apply the same token to interfere with the customer data. Hence, it is imperative that all cloud services provide a safe API to prevent such attacks.

Malicious Insiders

It is possible for a staff member at a cloud service provider to have complete access to your confidential resources. Therefore, cloud service providers should set proper security measures to track their employee actions. Normally, cloud service providers never follow the best security procedures and fail to implement security policies. Hence, their employees can collect confidential information from customers without getting detected.


Originally posted on Data Science Central

Read more…

With the cloud changing the way modernizations of big data is done, service providers and security organizations have to work harder to ensure security of Big Data to their consumers. The reason for increased security breaches is because the traditional security technologies have no capacity required to detect and protect against such attacks. In view of this rising issue, let's look at what companies in Silicon Valley are doing to make big data more secure.


IBM and security of Big Data


IBM has launched a security intelligence with Big Data platform to ensure threats and risk are detected. IBM's platform can help business address the ATPs, fraud and insider threats. IBM is helping its clients to answer questions that could never have been answered before. For instance, the new security intelligence with Big Data platform helps clients analyze emails, transactions, social media data, documents and full packet data over years of activity. With these kind of analytic capabilities, organizations can find malicious activities hidden in the big masses of data.


HP's Big Data Security strategy


HP makes use of Knowledge management apps and Autonomy enterprise search and integrate them with Security-event and information management (SIEM) to analyze the massive data. According to Varun Kohli, the director of product market and enterprise security products, it is possible to reveal rogue employee's behavior related to the data leaks of information, and learn in advance plots against the organization by cyber criminals. HP believes autonomy gives meaning to the data to ensure analyst are able to find out what people are saying, both negative and positive.


Platform services


The Blue Coat (bluecoat.com) security platform is uniquely positioned to ensure secure data for its clients on five advanced solutions areas. They include Advanced Threat Protection, Advanced web and cloud security, Encrypted Traffic management, Incidence Response & Network Forensics and Network Performance & Optimization. The platform aims to deliver cohesive visibility, protection and integration including:


- Providing a management environment to ensure operational teams can manage, enforce policies using a single platform whether in the cloud, on premise or across the platform. The plat


- Intelligence – protection of data is real-time, an effort that requires integrated intelligence to ensure an organization is able to adapt to rapid advancing threats.


Microsoft Big Data Security


To help all its clients move to the cloud and feel more secure, Microsoft launched its new security features of its well-known Azure SQL Database. New security steps include:


- Encryption – referred as "always encrypted", it helps businesses protect sensitive data without "having to relinquish the encryption keys to Azure SQL Database". This means that data remains encrypted on disks, in memory, on transit and during processing.


- Transparent data encryption – helps business comply with requirements using associated backups, encrypting the databases, transaction log files without making changes in the applications.


- Azure SQL database – to support authentication.


- Threat detection – alerts the users on suspicious database activities on logical server or the database itself.


The reasons why companies may not have their Big Data secure is because:


- They fail to view data security in all dimensions.


- Failure to have a cutting-edge comprehensive information security plan.


- Failure for many businesses to see data security as a business problem but instead as an "IT Problem".


- Failure to classify data and trade secrets.


The importance of securing Big Data for the business includes:


- Ensure accuracy – when data in the cloud is secure, every employee has confidence in the values and information.


- Security of confidential information – an organization has to ensure trade secrets and employee personal information are protected among other information to ensure the business does not run in to a crisis.


- Availability – when data is secure, it is accessible as long as internet connection is available. Security of Big Data means information can be accessed at any time regardless of location.


- Prevent opportunistic hacking – when Big Data is not secure, hackers may try to breach the low security level with the aim of destruction and stealing confidential information.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

IoT and M2M Communication Market Fact Sheet

Internet of Things (IoT) and Machine-to-Machine (M2M) communication technologies offer remote device access to most if not all household appliances. This is the core common aim of both M2M and IoT solutions. However, each communication technology differs in the way they achieve remote device access. Conventional M2M solutions rely on point-to-point communications with the help of embedded hardware modules along with either wired networks or cellular networks. On the other hand, IoT relies on IP-based networks to interface device data to a middleware platform or cloud. These were the similarities and differences between  IoT and M2M market, here are important facts regarding this industry.

Fact #1 – IoT and M2M will Demonstrate Substantial Growth 
The global IoT and M2M market is worth billions. Since the rise of cloud apps, sensors, and wireless networks, the global IoT and M2M market has gained a strong advantage. The global markets for Internet of Things and Machine-to-Machine communications will demonstrate a high rate of growth in the near future.

Fact #2 – Platforms and Technologies in IoT and M2M
The global Internet of Things and Machine-to-Machine market is divided by four criteria, which are technology, platforms, modules, and connections. According to technology and platforms, the market is segmented into sensor nodes, radio frequency identification (RFID), cloud management, gateways, near field communications, Zigbee, supervisory control and data acquisition (SCADA), and information and discovery services (IDS).

Fact #3 – Types of M2M and IoT Connections and Components
According to M2M connections, the market is segmented into network connections, module types, and sim-cards. On the basis of components, the IoT market is divided into network communications, RFID, security and support technology, and network communications.

Fact #4 – Applications of M2M and IoT
The IoT and M2M communications market finds its applications in several sectors. These include: Public safety and urban security, healthcare, retail, energy and power, telecom and IT, transportation, industrial and commercial buildings, consumer and residential, and manufacturing.

Fact #5 – IoT and M2M Communication Industry Drivers
The factors driving the IoT and M2M industries rely on support from governments. Government support plays a key role, as it is crucial for assigning the queries to a single point of contact and making the work smarter by integrating Internet and machines, which will generate more revenues and profitability.

Fact #6 – Developed Companies are leading the Market
Currently, the U.S. and the U.K. are the largest markets for technologies such as Machine to Machine. This can be attributed to the high technical awareness and technological advancements in these countries. Europe is also following a similar trend.

Fact #7 – Top Key Companies in the Industry
The global industry of the IoT and M2M market is dominated by key players such as Alcatel-Lucent S.A. (France), Huawei Technologies Co. Ltd. (China), Cisco Systems, Inc. (U.S.), Gemalto NV (Netherlands), Google Inc. (U.S.), IBM (U.S.), and Intel Corporation.

Fact #8 – Growth Opportunities for Existing Players and New Entrants
The established companies and new entrants in the IoT and M2M communication industry will benefit from the potential opportunities presented by new technological ideas and deployment of innovative devices.

Fact #9 – Factors Suppressing IoT and M2M Communication Industry
The global market for IoT and M2M communication will be suppressed by factors such as unfavorable governance and phasing out of 2G technology. In addition, customization of technology and sense of security for all the data available and accessible through the Internet will further inhibit the growth of the market.

To overcome these challenges, the Internet of Things and Machine-to-Machine market needs to develop and implement cost-effective and robust technology-enabled devices which possess superior integrating abilities on multiple platforms.


Originally posted on Data Science Central


Follow us @IoTCtrl | Join our Community
Read more…

2016 Trends in Big Data & Network Security

Guest blog post by Srividya Kannan Ramachandran

I attended the Carrier Network Security Strategies conference ( #CNSS2015)held by Light Reading in NYC on Dec 2. I also attended the New Jersey Tech Council’s Data Summit (#NJTechCouncil) on Dec 9. The main topics of discussion in the conference were around securing the perimeter of networks and protecting customer, carrier and network data. Here is brief summary of what I learned in these two conferences about managing and operating networks securely and protecting data.

1. The perimeter of a network as we know it does not exist anymore.

The traditional network security paradigm of securing the perimeter of the network so as to not allow malicious users to enter the network has changed now. The explosion of the devices that connect to networks and the mobility aspect they come with makes this securing extremely challenging.

2. Sharing threat intelligence allows for collaboration in developing strategies to identify and combat threats that occur on the internet.

There are many consortiums of companies today that pledge to share threat intelligence so as to make the information world safer.  The Cybersecurity Working Group of the CTIA - The Wireless Association in the US, and the European Telecommunications Standards Institute’s Network Function ... are doing a lot of advocacy and development work in the security area.

Christer Swartz from Palo Alto Networks gave a keynote address at CNSS where he talked about a futuristic model where we have the next generation firewall and advanced endpoint protection software talking with not only each other but also with a cloud based service that hosts threat intelligence – call it the threat intelligence cloud.

Chris Richter from Level 3 Communications also delivered a keynote highlighting the benefits of collaboration amongst carriers in a landscape where cyberattacks are increasing in number. He also mentioned a Wall Street Journal news article that describes how Level 3 thwarted a serious global hacking attack.

Chris Bream from Facebook reinforced the same idea of how Openness is key to increased Security. Lack of collaboration hurts companies that try to protect themselves and protect their customers. He talked about Facebook’s Threat Exchange – an API based platform where companies can share threat data. Companies like Netflix and AT&T have been using this platform already. This type of information exchange platform really helps smaller business to thrive because it helps them gain access to knowledge that was otherwise unavailable and very hard to acquire.

3. The players that are thinking about security are:

  • Telcos & internet service provider
  • IoT Device manufacturers
  • OEMs for networking gear
  • Enterprise customers
  • End user customer

Merely having antiviruses running on equipment connected to your network isn’t going to solve security needs. The Internet of Things is going to bring 8 billion connected devices online by the end of this decade. Experts from around the industry unequivocally agree that about 70% of these IoT devices are not being secured correctly. And if the perimeter of a network is now the perimeter of the internet, then all the players listed above have to think about security.

4. Big Data : Monetize and Protect

There are primarily two things to do with big data: monetize and protect. And both are equally important. No matter what else we do with big data, security and monetization almost always are also in the mix. Even if we are talking about platforms, and algorithms that we use to analyze big data, we are still talking about security when using cloud computing applications or monetization when describing the purpose of the analysis. Even if we talk about storing the data on the cloud, we are actually talking about being able to store and retrieve that data securely, and being able to perform access control and audits on it.

In the Data Summit at NJTC, there was a panel called Monetizing While Securing Big Data and in CNSS, there was a panel called Security: The Future of Monetization Opportunities for Service Providers.

In the former panel, Paul Zikopoulos from IBM shared an interesting quote – “If you are not paying for it, you are the product being sold.” There was a discussion on the massive governance challenges around the ownership of Meta-data in the big data revolution. Tom Mullen from Level 3 elaborated on how owing to, or actually, despite the inflection point in computing, the world is quickly moving from data collection to the data analysis mode. 

In the latter panel, the discussion was pivoted around monetizing security as a service (SecAAS). The panelists helped identify that small and medium businesses will require a lot more handholding when implementing SecAAS products on their network, while large enterprises would either have some form of their own security infrastructure and hence could work with a less customized version secAAS product.

I hope this provides you with a summary of all the current and important topics of discussion amongst practitioners in the field.

Follow us @IoTCtrl | Join our Community

Read more…

Is 2016 the Year of AI?

Guest blog post by Bill Vorhies

Summary:  Can AI take its victory lap in 2016?  A lot depends on what you call AI and whether the consumer can perceive it.

Image source:  skymind.io

If 2016 is to be “the year of AI” as some folks are speculating then we ought to take a look at what that might actually mean.  For starters, is AI sufficiently mature and will it matter in the every day world of consumers?  I’ll stipulate that AI is already relevant to a sizable minority of data scientists, especially those directly involved in AI projects.  But like the balance of data science, real economic importance comes when our knowledge is put to use in the broad economy.  Hence the emphasis on whether consumers will give a hoot. 

Like a lot of other DS disciplines, this doesn’t mean that Jane and Joe consumer even need to know that DS is at work.  It does mean that Jane and Joe would recognize that their lives are less convenient or efficient if the DS was suddenly removed.

What the consumer sees

Since this is CES season (Consumer Electronics Show for those of you not near any sort of video screen for the last week) this might be a good place to look to see how and if AI is making its way into the consumer world.  Here’s a more or less random sampling of 2016 CES new product rollouts:

  1. Samsung SUHD televisions
  2. Robot bartender
  3. Smart shower head (change color if you use too much water)
  4. Google/Lenovo Project Tango depth sensing for Android phones
  5. iLi wearable translator
  6. Advanced 3D printers
  7. Lyve photo organizer (organizes your digital pictures)
  8. Virtual reality headsets

Yes there are thousands of products at CES, but here’s the test.  Of these eight new products, which rely on artificial intelligence?  In my opinion there are only two, the iLi wearable translator, and the Lyve photo organizer.  A little explanation about these two.

The iLi wearable translator is a sleek gadget about the size of a small TV remote control with speakers and mics on each side.  Speak English into one side and it immediately broadcasts the translation into Mandarin, Japanese, French, Thai, or Korean out the other.  Yes it works in reverse (Mandarin in, English out) and no WiFi required, all on board memory.  Shades of Star Trek.

The Lyve photo organizer is an app in your PC or tablet that recognizes, finds, and organizes your digital photos.  Find all the pictures of Aunt Sally.  Show me pictures of Joe when he was a boy.  Display the pictures from the Grand Canyon vacation two years ago.

These two devices show two of the three primary applications of AI best known today, voice processing, image processing, and (unrepresented in this example) text processing.

From the data science side you should immediately recognize these as capabilities of Deep Learning, perhaps best described as unsupervised pattern recognition utilizing neural net architecture.

What exactly do we mean by AI?

If you’re a data science practitioner and following the literature then you’ve probably experienced that 9 out of 10 articles on AI directly tie to deep learning.  But is this the full breadth of AI from the consumers perspective?

Not to turn this into a definitional food fight, the original definitions of AI specified creating machines that could perform tasks that when performed by humans were perceived to require intelligence.  Note that no one said that the AI machines had to use the same logic as humans to achieve the task. 

For example, IBM’s chess playing phenom Deep Blue played superlative chess but was widely acknowledged not to play the way humans do, instead utilizing its ability to project tens of thousands of potential move combinations and evaluate the statistical value at each step.

Interesting chess factoid:

  • Feb 10, 1996 first win by a computer against a top human.
  • Nov 21, 2005 last win by a human against a top computer.

 Strong versus Weak AI

This opens the door to two divisions in the field of AI: Strong AI and Weak AI.  The Strong AI camp works on solutions genuinely simulating human reasoning (very limited success here so far).  The Weak AI camp wants to build systems that behave like humans, pragmatically just getting the system to work.  Deep Blue is an example of Weak AI.

There is a middle ground between these two and the Jeopardy-playing computer, IBM’s Watson is an example.  These are systems inspired by human reasoning but not trying to exactly model it.  Watson looks at thousands of pieces of text that give it confidence in its conclusion.  Like people, Watson is able to notice patterns in text each of which represents an increment of evidence, then add up that evidence to draw the most likely conclusion.  Some of the strongest work in AI today is taking place in this middle ground.

Narrow versus Broad AI

Another division in AI development is Narrow AI versus Broad AI.  Given that the requirement is that the machine perform the same task as a human then Narrow AI allows for lots of examples, especially outside of deep learning.  For example, systems that recommend options (what to watch, who to date, what to buy) can be built on Association math or graph analysis, much simpler than and completely unrelated to deep learning.  Your Roomba can find its way back to its charging station but that system doesn’t generalize beyond that narrow application.

So its apparent that consumers today are already experiencing lots of examples of Narrow AI.  In fact while there are still opportunities in Narrow AI, much of the data science foundation has been fully explored and exploited.  That may be of interest to consumers in these very narrow applications but I’m not sure that even Joe and Jane consumer would say this is the AI they were promised along with their jet packs and flying cars. What we’re aiming for is the opposite, Broad AI systems that can be generalized to many applications.

Deep Learning and the Goal of Broad and Not-Wimpy AI

For data scientists and consumers it appears the way forward is Deep Learning.  Other competing technologies may enter the field.  For example it’s been some time since we’ve heard from Genetic Programs which compete directly with Neural Nets.  But like Big Science, with all the money pouring into a single alternative, Deep Learning, that’s where we’re most likely to see break throughs.

We’re not satisfied with narrow apps.  Give us a system that generalizes.  And we don’t care on the strong/weak axis how we get there (hence ‘not-wimpy’) so long as it works.

In this category, true consumer applications are not common and belong mostly to the innovation giants like Google (Android voice recognition and self driving cars), Facebook (image recognition), and Microsoft Skype (instant translation).

Outside of these three there are a handful of interesting examples.  One of the best developed is the ability to perform text analysis on legal documents in the legal discovery process.  This has come on so fast that it has displaced a large percentage of $35/hour paralegal assistants who did this job more slowly, less accurately, and at greater cost. 

In fact, there’s been some speculation that the entire legal profession might be automated in this way.  However a recent study concludes that only about 13% of legal work can be automated despite startups like Legal Zoom and Rocket Lawyer.  The fact is that the activities performed by lawyers are extremely unstructured and therefore resistant to automation.  This is likely also to be a general rule or restriction on how far AI can go.  The outcome for the legal profession is probably more low cost basic services which will bring legal access to under-served portions of the population.

However, if you want a hopeful conclusion to this conversation and the impact of AI on consumers look no further than Baidu, the Chinese tech giant and their prototype device called DuLight that literally allows the blind to see.  The device wraps around the ear and connects by cable to a smart phone. 

The device contains a tiny camera that captures whatever is in front of you—a person’s face, a street sign, a package of food—and sends the images to an app on your smartphone. The app analyzes the images, determines what they depict, and generates an audio description that’s heard through the earpiece. If you can’t see, you can at least get an idea of what’s in front of you.

This is a fully generalized perceptual system based on deep learning that points to a time, not far beyond 2016, when AI will allow users to perceive and understand their surroundings perhaps even better than their native senses allow.

So 2016 may be a little early to declare victory but there are strong examples in the works.  Self-driving cars, content recognition and summarization, and systems to enhance our perception of our surroundings.  Can jet packs and flying cars be far behind?

 

January 6, 2016

About the author:  Bill Vorhies is Editorial Director for Data Science Central and has practiced as a data scientist and commercial predictive modeler since 2001.  He can be reached at:

Bill@DataScienceCentral.com

Follow us @IoTCtrl | Join our Community

Read more…

The first prediction is that data and analytics will continue to grow at an astounding pace and with increased velocity

This is no big surprise as all the past reports have pointed towards this growth and expansion -

Venturebeat * note that “Although the big data market will be nearly $50B by 2019 according to analysts, what’s most exciting is that the disruptive power of machine data analytics is only in its infancy. Machine analytics will be the fastest growing area of big data, which will have CAGR greater than 1000%.”

The move towards cloud based solutions opens up opportunities and it is not going to reverse. Following on from the trend in recent years yet more and more companies are increasing their use of cloud based solutions and along with this the opportunity to extract and collect data provides a potential for gleaning some information and knowledge from that data.

Suhale Kapoor, Co-Founder and Executive Vice President, Absolutdata Analytics *  highlights “The fast shift to the cloud: The cloud has become a preferred information storage place. Its rapid adoption is likely to continue even in 2016. According to Technology Business Research, big data will lead to tremendous cloud growth; Revenues for top 50 public cloud providers shot up from 47% in the last quarter of 2013 to $ 6.2 billion"

It is not difficult to predict that in 2016 the cloud and the opportunities that open up for data, analytics and machine learning will becomes huge drivers for business

predictions-for-2016-2

Applications will learn how to make themselves better

Applications will be designed to discover self improvement strategies as a new breed of log and machine data analytics, at the cloud layer, using predictive algorithms, enables; continuous improvement, continuous integration and continuous deployment. The application will learn from its users, in this sense the users will become the system architects teaching the system what they, the users, want and how the system is to deliver it to them.

Gartner view Advanced Machine Learning amongst the top trends to emerge in 2016 * with “advanced machine learning where deep neural nets (DNNs) move beyond classic computing and information management to create systems that can autonomously learn to perceive the world, on their own … (being particularly applicable to large, complex datasets) this is what makes smart machines appear "intelligent." DNNs enable hardware- or software-based machines to learn for themselves all the features in their environment, from the finest details to broad sweeping abstract classes of content. This area is evolving quickly, and organisations must assess how they can apply these technologies to gain competitive advantage.” the capability of systems to use advanced machine learning does not need to be confined to the information it finds outside it will also be introspective and be applied to the systems own itself and how it interfaces with human users.

A system performing data analytics needs to learn what questions it is being asked, how the questions are framed, as well as the vocabulary and the syntax the user chooses to ask those questions. No longer will the user be required to struggle with the structure of queries and programing language aimed at eliciting insight from data. The system will understand the users natural language requests such as “get me all the results that are relevant to my understanding of ‘x,y and z’ ”. The system will be able to do this because of the experience it has of the user/s asking these questions many times in structured programming languages (a corpus of language that the machine has long understood) and matching them to a new vocabulary that is more native to the non specialised user.

2016 will be the year these self learning applications emerge due to changes in the technology landscape for as Himanshu Sareen, CEO at Icreon Tech *  points out this move to machine learning is being fuelled by the technology that is becoming available “Just as all of the major cloud companies (Amazon Web Services, Google, IBM, etc.) provide analytics as a service, so do these companies provide machine learning APIs in the cloud. These APIs allow everyday developers to ‘build smart, data-driven applications’ ” it would be a foolish if these developers did not consider a system that was not self learning.

Our prediction is that through 2016 many more applications will become self learning thanks to developments in deep learning technology

predictions-for-2016-3

Working with data will become easier

While the highly specialised roles of the programmer,the  data scientist, and the data analyst are not going to disappear the exclusivity of the insights they have been part to is set to dissipate. Knowledge gleaned from data will not remain in the hands of the specialist and technology will once again democratise information. The need for easy to use applications providing self serve reports and self serve analysis is already recognised by business According to Hortonworks Chief Technology Officer Scott Gnau *  “There is a market need to simplify big data technologies, and opportunities for this exist at all levels: technical, consumption, etc.” … “Next year there will be significant progress towards simplification,”

Data will become democratised, first from programmers, then from data scientists and finally from analysts as Suhale Kapoor, Co-Founder and Executive Vice President, Absolutdata remarks “Even those not specially trained in the field will begin to crave a more mindful engagement with analytics. This explains why companies are increasingly adopting platforms that allow end users to apply statistics, seek solutions and be on top of numbers.” … “Humans can’t possibly know all the right questions and, by our very nature, those questions are loaded with bias, influenced by our presumptions, selections and what we intuitively expect to see. In 2016, we’ll see a strong shift from presumptive analytics — where we rely on human analysts to ask the right, bias-free questions — toward automated machine learning and smart pattern discovery techniques that objectively ask every question, eliminating bias and overcoming limitations.”

 “Historically, self-service data discovery and big data analyses were two separate capabilities of business intelligence. Companies, however, will soon see an increased shift in the blending of these two worlds. There will be an expansion of big data analytics with tools to make it possible for managers and executives to perform comprehensive self-service exploration with big data when they need it, without major handholding from information technology (IT), predicts a December study by business intelligence (BI) and analytics firm Targit Inc.” *…“Self-service BI allows IT to empower business users to create and discover insights with data, without sacrificing the greater big data analytics structures that help shape a data-driven organisation,” Ulrik Pedersen, chief technology officer of Targit, said in the report.

We are able to confidently predict that in 2016 more and more applications for analysing data will require less technical expertise.

predictions-for-2016-4

Data integration will become the key to gaining useful information

The maturity of big data processing engines enable an agile exploration of data and agile analytics able to make huge volumes of disparate and complex data fathomable. Connecting and combining datasets unlocks the insights held across data silos and will be done in the automatically in the background by SaaS applications rather than by manually manipulating spreadsheets.

David Cearley, vice president and Gartner Fellow postulates a “The Device Mesh” that “refers to an expanding set of endpoints people use to access applications and information or interact with people, social communities, governments and businesses” and that "In the postmobile world the focus shifts to the mobile user who is surrounded by a mesh of devices extending well beyond traditional mobile devices," that are “increasingly connected to back-end systems through various networks” and “As the device mesh evolves, we expect connection models to expand and greater cooperative interaction between devices to emerge”.

In the same report Cearley says that “Information has always existed everywhere but has often been isolated, incomplete, unavailable or unintelligible. Advances in semantic tools such as graph databases as well as other emerging data classification and information analysis techniques will bring meaning to the often chaotic deluge of information.”

It is an easy prediction but, more and more data sets will be blended from different sources allowing more insights, this will be a noticeable trend that will emerge during 2016.

predictions-for-2016-5

Seeing becomes all important, visualisations are the key to unlocking the path from data to information to knowledge

Having the ability to collect and explore complex data leads to an inevitable need to have a toolset to understand them. Tools that can present the information in these complex data as visual representations have been getting more mature and more widely adopted. Suhale Kapoor, Co-Founder and Executive Vice President, Absolutdata Analytics *  “Visuals will come to rule: The power of pictures over words is not a new phenomenon – the human brain has been hardwired to favour charts and graphs over reading a pile of staid spreadsheets. This fact has hit data engineers who are readily welcoming visualisation softwares that enable them to see analytical conclusions in a pictorial format.”

The fact that visualisation do leverage knowledge from data will lead to more adaptive and dynamic visualisation tools   “Graphs and charts are very compelling, but also static, sometimes giving business users a false sense of security about the significance — or lack of it — in the data they see represented. … data visualisation tools will need to become more than pretty graphs — they’ll need to give us the right answers, dynamically, as trends change …  leading to dynamic dashboards … automatically populating with entirely new charts and graphs depicting up-to-the-minute changes as they emerge, revealing hidden insights that would otherwise be ignored”*

We predict that in 2016 a new data centric semiotic, a visual language for communicating data derived information, will become stronger, grow in importance and be the engine of informatics .

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Guest blog post by Bhavin Shah

From the moment you walk out of the front door, it gets locked behind you even if you just missed out on locking it personally. From that moment, robotic intelligence takes charge and keeps you informed about whatever is going on in your home when you are away – Is your pooch scratching at the newly bought sofa? Is your kid doing his homework or whiling away the evening with his favourite cartoon? Did you forget to switch off the TV before you left for work? Has your garden been sprinkled adequately?

Such....and lot of other things.

Magical! Isn’t it?

The common notion about smart homes

The smart appliance market would witness a global investment of 15.2 billion USD by the end of 2015.

This figure, in itself, sums up the nature of popularity that smart homes are enjoying. Experts and analysts are of the opinion that this figure would radically grow in the next few years and the next generation would be living in a smart wireless era.

Who would not be intrigued by the idea of having everything done at the tapping of a button on one’s mobile phone?

The celebrated hits

When we talk about smart home, the first thought that comes to our mind is ‘relief from the constant noisy thought that says – What’s happening back at home?’ Remote automation facility also serves the purpose of having your house chores done, when you are not personally present to do them!

A smart home, laden with smart appliance would typically serve 3 purposes of the owner:

-          Keeping the owner informed about stuff when he is not home.

-          Performing tasks that have been remotely ordered to be done.

-          Performing tasks on the basis of voice commands when the owner is home

Switching the washing machine on or off, switching off lights, pulling down shades during the day, switching on the dishwasher, setting the television to a ‘kids’ channel’ when your 4-year old kid is coming back home, warming up food....the list is seemingly endless!

This whole idea is fascinating, convenient and off late, has become an affordable reality too!

But hold on.

Does this coin not have the other side or we are staying blind to the other side of the coin?

 

Are they really hits?

While with a smart home, you may stay assured that your house does not need your personal presence anymore to function at least till the extent of basic chores, it actually exposes you to another daunting question – If you are able to control your home remotely, would someone else not be able to do the same?

You would say, ‘We have automated security solutions, integrated with surveillance cameras and all! Moreover, I control my home, because I have access to control it. How can someone else?’

Ever heard about your emails and accounts being hacked? It was just you who had access to them as well, isn’t it?

The world of hackers is eying an elaborately fantastic time in the near future. With smart homes being implemented in every modern city with an affluent lifestyle, the possibility of breaking into houses and manipulating owners become al the simpler! More frighteningly, a remote crime with no physical trace!

An example

I found an example, worth sharing in an article at Forbes. Click here for the complete article.

When we talk about smart homes, we conveniently forget that we are exposing our home and every deliberate detail in it to the internet. That would mean, my home along with every inanimate object and every breathing being, is subject to the knowledge of millions of people.

When I know where the safe is, and what the code to open it is, there would be thousands of hackers who are constantly on the pry to possess as much knowledge about it as myself!

With smart security systems in place, no one would have to take the pains to break into my house! They can just decode the lock code and enter my home without the slightest difficulty.

While the dirt of burglary would be minimised to a huge extent, thanks to the complicated and robust interconnectivity within a smart home, any equipped hacker would now be able to put every alert to silence, remotely switch off every surveillance camera, turn off lights remotely to make visibility difficult, track down my valuables and leave house without the slightest doubt occurring to my neighbours!

Worse scenario:

Considering the fact that smart homes would witness the affluent class at its first buyer segment, security of life would be a graver problem than material security.

If I have an 8 year old returning to an empty home at 4 in the afternoon, he is still coming back to a human-less house, susceptible to the clutches of hackers, immaterial of how tight-looped the implemented security solution is.

Any ambitious hacker could easily bucket in every singular detail about my son’s commute to the house and from it. This leaves me ridiculously open to an easy kidnapping pursuit.

Security is the foremost concern about smart homes, even amongst its creators. Although the market is flooding with robust security solutions, we must not forget that every security system has some loop or the other.

As such, the idea of exposing one’s home to cloud servers and machines talking to each other seems to be a gamble with privacy and security.

The most applauded solutions

Just like in case of every other virtual ownership, the best hack, not to have a smart home hacked, is a unique password – as unique and as difficult as can be.

Lock your router down, use quality devices and remember to update them. Go with a cloud-service provider who swears by their cloud security facilities. ThingWorx and Freescale are awesome examples. 

Are they fool-proof?

No! Absolutely not! Yet, these small step help you stay away from harm in your best limits.

Let’s understand one thing –

The more powerful technology grows, the nastier would be the hacks to compromise it. Simultaneously, the more robust would be the security solutions in place.

It is definitely not in our power to contain hacking completely. Not using a technology, when uber is being defined by it, is also nothing less than plain folly. But staying a step ahead in safety is completely in our power.

Visit Volansys Technologies

Follow us @IoTCtrl | Join our Community

Read more…

Notable IoT Announcements at CES 2016

CES_Logo.jpg

170,000 attendees from across the globe and 3,600 vendors gathered amongst 2.4 million net square feet of exhibit space debuting the latest products and services across the entire consumer tech ecosystem just concluded CES 2016.

It’s come a long way since spinning out of the Chicago Music show in 1967. Products that have debuted at CES include the videocassette recorder, the compact disc player, HDTV, Microsoft Xbox and smart appliances.

Each year there seems to be a new category in consumer electronics added to the mix. In 2015 the big buzzword was the Internet of Things and it’s weight carried over to 2016 with more than 1000 exhibitors unveiling IoT technologies. For a community like ours focused on the industrial side of the IoT, what does a consumer electronics show have to do with our world?

A lot actually.  

Here are the notable announcements from CES 2016:

 

WiFi HaLow

For industrial IoT heads this is probably the most notable announcement to come out of the show. The Wi-Fi Alliance® introduced a low power, long range standard dubbed Wi-Fi HaLow™ .

In the IoT space with billions of sensors to be placed everywhere, the industry is in need of a low power Wi-Fi solution. Wi-Fi HaLow will be a designation for products incorporating IEEE 802.11ah technology. Wi-Fi HaLow operates in frequency bands below one gigahertz, offering longer range, lower power connectivity to Wi-Fi certified products.

Edgar Figueroa, President and CEO of Wi-Fi Alliance said, “Wi-Fi HaLow is well suited to meet the unique needs of the Smart Home, Smart City, and industrial markets because of its ability to operate using very low power, penetrate through walls, and operate at significantly longer ranges than Wi-Fi today. Wi-Fi HaLow expands the unmatched versatility of Wi-Fi to enable applications from small, battery-operated wearable devices to large-scale industrial facility deployments – and everything in between.”

Many devices that support Wi-Fi HaLow are expected to operate in 2.4 and 5 GHz as well as 900 MHz, allowing devices to connect with Wi-Fi’s ecosystem of more than 6.8 billion installed devices. Like all Wi-Fi devices, HaLow devices will support IP-based connectivity to natively connect to the cloud, which will become increasingly important in reaching the full potential of the Internet of Things. Dense device deployments will also benefit from Wi-Fi HaLow’s ability to connect thousands of devices to a single access point.

The bad news? The Wi-Fi Alliance isn't planning on rolling out HaLow certifications until sometime in 2018, and even if it gets here, it might not be the de-facto standard. There are others vying for the crown.

 

AT&T

AT&T held a developer summit at the Palms Resort which was all about emerging technologies, products and services. A year ago, AT&T launched the M2X Data Service, a cloud-based data storage service for enterprise IoT developers. At CES they announced the commercial launch of Flow Designer, a cloud-based tool developed at the AT&T Foundry that lets IoT developers quickly build new applications. They also said that they are on track to have 50% of their software built on open source. They are working with OpenDaylight, OPNFV, ON.Lab, the Linux Foundation, OpenStack and others. Rachel King of ZDNet has an interview with AT&T President and & CEO Ralph de la Vega here.

ces_2016_keynote_946x432.jpg.thumb.432.946.png

 

Ericsson

Ericsson and Verizon announced joint activities to further the development and deployment of cellular low-power wide-area (LPWA) networking for a diverse range of IoT applications. Ericsson introduced three IoT solutions for smart homes and cities:

  • Smart Metering as a Service puts consumers in control and enables utility companies to offer "smart" services to consumers in the future.

  • User & IoT Data Analytics enables controlled access and exposure of data from cellular and non-cellular devices and creates value through cross-industry offerings.

  • Networks Software 17A Diversifies Cellular for Massive IoT, supporting millions of IoT devices in one cell site, 90 percent reduced module cost, 10+ years battery life and 7-time cell coverage improvement.

 

IBM Watson

Last year, IBM announced a USD 3 Billion investment in Internet of Things, and in October, they announced plans to acquire The Weather Company, accelerating IBM's efforts in the IoT market that is expected to reach USD 1.7 trillion by 2020.

They furthered their commitment with five related IoT announcements at CES: Softbank, Whirpool, Under Armour, Pathway Genomics and Ford. What IBM does with Watson in the consumer space will carry over to the industrial space and vice versa. With the tremendous volumes of data from IoT, Watson’s advanced power of cognitive computing will be one way to exploit this new resource. Fortune’s Stacey Higginbotham has more here.

 

Intel

Lady GaGa aside, Intel made one announcement at CES which I think got through a lot clearer than Qualcomm’s 14 announcements! Rather than focus on technical aspects, Intel announced innovative technologies and collaborations aimed at delivering amazing experiences throughout daily life - which we often forget to do as we get enamored by the 1’s and 0’s. From unmanned aerial vehicles and wearables to new PCs and tablets, Intel made sure their chip was in it. On the industrial front was the DAQRI Smart Helmet, an augmented reality helmet for the industrial worker, powered by an Intel® Core™ M processor.

intelcesHH_BK_Photo.jpg

 

Qualcomm

Qualcomm made a mind-boggling 14 announcements in the CES time frame. Probably the most interesting was the Qualcomm® Snapdragon™ X5 LTE modem (9x07). Qualcomm said the chip has multimode capability and supports LTE Category 4 download speeds up to 150 Mbps. It’s designed to be used in a range of mobile broadband applications and in IoT use cases that demand higher data rates.

 

Samsung

The President and CEO of Samsung Electronics, BK Yoon, delivered the opening keynote speech CES, calling for greater openness and collaboration across industries to unlock the infinite possibilities of the Internet of Things. Mr. Yoon announced a timetable for making Samsung technology IoT-enabled. By 2017, all Samsung televisions will be IoT devices, and in five years all Samsung hardware will be IoT-ready. He also emphasized the importance of developers in building IoT and announced that Samsung will invest more than USD 100 million in its developer community in 2015.

 

ZigBee Alliance

The ZigBee Alliance, a non-profit association of companies creating open, global standards that define the Internet of Things for use in consumer, commercial and industrial applications, announced that it is working with the Thread Group on an end-to-end solution for IP-based IoT networks. The solution will become part of the ZigBee Alliance’s comprehensive set of product development specifications, technologies, and branding and certification programs.

 

I’m sure there were many more industrial Internet of Things announcements. Let me know what I missed in the comments section below.




Read more…

5 Ways SMS Messaging Will Play Out in IoT

Technophiles and dreamers unite in their joint vision of a future where our lives are connected via a network of devices that electronically talk both to each and to us. This intelligent design, often – and fondly – referred to as the 'Internet of Things', is a way to semi-automate everything from our homes to our workplaces to all kinds of fun and functional activities in between. While there are already glimpses of new ways to push the boundaries of cutting-edge communication, currently SMS messaging is slated to play a major role in how we live our day-to-day lives in a smarter way.

Housekeeping Reimagined

How many times have you forgotten to transfer wet clothes from the washer to the dryer and found yourself needing to rewash a stinky pile of long-sitting laundry? Perhaps you're notorious for not watering your plants, or maybe you're nagged by the idea that you've driven off and left the garage door up. As programming develops and device interconnectivity grows, you might get a text message from your smart flower pot or pre-programmed appliances alerting you to the error and giving you options as far as corrective action. Even better, the appliances' connection to the internet will allow them to guide you towards the proper amount of water for your plant or the temperature your fridge ideally should be at so you can make an educated decision about what steps to take next. If it sounds too good to be true, think again; by 2019, there will be 1.9 billion home devices connected across the IoT to a tune of some $490 billion in revenue.

Medical Milestones

The Internet of Things holds great promise for the medical industry. Geoff Zawolkow, CEO of Lab Sensor Solutions, says, "Sensors are changing the face of medicine. Mobile sensors are used to automatically diagnosis disease and suggest treatment, bringing us closer to having a Star Trek type Tricorder. Also mobile sensors will ensure the quality of our drugs, diagnostic samples and other biologically sensitive materials through remote monitoring, tracking and condition correction." SMS may connect medical professionals in their quest for quicker and more accurate diagnoses, but there are also real-world applications for everyday use as well; there are already pill boxes that will text you a reminder if you forget to take your daily dose, and a clever wearable gadget could send an alert to your phone if your heart rate or blood pressure read abnormally.

Security

We all seek to protect our home and loved ones, and the Internet of Things is making that easier and easier. Smart locks with electronic sensors can be activated – or deactivated, should your child arrive home to an empty house and find themselves unable to remember the entry code – by text, and should a break-in occur, emergency services and other chosen parties will get an SMS update as well.

Promoting Independent Living

The Internet of Things can help elderly relatives live alone longer by providing a constant connection between them and their caregivers. A network of wireless sensors placed around the home and even worn on the person can track, log, and transmit a person's daily activities and alert the proper authorities if there's a heat flare (fire), lack of regular activity (sudden illness or a fall), or even a fever or elevated heart rate. The alert threshold can be adjusted to maintain privacy and allow for discretion in all but certain circumstances deemed text-worthy by those involved. The result is greater independence for our parents and grandparents and peace of mind for those who love them the most.

Streamlining Inventory and Ordering

Running out of milk or eggs at home is inconvenient enough, but in the restaurant industry inventory mistakes can be practically catastrophic. Connected coolers, freezers, pantries, and storage containers send an automated SMS message when a product drops below a set level, with embedded data that can be plugged into an ordering system or forwarded right on to the distributor to maximize efficiency.

Experts say that there may be as many as 30 billion devices in the collective Internet of Things by the year 2020 – a huge web of connected devices that work in concert to make our lives bigger, better, and more efficient. Read more information about the impending electronic evolution and prepare yourself for a brave new world.

Read more…

2016 Predictions: IoT

Internet of Things (IoT) has garnered massive attention across the tech industry and portends major productivity advances for businesses of all types. The coming year holds significant promise as, potentially, the year in which actual (i.e., not simply existing products and services rechristened as IoT) business and industrial IoT deployments hit the mainstream.

Here are a few predictions from Bsquare.

1.  The fragmentation of IoT will gain speed

The IoT market is so broad that research data analyzing size and growth aspects of the market become almost meaningless. In fact, with such a broadly defined market, saying the IoT market will generate x trillions of dollars of economic impact is analogous to saying the same about the “technology” market. Interesting maybe, but hardly actionable for buyers, investors, suppliers or other market participants.

Going forward, the industry will begin to break the IoT market apart into subsets that do actually fit together. Many of these discrete markets may even dispense with the IoT acronym.

At the highest level this segmentation is already occurring. Consumer and business IoT, for example, are discrete markets having very little in common. Going further, and just looking at business IoT, segments are emerging for devices and device software, cloud services, machine learning, predictive analytics, and others that allow trends to be more accurately identified and tracked.

2.  Business IoT will see that fastest pace of innovation

While interesting developments are occurring in the consumer space, and these will continue to garner disproportionate attention, the most meaningful innovation will occur on the business side of the market. This is primarily due to the fact that while consumers may purchase IoT products for purely personal reasons, businesses embark on IoT initiatives with specific business objectives in mind, most of which translate directly or indirectly into improved financial outcomes. As a result, business IoT systems are considerably more complex and multifaceted, in many cases touching many core operational systems, yet afford more opportunities to innovate.

Advances in on-board device logic, machine learning, complex rule processing, predictive analytics, as well as IoT platforms in general, will raise the bar in terms of business outcome improvements that can be derived with IoT.

3.  A movement will be launched to capitalize the ‘o’ in IoT

Acronyms have long been smiled upon by technology people. They contribute to verbal economy and improve communications efficiency. For example, saying “TCP” requires five fewer syllables than “transmission control protocol.” And while there is no convention for the structure of acronyms, they are typically all caps in order to distinguish them from abbreviations.

However, starting with voice over IP (VoIP), technology acronyms started to get playful with capitalization. This was followed by fill-in-the-blank-as-a-service (SaaS, IaaS, PaaS, etc.) and, among others, IoT. The rationale for this was undoubtedly that words like “over,” “as,” “a,” and “of” are not important enough to warrant capitalization. This wouldn’t be a problem, and might even by slightly amusing, if it weren’t for the fact that the acronym most frequently appearing near IoT is ROI (return on investment). How do we account for the fact that “on” warrants a capital “O” while “of” has to get by with a small “o”?

4.  Analyst estimates will start to decline… but become more realistic

Stupendously ridiculous numbers have been bandied about regarding the potential size of the IoT market, the number of things participating, and total economic impact. The zeal to outdo one another quickly led to numbers in the trillions (one large company forecast economic impact at $82 trillion; by way of reference, nominal gross product for the planet earth is roughly $74 trillion (to be fair, the author didn’t specify a planet)). It seemed only a matter of time before someone would finally break out the “Q” word. E.g., “global economic value attributable to IoT is expected to eclipse two quadrillion dollars by the year… .”

As we get closer to reality many of these forecasts have been ratcheted down, in some cases by an order of magnitude. We expect these refinements will continue but at the same time become more realistic. In some ways, the progression of market forecasts follows the shape of Gartner’s well-known hype curve—progressively more outlandish estimates followed by a crashing back down to earth and finally settling into more realistic and sustainable ranges.

5.  2016 will be the year actual business IoT deployments accelerate

Not unlike any new technology, there is a propensity among suppliers to rechristen products and/or services they already offer using terminology associated with that new technology. Hence it might appear that the business-oriented IoT market is already going gangbusters when in fact it’s still in its infancy. This tendency is understandable and, in some cases, not completely without merit but what is truly interesting for businesses are complete systems where intelligent devices generate data that is captured by enterprise systems in order to automatically drive desired business outcomes. This, more than anything else, is why IoT is not even remotely the same as M2M.

For possibly the first time, 2016 will mark the beginning of complete, large scale IoT systems that directly and automatically link devices with business outcomes.

Read more…

Decision Scientist vs. Data Scientist

Someone asked a question in a LinkedIn forum on the distinction between Decision Science and Data Science.

My 2 cents. Others may disagree, given their particular orientation. Given a choice, we are in Decision Science domain; Given a firehose we are in Data Science domain. I would suggest that Decision Sciences ends where Data Science begins. Decision Scientists don't necessarily work with Big Data. Data Scientists are specialized to work with Big Data (or recognize when something isn't truly Big Data) and all the problems associated with that domain.

Decision Scientists build decision support tools to enable decision makers to make decisions, or take action, under uncertainty with a data-centric bias. Traditional analytics falls under this domain. Often decision makers like linear solutions that provide simple, explainable, socializable decision making frameworks. That is, they are looking for a rationale. Data Scientists build machines to make decisions about large-scale complex dynamical processes that are typically too fast (velocity, veracity, volume, etc.) for a human operator/manager. They typically don't concern themselves with whether the algorithm is explainable or socializable, but are more concerned with whether it is functional, reliable, accurate, and robust.

When the decision making has to be done at scale, iteratively, repetitively, in a non convex domain, in real-time, you need a Data Scientist and lots of compute power, bandwidth, storage capacity, scalability, etc. The dynamic nature of the generating process which leads to high volume, high velocity data, in my mind, is the differentiating factor between the two domains.

The transition from manually input data sources to sensor-driven real time data sources is the underlying theme of the "Internet of Things" and this is especially true with the "Industrial Internet of Things" with complex machines interconnected with hundreds of sensors relaying data continually. The underlying math may be somewhat sharable, but doing it at scale, at velocity, etc. requires an end-to-end approach, and a greater emphasis on high speed algorithms. This is particularly true when you are talking about IoT, and especially IIoT, where you do not want humans to be making decisions "on the fly".

A Decision Science problem might be something like a marketing analytics problem where you are segmenting a customer base, identifying a high margin target market, qualifying leads, etc. Here the cost of a bad 'decision' is relatively low. I view Decision Science from the perspective of "decision making under uncertainty" (see Decision theory) which suggests a statistical perspective to begin with. A Data Science problem might be more like "How do I dynamically tweak a Jet engine's performance in flight to ensure efficiency/uptime during flight, to achieve stable and reliable output, to optimize fuel consumption, to reduce maintenance costs, and to extend the useful service life of the engine?" The cost of a failure in flight can be pretty high. Here, you would want to have a reliable machine making those computations and decisions in real time, relying on both ground and in-flight streaming data for real time condition monitoring.

In reality, practitioners tend to be "Agnostic" Scientists that are transferring skills from one domain to another, with varying degrees of comfort with the different tools out there. However, a Data Scientist is more likely to have a diverse background that spans multiple disciplines including statistics, computing, engineering, etc. In these examples, the decision maker is a key differentiating factor in my view. A Decision Scientist typically provides a framework for decision making to a human being; a Data Scientist provides a framework for decision making to a machine.

When machines become more human, this distinction may be called into question, but for now, I think it works.

What do you think? Did I just give you a choice? Or, did I just deluge you with data?  Your call.

---

Note: In the above marketing analytics example, I am not referring to clickstream data; rather, I refer to historical records stored on an RDBMS somewhere.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

Data Security Trends for 2016

Data Security Professionals: What You Need to Know NOW: Trends for 2016

There are some scary things happening in data security. Along with the rise of the Internet of Things there has been a corresponding push by hackers to wrest the cloud from us law-abiding folks.

“Gartner is predicting that 6.4 billion connected “things" will be in use globally by the end of 2016 - up 30 percent from 2015 - and that number is expected to reach 20.8 billion by the year 2020. As more Internet connected devices hit the market, so too do the vulnerabilities that come with them, as evidenced by highly-publicized incidents of 2015 where researchers exploited vulnerabilities in planes, guns, medical devices and automobiles.

As the Internet of Things market expands and innovates, researchers will continue to find and uncover exploitable vulnerabilities in these newly connected “things,” which will in turn continue to fan the flames of responsible disclosure.”- Information Management

Companies are having a difficult time finding data security pros who know how to conquer this new frontier of data security in this “every business is an IT business age.”

Information Management Magazine had some cool ideas on this front:

Consolidation of IT Security

Big companies are buying out medium companies and then these really big companies are eating all of the “little fish” in sight. Dell buys EMC. Cisco buys Lancope. They all begin to buy companies like Adallom, Aorato and Secure Islands. It’s not going to stop next year, in fact, it will accelerate.

“It’s worth noting that offering up a “one stop shop” experience is completely different than being able to integrate technologies together to offer a seamless user experience.” Will that seamless user experience include seamless security?

Responsible Disclosure

You’ve got a Certified Hacker on staff who has uncovered some issues that overlap into the public domain. How much are you legally (never mind morally) required to divulge to regulators and/or competitors? According to IM, this issue will only get thornier as 2016 progresses: 

“White hat” hackers, hired to scope out flaws in systems, are already facilitating company / researcher relationships within the technology industry via bug bounty programs. However, it seems that many segments of the manufacturing industry would rather utilize lawyers to block research altogether than address the vulnerabilities that are uncovered. Another option for security researchers to consider is self-regulation, where they accept the risks and responsibilities associated with their findings.”

Smaller Businesses Up Security Spending

Remember the famous hacks of 2015? They were publicized more than ever before.  Companies like "LastPass, Securus Technologies, VTech and TalkTalk (are being targeted by) cybercriminals because they’re seen as less secure, while oftentimes owning valuable customer data.” These cyberattacks will grow in 2016.

People in the Cloud Share Responsibility

If you deploy in the cloud you share security responsibilities. Small to medium companies are hiring internally or taking advantage of Cloud Services’ security add-ons in contracts. To get a quick primer, check out Amazon’s shared responsibility model.

The other items in Information Management’s list include improved incident response protocols including communications and crisis management to calm investors and consumers; and enhanced collaboration among our communities as “security professionals are utilizing tools and platforms in order to better share and collaborate on security research and uncovering and responding to threats.” The folks at IM “expect this to increase and become more formalized amongst organizations, industry verticals and individual practitioners over the next year.”

What trends would you like us to keep an eye on for you as a cutting-edge data security specialist or leader? Let us know! We’d love to include your favorite topics right here.Email me. Until then, stay safe!

Read more…

Guest blog post by Bernard Marr

US agricultural manufacturer John Deere has always been a pioneering company. Its eponymous founder personally designed, built and sold some of the first commercial steel ploughs. These made the lives of settlers moving into the Midwest during the middle of the 19th century much easier and established the company as an American legend.

Often at the forefront of innovation, it is no surprise that it has embraced Big Data enthusiastically – assisting pioneers with the taming of the virtual wild frontier just as it did with the real one.

In recent years, it has focused efforts on providing Big Data and Internet of Things solutions to let farmers (and in the case of their industrial division with the black and yellow logo, builders) to make informed decisions based on real-time analysis of captured data.

So in this post I want to take a look at some of John Deere’s innovations in the virtual realm, and how they are leading to change which is said to be “revolutionizing” the world of farming.

Smart farms

The world’s population is growing rapidly, which means there is always going to be an increasing demand for more food. With the idea of genetically modified food still not appealing to public appetites, increasing the efficiency of production of standard crops is key to this. To this end, John Deere has launched several Big Data-enabled services which let farmers benefit from crowdsourced, real-time monitoring of data collected from its thousands of users.

They are designed by the company’s Intelligent Solutions Group, and the vision is that one day even large farms will be manageable by a small team of humans working alongside a fleet of robotic tools, all connected and communicating with each other.

To this end, they are working on a suite of services to allow everything from land preparation to seeding, fertilizing and harvesting to be controlled from a central hub.

The total land available can be split into sections and “Prescriptions” issued with precise instructions for seed density, depth and fertilization. These decisions are informed by Big Data – aggregated data from thousands of users feeding their own data back to the service for analysis.

Crowd sourced agriculture

Myjohndeere.com is an online portal which allows farmers to access data gathered from sensors attached to their own machinery as they work the fields, as well as aggregated data from other users around the world. It is also connected to external datasets including weather and financial data.

These services allow farmers to make better informed decisions about how to use their equipment, where they will get the best results from, and what return on their investment they are providing.

For example, fuel usage of different combines can be monitored and correlated with their productivity levels. By analyzing the data from thousands of farms, working with many different crops in many different conditions, it is possible to fine-tune operations for optimum levels of production.

The system also helps to minimize downtime by predicting, based on crowdsourced data, when and where equipment is likely to fail. This data can be shared with engineers who will stand ready to supply new parts and service machinery as and when it is needed – cutting down on waste caused by expensive machinery sitting idle.

Another service is Farmsight, launched in 2011. It allows farmers to make proactive decisions about what crops to plant where, based on information gathered in their own fields and those of other users. This is where the “prescriptions” can be assigned to individual fields, or sections of fields, and machinery remotely reprogrammed to alter their behavior according to the “best practice” suggested by the analytics.

As well as increasing farmers’ profits and hopefully creating cheaper, more abundant food for the world, there are potential environmental gains, too.

Pesticides and fertilizer can often cause pollution of air and waterways, so having more information on the precise levels needed for optimum production means that no more than is necessary will be used.

Who owns your agricultural data?

Of course, with all of this data being generated and shared – there is one question which needs answering – who owns it?

Deere offers what it calls its Deere Open Data Platform, which lets farmers share data with each other (or choose not to, if they wish) and also with third party application developers, who use can the APIs to connect equipment by other manufacturers, or to offer their own data analysis services.

But this has not stopped many farmers asking why they should effectively pay for their own data, and asking why John Deere and other companies providing similar services shouldn’t pay them – according to American Farm Bureau Federation director Mary Kay Thatcher.

Talks are currently ongoing between the AFBF and companies including John Deere, Monsanto and DuPont over how these concerns should be addressed. As well as privacy worries, there are concerns that having too much information could allow traders in financial markets to manipulate prices.

Farming is one of the fundamental activities which makes us human and distinguishes us from animals. Once we developed farms, we no longer needed to constantly be on the move in the pursuit of food and fertile foraging spots, leading to the development of towns, cities and civilization.

The future of farming?

With the development of automation and Big Data, we are starting to delegate those responsibilities to robots – not because farmers are lazy (they really aren’t, as anyone who lives in an area where agricultural activity goes on will tell you!) but because they can often do it better.

Sure, John Deere’s vision of vast areas of farmland managed by a man sitting at a computer terminal with a small team of helpers will lead to less employment opportunities for humans working the land, but that has been the trend for at least the last century, regardless.

And the potential for huge positive change– in a world facing overpopulation and insufficient production of food – particularly in the developing nations, is something that has the potential to benefit everyone on the planet.

I hope you found this post interesting. I am always keen to hear your views on the topic and invite you to comment with any thoughts you might have.

About : Bernard Marr is a globally recognized expert in analytics and big data. He helps companies manage, measure, analyze and improve performance using data.

His new book is: Big Data: Using Smart Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance You can read a free sample chapter here

 

Follow us @IoTCtrl | Join our Community

Read more…

IT Ops Challenge

Each layer of technology in the data centre is becoming progressively more complex to control and manage. The average server environment now has thousands of configuration parameters (e.g. Windows OS contains – 1,500+, IBM WebSphere Application Server – 16,000+, and Oracle WebLogic –  60,000+). The growing interdependence and complexity of interaction between applications also makes it increasingly difficult to manage and control business services.

IT change is very much a fact of life and it takes place at every level of the application and infrastructure stack. It also impacts pretty much every part of the business! To meet these development challenges businesses have adopted agile development processes to accelerate application release schedules. By employing practices such as continuous integration and continuous build they are able to generate hundreds of production changes each day. For example, eBay is estimated as having around 35,000 changes per year!

Industry analyst firm Forrester have stated that, “If you can’t manage today’s complexity, you stand no chance managing tomorrow’s. With each passing day, the problem of complexity gets worse. More complex systems present more elements to manage and more data, so growing complexity exacerbates an already difficult problem. Time is now the enemy because complexity is growing exponentially and inexorably.

The tools we use to manage IT infrastructure have been around for many years but are only capable of measuring what has happened. They are also not designed to deal with the complexity and dynamics of our modern IT technologies. IT operation teams need to be able to automate the collection and analysis of vast quantities of data down to the finest resolution and be able to highlight any changes to unify the various operations silos. None of the traditional tools are up to this ‘big data’ problem!

Big data for operations is still a relatively new paradigm and Gartner has defined the sector as “IT Operations Analytics.” and one that can enable smarter and faster decision-making in a dynamic IT environment with the objective to deliver better services to your customers. Forrester Research defines IT analytics as “The use of mathematical algorithms and other innovations to extract meaningful information from the sea of raw data collected by management and monitoring technologies.”

Despite its relative age a lot has already moved on and here are a few interesting findings:

  • Customer analytics (48%), operational analytics (21%), and fraud & compliance (21%) are now the top three uses for Big Data.
  • 15% of enterprises will use IT operations analytics technologies to deliver intelligence for both business execution and IT operations.
  • The market to be mainstream in 2018, making up 10% of the $20+ billion IT Operations Management software category.
  • 89% of business leaders believe big data will revolutionize business operations in the same way the Internet did.
  • 79% agree that ‘companies that do not embrace Big Data will lose their competitive position and may even face extinction

Where to use ITOA?

IT Operations Analytics (ITOA) is also known as Advanced Operational Analytics, or IT Data Analytics) and encapsulate technologies that are primarily used to discover complex patterns in high volumes of ‘noisy’ IT system availability and performance data. Gartner has outlined five core applications for ITOA:

  • Root Cause Analysis: The models, structures and pattern descriptions of IT infrastructure or application stack being monitored can help users pinpoint fine-grained and previously unknown root causes of overall system behavior pathologies.
  • Proactive Control of Service Performance and Availability: Predicts future system states and the impact of those states on performance.
  • Problem Assignment: Determines how problems may be resolved or, at least, direct the results of inferences to the most appropriate individuals or communities in the enterprise for problem resolution.
  • Service Impact Analysis: When multiple root causes are known, the analytics system’s output is used to determine and rank the relative impact, so that resources can be devoted to correcting the fault in the most timely and cost-effective way possible.
  • Complement Best-of-breed Technology: The models, structures and pattern descriptions of IT infrastructure or application stack being monitored are used to correct or extend the outputs of other discovery-oriented tools to improve the fidelity of information used in operational tasks (e.g., service dependency maps, application runtime architecture topologies, network topologies).
  • Real time application behavior learning: Learns & correlates the behavior of application based on user pattern and underlying Infrastructure on various application patterns, create metrics of such correlated patterns and store it for further analysis.
  • Dynamically Baselines Threshold: Learns behavior of Infrastructure on various application user patterns and determines the optimal behavior of the Infra and technological components, bench marks and baselines the low and high water mark for the specific environments and dynamically changes the bench mark baselines with the changing infra and user patterns without any manual intervention

By employing advanced analytics to harness vast volumes of highly diverse data from various applications and endpoints across an organisation’s IT infrastructure, ITOA solutions provide IT service desks with instant awareness of issues as they occur – and often before the person at the other end is able to acknowledge. Along with awareness, they deliver an understanding of how these issues could in turn affect both the IT infrastructure and the wider business.

Conclusion

IT operations teams are being challenged to run larger, more complex, hybrid and geographically dispersed IT systems that are constantly in a state of change without growing the number of people or resources. Everything from system successes to system failures and all points in between are logged and saved as IT operations data. IT services, applications, and technology infrastructure generate data every second of every day. All that raw, unstructured, or polystructured data is critical in managing IT operations successfully. The problem is that doing more with less requires a level of efficiency that can only come from complete visibility and intelligent control based on the detailed information coming out of IT systems.

ITOA provides a set of powerful tools that can generate the necessary insight to help IT operations teams to proactively determine risks, impacts, or the potential for outages that may come out of various events that take place in the environment. Allowing a new way for operations to proactively manage IT system performance, availability, and security in complex and dynamic environments with less resources and greater speed. ITOA contributes both to the top and bottom line of any organization by cutting IT operations costs and increasing business value through greater user experience and reliability of business transactions.

ITOA technologies are still relatively immature and Gartner have stated that it will take another 2-5 years for them to reach maturity. However, the smart MSP’s are moving fast to incorporate these technologies in their portfolio’s and IT consumers to demand these technologies from their partners. In the next few years it is forecast that the vast majority of Global 2000 companies will have deployed IT Operations Analytics Platforms as a central component of their architecture for monitoring critical applications and IT services. The key message is If you have not already started to look at ITOA it is time to start planning…

Originally posted on Data Science Central
Follow us @IoTCtrl | Join our Community
Read more…

50 Predictions for the Internet of Things in 2016

Earlier this year I wrote a piece asking “Do you believe the hype?” It called out an unlikely source of hype: the McKinsey Global Institute. The predictions for IoT in the years to come are massive. Gartner believes IoT is a central tenet of top strategic technology trends in 2016. Major technology players are also taking Big Swings. Louis Columbus, writing for Forbes, gathered all the 2015 market forecasts and estimates here.

So what better way to end the year and look into the future than by asking the industry for their predictions for the IoT in 2016. We asked for predictions aimed at the industrial side of the IoT. What new technologies will appear? Which companies will succeed or fail? What platforms will take off? What security challenges will the industry face? Will enterprises finally realize the benefits of IoT? We heard from dozens of startups, big players and industry soothsayers. In no particular order, here are the Internet of Things Predictions for 2016.

Tweet this article here

Photo Credit: Sean Creamer via Flickr

Nathaniel Borenstein, inventor of the MIME email protocol and chief scientist Mimecast

“The maturation of the IoT will cause entirely new business models to emerge, just as the Internet did. We will see people turning to connected devices to sell things, including items that are currently "too small" to sell, thus creating a renewed interest in micropayments and alternate currencies. Street performers, for example, might find they are more successful if a passerby had the convenience of waving a key fob at their "donate here" sign. The IoT will complicate all aspects of security and privacy, causing even more organizations to outsource those functions to professional providers of security and privacy services.”

Adam Wray, CEO, Basho

"The deluge of Internet of Things data represents an opportunity, but also a burden for organizations that must find ways to generate actionable information from (mostly) unstructured data. Organizations will be seeking database solutions that are optimized for the different types of IoT data and multi-model approaches that make managing the mix of data types less operationally complex.”

Geoff Zawolkow, CEO, Lab Sensor Solutions

“Sensors are changing the face of medicine. Mobile sensors are used to automatically diagnosis disease and suggest treatment, bringing us closer to having a Star Trek type Tricorder. Also mobile sensors will ensure the quality of our drugs, diagnostic samples and other biologically sensitive materials through remote monitoring, tracking and condition correction.”

Zach Supalla, CEO, Particle

“2016 isn't the Year of IoT (yet)- It's A Bump in the Road. The industry has been claiming it’s the year of IoT for the last ​five years - let’s stop calling it the year of the IoT and let's start to call it the year of experimentation. 2016 will be the year that we recognize the need for investment, but we’re still deeply in the experimental phase. 2016 will be the bump in the road year - but at the end of it, we’ll have a much better idea of how experiments should be run, and how organizations can “play nicely” within their own walls to make IoT a reality for the business.”

Borys Pratsiuk, Ph.D, Head of R&D Engineering, Ciklum

"The IoT in medicine in 2016 will be reflected in deeper consumption of the biomedical features for non-invasive human body diagnostics. Key medical IoT words for next year are the following: image processing, ultrasound, blood analysis, gesture detection, integration with smart devices. Bluetooth and WiFi will be the most used protocols in the integration with mobile."

Brian T. Patterson, President, EMerge Alliance US Representative, International Electrotechnical Council

“IoT to Enable an Enernet 2016 will see the IoT starting to play a major role in the evolution of a new, more resilient, efficient, flexible and sustainable 21st Century electric energy platform. IoT connected sensors and microcontrollers will enable the effective and efficient management of a true mesh network of building and community level microgrids, which in turn will enable the greater use of distributed renewable energy sources like solar, wind, bio fuel micro-turbines and fuel cells. The convergence of data networks and physical energy grids will give rise to what will become the Enernet, a data driven transactional energy network.”

Chris Rommel, Executive VP, IoT & Embedded Technology, VDC Research

“PaaS Solution Evolution to Cannibalize IoT Platform Opportunity: The landscape of Platform-as-a-Service (PaaS) solutions is changing rapidly. In 2015, leading PaaS providers IBM, Oracle, and SAP threw their hats into the “IoT platform” ring. As quickly as the value of PaaS solutions had been placed on the consumerization and user experiences of development platform offerings, premiums have now been placed on the ease of back-end integrations. However, the value associated with time to market in the Internet of Things marketplace is too high. IoT solution development and engineering organizations still want the flexible benefits offered by PaaS development, but they also require a breadth of out-of-the-box integrations to mitigate the downstream engineering and deployment hassles caused by heterogeneous IoT systems and networks topologies. The desire and need for enterprise organizations to tightly integrate deployed systems' operations with enterprise business functions are reshaping PaaS selection. The need for tight, out-of-the-box integrations extends beyond the back-end, however. Bi-directional integration is critical. The heterogeneous nature of the IoT and wide range of device form factors, components and functions is too complex and costly to rely on bespoke integrations. As such, we expect the aforementioned PaaS leaders to accelerate their ecosystem development efforts in 2016. Although we likely won’t see any real winners yet emerge in the IoT PaaS space, I do expect that the investments made by the aforementioned large players to threaten the market opportunity available to smaller IoT-focused platform vendors like Arrayent and Carriots.”

Laurent Philonenko, CTO, Avaya

“Surge in connected devices will flood the network – the increasing volume of data and need for bandwidth for a growing number of IoT connected devices such as healthcare devices, security systems and appliances will drive traditional networks to the breaking point. Mesh topologies and Fabric-based technologies will quickly become adopted as cost-effective solutions that can accommodate the need for constant changes in network traffic.”


Lila Kee, Chief Product Officer and Vice President, Business Development, GlobalSign

“Prediction: PKI becomes ubiquitous security technology within the Internet of Things (IoT) market. It's hard to think of a consumer device that isn't connected to the Internet these days - from our baby monitors to our refrigerators to our fitness devices. With the increase of connected devices of course comes risk of exposing privacy and consumer data. But, what happens when industrial devices and critical infrastructure connect to the Internet and get hacked? The results can be catastrophic. Security and safety are real concerns for the Internet of Things (IoT) and especially in the Industrial Internet of Things (IIoT). Regarding security, the industrial world has been a bit of a laggard, but now equipment manufacturers are looking to build security in right at the design and development stages. Unless the security challenges of IIoT can be managed, the exciting progress that has been made in this area of connected devices will slow down dramatically. PKI has been identified as a key security technology in the IIoT space by the analyst community and organizations supporting the IIoT security standards. In 2016, we expect that PKI will become ubiquitous security technology within the IoT market. There will be an increased interest in PKI, how it plays in the IoT market and how it needs to advance and scale to meet the demands of billions of devices managed in the field.”

IoT Central members can see all the predictions here. Become a member today here

Read more…

Industrial IoT: Extreme Growth, Extreme Opportunities

Guest blog post by Humera Malik

Getting ready for the Industrial Internet of Things

 The IoT has been influencing the creation of a connected world where unbelievable amounts of data are generated from just about every imaginable thing. These connected, and previously dormant, things are now going to be able to communicate- from my dog’s collar, the milk carton, my car, my thermostat, my watch, my washer and dryer (not that I would particularly enjoy that conversation), you name it. This connectivity and mass data production is exciting because it is defining the future by creating an eco-system where a huge variety of technologies have to work together, thus breaking down silos. Coming from the telco world, that is refreshing.

So what are we supposed to do with this influx of data and connected devices? We’ve been sitting down with IoT decision makers in the industry, and we’ve realized a common theme –at the end of the day, they all want to generate value for their customers from this technology.

Transition before Transformation

The realization of data as an asset is a big leap. In certain industries, less than 1% of the data generated is leveraged to optimize and predict in the working environment. The number of connected devices and data that will be generated is a daily moving target (a recent forecast says 28 billion devices in the next 5 years).

Be it 20 billion or more, we must take measures to accommodate this growth. Both the challenges and opportunities lie in roughly two realms- the consumer world and the industrial world. In the consumer world it is easier to measure value with the advent of some of these wearables, although I am on the fence about the privacy logistics (another topic for another day).

But, in my opinion, the industrial world is where we will see the next wave of technological evolution, through revolutionary smart machinery- machines that can learn and adapt. Big giants like GE and others are already developing connected equipment and devices that will shape the future of the industrial world. In the meantime, before this industrial transformation, small steps need to be taken to help many industries adapt and prepare for the connected future. 

Learn and Adapt

We've seen manufacturing, energy, and healthcare take the lead by implementing the beginning steps- sensors and connected devices. The front-runners are working to decipher a flood of new data sets, as well as trying to leverage the wealth of historical data that is readily available and, in my opinion, something we cannot afford to lose track of. Combined, this old and new data will allow us to predict the future- and isn’t that the dream of modern business?

But before your machine learning systems can take over, and optimize efficiency for cost controls while you go and figure out new revenue streams, my advice is to be realistic. The most success has been gained by focusing and placing priority on building an IoT strategy and a roadmap – such as building efficient production environments, with the KPI’s that will optimize for you, rather than a fully connected factory floor. Build a roadmap, and focus on the end KPI’s rather than the technology and a platform. Take small steps towards adopting an IoT strategy as we have seen too much investment go into platforms too early and yield little ROI.

The worlds of IoT and IoE continue to change, adapt and surprise daily, making it very easy to get overwhelmed. The key is to understand how it can help your business and then to develop a path to adopt this technology. The value of IoT lies in business outcomes, not in underlying technology. 

Dat-uh helps create business value from your IoT investments. View original blog post here

Follow us @IoTCtrl | Join our Community

Read more…

Sponsor