Join IoT Central | Join our LinkedIn Group | Post on IoT Central


OSes (231)

What will this market bring us in the next few years? Are there reasons for optimism?

During the last three years, I have had the opportunity to discover, know and analyse more than 50 Spanish companies in the exciting sector of the Internet of Things (IoT).

Some of these companies are globally recognized as pioneers of IoT. Others less known but very innovative, with great talent in their ranks. All of them have been weathering the storm and far from being discouraged, because the reality is being tougher than all the hype announced by analysts, are more excited than ever before future expectations.

As I wrote in my post “5 PROVERBS TO SAVE MY STARTUP”, nobody is a prophet in their land, but even so, I can not resist providing a few tips that I believe can help us use IoT as an enabler that drives the ICT sector. Would not it be fantastic if we finally met our desire to have a strong, dynamic, competitive and innovative ICT sector in our society?

Accept reality

And the stark reality is: "Spain is not a technological country, it is a service country". I think that the lapidary expression of Miguel de Unamuno, that “they invent it”, also applies to the IoT. But it is one thing not to invent and another is to become sellers of products, solutions or services of multinationals by all known.

We must use our ingenuity, talent, creativity, and customer orientation to design and develop quality, easy-to-use global IoT solutions.

If we are good sellers of foreign products, the language should be the problem. Our objective market should not be our City, our Community or our Country, our market must be the world.

Focus, Focus and Focus

I have insisted on many forums that in Spain we can not do everything on IoT. For example, we can be leaders in Smart Cities, but we will have little chance of success in Connected Cars, we must fight to find a gap in Industry 4.0 (also known as Industrial Internet or IIOT) but I fear we will not be number 1 in Wearables, although we could be innovative in Health services.

We must analyse our strengths and weaknesses to recognize where our opportunities are and what our threats are. Let us be references in our focus areas.

Trusted Ecosystems

We know that there is not a single company in the world that can do everything in IoT, much less leading the IoT, so it is obvious that our companies and Startups have no other choice than to create or be part of reliable ecosystems and Collaborative projects in the focus areas to meet the challenges posed by IoT projects.

We must design new sustainable business models with our local partners, it is time to trust if we want to survive in this competitive and fragmented sector until the magic 2020.

It's time to real collaboration, put a logo on our presentations and our website is absurd if there is something else behind.

Specialization

Given the size of IoT Spanish companies it is not possible to do everything and get it right.

We must specialize, whether manufacturing specific hardware, developing software or offering services in our focus areas.

Scalability

To succeed in IoT, Spanish companies must be able to offer global and scalable solutions. We will need startup talent to focus on companies of a larger size than without giving up innovation and agility, being able to cope with large national and international IoT projects.

Expect to be outsourced by other subcontractors of a company that works for an end customer is not acceptable if we really want to change. It is a pending subject of our business model not only in technology, it is a deep-seated problem of corporate culture.

We should be able to have at least one unicorn in IoT. And I'm not talking about Telefonica, Banco Santander, BBVA, Iberdrola, Inditex, ACS, Ferrovial or Indra, but a company that provides a new IoTaaS model based on our strengths (which all or almost all know) Services and HW / SW IoT products from Spanish manufacturers. That is, we must think about having our Uber, Airbnb or why not our Spanish Tesla.

We must look for concentration of companies in the focus areas to achieve the size that allows the scalability that the IoT business needs.

Invest in Education and Training

The IoT is complex, although many try to make it simple. We will need many types of profiles and not just theoretical knowledge.

It is vital at both, the private and public levels, that the Public Administrations and Companies dedicate funds to continuously educate students and train employees in the IoT technologies.

 “Investing now in IoT training will be key to ensuring a sustainable future for our companies, our country and our professionals.”

 Start Now

This advice goes to both Enterprises and Public Administrations.

In the case of Enterprises, it would be highly desirable to lose for once the fear of being the first to implement technology solutions. You must consider IoT a key element in the digitization process of your company.

Public Administrations, stop using your budgets as always, and think about investing in a more sustainable, intelligent and connected citizen.

To conclude, pulling on the proverb I think:

"We have the wicker, so we must have confidence that we can make a great basket in IoT".

You can read the Spanish version here.

Thanks in advance for your Likes and Shares

Thoughts ? Comments ?

Read more…
Today, with  Digitization of everything, 80 percent the data being created is unstructured. 
Audio, Video, our social footprints, the data generated from conversations between customer service reps, tons of legal document’s texts processed in financial sectors are examples of unstructured data stored in  Big Data.
Organizations are turning to natural language processing (NLP) technology to derive understanding from the myriad of these unstructured data available online and in call-logs.
Natural language processing (NLP) is the ability of computers to understand human speech as it is spoken. NLP is a branch of  artificial intelligence that has many important implications on the ways that computers and humans interact.  Machine Learning has helped computers parse the ambiguity of human language.
Apache OpenNLP, Natural Language Toolkit(NLTK), Stanford NLP are various open source NLP libraries used in real world application below.
Here are multiple ways NLP is used today:
The most basic and well known application of NLP is Microsoft Word spell checking.
Text analysis, also known as  sentiment analytics is a key use of NLP. Businesses are most concerned with comprehending how their customers feel emotionally adn use that data for betterment of their service.
Email filters are another important application of NLP. By analyzing the emails that flow through the servers, email providers can calculate the likelihood that an email is spam based its content by using Bayesian or Naive based spam filtering.
Call centers representatives engage with customers to hear list of specific complaints and problems. Mining this data for sentiment can lead to incredibly actionable intelligence that can be applied to product placement, messaging, design, or a range of other use cases.
Google and Bing and other search systems use NLP to extract terms from text to populate their indexes and to parse search queries.
Google Translate applies machine translation technologies in not only translating words, but in understanding the meaning of sentences to provide a true translation.
Many important decisions in financial markets use NLP by taking plain text announcements, and extracting the relevant info in a format that can be factored into algorithmic trading decisions. E.g. news of a merger between companies can have a big impact on trading decisions, and the speed at which the particulars of the merger, players, prices, who acquires who, can be incorporated into a trading algorithm can have profit implications in the millions of dollars.
Since the invention of the typewriter, the keyboard has been the king of human-computer interface. But today with voice recognition via  virtual assistants, like Amazon’s Alexa, Google’s Now, Apple’s Siri and Microsoft’s Cortana respond to vocal prompts and do everything from finding a coffee shop to getting directions to our office and also tasks like turning on the lights in home, switching the heat on etc. depending on how digitized and wired-up our life is.
Question Answering - IBM Watson is the most prominent example of question answering via information retrieval that helps guide in various areas like healthcare, weather, insurance etc.
Therefore it is clear that Natural Language Processing takes a very important role in new machine human interfaces. It’s an essential tool for leading-edge analytics & is the near future.
Read more…

Guest blog post by Olga Kolesnichenko

What is Big Data: data, process of analysis or concept? There are many definitions that describe Big Data as big amount of data or as some methods of analytics of big amount of data. But more applicable is the approach that Big Data is the concept that includes: data with specific characteristics (V3 - volume, velocity, variety, or V5 - plus value and veracity), methods of analytics (the number of different software is growing), and devices, infrastructure, and most important - the ideas how to configurate all into needed solution.

Another concept is the Internet of Things that based on Big Data Analytics. There are some established configurations of IoT: Smart Home, Smart Health, Smart Manufacturing, Smart City, Smart Mobility, Smart Energy, Smart Farming, Smart Earth & Ocean, Smart Circular Economy.

Smart Health or Internet of Health (or any IoT configuration) has the human in core of concept. I should point out that more easy to accept the medical approaches for different configurations of IoT than accept IoT approaches for Health Care. Why I can insist on this statement? My statement leans on long-term period of complexity of accepting biorhythmology and gravitational biology in medicine.

But biorhythmology and gravitational biology have the direct application for Internet of Health or IoT. The person controls own different medical data day by day during everyday life. And this new situation should be viewed as medical data collecting under gravitational forces and natural biorhythms influences to person.

Three sections of multifactorial regulation of human body should be mentioned: environmental, behavioral and homeostatic. Environmental section includes circannual rhythm (annual) and circadian rhythm (daily). Behavioral section includes body orientation towards gravitational forces (lying down, standing, sitting); active movement (walking, jogging, exercises); passive movement (lift and transport) with influence of acceleration forces; as well as sleep, emotional reactions, eating. Homeostatic section includes the processes of neurohumoral regulation of the body. This section consists of functional systems of the body, described by Russian scientist K.V. Sudakov and his following.

Thus creating Internet of Health configuration and implementing Big Data Analytics the medical data should be considered in terms of three sections of multifactorial regulation of human body. 

Read more…
Analytics and  Big Data have disrupted many industries, and now they are on the edge of scoring major points in sports. Over the past few years, the world of sports has experienced an explosion in the use of analytics
Till few years back experience, gut feelings, and superstition have traditionally shaped the decision making process in sports.
It is first started with Oakland Athletics' General Manager, Billy Beane, who applied analytics for selecting right players. This was the first known use of  statistics and data to make decisions in professional sports.
Today, every major professional sports team either has an analytics department or an analytics expert on staff.  From coaches and players to front offices and businesses, analytics can make a difference in scoring touchdowns, signing contracts or preventing injuries.
Big name organizations such as the Chicago Cubs, and Golden State Warriors are realizing that this is the future of sports and it is in their best interest to ride the wave while everyone else is trying to learn how to surf.
Golden State Warriors, have similarly used big data sets to help owners and coaches recruit players and execute game plans.
SportVu has six cameras installed in the NBA arenas to track the movements of every player on the court and the basketball 25 times per second. The data collected provides a plethora of innovative statistics based on speed, distance, player separation and ball possession to improve next games.
Adidas miCoach app works by having players attach a  wearable device to their jerseys. Data from the device shows the coach who the top performers are and who needs rest. It also provides real-time stats on each player, such as speed, heart rate and acceleration.
Patriots developed a  mobile app called Patriots Game Day Live, available to anyone attending a game at Gillette Stadium. With this app, they are trying to predict the wants and needs of fans, special content to be delivered, in-seat concession ordering and bathroom wait times.
FiveThirtyEight.com, provides details into more than just baseball coverage. It has over 20 journalists crunching numbers for fans to gain a better understanding of an upcoming game, series or season.
Motus’ new sleeves for tracking a pitcher’s throwing motion, measuring arm stress, speed and shoulder rotation. The advanced data generated from this increases a player’s health, performance and career. Experts can now predict with greater confidence if and when a pitcher with a certain throwing style will get injured.

In the recent Cricket world cup, every team had its own team of Data Analysts. They used various technologies like  Cloud Platform and visualizations to predict scores, player performance, player profiles and more. Around 40 years’ worth of Cricket World Cup data is being mined to produce insights that enhances the viewer's experience. 
Analytics can advance the sports fans' experience as teams and ticket vendors compete with the at-home experience -- the better they know their fans, the better they can cater to them.
This collection of data is also used for internet ads, which can help with the expansion and growth of your organization through social media platforms or websites. 
  • What would be the most profitable food served at the concession stand?
  • What would be the best prices to sell game day tickets?
  • Determine which player on the team is the most productive?
  • Which players in the draft will become all-stars, and which ones will be considered role players?
  • Understand the fans behavior at the stadium via their app and push relevant information accordingly.
In this  Digital age, Analytics are the present and future of professional sports. Any team that does not apply them to the fullest is at a competitive disadvantage.
Read more…

The Untapped Potential of Data Analytics

The potential of big data just keeps growing. For taking full advantage, companies need to incorporate analytics into their strategic objectives.

A research report from McKinsey Global Institute (MGI), suggests that the opportunity and applications continue to expand in the data-driven world.

With rapid technological transformation, the question for businesses arises on how to position themselves uniquely in the world leveraging analytics. Over 2.5 quintillion bytes of data is generated every day. As information pours in via various digital platforms, VR application, and mobile phones the need for data storage capacity has increased.

The transformational potential

The recent progress shows the potential of big data and analytics in more than five distinct domains. However, transforming to a data-driven decision-making organisation is not always simple.

The first challenge is to incorporate data and analytics along with business objectives into a core strategic vision. Secondly, the lack of talent in the adoption of analytics. New reports denote that despite training programs, the talent is not enough to match the demand. The next step is to develop the right business process and framework which includes data infrastructure.

Simply combining technology systems along with the existing business operations isn't enough. For ensuring a successful transformation, all aspects of business activity need to be evaluated and combined to realize the full potential of data analytics.

Incorporating data analytics

The next generation of analytic tools will unleash even bigger opportunities. With new machine-learning, deep-learning and artificial-intelligence capabilities, an enormous variety of applications can be enabled which provide customer service, manage logistics and analyze data.

Technology and productivity gains seem an advantage, but also carry the risk of people losing jobs. A case of automation is the AI software developed by Bridgewater Associates, the world's largest hedge fund to improve efficiency.

With Data and analytics shaking up every industry, the effects will only become more noticeable as adoption reaches the masses.

As machines gain unprecedented capabilities to solve complex problems, organizations can harness these capabilities to create their unique value proposition and solve problems.

 

Read more…
Today’s organizations are feeling the fear of becoming  dinosaur every day. New disrupters are coming into your industry and turning everything upside down.
Customers are more  demanding than ever and will abandon the service that is too slow to respond.  Everything is needed yesterday to make your customers happy.
Now, there is no time for organizations to implement huge enterprise applications which takes months and years. 
What they need is, more agile, smaller, hyper focused teams working together to innovate and provide customer value.
This is where Microservices have gain momentum and are becoming fast go-to solution for enterprises. They takes SOA a step further by breaking every component into effectively single-purpose applications.
Microservices, show a strategy for decomposing a large project, based on the functions, into smaller, more manageable pieces. While a monolithic app is One Big Program with many responsibilities, Microservice based apps are composed of several small programs, each with a single responsibility
Microservices are independently developed & deployable, small, modular services. Each component is developed separately, and the application is then simply the sum of its constituent components. Each service runs as a unique process and communicates with other components via a very lightweight methods like HTTP/Rest with Jason.
Unlike old single huge enterprise application which requires heavy maintenance, Microservices are easy to manage.
Here are few characteristics and advantages of Microservices:
  • Very small, targeted in scope and functionality
  • Gives developers the freedom to independently develop and deploy services
  • Loosely coupled & can communicate with other services on industry wide standards like HTTP and JSON
  • API based connectivity
  • Every service can be coded in different programming language
  • Easily deployable and disposable makes releases possible even multiple times a day
  • New Digital technology can be easily adopted for a service
  • Allows to change services as required by business, without a massive cost
  • Testing and releases easier for individual components
  • Better fault tolerance and scale up
There are some challenges as well, while using Microservices:
  • Incur a cost of the testing at system integration level
  • Need to configure monitoring and alerting and similar services for each microservice
  • Service calls to one another, so tracing the path and debugging can be difficult
  • Each service communicates through API/remote calls, which have more overhead
  • Each service generates a log, so there is no central log monitoring.
Netflix has great Microservice architecture that receives more than one billion calls every day, from more than 800 different types of devices, to its streaming-video API.
Nike, the athlete clothing and shoe giant & now digital brand is using Microservices in its apps to deliver extra ordinary  customer experience.
Amazon, eBay are other great examples of Microservices architecture.
GE’s Predix - the  industrial Internet platform is based on Microservices architecture.
So, if your IT organization is implementing a microservices architecture, here are some examples of an operating system (Linux, Ubuntu, CoreOS), container technology(Docker), a scheduler(Swarm, Kubernetes), and a monitoring tool(Prometheus).
The technical demands of digital transformation, all front/back-office systems that seamlessly coordinate customer experiences in a digital world is achieved by Microservices as the preferred architecture.
Microservices help close the gap between business and IT & are fundamental shift in how IT approaches software development and are absolutely essential in Digital Transformation.
Read more…

Many industry experts and consumers are pointing the Internet of Things (IoT) as an upcoming Industrial Revolution or an upcoming Internet.

Why this? Simple, because IoT will consist of the future form of interaction of businesses, governments and consumers with the physical world.

The most recent studies indicate that in 2020 more than 34 billion devices will be connected to the internet, in many sectors (Industrial, Agriculture, Transportation, Wearable Devices, Smart Cities, Smart Houses, etc).

Of these 34 billion, the IoT will be responsible for 23 billion devices, the others 11 billion will be represented by the regular devices, such as, smartphones, tablets, smartwatches, etc.

BI - IoT - Evolution Graph - IoT FutureSource: BI Intelligence

The business sector will be responsible for the biggest use part of this devices, since the IoT can reduce the Operational Costs, Increase the Production, expand the business for new market niches.

Government will take the second biggest part of the devices connected, in smart cities, fasting up the public process, increasing the quality life of the citizens.

At last but not less important, the home user, will have a lot of IoT Devices, Smart Houses, Wearable Devices.

So the future we can really specify in some words: "The future is Data".

Read more…

Do you know what is powerful real-time analytics?

In the  Digital age today, world has become smaller and faster. 
Global audio & video calls which were available only in corporate offices, are now available to common man on the smartphone.
Consumers have more information of the products and comparison than the manufactures at any time, any place, and any device.
Gone are the days, when organizations used to load data in their data warehouse overnight and take decision based on BI, next day. Today organizations need actionable insights faster than ever before to stay competitive, reduce risks, meet customer expectations, and capitalize on time-sensitive opportunities – Real-time, near real-time.
Real-time is often defined in microseconds, milliseconds, or seconds, while near real-time in seconds, minutes.
With real-time analytics, the main goal is to solve problems quickly as they happen, or even better, before they happen. Real-time recommendations create a hyper-personal shopping experience for each and every customer.
The  Internet of Things (IoT) is revolutionizing real-time analytics. Now, with sensor devices and the data streams they generate, companies have more insight into their assets than ever before.
Several industries are using this streaming data & putting real-time analytics. 
·         Churn prediction in Telecom
·        Intelligent traffic management in  smart cities
·        Real-time surveillance analytics to reduce crime
·        Impact of weather and other external factors on stock markets to take trading decisions
·        Real-time staff optimization in Hospitals based on patients 
·        Energy generation and distribution based on smart grids
·         Credit scoring and  fraud detection in financial & medical sector
Here are some real world examples of real-time analytics:
·        City of Chicago collects data from 911 calls, bus & train locations, 311 complaint calls & tweets to create a real-time geospatial map to cut crimes and respond to emergencies
·        The New York Times pays attention to their reader behavior using real-time analytics so they know what’s being read at any time. This helps them decide which position a story is placed and for how long it’s placed there
·        Telefonica the largest telecommunications company in Spain can now make split-second recommendations to television viewers and can create audience segments for new campaigns in real-time
·        Invoca, the call intelligence company, is embedding IBM Watson cognitive computing technology into its Voice Marketing Cloud to help marketers analyze and act on voice data in real-time.
·        Verizon now enables artificial intelligence and machine learning, predicting the customer intent by mining unstructured data and correlations
·        Ferrari, Honda & Red Bull use data generated by over 100 sensors in their Formula 
One cars and apply real-time analytics, giving drivers and their crews the information they need to make better decisions about pit stops, tire pressures, speed adjustments and fuel efficiency.
Real-Time analytics helps getting the right products in front of the people looking for them, or offering the right promotions to the people most likely to buy. For gaming companies, it helps in understanding which types of individuals are playing which game, and crafting an individualized approach to reach them.
As the pace of data generation and the value of analytics accelerate, real-time analytics is the top most choice to ride on this tsunami of information.
More and more tools such as Cloudera Impala, AWS, Spark, Storm, offer the possibility of real-time processing of Big Data and provide analytics,

Now is the time to move beyond just collecting, storing & managing the data to take rapid actions on the continuous streaming data – Real-Time!! 

Read more…

Originally posted by Vincent Granville

It's time again to share your predictions for 2017. I did my homework and came with these 10 predictions. I invite you to post your predictions in the comment section, or write a blog about it. Ramon Chen's predictions are posted here, while you can read Tableau's prediction here. Top programming languages for 2017 can be found here. Gil Press' top 10 hot data science technologies is also worth reading. For those interested, here were the predictions for 2016. Finally, MariaDB discusses the future of analytics and data warehousing in their Dec 20 webinar.

My Predictions

  1. Data science and machine learning will become more mainstream, especially in the following industries: energy, finance (banking, insurance), agriculture (precision farming), transportation, urban planning, healthcare (customized treatments), even government.
  2. Some, with no familiarity with data science, will want to create a legal framework about how data can be analyzed, how the algorithms should behave, and to force public disclosure of algorithm secrets. I believe that they will fail, though Obamacare is an example where predictive algorithms were required to ignore metrics such as gender or age, to compute premiums, resulting in more expensive premiums for everyone.
  3. The rise of sensor data - that is, IoT - will create data inflation. Data quality, data relevancy, and security will continue to be of critical importance.
  4. With the rise of IoT, more processes will be automated (piloting, medical diagnosis and treatment) using machine-to-machine or device-to-device communications powered by algorithms relying on artificial intelligence (AI), deep learning, and automated data science. I am currently writing an article that describes the differences between machine learning, IoT, AI, deep learning and data science. You can sign-up on DSC to make sure that you won't miss it. 
  5. The frontier between AI, IoT, data science, machine learning, deep learning and operations research will become more fuzzy. Statistical engineering will be present in more and more applications, be it machine learning, AI or data science. 
  6. Many systems will continue to not work properly. The solution will have to be found not in algorithms, but in people. Read my article Why so many Machine Learning Implementations Fail. An example is Google analytics, which fails to catch huge amounts of robotic traffic that is so rudimentary and so obvious, you don't need any statistical or data science knowledge to filter it or block it. People publish elementary solutions to address these issues, yet it continues unabated. Fake reviews, fake news, undetected hate speech on Twitter, undetected plagiarism by Google search, are in the same category. Eventually it leaves room for new players to jump in and build a system that will actually work. 
  7. Reliance on public data and public news will come with bigger scrutiny. Some say that the failure to predict the elections is a data science failure. In my opinion, it is a different type of failure: it is the failure to recognize that the media are biased (they publish whatever predictions that fit with their agenda) and maybe even those doing the surveys are biased or incompetent (there are lies, damn lies, and statistics as the saying goes). It is also a failure to recognize the very high volatility in these elections, and the fact that day-to-day variations were huge. Anyone able to compute sound confidence intervals that incorporates historical data,  would have said that the results were not reliably predictable. Finally, I always thought that the winner would be the one best able at manipulation and playing tricks, be it hacking or paying the media.
  8. More and more data cleaning, pre-processing, and exploratory data analysis will be automated. We will also face more unstructured data, with powerful ways to structure them.  Multiple algorithms and models will be more and more blended together to provide the best pattern recognition and predictive systems, and boost accuracy. 
  9. Data science education will evolve, with perhaps a come back of strong university curricula run by leading practitioners, and fewer people finding a job through data science camps only, as many of these camps do not train you to become a data scientist, but instead a Python / R / SQL coder with classic, elementary, even outdated and dangerous statistical knowledge. Or data camps will have to evolve, or otherwise risk becoming another kind of Phoenix university.
  10. Attacks against data-dependent infrastructure will switch from stealing or erasing data, to modifying data. Some will be launched from IoT devices if security holes are not fixed.

Follow us @IoTCtrl | Join our Community

Read more…

Fail fast approach to Digital Transformation

Digital Transformation is changing the way customers think & demand new products or services.
Today Bank accounts are opened online, Insurance claims are filed online, patient’s health is monitored online while buying things online is the thing of past. Everything is here and now in real time.
Till few years back any failure of decision making in business was scary & not acceptable. It had cost companies to go out of fortune 100 list. Blockbuster, Nokia, Kodak, Blackberry are well known examples of not trying new experiments quickly.
But with the digital era, failure is accepted & it is seen as part and parcel of a successful digital business. Failure must be fast, and the lessons of failure learned, should be even faster. It allows businesses to take a shotgun approach to digital transformation.
Fail fast is all about deploying quick pilots and check the outcome. If it does not work then drop the concept/idea and move on to new one. Be prepared to change the pace or direction as necessary.
No business will undergo digital transformation without making any mistakes. Even if an organization has the best possible  culture & strategy in place, there will be stumbling blocks on the road to success. With the digital technologies like  Cloud, Big Data, Analytics,  MobilityInternet of Things, at the disposal, organizations can test the innovative ideas quickly before even reaching out to customer for feedback.
Speed is of the essence here. Testing all the ideas without making huge investments, then delivering the applications in weeks and not months or years to remain competitive. This change has helped organizations to reduce the time-to-market of enhancement on customer experience.
Apple is an example of a company which failed but didn’t give up. It moved on, refined its approach, improved its R&D and eventually launched the product its customers deserved.
Domino's bounced back from customers comments like “your pizza tastes like a cardboard”. With the reboot of menu in 2009 & digital technology they experimented online ordering, created a tracker, which allowed customers to follow their pizza from the oven to their doorstep.
Air New Zeland gone from posting the largest corporate loss in its country’s history to being one of the world’s most consistently profitable airlines by using  Big Data Analytics to enhance customer experience in many ways including biometric baggage check-in, an electronic “air band” for unaccompanied minors.
There are several individual examples of failures and success over time:
·        Steve Jobs was fired from the Apple but came back as CEO & made history
·        Thomas Edison failed over 10000 times before success of light bulb
·        J K Rowling of Harry Potter had lots of failures
·        Michael Jordan succeeded after his constant failure to win
But organizations don’t have this time at their hand. They can learn a lot from these individuals failures but quickly move on and achieve success in Digital Transformation.
In Digital Transformation, fail fast is not an option but it is a requirement!!
Read more…
Machine Learning is the foundation for today’s insights on customer, products, costs and revenues which learns from the data provided to its algorithms.
Some of the most common examples of machine learning are Netflix’s algorithms to give movie suggestions based on movies you have watched in the past or Amazon’s algorithms that recommend products based on other customers bought before.
Typical algorithm model selection can be decided broadly on following questions:
·        How much data do you have & is it continuous?
·        Is it classification or regression problem?
·        Predefined variables (Labeled), unlabeled or mix?
·        Data class skewed?
·        What is the goal? – predict or rank?
·        Result interpretation easy or hard?
Here are the most used algorithms for various business problems:
 
Decision Trees: Decision tree output is very easy to understand even for people from non-analytical background. It does not require any statistical knowledge to read and interpret them. Fastest way to identify most significant variables and relation between two or more variables. Decision Trees are excellent tools for helping you to choose between several courses of action. Most popular decision trees are CART, CHAID, and C4.5 etc.
In general, decision trees can be used in real-world applications such as:
·        Investment decisions
·         Customer churn
·        Banks loan defaulters
·        Build vs Buy decisions
·        Company mergers decisions
·        Sales lead qualifications
 
Logistic Regression: Logistic regression is a powerful statistical way of modeling a binomial outcome with one or more explanatory variables. It measures the relationship between the categorical dependent variable and one or more independent variables by estimating probabilities using a logistic function, which is the cumulative logistic distribution.
In general, regressions can be used in real-world applications such as:
·        Predicting the Customer Churn
·         Credit Scoring &  Fraud Detection
·        Measuring the effectiveness of marketing campaigns
 
Support Vector Machines: Support Vector Machine (SVM) is a supervised machine learning technique that is widely used in pattern recognition and classification problems - when your data has exactly two classes.
In general, SVM can be used in real-world applications such as:
·        detecting persons with common diseases such as diabetes
·        hand-written character recognition
·        text categorization – news articles by topics
·        stock market price prediction
 
Naive Bayes: It is a classification technique based on Bayes’ theorem and very easy to build and particularly useful for very large data sets. Along with simplicity, Naive Bayes is known to outperform even highly sophisticated classification methods. Naive Bayes is also a good choice when CPU and memory resources are a limiting factor
In general, Naive Bayes can be used in real-world applications such as:
·         Sentiment analysis and text classification
·        Recommendation systems like Netflix, Amazon
·        To mark an email as spam or not spam
·        Facebook like face recognition
 
Apriori: This algorithm generates association rules from a given data set. Association rule implies that if an item A occurs, then item B also occurs with a certain probability.
In general, Apriori can be used in real-world applications such as:
·        Market basket analysis like amazon - products purchased together
·        Auto complete functionality like Google to provide words which come together
·        Identify Drugs and their effects on patients
 
Random Forest: is an ensemble of decision trees. It can solve both regression and classification problems with large data sets. It also helps identify most significant variables from thousands of input variables.
In general, Random Forest can be used in real-world applications such as:
·        Predict patients for  high risks
·        Predict  parts failures in manufacturing
·        Predict loan defaulters
The most powerful form of machine learning being used today, is called “ Deep Learning”.
In today’s  Digital Transformation age, most businesses will tap into machine learning algorithms for their operational and customer-facing functions
Read more…

Guest blog post by Sandeep Raut



Digital Transformation is helping all the corners of life and healthcare is no exception.
Patients when discharged from the hospital are given verbal and written instructions regarding their post-discharge care but many of them get readmitted in 30 days due to various reasons. 
Over last 5 years this 30 days readmission rate is almost 19% with over 25 billions of dollars spent per year.

In October 2012 the Centers for Medicaid and Medicare Services (CMS) began penalizing hospitals with the highest readmission rates for health conditions like acute myocardial infarction (AMI), heart failure (HF), pneumonia (PN), chronic obstructive pulmonary disease (COPD) and total hip arthroplasty/total knee arthroplasty (THA/TKA).


Various steps to reduce the readmission:

  • Send the patient home with 30-day medication supply, wrapped in packaging that clearly explains timing, dosage, frequency, etc
  • Have hospital staff make follow-up appointments with patient's physician and don't discharge patient until this schedule is set up
  • Use Digital technologies like Big Data & IoT to collect vitals and keep up visual as well as verbal communication with patients, especially those that are high risk for readmission. 
  • Kaiser Permanente & Novartis are using Telemedicine technologies like video cameras for remote monitoring to determine what's happening to the patient after discharge
  • Piedmont Hospital in Atlanta provides home care on wheels like case management, housekeeping services, transportation to the pharmacy and physician's office          
  • Use of Data Science algorithms to predict patients with high risk of readmission
  • Walgreens launched WellTransitions program where patients receive a medication review upon admission and discharge from hospital, bedside medication delivery, medication education and counseling, and regularly scheduled follow-up support by phone and online.
  • HealthLoop is a cloud based platform that automates follow-up care keeping doctors, patients and care-givers connected between visits with clinical information that is insightful, actionable, and engaging.
  • Propeller Health, a startup company in Madison has developed an app and sensors track medication usage and then send time and location data to a smartphone
  • Mango Health for iPhone and wearables like Apple Watch makes managing your medications fun, easy, and rewarding. App feature include: dose reminders, drug interaction info, a health history, and best of all - points and rewards, just for taking your medicines.
These emerging digital tools enable health care organizations to assess and better manage who is at risk for readmission and determine the optimal course of action for the patients.

Such tools also enable patients to live at home, in greater comfort and at lower cost, lifting the burden on themselves and their families.
Digital is helping mankind in all ways !!
Follow us @IoTCtrl | Join our Community
Read more…

Digital Transformation in Utilities sector

It is easy to take for granted the technology we have at our disposal. We flick a switch and the lights go on, we turn on the tap and clean water comes out. We don’t have to worry about gas for cooking. 
But today the Utilities industry is under pressure to simultaneously reduce costs and improve operational performance.
Utilities sector is a bit late in  digital innovations than  RetailBanking or  Insurance. With energy getting on the digital bandwagon with online customer engagement, smart sensors and better use of  analytics, Utilities are now adopting it.
Digital technology gives utility companies the opportunity to collect much richer, customer level data, analyze it for service improvements, and add new services to change the way customers buy their products.
Smart technology will be used to monitor home energy usage, to trigger alerts when previously established maximum limits are being reached, and to offer ‘time of use’ tariffs that reward consumers for shifting demand from peak times. 
Electricity is the most versatile and widely used form of energy and global demand is growing continuously.  Smart grids manage the electricity demand in sustainable, reliable and economic manner.
Advantages of Digital Transformation:
  • Digital makes customer self-service easy.
  • Digitally engaged customers trust their utilities.
  • Customer care, provided through digital technology, offers utilities both cost-to-serve efficiencies and improved customer intimacy.
  • Digital technology brings the capability to provide more accurate billing and payment processing, as well as faster response times for changing addresses and bills, removing and adding services, and many other functions
  • Using Mobile as a primary customer engagement channel for tips and alerts
  • Predictive maintenance with outage maps and real time alerts to service engineer helps reduce the downtime and costs
  • Smart meters allows utilities organizations to inform their customers about the energy consumption, tailor products and services to their customers while   achieving significant operational efficiencies at the same time

Meridian, a New Zealand energy company, launched PowerShop, an online energy retail market place that gives customers choice and control over how much power they buy and use. This helped Meridian attract online consumers and extend its reach of core retail offering.
Google’s Nest, an  IoT enabled energy efficiency management gives details about consumption patterns and better control.
Thames Water, UK’s largest provider of water uses digital for remote asset monitoring to anticipate equipment failures and respond in near real time.
Big Data analytics and actionable intelligence gives competitive advantage by gained efficiency. 
IBM Watson with its  cognitive computing power helped utilities identify trend and pattern analysis, predict which assets or pieces of equipment are most likely to cause points of failure. 
Today more than ever, utilities companies are asking: “How can we be competitive in this digital world?” People, whether they are customers, citizens or employees, increasingly expect a simple, fast and seamless experience. 
Read more…

Guest blog post by Shay Pal

This article has been contributed by Alain Louchez (Georgia Tech Research Institute)

The Internet of Things already integrates a new phase beyond prescriptive analytics. 

There is no shortage of attention lately on the “Internet of Things”. As a case in point, see the “Developing Innovation and Growing the Internet of Things Act” or “DIGIT Act”, i.e., S. 2607, a  bill introduced in the Senate on March 1, 2016 and amended on September 28, 2016,  “to ensure appropriate spectrum planning and inter-agency coordination to support the Internet of Things” – A companion bill, H.R. 5117, was introduced in the House of Representatives on April 28, 2016.

However, since there is no “internet” dedicated to “things”, it is fair to state that the Internet of Things does not exist as such.  We are left with a definitional vacuum, but it is hammering the obvious to acknowledge that there is no dearth of attempts around the world to fill the gap. Perhaps as a helpful shortcut, we could view the expression as a metaphor that captures the arrival of almost anything and everything, until now out of scope, into the communications space.

Source for picture: click here

While the proposed DIGIT Act sees the Internet of Things as referring to the “growing number of connected and interconnected devices”, we could argue that the smartness of those devices is what makes IoT truly unique; hence the “interconnection of intelligent things” may be a more accurate descriptor.

In sum, the term “Internet of Things” heralds the advent of what we could call a true “pulsating world” arising from sending data from and to smart devices.

Extracting information, and therefore value, from the data captured at the edge of the Internet of Things network is increasingly becoming a core focus of IoT solution providers.

As the Internet of Things is bound to become a gargantuan reservoir of data, it close interaction with data analytics has been well documented in academia and business.

Charles McLellan writing for a special feature of ZDNet in March 2015 on the Power of IoT and Big Data: Unlocking the Power, agrees that “the IoT will massively increase the amount of data available for analysis by all manner of organizations,” but cautions that “the volume, velocity and variety (not to mention variable veracity) of IoT-generated data makes it no easy task to select or build an analytics solution that can generate useful business insight.”

Business Analytics frameworks such as IBM’s and Gartner’s are widely used to chart the course of the business insight extraction. The former identifies three analytic capabilities, i.e., descriptive, predictive and prescriptive while the latter inserts a diagnostic stage between descriptive and predictive. Table 1 summarizes both frameworks:

Table 1: Existing Business Analytics Categories

Business Analytics Category

IBM

Gartner

Descriptive Analytics

"What has happened?"

"What happened?

Diagnostic Analytics

N.A.

“Why did it happen?

Predictive Analytics

"What could happen?”

"What will happen?”

Prescriptive

"What should we do?”

“How can we make it happen?

 

Extending the data analytics discussion to the Internet of Things (a.k.a. “the analytics of things”), Professor Thomas Davenport, in a March 2016 Data Informed article, wonders When Will the Analytics of Things Grow Up? He expresses the concern that “most of the ‘analytics of things’ thus far have been descriptive analytics – bar (and Heaven forbid, pie) charts, means and medians, and alerts for out-of-bounds data,” and highlights areas where business analytics can make a difference in IoT beyond descriptive (dashboard-type report on performance) such as diagnostic (alerts that need attention), predictive (e.g., breakdown potential) and prescriptive (recommendations based on predictions, experiments, or optimizations).

Prescriptive has often been described as the “final frontier” of data analytics. Yet, or perhaps because of this, a recent academic paper observes that “there are very limited examples of good prescriptive analytics in the real world” (1). The same paper proposes an overall definition “in general, prescriptive solutions assist business analysts in decision-making by determining actions and assessing their impact regarding business objectives, requirements, and constraints. For example, what if simulators have helped provide insights regarding the plausible options that a business could choose to implement in order to maintain or strengthen its current position in the market.” Let’s note in passing the “assisting” role of said solutions. Indeed advising outcomes and providing recommendations are presently associated with this final phase of business analytics. This seems to somewhat contradict the notion of “prescription” since its very meaning implies imposition of one single direction or rule. However, the current generally-accepted use gives room for interpretation or non-compliance.

Professor Davenport’s conclusion in the above-mentioned article anticipates the need for another type of prescription in the IoT world: “In some IoT environments such as smart cities analytics will need to provide automated prescriptive action /…/ In such settings, the amount of data and the need for rapid decision making will swamp human abilities to make decisions on it.”

Davenport’s prescient clarity is strengthened when we realize that, in the IoT space, the future, with “automated prescriptive action”, is actually already here (e.g., IoT devices with actuators, industrial robots, etc.). Consequently, we might have to recognize a possible fifth phase (a new final frontier?), i.e., normative analytics (or automated prescription), whose conceptual tenets lean on system engineering. For all intents and purposes, normative analytics is a self-adaptive system, which adjusts dynamically to changes in external and internal conditions (a point of comparison could be rocket launch technologies).

While not yet mainstream, the following two examples with life and death consequences, i.e., driverless cars and autonomous robots for surgery, show the interactive analytics continuum.

Data analysis for driverless cars (or more generally, unmanned vehicles) has to go through at least five stages, i.e., 1) descriptive (e.g., wear and tear of a multitude of components); 2) diagnostic (e.g., what’s wrong?); 3) predictive (e.g., impact of traffic jams, weather, detours); 4) prescriptive (based on previous analytics layers, what acceptable options does the vehicle have?) and finally 5) a normative stage (what the vehicle must do). All these phases must happen concurrently and seamlessly; there is no time to delay the final decision, i.e., the car must stop or move. The consequences of a delay, however minute, could be catastrophic.

The same can be said of surgical robots. For instance, a team of researchers at the Sheikh Zayed Institute for Pediatric Innovation within the Children’s National Health System, and Johns Hopkins University recently announced that they had developed a “Smart Tissue Autonomous Robot” (STAR) for soft tissue surgery, which is “a difficult task for a robot given tissue deformity and mobility” (2). Quite remarkably, they claim that they have demonstrated that “supervised autonomy with STAR not only is feasible but also, by some metrics, surpasses the performance of accepted surgical procedures including RAS (Robot-Assisted Surgery), LAP (Laparoscopy), and manual surgery” and that “autonomous surgery can bring better efficacy, safety, and access to the best surgical techniques regardless of human factors, including surgeon experience.” Is there a better example of automated and integrated analytics intimately connected with action?

The above two IoT-related use cases exemplify the smooth transition from the virtual to the physical world through analytics and may be used as a template/model for business analytics (see Table 2).

Table 2: Analytics Categories with a New “Final Frontier”

Business Analytics Category

Output

Category          Differentiator

Human Factor

Descriptive Analytics

"What has happened"    (Inform)

"It is what it is"

Except for the selection of statistical techniques and areas of interest, does not shape the output

Diagnostic Analytics

“Why did it happen?” (Understand)

“This is what needs attention”

Except for the selection of statistical techniques and areas of interest, does not shape the output

Predictive Analytics

"What could happen?"    (Forecast)

"This is what the future may look like"

Except for the selection of statistical techniques and areas of interest, does not shape the output

Prescriptive Analytics

"What should we do?"       (Advise)

"These are the optimal options we have" - Possible delay between analysis and decision

Shape the output, but post-analysis - Human decides on execution of potential actions.

Normative Analytics

"This is what must happen."                 (Execute)

Based on predictive analytics, next step is automatically triggered and instrumented - No delay between analysis and decision/action

Shape the output with action criteria embedded in this phase - Except for monitoring, human is not involved in execution.

 

 

Conclusion

Occam’s razor notwithstanding, there is a case to be made for specifying a new phase in data analytics, i.e., normative analytics, which, whether identified or not, already exists, especially in the Internet of Things universe. This is the step where, fed by the other analytics phases, action is automatically triggered and executed.

It would be hard to merge prescriptive analytics (as currently defined in the analytics space) with the proposed normative analytics. There will always be situations where the need for injecting human input into the final decision preempts process automation. In any event, prescriptive analytics remains the necessary step that must be gone through and successfully passed before normative analytics kicks in.

Note that this model can be applied to any Cyber-Physical System (CPS), whatever its size. The National Institute of Standards and Technology (NIST) defines Cyber-Physical Systems as “smart systems that include engineered interacting networks of physical and computational components,” and underscores that “CPS and related systems (including the Internet of Things (IoT) and the Industrial Internet) are widely recognized as having great potential to enable innovative applications and impact multiple economic sectors in the worldwide economy”. It also foresees that CPS’ new capabilities will fuse with other important evolutions of technologies such as big data analytics, which “are expected to bring transformational changes to economies, societies, our knowledge of the world, and ultimately the way people live” (3).

Dr. Frey and Professor Osborne’s abundantly-quoted study on computerization serves as a reminder that the broad diffusion of automated prescription might just be around the corner. According to their estimate, “47 percent of total US employment is in the high risk category, meaning that associated occupations are potentially automatable over some unspecified number of years, perhaps a decade or two” (4).

This prospect cannot materialize without normative analytics resting on optimization techniques. The irony is that as high technology increasingly pervades all economic functions, non-technology considerations become part and parcel of the whole framework. Policy, regulation, ethical and other critical perspectives need to be factored in the optimization calculation.

Among its many applications, normative analytics could open the door to a new regulatory domain, which has been called “governance by things” (5) or “algorithmic regulation” (6), i.e., a new way of implementing rules and norms through analytics not, however, without misgivings (7).

Regardless of the attractiveness of this new concept, the introduction of normative analytics as an integral phase of the data analytics spectrum must be further validated, and prudence advocated the same way William Vorhies not too long ago cautioned against the possible groundless addition of prescriptive analytics in the data analytics nomenclature, i.e., “with about 1/3rd of companies having yet adopted predictive analytics the last thing we need is introducing ‘the next big thing’ into that conversation unless there is some real value to be had” (8).

-------------------------------------

(1)    Uthayasankar Sivarajah, Muhammad Mustafa Kamal, Zahir Irani, and Vishanth Weerakkody, “Critical analysis of Big Data challenges and analytical methods”, Journal of Business Research, Elsevier, August 10, 2016, available at http://www.sciencedirect.com/science/article/pii/S014829631630488X

 

(2)    Azad Shademan, Ryan S. Decker, Justin D. Opfermann, Simon Leonard, Axel Krieger, and Peter C. W. Kim, May 4, 2016, “Supervised autonomous robotic soft tissue surgery”, Science Translational Medicine 8 (337), available at http://stm.sciencemag.org/content/8/337/337ra64.full

 

(3)    See Framework for Cyber-Physical Systems – Release 1.0 – May 2016, document prepared by the Cyber-Physical Systems Public Working Group (CPS PWG), an open public forum established by the National Institute of Standards and Technology (NIST), available at https://s3.amazonaws.com/nist-sgcps/cpspwg/files/pwgglobal/CPS_PWG_Framework_for_Cyber_Physical_Systems_Release_1_0Final.pdf

 

(4)    Carl Benedikt Frey, and Michael A. Osborne, “The Future of Employment: How Susceptible are Jobs to Computerization?”, September 17, 2013, Oxford Martin Press, available at http://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf

 

(5)    Wolfgang Shultz and Kevin Dankert, ’Governance by Things’ as a challenge to regulation by law, Internet Policy Review, Volume 5, Issue 2, June 30, 2016, available at http://policyreview.info/articles/analysis/governance-things-challenge-regulation-law

 

(6)    Brett Goldstein, “When Government Joins the Internet of Things”, The New York Times, September 8, 2013, available at http://www.nytimes.com/roomfordebate/2013/09/08/privacy-and-the-internet-of-things/when-government-joins-the-internet-of-things

 

(7)    Evgeny Morozov, “The Rise of Data and the Death of Politics, The Guardian (UK), July 19, 2014, available at https://www.theguardian.com/technology/2014/jul/20/rise-of-data-death-of-politics-evgeny-morozov-agorithmic-regulation

 

(8)    William Vorhies, “Prescriptive versus Predictive Analytics - A Distinction without a Difference?”, October 23, 2014, Decision Science Central, available at http://www.datasciencecentral.com/profiles/blogs/prescriptive-versus-predictive-analytics-a-distinction-without-a

 

 

The views expressed in this article are solely the author’s and do not necessarily represent those of the Georgia Institute of Technology (“Georgia Tech”), the Georgia Tech CDAIT members, the University System of (U.S. State of) Georgia or the (U.S.) State of Georgia.

Alain Louchez is the Managing Director of the Center for the Development and Application of Internet of Things Technologies (CDAIT pronounced “sedate”) at the Georgia Institute of Technology (“Georgia Tech”), Atlanta, Georgia. CDAIT’s purpose is to expand and promote the Internet of Things (IoT)’s huge potential and transformational capabilities through research, education and industry outreach. The Center is sponsored by global companies headquartered in North America, Europe, Asia and Australia.  Alain was recently selected by Instituto Tecnológico y de Estudios Superiores de Monterrey (Guadalajara) as an international advisor to “Centro de Innovación, Desarrollo Tecnológico y Aplicaciones de Internet de las Cosas” (a.k.a. “Center of Innovación in Internet of Things or CIIoT”) after bid approval by the government of Mexico in June 2016. Prior to joining Georgia Tech, Alain held various executive positions including member of the board of directors of leading companies in the high tech industry, in Europe and the United States.

Follow us @IoTCtrl | Join our Community

Read more…

Product recommendations in Digital Age

By 1994 the web has come to our doors bringing the power of online world at our doorsteps. Suddenly there was a way to buy things directly and efficiently online.
Then came eBay and Amazon in 1995....... Amazon started as bookstore and eBay as marketplace for sale of goods.
Since then, as  Digital tsunami flooded, there are tons of websites selling everything on web but these two are still going great because of their product recommendations.
We as customers, love that personal touch and feeling special, whether it’s being greeted by name when we walk into the store, a shop owner remembering our birthday, helping us personally to bays where products are kept, or being able to customize a website to our needs. It can make us feel like we are single most important customer. But in an online world, there is no Bob or Sandra to guide you through the product you may like. This is where recommendation engines do a fantastic job.
With personalized product recommendations, you can suggest highly relevant products to your customers at multiple touch points of the shopping process. Intuitive recommendations will make every customer feel like your shop was created just for them.
Product recommendation engines can be implemented by collaborative filtering, content-
based filtering, or with the use of hybrid recommender systems.
There are various types of product recommendations:
           ·        Customers who bought this also bought - like Amazon
           ·        Best sellers in store – like HomeDepot
           ·        Latest products or arriving soon – like GAP
           ·        Items usually bought together – like Amazon
           ·        Recently views based on history – like Asos
           ·        Also buy at checkout – like Lego
There are many benefits that a product recommendation engine can do for  digital marketing and it can go a long way in making your customers love your website and making it their favorite eCommerce site to shop for.
Advantages of product recommendations:
·        Increased conversion rate
·        Increased order value due to cross-sell
·        Better customer loyalty
·        Increased customer  retention rates
·        Improved  customer experience
Application of  Data Science to analyze the behavior of customers to make predictions about what future customers will like.  Big Data along with  machine learning and  artificial intelligence are the key to product recommendations.
Understanding the shopper’s behavior on different  channels is also a must in personalizing the experience. Physical retail, mobile, desktop and e-mails are the main sources of information for the personalization engines
Amazon was the first player in eCommerce to invest heavily on product recommendations. Its recommendation system is based on a number of simple elements: what a user has bought in the past, which items they have in their virtual shopping cart, items they’ve rated and liked, and what other customers have viewed and purchased. Amazon has used this algorithm to customize the browsing experience & pull returning customers. This has increased their sale by over 30%.
Yahoo, Netflix, Yahoo, YouTube, Tripadvisor, and Spotify are other famous sites taking advantage of the recommender systems. Netflix ran a famous 1 million dollars competition from 2006 till 2009 to improve their recommendation engine.
Many commercial product recommendation engines are available today such as Monetate, SoftCube, Barilliance, Strands etc.
Ultimately most important goal for any eCommerce platform is to convert visitors into paying customers. Today the customer segmentation era as gone and its  hyper- personalization
Product recommendations are extremely important in digital age !!
Read more…

How to Learn Big Data

With the contribution of Big Data Technologies in verging on each division of business, there are expected to be a great many Big Data occupations opportunities in government and other area of business. That is the motivation behind why everybody is charmed to learn Big Data as there are part of unfilled opportunities and a lucrative profession ahead. Presently the inquiry is "The way to Learn Big Data Analytic?" Obviously you can't go to back to the school to get a degree and important experience, yet there are choices. You can also choose to learn big data from various big data courses. Remembering the extent of this industry, numerous colleges and universities have begun to putting their Big Data courses online for the comfort of applicants. Be that as it may, these courses aren't totally free, one needs to pay a specific sum contingent upon the course. There are couple of various requirements for various courses that a man ought to know about before getting the knowledge points of interest of the course. Firstly we should comprehend what entirely Data is?

What is Big Data?

We've frequently heard this term Big Data, yet at the same time there is disarray and a colossal crevice in individuals' comprehension of what really does Big Data Means. How about we comprehend this to expel the perplexity.

What we've known till now is that "Huge Data is something which is so huge and complex that it can not be taken care of by conventional information handling framework and design". There are a lot of definition everywhere throughout the web which connotes the same previously mentioned thing. However, what does that really mean? How about we examine this with an illustration.

For example Let's say you maintain a little business and information base of your organization handles every one of your business exchange, suppose 30,000 exchange for each month and your framework is intended to handle up to 40,000 exchange for every month. Presently imagine a scenario in which you get more than 40,000 exchange for every month ?. Will your framework still handle those exchanges. Answer is no. Assume in the event that you get 3,000,000 exchanges in a month, so this information is Big information for you as your framework can deal with this information, however not for those which handles 10,000,000 exchange in a month. Along these lines, Big Data is basically a relative term in contrast with your design. Additionally, what may look Big today, may not be Big any more in future.

How to learn Big Data Analytics?

Books are the best friend, so we should also prefer books to learn about technologies. So here you can get thebest big data analytics books. All things considered, looking at Big Data i.e. amazingly extensive information to reveal concealed examples, and other valuable data that can be utilized for settle on settling on a superior choice is not something that should be possible overnight. Here I've shared couple of Big Data Online Training Courses that one can pick to see huge information expressly.

With development of Big Data scope as of late, organizations are scanning for information researchers. This specific field requests numerous aptitudes difficult to get through routine educational module. Acquaint yourself with the fundamentals of huge information and leave outfitted with reasonable experience separating esteem from huge information.

1. Udacity

2. SimpliLearn

3. EMC

4. Coursera

5. CalTech's Learning from Data

6. MIT Open Courseware

7. Jigsaw Academy

8. Stanford's OpenClassroom

9. Code School

These are the few courses that one can select to upgrade their huge information investigation aptitudes or just started to get the fundamental thought of huge information. Huge information is staying put and this field is en route towards development just and organizations are frantically searching for qualified and gifted individuals. Snatch any course from the previously mentioned alternatives to begin and to have a decent information of enormous information. There are a few Prerequisites that a man ought to have with a specific end goal to have a superior comprehension of the course.

Why you Should Learn Big Data?

Here are a few realities which demonstrates the situation of Big Data development which may urge you to take in this innovation.

In 2015, there were around 1.9 million Big Data employments in the U.S.

According to the data, in up and coming years India would require no less than 1, 00,000 information researchers to fill the Big Data space.

In up and coming years India will be the most favored goal to harbor examination outsourcing contrasted with Philippines and China

The yearly compensation of information examination expert could be as high as Rs 6-9 lakhs or additionally relying upon experience and abilities.

Occupations After Learning Big Data?

There are a lot of Jobs accessible for different position and organizations are urgently prepared to pay to heaps of cash to qualified and talented individuals. As per the reports, in 2015 there were around 1.9 million Big Data employments in the U.S. This number ain't going to descend at whatever time sooner, the extent of enormous information is en route up and this field is considered as a groundbreaking vocation choice. Here we've shared distinctive positions that you can seek after Big Data course.

  • Boss Data Officer
  • Huge Data Scientists
  • Huge Data Analyst
  • Huge Data Solutions Architecture
  • Enormous Data Engineer
  • Huge Data Researcher

Undoubtedly Big Data is a major open door for employment applicants around the globe. In this way, if are you prepared to make an excellent vocation move then select the field of Big Data.

And till now the bestIot Blogging i have seen is this blog. Thanks

Read more…

What is Deep Learning ?

Remember how you started recognizing fruits, animals, cars and for that matter any other object by looking at them from our childhood? 
Our brain gets trained over the years to recognize these images and then further classify them as apple, orange, banana, cat, dog, horse, Toyota, Honda, BMW and so on.
Inspired by these biological processes of human brain, artificial neural networks (ANN) were developed.  Deep learning refers to these artificial neural networks that are composed of many layers. It is the fastest-growing field in  machine learning. It uses many-layered Deep Neural Networks (DNNs) to learn levels of representation and abstraction that make sense of data such as images, sound, and text
Why ‘Deep Learning’ is called deep? It is because of the structure of ANNs. Earlier 40 years back, neural networks were only 2 layers deep as it was not computationally feasible to build larger networks. Now it is common to have neural networks with 10+ layers and even 100+ layer ANNs are being tried upon.
Using multiple levels of neural networks in Deep Learning, computers now have the capacity to see, learn, and react to complex situations as well or better than humans.
Normally  data scientists spend lot of time in data preparation – feature extraction or selecting variables which are actually useful to  predictive analytics. Deep learning does this job automatically and make life easier.
Many technology companies have made their deep learning libraries as open source:
  • Google’s Tensorflow
  • Facebook open source modules for Torch
  • Amazon released DSSTNE on GitHub
  • Microsoft released CNTK, its open source deep learning toolkit, on GitHub

Today we see lot of examples of Deep learning around:

  • Google Translate is using deep learning and image recognition to translate not only voice but written languages as well. 
  • With CamFind app, simply take a picture of any object and it uses mobile visual search technology to tell you what it is. It provides fast, accurate results with no typing necessary. Snap a picture, learn more. That’s it.
  • All digital assistants like Siri, Cortana, Alexa & Google Now are using deep learning for natural language processing and speech recognition
  • Amazon, Netflix & Spotify are using recommendation engines using deep learning for next best offer, movies and music
  • Google PlaNet can look at the photo and tell where it was taken
  • DCGAN is used for enhancing and completing the human faces
  • DeepStereo: Turns images from Street View into a 3D space that shows unseen views from different angles by figuring out the depth and color of each pixel
  • DeepMind’s WaveNet is able to generate speech which mimics any human voice that sounds more natural than the best existing Text-to-Speech systems
  • Paypal is using H2O based deep learning to prevent fraud in payments
Till now, Deep Learning has aided image classification, language translation, speech recognition and it can be used to solve any pattern recognition problem, and all of it is happening without human intervention.
Deep learning is a disruptive  Digital technology that is being used by more and more companies to create new business models.
Read more…
Remember the scenario of 1990s office environment:
  • We had our family photos pined on the board,
  • Our contacts were written by our hands and arranged in alphabetical order for easy retrieval,
  • For calling anyone we used to have one black dialing phone at the end of the hall
  • Most of the time outside dialing was allowed to only select few privileged seniors,
  • We used yellow post-it stickers to put our thoughts on the bulletin board,
  • Any software delivery to customer was copied on the 8 inch floppy disk and shipped across continents to be hand delivered.

Now fast forward to 2016 – we have Twitter and blogs to post our thoughts, Pinterest and Instagram to post our photos, no more wait for calling anyone, Facebook to talk to friends, smartphone to store our contacts, we can do not only audio but video calls via Skype or Face Time and software deliveries are instant via email.
Today we live in the world of instant gratification and  digital transformation is making it happen.
Our  smartphones have become more important than our spouses. We can’t live without them. They can do the jobs of alarm clock, camera, radio, torch, music systems, maps, books, news channels, credit cards, language translators & play games. We can do anything and everything from anywhere at any time. They are no more just a communication device, but has become our life’s remote control.
Here are some examples of Instant gratification – here and now!!
UberRUSH – Delivery service by Uber with ability to directly talk to/ chat with couriers to track the package in real time instead of notification or sms alerts.
Click and collect your merchandise,  multi-channel easy returns, free WiFi access while shopping, the ability to check stock online, update customer via beacon technology… these all can enhance the high street experience, bringing it more real time to customers.
An experiment of  customer experience started at LaGuardia Airport, where food and Beverage Company OTG had set up 300 tablet kiosks located in the terminal. As a traveler, you can use the tablet to check flight status, order food, play games or shop at airport stores. When you order food or purchase products, they can be delivered to you at your gate. While improving the travel experience, this is also creating more revenue for the restaurants and shops. This new approach has become so successful that it is being rolled out at other airports. This is instant happiness to customers.
Digital transformation is helping to reduce customer information gaps, wait times and frustrations.
"We will revert immediately" is not fast enough. Customer wants the service NOW!!
Read more…
Gone are the days, when companies used to decide strategy and then execute it for next five years as planned. 
Today company’s life on Fortune 500 or S&P 500 is just 15 years. Digital businesses like Uber, Airbnb did not exist before 2008 but now they are multi-billion dollar poster children for digital disruption.
Today due to digital, every business has to change how to operate, interact with their customers every day. Long term strategies are no longer valid or sustainable and change is constant feature.
Culture is a key determinant of this successful  digital transformation. We can change our technologies, our infrastructure, and our processes. But without addressing the human element, lasting change will not happen. Culture is the operating system of the organization. It is like air, it is there but you can’t see it.
It's important for leaders to understand the business's current culture to map the right solution and timeline that will work for that business. No two organizational cultures are the same. Executives underestimate the importance of culture in an era of digital.  Most cultures are risk averse at a time, when taking risks is the most direct path to innovation.
But we have to remember that without the involvement, cooperation and feedback of the workforce, any digital transformation will struggle to maintain momentum.
Building an organizational culture for a successful adoption of digital technologies like  IoTBig Data AnalyticsMobility requires everyone in the organization, from leaders to front-line employees, to be prepared to work in an open and transparent way. It’s hard for an organization to undergo digital transformation if the culture is one built around silos. In cases like these, cultural change would need to be addressed before the transformation process could begin
Culture leads the adoption of technology. The ability to innovate depends on the impatience of the organizational culture. Organizations have to build the culture and community, making the time for people to share experiences, test and learn what works, brainstorm and collaborate.
It takes time to develop a digital culture; the sooner a company acts, the more quickly it will be in a position to compete in this fast-paced, digitized,  multichannel world.
Southwest Airlines, in operation for more than 40 years, brought in culture change and empowered employees to go Digital and help customers.
Imagine how GE, which is more than 130 years old and operating in more than 175 countries now, has a quest for cultural change to be leader in Digital and  Industrial Internet of Things.
Coca Cola has reinvented itself with culture change by focusing on digital natives while offering more than 100 flavored drinks.

For Digital Transformation Culture is top most enabler. Without people, tools won’t make any difference!!
Read more…
RSS
Email me when there are new items in this category –

Sponsor