Join IoT Central | Join our LinkedIn Group | Post on IoT Central


Devices (332)

If you follow news about the Internet of Things, you will have read quite a few articles that attempt to predict the number of connected devices by the year 2020.

The chart displayed above, from iot-analytics.com gives a nice comparison of some of the predictions from major IoT players like Cisco and Ericsson as well as IT research companies like Gartner and IDC have made recently. The range for 2020 is between 18 Billion and 50 Billion.

The first thing we notice in the chart is - they don’t all start in the same place! The initiation point ranges from 6 B to 14 B for 2014. That’s like having 10 stock analysts predict the future price of a stock in 5 years and all 10 have a different price on their Bloomberg terminal. Certainly a cause for worry. The second key point is that the rate of growth varies tremendously - from 14% to 23%. If this were a discounted cash flow model, and these ranges were being used to predict sales growth, the model wouldn’t have much validity would it?

So why don’t we, for the fun of it, try to do the math on this one? Let's start by getting one thing straight - we will try to calculate the maximum number of connected devices by 2020. Not the mean or median or most likely - the maximum. This will make our choice of numbers way easier - we will choose the highest plausible number for each part of our calculation. Ok let's get started.

Part I - The Humans and their Toys

There are currently 7.4 Billion people on Earth. (http://www.worldometers.info/world-population/) There are 3.36 Billion people connected to the Internet (http://www.internetworldstats.com/stats.htm). In addition, there are around 2 Billion smartphones in the wild - give or take a few million. (http://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/) So - using the “maximum” adage we discussed earlier - lets assume the following:

1) By 2020, each person who currently owns a smartphone will also own a laptop or some other kind of personal computer, including tablets. (This is NOT true, especially in Sub-Saharan Africa and Latin America where smartphones tend to be the only device people own, but its safe to say this would represent a reasonable Maximum)

2) The whole population of the Earth will have access to the internet by 2020. Again - who the hell knows - but its a good maximum. The world population will be around 7.7 Billion (http://www.worldometers.info/world-population/) at that time - at least that is the highest number I could fine from a reputable looking source.

And now for the real wild assumptions. Lets say that the ratio of A) peeps who own a smartphone to B) peeps who have internet will stay the same. Now that requires an intellectual leap of faith. We know internet access will not go down, but smartphone access might hit a peak at some point before 2020. Here we are assuming it doesn’t. More people get internet, and thus more people get smartphones. Now - when we combine B) with 1) and 2) we are saying that (2/3.36)*7.7 = 4.58 B people will have a smartphone and a personal computer by 2020. So this brings us to 9.16 Billion connected devices that people use to access the internet by 2020.

Part II - The Industrial IoT

But what about the Industrial Internet of Things? Arduino, Raspberry Pi, sensors made by Ericsson, routers made by Cisco, drones, cars and planes? Well we can’t calculate that one based on the population of earth. But what about silicon chip manufacturing? Lets make some more assumptions:

3) This one is massive. Lets say 35% of all silicon chip shipments in 2020 will go into some kind of IoT device (not including those used by people directly). The raspberry pi 2 has a 900 MHz quad-core ARM Cortex A7 processor. Now suspend disbelief and imagine that every single factor floor in the world that makes silicon chips and processors will be rolling out this processor’s descendants in 2020. According to SEMI, in Q3 2015 there were 2,591 Millions of Square Inches (MSI) of silicon materials shipped. In June 2015, Freescale semiconductors revealed the i.MX 6Dual SCM processor - which measures in at 17mm X 14mm X height. (http://www.zdnet.com/article/freescale-launches-smallest-ever-dime-sized-iot-processor/) That is .66929 Inches * .551 Inches = .3688 square inches. You can make 7.025 Billion (2.591/.3688) of these processors in one quarter in 2016.

4) Great, so let's say this becomes the norm in 2020. If these IoT chips represented 35% of all the silicon chips produced in the world, that would be .35 * 7.025 Billion in one quarter and .35 * 28.1 Billion = 9.835 Billion in one year.

5) For the sake of keeping this article at less than a million pages, let's say that the number in 4) will be the number of Industrial IoT chips we have in the wild in 2020.

Great. We are done! Adding the results from part I and II we get: 9.16 Billion + 9.835 Billion = 18.995 Billion connected devices by 2020

Hey that’s just barely above the lowest number on the chart! Either our calculations are too conservative or everyone else is too optimistic.

Of course you could make the argument that using 35% in assumption 3 is a bit arbitrary. Granted. But given the fact that WSTS said smartphones and computers alone made up for 65% of all semiconductors in 2014 (blog.semiconductors.org) it doesn’t seem like such a crazy assumption. Going with the mantra of reaching the absolute maximum number given somewhat reasonable assumptions, we could say 50% (gasp!) of all semiconductor production will go towards industrial IoT in 2020 which would lead to 14.05 + 9.16 = 23.21 Billion devices. That’s still 5 billion less than the second lowest estimate (from IT research group IDC) on the chart.

Conclusion: If someone tells you there will be 50 Billion connected devices by 2020, tell them to read this article.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

It’s still early days for the IoT but everyday a little part of its burgeoning ecosystem becomes a factor in our lives, whether we know it or not. From industrial tools to farming to cities to grocery aisles and everything in between, the IoT is there. Here are 10 IoT case studies that show just where some of these technologies and applications are being applied.

1) Bytes and Bushels - Farming on an Industrial Scale

Farming and IoT seem to be the leading implementations on an industrial scale. I wrote on this last year, but the two New York Times pieces on Tom Farms, a multi-generation, family owned farm in North Indiana, is still one of the most comprehensive, and personal, IoT case studies I’ve seen to date. And it’s not just words, be sure to watch the multimedia video. Stories are here and here.

2) The Tesla IoT Car: Case Study

teslamodelsinterorio.jpg

MITCNC, the MIT Club of Northern California, is the regional alumni club of Massachusetts Institute of Technology in Northern California. They have a blog at https://blogmitcnc.org/ where they post on emerging trends and discoveries in science and technology. Displaying their best Car & Driver reviewer, while keeping their propeller hats on to look at IoT, data, privacy and security, this is a unique look at the most talked about car this century. Story here.

3) GE’s Big Bet on Data and Analytics

geimageengine.jpgHere’s a timely new case study from MIT Sloan Management Review that looks at how GE is seeking opportunities in the Internet of Things with industrial analytics. GE is leading the development of a new breed of operational technology (OT) that literally sits on top of industrial machinery. Long known as the technology that controls and monitors machines, OT now goes beyond these functions by connecting machines via the cloud and using data analytics to help predict breakdowns and assess the machines’ overall health. I’m really glad to see someone dive into this as I think GE’s big swing is still not yet fully appreciated. It soon will be. Case study here.

4) Can a Cow be an IoT Platform

One of my favorite stories on the IoT is penned by Bill Vorhies, President & Chief Data Scientist at Data-Magnum. It’s been on IoT Central for a while now, but I thought it important to include in this collection. Bill’s report recaps Microsoft’s Joseph Sirosh for a surprising conversation about a farmer’s dilemma, a professor’s ingenuity and how cloud, data and devices came together to fundamentally re-imagine an age old way of doing business. You can read Bill’s post here or watch the entertaining video below.

5) Global Smart Cities

In 2013, the UK government’s Department for Business, Innovation and Skills commissioned a study that looked at six global cities that are paving the way in smart city investment. It looked at how Chicago, Rio De Janeiro, Stockholm, Boston, Barcelona and Hong Kong tackled particular challenges when responding to the opportunities that a ‘smart city’ and private sector innovators might bring. Worth a read. Case study is here.

tvlightbvsmartcity.jpg

Photo courtesy of TVILIGHT BV

6) PTC Thingworx - All Traffic Solutions

Thingworx, a PTC company, has an IoT platform designed to build and run IoT applications, and enable customers to transform their products and services, innovate and unlock new business models. They have a plethora of case studies, but one that caught my eye was on All Traffic Solutions. The company has been at the forefront of the wireless market for over a decade but now sells its traffic safety products throughout the United States and 20 countries globally. That reach has provided a good deal of field-based insight that, over the last five years, All Traffic Solutions has channeled into developing innovative new web-based and IoT-connected signs that are incredibly smart, yet simple to use, adding significant value to the company’s hardware for its customers. Case study here.

7) Stanley Black and Decker

Managing a complex manufacturing facility is a challenge and this case study from Cisco showcases how Stanley Black & Decker operates one of its largest tool manufacturing plants in Reynosa, Mexico, which serves the North American market. Opened in 2005, the Reynosa plant primarily manufactures dozens of products, such as jigsaws, planers, cordless drills, floodlights, and screwdrivers for the DeWALT brand and lawnmowers for the Black & Decker brand. With 40 multiproduct manufacturing lines and thousands of employees, the plant produces millions of power tools each year. This case study shows how IoT technologies help with production visibility and flexibility. Case study here. Great video below.

8) SLAC National Accelerator Laboratory

sllabs.jpg

Since its opening in 1962, SLAC National Accelerator Laboratory has been helping create the future. Six scientists have been awarded Nobel prizes for work done at SLAC, and more than 1,000 scientific papers are published each year based on research at the Palo Alto-based lab. The team is now working on a future plan to take data from all intelligent sensors that monitor the vast systems at SLAC and feed the data into the cloud where it can be processed, analyzed, and delivered back to control engineers. Case study here.

9) The Supermarket of the Future: Designing for humans

It’s not just about technology, but applying technology to improve the human experience. This case study on Italy’s biggest grocery cooperative shows how it might be done...and I like it. Coop Italia’s “supermarket of the future,” designed by Carlo Ratti, has won rave reviews, thanks to a digital design that created a more human shopping experience using a range of off-the-shelf technology. Read more about it here.

10) IoT for Electronic Medical Records

65702-int-brand-1020-emt-ambulance-rwd.jpg.rendition.intel.web.720.405.jpg

The need to cut cost, improve medical care, and adopt electronic medical records (EMR) is driving hospitals to implement information technology solutions that streamline procedures such as billing, medical imaging, and electronic medical records processing. In this case study from Intel, it shows how their partner NEXCOM developed a medical informatics solution based on technologies from the Internet of Things to help overcome communication barriers between medical devices and IT networks. The solution turns medical device data into electronic medical records and sends them to the hospital’s private cloud, where data analytics can be performed to better evaluate a patient’s condition. Read more about it here.

Read more…

Matt Turck, a venture capitalist at FirstMark, has mapped out the Internet of Things Landscape for 2016.

Matt notes "The IoT today is largely at this inflection point where “the future is already here but it is not evenly distributed”. From ingestibles, wearables, AR/VR headsets to connected homes and factories, drones, autonomous cars and smart cities, a whole new world (and computing paradigm) is emerging in front of us. But as of right now, it just feels a little patchy, and it doesn’t always look good, or work great – yet."

The chart above is great, but it's his thoughtful and detailed blog post that's definitely worth your time. He covers the booming investment, the seemingly glacial pace for the end user, jockeying by large corporations, and what it all means for start-ups. 

 

Read more…

10 Big Data Use Cases Everyone Must Read

Guest blog post by Bernard Marr

What do you think of when you think of "big data"?

For many, it's a nebulous term that invokes images of huge server farms humming away. Or perhaps you think of receiving some kind of personalized advertisement from a retailer.

But big data is so much deeper and broader than that. I believe there are 10 major areas in which big data is currently being used to excellent advantage in practice - but within those arenas, data can be put to almost any purpose.

1. Understanding and Targeting Customers

This is one of the biggest and most publicized areas of big data use today. Here, big data is used to better understand customers and their behaviors and preferences. Companies are keen to expand their traditional data sets with social media data, browser logs as well as text analytics and sensor data to get a more complete picture of their customers. The big objective, in many cases, is to create predictive models.

You might remember the example of U.S. retailer Target, who is now able to very accurately predict when one of their customers will expect a baby. Using big data, Telecom companies can now better predict customer churn; Wal-Mart can predict what products will sell; and car insurance companies understand how well their customers actually drive.

Ski resorts are even using data to understand and target their patrons. RFID tags inserted into lift tickets can cut back on fraud and wait times at the lifts, as well as help ski resorts understand traffic patterns, which lifts and runs are most popular at which times of day, and even help track the movements of an individual skier if he were to become lost.

Imagine being an avid skier and receiving customized invitations from your favorite resort when there's fresh powder on your favorite run, or text alerts letting you know when the lift lines are shortest. They've also taken the data to the people, providing websites and apps that will display your day's stats, from how many runs you slalomed to how many vertical feet you traversed, which you can then share on social media or use to compete with family and friends.

Even government election campaigns can be optimized using big data analytics. Some believe Obama's win after the 2012 presidential election campaign was due to his team's superior ability to use big data analytics.

2. Understanding and Optimizing Business Processes

Big data is also increasingly used to optimize business processes. Retailers are able to optimize their stock based on predictions generated from social media data, web search trends and weather forecasts.

One particular business process that is seeing a lot of big data analytics is supply chain or delivery route optimization. Here, geographic positioning and radio frequency identification sensors are used to track goods or delivery vehicles and optimize routes by integrating live traffic data, etc. HR business processes are also being improved using big data analytics.

This includes the optimization of talent acquisition - Moneyball style - as well as the measurement of company culture and staff engagement using big data tools. For example, one company, Sociometric Solutions, puts sensors into employee name badges that can detect social dynamics in the workplace. The sensors report on how employees move around the workplace, with whom they speak, and even the tone of voice they use when communicating.

One of the company's clients, Bank of America, noticed that its top performing employees at call centers were those who took breaks together. They instituted group break policies and performance improved 23 percent.

You may have seen the RFID tags you can attach to things like your phone, your keys, or your glasses, which can then help you locate those things when they inevitably get lost. But suppose you could take that technology to the next level and createsmart labels that could stick on practically anything. Plus, they can tell you a lot more than just where a thing is; they can tell you its temperature, the moisture level, whether or not it's moving, and more.

Suddenly, this unlocks a whole new realm of "small data;" if big data is looking at vast quantities of information and analyzing it for patterns, then small data is about looking at the data for an individual product - say, a container of yogurt in a shipment - and being able to know if it's likely to go off before it reaches the store.

This part of the Internet of Things holds incredible promise for improving everything from logistics to health care, and I believe we're still just on the cusp of understanding what this incredible technology can do - as when electricity was only used to power light bulbs.

3. Personal Quantification and Performance Optimization 

Big data is not just for companies and governments but also for all of us individually. We can now benefit from the data generated from wearable devices such as smart watches or smart bracelets. Take the Up band from Jawbone as an example: the armband collects data on our calorie consumption, activity levels, and our sleep patterns. While it gives individuals rich insights, the real value is in analyzing the collective data.

In Jawbone's case, the company now collects 60 years worth of sleep data every night. Analyzing such volumes of data will bring entirely new insights that it can feed back to individual users.

The other area where we benefit from big data analytics is finding love - online this is. Most online dating sites apply big data tools and algorithms to find us the most appropriate matches.

4. Improving Healthcare and Public Health

The computing power of big data analytics enables us to decode entire DNA strings in minutes and will allow us to find new cures and better understand and predict disease patterns. Just think of what happens when all the individual data from smart watches and wearable devices can be used to apply it to millions of people and their various diseases. The clinical trials of the future won't be limited by small sample sizes but could potentially include everyone!

Apple's new health app, called ResearchKit, has effectively just turned your phone into a biomedical research device. Researchers can now create studies through which they collect data and input from users phones to compile data for health studies. Your phone might track how many steps you take in a day, or prompt you to answer questions about how you feel after your chemo, or how your Parkinson's disease is progressing. It's hoped that making the process easier and more automatic will dramatically increase the number of participants a study can attract as well as the fidelity of the data.

Big data techniques are already being used to monitor babies in a specialist premature and sick baby unit. By recording and analyzing every heartbeat and breathing pattern of every baby, the unit was able to develop algorithms that can now predict infections 24 hours before any physical symptoms appear. That way, the team can intervene early and save fragile babies in an environment where every hour counts.

What's more, big data analytics allow us to monitor and predict the developments of epidemics and disease outbreaks. Integrating data from medical records with social media analytics enables us to monitor flu outbreaks in real-time, simply by listening to what people are saying, i.e. "Feeling rubbish today - in bed with a cold".

Of course, while much has been made in the past of Google's ability to predict flu outbreaks based on search traffic, their model didn't work in 2014. Google itself admits that just because you search for "flu symptoms," it doesn't mean you're sick.

5. Improving Sports Performance

Most elite sports have now embraced big data analytics. We have the IBM SlamTracker tool for tennis tournaments; we use video analytics that track the performance of every player in a football or baseball game, and sensor technology in sports equipment such as basket balls or golf clubs allows us to get feedback (via smart phones and cloud servers) on our game and how to improve it. Many elite sports teams also track athletes outside of the sporting environment - using smart technology to track nutrition and sleep, as well as social media conversations to monitor emotional wellbeing.

The NFL has developed its own platform of applications to assist all 32 teams in making the best decisions based on everything from the condition of the grass on the field, to the weather, to statistics about an individual player's performance while in university. It is all in the name of strategy as well as reducing player injuries.

One of the really cool new things I have come across is a smart yoga mat: sensors embedded in the mat will be able to provide feedback on your postures, score your practice, and even guide you through an at-home practice.

6. Improving Science and Research

Science and research is currently being transformed by the new possibilities big data brings. Take, for example, CERN, the nuclear physics lab with its Large Hadron Collider, the world's largest and most powerful particle accelerator. Experiments to unlock the secrets of our universe - how it started and works - generate huge amounts of data.

The CERN data center has 65,000 processors to analyze its 30 petabytes of data. However, it uses the computing powers of thousands of computers distributed across 150 data centers worldwide to analyze the data. Such computing powers can be leveraged to transform so many other areas of science and research.

The computing power of big data could also be applied to any set of data, opening up new sources to scientists. Census data and other government collected data can more easily be accessed and analyzed by researchers to create bigger and better pictures of our health and social sciences.

7. Optimizing Machine and Device Performance

Big data analytics help machines and devices become smarter and more autonomous. For example, big data tools are used to operate Google's self-driving car. The Toyota Prius is fitted with cameras, GPS as well as powerful computers and sensors to safely drive on the road without the intervention of human beings. We can even use big data tools to optimize the performance of computers and data warehouses.

Xcel Energy initiated one of the first ever tests of a " smart grid" in Boulder, Colorado, installing smart meters on customers' homes that would allow them to log into a website and see their energy usage in real time. The smart grid would also theoretically allow power companies to predict usage in order to plan for future infrastructure needs and prevent brown out scenarios.

In Ireland, grocery chain Tescos has its warehouse employees wear armbands that track the goods they take from the shelves, distributes tasks, and even forecasts completion time for a job.

8. Improving Security and Law Enforcement

Big data is applied heavily in improving security and enabling law enforcement. I am sure you are aware of the revelations that the National Security Agency (NSA) in the U.S. uses big data analytics to foil terrorist plots (and maybe spy on us). Others use big data techniques to detect and prevent cyber attacks. Police forces use big data tools to catch criminals and even predict criminal activity and credit card companies use big data use it to detect fraudulent transactions.

In February 2014, the Chicago Police Department sent uniformed officers to make "custom notification" visits to individuals they had identified as likely to commit a crime through a computer generated list. The idea was to prevent crime by providing certain individuals with information about job training programs, or let them know about increased penalties for people with certain backgrounds. But many community groups cried foul and called the practice profiling.

9. Improving and Optimizing Cities and Countries

Big data is used to improve many aspects of our cities and countries. For example, it allows cities to optimize traffic flows based on real time traffic information as well as social media and weather data. A number of cities are currently piloting big data analytics with the aim of turning themselves into Smart Cities, where the transport infrastructure and utility processes are all joined up. Where a bus would wait for a delayed train and where traffic signals predict traffic volumes and operate to minimize jams.

The city of Long Beach, California is using smart water meters to detect illegal watering in real time and have been used to help some homeowners cut their water usage by as much as 80 percent. That's vital when the state is going through its worst drought in recorded history and the governor has enacted the first-ever state-wide water restrictions.

Los Angeles uses data from magnetic road sensors and traffic cameras to control traffic lights and thus the flow (or congestion) of traffic around the city. The computerized system controls 4,500 traffic signals around the city and has reduced traffic congestion by an estimated 16 percent.  

A tech startup called Veniam is testing a new way to create mobile wi-fi hotspots all over the city in Porto, Portugal. More than 600 city buses and taxis have been equipped with wifi transmitters, creating the largest free wi-fi hotspot in the world. Veniam sells the routers and service to the city, which in turn provides the wi-fi free to citizens, like a public utility. In exchange, the city gets an enormous amount of data - with the idea being that the data can be used to offset the cost of the wi-fi in other areas. For example, in Porto, sensors tell the city's waste management department when dumpsters are full, so they don't waste time, man hours, or fuel emptying containers that are only partly full.

10. Financial Trading

My final category of big data application comes from financial trading. High-Frequency Trading (HFT) is an area where big data finds a lot of use today. Here, big data algorithms are used to make trading decisions. Today, the majority of equity trading now takes place via data algorithms that increasingly take into account signals from social media networks and news websites to make, buy and sell decisions in split seconds.

Computers are programmed with complex algorithms that scan markets for a set of customizable conditions and search for trading opportunities. The programs can be designed to work with no human interaction or with human interaction, depending on the needs and desires of the client.  

The most sophisticated of these programs are now also designed to change as markets change, rather than being hardcoded.

For me, the 10 categories I have outlined here represent the areas in which big data is applied the most. Of course there are so many other applications of big data and there will be many new categories as the tools become more widespread.

Follow us @IoTCtrl | Join our Community

Read more…

New IOT Trends in Manufacturing

Trends That Will Shape the Internet of Things in 2016

In a relatively short time, The Internet of Things (IoT) has grown from a niche technology in the global market, into a widely embraced phenomenon. Rapid advancements in IP technologies, as well as the IoT devices and industries that they’re used in, mean that devices are now able to be integrated in more ways than ever before. One particular sector that has strongly embraced IoT adoption, is the manufacturing industry.

Offering a range of benefits, IoT will be a major force in shaping manufacturing throughout 2016 and beyond.

Manufacturers Will Become Increasingly Software Centric

Manufacturing hardware, processes, and even operational processes, will become more reliant on software. Whether referring to the embedded apps and software within devices, or the server-side software that controls machines and automations, manufacturers that adopt IoT as part of their strategy will need to focus investment and knowledge building around software. Not only will this affect the depth and complexity of their IoT integration, but it will also mean that these manufacturers will need to procure new talent or upskill existing staff with specific IoT skillsets in IT.

Costs will Decrease, Increasing Adoption

Cost has been a significant factor for manufacturers who have been hesitant to adopt widespread IoT systems in manufacturing. As IoT technologies continue to mature, implementation costs will decrease. Because IoT provides significant benefits in operational efficiency, price shrinkages will influence manufacturers who were previously undecided on the financial benefits of IoT.

RFID Will Be a Major Technology in Manufacturing

Research firm Markets and Markets, has projected that RFID will be widely adopted in the manufacturing sector. There are a number of factors contributing to this, including the ability to use passive RFID chips in manufacturing, with little additional cost. NFC is expected to experience the highest level of growth. Manufacturers will be able to benefit from RFID tracking on the production floor, but also in packaging and distribution.

In case studies, such as the use of RFID to track luggage at Hong Kong International Airport, RFID tags have been shown to provide read rates of up to 97%, compared to 80% for optically read barcode tags.

North America will Lead IoT use in Manufacturing

Although China and the United States have often swapped positions at the top spot of total manufacturing output, it is the U.S. that will lead IoT implementation in manufacturing for 2016. This is mostly due to high automation, frequent technological advancements, and a history of early-adoption of new technologies. This contrasts greatly with China, where output is high, but production methods differ, favoring low-cost labor in place of high levels of automation.

This increased trend in IoT adoption is expected to benefit other areas of North American industry, such as the R&D and software sectors. Cisco Systems, Microsoft, Intel, IBM, and General Electric are all U.S. based multinationals that lead in IoT sensor and software development. German companies SAP SE, Siemens, and Bosch, are also IoT leaders that will benefit from increased demand for IoT solutions in manufacturing.

 

 

Bottom Line – IoT Shows no Signs of Slowing Down

Regardless of initial reluctance to adopt, and increasing security concerns surrounding IoT devices, the industry as a whole is showing no signs of slowing down. Firms like Gartner research have predicted that there will be almost 7 billion sensors in use by the end of 2016, and that enterprise level software spend will total over $860bn, globally.

Manufacturers will realize more efficient operations which stretch from administration, to production floors, and even distribution. The internet of things doesn’t represent a flawless group of technologies, but it is set to be a significant aspect of the future of high tech manufacturing, no matter which way you look at it.

For more information on IOT Recruiting please check out our new websitewww.internetofthingsrecruiting.com  

Read more…

How does a company break into the huge potential marketplace that is the Industrial Internet? Spending on the Industrial Internet—the convergence of industrial machines, data, and the Internet (one aspect of the Internet of Things)—was $20 billion in 2012. Analysts are forecasting that number will reach $514 billion by 2020, creating nearly $1.3 trillion in value. MIT Sloan Management Review recently published a case study of one company that is succeeding in its efforts to be a major player in this market--GE. Here are five key strategies that GE is using to build its position in this growing market.

Get the talent you need

Developing new digital capabilities means building a new base of talent. GE turned to consumer advertising to recruit millennials to join GE as Industrial Internet developers and remind them of GE’s massive digital transformation effort.

You may be familiar with the ads; they feature a recent college graduate, Owen, excitedly breaking the news to his parents and friends that he had just landed a computer programming job—with GE. Owen tries to tell them that he will be writing code to help machines communicate but they are puzzled; after all, GE isn’t exactly known for its software. The ad campaign is just one component of GE’s human resource strategy, which is itself part of a larger corporate strategy to transform GE into a top ten software company.

Make a platform, not a product

GE doesn’t intend on becoming merely a software product company. It is building out a cloud-based, open platform that will enable customers, third party developers and GE itself to create customized software solutions in several different industry sectors. The platform has open standards and protocols that allow customers to more easily and quickly connect their machines to the Industrial Internet. The platform can accommodate the size and scale of industrial data for every customer at current levels of use, but it also has been designed to scale up as demand grows. The number and variety of Predix-related apps are not limited to what GE offers. Whereas customers may develop their own custom applications for use on the Predix platform, GE executives are working to build a developer community and create a new market for apps that can be hosted on the Predix platform.

Take a different sales approach

For GE, a pilot program is often an essential step of the adoption process for new customers, especially in the oil and gas sector (a focus of the case study). In early 2015, GE executed a four-week engagement with one of the largest global energy companies, which wanted to reinvent how it manages its “static” equipment. GE positioned Predix to help the customer rethink how it manages these assets and at the end of the four weeks, they developed a software solution that that allowed the customer to “walk through” how their reliability engineers could use Predix to better manage their static assets.

GE executives see the pilots as a way to bring customers onto the selling team. “To get anywhere in the oil and gas industry, we need help selling. We need customer voices out in the industry with success stories, or we’re just not going to come to the table with the credibility that we need. So we need to inspire our customers to want to do that,” says one executive.

Change the pricing model

Moving forward, GE’s pricing model is changing because of Predix. In the past, GE customers would buy bundles of equipment, services and software and treat these expenses as a capital expenditure. But with Predix, GE customers are beginning to treat these purchases as an operational expense. “We’ve got some customers that don’t ever want to actually buy the equipment. They just want us to come out and get paid based on production. It’s a service contract that wraps up equipment, services, software and all the analytics,” says Ron Holsey, Digital Commercial Leader, Surface, GE Oil & Gas. Holsey thinks this new model shows that GE is putting its money where its mouth is. “If we improve, say, their power consumption by X, we get $1; by Y, we get $1.50,” he says, noting that customers are becoming more open to this type of arrangement and sharing the data necessary to establish the baseline measurements that makes the model work. “In order for us to leverage the analytics, customers are asking us to put a little bit more risk on the table ourselves, and that’s the difference that we’ve seen in the market,” says Holsey.

Make an Investment

GE has bet big on the Industrial Internet. The company is committing $1 billion to put sensors on gas turbines, jet engines, and other machines; connect them to the cloud and, analyze the resulting flow of data to identify ways to improve machine productivity and reliability. “One billion dollars represents a big swing for GE,” says Matthias Heilmann, Chief Digital Officer of GE Oil & Gas Digital Solutions. “It signals this is real, this is our future. Indeed, GE has created an entirely new business division, GE Digital, to focus on selling industrial internet applications across the company’s many business divisions.

This post is adapted from the MIT SMR case study, “GE’s Big Bet on Data & Analytics,” published February 2016.

Originally posted on Data Science Central

Follow us @IoTCtrl | Join our Community

Read more…

IoT Without the Internet?

This is a guest post from Dana Blouin who writes about IoT at www.danablouin.com. You can follow him on Twitter @DanaBlouin

I recently took a trip to Northern Thailand to support a volunteer effort at a school that serves a rural hill tribe. The project was backed by three Thai startups Knit by JibDrvr andBangkok Bike Finder. All outstanding startups that value the idea of doing social good and believe that education is a fundamental necessity.                                                                                                                            

Village 1

This school itself isn’t to easy to get to. After taking an overnight train from Bangkok to Chaingmai we still had two days of travel ahead, including the last leg which was a three hour drive in a 4×4 truck through mountain roads and paths.

While I was visiting this village I was wondering how could I help this school through my specific set of knowledge and capabilities with the Internet of Things, there was just one issue. The village the school serves is completely off the grid, no power, no Internet and just to get a mobile signal you have to travel about 15km. How do you get the benefit of the IoT without actually having Internet, or power for that matter?

Well the power issues is fairly easy to address, this is Thailand after all and sun is one thing we have in abundance. There are a number of low power devices that can run via solar power and can be contained in a weatherproof case of some kind.

Of course there are solution in place for such lacks of connectivity, a device can cache data and then send it back up to the cloud later for analysis. Or a mobile device can be used to query sensors once they are in range to get real time data.

So, the tech solutions are there, now just finding out how IoT can help this school was the key. I spent a lot of time exploring the area around the school and talking with the teacher to assess what the school needs to help provide a better learning environment for the community they serve.

Water is really the primary concern for the school at this point. They can only get fresh water about five months out of the year. Water, being sort of essential for life and all, clearly moves to the top of the list. Knit by Jib is working on a project now that will help extend how high up on the stream the schools water is sourced from; which should allow them to get clean, fresh water year round.

Just because the IoT can’t physically bring the water to the school doesn’t mean it won’t have a role to play. I can envision sensors used for the water tanks at the school to measure level, letting the teacher know when they need to turn the valves on to fill the tanks, or even possibly some sensors to check water quality, I still have some more research to do on this front.

Another issues that the teachers face is nutrition, as the diet of the locals is very limited. It is often the case that the only balanced meal the students get each day is prepared by the teachers at school. To this end the teachers are currently running a school garden where they grow the food that is used in some of the students meals. An automated watering system linked to a soil moisture sensor seems to be a simple project that can be put together to help out in this regard. Of course because there is no electricity in this village the system would have to be solar powered so it could operate consistently, and then that whole system would need to be maintained. All are interesting challenges.

Ultimately it comes down to how much benefit can this school get from technology projects like this. Just installing them could help a little bit, but I am unsure how much benefit it would really be. I have been thinking that the biggest help these projects could accomplish would be as a learning experience for the children and maybe provide some inspiration along the way. The point of these projects and outreach is to benefits the school in its mission to provide quality education to this remote, rural area. I have a lot more thought to put into this before I can decide what, if any benefit can be offered through technology. I will be sure to post more here as I work through ideas.

Read more…

The smart phone on your belt is dramatically different from the flip phones of a decade ago. Technology continues to move at incredible speeds and we are truly living in a golden age. But the where we are headed is unlike where we’ve been.

In the future, the Internet of Things will be a reality in every sector. Smart systems will be released with sensors and robotics that simplify and automate manufacturing. The system will operate through wired and wireless networks and an infrastructure will help us to accomplish more during the course of a day.

This begins with physical objects, built with sensors and actuators placed in them. These individual parts will send and receive information in order to complete specific tasks. They will depend on real time data and this information will affect the big picture. In fact, each device on the assembly line will connect to a central system that will orchestrate and synchronize the entire system to ensure things run smoothly and as effectively as possible.


In order for smart manufacturing to work, there need to be systems in place that work with the smart manufacturing vision. Sensors must be placed in technology and a host system installed. This will help with logistics, order placement, procurement and other essential functions that impact the overall system.

So who does this? While your IT department could technically handle the task, it would be time consuming and cost you hundreds of man hours to develop. A better choice is to consider a vendor who can help with the effort. These individuals will help to create a functional system which is tightly integrated and allows you to effectively manage your manufacturing operations. With new industry standards being released for manufacturing all the time, it is certain the internet of things will play a pivotal role in the future of manufacturing automation.

An example of it is already seen in the food and beverage industry. Machines currently communicate sensitive information like temperature, humidity and the condition of the containers. Companies can also track shipments with identifying codes and determine where they originated from in the company and where these items were shipped to in the world. If there is a case of contamination, they can also quickly contact locations who received items that might be tainted.

When the internet of things becomes dominate on these manufacturing lines, there will be more power. There will be a central master computer that will run the entire operation. It will have an intelligent way to analyze, address concerns and to remain independent at all times, all while continuing to meet the demands of production.

There is no denying the internet of things will play an important role in the future of production. Good will be released faster and profits will spike for a company. That makes it important to embrace today and incorporate in the current structure of your business. Doing that will help you to be part of the future and to remain a visionary in the industry.

 

Are you hiring ahead of the coming shift in how workers work?

Read more…

A Quick History of the Internet of Things

How Did We Create Such a Rich Market?

Want to know how the "Internet of Things" became a thing at all? To do so, you must look back to the start: the birth of networking and the explosion of consumer technology.

The internet isn’t that old, so far as the world wide web. In 1974, the structure we know and love today was born. Just ten years later. that the first domain name system was introduced, allowing for easier networking. The first website actually came online in 1991. The "internet," as a network of connected devices in consumer homes, was only proposed just a scant two years before that, yet it came crashing into our mainstream world. 

In no time the internet took over. By 1995, multiple websites and systems came online. I remember watching crude bulletin board systems arise, then quickly be replaced by Geocities pages and early websites. The first business webpages actually came in the form of reproduced fliers, essentially scanned and put online to promote companies. All of these new ideas came from the imaginings of others that had taken place decades earlier.

The term “internet of things” or “IoT” is also not a new one. You can find references to it as far back as the idea of the Internet itself, but if you survey an IoT team, it is more than likely that few know this. The history, or at least the ideology, goes back a great deal further than most people know. This, of course, has ramifications on the marketplace, both in how older technology companies approach the space and how traditional product introduction processes operate.

Thinkers across history could be responsible for coining the term, depending on the story you read. Some point to Tesla and Edison as the first to lead connected objects. Others look at the literal applications by Tim Berners Lee and Mark Weiser, the latter of which famously created a water fountain synced to the activities of the NYSE. The founders of Nest could also make the list, one of the first truly non-computer connected objects.

Even the idealism and futurism of the 1950s and 1960s gave way to the Internet of Things thinking. Imagine a classic 60s technology ad, displaying the "home of the future." Everything is connected and communicating, and people are never out of reach of their day-to-day technology.

Then, of course, is Kevin Ashton, a man who comes up when you Google "who came up with the Internet of Things." Kevin is a frequent thinker in the space who is corrected attributed to a verifiable creation of the term, "Internet of Things." Like most corporate lingo, the origin is likely impossible to pin down, but the idea that the term was born in a boardroom is not surprising. The leaders who would go on to actually take these objects to market in the 90s included "traditional" players like IBM and Sony.

The story is that, no matter what route you pick to decipher the past, the rise of Internet of Things thinking is ubiquitous. From the moment "networking" arrived into everyday life, people were thinking about how it would impact our world.

1998 itself is a turning point in many ways, when something changed. Apple returned to the market with the iMac, and the team that designed this platform would go on to design the iPhone and, most critical to IoT research, the iPod. Big name manufacturers that had for most of their development focused on the PC were now investing in everyday objects with connectivity and technological features. The smartphone era was planted, and with it would come the first real consumer-level IoT object based on existing computers.

The history of IoT is extraordinarily dense, and the reading of the history depends on who you ask. If you were to question a designer at IBM in the late 1980s, you would find ideas similar to what we now call IoT in constant use. However, if you ask an emerging startup from the early 2000s, you would find a wave of thinkers taking credit for the idea. The reality is somewhere in between: those who thought ahead about computers expected what we have today, billions of devices.

IoT has continued to grow and to evolve and projections are bright for this new methodology for using the internet. The future of IoT is now –with devices coming online every day. The world is reliant upon connected cars, connected medical devices and even connected homes.

Companies today are scrambling to get their own IoT systems online and moving, and new recruits are being brought in every day to head up IoT systems in companies from small to large. How well do they know the history of the space and exactly how broad it can be?

 We want your input - please share your thoughts below!   Click Here 

Read more…

When you read or hear about the Internet of Things (IoT), do you imagine that we’re not quite in an age where such a concept is able to be fully realized? Have you ever pointed towards the fragmentation in the market regarding devices and services, or even the complexity of IoT, and questioned how concepts like the connected home could be adopted on a widespread scale?

If you’re still questioning IoT at this point, then it’s possible that you’re simply not looking closely enough. Many of the products and services that you’re using are already a part of IoT.

Microsoft’s Office suite is a connected service on IoT, Apple’s ecosystem is IoT to the core, and even your late model vehicle is likely connected to IoT in some way. In the consumer world, IoT is simply the reality of all your devices being connected; from your game console, to your cellular phone, the computer in your office and on your coffee table, and even your automated home lighting, air conditioning, and garage door.

IoT as a concept was first described over 20 years ago by researchers at MIT. They spoke of a future where devices and sensors would collect and share data. There’s a reason why it is a buzzword today. Data capabilities, the decreasing cost of hardware, and the widespread adoption of the internet have made IoT possible for consumers, businesses, and large organizations across the world.

As a consumer, you’re probably already using IoT today. Your smartphone can connect to your home PC and control it remotely. You can set schedules for you Cable PVR and arrive home to your favorite programs already recorded and ready to play. You can even strap a smart device to your wrist while you jog, while also collecting data on your heart rate, the calories you’ve burnt, and even map a GLONASS or GPS tracked route of where you went.

You can then upload that data to the cloud and retrieve it later. You can share it with other people. You could even send the information to your personal trainer who can observe and advise around your exercise regime. This is what the Internet of Things is all about. For consumers, it’s all about the power of information.

IoT makes life easier. Progression has been gradual, and in many ways low key. This may be why many haven’t noticed it happening. When you used to collect your mail, there was one place where you could do it; your mailbox. Today, your mailbox is anywhere that you go, as long as you have a connected device. We used to bank inside buildings. ATM’s came later, and they increased the convenience. Today you can bank from a smartwatch. You can make payments with an NFC chip without swiping plastic. You can transfer your money from account to account from a Smartphone or PC.

The Internet of Things has provided countless advantages to society. From smarter automated manufacturing, to biometric implants in critical care patients, IoT does more than the average person knows. Perhaps the fact that we already use IoT without even knowing it, is testament to how important, influential, and firmly embedded IoT is in our lives today.

 Give us you feedback on IOT - Click Here

Read more…

The topic of IoT and farming keeps coming up.

Last month Steve Lohr of the New York Times wrote a fantastic piece on The Internet of Things and the Future of Farming. His colleague Quentin Hardy wrote a similar piece, albeit with a big data slant, in November 2014. If you have not yet read either article, I suggest you take the time to do so and also watch the video of IoT at work at a modern farm. It’s one of the better IoT case studies I’ve come across and shows real and practical applications and results.

Both stories highlight Tom Farms, a multi-generation, family owned farm in North Indiana. The Toms won’t be setting up a stand at your local farmers market to hawk their goods. With over 19,000 acres they are feeding a nice portion of America and conduct farming on an industrial scale producing shipments of more than 30 million pounds of seed corn, 100 million pounds of corn, and 13 million pounds of soybeans each year.

As the video points out, technology, data and connectivity have gotten them to this scale. After the farm crisis of the 1980s, they double-downed and bought more land from other struggling farmers. Along the way they were proactive in researching and developing new production technologies - everything from sensors on the combine, GPS data, self-driving tractors, and apps for irrigation on an iPhone.

Farmers and Tablet PC

Photo Credit: Gary McKenzie on Flickr

All this technology is taking farming to a new level, in what is know as Precision Agriculture. The holy grail of precision agriculture is to optimize returns on inputs while preserving resources. The most common use of of modern farming is used for guiding tractors with GPS. But what other technologies are out there?

For that, the Wall Street Journal explored yesterday startups that put data in farmers' hands. Startups like Farmobile LLC, Granular Inc. and Grower Information Services Cooperative are challenging data-analysis tools from big agricultural companies such as Monsanto Co., DuPont Co., Deere & Co. and Cargill Inc.

The new crop from all of these technologies is data.

This changes the economics for farmers making them not just traders in crops, but in data, potentially giving them an edge against larger competitors that benefit from economies of scale (to compete against giants like Tom Farms).  

With the amount of venture investment in so-called agtech start-ups reaching $2.06 billion in the first half of this year there will be plenty of bytes in every bushel.

For a deep dive into Precision Agriculture, the history and the technologies behind it, I suggest registering for and reading the Foreign Affairs article, “The Precision Agriculture Revolution, Making the Modern Farmer.”

Read more…

Caltrain Quantified: An Exploration in IoT

Guest blog post by Cameron Turner

Executive Summary

Though often the focus of the urban noise debate, Caltrain is one of many contributors to overall sound levels along the Bay Area’s peninsula corridor. In this investigation, Cameron Turner of Palo Alto’s The Data Guild takes a look at this topic using a custom-built Internet of Things (IoT) sensor atop the Helium networking platform.

Introduction

If you live in (or visit) the Bay Area, chances are you have experience with the Caltrain. Caltrain is a commuter line which travels 77.4 miles between San Francisco and San Jose , carrying over 50 thousand passengers on over 70 trains daily.[1]

I’m lucky to live two blocks from the Caltrain line, and enjoy the convenience of the train. My office, The Data Guild, is just one block away. The Caltrain and its rhythms, bells and horns are a part of our daily life, and connect us to the City and with connections to BART, Amtrak, SFO and SJC, the rest of the world.

Over the holidays, my 4-year-old daughter and I undertook a project to quantify the Caltrain through a custom-built sensor and reporting framework, to get some first-hand experience in the so-called Internet of Things (IoT). This project also aligns with The Data Guild’s broader ambition to build out custom sensor systems atop network technologies to address global issues. (More on this here.)

Let me note here that this project was an exploration, and was not conducted in a manner (in goals or methodology) to provide fodder for either side of the many ongoing caltrain debates: the electrification project, quiet zone, or tragic recent deaths on the tracks.

Background

My interest in such a project began with an article published in the Palo Alto Daily in October 2014. The article addressed the call for a quiet zone in downtown Palo Alto, following complaints from residents of buildings closest to the tracks. Many subjective frustrations were made by residents based on personal experience.

According the the Federal Railroad Administration (FRA), the rules by which Caltrain operates, train engineers “must begin to sound train horns at least 15 seconds, and no more than 20 seconds, in advance of all public grade crossings.”

Additionally: “Train horns must be sounded in a standardized pattern of 2 long, 1 short and 1 long blasts.” and “The maximum volume level for the train horn is 110 decibels which is a new requirement. The minimum sound level remains 96 decibels.“

Questions

Given the numeric nature of the rules, and the subjective nature of current analysis/discussion, it seemed an ideal problem to address with data. Some of the questions we hoped to address including and beyond this issue:

  • Timing: Are train horns sounded at the appropriate time?
  • Schedule: Are Caltrains coming and going on time?
  • Volume: Are the Caltrain horns sounding at the appropriate level?
  • Relativity: How do Caltrain horns contribute to overall urban noise levels?

Methodology

Our methodology to address these topics included several steps:

  1. Build a custom sensor equipped to capture ambient noise levels
  2. Leverage an uplink capability to receive data from the sensor in near real-time
  3. Deploy sensor then monitor sensor output and test/modify as needed
  4. Develop a crude statistical model to convert sensor levels (voltage) to sound levels (dB)
  5. Analysis and reporting

Apparatus

We developed a simple sensor based on the Arduino platform. A baseline Uno board, equipped with a local ATmega328 processor, was wired to and Adafruit Electret Microphone/Amplifier 4466 w/adjustable gain.

We were lucky to be introduced through the O’Reilly Strata NY event to a local company: Helium. Backed by Khosla Ventures et al, Helium is building an internet of things platform for smart machines. They combine a wireless protocol optimized for device and sensor data with cloud-based tooling for working with the data and building applications.

We received a Beta Kit which included a Arduino shield for uplink to their bridge device, which then connects via GSM to the Internet. Here is our sensor (left) with the Helium bridge device (right).

Deployment

With our instrument ready for deployment, we sought to find a safe location to deploy. By good fortune, a family friend (and member of the staff of the Stanford Statistics department, where I am completing my degree) owns a home immediately adjacent to a Caltrain crossing, where Caltrain operators are required to sound their horn.

Conductors might also be particularly sensitive to this crossing, Churchill St., due to its proximity to Palo Alto High School and the tragic train-related death of a teen, recently.

From a data standpoint, this location was ideal as it sits approximately half-way between the Palo Alto and California Avenue stations.

We deployed our sensor outdoors facing the track in a waterproof enclosure and watched the first data arrive.

Monitoring

Through a connector to Helium’s fusion platform, we were able to see data in near real-time. (note the “debug” window on the right, where microphone output level arrives each second).

We used another great service, provided by Librato, (now a part of SolarWinds) a San Francisco-based monitoring and metrics company. Using Librato, we enabled data visualization of the sound levels as they were generated. We were able to view this relative to its history. This was a powerful capability as we worked to fine-tune the power and amplifier.

Note the spike in the middle of the image above, which we could map to a train horn heard ourselves during the training period.

Data Preparation

Next, we took a weekday (January 7, 2015), which appeared typical of a non-holiday weekday relative to the entire month of data collected. For this period, we were able to construct a 24-hour data set at 1-second sample intervals for our analysis.

Data was accessed through the Librato API, downloaded as JSON, converted to CSV and cleansed.

Analysis

First, to gain intuition, we took a sample recording gathered at the sensor site of a typical train horn.

Click HERE to hear the sample sound.

Using matplotlib within an ipython notebook, we are able to “see” this sound, in both its raw audio form and as a spectrogram showing frequency:

Next, we look at our entire 24 hours of data, beginning on the evening of January 6, and concluding 24 hours later on the evening of January 7th. Note the quiet “overnight” period, about a quarter of the way across the x axis.

To put this into context, we overlay the Caltrain schedule. Given the sensor sits between the Palo Alto and California Avenue stations, and given the variance in stop times, we mark northbound trains using the scheduled stop at Palo Alto (red), and southbound trains using the scheduled stop at California Ave (green).

Initially, we can make two converse observations: many peak sound events tend to lie quite close to these stop times, as expected. However: many of the sound events (including the maximum recorded value, the nightly ~11pm freight train service) occur independent of the scheduled Caltrains.

Conversion to Decibels

On the Y axis above, the sound level is reported in the raw voltage output from the Microphone. To address the questions above we needed a way to convert these values to decibel units (dB).

To do so, a low-cost sound meter was obtained from Fry’s. Then an on-site calibration was performed to map decibel readings from the sensor to the voltage output uploaded from our microphone.

Within R Studio, these values were plotted and a crude estimation function was derived to create a linear mapping between voltage and dB:

The goal of doing a straight line estimate vs. log-linear was to compensate for differences in apparatus (dB meter vs. microphone within casing) and overall to maintain conservative approximations. Most of the events in question during the observation period were between 2.0 and 2.5 volts, where we collected several training points (above).

A challenge in this process was the slight lag between readings and data collection with unknown variance. As such, only “peak” and “trough” measurements could be used reliably to build the model.

With this crude conversion estimator in hand, we would now replot the data above with decibels on the y axis.

Clearly the “peaks” above are of interest as outliers from the baseline noise level at this site. In fact, there are 69 peaks (>82 dB) observed (at 1-second sample rate), and 71 scheduled trains for this same period. Though this location was about 100 yards removed from the tracks, the horns are quieter than the recommended 96dB-115dB range recommended by the FRA. (With caveat above re: crude approximator)

Interesting also that we’re not observing the “two long-two short-one long” pattern. Though some events are lost to the sampling rate, qualitatively this does not seem to be a standard practice followed by the engineers. Those who live in Palo Alto also know this to be true, qualitatively.

Also worth noting is the high variance of ambient noise, the central horizontal blue “cloud” above, ranging from ~45 dB to ~75 dB. We sought to understand the nature of this variance and whether it contained structure.

Looking more closely at just a few minutes of data during the Jan 7 morning commute, we can see that indeed there is a periodic structure to the variance.

In comparing to on-site observations, we could determine that this period was defined by the traffic signal which sits between the sensor and the train tracks, on Alma St. Additionally, we often observe an “M” structure (bimodal peak) indicating the southbound traffic accelerating from the stop line when the light turned green, followed by the passing northbound traffic seconds later.

Looking at a few minutes of the same morning commute, we can clearly see when the train passed and sounded its horn. Here again, green indicates a southbound train, red indicates and northbound train.

In this case, the southbound train passed slightly before its scheduled arrival time at the California Avenue station, and the Northbound train passed within its scheduled arrival minute, both on time. Note also the peak unassociated with the train. We’ll discuss this next.

Perhaps a more useful summary of the data collected is shown as a histogram, where the decibels are shown on the X axis and the frequency (count) is shown on the Y axis.

We can clearly see a bimodal distribution, where sound is roughly normally distributed, with a second distribution at the higher end. The question still remained why several of the peak observed values fell nowhere near the scheduled train time?

The answer here requires no sensors: airplanes, sirens and freight trains are frequent noise sources in Palo Alto. These factors, coupled with a nearby residential construction project accounted for the non-regular noise events we observed.

Click HERE to hear a sample sound.

Finally, we subsetted the data into three groups, one to look at non-Train minutes, one to look at northbound train minutes and one to look at southbound train minutes. The mean dB levels were 52.13, 52.18 and 52.32 respectively. While the order here makes sense, these samples bury the outcome since a horn blast may only be one second of a train-minute. The difference between northbound and southbound are consistent with on-site observation-- given the sensor lies on the northeast corner of the crossing, horn blasts from southbound trains were more pronounced.

Conclusion

Before making any conclusions it should be noted again that these are not scientific findings, but rather an attempt to add some rigor to the discussion around Caltrain and noise pollution. Further study with a longer period of analysis and duplicity of data collection would be required to statistically state these conclusions.

That said, we can readdress the topics in question:

Timing: Are train horns sounded at the appropriate time?

The FRA recommends engineers sound their horn between 15 and 20 seconds before a crossing. Given the tight urban nature of this crossing this recommendation seems a misfit. Caltrain engineers are sounding within 2-3 seconds of the crossing, which seems more appropriate.

Schedule: Are Caltrains coming and going on time?

Though not explored in depth here, generally we can observe that trains are passing our sensor prior to their scheduled arrival at the upcoming station.

Volume: Are the Caltrain horns sounding at the appropriate level?

As discussed above, the apparent dB level at a location very close to the track was well below the FRA recommended levels.

Relativity: How do Caltrain horns contribute to overall urban noise levels?

The Caltrain horns generate roughly an additional 10dB to peak baseline noise levels, including period traffic events at the intersection observed.

Opinions

Due to their regular frequency and physical presence, trains are an easy target when it comes to urban sound attenuation efforts. However, the regular oscillations of traffic, sirens, airplanes and construction create a very high, if not predictable baseline above which trains must be heard.

Considering the importance of safety to this system, which operates just inches from bikers, drivers and pedestrians, there is a tradeoff to be made between supporting quiet zone initiatives and the capability of speeding trains to be heard.

In Palo Alto, as we move into an era of electric cars, improved bike systems and increased pedestrian access, the oscillations of noise created by non-train activities may indeed subside over time. And this in turn, might provide an opportunity to lower the “alert sounds” such as sirens and train horns required to deliver these services safely. Someday much of our everyday activity might be accomplished quietly.

Until then, we can only appreciate these sounds which must rise above our noisy baseline, as a reminder of our connectedness to the greater bay area through our shared focus on safety and convenient public transportation.

Acknowledgements:

Sincere thanks to Helen T. and Nick Parlante of Stanford University, Mark Phillips of Helium and Nik Wekwerth/Jason Derrett/Peter Haggerty of Librato for their help and technical support.

Thanks also to my peers at The Data Guild, Aman, Chris, Dave and Sandy and the Palo Alto Police IT department for their feedback.

And thanks to my daughter Tallulah for her help soldering and moral support.

[1] http://en.wikipedia.org/wiki/Caltrain

Originally posted on LinkedIn. 

Follow us @IoTCtrl | Join our Community

Read more…
RSS
Email me when there are new items in this category –

Sponsor