Join IoT Central | Join our LinkedIn Group | Post on IoT Central


data (17)

Speaking of the wireless communication technology of the Internet of Things, everyone must be familiar with LoRa, because it adopts the principle of spread spectrum modulation and a unique error correction mechanism to achieve ultra-long-distance wireless transmission. Wireless communication distance is longer.
Of course, the focus of this article is not to discuss the characteristics of LoRa, but to talk about several key core parameters in LoRa modulation.

1. Spreading Factor (SF)
LoRa spread spectrum employs multiple information chips to represent each bit of payload information. The speed at which spread information is sent is called the symbol rate (Rs), and the ratio between the chip rate and the nominal symbol rate is the spreading factor, which represents the number of symbols sent per information bit. The popular understanding is to represent a single data bit with multiple information chips.
To simplify the explanation in the digital domain, if we agree that 101110 means that the actual data bit is 1, a valid data packet such as 0xFF needs to be transmitted in the application, and the corresponding binary representation is: 1111 1111, then the information chip to be actually transmitted is:

12161764298?profile=RESIZE_710x

Through the above method, the bit error rate of transmission can be reduced, thereby increasing the effective communication distance. However, when the number of transmitted information symbols is the same, the actual amount of effective data transmitted is reduced. Therefore, when other parameters are the same, The larger the SF parameter is set, the smaller the actual transmitted data rate.

LoRa spreading factor value range:

12161764486?profile=RESIZE_710x

Note:
① The above table is taken from the SX127x data sheet;
② SF=6 can only be used in ImplicitHeader mode;
③ SX126x series can support SF=5
2. Modulation bandwidth BandWidth(BW)
Channel bandwidth is used to limit the frequency range allowed to pass through the current channel, which can be understood as a frequency passband.
The frequency allowed by a channel is usually 433.125MHz to 433.250MHz, and the corresponding BW=125kHz.
According to Shannon's theorem, increasing the channel bandwidth can increase the effective data rate to shorten the air delay time

Shannon's theorem
However, it can be seen from the digital sensitivity calculation formula that increasing the channel bandwidth will reduce the system sensitivity, thus shortening the wireless communication distance.
Receive sensitivity S = 10lg⁡(KTB) + NF + SNR, where B represents the channel bandwidth.
In LoRa modulation, the channel bandwidth is bilateral bandwidth (full channel bandwidth), while the BW of traditional FSK modulation refers to unilateral bandwidth or receiving bandwidth.

3. Coding Rate(CR)
In the process of LoRa communication, cyclic forward error correction technology is used internally, that is, part of the data in the actual data packet transmitted over the air is used for error correction decoding, and the ratio of the effective data length to the actual length of the air transmitted data packet is called encoding rate.
LoRa encoding rate value range and corresponding overhead ratio:

12161764492?profile=RESIZE_584x

Note: The above pictures are taken from the SX127x data sheet
Based on the above, it can be seen that using the error correction algorithm will increase the link overhead and reduce the effective data transmission rate. However, due to the existence of the error correction code, the transmission has strong anti-interference ability and higher reliability.
Speaking of this, I feel that I need to go deeper, otherwise I will not be able to reflect my own level
Relationship between LoRa signal bandwidth BW, symbol rate Rs and data rate DR

Chip speed Rc:
As mentioned earlier, the bandwidth has a great relationship with the transmission rate of the signal. Here, the transmission rate of the chip is equal to the value of the bandwidth (unit Hz), that is:
Rc=BW = |BW|chips/s

Symbol rate Rs:
Each symbol has 2^SF chips, and the transmission rate of the chips is Rc, so the symbol transmission rate Rs is:
Rs= Rc/2^SF = BW/2^SF

Data transmission rate DR (or bit Rate):
DR= Rb(bits/sec) = SF * Rs * CR = SF * (BW/2^SF) * CR

4. Low Data Rate Optimization
In the cognition of many people, the core parameters of LoRa seem to be only SF, BW, and CR. The parameter value setting of Low Data Rate Optimization is easy to be ignored, but in the design process, this parameter is still very important, especially In the application process of low rate and large data packet transmission, the long-term continuous transmission of the transmitter may cause system frequency deviation and reduce the communication success rate. After enabling the Low Data Rate Optimization option, it can improve the communication robustness of LoRa under low rate conditions. sex.
The specific setting condition is that when the transmission time of a single symbol exceeds 16 milliseconds, the LowDataRateOptimize bit must be enabled, and both the transmitter and the receiver must have the same LowDataRateOptimize setting.
Take BW=500K, SF=9 as an example:

At this time RS =500kHz / 512, TS = 1 / RS = 512/500kHz= 1 ms
In this case, it is not necessary to enable Low Data Rate Optimization.
Take BW=25K, SF=10 as an example:
At this time RS =25kHz / 1024, TS = 1 / RS =1024/25kHz= 40.96 ms
In this case, Low Data Rate Optimization must be turned on.

Read more…

The Core Costs of Data in IoT

Data is a critical resource in IoT that enables organizations to gain insights into their operations, optimize processes, and improve customer experience. It is important to understand the cost of managing and processing data, as it can be significant. Too often, organizations have more data than they know how to effectively use. Here are some of the major areas of costs:

First, data storage is a major cost. IoT devices generate large amounts of data, and this data needs to be stored in a secure and reliable way. Storing data in the cloud or on remote servers can be expensive, as it requires a robust and scalable infrastructure to support the large amounts of data generated by IoT devices. Additionally, data must be backed up to ensure data integrity and security, which adds to the cost.

Second, data processing and analysis require significant computational resources. Processing large amounts of data generated by IoT devices requires high-performance hardware and software, which can be expensive to acquire and maintain. Additionally, hiring data scientists and other experts to interpret and analyze the data adds to the cost.

Third, data transmission over networks can be costly. IoT devices generate data that needs to be transmitted over networks to be stored and processed. Depending on the location of IoT devices and the network infrastructure, the cost of network connectivity can vary widely.

Finally, data security is a major concern in IoT, and implementing robust security measures can add to the cost. This includes implementing encryption protocols to ensure data confidentiality, as well as implementing measures to prevent unauthorized access to IoT devices and data.

Managing and processing data requires significant resources, including storage, processing and analysis, network connectivity, and security. While data is a valuable resource that can provide significant value, the cost of managing and processing data must be carefully evaluated to ensure that the benefits outweigh the expenses.

Read more…

When I think about the things that held the planet together in 2020, it was digital experiences delivered over wireless connectivity that made remote things local.

While heroes like doctors, nurses, first responders, teachers, and other essential personnel bore the brunt of the COVID-19 response, billions of people around the world found themselves cut off from society. In order to keep people safe, we were physically isolated from each other. Far beyond the six feet of social distancing, most of humanity weathered the storm from their homes.

And then little by little, old things we took for granted, combined with new things many had never heard of, pulled the world together. Let’s take a look at the technologies and trends that made the biggest impact in 2020 and where they’re headed in 2021:

The Internet

The global Internet infrastructure from which everything else is built is an undeniable hero of the pandemic. This highly-distributed network designed to withstand a nuclear attack performed admirably as usage by people, machines, critical infrastructure, hospitals, and businesses skyrocketed. Like the air we breathe, this primary facilitator of connected, digital experiences is indispensable to our modern society. Unfortunately, the Internet is also home to a growing cyberwar and security will be the biggest concern as we move into 2021 and beyond. It goes without saying that the Internet is one of the world’s most critical utilities along with water, electricity, and the farm-to-table supply chain of food.

Wireless Connectivity

People are mobile and they stay connected through their smartphones, tablets, in cars and airplanes, on laptops, and other devices. Just like the Internet, the cellular infrastructure has remained exceptionally resilient to enable communications and digital experiences delivered via native apps and the web. Indoor wireless connectivity continues to be dominated by WiFi at home and all those empty offices. Moving into 2021, the continued rollout of 5G around the world will give cellular endpoints dramatic increases in data capacity and WiFi-like speeds. Additionally, private 5G networks will challenge WiFi as a formidable indoor option, but WiFi 6E with increased capacity and speed won’t give up without a fight. All of these developments are good for consumers who need to stay connected from anywhere like never before.

Web Conferencing

With many people stuck at home in 2020, web conferencing technology took the place of traveling to other locations to meet people or receive education. This technology isn’t new and includes familiar players like GoToMeeting, Skype, WebEx, Google Hangouts/Meet, BlueJeans, FaceTime, and others. Before COVID, these platforms enjoyed success, but most people preferred to fly on airplanes to meet customers and attend conferences while students hopped on the bus to go to school. In 2020, “necessity is the mother of invention” took hold and the use of Zoom and Teams skyrocketed as airplanes sat on the ground while business offices and schools remained empty. These two platforms further increased their stickiness by increasing the number of visible people and adding features like breakout rooms to meet the demands of businesses, virtual conference organizers, and school teachers. Despite the rollout of the vaccine, COVID won’t be extinguished overnight and these platforms will remain strong through the first half of 2021 as organizations rethink where and when people work and learn. There’s way too many players in this space so look for some consolidation.

E-Commerce

“Stay at home” orders and closed businesses gave e-commerce platforms a dramatic boost in 2020 as they took the place of shopping at stores or going to malls. Amazon soared to even higher heights, Walmart upped their game, Etsy brought the artsy, and thousands of Shopify sites delivered the goods. Speaking of delivery, the empty city streets became home to fleets FedEx, Amazon, UPS, and DHL trucks bringing packages to your front doorstep. Many retail employees traded-in working at customer-facing stores for working in a distribution centers as long as they could outperform robots. Even though people are looking forward to hanging out at malls in 2021, the e-commerce, distribution center, delivery truck trinity is here to stay. This ball was already in motion and got a rocket boost from COVID. This market will stay hot in the first half of 2021 and then cool a bit in the second half.

Ghost Kitchens

The COVID pandemic really took a toll on restaurants in the 2020, with many of them going out of business permanently. Those that survived had to pivot to digital and other ways of doing business. High-end steakhouses started making burgers on grills in the parking lot, while takeout pizzerias discovered they finally had the best business model. Having a drive-thru lane was definitely one of the keys to success in a world without waiters, busboys, and hosts. “Front of house” was shut down, but the “back of house” still had a pulse. Adding mobile web and native apps that allowed customers to easily order from operating “ghost kitchens” and pay with credit cards or Apple/Google/Samsung Pay enabled many restaurants to survive. A combination of curbside pickup and delivery from the likes of DoorDash, Uber Eats, Postmates, Instacart and Grubhub made this business model work. A surge in digital marketing also took place where many restaurants learned the importance of maintaining a relationship with their loyal customers via connected mobile devices. For the most part, 2021 has restauranteurs hoping for 100% in-person dining, but a new business model that looks a lot like catering + digital + physical delivery is something that has legs.

The Internet of Things

At its very essence, IoT is all about remotely knowing the state of a device or environmental system along with being able to remotely control some of those machines. COVID forced people to work, learn, and meet remotely and this same trend applied to the industrial world. The need to remotely operate industrial equipment or an entire “lights out” factory became an urgent imperative in order to keep workers safe. This is yet another case where the pandemic dramatically accelerated digital transformation. Connecting everything via APIs, modeling entities as digital twins, and having software bots bring everything to life with analytics has become an ROI game-changer for companies trying to survive in a free-falling economy. Despite massive employee layoffs and furloughs, jobs and tasks still have to be accomplished, and business leaders will look to IoT-fueled automation to keep their companies running and drive economic gains in 2021.

Streaming Entertainment

Closed movie theaters, football stadiums, bowling alleys, and other sources of entertainment left most people sitting at home watching TV in 2020. This turned into a dream come true for streaming entertainment companies like Netflix, Apple TV+, Disney+, HBO Max, Hulu, Amazon Prime Video, Youtube TV, and others. That said, Quibi and Facebook Watch didn’t make it. The idea of binge-watching shows during the weekend turned into binge-watching every season of every show almost every day. Delivering all these streams over the Internet via apps has made it easy to get hooked. Multiplayer video games fall in this category as well and represent an even larger market than the film industry. Gamers socially distanced as they played each other from their locked-down homes. The rise of cloud gaming combined with the rollout of low-latency 5G and Edge computing will give gamers true mobility in 2021. On the other hand, the video streaming market has too many players and looks ripe for consolidation in 2021 as people escape the living room once the vaccine is broadly deployed.

Healthcare

With doctors and nurses working around the clock as hospitals and clinics were stretched to the limit, it became increasingly difficult for non-COVID patients to receive the healthcare they needed. This unfortunate situation gave tele-medicine the shot in the arm (no pun intended) it needed. The combination of healthcare professionals delivering healthcare digitally over widespread connectivity helped those in need. This was especially important in rural areas that lacked the healthcare capacity of cities. Concurrently, the Internet of Things is making deeper inroads into delivering the health of a person to healthcare professionals via wearable technology. Connected healthcare has a bright future that will accelerate in 2021 as high-bandwidth 5G provides coverage to more of the population to facilitate virtual visits to the doctor from anywhere.

Working and Living

As companies and governments told their employees to work from home, it gave people time to rethink their living and working situation. Lots of people living in previously hip, urban, high-rise buildings found themselves residing in not-so-cool, hollowed-out ghost towns comprised of boarded-up windows and closed bars and cafés. Others began to question why they were living in areas with expensive real estate and high taxes when they not longer had to be close to the office. This led to a 2020 COVID exodus out of pricey apartments/condos downtown to cheaper homes in distant suburbs as well as the move from pricey areas like Silicon Valley to cheaper destinations like Texas. Since you were stuck in your home, having a larger house with a home office, fast broadband, and a back yard became the most important thing. Looking ahead to 2021, a hybrid model of work-from-home plus occasionally going into the office is here to stay as employees will no longer tolerate sitting in traffic two hours a day just to sit in a cubicle in a skyscraper. The digital transformation of how and where we work has truly accelerated.

Data and Advanced Analytics

Data has shown itself to be one of the world’s most important assets during the time of COVID. Petabytes of data has continuously streamed-in from all over the world letting us know the number of cases, the growth or decline of infections, hospitalizations, contact-tracing, free ICU beds, temperature checks, deaths, and hotspots of infection. Some of this data has been reported manually while lots of other sources are fully automated from machines. Capturing, storing, organizing, modeling and analyzing this big data has elevated the importance of cloud and edge computing, global-scale databases, advanced analytics software, and the growing importance of machine learning. This is a trend that was already taking place in business and now has a giant spotlight on it due to its global importance. There’s no stopping the data + advanced analytics juggernaut in 2021 and beyond.

Conclusion

2020 was one of the worst years in human history and the loss of life was just heartbreaking. People, businesses, and our education system had to become resourceful to survive. This resourcefulness amplified the importance of delivering connected, digital experiences to make previously remote things into local ones. Cheers to 2021 and the hope for a brighter day for all of humanity.

Read more…

IoT and Data- Quite Unstoppable

14305793?profile=original

According to Cisco, currently there are 10 billion things – phones, PCs, things – connected to the Internet. That is merely 600ths of one percent of the actual devices and things that exist right now. There are over one trillion devices out there right this very minute that are not talking to the Internet – but soon enough they will be.

Kevin Ashton, cofounder and executive director of the Auto-ID Center at MIT, first men-tioned the Internet of Things in a presentation he made to Procter & Gamble in 1999. Here’s how Ashton explains the potential of the Internet of Things:

“Today computers -- and, therefore, the Internet -- are almost wholly dependent on hu-man beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet were first captured and created by hu-man beings by typing, pressing a record button, taking a digital picture or scanning a bar code.

The problem is, people have limited time, attention and accuracy -- all of which means they are not very good at capturing data about things in the real world. If we had com-puters that knew everything there was to know about things -- using data they gathered without any help from us -- we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.”

Covergence

The broadband divide could prove to be a real hampering force to the Internet of Things movement that is gaining speed today. Cloud, mobility, big data are all con-verging and making a seamless network, but the success of this convergence de-pends heavily on the ability to actually move and access the data. And considering that millions of additional devices (some of which are just sensors) will enter the equation means its time for further investment and quick. According to the CIO Sur-vey, organizations are in a prime position to innovate and make significant changes.

CONNECT ANY THING OVER ANY NETWORK

The Internet of Things (IoT) is a computing concept that describes a future where everyday physical objects will be connected to the Internet and be able to identify themselves to other devices. It is significant because an object that can represent itself digitally becomes something greater than the object by itself. No longer does the object relate just to you, but is now connected to surrounding objects and database data. When many objects act in unison, they are known as having "ambient intelligence."

Business Model focusing more on Data

In other words, as the physical and digital worlds integrate more closely with each other, and the number of connected devices is predicted to reach 25 billion by 2018, the IoT will enhance and evolve our ability to manage and process information . It’s a more context-oriented world, because there is better data. First thing in a new technology, people do all the obvious things that look like the old market, but more efficiently. In the Internet, GNN had web ads like old newspaper ads. Later there was Google search, which was a different way of doing advertising, by focusing more on data. Now we’ve got social search, social networks. The business model moves to something that is more native to the technology. Uber is an early IoT company. Other businesses will pop up that do more native things. Much of what is available are components that require highly specialized knowledge and skills to make use of. The Internet of Things and its partner in crime, big data, can also impact society at a much higher level. By effecting better decision making through a better understanding of data, we can tackle socioeconomic issues like poverty and disease, education, and quality of life around the world. You know that soccer ball that generates electricity (an awesome invention, btw)? The IoT is the next exponent up.

 IoT focus on what matters most to you

The Internet of Things is not a futuristic, aspirational technology trend. It’s here today in the devices, sensors, cloud infrastructure, and data and business intelligence tools you are already using. Rather than thinking about the Internet of Things in terms of everything–such as billions of devices and sensors–focus on what matters most to you. Instead of thinking about the massive amount of data being produced, think about how one piece of data can provide value to your business. The DIY Marker community has its Arduino and Rasberry Pi boards to create toy educational experiments but even those require a bit of study to make sense of. The only project that I know of that seems to be pointing in a direction of making IoT available as a platform for anyone to create with is the TOI, thingsoninternet.biz and their VIPER platform. It is a set of components that are open so available from many sources and they have made Python available as the programming language. Python was create to be an easy programming language to learn but until VIPER it was not suitable for embedded devices. Look for this interesting product on kickstarter and use it to point to a direction for the rest of the industry.

That said, the notion of “The Internet of things” is something unstoppable. More and more devices will become Internet enabled, not less. What needs to be addressed is rock-solid security (logical and physical) combined with privacy laws and policies. At the same time, a comprehensive set of government acts, laws, and regulatory frameworks and technical standards needs to be developed to harness the potential of new models of interactions among the machines and people.

Best Regards,

Raj Kosaraju

Read more…

Every week, thousands of new apps are seen hitting the mobile market. Unfortunately, the number of hackers working assiduously to tap into these apps to implant malware or phish for user information has also been on the increase. By implication, there is every need to take the security of mobile users very seriously particularly when it comes to app development.


Apart from being highly vigilant about security, app developers need to be able to identify these security issues and know how to avoid them, so as to be able to provide users with the security they need to keep their information and other data safe. Security issues can be experienced in various forms during any mobile application development process; some of which are explained below.

Failure to implement secure communications to servers

Most apps are designed to connect back to a server particularly those applications that control sensitive user information. Therefore, as a critical area of concern, mobile app developers must ensure safe transit between the app and the server. Nothing has to be interrupted on an insecure WiFi connection. Basically, this type of security is achievable through SSL certificates and encryption. User information can be compromised particularly if developers fail to employ the right SSL libraries.

Inability to plan for physical security breaches

Nothing much can be done to prevent theft or loss of mobile devices. In fact, mobile app developers have a very little role to play in this. However, they can greatly help to minimize the problem by executing a local session timeout code. Usually, users are obligated to enter a password from time to time to access an app. Rather than making this a daily occurrence, password requirement from devices can be observed once a week or at the fifth time the app is used. Local session timeout can also prevent the use of software that helps users remember passwords.

The use of weak encryption or an entire lack of encryption

Obviously, improves constantly which helps to make algorithms become obsolete and very easy to crack. Failing to use encryption or using weak encryption in an app can put sensitive user information at risk of getting exposed. In the course of using certain apps, users are obligated to input sensitive data like personal identification information or credit card numbers. It is sad to know that this information can be hacked particularly with the absence of good encryption. An app is more likely to be hacked when it becomes more popular. So, if you are looking to push your app to the top, there is every need to invest in good encryption.

Bypassing systematic security testing

Most importantly, Indian app developers need to consider themselves as the last line of defense. You stand to put your app users at risk when you fail to ensure a secure app. In every development process, testing is very important and as such, there is no need to rush in releasing an app. Ensure to test every common inlet for security issues, such as sensors, GPS, camera, and even the development platform. Viruses and malware are no respecters of apps – every app is vulnerable to an attack from them.

Developers should try as much as possible to avoid the eruption of a crash and debug logs during testing. These are often common places hackers often take advantage of for app vulnerabilities. Apart from increasing the speed of an app, NSLog statements on iOS can be effectively disabled during iPhone app development to avoid vulnerabilities. Also, an Android app remains vulnerable until the Android debug log is typically cleared.

Lack of proper planning for data caching vulnerabilities

Unlike standard laptops and desktops, mobile devices are well-known for their ability to store short-term information for longer periods. This caching method generally helps to increase speed. However, since hackers can easily access cached information, there is every possible for mobile devices to be susceptible to security breaches. A major way of avoiding the problem is by demanding for a password to use an app. However, this can affect the popularity of your app, as most app users often find the use of passwords to be quite inconvenient. Alternatively, you can program the cache to be automatically erased every time users reboot their mobile device. This is another meaningful solution to data caching vulnerabilities.

Adopting other developers’ code

Developing an app from the start can be very time-consuming but with the availability of numerous free codes, this process has been extremely simplified. Interestingly, some hackers create codes for unsuspecting developers. In the hopes that application developers would pick up their codes, some hackers have ventured into creating anonymous codes. Through this, they tend to gain easy and free access to any information of their choice after the app has been designed and released.

Although it is never a bad thing to build upon people’s ideas, however, it is highly essential to carry out relevant research before doing so. In order to avoid experiencing security issues, it is well advisable that you make use of code from reliable sources. So, if you’re looking to build upon the ideas of a third-party, ensure to use sources you can trust. As a matter of fact, always use verified and trusted sources for code and ensure to be on the lookout for phishing scams by reading the code line by line.

Slow patching of app

Just because your app has been launched does not mean that you are done with the development process. Hackers are always on the move, they do not relent in their efforts to break through an app and so, they always work very fast. Most times, they search for apps with irregular security updates. Then they exploit these security breaches to bring down the app. Just to let you know, it is good to perform regular security updates by revisiting the app often.

However, users on their own part may be unable to get these patches on time. This is because they have to accept and download them. Additionally, the approval process of a patch on an iOS platform can typically take up to a week. Obviously, patches can take a while to reach users. To this end, you can put user information at risk if you fail to stay right on top of new security updates.

When it comes to creating apps that deal with confidential matters such as personal information and customer credit cards, there is always no room for error. To any app developer, the repercussions of the smallest security breach can be highly catastrophic. As a matter of fact, it is your duty to protect both your app and its users. So, ensure to take all necessary precautions so as not to get caught unawares.

Save

Read more…

Antarctica inhabits a unique place in the human exploration mythos. The vast expanse of uninhabitable land twice the size of Australia has birthed legendary stories of human perseverance and cautionary tales about the indomitable force of nature. However, since those early years, Antarctica has become a rich research center for all different kinds of data collection – from climate change, to biology, to seismic and more. And although today there are many organizations with field stations running this data collection, the nature of its, well, nature still presents daily challenges that technology has had a hand in helping address.

Can You Send Data Through Snow?

British Antarctic Survey (BAS) – of recent Boaty McBoatface fame – has been entrenched in this brutal region for over 60 years, the BAS endeavors to gather data on the polar environment and search for indicators of global change. Its studies of sediments, ice cores, meteorites, the polar atmosphere and ever-changing ice shelves are vitally important and help predict the global climate of the future. Indeed, the BAS is one of the most essential research institutions in the world.

In addition to two research ships, five aircraft and five research stations, the BAS relies on state of the art data gathering equipment to complete its mission. From GPS equipment to motion and atmospheric sensors, the BAS deploys only the most precise and reliable equipment available to generate data. Reliable equipment is vital because of the exceedingly high cost of shipping and repair in such a remote place.

To collect this data, BAS required a network that could reliably transmit it in what could be considered one of the harshest environments on the planet. This means deploying GPS equipment, motion and atmospheric sensors, radios and more that could stand up to the daily tests.

In order to collect and transport the data in this harsh environment, BAS needed a ruggedized solution that could handle both the freezing temperatures (-58 degrees F in the winer), strong winds and snow accumulation. Additionally, the solution needed to be low power due to the region’s lack of power infrastructure.

 The Application

Halley VI Research Station is a highly advanced platform for global earth, atmospheric and space weather observation. Built on a floating ice shelf in the Weddell Sea, Halley VI is the world’s first re-locatable research facility. It provides scientists with state-of-the-art laboratories and living accommodation, enabling them to study pressing global problems from climate change and sea-level rise to space weather and the ozone hole (Source: BAS website).

The BAS monitors the movement of Brunt Ice Shelf around Halley VI using highly accurate remote field site GPS installations. It employs FreeWave radios at these locations to transmit data from the field sites back to a collection point on the base.

Once there, the data undergoes postprocessing and is sent back to Cambridge, England for analysis. Below are Google Maps representation of the location of the Halley VI Research Station and a satellite image (from 2011) shows the first 9 of the remote GPS systems in relation to Halley VI.

The Problem

Data transport and collection at Halley VI requires highly ruggedized, yet precise and reliable wireless communication systems to be successful. Antarctica is the highest, driest, windiest and coldest region on earth and environmental condition are extremely harsh year round. Temperatures can drop below -50°C (-58 °F) during the winter months.

Winds are predominantly from the east. Strong winds usually pick up the dusty surface snow, reducing visibility to a few meters. Approximately 1.2 meters of snow accumulates each year on the Brunt Ice Shelf and buildings on the surface become covered and eventually crushed by snow.

This part of the ice shelf is also moving westward by approximately 700 meters per year. There is 24-hour darkness for 105 days per year when Halley VI is completely isolated from the outside world by the surrounding sea ice (Source: BAS Website).

Additionally, the components of the wireless ecosystem need to be low power due to the region’s obvious lack of power infrastructure. These field site systems have been designed from ‘off the shelf’ available parts that have been integrated and ‘winterized’ by BAS for Antarctic deployment.

The Solution

The BAS turned to wireless data radios from FreeWave that ensure uptime and that can transport data over ice – typically a hindrance to RF communications. Currently, the network consists of 19 FreeWave 900 MHz radios, each connected to a remote GPS station containing sensors that track the movement of the Brunt Ice Shelf near the Halley VI Research Station.

The highly advanced GPS sensors accurately determine the Shelf’s position and dynamics, before reporting this back to a base station at Halley VI. Throughput consists of a 200 kilobit file over 12 minutes, and the longest range between a field site and the research station is approximately 30 kilometers.

Deployment of the GPS field site is done by teams of 3-4 staff using a combination of sledges and skidoo, or Twin Otter aircraft, depending on the distance and the abundance of ice features such as crevassing. As such, wireless equipment needed to be lightweight and easy to install and configure because of obvious human and material resource constraints.

In addition, the solution has to revolve around low power consumption. FreeWave radios have more than two decades of military application and many of the technical advancements made in collaboration with its military partners have led to innovations around low power consumption and improved field performance. The below image shows an example of a BAS remote GPS site, powered by a combination of batteries, a solar panel and a wind turbine (penguin not included).

FreeWave Technologies has been a supplier to the BAS for nearly a decade and has provided a reliable wireless IoT network in spite of nearly year-round brutal weather conditions. To learn more, visit: http://www.freewave.com/technology/.

Read more…

Article: A Brief History of Field Programmable Devices (FPGAs)-data-analytics-alone-cannot-deliver-effective-automation-solutions-industrial-iot-min-jpg

By Akeel Al-Attar. This article first appeared here

Automated analytics (which can also be referred to as machine learning, deep learning etc.) are currently attracting the lion’s share of interest from investors, consultants, journalists and executives looking at technologies that can deliver the business opportunities being afforded by the Internet of Things. The reason for this surge in interest is that the IOT generates huge volumes of data from which analytics can discover patterns, anomalies and insights which can then be used to automate, improve and control business operations.


One of the main attractions of automated analytics appears to be the perception that it represents an automated process that is able to learn automatically from data without the need to do any programming of rules. Furthermore, it is perceived that the IOT will allow organisations to apply analytics to data being generated by any physical asset or business process and thereafter being able to use automated analytics to monitor asset performance, detect anomalies and generate problem resolution / trouble-shooting advice; all without any programming of rules!

In reality, automated analytics is a powerful technology for turning data into actionable insight / knowledge and thereby represents a key enabling technology for automation in Industrial IOT. However, automated analytics alone cannot deliver complete solutions for the following reasons:

i- In order for analytics to learn effectively it needs data that spans the spectrum of normal, sub normal and anomalous asset/process behaviour. Such data can become available relatively quickly in a scenario where there are tens or hundreds of thousands of similar assets (central heating boilers, mobile phones etc.). However, this is not the case for more complex equipment / plants / processes where the volume of available faults or anomalous behaviour data is simply not large enough to facilitate effective analytics learning/modelling. As a result any generated automated analytics will be very restricted in its scope and will generate a large number of anomalies representing operating conditions that do not exist in the data.

ii- By focussing on data analytics alone we are ignoring the most important asset of any organisation; namely the expertise of its people in how to operate plants / processes. This expertise covers condition / risk assessment, planning, configuration, diagnostics, trouble-shooting and other skills that can involve decision making tasks. Automating ‘Decision making’ and applying it to streaming real-time IOT data offers huge business benefits and is very complementary to automated analytics in that it addresses the very areas in point 1 above where data coverage is incomplete, but human expertise exists.

Capturing expertise into an automated decision making system does require the programming of rules and decisions but that need not be a lengthy or cumbersome in a modern rules/decision automation technology such as Xpertrule. Decision making tasks can be represented in a graphical way that a subject matter expert can easily author and maintain without the involvement of a programmer. This can be done using graphical and easy to edit decision flows, decision trees, decision tables and rules. From my experience in using this approach, a substantial decision making task of tens of decision trees can be captured and deployed within a few weeks.

Given the complementary nature of automated analytics and automated decisions, I would recommend the use of symbolic learning data analytics techniques. Symbolic analytics generate rules/tree structures from data which are interpretable and understandable to the domain experts. Whilst rules/tree analytics models are marginally less accurate than deep learning or other ‘blackbox models’, the transparency of symbolic data models offer a number of advantages:

i- The analytics models can be validated by the domain experts
ii- The domain experts can add additional decision knowledge to the analytics models
iii- The transparency of the data models gives the experts insights into the root causes of problems and highlights opportunities for performance improvement.

Combining automated knowledge from data analytics with automated decisions from domain experts can deliver a paradigm shift in the way organisations use IOT to manage their assets / processes. It allows organisations to deploy their best practice expertise 24/7 real time throughout the organisation and rapidly turn newly acquired data into new and improved knowledge.

Below are example decision and analytics knowledge from an industrial IOT solution that we developed for a major manufacturer of powder processing mills. The solution monitors the performance of the mills to diagnose problems and to detect anomalous behaviour:

The Fault diagnosis tree below is part of the knowledge captured from the subject matter experts within the company

Article: A Brief History of Field Programmable Devices (FPGAs)-fault-diagnosis-tree-min-jpg



The tree below is generated by automated data analytics and relates the output particle size to other process parameters and environmental variables. The tree is one of many analytics models used to monitor anomalous behaviour of the process.

Article: A Brief History of Field Programmable Devices (FPGAs)-automated-data-analytics-min-jpg



The above example demonstrates both the complementary nature of rules and analytics automation and the interpretability of symbolic analytics. In my next posting I will cover the subject of the rapid capture of decision making expertise using decision structuring and the induction of decision trees from decision examples provided by subject matter experts.

Read more…

Sponsor