Join IoT Central | Join our LinkedIn Group | Post on IoT Central


Cloud Platforms (183)

The possibilities of what you can do with digital twin technology are only as limited as your imagination

Today, forward-thinking companies across industries are implementing digital twin technology in increasingly fascinating and ground-breaking ways. With Internet of Things (IoT) technology improving every day and more and more compute power readily available to organizations of all sizes, the possibilities of what you can do with digital twin technology are only as limited as your imagination.

What Is a Digital Twin?

A digital twin is a virtual representation of a physical asset that is practically indistinguishable from its physical counterpart. It is made possible thanks to IoT sensors that gather data from the physical world and send it to be virtually reconstructed. This data includes design and engineering details that describe the asset’s geometry, materials, components, and behavior or performance.

When combined with analytics, digital twin data can unlock hidden value for an organization and provide insights about how to improve operations, increase efficiency or discover and resolve problems before the real-world asset is affected.

These 4 Steps Are Critical for Digital Twin Success:

Involve the Entire Product Value Chain

It’s critical to involve stakeholders across the product value chain in your design and implementation. Each department faces diverse business challenges in their day-to-day operations, and a digital twin provides ready solutions to problems such as the inability to coordinate across end-to-end supply chain processes, minimal or no cross-functional collaboration, the inability to make data-driven decisions, or clouded visibility across the supply chain. Decision-makers at each level of the value chain have extensive knowledge on critical and practical challenges. Including their inputs will ensure a better and more efficient design of the digital twin and ensure more valuable and relevant insights.

Establish Well-Documented Practices

Standardized and well-documented design practices help organizations communicate ideas across departments, or across the globe, and make it easier for multiple users of the digital twin to build or alter the model without destroying existing components or repeating work. Best-in-class modelling practices increase transparency while simplifying and streamlining collaborative work.

Include Data From Multiple Sources

Data from multiple sources—both internal and external—is an essential part of creating realistic and helpful simulations. 3D modeling and geometry is sufficient to show how parts fit together and how a product works, but more input is required to model how various faults or errors might occur somewhere in the product’s lifecycle. Because many errors and problems can be nearly impossible to accurately predict by humans alone, a digital twin needs a vast amount of data and a robust analytics program to be able to run algorithms to make accurate forecasts and prevent downtime.

Ensure Long Access Lifecycles 

Digital twins implemented using proprietary design software have a risk of locking owners into a single vendor, which ties the long-term viability of the digital twin to the longevity of the supplier’s product. This risk is especially significant for assets with long lifecycles such as buildings, industrial machinery, airplanes, etc., since the lifecycles of these assets are usually much longer than software lifecycles. This proprietary dependency only becomes riskier and less sustainable over time. To overcome these risks, IT architects and digital twin owners need to carefully set terms with software vendors to ensure data compatibility is maintained and vendor lock-in can be avoided.

Common Pitfalls to Digital Twin Implementation

Digital twin implementation requires an extraordinary investment of time, capital, and engineering might, and as with any project of this scale, there are several common pitfalls to implementation success.

Pitfall 1: Using the Same Platform for Different Applications

Although it’s tempting to try and repurpose a digital twin platform, doing so can lead to incorrect data at best and catastrophic mistakes at worst. Each digital twin is completely unique to a part or machine, therefore assets with unique operating conditions and configurations cannot share digital twin platforms.

Pitfall 2: Going Too Big, Too Fast

In the long run, a digital twin replica of your entire production line or building is possible and could provide incredible insights, but it is a mistake to try and deploy digital twins for all of your pieces of equipment or programs all at once. Not only is doing too much, too fast costly, but it might cause you to rush and miss critical data and configurations along the way. Rather than rushing to do it all at once, perfect a few critical pieces of machinery first and work your way up from there.

Pitfall 3: Inability to Source Quality Data

Data collected in the field is subject to quality errors due to human mistakes or duplicate entries. The insights your digital twin provides you are only as valuable as the data it runs off of. Therefore, it is imperative to standardize data collection practices across your organization and to regularly cleanse your data to remove duplicate and erroneous entries.

Pitfall 4: Lack of Device Communication Standards

If your IoT devices do not speak a common language, miscommunications can muddy your processes and compromise your digital twin initiative. Build an IT framework that allows your IoT devices to communicate with one another seamlessly to ensure success.

Pitfall 5: Failing to Get User Buy-In

As mentioned earlier in this eBook, a successful digital twin strategy includes users from across your product value chain. It is critical that your users understand and appreciate the value your digital twin brings to them individually and to your organization as a whole. Lack of buy-in due to skepticism, lack of confidence, or resistance can lead to a lack of user participation, which can undermine all of your efforts.

The Challenge of Measuring Digital Twin Success

Each digital twin is unique and completely separate in its function and end-goal from others on the market, which can make measuring success challenging. Depending on the level of the twin implemented, businesses need to create KPIs for each individual digital twin as it relates to larger organizational goals.

The configuration of digital twins is determined by the type of input data, number of data sources and the defined metrics. The configuration determines the value an organization can extract from the digital twin. Therefore, a twin with a higher configuration can yield better predictions than can a twin with a lower configuration. The reality is that success can be relative, and it is impossible to compare the effectiveness of two different digital twins side by side.

Conclusion

It’s possible — probable even — that in the future all people, enterprises, and even cities will have a digital twin. With the enormous growth predicted in the digital twin market in the coming years, it’s evident that the technology is here to stay. The possible applications of digital twins are truly limitless, and as IoT technology becomes more advanced and widely accessible, we’re likely to see many more innovative and disruptive use cases.

However, a technology with this much potential must be carefully and thoughtfully implemented in order to ensure its business value and long-term viability. Before embracing a digital twin, an organization must first audit its maturity, standardize processes, and prepare its culture and staff for this radical change in operations. Is your organization ready?

Originally posted here.

Read more…

Five IoT retail trends for 2021

In 2020 we saw retailers hard hit by the economic effects of the COVID-19 pandemic with dozens of retailers—Neiman Marcus, J.C. Penney, and Brooks Brothers to name a few— declaring bankruptcy. During the unprecedented chaos of lockdowns and social distancing, consumers accelerated their shift to online shopping. Retailers like Target and Best Buy saw online sales double while Amazon’s e–commerce sales grew 39 percent.1 Retailers navigated supply chain disruptions due to COVID-19, climate change events, trade tensions, and cybersecurity events.  

After the last twelve tumultuous months, what will 2021 bring for the retail industry? I spoke with Microsoft Azure IoT partners to understand how they are planning for 2021 and compiled insights about five retail trends. One theme we’re seeing is a focus on efficiency. Retailers will look to pre-configured digital platforms that leverage cloud-based technologies including the Internet of Things (IoT), artificial intelligence (AI), and edge computing to meet their business goals. 

a group of people standing in front of a mirror posing for the camera

Empowering frontline workers with real-time data

In 2021, retailers will increase efficiency by empowering frontline workers with real-time data. Retail employees will be able to respond more quickly to customers and expand their roles to manage curbside pickups, returns, and frictionless kiosks.  

In H&M Mitte Garten in Berlin, H&M empowered employee ambassadors with fashionable bracelets connected to the Azure cloud. Ambassadors were able to receive real-time requests via their bracelets when customers needed help in fitting rooms or at a cash desk. The ambassadors also received visual merchandising instructions and promotional updates. 

Through the app built on Microsoft partner Turnpike’s wearable SaaS platform leveraging Azure IoT Hub, these frontline workers could also communicate with their peers or their management team during or after store hours. With the real-time data from the connected bracelets, H&M ambassadors were empowered to delivered best-in-class service.   

Carl Norberg, Founder, Turnpike explained, “We realized that by connecting store IoT sensors, POS systems, and AI cameras, store staff can be empowered to interact at the right place at the right time.” 

Leveraging live stream video to innovate omnichannel

Livestreaming has been exploding in China as influencers sell through their social media channels. Forbes recently projected that nearly 40 percent of China’s population will have viewed livestreams during 2020.2 Retailers in the West are starting to leverage live stream technology to create innovative omnichannel solutions.  

For example, Kjell & Company, one of Scandinavia’s leading consumer electronics retailers, is using a solution from Bambuser and Ombori called Omni-queue built on top of the Ombori Grid. Omni-queue enables store employees to handle a seamless combination of physical and online visitors within the same queue using one-to-one live stream video for online visitors.  

Kjell & Company ensures e-commerce customers receive the same level of technical expertise and personalized service they would receive in one of their physical locations. Omni-queue also enables its store employees to be utilized highly efficiently with advanced routing and knowledge matching. 

Maryam Ghahremani, CEO of Bambuser explains, “Live video shopping is the future, and we are so excited to see how Kjell & Company has found a use for our one-to-one solution.” Martin Knutson, CTO of Kjell & Company added “With physical store locations heavily affected due to the pandemic, offering a new and innovative way for customers to ask questions—especially about electronics—will be key to Kjell’s continued success in moving customers online.” 

20191026_JagerandKokemor_Attabotics_RobotOnWhite_15519.FIN (1)

Augmenting omnichannel with dark stores and micro-fulfillment centers  

In 2021, retailers will continue experimenting with dark stores—traditional retail stores that have been converted to local fulfillment centers—and micro-fulfillment centers. These supply chain innovations will increase efficiency by bringing products closer to customers. 

Microsoft partner Attabotics, a 3D robotics supply chain company, works with an American luxury department store retailer to reduce costs and delivery time using a micro-fulfillment center. Attabotics’ unique use of both horizontal and vertical space reduces warehouse needs by 85 percent. Attabotics’ structure and robotic shuttles leveraged Microsoft Azure Edge Zones, Azure IoT Central, and Azure Sphere.

The luxury retailer leverages the micro-fulfillment center to package and ship multiple beauty products together. As a result, customers experience faster delivery times. The retailer also reduces costs related to packaging, delivery, and warehouse space.  

Scott Gravelle, Founder, CEO, and CTO of Attabotics explained, “Commerce is at a crossroads, and for retailers and brands to thrive, they need to adapt and take advantage of new technologies to effectively meet consumers’ growing demands. Supply chains have not traditionally been set up for e-commerce. We will see supply chain innovations in automation and modulation take off in 2021 as they bring a wider variety of products closer to the consumer and streamline the picking and shipping to support e-commerce.” 

a group of people wearing costumes

Helping keep warehouse workers safe

What will this look like? Cognizant’s recent work with an athletic apparel retailer offers a blueprint. During the peak holiday season, the retailer needed to protect its expanding warehouse workforce while minimizing absenteeism. To implement physical distancing and other safety measures, the retailer  leveraged Cognizant’s Safe Buildings solution built with Azure IoT Edge and IoT Hub services.   

With this solution, employees maintain physical distancing using smart wristbands. When two smart wristbands were within a pre-defined distance of each other for more than a pre-defined time, the worker’s bands buzzed to reinforce safe behaviors. The results drove nearly 98 percent distancing compliance in the initial pilot. As the retailer plans to scale-up its workforce at other locations, implementing additional safety modules are being considered:   

  • Touchless temperature checks.  
  • Occupancy sensors communicate capacity information to the management team for compliance records.  
  • Air quality sensors provide environmental data so the facility team could help ensure optimal conditions for workers’ health.  

“For organizations to thrive during and post-pandemic, enterprise-grade workplace safety cannot be compromised. Real-time visibility of threats is providing essential businesses an edge in minimizing risks proactively while building employee trust and empowering productivity in a safer workplace,” Rajiv Mukherjee, Cognizant’s IoT Practice Director for Retail and Consumer Goods.  

Optimizing inventory management with real-time edge data

In 2021, retailers will ramp up the adoption of intelligent edge solutions to optimize inventory management with real-time data. Most retailers have complex inventory management systems. However, no matter how good the systems are, there can still be data gaps due to grocery pick-up services, theft, and sweethearting. The key to addressing these gaps is to combine real-time data from applications running on edge cameras and other edge devices in the physical store with backend enterprise resource planning (ERP) data.  

Seattle Goodwill worked with Avanade to implement a new Microsoft-based Dynamics platform across its 24 stores. The new system provided almost real-time visibility into the movement of goods from the warehouses to the stores. 

Rasmus Hyltegård, Director of Advanced Analytics at Avanade explained, “To ensure inventory moves quickly off the shelves, retailers can combine real-time inventory insights from Avanade’s smart inventory accelerator with other solutions across the customer journey to meet customer expectations.” Hyltegård continued, “Customers can check online to find the products they want, find the stores with product in stock, and gain insight into which stores have the shortest queues, which is important during the pandemic and beyond. Once a customer is in the store, digital signage allows for endless aisle support.” 

a person standing in front of a building

Summary

The new year 2021 holds a wealth of opportunities for retailers. We foresee retail leaders reimagining their businesses by investing in platforms that integrate IoT, AI, and edge computing technologies. Retailers will focus on increasing efficiencies to reduce costs. Modular platforms supported by an ecosystem of strong partner solutions will empower frontline workers with data, augment omnichannel fulfillment with dark stores and micro-fulfillment, leverage livestream video to enhance omnichannel, prioritize warehouse worker safety, and optimize inventory management with real-time data. 

Originally posted here.

Read more…

Security has long been a worry for the Internet of Things projects, and for many organizations with active or planned IoT deployments, security concerns have hampered digital ambitions. By implementing IoT security best practices, however, risk can be minimized.

Fortunately, IoT security best practices can help organizations reduce the risks facing their deployments and broader digital transformation initiatives. These same best practices can also reduce legal liability and protect an organization’s reputation.

Technological fragmentation is not just one of the biggest barriers to IoT adoption, but it also complicates the goal of securing connected devices and related services. With IoT-related cyberattacks on the rise, organizations must become more adept at managing cyber-risk or face potential reputational and legal consequences. This article summarizes best practices for enterprise and industrial IoT projects.

Key takeaways from this article include the following:

  • Data security remains a central technology hurdle related to IoT deployments.
  • IoT security best practices also can help organizations curb the risk of broader digital transformation initiatives.
  • Securing IoT projects requires a comprehensive view that encompasses the entire life cycle of connected devices and relevant supply chains.

Fragmentation and security have long been two of the most significant barriers to the Internet of Things adoption. The two challenges are also closely related.

Despite the Internet of Things (IoT) moniker, which implies a synthesis of connected devices, IoT technologies vary considerably based on their intended use. Organizations deploying IoT thus rely on an array of connectivity types, standards and hardware. As a result, even a simple IoT device can pose many security vulnerabilities, including weak authentication, insecure cloud integration, and outdated firmware and software.

For many organizations with active or planned IoT deployments, security concerns have hampered digital ambitions. An IoT World Today August 2020 survey revealed data security as the top technology hurdle for IoT deployments, selected by 46% of respondents.

Fortunately, IoT security best practices can help organizations reduce the risks facing their deployments and broader digital transformation initiatives. These same best practices can also reduce legal liability and protect an organization’s reputation.

But to be effective, an IoT-focused security strategy requires a broad view that encompasses the entire life cycle of an organization’s connected devices and projects in addition to relevant supply chains.

Know What You Have and What You Need

Asset management is a cornerstone of effective cyber defence. Organizations should identify which processes and systems need protection. They should also strive to assess the risk cyber attacks pose to assets and their broader operations.

In terms of enterprise and industrial IoT deployments, asset awareness is frequently spotty. It can be challenging given the array of industry verticals and the lack of comprehensive tools to track assets across those verticals. But asset awareness also demands a contextual understanding of the computing environment, including the interplay among devices, personnel, data and systems, as the National Institute of Standards and Technology (NIST) has observed.

There are two fundamental questions when creating an asset inventory: What is on my network? And what are these assets doing on my network?

Answering the latter requires tracking endpoints’ behaviours and their intended purpose from a business or operational perspective. From a networking perspective, asset management should involve more than counting networking nodes; it should focus on data protection and building intrinsic security into business processes.

Relevant considerations include the following:

  • Compliance with relevant security and privacy laws and standards.
  • Interval of security assessments.
  • Optimal access of personnel to facilities, information and technology, whether remote or in-person.
  • Data protection for sensitive information, including strong encryption for data at rest and data in transit.
  • Degree of security automation versus manual controls, as well as physical security controls to ensure worker safety.

IoT device makers and application developers also should implement a vulnerability disclosure program. Bug bounty programs are another option that should include public contact information for security researchers and plans for responding to disclosed vulnerabilities.

Organizations that have accurately assessed current cybersecurity readiness need to set relevant goals and create a comprehensive governance program to manage and enforce operational and regulatory policies and requirements. Governance programs also ensure that appropriate security controls are in place. Organizations need to have a plan to implement controls and determine accountability for that enforcement. Another consideration is determining when security policies need to be revised.

An effective governance plan is vital for engineering security into architecture and processes, as well as for safeguarding legacy devices with relatively weak security controls. Devising an effective risk management strategy for enterprise and industrial IoT devices is a complex endeavour, potentially involving a series of stakeholders and entities. Organizations that find it difficult to assess the cybersecurity of their IoT project should consider third-party assessments.

Many tools are available to help organizations evaluate cyber-risk and defences. These include the vulnerability database and the Security and Privacy Controls for Information Systems and Organizations document from the National Institute of Standards and Technology. Another resource is the list of 20 Critical Security Controls for Effective Cyber Defense. In terms of studying the threat landscape, the MITRE ATT&CK is one of the most popular frameworks for adversary tactics and techniques.

At this stage of the process, another vital consideration is the degree of cybersecurity savviness and support within your business. Three out of ten organizations deploying IoT cite lack of support for cybersecurity as a hurdle, according to August 2020 research from IoT World Today. Security awareness is also frequently a challenge. Many cyberattacks against organizations — including those with an IoT element — involve phishing, like the 2015 attack against Ukraine’s electric grid.

IoT Security Best Practices

Internet of Things projects demands a secure foundation. That starts with asset awareness and extends into responding to real and simulated cyberattacks.

Step 1: Know what you have.

Building an IoT security program starts with achieving a comprehensive understanding of which systems need to be protected.

Step 2: Deploy safeguards.

Shielding devices from cyber-risk requires a thorough approach. This step involves cyber-hygiene, effective asset control and the use of other security controls.

Step 3: Identify threats

Spotting anomalies can help mitigate attacks. Defenders should hone their skills through wargaming.

Step 4: Respond effectively.

Cyberattacks are inevitable but should provide feedback that feeds back to step 1.

Exploiting human gullibility is one of the most common cybercriminal strategies. While cybersecurity training can help individuals recognize suspected malicious activities, such programs tend not to be entirely effective. “It only takes one user and one-click to introduce an exploit into a network,” wrote Forrester analyst Chase Cunningham in the book “Cyber Warfare.” Recent studies have found that, even after receiving cybersecurity training, employees continue to click on phishing links about 3% of the time.

Security teams should work to earn the support of colleagues, while also factoring in the human element, according to David Coher, former head of reliability and cybersecurity for a major electric utility. “You can do what you can in terms of educating folks, whether it’s as a company IT department or as a consumer product manufacturer,” he said. But it is essential to put controls in place that can withstand user error and occasionally sloppy cybersecurity hygiene.

At the same time, organizations should also look to pool cybersecurity expertise inside and outside the business. “Designing the controls that are necessary to withstand user error requires understanding what users do and why they do it,” Coher said. “That means pulling together users from throughout your organization’s user chain — internal and external, vendors and customers, and counterparts.”

Those counterparts are easier to engage in some industries than others. Utilities, for example, have a strong track record in this regard, because of the limited market competition between them. Collaboration “can be more challenging in other industries, but no less necessary,” Coher added.

Deploy Appropriate Safeguards

Protecting an organization from cyberattacks demands a clear framework that is sensitive to business needs. While regulated industries are obligated to comply with specific cybersecurity-related requirements, consumer-facing organizations tend to have more generic requirements for privacy protections, data breach notifications and so forth. That said, all types of organizations deploying IoT have leeway in selecting a guiding philosophy for their cybersecurity efforts.

A basic security principle is to minimize networked or vulnerable systems’ attack surface — for instance, closing unused network ports and eliminating IoT device communication over the open internet. Generally speaking, building security into the architecture of IoT deployments and reducing attackers’ options to sabotage a system is more reliable than adding layers of defence to an unsecured architecture. Organizations deploying IoT projects should consider intrinsic security functionality such as embedded processors with cryptographic support.

But it is not practical to remove all risk from an IT system. For that reason, one of the most popular options is defence-in-depth, a military-rooted concept espousing the use of multiple layers of security. The basic idea is that if one countermeasure fails, additional security layers are available.

While the core principle of implementing multiple layers of security remains popular, defence in depth is also tied to the concept of perimeter-based defence, which is increasingly falling out of favour. “The defence-in-depth approach to cyber defence was formulated on the basis that everything outside of an organization’s perimeter should be considered ‘untrusted’ while everything internal should be inherently ‘trusted,’” said Andrew Rafla, a Deloitte Risk & Financial Advisory principal. “Organizations would layer a set of boundary security controls such that anyone trying to access the trusted side from the untrusted side had to traverse a set of detection and prevention controls to gain access to the internal network.”

Several trends have chipped away at the perimeter-based model. As a result, “modern enterprises no longer have defined perimeters,” Rafla said. “Gone are the days of inherently trusting any connection based on where the source originates.” Trends ranging from the proliferation of IoT devices and mobile applications to the popularity of cloud computing have fueled interest in cybersecurity models such as zero trust. “At its core, zero trust commits to ‘never trusting, always verifying’ as it relates to access control,” Rafla said. “Within the context of zero trusts, security boundaries are created at a lower level in the stack, and risk-based access control decisions are made based on contextual information of the user, device, workload or network attempting to gain access.”

Zero trust’s roots stretch back to the 1970s when a handful of computer scientists theorized on the most effective access control methods for networks. “Every program and every privileged user of the system should operate using the least amount of privilege necessary to complete the job,” one of those researchers, Jerome Saltzer, concluded in 1974.

While the concept of least privilege sought to limit trust among internal computing network users, zero trusts extend the principle to devices, networks, workloads and external users. The recent surge in remote working has accelerated interest in the zero-trust model. “Many businesses have changed their paradigm for security as a result of COVID-19,” said Jason Haward-Grau, a leader in KPMG’s cybersecurity practice. “Many organizations are experiencing a surge to the cloud because businesses have concluded they cannot rely on a physically domiciled system in a set location.”

Based on data from Deloitte, 37.4% of businesses accelerated their zero trust adoption plans in response to the pandemic. In contrast, more than one-third, or 35.2%, of those embracing zero trusts stated that the pandemic had not changed the speed of their organization’s zero-trust adoption.

“I suspect that many of the respondents that said their organization’s zero-trust adoption efforts were unchanged by the pandemic were already embracing zero trusts and were continuing with efforts as planned,” Rafla said. “In many cases, the need to support a completely remote workforce in a secure and scalable way has provided a tangible use case to start pursuing zero-trust adoption.”

A growing number of organizations are beginning to blend aspects of zero trust and traditional perimeter-based controls through a model known as secure access service edge (SASE), according to Rafla. “In this model, traditional perimeter-based controls of the defence-in-depth approach are converged and delivered through a cloud-based subscription service,” he said. “This provides a more consistent, resilient, scalable and seamless user experience regardless of where the target application a user is trying to access may be hosted. User access can be tightly controlled, and all traffic passes through multiple layers of cloud-based detection and prevention controls.”

Regardless of the framework, organizations should have policies in place for access control and identity management, especially for passwords. As Forrester’s Cunningham noted in “Cyber Warfare,” the password is “the single most prolific means of authentication for enterprises, users, and almost any system on the planet” — is the lynchpin of failed security in cyberspace. Almost everything uses a password at some stage.” Numerous password repositories have been breached, and passwords are frequently recycled, making the password a common security weakness for user accounts as well as IoT devices.

A significant number of consumer-grade IoT devices have also had their default passwords posted online. Weak passwords used in IoT devices also fueled the growth of the Mirai botnet, which led to widespread internet outages in 2016. More recently, unsecured passwords on IoT devices in enterprise settings have reportedly attracted state-sponsored actors’ attention.

IoT devices and related systems also need an effective mechanism for device management, including tasks such as patching, connectivity management, device logging, device configuration, software and firmware updates and device provisioning. Device management capabilities also extend to access control modifications and include remediation of compromised devices. It is vital to ensure that device management processes themselves are secure and that a system is in place for verifying the integrity of software updates, which should be regular and not interfere with device functionality.

Organizations must additionally address the life span of devices and the cadence of software updates. Many environments allow IT pros to identify a specific end-of-life period and remove or replace expired hardware. In such cases, there should be a plan for device disposal or transfer of ownership. In other contexts, such as in industrial environments, legacy workstations don’t have a defined expiration date and run out-of-date software. These systems should be segmented on the network. Often, such industrial systems cannot be easily patched like IT systems are, requiring security professionals to perform a comprehensive security audit on the system before taking additional steps.

Identify Threats and Anomalies

In recent years, attacks have become so common that the cybersecurity community has shifted its approach from preventing breaches from assuming a breach has already happened. The threat landscape has evolved to the point that cyberattacks against most organizations are inevitable.

“You hear it everywhere: It’s a matter of when, not if, something happens,” said Dan Frank, a principal at Deloitte specializing in privacy and data protection. Matters have only become more precarious in 2020. The FBI has reported a three- to four-fold increase in cybersecurity complaints after the advent of COVID-19.

Advanced defenders have taken a more aggressive stance known as threat hunting, which focuses on proactively identifying breaches. Another popular strategy is to study adversary behaviour and tactics to classify attack types. Models such as the MITRE ATT&CK framework and the Common Vulnerability Scoring System (CVSS) are popular for assessing adversary tactics and vulnerabilities.

While approaches to analyzing vulnerabilities and potential attacks vary according to an organization’s maturity, situational awareness is a prerequisite at any stage. The U.S. Army Field Manual defines the term like this: “Knowledge and understanding of the current situation which promotes timely, relevant and accurate assessment of friendly, enemy and other operations within the battlespace to facilitate decision making.”

In cybersecurity as in warfare, situational awareness requires a clear perception of the elements in an environment and their potential to cause future events. In some cases, the possibility of a future cyber attack can be averted by merely patching software with known vulnerabilities.

Intrusion detection systems can automate some degree of monitoring of networks and operating systems. Intrusion detection systems that are based on detecting malware signatures also can identify common attacks. They are, however, not effective at recognizing so-called zero-day malware, which has not yet been catalogued by security researchers. Intrusion detection based on malware signatures is also ineffective at detecting custom attacks, (i.e., a disgruntled employee who knows just enough Python or PowerShell to be dangerous. Sophisticated threat actors who slip through defences to gain network access can become insiders, with permission to view sensitive networks and files. In such cases, situational awareness is a prerequisite to mitigate damage.

Another strategy for intrusion detection systems is to focus on context and anomalies rather than malware signatures. Such systems could use machine learning to learn legitimate commands, use of messaging protocols and so forth. While this strategy overcomes the reliance on malware signatures, it can potentially trigger false alarms. Such a system can also detect so-called slow-rate attacks, a type of denial of service attack that gradually robs networking bandwidth but is more difficult to detect than volumetric attacks.

Respond Effectively to Cyber-Incidents

The foundation for successful cyber-incident response lies in having concrete security policies, architecture and processes. “Once you have a breach, it’s kind of too late,” said Deloitte’s Frank. “It’s what you do before that matters.”

That said, the goal of warding off all cyber-incidents, which range from violations of security policies and laws to data breaches, is not realistic. It is thus essential to implement short- and long-term plans for managing cybersecurity emergencies. Organizations should have contingency plans for addressing possible attacks, practising how to respond to them through wargaming exercises to improve their ability to mitigate some cyberattacks and develop effective, coordinated escalation measures for successful breaches.

There are several aspects of the zero trust model that enhance organizations’ ability to respond and recover from cyber events. “Network and micro-segmentation, for example, is a concept by which trust zones are created by organizations around certain classes or types of assets, restricting the blast radius of potentially destructive cyberattacks and limiting the ability for an attacker to move laterally within the environment,” Rafla said. Also, efforts to automate and orchestrate zero trust principles can enhance the efficiency of security operations, speeding efforts to mitigate attacks. “Repetitive and manual tasks can now be automated and proactive actions to isolate and remediate security threats can be orchestrated through integrated controls,” Rafla added.

Response to cyber-incidents involves coordinating multiple stakeholders beyond the security team. “Every business function could be impacted — marketing, customer relations, legal compliance, information technology, etc.,” Frank said.

A six-tiered model for cyber incident response from the SANS Institute contains the following steps:

  • Preparation: Preparing the team to react to events ranging from cyberattacks to hardware failure and power outages.
  • Identification: Determining if an operational anomaly should be classified as a cybersecurity incident, and how to respond to it.
  • Containment: Segmenting compromised devices on the network long enough to limit damage in the event of a confirmed cybersecurity incident. Conversely, long-term containment measures involve hardening effective systems to allow them to enable normal operations.
  • Eradication: Removing or restoring compromised systems. If a security team detects malware on an IoT device, for instance, this phase could involve reimaging its hardware to prevent reinfection.
  • Recovery: Integrating previously compromised systems back into production and ensuring they operate normally after that. In addition to addressing the security event directly, recovery can involve crisis communications with external stakeholders such as customers or regulators.
  • Lessons Learned: Documenting and reviewing the factors that led to the cyber-incident and taking steps to avoid future problems. Feedback from this step should create a feedback loop providing insights that support future preparation, identification, etc.

While the bulk of the SANS model focuses on cybersecurity operations, the last step should be a multidisciplinary process. Investing in cybersecurity liability insurance to offset risks identified after ongoing cyber-incident response requires support from upper management and the legal team. Ensuring compliance with the evolving regulatory landscape also demands feedback from the legal department.

A central practice that can prove helpful is documentation — not just for security incidents, but as part of ongoing cybersecurity assessment and strategy. Organizations with mature security documentation tend to be better positioned to deal with breaches.

“If you fully document your program — your policies, procedures, standards and training — that might put you in a more favourable position after a breach,” Frank explained. “If you have all that information summarized and ready, in the event of an investigation by a regulatory authority after an incident, it shows the organization has robust programs in place.”

Documenting security events and controls can help organizations become more proactive and more capable of embracing automation and machine learning tools. As they collect data, they should repeatedly ask how to make the most of it. KPMG’s Haward-Grau said cybersecurity teams should consider the following questions:

  • What data should we focus on?
  • What can we do to improve our operational decision making?
  • How do we reduce our time and costs efficiently and effectively, given the nature of the reality in which we’re operating?

Ultimately, answering those questions may involve using machine learning or artificial intelligence technology, Haward- Grau said. “If your business is using machine learning or AI, you have to digitally enable them so that they can do what they want to do,” he said.

Finally, documenting security events and practices as they relate to IoT devices and beyond can be useful in evaluating the effectiveness of cybersecurity spending and provide valuable feedback for digital transformation programs. “Security is a foundational requirement that needs to be ingrained holistically in architecture and processes and governed by policies,” said Chander Damodaran, chief architect at Brillio, a digital consultancy firm. ”Security should be a common denominator.”

IoT Security

Recent legislation requires businesses to assume responsibility for protecting the Internet of Things (IoT) devices. “Security by Design” approaches are essential since successful applications deploy millions of units and analysts predict billions of devices deployed in the next five to ten years. The cost of fixing compromised devices later could overwhelm a business.

Security risks can never be eliminated: there is no single solution for all concerns, and the cost to counter every possible threat vector is prohibitively expensive. The best we can do is minimize the risk, and design devices and processes to be easily updatable.

It is best to assess damage potential and implement security methods accordingly. For example, for temperature and humidity sensors used in environmental monitoring, data protection needs are not as stringent as devices transmitting credit card information. The first may require anonymization for privacy, and the second may require encryption to prevent unauthorized access.

Overall Objectives

Senders and receivers must authenticate. IoT devices must transmit to the correct servers and ensure they receive messages from the correct servers.

Mission-critical applications, such as vehicle crash notification or medical alerts, may fail if the connection is not reliable. Lack of communication itself is a lack of security.

Connectivity errors can make good data unreliable, and actions on the content may be erroneous. It is best to select connectivity providers with strong security practices—e.g., whitelisting access and traffic segregation to prevent unauthorized communication.

ACtC-3dLml_wPNzqObxWBELrfzifYiQLQpU6QVaKaMERqQZXspv-WPYLG17u2sJEtTM1RP3Kj42_qgp4SLMhoJwYt75EXfRWF8MaqPbvJFl6fCp3EIt30sEvOZ3P74hoo21lwBkEd9Td41iGvZY-zNMhEvIo6A=w1980-h873-no?authuser=0&profile=RESIZE_710x

IoT Security: 360-Degree Approach

Finally, only authorized recipients should access the information. In particular, privacy laws require extra care in accessing the information on individuals.

Data Chain

Developers should implement security best practices at all points in the chain. However, traditional IT security protects servers with access controls, intrusion detection, etc., the farther away from the servers that best practices are implemented, the less impact that remote IoT device breaches have on the overall application.

For example, compromised sensors might send bad data, and servers might take incorrect actions despite data filtering. Thus, gateways offer an ideal location for security with compute capacity for encryption and implement over-the-air (OTA) updates for security fixes.

Servers often automate responses on data content. Simplistic and automated responses to bad data could cascade into much greater difficulty. If devices transmit excessively, servers could overload and fail to provide timely responses to transmissions—retry algorithms resulting from network unavailability often create data storms.

IoT devices often use electrical power rather than batteries, and compromised units could continue to operate for years. Implementing over-the-air (OTA) functions for remotely disabling devices could be critical.

When a breach requires device firmware updates, OTA support is vital when devices are inaccessible or large numbers of units must be modified rapidly. All devices should support OTA, even if it increases costs—for example, adding memory for managing multiple “images” of firmware for updates.

In summary, IoT security best practices of authentication, encryption, remote device disable, and OTA for security fixes, along with traditional IT server protection, offers the best chance of minimizing risks of attacks on IoT applications.

Originally posted here.

Read more…

The benefits of IoT data are widely touted. Enhanced operational visibility, reduced costs, improved efficiencies and increased productivity have driven organizations to take major strides towards digital transformation. With countless promising business opportunities, it’s no surprise that IoT is expanding rapidly and relentlessly. It is estimated that there will be 75.4 billion IoT devices by 2025. As IoT grows, so do the volumes of IoT data that need to be collected, analyzed and stored. Unfortunately, significant barriers exist that can limit or block access to this data altogether.

Successful IoT data acquisition starts and ends with reliable and scalable IoT connectivity. Selecting the right communications technology is paramount to the long-term success of your IoT project and various factors must be considered from the beginning to build a functional wireless infrastructure that can support and manage the influx of IoT data today and in the future.

Here are five IoT architecture must-haves for unlocking IoT data at scale.

1. Network Ownership

For many businesses, IoT data is one of their greatest assets, if not the most valuable. This intensifies the demand to protect the flow of data at all costs. With maximum data authority and architecture control, the adoption of privately managed networks is becoming prevalent across industrial verticals.

Beyond the undeniable benefits of data security and privacy, private networks give users more control over their deployment with the flexibility to tailor their coverage to the specific needs of their campus style network. On a public network, users risk not having the reliable connectivity needed for indoor, underground and remote critical IoT applications. And since this network is privately owned and operated, users also avoid the costly monthly access, data plans and subscription costs imposed by public operators, lowering the overall total-cost-of-ownership. Private networks also provide full control over network availability and uptime to ensure users have reliable access to their data at all times.

2. Minimal Infrastructure Requirements

Since the number of end devices is often fixed to your IoT use cases, choosing a wireless technology that requires minimal supporting infrastructure like base stations and repeaters, as well as configuration and optimization is crucial to cost-effectively scale your IoT network.

Wireless solutions with long range and excellent penetration capability, such as next-gen low-power wide area networks, require fewer base stations to cover a vast, structurally dense industrial or commercial campuses. Likewise, a robust radio link and large network capacity allow an individual base station to effectively support massive amounts of sensors without comprising performance to ensure a continuous flow of IoT data today and in the future.

3. Network and Device Management

As IoT initiatives move beyond proofs-of-concept, businesses need an effective and secure approach to operate, control and expand their IoT network with minimal costs and complexity.

As IoT deployments scale to hundreds or even thousands of geographically dispersed nodes, a manual approach to connecting, configuring and troubleshooting devices is inefficient and expensive. Likewise, by leaving devices completely unattended, users risk losing business-critical IoT data when it’s needed the most. A network and device management platform provides a single-pane, top-down view of all network traffic, registered nodes and their status for streamlined network monitoring and troubleshooting. Likewise, it acts as the bridge between the edge network and users’ downstream data servers and enterprise applications so users can streamline management of their entire IoT project from device to dashboard.

4. Legacy System Integration

Most traditional assets, machines, and facilities were not designed for IoT connectivity, creating huge data silos. This leaves companies with two choices: building entirely new, greenfield plants with native IoT technologies or updating brownfield facilities for IoT connectivity. Highly integrable, plug-and-play IoT connectivity is key to streamlining the costs and complexity of an IoT deployment. Businesses need a solution that can bridge the gap between legacy OT and IT systems to unlock new layers of data that were previously inaccessible. Wireless IoT connectivity must be able to easily retrofit existing assets and equipment without complex hardware modifications and production downtime. Likewise, it must enable straightforward data transfer to the existing IT infrastructure and business applications for data management, visualization and machine learning.

5. Interoperability

Each IoT system is a mashup of diverse components and technologies. This makes interoperability a prerequisite for IoT scalability, to avoid being saddled with an obsolete system that fails to keep pace with new innovation later on. By designing an interoperable architecture from the beginning, you can avoid fragmentation and reduce the integration costs of your IoT project in the long run. 

Today, technology standards exist to foster horizontal interoperability by fueling global cross-vendor support through robust, transparent and consistent technology specifications. For example, a standard-based wireless protocol allows you to benefit from a growing portfolio of off-the-shelf hardware across industry domains. When it comes to vertical interoperability, versatile APIs and open messaging protocols act as the glue to connect the edge network with a multitude of value-deriving backend applications. Leveraging these open interfaces, you can also scale your deployment across locations and seamlessly aggregate IoT data across premises.  

IoT data is the lifeblood of business intelligence and competitive differentiation and IoT connectivity is the crux to ensuring reliable and secure access to this data. When it comes to building a future-proof wireless architecture, it’s important to consider not only existing requirements, but also those that might pop up down the road. A wireless solution that offers data ownership, minimal infrastructure requirements, built-in network management and integration and interoperability will not only ensure access to IoT data today, but provide cost-effective support for the influx of data and devices in the future.

Originally posted here.

Read more…

by Philipp Richert

New digital and IoT use cases are becoming more and more important. When it comes to the adoption of these new technologies, there are several different maturity levels, depending on the domain. Within the retail industry, and specifically food retail, we are currently seeing the emergence of a host of IoT use cases.

Two forces are driving this: a technology push, in which suppliers in the retail domain have technologies available to build retail IoT use cases within a connected store; and a market pull by their customers, who are boosting the demand for such use cases.

Retail-IoT-use-case-technology-push-and-market-pull-1136x139.png

However, we also need to ask the following questions: What are IoT use cases good for? And what are they aiming at? We currently see three different fields of application:

  • Increasing efficiency and optimizing processes
  • Increasing customer satisfaction
  • Increasing revenues with new business models

No matter what is most important for your organization or whatever your focus, it is crucial to set up a process that provides guidance for identifying the right use cases. In the following section, we share some insights on how retailers can best design this process. We collated these insights together with the team from the Food Tech Campus.

How to identify the right retail IoT use cases

When identifying the right use cases for their stores, retailers should make sure to look into all phases within the entire innovation process: from problem description and idea collation to solution concept and implementation. Within this process, it is also essential to consider the so-called innovator’s trilemma and ensure that use cases are:

  • Desirable ones that your customer really needs
  • Technically feasible
  • Profitable for your sustainable business development

Before we can actually start identifying retail IoT use cases, we need to define search fields so that we can work on one topic with greater dedication and focus. We must then open up the problem space in order to extract the most relevant problems and pain points. Starting with prioritized and selected pain points, we then open up the solution space in order to define several solution concepts. Once these have been validated, the result should be a well-defined problem statement that concisely describes one singular pain point.

In the following, we want to take a deep dive into the different phases of the process while giving concrete examples, tips and our top-rated tools. Enjoy!

Search fields

Retailers possess expertise and face challenges at various stages along their complex process chains. It helps here to focus on a specific target group in order to avoid distraction. Target groups are typically users or customers in a defined environment. A good example would be to focus your search on processes that happen inside a store location and are relevant to the customer (e.g., the food shopper).

Understand and observe problems

User research, observation and listening are keys to a well-defined problem statement that allows for further ideation. Embedding yourself in various situations and conducting interviews with all the stakeholders visiting or operating a store should be the first steps. Join employees around the store for a day or two and support them during their everyday tasks. Empathize, look for any friction and ask questions. Take your key findings into workshops and spend some time isolating specific causes. Use personas based on your user research and make use of frameworks and canvas templates in order to structure your findings. Use working titles to name the specific problem statements. One example might be: Long queueing as a major nuisance for customers.

Synthesize findings

Are your findings somehow connected? Single-purpose processes and their owners within a store environment are prone to isolated views. Creating a common problem space increases the chances of adoption of any solution later. So it is worth taking the time to map out all findings and take a look at projects in the past and their outcome. In our example, queueing is linked to staff planning, lack of communication and unpredictable customer behavior.

Prioritize problems and pain points

Ask users or stakeholders to give their view on defined problem statements and let them vote. Challenge their view and make them empathize and broaden their view towards a more holistic benefit. Once the quality of a problem statement has been assessed, evaluate the economic implications. In our example, this could mean that queueing affects most employees in the store, directly or indirectly. This problem might be solved through technology and should be further explored.

The result of a well-structured problem statement list should consist of a few new insights that might result in quick gains; one or two major known pain points, where the solution might be viable and feasible; and a list with additional topics that exist but are not too pressing at the moment.

Define opportunity areas

Map technologies and problems together. Are there any strategic goals that these problem statements might be assigned to? Have things changed in terms of technical feasibility (e.g., has the cost of a technology dropped over the past three years?). Can problems be validated within a larger setup easily or are we talking about singular use cases? All these considerations should lead towards the most attractive problem to solve. Again, in our example, this might be: Queuing is a major problem in most locations, satisfying our customers should be our main goal, existing solutions are too expensive or inflexible.

Retail-IoT-use-case-problem-solution-space-1-1136x580.png

When identifying the right use cases for their stores, retailers should make sure to look into all phases within the entire innovation process: from problem description and idea collation to solution concept and implementation.

Ideate and explore use cases

When conducting an ideation session, it is very helpful to bring in trends that are relevant to the defined problem areas so as to help boost creativity. In our example, for instance, this might be technology trends such as frictionless checkout for retail, hybrid checkout concepts, bring your own device (BYOD) and sensor approaches. It is always important to keep the following in mind: What do these trends mean for the customer journey in-store and how can they be integrated in (legacy) environments?

Define solutions concepts

In the process of further defining the solution concepts, it is essential to evaluate the market potential and to consider customer and user feedback. Depending on the solution, it might be necessary to ask the various stakeholders – from store managers to personnel to customers – in order to get a clearer picture. When talking to customers or users, it is also helpful to bring along scribbles, pictures or prototypes in order to increase immersion. The insights gathered in this way help to validate assumptions and to pilot the concept accordingly.

Set metrics and KPIs to prove success

Defining data-based metrics and KPIs is essential for a successful solution. When setting up metrics and KPIs, you need to consider two aspects:

  • Use existing data – e.g., checkout frequency – in order to demonstrate the impact of the new solution. This offers a very inexpensive way of validating the business potential of the solution early on.
  • Use new data – e.g. measure waiting time – from the solution and evaluate it on a regular basis. This helps to get a better understanding of whether you are collecting the right data and to derive measures that help to improve your solution.

Prototype for quick insights

In terms of technology, practically everything is feasible today. However, the value proposition of a use case (in terms of business and users) can remain unclear and requires testing. Instead of building a technical prototype, it can be helpful to evaluate the value proposition of the solution with humans (empathy prototyping). This could be a person triggering an alarm based on the information at hand instead of an automatic action. Insights and lessons learnt from this phase can be used alongside the technical realization (proof-of-concept) in order to tweak specific features of the solution.

Initiate a PoC for technical feasibility

When it comes to technical feasibility, a clear picture of the objectives and key results (OKRs) for the PoC is essential. This helps to set the boundaries for a lean process with respect to the installation of hardware, an efficient timeline and minimum costs. Furthermore, a well-defined test setup fosters short testing timespans that often yield all needed results.

How IoT platforms can help build retail IoT use cases

The strong trend towards digitization within the retail industry opens up new use cases for the (food) retail industry. In order to make the most of this trend and to build on IoT, it is crucial first of all to determine which use cases to start with. Every retailer has a different focus and needs for their stores.

In the course of our retail projects, we have identified some of the recurring use cases that food retailers are currently implementing. We have also learnt a lot about how they can best leverage IoT in order to build a connected store. We share these insights in our white paper “The connected retail store.”

Originally posted here.

Read more…

When I think about the things that held the planet together in 2020, it was digital experiences delivered over wireless connectivity that made remote things local.

While heroes like doctors, nurses, first responders, teachers, and other essential personnel bore the brunt of the COVID-19 response, billions of people around the world found themselves cut off from society. In order to keep people safe, we were physically isolated from each other. Far beyond the six feet of social distancing, most of humanity weathered the storm from their homes.

And then little by little, old things we took for granted, combined with new things many had never heard of, pulled the world together. Let’s take a look at the technologies and trends that made the biggest impact in 2020 and where they’re headed in 2021:

The Internet

The global Internet infrastructure from which everything else is built is an undeniable hero of the pandemic. This highly-distributed network designed to withstand a nuclear attack performed admirably as usage by people, machines, critical infrastructure, hospitals, and businesses skyrocketed. Like the air we breathe, this primary facilitator of connected, digital experiences is indispensable to our modern society. Unfortunately, the Internet is also home to a growing cyberwar and security will be the biggest concern as we move into 2021 and beyond. It goes without saying that the Internet is one of the world’s most critical utilities along with water, electricity, and the farm-to-table supply chain of food.

Wireless Connectivity

People are mobile and they stay connected through their smartphones, tablets, in cars and airplanes, on laptops, and other devices. Just like the Internet, the cellular infrastructure has remained exceptionally resilient to enable communications and digital experiences delivered via native apps and the web. Indoor wireless connectivity continues to be dominated by WiFi at home and all those empty offices. Moving into 2021, the continued rollout of 5G around the world will give cellular endpoints dramatic increases in data capacity and WiFi-like speeds. Additionally, private 5G networks will challenge WiFi as a formidable indoor option, but WiFi 6E with increased capacity and speed won’t give up without a fight. All of these developments are good for consumers who need to stay connected from anywhere like never before.

Web Conferencing

With many people stuck at home in 2020, web conferencing technology took the place of traveling to other locations to meet people or receive education. This technology isn’t new and includes familiar players like GoToMeeting, Skype, WebEx, Google Hangouts/Meet, BlueJeans, FaceTime, and others. Before COVID, these platforms enjoyed success, but most people preferred to fly on airplanes to meet customers and attend conferences while students hopped on the bus to go to school. In 2020, “necessity is the mother of invention” took hold and the use of Zoom and Teams skyrocketed as airplanes sat on the ground while business offices and schools remained empty. These two platforms further increased their stickiness by increasing the number of visible people and adding features like breakout rooms to meet the demands of businesses, virtual conference organizers, and school teachers. Despite the rollout of the vaccine, COVID won’t be extinguished overnight and these platforms will remain strong through the first half of 2021 as organizations rethink where and when people work and learn. There’s way too many players in this space so look for some consolidation.

E-Commerce

“Stay at home” orders and closed businesses gave e-commerce platforms a dramatic boost in 2020 as they took the place of shopping at stores or going to malls. Amazon soared to even higher heights, Walmart upped their game, Etsy brought the artsy, and thousands of Shopify sites delivered the goods. Speaking of delivery, the empty city streets became home to fleets FedEx, Amazon, UPS, and DHL trucks bringing packages to your front doorstep. Many retail employees traded-in working at customer-facing stores for working in a distribution centers as long as they could outperform robots. Even though people are looking forward to hanging out at malls in 2021, the e-commerce, distribution center, delivery truck trinity is here to stay. This ball was already in motion and got a rocket boost from COVID. This market will stay hot in the first half of 2021 and then cool a bit in the second half.

Ghost Kitchens

The COVID pandemic really took a toll on restaurants in the 2020, with many of them going out of business permanently. Those that survived had to pivot to digital and other ways of doing business. High-end steakhouses started making burgers on grills in the parking lot, while takeout pizzerias discovered they finally had the best business model. Having a drive-thru lane was definitely one of the keys to success in a world without waiters, busboys, and hosts. “Front of house” was shut down, but the “back of house” still had a pulse. Adding mobile web and native apps that allowed customers to easily order from operating “ghost kitchens” and pay with credit cards or Apple/Google/Samsung Pay enabled many restaurants to survive. A combination of curbside pickup and delivery from the likes of DoorDash, Uber Eats, Postmates, Instacart and Grubhub made this business model work. A surge in digital marketing also took place where many restaurants learned the importance of maintaining a relationship with their loyal customers via connected mobile devices. For the most part, 2021 has restauranteurs hoping for 100% in-person dining, but a new business model that looks a lot like catering + digital + physical delivery is something that has legs.

The Internet of Things

At its very essence, IoT is all about remotely knowing the state of a device or environmental system along with being able to remotely control some of those machines. COVID forced people to work, learn, and meet remotely and this same trend applied to the industrial world. The need to remotely operate industrial equipment or an entire “lights out” factory became an urgent imperative in order to keep workers safe. This is yet another case where the pandemic dramatically accelerated digital transformation. Connecting everything via APIs, modeling entities as digital twins, and having software bots bring everything to life with analytics has become an ROI game-changer for companies trying to survive in a free-falling economy. Despite massive employee layoffs and furloughs, jobs and tasks still have to be accomplished, and business leaders will look to IoT-fueled automation to keep their companies running and drive economic gains in 2021.

Streaming Entertainment

Closed movie theaters, football stadiums, bowling alleys, and other sources of entertainment left most people sitting at home watching TV in 2020. This turned into a dream come true for streaming entertainment companies like Netflix, Apple TV+, Disney+, HBO Max, Hulu, Amazon Prime Video, Youtube TV, and others. That said, Quibi and Facebook Watch didn’t make it. The idea of binge-watching shows during the weekend turned into binge-watching every season of every show almost every day. Delivering all these streams over the Internet via apps has made it easy to get hooked. Multiplayer video games fall in this category as well and represent an even larger market than the film industry. Gamers socially distanced as they played each other from their locked-down homes. The rise of cloud gaming combined with the rollout of low-latency 5G and Edge computing will give gamers true mobility in 2021. On the other hand, the video streaming market has too many players and looks ripe for consolidation in 2021 as people escape the living room once the vaccine is broadly deployed.

Healthcare

With doctors and nurses working around the clock as hospitals and clinics were stretched to the limit, it became increasingly difficult for non-COVID patients to receive the healthcare they needed. This unfortunate situation gave tele-medicine the shot in the arm (no pun intended) it needed. The combination of healthcare professionals delivering healthcare digitally over widespread connectivity helped those in need. This was especially important in rural areas that lacked the healthcare capacity of cities. Concurrently, the Internet of Things is making deeper inroads into delivering the health of a person to healthcare professionals via wearable technology. Connected healthcare has a bright future that will accelerate in 2021 as high-bandwidth 5G provides coverage to more of the population to facilitate virtual visits to the doctor from anywhere.

Working and Living

As companies and governments told their employees to work from home, it gave people time to rethink their living and working situation. Lots of people living in previously hip, urban, high-rise buildings found themselves residing in not-so-cool, hollowed-out ghost towns comprised of boarded-up windows and closed bars and cafés. Others began to question why they were living in areas with expensive real estate and high taxes when they not longer had to be close to the office. This led to a 2020 COVID exodus out of pricey apartments/condos downtown to cheaper homes in distant suburbs as well as the move from pricey areas like Silicon Valley to cheaper destinations like Texas. Since you were stuck in your home, having a larger house with a home office, fast broadband, and a back yard became the most important thing. Looking ahead to 2021, a hybrid model of work-from-home plus occasionally going into the office is here to stay as employees will no longer tolerate sitting in traffic two hours a day just to sit in a cubicle in a skyscraper. The digital transformation of how and where we work has truly accelerated.

Data and Advanced Analytics

Data has shown itself to be one of the world’s most important assets during the time of COVID. Petabytes of data has continuously streamed-in from all over the world letting us know the number of cases, the growth or decline of infections, hospitalizations, contact-tracing, free ICU beds, temperature checks, deaths, and hotspots of infection. Some of this data has been reported manually while lots of other sources are fully automated from machines. Capturing, storing, organizing, modeling and analyzing this big data has elevated the importance of cloud and edge computing, global-scale databases, advanced analytics software, and the growing importance of machine learning. This is a trend that was already taking place in business and now has a giant spotlight on it due to its global importance. There’s no stopping the data + advanced analytics juggernaut in 2021 and beyond.

Conclusion

2020 was one of the worst years in human history and the loss of life was just heartbreaking. People, businesses, and our education system had to become resourceful to survive. This resourcefulness amplified the importance of delivering connected, digital experiences to make previously remote things into local ones. Cheers to 2021 and the hope for a brighter day for all of humanity.

Read more…

By Michele Pelino

The COVID-19 pandemic drove businesses and employees to became more reliant on technology for both professional and personal purposes. In 2021, demand for new internet-of-things (IoT) applications, technologies, and solutions will be driven by connected healthcare, smart offices, remote asset monitoring, and location services, all powered by a growing diversity of networking technologies.

In 2021, we predict that:

  • Network connectivity chaos will reign. Technology leaders will be inundated by an array of wireless connectivity options. Forrester expects that implementation of 5G and Wi-Fi technologies will decline from 2020 levels as organizations sort through market chaos. For long-distance connectivity, low-earth-orbit satellites now provide a complementary option, with more than 400 Starlink satellites delivering satellite connectivity today. We expect interest in satellite and other lower-power networking technologies to increase by 20% in the coming year.
  • Connected device makers will double down on healthcare use cases. Many people stayed at home in 2020, leaving chronic conditions unmanaged, cancers undetected, and preventable conditions unnoticed. In 2021, proactive engagement using wearables and sensors to detect patients’ health at home will surge. Consumer interest in digital health devices will accelerate as individuals appreciate the convenience of at-home monitoring, insight into their health, and the reduced cost of connected health devices.
  • Smart office initiatives will drive employee-experience transformation. In 2021, some firms will ditch expensive corporate real estate driven by the COVID-19 crisis. However, we expect at least 80% of firms to develop comprehensive on-premises return-to-work office strategies that include IoT applications to enhance employee safety and improve resource efficiency such as smart lighting, energy and environmental monitoring, or sensor-enabled space utilization and activity monitoring in high traffic areas.*
  • The near ubiquity of connected machines will finally disrupt traditional business. Manufacturers, distributors, utilities, and pharma firms switched to remote operations in 2020 and began connecting previously disconnected assets. This connected-asset approach increased reliance on remote experts to address repairs without protracted downtime and expensive travel. In 2021, field service firms and industrial OEMs will rush to keep up with customer demand for more connected assets and machines.
  • Consumer and employee location data will be core to convenience. The COVID-19 pandemic elevated the importance location plays in delivering convenient customer and employee experiences. In 2021, brands must utilize location to generate convenience for consumers or employees with virtual queues, curbside pickup, and checking in for reservations. They will depend on technology partners to help use location data, as well as a third-party source of location trusted and controlled by consumers.

* Proactive firms, including Atea, have extended IoT investments to enhance employee experience and productivity by enabling employees to access a mobile app that uses data collected from light-fixture sensors to locate open desks and conference rooms. Employees can modify light and temperature settings according to personal preferences, and the system adjusts light color and intensity to better align with employees’ circadian rhythms to aid in concentration and energy levels. See the Forrester report “Rethink Your Smart Office Strategy.”

Originally posted HERE.

Read more…

By Patty Medberry

After 2020’s twists and turns, here’s hoping that 2021 ushers in a restored sense of “normal.” In thinking about what the upcoming year might bring for industrial IoT, three key trends emerge.

Trend #1: Securing operational technology (OT)

 IT will take a bolder posture to secure OT environments.

Cyber risks in industrial environments will continue to grow causing IT to take bolder steps to secure the OT network in 2021. The CISO and IT teams have accountability for cybersecurity across the enterprise. But often they do not have visibility into the OT network. Many OT networks use traditional measures like air gapping or an industrial demilitarized zone to protect against attacks. But these solutions are rife with backdoors. For example, third-party technicians and other vendors often have remote access to update systems, machines and devices. With increasing pressure from board members and government regulators to manage IoT/OT security risks, and to protect the business itself, the CISO and IT will need to do more.

Success requires OT’s help. IT cybersecurity practices that work in the enterprise are not always appropriate for industrial environments. What’s more, IT doesn’t have the expertise or insight into operational and process control technology. A simple patch could bring down production (and revenues).

Bottom line? Organizations will need solutions that strengthen cybersecurity while meeting IT and OT needs. For IT, that means visibility and control across their own environment to the OT network. For OT, it means security solutions that allow them respond to anomalies while keeping production humming.

Trend #2: Remote and autonomous operations

The need for operational resiliency will accelerate the deployment of remote and autonomous operations – driving a new class of networking.

The impact of changes brought on in 2020 is driving organizations to increasingly use IoT technologies for operational resiliency. After all, IoT helps keep a business up and running when people cannot be on the ground. It also helps improve safety and efficiencies by preventing unnecessary site visits and reducing employee movement throughout facilities.

In 2021, we will see more deployments aimed at sophisticated remote operations. These will go well beyond remote monitoring. They will include autonomous operational controls for select parts of a process and will be remotely enabled for other parts. Also, deployments will increasingly move toward full autonomy, eliminating the need for humans to be present locally or remotely. And more and more, AI will used for dynamic optimization and self-healing, in use cases such as:

  • autonomous guided vehicles for picking and packing, material handling, and autonomous container applications across manufacturing, warehouses and ports
  • increased automation of the distribution grid
  • autonomous haul trucks for mining applications
  • Computer-based train control for rail and mass transit

All these use cases require data instantly and in mass, demanding a network that can support that data plus deliver the speed required for analysis. This new class of industrial networking must provide the ability to handle more network bandwidth, offer zero latency data and support edge compute. It also needs security and scale to adapt quickly, ensuring the business is up and running – no matter what.

Trend #3: Managing multiple access technologies

Organizations will operate multiple-access technologies to achieve operational agility and flexibility.

While Ethernet has always been the foundation for connectivity in industrial IoT spaces, that connectivity is quickly expanding to wireless. Wireless helps reduce the pain of physical cabling and provides the flexibility and agility to upgrade, deploy and reconfigure the network with less operational downtime. Newer wireless technologies like Wi-Fi 6 and 5G also power use cases not possible in the past (or possible only with wired connectivity).

As organizations expand their IoT deployments, the need to manage multiple access technologies will grow. Successful deployments will require the right connectivity for the use case, otherwise, costs, complexity and security risks increase. With wireless choices including Wi-Fi, LoRaWAN, Wi-SUN, public or private cellular, Bluetooth and more, organizations will need to determine the best technology for each use case.  

Cisco’s recommendation: Build an access strategy to optimize costs and resources while ensuring security. Interactions between access technologies should deliver a secured and automated end-to-end IP infrastructure – and must avoid a “mishmash” leading to complexity and failed objectives.

As the end of 2020 fast approaches, I wish everyone a safe and healthy New Year. As you continue building and refining your plans for 2021, please consider how you can unleash these IoT network trends to reduce your cybersecurity risks and increase your operational resiliency. 

Originally posted HERE.

Read more…

New solar performance monitoring system has potential to become IoT of photovoltaics. Credit: Pexels

A new system for measuring solar performance over the long term in scalable photovoltaic systems, developed by Arizona State University researchers, represents a breakthrough in the cost and longevity of interconnected power delivery.

When solar cells are developed, they are "current-voltage" tested in the lab before they are deployed in panels and systems outdoors. Once installed outdoors, they aren't usually tested again unless the system undergoes major issues. The new test system, Suns-Voc, measures the system's voltage as a function of light intensity in the outdoor setting, enabling real-time measurements of performance and detailed diagnostics.

"Inside the lab, however, everything is controlled," explained Alexander Killam, an ASU electrical engineering doctoral student and graduate research associate. "Our research has developed a way to use Suns-Voc to measure solar panels' degradation once they are outdoors in the real world and affected by weather, temperature and humidity," he said.

Current photovoltaic modules are rated to last 25 years at 80 percent efficiency. The goal is to expand that time frame to 50 years or longer.

"This system of monitoring will give photovoltaic manufacturers and big utility installations the kind of data necessary to adjust designs to increase efficiency and lifespans," said Killam, the lead author of "Monitoring of Photovoltaic System Performance Using Outdoor Suns-Voc," for Joule.

For example, most techniques used to measure outdoor solar efficiency require you to disconnect from the power delivery mechanism. The new approach can automatically measure daily during sunrise and sunset without interfering with power delivery.

"When we were developing photovoltaics 20 years ago, panels were expensive," said Stuart Bowden, an associate research professor who heads the silicon section of ASU's Solar Power Laboratory. "Now they are cheap enough that we don't have to worry about the cost of the panels. We are more interested in how they maintain their performance in different environments.

"A banker in Miami underwriting a photovoltaic system wants to know in dollars and cents how the system will perform in Miami and not in Phoenix, Arizona."

"The weather effects on photovoltaic systems in Arizona will be vastly different than those in Wisconsin or Louisiana," said Joseph Karas, co-author and materials science doctoral graduate now at the National Renewable Energy Lab. "The ability to collect data from a variety of climates and locations will support the development of universally effective solar cells and systems."

The research team was able to test its approach at ASU's Research Park facility, where the Solar Lab is primarily solar powered. For its next step, the lab is negotiating with a power plant in California that is looking to add a megawatt of silicon photovoltaics to its power profile.

The system, which can monitor reliability and lifespan remotely for larger, interconnected systems, will be a major breakthrough for the power industry.

"Most residential solar rooftop systems aren't owned by the homeowner, they are owned by a utility company or broker with a vested interest in monitoring photovoltaic efficiency," said Andre' Augusto, head of Silicon Heterojunction Research at ASU's Solar Power Laboratory and a co-author of the paper.

"Likewise, as developers of malls or even planned residential communities begin to incorporate solar power into their construction projects, the interest in monitoring at scale will increase, " Augusto said.

According to Bowden, it's all about the data, especially when it can be monitored automatically and remotely—data for the bankers, data for developers, and data for the utility providers.

If Bill Gates' smart city, planned about 30 miles from Phoenix in Buckeye, Ariz., uses the team's measurement technology, "It could become the IoT of Photovoltaics," said Bowden.

Originally posted HERE.

Read more…

Written by: Mirko Grabel

Edge computing brings a number of benefits to the Internet of Things. Reduced latency, improved resiliency and availability, lower costs, and local data storage (to assist with regulatory compliance) to name a few. In my last blog post I examined some of these benefits as a means of defining exactly where is the edge. Now let’s take a closer look at how edge computing benefits play out in real-world IoT use cases.

Benefit No. 1: Reduced latency

Many applications have strict latency requirements, but when it comes to safety and security applications, latency can be a matter of life or death. Consider, for example, an autonomous vehicle applying brakes or roadside signs warning drivers of upcoming hazards. By the time data is sent to the cloud and analyzed, and a response is returned to the car or sign, lives can be endangered. But let’s crunch some numbers just for fun.

Say a Department of Transportation in Florida is considering a cloud service to host the apps for its roadside signs. One of the vendors on the DoT’s shortlist is a cloud in California. The DoT’s latency requirement is less than 15ms. The light speed in fiber is about 5 μs/km. The distance from the U.S. east coast to the west coast is about 5,000 km. Do the math and the resulting round-trip latency is 50ms. It’s pure physics. If the DoT requires a real-time response, it must move the compute closer to the devices.

Benefit No. 2: Improved resiliency/availability

Critical infrastructure requires the highest level of availability and resiliency to ensure safety and continuity of services. Consider a refinery gas leakage detection system. It must be able to operate without Internet access. If the system goes offline and there’s a leakage, that’s an issue. Compute must be done at the edge. In this case, the edge may be on the system itself.

While it’s not a life-threatening use case, retail operations can also benefit from the availability provided by edge compute. Retailers want their Point of Sale (PoS) systems to be available 100% of the time to service customers. But some retail stores are in remote locations with unreliable WAN connections. Moving the PoS systems onto their edge compute enables retailers to maintain high availability.

Benefit No. 3: Reduced costs

Bandwidth is almost infinite, but it comes at a cost. Edge computing allows organizations to reduce bandwidth costs by processing data before it crosses the WAN. This benefit applies to any use case, but here are two example use-cases where this is very evident: video surveillance and preventive maintenance. For example, a single city-deployed HD video camera may generate 1,296GB a month. Streaming that data over LTE easily becomes cost prohibitive. Adding edge compute to pre-aggregate the data significantly reduces those costs.

Manufacturers use edge computing for preventive maintenance of remote machinery. Sensors are used to monitor temperatures and vibrations. The currency of this data is critical, as the slightest variation can indicate a problem. To ensure that issues are caught as early as possible, the application requires high-resolution data (for example, 1000 per second). Rather than sending all of this data over the Internet to be analyzed, edge compute is used to filter the data and only averages, anomalies and threshold violations are sent to the cloud.

Benefit No. 4: Comply with government regulations

Countries are increasingly instituting privacy and data retention laws. The European Union’s General Data Protection Regulation (GDPR) is a prime example. Any organization that has data belonging to an EU citizen is required to meet the GDPR’s requirements, which includes an obligation to report leaks of personal data. Edge computing can help these organizations comply with GDPR. For example, instead of storing and backhauling surveillance video, a smart city can evaluate the footage at the edge and only backhaul the meta data.

Canada’s Water Act: National Hydrometric Program is another edge computing use case that delivers regulatory compliance benefits. As part of the program, about 3,000 measurement stations have been implemented nationwide. Any missing data requires justification. However, storing data at the edge ensures data retention.

Bonus Benefit: “Because I want to…”

Finally, some users simply prefer to have full control. By implementing compute at the edge rather than the cloud, users have greater flexibility. We have seen this in manufacturing. Technicians want to have full control over the machinery. Edge computing gives them this control as well as independence from IT. The technicians know the machinery best and security and availability remain top of mind.

Summary

By reducing latency and costs, improving resiliency and availability, and keeping data local, edge computing opens up a new world of IoT use cases. Those described here are just the beginning. It will be exciting to see where we see edge computing turn up next. 

Originaly posted: here

Read more…

Arm DevSummit 2020 debuted this week (October 6 – 8) as an online virtual conference focused on engineers and providing them with insights into the Arm ecosystem. The summit lasted three days over which Arm painted an interesting technology story about the current and future state of computing and where developers fit within that story. I’ve been attending Arm Techcon for more than half a decade now (which has become Arm DevSummit) and as I perused content, there were several take-a-ways I noticed for developers working on microcontroller based embedded systems. In this post, we will examine these key take-a-ways and I’ll point you to some of the sessions that I also think may pique your interest.

(For those of you that aren’t yet aware, you can register up until October 21st (for free) and still watch the conferences materials up until November 28th . Click here to register)

Take-A-Way #1 – Expect Big Things from NVIDIAs Acquisition of Arm

As many readers probably already know, NVIDIA is in the process of acquiring Arm. This acquisition has the potential to be one of the focal points that I think will lead to a technological revolution in computing technologies, particularly around artificial intelligence but that will also impact nearly every embedded system at the edge and beyond. While many of us have probably wondered what plans NVIDIA CEO Jensen Huang may have for Arm, the Keynotes for October 6th include a fireside chat between Jensen Huang and Arm CEO Simon Segars. Listening to this conversation is well worth the time and will help give developers some insights into the future but also assurances that the Arm business model will not be dramatically upended.

Take-A-Way #2 – Machine Learning for MCU’s is Accelerating

It is sometimes difficult at a conference to get a feel for what is real and what is a little more smoke and mirrors. Sometimes, announcements are real, but they just take several years to filter their way into the market and affect how developers build systems. Machine learning is one of those technologies that I find there is a lot of interest around but that developers also aren’t quite sure what to do with yet, at least in the microcontroller space. When we hear machine learning, we think artificial intelligence, big datasets and more processing power than will fit on an MCU.

There were several interesting talks at DevSummit around machine learning such as:

Some of these were foundational, providing embedded developers with the fundamentals to get started while others provided hands-on explorations of machine learning with development boards. The take-a-way that I gather here is that the effort to bring machine learning capabilities to microcontrollers so that they can be leveraged in industry use cases is accelerating. Lots of effort is being placed in ML algorithms, tools, frameworks and even the hardware. There were several talks that mentioned Arm’s Cortex-M55 architecture that will include Helium technology to help accelerate machine learning and DSP processing capabilities.

Take-A-Way #3 – The Constant Need for Reinvention

In my last take-a-way, I eluded to the fact that things are accelerating. Acceleration is not just happening though in the technologies that we use to build systems. The very application domain that we can apply these technology domains to is dramatically expanding. Not only can we start to deploy security and ML technologies at the edge but in domains such as space and medical systems. There were several interesting talks about how technologies are being used around the world to solve interesting and unique problems such as protecting vulnerable ecosystems, mapping the sea floor, fighting against diseases and so much more.

By carefully watching and listening, you’ll notice that many speakers have been involved in many different types of products over their careers and that they are constantly having to reinvent their skill sets, capabilities and even their interests! This is what makes working in embedded systems so interesting! It is constantly changing and evolving and as engineers we don’t get to sit idly behind a desk. Just as Arm, NVIDIA and many of the other ecosystem partners and speakers show us, technology is rapidly changing but so are the problem domains that we can apply these technologies to.

Take-A-Way #4 – Mbed and Keil are Evolving

There are also interesting changes coming to the Arm toolchains and tools like Mbed and Keil MDK. In Reinhard Keil’s talk, “Introduction to an Open Approach for Low-Power IoT Development“, developers got an insight into the changes that are coming to Mbed and Keil with the core focus being on IoT development. The talk focused on the endpoint and discussed how Mbed and Keil MDK are being moved to an online platform designed to help developers move through the product development faster from prototyping to production. The Keil Studio Online is currently in early access and will be released early next year.

(If you are interested in endpoints and AI, you might also want to check-out this article on “How Do We Accelerate Endpoint AI Innovation? Put Developers First“)

Conclusions

Arm DevSummit had a lot to offer developers this year and without the need to travel to California to participate. (Although I greatly missed catching up with friends and colleagues in person). If you haven’t already, I would recommend checking out the DevSummit and watching a few of the talks I mentioned. There certainly were a lot more talks and I’m still in the process of sifting through everything. Hopefully there will be a few sessions that will inspire you and give you a feel for where the industry is headed and how you will need to pivot your own skills in the coming years.

Originaly posted here

Read more…

SSE Airtricity employees Derek Conty, left, Francie Byrne, middle, and Ryan Doran, right, install solar panels on the roof of Kinsale Community School in Kinsale, Ireland. The installation is part of a project with Microsoft to demonstrate the feasibility of distributed power purchase agreements. Credit: Naoise Culhane

by John Roach

Solar panels being installed on the roofs of dozens of schools throughout Dublin, Ireland, reflect a novel front in the fight against global climate change, according to a senior software engineer and a sustainability lead at Microsoft.

The technology copmpany partnered with SSE Airtricity, Ireland's largest provider of 100% green energy and a part of FTSE listed SSE Group, to install and manage the internet-connected solar panels, which are connected via Azure IoT to Microsoft Azure, a cloud computing platform.

The software tools aggregate and analyze real-time data on energy generated by the solar panels, demonstrating a mechanism for Microsoft and other corporations to achieve sustainability goals and reduce the carbon footprint of the electric power grid.

"We need to decarbonize the global economy to avoid catastrophic climate change," said Conor Kelly, the software engineer who is leading the distributed solar energy project for Microsoft Azure IoT. "The first thing we can do, and the easiest thing we can do, is focus on electricity."

Microsoft's $1.1 million contribution to the project builds on the company's ongoing investment in renewable energy technologies to offset carbon emissions from the operation of its datacenters.

A typical approach to power datacenters with renewable energy is for companies such as Microsoft to sign so-called power purchase agreements with energy companies.The agreements provide financial guarantees needed to build industrial-scale wind and solar farms and connections to the power grid.

The new project demonstrates the feasibility of agreements to install solar panels on rooftops distributed across towns with existing grid connections and use internet of things, or IoT, technologies to aggregate the accumulated energy production for carbon offset accounting.

"It utilizes existing assets that are sitting there unmonetized, which are roofs of buildings that absorb sunlight all day," Kelly said.

New Business Model

The project is also a proof-of-concept, or blueprint, for how energy providers can adapt as the falling price of solar panels enables distributed electric power generation throughout the existing electric power grid.

Traditionally, suppliers purchase power from central power plants and industrial-scale wind and solar farms and sell it to consumers on the distribution grid. Now, energy providers like SSE Airtricity provide renewable energy solutions that allow end consumers to generate power, from sustainable sources, using the existing grid connection on their premises.

"The more forward-thinking energy providers that we are working with, like SSE Airtricity, identify this as an opportunity and industry changing shift in how energy will be generated and consumed," Kelly noted.

The opportunity comes in the ability to finance the installation of solar panels and batteries at homes, schools, businesses and other buildings throughout a community and leverage IoT technology to efficiently perform a range of services from energy trading to carbon offset accounting.

Kelly and his team with Azure IoT are working with SSE Airtricity to develop the tools and machine learning models necessary to unlock this opportunity.

"Instead of having utility scale solar farms located outside of cities, you could have a solar farm at the distribution level, spread across a number of locations," said Fergal Ahern, a business energy solutions manager and renewable energy expert with SSE Airtricity.

For the distributed power purchase agreement, SSE Airtricity uses Azure IoT to aggregate the generation of all the solar panels installed across 27 schools around the provinces of Leinster, Munster and Connacht and run it through a machine learning model to determine the carbon emissions that the solar panels avoid.

The schools use the electricity generated by the solar panels, which reduces their utility bills; Microsoft receives the renewable energy credits for the generated electricity, which the company applies to its carbon neutrality commitments.

The panels are expected to produce enough energy annually to power the equivalent of 68 Irish homes for a year and abate more than 2.1 million kilograms, which is equivalent to 4.6 million pounds, of carbon dioxide emissions over the 15 years of the agreement, according to Kelly.

"This is additional renewable energy that wouldn't have otherwise happened," he said. "Every little bit counts when it comes to meeting our sustainability targets and combatting climate change."

Every little bit counts

Victory Luke, a 16 year old student at Collinstown Park Community College in Dublin, has lived by the "every little bit counts" mantra since she participated in a "Generation Green" sustainability workshop in 2019 organized by the Sustainable Energy Authority of Ireland, SSE Airtricity and Microsoft.

The workshop was part of an education program surrounding the installation of solar panels and batteries at her school along with a retrofit of the lighting system with LEDs. Digital screens show the school's energy use in real time, allowing students to see the impact of the energy efficiency upgrades.

Luke said the workshop captured her interest on climate change issues. She started reading more about sustainability and environmental conservation and agreed to share her newfound knowledge with the younger students at her school.

"I was going around and talking to them about energy efficiency, sharing tips and tricks like if you are going to boil a kettle, only boil as much water as you need, not too much," she explained.

That June, the Sustainable Energy Authority of Ireland invited her to give a speech at the Global Conference on Energy Efficiency in Dublin, which was organized by the International Energy Agency, an organization that works with governments and industry to shape sustainable energy policy.

"It kind of felt surreal because I honestly felt like I wasn't adequate enough to be speaking about these things," she said, noting that the conference attendees included government ministers, CEOs and energy experts from around the world.

At the time, she added, the global climate strike movement and its youth leaders were making international headlines, which made her advocacy at school feel even smaller. "Then I kind of realized that it is those smaller things that make the big difference," she said.

SSE Airtricity and Microsoft plan to replicate the educational program that inspired Luke and her classmates at dozens of the schools around Ireland that are participating in the project.

"When you've got solar at a school and you can physically point at the installation and a screen that monitors the power being generated, it brings sustainability into daily school life," Ahern said.

Proof of concept for policymakers

The project's education campaign extends to renewable energy policymakers, Kelly noted. He explained that renewable energy credits—a market incentive for corporations to support renewable energy projects—are currently unavailable for distributed power purchase agreements.

For this project, Microsoft will receive genuine renewable energy credits from a wind farm that SSE Airtricity also operates, he added.

"And," he said, "we are hoping to use this project as an example of what regulation should look like, to say, 'You need to award renewable energy credits to distributed generation because they would allow corporates to scale-up this type of project.'"

For her part, Luke supports steps by multinational corporations such as Microsoft to invest in renewable energy projects that address global climate change.

"It is a good thing to see," she said. "Once one person does something, other people are going to follow.

Originaly posted HERE

Read more…

An edge device is the network component that is responsible for connecting a local area network to an external or wide area network, which can be accessed from anywhere. Edge devices offer several new services and improved outcomes for IoT deployments across all markets. Smart services that rely on high volumes of data and local analysis can be deployed in a wide range of environments.

Edge device provides the local data to an external network. If protocols are different in local and external networks, it also translates this information, and make the connection between both network boundaries. Edge devices analyze diagnostics and automatic data populating; however, it is necessary to make a secure connection between the field network and cloud computing. In the event of loss of internet connection or cloud crash edge device will store data until the connection is established, so it won’t lose any process information. The local data storage is optional and not all edge devices offer local storage, it depends on the application and service required to implement on the plant.

How does an edge device work?

An edge device has a very straightforward working principle, it communicates between two different networks and translates one protocol into another. Furthermore, it creates a secure connection with the cloud.

An edge device can be configured via local access and internet or cloud. In general, we can say an edge device is a plug-and-play, its setup is simple and does not require much time to configure.

Why should I use an edge device?

Depending on the service required in the plant, the edge devices will be a crucial point to collect the information and create an automatic digital twin of your device in the cloud. 

Edge devices are an essential part of IoT solutions since they connect the information from a network to a cloud solution. They do not affect the network but only collect the data from it, and never cause a problem with the communication between the control system and the field devices. by using an edge device to collect information, the user won’t need to touch the control system. Edge is one-way communication, nothing is written into the network, and data are acquired with the highest possible security.

Edge device requirements

Edge devices are required to meet certain requirements that are to meet at all conditions to perform in different secretions. This may include storage, network, and latency, etc.

Low latency

Sensor data is collected in near real-time by an edge server. For services like image recognition and visual monitoring, edge servers are located in very close proximity to the device, meeting low latency requirements. Edge deployment needs to ensure that these services are not lost through poor development practice or inadequate processing resources at the edge. Maintaining data quality and security at the edge whilst enabling low latency is a challenge that need to address.

Network independence

IoT services do not care for data communication topology.  The user requires the data through the most effective means possible which in many cases will be mobile networks, but in some scenarios, Wi-Fi or local mesh networking may be the most effective mechanism of collecting data to ensure latency requirements can be met.

Good-Edge-IOT-Device-1024x576.jpg

Data security

Users require data at the edge to be kept secure as when it is stored and used elsewhere. These challenges need to meet due to the larger vector and scope for attacks at the edge. Data authentication and user access are as important at the edge as it is on the device or at the core.  Additionally, the physical security of edge infrastructure needs to be considered, as it is likely to hold in less secure environments than dedicated data centers.

Data Quality

Data quality at the edge is a key requirement to guarantee to operate in demanding environments. To maintain data quality at the edge, applications must ensure that data is authenticated, replicated as and assigned into the correct classes and types of data category.

Flexibility in future enhancements

Additional sensors can be added and managed at the edge as requirements change. Sensors such as accelerometers, cameras, and GPS, can be added to equipment, with seamless integration and control at the edge.

Local storage

Local storage is essential in the event of loss of internet connection or cloud crash edge device will store data until the connection is established, so it won’t lose any process information. The local data storage is optional and not all edge devices offer local storage, it depends on the application and service required to implement on the plant

Originaly Posted here

Read more…

Impact of IoT in Inventory

Internet of Things (IoT) has revolutionized many industries including inventory management. IoT is a concept where devices are interconnected via the internet. It is expected that by 2020, there will be 26 billion devices connected worldwide. These connections are important because it allows data sharing which then can perform actions to make life and business more efficient. Since inventory is a significant portion of a company’s assets, inventory data is vital for an accounting department for the company’s asset management and annual report.

Inventory solutions based on IoT and RFID, individual inventory item receives an RFID tag. Each tag has a unique identification number (ID) that contains information about an inventory item, e.g. a model, a batch number, etc. these tags are scanned by RF reader. Upon scanning, a reader extracts its IDs and transmits them to the cloud for processing. Along with the tag’s ID, the cloud receives location and the time of reading. This data is used for updates about inventory items’, allowing users to monitor the inventory from anywhere, in real-time.

Industrial IoT

The role of IoT in inventory management is to receive data and turn it into meaningful insights about inventory items’ location, status, and giving users a corresponding output. For example, based on the data, and inventory management solution architecture, we can forecast the number of raw materials needed for the upcoming production cycle. The output of the system can also send an alert if any individual inventory item is lost.

Moreover, IoT based inventory management solutions can be integrated with other systems, i.e. ERP and share data with other departments.

RFID in Industrial IoT

RFID consist of three main components tag, antenna, and a reader

Tags: An RFID tag carries information about a specific object. It can be attached to any surface, including raw materials, finished goods, packages, etc.

RFID antennas: An RFID antenna receives signals to supply power and data for tags’ operation

RFID readers: An RFID reader, uses radio signals to read and write to the tags. The reader receives data stored in the tag and transmits it to the cloud.

Benefits of IoT in inventory management

The benefits of IoT on the supply chain are the most exciting physical manifestations we can observe. IoT in the supply chain creates unparalleled transparency that increases efficiencies.

Inventory tracking

The major benefit of inventory management is asset tracking, instead of using barcodes to scan and record data, items have RFID tags which can be registered wirelessly. It is possible to accurately obtain data and track items from any point in the supply chain.

With RFID and IoT, managers don’t have to spend time on manual tracking and reporting on spreadsheets. Each item is tracked and the data about it is recorded automatically. Automated asset tracking and reporting save time and reduce the probability of human error.

Inventory optimization

Real-time data about the quantity and the location of the inventory, manufacturers can reduce the amount of inventory on hand while meeting the needs of the customers at the end of the supply chain.

The data about the amount of available inventory and machine learning can forecast the required inventory which allows manufacturers to reduce the lead time.

Remote tracking

Remote product tracking makes it easy to have an eye on production and business. Knowing production and transit times, allows you to better tweak orders to suit lead times and in response to fluctuating demand. It shows which suppliers are meeting production and shipping criteria and which needs monitoring for the required outcome.

It gives visibility into the flow of raw materials, work-in-progress and finished goods by providing updates about the status and location of the items so that inventory managers see when an individual item enters or leaves a specific location.

Bottlenecks in the operations

With the real-time data about the location and the quantity, manufacturers can reveal bottlenecks in the process and pinpoint the machine with lower utilization rates. For instance, if part of the inventory tends to pile up in front of a machine, a manufacturer assumes that the machine is underutilized and needs to be seen to.

The Outcomes

The data collected by inventory management is more accurate and up-to-date. By reducing these time delays, the manufacturing process can enhance accuracy and reduce wastage. An IoT-based inventory management solution offers complete visibility on inventory by providing real-time information fetched by RFID tags. It helps to track the exact location of raw materials, work-in-progress and finished goods. As a result, manufacturers can balance the amount of on-hand inventory, increase the utilization of machines, reduce lead time, and thus, avoid costs bound to the less effective methods. This is all about optimizing inventory and ensuring anything ordered can be sold through whatever channel necessary.

Originally posted here

Read more…

Can AI Replace Firmware?

Scott Rosenthal and I go back about a thousand years; we've worked together, helped midwife the embedded field into being, had some amazing sailing adventures, and recently took a jaunt to the Azores just for the heck of it. Our sons are both big data people; their physics PhDs were perfect entrees into that field, and both now work in the field of artificial intelligence.

At lunch recently we were talking about embedded systems and AI, and Scott posed a thought that has been rattling around in my head since. Could AI replace firmware?

Firmware is a huge problem for our industry. It's hideously expensive. Only highly-skilled people can create it, and there are too few of us.

What if an AI engine of some sort could be dumped into a microcontroller and the "software" then created by training that AI? If that were possible - and that's a big "if" - then it might be possible to achieve what was hoped for when COBOL was invented: programmers would no longer be needed as domain experts could do the work. That didn't pan out for COBOL; the industry learned that accountants couldn't code. Though the language was much more friendly than the assembly it replaced, it still required serious development skills.

But with AI, could a domain expert train an inference engine?

Consider a robot: a "home economics" major could create scenarios of stacking dishes from a dishwasher. Maybe these would be in the form of videos, which were then fed to the AI engine as it tuned the weighting coefficients to achieve what the home ec expert deems worthy goals.

My first objection to this idea was that these sorts of systems have physical constraints. With firmware I'd write code to sample limit switches so the motors would turn off if at an end-of-motion extreme. During training an AI-based system would try and drive the motors into all kinds of crazy positions, banging destructively into stops. But think how a child learns: a parent encourages experimentation but prevents the youngster from self-harm. Maybe that's the role of the future developer training an AI. Or perhaps the training will be done on a simulator of some sort where nothing can go horribly wrong.

Taking this further, a domain expert could define the desired inputs and outputs, and then a poorly-paid person do the actual training. CEOs will love that. With that model a strange parallel emerges to computation a century ago: before the computer age "computers" were people doing simple math to create tables of logs, trig, ballistics, etc. A room full all labored at a problem. They weren't particularly skilled, didn't make much, but did the rote work under the direction of one master. Maybe AI trainers will be somewhat like that.

Like we outsource clothing manufacturing to Bangladesh, I could see training, basically grunt work, being sent overseas as well.

I'm not wild about this idea as it means we'd have an IoT of idiots: billions of AI-powered machines where no one really knows how they work. They've been well-trained but what happens when there's a corner case?

And most of the AI literature I read suggests that inference successes of 97% or so are the norm. That might be fine for classifying faces, but a 3% failure rate of a safety-critical system is a disaster. And the same rate for less-critical systems like factory controllers would also be completely unacceptable.

But the idea is intriguing.

Original post can be viewed here

Feel free to email me with comments.

Back to Jack's blog index page.

Read more…

7811924256?profile=RESIZE_400x

 

CLICK HERE TO DOWNLOAD

This complete guide is a 212-page eBook and is a must read for business leaders, product managers and engineers who want to implement, scale and optimize their business with IoT communications.

Whether you want to attempt initial entry into the IoT-sphere, or expand existing deployments, this book can help with your goals, providing deep understanding into all aspects of IoT.

CLICK HERE TO DOWNLOAD

Read more…

Edge Products Are Now Managed At The Cloud

Now more than ever, there are billions of edge products in the world. But without proper cloud computing, making the most of electronic devices that run on Linux or any other OS would not be possible.

And so, a question most people keep asking is which is the best Software-as-a-service platform that can effectively manage edge devices through cloud computing. Well, while edge device management may not be something, the fact that cloud computing space is not fully exploited means there is a lot to do in the cloud space.

Product remote management is especially necessary for the 21st century and beyond. Because of the increasing number of devices connected to the internet of things (IoT), a reliable SaaS platform should, therefore, help with maintaining software glitches from anywhere in the world. From smart homes, stereo speakers, cars, to personal computers, any product that is connected to the internet needs real-time protection from hacking threats such as unlawful access to business or personal data.

Data being the most vital asset is constantly at risk, especially if individuals using edge products do not connect to trusted, reliable, and secure edge device management platforms.

Bridges the Gap Between Complicated Software And End Users

Cloud computing is the new frontier through which SaaS platforms help manage edge devices in real-time. But something even more noteworthy is the increasing number of complicated software that now run edge devices at homes and in workplaces.

Edge device management, therefore, ensures everything runs smoothly. From fixing bugs, running debugging commands to real-time software patch deployment, cloud management of edge products bridges a gap between end-users and complicated software that is becoming the norm these days.

Even more importantly, going beyond physical firewall barriers is a major necessity in remote management of edge devices. A reliable Software-as-a-Service, therefore, ensures data encryption for edge devices is not only hackproof by also accessed by the right people. Moreover, deployment of secure routers and access tools are especially critical in cloud computing when managing edge devices. And so, developers behind successful SaaS platforms do conduct regular security checks over the cloud, design and implement solutions for edge products.

Reliable IT Infrastructure Is Necessary

Software-as-a-service platforms that manage edge devices focus on having a reliable IT infrastructure and centralized systems through which they can conduct cloud computing. It is all about remotely managing edge devices with the help of an IT infrastructure that eliminates challenges such as connectivity latency.

Originally posted here

Read more…

In the era of digitalization, IoT is fostering the upcoming revolution in mobile apps. The ways companies used to provide mobile app development are changing because of IoT. After helping thousands of corporates to deliver extraordinary user experiences, IoT is all set with some new and advanced mobile app development trends. 

The tech world is the one that is continuously evolving. Every year and each day, innovations come to light. Each of them is revolutionizing our lives in one or the other ways. From the first wheel to smart cities, humans have come a long way.

The evolution and foundation of smart cities is the result of IoT or the Internet of Things. IoT has definitely stirred quite an uproar in the digital world with the mass potential it has. It can bring everything and everyone online. 

As per the latest mobile app stats, IoT will become a more significant player in the mobile app development industry. The market share of IoT is going to increase more than double in 2021 with a staggering amount of 520 billion USD. While four years back in 2017, this number was 235 billion USD. 

Soon the IoT mobile app development will face new trends in the coming year and beyond.

Let us take a look at the top IoT mobile app development trends.

IoT App Trend #1: Cybersecurity for IoT

With an increase in the number of devices online, cybersecurity is the top priority for all businesses as IoT gains popularity. The network is expected to expand in the coming years, and so the data volume will also increase. All this draws attention to more information to protect.

IoT security will see an exponential rise as more users will store their data over the cloud. From banking details to home security, everything is easily breached if the security firewall is weak in IoT applications. 

Therefore mobile app development companies need to work upon the up-gradation of their IoT enabled mobile apps. 

IoT App Trend #2: Roaring Popularity of Smart Home Devices

When smart home devices were launched, many mocked them by calling them unrealistic toys for lazy youngsters. Now, the same people are finding it increasingly difficult to resist the charm of IoT devices. 

IoT devices are expected to be very popular in 2021 and the years to come. The reason behind their growing popularity is that the IoT devices are becoming highly intuitive and innovative. They are extended not only to the comfort of home automation but also to home security and the safety of your family.

Another great advantage of implementing smart IoT development adoption is the need to save energy. The intelligent lights or intelligent thermostats help in conserving energy, reducing bills. These reasons will lead to more and more people to adopt smart home devices.

IoT App Trend #3: Backed by AI and ML

Artificial Intelligence and Machine Learning both are thriving technologies. Both of these are the facilitators of automation. We all know how Artificial Intelligence has touched millions of lives around the globe. 

Together with IoT, AI and ML are unique data-driven technologies shaping the future of human-machine interactions. The developers set up a combination of IoT and Artificial Intelligence that helps automate the routine tasks, simplifies work, and gets the most accurate information.

IoT App Trend #4: IoT and Healthcare

With the revolution in the health-tech industry, healthcare companies are turning towards mobile platforms. IoT enabled apps to open up new opportunities to improve the medical sector.

IoT has immense applications that are already running in the healthcare field and is expected to increase by 26.2% 

Healthcare apps featuring IoT technology are expected to reform the world of medical sciences. These IoT mobile apps can even help doctors and medical professionals treat their patients even from a distance.

Smart wearables and implants will be able to record diverse parameters to keep the patient’s health in check. By integrating sensors, portable devices, and all kinds of medical equipment, real-time updates of a patient’s health can be recorded and sent to the concerned person. 

IoT App Trend #5: Edge Computing to Overtake Cloud Computing

This is a change where we have to be careful. For the past many years, IoT devices have been storing their data on cloud storage. However, the IoT developers, development services, and manufacturers have started thinking about the utility of storing, calculating, and analyzing data to the limit.

So basically this means, in place of sending the entire data from IoT devices to the cloud, the data is first transmitted to a local or nearer storage device located close to the IoT device or on the edge of the network. 

This local storage device then analyzes, sorts, filters and calculates the data and then sends all or only a part of the data to the cloud, reducing the traffic on the network avoiding any bottleneck situation.

Known as “edge computing”, this approach has several advantages if used correctly. Firstly, it helps in the better management of the large amount of data that each device sends. Second, the reduced dependency on cloud storage allows devices and applications to perform faster and also reduce latency.

Being able to collect and process data locally, the IoT application is expected to consume lesser bandwidth and work even when connectivity to the cloud is affected. After seeing these positive aspects, state-of-the-art computing is looking forward to better innovation and broad adoption in IoT, both consumer and industrial.

Reduced connectivity to the cloud will also result in fewer security costs and facilitate better security practices. 2021 will see better state-of-the-art IT in IoT.

IoT App Trend #6: Are You Excited About Smart Cities?

Well, all of us are super excited to witness smart cities. Smart cities are one of the significant accomplishments of IoT and modernization. Integrated with IoT-powered devices, smart cities promise improved efficiency and security for the common folk on the streets and inside their homes.

With superfast data transfer supported by 5G, public transportation will also see a massive change in the way they work. 

By now, we know that IoT will focus on developing smart parking lots, street lights, and traffic controls. To add up to this, with IoT and fast internet, we will live inside a world where our refrigerators will be aware of what food we have inside.

IoT will impact traffic congestion and security. It will also help in the development of sustainable cities leading us to a green future.

IoT App Trend #7: Blockchain for IoT Security

Many financial and governmental institutions, entrepreneurs, consumers as well as industrialists will be decentralized, self-governing, and be quite smart. Most of the new companies are seen building their territory on the entanglement of IOTA to develop modules and other components for firms without the cost of SaaS and Cloud.

IOTA is a distributed ledger especially designed to record and execute transactions between devices in the IoT ecosystem.

If you are in this industry, then you should prepare to see the centralized and monolithic computer models that are separated in the jobs and microservices. All this will be distributed to decentralized machines and devices. 

In the coming future, IoT will penetrate the disciplines of health, government, transactions, and others that we cannot think of right now. Such types of IoT technology trends will create significant effective differences.

IoT App Trend #8: IoT for Retail Apps

The eCommerce industry will also get benefited from IoT integration. Retail supply change will be more efficient after the incorporation of IoT mobile apps. It is expected to improve the online shopping experience for individuals across the globe.

Also, IoT will make the retail experience more personalized for each customer with in-app advertisements based on the user’s shopping history. We already get notifications once we purchase a product from a particular eStore. With IoT enabled mobile apps, the app will guide us to our favorite store using in-site maps.

IoT App Trend #9- Will IoT Boost Predictive Maintenance?

Yes, it will. In 2021 and beyond, the smart home system will notify the owner about plumbing leaks, appliance failures, or any other problem so that the house owner can avoid any disaster. Soon these intelligent sensors will enter our houses.

In response to these predictive skills of IoT, we can expect to see home care offers as a contractor service. If there will be a need for any emergency action, your presence in the house will not be necessary. 

IoT App Trend #10: Easy and Better Commuting

IoT mobile applications are expected to make commuting easier for students, the elderly, the business person, and many more. Today, due to heavy traffic, commuting is a significant issue for most of us. With major innovations in technology and integration of IoT, mobile applications will make traveling a breeze for everyone.

Here are some of the conventional ways that commuting will change:

  • Smart street lights will make walking on the road safe for pedestrians
  • Finding parking spaces will be a lot easier and seamless with data-driven parking apps. 
  • In-app navigation and public transportation will definitely make public transit more reliable 
  • IoT powered mobile apps will also improve routing between different modes of transfer.

With so many innovative ideas and benefits for iOS and android based IoT mobile apps, the mobile app development market will see an influx of transportation apps in the years to come.

IoT App Trend #11: Sustainable-as-a-Service Becomes the Norm.

While talking about the IoT trends, SaaS or Sustainable-as-a-Service is considered as one of the hot topics for the estimated market. Because of the low cost of entry, SaaS is quickly getting to the top list for being the favorite firm in the IT gaming sector. 

Out of these emerging technological IoT trends, Software-as-a-service will make the lives of people better than ever.

IoT App Trend #12- Energy and Resource Management 

Do you know what affects energy management the most? Well, energy management majorly depends on the acquisition of a better understanding of how to consume resources. IoT mobile app-based electronics are expected to play a significant role in the conservation of energy. 

All of these IoT trends can be integrated into resource management, making lives more accessible, more comfortable, and responsible.

Automatic notifications can also be added to the mobile app in order to send information to the owner in case the power threshold exceeds. Various other fancy features can also be added to these IoT mobile apps such as sprinkler control, in-house temperature management, etc.

Conclusion

We all know that IoT has great potential to bring revolutionary changes in the present mobile app development industry trends. It is expected to open up immense possibilities for every business or individual related to this field. Directly or indirectly, IoT will drive the future of almost every industry.

The above mentioned are some of the trends that will dominate the IoT app development ecosystem in the years to come. Amid all these predictions and trends, the future is promising and worth the wait. 

 

 

 

 

Read more…
RSS
Email me when there are new items in this category –

Sponsor