Join IoT Central | Join our LinkedIn Group | Post on IoT Central


Cloud Platforms (183)

In the vast terrain of digital transformation, the Internet of Things (IoT) has emerged as a leading beacon. As businesses grapple with evolving demands, IoT serves as a cornerstone for innovation, operational efficiency, and superior customer engagement. This article delves into the intricacies of IoT and elucidates how businesses can embrace it to spur growth.

Understanding the IoT Landscape

The Internet of Things comprises a vast network of interconnected physical devices, all embedded with sensors, software, and other technologies to collect and exchange data. From household items like smart thermostats to complex systems such as industrial machinery, IoT technology is versatile and can be applied in various sectors. It even plays a role in enhancing the robustness and capabilities of virtual networks, including VPS hosting services. By integrating IoT technology, businesses can create a more efficient, data-driven, and automated operation.

The Value Proposition for Businesses

The impact of IoT on business is profound. When appropriately harnessed, IoT can provide actionable data that serves as the foundation for informed decision-making. In a world that is shifting toward data-driven strategies, the real-time analytics that IoT offers can be transformative.

The technology also has the potential to elevate productivity. IoT can automate various mundane and repetitive tasks, freeing human resources for more creative and complex responsibilities. When it comes to customer experiences, IoT brings an unprecedented level of personalization and convenience, thereby boosting customer satisfaction and loyalty.

But the benefits don't stop there. Implementing IoT can lead to a more cost-efficient operation. One way it achieves this is by enabling predictive maintenance. This ensures that machinery and equipment are serviced before they break down, thus reducing downtime and extending the lifespan of the asset.

Implementing IoT in Your Business Strategy

As with any significant business undertaking, the effective implementation of IoT starts with the identification of specific business needs and objectives. Are you seeking operational efficiency or striving for superior customer engagement? Knowing what you aim to achieve helps you choose the right devices and platforms tailored to meet those objectives.

Choosing the right IoT devices is vital to the success of your venture. IoT has a broad spectrum of applications, and the range of devices available is equally diverse. Whether it's a smart camera to enhance security or a temperature sensor in a manufacturing line, selecting devices that suit your specific needs is critical.

The next stage involves integrating IoT technology into your existing infrastructure. Seamless integration is crucial to achieving a streamlined operation. Whether your business is purely online and reliant on VPS hosting, or you operate from a brick-and-mortar establishment, the IoT architecture should be compatible with your existing systems.

Security is another critical consideration. The interconnected nature of IoT increases the potential risk of cyber threats. As such, robust security measures are required to safeguard against unauthorized access and data breaches. You'll need to deploy strong encryption techniques and continually monitor the network to protect against vulnerabilities.

However, implementing IoT is not a "set it and forget it" deal. Continuous monitoring and data analysis are key to maximizing the benefits. IoT generates large volumes of data, and you need a comprehensive analytics strategy to sift through this data and extract actionable insights.

Finally, the system should undergo periodic evaluations for performance and security. These reviews help in iterating and optimizing your IoT setup, ensuring it evolves with changing business requirements and technological advancements.

Taking the Plunge

IoT technology offers an unmatched opportunity for businesses to elevate operational efficiency, enrich customer experiences, and drive growth. With a clear strategy in place— one that identifies your business needs incorporates the right devices, and follows a secure and data-driven approach— the benefits can be substantial.

Conclusion

In a rapidly digitizing world, IoT is not merely a fad but a transformative force that can help businesses stay competitive and reach new heights. Therefore, it's not a question of whether to adopt IoT but how best to do so for sustainable business growth.

Read more…

The manufacturing industry is on the brink of a transformative journey with the integration of 5G technology. As we step into the future, the powerful combination of 5G and the Internet of Things (IoT) is revolutionizing the manufacturing landscape, promising unparalleled levels of efficiency, innovation, and success. The potential for growth is immense, as indicated by the projected expansion of the global 5G in the manufacturing market. According to a report by Global Market Estimates, the market is expected to experience a remarkable compound annual growth rate (CAGR) of around 27.5% during the forecast period from 2021 to 2026. The momentum continues to build, with another study conducted by Allied Market Research revealing that the global industrial 5G market's value reached $12.47 billion in 2020 and is projected to surge to an astounding $140.88 billion by the year 2030, growing at the same impressive CAGR of 27.5%. This surge in demand and implementation of 5G technology is set to redefine manufacturing operations, unleashing a new era of connectivity, data-driven decision-making, and technological advancement in the industry.

Let's explore the game-changing areas where 5G is shaping manufacturing:

  • Enhanced Automation and Robotics: Brace yourself for a world where machines, robots, and control systems communicate seamlessly in real time. With the lightning-fast speed and ultra-low latency of 5G, automation reaches new heights. Human operators collaborate harmoniously with their mechanical counterparts, driving productivity to soaring levels and creating a manufacturing ecosystem buzzing with flawless precision.
  • IoT Expansion: Prepare to be captivated by the power of connectivity as 5G and IoT converge. An interconnected web of devices and sensors revolutionizes manufacturing environments. Real-time data flows effortlessly, empowering manufacturers with unparalleled insights into production processes, equipment performance, and inventory levels. Welcome to the era of smart factories and a thriving industrial IoT ecosystem where innovation knows no boundaries.
  • Real-time Analytics and Predictive Maintenance: Unlock the door to real-time analytics with 5G's extraordinary bandwidth and lightning-fast transmission. Advanced algorithms analyze production data on the spot, equipping manufacturers with invaluable insights. Witness the magic of predictive maintenance strategies that detect and address potential equipment failures before they disrupt operations. Say goodbye to downtime, watch maintenance costs plummet, and witness equipment performance reach peak efficiency.
  • Remote Operations and Monitoring: Embrace a paradigm shift as 5G propels us into the era of real-time remote control and monitoring. Manufacturers gain the power to oversee and manage operations from anywhere on the globe. Multiple production sites become effortlessly manageable, critical information is at your fingertips, and operational streamlining becomes second nature. Flexibility reigns supreme, decisions are lightning-quick, and the need for on-site personnel diminishes, optimizing resources and reducing costs.
  • Augmented Reality (AR) and Virtual Reality (VR) Integration: Immerse yourself in a new manufacturing era as 5G unleashes the full potential of AR and VR. Experience lightning-fast speeds and ultra-low latency as AR glasses guide workers through assembly processes troubleshoot with ease, and perfect quality control in real-time. Witness accuracy soaring, errors vanishing, and worker productivity reaching unparalleled heights.
  • Supply Chain Optimization: Let 5G permeate your supply chain, igniting a transformative revolution. Real-time connectivity, data sharing, and analytics illuminate supply chain visibility like never before. Bid farewell to stockouts and delays as inventory tracking becomes a breeze, logistics management reaches new heights of efficiency, and distribution networks optimize with precision. Elevate inventory management and delight customers with enhanced satisfaction.

12214206065?profile=RESIZE_710x

Here is a more detailed explanation of using 5G for real-time quality monitoring in medical device manufacturing:

Real-Time Quality Monitoring with 5G

Maintaining rigorous quality control is crucial in medical device manufacturing to ensure patient safety. However, traditional testing and inspection processes can be time-consuming, costly, and unable to catch all defects. 5G enables real-time quality monitoring by connecting production equipment with smart sensors and analytics.

With 5G, sensors can stream massive amounts of real-time data on product specifications, equipment performance, environmental conditions, and more. 5G's high bandwidth and low latency allow huge volumes of sensor data to be transferred continuously without lag.

This data feeds into edge devices running advanced analytics algorithms. The algorithms identify anomalies, detect deviations from quality parameters, and predict potential defects. Operators are notified of issues for immediate corrective action.

For example, vibration sensors may reveal the out-of-tolerance operation of a cutting tool indicating impending tool failure. Temperature probes may show unacceptable fluctuations in a curing oven negatively impacting material properties. These insights can prompt remedial measures before defective products are created.

5G-enabled real-time monitoring provides a holistic view of production quality. It shifts quality control from periodic testing to proactive prevention by enabling predictive capabilities. This allows medical device manufacturers to achieve significant improvements in product quality, output, and compliance with regulatory standards.

The manufacturing industry is reaching new pinnacles of greatness as we move forward into the future with 5G steering the ship. Leverage the power of 5G technology to gain a competitive advantage, respond quickly to changing demands in the market, and deliver products with an efficiency that is unmatched by any other method. Join the manufacturing revolution, and you will be able to observe the growth of invention, the ascent of efficiency, and the unbounded expansion of success.

 

References:

https://www.globalmarketestimates.com/market-report/5g-in-manufacturing-market-3566

https://www.alliedmarketresearch.com/industrial-5g-market-A11659

Read more…

The Internet of Things (IoT) continues to revolutionize industries, and Microsoft Azure IoT is at the forefront of this transformation. With its robust suite of services and features, Azure IoT enables organizations to connect, monitor, and manage their IoT devices and data effectively. In this blog post, we will explore the latest trends and use cases of Azure IoT in 2023, showcasing how it empowers businesses across various sectors.

Edge Computing and AI at the Edge:

As the volume of IoT devices and the need for real-time analytics increases, edge computing has gained significant momentum. Azure IoT enables edge computing by seamlessly extending its capabilities to the edge devices. In 2023, we can expect Azure IoT to further enhance its edge computing offerings, allowing organizations to process and analyze data closer to the source. With AI at the edge, businesses can leverage machine learning algorithms to gain valuable insights and take immediate actions based on real-time data.

Edge Computing and Real-time Analytics:

As IoT deployments scale, the demand for real-time data processing and analytics at the edge has grown. Azure IoT Edge allows organizations to deploy and run cloud workloads directly on IoT devices, enabling quick data analysis and insights at the edge of the network. With edge computing, businesses can reduce latency, enhance security, and make faster, data-driven decisions.

Industrial IoT (IIoT) for Smart Manufacturing:

Azure IoT is poised to play a crucial role in the digital transformation of manufacturing processes. IIoT solutions built on Azure enable manufacturers to connect their machines, collect data, and optimize operations. In 2023, we anticipate Azure IoT to continue empowering smart manufacturing by offering advanced analytics, predictive maintenance, and intelligent supply chain management. By harnessing the power of Azure IoT, manufacturers can reduce downtime, enhance productivity, and achieve greater operational efficiency.

Connected Healthcare:

In the healthcare industry, Azure IoT is revolutionizing patient care and operational efficiency. In 2023, we expect Azure IoT to drive the connected healthcare ecosystem further. IoT-enabled medical devices, remote patient monitoring systems, and real-time data analytics can help healthcare providers deliver personalized care, improve patient outcomes, and optimize resource allocation. Azure IoT's robust security and compliance features ensure that sensitive patient data remains protected throughout the healthcare continuum.

Smart Cities and Sustainable Infrastructure:

As cities strive to become more sustainable and efficient, Azure IoT offers a powerful platform for smart city initiatives. In 2023, Azure IoT is likely to facilitate the deployment of smart sensors, intelligent transportation systems, and efficient energy management solutions. By leveraging Azure IoT, cities can enhance traffic management, reduce carbon emissions, and improve the overall quality of life for their residents.

Retail and Customer Experience:

Azure IoT is transforming the retail landscape by enabling personalized customer experiences, inventory optimization, and real-time supply chain visibility. In 2023, we can expect Azure IoT to continue enhancing the retail industry with innovations such as cashier-less stores, smart shelves, and automated inventory management. By leveraging Azure IoT's capabilities, retailers can gain valuable insights into customer behavior, streamline operations, and deliver superior shopping experiences.

AI and Machine Learning Integration:

Azure IoT integrates seamlessly with Microsoft's powerful artificial intelligence (AI) and machine learning (ML) capabilities. By leveraging Azure IoT and Azure AI services, organizations can gain actionable insights from their IoT data. For example, predictive maintenance algorithms can analyze sensor data to detect equipment failures before they occur, minimizing downtime and optimizing operational efficiency.

Enhanced Security and Device Management:

In an increasingly interconnected world, security is a top priority for IoT deployments. Azure IoT provides robust security features to protect devices, data, and communications. With features like Azure Sphere, organizations can build secure and trustworthy IoT devices, while Azure IoT Hub ensures secure and reliable device-to-cloud and cloud-to-device communication. Additionally, Azure IoT Central simplifies device management, enabling organizations to monitor and manage their IoT devices at scale.

Industry-specific Solutions:

Azure IoT offers industry-specific solutions tailored to the unique needs of various sectors. Whether it's manufacturing, healthcare, retail, or transportation, Azure IoT provides pre-built solutions and accelerators to jumpstart IoT deployments. For example, in manufacturing, Azure IoT helps optimize production processes, monitor equipment performance, and enable predictive maintenance. In healthcare, it enables remote patient monitoring, asset tracking, and patient safety solutions.

Integration with Azure Services:

Azure IoT seamlessly integrates with a wide range of Azure services, creating a comprehensive ecosystem for IoT deployments. Organizations can leverage services like Azure Functions for serverless computing, Azure Stream Analytics for real-time data processing, Azure Cosmos DB for scalable and globally distributed databases, and Azure Logic Apps for workflow automation. This integration enables organizations to build end-to-end IoT solutions with ease.

Conclusion:

In 2023, Azure IoT is set to drive innovation across various sectors, including manufacturing, healthcare, cities, and retail. With its robust suite of services, edge computing capabilities, and AI integration, Azure IoT empowers organizations to harness the full potential of IoT and achieve digital transformation. As businesses embrace the latest trends and leverage the diverse use cases of Azure IoT, they can gain a competitive edge, improve operational efficiency, and unlock new opportunities in the connected world.

 

About Infysion

We work closely with our clients to help them successfully build and execute their most critical strategies. We work behind-the-scenes with machine manufacturers and industrial SaaS providers, to help them build intelligent solutions around Condition based machine monitoring, analytics-driven Asset management, accurate Failure predictions and end-to-end operations visibility. Since our founding 3 years ago, Infysion has successfully productionised over 20+ industry implementations, that support Energy production, Water & electricity supply monitoring, Wind & Solar farms management, assets monitoring and Healthcare equipment monitoring.

We strive to provide our clients with exceptional software and services that will create a meaningful impact on their bottom line.

 Visit our website to learn more about success stories, how we work, Latest Blogs and different services we do offer!

Read more…

Cloud-based motor monitoring as a service is revolutionizing the way industries manage and maintain their critical assets. By leveraging the power of the cloud, organizations can remotely monitor motors, analyze performance data, and predict potential failures. However, as this technology continues to evolve, several challenges emerge that need to be addressed for successful implementation and operation. In this blog post, we will explore the top challenges faced in cloud-based motor monitoring as a service in 2023. 

Data Security and Privacy:

One of the primary concerns in cloud-based motor monitoring is ensuring the security and privacy of sensitive data. As motor data is transmitted and stored in the cloud, there is a need for robust encryption, authentication, and access control mechanisms. In 2023, organizations will face the challenge of implementing comprehensive data security measures to protect against unauthorized access, data breaches, and potential cyber threats. Compliance with data privacy regulations, such as GDPR or CCPA, adds an additional layer of complexity to this challenge.

Connectivity and Network Reliability:

For effective motor monitoring, a reliable and secure network connection is crucial. In remote or industrial environments, ensuring continuous connectivity can be challenging. Factors such as signal strength, network coverage, and bandwidth limitations need to be addressed to enable real-time data transmission and analysis. Organizations in 2023 will need to deploy robust networking infrastructure, explore alternative connectivity options like satellite or cellular networks, and implement redundancy measures to mitigate the risk of network disruptions.

Scalability and Data Management:

Cloud-based motor monitoring generates vast amounts of data that need to be efficiently processed, stored, and analyzed. In 2023, as the number of monitored motors increases, organizations will face challenges in scaling their data management infrastructure. They will need to ensure that their cloud-based systems can handle the growing volume of data, implement efficient data storage and retrieval mechanisms, and utilize advanced analytics and machine learning techniques to extract meaningful insights from the data.

Integration with Existing Systems:

Integrating cloud-based motor monitoring systems with existing infrastructure and software can pose significant challenges. In 2023, organizations will need to ensure seamless integration with their existing enterprise resource planning (ERP), maintenance management, and asset management systems. This includes establishing data pipelines, defining standardized protocols, and implementing interoperability between different systems. Compatibility with various motor types, brands, and communication protocols also adds complexity to the integration process.

Cost and Return on Investment:

While cloud-based motor monitoring offers numerous benefits, organizations must carefully evaluate the cost implications and expected return on investment (ROI). Implementing and maintaining the necessary hardware, software, and cloud infrastructure can incur significant expenses. Organizations in 2023 will face the challenge of assessing the financial viability of cloud-based motor monitoring, considering factors such as deployment costs, ongoing operational expenses, and the potential savings achieved through improved motor performance, reduced downtime, and optimized maintenance schedules.

Connectivity and Reliability:

Cloud-based motor monitoring relies heavily on stable and reliable internet connectivity. However, in certain remote locations or industrial settings, maintaining a consistent connection can be challenging. The availability of high-speed internet, network outages, or intermittent connections may impact real-time monitoring and timely data transmission. Service providers will need to address connectivity issues to ensure uninterrupted monitoring and minimize potential disruptions.

Scalability and Performance:

As the number of monitored motors increases, scalability and performance become critical challenges. Service providers must design their cloud infrastructure to handle the growing volume of data generated by motor sensors. Ensuring real-time data processing, analytics, and insights at scale will be vital to meet the demands of large-scale motor monitoring deployments. Continuous optimization and proactive capacity planning will be necessary to maintain optimal performance levels.

Integration with Legacy Systems:

Integrating cloud-based motor monitoring with existing legacy systems can be a complex undertaking. Many organizations have legacy equipment or infrastructure that may not be inherently compatible with cloud-based solutions. The challenge lies in seamlessly integrating these disparate systems to enable data exchange and unified monitoring. Service providers need to offer flexible integration options, standardized protocols, and compatibility with a wide range of motor types and manufacturers.

 

Data Analytics and Actionable Insights:

Collecting data from motor sensors is only the first step. The real value lies in extracting actionable insights from this data to enable predictive maintenance, identify performance trends, and optimize motor operations. Service providers must develop advanced analytics capabilities that can process large volumes of motor data and provide meaningful insights in a user-friendly format. The challenge is to offer intuitive dashboards, anomaly detection, and predictive analytics that empower users to make data-driven decisions effectively.

Conclusion:

Cloud-based motor monitoring as a service offers tremendous potential for organizations seeking to optimize motor performance and maintenance. However, in 2023, several challenges need to be addressed to ensure its successful implementation. From data security and connectivity issues to scalability, integration, and advanced analytics, service providers must actively tackle these challenges to unlock the full benefits of cloud-based motor monitoring. By doing so, organizations can enhance operational efficiency, extend motor lifespan, and reduce costly downtime in the ever-evolving landscape of motor-driven industries.

Read more…

Wearable technology: role in respiratory health and disease | European  Respiratory Society

Wearable devices, such as smartwatches, fitness trackers, and health monitors, have become increasingly popular in recent years. These devices are designed to be worn on the body and can measure various physiological parameters, such as heart rate, blood pressure, and body temperature. Wearable devices can also track physical activity, sleep patterns, and even detect falls and accidents.

Body sensor networks (BSNs) take the concept of wearables to the next level. BSNs consist of a network of wearable sensors that can communicate with each other and with other devices. BSNs can provide real-time monitoring of multiple physiological parameters, making them useful for a range of applications, including medical monitoring, sports performance monitoring, and military applications.

Smart portable devices, such as smartphones and tablets, are also an essential component of the IoT ecosystem. These devices are not worn on the body, but they are portable and connected to the internet, allowing for seamless communication and data transfer. Smart portable devices can be used for a wide range of applications, such as mobile health, mobile banking, and mobile commerce.

The development of wearables, BSNs, and smart portable devices requires a unique set of skills and expertise, including embedded engineering. Embedded engineers are responsible for designing and implementing the hardware and software components that make these devices possible. Embedded engineers must have a deep understanding of electronics, sensors, microcontrollers, and wireless communication protocols.

One of the significant challenges of developing wearables, BSNs, and smart portable devices is power consumption. These devices are designed to be small, lightweight, and portable, which means that they have limited battery capacity. Therefore, embedded engineers must design devices that can operate efficiently with minimal power consumption. This requires careful consideration of power management strategies, such as sleep modes and low-power communication protocols.

Another challenge of developing wearables, BSNs, and smart portable devices is data management. These devices generate large volumes of data that need to be collected, processed, and stored. The data generated by these devices can be highly sensitive and may need to be protected from unauthorized access. Therefore, embedded engineers must design devices that can perform efficient data processing and storage while providing robust security features.

The communication protocols used by wearables, BSNs, and smart portable devices also present a significant challenge for embedded engineers. These devices use wireless communication protocols, such as Bluetooth and Wi-Fi, to communicate with other devices and the internet. However, the communication range of these protocols is limited, which can make it challenging to establish and maintain reliable connections. Embedded engineers must design devices that can operate efficiently in environments with limited communication range and intermittent connectivity.

Finally, the user interface and user experience of wearables, BSNs, and smart portable devices are critical for their success. These devices must be easy to use and intuitive, with a user interface that is designed for small screens and limited input methods. Embedded engineers must work closely with user experience designers to ensure that the devices are user-friendly and provide a seamless user experience.

Read more…

Wireless Sensor Networks and IoT

We all know how IoT has revolutionized the way we interact with the world. IoT devices are now ubiquitous, from smart homes to industrial applications. A significant portion of these devices are Wireless Sensor Networks (WSNs), which are a key component of IoT systems. However, designing and implementing WSNs presents several challenges for embedded engineers. In this article, we discuss some of the significant challenges that embedded engineers face when working with WSNs.

WSNs are a network of small, low-cost, low-power, and wirelessly connected sensor nodes that can sense, process, and transmit data. These networks can be used in a wide range of applications such as environmental monitoring, healthcare, industrial automation, and smart cities. WSNs are typically composed of a large number of nodes, which communicate with each other to gather and exchange data. The nodes are equipped with sensors, microprocessors, transceivers, and power sources. The nodes can also be stationary or mobile, depending on the application.

One of the significant challenges of designing WSNs is the limited resources of the nodes. WSNs are designed to be low-cost, low-power, and small, which means that the nodes have limited processing power, memory, and energy. This constraint limits the functionality and performance of the nodes. Embedded engineers must design WSNs that can operate efficiently with limited resources. The nodes should be able to perform their tasks while consuming minimal power to maximize their lifetime.

Another challenge of WSNs is the limited communication range. The nodes communicate with each other using wireless radio signals. However, the range of the radio signals is limited, especially in indoor environments where the signals are attenuated by walls and other obstacles. The communication range also depends on the transmission power of the nodes, which is limited to conserve energy. Therefore, embedded engineers must design WSNs that can operate reliably in environments with limited communication range.

WSNs also present a significant challenge for embedded engineers in terms of data management. WSNs generate large volumes of data that need to be collected, processed, and stored. However, the nodes have limited storage capacity, and transferring data to a centralized location may not be practical due to the limited communication range. Therefore, embedded engineers must design WSNs that can perform distributed data processing and storage. The nodes should be able to process and store data locally and transmit only the relevant information to a centralized location.

Security is another significant challenge for WSNs. The nodes in WSNs are typically deployed in open and unprotected environments, making them vulnerable to physical and cyber-attacks. The nodes may also contain sensitive data, making them an attractive target for attackers. Embedded engineers must design WSNs with robust security features that can protect the nodes and the data they contain from unauthorized access.

The deployment and maintenance of WSNs present challenges for embedded engineers. WSNs are often deployed in harsh and remote environments, making it difficult to access and maintain the nodes. The nodes may also need to be replaced periodically due to the limited lifetime of the power sources. Therefore, embedded engineers must design WSNs that are easy to deploy, maintain, and replace. The nodes should be designed for easy installation and removal, and the network should be self-healing to recover from node failures automatically.

Final thought; WSNs present significant challenges for embedded engineers, including limited resources, communication range, data management, security, and deployment and maintenance. Addressing these challenges requires innovative design approaches that can maximize the performance and efficiency of WSNs while minimizing their cost and complexity. Embedded engineers must design WSNs that can operate efficiently with limited resources, perform distributed data processing and storage, provide robust security features, and be easy to deploy

Read more…

What is GNSS positioning technology?

GNSS is the general term for all navigation and positioning satellites, that is, the Global Navigation Satellite System (Global Navigation Satellite System). BDS, GLONASS in Russia, GALILEO in Europe, etcetera. We can also simply understand it as a positioning system based on artificial earth satellites, which can provide accurate geographic location, speed and time anywhere in the world and in near-Earth space information.

The principle of GNSS positioning is NB Module based on the constant propagation speed of radio waves and the linear nature of the propagation path, by measuring the propagation time of the radio waves in the space to determine the distance difference between the satellite and the user receiver antenna. The distance difference, distance and measurement value, and then use these distance differences as the radius to meet the three spheres, and solve the user position according to the simultaneous equations;

First of all, any location on the earth's surface has its three-dimensional coordinates, that is, longitude, latitude and elevation. The GNSS satellite above its head also has its own three-dimensional coordinates. We can regard the entire space as a coordinate system, and we can draw a cube. The two opposite corners of the cube are the user and the satellite;

Secondly, based on the knowledge of solid geometry, we can know the distance △L between the satellite and the user (this distance is also called "pseudorange"); the equation is as follows:

The coordinates of the satellite are (x', y', z'), which are known, and the coordinates of the user are (x, y, z), which are unknown. At the same time, the satellite can send a signal to the user terminal, and the transmission speed of the signal is basically equal to the speed of light c, and the satellite has a highly accurate atomic clock, so it knows its own time is t.

Read more…

Leveraging Salesforce IoT Cloud

What is so unique about IoT technology? With IoT, there is neither a need for human-to-computer nor human-to-human interaction. In other words, everything is connected yet independent. 

If we bring Salesforce to the picture, we have an intelligent solution, i.e., Salesforce IoT Cloud, that can get all your data in a single place. Sounds interesting, right? So, what is Salesforce IoT cloud, and how does it benefit businesses? It makes sense to partner with a reliable Salesforce consultant to know more about this robust solution. 

What is Salesforce IoT cloud?

Salesforce IoT Cloud allows organizations to create and enrich customer profiles and enter data irrespective of location, channel, time, or device. The robust solution helps implement interaction IoT laws, personalizes outcomes, or allows for simple integration with the Salesforce mobile app. The platform supports organizations to handle enormous volumes of data gathered from different processes, network devices, and locations.

Its practical approach helps develop excellent customer relationships, improving client retention and engagement. By assisting organizations to monetize their IoT investment, Salesforce IoT cloud allows organizations to provide a competitive edge over any connected device. The Salesforce IoT Cloud platform can be integrated with other Salesforce services such as Salesforce sales cloud, service cloud, community cloud, and marketing cloud.

Top Benefits of Salesforce IoT cloud?

Data Analysis with Einstein Analytics: Salesforce IoT cloud utilizes the Einstein Analytics platform to process data gathered from different sources and analyze it. This helps users better understand consumer behavior and preferences while undertaking the necessary steps to attract and retain them.

Augments Customer Experience: All the historical data regarding customer interaction gets stored on the Salesforce IoT cloud platform, which considers their location, service background, and more to provide organizations with a complete view of customer behavior. In the long run, organizations will become more proactive by anticipating customer demands, thereby providing a superior customer experience.

Low or No Code Approach: Due to its low code approach, the Salesforce IoT Cloud allows business executives to carry out their IoT processes without bothering the IT department. The orchestration rules can be set up in a way like that of a Salesforce marketing cloud feature, i.e., 'customer journey.' IoT will automate your business operations by placing triggers and responses in place.

Customer Context: This feature entails the machine learning aspect of Salesforce IoT cloud records and analyses past actions and behaviors of customers to make real-time decisions. The feature considers Customer History, location, and Service History and pools it with IoT device data to provide you with a complete picture of what's occurring.

More Visibility: The process involved in implementing a Salesforce IoT system is a primary consideration in choosing it. Businesses, too, could get a bird's eye view of the process in progress from the traffic view, which is a visual representation of an organization's ROI in different aspects. The visual panel allows organizations to see how IoT products perform in a constantly evolving consumer landscape.  

Increases Client Referrals: Clients expect hassle-free connectivity with businesses. Implementing the Salesforce IoT cloud platform makes it possible to build strong customer relationships by being accessible to them all the while.

Enhances R&D Activities: Businesses can make necessary changes to suit customer needs and requirements by tracking consumer behavior and preferences. Besides this, it is also possible to predict the taste of future customers from a pre-emptive perspective. Businesses can make quick decisions or fix the problems by getting a brief understanding of what's operative and what isn't. This improves service delivery.

Increases Lead Generation Process: Besides B2B transactions, the Salesforce IoT cloud can help the sales department gain information about the products linked with the Salesforce IoT cloud. With this information, the Sales team will be able to identify expired items, whose warranty expiration is approaching, and more. They can leverage this data to upgrade their sales processes and reach targeted customers. Businesses can create personalized deals or pitch a new product to the customers if their existing product isn't working well. Most of the data gathered can be used to forecast customer needs in several ways.

Provide Complete Perspective of Customers: By leveraging the personalized reports, Salesforce IoT provides businesses with a 360-degree view of their customers. Based on an organization's requirements, it is possible to modify the advantages.

Seamless Integration with Other Systems: Besides empowering organizations to provide services independently by gathering and processing data, Salesforce IoT cloud is capable of expanding its functionality, permitting third-party integration. Consequently, businesses get to access data from multiple sources, enabling them to explore other aspects of a business.

Final Words:

The Salesforce IoT cloud provides a complete and exact picture of how customers utilize the goods and services by integrating data gathered by their apps, irrespective of their position. The data collected can be leveraged by businesses to create personalized sales and marketing strategies while influencing customer opinion. So, implementing Salesforce IoT cloud will take the client management of every company to a new level. The innovative cloud solution offers infinite potential, which businesses can leverage to make informed business decisions. So, companies now need to integrate the Salesforce IoT cloud into their operations. However, if you wish to integrate the Salesforce IoT cloud within your business ecosystem, ensure you get in touch with a certified Salesforce Consulting Partner. 

Read more…

What is edge acquisition?

1. Introduction to edge acquisition

Edge acquisition is to abstract data (such as ModBus registers) on external devices (such as ModBus devices) into internal data points of the device (E870-D1). Configuring and reading internal data points is equivalent to configuring and reading corresponding external data ( such as ModBus registers). External devices can be ModBus devices or other communication protocol devices, such as IIC, CAN and other bus protocol devices. Currently, Ebyte E870-D1 only supports ModBus devices. (This article takes E870-D1 as an example)

2. The principle of edge acquisition

Implementation principle of edge acquisition (taking ModBus device as an example): As shown in the following edge acquisition application topology diagram, when edge acquisition data points are configured, the edge acquisition device (E870-D1) polls and reads the data in the external device through the ModBus protocol. The register is transmitted to the cloud platform through 4G;GPS Module when the cloud platform configures the data point, the edge acquisition device configures the corresponding register of the external device through the ModBus protocol, thus realizing the transparent transmission and control of the cloud platform and the external device.

The data structure of each data point includes the following attributes: "Enable", "Keyword", "Slave Address", "Register Type", "Register Address", "Data Type", "Report Mode", "Report Time" ", "Range of variation", "Number of decimal places", "Read-write property", "Up formula", "Down formula".

Among them, "keyword", "slave address", "register type", "register address", "read and write attributes" are used to realize the association between data points and ModBus registers.

The "keyword" is the name of the data point, and the name cannot be repeated in an edge acquisition device (E870-D1). When the cloud platform reads/configures the external device through the data point name ("keyword"), the edge acquisition The device (E870-D1) automatically translates the data point into the corresponding ModBus register according to its "Slave Address", "Register Type", "Register Address", and "Read-Write Attribute" information.

"Enable" controls whether the data point is valid. Only valid data points can be read, configured and polled.

Other attributes "Report Mode", "Report Time", "Variation Range", "Number of Decimals", "Upstream Formula", "Downstream Formula" can implement simple edge computing, which will be described in detail below.
2. The difference between edge collection and edge computing: Huawei's definition of edge computing (URL)

Huawei's definition of edge computing is: Edge computing is a distributed and open platform that integrates network, computing, storage, and application core capabilities at the edge of the network near the source of things or data to provide edge intelligent services nearby. To put it simply, edge computing is to analyze the data collected from the terminal directly in the local device or network where the data is generated, without the need to transmit the data to the cloud data processing center.

It is not difficult to see from the comparison between Huawei's definition of edge computing and the first description of the nature and principle of edge collection in this article. Edge collection belongs to a part of edge computing, that is, the part where the terminal collects data. At the same time, Ebyte E870-D1 has the function of transmitting data to the platform.
3. Problems solved by Ebyte E870-D1 edge acquisition:

1. Communication between ModBus and the platform: The traditional ModBus device is only short-distance communication. It is difficult to achieve communication with the Internet platform. Using Ebyte E870-D1, you can use its edge acquisition function to seamlessly connect to the local area. ModBus device and remote cloud platform, so as to realize unified monitoring and deployment of many ModBus devices by cloud platform.

2. Reduce the pressure on the platform server: Ebyte E870-D1 can easily add and delete data points through the "enable" attribute of data points, reduce unnecessary data point uploads, and at the same time "report mode", "report time" ” and “Variation range” can control the reporting timing of data points and reduce unnecessary reporting times of data points. The above can achieve a data filtering effect.

3. Simple edge computing can be realized: through the "upstream formula" and "downstream formula", custom addition, subtraction, multiplication and division calculations can be performed inside the Ebyte E870-D1.
Fourth, the advantages of Ebyte E870-D1 edge acquisition over ModBus_TCP:

1. Another way to achieve remote control of local ModBus devices is to use ModBus_TCP mode, but ModBus_TCP is not suitable for frequent data exchange and small amount of data each time, because it will consume a lot of network resources and platform resources. .

Today's sharply increased number of edge devices are all small, sophisticated, and highly specialized, such as a variety of sensors, each of which cannot be equipped with networking capabilities, and a huge number of edge devices are directly connected to the platform through ModBus_TCP , the pressure on the platform can also be imagined. Ebyte E870-D1 edge acquisition function can solve this problem very well.

2. Can connect to multiple external devices at the same time:Wireless modem Ebyte E870-D1 can connect to multiple edge acquisition devices at the same time, and ModBus_TCP can generally only communicate one-to-one.

3. The built-in IO can also be abstracted into data points: I explained how to abstract external device data into edge collection data points. At the same time, the DI, DO, AI, and AO that come with Ebyte E870-D1 can also be abstracted into structures. The same data points, thus eliminating the difference between built-in IO and external device data (such as DI, DO, AI, AO, etc.), reading/configuring built-in IO can use the same data structure, which is very helpful for platform development .

5. Usage scenarios

1. Connect multiple industrial DI, DO, AI, AO equipment to realize the network upgrade and transformation of traditional equipment.

2. Connect multiple ModBus sensors to realize environmental monitoring of an area.

Read more…
An AI based approach increases accuracy and can even make the impossible possible.
 
What is an Outlier?
 
Put simply, an outlier is a piece of data or observation that differs drastically from a given norm.
 
In the image above, the red fish is an outlier. Clearly differing by color, but also by size, shape, and more obviously direction. As such, the analysis of detecting outliers in data fall into two categories: univariate, and multivariate
  • Univariate: considering a single variable
  • Multivariate: considering multiple variables
 
Outlier Detection in Industrial IoT
 
In Industrial IoT use cases, outlier detection can be instrumental in specific use cases such as understanding the health of your machine. Instead of looking at characteristics of a fish like above, we are looking at characteristics of a machine via data such as sensor readings.
 
The goal is to learn what normal operation looks like where outliers are abnormal activity indicative of a future problem.
 
Statistical Approach to Outlier Detection
Statistics - Normal Distribution 
Statistical/probability based approaches date back centuries. You may recall back the bell curve. The values of your dataset plot to a distribution. In simplest terms, you calculate the mean and standard deviation of that distribution. You then can plot the location of x standard deviations from the mean and anything that falls beyond that is an outlier.
 
A simple example to explore using this approach is outside air temperature. Looking at the low temperature in Boston for the month of January from 2008-2018 we find an average temperature of ~23 degrees F with a standard deviation of ~9.62 degrees. Plotting out 2 standard deviations results in the following.
 
 
 a797d2_2861843bb7ba4a82bab87eef54b09196~mv2.png
 
 
Interpreting the chart above, any temperature above the gray line or below the yellow can be considered outside the range of normal...or an outlier.
 
Why do we need AI?
If we just showed that you can determine outliers using simple statistics, then why do we need AI at all? The answer depends on the type of outlier analysis.
 
Why AI for Univariate Analysis?
In the example above, we successfully analyzed outliers in weather looking at a single variable: temperature.
 
So, why should we complicate things by introducing AI to the equation? The answer has to do with the distribution of your data. You can run univariate analysis using statistical measures, but in order for the results to be accurate, it is assumed that the distribution of your data is "normal". In other words, it needs to fit to the shape of a bell curve (like the left image below).
 
However, in the real world, and specifically in industrial use cases, the resulting sensor data is not perfectly normal (like the right image below).
 6 ways to test for a Normal Distribution — which one to use? | by Joos  Korstanje | Towards Data Science
As a result, statistical analysis on a non-normal dataset would result in more false positives and false negatives.
 
The Need for AI
AI-based methods on the other hand, do not require a normal distribution and finds patterns in the data that result in much higher accuracy. In the case of the weather in Boston, getting the forecast slightly wrong does not have a huge impact. However, in industries such as rail, oil and gas, and industrial equipment, trust in the accuracy of your results has a long lasting impact. An impact that can only be achieved by AI.
 
Why AI for Multivariate Analysis?
The case for AI in a multivariate analysis is a bit more straight forward. Effectively, when we are looking at a single variable we can easily plot the results on a plane such as the temperature chart or the normal and non-normal distribution charts above.
 
However, if we are analyzing multiple points, such as the current, voltage and wattage of a motor, or vibration over 3 axis, or the return temp and discharge temp of an HVAC system, plotting and analyzing with statistics has its limitations. Just visualizing the plot becomes impossible for a human as we go from a single plane to hyperplanes as shown below.
 
MSRI | Hyperplane arrangements and application
 
The Need for AI
For multivariate analysis, visual inspection starts to go beyond human capabilities while technical analysis goes beyond statistical capabilities. Instead, AI can be utilized to find patterns in the underlying data in order to learn normal operation and adequately monitor for outliers. In other words, for multivariate analysis AI starts to make the impossible possible.
 
Summary
Statistics and probability has been around far longer than anyone reading this post. However, not all data is created equal and in the world of industrial IoT, statistical techniques have crucial limitations.
 
AI-based techniques go beyond these limitations helping to reduce false positives/negatives and often times making robust analysis possible for the first time.
 
At Elipsa, we build simple, fast and flexible AI for IoT. Get free access to our Community Edition to start integrating machine learning into your applications.
 
Read more…

For object detection projects, labeling your images with their corresponding bounding boxes and names is a tedious and time-consuming task, often requiring a human to label each image by hand. The Edge Impulse Studio has already dramatically decreased the amount of time it takes to get from raw images to a fully labeled dataset with the Data Acquisition Labeling Queue feature directly in your web browser. To make this process even faster, the Edge Impulse Studio is getting a new feature: AI-Assisted Labeling.

ezgif_com_gif_maker_3_2924bbe7c1.gifAutomatically label common objects with YOLOv5.

 

To get started, create a “Classify multiple objects” images project via the Edge Impulse Studio new project wizard or open your existing object detection project. Upload your object detection images to your Edge Impulse project’s training and testing sets. Then, from the Data Acquisition tab, select “Labeling queue.” 

 

1. Using YOLOv5

By utilizing an existing library of pre-trained object detection models from YOLOv5 (trained with the COCO dataset), common objects in your images can quickly be identified and labeled in seconds without needing to write any code!

To label your objects with YOLOv5 classification, click the Label suggestions dropdown and select “Classify using YOLOv5.” If your object is more specific than what is auto-labeled by YOLOv5, e.g. “coffee” instead of the generic “cup” class, you can modify the auto-labels to the left of your image. These modifications will automatically apply to future images in your labeling queue.

Screen_Shot_2022_02_08_at_11_54_17_AM_70dba7c50d.png

 

Click Save labels to move on to your next raw image, and see your fully labeled dataset ready for training in minutes!

 

2. Using your own model

You can also use your own trained model to predict and label your new images. From an existing (trained) Edge Impulse object detection project, upload new unlabeled images from the Data Acquisition tab. Then, from the “Labeling queue”, click the Label suggestions dropdown and select “Classify using <your project name>”:

Screen_Shot_2022_02_08_at_1_24_32_PM_caf900313c.png

 

You can also upload a few samples to a new object detection project, train a model, then upload more samples to the Data Acquisition tab and use the AI-Assisted Labeling feature for the rest of your dataset. Classifying using your own trained model is especially useful for objects that are not in YOLOv5, such as industrial objects, etc.

Click Save labels to move on to your next raw image, and see your fully labeled dataset ready for training in minutes using your own pre-trained model!

 

3. Using object tracking

If you have objects that are a similar size or common between images, you can also track your objects between frames within the Edge Impulse Labeling Queue, reducing the amount of time needed to re-label and re-draw bounding boxes over your entire dataset.

Draw your bounding boxes and label your images, then, after clicking Save labels, the objects will be tracked from frame to frame:

ezgif_com_gif_maker_2_87d2148451.gifTrack and auto-label your objects between frames.

 

Now that your object detection project contains a fully labeled dataset, learn how to train and deploy your model to your edge device: check out our tutorial!

 

Originally posted on the Edge Impulse blog by Jenny Plunkett - Senior Developer Relations Engineer.

Read more…

The head is surely the most complex group of organs in the human body, but also the most delicate. The assessment and prevention of risks in the workplace remains the first priority approach to avoid accidents or reduce the number of serious injuries to the head. This is why wearing a hard hat in an industrial working environment is often required by law and helps to avoid serious accidents.

This article will give you an overview of how to detect that the wearing of a helmet is well respected by all workers using a machine learning object detection model.

For this project, we have been using:

  • Edge Impulse Studi to acquire some custom data, visualize the data, train the machine learning model and validate the inference results.
  • Part of this public dataset from Roboflow, where the images containing the smallest bounding boxes has been removed.
  • Part of the Flicker-Faces-HQ (FFHQ) (under Creative Commons BY 2.0 license) to rebalance the classes in our dataset.
  • Google Colab to convert the Yolo v5 PyTorch format from the public dataset to Edge Impulse Ingestion format.
  • A Rasberry Pi, NVIDIA Jetson Nano or with any Intel-based Macbooks to deploy the inference model.

Before we get started, here are some insights of the benefits / drawbacks of using a public dataset versus collecting your own. 

Using a public dataset is a nice-to-have to start developing your application quickly, validate your idea and check the first results. But we often get disappointed with the results when testing on your own data and in real conditions. As such, for very specific applications, you might spend much more time trying to tweak an open dataset rather than collecting your own. Also, remember to always make sure that the license suits your needs when using a dataset you found online.

On the other hand, collecting your own dataset can take a lot of time, it is a repetitive task and most of the time annoying. But, it gives the possibility to collect data that will be as close as possible to your real life application, with the same lighting conditions, the same camera or the same angle for example. Therefore, your accuracy in your real conditions will be much higher. 

Using only custom data can indeed work well in your environment but it might not give the same accuracy in another environment, thus generalization is harder.

The dataset which has been used for this project is a mix of open data, supplemented by custom data.

First iteration, using only the public datasets

At first, we tried to train our model only using a small portion of this public dataset: 176 items in the training set and 57 items in the test set where we took only images containing a bounding box bigger than 130 pixels, we will see later why. 

Rav03Ny7X2bh1iOSftwHgggWj31SyQWk-sl_k4Uot4Jpw3eMg9XgYYrIyajogGfGOfL8j7qttiAWAcsABUgcoHUIg1QfYQRxeZfF_dnSFpsSiXhiIHduAZI9x6qcgikCcluR24r1

If you go through the public dataset, you can see that the entire dataset is strongly missing some “head” data samples. The dataset is therefore considered as imbalanced.

Several techniques exist to rebalance a dataset, here, we will add new images from Flicker-Faces-HQ (FFHQ). These images do not have bounding boxes but drawing them can be done easily in the Edge Impulse Studio. You can directly import them using the uploader portal. Once your data has been uploaded, just draw boxes around the heads and give it a label as below: 

AcihTfl2wibfy9LOzSUuPKEcF7IupGPOzPOmMmNi2LUq8sV7I2IVT5W4-7GGS8wJVD1o7VIQ5e7utCkQ1qT2xLawW7mQsTGL_WNuWIVIp5v89sCZt9gZ9fX7fwHo0PG9A3SDBCqV

Now that the dataset is more balanced, with both images and bounding boxes of hard hats and heads, we can create an impulse, which is a mix of digital signal processing (DSP) blocks and training blocks:

_qwt-WMdXI4Oc7BkNQfyEYZKV5MvziDkt1UUl1Hrx-65u_Uf-L_qEUmHMx_qN5Xh-r5vpn8JxCgpJvcT2v4-hWD9ZHE_wJjDgCCXZXxTkOtcTKSKGizDx9ZQO0KnBvvmaBCA1QvD

In this particular object detection use case, the DSP block will resize an image to fit the 320x320 pixels needed for the training block and extract meaningful features for the Neural Network. Although the extracted features don’t show a clear separation between the classes, we can start distinguishing some clusters:

zr70Lpe0Rg3wap9FWoGrco1pfT6L3TWUxYds3NhM_uHMhFDDr89KcLTH_OXIgKs6BrMdP7iihoz8t64Mk2JtbpTfmBAXyRYukNS9zxLk9zuGjZLqvakkgw6oOBuIhiVAzcMcZu9E

To train the model, we selected the Object Detection training block, which fine tunes a pre-trained object detection model on your data. It gives a good performance even with relatively small image datasets. This object detection learning block relies on MobileNetV2 SSD FPN-Lite 320x320.    

According to Daniel Situnayake, co-author of the TinyML book and founding TinyML engineer at Edge Impulse, this model “works much better for larger objects—if the object takes up more space in the frame it’s more likely to be correctly classified.” This has been one of the reason why we got rid of the images containing the smallest bounding boxes in our import script.

After training the model, we obtained a 61.6% accuracy on the training set and 57% accuracy on the testing set. You also might note a huge accuracy difference between the quantized version and the float32 version. However, during the linux deployment, the default model uses the unoptimized version. We will then focus on the float32 version only in this article.

fWwhQWxxLkAdnsFKuIUc2Lf2Lzji9m2uXux5cr3CmLf2cP8fiE_RQHaqJxekyBI3oIzOS81Jwoe6aBPfi1OFgEJSS3XQWnzR9nJ3eTY2M5JNVG38H3Dro2WZH3ltruXn_pUZkVvw

This accuracy is not satisfying, and it tends to have trouble detecting the right objects in real conditions:

hardhat_bad_82fbd9a22a.gif

Second iteration, adding custom data

On the second iteration of this project, we have gone through the process of collecting some of our own data. A very useful and handy way to collect some custom data is using our mobile phone. You can also perform this step with the same camera you will be using in your factory or your construction site, this will be even closer to the real condition and therefore work best with your use case. In our case, we have been using a white hard hat when collecting data. For example, if your company uses yellow ones, consider collecting your data with the same hard hats. 

Once the data has been acquired, go through the labeling process again and retrain your model. 

_f7J4zddenmarUiTf3VMyOz_kG70nieiEkSwR8kB3JhJE5K1IqCdttj4aOtrfzv4QYWXJ4Y9u_0MU1xKfFsU8hUB5gj00Y1E7oKlixjmhNB2p7VIqoamD9migXXPkAOrFRGVFfIo

We obtain a model that is slightly more accurate when looking at the training performances. However, in real conditions, the model works far better than the previous one.

NXnwDbkaWEia7qyM20U2kexTiWBSOXam_ACEGxzKCJ8kYtmxS7eCTMZsuwXJrjvkFUVb9YbSqwS7EOGiE4wu_FFGQ4YOufAB-JZA_uCOEoHO8D75ke6YU4H6QKnCBJyJA0hD4Lw3

Finally, to deploy your model on yourA Rasberry Pi, NVIDIA Jetson Nano or your Intel-based Macbook, just follow the instructions provided in the links. The command line interface `edge-impulse-linux-runner` will create a lightweight web interface where you can see the results.

hardhat_good_18d9e33d3a.gif

Note that the inference is run locally and you do not need any internet connection to detect your objects. Last but not least, the trained models and the inference SDK are open source. You can use it, modify it and integrate it to a broader application matching specifically to your needs such as stopping a machine when a head is detected for more than 10 seconds.

This project has been publicly released, feel free to have a look at it on Edge Impulse studio, clone the project and go through every steps to get a better understanding: https://studio.edgeimpulse.com/public/34898/latest

The essence of this use case is, Edge Impulse allows with very little effort to develop industry grade solutions in the health and safety context. Now this can be embedded in bigger industrial control and automation systems with a consistent and stringent focus on machine operations linked to H&S complaint measures. Pre-training models, which later can be easily retrained in the final industrial context as a step of “calibration,” makes this a customizable solution for your next project.

Originally posted on the Edge Impulse blog by Louis Moreau - User Success Engineer at Edge Impulse & Mihajlo Raljic - Sales EMEA at Edge Impulse

Read more…

Today the world is obsessed with the IoT, as if this is a new concept. We've been building the IoT for decades, but it was only recently some marketing "genius" came up with the new buzz-acronym.

Before there was an IoT, before there was an Internet, many of us were busy networking. For the Internet itself was a (brilliant) extension of what was already going on in the industry.

My first experience with networking was in 1971 at the University of Maryland. The school had a new computer, a $10 million Univac 1108 mainframe. This was a massive beast that occupied most of the first floor of a building. A dual-processor machine it was transistorized, though the control console did have some ICs. Rows of big tape drives mirrored the layman's idea of computers in those days. Many dishwasher-sized disk drives were placed around the floor and printers, card readers and other equipment were crammed into every corner. Two Fastrand drum memories, each consisting of a pair of six-foot long counterrotating drums, stored a whopping 90 MB each. Through a window you could watch the heads bounce around.

The machine was networked. It had a 300 baud modem with which it could contact computers at other universities. A primitive email system let users create mail which was queued till nightfall. Then, when demands on the machine were small, it would call the appropriate remote computer and forward mail. The system operated somewhat like today's "hot potato" packets, where the message might get delivered to the easiest machine available, which would then attempt further forwarding. It could take a week to get an email, but at least one saved the $0.08 stamp that the USPS charged.

The system was too slow to be useful. After college I lost my email account but didn't miss it at all.

By the late 70s many of us had our own computers. Mine was a home-made CP/M machine with a Z80 processor and a small TV set as a low-res monitor. Around this time Compuserve came along and I, like so many others, got an account with them. Among other features, users had email addresses. Pretty soon it was common to dial into their machines over a 300 baud modem and exchange email and files. Eventually Compuserve became so ubiquitous that millions were connected, and at my tools business during the 1980s it was common to provide support via this email. The CP/M machine gave way to a succession of PCs, Modems ramped up to 57 K baud.

My tools business expanded rapidly and soon we had a number of employees. Sneakernet was getting less efficient so we installed an Arcnet network using Windows 3.11. That morphed into Ethernet connections, though the cursing from networking problems multiplied about as fast as the data transfers. Windows was just terrible at maintaining reliable connectivity.

In 1992 Mike Lee, a friend from my Boys Night Out beer/politics/sailing/great friends group, which still meets weekly (though lately virtually) came by the office with his laptop. "You have GOT to see this" he intoned, and he showed me the world-wide web. There wasn't much to see as there were few sites. But the promise was shockingly clear. I was stunned.

The tools business had been doing well. Within a month we spent $100k on computers, modems and the like and had a new business: Softaid Internet Services. SIS was one of Maryland's first ISPs and grew quickly to several thousand customers. We had a T1 connection to MAE-EAST in the DC area which gave us a 1.5 Mb/s link… for $5000/month. Though a few customers had ISDN connections to us, most were dialup, and our modem shelf grew to over 100 units with many big fans keeping the things cool.

The computers all ran BSD Unix, which was my first intro to that OS.

I was only a few months back from a failed attempt to singlehand my sailboat across the Atlantic and had written a book-length account of that trip. I hastily created a web page of that book to learn about using the web. It is still online and has been read several million times in the intervening years. We put up a site for the tools business which eventually became our prime marketing arm.

The SIS customers were sometimes, well, "interesting." There was the one who claimed to be a computer expert, but who tried to use the mouse by waving it around over the desk. Many had no idea how to connect a modem. Others complained about our service because it dropped out when mom would pick up the phone to make a call over the modem's beeping. A lot of handholding and training was required.

The logs showed a shocking (to me at the time) amount of porn consumption. Over lunch an industry pundit explained how porn drove all media, from the earliest introduction of printing hundreds of years earlier.

The woman who ran the ISP was from India. She was delightful and had a wonderful marriage. She later told me it had been arranged; they met  their wedding day. She came from a remote and poor village and had had no exposure to computers, or electricity, till emigrating to the USA.

Meanwhile many of our tools customers were building networking equipment. We worked closely with many of them and often had big routers, switches and the like onsite that our engineers were working on. We worked on a lot of what we'd now call IoT gear: sensors et al connected to the net via a profusion of interfaces.

I sold both the tools and Internet businesses in 1997, but by then the web and Internet were old stories.

Today, like so many of us, I have a fast (250 Mb/s) and cheap connection into the house with four wireless links and multiple computers chattering to each other. Where in 1992 the web was incredibly novel and truly lacking in useful functionality, now I can't imagine being deprived of it. Remember travel agents? Ordering things over the phone (a phone that had a physical wire connecting it to Ma Bell)? Using 15 volumes of an encyclopedia? Physically mailing stuff to each other?

As one gets older the years spin by like microseconds, but it is amazing to stop and consider just how much this world has changed. My great grandfather lived on a farm in a world that changed slowly; he finally got electricity in his last year of life. His daughter didn't have access to a telephone till later in life, and my dad designed spacecraft on vellum and starched linen using a slide rule. My son once saw a typewriter and asked me what it was; I mumbled that it was a predecessor of Microsoft Word.

That he understood. I didn't have the heart to try and explain carbon paper.

Originally posted HERE.

Read more…

4 key questions to ask tech vendors

Posted by Terri Hiskey

Without mindful and strategic investments, a company’s supply chain could become wedged in its own proverbial Suez Canal, ground to a halt by outside forces and its inflexible, complex systems.

 

It’s a dramatic image, but one that became reality for many companies in the last year. Supply chain failures aren’t typically such high-profile events as the Suez Canal blockage, but rather death by a thousand inefficiencies, each slowing business operations and affecting the customer experience.

Delay by delay and spreadsheet by spreadsheet, companies are at risk of falling behind more nimble, cloud-enabled competitors. And as we emerge from the pandemic with a new understanding of how important adaptable, integrated supply chains are, company leaders have critical choices to make.

The Hannover Messe conference (held online from April 12-16) gives manufacturing and supply chain executives around the world a chance to hear perspectives from industry leaders and explore the latest manufacturing and supply chain technologies available.

Technology holds great promise. But if executives don’t ask key strategic questions to supply chain software vendors, they could unknowingly introduce a range of operational and strategic obstacles into their company’s future.

If you’re attending Hannover Messe, here are a few critical questions to ask:

Are advanced technologies like machine learning, IoT, and blockchain integrated into your supply chain applications and business processes, or are they addressed separately?

It’s important to go beyond the marketing. Is the vendor actually promoting pilots of advanced technologies that are simply customized use cases for small parts of an overall business process hosted on a separate platform? If so, it may be up to your company to figure out how to integrate it with the rest of that vendor’s applications and to maintain those integrations.

To avoid this situation, seek solutions that have been purpose-built to leverage advanced technologies across use cases that address the problems you hope to solve. It’s also critical that these solutions come with built-in connections to ensure easy integration across your enterprise and to third party applications.

Are your applications or solutions written specifically for the cloud?

If a vendor’s solution for a key process (like integrated business planning or plan to produce, for example) includes applications developed over time by a range of internal development teams, partners, and acquired companies, what you’re likely to end up with is a range of disjointed applications and processes with varying user interfaces and no common data model. Look for a cloud solution that helps connect and streamline your business processes seamlessly.

Update schedules for the various applications could also be disjointed and complicated, so customers can be tempted to skip updates. But some upgrades may be forced, causing disruption in key areas of your business at various times.

And if some of the applications in the solution were written for the on-premises world, business processes will likely need customization, making them hard-wired and inflexible. The convenience of cloud solutions is that they can take frequent updates more easily, resulting in greater value driven by the latest innovations.

Are your supply chain applications fully integrated—and can they be integrated with other key applications like ERP or CX?

A lack of integration between and among applications within the supply chain and beyond means that end users don’t have visibility into the company’s operations—and that directly affects the quality and speed of business decisions. When market disruptions or new opportunities occur, unintegrated systems make it harder to shift operations—or even come to an agreement on what shift should happen.

And because many key business processes span multiple areas—like manufacturing forecast to plan, order to cash, and procure to pay—integration also increases efficiency. If applications are not integrated across these entire processes, business users resort to pulling data from the various systems and then often spend time debating whose data is right.

Of course, all of these issues increase operational costs and make it harder for a company to adapt to change. They also keep the IT department busy with maintenance tasks rather than focusing on more strategic projects.

Do you rely heavily on partners to deliver functionality in your supply chain solutions?

Ask for clarity on which products within the solution belong to the vendor and which were developed by partners. Is there a single SLA for the entire solution? Will the two organizations’ development teams work together on a roadmap that aligns the technologies? Will their priority be on making a better solution together or on enhancements to their own technology? Will they focus on enabling data to flow easily across the supply chain solution, as well as to other systems like ERP? Will they be able to overcome technical issues that arise and streamline customer support?

It’s critical for supply chain decision-makers to gain insight into these crucial questions. If the vendor is unable to meet these foundational needs, the customer will face constant obstacles in their supply chain operations.

Originally posted here.

Read more…

By Tony Pisani

For midstream oil and gas operators, data flow can be as important as product flow. The operator’s job is to safely move oil and natural gas from its extraction point (upstream), to where it’s converted to fuels (midstream), to customer delivery locations (downstream). During this process, pump stations, meter stations, storage sites, interconnection points, and block valves generate a substantial volume and variety of data that can lead to increased efficiency and safety.

“Just one pipeline pump station might have 6 Programmable Logic Controllers (PLCs), 12 flow computers, and 30 field instruments, and each one is a source of valuable operational information,” said Mike Walden, IT and SCADA Director for New Frontier Technologies, a Cisco IoT Design-In Partner that implements OT and IT systems for industrial applications. Until recently, data collection from pipelines was so expensive that most operators only collected the bare minimum data required to comply with industry regulations. That data included pump discharge pressure, for instance, but not pump bearing temperature, which helps predict future equipment failures.

A turnkey solution to modernize midstream operations

Now midstream operators are modernizing their pipelines with Industrial Internet of Things (IIoT) solutions. Cisco and New Frontier Technologies have teamed up to offer a solution combining the Cisco 1100 Series Industrial Integrated Services Router, Cisco Edge Intelligence, and New Frontier’s know-how. Deployed at edge locations like pump stations, the solution extracts data from pipeline equipment and is sent via legacy protocols, transforming data at the edge to a format that analytics and other enterprise applications understand. The transformation also minimizes bandwidth usage.

Mike Walden views the Cisco IR1101 as a game-changer for midstream operators. He shared with me that “Before the Cisco IR1101, our customers needed four separate devices to transmit edge data to a cloud server—a router at the pump station, an edge device to do protocol conversion from the old to the new, a network switch, and maybe a firewall to encrypt messages…With the Cisco IR1101, we can meet all of those requirements with one physical device.”

Collect more data, at almost no extra cost

Using this IIoT solution, midstream operators can for the first time:

  • Collect all available field data instead of just the data on a polling list. If the maintenance team requests a new type of data, the operations team can meet the request using the built-in protocol translators in Edge Intelligence. “Collecting a new type of data takes almost no extra work,” Mike said. “It makes the operations team look like heroes.”
  • Collect data more frequently, helping to spot anomalies. Recording pump discharge pressure more frequently, for example, makes it easier to detect leaks. Interest in predicting (rather than responding to) equipment failure is also growing. The life of pump seals, for example, depends on both the pressure that seals experience over their lifetime and the peak pressures. “If you only collect pump pressure every 30 minutes, you probably missed the spike,” Mike explained. “If you do see the spike and replace the seal before it fails, you can prevent a very costly unexpected outage – saving far more than the cost of a new seal.”
  • Protect sensitive data with end-to-end security. Security is built into the IR1101, with secure boot, VPN, certificate-based authentication, and TLS encryption.
  • Give IT and OT their own interfaces so they don’t have to rely on the other team. The IT team has an interface to set up network templates to make sure device configuration is secure and consistent. Field engineers have their own interface to extract, transform, and deliver industrial data from Modbus, OPC-UA, EIP/CIP, or MQTT devices.

As Mike summed it up, “It’s finally simple to deploy a secure industrial network that makes all field data available to enterprise applications—in less time and using less bandwidth.”

Originally posted here.

Read more…

By GE Digital

“The End of Cloud Computing.” “The Edge Will Eat The cloud.” “Edge Computing—The End of Cloud Computing as We Know It.”  

Such headlines grab attention, but don’t necessarily reflect reality—especially in Industrial Internet of Things (IoT) deployments. To be sure, edge computing is rapidly emerging as a powerful force in turning industrial machines into intelligent machines, but to paraphrase Mark Twain: “The reports of the death of cloud are greatly exaggerated.” 

The Tipping Point: Edge Computing Hits Mainstream

We’ve all heard the stats—billions and billions of IoT devices, generating inconceivable amounts of big data volumes, with trillions and trillions of U.S. dollars to be invested in IoT over the next several years. Why? Because industrials have squeezed every ounce of productivity and efficiency out of operations over the past couple of decades, and are now looking to digital strategies to improve production, performance, and profit. 

The Industrial Internet of Things (IIoT) represents a world where human intelligence and machine intelligence—what GE Digital calls minds and machines—connect to deliver new value for industrial companies. 

In this new landscape, organizations use data, advanced analytics, and machine learning to drive digital industrial transformation. This can lead to reduced maintenance costs, improved asset utilization, and new business model innovations that further monetize industrial machines and the data they create. 

Despite the “cloud is dead” headlines, GE believes the cloud is still very important in delivering on the promise of IIoT, powering compute-intense workloads to manage massive amounts of data generated by machines. However, there’s no question that edge computing is quickly becoming a critical factor in the total IIoT equation.

“The End of Cloud Computing.” “The Edge Will Eat The cloud.” “Edge Computing—The End of Cloud Computing as We Know It.”  

Such headlines grab attention, but don’t necessarily reflect reality—especially in Industrial Internet of Things (IoT) deployments. To be sure, edge computing is rapidly emerging as a powerful force in turning industrial machines into intelligent machines, but to paraphrase Mark Twain: “The reports of the death of cloud are greatly exaggerated.”

The Tipping Point: Edge Computing Hits Mainstream

We’ve all heard the stats—billions and billions of IoT devices, generating inconceivable amounts of big data volumes, with trillions and trillions of U.S. dollars to be invested in IoT over the next several years. Why? Because industrials have squeezed every ounce of productivity and efficiency out of operations over the past couple of decades, and are now looking to digital strategies to improve production, performance, and profit. 

The Industrial Internet of Things (IIoT) represents a world where human intelligence and machine intelligence—what GE Digital calls minds and machines—connect to deliver new value for industrial companies. 

In this new landscape, organizations use data, advanced analytics, and machine learning to drive digital industrial transformation. This can lead to reduced maintenance costs, improved asset utilization, and new business model innovations that further monetize industrial machines and the data they create. 

Despite the “cloud is dead” headlines, GE believes the cloud is still very important in delivering on the promise of IIoT, powering compute-intense workloads to manage massive amounts of data generated by machines. However, there’s no question that edge computing is quickly becoming a critical factor in the total IIoT equation. 

What is edge computing? 

The “edge” of a network generally refers to technology located adjacent to the machine which you are analyzing or actuating, such as a gas turbine, a jet engine, or magnetic resonance (MR) scanner. 

Until recently, edge computing has been limited to collecting, aggregating, and forwarding data to the cloud. But what if instead of collecting data for transmission to the cloud, industrial companies could turn massive amounts of data into actionable intelligence, available right at the edge? Now they can. 

This is not just valuable to industrial organizations, but absolutely essential.

Edge computing vs. Cloud computing 

Cloud and edge are not at war … it’s not an either/or scenario. Think of your two hands. You go about your day using one or the other or both depending on the task. The same is true in Industrial Internet workloads. If the left hand is edge computing and the right hand is cloud computing, there will be times when the left hand is dominant for a given task, instances where the right hand is dominant, and some cases where both hands are needed together. 

Scenarios in which edge computing will take a leading position include things such as low latency, bandwidth, real-time/near real-time actuation, intermittent or no connectivity, etc. Scenarios where cloud will play a more prominent role include compute-heavy tasks, machine learning, digital twins, cross-plant control, etc. 

The point is you need both options working in tandem to provide design choices across edge to cloud that best meet business and operational goals.

Edge Computing and Cloud Computing: Balance in Action 

Let’s look at a couple of illustrations. In an industrial context, examples of intelligent edge machines abound—pumps, motors, sensors, blowout preventers and more benefit from the growing capabilities of edge computing for real-time analytics and actuation. 

Take locomotives. These modern 200 ton digital machines carry more than 200 sensors that can pump one billion instructions per second. Today, applications can not only collect data locally and respond to changes on that data, but they can also perform meaningful localized analytics. GE Transportation’s Evolution Series Tier 4 Locomotive uses on-board edge computing to analyze data and apply algorithms for running smarter and more efficiently. This improves operational costs, safety, and uptime. 

Sending all that data created by the locomotive to the cloud for processing, analyzing, and actuation isn’t useful, practical, or cost-effective. 

Now let’s switch gears (pun intended) and talk about another mode of transportation—trucking. Here’s an example where edge plays an important yet minor role, while cloud assumes a more dominant position. In this example, the company has 1,000 trucks under management. There are sensors on each truck tracking performance of the vehicle such as engine, transmission, electrical, battery, and more. 

But in this case, instead of real-time analytics and actuation on the machine (like our locomotive example), the data is being ingested, then stored and forwarded to the cloud where time series data and analytics are used to track performance of vehicle components. The fleet operator then leverages a fleet management solution for scheduled maintenance and cost analysis. This gives him or her insights such as the cost over time per part type, or the median costs over time, etc. The company can use this data to improve uptime of its vehicles, lower repair costs, and improve the safe operation of the vehicle.

What’s next in edge computing 

While edge computing isn’t a new concept, innovation is now beginning to deliver on the promise—unlocking untapped value from the data being created by machines. 

GE has been at the forefront of bridging minds and machines. Predix Platform supports a consistent execution environment across cloud and edge devices, helping industrials achieve new levels of performance, production, and profit.

Originally posted here.

Read more…

By Sachin Kotasthane

In his book, 21 Lessons for the 21st Century, the historian Yuval Noah Harari highlights the complex challenges mankind will face on account of technological challenges intertwined with issues such as nationalism, religion, culture, and calamities. In the current industrial world hit by a worldwide pandemic, we see this complexity translate in technology, systems, organizations, and at the workplace.

While in my previous article, Humane IIoT, I discussed the people-centric strategies that enterprises need to adopt while onboarding IoT initiatives of industrial IoT in the workforce, in this article, I will share thoughts on how new-age technologies such as AI, ML, and big data, and of course, industrial IoT, can be used for effective management of complex workforce problems in a factory, thereby changing the way people work and interact, especially in this COVID-stricken world.

Workforce related problems in production can be categorized into:

  1. Time complexity
  2. Effort complexity
  3. Behavioral complexity

Problems categorized in either of the above have a significant impact on the workforce, resulting in a detrimental effect on the outcome—of the product or the organization. The complexity of these problems can be attributed to the fact that the workforce solutions to such issues cannot be found using just engineering or technology fixes as there is no single root-cause, rather, a combination of factors and scenarios. Let us, therefore, explore a few and seek probable workforce solutions.8829066088?profile=RESIZE_584x

Figure 1: Workforce Challenges and Proposed Strategies in Production

  1. Addressing Time Complexity

    Any workforce-related issue that has a detrimental effect on the operational time, due to contributing factors from different factory systems and processes, can be classified as a time complex problem.

    Though classical paper-based schedules, lists, and punch sheets have largely been replaced with IT-systems such as MES, APS, and SRM, the increasing demands for flexibility in manufacturing operations and trends such as batch-size-one, warrant the need for new methodologies to solve these complex problems.

    • Worker attendance

      Anyone who has experienced, at close quarters, a typical day in the life of a factory supervisor, will be conversant with the anxiety that comes just before the start of a production shift. Not knowing who will report absent, until just before the shift starts, is one complex issue every line manager would want to get addressed. While planned absenteeism can be handled to some degree, it is the last-minute sick or emergency-pager text messages, or the transport delays, that make the planning of daily production complex.

      What if there were a solution to get the count that is almost close to the confirmed hands for the shift, an hour or half, at the least, in advance? It turns out that organizations are experimenting with a combination of GPS, RFID, and employee tracking that interacts with resource planning systems, trying to automate the shift planning activity.

      While some legal and privacy issues still need to be addressed, it would not be long before we see people being assigned to workplaces, even before they enter the factory floor.

      During this course of time, while making sure every line manager has accurate information about the confirmed hands for the shift, it is also equally important that health and well-being of employees is monitored during this pandemic time. Use of technologies such as radar, millimeter wave sensors, etc., would ensure the live tracking of workers around the shop-floor and make sure that social distancing norms are well-observed.

    • Resource mapping

      While resource skill-mapping and certification are mostly HR function prerogatives, not having the right resource at the workstation during exigencies such as absenteeism or extra workload is a complex problem. Precious time is lost in locating such resources, or worst still, millions spent in overtime.

      What if there were a tool that analyzed the current workload for a resource with the identified skillset code(s) and gave an accurate estimate of the resource’s availability? This could further be used by shop managers to plan manpower for a shift, keeping them as lean as possible.

      Today, IT teams of OEMs are seen working with software vendors to build such analytical tools that consume data from disparate systems—such as production work orders from MES and swiping details from time systems—to create real-time job profiles. These results are fed to the HR systems to give managers the insights needed to make resource decisions within minutes.

  2. Addressing Effort Complexity

    Just as time complexities result in increased  production time, problems in this category result in an increase in effort by the workforce to complete the same quantity of work. As the effort required is proportionate to the fatigue and long-term well-being of the workforce, seeking workforce solutions to reduce effort would be appreciated. Complexity arises when organizations try to create a method out-of-madness from a variety of factors such as changing workforce profiles, production sequences, logistical and process constraints, and demand fluctuations.

    Thankfully, solutions for this category of problems can be found in new technologies that augment existing systems to get insights and predictions, the results of which can reduce the efforts, thereby channelizing it more productively. Add to this, the demand fluctuations in the current pandemic, having a real-time operational visibility, coupled with advanced analytics, will ensure meeting shift production targets.

    • Intelligent exoskeletons

      Exoskeletons, as we know, are powered bodysuits designed to safeguard and support the user in performing tasks, while increasing overall human efficiency to do the respective tasks. These are deployed in strain-inducing postures or to lift objects that would otherwise be tiring after a few repetitions. Exoskeletons are the new-age answer to reducing user fatigue in areas requiring human skill and dexterity, which otherwise would require a complex robot and cost a bomb.

      However, the complexity that mars exoskeleton users is making the same suit adaptable for a variety of postures, user body types, and jobs at the same workstation. It would help if the exoskeleton could sense the user, set the posture, and adapt itself to the next operation automatically.

      Taking a leaf out of Marvel’s Iron Man, who uses a suit that complements his posture that is controlled by JARVIS, manufacturers can now hope to create intelligent exoskeletons that are always connected to factory systems and user profiles. These suits will adapt and respond to assistive needs, without the need for any intervention, thereby freeing its user to work and focus completely on the main job at hand.

      Given the ongoing COVID situation, it would make the life of workers and the management safe if these suits are equipped with sensors and technologies such as radar/millimeter wave to help observe social distancing, body-temperature measuring, etc.

    • Highlighting likely deviations

      The world over, quality teams on factory floors work with checklists that the quality inspector verifies for every product that comes at the inspection station. While this repetitive task is best suited for robots, when humans execute such repetitive tasks, especially those that involve using visual, audio, touch, and olfactory senses, mistakes and misses are bound to occur. This results in costly reworks and recalls.

      Manufacturers have tried to address this complexity by carrying out rotation of manpower. But this, too, has met with limited success, given the available manpower and ever-increasing workloads.

      Fortunately, predictive quality integrated with feed-forwards techniques and some smart tracking with visuals can be used to highlight the area or zone on the product that is prone to quality slips based on data captured from previous operations. The inspector can then be guided to pay more attention to these areas in the checklist.

  3. Addressing Behavioral Complexity

    Problems of this category usually manifest as a quality issue, but the root cause can often be traced to the workforce behavior or profile. Traditionally, organizations have addressed such problems through experienced supervisors, who as people managers were expected to read these signs, anticipate and align the manpower.

    However, with constantly changing manpower and product variants, these are now complex new-age problems requiring new-age solutions.

    • Heat-mapping workload

      Time and motion studies at the workplace map the user movements around the machine with the time each activity takes for completion, matching the available cycle-time, either by work distribution or by increasing the manpower at that station. Time-consuming and cumbersome as it is, the complexity increases when workload balancing is to be done for teams working on a single product at the workstation. Movements of multiple resources during different sequences are difficult to track, and the different users cannot be expected to follow the same footsteps every time.

      Solving this issue needs a solution that will monitor human motion unobtrusively, link those to the product work content at the workstation, generate recommendations to balance the workload and even out the ‘congestion.’ New industrial applications such as short-range radar and visual feeds can be used to create heat maps of the workforce as they work on the product. This can be superimposed on the digital twin of the process to identify the zone where there is ‘congestion.’ This can be fed to the line-planning function to implement corrective measures such as work distribution or partial outsourcing of the operation.

    • Aging workforce (loss of tribal knowledge)

      With new technology coming to the shop-floor, skills of the current workforce get outdated quickly. Also, with any new hire comes the critical task of training and knowledge sharing from experienced hands. As organizations already face a shortage of manpower, releasing more hands to impart training to a larger workforce audience, possibly at different locations, becomes an even more daunting task.

      Fully realizing the difficulties and reluctance to document, organizations are increasingly adopting AR-based workforce trainings that map to relevant learning and memory needs. These AR solutions capture the minutest of the actions executed by the expert on the shop-floor and can be played back by the novice in-situ as a step-by-step guide. Such tools simplify the knowledge transfer process and also increase worker productivity while reducing costs.

      Further, in extraordinary situations such  as the one we face at present, technologies such as AR offer solutions for effective and personalized support to field personnel, without the need to fly in specialists at multiple sites. This helps keep them safe, and accessible, still.

Key takeaways and Actionable Insights

The shape of the future workforce will be the result of complex, changing, and competing forces. Technology, globalization, demographics, social values, and the changing personal expectations of the workforce will continue to transform and disrupt the way businesses operate, increasing the complexity and radically changing where, and when of future workforce, and how work is done. While the need to constantly reskill and upskill the workforce will be humongous, using new-age techniques and technologies to enhance the effectiveness and efficiency of the existing workforce will come to the spotlight.

8829067296?profile=RESIZE_710x

Figure 2: The Future IIoT Workforce

Organizations will increasingly be required to:

  1. Deploy data farming to dive deep and extract vast amounts of information and process insights embedded in production systems. Tapping into large reservoirs of ‘tribal knowledge’ and digitizing it for ingestion to data lakes is another task that organizations will have to consider.
  2. Augment existing operations systems such as SCADA, DCS, MES, CMMS with new technology digital platforms, AI, AR/VR, big data, and machine learning to underpin and grow the world of work. While there will be no dearth of resources in one or more of the new technologies, organizations will need to ‘acqui-hire’ talent and intellectual property using a specialist, to integrate with existing systems and gain meaningful actionable insights.
  3. Address privacy and data security concerns of the workforce, through the smart use of technologies such as radar and video feeds.

Nonetheless, digital enablement will need to be optimally used to tackle the new normal that the COVID pandemic has set forth in manufacturing—fluctuating demands, modular and flexible assembly lines, reduced workforce, etc.

Originally posted here.

Read more…

By Adam Dunkels

When you have to install thousands of IoT devices, you need to make device installation impressively fast. Here is how to do it.

Every single IoT device out there has been be installed by someone.

Installation is the activity that requires the most attention during that device’s lifetime.

This is particularly true for large scale IoT deployments.

We at Thingsquare have been involved in many IoT products and projects. Many of these have involved large scale IoT deployments with hundreds or thousands of devices per deployment site.

In this article, we look at why installation is so important for large IoT deployments – and a list of 6 installation tactics to make installation impressively fast while being highly useful:

  1. Take photos
  2. Make it easy to identify devices
  3. Record the location of every device
  4. Keep a log of who did what
  5. Develop an installation checklist, and turn it into an app
  6. Measure everything

And these tactics are useful even if you only have a handful of devices per site, but thousands or tens of thousands of devices in total.

Why Installation Tactics are Important in Large IoT Deployments

Installation is a necessary step of an IoT device’s life.

Someone – maybe your customers, your users, or a team of technicians working for you – will be responsible for the installation. The installer turns your device from a piece of hardware into a living thing: a valuable producer of information for your business.

But most of all, installation is an inevitable part of the IoT device life cycle.

The life cycle of an IoT device can be divided into four stages:

  1. Produce the device, at the factory (usually with a device programming tool).
  2. Install the device.
  3. Use the device. This is where the device generates the value that we created it for. The device may then be either re-installed at a new location, or we:
  4. Retire the device.

Two stages in the list contain the installation activity: both Install and Use.

So installation is inevitable – and important. We need to plan to deal with it.

Installation is the Most Time-Consuming Activity

Most devices should spend most of their lifetime in the Use stage of their life cycle.

But a device’s lifetime is different from the attention time that we need to spend on them.

Devices usually don’t need much attention in their Use stage. At this stage, they should mostly be sitting there and generate valuable information.

By contrast, for the people who work with the devices, most of their attention and time will be spent in the Install stage. Since those are people who’s salary you are paying for, you want to be as efficient as possible.

How To Make Installation Impressively Fast - and Useful

At Thingsquare, we have deployed thousands of devices together with our customers, and our customers have deployed many hundreds of thousands of devices with their customers.

These are our top six tactics to make installation fast – and useful:

1. Take Photos

After installation, you will need to maintain and troubleshoot the system. This is a normal part of the Use stage.

Photos are a goldmine of information. Particularly if it is difficult to get to the location afterward.

Make sure you take plenty of photos of each device as they are installed. In fact, you should include multiple photos in your installation checklist – more about this below.

We have been involved in several deployments where we have needed to remotely troubleshoot installations after they were installed. Having a bunch of photos of how and where the devices were installed helps tremendously.

The photos don’t need to be great. Having a low-quality photo beats having no photo, every time.

 

2. Make it Easy to Identify Devices

When dealing with hundreds of devices, you need to make sure that you know exactly which you installed, and where.

You therefore need to make it easy to identify each device. Device identification can be made in several ways, and we recommend you to use more than one way to identify the devices. This will reduce the risk of manual errors.

The two ways we typically use are:

  • A printed unique ID number on the device, which you can take a photo of
  • Automatic secure device identification via Bluetooth – this is something the Thingsquare IoT platform supports out of the box

Being certain about where devices were installed will make maintenance and troubleshooting much easier – particularly if it is difficult to visit the installation site.

3. Record the Location of Every Device

When devices are installed, make sure to record their location.

The easiest way to do this is to take the GPS coordinates of the devices as it is being deployed. Preferably with the installation app, which can do this automatically – see below.

For indoor installations, exact GPS locations may be unreliable. But even for those devices, having a coarse-grained GPS location is useful.

The location is useful both when analyzing the data that the devices produce, and when troubleshooting problems in the network.

 

4. Keep a Log of Who Did What

In large deployments, there will be many people involved.

Being able to trace the installation actions, as well as who took what action, is enormously useful. Sometimes just knowing the steps that were taken when installing each device is important. And sometimes you need to talk to the person who did the installation.

5. Develop an Installation Checklist - and Turn it into an App

Determine what steps are needed to install each device, and develop a step-by-step checklist for each step.

Then turn this checklist into an app that installation personnel can run on their own phones.

Each step of each checklist should be really easy understand to avoid mistakes along the way. And it should be easy to go back and forth in the steps, if needed.

Ideally, the app should run on both Android and iOS, because you would like everyone to be able to use it on their own phones.

Here is an example checklist, that we developed for a sensor device in a retail IoT deployment:

  • Check that sensor has battery installed
  • Attach sensor to appliance
  • Make sure that the sensor is online
  • Check that the sensor has a strong signal
  • Check that the GPS location is correct
  • Move hand in front of sensor, to make sure sensor correctly detects movement
  • Be still, to make sure sensor correctly detects no movement
  • Enter description of sensor placement (e.g. “on top of the appliance”)
  • Enter description of appliance
  • Take a photo of the sensor
  • Take a photo of the appliance
  • Take a photo of the appliance and the two beside it
  • Take a photo of the appliance and the four beside it
 

6. Measure Everything

Since installation costs money, we want it to be efficient.

And the best way to make a process more efficient is to measure it, and then improve it.

Since we have an installation checklist app, measuring installation time is easy – just build it into the app.

Once we know how much time each step in the installation process needs, we are ready to revise the process and improve it. We should focus on the most time-consuming step first and measure the successive improvements to make sure we get the most bang for the buck.

Conclusions

Every IoT device needs to be installed and making the installation process efficient saves us attention time for everyone involved – and ultimately money.

At Thingsquare, we have deployed thousands of devices together with our customers, and our customers have deployed many hundreds of thousands of devices with their customers.

We use our experience to solve hard problems in the IoT space, such as how to best install large IoT systems – get in touch with us to learn more!

Originally posted here.

Read more…

by Stephanie Overby

What's next for edge computing, and how should it shape your strategy? Experts weigh in on edge trends and talk workloads, cloud partnerships, security, and related issues


All year, industry analysts have been predicting that that edge computing – and complimentary 5G network offerings ­­– will see significant growth, as major cloud vendors are deploying more edge servers in local markets and telecom providers pushing ahead with 5G deployments.

The global pandemic has not significantly altered these predictions. In fact, according to IDC’s worldwide IT predictions for 2021, COVID-19’s impact on workforce and operational practices will be the dominant accelerator for 80 percent of edge-driven investments and business model change across most industries over the next few years.

First, what exactly do we mean by edge? Here’s how Rosa Guntrip, senior principal marketing manager, cloud platforms at Red Hat, defines it: “Edge computing refers to the concept of bringing computing services closer to service consumers or data sources. Fueled by emerging use cases like IoT, AR/VR, robotics, machine learning, and telco network functions that require service provisioning closer to users, edge computing helps solve the key challenges of bandwidth, latency, resiliency, and data sovereignty. It complements the hybrid computing model where centralized computing can be used for compute-intensive workloads while edge computing helps address the requirements of workloads that require processing in near real time.”

Moving data infrastructure, applications, and data resources to the edge can enable faster response to business needs, increased flexibility, greater business scaling, and more effective long-term resilience.

“Edge computing is more important than ever and is becoming a primary consideration for organizations defining new cloud-based products or services that exploit local processing, storage, and security capabilities at the edge of the network through the billions of smart objects known as edge devices,” says Craig Wright, managing director with business transformation and outsourcing advisory firm Pace Harmon.

“In 2021 this will be an increasing consideration as autonomous vehicles become more common, as new post-COVID-19 ways of working require more distributed compute and data processing power without incurring debilitating latency, and as 5G adoption stimulates a whole new generation of augmented reality, real-time application solutions, and gaming experiences on mobile devices,” Wright adds.

8 key edge computing trends in 2021


Noting the steady maturation of edge computing capabilities, Forrester analysts said, “It’s time to step up investment in edge computing,” in their recent Predictions 2020: Edge Computing report. As edge computing emerges as ever more important to business strategy and operations, here are eight trends IT leaders will want to keep an eye on in the year ahead.

1. Edge meets more AI/ML


Until recently, pre-processing of data via near-edge technologies or gateways had its share of challenges due to the increased complexity of data solutions, especially in use cases with a high volume of events or limited connectivity, explains David Williams, managing principal of advisory at digital business consultancy AHEAD. “Now, AI/ML-optimized hardware, container-packaged analytics applications, frameworks such as TensorFlow Lite and tinyML, and open standards such as the Open Neural Network Exchange (ONNX) are encouraging machine learning interoperability and making on-device machine learning and data analytics at the edge a reality.” 

Machine learning at the edge will enable faster decision-making. “Moreover, the amalgamation of edge and AI will further drive real-time personalization,” predicts Mukesh Ranjan, practice director with management consultancy and research firm Everest Group.

“But without proper thresholds in place, anomalies can slowly become standards,” notes Greg Jones, CTO of IoT solutions provider Kajeet. “Advanced policy controls will enable greater confidence in the actions made as a result of the data collected and interpreted from the edge.” 

 

2. Cloud and edge providers explore partnerships


IDC predicts a quarter of organizations will improve business agility by integrating edge data with applications built on cloud platforms by 2024. That will require partnerships across cloud and communications service providers, with some pairing up already beginning between wireless carriers and the major public cloud providers.

According to IDC research, the systems that organizations can leverage to enable real-time analytics are already starting to expand beyond traditional data centers and deployment locations. Devices and computing platforms closer to end customers and/or co-located with real-world assets will become an increasingly critical component of this IT portfolio. This edge computing strategy will be part of a larger computing fabric that also includes public cloud services and on-premises locations.

In this scenario, edge provides immediacy and cloud supports big data computing.

 

3. Edge management takes center stage


“As edge computing becomes as ubiquitous as cloud computing, there will be increased demand for scalability and centralized management,” says Wright of Pace Harmon. IT leaders deploying applications at scale will need to invest in tools to “harness step change in their capabilities so that edge computing solutions and data can be custom-developed right from the processor level and deployed consistently and easily just like any other mainstream compute or storage platform,” Wright says.

The traditional approach to data center or cloud monitoring won’t work at the edge, notes Williams of AHEAD. “Because of the rather volatile nature of edge technologies, organizations should shift from monitoring the health of devices or the applications they run to instead monitor the digital experience of their users,” Williams says. “This user-centric approach to monitoring takes into consideration all of the components that can impact user or customer experience while avoiding the blind spots that often lie between infrastructure and the user.”

As Stu Miniman, director of market insights on the Red Hat cloud platforms team, recently noted, “If there is any remaining argument that hybrid or multi-cloud is a reality, the growth of edge solidifies this truth: When we think about where data and applications live, they will be in many places.”

“The discussion of edge is very different if you are talking to a telco company, one of the public cloud providers, or a typical enterprise,” Miniman adds. “When it comes to Kubernetes and the cloud-native ecosystem, there are many technology-driven solutions competing for mindshare and customer interest. While telecom giants are already extending their NFV solutions into the edge discussion, there are many options for enterprises. Edge becomes part of the overall distributed nature of hybrid environments, so users should work closely with their vendors to make sure the edge does not become an island of technology with a specialized skill set.“

 

4. IT and operational technology begin to converge


Resiliency is perhaps the business term of the year, thanks to a pandemic that revealed most organizations’ weaknesses in this area. IoT-enabled devices (and other connected equipment) drive the adoption of edge solutions where infrastructure and applications are being placed within operations facilities. This approach will be “critical for real-time inference using AI models and digital twins, which can detect changes in operating conditions and automate remediation,” IDC’s research says.

IDC predicts that the number of new operational processes deployed on edge infrastructure will grow from less than 20 percent today to more than 90 percent in 2024 as IT and operational technology converge. Organizations will begin to prioritize not just extracting insight from their new sources of data, but integrating that intelligence into processes and workflows using edge capabilities.

Mobile edge computing (MEC) will be a key enabler of supply chain resilience in 2021, according to Pace Harmon’s Wright. “Through MEC, the ecosystem of supply chain enablers has the ability to deploy artificial intelligence and machine learning to access near real-time insights into consumption data and predictive analytics as well as visibility into the most granular elements of highly complex demand and supply chains,” Wright says. “For organizations to compete and prosper, IT leaders will need to deliver MEC-based solutions that enable an end-to-end view across the supply chain available 24/7 – from the point of manufacture or service  throughout its distribution.”

 

5. Edge eases connected ecosystem adoption


Edge not only enables and enhances the use of IoT, but it also makes it easier for organizations to participate in the connected ecosystem with minimized network latency and bandwidth issues, says Manali Bhaumik, lead analyst at technology research and advisory firm ISG. “Enterprises can leverage edge computing’s scalability to quickly expand to other profitable businesses without incurring huge infrastructure costs,” Bhaumik says. “Enterprises can now move into profitable and fast-streaming markets with the power of edge and easy data processing.”

 

6. COVID-19 drives innovation at the edge


“There’s nothing like a pandemic to take the hype out of technology effectiveness,” says Jason Mann, vice president of IoT at SAS. Take IoT technologies such as computer vision enabled by edge computing: “From social distancing to thermal imaging, safety device assurance and operational changes such as daily cleaning and sanitation activities, computer vision is an essential technology to accelerate solutions that turn raw IoT data (from video/cameras) into actionable insights,” Mann says. Retailers, for example, can use computer vision solutions to identify when people are violating the store’s social distance policy.

 

7. Private 5G adoption increases


“Use cases such as factory floor automation, augmented and virtual reality within field service management, and autonomous vehicles will drive the adoption of private 5G networks,” says Ranjan of Everest Group. Expect more maturity in this area in the year ahead, Ranjan says.

 

8. Edge improves data security


“Data efficiency is improved at the edge compared with the cloud, reducing internet and data costs,” says ISG’s Bhaumik. “The additional layer of security at the edge enhances the user experience.” Edge computing is also not dependent on a single point of application or storage, Bhaumik says. “Rather, it distributes processes across a vast range of devices.”

As organizations adopt DevSecOps and take a “design for security” approach, edge is becoming a major consideration for the CSO to enable secure cloud-based solutions, says Pace Harmon’s Wright. “This is particularly important where cloud architectures alone may not deliver enough resiliency or inherent security to assure the continuity of services required by autonomous solutions, by virtual or augmented reality experiences, or big data transaction processing,” Wright says. “However, IT leaders should be aware of the rate of change and relative lack of maturity of edge management and monitoring systems; consequently, an edge-based security component or solution for today will likely need to be revisited in 18 to 24 months’ time.”

Originally posted here.

Read more…

IoT Sustainability, Data At The Edge.

Recently I've written quite a bit about IOT, and one thing you may have picked up on is that the Internet of Things is made up of a lot of very large numbers.

For starters, the number of connected things is measured in the tens of billions, nearly 100's of billions. Then, behind that very large number is an even bigger number, the amount of data these billions of devices is predicted to generate.

As FutureIoT pointed out, IDC forecasted that the amount of data generated by IoT devices by 2025 is expected to be in excess of 79.4 zettabytes (ZB).

How much is Zettabyte!?

A zettabyte is a very large number indeed, but how big? How can you get your head around it? Does this help...?

A zettabyte is 1,000,000,000,000,000,000,000 bytes. Hmm, that's still not very easy to visualise.

So let's think of it in terms of London busses. Let's image a byte is represented as a human on a bus, a London bus can take 80 people, so you'd need 993 quintillion busses to accommodate 79.4 zettahumans.

I tried to calculate how long 993 quintillion busses would be. Relating it to the distance to the moon, Mars or the Sun wasn't doing it justice, the only comparable scale is the size of the Milky Way. Even with that, our 79.4 zettahumans lined up in London busses, would stretch across the entire Milky Way ... and a fair bit further!

Sustainability Of Cloud Storage For 993 Quintillion Busses Of Data

Everything we do has an impact on the planet. Just by reading this article, you're generating 0.2 grams of Carbon Dioxide (CO2) emissions per second ... so I'll try to keep this short.

Using data from the Stanford Magazine that suggests every 100 gigabytes of data stored in the Cloud could generate 0.2 tons of CO2 per year. Storing 79.4 zettabytes of data in the Cloud could be responsible for the production of 158.8 billion tons of greenhouse gases.

8598308493?profile=RESIZE_710x

 

Putting that number into context, using USA Today numbers, the total emissions for China, USA, India, Russia, Japan and Germany accounted for a little over 21 billion tons in 2019.

So if we just go ahead and let all the IoT devices stream data to the Cloud, those billions of little gadgets would indirectly generate more than seven times the air pollution than the six most industrial countries, combined.

Save The Planet, Store Data At The Edge

As mentioned in a previous article, not all data generated by IoT devices needs to be stored in the Cloud.

Speaking with an expert in data storage, ObjectBox, they say their users on average cut their Cloud data storage by 60%. So how does that work, then? 

First, what does The Edge mean?

The term "Edge" refers to the edge of the network, in other words the last piece of equipment or thing connected to the network closest to the point of usage.

Let me illustrate in rather over-simplified diagram.

8598328665?profile=RESIZE_710x

 

How Can Edge Data Storage Improve Sustainability?

In an article about computer vision and AI on the edge, I talked about how vast amounts of network data could be saved if the cameras themselves could detect what an important event was, and to just send that event over the network, not the entire video stream.

In that example, only the key events and meta data, like the identification marks of a vehicle crossing a stop light, needed to be transmitted across the network. However, it is important to keep the raw content at the edge, so it can be used for post processing, for further learning of the AI or even to be retrieved at a later date, e.g. by law-enforcement.

Another example could be sensors used to detect gas leaks, seismic activity, fires or broken glass. These sensors are capturing volumes of data each second, but they only want to alert someone when something happens - detection of abnormal gas levels, a tremor, fire or smashed window.

Those alerts are the primary purpose of those devices, but the data in between those events can also hold significant value. In this instance, keeping it locally at the edge, but having it as and when needed is an ideal way to reduce network traffic, reduce Cloud storage and save the planet (well, at least a little bit).

Accessible Data At The Edge

Keeping your data at the edge is a great way to save costs and increase performance, but you still want to be able to get access to it, when you need it.

ObjectBox have created not just one of the most efficient ways to store data at the edge, but they've also built a sophisticated and powerful method to synchronise data between edge devices, the Cloud and other edge devices.

Synchronise Data At The Edge - Fog Computing.

Fog Computing (which is computing that happens between the Cloud and the Edge) requires data to be exchanged with devices connected to the edge, but without going all the way to/from the servers in the Cloud. 

In the article on making smarter, safer cities, I talked about how by having AI-equipped cameras share data between themselves they could become smarter, more efficient. 

A solution like that could be using ObjectBox's synchronisation capabilities to efficiently discover and collect relevant video footage from various cameras to help either identify objects or even train the artificial intelligence algorithms running on the AI-equipped cameras at the edge.

Storing Data At The Edge Can Save A Bus Load CO2

Edge computing has a lot of benefits to offer, in this article I've just looked at what could often be overlooked - the cost of transferring data. I've also not really delved into the broader benefits of ObjectBox's technology, for example, from their open source benchmarks, ObjectBox seems to offer a ten times performance benefit compared to other solutions out there, and is being used by more than 300,000 developers.  

The team behind ObjectBox also built technologies currently used by internet heavy-weights like Twitter, Viber and Snapchat, so they seem to be doing something right, and if they can really cut down network traffic by 60%, they could be one of sustainable technology companies to watch.  

Originally posted here.

Read more…
RSS
Email me when there are new items in this category –

Sponsor