Join IoT Central | Join our LinkedIn Group | Post on IoT Central


OSes (231)

IoT forensic science uses technical methods to solve problems related to the investigation of incidents involving IoT devices. Some of the technical ways that IoT forensic science solves problems include:

  1. Data Extraction and Analysis: IoT forensic science uses advanced software tools to extract data from IoT devices, such as logs, sensor readings, and network traffic. The data is then analyzed to identify relevant information, such as timestamps, geolocation, and device identifiers, which can be used to reconstruct events leading up to an incident.

  2. Reverse Engineering: IoT forensic science uses reverse engineering techniques to understand the underlying functionality of IoT devices. This involves analyzing the hardware and software components of the device to identify vulnerabilities, backdoors, and other features that may be relevant to an investigation.

  3. Forensic Imaging: IoT forensic science uses forensic imaging techniques to preserve the state of IoT devices and ensure that the data collected is admissible in court. This involves creating a complete copy of the device's storage and memory, which can then be analyzed without altering the original data.

  4. Cryptography and Data Security: IoT forensic science uses cryptography and data security techniques to ensure the integrity and confidentiality of data collected from IoT devices. This includes the use of encryption, digital signatures, and other security measures to protect data during storage, analysis, and transmission.

  5. Machine Learning: IoT forensic science uses machine learning algorithms to automate the analysis of large amounts of data generated by IoT devices. This can help investigators identify patterns and anomalies that may be relevant to an investigation.

IoT forensic science uses many more (and more advances) technical methods to solve problems related to the investigation of incidents involving IoT devices. By leveraging these techniques, investigators can collect, analyze, and present digital evidence from IoT devices that can be used to reconstruct events and support legal proceedings.

Read more…

By Bee Hayes-Thakore

The Android Ready SE Alliance, announced by Google on March 25th, paves the path for tamper resistant hardware backed security services. Kigen is bringing the first secure iSIM OS, along with our GSMA certified eSIM OS and personalization services to support fast adoption of emerging security services across smartphones, tablets, WearOS, Android Auto Embedded and Android TV.

Google has been advancing their investment in how tamper-resistant secure hardware modules can protect not only Android and its functionality, but also protect third-party apps and secure sensitive transactions. The latest android smartphone device features enable tamper-resistant key storage for Android Apps using StrongBox. StrongBox is an implementation of the hardware-backed Keystore that resides in a hardware security module.

To accelerate adoption of new Android use cases with stronger security, Google announced the formation of the Android Ready SE Alliance. Secure Element (SE) vendors are joining hands with Google to create a set of open-source, validated, and ready-to-use SE Applets. On March 25th, Google launched the General Availability (GA) version of StrongBox for SE.

8887974290?profile=RESIZE_710x

Hardware based security modules are becoming a mainstay of the mobile world. Juniper Research’s latest eSIM research, eSIMs: Sector Analysis, Emerging Opportunities & Market Forecasts 2021-2025, independently assessed eSIM adoption and demand in the consumer sector, industrial sector, and public sector, and predicts that the consumer sector will account for 94% of global eSIM installations by 2025. It anticipates that established adoption of eSIM frameworks from consumer device vendors such as Google, will accelerate the growth of eSIMs in consumer devices ahead of the industrial and public sectors.


Consumer sector will account for 94% of global eSIM installations by 2025

Juniper Research, 2021.

Expanding the secure architecture of trust to consumer wearables, smart TV and smart car

What’s more? A major development is that now this is not just for smartphones and tablets, but also applicable to WearOS, Android Auto Embedded and Android TV. These less traditional form factors have huge potential beyond being purely companion devices to smartphones or tablets. With the power, size and performance benefits offered by Kigen’s iSIM OS, OEMs and chipset vendors can consider the full scope of the vast Android ecosystem to deliver new services.

This means new secure services and innovations around:

🔐 Digital keys (car, home, office)

🛂 Mobile Driver’s License (mDL), National ID, ePassports

🏧 eMoney solutions (for example, Wallet)

How is Kigen supporting Google’s Android Ready SE Alliance?

The alliance was created to make discrete tamper resistant hardware backed security the lowest common denominator for the Android ecosystem. A major goal of this alliance is to enable a consistent, interoperable, and demonstrably secure applets across the Android ecosystem.

Kigen believes that enabling the broadest choice and interoperability is fundamental to the architecture of digital trust. Our secure, standards-compliant eSIM and iSIM OS, and secure personalization services are available to all chipset or device partners in the Android Ready SE Alliance to leverage the benefits of iSIM for customer-centric innovations for billions of Android users quickly.

Vincent Korstanje, CEO of Kigen

Kigen’s support for the Android Ready SE Alliance will allow our industry partners to easily leapfrog to the enhanced security and power efficiency benefits of iSIM technology or choose a seamless transition from embedded SIM so they can focus on their innovation.

We are delighted to partner with Kigen to further strengthen the security of Android through StrongBox via Secure Element (SE). We look forward to widespread adoption by our OEM partners and developers and the entire Android ecosystem.

Sudhi Herle, Director of Android Platform Security 

In the near term, the Google team is prioritizing and delivering the following Applets in conjunction with corresponding Android feature releases:

  • Mobile driver’s license and Identity Credentials
  • Digital car keys

Kigen brings the ability to bridge the physical embedded security hardware to a fully integrated form factor. Our Kigen standards-compliant eSIM OS (version 2.2. eUICC OS) is available to support chipsets and device makers now. This announcement is a start to what will bring a whole host of new and exciting trusted services offering better experience for users on Android.

Kigen’s eSIM (eUICC) OS brings

8887975464?profile=RESIZE_710x

The smallest operating system, allowing OEMs to select compact, cost-effective hardware to run it on.

Kigen OS offers the highest level of logical security when employed on any SIM form factor, including a secure enclave.

On top of Kigen OS, we have a broad portfolio of Java Card™ Applets to support your needs for the Android SE Ready Alliance.

Kigen’s Integrated SIM or iSIM (iUICC) OS further this advantage

8887975878?profile=RESIZE_710x

Integrated at the heart of the device and securely personalized, iSIM brings significant size and battery life benefits to cellular Iot devices. iSIM can act as a root of trust for payment, identity, and critical infrastructure applications

Kigen’s iSIM is flexible enough to support dual sim capability through a single profile or remote SIM provisioning mechanisms with the latter enabling out-of-the-box connectivity, secure and remote profile management.

For smartphones, set top boxes, android auto applications, auto car display, Chromecast or Google Assistant enabled devices, iSIM can offer significant benefits to incorporate Artificial intelligence at the edge.

Kigen’s secure personalization services to support fast adoption

SIM vendors have in-house capabilities for data generation but the eSIM and iSIM value chains redistribute many roles and responsibilities among new stakeholders for the personalization of operator credentials along different stages of production or over-the-air when devices are deployed.

Kigen can offer data generation as a service to vendors new to the ecosystem.

Partner with us to provide cellular chipset and module makers with the strongest security, performance for integrated SIM leading to accelerate these new use cases.

Security considerations for eSIM and iSIM enabled secure connected services

Designing a secure connected product requires considerable thought and planning and there really is no ‘one-size-fits-all’ solution. How security should be implemented draws upon a multitude of factors, including:

  • What data is being stored or transmitted between the device and other connected apps?
  • Are there regulatory requirements for the device? (i.e. PCI DSS, HIPAA, FDA, etc.)
  • What are the hardware or design limitations that will affect security implementation?
  • Will the devices be manufactured in a site accredited by all of the necessary industry bodies?
  • What is the expected lifespan of the device?

End-to-end ecosystem and services thinking needs to be a design consideration from the very early stage especially when considering the strain on battery consumption in devices such as wearables, smart watches and fitness devices as well as portable devices that are part of the connected consumer vehicles.

Originally posted here.

Read more…

In my last post, I explored how OTA updates are typically performed using Amazon Web Services and FreeRTOS. OTA updates are critically important to developers with connected devices. In today’s post, we are going to explore several best practices developers should keep in mind with implementing their OTA solution. Most of these will be generic although I will point out a few AWS specific best practices.

Best Practice #1 – Name your S3 bucket with afr-ota

There is a little trick with creating S3 buckets that I was completely oblivious to for a long time. Thankfully when I checked in with some colleagues about it, they also had not been aware of it so I’m not sure how long this has been supported but it can help an embedded developer from having to wade through too many AWS policies and simplify the process a little bit.

Anyone who has attempted to create an OTA Update with AWS and FreeRTOS knows that you have to setup several permissions to allow an OTA Update Job to access the S3 bucket. Well if you name your S3 bucket so that it begins with “afr-ota”, then the S3 bucket will automatically have the AWS managed policy AmazonFreeRTOSOTAUpdate attached to it. (See Create an OTA Update service role for more details). It’s a small help, but a good best practice worth knowing.

Best Practice #2 – Encrypt your firmware updates

Embedded software must be one of the most expensive things to develop that mankind has ever invented! It’s time consuming to create and test and can consume a large percentage of the development budget. Software though also drives most features in a product and can dramatically different a product. That software is intellectual property that is worth protecting through encryption.

Encrypting a firmware image provides several benefits. First, it can convert your firmware binary into a form that seems random or meaningless. This is desired because a developer shouldn’t want their binary image to be easily studied, investigated or reverse engineered. This makes it harder for someone to steal intellectual property and more difficult to understand for someone who may be interested in attacking the system. Second, encrypting the image means that the sender must have a key or credential of some sort that matches the device that will decrypt the image. This can be looked at a simple source for helping to authenticate the source, although more should be done than just encryption to fully authenticate and verify integrity such as signing the image.

Best Practice #3 – Do not support firmware rollbacks

There is often a debate as to whether firmware rollbacks should be supported in a system or not. My recommendation for a best practice is that firmware rollbacks be disabled. The argument for rollbacks is often that if something goes wrong with a firmware update then the user can rollback to an older version that was working. This seems like a good idea at first, but it can be a vulnerability source in a system. For example, let’s say that version 1.7 had a bug in the system that allowed remote attackers to access the system. A new firmware version, 1.8, fixes this flaw. A customer updates their firmware to version 1.8, but an attacker knows that if they can force the system back to 1.7, they can own the system. Firmware rollbacks seem like a convenient and good idea, in fact I’m sure in the past I used to recommend them as a best practice. However, in today’s connected world where we perform OTA updates, firmware rollbacks are a vulnerability so disable them to protect your users.

Best Practice #4 – Secure your bootloader

Updating firmware Over-the-Air requires several components to ensure that it is done securely and successfully. Often the focus is on getting the new image to the device and getting it decrypted. However, just like in traditional firmware updates, the bootloader is still a critical piece to the update process and in OTA updates, the bootloader can’t just be your traditional flavor but must be secure.

There are quite a few methods that can be used with the onboard bootloader, but no matter the method used, the bootloader must be secure. Secure bootloaders need to be capable of verifying the authenticity and integrity of the firmware before it is ever loaded. Some systems will use the application code to verify and install the firmware into a new application slot while others fully rely on the bootloader. In either case, the secure bootloader needs to be able to verify the authenticity and integrity of the firmware prior to accepting the new firmware image.

It’s also a good idea to ensure that the bootloader is built into a chain of trust and cannot be easily modified or updated. The secure bootloader is a critical component in a chain-of-trust that is necessary to keep a system secure.

Best Practice #5 – Build a Chain-of-Trust

A chain-of-trust is a sequence of events that occur while booting the device that ensures each link in the chain is trusted software. For example, I’ve been working with the Cypress PSoC 64 secure MCU’s recently and these parts come shipped from the factory with a hardware-based root-of-trust to authenticate that the MCU came from a secure source. That Root-of-Trust (RoT) is then transferred to a developer, who programs a secure bootloader and security policies onto the device. During the boot sequence, the RoT verifying the integrity and authenticity of the bootloader, which then verifies the integrity and authenticity of any second stage bootloader or software which then verifies the authenticity and integrity of the application. The application then verifies the authenticity and integrity of its data, keys, operational parameters and so on.

This sequence creates a Chain-Of-Trust which is needed and used by firmware OTA updates. When the new firmware request is made, the application must decrypt the image and verify that authenticity and integrity of the new firmware is intact. That new firmware can then only be used if the Chain-Of-Trust can successfully make its way through each link in the chain. The bottom line, a developer and the end user know that when the system boots successfully that the new firmware is legitimate. 

Conclusions

OTA updates are a critical infrastructure component to nearly every embedded IoT device. Sure, there are systems out there that once deployed will never update, however, those are probably a small percentage of systems. OTA updates are the go-to mechanism to update firmware in the field. We’ve examined several best practices that developers and companies should consider when they start to design their connected systems. In fact, the bonus best practice for today is that if you are building a connected device, make sure you explore your OTA update solution sooner rather than later. Otherwise, you may find that building that Chain-Of-Trust necessary in today’s deployments will be far more expensive and time consuming to implement.

Originally posted here.

Read more…

TinyML focuses on optimizing machine learning (ML) workloads so that they can be processed on microcontrollers no bigger than a grain of rice and consuming only milliwatts of power.

By Arm Blueprint staff
 

TinyML focuses on the optimization of machine learning (ML) workloads so that they can be processed on microcontrollers no bigger than a grain of rice and consuming only a few milliwatts of power.

TinyML gives tiny devices intelligence. We mean tiny in every sense of the word: as tiny as a grain of rice and consuming tiny amounts of power. Supported by Arm, Google, Qualcomm and others, tinyML has the potential to transform the Internet of Things (IoT), where billions of tiny devices, based on Arm chips, are already being used to provide greater insight and efficiency in sectors including consumer, medical, automotive and industrial.

Why target microcontrollers with tinyML?

Microcontrollers such as the Arm Cortex-M family are an ideal platform for ML because they’re already used everywhere. They perform real-time calculations quickly and efficiently, so they’re reliable and responsive, and because they use very little power, can be deployed in places where replacing the battery is difficult or inconvenient. Perhaps even more importantly, they’re cheap enough to be used just about anywhere. The market analyst IDC reports that 28.1 billion microcontrollers were sold in 2018, and forecasts that annual shipment volume will grow to 38.2 billion by 2023.

TinyML on microcontrollers gives us new techniques for analyzing and making sense of the massive amount of data generated by the IoT. In particular, deep learning methods can be used to process information and make sense of the data from sensors that do things like detect sounds, capture images, and track motion.

Advanced pattern recognition in a very compact format

Looking at the math involved in machine learning, data scientists found they could reduce complexity by making certain changes, such as replacing floating-point calculations with simple 8-bit operations. These changes created machine learning models that work much more efficiently and require far fewer processing and memory resources.

TinyML technology is evolving rapidly thanks to new technology and an engaged base of committed developers. Only a few years ago, we were celebrating our ability to run a speech-recognition model capable of waking the system if it detects certain words on a constrained Arm Cortex-M3 microcontroller using just 15 kilobytes (KB) of code and 22KB of data.

Since then, Arm has launched new machine learning (ML) processors, called the Ethos-U55 and Ethos-U65, a microNPU specifically designed to accelerate ML inference in embedded and IoT devices.

The Ethos-U55, combined with the AI-capable Cortex-M55 processor, will provide a significant uplift in ML performance and improvement in energy efficiency over the already impressive examples we are seeing today.

TinyML takes endpoint devices to the next level

The potential use cases of tinyML are almost unlimited. Developers are already working with tinyML to explore all sorts of new ideas: responsive traffic lights that change signaling to reduce congestion, industrial machines that can predict when they’ll need service, sensors that can monitor crops for the presence of damaging insects, in-store shelves that can request restocking when inventory gets low, healthcare monitors that track vitals while maintaining privacy. The list goes on.

TinyML can make endpoint devices more consistent and reliable, since there’s less need to rely on busy, crowded internet connections to send data back and forth to the cloud. Reducing or even eliminating interactions with the cloud has major benefits including reduced energy use, significantly reduced latency in processing data and security benefits, since data that doesn’t travel is far less exposed to attack. 

It’s worth nothing that these tinyML models, which perform inference on the microcontroller, aren’t intended to replace the more sophisticated inference that currently happens in the cloud. What they do instead is bring specific capabilities down from the cloud to the endpoint device. That way, developers can save cloud interactions for if and when they’re needed. 

TinyML also gives developers a powerful new set of tools for solving problems. ML makes it possible to detect complex events that rule-based systems struggle to identify, so endpoint AI devices can start contributing in new ways. Also, since ML makes it possible to control devices with words or gestures, instead of buttons or a smartphone, endpoint devices can be built more rugged and deployable in more challenging operating environments. 

TinyML gaining momentum with an expanding ecosystem

Industry players have been quick to recognize the value of tinyML and have moved rapidly to create a supportive ecosystem. Developers at every level, from enthusiastic hobbyists to experienced professionals, can now access tools that make it easy to get started. All that’s needed is a laptop, an open-source software library and a USB cable to connect the laptop to one of several inexpensive development boards priced as low as a few dollars.

In fact, at the start of 2021, Raspberry Pi released its very first microcontroller board, one of the most affordable development board available in the market at just $4. Named Raspberry Pi Pico, it’s powered by the RP2040 SoC, a surprisingly powerful dual Arm Cortex-M0+ processor. The RP2040 MCU is able to run TensorFlow Lite Micro and we’re expecting to see a wide range of ML use cases for this board over the coming months.

Arm is a strong proponent of tinyML because our microcontroller architectures are so central to the IoT, and because we see the potential of on-device inference. Arm’s collaboration with Google is making it even easier for developers to deploy endpoint machine learning in power-conscious environments.

The combination of Arm CMSIS-NN libraries with Google’s TensorFlow Lite Micro (TFLu) framework, allows data scientists and software developers to take advantage of Arm’s hardware optimizations without needing to become experts in embedded programming.

On top of this, Arm is investing in new tools derived from Keil MDK to help developers get from prototype to production when deploying ML applications.

TinyML would not be possible without a number of early influencers. Pete Warden, a “founding father” of tinyML and a technical lead of TensorFlow Lite Micro at Google,&nbspArm Innovator, Kwabena Agyeman, who developed OpenMV, a project dedicated to low-cost, extensible, Python-powered machine-vision modules that support machine learning algorithms, and Arm Innovator, Daniel Situnayake a founding tinyML engineer and developer from Edge Impulse, a company that offers a full tinyML pipeline that covers data collection, model training and model optimization. Also, Arm partners such as Cartesiam.ai, a company that offers NanoEdge AI, a tool that creates software models on the endpoint based on the sensor behavior observed in real conditions have been pushing the possibilities of tinyML to another level. 

Arm, is also a partner of the TinyML Foundation, an open community that coordinates meet-ups to help people connect, share ideas, and get involved. There are many localised tinyML meet-ups covering UK, Israel and Seattle to name a few, as well as a global series of tinyML Summits. For more information, visit the tinyML foundation website.

Originally posted here.

Read more…

Once again, I’m jumping up and down in excitement because I’m going to be hosting a panel discussion as part of a webinar series — Fast and Fearless: The Future of IoT Software Development — being held under the august auspices of IotCentral.io

At this event, the second of a four-part series, we will be focusing on “AI and IoT Innovation” (see also What the FAQ are AI, ANNs, ML, DL, and DNNs? and What the FAQ are the IoT, IIoT, IoHT, and AIoT?).

9132441666?profile=RESIZE_400x

Panel members Karl Fezer (upper left), Wei Xiao (upper right), Nikhil Bhaskaran (lower left), and Tina Shyuan (bottom right) (Click image to see a larger version)

As we all know, the IoT is transforming the software landscape. What used to be a relatively straightforward embedded software stack has been revolutionized by the IoT, with developers now having to juggle specialized workloads, security, artificial intelligence (AI) and machine learning (ML), real-time connectivity, managing devices that have been deployed into the field… the list goes on.

In this webinar — which will be held on Tuesday 29 June 2021 from 10:00 a.m. to 11:00 a.m. CDT — I will be joined by four industry luminaries to discuss how to juggle the additional complexities that machine learning adds to IoT development, why on-device machine learning is more important now than ever, and what the combination of AI and IoT looks like for developers in the future.

The luminaries in question (and whom I will be questioning) are Karl Fezer (AI Ecosystem Evangelist at Arm), Wei Xiao (Principal Engineer, Sr. Strategic Alliances Manager at Nvidia), Nikhil Bhaskaran (Founder of Shunya OS), and Tina Shyuan (Director of Product Marketing at Qeexo).

So, what say you? Dare I hope that we will have the pleasure of your company and that you will be able to join us to (a) tease your auditory input systems with our discussions and (b) join our question-and-answer free-for-all frensy at the end? If so, may I suggest that you Register Now before all of the good virtual seats are taken, metaphorically speaking, of course.

>> Clicke here to register

Read more…

Happy Friday (or whatever day it is when you find yourself reading this). I’m currently bouncing off the walls in excitement because I’ve been invited to host a panel discussion as part of a webinar series — Fast and Fearless: The Future of IoT Software Development — being held under the august auspices of IoTCentral.io

8902866889?profile=RESIZE_584x

Panel members Joe Alderson (upper left), Pamela Cortez (upper right), Katherine Scott (lower left), and Ihor Dvoretskyi (bottom right)

At this event, the first of a 4-part series, we will be focusing on “The IoT Software Developer Experience.”

As we all know, the IoT is transforming the software landscape. What used to be a relatively straightforward embedded software stack has been revolutionized by the IoT, with developers now having to juggle specialized workloads, security, machine learning, real-time connectivity, managing devices that have been deployed into the field… the list goes on.

In this webinar — which will be held on Tuesday 11 May 2021 from 10:00 a.m. to 11:00 a.m. CDT — I will be joined by four industry luminaries to discuss the development challenges engineers are facing today, how the industry is helping to make IoT development easier, an overview of development processes (including cloud-based continuous integration (CI) workflows and low-code development), and what the future looks like for developers who are building for the IoT. 

The luminaries in question (and whom I will be questioning) are Joe Alderson (Director of Embedded Tools and User Experience at Arm), Pamela Cortez (IoT Developer Advocate and Sr. Program Manager at Microsoft Azure IoT), Katherine Scott, Developer Advocate at Open Robotics, and Ihor Dvoretskyi (Developer Advocate at Cloud Native Computing Foundation).

So, what say you? Dare I hope that we will have the pleasure of your company and that you will be able to join us to (a) tease your auditory input systems with our discussions and (b) join our question-and-answer free-for-all at the end?

Recording available:

Read more…

Well, this isn’t something I expected to be talking about today, but my chum Ben Cook just introduced me to something that looks rather cool.

Ben is the Founder and Director at Airspeed Electronics Ltd., which is an electronic design consultancy that’s based in the UK specializing in high-performance acoustic detection and tracking technology for counter-unmanned aircraft system (UAS) applications. The folks at Airspeed Electronics are currently developing a drone detection and tracking system called MANTIS, where this work is being funded through a research grant provided by the UK Ministry of Defence (which — before you make a nasty comment — is how they spell “Defense” in the UK).

MANTIS, which stands for “MAchine learNing acousTIc Surveillance,” is a system of distributed, intelligent acoustic sensors that use artificial intelligence (AI) for the detection, classification, and location estimation of UAS — such as drones — based on their acoustic signatures.

But that’s not what I wanted to talk to you about…

In his email to me, Ben spake as follows: “Have you heard of an embedded operating system called ‘Luos’ before? It’s a microservices software architecture, like docker but for use with microcontrollers. I have no affiliation, I just stumbled across this today and I’m thinking this could be very useful for some future projects. It looks really good for anything ‘modular-y,’ if you know what I mean…”

I do know what Ben means. I just meandered my way around the luos.io website, perused and pondered the documentation at docs.luos.io, and watched this video on YouTube (later today, I’m going to get the tattoo, buy the T-shirt, and see the stage play).

In a nutshell, Luos is a simple and lightweight open-source distributed operating system dedicated to embedded systems. It uses the concept of modularity to simplify the linking of components and chunks of application code together to form a single system image.

Consider a system like a robot that uses multiple microcontrollers to manage its various sensors, actuators, and motors. If each of these microcontrollers employs Luos technology, all of them can use any feature of any microcontroller in the system as if all of the features were located in the same component.

Now, I’m a hardware design engineer by trade, so the software side is a bit outside my bailiwick, but — even so — looking at the video above and scanning the documentation makes me sit up and say, “Wow, this looks really, really cool.”

I asked around a few of my embedded systems software developer friends, and no one had heard of Luos, but I have a feeling that this may be a tool that’s poised to make a big splash. All sorts of ideas are currently bouncing around my head, like the fact that the Tracealyzer tool from Percepio would make an ideal companion for the Luos OS (see also The 2021 Embedded Online Conference Approacheth).

How about you? Have you heard of Luos? If so, what are your thoughts? If not, and if you lean toward the software side of things, it would be great if you could take a look with your highly trained eye, see what you think, and report back to the rest of us in the comments below.

Originally posted HERE

Read more…

By Sachin Kotasthane

In his book, 21 Lessons for the 21st Century, the historian Yuval Noah Harari highlights the complex challenges mankind will face on account of technological challenges intertwined with issues such as nationalism, religion, culture, and calamities. In the current industrial world hit by a worldwide pandemic, we see this complexity translate in technology, systems, organizations, and at the workplace.

While in my previous article, Humane IIoT, I discussed the people-centric strategies that enterprises need to adopt while onboarding IoT initiatives of industrial IoT in the workforce, in this article, I will share thoughts on how new-age technologies such as AI, ML, and big data, and of course, industrial IoT, can be used for effective management of complex workforce problems in a factory, thereby changing the way people work and interact, especially in this COVID-stricken world.

Workforce related problems in production can be categorized into:

  1. Time complexity
  2. Effort complexity
  3. Behavioral complexity

Problems categorized in either of the above have a significant impact on the workforce, resulting in a detrimental effect on the outcome—of the product or the organization. The complexity of these problems can be attributed to the fact that the workforce solutions to such issues cannot be found using just engineering or technology fixes as there is no single root-cause, rather, a combination of factors and scenarios. Let us, therefore, explore a few and seek probable workforce solutions.8829066088?profile=RESIZE_584x

Figure 1: Workforce Challenges and Proposed Strategies in Production

  1. Addressing Time Complexity

    Any workforce-related issue that has a detrimental effect on the operational time, due to contributing factors from different factory systems and processes, can be classified as a time complex problem.

    Though classical paper-based schedules, lists, and punch sheets have largely been replaced with IT-systems such as MES, APS, and SRM, the increasing demands for flexibility in manufacturing operations and trends such as batch-size-one, warrant the need for new methodologies to solve these complex problems.

    • Worker attendance

      Anyone who has experienced, at close quarters, a typical day in the life of a factory supervisor, will be conversant with the anxiety that comes just before the start of a production shift. Not knowing who will report absent, until just before the shift starts, is one complex issue every line manager would want to get addressed. While planned absenteeism can be handled to some degree, it is the last-minute sick or emergency-pager text messages, or the transport delays, that make the planning of daily production complex.

      What if there were a solution to get the count that is almost close to the confirmed hands for the shift, an hour or half, at the least, in advance? It turns out that organizations are experimenting with a combination of GPS, RFID, and employee tracking that interacts with resource planning systems, trying to automate the shift planning activity.

      While some legal and privacy issues still need to be addressed, it would not be long before we see people being assigned to workplaces, even before they enter the factory floor.

      During this course of time, while making sure every line manager has accurate information about the confirmed hands for the shift, it is also equally important that health and well-being of employees is monitored during this pandemic time. Use of technologies such as radar, millimeter wave sensors, etc., would ensure the live tracking of workers around the shop-floor and make sure that social distancing norms are well-observed.

    • Resource mapping

      While resource skill-mapping and certification are mostly HR function prerogatives, not having the right resource at the workstation during exigencies such as absenteeism or extra workload is a complex problem. Precious time is lost in locating such resources, or worst still, millions spent in overtime.

      What if there were a tool that analyzed the current workload for a resource with the identified skillset code(s) and gave an accurate estimate of the resource’s availability? This could further be used by shop managers to plan manpower for a shift, keeping them as lean as possible.

      Today, IT teams of OEMs are seen working with software vendors to build such analytical tools that consume data from disparate systems—such as production work orders from MES and swiping details from time systems—to create real-time job profiles. These results are fed to the HR systems to give managers the insights needed to make resource decisions within minutes.

  2. Addressing Effort Complexity

    Just as time complexities result in increased  production time, problems in this category result in an increase in effort by the workforce to complete the same quantity of work. As the effort required is proportionate to the fatigue and long-term well-being of the workforce, seeking workforce solutions to reduce effort would be appreciated. Complexity arises when organizations try to create a method out-of-madness from a variety of factors such as changing workforce profiles, production sequences, logistical and process constraints, and demand fluctuations.

    Thankfully, solutions for this category of problems can be found in new technologies that augment existing systems to get insights and predictions, the results of which can reduce the efforts, thereby channelizing it more productively. Add to this, the demand fluctuations in the current pandemic, having a real-time operational visibility, coupled with advanced analytics, will ensure meeting shift production targets.

    • Intelligent exoskeletons

      Exoskeletons, as we know, are powered bodysuits designed to safeguard and support the user in performing tasks, while increasing overall human efficiency to do the respective tasks. These are deployed in strain-inducing postures or to lift objects that would otherwise be tiring after a few repetitions. Exoskeletons are the new-age answer to reducing user fatigue in areas requiring human skill and dexterity, which otherwise would require a complex robot and cost a bomb.

      However, the complexity that mars exoskeleton users is making the same suit adaptable for a variety of postures, user body types, and jobs at the same workstation. It would help if the exoskeleton could sense the user, set the posture, and adapt itself to the next operation automatically.

      Taking a leaf out of Marvel’s Iron Man, who uses a suit that complements his posture that is controlled by JARVIS, manufacturers can now hope to create intelligent exoskeletons that are always connected to factory systems and user profiles. These suits will adapt and respond to assistive needs, without the need for any intervention, thereby freeing its user to work and focus completely on the main job at hand.

      Given the ongoing COVID situation, it would make the life of workers and the management safe if these suits are equipped with sensors and technologies such as radar/millimeter wave to help observe social distancing, body-temperature measuring, etc.

    • Highlighting likely deviations

      The world over, quality teams on factory floors work with checklists that the quality inspector verifies for every product that comes at the inspection station. While this repetitive task is best suited for robots, when humans execute such repetitive tasks, especially those that involve using visual, audio, touch, and olfactory senses, mistakes and misses are bound to occur. This results in costly reworks and recalls.

      Manufacturers have tried to address this complexity by carrying out rotation of manpower. But this, too, has met with limited success, given the available manpower and ever-increasing workloads.

      Fortunately, predictive quality integrated with feed-forwards techniques and some smart tracking with visuals can be used to highlight the area or zone on the product that is prone to quality slips based on data captured from previous operations. The inspector can then be guided to pay more attention to these areas in the checklist.

  3. Addressing Behavioral Complexity

    Problems of this category usually manifest as a quality issue, but the root cause can often be traced to the workforce behavior or profile. Traditionally, organizations have addressed such problems through experienced supervisors, who as people managers were expected to read these signs, anticipate and align the manpower.

    However, with constantly changing manpower and product variants, these are now complex new-age problems requiring new-age solutions.

    • Heat-mapping workload

      Time and motion studies at the workplace map the user movements around the machine with the time each activity takes for completion, matching the available cycle-time, either by work distribution or by increasing the manpower at that station. Time-consuming and cumbersome as it is, the complexity increases when workload balancing is to be done for teams working on a single product at the workstation. Movements of multiple resources during different sequences are difficult to track, and the different users cannot be expected to follow the same footsteps every time.

      Solving this issue needs a solution that will monitor human motion unobtrusively, link those to the product work content at the workstation, generate recommendations to balance the workload and even out the ‘congestion.’ New industrial applications such as short-range radar and visual feeds can be used to create heat maps of the workforce as they work on the product. This can be superimposed on the digital twin of the process to identify the zone where there is ‘congestion.’ This can be fed to the line-planning function to implement corrective measures such as work distribution or partial outsourcing of the operation.

    • Aging workforce (loss of tribal knowledge)

      With new technology coming to the shop-floor, skills of the current workforce get outdated quickly. Also, with any new hire comes the critical task of training and knowledge sharing from experienced hands. As organizations already face a shortage of manpower, releasing more hands to impart training to a larger workforce audience, possibly at different locations, becomes an even more daunting task.

      Fully realizing the difficulties and reluctance to document, organizations are increasingly adopting AR-based workforce trainings that map to relevant learning and memory needs. These AR solutions capture the minutest of the actions executed by the expert on the shop-floor and can be played back by the novice in-situ as a step-by-step guide. Such tools simplify the knowledge transfer process and also increase worker productivity while reducing costs.

      Further, in extraordinary situations such  as the one we face at present, technologies such as AR offer solutions for effective and personalized support to field personnel, without the need to fly in specialists at multiple sites. This helps keep them safe, and accessible, still.

Key takeaways and Actionable Insights

The shape of the future workforce will be the result of complex, changing, and competing forces. Technology, globalization, demographics, social values, and the changing personal expectations of the workforce will continue to transform and disrupt the way businesses operate, increasing the complexity and radically changing where, and when of future workforce, and how work is done. While the need to constantly reskill and upskill the workforce will be humongous, using new-age techniques and technologies to enhance the effectiveness and efficiency of the existing workforce will come to the spotlight.

8829067296?profile=RESIZE_710x

Figure 2: The Future IIoT Workforce

Organizations will increasingly be required to:

  1. Deploy data farming to dive deep and extract vast amounts of information and process insights embedded in production systems. Tapping into large reservoirs of ‘tribal knowledge’ and digitizing it for ingestion to data lakes is another task that organizations will have to consider.
  2. Augment existing operations systems such as SCADA, DCS, MES, CMMS with new technology digital platforms, AI, AR/VR, big data, and machine learning to underpin and grow the world of work. While there will be no dearth of resources in one or more of the new technologies, organizations will need to ‘acqui-hire’ talent and intellectual property using a specialist, to integrate with existing systems and gain meaningful actionable insights.
  3. Address privacy and data security concerns of the workforce, through the smart use of technologies such as radar and video feeds.

Nonetheless, digital enablement will need to be optimally used to tackle the new normal that the COVID pandemic has set forth in manufacturing—fluctuating demands, modular and flexible assembly lines, reduced workforce, etc.

Originally posted here.

Read more…

Image Source: SEGGER.com

Nearly every embedded software developer working in the IoT space is now building secure devices. Developers have been mostly focused on how to handle secure applications and the basic microcontroller technologies such as how to use Arms TrustZone or leverage multicore processors. A looming problem that many companies and teams are overlooking is that figuring out how to develop secure applications is just the first step. There are three stages to secure product lifecycle management and in today’s post, we will review what is involved in each stage.

As a quick overview, the stages, which can be seen in the diagram below, are:

  • Development
  • Test and Production Deployment
  • Maintenance and In-field Servicing

Let us look at each of these stages in a little more detail. 

Stage #1 – Development

Development is probably the area that most developers are the most familiar with, but at the same time, the area that they are learning to adapt to the most. Many developers have designed and built systems without ever having to take security into account. Development involves a lot more than just deciding which components to isolate and how to separate the software into secure and non-secure regions.

For example, during the development phase developers now need to learn how to develop in the environment where a secure bootloader is in place. They need to consider how to handle firmware fallbacks, if they are allowed and if so, under what conditions. Firmware images may need to be compressed on top of the need for authentication.

While the development stage has become more complicated, developers should not struggle too much to extrapolate their past experiences to developing secure firmware successfully.

Stage #2 – Test and Production Deployment

The area that developers will probably struggle with the most is the test and production deployment stage. Testing secure software requires additional steps to be taken that authenticate debug hardware so that the developer can access secure memory regions to test their code and successfully debug it. Even more importantly, care must be taken to install that secure software onto a product during production.

There are several ways this can be done, but one method is to use a secure flashing device like SEGGERS Flasher Secure. These devices can follow a multistep process that involves validating a user ID which allows the secure firmware to be installed on the device. The devices themselves limit how many and on what devices the firmware can be installed which helps to protect a team’s intellectual property and prevents unauthorized production of a product.

8782955684?profile=RESIZE_710x

Stage #3 – Maintenance and In-field Servicing

Finally, there is the maintenance and in-field servicing stage which is a partial continuation of the development phase. Once a product has been deployed into the field, it needs to be securely updated. Updates can be done manually in-field, or they can be done using an over-the-air update process. This involves a device being able to contact a secure firmware server that can compress and encrypt the image and transport it to the device. Once the device has received the image, it must decrypt, decompress and validate the contents of the image. If everything looks good, the image can then be loaded as the primary firmware for the device.

Conclusions

 There is much more to designing and deploying a secure device than simply developing a secure application. The entire process is broken up into three main stages that we have looked at in greater detail today. Unfortunately, we have only just scratched the surface!

Orignally posted here.

Read more…

When I think about the things that held the planet together in 2020, it was digital experiences delivered over wireless connectivity that made remote things local.

While heroes like doctors, nurses, first responders, teachers, and other essential personnel bore the brunt of the COVID-19 response, billions of people around the world found themselves cut off from society. In order to keep people safe, we were physically isolated from each other. Far beyond the six feet of social distancing, most of humanity weathered the storm from their homes.

And then little by little, old things we took for granted, combined with new things many had never heard of, pulled the world together. Let’s take a look at the technologies and trends that made the biggest impact in 2020 and where they’re headed in 2021:

The Internet

The global Internet infrastructure from which everything else is built is an undeniable hero of the pandemic. This highly-distributed network designed to withstand a nuclear attack performed admirably as usage by people, machines, critical infrastructure, hospitals, and businesses skyrocketed. Like the air we breathe, this primary facilitator of connected, digital experiences is indispensable to our modern society. Unfortunately, the Internet is also home to a growing cyberwar and security will be the biggest concern as we move into 2021 and beyond. It goes without saying that the Internet is one of the world’s most critical utilities along with water, electricity, and the farm-to-table supply chain of food.

Wireless Connectivity

People are mobile and they stay connected through their smartphones, tablets, in cars and airplanes, on laptops, and other devices. Just like the Internet, the cellular infrastructure has remained exceptionally resilient to enable communications and digital experiences delivered via native apps and the web. Indoor wireless connectivity continues to be dominated by WiFi at home and all those empty offices. Moving into 2021, the continued rollout of 5G around the world will give cellular endpoints dramatic increases in data capacity and WiFi-like speeds. Additionally, private 5G networks will challenge WiFi as a formidable indoor option, but WiFi 6E with increased capacity and speed won’t give up without a fight. All of these developments are good for consumers who need to stay connected from anywhere like never before.

Web Conferencing

With many people stuck at home in 2020, web conferencing technology took the place of traveling to other locations to meet people or receive education. This technology isn’t new and includes familiar players like GoToMeeting, Skype, WebEx, Google Hangouts/Meet, BlueJeans, FaceTime, and others. Before COVID, these platforms enjoyed success, but most people preferred to fly on airplanes to meet customers and attend conferences while students hopped on the bus to go to school. In 2020, “necessity is the mother of invention” took hold and the use of Zoom and Teams skyrocketed as airplanes sat on the ground while business offices and schools remained empty. These two platforms further increased their stickiness by increasing the number of visible people and adding features like breakout rooms to meet the demands of businesses, virtual conference organizers, and school teachers. Despite the rollout of the vaccine, COVID won’t be extinguished overnight and these platforms will remain strong through the first half of 2021 as organizations rethink where and when people work and learn. There’s way too many players in this space so look for some consolidation.

E-Commerce

“Stay at home” orders and closed businesses gave e-commerce platforms a dramatic boost in 2020 as they took the place of shopping at stores or going to malls. Amazon soared to even higher heights, Walmart upped their game, Etsy brought the artsy, and thousands of Shopify sites delivered the goods. Speaking of delivery, the empty city streets became home to fleets FedEx, Amazon, UPS, and DHL trucks bringing packages to your front doorstep. Many retail employees traded-in working at customer-facing stores for working in a distribution centers as long as they could outperform robots. Even though people are looking forward to hanging out at malls in 2021, the e-commerce, distribution center, delivery truck trinity is here to stay. This ball was already in motion and got a rocket boost from COVID. This market will stay hot in the first half of 2021 and then cool a bit in the second half.

Ghost Kitchens

The COVID pandemic really took a toll on restaurants in the 2020, with many of them going out of business permanently. Those that survived had to pivot to digital and other ways of doing business. High-end steakhouses started making burgers on grills in the parking lot, while takeout pizzerias discovered they finally had the best business model. Having a drive-thru lane was definitely one of the keys to success in a world without waiters, busboys, and hosts. “Front of house” was shut down, but the “back of house” still had a pulse. Adding mobile web and native apps that allowed customers to easily order from operating “ghost kitchens” and pay with credit cards or Apple/Google/Samsung Pay enabled many restaurants to survive. A combination of curbside pickup and delivery from the likes of DoorDash, Uber Eats, Postmates, Instacart and Grubhub made this business model work. A surge in digital marketing also took place where many restaurants learned the importance of maintaining a relationship with their loyal customers via connected mobile devices. For the most part, 2021 has restauranteurs hoping for 100% in-person dining, but a new business model that looks a lot like catering + digital + physical delivery is something that has legs.

The Internet of Things

At its very essence, IoT is all about remotely knowing the state of a device or environmental system along with being able to remotely control some of those machines. COVID forced people to work, learn, and meet remotely and this same trend applied to the industrial world. The need to remotely operate industrial equipment or an entire “lights out” factory became an urgent imperative in order to keep workers safe. This is yet another case where the pandemic dramatically accelerated digital transformation. Connecting everything via APIs, modeling entities as digital twins, and having software bots bring everything to life with analytics has become an ROI game-changer for companies trying to survive in a free-falling economy. Despite massive employee layoffs and furloughs, jobs and tasks still have to be accomplished, and business leaders will look to IoT-fueled automation to keep their companies running and drive economic gains in 2021.

Streaming Entertainment

Closed movie theaters, football stadiums, bowling alleys, and other sources of entertainment left most people sitting at home watching TV in 2020. This turned into a dream come true for streaming entertainment companies like Netflix, Apple TV+, Disney+, HBO Max, Hulu, Amazon Prime Video, Youtube TV, and others. That said, Quibi and Facebook Watch didn’t make it. The idea of binge-watching shows during the weekend turned into binge-watching every season of every show almost every day. Delivering all these streams over the Internet via apps has made it easy to get hooked. Multiplayer video games fall in this category as well and represent an even larger market than the film industry. Gamers socially distanced as they played each other from their locked-down homes. The rise of cloud gaming combined with the rollout of low-latency 5G and Edge computing will give gamers true mobility in 2021. On the other hand, the video streaming market has too many players and looks ripe for consolidation in 2021 as people escape the living room once the vaccine is broadly deployed.

Healthcare

With doctors and nurses working around the clock as hospitals and clinics were stretched to the limit, it became increasingly difficult for non-COVID patients to receive the healthcare they needed. This unfortunate situation gave tele-medicine the shot in the arm (no pun intended) it needed. The combination of healthcare professionals delivering healthcare digitally over widespread connectivity helped those in need. This was especially important in rural areas that lacked the healthcare capacity of cities. Concurrently, the Internet of Things is making deeper inroads into delivering the health of a person to healthcare professionals via wearable technology. Connected healthcare has a bright future that will accelerate in 2021 as high-bandwidth 5G provides coverage to more of the population to facilitate virtual visits to the doctor from anywhere.

Working and Living

As companies and governments told their employees to work from home, it gave people time to rethink their living and working situation. Lots of people living in previously hip, urban, high-rise buildings found themselves residing in not-so-cool, hollowed-out ghost towns comprised of boarded-up windows and closed bars and cafés. Others began to question why they were living in areas with expensive real estate and high taxes when they not longer had to be close to the office. This led to a 2020 COVID exodus out of pricey apartments/condos downtown to cheaper homes in distant suburbs as well as the move from pricey areas like Silicon Valley to cheaper destinations like Texas. Since you were stuck in your home, having a larger house with a home office, fast broadband, and a back yard became the most important thing. Looking ahead to 2021, a hybrid model of work-from-home plus occasionally going into the office is here to stay as employees will no longer tolerate sitting in traffic two hours a day just to sit in a cubicle in a skyscraper. The digital transformation of how and where we work has truly accelerated.

Data and Advanced Analytics

Data has shown itself to be one of the world’s most important assets during the time of COVID. Petabytes of data has continuously streamed-in from all over the world letting us know the number of cases, the growth or decline of infections, hospitalizations, contact-tracing, free ICU beds, temperature checks, deaths, and hotspots of infection. Some of this data has been reported manually while lots of other sources are fully automated from machines. Capturing, storing, organizing, modeling and analyzing this big data has elevated the importance of cloud and edge computing, global-scale databases, advanced analytics software, and the growing importance of machine learning. This is a trend that was already taking place in business and now has a giant spotlight on it due to its global importance. There’s no stopping the data + advanced analytics juggernaut in 2021 and beyond.

Conclusion

2020 was one of the worst years in human history and the loss of life was just heartbreaking. People, businesses, and our education system had to become resourceful to survive. This resourcefulness amplified the importance of delivering connected, digital experiences to make previously remote things into local ones. Cheers to 2021 and the hope for a brighter day for all of humanity.

Read more…

Arm DevSummit 2020 debuted this week (October 6 – 8) as an online virtual conference focused on engineers and providing them with insights into the Arm ecosystem. The summit lasted three days over which Arm painted an interesting technology story about the current and future state of computing and where developers fit within that story. I’ve been attending Arm Techcon for more than half a decade now (which has become Arm DevSummit) and as I perused content, there were several take-a-ways I noticed for developers working on microcontroller based embedded systems. In this post, we will examine these key take-a-ways and I’ll point you to some of the sessions that I also think may pique your interest.

(For those of you that aren’t yet aware, you can register up until October 21st (for free) and still watch the conferences materials up until November 28th . Click here to register)

Take-A-Way #1 – Expect Big Things from NVIDIAs Acquisition of Arm

As many readers probably already know, NVIDIA is in the process of acquiring Arm. This acquisition has the potential to be one of the focal points that I think will lead to a technological revolution in computing technologies, particularly around artificial intelligence but that will also impact nearly every embedded system at the edge and beyond. While many of us have probably wondered what plans NVIDIA CEO Jensen Huang may have for Arm, the Keynotes for October 6th include a fireside chat between Jensen Huang and Arm CEO Simon Segars. Listening to this conversation is well worth the time and will help give developers some insights into the future but also assurances that the Arm business model will not be dramatically upended.

Take-A-Way #2 – Machine Learning for MCU’s is Accelerating

It is sometimes difficult at a conference to get a feel for what is real and what is a little more smoke and mirrors. Sometimes, announcements are real, but they just take several years to filter their way into the market and affect how developers build systems. Machine learning is one of those technologies that I find there is a lot of interest around but that developers also aren’t quite sure what to do with yet, at least in the microcontroller space. When we hear machine learning, we think artificial intelligence, big datasets and more processing power than will fit on an MCU.

There were several interesting talks at DevSummit around machine learning such as:

Some of these were foundational, providing embedded developers with the fundamentals to get started while others provided hands-on explorations of machine learning with development boards. The take-a-way that I gather here is that the effort to bring machine learning capabilities to microcontrollers so that they can be leveraged in industry use cases is accelerating. Lots of effort is being placed in ML algorithms, tools, frameworks and even the hardware. There were several talks that mentioned Arm’s Cortex-M55 architecture that will include Helium technology to help accelerate machine learning and DSP processing capabilities.

Take-A-Way #3 – The Constant Need for Reinvention

In my last take-a-way, I eluded to the fact that things are accelerating. Acceleration is not just happening though in the technologies that we use to build systems. The very application domain that we can apply these technology domains to is dramatically expanding. Not only can we start to deploy security and ML technologies at the edge but in domains such as space and medical systems. There were several interesting talks about how technologies are being used around the world to solve interesting and unique problems such as protecting vulnerable ecosystems, mapping the sea floor, fighting against diseases and so much more.

By carefully watching and listening, you’ll notice that many speakers have been involved in many different types of products over their careers and that they are constantly having to reinvent their skill sets, capabilities and even their interests! This is what makes working in embedded systems so interesting! It is constantly changing and evolving and as engineers we don’t get to sit idly behind a desk. Just as Arm, NVIDIA and many of the other ecosystem partners and speakers show us, technology is rapidly changing but so are the problem domains that we can apply these technologies to.

Take-A-Way #4 – Mbed and Keil are Evolving

There are also interesting changes coming to the Arm toolchains and tools like Mbed and Keil MDK. In Reinhard Keil’s talk, “Introduction to an Open Approach for Low-Power IoT Development“, developers got an insight into the changes that are coming to Mbed and Keil with the core focus being on IoT development. The talk focused on the endpoint and discussed how Mbed and Keil MDK are being moved to an online platform designed to help developers move through the product development faster from prototyping to production. The Keil Studio Online is currently in early access and will be released early next year.

(If you are interested in endpoints and AI, you might also want to check-out this article on “How Do We Accelerate Endpoint AI Innovation? Put Developers First“)

Conclusions

Arm DevSummit had a lot to offer developers this year and without the need to travel to California to participate. (Although I greatly missed catching up with friends and colleagues in person). If you haven’t already, I would recommend checking out the DevSummit and watching a few of the talks I mentioned. There certainly were a lot more talks and I’m still in the process of sifting through everything. Hopefully there will be a few sessions that will inspire you and give you a feel for where the industry is headed and how you will need to pivot your own skills in the coming years.

Originaly posted here

Read more…

Will We Ever Get Quantum Computers?

In a recent issue of IEEE Spectrum, Mikhail Dyakonov makes a pretty compelling argument that quantum computing (QC) isn't going to fly anytime soon. Now, I'm no expert on QC, and there sure is a lot of money being thrown at the problem by some very smart people, but having watched from the sidelines QC seems a lot like fusion research. Every year more claims are made, more venture capital gets burned, but we don't seem to get closer to useful systems.

Consider D-Wave Systems. They've been trying to build a QC for twenty years, and indeed do have products more or less on the market, including, it's claimed, one of 1024 q-bits. But there's a lot of controversy about whether their machines are either quantum computers at all, or if they offer any speedup over classical machines. One would think that if a 1K q-bit machine really did work the press would be all abuzz, and we'd be hearing constantly of new incredible results. Instead, the machines seem to disappear into research labs.

Mr. Duakonov notes that optimistic people expect useful QCs in the next 5-10 years; those less sanguine expect 20-30 years, a prediction that hasn't changed in two decades. He thinks a window of many decades to never is more realistic. Experts think that a useful machine, one that can do the sort of calculations your laptop is capable of, will require between 1000 and 100,000 q-bits. To me, this level of uncertainty suggests that there is a profound lack of knowledge about how these machines will work and what they will be able to do.

According to the author, a 1000 q-bit machine can be in 21000 states (a classical machine with N transistors can be in only 2N states), which is about 10300, or more than the number of sub-atomic particles in the universe. At 100,000 q-bits we're talking 1030,000, a mind-boggling number.

Because of noise, expect errors. Some theorize that those errors can be eliminated by adding q-bits, on the order of 1000 to 100,000 additional per q-bit. So a useful machine will need at least millions, or perhaps many orders of magnitude more, of these squirrelly microdots that are tamed only by keeping them at 10 millikelvin.

A related article in Spectrum mentions a committee formed of prestigious researchers tasked with assessing the probability of success with QC concluded that:

"[I]t is highly unexpected" that anyone will be able to build a quantum computer that could compromise public-key cryptosystems (a task that quantum computers are, in theory, especially suitable for tackling) in the coming decade. And while less-capable "noisy intermediate-scale quantum computers" will be built within that time frame, "there are at present no known algorithms/applications that could make effective use of this class of machine," the committee says."

I don't have a dog in this fight, but am relieved that useful QC seems to be no closer than The Distant Shore (to quote Jan de Hartog, one of my favorite writers). If it were feasible to easily break encryption schemes banking and other systems could collapse. I imagine Blockchain would fail as hash algorithms became reversable. The resulting disruption would not be healthy for our society.

On the other hand, Bruce Schneier's article in the March issue of IEEE Computing Edge suggests that QC won't break all forms of encryption, though he does think a lot of our current infrastructure will be vulnerable. The moral: if and when QC becomes practical, expect chaos.

I was once afraid of quantum computing, as it involves mechanisms that I'll never understand. But then I realized those machines will have an API. Just as one doesn't need to know how a computer works to program in Python, we'll be insulated from the quantum horrors by layers of abstraction.

Originaly posted here

Read more…

7811924256?profile=RESIZE_400x

 

CLICK HERE TO DOWNLOAD

This complete guide is a 212-page eBook and is a must read for business leaders, product managers and engineers who want to implement, scale and optimize their business with IoT communications.

Whether you want to attempt initial entry into the IoT-sphere, or expand existing deployments, this book can help with your goals, providing deep understanding into all aspects of IoT.

CLICK HERE TO DOWNLOAD

Read more…

Edge Products Are Now Managed At The Cloud

Now more than ever, there are billions of edge products in the world. But without proper cloud computing, making the most of electronic devices that run on Linux or any other OS would not be possible.

And so, a question most people keep asking is which is the best Software-as-a-service platform that can effectively manage edge devices through cloud computing. Well, while edge device management may not be something, the fact that cloud computing space is not fully exploited means there is a lot to do in the cloud space.

Product remote management is especially necessary for the 21st century and beyond. Because of the increasing number of devices connected to the internet of things (IoT), a reliable SaaS platform should, therefore, help with maintaining software glitches from anywhere in the world. From smart homes, stereo speakers, cars, to personal computers, any product that is connected to the internet needs real-time protection from hacking threats such as unlawful access to business or personal data.

Data being the most vital asset is constantly at risk, especially if individuals using edge products do not connect to trusted, reliable, and secure edge device management platforms.

Bridges the Gap Between Complicated Software And End Users

Cloud computing is the new frontier through which SaaS platforms help manage edge devices in real-time. But something even more noteworthy is the increasing number of complicated software that now run edge devices at homes and in workplaces.

Edge device management, therefore, ensures everything runs smoothly. From fixing bugs, running debugging commands to real-time software patch deployment, cloud management of edge products bridges a gap between end-users and complicated software that is becoming the norm these days.

Even more importantly, going beyond physical firewall barriers is a major necessity in remote management of edge devices. A reliable Software-as-a-Service, therefore, ensures data encryption for edge devices is not only hackproof by also accessed by the right people. Moreover, deployment of secure routers and access tools are especially critical in cloud computing when managing edge devices. And so, developers behind successful SaaS platforms do conduct regular security checks over the cloud, design and implement solutions for edge products.

Reliable IT Infrastructure Is Necessary

Software-as-a-service platforms that manage edge devices focus on having a reliable IT infrastructure and centralized systems through which they can conduct cloud computing. It is all about remotely managing edge devices with the help of an IT infrastructure that eliminates challenges such as connectivity latency.

Originally posted here

Read more…

Industrial IoT Revolution

Why the Nvidia Jetson Nano is responsible for the biggest industrial IoT revolution these days

 
c1f0a2_ecaa338269684f82b2661b550075f528~mv2.webp
 
 

It feels like yesterday when the Raspberry Pi foundation released the first-in-line Single Board Computer (SBC) to the market. Back in 2012, Raspberry Pi wasn't alone in the SBC growing market, however, it was the first to make a community-based product that brings the hardware and the software eco-system to a beautiful harmony on the internet. Before those days, embedded Linux based SBC's and SOM's were a place for Linux kernel and embedded hardware experts, no easy-to-use tools, ready Linux based distros, or most importantly without the enormous amount of questions and answers across the internet on anything related.

Today, 8 years later, the "2012 revolution" happens again

This time, it took a year to understand the impact of the new 'kid' in the market, but now, there are a few indications that defiantly build the route to a revolution.

The Raspberry Pi was the first to make embedded Linux easy while keeping the advantages of reliability and flexibility in terms of fitting to different kinds of industries applications. It's almost impossible to ignore the variety of industries where Raspberry Pi is in its hurt of products to save time-to-market and costs. The power of this magical board leans on the software side: The Raspberry Pi foundation and their community, worked hard across the years to improve and share their knowledge, but, at the same time, without notice or targeting, they brought the Pi development to an extremely "serverless" level.

The Nvidia Jetson Nano

Let's stop talking about the Raspberry Pi and focus on today's industry needs to understand better why the new kid in the town is here to change the market of IoT and smart products forever.

 
c1f0a2_2ca55bc3cd744a10a05bc244c4e092c1~mv2.webp
 
 Why do we need to thanks Nvidia and the Jetson Nano?
 

The market is going forward. AI, Robotics, amazing-looking screen app Gui's, image processing, and long data calculations are all become the new standard of smart edge products.

If a few years ago, you would only want to connect your product to the cloud and receive anything valuable, today, product managers and developers compete in a much tougher industry era. This time, the Raspberry Pi can't be the technology hero again, its resources are limited and the eco-system starts to squint to a better-fit solution.

 
c1f0a2_b46f958fa9b543af88a6ad38b2afce82~mv2.webp
 
 

NVIDIA Jetson devices in Upswift.io device management platform

The Jetson Nano is the first SBC to understood the necessary combination that will drive new products to use it. It's the first SBC designed in the mind of industrial powerful use cases, while not forgetting the prototyping stage and the harmony that gave the Raspberry Pi their success. It's the first solution to bring the whole package for developers and for hardware engineers with a "SaaS" feel: The OS is already perfect thanks to Ubuntu, there is plenty of software instructions by Nvidia and open-source ready-to-use tools custom made for the Jetson family, and for the hardware engineers: they are free to go with the System On Module (SOM) that is connected to a carrier board which includes all the necessary outputs and inputs to make the development stage even faster.

The Jetson Nano combination is basically providing the first world infrastructure for producing a "2020" product with complex software while working in a minimal budget and time-to-market. The Jetson Nano enables developers and product managers to imagine further without compromises, bringing tough software missions to the edge easily.

Originally posted here

Read more…

Industrial Prototyping for IoT

I-Pi SMARC.jpg

ADLINK is a global leader in edge computing driving data-to-decision applications across industries. The company recently introduced I-Pi SMARC for Industrial IoT prototyping.

-       AdLInk I-Pi SMARC consists of a simple carrier paired with a SMARC Computer on Module

-       SMARC Modules are available from entry level PX30 Rockchip to top of the line Intel Apollo Lake.

-       SMARC modules are specifically designed for typical industrial embedded applications that require long life, high MTBF and strict revision control.

-       Use popular off the shelve sensors and create prototypes or proof of concepts on short notice.

Additional information can be found here

 

Read more…

By: Kelly McNelis

We have faced unprecedented disruption from the many challenges of COVID-19, and PTC’s LiveWorx was no exception. The definitive digital transformation event went virtual this year, and despite the transition from physical to digital, LiveWorx delivered.

Of the many insightful virtual keynotes, one that caught everyone’s attention was ‘Digital Transformation: The Technology & Support You Need to Succeed,’ presented by PTC’s Executive Vice President (EVP) of Products, Kevin Wrenn, and PTC’s EVP and Chief Customer Officer, Eduarda Camacho.

Their keynote focused on how companies should be prioritizing the use of best-in-class technology that will meet their changing needs during times of disruption and accelerated digital transformation. Wrenn and Camacho highlighted five of our customers through interactive case studies on how they are using PTC technology to capitalize on digital transformation to thrive in an era of disruption.

6907721673?profile=RESIZE_400x

Below is a summary of the five customers and their stories that were highlighted during the keynote.

1. Royal Enfield (Mass Customization)

Royal Enfield is an Indian motorcycle company that has been manufacturing motor bikes since 1901. They have British roots, and their main customer base is located in India and Europe. Riders of Royal Enfield wants their bikes to be particular to their brand, so they worked to better manage the complexities of mass customization and respond to market demands.

Royal Enfield is a long time PTC customer, but they were on old versions of PTC technology. They first upgraded Creo and Windchill to the latest releases so they could leverage the new capabilities. They then moved on to transform their processes for platform and variant designs, introduced simulation much earlier by using Creo Simulation Live, and leveraged generative design by bringing AI into engineering and applying it to engine and chassis complex custom forged components. Finally, they retrained and retooled their engineering staff to fully leverage the power of new processes and technologies.

The entire Royal Enfield team now has digital capabilities that accelerate new product designs, variants, and accessories for personalization; as a result, they are able to deliver a much-shortened design cycle. Royal Enfield is continuing their digital transformation trend, and will invest in new ways to create value while leveraging augmented reality with PTC's Vuforia suite.

2. VCST (Manufacturing Efficiency, Quality, and Innovation)

VCST is part of the BMT Group and are a world-class automotive supplier of precision-machined power train and brake components. Their problem was that they had high costs for their production facility in Belgium. They either needed to improve their cost efficiency in their plant or face the potential of needing to shut down the facility and relocate it to another region. VCST decided to implement ThingWorx so that anyone can have instant visibility to asset status and performance. VCST is also creating the ability to digitize maintenance requests and the ability to acquire about spare parts to improve the overall efficiency in support of their costs reduction goals.

Additionally, VCST has a goal to reach zero complaints for their customers and, if any quality problems appear to their customers, they can be required to do a 100% inspection until the problem is solved. Moreover, as cars have gotten quieter with electrification, the noise from the gears has become an issue, and puts pressure on VCST to innovate and reduce gear noise.

VCST has again relied on ThingWorx and Windchill to collect and share data for joint collaborative analysis to innovate and reduce gear noise. VCST also plans to use Vuforia Expert Capture and Vuforia Chalk to train maintenance workers to further improve their efficiency and cost effectiveness. The company is not done with their digital transformation, and they have plans to implement Creo and Windchill to enable end-to-end digital thread connectivity to the factory.

3. BID Group Holdings (Connected Product)

BID Group Holdings operates in the wood processing industry. It is one of the largest integrated suppliers and North American leader in the field. The purpose of BID Group is to deliver a complete range of innovative equipment, digital technologies, turnkey installations, and aftermarket services to their customers. BID Group decided to focus on their areas of expertise, an rely on PTC, Microsoft, and Rockwell Automation’s combined capabilities and scale to deliver SaaS type solutions to their own industry.

Leveraging this combined power, the BID Group developed a digital strategy for service to improve mill efficiency and profitability. The solution is named OPER8 and was built on the ThingWorx platform. This allowed BID Group to provide their customers an out of the box solution with efficient time-to-value and low costs of ownership. BID Group is continuing to work with PTC and Rockwell Automation, to develop additional solutions that will reduce downtime of OPER8 with a predictive analytics module by using ThingWorx Analytics and LogixAI.

4. Hitachi (Service Optimization)

Hitachi operates an extensive service decision that ensures its customers’ data systems remain up and running. Their challenge was not to only meet their customers uptime Service Level Agreements, but to do it without killing their cost structure. Hitachi decided to implement PTC’s Servigistics Service Parts Management software to ensure the right parts are available when and where they are needed for service. With Servigistics, Hitachi was able to accomplish their needs while staying cost effective and delighting their customers.

Hitachi runs on the cloud, which allows them to upgrade to current releases more often, take advantage of new functionality, and avoid unexpected costs.

PTC has driven engagement and support for Hitachi through the PTC Community, and encourages all customers to utilize this platform. The network of collaborative spaces in a gathering place for PTC customers and partners to showcase their work, inspire each other, and share ideas or best practices in order to expand the value of their PTC solutions and services.

5. COVID-19 Response 

COVID-19 has put significant strain on the world’s hospitals and healthcare infrastructure, and hospitalization rates for COVID brought into question the capacity of being able to handle cases. Many countries began thinking of the value field hospitals could bring to safely care for patients and ease the admissions numbers of ‘regular’ hospitals. However, the complication is that field hospitals have essentially no isolation or air filtration capability that is required for treating COVID patients or healthcare workers.

As a result, the US Army Corp of Engineers has put out specifications to create self-contained isolation units, which are fully functioning hospital rooms that can be transported or built onsite. But, the assembly needed to happen fast, and a group of companies (including PTC) led by The Innovation Machine rallied to help design and define the SCIU’s.

With buy-in from numerous companies, a common platform was needed for companies to collaborate. PTC felt compelled to react, and many PTC customers and partners joined in to help create a collaboration platform, with cloud-based Windchill as the foundation. But, PTC didn’t just provide software to this collaboration; PTC also contributed with digital thread and design advice to help the group solve some of the major challenges. This design is a result of the many companies coming together to create deployments across various US state governments, agencies, and FEMA.

Final Thoughts

All of the above customers approached digital transformation as a business imperative. They all had sizeable challenges that needed to be solved and took leadership positions to implement plans that leveraged digital transformation technologies combined with new processes.

PTC will continue to innovate across the digital transformation portfolio and is committed to ensuring that customer success offerings capture value faster and provide the best outcomes.

Original Post Link: https://www.ptc.com/en/product-lifecycle-report/liveworx-digital-transformation–technology-and-support-you-need-to-succeed

Author Bio: Kelly is a corporate communications specialist at PTC. Her responsibilities include drafting and approving content for PTC’s external and social media presence and supporting communications for the Chief Strategy Officer. Kelly has previous experience as a communications specialist working to create and implement materials for the Executive Vice President of the Products Organization and senior management team members.

 

Read more…

Helium Expands to Europe

Helium, the company behind one of the world’s first peer-to-peer wireless networks, is announcing the introduction of Helium Tabs, its first branded IoT tracking device that runs on The People’s Network. In addition, after launching its network in 1,000 cities in North America within one year, the company is expanding to Europe to address growing market demand with Helium Hotspots shipping to the region starting July 2020. 

Since its launch in June 2019, Helium quickly grew its footprint with Hotspots covering more than 700,000 square miles across North America. Helium is now expanding to Europe to allow for seamless use of connected devices across borders. Powered by entrepreneurs looking to own a piece of the people-powered network, Helium’s open-source blockchain technology incentivizes individuals to deploy Hotspots and earn Helium (HNT), a new cryptocurrency, for simultaneously building the network and enabling IoT devices to send data to the Internet. When connected with other nearby Hotspots, this acts as the backbone of the network. 

“We’re excited to launch Helium Tabs at a time where we’ve seen incredible growth of The People’s Network across North America,” said Amir Haleem, Helium’s CEO and co-founder. “We could not have accomplished what we have done, in such a short amount of time, without the support of our partners and our incredible community. We look forward to launching The People’s Network in Europe and eventually bringing Helium Tabs and other third-party IoT devices to consumers there.”  

Introducing Helium Tabs that Run on The People’s Network
Unlike other tracking devices,Tabs uses LongFi technology, which combines the LoRaWAN wireless protocol with the Helium blockchain, and provides network coverage up to 10 miles away from a single Hotspot. This is a game-changer compared to WiFi and Bluetooth enabled tracking devices which only work up to 100 feet from a network source. What’s more, due to Helium’s unique blockchain-based rewards system, Hotspot owners will be rewarded with Helium (HNT) each time a Tab connects to its network. 

In addition to its increased growth with partners and customers, Helium has also seen accelerated expansion of its Helium Patrons program, which was introduced in late 2019. All three combined have helped to strengthen its network. 

Patrons are entrepreneurial customers who purchase 15 or more Hotspots to help blanket their cities with coverage and enable customers, who use the network. In return, they receive discounts, priority shipping, network tools, and Helium support. Currently, the program has more than 70 Patrons throughout North America and is expanding to Europe. 

Key brands that use the Helium Network include: 

  • Nestle, ReadyRefresh, a beverage delivery service company
  • Agulus, an agricultural tech company
  • Conserv, a collections-focused environmental monitoring platform

Helium Tabs will initially be available to existing Hotspot owners for $49. The Helium Hotspot is now available for purchase online in Europe for €450.

Read more…
RSS
Email me when there are new items in this category –

Sponsor