Subscribe to our Newsletter | To Post On IoT Central, Click here


Featured Posts (674)

Sort by

Then it seemed that overnight, millions of workers worldwide were told to isolate and work from home as best as they could. Businesses were suddenly forced to enable remote access for hundreds or thousands of users, all at once, from anywhere across the globe. Many companies that already offered VPN services to a small group of remote workers scurried to extend those capabilities to the much larger workforce sequestering at home. It was a decision made in haste out of necessity, but now it’s time to consider, is VPN the best remote access technology for the enterprise, or can other technologies provide a better long-term solution?

Long-term Remote Access Could Be the Norm for Some Time

Some knowledge workers are trickling back to their actual offices, but many more are still at home and will be for some time. Global Workplace Analytics estimates that 25-30% of the workforce will still be working from home multiple days a week by the end of 2021. Others may never return to an official office, opting to remain a work-from-home (WFH) employee for good.

Consequently, enterprises need to find a remote access solution that gives home-based workers a similar experience as they would have in the office, including ease of use, good performance, and a fully secure network access experience. What’s more, the solution must be cost effective and easy to administer without the need to add more technical staff members.

VPNs are certainly one option, but not the only one. Other choices include appliance-based SD-WAN and SASE. Let’s have a look at each approach.

VPNs Weren’t Designed to Support an Entire Workforce

While VPNs are a useful remote access solution for a small portion of the workforce, they are an inefficient technology for giving remote access to a very large number of workers. VPNs are designed for point-to-point connectivity, so each secure connection between two points – presumably a remote worker and a network access server (NAS) in a datacenter – requires its own VPN link. Each NAS has a finite capacity for simultaneous users, so for a large remote user base, some serious infrastructure may be needed in the datacenter.

Performance can be an issue. With a VPN, all communication between the user and the VPN is encrypted. The encryption process takes time, and depending on the type of encryption used, this may add noticeable latency to Internet communications. More important, however, is the latency added when a remote user needs access to IaaS and SaaS applications and services. The traffic path is convoluted because it must travel between the end user and the NAS before then going out to the cloud, and vice versa on the way back.

An important issue with VPNs is that they provide overly broad access to the entire network without the option of controlling granular user access to specific resources. Stolen VPN credentials have been implicated in several high-profile data breaches. By using legitimate credentials and connecting through a VPN, attackers were able to infiltrate and move freely through targeted company networks. What’s more, there is no scrutiny of the security posture of the connecting device, which could allow malware to enter the network via insecure user devices.

SD-WAN Brings Intelligence into Routing Remote Users’ Traffic

Another option for providing remote access for home-based workers is appliance-based SD-WAN. It brings a level of intelligence to the connectivity that VPNs don’t have. Lee Doyle, principal analyst with Doyle Research, outlines the benefits of using SD-WAN to connect home office users to their enterprise network:

  • Prioritization for mission-critical and latency-sensitive applications
  • Accelerated access to cloud-based services
  • Enhanced security via encryption, VPNs, firewalls and integration with cloud-based security
  • Centralized management tools for IT administrators

One thing to consider about appliance-based SD-WAN is that it’s primarily designed for branch office connectivity—though it can accommodate individual users at home as well. However, if a company isn’t already using SD-WAN, this isn’t a technology that is easy to implement and setup for hundreds or thousands of home-based users. What’s more, a significant investment must be made in the various communication and security appliances.

SASE Provides a Simpler, More Secure, Easily Scalable Solution

Cato’s Secure Access Service Edge (or SASE) platform provides a great alternative to VPN for remote access by many simultaneous workers. The platform offers scalable access, optimized connectivity, and integrated threat prevention that are needed to support continuous large-scale remote access.

Companies that enable WFH using Cato’s platform can scale quickly to any number of remote users with ease. There is no need to set up regional hubs or VPN concentrators. The SASE service is built on top of dozens of globally distributed Points of Presence (PoPs) maintained by Cato to deliver a wide range of security and networking services close to all locations and users. The complexity of scaling is all hidden in the Cato-provided PoPs, so there is no infrastructure for the organization to purchase, configure or deploy. Giving end users remote access is as simple as installing a client agent on the user’s device, or by providing clientless access to specific applications via a secure browser.

Cato’s SASE platform employs Zero Trust Network Access in granting users access to the specific resources and applications they need to use. This granular-level security is part of the identity-driven approach to network access that SASE demands. Since all traffic passes through a full network security stack built into the SASE service, multi-factor authentication, full access control, and threat prevention are applied to traffic from remote users. All processing is done within the PoP closest to the users while enforcing all corporate network and security policies. This eliminates the “trombone effect” associated with forcing traffic to specific security choke points on a network. Further, admins have consistent visibility and control of all traffic throughout the enterprise WAN.

SASE Supports WFH in the Short-term and Long-term

While some workers are venturing back to their offices, many more are still working from home—and may work from home permanently. The Cato SASE platform is the ideal way to give them access to their usual network environment without forcing them to go through insecure and inconvenient VPNs.

Originally posted here

Read more…

Today the world is obsessed with the IoT, as if this is a new concept. We've been building the IoT for decades, but it was only recently some marketing "genius" came up with the new buzz-acronym.

Before there was an IoT, before there was an Internet, many of us were busy networking. For the Internet itself was a (brilliant) extension of what was already going on in the industry.

My first experience with networking was in 1971 at the University of Maryland. The school had a new computer, a $10 million Univac 1108 mainframe. This was a massive beast that occupied most of the first floor of a building. A dual-processor machine it was transistorized, though the control console did have some ICs. Rows of big tape drives mirrored the layman's idea of computers in those days. Many dishwasher-sized disk drives were placed around the floor and printers, card readers and other equipment were crammed into every corner. Two Fastrand drum memories, each consisting of a pair of six-foot long counterrotating drums, stored a whopping 90 MB each. Through a window you could watch the heads bounce around.

The machine was networked. It had a 300 baud modem with which it could contact computers at other universities. A primitive email system let users create mail which was queued till nightfall. Then, when demands on the machine were small, it would call the appropriate remote computer and forward mail. The system operated somewhat like today's "hot potato" packets, where the message might get delivered to the easiest machine available, which would then attempt further forwarding. It could take a week to get an email, but at least one saved the $0.08 stamp that the USPS charged.

The system was too slow to be useful. After college I lost my email account but didn't miss it at all.

By the late 70s many of us had our own computers. Mine was a home-made CP/M machine with a Z80 processor and a small TV set as a low-res monitor. Around this time Compuserve came along and I, like so many others, got an account with them. Among other features, users had email addresses. Pretty soon it was common to dial into their machines over a 300 baud modem and exchange email and files. Eventually Compuserve became so ubiquitous that millions were connected, and at my tools business during the 1980s it was common to provide support via this email. The CP/M machine gave way to a succession of PCs, Modems ramped up to 57 K baud.

My tools business expanded rapidly and soon we had a number of employees. Sneakernet was getting less efficient so we installed an Arcnet network using Windows 3.11. That morphed into Ethernet connections, though the cursing from networking problems multiplied about as fast as the data transfers. Windows was just terrible at maintaining reliable connectivity.

In 1992 Mike Lee, a friend from my Boys Night Out beer/politics/sailing/great friends group, which still meets weekly (though lately virtually) came by the office with his laptop. "You have GOT to see this" he intoned, and he showed me the world-wide web. There wasn't much to see as there were few sites. But the promise was shockingly clear. I was stunned.

The tools business had been doing well. Within a month we spent $100k on computers, modems and the like and had a new business: Softaid Internet Services. SIS was one of Maryland's first ISPs and grew quickly to several thousand customers. We had a T1 connection to MAE-EAST in the DC area which gave us a 1.5 Mb/s link… for $5000/month. Though a few customers had ISDN connections to us, most were dialup, and our modem shelf grew to over 100 units with many big fans keeping the things cool.

The computers all ran BSD Unix, which was my first intro to that OS.

I was only a few months back from a failed attempt to singlehand my sailboat across the Atlantic and had written a book-length account of that trip. I hastily created a web page of that book to learn about using the web. It is still online and has been read several million times in the intervening years. We put up a site for the tools business which eventually became our prime marketing arm.

The SIS customers were sometimes, well, "interesting." There was the one who claimed to be a computer expert, but who tried to use the mouse by waving it around over the desk. Many had no idea how to connect a modem. Others complained about our service because it dropped out when mom would pick up the phone to make a call over the modem's beeping. A lot of handholding and training was required.

The logs showed a shocking (to me at the time) amount of porn consumption. Over lunch an industry pundit explained how porn drove all media, from the earliest introduction of printing hundreds of years earlier.

The woman who ran the ISP was from India. She was delightful and had a wonderful marriage. She later told me it had been arranged; they met  their wedding day. She came from a remote and poor village and had had no exposure to computers, or electricity, till emigrating to the USA.

Meanwhile many of our tools customers were building networking equipment. We worked closely with many of them and often had big routers, switches and the like onsite that our engineers were working on. We worked on a lot of what we'd now call IoT gear: sensors et al connected to the net via a profusion of interfaces.

I sold both the tools and Internet businesses in 1997, but by then the web and Internet were old stories.

Today, like so many of us, I have a fast (250 Mb/s) and cheap connection into the house with four wireless links and multiple computers chattering to each other. Where in 1992 the web was incredibly novel and truly lacking in useful functionality, now I can't imagine being deprived of it. Remember travel agents? Ordering things over the phone (a phone that had a physical wire connecting it to Ma Bell)? Using 15 volumes of an encyclopedia? Physically mailing stuff to each other?

As one gets older the years spin by like microseconds, but it is amazing to stop and consider just how much this world has changed. My great grandfather lived on a farm in a world that changed slowly; he finally got electricity in his last year of life. His daughter didn't have access to a telephone till later in life, and my dad designed spacecraft on vellum and starched linen using a slide rule. My son once saw a typewriter and asked me what it was; I mumbled that it was a predecessor of Microsoft Word.

That he understood. I didn't have the heart to try and explain carbon paper.

Originally posted HERE.

Read more…

When I think about the things that held the planet together in 2020, it was digital experiences delivered over wireless connectivity that made remote things local.

While heroes like doctors, nurses, first responders, teachers, and other essential personnel bore the brunt of the COVID-19 response, billions of people around the world found themselves cut off from society. In order to keep people safe, we were physically isolated from each other. Far beyond the six feet of social distancing, most of humanity weathered the storm from their homes.

And then little by little, old things we took for granted, combined with new things many had never heard of, pulled the world together. Let’s take a look at the technologies and trends that made the biggest impact in 2020 and where they’re headed in 2021:

The Internet

The global Internet infrastructure from which everything else is built is an undeniable hero of the pandemic. This highly-distributed network designed to withstand a nuclear attack performed admirably as usage by people, machines, critical infrastructure, hospitals, and businesses skyrocketed. Like the air we breathe, this primary facilitator of connected, digital experiences is indispensable to our modern society. Unfortunately, the Internet is also home to a growing cyberwar and security will be the biggest concern as we move into 2021 and beyond. It goes without saying that the Internet is one of the world’s most critical utilities along with water, electricity, and the farm-to-table supply chain of food.

Wireless Connectivity

People are mobile and they stay connected through their smartphones, tablets, in cars and airplanes, on laptops, and other devices. Just like the Internet, the cellular infrastructure has remained exceptionally resilient to enable communications and digital experiences delivered via native apps and the web. Indoor wireless connectivity continues to be dominated by WiFi at home and all those empty offices. Moving into 2021, the continued rollout of 5G around the world will give cellular endpoints dramatic increases in data capacity and WiFi-like speeds. Additionally, private 5G networks will challenge WiFi as a formidable indoor option, but WiFi 6E with increased capacity and speed won’t give up without a fight. All of these developments are good for consumers who need to stay connected from anywhere like never before.

Web Conferencing

With many people stuck at home in 2020, web conferencing technology took the place of traveling to other locations to meet people or receive education. This technology isn’t new and includes familiar players like GoToMeeting, Skype, WebEx, Google Hangouts/Meet, BlueJeans, FaceTime, and others. Before COVID, these platforms enjoyed success, but most people preferred to fly on airplanes to meet customers and attend conferences while students hopped on the bus to go to school. In 2020, “necessity is the mother of invention” took hold and the use of Zoom and Teams skyrocketed as airplanes sat on the ground while business offices and schools remained empty. These two platforms further increased their stickiness by increasing the number of visible people and adding features like breakout rooms to meet the demands of businesses, virtual conference organizers, and school teachers. Despite the rollout of the vaccine, COVID won’t be extinguished overnight and these platforms will remain strong through the first half of 2021 as organizations rethink where and when people work and learn. There’s way too many players in this space so look for some consolidation.

E-Commerce

“Stay at home” orders and closed businesses gave e-commerce platforms a dramatic boost in 2020 as they took the place of shopping at stores or going to malls. Amazon soared to even higher heights, Walmart upped their game, Etsy brought the artsy, and thousands of Shopify sites delivered the goods. Speaking of delivery, the empty city streets became home to fleets FedEx, Amazon, UPS, and DHL trucks bringing packages to your front doorstep. Many retail employees traded-in working at customer-facing stores for working in a distribution centers as long as they could outperform robots. Even though people are looking forward to hanging out at malls in 2021, the e-commerce, distribution center, delivery truck trinity is here to stay. This ball was already in motion and got a rocket boost from COVID. This market will stay hot in the first half of 2021 and then cool a bit in the second half.

Ghost Kitchens

The COVID pandemic really took a toll on restaurants in the 2020, with many of them going out of business permanently. Those that survived had to pivot to digital and other ways of doing business. High-end steakhouses started making burgers on grills in the parking lot, while takeout pizzerias discovered they finally had the best business model. Having a drive-thru lane was definitely one of the keys to success in a world without waiters, busboys, and hosts. “Front of house” was shut down, but the “back of house” still had a pulse. Adding mobile web and native apps that allowed customers to easily order from operating “ghost kitchens” and pay with credit cards or Apple/Google/Samsung Pay enabled many restaurants to survive. A combination of curbside pickup and delivery from the likes of DoorDash, Uber Eats, Postmates, Instacart and Grubhub made this business model work. A surge in digital marketing also took place where many restaurants learned the importance of maintaining a relationship with their loyal customers via connected mobile devices. For the most part, 2021 has restauranteurs hoping for 100% in-person dining, but a new business model that looks a lot like catering + digital + physical delivery is something that has legs.

The Internet of Things

At its very essence, IoT is all about remotely knowing the state of a device or environmental system along with being able to remotely control some of those machines. COVID forced people to work, learn, and meet remotely and this same trend applied to the industrial world. The need to remotely operate industrial equipment or an entire “lights out” factory became an urgent imperative in order to keep workers safe. This is yet another case where the pandemic dramatically accelerated digital transformation. Connecting everything via APIs, modeling entities as digital twins, and having software bots bring everything to life with analytics has become an ROI game-changer for companies trying to survive in a free-falling economy. Despite massive employee layoffs and furloughs, jobs and tasks still have to be accomplished, and business leaders will look to IoT-fueled automation to keep their companies running and drive economic gains in 2021.

Streaming Entertainment

Closed movie theaters, football stadiums, bowling alleys, and other sources of entertainment left most people sitting at home watching TV in 2020. This turned into a dream come true for streaming entertainment companies like Netflix, Apple TV+, Disney+, HBO Max, Hulu, Amazon Prime Video, Youtube TV, and others. That said, Quibi and Facebook Watch didn’t make it. The idea of binge-watching shows during the weekend turned into binge-watching every season of every show almost every day. Delivering all these streams over the Internet via apps has made it easy to get hooked. Multiplayer video games fall in this category as well and represent an even larger market than the film industry. Gamers socially distanced as they played each other from their locked-down homes. The rise of cloud gaming combined with the rollout of low-latency 5G and Edge computing will give gamers true mobility in 2021. On the other hand, the video streaming market has too many players and looks ripe for consolidation in 2021 as people escape the living room once the vaccine is broadly deployed.

Healthcare

With doctors and nurses working around the clock as hospitals and clinics were stretched to the limit, it became increasingly difficult for non-COVID patients to receive the healthcare they needed. This unfortunate situation gave tele-medicine the shot in the arm (no pun intended) it needed. The combination of healthcare professionals delivering healthcare digitally over widespread connectivity helped those in need. This was especially important in rural areas that lacked the healthcare capacity of cities. Concurrently, the Internet of Things is making deeper inroads into delivering the health of a person to healthcare professionals via wearable technology. Connected healthcare has a bright future that will accelerate in 2021 as high-bandwidth 5G provides coverage to more of the population to facilitate virtual visits to the doctor from anywhere.

Working and Living

As companies and governments told their employees to work from home, it gave people time to rethink their living and working situation. Lots of people living in previously hip, urban, high-rise buildings found themselves residing in not-so-cool, hollowed-out ghost towns comprised of boarded-up windows and closed bars and cafés. Others began to question why they were living in areas with expensive real estate and high taxes when they not longer had to be close to the office. This led to a 2020 COVID exodus out of pricey apartments/condos downtown to cheaper homes in distant suburbs as well as the move from pricey areas like Silicon Valley to cheaper destinations like Texas. Since you were stuck in your home, having a larger house with a home office, fast broadband, and a back yard became the most important thing. Looking ahead to 2021, a hybrid model of work-from-home plus occasionally going into the office is here to stay as employees will no longer tolerate sitting in traffic two hours a day just to sit in a cubicle in a skyscraper. The digital transformation of how and where we work has truly accelerated.

Data and Advanced Analytics

Data has shown itself to be one of the world’s most important assets during the time of COVID. Petabytes of data has continuously streamed-in from all over the world letting us know the number of cases, the growth or decline of infections, hospitalizations, contact-tracing, free ICU beds, temperature checks, deaths, and hotspots of infection. Some of this data has been reported manually while lots of other sources are fully automated from machines. Capturing, storing, organizing, modeling and analyzing this big data has elevated the importance of cloud and edge computing, global-scale databases, advanced analytics software, and the growing importance of machine learning. This is a trend that was already taking place in business and now has a giant spotlight on it due to its global importance. There’s no stopping the data + advanced analytics juggernaut in 2021 and beyond.

Conclusion

2020 was one of the worst years in human history and the loss of life was just heartbreaking. People, businesses, and our education system had to become resourceful to survive. This resourcefulness amplified the importance of delivering connected, digital experiences to make previously remote things into local ones. Cheers to 2021 and the hope for a brighter day for all of humanity.

Read more…

By Larry LeBlanc

Well, it has been quite a year, hasn’t it? On the cybersecurity front, everyone is worried about malicious actors tampering with election data – but it seems they were more focused (or at least successful) in conducting ransomware attacks on hospitals.

On the IoT front we saw the disclosure of significant vulnerabilities, such as Ripple20 in June and Amnesia-33 in December, that expose the TCP/IP stacks used in millions of IoT devices. With TCP/IP serving as the arterial system of the IoT, carrying the data which is its lifeblood, these vulnerabilities again demonstrate why all organizations need a plan to rapidly perform firmware updates on their IoT devices if they want to “stop the bleeding,” from these types of vulnerabilities.

So, what might the new year bring in terms of IoT security? I can’t say my crystal ball is crystal clear, but here are three predictions I am willing to make about IoT security developments in 2021.

Security as a Service Rapidly Expands into the IoT Market 

Companies are increasingly seeking to outsource their on-premises, cloud, and other cybersecurity needs, as the recent success of FireEye and other Security as a Service (SECaaS) providers demonstrates. These SECaaS providers combine deep levels of cybersecurity expertise, easy to deploy SaaS security solutions and economies of scale to deliver companies robust security at a lower cost than in-house alternatives. 

Now, SECaaS providers are expanding into the IoT market – and I expect this expansion to pick up steam in 2021. The millions of IoT devices that companies have deployed around the world represent a massive target for cybercriminals – and companies’ lack of IoT security expertise often make these devices easy for criminals to hack. Rather than become experts in IoT security themselves, I expect companies to increasingly partner with SECaaS providers who can help them protect their IoT data from malicious actors.

However, one question regarding this emerging IoT SECaaS market is, who will dominate it? Will it be established SECaaS providers extending their existing application to the IoT, providing companies with a comprehensive solution to their security needs? Or start-ups with SECaaS applications specifically designed to protect IoT data who be the leaders in this market? On this question, only time will tell. 

Companies Will Demand That IoT Solution Providers Establish Their Security Bona Fides 

While more companies outsource IoT security to SECaaS providers, they will also now start demanding (if they have not already) that their IoT device, connectivity, cloud and other providers not just talk about being committed to making their solutions secure, but prove it. 

Specifically, they will only work with IoT solution providers that have a deep understanding of IoT security issues, have integrated robust security capabilities into products, and are working to constantly update these products’ security capabilities.

By only partnering with IoT solution providers committed to security, these companies will position themselves to deploy an IoT security plan that provides them with defense in depth, enabling them to avoid having a chink in their IoT security armor result in a data breach or loss.

Expect more companies to demand that IoT solution vendors back up their security commitment promises with clear answers to questions like:

  • Can show how you are committed to transparency and responsiveness when dealing with security vulnerability reports?
  • Have you taken responsibility for vulnerability disclosure by becoming a CVE Numbering Authority (CNA) for your products?
  • What plans do you have to provide timely IoT device security updates, and how will you help me deploy these updates?

IoT hacks have taught companies that they must approach security as a critical requirement for their IoT deployments. After all, if they fail to partner with IoT solution providers who are not committed to security, they risk leaving not just their IoT applications, but their entire IT environment, open to attacks from sophisticated cybercriminals. 

A Return to IoT Security Basics 

Last year, many predicted that companies would deploy new AI-enabled threat intelligence and other leading-edge security technologies to protect themselves from attacks. 

However, while these new technologies do hold promise, most of the IoT hacks that took place over the past year resulted from companies simply not following basic IoT security best practices. Back in 2018 the Open Web Application Security Project (OWASP) identified the top ten most impactful IoT security vulnerabilities, and the one that led the list was weak, guessable or hardcoded passwords. Following close behind in fourth was a lack of secure update mechanisms, which can lead to devices running on old, vulnerable firmware even when new, secure updates are available. 

I wish I could say that things have changed over the past two years, but I expect when the OWASP next updates this list, you will see the same security vulnerabilities on it. When it comes to protecting your IoT data, you don’t need AI to create strong passwords, update your device firmware, or activate and use your IoT devices’ built-in firewalls – you just need to make sure you are following basic IoT security best practices. 

Not only does implementing these basic best practices make IoT applications more secure, but they can reveal opportunities to reduce operational costs. For example, a company that ensures it is updating its devices’ firmware is likely to quickly discover that over-the-air firmware updates make it easy for them to protect their IoT devices from new attacks, allowing them to eliminate expensive trips by technicians to manually update IoT devices with security and other upgrades.

Perhaps I am being too optimistic, but I believe that, having seen over the past year that basic IoT security best practices can address high-profile vulnerabilities like Ripple20 and Amnesia-33, in 2021 companies will make sure they have fully implemented these best practices – and only then look for other technologies to address rarer, more sophisticated attacks. 

Originally posted HERE.

Read more…

By Michele Pelino

The COVID-19 pandemic drove businesses and employees to became more reliant on technology for both professional and personal purposes. In 2021, demand for new internet-of-things (IoT) applications, technologies, and solutions will be driven by connected healthcare, smart offices, remote asset monitoring, and location services, all powered by a growing diversity of networking technologies.

In 2021, we predict that:

  • Network connectivity chaos will reign. Technology leaders will be inundated by an array of wireless connectivity options. Forrester expects that implementation of 5G and Wi-Fi technologies will decline from 2020 levels as organizations sort through market chaos. For long-distance connectivity, low-earth-orbit satellites now provide a complementary option, with more than 400 Starlink satellites delivering satellite connectivity today. We expect interest in satellite and other lower-power networking technologies to increase by 20% in the coming year.
  • Connected device makers will double down on healthcare use cases. Many people stayed at home in 2020, leaving chronic conditions unmanaged, cancers undetected, and preventable conditions unnoticed. In 2021, proactive engagement using wearables and sensors to detect patients’ health at home will surge. Consumer interest in digital health devices will accelerate as individuals appreciate the convenience of at-home monitoring, insight into their health, and the reduced cost of connected health devices.
  • Smart office initiatives will drive employee-experience transformation. In 2021, some firms will ditch expensive corporate real estate driven by the COVID-19 crisis. However, we expect at least 80% of firms to develop comprehensive on-premises return-to-work office strategies that include IoT applications to enhance employee safety and improve resource efficiency such as smart lighting, energy and environmental monitoring, or sensor-enabled space utilization and activity monitoring in high traffic areas.*
  • The near ubiquity of connected machines will finally disrupt traditional business. Manufacturers, distributors, utilities, and pharma firms switched to remote operations in 2020 and began connecting previously disconnected assets. This connected-asset approach increased reliance on remote experts to address repairs without protracted downtime and expensive travel. In 2021, field service firms and industrial OEMs will rush to keep up with customer demand for more connected assets and machines.
  • Consumer and employee location data will be core to convenience. The COVID-19 pandemic elevated the importance location plays in delivering convenient customer and employee experiences. In 2021, brands must utilize location to generate convenience for consumers or employees with virtual queues, curbside pickup, and checking in for reservations. They will depend on technology partners to help use location data, as well as a third-party source of location trusted and controlled by consumers.

* Proactive firms, including Atea, have extended IoT investments to enhance employee experience and productivity by enabling employees to access a mobile app that uses data collected from light-fixture sensors to locate open desks and conference rooms. Employees can modify light and temperature settings according to personal preferences, and the system adjusts light color and intensity to better align with employees’ circadian rhythms to aid in concentration and energy levels. See the Forrester report “Rethink Your Smart Office Strategy.”

Originally posted HERE.

Read more…

By Patty Medberry

After 2020’s twists and turns, here’s hoping that 2021 ushers in a restored sense of “normal.” In thinking about what the upcoming year might bring for industrial IoT, three key trends emerge.

Trend #1: Securing operational technology (OT)

 IT will take a bolder posture to secure OT environments.

Cyber risks in industrial environments will continue to grow causing IT to take bolder steps to secure the OT network in 2021. The CISO and IT teams have accountability for cybersecurity across the enterprise. But often they do not have visibility into the OT network. Many OT networks use traditional measures like air gapping or an industrial demilitarized zone to protect against attacks. But these solutions are rife with backdoors. For example, third-party technicians and other vendors often have remote access to update systems, machines and devices. With increasing pressure from board members and government regulators to manage IoT/OT security risks, and to protect the business itself, the CISO and IT will need to do more.

Success requires OT’s help. IT cybersecurity practices that work in the enterprise are not always appropriate for industrial environments. What’s more, IT doesn’t have the expertise or insight into operational and process control technology. A simple patch could bring down production (and revenues).

Bottom line? Organizations will need solutions that strengthen cybersecurity while meeting IT and OT needs. For IT, that means visibility and control across their own environment to the OT network. For OT, it means security solutions that allow them respond to anomalies while keeping production humming.

Trend #2: Remote and autonomous operations

The need for operational resiliency will accelerate the deployment of remote and autonomous operations – driving a new class of networking.

The impact of changes brought on in 2020 is driving organizations to increasingly use IoT technologies for operational resiliency. After all, IoT helps keep a business up and running when people cannot be on the ground. It also helps improve safety and efficiencies by preventing unnecessary site visits and reducing employee movement throughout facilities.

In 2021, we will see more deployments aimed at sophisticated remote operations. These will go well beyond remote monitoring. They will include autonomous operational controls for select parts of a process and will be remotely enabled for other parts. Also, deployments will increasingly move toward full autonomy, eliminating the need for humans to be present locally or remotely. And more and more, AI will used for dynamic optimization and self-healing, in use cases such as:

  • autonomous guided vehicles for picking and packing, material handling, and autonomous container applications across manufacturing, warehouses and ports
  • increased automation of the distribution grid
  • autonomous haul trucks for mining applications
  • Computer-based train control for rail and mass transit

All these use cases require data instantly and in mass, demanding a network that can support that data plus deliver the speed required for analysis. This new class of industrial networking must provide the ability to handle more network bandwidth, offer zero latency data and support edge compute. It also needs security and scale to adapt quickly, ensuring the business is up and running – no matter what.

Trend #3: Managing multiple access technologies

Organizations will operate multiple-access technologies to achieve operational agility and flexibility.

While Ethernet has always been the foundation for connectivity in industrial IoT spaces, that connectivity is quickly expanding to wireless. Wireless helps reduce the pain of physical cabling and provides the flexibility and agility to upgrade, deploy and reconfigure the network with less operational downtime. Newer wireless technologies like Wi-Fi 6 and 5G also power use cases not possible in the past (or possible only with wired connectivity).

As organizations expand their IoT deployments, the need to manage multiple access technologies will grow. Successful deployments will require the right connectivity for the use case, otherwise, costs, complexity and security risks increase. With wireless choices including Wi-Fi, LoRaWAN, Wi-SUN, public or private cellular, Bluetooth and more, organizations will need to determine the best technology for each use case.  

Cisco’s recommendation: Build an access strategy to optimize costs and resources while ensuring security. Interactions between access technologies should deliver a secured and automated end-to-end IP infrastructure – and must avoid a “mishmash” leading to complexity and failed objectives.

As the end of 2020 fast approaches, I wish everyone a safe and healthy New Year. As you continue building and refining your plans for 2021, please consider how you can unleash these IoT network trends to reduce your cybersecurity risks and increase your operational resiliency. 

Originally posted HERE.

Read more…

New solar performance monitoring system has potential to become IoT of photovoltaics. Credit: Pexels

A new system for measuring solar performance over the long term in scalable photovoltaic systems, developed by Arizona State University researchers, represents a breakthrough in the cost and longevity of interconnected power delivery.

When solar cells are developed, they are "current-voltage" tested in the lab before they are deployed in panels and systems outdoors. Once installed outdoors, they aren't usually tested again unless the system undergoes major issues. The new test system, Suns-Voc, measures the system's voltage as a function of light intensity in the outdoor setting, enabling real-time measurements of performance and detailed diagnostics.

"Inside the lab, however, everything is controlled," explained Alexander Killam, an ASU electrical engineering doctoral student and graduate research associate. "Our research has developed a way to use Suns-Voc to measure solar panels' degradation once they are outdoors in the real world and affected by weather, temperature and humidity," he said.

Current photovoltaic modules are rated to last 25 years at 80 percent efficiency. The goal is to expand that time frame to 50 years or longer.

"This system of monitoring will give photovoltaic manufacturers and big utility installations the kind of data necessary to adjust designs to increase efficiency and lifespans," said Killam, the lead author of "Monitoring of Photovoltaic System Performance Using Outdoor Suns-Voc," for Joule.

For example, most techniques used to measure outdoor solar efficiency require you to disconnect from the power delivery mechanism. The new approach can automatically measure daily during sunrise and sunset without interfering with power delivery.

"When we were developing photovoltaics 20 years ago, panels were expensive," said Stuart Bowden, an associate research professor who heads the silicon section of ASU's Solar Power Laboratory. "Now they are cheap enough that we don't have to worry about the cost of the panels. We are more interested in how they maintain their performance in different environments.

"A banker in Miami underwriting a photovoltaic system wants to know in dollars and cents how the system will perform in Miami and not in Phoenix, Arizona."

"The weather effects on photovoltaic systems in Arizona will be vastly different than those in Wisconsin or Louisiana," said Joseph Karas, co-author and materials science doctoral graduate now at the National Renewable Energy Lab. "The ability to collect data from a variety of climates and locations will support the development of universally effective solar cells and systems."

The research team was able to test its approach at ASU's Research Park facility, where the Solar Lab is primarily solar powered. For its next step, the lab is negotiating with a power plant in California that is looking to add a megawatt of silicon photovoltaics to its power profile.

The system, which can monitor reliability and lifespan remotely for larger, interconnected systems, will be a major breakthrough for the power industry.

"Most residential solar rooftop systems aren't owned by the homeowner, they are owned by a utility company or broker with a vested interest in monitoring photovoltaic efficiency," said Andre' Augusto, head of Silicon Heterojunction Research at ASU's Solar Power Laboratory and a co-author of the paper.

"Likewise, as developers of malls or even planned residential communities begin to incorporate solar power into their construction projects, the interest in monitoring at scale will increase, " Augusto said.

According to Bowden, it's all about the data, especially when it can be monitored automatically and remotely—data for the bankers, data for developers, and data for the utility providers.

If Bill Gates' smart city, planned about 30 miles from Phoenix in Buckeye, Ariz., uses the team's measurement technology, "It could become the IoT of Photovoltaics," said Bowden.

Originally posted HERE.

Read more…

by Evelyn Münster

IoT systems are complex data products: they consist of digital and physical components, networks, communications, processes, data, and artificial intelligence (AI). User interfaces (UIs) are meant to make this level of complexity understandable for the user. However, building a data product that can explain data and models to users in a way that they can understand is an unexpectedly difficult challenge. That is because data products are not your run-of-the-mill software product.

In fact, 85% of all big data and AI projects fail. Why? I can say from experience that it is not the technology but rather the design that is to blame.

So how do you create a valuable data product? The answer lies in a new type of user experience (UX) design. With data products, UX designers are confronted with several additional layers that are not usually found in conventional software products: it’s a relatively complex system, unfamiliar to most users, and comprises data and data visualization as well as AI in some cases. Last but not least, it presents an entirely different set of user problems and tasks than customary software products.

Let’s take things one step at a time. My many years in data product design have taught me that it is possible to create great data products, as long as you keep a few things in mind before you begin.

As a prelude to the UX design process, make sure you and your team answer the following nine questions:

1. Which problem does my product solve for the user?

The user must be able to understand the purpose of your data product in a matter of minutes. The assignment to the five categories of the specific tasks of data products can be helpful: actionable insights, performance feedback loop, root cause analysis, knowledge creation, and trust building.

2. What does the system look like?

Do not expect users to already know how to interpret the data properly. They need to be able to construct a fairly accurate mental model of the system behind the data.

3. What is the level of data quality?

The UI must reflect the quality of the data. A good UI leads the user to trust the product.

4. What is the user’s proficiency level in graphicacy and numeracy?

Conduct user testing to make sure that your audience will be able to read and interpret the data and visuals correctly.

5. What level of detail do I need?

Aggregated data is often too abstract to explain, or to build user trust. A good way to counter this challenge is to use details that explain things. Then again, too much detail can also be overwhelming.

6. Are we dealing with probabilities?

Probabilities are tricky and require explanations. The common practice of cutting out all uncertainties makes the UI deceptively simple – and dangerous.

7. Do we have a data visualization expert on the design team?

UX design applied to data visualization requires a special skillset that covers the entire process, from data analysis to data storytelling. It is always a good idea to have an expert on the team or, alternatively, have someone to reach out to when required.

8. How do we get user feedback?

As soon as the first prototype is ready, you should collect feedback through user testing. The prototype should present content in the most realistic and consistent way possible, especially when it comes to data and figures.

9. Can the user interface boost our marketing and sales?

If the user interface clearly communicates what the data product does and what the process is like, then it could take on a new function: sell your products.

To sum up: we must acknowledge that data products are an unexplored territory. They are not just another software product or dashboard, which is why, in order to create a valuable data product, we will need a specific strategy, new workflows, and a particular set of skills: Data UX Design.

Originally posted HERE 

Read more…

If in previous years many of the Internet of Things (IoT) predictions have failed, the year 2020 has been no exception. This time justified by the virus outbreak.

In my article " 2020 IoT Trends and Predictions: Be prepared for the IoT Tsunami" I wrote that we should be prepared for the Internet of Things (IoT) tsunami, but it won't be in 2020". I didn't imagine the special circumstances of the year "MMXX". Today, I do see clearly that Covid-19’s impact is difficult to ignore looking forward into 2021 and beyond. This pandemic is going to accelerate adoption in many industries that have been affected and will have to make some changes to how they operate.

The year 2020 has been a significant year in terms of the emergence of technologies leading to a much better space of IoT to flourish and grow.

I'm not going to make my own predictions this year. Although I have taken a responsibility for my followers to collect and publish the predictions of other recognized or enthusiastic voices of the IoT.

Here I summarize some of them. My advice is to keep relying on optimistic predictions as there are many Reasons to Believe in Internet of Things.

  • Forrester - Predictions 2021: Technology Diverity Drives IoT Growth
    • Network connectivity chaos will reign. We expect interest in satellite and other lower-power networking technologies to increase by 20% in the coming year.
    • Connected device makers will double down on healthcare use cases. In 2021, proactive engagement using wearables and sensors to detect patients’ health at home will surge.
    • Smart office initiatives will drive employee-experience transformation. We expect at least 80% of firms to develop comprehensive on-premises return-to-work office strategies that include IoT applications to enhance employee safety and improve resource efficiency.
    • The near ubiquity of connected machines will finally disrupt traditional business. In 2021, field service firms and industrial OEMs will rush to keep up with customer demand for more connected assets and machines.
    • Consumer and employee location data will be core to convenience. In 2021, brands must utilize location to generate convenience for consumers or employees with virtual queues, curbside pickup, and checking in for reservations.
  • CRN - 5 Hot IoT Trends To Watch in 2021 And Beyond
    • Changes In Real Estate Trends Will Push Smart Office Initiatives
    • The Internet Of Behavior Is Coming To Your Workplace
    • Location Data Will Become More Prominent
    • This Year’s Pivot To Remote Operations Will Expand Connected Assets
    • Connected Health Care Will Ramp Up In 2021
  • The future of IoT: 5 major predictoins for 2021, based on Forrester
  • TechoPedia - 6 IoT Predictions for 2021: What's Next? –
    1. An Increase in IoT Remote Workforce Management Products
    2. More IoT-Enabled Options for Smart Cities
    3. Improving Driving and Autonomous Vehicles
    4. The IoT Will Boost Predictive Maintenance
    5. The Connected Home over Internet Protocol (CHIP) Standard Will Become a Reality
    6. Market Enticements With Multipurpose Products
  • Forbes - 5 IoT Trends To Watch In 2021
    1. Can You Turn Off Your Alexa? We'll likely see an increase in the security surrounding smart devices, including AI-driven, automated ability to scan networks for IoT devices.
    2. More Use Cases in More Industries - the IoT has the ability to mean big money for almost any industry.
    3. IoT Helping to Build Digital Twins - the IoT may be the perfect partner for the development of digital twins, for almost any application. Especially for things like construction, engineering, and architecture, that could mean huge cost and time savings.
    4. IoT and Data Analytics - the IoT is no longer just about monitoring behavior and spitting out data. It's about processing data quickly and making recommendations (or taking actions) based on those findings.
    5. Improving Data Processing at the Edge - With the confluence of 5G networks, an increase in IoT and IIoT devices, and a dramatic increase in the amount of data we are collecting, I don't see this trend going anywhere but up
  • Security Today - By 2021, 36 bilion IoT devices will be installed around the world.
  • IoT Agenda - Mitch Maiman - Intelligent Product Solutions (IPS)- IoT predictions for 2021–
    • Medical IoT
    • Radio frequency services
    • AI and augmented reality
    • Electric vehicles
    • Remote work
  • IoT Agenda - Carmen Fontana, Institute of Electrical and Electronics Engineer -Top 5 IoT predictions for gworing use cases in 2021
    • Wearables will blur the line between consumer gadgets and medical devices
    • Consumers will be more concerned about data privacy
    • AI IoT products will be more accessible
    • Digital twin adoption will explode due to increased remote work
    • Edge computing will benefit from green energy investment
  • IoT World Today - IoT Trends 2021: A Focus on Gundamentals, Not Nice-to-Haves. IoT trends in 2021 will focus on core needs such as health-and-safety efforts and equipment monitoring, but IoT in customer experience will also develop.
  • Rockwell Automation Predictions for 2021
    • IT/OT Integration is critical for answering the $77 billion need for IIoT
    • Edge is the new cloud
    • Digital twins save $1 trillion in manufacturing costs
    • Pandemic promotes AR training as the new standard for a distributed workforce
    • Automation accelerates employee advancement through human-machine interface
  • Top 5 IoT Predictions For 2021; What Future Holds?
    • Private IoT networks
    • Digital health
    • Cities would turn smarter
    • Remote offices
    • Improved location services
  • online - 10 IoT Trends for 2020/2021: Latest Predictions According to Experts 
  • J2 Innovations - Smart Building, Equpement and IoT Trends for 2021 –
    • Remote work and management
    • Changing the way we work
    • Flexible spaces
    • Digital processes- 2021 will see ever more processes becoming digital.
    • The convergence of IT and OT - The industry will continue to see a concerted push to integrate and leverage the vast amounts of valuable data derived from Operational Technologies (OT) into the Information Technology (IT) side of the enterprise
    • A new kind of interoperability - A good example of that is The Web of Things (WoT), which is an open source standard being pioneered by Siemens
  • Krakul - IoT trends to expect in 2021 - Cloud service providers are the most prominent vendors within the IoT space. 2021 will also see the rise in IoT development partnerships. Brands, who not only require cloud transformation will need a hardware partner to ensure IoT devices perform to both consumer and business needs. Whether those IoT device applications will see use by consumers, businesses or the industry – the common concerns shaping IoT solutions for 2021 include:
    • integration
    • usability
    • security
    • interoperability
    • user safety
    • return on investment (ROI) for the business case
  • Analysis Mason - predictions for business connectivity, communications, IoT and security in 2021 –
    • A major mobile operator will buy one of the IoT market disruptors.
    • A new deployment model for private LTE/5G networks will emerge – the public industrial network
    • Private networks will become a topic for financial sponsors.
  • TBR (Ezra Gottheil)- 2021 DEVICES & COMMERCIAL IOT PREDICTIONS
    • AI in IoT will increasingly be encapsulated in specific functions like recognition and detection
    • Conversational user interfaces, based on voice or typed communication, will play an increasing role in business Solutions
    • THE EMERGENCE OF THE CHIEF DATA OFFICER ROLE WILL INCREASE ORGANIZATIONAL CLARITY, ACCELERATING IOT ADOPTION
  • Predictions for Embedded Machine Learning for IoT in 2021 
    • From increasingly capable hardware to TinyML, embedded machine learning will make strides in 2021.
    • More capable microcontrollers combined with on-device machine learning at the edge is poised to develop further in 2021. These developments with further advances in video surveillance, manufacturing and more.
    • The impact of COVID-19 on the global supply chain, however, may stunt innovation and growth of embedded machine learning.
  • Frost & Sullivan -Top 4 Growth Opportunities in the Internet of Things Industry for 2021 
    • Exponential growth of edge computing in public and private networks
    • Convergence between IT and OT to drive end-user concerns on IIoT security, privacy, and data protection
    • Emerging techs: convergence of IoT, AI, and blockchain
    • The future of retail post COVID-19

Key Takeaways               

In spite the global pandemic has influenced product introduction timelines, causing some things to be fast-tracked, while others lose priority, enterprises, consumers, and different stakeholders will continue to drive demand for new and improved internet of things applications, technologies, and solutions in 2021 across verticals and geographies.

IoT will continue to gain footholds, as people and enterprises become comfortable and familiar with the technology and it is incorporated into daily life in seamless ways.

You must not forget that by the year 2025 IoT devices installed worldwide would be 75.44 Billion.  That is a whopping number, which will relentlessly soar further, will give a positive impact on our lives and businesses alike.

I expect an exciting year for IoT advancements in 2021. And you?

Read more…

Figure 1: Solution architecture with AWS IoT core

Critical and high-value assets are always on the move, and this holds across practically every industry vertical relying on supply chain and logistics operations. Naturally, enterprises seek ways to track their assets with the shipment carrier in ways that are most optimal to their requirements. The end goal is often to have greater visibility and control of assets while in transit with the shipment carrier while opening up opportunities to optimize business operations based on insights-driven decisions.

For assets in transit, proactive shipment monitoring results in greater reliability of the shipment's integrity by way of real-time updates about the shipment's location, transit status, and conditions like temperature and humidity (for perishable shipments). All this information helps quick issue identification and remediation by the respective stakeholders. This helps minimize losses and reduced insurance claims, which results in further cost optimization for the enterprise while delivering a delightful purchase experience to their customers.

A solution to address such requirements would need to be an IoT (Internet of Things) solution requiring a combination of tracker devices (hardware), cloud apps (software platform), enterprise systems integrations (with SCM, ERP & similar systems), and professional services & support for field installation & continuous data insights. For most enterprises, this implementation of the Internet of things is complex and non-core. Such an IoT solution requires the investment of capital, time, and expertise to build and deploy such a solution, especially one that's secure and scalable.

In this post, we'll discuss such an IoT solution that is built using AWS IoT Core and can be delivered in an affordable manner such that it's ready to use within a matter of days or weeks. The solution leverages GPS-enabled tracker hardware comprising condition monitoring sensors like temperature, humidity, shock impact, and ambient light. This device can be used to track entire containers or pallets having multiple cartons or even individual item boxes depending on the requirements. The shock impact sensor on the device indicates asset mishandling based on threshold limits, and the light sensor can indicate potentially unauthorized use/asset theft. Such a device requires a cellular connectivity service to communicate sensor data to the cloud per pre-configured rules.

By way of API integrations using AWS SDKs for the Internet of things, the tracker devices are first connected and authenticated. The data they generate is published to a cloud app powered by AWS IoT Core in real-time or at preset time intervals. The data sent to the cloud app is in JSON-format message payloads sent via the MQTT protocol supported by AWS IoT Core; and is presented on the Frontend Dashboard UI in a rich, interactive manner on a map interface with sensor-specific information available within a couple of taps/clicks.

These sensor data messages are further forwarded to other back-end systems like AWS IoT Analytics. The data is usually saved in a time-series data store for analysis and insights later in time. Additionally, API integrations can also be easily built for the cloud app to work with enterprise apps like Transport Management Systems and Warehouse Management Systems to realize autonomous supply chain operations. Business rules define such movement of data- and operations-specific logic and is handled via AWS's Rules Engine service, which also can be used to transform device data before forwarding it to a different application.

However, not every data point a sensor picks up needs to be sent to the cloud app unless such a mandate exists, often due to regulatory compliance requirements in verticals like healthcare and pharmaceuticals. The Dashboard UI on the cloud provides a simple interface to set ranges of minimum and maximum sensor readings acceptable. Any breach of this range is immediately notified as an alert to the team responsible for monitoring the shipment. The team can then contact the shipment carrier to take corrective action. Such ranges can also be separately configured within mere seconds for each shipment per its monitoring requirements.

The secure bidirectional messaging between the tracker device and the cloud app is enabled via AWS IoT Core's Device Gateway, which scales automatically to process millions of messages in either direction while ensuring low latency mission-critical applications.

This makes the purpose-built shipment monitoring solution completely configurable and hence scalable while still being quickly deployable without the hassles of capital expenses and significant resource time spent in custom building such solutions from scratch.

Summary

The intelligent shipment monitoring solution enables enterprises to have greater control over the movement of their assets while having enough data and insights over time to optimize business operations as required.

With AWS IoT Core and AWS IoT Analytics, such a data-driven outcome approach to handle supply chain operations delivers transformational benefits such as reduced losses, greater cost control, and improved customer satisfaction rates that can result in sustainable competitive advantage in the marketplace.

Originally posted HERE.

Read more…

As industrial organizations connect more devices, enable more remote access, and build new applications, the airgap approach to protecting industrial networks against cyber threats is no longer sufficient. As industries are becoming more digital, cyberattacks are getting more sophisticated, and yet many organizations are lagging in the adoption of updated and reliable industrial cybersecurity postures. And when these organization’s security leaders start building a strategy to secure operations beyond the industrial demilitarized zone (IDMZ), they realize it might not be as simple as they thought.

Industrial assets (as well as industrial networks, in many cases) are managed by the operations team, which is typically focused on production integrity, continuity, and physical safety, rather than cyber safety. The IT teams often have the required cybersecurity skills and experience but generally lack the operations context and the knowledge of the industrial processes that are required to take security measures without disrupting production.

Building a secure industrial network requires strong collaboration between IT and operations teams. Only together can they appreciate what needs to be protected and how best to protect it. Only together can they implement security best practices to build secure industrial operations.

Enhancing the security of industrial networks will not happen overnight: IT and operations teams have to build their relationship; new security tools might have to be deployed; networks might need to be upgraded and segmented; new correlation policies will have to be developed.

Security is a journey. Only a phased and pragmatic approach can lay the ground for a converged IT/OT security architecture. Each phase must be an opportunity to build the foundation for the next. This will ensure your industrial security project addresses crucial security needs at minimal costs. It will also help you raise skills and maturity levels throughout the organization to gain wide acceptance and ensure effective collaboration.

Being the leader in both the cybersecurity and industrial networking markets, we looked at the successful projects Cisco has been involved in. This led us to recommend a three-step journey outlined in Cisco’s Industrial Security Validated Design.

What is a Cisco Validated Design (CVD)? CVDs provide the foundation for systems design based on common use cases or current engineering system priorities. They incorporate a broad set of technologies, features, and applications to address customer needs. Each one has been comprehensively tested and documented by Cisco engineers to ensure faster, more reliable, and fully predictable deployment.

Our approach to industrial security is focused on crucial needs, while creating a framework for IT and operations to build an effective and collaborative workflow. It enables protection against the most common devastating cybersecurity threats, at optimized cost. And provides a practical approach to simplify adoption.

To learn more, read our solution brief or watch the replay of the webinar I just presented. A detailed design and implementation guide will be available soon for helping to accelerate proof-of-concepts and deployment efforts.

Originally posted HERE.

Read more…
Fig.1 Arrow Shield 96 Trusted Platform

Introduction

IoT product development crosses several domains of expertise from embedded design to communication protocols and cloud computing. Because of this complexity “end-to-end” or “edge-to-cloud” IoT security is becoming a challenging concept in the industry. Edge in many cases refers to the device as a single element in the edge-to-cloud chain. But the device must not be regarded as a whole when security requirements are defined. Trust must first be established within the processing unit and propagated through several layers of the software stack before the device becomes a trusted end node. Securing the processor requires to properly integrate multiple layers of security and use security features implemented in hardware. Embedded security expertise and experience is required to accomplish such tasks. It is very easy to put a lot of effort on implementing security for an IoT product and in the same time missing to cover key use cases. A simpler way to narrowing down on defining the end-to-end security is to start with identifying the minimum set of business requirements.

Brand image, how a company’s customers perceive and value it, is one of the most valuable assets of any corporation. Two of the most important characteristics of an IoT device that can promote a positive brand image are: resiliency and privacy. For resiliency, this might mean adding features that increase the device’s ability to self-recover from malfunctions or cyber-attacks. For privacy, this means protecting user information and data but also the intellectual property (IP), the product invested in the product. This means that preventing exploitation through vectors such as product\device cloning and over production becomes important. Another business driver is the overall cost of ownership for the product. Are there security related features that can drive the cost down? We include here not just operational cost but also liabilities.

In this blog, we dive deeper into solutions that support these business requirements. We will also discuss a demo we have created in collaboration with our partners Sequitur Labs and Arrow to demonstrate a commercially available approach to solving a number of several security use cases for IoT.

Security in depth – a methodical approach for securing connected products

IoT security must start with securing the device, so that data, data collection, and information processing can be trusted. Security must be applied in layers and facilitate trust propagation from the silicon hardware root of trust (HWRoT) to the public/private cloud or the application provider back-end. Furthermore, the connected paradigm provides the opportunity to delegate access control and security monitoring in the cloud, outside of the device. Narrowing down further, device security must be rooted by enabling fundamental capabilities of the processor or system on chip and consider all three stages of the device lifecycle: inception (manufacturing, first boot), operation, and decommissioning.

In a nutshell we should consider the following layers for securing any IoT product:

  • Set a hardware root of trust – secure programming and provisioning (firmware, key material, fuses)
  • Implement hardware enforced isolation – system partitioning secure / non-secure
  • Design secure boot – authenticated boot chain all the way to an authenticated kernel
  • Build for resiliency – fail-safe to an alternative firmware image and restore from off-board location
  • Enable Trusted Execution – establish a logical secure enclave
  • Abstract hardware security – streamline application development
  • Enable security monitoring – cloud based, actionable security monitoring for a fleets of devices

These capabilities provide a foundation sufficient to fulfill the most common security requirements of any IoT product.

Embedded security features needed to build the security layers described above are available today from many silicon providers. However, software is needed to turn these into a usable framework for application developers to easily implement higher layer security use cases without the need for advanced silicon expertise.

Such software products must be architected to be easily ported to diverse silicon designs. Secondly, the software solution must work with the established IoT manufacturing process. “Turning on” embedded security features triggers changes to existing manufacturing flows to accommodate hardware testing before final firmware image can be programmed, burning fuses in the silicon in a specific order and overall handling sensitive cryptographic key material. The fragmentation, complexity, and expertise required are the reasons why embedded security is a challenge to implement at scale in IoT today.

A closer look – commercially available secure platform with Arrow Shield96

AWS partnered with Sequitur Labs and Arrow to provide a commercial solution that follows the approach described in the previous paragraph. This solution follows the NIST SP 800-193 for Platform Firmware Resilience Guidelines and goes beyond to create a secure platform fitted for embedded and IoT products. In the same time it is abstracting the complexity of understanding and utilizing embedded security IP such as hardware crypto, random number generators, fuse controllers, tampers, hardware integrity checkers, TrustZone, on-the-fly memory encryption.

For this blog, we created a demo using the Arrow Shield 96 Trusted Platform (Fig 1) single board computer running Sequitur Labs custom firmware image based on the EmSPARK Security Suite. The Arrow Shield96 board is based on the Microchip SAMD27, a Cortex A5 entry level MPU that embeds a set of security IP capable to fulfill the most stringent security requirements.

Let’s dive deeper into the technical implementation first then into the demo scenarios that fulfill some of customers’ business needs.

Security inception and propagation of trust

Secure boot and firmware provisioning

Introducing secure boot requires initial programming of the CPU, essentially burning keys in the processor’s fuses, setting up the boot configuration, establishing the Hardware Root of Trust, and ensuring the processor only boots authenticated, trusted firmware. Secure boot implementation is tightly correlated to the processor programming and the device firmware provisioning. The following section provides details how secure boot and firmware provisioning can be done properly to establish a trusted security foundation for any application.

Firmware provisioning

EmSPARK Security Suite methodology for provisioning and programming the Shield96 board minimizes complexity and the need for embedded security expertise. It provides a tool and software building blocks that guide the device makers to create an encrypted manufacturing firmware image first. The manufacturing firmware image packages the final components: encrypted blobs of the final device firmware, a provisioning application, and customer specific key material such as private key and X.509 certificate for cloud connectivity, certificate authorities to authenticate firmware components and application updates.
The actual firmware provisioning and CPU programming is performed automatically during the very first boot of the device flashed with the manufacturing image. With the CPU running in secure mode the provisioning application burns the necessary CPU fuses and generates keys using the embedded TRNG (true random number generator) to uniquely encrypt the software components that together form the final firmware. Such components are the Trusted Execution Environment (CoreTEE), Linux kernel, customer applications, Trusted Applications, and key material (such as key material needed to authenticate with AWS IoT Core).

The output – establishing a trusted foundation

The result is firmware encrypted uniquely with a key derived from the HWRoT for each device in a process that does not leave room for device secrets mismanagement or human error. Device diversification achieved this way drastically reduces the cost of manufacturing by eliminating the need for HSMs and secure facilities while providing protection from class break attacks (break one break all).
Another task the provisioning process performs during the very first boot is creating and securely storing a unique device certificate from a preloaded CSR (Certificate Signing Request) template and a key pair generated using the HW TRNG then signed with a customer provided private key only usable securely during the device first boot. The device certificate serves as the immutable device identity for cloud authentication.

Secure boot

The secure boot implemented creates the system partitioning in secure and non-secure domains making sure all peripherals are set to the desired domain. Arm TrustZone and Microchip security IP are at the core of the implementation. CoreTEE, the operating system for the secure domain runs in on-the-fly AES encrypted DDR memory. This protects a critical software component (the TEE) from memory probing attacks. Secure boot has been designed so at the end of the boot process, before handing over control of the processor from the secure domain to the non-secure domain (Linux) to close access to the fuse controller, secure JTAG, and other peripherals that can be leveraged to breach the security.

Building for resilience

Secure boot implements two features that boost device resilience – a fail-over boot from a secondary image (B) when primary boot (A) fails, and the ability to restore a known good image (A) from an off-board location. The solution includes a hardware watchdog and a boot-loop counter (as set by the device maker) that Linux resets to maximum after each successful boot. If Linux fails to boot repeatedly and the counter reaches zero the B partition is set for the next boot. After such failure once the failover boot B is loaded, the device connects to an off-board location (in our demo that is a repository on AWS) retrieves the latest firmware image and re-installs it as the primary one (A). These two features help to reduce operational cost by allowing devices in the field to self-heal. In addition, AWS IoT Device Defender checks device behaviors for ongoing analysis and triggers alerts when behaviors deviate from expected ranges.

In our demo when the alternative firmware image (B) is loaded, an event is triggered in the AWS IoT Device Defender agent. The AWS IoT Device Defender agent running as a TA in the secure domain sends these events to the AWS IoT Device Defender Detect service for evaluation. The TA, running in the secure domain, also signs AWS IoT Device Defender messages to facilitate integrity validation for each reported event.

Another key component of the EmSPARK Suite is the secure update process. Since secure boot is the only process that can decrypt firmware components during device start it is also involved in performing the firmware update. The firmware update feature is facilitated in Linux as an API call that requires a manifest and the signed and/or encrypted new firmware image. The API call performs image signature verification and sets the flag for the boot to update and restarts the board. During next boot the secure boot process decrypts the new image using a pre-provisioned key and re-encrypts it with the board-specific key. The manifest indicates which components need to be updated – Linux Kernel, TEE, TAs and/or bootloader.

Enabling easy development through security abstraction

Arrow Shield through the EmSPARK Suite product preloads a number of TAs (Trusted Applications) with the Shield96 firmware. The figure below is a view of the dual domain implementation and the software components provided with the Shield96 Trusted product in our demo.

8275351859?profile=RESIZE_710x

Fig 2. Software architecture enabling TrustZone\TEE with EmSPARK Suite

These TAs expose a set of secure functions to Linux via a C SDK (called the CoreLocker APIs). The Arrow board and Sequitur’s security suite preloads the following TAs for our demo:

  • Cryptographic engine – providing symmetric, asymmetric crypto operations and key generation integrating silicon-specific hardware crypto
    Key-store and a CA-store managed (add, delete) via signed commands
  • Secure firmware update
  • Secure storage for files and stream data
  • TLS and MQTT stacks
  • AWS IoT Device Defender secure agent

In addition, a tamper detection and remediation TA has been added for our demo purposes (as detailed in “The demo” section below). These TAs provide a preloaded framework for implementing a comprehensive set of security use cases assuring that security operations are executed in isolation from the application OS in an authenticated and resilient environment. Such use cases include confidentiality, authentication and authorization, access control, attestation, privacy, integrity protection, device health monitoring, secure communication with the cloud or other devices, secure lifecycle management.

All TA functions are made available to application development through a set of C APIs via an SDK. Developers do not need to understand the complexity of creating TAs or using HW security provided by the chipset.

Translating TAs to security use cases

Through a securely managed CA-store (Certificate Authority) the device can authenticate payloads against a set of CAs optionally loaded at manufacturing or later in the device lifecycle. Having the ability to update securely the CAs the device or product owner can transfer the ownership of certain functions such as firmware update or application update to other entities. For example, the customer owns the applications but the firmware update and security management may be delegated to a third party Managed Service Provider while maintaining privacy requirements.
The cryptographic engine is core to anything related to security and implement a set of symmetric and asymmetric cryptographic functions and key generation allowing applications in non-secure domain to execute crypto in isolation. HW crypto is used when implemented by the chipset.

The Microchip SAMA5D2 implements in hardware the ability to monitor in real time regions of memory. In the Shield96 firmware this feature – ICM, Integrity Check Monitoring – is used to monitor the integrity of the Linux kernel. Any modification of the Linux kernel triggers an interrupt in the secure domain. The hardware isolation implemented through TrustZone prevents Linux to even “be aware” of such interrupts. The interrupt triggers a remediation function implemented in a TA and together with the Device Defender Secure Agent TA that does three operations:

  • records the tampering event and restarts Linux from the verified, authenticated encrypted image provided through secure boot
  • after restart packages the tampering event into a JSON format, signs it for integrity assurance and stores it
  • publishes the JSON package to the AWS IoT Device Defender monitoring service

Complementing the edge-to-cloud security strategy with AWS IoT Device Defender

AWS IoT Device Defender audits device cloud configuration based on security best practices and monitors anomalies and threats on devices based on expected cloud- and device-side behaviors on an ongoing basis. In this demo and for complementing the defense mechanisms implemented at the device level, AWS IoT Device Defender performs its monitoring capability and enables customers to receive alerts when it evaluates that anomalous or threat events occurred on an end-node. This demo required installing AWS IoT Device Defender agents on both the non-secure and secure domains of the Shield96 board. The security domain is providing the secure crypto signature (using securely a private key) to device health reports and also isolates the detection and reporting processes from being intercepted by malicious applications. AWS IoT Device Defender agent collects monitored behaviors in the forms of metrics from both domains; then from the secure domain, AWS IoT Device Defender agent sends the metrics to the AWS Cloud for evaluation.

The Demo

For a full demo tutorial, please watch this video .

8275363691?profile=RESIZE_710x

Fig. 3 Edge-to-cloud IoT security demo at Arrow Embedded to Go 2020

The demo covers the following scenarios:

  • Out of the box experience
  • Firmware personalization – secure firmware rotation to provide a logistical separation between manufacturing and production firmware
  • Device registration to AWS IoT Core
  • Device decommissioning (de-registration) from AWS IoT Core
  • Secure firmware update
  • Resilience demonstration – tamper event simulation and remediation
  • Event reporting to AWS IoT Device Defender

Demonstrating resilience and tamper violation reporting with AWS IoT Device Defender

The boot logic for the demo includes a safety check for tamper events. In this case, we connected a button to an environmental tamper pin. The tamper violation generated by the button press is detected in the next boot sequence so the initial boot code switches to the secondary boot stack, and proceeds to boot the “fail-safe” boot image. Once booted the system will publish the tamper event to AWS IoT Device Defender for logging and analysis. In the demo, the primary and secondary images are identical, so each tamper event simply switches to the other. This allows the demo scenario to be repeated with each tamper event switching the system from A to B or B to A firmware images.

Streamlining personalized firmware to commercial boards

The commercial solution introduced by Arrow with the Shiled96 board includes a cloud based secure firmware rotation from the manufacturing generic firmware using AWS thus streamlining device personalization and providing a production ready device to a multitude of customers.

Out of manufacturing, the Shield96 Trusted board comes preloaded with a minimum and generic version of Linux. The out of the box experience to get to a personalized and up to date firmware is as simple as inserting an SD card and connecting the board to the Internet. The device boots securely, partitions the SD card then using Just-in-Time Registration of Device Certificates on AWS IoT (JITR) registers the device to AWS IoT Core and provisions it to Sequitur’s AWS IoT Core endpoint and to the Sandbox application. Next, the device automatically downloads the most recent generic or customer-specific file system, installs it and restarts. Thus the Sandbox provides lifecycle device management and firmware updates.

The 2-stage firmware deployment starting with a generic preloaded firmware at Arrow Programming Center followed by a cloud based final firmware rotation gives customers valuable features. For instance, an Original Equipment Manufacturer (OEM)\Original Device Manufacturer (ODM) may need to produce devices with firmware variations for deployment in different geographical regions or customized for different customers. Alternatively, the OEM\ODM may want to optimize logistics, manufacture in volume while the firmware is still in development, and load the final firmware in a distribution facility before shipping to customers. It also eliminates the opportunity for IP theft in manufacturing since the final firmware is never present at the manufacturer.

Conclusion

The solution introduced with this blog demonstrates that manufacturers can produce devices at scale while security is implemented properly, taking full advantage of the silicon embedded security IP. This implementation extends niche expertise and years of experience into a framework accessible to any developer.
Why is this important? Advanced security implemented right, massively reduces time to market and cost; the solution is also highly portable to other silicon. Sequitur Lab’s EmSPARK Security Suite is already available for NXP microprocessors (i.MX and QuorIQ Layerscape families) and nVidia Xavier bringing the same level of abstraction to IoT and embedded developers.
In this relationship Arrow proposes a secure single board computer fully provisioned. Arrow adds greater value by offering the ability to customize the hardware and the firmware. Customers can choose to add or remove hardware components, customize the Linux kernel, and subscribe for firmware management and security monitoring.
APN partners complement existing AWS services to enable customers in deploying a comprehensive security architecture and a seamless experience. In this case, Sequitur Labs and Arrow bring to market a game changing product complementing existing AWS edge and cloud services to enable any project of any size to use advanced security without the need for qualified embedded security experts.
Moreover, the product builds on top of HW security features of existing processors while providing the necessary software tools and process to work with existing manufacturing flows and not require secure manufacturing.
For a deeper dive into this solution the Getting Started Guide on the AWS Partner Device Catalog provides board bring up steps and example code for many of the supported use cases.

Originally posted HERE.

Read more…

By: Kiva Allgood, Head of IoT for Ericsson

Recently, I had the pleasure of participating in PTC’s LiveWorx conference as it went virtual, adding further credence to its reputation as the definitive event for digital transformation. I joined PTC’s Chief Technology Officer Steve Dertien for a presentation on how to unleash the power of industrial IoT (IIoT) and cellular connectivity.

A lot has changed in business over the past few months. With a massive remote migration the foremost priority, many business initiatives were put on the back burner. IIoT wasn’t one of them. The realm has remained a key strategic objective; in fact, considering how it can close distances and extend what industrial enterprises are able to monitor, control and accomplish, it’s more important than ever.

Ericsson and PTC formed a partnership specifically to help industrial enterprises accelerate digital transformation. Ericsson unlocks the full value of global cellular IoT connectivity and provides on-premise solutions. PTC offers an industrial IoT platform, ready to configure and deploy, with flexible connectivity and capabilities to build IoT solutions without manual coding.

This can enable enterprises to speed up cellular IoT deployments, realize the advantages of Industry 4.0 and better compete. Further, they can create a foundation for 5G, introducing such future benefits as network slicing, edge computing and high reliability, low-latency communications.

It all sounds great, I know, but if you’re like most folks, you probably have a few basic questions on your mind. Here’s are a few of the ones that I typically receive and appreciate the most.

Why cellular?

You’re connected already, via wire or Wi-Fi, so why is cellular necessary? You need reliable, global and dedicated connectivity that’s flexible to deploy. If you think about a product and its lifecycle, it may be manufactured in one location, land in another, then ultimately move again. If you can gather secure insight from it – regardless of where it was manufactured, bought or sold – you can improve operational efficiency, product capabilities, identify new business opportunities and much more.

What cellular can do especially well is effectively capture all that value by combining global connectivity with a private network. Then, through software like PTC’s, you can glean an array of information that’ll leave you wondering how else you can use the technology, regardless of whether the data is on or off the manufacturing floor. For instance, by applying virtual or augmented reality (VR/AR), you can find product defects before they leave the factory or end up in other products.

That alone can eliminate waste, save money from production to shipping, protect your reputation and much more.

According to analysts at ABI Research, we’ll see 4.3 billion wireless connections in smart factories by 2030, leading to a $1 trillion smart manufacturing market. For those that embrace Industry 4.0, private cellular has the potential to improve gross margins by 5-13% for factory and warehouse operations. What’s more, manufacturers can expect a 10x return on their investment.

You just need to be able to reliably turn actionable intelligence throughout the product’s lifecycle and across your global enterprise both securely and reliably – and that’s what cellular delivers.

Where do I start?

People don’t often ask for cellular or a dedicated private network specifically. They come to us with questions about things like how they can improve production cycle times or reduce costs by a certain percentage. That’s exactly where you should begin, too.

I come from the manufacturing space where for years I lived quality control, throughput and output. When someone would introduce a new idea, we’d vet it with a powerful but simple question: How will this make or save us money? If it couldn’t do either, we weren’t interested.

Look at your products and processes the same way when it comes to venturing into IIoT and digital transformation. Find the pain points. Identify defects, bottlenecks and possible improvements. Seek out how to further connect your business and the opportunities that could present. Data is indeed the new oil; it’s the intelligence that’ll help you understand where you need to go and what you need to do to move forward or create a new business.

What should I look for?

To get off on the right foot, be sure to engage the right partners. Realize this is a very complex area; no single provider can offer a solution that’ll address every need in one. You need partners with an ecosystem of their own best-of-breed partners; that’s why we work with companies like PTC. We have expertise in specific areas, focus on what we do best and work closely together to ensure we approach IIoT right.

We are building on an established foundation we created together. Both organizations have invested a lot of time, money, R&D cycles and processes in developing our individual and collective offerings. That said, not only will we be working together into the future, customers are assured they’ll remain on the forefront of innovation.

That future proofing is what you need to look for as well. You need wireless connectivity for applications involving asset tracking, predictive maintenance, digital twins, human-robot workflow integration and more. While Industry 4.0 is a priority, you want to lay a foundation for fast adoption of 5G, too.

There are other considerations to keep in mind down the road, such as your workforce. Employees may not want to be “machines” themselves, but they will want to be a robotics engineer or use AR or VR for artificial intelligence analysis. The future of work is changing, too, and IIoT offers a way to keep employees engaged.

Originally posted HERE

CLICK HERE to view Kivsa Allgood's LiveWorx presentation, “Unleashing the Power of Industrial IoT and Cellular Connectivity.”

Read more…

Written by: Mirko Grabel

Edge computing brings a number of benefits to the Internet of Things. Reduced latency, improved resiliency and availability, lower costs, and local data storage (to assist with regulatory compliance) to name a few. In my last blog post I examined some of these benefits as a means of defining exactly where is the edge. Now let’s take a closer look at how edge computing benefits play out in real-world IoT use cases.

Benefit No. 1: Reduced latency

Many applications have strict latency requirements, but when it comes to safety and security applications, latency can be a matter of life or death. Consider, for example, an autonomous vehicle applying brakes or roadside signs warning drivers of upcoming hazards. By the time data is sent to the cloud and analyzed, and a response is returned to the car or sign, lives can be endangered. But let’s crunch some numbers just for fun.

Say a Department of Transportation in Florida is considering a cloud service to host the apps for its roadside signs. One of the vendors on the DoT’s shortlist is a cloud in California. The DoT’s latency requirement is less than 15ms. The light speed in fiber is about 5 μs/km. The distance from the U.S. east coast to the west coast is about 5,000 km. Do the math and the resulting round-trip latency is 50ms. It’s pure physics. If the DoT requires a real-time response, it must move the compute closer to the devices.

Benefit No. 2: Improved resiliency/availability

Critical infrastructure requires the highest level of availability and resiliency to ensure safety and continuity of services. Consider a refinery gas leakage detection system. It must be able to operate without Internet access. If the system goes offline and there’s a leakage, that’s an issue. Compute must be done at the edge. In this case, the edge may be on the system itself.

While it’s not a life-threatening use case, retail operations can also benefit from the availability provided by edge compute. Retailers want their Point of Sale (PoS) systems to be available 100% of the time to service customers. But some retail stores are in remote locations with unreliable WAN connections. Moving the PoS systems onto their edge compute enables retailers to maintain high availability.

Benefit No. 3: Reduced costs

Bandwidth is almost infinite, but it comes at a cost. Edge computing allows organizations to reduce bandwidth costs by processing data before it crosses the WAN. This benefit applies to any use case, but here are two example use-cases where this is very evident: video surveillance and preventive maintenance. For example, a single city-deployed HD video camera may generate 1,296GB a month. Streaming that data over LTE easily becomes cost prohibitive. Adding edge compute to pre-aggregate the data significantly reduces those costs.

Manufacturers use edge computing for preventive maintenance of remote machinery. Sensors are used to monitor temperatures and vibrations. The currency of this data is critical, as the slightest variation can indicate a problem. To ensure that issues are caught as early as possible, the application requires high-resolution data (for example, 1000 per second). Rather than sending all of this data over the Internet to be analyzed, edge compute is used to filter the data and only averages, anomalies and threshold violations are sent to the cloud.

Benefit No. 4: Comply with government regulations

Countries are increasingly instituting privacy and data retention laws. The European Union’s General Data Protection Regulation (GDPR) is a prime example. Any organization that has data belonging to an EU citizen is required to meet the GDPR’s requirements, which includes an obligation to report leaks of personal data. Edge computing can help these organizations comply with GDPR. For example, instead of storing and backhauling surveillance video, a smart city can evaluate the footage at the edge and only backhaul the meta data.

Canada’s Water Act: National Hydrometric Program is another edge computing use case that delivers regulatory compliance benefits. As part of the program, about 3,000 measurement stations have been implemented nationwide. Any missing data requires justification. However, storing data at the edge ensures data retention.

Bonus Benefit: “Because I want to…”

Finally, some users simply prefer to have full control. By implementing compute at the edge rather than the cloud, users have greater flexibility. We have seen this in manufacturing. Technicians want to have full control over the machinery. Edge computing gives them this control as well as independence from IT. The technicians know the machinery best and security and availability remain top of mind.

Summary

By reducing latency and costs, improving resiliency and availability, and keeping data local, edge computing opens up a new world of IoT use cases. Those described here are just the beginning. It will be exciting to see where we see edge computing turn up next. 

Originaly posted: here

Read more…

 

Question 1 : So, let’s start with the obvious question. What is DevOps and why is it inevitable for today’s businesses to adopt?

Answer : DevOps at the end of the day, if you look at it from a higher level, it is really the automation of agile, a better way to perform application design, development and deployment. This is far superior than the waterfall method which I actually started working on and was even teaching college, back in the 80s and 90s. So, the idea is that we’re going to, in essence, continuously improve the software through an agile methodology where we get together to deal with events versus some sort of a schedule or sequence, and how software is delivered.

So, DevOps really is the ability to automate that. And so, it’s the idea that we can actually code applications, say an application that runs on Linux, we can hit a button and it automatically goes through testing, including penetration testing, security testing, stability testing, performance testing. And then moves into a continuous integration process, then moves into a continuous deployment process and then is pushed out to a particular staging area and then it’s pushed out from the staging area to a production server.

The goal of DevOps is really kind of remove the humans from that process even though we haven’t done that completely yet. It is, in essence, to create a repeatable process as leverage with the number of tool sets that are working together to streamline the modification and delivery of software in a way that’s going to be better quality each time the software is delivered. There’s some cultural issues around DevOps as well, by the way, that are just important, it’s the ability to, in essence, understand that thinkers are going to be integrated iterative, the ability to deal with feedback directly from the testers and the operators, the ability to flatten the organization, and have a very open and interactive organization moving forward. And that’s the other side of the coin.

So people have a tendency to look at DevOps as just a tool chain with lots of cool tools, continuous integration, continuous testing, those sorts of things are working together, but it’s really a combination of a toolchain of process, and also a cultural change that’s probably more important than any of the technological changes.

Question 2 : One interesting point you mentioned about agile. So, I mean, as we all know, agile is a very commonly adopted methodology that’s in the software industry and I mean lot of companies are implementing agile successfully. So, as we talk about DevOps, I know it’s an extension, but how is this complementing to agile from a practical implementation standpoint?

Answer : Again, DevOps is really going to be very much of the automation of agile. So Agile is going to take a cultural change, an organizational change in order to make it effective. And ultimately, we’re leveraging a toolchain within DevOps as a way to automate everything that occurs in an agile environment. So, if we’re getting together on a daily basis to form a scrum and we’re talking about what needs to be changed, then typically the DevOps toolchain is where those changes are going to occur. 

Read more…

 

As small as a postage stamp, the Seeeduino XIAO boasts a 32-bit Arm Cortex-M0+ processor running at 48 MHz with 256 KB of flash memory and 32 KB of SRAM.

A couple of months ago, I penned a column, The Worm Turns, in which I revealed that — although I’d been bravely fighting my urges — my will had crumbled and I had decided to create a display comprising a 12 x 12 = 144 array of ping pong balls, each illuminated with a tricolor WS2818 LED (a.k.a. a NeoPixel).

8221240097?profile=RESIZE_400x

The author proudly presenting his 12 x 12 ping pong ball array (Click image to see a larger version — Image source: Max Maxfield)

First, I found a pack of 144 ping pong balls on Amazon for only $11. I ordered two cartons because I knew I would need some spares. Of course, this immediately tempted me to increase the size of my array to 15 = 15 = 225 ping pong balls, but I’d already ordered 150 NeoPixels in the form of five meters of 30 pixels/meter strips from Adafruit, so I decided to stick with the original plan, which we will call “Plan A” so no one gets confused.

Thank goodness I restrained myself, because the 12 x 12 array is proving to be a lot more work than I expected — a 15 x 15 array would have brought me to my knees.

The next step was to build a 2-ball prototype because I wanted to see whether it was best to attach the NeoPixel to the outside of the ball (the fast-and-easy option) or inside the ball (the slow-and-painful alternative). Although you can’t see it from the picture or from this video, there is a slight but noticeable difference in the real-world, and one method is indeed better than the other — can you guess which one?

8221246481?profile=RESIZE_400x

A prototype using two ping pong balls (Click image to see a larger version — Image source: Max Maxfield)

Have you ever tried to drill 3/8” holes into 144 ping pong balls? Me neither. Over the years, I’ve learned a thing or two, and one of the things I’ve learned is that drilling holes in ping pong balls always ends in tears. Thus, I ended up cutting these holes using a small pair of curved nail scissors (there’s one long evening I’ll never see again).

The reason for using the strips is that this is the cheapest way to purchase NeoPixels with associated capacitors in the easiest-to-use form. Unfortunately, the ball-to-ball spacing (43 mm) on the board is greater than the pixel-to-pixel spacing (33 mm) on the strip. This means chopping the strip into segments, attaching each segment to its associated ping pong ball, and then connecting adjacent segments together using three wires. So, 144 x 3 = 432 short wires to strip and solder. Do you have any idea how long this takes? I do!

8221247276?profile=RESIZE_400x

The Seeeduino XIAO is the size of a small postage stamp (Click image to see a larger version — Image source: Seeed Studio)

Now, you may have noticed that I was driving my 2-ball prototype with an Arduino Uno, but this is too large to be used in my array. In the past, I would have been tempted to use an Arduino Nano, which is reasonably small and not-too-expensive. On the other hand, the fact that this is an 8-bit processor running at only 16 MHz with only 32 KB of flash memory and only 2 KB of SRAM would limit the effects I could achieve.

Sometimes (rarely) the fates decide to roll the dice in one’s favor. In this case, while I was pondering which processor to employ, the folks from Seeed Studio contacted me to tell me about their Seeeduino XIAO.

OMG! This little rapscallion — which is only the size of a small postage stamp and costs only $5 — is awesome! In addition to a 32-bit Arm Cortex-M0+ processor running at 48 MHz, this bodacious beauty boasts 256 KB of flash memory and 32 KB of SRAM.

As an aside, it’s important to note is that the Seeeduino XIAO’s programming connector is USB Type-C, which means you’re going to need a USB-A to USB Type-C cable.

8221253899?profile=RESIZE_584x

The Seeeduino XIAO’s 11 input/output pins pack a punch (Click image to see a larger version — Image source: Seeed Studio)

In addition to its power and ground pins, the Seeeduino XIAO has 11 data pins, each of which can act as an analog input or a digital input/output (I/O). Furthermore, one of these pins can by driven by an internal digital-to-analog converter (DAC) and act as a true analog output, while the other pins can be used to provide I2C, SPI, and UART interfaces.

Sad to relate, there is one small fly in the soup or a large elephant in the room (I’m feeling generous today, so I’ll let you pick the metaphor you prefer). The problem is that, although it can be powered with the same 5 V supply as the NeoPixels, the Seeeduino XIAO’s I/O pins use a 3.3 V interface, but the NeoPixels require 5 V data signals, so we need some way to convert between the two.

In the past, I would probably have used a full-up bidirectional logic level converter, like the 4-channel BOB (breakout board) from SparkFun, but I only need a single unidirectional signal, so this seems a bit of overkill.

Happily, I recently ran across an awesome hack on Hackaday.com that provides a simple solution requiring only a single general-purpose IN4001 diode.

8221260061?profile=RESIZE_584x

A cheap-and-cheerful voltage level converter hack (Click image to see a larger version — Image source: Max Maxfield)

The way this works is rather clever. From the NeoPixel’s data sheet we learn that a logic 1 is considered to be 0.7 * Vcc. Since we are powering our NeoPixels with 5 V, this means a logic 1 will be 0.7 * 5 = 3.5 V, which is higher than the XIAO’s 3.3 V digital output. Bummer!

Actually, if the truth be told, there is some “wriggle room” here, and the 3.3 V signal from the XIAO might work, but are we the sort of people for whom “might” is good enough? Of course we aren’t!

The solution is to add a “sacrificial NeoPixel” at the beginning of the chain, and to power this pixel via our IN4001 diode. Since the IN4001 has a forward voltage drop of 0.7 V, the first NeoPixel will see a Vcc of 5 – 0.7 = 4.3 V. Remember that the NeoPixel considers a logic 1 to be 0.7 * Vcc, so this first NeoPixel will accept anything above 0.7 * 4.3 = 3.01 V as being a logic 1. Meanwhile, the next NeoPixel in the chain will see the 4.3 V data signal coming out of the first NeoPixel as being a valid logic 1. Pretty clever, eh?

I’m currently about half of the way through wiring everything up. I cannot wait to see my array light up for the first time. Once everything is up and running, I will return to regale you with more details. Until that frabjous day, I will delight to hear your comments, questions, and suggestions.

Originally posted HERE.

Read more…

Ever wanted the power of the all new Raspberry Pi 4 Single Board Computer, but in a smaller form factor? With more options to expand the I/Os and its functions? Well, The Raspberry Pi Compute Module 4 (a.k.a. CM4) got you covered! In this article, we’ll be taking a deep dive into the all-new CM4, see what’s new and how different the latest iteration is from its predecessor, CM3.

Introduction - The System on Module Architecture

The CM4 can be described as a ‘stripped-down’ version of the Raspberry Pi 4 Model B, which contains the same processor, memory, eMMC flash memory and the power regulation circuitry built-in. The CM4 looks almost like a breakout board with two connectors underneath, hence the name “System on Module (SoM)”. However, what differs the CM4 (all compute modules for that matter) from the regular Raspberry Pi 4 is that the CM4 does not come equipped with any hardware I/O ports such as USB, Ethernet and HDMI, but offers access to all the useful I/O pins of the cpu to be utilized to connect external peripherals that the designers include in their circuit designs. This offers the ultimate freedom to the designers and developers to use the computing power of the Raspberry Pi 4, while reducing the overall cost of their designs by only having to use what’s necessary in their designs.

 

What’s New In The CM4?

The key difference with the CM4, at first glance, is the form factor of the module. The previous versions, including the CM3 were designed to have the DDR2-SODIMM (mechanically compatible) form factor which looked like a laptop RAM stick. The successor, CM4 comes in a smaller form factor, with 2x 100-pin High-Density connector which can be ‘popped-on’ to the receiving board.

8221226476?profile=RESIZE_710x

Key Features

The CM4 comes in 32 different variants which has varying Flash and RAM options and optional wireless connectivity. Similar to the predecessors, there is also a CM4Lite version, which does not come with a built-in eMMC memory, reducing the cost of the module to a minimum of $25. However, all the variants of CM4 are equipped with following key features:

 
  • Broadcom BCM2711, Quad Core Cortex-A72 (Arm V8) 64-bit System on Chip, running at 1.5 GHz

  • 1/2/4/8GB LPDDR4 RAM options

  • 0(CM4Lite)/8/16/32GB of eMMC storage options (upto 100MB/s bandwidth)

  • Smaller footprint of 55mm x 40mm x 4.7mm (w x l x h)

  • Supports H.265 (4Kp60 Decode); H.264 (1080p60fps Decode, 1080p30fps Encode) using OpenGL ES 3.0 graphics

  • Radio Module

  • 2.4/5GHz IEEE 802.11 b/g/n/ac Wireless (optional)

  • Bluetooth 5.0 BLE

  • On-board selector to switch between PCB trace antenna and external antenna

  • On-board Gigabit Ethernet PHY supporting IEEE 1588 standard

  • 1x PCI Express Gen2.0 lane (5Gbps)

  • 2x HDMI2.0 ports (upto 4k60fps)

  • 1x USB 2.0 port (480MBps)

  • 28x GPIO pins, with the support on both 1.8V or 3.3V logic levels along with the peripheral options:

  • 2x PWM channels

  • 3x GPCLK

  • 6x UART (Serial)

  • 6x I2C

  • 5x SPI

  • 1x SDIO interface

  • 1x DPI

  • 1x PCM

  • MIPI DSI (Serial Display)

  • 1x 2-lane MIPI DSI display port

  • 1x 4-lane MIPI DSI display port

  • MIPI CSI-2 (Serial Camera)

  • 1x 2-lane MIPI CSI camera port

  • 1x 4-lane MIPI CSI camera port

  • 1x +5V Power Supply Input (on-board regulator circuitry available)

 

The Applications - DIY? Industrial?

The CM4 can be integrated into end products, designed and prototyped using the full-size Raspberry Pi 4 SBC. This allows the removal of unused ports, peripherals and components which reduces the overall cost and complexity. Therefore application ideas are virtually limitless and range all the way from DIY projects such as the PiBoy to industrial IoT designs such as integrated home automation systems, small scale hosting servers, data exchange hubs and portable electronics which require the processing power offered by the CM4, all while maintaining the smaller form factor and power consumption. Compute Module Clusters such as the Turing Pi 2, which harnesses the power of multiple Compute Modules are also an option with this powerful, yet small System on Module, the Raspberry Pi CM4.

 

How Can I Use Upswift Solutions On My Compute Module 4 Based Design?

Upswift offers hassle-free management solutions for all Linux-based embedded systems (CM4 included), by providing you a one-click solution to monitor, control and manage all your connected devices, from one place.

Originally posted HERE.

Read more…

It’s been a long time since I performed Karnaugh map minimizations by hand. As a result, on my first pass, I missed a couple of obvious optimizations.

I’m sorry about the title of this blog, but I’m feeling a little wackadoodle at the moment. I think the problem is that I’m giddy with excitement at the thought of the forthcoming Thanksgiving holiday.

So, here’s the deal. Starting sometime in 2021, I’m going to be writing a series of columns for Practical Electronics magazine in the UK teaching digital logic fundamentals to absolute beginners.

This will have a hands-on component with an accompanying circuit board. We’re going to start by constructing some simple logic gates at the transistor level, then use primitive logic gates in 7400-series ICs to construct more sophisticated functions, and work our way up to… but I fear I can say no more at the moment.

After we’ve created some really simple combinatorial functions — like a 2:1 multiplexer — by hand, we’re going to introduce things like Boolean algebra, DeMorgan transforms, and Karnaugh maps, and then we are going to use what we’ve learned to implement more complex combinatorial functions, cumulating in a BCD to 7-segment decoder, before we progress to sequential circuits.

I was sketching out some notes this past weekend. Prior to the BCD to 7-segment decoder, we’ll already have tackled a BCD to decimal decoder, so a lot of the groundwork will have been laid. We’ll start by explaining how the segments in the 7-segment display are identified using the letters ‘a’ through ‘f’ and showing the combinations of segments we use to create the decimal digits 0 through 9.

8217684257?profile=RESIZE_710x

Using a 7-segment display to represent the decimal digits 0 through 9 (Click image to see a larger version — Image source: Max Maxfield)

Next, we will create the truth table. We’ll be using a common cathode 7-segment display, which means active-high outputs from our decoder because this is easier for newbies to wrap their brains around.

8217685658?profile=RESIZE_710x

Truth table for BCD to 7-segment decoder with active-high outputs (Click image to see a larger version — Image source: Max Maxfield)

Observe the input combinations shown in red in the truth table. We’ll point out that, in our case, we aren’t planning on using these input combinations, which means we don’t care what the corresponding outputs are because we will never actually see them (we’re using ‘X’ characters to represent the “don’t care” values). In turn, this means we can use these don’t care values in our Karnaugh maps to aid us in our logic minimization and optimization.

The funny thing is that it’s been a long time since I performed Karnaugh map minimizations by hand. As a result, on my first pass, I missed a couple of obvious optimizations. Just for giggles and grins, I’ve shown the populated maps below. Before you look at my solutions, why don’t you take a couple of minutes to perform your own minimizations to see how much you remember?

 8217691254?profile=RESIZE_710x

Use these populated maps to perform your own minimizations and optimizations (Click image to see a larger version — Image source: Max Maxfield)

I should point out that I’m a bit rusty at this sort of thing, so you might want to check that I’ve correctly captured the truth table and accurately populated these maps before you leap into the fray with gusto and abandon.

Remember that we’re dealing with absolute beginners here, so — even though I will have recently introduced them to Karnaugh map techniques, I think it would be a good idea to commence this portion of the discussions by walking them through the process for segment ‘a’ step-by-step as illustrated below.

8217692064?profile=RESIZE_710x

Karnaugh map minimizations for 7-segment display (Click image to see a larger version — Image source: Max Maxfield)

Next, I extracted the Boolean equations corresponding to the Karnaugh map minimizations. As shown below, I’ve color-coded any product terms that appear multiple times. I don’t recall seeing this done before, but I think it could be a useful aid for beginners. Once again, I’d be interested to hear your thoughts about this.

8217692289?profile=RESIZE_710x

Boolean equations for 7-segment display (Click image to see a larger version — Image source: Max Maxfield)

Actually, I’d love to hear your thoughts on anything I’ve shown here. Do you think the way I’ve drawn the diagrams is conducive to beginners understanding what’s going on? Can you spot anything I’ve missed or could do better? I can’t wait for you to see what we have planned with regards to the circuit board and the “hands-on” part of this forthcoming series (I will, of course, be reporting back further in the future). Until then, as always, I welcome your comments, questions, and suggestions.

Originally posted HERE.

Read more…

 

by Sam Kingsley

The "internet of things" will provide cyber criminals with new ways to exploit faults in personal security systems.

As the number of online devices surges and superfast 5G connections roll out, record numbers of companies are offering handsome rewards to ethical hackers who successfully attack their cybersecurity systems.

The fast-expanding field of internet-connected devices, known as the "nternet of things" (IoT) which includes smart televisions and home appliances, are set to become more widespread once 5G becomes more available—posing one of the most serious threats to digital security in future.

At a conference hosted by Nokia last week, "friendly hacker" Keren Elazari said that co-opting hackers—many of whom are amateurs—to hunt for vulnerabilities "was looked at as a trendy Silicon Valley thing six to eight years ago".

But "bug bounty programmes" are now offered by organisations ranging from the Pentagon and banks such as Goldman Sachs to airlines, tech giants and thousands of smaller businesses.

The largest bug-bounty platform, HackerOne, has 800,000 hackers on its books and said its organisations paid out a record $44 million (38.2 million euros) in cash rewards this year, up 87 percent on the previous 12 months.

"Employing just one full-time security engineer in London might cost a company 80,000 pounds (89,000 euros, $106,000) a year, whereas we open companies up to this global community of hundreds of thousands of hackers with a huge diversity in skills," Prash Somaiya, security solutions architect at HackerOne, told AFP.

8212233260?profile=RESIZE_710x

"We already know from what has happened in the past five years that the criminals find very clever ways to utilise digital devices," a friendly hacker told AFP

"We're starting to see an uptick in IoT providers taking hacking power seriously," Somaiya said, adding that HackerOne now regularly ships internet-connected toys, thermostats, scooters and cars out to its hackers for them to try to breach.

"We already know from what has happened in the past five years that the criminals find very clever ways to utilise digital devices," Elazari told AFP.

A sobering example was the 2016 "Mirai" cyberattack, during which attackers took control of 300,000 unsecured devices, including printers, webcams and TV recorders, and directed them to flood and disable websites of media, companies and governments around the world.

"In the future of 5G we're talking about every possible device having high-bandwidth connections, it's not just your computer or your phone," Elazari warned.

In October Nokia announced it had detected a 100 percent increase in malware infections on IoT devices in the previous year, noting in its threat report that each new application of 5G offers criminals "more opportunities for inflicting damage and extracting ransom".

8212234673?profile=RESIZE_710x

"Bug bounty programmes" are now offered by organisations ranging from the Pentagon and banks such as Goldman Sachs to airlines, tech giants and thousands of smaller businesses.
 

Breaker mindset

The rewards for hackers can be high: 200 of HackerOne's bug-hunters have now claimed more than $100,000 in prizes, while nine have breached the million-dollar earnings mark.

Apple, which advertises its own bug bounty programme, increased its maximum reward to more than $1 million at the end of last year, for a hacker able to demonstrate "zero click" weaknesses that would allow someone to access a device without any action by the user.

"A big driver is of course the financial incentive, but there's this element of a breaker mindset, to figure out how something is built so you can break it and tear it apart," Somaiya said.

"Being one individual who's able to hack multibillion-dollar companies is a real thrill, there's a buzz to it."

The rush of companies shifting to remote working during the pandemic has also led to "a surge in hacktivity", HackerOne said, with a 59 percent increase in hackers signing up and a one-third increase in rewards paid out.

The French and UK governments are among those to have opened up coronavirus tracing apps to friendly hackers, Somaiya added.

8212235461?profile=RESIZE_710x

"I see a lot of risk for misconfiguration and improper access control, these glitches are one of the main risks," Silke Holtmanns, head of 5G security research for cybersecurity firm AdaptiveMobile, told AFP
 

Incentive to act

While 5G internet systems will have new security features built into the network infrastructure—something absent before—the new technology is vastly more complex than its predecessors, leaving more potential for human error.

"I see a lot of risk for misconfiguration and improper access control, these glitches are one of the main risks," Silke Holtmanns, head of 5G security research for cybersecurity firm AdaptiveMobile, told AFP.

But companies are being motivated to act as security moves up the agenda, Holtmanns believes.

The EU, along with governments around the world, has begun tightening cybersecurity demands on organisations, and fines for data breaches have been increasing.

"Before now it's been hard for companies to justify higher investment in security," Holtmanns, who sits on the EU cybersecurity advisory group Enisa, said.

But she added, "If they can say: 'With that security level we can attract a higher level of customer, or lower insurance premiums,' people start thinking in this direction, which is a good thing."

Originally posted HERE.

Read more…

Everybody Needs a ShieldBuddy

 

Arduino Mega footprint; three 32-bit cores all running at 200 MHz; 4 Mbytes of Flash and 500 Kbytes of RAM; works with the Arduino IDE; what’s not to love?

I tend to have a lot of hobby projects on the go at any particular time. Occasionally, I even manage to finish one. More rarely, one actually works.

maxncb-0102-01-awesome-audio-reactive-artifact-300x218.jpg?profile=RESIZE_400x

 Awesome Audio Reactive Artifact (Click image to see a larger version — Image source: Max Maxfield)

I also have a soft spot for 8-bit microprocessors and microcontrollers. Thus, many of my hobby projects are based on the Arduino Nano, Uno, or Mega platforms.

Take my Awesome Audio Reactive Artifact, for example. This little rascal is currently powered using an Arduino Uno, which is driving 145 tricolor NeoPixels. In turn, these NeoPixels are mounted under 31 defunct vacuum tubes (see also Awesome Audio-Reactive Artifact meets BirmingHAMfest).

The Awesome Audio Reactive Artifact also includes an ADMP401-based MEMS Microphone breakout board (BOB), which costs $10.95 from the guys and gals at SparkFun. In turn, this feeds a cheap-and-cheerful MSGEQ7 audio spectrum analyzer chip, which relieves the Arduino of a lot of processing pain (see also Using MSGEQ7s In Audio-Reactive Projects).

 

maxncb-0102-02-countdown-timer-300x200.jpg?profile=RESIZE_400x Countdown Timer (Click image to see a larger version — Image source: Max Maxfield) 

Sad to relate, 8-bit Arduinos sometimes run out of steam. Consider my Countdown Timer, for example, whose task it is to display the years (YY), months (MM), days (DD), hours (HH), minutes (MM), and seconds (SS) to my 100th birthday (see also Yes! My Countdown Timer is Alive!).

This little scamp employs 12 Lixie displays, each of which contains 20 NeoPixels, which gives us 240 NeoPixels in all. As the sophistication of the effects I was trying to implement increased, so did my processing requirements. Thus, I decided to use a Teensy 3.6, which features a 32-bit 180 MHz ARM Cortex-M4 processor with a floating-point unit. Furthermore, the Teensy 3.6 boasts 1 Mbyte of Flash memory for code, along with 256 Kbytes of RAM for dynamic data and variables.

maxncb-0102-03-inamorata-prognostication-engine-152x300.jpg?profile=RESIZE_180x180 Prognostication Engine (Click image to see a larger version — Image source: Max Maxfield)

All of which brings us to the pièce de résistance in the form of my Pedagogical and Phantasmagorical Prognostication Engine (see also The Color of Prognostication). This bodacious beauty sports two knife switches, eight toggle switches, ten pushbutton switches, five motorized potentiometers, six analog meters, and a variety of sensors (temperature, barometric pressure, humidity, proximity). All of this requires a bunch of analog and digital general-purpose input/output (GPIO) pins.

Furthermore, in addition to a wealth of weird, wonderful, and wide-ranging sound effects, the engine is equipped with 354 NeoPixels. These could potentially be daisy-chained from a single pin, although I ended up partitioning them into five strands. More importantly, the various effects require a lot of processing and memory.

When things finally started to come together on this project, I was initially thinking of using an Arduino Mega to power the beast, mainly because it has 54 digital pins and 16 analog inputs. On the downside, we have to remember that this is only an 8-bit processor gamely running at 16 MHz with a scant 256 Kbytes of Flash memory and 8 Kbytes of RAM. Furthermore, the Mega doesn’t have a floating-point unit (FPU), which means that if you need to use floating-point operations, this will really impact the performance of your programs.

maxncb-0102-04-hitex-shieldbuddy-300x155.jpg?profile=RESIZE_400x The tri-core ShieldBuddy (Click image to see a larger version — Image source: Hitex)

But turn that frown upside down into a smile, because the boffins at Hitex (hitex.com) have taken the Arduino Mega form factor and slapped an awesome Infineon Aurix TC275 processor down on it.

These processors are typically found only in state-of-the-art embedded systems. they rarely make it into the maker world (like the somewhat disheveled scientist who is desperately in need of a haircut says in the movie Independence: “They don’t let us out very often”).

The result is called the ShieldBuddy. As you can see in this video, I just took delivery of my first ShieldBuddy, and I’m really rather excited (I say “first” because I have no doubt this is going to be one of many).

So, what makes the ShieldBuddy so special? Well, how about the fact that the TC275 boasts three independent 32-bit cores, all running at 200 MHz, each with its own FPU, and all sharing 4 Mbytes of Flash and 500 Kbytes of RAM (actually, this is a bit of a simplification, but it will suffice for now). 

There’s no need for you to be embarrassed — I’m squealing in excitement alongside you. Now, if you are a professional programmer, you’ll be delighted to hear that the main ShieldBuddy toolchain is the Eclipse-based “FreeEntryToolchain” from HighTec/PLS/ Infineon. This is a full-on C/C++ development environment with source-level debugger and suchlike. 

But how about if — like me — you aren’t used to awesomely powerful (and correspondingly complicated) Eclipse-based toolchains? Well, there’s no need to worry, because the guys and gals at Hitex also have a solution for the Arduino’s integrated development environment (IDE). 

Sit up straight and pay attention, because this is where things start to get really clever. In addition to any functions you create yourself, an Arduino sketch (program) always contains two functions: setup(), which runs only one time, and loop(), which runs over and over again. 

Now, remember that the ShieldBuddy has three processor cores, which we might call Core 0, Core 1, and Core 2. Well, you can take your existing sketches and compile/upload them for the ShieldBuddy, and — by default — they will run on Core 0. 

You could achieve the same effect by renaming your setup() function to be setup0(), and renaming your loop() function to be loop0(), which explicitly tells the compiler to target these functions at Core 0. 

The point is that you can also create setup1() and loop1() functions, which will automatically be compiled to run on Core 1, and you can create setup2() and loop2() functions, which will automatically be compiled to run on Core 2. Any of your remaining functions will be compiled in such a way as to run on whichever of the cores need to use them. 

Although each core runs independently, they can communicate between themselves using techniques like shared memory. Also, you can use interrupts to coordinate and communicate between cores. 

And things just keep on getting better and better, because it turns out that a NeoPixel library is already available for the ShieldBuddy. 

I’m just about to start experimenting with this little beauty. All I can say is that everybody needs a ShieldBuddy and my ShieldBuddy is my new best friend (sorry Little Steve). How about you? Could any of your projects benefit from the awesome processing power provided by the ShieldBuddy? 

Originally posted HERE.

Read more…

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…

Continue

Answering your Huawei ban questions

A lot has happened since we uploaded our most recent video about the Huawei ban last month. Another reprieve has been issued, licenses have been granted and the FCC has officially barred Huawei equipment from U.S. networks. Our viewers had some… Continue

IoT Career Opportunities