Subscribe to our Newsletter | To Post On IoT Central, Click here


OSes (223)

Arm DevSummit 2020 debuted this week (October 6 – 8) as an online virtual conference focused on engineers and providing them with insights into the Arm ecosystem. The summit lasted three days over which Arm painted an interesting technology story about the current and future state of computing and where developers fit within that story. I’ve been attending Arm Techcon for more than half a decade now (which has become Arm DevSummit) and as I perused content, there were several take-a-ways I noticed for developers working on microcontroller based embedded systems. In this post, we will examine these key take-a-ways and I’ll point you to some of the sessions that I also think may pique your interest.

(For those of you that aren’t yet aware, you can register up until October 21st (for free) and still watch the conferences materials up until November 28th . Click here to register)

Take-A-Way #1 – Expect Big Things from NVIDIAs Acquisition of Arm

As many readers probably already know, NVIDIA is in the process of acquiring Arm. This acquisition has the potential to be one of the focal points that I think will lead to a technological revolution in computing technologies, particularly around artificial intelligence but that will also impact nearly every embedded system at the edge and beyond. While many of us have probably wondered what plans NVIDIA CEO Jensen Huang may have for Arm, the Keynotes for October 6th include a fireside chat between Jensen Huang and Arm CEO Simon Segars. Listening to this conversation is well worth the time and will help give developers some insights into the future but also assurances that the Arm business model will not be dramatically upended.

Take-A-Way #2 – Machine Learning for MCU’s is Accelerating

It is sometimes difficult at a conference to get a feel for what is real and what is a little more smoke and mirrors. Sometimes, announcements are real, but they just take several years to filter their way into the market and affect how developers build systems. Machine learning is one of those technologies that I find there is a lot of interest around but that developers also aren’t quite sure what to do with yet, at least in the microcontroller space. When we hear machine learning, we think artificial intelligence, big datasets and more processing power than will fit on an MCU.

There were several interesting talks at DevSummit around machine learning such as:

Some of these were foundational, providing embedded developers with the fundamentals to get started while others provided hands-on explorations of machine learning with development boards. The take-a-way that I gather here is that the effort to bring machine learning capabilities to microcontrollers so that they can be leveraged in industry use cases is accelerating. Lots of effort is being placed in ML algorithms, tools, frameworks and even the hardware. There were several talks that mentioned Arm’s Cortex-M55 architecture that will include Helium technology to help accelerate machine learning and DSP processing capabilities.

Take-A-Way #3 – The Constant Need for Reinvention

In my last take-a-way, I eluded to the fact that things are accelerating. Acceleration is not just happening though in the technologies that we use to build systems. The very application domain that we can apply these technology domains to is dramatically expanding. Not only can we start to deploy security and ML technologies at the edge but in domains such as space and medical systems. There were several interesting talks about how technologies are being used around the world to solve interesting and unique problems such as protecting vulnerable ecosystems, mapping the sea floor, fighting against diseases and so much more.

By carefully watching and listening, you’ll notice that many speakers have been involved in many different types of products over their careers and that they are constantly having to reinvent their skill sets, capabilities and even their interests! This is what makes working in embedded systems so interesting! It is constantly changing and evolving and as engineers we don’t get to sit idly behind a desk. Just as Arm, NVIDIA and many of the other ecosystem partners and speakers show us, technology is rapidly changing but so are the problem domains that we can apply these technologies to.

Take-A-Way #4 – Mbed and Keil are Evolving

There are also interesting changes coming to the Arm toolchains and tools like Mbed and Keil MDK. In Reinhard Keil’s talk, “Introduction to an Open Approach for Low-Power IoT Development“, developers got an insight into the changes that are coming to Mbed and Keil with the core focus being on IoT development. The talk focused on the endpoint and discussed how Mbed and Keil MDK are being moved to an online platform designed to help developers move through the product development faster from prototyping to production. The Keil Studio Online is currently in early access and will be released early next year.

(If you are interested in endpoints and AI, you might also want to check-out this article on “How Do We Accelerate Endpoint AI Innovation? Put Developers First“)

Conclusions

Arm DevSummit had a lot to offer developers this year and without the need to travel to California to participate. (Although I greatly missed catching up with friends and colleagues in person). If you haven’t already, I would recommend checking out the DevSummit and watching a few of the talks I mentioned. There certainly were a lot more talks and I’m still in the process of sifting through everything. Hopefully there will be a few sessions that will inspire you and give you a feel for where the industry is headed and how you will need to pivot your own skills in the coming years.

Originaly posted here

Read more…

Will We Ever Get Quantum Computers?

In a recent issue of IEEE Spectrum, Mikhail Dyakonov makes a pretty compelling argument that quantum computing (QC) isn't going to fly anytime soon. Now, I'm no expert on QC, and there sure is a lot of money being thrown at the problem by some very smart people, but having watched from the sidelines QC seems a lot like fusion research. Every year more claims are made, more venture capital gets burned, but we don't seem to get closer to useful systems.

Consider D-Wave Systems. They've been trying to build a QC for twenty years, and indeed do have products more or less on the market, including, it's claimed, one of 1024 q-bits. But there's a lot of controversy about whether their machines are either quantum computers at all, or if they offer any speedup over classical machines. One would think that if a 1K q-bit machine really did work the press would be all abuzz, and we'd be hearing constantly of new incredible results. Instead, the machines seem to disappear into research labs.

Mr. Duakonov notes that optimistic people expect useful QCs in the next 5-10 years; those less sanguine expect 20-30 years, a prediction that hasn't changed in two decades. He thinks a window of many decades to never is more realistic. Experts think that a useful machine, one that can do the sort of calculations your laptop is capable of, will require between 1000 and 100,000 q-bits. To me, this level of uncertainty suggests that there is a profound lack of knowledge about how these machines will work and what they will be able to do.

According to the author, a 1000 q-bit machine can be in 21000 states (a classical machine with N transistors can be in only 2N states), which is about 10300, or more than the number of sub-atomic particles in the universe. At 100,000 q-bits we're talking 1030,000, a mind-boggling number.

Because of noise, expect errors. Some theorize that those errors can be eliminated by adding q-bits, on the order of 1000 to 100,000 additional per q-bit. So a useful machine will need at least millions, or perhaps many orders of magnitude more, of these squirrelly microdots that are tamed only by keeping them at 10 millikelvin.

A related article in Spectrum mentions a committee formed of prestigious researchers tasked with assessing the probability of success with QC concluded that:

"[I]t is highly unexpected" that anyone will be able to build a quantum computer that could compromise public-key cryptosystems (a task that quantum computers are, in theory, especially suitable for tackling) in the coming decade. And while less-capable "noisy intermediate-scale quantum computers" will be built within that time frame, "there are at present no known algorithms/applications that could make effective use of this class of machine," the committee says."

I don't have a dog in this fight, but am relieved that useful QC seems to be no closer than The Distant Shore (to quote Jan de Hartog, one of my favorite writers). If it were feasible to easily break encryption schemes banking and other systems could collapse. I imagine Blockchain would fail as hash algorithms became reversable. The resulting disruption would not be healthy for our society.

On the other hand, Bruce Schneier's article in the March issue of IEEE Computing Edge suggests that QC won't break all forms of encryption, though he does think a lot of our current infrastructure will be vulnerable. The moral: if and when QC becomes practical, expect chaos.

I was once afraid of quantum computing, as it involves mechanisms that I'll never understand. But then I realized those machines will have an API. Just as one doesn't need to know how a computer works to program in Python, we'll be insulated from the quantum horrors by layers of abstraction.

Originaly posted here

Read more…

7811924256?profile=RESIZE_400x

 

CLICK HERE TO DOWNLOAD

This complete guide is a 212-page eBook and is a must read for business leaders, product managers and engineers who want to implement, scale and optimize their business with IoT communications.

Whether you want to attempt initial entry into the IoT-sphere, or expand existing deployments, this book can help with your goals, providing deep understanding into all aspects of IoT.

CLICK HERE TO DOWNLOAD

Read more…

Edge Products Are Now Managed At The Cloud

Now more than ever, there are billions of edge products in the world. But without proper cloud computing, making the most of electronic devices that run on Linux or any other OS would not be possible.

And so, a question most people keep asking is which is the best Software-as-a-service platform that can effectively manage edge devices through cloud computing. Well, while edge device management may not be something, the fact that cloud computing space is not fully exploited means there is a lot to do in the cloud space.

Product remote management is especially necessary for the 21st century and beyond. Because of the increasing number of devices connected to the internet of things (IoT), a reliable SaaS platform should, therefore, help with maintaining software glitches from anywhere in the world. From smart homes, stereo speakers, cars, to personal computers, any product that is connected to the internet needs real-time protection from hacking threats such as unlawful access to business or personal data.

Data being the most vital asset is constantly at risk, especially if individuals using edge products do not connect to trusted, reliable, and secure edge device management platforms.

Bridges the Gap Between Complicated Software And End Users

Cloud computing is the new frontier through which SaaS platforms help manage edge devices in real-time. But something even more noteworthy is the increasing number of complicated software that now run edge devices at homes and in workplaces.

Edge device management, therefore, ensures everything runs smoothly. From fixing bugs, running debugging commands to real-time software patch deployment, cloud management of edge products bridges a gap between end-users and complicated software that is becoming the norm these days.

Even more importantly, going beyond physical firewall barriers is a major necessity in remote management of edge devices. A reliable Software-as-a-Service, therefore, ensures data encryption for edge devices is not only hackproof by also accessed by the right people. Moreover, deployment of secure routers and access tools are especially critical in cloud computing when managing edge devices. And so, developers behind successful SaaS platforms do conduct regular security checks over the cloud, design and implement solutions for edge products.

Reliable IT Infrastructure Is Necessary

Software-as-a-service platforms that manage edge devices focus on having a reliable IT infrastructure and centralized systems through which they can conduct cloud computing. It is all about remotely managing edge devices with the help of an IT infrastructure that eliminates challenges such as connectivity latency.

Originally posted here

Read more…

Industrial IoT Revolution

Why the Nvidia Jetson Nano is responsible for the biggest industrial IoT revolution these days

 
c1f0a2_ecaa338269684f82b2661b550075f528~mv2.webp
 
 

It feels like yesterday when the Raspberry Pi foundation released the first-in-line Single Board Computer (SBC) to the market. Back in 2012, Raspberry Pi wasn't alone in the SBC growing market, however, it was the first to make a community-based product that brings the hardware and the software eco-system to a beautiful harmony on the internet. Before those days, embedded Linux based SBC's and SOM's were a place for Linux kernel and embedded hardware experts, no easy-to-use tools, ready Linux based distros, or most importantly without the enormous amount of questions and answers across the internet on anything related.

Today, 8 years later, the "2012 revolution" happens again

This time, it took a year to understand the impact of the new 'kid' in the market, but now, there are a few indications that defiantly build the route to a revolution.

The Raspberry Pi was the first to make embedded Linux easy while keeping the advantages of reliability and flexibility in terms of fitting to different kinds of industries applications. It's almost impossible to ignore the variety of industries where Raspberry Pi is in its hurt of products to save time-to-market and costs. The power of this magical board leans on the software side: The Raspberry Pi foundation and their community, worked hard across the years to improve and share their knowledge, but, at the same time, without notice or targeting, they brought the Pi development to an extremely "serverless" level.

The Nvidia Jetson Nano

Let's stop talking about the Raspberry Pi and focus on today's industry needs to understand better why the new kid in the town is here to change the market of IoT and smart products forever.

 
c1f0a2_2ca55bc3cd744a10a05bc244c4e092c1~mv2.webp
 
 Why do we need to thanks Nvidia and the Jetson Nano?
 

The market is going forward. AI, Robotics, amazing-looking screen app Gui's, image processing, and long data calculations are all become the new standard of smart edge products.

If a few years ago, you would only want to connect your product to the cloud and receive anything valuable, today, product managers and developers compete in a much tougher industry era. This time, the Raspberry Pi can't be the technology hero again, its resources are limited and the eco-system starts to squint to a better-fit solution.

 
c1f0a2_b46f958fa9b543af88a6ad38b2afce82~mv2.webp
 
 

NVIDIA Jetson devices in Upswift.io device management platform

The Jetson Nano is the first SBC to understood the necessary combination that will drive new products to use it. It's the first SBC designed in the mind of industrial powerful use cases, while not forgetting the prototyping stage and the harmony that gave the Raspberry Pi their success. It's the first solution to bring the whole package for developers and for hardware engineers with a "SaaS" feel: The OS is already perfect thanks to Ubuntu, there is plenty of software instructions by Nvidia and open-source ready-to-use tools custom made for the Jetson family, and for the hardware engineers: they are free to go with the System On Module (SOM) that is connected to a carrier board which includes all the necessary outputs and inputs to make the development stage even faster.

The Jetson Nano combination is basically providing the first world infrastructure for producing a "2020" product with complex software while working in a minimal budget and time-to-market. The Jetson Nano enables developers and product managers to imagine further without compromises, bringing tough software missions to the edge easily.

Originally posted here

Read more…

Industrial Prototyping for IoT

I-Pi SMARC.jpg

ADLINK is a global leader in edge computing driving data-to-decision applications across industries. The company recently introduced I-Pi SMARC for Industrial IoT prototyping.

-       AdLInk I-Pi SMARC consists of a simple carrier paired with a SMARC Computer on Module

-       SMARC Modules are available from entry level PX30 Rockchip to top of the line Intel Apollo Lake.

-       SMARC modules are specifically designed for typical industrial embedded applications that require long life, high MTBF and strict revision control.

-       Use popular off the shelve sensors and create prototypes or proof of concepts on short notice.

Additional information can be found here

 

Read more…

By: Kelly McNelis

We have faced unprecedented disruption from the many challenges of COVID-19, and PTC’s LiveWorx was no exception. The definitive digital transformation event went virtual this year, and despite the transition from physical to digital, LiveWorx delivered.

Of the many insightful virtual keynotes, one that caught everyone’s attention was ‘Digital Transformation: The Technology & Support You Need to Succeed,’ presented by PTC’s Executive Vice President (EVP) of Products, Kevin Wrenn, and PTC’s EVP and Chief Customer Officer, Eduarda Camacho.

Their keynote focused on how companies should be prioritizing the use of best-in-class technology that will meet their changing needs during times of disruption and accelerated digital transformation. Wrenn and Camacho highlighted five of our customers through interactive case studies on how they are using PTC technology to capitalize on digital transformation to thrive in an era of disruption.

6907721673?profile=RESIZE_400x

Below is a summary of the five customers and their stories that were highlighted during the keynote.

1. Royal Enfield (Mass Customization)

Royal Enfield is an Indian motorcycle company that has been manufacturing motor bikes since 1901. They have British roots, and their main customer base is located in India and Europe. Riders of Royal Enfield wants their bikes to be particular to their brand, so they worked to better manage the complexities of mass customization and respond to market demands.

Royal Enfield is a long time PTC customer, but they were on old versions of PTC technology. They first upgraded Creo and Windchill to the latest releases so they could leverage the new capabilities. They then moved on to transform their processes for platform and variant designs, introduced simulation much earlier by using Creo Simulation Live, and leveraged generative design by bringing AI into engineering and applying it to engine and chassis complex custom forged components. Finally, they retrained and retooled their engineering staff to fully leverage the power of new processes and technologies.

The entire Royal Enfield team now has digital capabilities that accelerate new product designs, variants, and accessories for personalization; as a result, they are able to deliver a much-shortened design cycle. Royal Enfield is continuing their digital transformation trend, and will invest in new ways to create value while leveraging augmented reality with PTC's Vuforia suite.

2. VCST (Manufacturing Efficiency, Quality, and Innovation)

VCST is part of the BMT Group and are a world-class automotive supplier of precision-machined power train and brake components. Their problem was that they had high costs for their production facility in Belgium. They either needed to improve their cost efficiency in their plant or face the potential of needing to shut down the facility and relocate it to another region. VCST decided to implement ThingWorx so that anyone can have instant visibility to asset status and performance. VCST is also creating the ability to digitize maintenance requests and the ability to acquire about spare parts to improve the overall efficiency in support of their costs reduction goals.

Additionally, VCST has a goal to reach zero complaints for their customers and, if any quality problems appear to their customers, they can be required to do a 100% inspection until the problem is solved. Moreover, as cars have gotten quieter with electrification, the noise from the gears has become an issue, and puts pressure on VCST to innovate and reduce gear noise.

VCST has again relied on ThingWorx and Windchill to collect and share data for joint collaborative analysis to innovate and reduce gear noise. VCST also plans to use Vuforia Expert Capture and Vuforia Chalk to train maintenance workers to further improve their efficiency and cost effectiveness. The company is not done with their digital transformation, and they have plans to implement Creo and Windchill to enable end-to-end digital thread connectivity to the factory.

3. BID Group Holdings (Connected Product)

BID Group Holdings operates in the wood processing industry. It is one of the largest integrated suppliers and North American leader in the field. The purpose of BID Group is to deliver a complete range of innovative equipment, digital technologies, turnkey installations, and aftermarket services to their customers. BID Group decided to focus on their areas of expertise, an rely on PTC, Microsoft, and Rockwell Automation’s combined capabilities and scale to deliver SaaS type solutions to their own industry.

Leveraging this combined power, the BID Group developed a digital strategy for service to improve mill efficiency and profitability. The solution is named OPER8 and was built on the ThingWorx platform. This allowed BID Group to provide their customers an out of the box solution with efficient time-to-value and low costs of ownership. BID Group is continuing to work with PTC and Rockwell Automation, to develop additional solutions that will reduce downtime of OPER8 with a predictive analytics module by using ThingWorx Analytics and LogixAI.

4. Hitachi (Service Optimization)

Hitachi operates an extensive service decision that ensures its customers’ data systems remain up and running. Their challenge was not to only meet their customers uptime Service Level Agreements, but to do it without killing their cost structure. Hitachi decided to implement PTC’s Servigistics Service Parts Management software to ensure the right parts are available when and where they are needed for service. With Servigistics, Hitachi was able to accomplish their needs while staying cost effective and delighting their customers.

Hitachi runs on the cloud, which allows them to upgrade to current releases more often, take advantage of new functionality, and avoid unexpected costs.

PTC has driven engagement and support for Hitachi through the PTC Community, and encourages all customers to utilize this platform. The network of collaborative spaces in a gathering place for PTC customers and partners to showcase their work, inspire each other, and share ideas or best practices in order to expand the value of their PTC solutions and services.

5. COVID-19 Response 

COVID-19 has put significant strain on the world’s hospitals and healthcare infrastructure, and hospitalization rates for COVID brought into question the capacity of being able to handle cases. Many countries began thinking of the value field hospitals could bring to safely care for patients and ease the admissions numbers of ‘regular’ hospitals. However, the complication is that field hospitals have essentially no isolation or air filtration capability that is required for treating COVID patients or healthcare workers.

As a result, the US Army Corp of Engineers has put out specifications to create self-contained isolation units, which are fully functioning hospital rooms that can be transported or built onsite. But, the assembly needed to happen fast, and a group of companies (including PTC) led by The Innovation Machine rallied to help design and define the SCIU’s.

With buy-in from numerous companies, a common platform was needed for companies to collaborate. PTC felt compelled to react, and many PTC customers and partners joined in to help create a collaboration platform, with cloud-based Windchill as the foundation. But, PTC didn’t just provide software to this collaboration; PTC also contributed with digital thread and design advice to help the group solve some of the major challenges. This design is a result of the many companies coming together to create deployments across various US state governments, agencies, and FEMA.

Final Thoughts

All of the above customers approached digital transformation as a business imperative. They all had sizeable challenges that needed to be solved and took leadership positions to implement plans that leveraged digital transformation technologies combined with new processes.

PTC will continue to innovate across the digital transformation portfolio and is committed to ensuring that customer success offerings capture value faster and provide the best outcomes.

Original Post Link: https://www.ptc.com/en/product-lifecycle-report/liveworx-digital-transformation–technology-and-support-you-need-to-succeed

Author Bio: Kelly is a corporate communications specialist at PTC. Her responsibilities include drafting and approving content for PTC’s external and social media presence and supporting communications for the Chief Strategy Officer. Kelly has previous experience as a communications specialist working to create and implement materials for the Executive Vice President of the Products Organization and senior management team members.

 

Read more…

Helium Expands to Europe

Helium, the company behind one of the world’s first peer-to-peer wireless networks, is announcing the introduction of Helium Tabs, its first branded IoT tracking device that runs on The People’s Network. In addition, after launching its network in 1,000 cities in North America within one year, the company is expanding to Europe to address growing market demand with Helium Hotspots shipping to the region starting July 2020. 

Since its launch in June 2019, Helium quickly grew its footprint with Hotspots covering more than 700,000 square miles across North America. Helium is now expanding to Europe to allow for seamless use of connected devices across borders. Powered by entrepreneurs looking to own a piece of the people-powered network, Helium’s open-source blockchain technology incentivizes individuals to deploy Hotspots and earn Helium (HNT), a new cryptocurrency, for simultaneously building the network and enabling IoT devices to send data to the Internet. When connected with other nearby Hotspots, this acts as the backbone of the network. 

“We’re excited to launch Helium Tabs at a time where we’ve seen incredible growth of The People’s Network across North America,” said Amir Haleem, Helium’s CEO and co-founder. “We could not have accomplished what we have done, in such a short amount of time, without the support of our partners and our incredible community. We look forward to launching The People’s Network in Europe and eventually bringing Helium Tabs and other third-party IoT devices to consumers there.”  

Introducing Helium Tabs that Run on The People’s Network
Unlike other tracking devices,Tabs uses LongFi technology, which combines the LoRaWAN wireless protocol with the Helium blockchain, and provides network coverage up to 10 miles away from a single Hotspot. This is a game-changer compared to WiFi and Bluetooth enabled tracking devices which only work up to 100 feet from a network source. What’s more, due to Helium’s unique blockchain-based rewards system, Hotspot owners will be rewarded with Helium (HNT) each time a Tab connects to its network. 

In addition to its increased growth with partners and customers, Helium has also seen accelerated expansion of its Helium Patrons program, which was introduced in late 2019. All three combined have helped to strengthen its network. 

Patrons are entrepreneurial customers who purchase 15 or more Hotspots to help blanket their cities with coverage and enable customers, who use the network. In return, they receive discounts, priority shipping, network tools, and Helium support. Currently, the program has more than 70 Patrons throughout North America and is expanding to Europe. 

Key brands that use the Helium Network include: 

  • Nestle, ReadyRefresh, a beverage delivery service company
  • Agulus, an agricultural tech company
  • Conserv, a collections-focused environmental monitoring platform

Helium Tabs will initially be available to existing Hotspot owners for $49. The Helium Hotspot is now available for purchase online in Europe for €450.

Read more…

Recovering from a system failure or a software glitch can be no easy task.  The longer the fault occurs the harder it can be to identify and recover.  The use of an external watchdog is an important and critical tool in the embedded systems engineer toolbox.  There are five tips that should be taken into account when designing a watchdog system.

Tip #1 – Monitor a heartbeat

The simplest function that an external watchdog can have is to monitor a heartbeat that is produced by the primary application processor.  Monitoring of the heartbeat should serve two distinct purposes.  First, the microcontroller should only generate the heartbeat after functional checks have been performed on the software to ensure that it is functioning.  Second, the heartbeat should be able to reveal if the real-time response of the system has been jeopardized.

Monitoring the heartbeat for software functionality and real-time response can be done using a simple, “dumb” external watchdog.  The external watchdog should have the capability to assign a heartbeat period along with a window that the heartbeat must appear within.  The purpose of the heartbeat window is to allow the watchdog to detect that the real-time response of the system is compromised.  In the event that either functional or real-time checks fail the watchdog then attempts to recover the system through a reset of the application processor.

Tip #2 – Use a low capability MCU

External watchdogs that can be to monitor a heartbeat are relatively low cost but can severely limit the capabilities and recovery possibilities of the watchdog system.  A low capability microcontroller can cost nearly the same amount as an external watchdog timer so why not add some intelligence to the watchdog and use a microcontroller.  The microcontroller firmware can be developed to fulfill the windowed heartbeat monitoring with the addition of so much more.  A “smart” watchdog like this is sometimes referred to as a supervisor or safety watchdog and has actually been used for many years in different industries such as automotive.  Generally a microcontroller watchdog has been reserved for safety critical applications but given the development tools and the cost of hardware it can be cost effective in other applications as well.

Tip #3 – Supervise critical system functions

The decision to use a small microcontroller as a watchdog opens nearly endless possibilities of how the watchdog can be used.  One of the first roles of a smart watchdog is usually to supervise critical system functions such as a system current or sensor state.  One example of how a watchdog could supervise a current would be to take an independent measurement and then provide that value to the application processor.  The application processor could then compare its own reading to that of the watchdog.  If there were disagreement between the two then the system would execute a fault tree that was deemed to be appropriate for the application.

Tip #4 – Observe a communication channel

Sometimes an embedded system can appear to be operating as expected to the watchdog and the application processor but from an external observer be in a non-responsive state.  In such cases it can be useful to tie the smart watchdog to a communication channel such as a UART.  When the watchdog is connected to a communication channel it not only monitor channel traffic but even commands that are specific to the watchdog.  A great example of this is a watchdog designed for a small satellite that monitors radio communications between the flight computer and ground station.  If the flight computer becomes non-responsive to the radio, a command could be sent to the watchdog that is then executed and used to reset the flight computer.

Tip #5 – Consider an externally timed reset function

The question of who is watching the watchdog is undoubtedly on the minds of many engineers when using a microcontroller for a watchdog.  Using a microcontroller to implement extra features adds some complexity and a new software element to the system.  In the event that the watchdog goes off into the weeds how is the watchdog going to recover? One option would be to use an external watchdog timer that was discussed earlier.  The smart watchdog would generate a heartbeat to keep itself from being reset by the watchdog timer.  Another option would be to have the application processor act as the watchdog for the watchdog.  Careful thought needs to be given to the best way to ensure both processors remain functioning as intended.

Conclusion

The purpose of the smart watchdog is to monitor the system and the primary microcontroller to ensure that they operate as expected.  During the design of a system watchdog it can be very tempting to allow the number of features supported to creep.  Developers need to keep in mind that as the complexity of the smart watchdog increases so does the probability that the watchdog itself will contain potential failure modes and bugs.  Keeping the watchdog simple and to the minimum necessary feature set will ensure that it can be exhaustively tested and proven to work.

Originally Posted here

 

Read more…

It's Not All Linux

In the comments section of my 2020 embedded salary survey, quite a few respondents felt that much of the embedded world is being subsumed by canned solutions. Will OSes like Linux and cheap, powerful boards like the Raspberry Pi and Arduino replace traditional engineering? Has that already happened?

A number of people complained their colleagues no longer understand low-level embedded things like DMA, chip selects, diddling I/O registers, and the like. They feel these platforms isolate the engineer from those details.

Part of me says yeah! That's sort of what we want. Reuse and abstraction means the developer can focus on the application rather than bringing up a proprietary board. Customers want solutions and don't care about implementation details. We see these abstractions working brilliantly when we buy a TCP/IP stack, often the better part of 100K lines of complex code. Who wants to craft those drivers?

Another part of me says "save me from these sorts of products." It is fun to design a board. To write the BSP and toss bits at peripheral registers. Many of us got a rush the first time we made an LED blink or a motor spin. I still find that fulfilling.

So what's the truth? Is the future all Linux and Pis?

The answer is a resounding "no." A search for "MCU" on Digi-Key gets 89,149 part numbers. Sure, many of these are dups with varying packages and the like, but that's still a ton of controllers.

Limiting that search to 8 bitters nets 30,574 parts. I've yet to see Linux run on a PIC or other tiny device.

Or filter to Cortex-M devices only. You still get 16,265 chips. None of those run Linux, Windows, BSD, or any other general-purpose OS. These are all designed into proprietary boards. Those engineers are working on the bare metal... and having a ton of fun.

The bigger the embedded world gets the more applications are found. Consider machine learning. That's for big iron, for Amazon Web Services, right? Well, partly. Eta Compute and other companies are moving ML to the edge with smallish MCUs running at low clock rates with limited memory. Power consumption rules, and 2 GB of RAM at 1 GHz just doesn't cut it when harvesting tiny amounts of energy.

Then there's cost. If you can reduce the cost of a product made in the millions by just a buck the business prospers. Who wants a ten dollar CPU when a $0.50 microcontroller will do?

Though I relish low-level engineering our job is to get products to market as efficiently as possible. Writing drivers for a timer is sort of silly when you realize that thousands of engineers using the same part are doing the same thing. Sure, semi vendors often deliver code to handle all of this, but in my experience most of that is either crap or uses the peripherals in the most limited ways. A few exceptions exist, such as Renesas's Synergy. They go so far as to guarantee that code. My fiddling with it leaves me impressed, though the learning curve is steep. But that sort of abstraction surely must be a part of this industry going forward. Just as we don't write protocol stacks and RTOSes any more, canned code will become more common.

Linux and canned boards have important roles in this business. But an awful lot of us will still work on proprietary systems.

View original post here

For novel ideas about building embedded systems (both hardware and firmware), join the 35,000 engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe

Read more…

E-cOMMERCE AND BENEFITS DERIVED FROM IOT

E-commerce has been seeing growth since the past decades. E-commerce had become a trend in retailers and popular in consumers. Owning to its quick, easy, and reliable service, e-commerce's popularity is known to all. Now with the advent of IoT(internet of things), where devices transfer data among themselves without any human interactions, it has benefited e-commerce in myriad ways.

Faster, More Reliable

The first and foremost advantage that IoT provides to e-commerce is that it has added to the reliability of the transactions. As there is no chance of human error anymore. Owning to its automated systems, the data of transactions are reliable and quicker.

Enhancing the business of retailers

Sell more, make money more. The IoT has created a possibility to exactly know the customer's needs and desires using the technology to collect data about the trend on social media. This collected data is then applied to sell the desired products accordingly leading to more and more growth in business via e-commerce. This way it is not only advantageous to the retailers but also to customers as IOT allows a great deal in customer care.

It leads to enhancement in marketing and promotions. Product promotion is also made through IoT and on the other hand it leads to an increase in customer care.

Securing items in the Warehouses

The IoT technology has made it possible to make sure the items do not get over stoked in the Warehouses or the items do not expired/get bruised in them by remotely sensing the products in the warehouses. This has ensured the optimization of productivity. The IoT has the ability to keep in check even at the times when there are lots of chances of human error. So, the items are more secured when the surveillance is through IoT.

Easy Tracking of theft and Losses

The products are always under surveillance, it's location, temperature through multiple devices which keeps a track on this tracking ID. GPS enabled e-commerce business makes it possible to keep track of the products in every instance. Hence making it less prone to theft and other losses. The product is never out of sight and the whole travel history is being constantly recorded. The automated e-mails and texts regarding the product's departure and arrival make it secure in delivering to the right place in a safe way.

E-commerce web Development and Design

When it comes to selling, buying online, eCommerce websites need lucrative web designs to captivate customers and this is one reason Shopify developers are well in demand. Also, not only captivating but also fast and quick. The web development now largely inclined towards using the IoT technologies, as in to make the work fast and more reliable. The IoT devices are meant to communicate more safely. Hence, they are more admired and desired. The web-based user interface also prefer IOT devices for reliability and for making things faster. Als, the IoT enabled websites makes it easier for consumers with a low-speed internet connection by adjusting the response time by minimizing it between the web server and IoT enabled sites.

There's more to come yet in IoT, with its ever-increasing usage of devices. This will help in the growth of e-commerce even more in the future.

Author Bio: Abdullah Ali is Co-founder and Shopify Developer in Los Angeles

Read more…
I have spent many years working with IoT projects. Most of them were typical, there was nothing unusual behind them and they were trying to copy the success of their competitors, however, the deeper I was diving into IoT startups environment more and more I was facing with innovators in this niche who have found how to adapt IoT technologies for their enterprises' specification. In this article, I will not mention my partners' names because of the reasons, however, I will try to push you to the idea of how to implement IoT to your company.
Read more…

The Anti-Quality Movement

by Jack Ganssle

[email protected]

Recently our electric toothbrush started acting oddly – differently from before. I complained to Marybeth who said, “I think it’s in the wrong mode.”

Really? A toothbrush has modes?

We in the embedded industry have created a world that was unimaginable prior to the invention of the microprocessor. Firmware today controls practically everything, from avionics to medical equipment to cars to, well everything.

And toothbrushes.

But we’re working too hard at it. Too many of us use archaic development strategies that aren’t efficient. Too many of us ship code with too many errors. That's something that can, and must, change.

Long ago the teachings of Deming and Juran revolutionized manufacturing. One of Deming's essential insights was that fixing defects will never lead to quality. Quality comes from correct design rather than patches applied on the production line. And focusing on quality lowers costs.

The software industry never got that memo.

The average embedded software project devotes 50% of the schedule to debugging and testing the code. It's stunning that half of the team’s time is spent finding and fixing mistakes.

Test is hugely important. But, as Dijkstra observed, testing can only prove the presence of errors, not the absence of bugs.

Unsurprisingly, and mirroring Deming's tenets, it has repeatedly been shown that a focus on fixing bugs will never lead to a quality product - all that will do is extend the schedule and insure defective code goes out the door.

Focusing on quality has another benefit: the project gets done faster. Why? That 50% of the schedule used to deal with bugs gets dramatically shortened. We shorten the schedule by not putting the bugs in in the first place.

High quality code requires a disciplined approach to software engineering - the methodical use of techniques and approaches long known to work. These include inspection of work products, using standardized ways to create the software, seeding code with constructs that automatically catch errors, and using various tools that scan the code for defects. Nothing that is novel or unexpected, nothing that a little Googling won't reveal. All have a long pedigree of studies proving their efficacy.

Yet only one team out of 50 makes disciplined use of these techniques.

What about metrics? Walk a production line and you'll see the walls covered with charts showing efficiency, defect rates, inventory levels and more. Though a creative discipline like engineering can't be made as routine as manufacturing, there are a lot of measurements that can and must be used to understand the team's progress and the product's quality, and to drive the continuous improvement we need.

Errors are inevitable. We will ship bugs. But we need a laser-like focus on getting the code right. How right? We have metrics; we know how many bugs the best and mediocre teams ship. Defect Removal Efficiency is a well-known metric used to evaluate quality of shipped code; it's the percentage of the entire universe of bugs found in a product that were removed prior to shipping (it's measured until 90 days after release). The very best teams, representing just 0.4% of the industry, eliminates over 99% of bugs pre-shipment. Most embedded groups only removed 95%.

Where does your team stand on this scale? Can one control quality if it isn’t measured?

We have metrics about defect injection rates, about where in the lifecycle they are removed, about productivity vs. any number of parameters and much more. Yet few teams collect any numbers.

Engineering without numbers isn’t engineering. It’s art.

Want to know more about metrics and quality in software engineering? Read any of Capers Jones’ books. They are dense, packed with tables of numbers, and sometimes difficult as the narrative is not engaging, but they paint a picture of what we can measure and how differing development activities effect errors and productivity.

Want to understand where the sometimes-overhyped agile methods make sense? Read Agile! by Bertrand Meyer and Balancing Agility and Discipline by Barry Boehm and Richard Turner.

Want to learn better ways to schedule a project and manage requirements? Read any of Karl Wiegers’ books and articles.

The truth is that we know of better ways to get great software done more efficiently and with drastically reduced bug rates.

When will we start?

Jack Ganssle has written over 1000 articles and six books about embedded systems, as well as one about his sailing fiascos. He has started and sold three electronics companies. He welcomes dialog at [email protected] or at www.ganssle.com.

 

Read more…

Over the next 25 years, the global energy consumption is estimated to soar by 40%, and unless we don’t stumble upon a vast coal mine, there is no way we can fulfil this requirement.

You may be thinking: “Renewable energy will do the job for us.”

Yes, renewable energy are paving the path for a more sustainable future, however, their mass adoption is still limited due to several barriers. They are highly dependent on natural occurrences, require a high capital investment, and are inflexible with traditional energy grids. It can’t be said with certainty when these limitless sources can be used to a full extent for our ever-growing electricity needs.

Till the time we overcome these barriers and adopt the renewables entirely, optimal and efficient utilization of non-energy resources is the only prominent option to stretch our days of existence with power supply.

Utilities are continually looking for a technological enabler that can help them optimize their processes and manage assets remotely. IoT is one such technology that is helping power generation companies to do so.

With its advanced telemetry and cognition capabilities, energy utilities can boost their energy transmission and distribution processes to facilitate the efficient flow of electricity and reduce energy wastage.

Let us see the ways with which IoT can help utilities for adequate electricity supply:

1)  Remote Asset Monitoring and Management:

This is primarily the most talked-about application of an IoT system. It allows energy companies to remotely manage their equipment; ranging from power generators to transmission lines. Sensors embedded on assets can measure parameters like temperature, vibration, pressure, wear, etc. that can be read by the utilities to identify the probable breakdown point.

This enable utilities to monitor the depreciation of their assets and govern the overall health of their power transmission and distribution architecture. By compiling the continuously stored data, insights and trends can also be generated to estimate the failure time. Subsequently, inspection and maintenance procedures can be scheduled to eliminate failures and reduce downtime.

Furthermore, by planning their repair and maintenance tasks as per IoT based real-time alert system, power generation companies can even reduce energy losses and enhance their operational efficiency. This helps them to optimize the consumption of their resources and facilitate the flow of electricity from power plants to consumers’ facilities in an efficient manner.

2)  Grid Balancing and Supply Rerouting/Restoring:

Smart Grids are a modern IoT variant of the existing power grids that include several energy measures and smart appliances like smart meters. Their implementation in the infrastructure of power distribution and transmission makes the flow of electricity from power plants to the end consumers more efficient and reliable.

However, there benefits far exceeds then simply supplying the electricity from one place to another. They allow utilities to lower operational costs and help customers to manage electricity consumption at home.

In terms of making the flow of electricity more efficient and promote optimum resource utilization, smart grids are the initial choice. As they work on a two-way interaction, they can effectively manage congestion on the power lines. They are even capable of ensuring the connection requirements of the generation stations such as frequency and voltage control to prevent instability.

Furthermore, the smart grids offer astounding electricity rerouting and restoring advantages. In case a transmission line breaks due to adverse weather conditions or voltage fluctuations, smart grids can find another route to transmit electricity to a locality. Other than facilitating a constant power supply, utilities can also use this feature of smart grids to identify causes that result in energy fluctuations. This empowers energy companies to reduce fluctuations that result in wire failure and hence boost their asset utilization efficiently.

Smart Grids also enable the indulgence of renewable energy resources. Let us see how:

3)  Renewable energy management:

As we discussed earlier, the IoT infuses two-way dialogue system in an existing electricity distribution channel. This means that instead of merely supplying power to the users and bill them once a month, the IoT system enables both the utilities and consumers to share information.

“In layman terms, even the end-users can now contribute to a smart grid.”

The most significant advantage of using smart grids is its ability to include renewable energy resources to an existing power grid. Thus, end consumers using solar panels to power their home appliances can discharge excess electricity into the power grid in exchange for money. Many countries like USA and India are providing subsidies to end consumers for creating a small-scale solar power station on their rooftops.

This is assisting in the mass adoption of renewable energy resources, which other than alleviating energy scarcity can also help in reducing the emission of greenhouse gases. Since smart grids act as a stable infrastructure to maintain voltage fluctuations, they can be used to manage the electricity flow from both renewable and conventional methods.

My Personal Opinion:

“As per marketsandmarkets, IoT in the energy market is expected to grow at a CAGR of 24.1% to reach a value of USD 22.34 billion by the end of 2020.”

This showcase the potential that IoT can offer to the vertical of electricity transmission and distribution. Its implementation has the power to facilitate our energy needs and reduce our consumption to a great extent. As per a study by environmental impact assessment (EIA), by effectively utilizing existing IoT technologies, the industries in the US have reduced their coal energy consumption for producing electricity by 14 to 22%.

Similarly, as smart consumers, we can also implement smart meters and IoT based energy monitoring solutions at our homes and facilities to monitor overall power consumption and manage needs accordingly.

This can be the initial step to enjoy limitless power. Let us contribute by joining the IoT revolution and make our homes & industries smart for a sustainable future.

Read more…

The real estate segment has been acknowledged as the most "behind in development" in multiple Forbes and TechCrunch reports. Mainly because of the fact that real estate investments, management and everything in between aren't very appealing to younger generations, a lot of companies and brokers still rely on a very traditional approach, made of direct marketing, face-to-face meetings and similar. With this in mind, though, there are several ones who refused to remain in an "old" era and embraced new pieces of technology. Let's analyse why IoT should be the pivotal one.

The Usage Of Data

Let's state the obvious: every real estate-related process is bulky and slow. From finance to paperwork, every small individual process could take ages to be finalised and this, for many, was the starting opening point for technology to proliferate within this sector. The usage of data, in the real estate sector, applies to IoT because it revolves around risk management for long term financial procedures: if someone is looking to invest £2 million in a property, for example, a long credit score check will be required, as well as cross-references to evaluate if the request isn't masked as what's known in the segment as a "ghost contractor". Big data, especially if coming from bank accounts and credit card companies, could speed up these processes massively, effectively connecting an extremely slow business segment to the IoT. The UK is very ahead in regards to this very matter, especially within the commercial property auctions segment, where companies are heavily relying on data to speed up the buying process.

3828259225?profile=RESIZE_710x

Architecture-Ready Applications

Has anyone of you ever used a dedicated software for any real estate task? From development to buying and investments, every piece of software has been designed and planned for professionals who know technical terms, therefore actually blocking "casual" users or junior professionals from approaching them. Having a data-driven, IoT-oriented architecture could heavily help in setting up budgeting for real estate management (having 12 properties and calculating individual fees isn't an easy job) or just to deal with clients, by letting it focus on the taxes/admin part whilst you do the "physical" management part.
Once again, the usage of big data and data points is quintessential for this very purpose, given the fact that it could simplify the way many different tasks are executed, effectively reducing their timescales.

To Conclude

Should the real estate sector approach IoT, data and data science? Definitely. Not only because of the above-mentioned reasons but also because of the fact that data-driven strategies are ahead of time: from 2020, in fact, we will definitely witness a massive growth of data and its applications in business (not just IoT).

Read more…
RSS
Email me when there are new items in this category –

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…

Continue

Answering your Huawei ban questions

A lot has happened since we uploaded our most recent video about the Huawei ban last month. Another reprieve has been issued, licenses have been granted and the FCC has officially barred Huawei equipment from U.S. networks. Our viewers had some… Continue

IoT Career Opportunities