Subscribe to our Newsletter | To Post On IoT Central, Click here


Olha Bohun's Posts (8)

The massive data streams produced by IoT devices can be efficiently processed using flow-based programming. This approach allows for the interconnection of multiple IoT devices, creating a flow between them, and thus the ability to exchange data across predefined connections and networks.

Thanks to the capabilities of the cloud, this approach can facilitate the processing of deep real-time telemetry data pools. Such a solution can be especially useful in asset-intensive industries such as logistics and transportation, automotive, agriculture, utilities, oil, gas and manufacturing.

Learn how to build IoT telemetry simulators for Azure and Amazon cloud platforms using the Node-RED tool. Read more...

Read more…

The internet of things (IoT) has the power to influence so many of our services and utilities. So many, in fact, that even power itself is included in the concept’s clean sweep of the world’s commodities. Here’s what you need to know about the Internet of Energy.

 

The three Ds – decarbonisation, decentralisation and digitisation – are transforming the energy sector, as the quest for a carbon-free world continues. This falls on a backdrop of the IoT’s driving of efficiency around wind turbines and solar systems, which look set to represent the future of global energy consumption. The new connected energy business model is already here, it is called the internet of energy.

The internet of energy explained

The year is 2018 and the demand for clean energy has never been higher. Governments presiding over developed markets face pressures to cut emissions in the face of global warming, while projections from the International Energy Agency outlines a rise of 55% in the global demand for energy between 2005 – 2030.

Access to energy has already increased in recent years, but with the UN estimating global population growth of a further two billion by 2040, the coming years will lead to huge pressure on solar and wind power to meet this growing demand.

Another factor is the decentralisation of energy grids as a result of old, centralised systems failing to integrate newer units, like solar panels. Operators claim that billions are being spent on stabilising faulty power grids every year, with some of this passing onto the consumer.

A solution comes in the form of an internet of energy, whose technology can provide the infrastructure for decentralised, smarter energy grids and a stable supply of power.

The future of power

The internet of energy is based on a foundation of data, collected by a network of sensors with varying applications.

General Electric is one of the groups that use sensors within its turbines to monitor things like output and productivity. This is funnelled into a computer providing information on external factors, like the weather or fuel costs, which churns out recommendations for peak performance.

Artificial intelligence is ideal in this situation as a result of its ability to analyse data much quicker and more effectively than humans. In 2017 the concept was praised by the United States Department of Energy following its examination of past fluctuations in power to determine the answers for a more stable and efficient grid.

Quantum computing is another area of much interest for energy players. Its ability to process and store data at a faster pace than a classical computer makes it perfect for oil rigs, where tens of thousands of sensors are used to collect information on the performance of equipment.

At the core of further decentralisation has been blockchain; touted by China’s State Grid Corporation as a way of securing information on things like use of power and market prices. Data can then be shared with government bodies and private firms to develop a deeper understanding of the country’s energy consumption.

Conclusion

In near enough every use case of the internet of energy, there is an underlying theme: connectivity. By collecting, analysing and trading data via a secure, decentralised network, the energy industry can start to find a route towards providing renewables for the world’s population.

The truth is, solar and wind turbines will not solve our problems alone. With revelations that the annual waste of renewable energy from China is enough to power Beijing for an entire year, there is a clear need for a network to make better use of this equipment.

Given the rising demand for energy – and its following of a production model that dates back over 100 years – our companies must embrace the innovations that can accelerate its production. Through an internet of energy, we might have found the answer.

Get in touch to see how your own organisation can benefit from decentralised solutions and IoT.

 

Originally published at eleks.com

Read more…

 

Despite the industry being in a state of post-revolution following the onset and adoption of machinery, there is a batch of technologies that already define the state of technological innovation in agriculture.

 

The agricultural sector is in the middle of the data-driven transformation. Farmers and commodity traders are heading towards technological innovation in agriculture, adopting data analytics and smart farming technologies. Facing a crucial period in their history, agricultural businesses are tasked with combating the issues that will change not only their working methods but the world as we know it.

The agribusiness issues at hand

One of the greatest pain points associated with agriculture is the ability to predict the events that will achieve a given result.

Conditions play even less in the favour of farms positioned within markets that face rising production costs. The global population reaching 9.6 billion people by 2050, up from around 7 billion at present, according to forecasts from the United Nations, combined with the spread of economic prosperity are adding great pressure to the market. The UN suggests the doubling of crop production by 2050 as a countermeasure to this growth.

Some farmers simply cannot increase their land in order to grow more crops. As a result, there is a case for technology to make better use of the space available.

 

How IoT and predictive analytics can solve agriculture’s pressing problems

To become more efficient, agricultural businesses need data and plenty of it. This opens the door for technological innovation, as the size of these businesses and their plots of land prevent any kind of manual surveying.

Already we are seeing an active use of IoT devices to analyse the status of crops, capturing real-time data with sensors. For instance, with soil sensors, farmers can detect any irregular conditions such as high acidity and efficiently tackle these issues to improve their yield.

The data gathered from sensors allows to apply advanced analytics and get the insight that aid decisions around harvesting, while machine learning can transform the figures into solid predictions. Using advanced analytics, agricultural businesses can forecast yields, foresee unexpected weather conditions, predict market demand and mitigate risks, as well as better plan their capacity.

Agricultural drone is also among the key components of smart farming today. Tasked with the surveying of crop and livestock conditions from up high, their use of time lapsing within onboard cameras is helping farmers identify problems in areas like irrigation, which would otherwise go undetected.

Other members of the drone family allow for the spraying of crops at a greater accuracy than a tractor. As an added benefit, this also seeks to reduce the risk of human exposure to harmful chemicals. Back to ground level, there is potential for other robots to help out with manual duties like planting, ploughing and meat production.

The end goal in this case? A more efficient, more effective farm.

 

Conclusion

To spell things out: population growth could mean that every agricultural business will have to increase their levels of productivity over the next 30 years. That said, a review of the tech on today’s market suggests even the most specific problems can be matched with smart agribusiness solutions.

In the era of smart agriculture, IoT and predictive analytics are powering more efficient operations around the world. Combining IoT with analytics, agribusinesses get accurate predictions for crops and market conditions, allowing to increase their yields and profits. Smart application of technologies can facilitate warehouse and inventory management, help plan and execute seasonal works with the automated flow of data from the fields and agro-research labs.

Get in touch to discuss where the IoT can help futureproof your own agricultural business.

 

Originally published at eleks.com

Read more…

Chances are you’ve come across the word Bitcoin several times — at the very least, in some general finance related news. It’s also likely you have been dismissing the cryptocurrency as another gimmick, unless you followed the spike in its value at any point where, say, PayPal malfunctioned, and counted exactly how much a geek who’s had his saving in bitcoin earned that way.

In which case you’d know that since its introduction in 2008, Bitcoin has risen in value. Like really, it has: in its infancy, during negotiating the possible value of transactions on the bitcointalk forums, one notable transaction of 10,000 BTC was used to purchase two pizzas. Today 10,000 BTC equals to over 35 million dollars. This growth was achieved in less than 10 years.

Now before we get carried away, one could point out that the brave new currency was designed as a “Peer-to-Peer” system relying on “cryptography to control its creation and management, rather than on central authorities”. And its design borrows ideas from the cypherpunk community. So sounds like the opposite of what businesses should be interested in, right? A skeptic would bring up Bitcoin’s motley history of being oppressed by regulatory authorities or even barely legal.

And yet, if you’re still confusing cyber-punk fantasy and cypherpunk community, here is a puzzling list of companies that deal with bitcoins, and it includes such likely familiar names as Microsoft, IBM, Reddit, Subway, Lionsgate Films, Bloomberg.com, WordPress.com, Wikipedia, Steam, Richard Branson’s Virgin Galactic, and Tesla. Oh, and Bitcoin is now accepted as a legal payment method in Japan.

Now would be a good time to wonder why businesses are starting to pay attention and governments are beginning to take digital currency into account and making standards for it rather than suppressing it. The answer is in the principle behind it and, ironically, in the security, it provides to the user. Bitcoin network is fully decentralised and is meant to exclude trust-reliance and intermediaries from the financial operations: a direct transaction between users recorded in a public ledger. No single party has the power to issue new bitcoins or approve Bitcoin transactions. The shared transaction register is called the blockchain. And it’s this technology that is of particular interest to businesses.

While bitcoin as an equity is still difficult to project surely enough to view as an investment, its success key, blockchain technology, can and should be used for business processes improvement. The technology is basically a game changer, a principle of organising human interactions in a secure and yet completely decentralised way, which could have applications in far more than just the financial sector. Blockchain, as put by the Harvard Business Review, is a foundational technology that “has the potential to create new foundations for our economic and social systems.” They call it foundational due to the fact that the changes it is expected to bring about are not rapid — but very tenacious and large-scale. For comparison, they offer the 30-year path that took TCP/IP technology (yes, the Internet) to success: “it took more than 30 years for TCP/IP to move through all the phases — single use, localised use, substitution, and transformation — and reshape the economy. Today, more than half the world’s most valuable public companies have internet-driven, platform-based business models”.

So how does it work and what are the benefits?

Much like TCP/IP drastically lowered the cost of connections, blockchain has the capacity to reduce the cost of transactions, which makes it a very efficient system of record. Tracking and recording continuous transactions, analysing performance rates based on those and making plans for future — all of those processes are inalienable for a business. Most businesses have no single compound record of all the activities; instead, data are distributed across internal units and then reconciled across ledgers, which takes time and leaves space for human error.

So, what is blockchain and how it can be a solution here? In simple words, blockchain is just a way of handling databases, the ultimate ledger. Only a highly efficient, secure and verifiable one. In a regular business operations system, there are numerous dissimilar databases for various activities of a company: changes to them are made separately, and then you spend resources to compile the whole thing together. This is, in terms of resources, the opposite of optimal and efficient.

In a blockchain system, the ledger is replicated in identical databases, hosted and maintained separately. When changes are entered in one copy, all the other copies are simultaneously updated. This smart business approach would keep records of exchanged assets in all ledgers. “If a stock transaction took place on a blockchain-based system, it would be settled within seconds, securely and verifiably”, the Harvard Business Review explains.

Secure, decentralized, shared publicly, trusted and automated. This is exactly the kind of solution that modern professionals would want and software developers would be looking to develop. The most important fact about blockchain is that its applications are not limited to the banking and financial industry — its principle can be used for other business improvement purposes, including smart contracts or establishing a secure document transfer system, network infrastructure or marketing forecasts and many others.

For example, blockchain allows creating a custom system for smart documents management: secure storing and transfer of various kinds of assets.

In establishing secure document transfer networks, blockchain provides an invaluable advantage because of efficient cryptography and a decentralised structure. This sets the foundation for broadening the scope of typical smart contracts from just the financial sector to the legal realm, real estate, intellectual property and much more.

Whether you need to improve identification and authentication solutions or introduce a supply chain verification system, or shared economy solutions for, say, ride-sharing services — there’s a whole new frontier in business systems organisation.

Distributed ledger technology can be tailored by professional software developers for various business optimisations. Let’s discuss how your organization can take full advantage of blockchain’s benefits.


Originally published at eleks.com 

Read more…
Businesses are getting ready for this year’s Black Friday and Christmas shopping season. Leading retailers attempt to predict the hottest trends, define reasonable prices and foresee a time-slot an average consumer will spend on a particular product. With big data analytics and powerful machine learning tools, their predictions will be the most accurate.

The term “big data” has become a buzzword for sales teams across nearly every industry over the past few years. Companies have collected vast amounts of data from leads and transactions which no single person would ever be able to process. According to an MIT Sloan Management Review survey of companies earning 500M+ in sales, at least 40% of companies are using machine learning tools to increase performance. From providing insights on leads to recommending current customers new products, machine learning can revolutionize the sales industry in several ways.

1. Added Customer Support

According to Salesforce’s Adam Lawson, customer experience is the most important variable separating successful and unsuccessful sales teams. ML will allow for significant improvements to customer experience — with the ability to proactively follow up with leads, customize the user’s experience and answer questions via chat (called chatbots), every customer will have an experience tailored to their preferences and needs.

2. Improved Forecasting

Within the last few year, advanced lead scoring has become an extremely popular tool for sales teams. Lead scoring, which uses ML, looks at collected data on prospects, such as their budget, size, past sales, and interaction with marketing emails then formulates a score which will project interest and the likelihood of a sale. This process reduces the number of dead leads and focuses a sales team on converting strong leads to clients.

3. Personalized Suggestions

In the retail industry, you may have noticed the text “other customers purchased” or “you may also be interested in,” followed by a list of similar or complementary products. These suggestions are thanks to ML, which allows supporting a consumer-centric approach, through the analysis of sales patterns, purchase histories, and data consumption. Companies like Amazon, Spotify, and Netflix are already employing this solution to suggest additional content for customers. As this technology becomes more readily available, smaller retailers and SaaS companies will begin to follow suit.

In the sales industry, ML can streamline the entire consumer relationship from the first point of contact to customer support. As machine learning continues to improve both sales teams’ and customers’ experiences, its influence over the sales industry will only increase over the next few years.

Want to keep your sales high even after the holiday buying boom is over? Contact us at ELEKS. We’ll make sure your team is equipped with the machine learning tools your company needs to get ahead.

Originally published at eleks.com on November 9, 2017.

Read more…

How Can You Cope With The Rise Of Dark Data

At this point, everyone has heard about what big data analytics can do for marketing, research, and internal productivity. However, the data only about 20% of all data created is collected and analyzed. The other 80% is known as dark data, or data that collected but not analyzed or made to be searchable. So, what is the purpose of this data, and why is it taking up terabytes worth of storage space on servers around the world?

Examples of Dark Data

  • Media: Audio, video and image files oftentimes will not be indexed, making them difficult to gain insights from. Contents of these media files, such as the people in the recording or dialogue within a video, will remain locked within the file itself.

  • Social Data: Social media analytics have improved drastically over the last few years. However, data can only be gathered from a user’s point of entry to their exit point. If a potential customer follows a link on Facebook, then send the visited website to five friends in a group chat, the firm will not realize their advertisement had 6 touchpoints, not just the one.

  • Search Histories: For many companies, especially in the financial service, healthcare, and energy industries, regulations are a constant concern. As legal compliance standards change, firms worry that they will end up deleting something valuable.

As analytics and automation improve, more dark data is beginning to be dragged out into the light. AI, for example, is getting far better at speech recognition. This allows media files to be automatically tagged with metadata and audio files to be transcribed in real time. Social data is also starting to be tracked with far better accuracy. In doing so, companies will be able to better understand their customers, their interests, and their buying habits. This will allow marketers to create limited, targeted ads based on a customers location that bring in more revenue while reducing cost.

The explosion of data we are currently seeing is only the tip of the big data iceberg. As IoT and wearable devices continue their integration into our daily lives, the amount of data we produce will only grow. Companies are looking to get ahead of the curve and ensure they can gain as much insight from this data as possible. If these firms do not have a plan to create actionable insights from this currently dark data, they ultimately could fall behind and lose out to competitors with a bigger focus on analytics.

The original story was published on ELEKS Trends Blog, visit to get more insights. 

Read more…

At ELEKS, we run several R&D projects related to IoT, drones and other smart devices. Over the course of these projects, we discovered that there are some tasks that are common for different devices and platforms. Our team has substantial experience with complex enterprise solutions, so when our partners from Cisco Systems suggested that we apply our expertise to IoT and create a framework for IoT solutions, we gladly joined the project.

Currently, there are a lot of IoT tools and solutions (as much as 35 are mentioned in this single article), so why should we create something new? The majority of the existing frameworks focus on programming and controlling separate devices. Yet, we aimed to develop a solution that would allow programming complex systems consisting of multiple devices that interact with each other.

Consequently, the framework had to provide the following set of features:

  • An interface for interaction with hardware;
  • Communication between devices;
  • Automatic search of available devices;
  • Dynamic scaling;
  • Integration with other systems;
  • Fault tolerance.

We saw that the majority of requirements could be implemented with Akka.

Ideation

To understand the problems we wanted to solve with Pragmukko, let’s look at the hypothetical business case that can become a commonplace occurrence in the very near future. Assume we have several drones controlled by Raspberry Pi, a regular PC as a control station and a smart fridge with ice cream controlled by Intel Edison. All the devices are physically located in one room and are connected to one and the same network via Wi-Fi. A drone has to pick up and deliver some ice cream from the fridge when a user orders it to do so. Therefore, the framework’s main goal is to enable easy programming of this function.

Since the project team decided to use Akka, our first idea was to launch an Akka node on every component of our system: drones, control station and fridge, and then build the logic of every component as an actor in the node. This way, our system would look like a regular Akka cluster with some nodes capable of flying and other being fridges with ice cream. Akka enables the drones, the fridge and the control station to communicate with each other, control the state of the cluster, receive the list of the cluster members, etc. Under certain conditions, our system is decentralized and resistant to the failure of individual components. Moreover, we can easily add new components to the cluster – they will immediately get the information about other members and join the operation. This means that we can effortlessly install one more fridge, or, if any drone takes too much ice cream and falls down, we can employ additional drones.

In general, we had to write a hardware interaction layer and bring the whole IoT and drones thing together. That’s what we did, along with some other things.

Implementation

To bring the system to life, the team took the following steps. First of all, we created a mechanism to automatically group the nodes in a cluster. Although Akka has this function, we ignored it because it required preliminary configuration of seed nodes. We also needed to know the network addresses of all cluster members, and these were the kind of complications we wanted to avoid. Instead, the project team developed the node grouping mechanism based on UDP broadcasting.

Then, we programmed the device nodes to interact with hardware by transforming the incoming Akka messages into commands. We named such nodes Embedded Nodes. Schematically, they can be presented in the following way:

Drones and IoT ELEKSLabs

Broadcast Actor is an actor that sends UDP-broadcast messages for the node to be identified and added to the cluster.
Embedded Actor is an actor that processes the messages from other nodes of the cluster and interacts with hardware. This actor can be extended to implement user logic.
Hardware Layer is the layer for hardware interaction. It is an actor that receives binary commands and writes them to the serial port.

Also, we introduced one more type of nodes, Manage Nodes, to work on a server, command the devices and provide interaction with other systems. Pragmukko allows extending the functions of a Manage Node with plugins that could also integrate with other systems. The plugins we have can integrate with MongoDB and HDFS as well as implement the REST interface based on Akka-http. Below is the illustration of this node:

Drones and IoT ELEKSLabs-2

UDP-Responder is an actor that adds new nodes to the cluster.

Sample code

Unfortunately, the project team was unable to implement the business case described above since there’s no fridge with ice cream in our office. Therefore, we tested a simplified case. We had several quadcopters randomly flying over a certain territory. They were controlled by the control station that kept them inside the perimeter.

Drones and IoT ELEKSLabs-3

Here’s the example of the code that works on a quadcopter:

1object EmbeddedMain extends App with DroneCommands {
2 
3  //  Initialize initial speed of the drone
4  var (vx, vy) = (Random.nextFloat(), Random.nextFloat())
5 
6  // Special helper to create Embedded Node
7  EmbeddedPragma {
8    ctx => {
9       
10      // The Start message comes right after the node joins the cluster 
11      case Start =>
12        // Here, we notify the hardware that we want to receive telemetry
13        ctx.subscribeHardwareEvents()
14         
15        // Notifying the hardware layer about the desired initial position and speed of the drone
16        // We tell the drone to rise 10 meters up and start moving in directions vx and vy with the specified              // speed
17 
18        ctx.self ! moveTo(0,0,-10)           
19        ctx.self ! direction(vx, vy, 0)
20 
21      // Here, we process the autopilot data
22      // The data is received in the binary format
23      // The tools for processing are available in the DroneCommands trait
24      case TelemetryBatch(batch) =>  // TelemetryBatch - message contains drone
25        // Checking if the received data contains the information about the drone’s position in space
26        val position = batch.collect { case DronePositionLocal(p) => p }.lastOption
27        // If the position is located, we send the information about it to all listeners
28        ctx.listeners foreach ( _ ! position )
29 
30      // Processing commands from the control station
31 
32      case "turn x" =>
33        vx = -vx
34        ctx.self ! direction(vx, vy, 0)
35 
36      case "turn y" =>
37        vy = -vy
38        ctx.self ! direction(vx, vy, 0)
39    }
40  }
41}

And the code of the control station:

1// plug-in listing
2class DroneControlExt extends GCExtentions with DroneCommands {
3 
4  override def process(manager: ActorRef): Receive = {
5 
6     // Processing messages from a drone
7     //  If the drone crosses the 20x20 m perimeter
8     // we send the command to turn along the required axis
9    case DronePositionLocal(position) =>
10      if (position.x > 10 || position.x < -10) sender() ! "turn x"
11      if (position.y > 10 || position.y < -10) sender() ! "turn y"
12  }
13 
14}

Basically, that’s all with the code. We compiled it and launched the Embedded Node on the drone. As for the control station, it can be launched anywhere: for instance on a laptop, provided that it is located in the same network with drones.

So, what’s happening behind the scenes? A drone, when launched, starts identifying itself with broadcast messages. As soon as a control station gets the message, it adds the drone to a cluster. Finding itself in the cluster, the drone adds the control station to the context.listeners list.

The control station and the drone’s Embedded Node implement the trait DroneCommands. This trait contains utilities for Pixhawk MAVLink protocol. For example, direction(dx:Float, dy:Float, dz:Float) method forms a binary command that sets the drone speed along certain axes. All incoming binary commands are automatically transmitted to the hardware interaction layer that communicates with the drone’s autopilot. Since we implemented the hardware interaction layer as a simple interface to the serial port, other autopilots would work if you change the implementation of DroneCommands.

Here are a few more words about the deployment. As the number of drones and IoT devices we experimented with grew bigger, the time to deploy them also increased. We had to go through the same routine for each device: install Java, upload the app, register it as daemon, and reboot the device. Only after all these steps could we deploy the Manage Node with all the components it interacted with. The process was so complicated that the team immediately wanted to automate it. On the advice of our partners from Cisco, we used Mantl.io. A platform for rapidly deploying globally distributed services, Mantl provides all the necessary components to start fast and improve often. As Cisco’s CTO Zorawar Biri Singh outlines in his exclusive interview to InfoWorld, in the foreseeable future, this lightweight, high-level container PaaS can be used as an orchestrator solution for building tightly coupled systems.

Drones and IoT ELEKSLabs-4

With just a few Ansible scripts, we were capable of deploying the necessary infrastructure, regardless of the number of devices in it. The team really loved this method and, as a result, we highly recommend it.

Performance

Traditionally, Akka is used to build distributed server solutions. So, we faced a question – if Akka-based products are effective enough on devices with limited computing capabilities. We received the answer through an experiment. In our case, we used the Raspberry Pi 2 Model B as the drone’s onboard computer. The device had the following characteristics:

  • A 900MHz quad-core ARM Cortex-A7 CPU
  • 1GB RAM

The characteristics might look quite impressive, but alongside our process, this mini computer also has to run two important processes with nearly real-time priorities. To prevent our software from inhibiting the neighbouring processes, we limited the CPU usage for it to a single core. Additionally, we set one more limitation: the use of memory by the Java process was restricted by the -Xmx32m parameter.

Before the test flight, the team decided to check how our Embedded Node would work on Raspberry Pi with such limitations. Once launched, the process immediately utilised 25% CPU (100% according to top data), or just one core out of four, and in a couple of seconds the CPU use dropped down to an acceptable 10-15%. The memory showed even better results: the process did not utilize all of the allocated 32 Mb, and the garbage collector was starting after a time-out.

During the flight, the performance parameters declined. CPU use reached 20% and the garbage collector worked intensely, because the autopilot generated a lot of telemetry data. If necessary, there is some possibility for optimization by limiting the telemetry stream only to autopilot configuration. We did not introduce such limitations because the parameters were within the normal range and the software did its job well – the control station directed the drone and kept it within the designated perimeter.

The project also revealed a number of challenges we still have to overcome. In particular, the higher the drone’s speed, the harder it is to control it. This is a complex problem and our framework cannot take all the blame for it. The control scheme that we used in the experiment is hardly suitable for real tasks. We can feed the drone the destination coordinates instead of the velocity vectors. Such an approach will add accuracy to the drone positioning, offsetting the dependance on the data link between the drone and the control station.

The second problem is the insufficient speed of our software. Akka is very slow to start. If the actor system is launched, joining the cluster takes from 5 to 20 seconds. Primarily, the speed depends on the quality of the network connection. We still work to improve this aspect.

Emulation

Another interesting feature of Pragmukko is the possibility to emulate hardware cluster members. This can come in handy while testing the business logic that is implemented on the devices. To enable hardware emulation, we extended the framework with the possibility of switching the Hardware Interaction Layer to Mock. For the tests, we saved real telemetry in a file and then ran its instances on the Embedded Node. I can already see how we will run integration tests for our drones on Jenkins. Then, the expression “the tests have crashed” will not be nearly as dramatic as it is now.

Follow the link to learn more about Pragmukko, its features and the project roadmap.

Conclusions

Pragmukko makes the system easy to program, configure and test. In spite of certain complications, we are satisfied with the result. We also believe that our framework has a potential for a widespread use.

I thank all of you who have read up to these lines. No drone was harmed in the making of this experiment.

Read more…
What if you could browse an online grocery store, add products to your shopping cart and complete your order right on your smart fridge’s screen using your voice instead of touch interactions? Does speaking to your fridge sound weird? Not at all. At least when you have a friendly, voice-powered shopping assistant integrated with your refrigerator.
Read more…