Project Vita: Chapter 4 – Retrieving Data with a Scalable Sensor Solution

Instead, following MuleSoft’s three-tier architecture, we implemented a ‘Process API’ to act as our master node and deployed it to infrastructure within each ‘site’ (i.e. on the same subnet as an array of nodes - like an employee’s home or an office). This Process API simultaneously searches all nodes (System APIs) for the required sensor and retrieves the data before returning it to the client. This master node also enabled us to retrieve readings from all available sensors within the site simultaneously and aggregate the results into one single response. From a security point of view, it also meant we could restrict access to the System APIs and regulate inbound traffic to the site via a single point of entry.

The next step was aggregating results between the different sites as we had multiple employees running nodes at home and planned to add additional nodes to each of our client's offices. We were able to reuse our master node concept and add an additional tier within the process layer which would be deployed to CloudHub and act as our ‘super node’. This API could simultaneously call each of our master nodes and aggregate the results, or it could act as a single endpoint which would search and retrieve the data for any specific sensor within any site (globally).

Containerization of the Mule Runtime

As we were developing our solution remotely, with multiple team members configuring nodes across various sites, it became imperative that we had a consistent development environment, particularly when it came to troubleshooting or debugging.

We were using the on-premise Mule Runtime but hardware varied considerably with a combination of laptops, MacBooks, Raspberry Pis, and a HP ProLiant Microserver, each setup and configured by a different team member. This meant that when anything went wrong during development or deployment, it was proving difficult to isolate the issue despite it often being a simple developer oversight.

We decided to leverage Docker to containerise the Mule Runtime, which enabled us to run one quick command and automatically access the same environment regardless of the hardware in use. This provided a consistent backbone for all of our nodes and master nodes, but also reduced our initial node setup time from a couple of hours to literally minutes.­­­­­

We had the beginnings of our Application Network with the master nodes and super node now deployed, and the data was being brought to life using Einstein Analytics, so we set upon finding the next integration challenge while the Salesforce team started looking at embedding Lighting Web Components and leveraging Field Service.

Project Vita's chapters: Read More

Project Vita: Chapter 3 – Choosing Scalable Salesforce Tools

Processing Platform Events

ThirdEye prides itself on being at the cutting edge of Salesforce’s new features and platform capabilities, and as such, has amended the processing of the platform events as recommended changes to the platform occurred:

1. Salesforce IoT (Retiring October 2020)

  • ‘IoT MiFlora Context’ linked the Platform Event to the Asset object using the unique ‘MAC Address’ identifier
  • ‘IoT Moisture Orchestration’ was created against the Context and served to set and retain the state of the asset’s moisture as each of the platform events were received. It also processed logic against the message and when the Moisture State changed, a case was automatically created.
 

2. Process Builder + Flow Builder

Foreseeing the retirement of ‘Salesforce IoT’, we migrated the logic to Process Builder and Flow. The only structural addition we needed to make was to add and maintain the ‘Current Moisture State’ on the Asset Object:
  • Process Builder listens for Plant_Event__e Platform Events, gathers related object information about the specific Plant, and passes this information to Flow for assessing the Moisture State.
  • The Flow determines if the state has changed since the last event and creates a case if it has changed to ‘High’ or ‘Low’ Moisture. We also notify the plant’s owner to advise them of what action to take.
  • Once the Moisture State has changed back to ‘Sufficient’, the Cases for the Asset are automatically closed.
We also have a scheduled Flow that determines if the last sensor reading was over a certain amount of time ago. If it is, the Asset is marked as ‘Offline’ and the plant’s owner is notified.

3. Flow Builder (Summer ‘20 Release)

Instead of using Process Builder as the listener for the platform event, the ‘Salesforce Summer ‘20’ release allows you to do all of this from one automation tool, Flow Builder!

Einstein Analytics & Historic Sensor Readings

It was all well and good getting the notifications that our plants were thirsty but we also wanted to be able to visualize the data and see it for ourselves. So, for each of the processing automation above we also stored the readings as a record in a custom sObject object called ‘Asset Sensor History’. After setting up an hourly incremental sync for the sObjects and Dataflow in ‘Einstein Analytics Data Manager’ for this historical data, we could use the resulting dataset to show on a Timeline chart. However, the question of scalability loomed over us again when calculating the timely depletion of sObject storage. It was time for us to use a Big Object. To make it configurable from the Setup UI, we created a Schedulable Batchable Apex Job that:
  1. Replicates the data in the the ‘Asset Sensor History Store' Big Object from the subject
  2. Checks for successful replication
  3. Deletes successfully replicated sObject data
  4. Triggers 'Einstein Analytics Data Manager' to sync the SFDC_LOCAL Connector Big Object data
Now we had both of these in Analytics we created a scheduled ‘Recipe’ to merge, transform, and denormalize our data sources into the dataset that we now use for dashboards and analysis.

High-Level ERD & Analytics Dataflow

Project Vita's chapters:

Read More

Project Vita: Chapter 2 - The Quest for Data

Tools Included

Hardware: Software:
  • Plant Sensor
  • Bluetooth Enabled Controller
  • Mule 4.2.2 Runtime
  • Salesforce
  • Python 3

Different Approaches

The first challenge was identifying a feasible means of capturing plant data and narrowing down sensors that are compatible with a wider technological landscape (ie. not locked into a proprietary solution or product).

At our disposal we had a Raspberry Pi (compact and powerful single-board computer) which had multiple GPIO headers that all kinds of sensors could be connected to. On the market you can typically find two types of soil sensors: resistive and capacitive. The capacitive option was preferred since it has no exposed metal by design. Therefore, it was neither at risk of oxidation, nor corrosion unlike a resistive sensor.

One of the first sensors we experimented with was a capacitive moisture sensor from Adafruit called the STEMMA Soil Sensor, which once connected to a Pi, we were able use to get readings for soil moisture and temperature. We successfully leveraged this solution for two plants located side by side, but it quickly became evident that this was not scalable to the extent we required.

It was back to the drawing board to explore what other sensors were available on the market, where we identified a possible candidate created by XiaoMi: a wireless BLE (Bluetooth Low Energy) device which would provide us with readings for both moisture and temperature, as well as light exposure and soil fertility too.

Solution Architecture (v1)

After exploring a few different approaches and iterating through the most promising one, the solution uses IoT sensors, Raspberry Pis, MuleSoft, Python, and Salesforce.

A sensor is placed in each plant pot that monitors the ambient temperature, light exposure, moisture and soil fertility (measured as conductivity). Each Raspberry Pi is placed in a cluster of plants with the ability to scan for nearby sensors and retrieve individual plant data. This is made possible by leveraging Python code running on each Pi and deployed applications which send data to Salesforce at a set interval. These applications were built using MuleSoft - an integration and API platform for connecting applications, data and services.

Taking a deeper dive, we hosted our Mule applications on the Mule Runtime (4.2.2) to serve as a lightweight and scalable runtime engine. Following best practises such as modularization, concise documentation in READMEs and promoting consumption, we developed two applications to take care of our integration requirements:

  • System API - leverages the Python application on the backend and transforms the payload in accordance with our API contract (API RAML Specification).
  • IoT Batch Scheduler – calls various endpoints of the Mule System API, for example scanning for local devices and retrieving data. This means that in a cluster of plants or in a set location, one Raspberry Pi can be used to retrieve readings from any number of devices within range. Then the batch scheduler application sends the data to Salesforce at a user defined frequency.

In summary, each Pi runs three applications: one Python application and two Mule applications. The sequence diagram illustrates how these three applications interact and how data is sent to Salesforce as a Plant Event.

As a final step, we added automation and in-built resiliency where possible to ensure our solution is easily consumable, robust and scalable. For example, each Pi has a Cron job that pulls the latest source code from our git repository, compiles the applications, and automatically redeploys on reboot in the case that there is a power outage or if applications need to be updated. Now that we have our plant data successfully feeding into Salesforce, stay tuned next week to see how we are leveraging the power of Salesforce IoT Orchestration to bring the data to life!   Project Vita's chapters:   Read More

Project Vita: Chapter 1 - Will Your Desk Plant Survive Covid-19?

Amid the business chaos of adjusting to working during a global pandemic, our dozens of plants were suddenly deprived of their normal caretakers that are in-tune with their watering and care needs. Instead, they were left in the hapless blackthumbed hands of yours truly. While I tried my best for the first couple weeks to ensure that each plant was watered regularly - I soon learned that a one-size fits all approach to plant watering combined with less regulated office temperatures was not faring well for our flora friends. With nobody knowing when we might be able to return to our offices as normal, the team was all very concerned for the wellbeing of their plants, and as we're reading in the news, we were not alone with this concern. We had our business problem. We needed a way to better monitor the state of our potted pals, and figure out how we can enable someone that is clueless about plant care (me) to give these plants the love and care they require. Now, after brainstorming with the talented tech team at ThirdEye, we have our solution, Project Vita. With Project Vita, we will solve the plant problem the way we know best - by putting to use market leading technologies. Each plant will be equipped with a wireless sensor that feeds data to a Raspberry Pi, at which point Mulesoft will take regularly scheduled and instant reads of plant data and feed it into the Salesforce IoT Orchestration. From there, we will be using Salesforce Service Cloud to raise cases when our plants fall below the moisture threshold for the plant species as well as alert us if sunlight or temperature conditions are not optimal. Einstein Analytics will give us insights from the history of sensor readings - and that is just the start. While we realize this is a somewhat frivolous application of technology, Project Vita will serve as a fully functional demonstration on how IoT Orchestration can be combined with the already powerful features of their Salesforce technology stack in a way that will help them solve problems preventively and with greater accuracy. As we go down this journey, we will be sharing regular project updates, success stories and horror stories. Welcome to Project Vita, we hope you will follow it with us and we would love to hear your ideas!   Project Vita's chapters: Read More
Top Get in touch

Find out how ThirdEye can help your company to succeed

Communication:

Sign up to receive marketing updates and news from ThirdEye Consulting:
We’d like to keep in touch with you with news and updates about ThirdEye Consulting and the innovative work we are doing.
We will never sell your data and we promise to keep your details safe and secure.
You can change your mind at any time by emailing contact@thirdeyeconsulting.co.uk.

Follow us:

Interested in being part of the ThirdEye's team?

Communication:

Sign up to receive marketing updates and news from ThirdEye Consulting:
We’d like to keep in touch with you with news and updates about ThirdEye Consulting and the innovative work we are doing.
We will never sell your data and we promise to keep your details safe and secure.
You can change your mind at any time by emailing contact@thirdeyeconsulting.co.uk.

Follow us:

This website uses cookies to enhance your browsing experience... moregot it