We all know the phrase knowledge is power. We are living in a digital age and today knowledge is data. With the value of data in some settings touted to of passed that of gold the management of data should be of increasing concern to all organizations.
When experts refer to data, one important point is often masked as hype overtakes reason. Data is only of value if it’s the right data at the right time, delivered to the right person through the right device using the right display so that person digests it in time to use it!
So just like everything in this technologically advanced world, data management starts with understanding the intended purpose and possible usage of the data, or simply put good old “business requirements” as we know it
First identify data requirements is a core-belief for Cypernex.
However, trying to prescribe the ultimate benefits of data before you start your project is a recipe for “Paralysis by Analysis”. Additionally, the project will uncover new opportunities that would not at first be visible. So for requirements a happy middle ground is the best place to start.
Fortunately Cypernex is there to alleviate the need for your team to be constantly jumping between “what technically is possible” and “what we want”. Its best to start from a basic set of sensors and build up from there.
To develop your strategy for data you must recognise there are three ingredients for valuable data.
Accuracy + relevance + experience
Without your experience with your industry, customers and/or product the data won’t be fit for purpose.
Cypernex includes a range of features to improve the data as ingested. Think of it like a production line, a machine that accepts raw materials (sensor data) and produces a finished product at the end.
Cypernex data production
Your sensor data comes into Cypernex (on left) through various pipes and channels.
(e.g. LoRa, Modbus, MQTT, Sigfox, WiFi, ERP)
You use Cypernex to perform actions on that data such as;
Cleansing (fill missing gaps or remove null values for time where sensor disconnected)
Normalize convert to new base
Calendarize (create summaries by day/month/year)
Embellish (Combine with other data to create new information and meaning)
Supplement (If there is a customer name, location known or extracted from ERP then add this into the data stream)
You use Cypernex to perform actions on that data such as;
Cleansing (fill missing gaps or remove null values for time where sensor disconnected)
Normalize convert to new base
Calendarize (create summaries by day/month/year)
Embellish (Combine with other data to create new information and meaning)
Supplement (If there is a customer name, location known or extracted from ERP then add this into the data stream)
New data packages emerge ready to be shipped off to other data production processes using functions, Dashboards, Ai, ML, VR to name a few.
Now your data will be sent off to similar machines producing new information that is “fit-for-purpose” and is delivered to the “right-person” at the right time in the right place to help you make informed decisions with confidence.
Status engines
Cypernex streaming data production utilizes status engines.
The engine examines data streams looking for specific patterns and when these appear the machine, sensor and or entire applications state changes to initiate a set of tasks, alerts, dynamic functions and/or automation. All states are conditional and these have a “no code” setup that uses other data sets, forecasts, predictions and more.
The engine examines data streams looking for specific patterns and when these appear the machine, sensor and or entire applications state changes to initiate a set of tasks, alerts, dynamic functions and/or automation. All states are conditional and these have a “no code” setup that uses other data sets, forecasts, predictions and more.
The pattern may exist in the data as a combination of data features spread across datasets and to avoid making it overly complex to setup, we found a way to make it really simple.
When something happens, press the record button and it remembers. Give that event a name and you can reuse it again, combining your experience with data.
Once a status condition is recorded, there nothing more to do.
These conditions are automatically activated when processing the incoming data stream.
Once a status condition is recorded, there nothing more to do.
These conditions are automatically activated when processing the incoming data stream.