Company Events Academic Community Support Solutions Products & Services Contact NI MyNI

Enabling Data to Flow between Processes in a Specific Order

NI InsightCM™ SDK Help

Edition Date: July 2017

Part Number: 375191C-01

»View Product Info
Download Help (Windows Only)

Parent Topic: Device Software Fundamentals

Use the Dataflow Script, an infrastructure feature of the SDK, to implement a loosely coupled order of operations on data sets.

Sometimes it is necessary for processes to operate on data sets as part of a sequence of related steps. Consider the following simplified description of how the NI InsightCM SDK reference code operates on data sets in a specific order:

  1. A process acquires waveforms. After each read operation, the process bundles the data from each channel in a data group into a data set.
  2. A separate process calculates features and spectral bands from the waveforms in each data set and adds the values to the data set.
  3. A separate process evaluates alarm rules for each data set to see if an alarm occurred or if an active alarm cleared. The second process must operate on data sets before this process because feature and spectral band values are common sources of alarms.
  4. If the third process finds that an alarm rule evaluates true, a separate process writes the waveforms and calculated values to a data file.

Integrate the Dataflow Script feature in your device software using its two components:

  • The Dataflow Script is a single list of processes ordered by the position at which you want their operations to occur. You can add, remove, or change the processes in the Dataflow Script depending on your unique programming goals.
  • The Dataflow Script API, when called to transfer a data set to the next process, looks up the next process in the Dataflow Script. The API sends a Qbus message with the data set to the next process. In other words, processes do not need to hard code the name of the recipient for their data sets.

The SDK reference code includes several of its NI-developed processes in the Dataflow Script. Additional NI-developed processes send data sets to the first process in the Dataflow Script. You can add processes you develop to the Dataflow Script and remove NI-developed processes. The following sections explain how the Dataflow Script and its API works so you can learn how to properly integrate your own processes.

Comparison of Dataflow Script and Custom Qbus Messages

Dataflow Script and custom Qbus messages are similar communication methods, and you can develop a process to implement both types of communication. However, the use case for Dataflow Script is distinct from that of custom Qbus messages:

  • Use the Dataflow Script feature to maintain the ability to change the order in which processes operate on a data set. This is possible because all processes in the Dataflow Script transfer data sets through the same message type. Therefore, the message provides a standard interface that is not customized to the recipient.
  • Use custom messages when you can tightly couple the process that sends a message to the recipient process. For example, if a process will always send a particular message to a specific process, a custom message is appropriate.

Requirements for Integrating a Process with the Dataflow Script

When you integrate a process with the Dataflow Script, you must ensure the following behaviors are true:

  • The process contains code to handle ProcessDataFlowBlock Qbus messages, which the Dataflow Script API sends to transfer a data set to the next process. Processes return these messages when they are the oldest message in the message queue, just as they return other Qbus messages.
  • The process can receive and evaluate any data set, regardless of which process sent it. This allows you to insert a process into the Dataflow Script without rewriting the adjacent processes to account for the change.
  • The process can transfer a data set to the next process without operating on the data set if the data set does not contain desired elements or properties. This behavior is required because a data set must pass through every process in the Dataflow Script.
  • The process does not need to know the recipient for its data sets. In other words, rely on the Dataflow Script API to look up the name of the next process in the Dataflow Script. This allows you to insert a process into the Dataflow Script without rewriting the adjacent processes to account for the change.

The following illustration shows these characteristics of the Dataflow Script implementation, where each blue square represents a process within the Dataflow Script:

Dataflow Script Executes Start to Finish

Sometimes a process in the Dataflow Script is designed to operate only on some types of data sets. For example, a process might operate only on data sets that contain values from a particular type of sensor. In this situation, the process can avoid operating on a data set by evaluating it for a desired attribute, and then sending the unchanged data set to the next process if that attribute is not found. Consider the following diagram, where the second and third processes examine the data set prior to operating on it.

This behavior means that you cannot branch the Dataflow Script by transferring data sets through a subset of its processes. Therefore, even if a particular process is designed to operate on only certain types of data, it must be part of the Dataflow Script and all data sets must pass through it.

Additional Behaviors of the Dataflow Script

Processes Can Operate on Messages Unrelated to Dataflow Script

Processes operate outside of their role in the Dataflow Script by handling other messages. For example, the Trigger process can respond to force-trigger requests from NI InsightCM Server at any time, including while data sets are advancing through the Dataflow Script.

Multiple Data Sets Exist Simultaneously

Any process can operate on a data set as soon as it dequeues the Qbus message that contains the data set. Therefore, multiple data sets can remain in memory while they pass between processes in the Dataflow Script. For more information about the contents and behavior of data sets, refer to Data Sets.

Dataflow Script API Automatically Cleans Up Data

When a process calls the Get Next Process Module VI and the Dataflow Script API determines that process is the final process, the API automatically closes references to the current data set. This frees the memory the device used to store this data. Therefore, all processes in Dataflow Script must call the Get Next Process Module VI, even if you know a process is currently the final process in Dataflow Script.

Related Information

Setting the Processes in the Dataflow Script

Handling a Data Set and Transferring Data to the Next Process

Sending a Data Set to the First Process in the Dataflow Script


 

Your Feedback! poor Poor  |  Excellent excellent   Yes No
 Document Quality? 
 Answered Your Question? 
Add Comments 1 2 3 4 5 submit