Skip to main content

Tables and Databases

Data fields, tables, and databases

For the following functions, the required tables are passed to the function as a resource. The table is read and processed internally.

Depending on the input value, interpolation is performed between individual data elements as a data field.

As a table with field names, a suitable entry can be searched for in certain key fields and various fields of this entry can be read out.

The implementation of an sqlite interface for accessing a local database instance with definable search queries is planned.

Characteristic curve, characteristic field "lookup"

With the function lookup() , data for one to three input variables can be linearly interpolated from characteristic curves or characteristic fields.

L1 = lookup(x, {...});
L2 = lookup(x, y, {...});
L3 = lookup(x, y, z, {...});
// The configuration object is mandatory and must
// refer to a resource:
Lx = lookup(..., {$ref:'myLookupData'});

The configuration object follows its own syntax and therefore must be obtained from the resource area of the math module using {$ref:'myLookupData'}. Two formats are supported: the classic text format and a description in simplified JSON format1

Description of inputs / axes

Each input signal is assigned an axis. This is defined by its unit <unit>, a start value <from>, the increment <step> and the number of support points <count>. The axes do not necessarily have to be sorted in ascending numerical order.

Configuration in text format (classic)

The following description must be added at the beginning of the text block for each axis, i.e., at least one, maximum three:

IN <unit> <from> <step> <count>
Configuration in JSON format:

In JSON format, it is also possible to specify a descriptive, informal name <name> and, as an alternative to one of the previously mentioned value parameters, the end value <to>. This means that exactly three of the four properties <from>, <step>, <count> or <to> must be defined; the fourth is calculated.

{
axes: [
{ name: <str>, // optional description
unit: <str>, // recommended
// choose 3 as mandatory of the following 4:
from: <dbl>, // left-side value of input axis
step: <dbl>, // increment between interpolation points
count: <uint>, // number of interpolation points
to: <dbl> // right-side value of input axis
}, ...
], ...
}

Description of outputs

The output signal is described by a unit <unit>

Configuration in text format (classic)

The following definition must be listed once after the definition of the inputs:

OUT <unit>
Configuration in JSON format:

In JSON format, it is also possible to specify a descriptive, informal name <name>. Instead of the JSON object for each output (recommended), only the physical unit can be specified as a string.

{
outputs: [
{ name: <str>, // optional description
unit: <str> // recommended
}, ... // prepared to support vector output in future
<str>, ... // alternative to define output by unit
], ...

Data field

The data field consists of <dbl> values. The order of the support points is determined by the following pseudo code:

foreach(z : axis3)
foreach(y : axis2)
foreach(x : axis1)
Value[x,y,z];
Configuration in text format (classic)

The data values are simply separated from each other by spaces, tabs, or line feeds and read in sequence into the support points of the characteristic curve or characteristic field.

0 42.942 87.924 134.946
184.008 235.11 288.252...
continued up to 161 data points
Configuration in JSON format:

In JSON format, all data values are specified in an array of floating point numbers.

{
data: [
<dbl>, ...
], ...
}

Summary

Configuration in text format (classic)

Characteristic curve:

IN    kPa    0.0    0.167751    161
OUT l

0
42.942
87.924
134.946
184.008
235.11
288.252
343.434
400.656
459.918
521.22...
continued up to 161 data points

Characteristic map:

IN    g/h    0    10000    22
IN 1/min 900 100 10
OUT kW
0.0 40.8 81.6 132.9 185.2 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0 200.0
0.0 40.0 80.0 131.3 184.3 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0 230.0
0.0 38.5 76.9 125.8 182.4 233.5 286.9 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0 310.0
... 7 more rows with 22 data points each
Configuration in simplified JSON format

The configuration object can also be described in simplified JSON format1.

{
axes: [
{ name: <str>, // optional description
unit: <str>, // recommended
from: <dbl>, // left-side value of input axis
step: <dbl>, // increment between interpolation points
        count: <uint>, // number of interpolation points
to: <dbl> // right-side value of input axis
}, ...
],
outputs: [
{ name: <str>, // optional description
unit: <str> // recommended
}, ... // prepared to support vector output in future
<str>, ... // alternative to define output by unit
],
data: [
<dbl>, ...
]
}

Provision as a resource

There are many ways to include these characteristic curves or characteristic field descriptions in the resources of the math module.

For small characteristic fields or characteristic curves, formatting as a text resource is suitable:

    "resources": {
"myLookupData": [
"IN kPa 0.0 0.167751 161"
, "OUT l"
, "0 42.942 87.924 134.946 184.008 235.11 288.252 343.434"
, "400.656 459.918 521.22 584.562 649.944 717.366 785.90745"
, "854.45145 ... and more"
], ...

Alternatively, the same map in simplified JSON format:

    "resources": {
"myLookupData": {
axes: [{ name: "HydroStatPressure", unit: "kPa"
, from: 0.0, step: 0.167751, count: 161 }],
outputs: [{ name: "Volume", unit: "l" }],
data: [
0, 42.942, 87.924, 134.946, 184.008, 235.11, 288.252, 343.434
, 400.656, 459.918, 521.22, 584.562, 649.944, 717.366, 785.90745
, 854.45145, // ... and more
]
}, ...

For larger lookup tables or those that do not appear in the JSON configuration of the smartCORE or are not supposed to be readable, a processed character string would be useful.

First, with a map file myLookupData.tab as the data source, which follows the above syntax (text/json):

    "resources": {
"myLookupData": {
"decoder": ["file"],
"value": "myLookupData.tab"
}, ...

or, if the lookup table is in compressed form:

    "resources": {
"myLookupData": {
"decoder": ["file", "unzip"],
"value": "myLookupData.gz"
}, ...

Alternatively, the data can also be included in the resource as a base64 string:

    "resources": {
"myLookupData": {
"decoder": ["base64"],
"value": "Q09MCWcvaAkwCTEwMDAwCTIyDQpST1cJMS9taW4JOTAwCTEw..."
}, ...

... or the compressed file as a base64 string:

    "resources": {
"myLookupData": {
"decoder": ["base64", "unzip"],
"value": "UEsDBBQAAgAIACt3cUoHOG6qmgEAAD4FAAAcAAAAUF9lZmZf..."
}

Search query "selectRow"

The selectRow() function searches sequentially through all rows in a simple table for a match with the specified keys and returns the row index found. If no match is found, -1 is returned. A search is only performed if at least one of the requested key values changes.

This row index can be used in getField() to read any values from the data fields of the table.

m1  = selectRow(key1, ..., { table: <str>      // mandatory
, colSep: <regex>
, anyKey: <var>
, key: <str> // mandatory (*)
, keys: [<str>] // mandatory (*)
});
// (*) exactly one of key or keys must be defined
PropertyValueDescription
table<str>Mandatory: Name of a text resource that defines the content of the table. The data elements are interpreted as variants and thus provide the best possible data type. Floating point numbers must be written with a period as the decimal separator.
colSep<regex>Regular expression that defines the column separators. Default: "[,;|\t]", universally applicable for each individual character specified in [].
anyKey<var>A data element of this value returns a positive result for any key values, default: "*"
key<str>Required (*, **): A data field in the table in which the first and only key value key1 is searched for.
keys[<str>]Required (*, **): A vector with multiple data fields that are to be applied to the key values key1, key2, key3, ... in the specified order.

(*) Exactly one of the properties key or keys must be used to select the key column(s).

(**) By default, an exact match of the key value with the entries in the table column is always searched for. The following comparison operators can optionally be prefixed to the field identifier to modify the query; spaces may be inserted between the operator and the identifier:

OperatorMeaningExampleExecution as...
KEY is a field identifier in keys:[..]keyN is the key value passed as a parameter
==Exact match (default)'KEY'
'==KEY'
keyN == Table[KEY]
*=for <str>:
contains
*=KEYkeyN contains Table[KEY]
&=for <uint>|<int>:
all bits of the mask set
'&=KEY'bAnd(keyN, Table[KEY]) == Table[KEY]
&>for <uint>|<int>:
at least one bit of the mask set
'&>KEY'bAnd(keyN, Table[KEY]) > 0
&0for <uint>|<int>:
no bit of the mask set
'&0 KEY'bAnd(keyN, Table[KEY]) == 0
<=Less than or equal to'<=KEY'keyN <= Table[KEY]
<Less than'<KEY'keyN < Table[KEY]
>=Greater than or equal to'>=KEY'keyN >= Table[KEY]
>Greater than'>KEY'keyN < Table[KEY]

For the data type <str>, comparisons refer to alphabetical order.

Formatting the table

  • The table is loaded from a text resource (ASCII, UTF-8)

  • Column separators are defined via the colSep property.

  • The same table can be used in multiple selectRow() functions. The colSep specification must be identical for all of them, as it is not specified which of the functions interprets the table first (and thus also for all other instances).

  • The first row contains the field names, which are then used to define the key columns and result fields.

  • Blank lines are ignored.

  • Lines that begin with 3 identical characters from the group _ - = ~ * are interpreted as separators and ignored.

  • Each data field is read and processed individually as a variant. There are therefore no fixed data types for a column.

  • The search is always performed sequentially through all lines.

Read table entry "getField"

The function getField() reads the corresponding data element from the data field field for the row index rowIdx found in a table table interpreted with selectRow(). The data type is determined by the data element as a variant.

If rowIdx is < 0 or outside the valid range, the variant defined with notFound is returned.

v1 = getField(rowIdx, { table: <str>      // mandatory
, field: <str> // mandatory
, notFound: <var>
});
PropertyValueDescription
table<str>Mandatory: Name of a text resource that defines the content of the table. This must have been interpreted and loaded in a selectRow() function.
field<str>Required: Name of a data field from which to read the result value.
notFound<var>This variant is returned as the result if rowIdx does not refer to a valid search result. Default: false.

Playback of recorded data "playback" 2

Under Construction This function is currently being implemented and is not yet intended for productive use.

The playback() function opens a "Tape" file and extracts "Tracks" from it, which are time sequences of selected data channels.

warning

Depending on the structure and length of the data sources used, memory violations and crashes of the smartCORE may still occur.

The tracks can be published as variables in the Math module with an additional prefix and then also transferred to the smartCORE as output variables.

Similar to a classic tape recorder, markers can be set for the playback of a specific time period and control functions (stop, play, pause, rewind) can also be triggered via extended logic.

Studer A800

Currently, the uncompressed OSF file format is supported. Other formats such as WAV, CSV, etc. may follow in future versions.

tip

Runs imported into the system from tape can be used as a reference for measurement signals to track and monitor their dynamic behavior after a start event.

tip

To test new functions in the Math Module, measurement data from productive use can be imported back into a stationary laboratory system in real time. optiCONTROL offers various options for merging, cutting, or channel selection for data preparation.

tip

To protect the OSF files on the /sde drive from automatic compression, you can simply use a different file extension, e.g., *.osf_pb or *.osf_tape.

t1 = playback(ctrl, {...});
t2 = playback(ctrl, tInfo, {...});
t3 = playback(ctrl, tInfo, source, {...});
// Configuration is mandatory for paths and keys
tx = playback(..., { directory: <str> // mandatory!
, source: <str> // mandatory! (or by parameter)
, varPrefix: <str>
, keys: [<str>] // mandatory!
    , tBegin: <date>
, tRewind: <date>
, tEnd: <date>
, result: <enum>
, loop: <bool>
, tZoom: <dbl>
, endVar: <str>
});

The parameter tInfo can contain a scalar value or a vector with up to 3 components. The description is also included in the following table.

PropertyValueDescription
directory<str>Path to the directory containing
source<str>Initial tape, can be overwritten by parameters. Each change loads the channels selected by keys from the designated tape.
varPrefix<str>The prefix is added to the selected channels as a variable for publication, e.g., "Tape."
keys[<str>]List of channel names from the tape file that are loaded for playback.
tBegin<date>Start time on the selected tape
tRewind<date>Target point for rewinding or fast-forwarding
tEnd<date>End time for playback or automatic return
result<enum>Output of the tape position as
abs: absolute timestamp (default)
relOrigin: relative timestamp to the start of the tape
relBegin: relative timestamp to the tBegin mark
relInterval: relative position [0..1][0..1] in the interval [tBegin,tEnd][tBegin, tEnd]
loop<bool>When the tEnd mark is reached, jump back to tRewind and continue playback. (def.: false)
tZoom<dbl>Zoom factor for the time axis during playback (def: 1.0),
>1.0>1.0: Fast forward
<1.0<1.0: Slow motion
endVar<str><bool> output variable, set to 'true' when the tEnd marker is reached.
tInfo<dbl>:=tRewind:= tRewind
tInfo[<dbl>, <dbl>]:=[tRewind,tEnd]:= [tRewind, tEnd]
tInfo[<dbl>, <dbl>, <dbl>]:=[tBegin,tRewind,tEnd]:= [tBegin, tRewind, tEnd]

Footnotes

  1. With the exception of comments 2

  2. Available from catalog version 11 onwards.