Introduction to Apis Foundation

Thank you for using Apis Foundation from Prediktor AS. Apis is Prediktor’s real-time industrial software platform. Apis is component based, with each component containing a set of functions, and has been used in over 600 installations, mostly within mission critical areas in the maritime, manufacturing, and oil & gas drilling and production industries.

We hope you find Apis Foundation useful. Any questions and suggestions are welcome via email: support@prediktor.no

What is Apis Foundation?

Apis Foundation is a real-time industrial software platform that incorporates many industry standards such as OPC and OPC UA. Apis Foundation is fully component-based and can therefore be integrated in a myriad of ways with our partners’ software. You can choose which components are required and avoid paying for unneeded functionality.

Components include: tools to collect data from external systems, sensors and equipment; a powerful time-series database that serves as a real-time Historian; and tools to expose data via OPC or OPC UA. Using these components, a number of different applications can be built including OPC UA-based Historians, OPC DA/HDA servers, OPC Hubs, OPC UA wrappers for transferring real-time data over the internet, and much, much more.

Apis Foundation includes the following tools and services:

Apis Services:

  • Apis Hive is a multipurpose real-time data communication hub and container for Apis Modules. Hive is an executable that hosts data access, processing, and logging components into one efficient real-time domain.
  • Apis HoneyStore is Prediktor's high-performance, time-series database.
  • Apis Chronical is Prediktor's high-performance, event-server and historian.
  • Apis OpcUa Namespace Replication Service is a service for replicating namespaces from OPC UA servers to Apis Hive.
  • Apis Backup Agent is a service responsible for executing backup and restore jobs.

Apis Tools:

  • Apis Management Studio is the main engineering interface for configuring Apis services.
  • Apis Bare is a tool for manually backing up and restoring configuration and data for your Apis applications.

Where to start?

If you're new to Apis, we suggest you read the How To Guides. They cover the most common tasks and concepts when using the software.

Overview of Apis Foundation.

The functionality of Apis can be organized into the following groups:

• Connect - robust, high frequency, and high capacity connection to real-time data sources.

• Store – high frequency and capacity, highly distributable, and scalable storage of real-time data.

• Process – real-time, deterministic processing of collected values to create aggregated information.

• Visualize – provide useful information to several groups of stakeholders simultaneously.

The core component of our connect functionality is Apis Hive. Hive is the executable that hosts data access, processing, and logging components in one efficient, real-time computational domain.

Data access components source data based on both open standards and proprietary interfaces. These are combined inside Hive into one homogenous way of understanding the data. Here they are refined by different processing components, logged to data storage, or exposed to external clients through industry standard interfaces. The outcome of data processing can also be written back to the access components.

The system is component-based and most of the components are optional. Customers can therefore choose which components are required and avoid paying for unneeded functionality.The figure below illustrates the design of the Apis platform.

How To Guides

The How To Guides will give you an introduction to the most common tasks and concepts when using Apis.

Getting Started

This section explains how to install a run-time license and will get you familiar with using Apis Management Studio and the concepts of Apis Hive, Modules and Items. Please pick a topic from the menu.

Install a Runtime License

The Apis software will run in demo mode for one hour without a license key. To obtain a valid license, please contact Prediktor at either:

Email: support@prediktor.no

Phone: +47 95 40 80 00

To request a license

  • Fill in ProjectRef and Invoice Details in the popup window.

  • Click "Copy to Clipboard" and paste the results into an e-mail. Then send this to Prediktor.

Email: support@prediktor.no

To install a license key

When you have received the license key (which is locked to the MAC address of the computer and the hard disk ID)

  • Open Apis Managemen Studio
  • Expand "apis://localhost" -> "Licensing" -> -> "Install SW License Code". A dialog box opens, where you can browse to select, then open the license file. This completes the license key installation.

Restart Apis Hive and Apis HoneyStore

When the license key is installed, Apis Hive and Apis Honeystore have to be restarted (or started if they're not running).

  • Click "ApisHive" -> "Stop", and then "ApisHive" -> "Start", to restart:

  • Click "ApisHoneyStore" -> "Stop", and then "ApisHoneyStore" -> "Start", to restart:

Start Apis Hive

  • Open Open Apis Management Studio from the APIS menu under the Windowws Start Menu.
  • From the Hive Instances node, go to the desired ApisHive instance, right click and select Start.

Open Apis Management Studio

  • Open Apis Management Studio from the APIS menu under the Windowws Start Menu.
    Then you can browse and connect to Apis services.
  • Apis Management Studio can only connect to running Apis services, not stopped ones.

Adding a Module

In this example, we'll use Apis Management Studio to add a module of type ApisWorkerBee to the Apis Hive environment. The procedure is similar for any type of module you want to add.

Open Apis Management Studio and right-click on the "hive" instance in Solution Explorer, then select "Add Module" from the menu.

The following window appears:

By selecting a category on the left side, modules of that category will appear on the right side.

Click the Process category, and select the Worker module type

Give the module a name. Click Add and window with the properties of the module appears. Change the desired properties and click OK

Advanced

By clicking on the Advanced button, three more rows appear where it is possible to set the number of module which will be added and the start index of the naming of the module.

Additionally it is possible to set a Properties template file which will make the properties of the module be loaded from a previously saved template (this is optional). This is handy if modules need to have similar properties . You can save templates in the Solution Explorer by selecting Export properties in the context menu of the module and giving it a name. The module's properties upon creation will then be set to these saved properties.

You could write RTM {0} Worker in the name element, and 4 in the Count element. This will produce 4 modules named: RTM 1 Worker, RTM 2 Worker, RTM 3 Worker, and RTM 4 Worker.

Deleting and Renaming Modules

Deleting modules

If you want to delete a module from your configuration, select the Module Node in the Solution Explorer. Open the context menu and select Remove.

If any items of this module have been enabled for logging into a Apis Honeystore database, the corresponding items in the database may be deleted from the database as well. Deletion depends on the module property ItemDeletion of the corresponding ApisLoggerBee module

If you are deleting an ApisLoggerBee module, the database managed by this module may be deleted, depending on the module property AutoDeleteDB of the ApisLoggerBee module you're deleting. Therefore, if you want to delete the database, make sure this property is set to true. Set it to false, if you want to keep the database for future use

Renaming modules

If you want to rename a module in your configuration, select the Module Node in the Solution Explorer. Open the context menu and select Rename module..... Then, enter a new, unique name for the module and click OK.

When a module is renamed, the following configuration changes are applied / maintained automatically:

  • External item connections.
  • Global attributes, will re-register and rename the Global attribute according to the new Module name, when applicable.
  • Event broker connections.
  • The Logger Bee automatically renames items currently stores into its HoneyStore database (Note that the database itself, is not renamed since it can be shared amongst several modules, and there is no one-to-one connection).
  • The alarm Area name of the module in the Apis Event Server / Chronical.
  • If Security / Config Audit trails are enabled, a ModuleRenamed entry will be logged.

It is strongly advised to take a backup before renaming modules, in case of unwanted effects as result of the rename operation. Also, if possible, it could be wise to restart the Hive instance after a module has been renamed, and inspect the trace logs for any related issues. Even if you cannot restart the Hive instance, inspecting the trace logs is a good idea.

Adding Items

In this example, we'll use Apis Management Studio to add items of type Signal to a ApisWorkerBee module. The procedure is similar for any type of item you want to add.

Right-click on the "Worker" module instance in Solution Explorer and select "Add Items " and "Signal" from the menu.

The following window shows up:

Give the item a name and click Add items. You can repeat this to create multiple items. Click OK when finished, and the items will be created in the module.

Explanations of the fields in the form

The fields of the form have to be filled out to create items.

1. Server - The hive server to add items to.

2. Module - The module to add items to.

3. Item type - The item type of the item to create. The item types available depend on the module selected.

4. Properties Template - The properties of the item can be loaded from a previously saved template (this is optional). This is handy if items shall have similar properties .You can save templates in the Solution Explorer by selecting Export properties in the context menu of the item, and giving it a name. The item's properties upon creation will then be set to these saved properties.

5. Name - Name of the item. If {0} is part of the name, it's used as a placeholder for the index.

6. Count - It's possible to add several items at once by setting the count to the desired number of items. If the count is larger than 1 the names of the items will be the name plus a number which is incremented for each item. {0} can be used as a placeholder for the increment number.

7. StartIndex - The starting index of the name if the count is > 1.

You could write Well {0} Pressure in the name element, and 4 in the Count element. This will produce 4 modules named: Well 1 Pressure, Well 2 Pressure, Well 3 Pressure, and Well 4 Pressure.

When you click "Add item(s)", the items are created as templates. This means that the items have not been actually added to the module(s) yet. The items will be created by clicking the "Submit" button. The reason they are temporarily created as templates is so you can change the properties of the items before they are actually created. This is done by selecting the items in the list in the upper left corner. The property editor on the right-hand side is used to change the properties.

Browsing the namespace

By clicking the browse button after selecting the item type, it is possible to browse for the items the module offers. Not all modules support browsing.

Delete an Item

Items can be deleted by selecting one or more item nodes in the Solution Explorer and selecting "Remove" in the context menu.

In addition, it's possible to delete items from the list views (except for the custom items view). This is done by selecting the items to delete and clicking the "Delete" button.

In both cases, a confirmation message will be displayed.

.

Renaming Items

An item can get a new name by changing the name in the Property Editor.

If the item you are renaming also has been enable for logging into a Apis Honeystore database, the item name will also change in the database.

Connecting Items

The item connection dialog can be displayed in several ways:

The connection dialog consists of two trees. The tree to the left contains available sources (i.e. all items for the instance), the tree on the right contains destination items. Only inputs can be connected in this dialog. You can select items in the trees on the left and right side, then click "Connect" to connect. To disconnect, select the input in the tree on the right side that you wish to disconnect, and click "Disconnect".

It is also possible to drag items from the source to a destination item.

Vectors and matrices are connected in the same manner, but for each element of the vector/matrix there will be field in which the connected item is displayed. If no item has been connected, "n/a" appears.

The changes to the connections will be performed when you click "Ok".

.

Export Item Properties

It is possible to export the properties of an item. This can then be used as a template when creating new items. The new item will get the same property values as the item for which properties were exported. Select Export properties in the context menu in the item node, which brings up a dialog box asking the user to input a name.

You can then enter an easily recognizable name for the Properties Template.

When adding items, all the saved properties templates for the item type are listed in the "Properties Template" combo box. The templates are stored in files in the program folder of Apis Management Studio under the folder defaults. The templates are grouped by the module type and under the folder Items, then further grouped by item type. The files can be copied to other installations of AMS.

Export Module Properties

It's possible to export the properties of a module. These can then be used as templates when creating a new module, and the new module will get the same property values as the export module. You can select "Export properties" in the context menu in the module, which will bring up this dialog box, where the user is asked to give the template a name.

You can then enter an easily recognizable name for the Properties Template.

When adding modules all the saved properties templates for the module type are listed in the "Properties Template" combo box. The templates are stored in files in the program folder of Apis Management Studio under the folder defaults. The templates are grouped by the module type, and the saved templates are stored in the folder "ModuleProperties". The files can be copied to other installations of AMS.

Commands And Events

The configuration of Commands and Events is done by finding the Connect Events menu item in either the context menu of the Instance Node or the Module Node of the Solution Explorer.

This will bring up the Connect Events window:

The left hand side of the window displays the available commands in the modules. In this case there are two modules with a set of commands. Different module types will typically have different commands and events.

On the right hand side, the events of the modules are listed. Below the the events (on level 4), the commands which will be executed when the event occurs are listed. It is possible to add and remove the commands by dragging commands from the left hand side to the right hand side.

The order of the commands can also be changed by dragging the command up or down the tree.

In addition to that it is possible to mark a command on the left hand side and an event on the right hand side, and click Connect in order to add a command to an event.

By marking a command on the right hand side and clicking disconnect, the command will be removed from the event.

There will be no changes in the server configuration until you click "Ok". By clicking Cancel all changes will be discarded.

Install and use floating Runtime License

What is Network Licensing?

Network licensing is based on client/server architecture, where licenses are placed on a centralized system in the subnet. On the License Server computer, the APIS License Server must be running to serve license requests from clients.

The main difference between activating software that uses a network license rather than a standalone license is that the license code must reside on the system where the License Manager runs. This may not necessarily be the system where client application will be used.

Prepare the license server

TODO: Describe how to set up the APIS/Cryptlex Licensing server...

Email: support@prediktor.no

Phone: +47 95 40 80 00

Request a license

TODO: Describe how to request an APIS license...

Email: support@prediktor.no

Phone: +47 95 40 80 00

Install a floating network license key

TODO: Describe how to deploy an APIS license...

Activate a floating license

TODO: Describe how to activate an APIS license...

Restart Apis Hive and Apis HoneyStore

When the license key is activated, Apis Hive and Apis Honeystore have to be restarted (or started if they're not running).

  • Click "ApisHoneyStore" -> "Stop", and then "ApisHoneyStore" -> "Start", to restart:

Connect

This section covers how to acquire data using the most common data protocols . Please pick a topic from the menu.

Connect to an OPC DA server

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisOPC from the Module type dropdown list.

  • After adding the module, select the new module named "ApisOPC1" from the Solution Explorer.
  • In the Properties Editor, enter values for:
    • Computer: The IP address or DNS name of your OPC server machine. If the OPC server is running on your local computer, you should leave this property blank.
    • Server: Select the name of the OPC server from the dropdown list.

TIP: If you cannot see the name of your OPC server, take a look at the Guide OPC DCOM Setup . The log viewer in Apis Management Studio can also be useful when troubleshooting DCOM Setup.

  • Press "Apply" when done.

Follow the guide Add Items to a Module, but this time add item of type "OPC Item".

  • Click the "Browse" button.
  • A dialog opens that lets you select items from your OPC Server. Click "Ok" when done.

  • The item list will get new entries showing the added items.

Connect To OPC AE Server

This module can both replicate the received Alarms&Event in a local AE server and display the information on items in a namespace. In this example we'll do both.

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisAEClient from the "Module type" dropdown list.

  • After adding the module, select the new module named "ApisAEClient1" from the Solution Explorer.

  • In the Properties Editor, enter values for:

    • Computer: The IP address or DNS name of your OPC server machine. If the OPC server is running on your local computer, you should leave this property blank;
    • Server: Select the name of the OPC server from the dropdown list;
    • Operation Mode: Both (auto). Items will be automatically added to the namespace and the alarms will be registered in the Apis Alarm server in same Apis instance.

TIP: If you cannot see the name of your OPC AE server, take a look at the guide OPC DCOM Setup . The log viewer in Apis Management Studio can also be useful when trouble-shooting DCOM Setup.

  • Press "Apply" when done.
  • The items will be automatically created when alarms & events are received. You can change the default alarm information to display on items of type OPC AE Item. Select the item and change the ValueAssigment

  • Press "Apply" when done.

Connect To OPC UA Server

Follow the guide Add Module to Apis Hive, but this time select a module of type Apis OpcUa from the Module type dropdown list.

  • After adding the module, select the new module named "ApisOPCUA1" from the Solution Explorer.
  • In the Properties Editor, enter values for:
    • ServerEndPoint: Endpoint URL of the OPC UA server. Syntax opc.tcp://{HostName or IP address}:{port number}
  • Press "Apply" when done.

Follow the guide Add Items to a Module, but this time add item of type "OPC Item".

  • Click the "Browse" button.

  • A dialog opens that lets you select items from the OPC UA Server.

  • When the dialog appears it is not populated with any items. In order to find the items you are looking for, you may enter your search criteria in the input box at the top of the dialog, By default it is the name of the leaf nodes you may filter on, but you can click the "Filter type" button to set other filter-types.

  • Click "Browse" to perform the search for items. When no search criteria is entered, the entire available namespace will be displayed.

  • Select the items you want to add and click "Ok" when done.

  • The item list will get new entries showing the added items.

Connect To ModBus Slave

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisModbus from the "Module type" dropdown list.

  • After adding the module, select the new module named "ApisModbus1" from the Solution Explorer.

Basic setup, communication interface

The module supports both serial (RTU) and TCP/IP (Modbus TCP) interface, depending on your server. In the Properties Editor, enter values for:

  1. TCP/IP based server:

    • Comm. type: TCP/IP
    • IP address: IP address of your Modbus slave.
    • Port: TCP port of your Modbus slave

    .

  2. Serial communication based server :

    • Comm. type receive : Serial
    • Com port: Com port connected to the slave.
    • BaudRate: Baud rate of your slave serial setup.
    • DataBits: Data bits of your slave serial setup.
    • FlowControl: Handshake of your slave serial setup.
    • Parity: Parity of your slave serial setup.
    • StopBits: Stop bits of your slave serial setup.

  • Further in the Properties Editor:
    • Poll interval: Enter the value for how often this masterr should poll for new values on server (in seconds).
    • Default Slave address : Note! this is the initial property when new items are created.

Add Items (registers)

Now follow the guide Add Items to a Module, but this time select the Modbus module and add items of one of the register types:

  • Coil
  • DiscretesInput
  • InputRegister
  • HoldingRegister

Example Holding register:

Give the Item a proper name like "Temp_Man_6". Assure SrcItemID is pointing to valid register address like "40002",check the Slaveadress and set correct Valetype of the value in the register of the slave. Ok

If all setting are correct the "Temp_Man_6" tag should be displaying the value of holding register 40002 of the slave.

Troubleshooting

If there's no connection or data received:

  • Use a third-party terminal application like wireshark, to check if the server is sending telegrams.
  • For a TCP/IP based server:
    • Check the firewall settings for the receiving port.
    • Check the network connection to the server, (ping)

Connect to a WITS-0 server

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisWITS from the "Module type" dropdown list.

  • After adding the module, select the new module named "ApisWITS1" from the Solution Explorer.

The module supports both serial and TCP/IP interfaces, depending on your server. In the Properties Editor, enter values for:

  1. TCP/IP based server:
    • Comm. type receive : Winsock
    • IP address receive: IP address of your WITS-0 server .
    • Port receive: TCP port of your WITS-0 server .
    • Protocol: The protocol of your WITS-0 server, TCP or UDP.
  2. Serial communication based server :
    • Comm. type receive : Serial
    • Com port receive: Com port connected to the server.
    • Baud rate receive: Baud rate of your server serial setup.
    • Data bits receive: Data bits of your server serial setup.
    • Flow Control receive: Handshake of your server serial setup.
    • Parity receive: Parity of your server serial setup.
    • StopBit receive: Stop bits of your server serial setup.
  • Further in the Properties Editor:
    • Poll interval: enter the value for how often this client should poll for new values on the server (in seconds).
    • Autogenerate :Decide whether the client should generate items automatically based on the telegram from the server. The items will be generated according to the specification in W.I.T.S. Wellsite Information Transfer Specification.

  • Press "Apply" when done.

If you selected Autogenerate in the property setup there should be no need to add items manually.

Otherwise:

Follow the guide Add Items to a Module, but this time add items of type "WITSItem".

  • The item list will get new entries showing the added items.

  • Alternative:
  • In the name field, write a custom name and click the "Add item(s) " button.

  • Select the new item and fill in properties manually; Record, Field, Type, etc.

Troubleshooting

If there's no connection or data received:

  • Use a third-party terminal application like putty, to check if the server is sending telegrams.
  • For a TCP/IP based server:
    • Check the firewall settings for the receiving port.
    • Check the network connection to the server, (ping)

Connect to an SQL Server

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisSQL from the "Module type" dropdown list.

  • After adding the module, select the new module named "ApisSQL1" from the Solution Explorer.

The module supports communication with SQL Server, either on a fixed interval (based on the Timer interval property) or triggered by using a trigger item.

This Quick Start Guide will show you how to connect an SQL Bee to your SQL Server instance, and send and receive values from SQL Server, using simple query and more advanced stored procedures.

To start with, you'll need to configure your new SQL Bee to connect to SQL Server. You'll need access to an SQL Server from the computer on which you're configuring the SQL Bee. This includes a database username and password, along with network access between the two.

In your SQL Module, you'll need to set the following properties:

The "Database login" property should contain the name of the user you want to connect as. The "Database login password" property should contain the password for that user. The "Database name" should contain the name of the SQL database you want to use. The "Database server" property should contain the name or IP address of the SQL server, along with the name of the instance, if the database isn't on the default instance.

Assume database name is "test", we use "sa" login and database server is local.

It can also be useful to change the SQL statement property to "Select 1" and the Timer interval property to 1000. Press Apply when you're happy with the field values. This will allow you to see if the connection was successful by using the "Connection state" property. If the connection is successful, it should display "Connection state: open".

The next step is to change the "SQL Statement" property to reflect the SQL command you want run. Writing "Select 1" allowed us to check the connection worked, but we're going to want to do something more useful.

First of all you will need a table named "Items" you can use this query to create it.


USE [test]

GO

/****** Object: Table [dbo].[Items] Script Date: 30.06.2017 10.24.40 ******/

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[Items](

[ItemID] [nvarchar](50) NOT NULL,

[ItemValue] [float] NULL,

[ItemTimestamp] [datetime] NULL,

[ItemQuality] [int] NULL,

CONSTRAINT [PK_Items] PRIMARY KEY CLUSTERED

(

[ItemID] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

) ON [PRIMARY]

GO

The table should at least have one row with ItemId and ItemValue set.

Execute as text

The simplest way to acquire data is by sending a simple SQL statement to the server.

Set the "SQL statement type" property to "Execute as text"

Set "SQL statement" to "Select * FROM [test].[dbo].[Items]"

To add a Read Item, right-click on the SQL Bee module again and press Read Item. In the dialog box, instead of pressing Add Item, press Browse.

This will give you a list of available items, one of which should be ReadItem1 :

Press the checkbox beside ReadItem1 and then OK. Press OK again to add the item.

The Read Item will now reflect the values of "ItemValue", "ItemTimestamp" and "ItemQuality" of row "ReadItem" in table "Items"

When the foreign system updates these field in the table the values will be reflected in the namespace of ApisHive.

Execute as stored procedure

For more advanced queries execution of stored procedures in the SQL server might me required.

In this example, we'll be using a stored procedure called TestBeeParams:


USE [test]

GO

/****** Object: StoredProcedure [dbo].[TestBeeParams] Script Date: 28.06.2017 12.50.45 ******/

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

ALTER PROCEDURE [dbo].[TestBeeParams] @Number int, @SPName nvarchar(max), @Source nvarchar(max), @XML nvarchar(max)

as

begin

set nocount on

DECLARE @idoc int;

EXEC sp_xml_preparedocument @idoc OUTPUT, @XML;

DECLARE @ItemID nvarchar(max);

DECLARE @ItemValue float;

DECLARE @ItemTimestamp DateTime;

DECLARE @ItemQuality int;

SELECT @ItemID = ItemID, @ItemValue = ItemValue, @ItemTimeStamp = ItemTimestamp, @ItemQuality = ItemQuality

FROM OPENXML(@idoc, 'ROOT/ItemSample')

WITH

(

ItemID nvarchar(50) '@ItemID',

ItemValue float '@ItemValue',

ItemTimestamp DateTime '@ItemTimestamp',

ItemQuality int '@ItemQuality'

)

if @ItemID is not null

begin

-- Update write items

if exists(select itemid from Items where ItemID = @ItemID)

begin

-- Item exsists update data

update Items set ItemValue = @ItemValue, ItemTimestamp= @ItemTimestamp, ItemQuality= @ItemQuality where ItemID =@ItemID

end

else

begin

-- Item does not exsist, insert it into table

insert into dbo.Items(ItemID,ItemValue,ItemTimestamp,ItemQuality)

SELECT @ItemID, @ItemValue , @ItemTimeStamp , @ItemQuality

end

end

-- Return all items regardless they have changed or not

select * from dbo.Items

end

This stored procedure takes in write item(s) and returns all items as read item(s). Add it to your SQL database.

You will need a table named "Items" you can use this query to create it.


USE [test]

GO

/****** Object: Table [dbo].[Items] Script Date: 30.06.2017 10.24.40 ******/

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[Items](

[ItemID] [nvarchar](50) NOT NULL,

[ItemValue] [float] NULL,

[ItemTimestamp] [datetime] NULL,

[ItemQuality] [int] NULL,

CONSTRAINT [PK_Items] PRIMARY KEY CLUSTERED

(

[ItemID] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

) ON [PRIMARY]

GO

As you can see from the stored procedure, the write items are passed in using XML. For multiple items, you can add a WHERE clause to the initial select, allowing you to distinguish between different items. However, for this example, we'll just be sending in a single value.

Once the stored procedure is on your database, change the following SQL Bee parameters:

The "SQL statement" should contain the name of the statement without any parameters or the EXEC keyword. This is because the "SQL statement type" is set to "Execute as stored procedure" so the parameters and EXEC will be added automatically. The "Table schema" property is set to "By rows", since we're sending back a row per value. The Timer interval is set to 1000 milliseconds, so we can see a response fairly rapidly.

After we apply the values, we need to add a single write item to the module. Do this by right-clicking on the SQL Bee module, going to "Add Item" and pressing on Write Item:

The name of the item doesn't matter in this case, the item name will be inserted into the table, so you can just press the "Add Item" button at the bottom of the dialog box:

Then press "Ok" to finish adding the Write Item. You can then alter the item to set it to any type you like.

Once the write item is setup, you can add a Read Item. Assume the Item table contains Item named ReadItem1 or you can add it with following query :


USE [test]

GO

INSERT INTO [dbo].[Items]

([ItemID]

,[ItemValue]

,[ItemTimestamp]

,[ItemQuality])

SELECT 'ReadItem1', 123.5 , '2017-06-30 10:06:08.000' , 192

GO

To add a Read Item, right-click on the SQL Bee module again and press Read Item. In the dialog box, instead of pressing Add Item, press Browse.

This will give you a list of available items, one of which should be ReadItem1 (and Write Item1 added previously):

Press the checkbox beside ReadItem1 and then OK. Press OK again to add the item.

The Read Item will now reflect any value you write in the Write Item. The Write Item value is sent into the stored procedure, read from the XML, and sent back through the SELECT statement at the bottom of the stored procedure.

Troubleshooting

An important point to note is that stored procedures called from the SQL Bee module may not contain temporary tables. Any stored procedure containing a temporary table will fail to run.

Configure Connection Manager

The connection manager module (ApisCnxMgr) is used to configure connections to remote OpcUa servers.

To create a module of this type, follow the guide Add Module to Apis Hive and select the module type "ApisCnxMgr":

Click 'Add' followed by 'Ok' in the module properties dialog to create the module.

Next, items of type "OpcUa connection" must be added to the module. Right-click the new module in Solution explorer, select "Add items" and then "OpcUa connection":

In the "Add items" dialog, click 'Add item(s)' to create an OpcUa connection item with the name "OpcUa connection1". Select this item in the item-list, and specify the "Endpoint URL" property for the OpcUa server:

Repeat these steps for each OpcUa server that ApisHive should connect to. The item list in Apis Management Studio will show the status of each connection:

The connection manager module also supports the item type "OpcUa cluster", which is used when redundant OpcUa servers are available and you want ApisHive to automatically fail-over between these servers. After creating an OpcUa cluster item, select the connection items that are part of this cluster and assign their "Cluster" property:

The OpcUa and OpcUaProxy module can now connect to the cluster, and will automatically choose the best server in the cluster by observing the connection status and servicelevel of each server.

Stream Data to Broker

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisUaPublisherBee from the Module type dropdown list.

  • After adding the module, select the new module named "UaPublisherBee" from the Solution Explorer.
  • First select type of communication:
    • File (is used for debugging purposer, information is written to a file for inspection of mesages)
    • MQTT (when using MQTT protocole to the broker)
    • AMQP (when using AMQP protocole to the broker)
  • Press Apply

When the protocole is selected several parameters need to be set, and these wil wary depending on the selectet brokertype.

Brokertype: File

  • Properties
    • Filename: This propery shall be the filename to use.

This is mainly used for debugguing purpose, to see the content of message itself. the only property here is to set the filename where the messages will be stored. There will be one line for each messages. The plugin will use 10 files to avoid ublimited filesize and use of diskspace. When a file get a serten limit it will create a new file with name XX_o up til XX_9

Brokertype: AMQP

  • Properties
    • AMQP Type: select the communication to use eiter Websocket or HTTPS syncronius or not.
    • AMQPMain Address: The microsoft Endpoint to Eventhub where all realtime data are sent.
    • AMQPBackFill Address: The endpoint to the backfil channel.
    • AMQP Connectiontype. Multiple,single, or transient. Defailt is multiple.

In addition to thes settings the property the Main EntityPath/Topic and BackFill EntityPath/Topic have to be filled out. It is possible to use the same parameters for both Main and Backfill properties, but then all data will be sent to same Eventhub.

Brokertype: MQTT

  • Properties
    • MQTTMain Address: addres to broker eg test.mosquitto.org
    • MQTTMain Port: port to use e.g. 1883
    • MQTTMain ClientId: A unique string (e.g. GUID)
    • MQTTMain User: A user defined by the broker.
    • MQTTMain Password: password to the broker.
    • MQTTMain CleanSession: (enabled or not)
    • MQTTMain Version (V3.1.1 or V5.0)
    • MQTTMain Transport (Tspserver without security, TcpSerberTLS with security/encryption)
    • MQTTMain Client certificate: full name of a certificat, if then broker needs this to verify the clent.
    • MQTTBacFill Address: addres to broker eg test.mosquitto.org
    • MQTTBacFill Port: port to use e.g. 1883
    • MQTTBacFill ClientId: A unique string (e.g. GUID)
    • MQTTBacFill User: A user defined by the broker.
    • MQTTBacFill Password: password to the broker.
    • MQTTBacFill CleanSession: (enabled or not)
    • MQTTBacFill Version (V3.1.1 or V5.0)
    • MQTTBacFill Transport (Tspserver without security, TcpSerberTLS with security/encryption)
    • MQTTBacFill Client certificate: full name of a certificat, if then broker needs this to verify the clent.

These properties define communication to both the primary and secondary broker. The primary broker always get the realtime messages while the secondary gets messages that are old when doing catchup or resending old messages the primary broker did not accept. In adition to these properties the properies Main EntityPath/Topic and BackFill EntityPath/Topic has to be set. Also check documentation for common properties that has to be defined.

When using MQTT to Microsoft IOT Hub you get a connectionstring from Microsoft. This has to be decodes to different parameters. See https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-mqtt-support#using-the-mqtt-protocol-directly-as-a-device to get details.

The other parameters van be changed later.

To start publishing data we need to:

  • Add a Apis OpcUa module. This represent the data to be published.

  • Follow the guide Add Items to a Module (UaPublisher) and add item of type "Writer Group".

    • The property PublishInterval define how often a message is published.
    • MaxNetworkMessageSize define the maximum size a message can bee.
  • Follow the guide Add Items to a Module (UaPublisher) and add item of type "Variable Dataset Writer". This is used to connect the dataset to a "Writer Group".

    • Select the "Writer Group" in the WriterGroupItem property.
    • Select the ApisOpcUa module in property "DataSetName"

Now there should be a system that every PublisherInterval create a message including all data arrived in current interval, create a message according to the configuration, and send this to the Eventhub. To see how the transmitted messages look like select BrokerType equal to File and press Apply. Then enter a filname for property "FileName" (id 2300). Then published messages are written to that file. This is a nice way to verify that the messages look as expected. When satisfied then go back and send the messages to the eventhub.

One Publisher can have many "Writer Group" Items and "Variable Dataset Writer". A "Variable Dataset Writer" can only be connected to one Datset (Apis OpcUa module) and one "Writer Group". A Dataset can only be connected to one "Variable Dataset Writer".

To get an overview of the dataflow in a PublishertBee se figure in ApisUaPublisherBee.

Connect Interpreter module to source

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisInterpreter rom the "Module type" dropdown list.

  • After adding the module, select the new module named "Interpreter" or what name you chose from the Solution Explorer.

Basic setup, communication interface

The module supports both serial and TCP/IP interfaces, depending on your server. In the Properties Editor, enter values for:

  1. TCP/IP based server/source:
    • Comm. type receive : Winsock
    • IP address : IP address of your server .
    • IP Port receive: TCP port of your server .
    • IP Protocol: The protocol of your server, TCP or UDP
    • .
  2. Serial communication based server :
    • Comm. type receive : Serial
    • Serial Com port: Com port connected to the server.
    • Serial Baud rate: Baud rate of your server serial setup.
    • Serial Data bits: Data bits of your server serial setup.
    • Serial Flow Control: Handshake of your server serial setup.
    • Serial Parity: Parity of your server serial setup.
    • Serial StopBit: Stop bits of your server serial setup.
  • Further in the Properties Editor:

    • Timer: enter the value for how often this client should poll (send) for new values on the server (in seconds).
  • Press "Apply" when done.

Mode of operation

Read

When the "Mode of operation" property is set to Read the module will attempt to Read a string from the selected communication port.

Depending of the format of the telegram various properties must be set in addition to the communication properties.

Assume a device sends a telegram with following format:

$GPGLL,5300.97914,N,00259.98174,E,26.375698,A*F<CR<LF>

Now follow the guide Add Items to a Module, but this time add items of type "InterpreterItem".

Buffer size:

First of all we need to allocate enough buffer space to the telegram, this telegram seems to contain nearly 50 characters including <CR>LF>

Setting the property Buffer size to 100 should be adequate.

Terminating character:

The telegram seems to be terminated with <LF> 0xA LineFeed , this means that the Terminating character property should be set to LF. If you dont find your terminating character, just type in the ASCII value of the character (decimal)

This property settings should be enough to receive "raw" telegram. The module reads the interface until terminating character is hit or buffer is full then InterpreterItem is updated.

If communication is OK and the server / device is sending the InterpreterItem tag should be updated.

If we want the module to extract / interpret part(s) of the telegram to different item(s):

Interpret:

Enable the Interpret property.

Field separator:

In this case different values seems to be separated by comma ',' , set Field separator property to Comma.

InterpreterItem property

In property window for the "InterpreterItem" set Adress to the field you want to interpret

$GPGLL,5300.97914,N,00259.98174,E,26.375698,A*F<CR<LF>

0123456
$GPGLL5300.97914N00259.98174E26.375698A

In this case 5

 

The InterpreterItem tag should be updated. with the value from field 5 in the telegram

If other fields need to be extracted just add InterpreterItems an set appropriate Address (fIeld)

Write

Follow the guide Add Items to a Module, but this time add items of type "InterpreterSendItem".

In Write mode the value of the "InterpreterSendItem" is sent to a listening server ,when:

  • The value changes, depends on "Update only on change" attribute.
  • Command Item using this "InterpreterSendItem" as a Parent Item is trigged.

Write->Read

Follow the guide Add Items to a Module, but this time add one item of type "InterpreterSendItem" and one item of type "InterpreterItem"

The value of the "InterpreterSendItem" is sent to a listening server, in same way as in Write mode, it is assumed that the server will send a response, the response will end up in the "InterpreterItem" in same way as in Read mode.

This mode is applicable to trig some kind of synchronous response (reading) from a casual server.

TCP-Server

Follow the guide Add Items to a Module, but this time add items of type "InterpreterSendItem".

This mode

If no "InterpreterItem" are defined the "InterpreterSendItem" is sent to a connected client as a stream, no handshake.

If a "InterpreterItem" is defined the "InterpreterItem" acts as a command identifier, when the server receives data matching the value of the "InterpreterItem" it will respond with the value of "InterpreterSendItem". The Address is used as identificatior.

For instance Worker StringFormatter can be used to format the command and response for your needs.

The example below shows server where two commands are defined "print" and "get_status". The Address property decides what response belongs to what command. When the TCP-Server receives "print" from external client it will respond with "Print job finished Id 177". "print" matches the value of "PrintCmd" item which has Address 10 the server responds with the value of SendItem having Address 10.

In this case PuTTy terminal is connected to the server to demonstrate this feature

If there's no connection or data received:

  • Use a third-party terminal application like putty, to check if the server is sending telegrams.
  • For a TCP/IP based server:
    • Check the firewall settings for the receiving port.
    • Check the network connection to the server, (ping)
  • In many cases the device like hand held scanners send data occasionally , then it might be difficult to figure out whether the device is passive or there is configuration / communication problem.

Store

This section gives an introduction to storing time series data with Apis Honeystore. Please pick a topic from the menu.

Store Time Series Data

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisLogger from the "Module type" dropdown list.

  • After adding the module, select the items in the item list that you want to store

  • In the Properties Editor, click on the button "Add Property"
  • In the "Add property" dialog, type "*apislogger*" in the filter field.

  • Click "Apply".

View Time Series Data

  • In the Historical Items view, select the preferred start time, end time, and aggregate, then click "Read History"

Adding a Database

A database can be added in AMS to Honeystore by selecting "Create database" in the context menu of the hs://<computer> node.

By setting: 

  • Database name - the name of the database
  • Path - where the database is stored on disk
  • Maximum items - the maximum number of items that can be stored in the database
  • Cache size.- read this to find out how to calculate a reasonable cache size.

And clicking OK, a new database will be created.

Delete Databases

A database can be deleted in AMS by selecting "Delete" in the context menu of the database node.

When deleting a database, all historical data contained within the database also will be deleted.

Typically, the application responsible for storing the data to the database also handles the deletion of the database. E.g. the ApisLoggerBee module of Apis Hive.

It's only possible to delete a database in "Admin" running mode

Export Data

The data in a database can be exported to a file in AMS.

It's only possible to export a database in the "Online" and "OnlineNoCache" running modes

This is done by selecting "Tasks->Export Data" from the context menu of the database node. This will display a dialog box for selecting the items to export.

By clicking "Select All", all items will be selected. At the top, there's a filter to find items easily.

When the items which you want to export have been selected, press the "Next" button. This brings up fields for inputting start and end times, and the name of the exported file.

After pressing the "Next" button again, you can start exporting by clicking the "Start Export" button. A progress bar indicates the progress of the export. When the data has been exported, you push the "Close" button to close the dialog box

Import Data

Data can be imported into existing databases in AMS.

It's only possible to import into a database in the "OnlineNoCache" running mode.

Assure the database mode is "OnlineNoCache", then select "Tasks -> Import Data" in the context menu of the database node.

This brings up a dialog box where you can select a file and file type. If the file ends with .ahx, it's assumed the file is binary. The file type can, however, be overriden.

There are two possible file types: Binary and Tab Delimited text files.

Tree Filter

It is possible to filter the content of the tree view for the Honeystore nodes in AMS. In the context menu of the honeystore ned there is a Filter menu item. By clicking that the a dialog appears in which it is possbile to:

  • Hide disabled databases.
  • Hide databases which are in OnlineNoCache running mode
  • Shows only databases which adhere to the naming filter. * as wildcard is allowed.
  • Maximum number of databases displayed.

Replay Data

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisReplay from the Module type drop down list.

3 ways of running the Replay module will be described here:

  1. Playing back data from specific start time to end time on certain speed.
  2. Calculate aggregate using relative start and end-time.
  3. Playback data and use for input to another module and synchronize these.
  • After adding the module, select the new module named "Replay" from theSolution Explorer.

1. Playing back data from specific start time to end time on certain speed.

In the Properties Editor, enter values for:

"SynchronousReadDelay": 1000 ms.This property is the delay in ms. between read from the database. Setting this property to 0 might cause system overload.

"Resolution":1 s.

"StartTime" and "EndTime": Choose values where you are sure that database contains data.

"Exchange rate": Exchange rate in milliseconds for updating items from external items.

Press "Apply" when done.

Follow the guide Add Items to a Module, but this time add item of type "SynchrounusItem".

  • Click the "Browse" button.

  • A dialog opens that lets you select Items in the database(s). Click "Ok" when done.

  • The item list will get new entries showing the added items.

In addition it shows number of status and control items, amongst them; "Start and End-Time", "Resolution" and "Sync delay". The newly added items now shows quality bad, they still have no data.

Toggle the "CmdStart" tag to true

The replay starts, CurrentTime shows the timestamps of played values, the timestamps of replayed values depends of the "TimeMode" property.

Controlling the playback:

The "speed" of the playback can be adjusted with the "Syncdelay" item and the Resolution depending of application.

Playback can be paused/unpaused by toggle the "CmdPause" item, in pause you can step playback by toggle the "CmdStep" item.

If the "Step on NextTime" property is set, you can in Pause step to a desired time by setting the "NextTime" item to a value between "Start" and "End-time"

2. Calculate aggregate using relative start and end-time.

In the Properties Editor, Advanced section enter values for:

UseRelativeTime : true

SynchronousReadAggregate : "Average" or any other aggregate suitable for your needs.

We use the same items as in previous example, in the item list set the values of "RelativeStart" and "EndTime" and "Resolution". In this case we want to calculate the average of the last hour.

Toggle "CmdStart"

The replayed items will now show the average for the last hour.

3. Playback data and use for input to another module and synchronize these.

This implicates another module consuming data from the Replay module, this is beyond Quick start guide, however here is a simple example.

A simulation module is required in this case Java module connected to a simple ModFrame application. The ModFrame application has two input signals Salar1 and 2 and one output signal Scalar3. The playback items ApisReplay.Logger.Worker.Signal3 and 4 ar used as input to Java.Scalar1 and 2

To be able to synchronize these two modules Java and ApisReplay we have to use the internal Commands and Events mechanism in Apis.The image below shows the default configuration for these two modules.

In the Replay module we have OnSyncReadDone, event th at notifies that a synchronous read operation has finished and new data is ready on replayed items. In the default configuration this event is connected to ReadSync command which initiates a new synchronized read operation immediately (a loop).

In Java module there is a OnTimer, event fired when at a rate given by the module-property Timerperiod. this is connected to the OnsStep command calling the OnOneStep() java method, when OnOneStep() retuns it fires OneStepDone event indicating that the OnOneStep() java method is finished. With this configuration these two modules live their own life.

In next image we have changed the configuration: When Replay module has new data (OnSyncReadDone) it notifies the Java module first to read input (HandleExternalItems) and then to run a step (OneStep). When Java has finished its calculation, (this could take some time if the application was a advanced simulator) it fires OnStepDone which triggers ReadSynch i Replay module. We now have a "loop" where the two modules are synchronized.

Trouble shooting

No data :Check Start- and End- time, is there any data in the time period.

Hangs : Check Commands and Events loop.

Process

This section gives an introduction to a adding alarms, calculations and email notifications. Please pick a topic from the menu.

Hive Worker Module

How to configure worker

This section gives an introduction to configure worker module. Please pick a topic from the menu.

How to configure Signal item

This example show you how to add an signal item, which is used to auto generate different types of signal. There are seven different signals types which are supported:

  1. Sine
  2. Triangle
  3. Sawtooth
  4. Square
  5. Random
  6. PeriodicRandom
  7. Counter

The variable type which support this functionality is the item type Signal on the ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive to add a module of type ApisWorker to an Apis Hive Instance.

  • After adding the module, select the new module named "Worker" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.

  • Click on Apply

Add item type Signal

  • Click 'Add item(s)'
  • Click Ok

Configure item Signal1 as signal type Sine

This will show you how to configure a signal that generates a sinus curve with an amplitude of 50. offset (or bias) of 100 and a period of 20 seconds

  • Select the item Signal1
  • In the attribute window select the attribute Waveform and select the value Sine
  • In the attribute window select the attribute Amplitude and set the value to 50
  • In the attribute window select the attribute Bias and set the value to 100
  • In the attribute window select the attribute Period and set the value to 20

  • Click Apply
  • You should get an sinus signal with an offset of 100 and amplitude of 50 and a period of 20 seconds as shown below.

Configure item Signal2 as signal type sawtooth

  • Follow the same step as for Signal1 but change the waveform to Sawtooth
  • The signal should look like the picture below with the amplitude of 100 and bias of 50. The period is 20 seconds

Configure item Signal2 as signal type square

  • Follow the same step as for Signal2 but change the waveform to Square.
  • The picture below is showing the just the data point with a line drawn between them.

  • The picture below shows a graph where the current value is keep until a new value is received. The actual value (raw data)is shown as a circle on the curv.

How to configure Stringformatter item

This example explains how to add a Stringformatter item which is used to format a string based on external item(s) and a format-control string.
The variable type which support this functionality is the item type StringFormatter on the ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive to add a module of type ApisWorker to an Apis Hive Instance.

  • After adding the module, select the new module named "Worker" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.
  • Click on Apply

Add items

  • Click "Add item(s)"
  • Click Ok

Connect String Formatter item to source item

The StringFormatter will format the value on the input item according to the value in the FormatControlString property.

Connect external item

  • Right Click on the StringFormatter item and select Connect
  • Select the item which should be used as source item. In this case I will use an item called Worker.InputStringFormatter

  • Click Connect and then Ok

Configure FormatControlString property

  • Select the StringFormatter item to get the property view of the item
  • Set the value of the FormatControlString to format item based on input value.

  • Click Apply

How to configure Variable item with reset value

This example explains how to add automatically reset an item to a user defined value after a given period. The variable type which support this functionality is the item type Item type: Item Attribute Items on the ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisWorker from the Module type drop down list.

  • After adding the module, select the new module named "ApisWorker1" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.

  • Click on Apply

Add items

  • Click Add items(s)
  • Select the items (in the item list) and change the Valuetype to 8 byte float (in the property view)

  • Click Ok

Configure Value reset functionality

  • In the items list view (All Items), select item ApisWorker1.Variable1 to get the item property
  • Set the AutoResetTimeout to '5000' millisecond

  • Click Apply
  • Do the same operation for ApisWorker1.Variable2 but set the AutoResetValue to -1

  • Click Apply

Change the value of the item.

  • Select item 'ApisWorker1.Variable1' and set the value to 60. After 5 seconds the value should be reset to 0
  • Select item 'ApisWorker1.Variable2' and set the value to 30. After 5 seconds the value should be reset to -1

How to configure Time item

This example explains how to add a time item which displays a "clock" as value in either UTC og LocalTime . The variable type which support this functionality is the item type Time on the ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive to add a module of type ApisWorker to an Apis Hive Instance.

  • After adding the module, select the new module named "Worker" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.

  • Click on Apply

Add item type Time

  • Add item named Time1 of type Time

  • Click Add item(s)
  • Click Ok

Configure item Type

  • To display the value in local time, set the attribute 'Local time' to true. If you want to display the value in UTC, set the attribute 'Local time' to false

  • Click Apply

How to configure Multiplexer item

This example explains how use a multiplexer to select between a set of input items. The variable type which support this functionality is the item type Multiplexer on the ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive to add a module of type ApisWorker to an Apis Hive Instance.

  • After adding the module, select the new module named "Worker" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.
  • Click on Apply

Add item type Multiplexer

  • Click "Add item(s)"
  • Click Ok

Connect Multiplexer item to source item

The numbers of input to select between is given by the external items connected. The first external item (ExternalItem1) is the selector of which port to use as value.

Connect selector item

  • Make sure to connect the first item to the input selector item as ExternalItem1
  • Right Click on the Multiplexer item and select connect
  • Select the item which should be used as selector item. In this case I will use an item called Worker.InputSelector

  • The attribute overview of item "_Multiplexter1"_shows that Input selector item "Worker.InputSelector" is connected to ExternalItem1

Select which items to multiplexes between

The different items to multiplex between is given by external items. ExternalItem2 = input1, ..., ExternalItemN = input(N-1)

  • Connect Multiplexer item to input1, input2, input3 and input4.

  • You should see selector item as ExternalItem1 and the External item 2 to 5 should be the different input items

How to configure Item Attribute item

In some cases you might want to make available item attribute(s) of as a separate item in the Apis name space. This example explains how to do just that. The variable type which support this functionality is the item type Item type: Item Attribute Items on the ApisWorker, ApisOpc and ApisOpcUa modules. This example will show you how to configure item attribute items on an ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive to add a module of type ApisWorker to an Apis Hive Instance.

  • After adding the module, select the new module named "Worker" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.
  • Click on Apply

Add items

  • Make sure that you have the items with attributes available in the Apis namespace. In this example I will use a Worker Signal item.
  • Follow the guide Add Items to a Module, but this time select item type "Item type: Item Attribute Items".
  • Click on Browse button to get an overview of which attribute to expose as item.

  • Select Sine.Amplitude by click on the check box on the left side of the item and click Ok button twice

  • The access rights of the item Worker.Sine.Amplitude is given by the access rights of the attribute. Since the Amplitude have read/write access you should be able to change the amplitude both from the Apis namespace and from the item attribute property window. Updating either one of them (item / attribute) should be automatically be reflected on each other.

How to configure Bit Selector item

This example explains how to select a bit in an value and check it the bit is set or not. The variable type which support this functionality is the item type BitSelect on the ApisWorker module.

Add worker module

Follow the guide Add Module to Apis Hive to add a module of type ApisWorker to an Apis Hive Instance.

  • After adding the module, select the new module named "Worker" from the Solution Explorer.
  • Set the "ExchangeRate" property to e.g. 1000 ms. This is the update rate when this module exchanges data with other modules.
  • Click on Apply

Add items

  • Click Ok

Connect Bit select item to source item

  • Right click on the item and select Connect

  • Select the source of the bit select item to connect

  • Click Connect and then Ok

Configure which bit index to check of source item

  • Select the bit select item to get the property view of the item
  • Set the value of the BitPosition to check if the bit is set or not

  • Click Apply

Result

  • The bit select item should become true when the bit in position 1 is set and false if not.

Add Alarms

Setup Level Alarms

This example explains how to configure Apis Hive as OPC AE service and add level alarms on items.

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisAlarmArea from the "Module type" dropdown list.

  • After adding the module, select the items in the item list that you want to setup alarms for.

  • In the Properties Editor, click on the button "Add Property"
  • In the "Add property" dialog, type "*alarm*" in the filter field, and select the new global property "ApisAlarmArea1EvtCategory"

  • Click "Ok"
  • In the Properties Editor, click on the property "ApisAlarmArea1EvtCategory". From the dropdown menu, select the alarm category you want. In this example we'll use "level".

  • Click "Apply".
  • The OPC DA alarm attributes AlmH, AlmHH, AlmL and AlmLL (Alarm High Limit, Alarm High-High Limit, Alarm Low Limit, Alarm Low-Low Limit) will be added to the item. Set the limits you want. In this example we set AlmH limit to 100.

  • Click "Apply".

Setup Discrete Alarms

This example explains how to configure Apis Hive as OPC AE service and add level alarms on Items.

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisAlarmArea from the "Module type" dropdown list.

  • After adding the module, select the items in the item list that you want to setup alarms for.

  • In the Properties Editor, click on the button "Add Property"
  • In the "Add property" dialog, type "*alarm*" in the filter field, and select the new global property "ApisAlarmArea1EvtCategory"

  • Click "Ok"
  • In the Properties Editor, click on the property "ApisAlarmArea1EvtCategory".From the dropdown menu, select the alarm category you want. In this example we'll use "discrete".

  • Click "Apply".
  • The Predefined Apis Alarm Attribute AlmNormalState will be added to the item. Set the value of when the normal state you want. In this example we'll set AlmNormalState to 50.

  • Click "Apply".

Setup Watchdog Alarms

This example explains how to configure Apis Hive as OPC AE service and add watchdog alarms on Items.

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisAlarmArea from the "Module type" dropdown list.

  • After adding the module, select the items in the item list that you want to setup alarms for.

  • In the Properties Editor, click on the button "Add Property"
  • In the Add property dialog, type "*alarm*" in the filter field, and select the new global property "ApisAlarmArea1EvtCategory"

  • Click "Ok"
  • In the Properties Editor, click on the property "ApisAlarmArea1EvtCategory".From the dropdown menu, select the alarm category you want. In this example we'll use "watchdog".

  • Click "Apply".
  • The Predefined Apis Alarm Attribute AlmWatchdogPeriods will be added to the item. Set the watchdog period. In this example we set AlmWatchdogPeriod to 6 seconds.

Note: The watchdog period is the frequency you have configured for module attribute ScanPeriod on the ApisAlarmAreaBee. If you have a scan period of 500ms on the ApisAlarmAreaBee and no changes of the items value have occurred within 3 second, the watchdog alarm will be triggered.

  • Click "Apply".

Setup Watch Quality Alarms

This example explains how to configure Apis Hive as OPC AE service and add watchdog alarms on items.

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisAlarmArea from the "Module type" dropdown list.

  • After adding the module, select the items in the item list that you want to setup alarms for.

  • In the Properties Editor, click on the button "Add Property"
  • In the "Add property" dialog, type "*alarm*" in the filter field, and select the new global property "ApisAlarmArea1EvtCategory"

  • Click "Ok"
  • In the Properties Editor, click on the property "ApisAlarmArea1EvtCategory".From the dropdown menu, select the alarm category you want. In this example we'll use "watch quality".

  • Click "Apply".
  • The Predefined Apis Alarm Attribute AlmWatchdogPeriods will be added to the item. Set the watchdog period. In this example we set AlmWatchdogPeriod to 6 seconds.

Note: The watchdog period is the frequency of what you have configured for module attribute ScanPeriode on the ApisAlarmAreaBee. In this example you will trigger an alarm, if you have a scan period of 500ms on the ApisAlarmAreaBee and no changes of the items value have occurred within 3 second.

  • Click "Apply".

Add Dynamic Calculations

The ApisCalculate module has been deprecated.
Instead, use Function items in any module supporting this item type, e.g. the ApisWorker module.

Process Events with ApisEventBus

Overview

The EventBus bee is used for event processing. It uses item types to define four basic processing components:

  • Sources connects to something that can produce events, e.g. Chronical, Kafka.
  • Sinks connects to something that can consume events, e.g. Chronical, RDBMS, Kafka.
  • Channels are places where Sources can publish events, and Sinks can subscribe to events
  • Routers move events between Channels with optional filtering and transformations

Channel item type

This item type defines a channel/queue with the same name as the item. By default, all events published to a channel will immediately be forwarded to all subscribers. If the attribute BatchTime is set, events will be grouped together for this number of seconds, into a single event. If the attribute BatchSize is also specified, this it will override BatchTime when the specified number of events have been batched.

Router item type

This item type has attributes for Input channel, Output channel and Script. All events published to the input channel is filtered and/or transformed by the script, and the result is published on the output channel.

Source.Chronical item type

This item type creates a subscription for chronical events in the local hive instance. It has attributes used to specify an Event source, an Event type, and an Output channel.

The Source.Chronical item produces events like this:

<event>
	<Timestamp value="132313410105734253" filetime="2020-04-14T12:30:10.5734253Z"/>
	<Generation value="1"/>
	<Sequence value="19708435"/>
	<Source value="2166" name="Worker.Variable1">
		<Area name="AlarmArea"/>
	</Source>
	<Type value="20" name="LevelAlarm">
		<Attr>
			<UA_NODEID value="0:0:9482"/>
			<UA_SUBSTATES value="Low,LowLow,High,HighHigh"/>
			<AE_CONDITION value="Level alarm"/>
		</Attr>
	</Type>
	<State value="16777227" Enabled="1" Active="1" Acked="1" Low="1"/>
	<Severity value="780"/>
	<Message value="The value is lower than 5, last was 0"/>
	<Received value="132313410105734253" filetime="2020-04-14T12:30:10.5734253Z"/>
	<SourceName value="Worker.Variable1"/>
	<UserName value="PREDIKTOR\larsh"/>
	<Category value="2" name="Level"/>
	<ActiveTime value="132313408772290610" filetime="2020-04-14T12:27:57.2290610Z"/>
	<CurrentValue value="0"/>
	<CurrentQuality value="192"/>
	<CurrentTimestamp value="132313408772290610" filetime="2020-04-14T12:27:57.2290610Z"/>
	<LastState value="16777223" Enabled="1" Active="1" AckRequired="1" Low="1"/>
	<Comment value="Acked from AMS"/>
</event>

Sink.Db item type

This item type connects to an ADO database specified by the attribute Connectingstring, and executes stored procedures when the events received on the Input channel has the following layout:

<db>
	<exec name='stored-proc-name'>
		<arg name='param-name-1' value='param-value'/>
		<arg name='param-name-2' value='param-value'/>
	</exec>
</db>

Sink.Smtp item type

This item type is used to send emails. The attributes Server (and optionally Username and Password) defines the SMTP connection.

The attributes From, To, Cc, Bcc and Importance defines message properties.

The Sink.Smtp item expects incoming events to have the following layout:

<mail subject="New alarms from ApisHive">
  <part type='text/html'>
	<html>
	  <head>.....</head>
	  <body>....</body>
	</html>
  </part>
<mail>

Sink.Tracelog item type

This item type saves each received event to a tracelog. It is used for debugging/inspecting the events on a channel specified by the attribute Input channel. The attribute TraceFile specifies the path to the tracefile, and the attribute Enabled is used to enable/disable tracing.

Example: exporting events from Chronical to PDS DB

  • Create two Channel items: "ChronicalChannel" and "PdsChannel"
  • Create one Source.Chronical item, specify the Eventtype and Eventsource which should be exported, and set the Output channel to "ChronicalChannel".
  • Create one Sink.Db item, specify the ADO Connectionstring for the PDS database and set the Input channel to "PdsChannel"
  • Create one Router item, set Input channel to "ChronicalChannel", Output channel to "PdsChannel"

Now events can be transformed stored procedure invocations by the following xslt:

<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">

<xsl:template match="/event">
	<db>
		<exec name='ac_new_event'>
			<arg name='Timestamp' value='{Timestamp/@filetime}'/>
			<arg name='SourceName' value='{Source/@name}'/>
			<arg name='AlarmAreaName' value='{Source/Area/@name}'/>
			<arg name='TypeName' value='{Type/@name}'/>
			<xsl:apply-templates select='Type/Attr/AE_CONDITION'/>
			<xsl:apply-templates select='State/@Active'/>
			<xsl:apply-templates select='State/@Enabled'/>
			<xsl:apply-templates select='State/@Acked'/>
			<xsl:apply-templates select='State/@Low'/>
			<xsl:apply-templates select='State/@LowLow'/>
			<xsl:apply-templates select='State/@High'/>
			<xsl:apply-templates select='State/@HighHigh'/>
			<arg name='Severity' value='{Severity/@value}'/>
			<arg name='Message' value='{Message/@value}'/>
			<arg name='CategoryName' value='{Category/@name}'/>
			<arg name='ActiveTime' value='{ActiveTime/@filetime}'/>
			<arg name='Quality' value='{CurrentQuality/@value}'/>
			<xsl:apply-templates select='Comment'/>
			<xsl:apply-templates select='UserName'/>
		</exec>
	</db>
</xsl:template>

<xsl:template match='@Active'>
	<arg name="StateActive" value="1"/>
</xsl:template>

<xsl:template match='@Enabled'>
	<arg name="StateEnabled" value="1"/>
</xsl:template>

<xsl:template match='@Acked'>
	<arg name="StateAck" value="1"/>
</xsl:template>

<xsl:template match='@Low'>
	<arg name="SubconditionName" value="Lo"/>
</xsl:template>

<xsl:template match='@LowLow'>
	<arg name="SubconditionName" value="LoLo"/>
</xsl:template>

<xsl:template match='@High'>
	<arg name="SubconditionName" value="Hi"/>
</xsl:template>

<xsl:template match='@HighHigh'>
	<arg name="SubconditionName" value="HiHi"/>
</xsl:template>

<xsl:template match='Comment'>
	<arg name="Comment" value="{@value}"/>
</xsl:template>

<xsl:template match='UserName'>
	<arg name="UserName" value="{@value}"/>
</xsl:template>

<xsl:template match='AE_CONDITION'>
	<arg name="ConditionName" value="{@value}"/>
</xsl:template>

</xsl:stylesheet>

Example: SMTP formatting of batched events

<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">

<xsl:output method="xml" indent='yes'/>

<xsl:template match="/batch">
  <mail subject="You've got {count(event)} new alarms!">
	<part type='text/html'>
	  <html>
		<head>
		  <style>
			body {font-family: Helvetica, sans-serif;}
			th, td {text-align: left;}
			th {background: #ccc; padding: 0.2em; padding-right: 2em;}
			td {padding: 0.1em 0.2em; padding-right: 2em;}
		  </style>
		</head>
		<body>
		  <h1>Alarm list</h1>
		  <p>There was <xsl:value-of select="count(event)"/> alarm-related events since the previous report.</p>
		  <table>
			<tr>
			  <th>Date</th>
			  <th>Time</th>
			  <th>Severity</th>
			  <th>Active</th>
			  <th>Acked</th>
			  <th>Condition</th>
			  <th>Source</th>
			  <th>Message</th>
			  <th>Operator</th>
			  <th>Comment</th>
			</tr>
			<xsl:apply-templates select='event'/>
		  </table>
		</body>
	  </html>
	</part>
  </mail>
</xsl:template>

<xsl:template match='event'>
  <tr>
	<xsl:if test='position() mod 2 = 0'>
	  <xsl:attribute name='style'>background-color: #f0f0f5;</xsl:attribute>
	</xsl:if>
	<td><xsl:value-of select='substring(Timestamp/@filetime, 0, 11)'/></td>
	<td><xsl:value-of select='substring(Timestamp/@filetime, 12, 8)'/></td>
	<td><xsl:value-of select='Severity/@value'/></td>
	<td style='color:red;'><xsl:if test='State/@Active'>&#x2757;</xsl:if></td>
	<td style='color:green;'><xsl:if test='State/@Acked'>&#x2714;</xsl:if></td>
	<td><xsl:value-of select='Type/Attr/AE_CONDITION/@value'/></td>
	<td><xsl:value-of select='Source/@name'/></td>
	<td><xsl:value-of select='Message/@value'/></td>
	<td><xsl:value-of select='UserName/@value'/></td>
	<td><xsl:value-of select='Comment/@value'/></td>
  </tr>
</xsl:template>

</xsl:stylesheet>

Example: filtering of chronical events

<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">

<xsl:output method="xml" omit-xml-declaration="yes"/>

<!-- 
Template matching the top element of the input document.

If this element matches the filtering criteria (State/@Active=1),
the element is applied to the 'copy-template' which recursively
copies an element and all its attributes and child elements.

If the top element does not match the filtering, nothing will be
copied to the output document.
-->
<xsl:template match="/event">
  <xsl:if test="State[@Active=1]">
	<xsl:apply-templates mode="copy" select="."/>
  </xsl:if>
</xsl:template>

<!--
Template used to copy nodes and attributes, recursively
-->  
<xsl:template mode="copy" match='@*|node()'>
  <xsl:copy>
	<xsl:apply-templates mode="copy" select="@*|node()"/>
  </xsl:copy>
</xsl:template>

</xsl:stylesheet>

Item Expression Editor

Delete this text and replace it with your own content.

Contextualization

Introduction

Industrial sensor data, in its native form, is often available from sensors as numerical or binary data elements. The simplest industrial communication protocols provide only these basic elements, while more advanced protocols may introduce meta data such as time-stamps, quality indicators and descriptive information which enhances the value of the data. However, such simple meta data are most often not enough for applications to act upon the data the right way.

The journey from sensor data to actionable information requires that the data is set in context. Context or contextual information is any information about any entity that can be used to effectively reduce the amount of reasoning required to understand what the data means. The process of contextualization is about bringing the right context to structure-less or badly structured data. The overall objective Apis as an information gateway is to provide a standardized and unified way of accessing operational data from different assets – not only standardized on protocol, but also standardized on data semantics (context). The uniqueness of Apis lies in its ability to bring context into unstructured data, to change context of contextualized data, and to expose the same data in different contexts. Contextualization is beneficial and necessary in many scenarios, but there are two areas where the benefit of contextualization stand out; Diverse and distributed asset environments and OT (Operational Technology) Big Data scenarios.

Target Scenarios

In large organizations with distributed assets delivered by different vendors over a long period, the assets tend to expose their data on different formats and different structures. A pump from vendor A does not necessarily expose the same information as a pump from vendor B, and the pumps installed 10 years ago, display their information differently from the pumps installed today. The ability to harmonize the information structures so that the similar assets appear in a unified way is essential to be efficient when working with the data and to assure the quality of the results.

In OT Big Data applications, the contextualization stage is extremely important, since you do not always know the potential usage of the data at the time of collection, and you need to bring in context (or different contexts) to your data at a later stage. Flexibility and efficiency in the process of data contextualization is therefore a considerable advantage when it comes to handling big data in OT applications. (In IT Big Data applications, a similar concept called Schema on Read is adopted in data lake architectures. In schema on read, data is applied to a schema as it is pulled out of a stored location, rather than as it goes in.) Organizations and applications characterized by combination of diverse and distributed asset environments and OT Big Data will boost considerably by utilizing the Apis gateway.

Information models in Apis

Contextualization in Apis is built around the OPC UA standard and the information model capability of this standard. OPC UA information models are graph models structured as a full meshed network of nodes, allowing information to be connected in various ways, and expressing the semantics of the domain of interest. A key benefit of Apis as an information gateway is the capability to build and host generic OPC UA based information models. This means that OPC UA model structures can be represented in Apis, with the data variable nodes of the model hosted by Apis Hive items. This way it is possible to provide context (and different multiple contexts) to the data hosted in Apis Hive and Apis Honeystore. Information models can be constructed manually in Apis Management Studio, imported from external tools such as UA Modeler from Unified Automation or built automatically by integrating existing structure sources such as engineering databases and other master data repositories.

Asset Registry

A frequently used information model structure in industrial systems is asset hierarchies. Apis introduces the concept Asset Registry, which is based on the OPC UA ISA95 companion standard. Asset Registry can be used to organize any system, simplify engineering and enhance navigation. Utilizing the semantic modeling capabilities, Asset Registry enables system engineers to organize all enterprise and equipment data into reusable equipment (asset) types and equipment classes that can be easily instantiated for all similar assets to construct a complete system.

Namespace Replication

Namespace Replication is the ability to extract models for one or more namespaces on an external (source) UA Server, and exposing those namespaces through a local Hive UA Server. Namespaces from several external UA servers can be aggregated into the same UA server, called an aggregating server.

Namespace replication requires a running instance of the Apis OPC UA Namespace Server (UANSS) service.

A prerequisite to namespace replication is to connect to the external server with an OPC UA Client, the ApisOpcUa module.

For how to install UANSS, see Apis OPC UA Namespace Server Installation

To start, stop and connect to UANSS, see Connect to Apis OPC UA Namespace Server.

To setup which and how to replicate namespaces for an external server, see Configure Orchestrator.

To setup variable-items generation for an external server, see Configure Replicated Namespaces.

The process of performing the replication of namespaces is called Crawling, described in Address Space Crawling.

The process of loading replicated namespaces into the aggregating server is described in Namespace Proxy Models.

Connect to Apis OPC UA Namespace Server

The Apis OPC UA Namespace Server (UANSS) runs as a Windows service. The service name is 'APIS OPC UA Namespace server'. The service can be started, stopped and configured from the Services view in Windows.

Once the ApisOpcUa module (OPC UA Client) is connected to the source server, use Apis Management Studio (AMS) to connect to the UANSS service:

The UANSS service address can be entered directly or found in the dropdown box. Amend server name and/or port if neccessary.

Once connected, the content of the UANSS server can be viewed in the Solution Exporer:

The first level is the Hive (ApisHive in the picture). The second level is a representation the ApisOpcUa module (OpcUa), this representation is called an Orchestrator in the UANSS.

Configure Orchestrator

A representation of a source server is called an Orchestrator, it will have the same name as the ApisOpcUa module that is connected to the server.

To configure the Orchestrator, select the Orchestrator in the Solution Explorer in Apis Management Studio (AMS). The properties of the Orchestrator can be viewed and changed in the Property Editor in AMS.

To configure the properties of a specific replicated namespace, see Configure Replicated Namespace.

The properties are the following:

Crawler:

Property NameExplanation
Database root storeLocation of root store on disk. Crawler results will be stored here
Address space crawling mode

How address space crawling is initiated. Manual or OnShedule.

Manual: Crawl in initiated manually.

OnShedule: Automatic crawl is done periodically.

Address space crawling sheduleHow often crawling is done, when crawling is OnSchedule. Choose between Hourly, Daily, Weekly and Monthly
Method call redirectionWhether or not method calls are redirected to native server
Config pathThe path to the config directory
Application uriApplication uri for UA client
Use discovery serviceIf true, will use the UA discovery service to find endpoints
Max response countThe maximum number of items in a single response
Include Apis namespacesStates whether to include apis specific namespaces when crawling an ApisHive UA Server
Crawler results store sizeNumber of Crawler Results to keep
Low cache water markThe cache is purged to this size. If both watermarks are 0, the cache will never purge and grow to the number of nodes crawled (ie. use lots of memory)
High cache water markMax size of cache (in number of nodes). If both watermarks are 0, the cache will never purge and grow to the number of nodes crawled (ie. use lots of memory) Good watermarks are probably around 200.000-1000.000, and the low watermark somewhere around 5.000-10.000 below high watermark (that means 5.000-10.000 nodes will be flushed from the cache when the high-mark is reached). If not running out of memory, use watermarks 0 as this has tremendous effect on performance
Use in memory databaseIf true, the intermediate node database will be stored in-memory instead of a file
Ua ViewUa View to be used during crawling

UA Client: Most properties are fetched from the ApisOpcUa module.

Property NameExplanation
Reverse connectionSpecifies if the client (false) or the server (true) should initiate the connection. This property is set by the Hive UA client
Server endpointAddress of UA server
Reverse endpointEndpoint used for reverse connection. Format is opc.tcp://0.0.0.0:{port number}. This can be the same or different from the reverse endpoint used by the Hive OPC UA client. If different, remember to add it to ReverseConnections on the server
User nameUser name for the connection
PasswordThe password for the Opc UA connecton. (Same as for the ApisOpcUa)
Secure connection

If true, the most secure connection method that the server provides is used. This property is set by the Hive UA client

Authentication enabled

If true, a username and a password must be defined. This property is set by the Hive UA client

Use discovery serviceIf true, will use the UA discovery service to find endpoints.
Session timeout

Requested maximum number of milliseconds that a Session should remain open without activity. If the Client fails to issue a Service request within this interval, then the Server shall automatically terminate the Client Session. Default is 60000 milliseconds.

Discovery timeout

Timeout for contacting discovery service, milliseconds. Default value is 15000

Namespaces to replicate

Select the namepaces to replicate from the remote server. In order to replicate namespaces on a remote server, the namespaces of interest must be selected and added.

Locate the 'APIS OPC UA Namespace server' in the Solution Explorer and browse down to the Orchestrator, then to Replicated Namespaces. Right-click on Replicated Namespaces to open a popup and select Add Replicated Namespaces:

This opens a dialog where namepaces can be select for replication:

Press the Add namespace button to get a list of available namespaces, select the ones you want.

To avoid uri collisions on the aggregating server, you can change the exposed uri of the replicated namespace, as is done for the last namespace in the example above.

Press the Add custom namespace button to add non existing namespaces (namespaces that are expected to appear at a later point in time).

To configure a replicated namespace, locate the namespace beneath the Orchestrator in the Solution Explorer:

Configure Replicated Namespace

Configuring a replicated namespace consist mainly of determining how the generation of items will be done. Variables (that extends BaseDataVariableType) in the namespace are nodes that holds some kind of data that varies, i.e. the value of a temperature sensor. To keep the aggregating server updated with the latest values, the values are updated from the remote server by subscription. Internally in the aggregating server, the variables values are stored in Apis Hive OPC items. By using items, the subscribed to values will also be available from an OPC DA client.

For a replicated namespace, one needs to determine which variables to create items for, and how to generate the item names.

The configuration properties for a replicated namespace, are the following:

Misc:

Property NameExplanation
Exposed URIThe exposed namespace which the remote uri is mapped to. This is the namespace uri that clients that connect to the aggregating server will see. This may or may not be the same as the Source URI.
Item generation

Specifies when items used to hold variable-values should be generated for the replicated namespace.

• If no types in namespace: Generate items if there are no Types in the namespace (only instances)

• Always: Generate items for all variables (this will include instance declarations)

• Never: Do not create items

• Datavariables that are not instance declarations: Generate items for all variables other than instance declarations.

Namespace storeSpecifies the database file used to hold the local replicated namespace representing the remote URI.
Source URIThe remote source namespace to replicate

Item naming:

Property NameExplanation
Path element

Specifies which attributes to use when generating names for items to hold datavariable-values.

• Default

• NodeId: use the node id only

• Browsename: use the browsename only

• Browsename and NodeId: use the browsename and the node id

• Displayname: use the displayname only

• Displayname and NodeId: use the displayname and the node id

Path ruleSpecifies how far up the hierarchy to go when generating names for items to hold datavariable-values.
Name hierarchy cache depthHow many levels, counted from the objects folder, to cache names during itemname generation. Specify 0 to cache everything. May use a lot of memory on large systems.
Path separatorSpecifies the separator character to use between parent-child relationships when generating names for items to hold datavariable-values.

Naming root:

Property NameExplanation
Fail policy

Specifies how to handle the situation when a naming root is not detected along the parent-child hierarchy.

• Fail: Abort crawl with an error message

• Stop at first object node: Include the path elements up to, and including, the first parent that is an object

• Stop at objects folder: Include the path elements up to the Objects folder

Include subtypesSpecifies whether instances of subtypes of the naming root type are accepted as naming root.
Node IdSpecifies the node id of the naming root type node

Address Space Crawling

Address space crawling is the process of replicating foreign namespaces from one or more remote UA servers. Crawling is configured and initiated by right-clicking on a Orchestrator in the Solution Explorer in AMS, and choosing Address Space Crawling in the popup.

This will open the Crawling configuration view:

The current status of the Crawler is shown in the Crawler status field.

Start the Crawler manually by pressing the Start crawler button. An ongoing crawl can be stopped by pressing the Stop crawler button.

Each crawl will result in a model set. The number of sets is limited to the 'Crawler results store size' property of the Orchestrator. All Crawler model sets are visible in the Crawler model sets table:

Column NameExplanation
Available model setsName of model set
StartedDate and time when the crawl started
Finished

Date and time when the crawl finished

InitiatedHow the crawl was initiated, manually or on schedule
StatusStatus of crawler result. If errors exist, select the model set to see the errors in the Crawler set content table

To delete a model set, select the model set in the table and press the Delete button.

Press the Refresh button to update the content of the table.

The Crawler set content table will show the selected namepaces for the crawl and any error messages:

Column NameExplanation
Source URINamepaces in this crawl
Validation errorIf the crawl result has errors, an error summary is shown here
Error/Warning log file

If the crawl result has error, a detailed log file can be found at this location

See Namespace Proxy Models for how to load the crawled model into the aggregating server.

Namespace Proxy Models

The Namespace Proxy Models configuration page is where the configuration of which crawler result models to upload to the aggregating server is done. The Namespace Proxy Models configuration page is opened by navigating to the Orchestrator, right-click and choose Namespace Proxy Models:

This will open the Namespace Proxy Models configuration view:

The Replicated namespaces table shows the foreign namespaces that are loaded into the aggregated server.

Column NameExplanation
Exposed URIThe namepaces URI that is exposed from the aggregating server
Source URIThe namespace URI on the source server
Crawl time

The time when the crawl that produced the model was started

Loading mode

By default, the namespace is loaded automatically when a crawl has finished. Note, this happens only if there are changes from the last crawl. If you do not want automatic upload, change mode to Manual, by changing it in the table directly

Last loaded

The time when the model was loaded into the aggregating server

Source

Location of the file containing the namespace that is loaded

Obsolete

If true, there exists a newer (different) version ready to be uploaded. In this case, the background of the row is pink

Manually load models

To manually load models, you need to select a namespace source. This is done in the 'Namespace source' area. Select a crawler result db-file, or a Nodeset2-file.After the source is selected, select the namespaces to replace in the 'Replicated namespaces' table. User CTRL and click to select multiple namespaces. Press the Load model(s) button to load the namepace(s) into the aggregating server.

Compare models

To compare models from a crawler set namespace source to the currently loaded models, select a Namespace source and in the 'Replicated namespaces' table, select the namespaces of interest. Press the Compare button to produce the compare results:

Column NameExplanation
Namespace URIThe namepaces URI that is being compared
Nodes only in currentThe number of nodes that exist in the loaded namespace, but not in the crawled namespace
Nodes only in crawler set

The number of nodes that exist in the crawled namespace, but not in the loaded namespace

References only in current

The number of references that exist in the loaded namespace, but not in the crawled namespace

References only in crawler set

The number of references that exist in the crawled namespace, but not in the loaded namespace

Current database

Location of the file containing the namespace that is currently loaded

Old database

Location of the file containing the crawled namespace

Troubleshooting Guide

Question: Why are no function items generated?

Answers:

  • Are all referenced namespaces available?
  • Are any BaseDataVariable-derived variables defined in the namespace? For function items to be generated, variables in the semantic model must inherit from BaseDataVariableType.

Create new namespace

In Apis Hive, namespaces are hosted in a module called ApisSemantics. In order to create a new (empty) namespace in the UA server, an instance of the ApisSemantics module must be created in Apis Hive. See "Adding a Module" on how to add a module to Hive.

In the Add a Module dialog, ApisSemantics is located in the Others group. Give the module a name and press Add.

A new dialog appears, giving you the opportunity to manipulate properties for the module. Setting the Uri is the most important step at this point. For a description of all properties, see ApisSemantics properties.

Press Apply, and the new namespace is added to the UA server.

Namespace management

All Namespace Management described here is done from one menu in AMS. To open this menu go to the Hive-instance you want to work with. Open the "Information Modelling" and right click on "Perspectives" to get the context menu. In the context menu select Namespace management. The following choices can be made

Check Namespaces

Import Namespaces

Export Namespaces

Regenerate Items

Delete Non-Mapped Optional Variables

Check Namespaces

Select the Check existing namespace. in the Namespace Management menu.

The following window appears.

It is possible to select one or more namespaces. The search field makes it possible to filter the namespaces.

At least one validation method must be selected before the Check namespaces button is enabled.

Press the Check namespaces button to start the validity check.

The check may take some time, and when it's finished a result window will appear like the one below:

The first list contains the namespaces selected to check. The first column indicates if there were any errors, the second column is the uri of the namespace and the last column states how many nodes and references were checked.

By clicking on the rows in the first list results are listed in the error results list and by clicking in the error list the detailed error will appear in the Details section.

Import Namespaces

Right-click on Information Modelling -> Perspectives and select the Namespace Management menu and select Import Namespaces.

The following window appears:

In the File type entry select either:

  • Import NodeSet - will import from node set file, and completely replace anything in that namespace with the content of the file.
  • Import NodeSetChanges - will import from node set changes file, and completely replace anything in that namespace with the content of the file.
  • Update From NodeSetChanges - will import from node set changes file, but will leave currently in anything in that namespace untouched unless it has a duplicate in the node set changes file.

To import from a file, select Nodeset File to Import and browse to the file.

To Import from a Configuration Repository, select Backup Set to Import and browse the service for the desired namespace backup.

By default all checkmethods will be selected. After the import is done the new namespace will be checked.

Start the import and check by pressing the Start button.

If the namespace exist and nodes are defined the user get a warning, and can cancel the import.

If the selected file is not a correct Nodset xml file an error message will be shown and the import stop.

If errors are detected, the error result form will pop up and the user can select to either Import anyway, or Discard this import.

Semantics Module:

A namespaces is hosted by a Semantics Module in Hive. If one already exists for your namespace, you are good to go.

If a Semantics Module does not exist, one will be created for you. The name of the module is based on the Namespace URI by the following algorithm:

  • The uri is split into tokens by '/'.
  • If the last token contains at least 4 letters, this will used.
  • Tokens will be added to the name, separated by '_', till there are at least 4 letters in total.

Examples:

http://opcfoundation.org/UA/Dictionary/IRDI will give the name 'IRDI'

http://opcfoundation.org/UA/DI will give the name 'UA_DI'

http://opcfoundation.org/UA/DI/2012/02/1 will give the name 'UA_DI_2012_02_1'

If you want to decide the name yourself, create the Semantics Module before importing the nodeset file or rename the auto generated module.

Export Namespaces

Select Export Namespaces in the Namespace Managment menu.

The following window appears:

It is possible to select multiple namespace for export. The top most text box filters the displayed namespaces.

The export format can be either NodeSet or NodeSetChanges.

Each exported namespace will be exported to its own file.

The folder in which these files will be stored can be selected by the user.

Click the Export namespaces button to export.

The following window will appear displaying the files the namespaces will be stored in.

It is possible to alter this by clicking on the file name.

To perform the actual export click Ok.

Delete Non-Mapped Optional Variable

By "deleting non-mapped optional variable" we mean deleting variables which inherit from BaseDataVariableType and do not have any external items.

Open the Namespace Management menu and select Delete Non-Mapped Optional Variable.

The following window appears:

Select one or more namespaces and click Delete Optional Non-Mapped Items

Regenerate Items

The term "Regenerate Items" means to create the function items which hold the real-time values of variables. For instance if you have changed the name-generation options, you will need to regenerate the items according to the new rules.

Open the Namespace Management menu and select Regenerate Items.

The following window appears:

Select one or more namespaces and click the Regenerate Items button.

Import a namespace

Namespace import is done by importing a nodeset2.xml file. Before you can import the file, an ApisSemantics module for your namespace must exist. See Create new namespace on how to create a module for your namespace. It is of crucial importance that the namespace Uri in the ApisSemantics module is exactly the same as the one in your nodeset2.xml file.

Follow the instructions in Import new/replace namespace from nodeset to complete the namespace import.

Model construction

An information model is a representation of concepts and the relationships, constraints, rules, and operations to specify data semantics for a domain. In Apis, it specifies relations between assets and their data.

The ability to construct information models is built into the Apis framework and it is an implementation of the OPC UA information modelling specification.

Before you can start constructing models, you need to create a namespace to host your model.

There are several way to construct models in Apis:

Construction Perspectives

Information models can be manually built using Apis Management Studio (AMS). OPC UA information modelling is a standardized way to organize your data, described in detail on the page Unified Architecture from OPCFoundation.

All information models in Apis UA Server adheres to this standard.

The OPC UA specification is extendable, there are specifications built on top of OPC UA that are called Companion Specifications. A list of companion specifications can be found here.

In Apis, there are tools for building models for the following specifications:

  • OPC UA
  • ISA95 (The Asset Registry part is implemented in Apis)

To construct models in AMS, connect to an Apis Hive instance and navigate to the model perspective you want to work with.

Navigate to 'Information Modelling' then 'Perspectives' to find the different construction perspectives that Apis supports:

OPC UA:

Navigate the OPC UA node to work with models in the OPC UA specification perspective:

Right-click on a node and choose Edit Object to start constructing a model. See Build models for OPC UA for a detailed explanation of how to do this.

Asset Registry:

Navigate the Asset Registry node to work with models in the ISA95 companion specification perspective.

Only instances that confirm to the ISA95 companion specification will be visible in this perspective.

Right-click on a node and choose Edit Equipment to start constructing a model. See Build models for ISA95 (Asset Registry) for a detailed explanation of how to do this.

Note! The top most Equipment in an ISA95 model must be added from the OPC UA specification perspective, on the Objects node. The top node must be of type EquipmentType, or an inheritance, to confirm to the ISA95 companion specification.

Model Construction OPC UA

The OPC UA Model Construction Editor is where it is possible to add, edit and delete instances in an OPC UA information model .See How to open this editor here.

Before you can start constructing models, you need to create a namespace to host your model.

The editor view consists of several parts, navigation, properties, actions and overview of children and references, see image below. Next, the different parts will be described in detail.

The first row is a breadcrumb showing the path of parents from the Objects node to the node you are editing. Press the names to navigate to a parent.

The arrow buttons lets you navigate back and forth in your navigation history.

Adaptive means the editor will automatically load the content of nodes selected in the browsable tree on the left. If not selected, the content will stay on this instance till you change it.

The Reload button will reload the content from the server.

Properties

Property NameExplanation
NodeIdUniquely defines the instance in the UA server.
NodeClassAn enumeration identifying the NodeClass of the instance, such as Object or Variable.
DisplayNameThe name of the node when displayed in user interface.
BrowseNameIdentifies the instance when browsing the UA server.
DescriptionDescription of the instance
NamespaceThe namespace URI that this instance belongs to.
EventNotifierIndicates if the semantic object is an event notifier.
ParentsNames and references of parents.
WriteMaskSpecifies which attributes of the instance that is writable (editable).
UserWriteMask

Specifies which attributes of the instance that is writable (editable) by the currently connected user.

TypeThe type of this instance.
Properties (group)Contains all properties of type PropertyType of this instance.

Actions

To build a model, extend instances by utilizing the following actions:

Action NameDescription
Add optional childAdd optional children that is defined in the type of the instance.
Add placeholder childAdd children defined by a placeholder modelling rule.
Add custom childAdd children of custom type and custom reference.
Add referenceAdd a reference to another instance or type.

Children

The children of this instance of NodeClass Object with a hierarchical reference from this instance.

Column NameExplanation
DisplayNameThe DisplayName of the child.
TypeDefinitionThe type of the child.
ReferenceTypeThe reference type to the child.
ModellingRuleThe ModellingRule defined for this child.
NodeClassThe NodeClass of this child.
DescriptionDescription of the child.

Child objects can be deleted by pressing the waste bin icon to the right. Note that this will delete the whole child object, not just the reference. If the waste bin is disabled, its probably because the modelling rule does not allow deletion.

The children of this instance of NodeClass Variable with a hierarchical reference from this instance.

Column NameExplanation
DisplayNameThe DisplayName of the child.
TypeDefinitionThe type of the child.
ReferenceTypeThe reference type to the child.
ModellingRuleThe ModellingRule defined for this child.
NodeClassThe NodeClass of this child.
ValueThe value of the variable.
DataTypeThe data type of the variable
DescriptionDescription of the child.
Expression

Calculation expression to compute the value from external item source(s). The expression be changed by clicking the button to open the Item Expression Editor.

Child variables can be deleted by pressing the waste bin icon to the right. Note that this will delete the whole child variable, not just the reference. If the waste bin is disabled, its probably because the modelling rule does not allow deletion.

References

The Reference table contains references to and from this instance. There are filters for reference Direction (Forward, Inverse and Both) and reference Hierarchical type (Hierarchical, NonHierarchical and Both).

Column NameExplanation
ReferenceTypeThe reference type to the target.
Reference idThe NodeId of the reference type.
Target nameThe DisplayName of the target of the reference.
Target idThe NodeId of the target.
DirectionForward: the reference is pointing from this instance to target. Inverse: the reference is pointing from target to this instance.
Hierarchy typeHierarchical: References used to model a hierarchy. NonHierarchical: References used for other than hierarchical purposes.
DescriptionDescription of the reference.

References can be deleted by pressing the waste bin icon to the right. Note that this will delete the reference only, not the target of the reference. If the waste bin is disabled, its probably because this is the only hierarchical reference to the target.

Model Construction ISA95 (Asset Registry)

In the Asset Registry Model Construction Editor, it is possible to add, edit and delete equipments, equipment properties and equipment classes in an ISA95 information model .See How to open this editor here.

Before you can start constructing models, you need to create a namespace to host your model.

The editor view consists of several parts, navigation, properties, actions and overview of children and references, see image below. Next, the different parts will be described in detail.

The first row is a breadcrumb showing the path of parents from the top node to the node you are editing. The top node is actually the Objects node. The Objects node is not visible in this view since it's not a part of ISA95.

Press the names to navigate to a parent.

The arrow buttons lets you navigate back and forth in your navigation history.

Adaptive means the editor will automatically load the content of nodes selected in the browsable tree on the left. If not selected, the content will stay on this instance till you change it.

The Reload button will reload the content from the server.

Properties

Property NameExplanation
NodeIdUniquely defines the instance in the UA server.
NodeClassAn enumeration identifying the NodeClass of the instance, such as Object or Variable.
DisplayNameThe name of the node when displayed in user interface.
BrowseNameIdentifies the instance when browsing the UA server.
DescriptionDescription of the instance
NamespaceThe namespace URI that this instance belongs to.
EventNotifierIndicates if the semantic object is an event notifier.
ParentsNames and references of parents.
WriteMaskSpecifies which attributes of the instance that is writable (editable).
UserWriteMask

Specifies which attributes of the instance that is writable (editable) by the currently connected user.

TypeThe type of this instance.
Properties (group)Contains all properties of type PropertyType of this instance.

Actions

To build a model, extend instances by utilizing the following actions:

Action NameDescription
Add optional childAdd optional children that is defined in the type of the instance.
Add equipmentAdd children defined by a placeholder modelling rule. (Children that inherits EquipmentType)
Add propertyAdd variable children. (Variables that inherit EquipmentPropertyType)
Add equipment classAdd an equipment class. (ClassType)

Sub Equipments

The children (objects) of this instance of type EquipmentType (or an inheritance), with a MadeUpOfEquipment reference from this instance.

Column NameExplanation
DisplayNameThe DisplayName of the child.
TypeDefinitionThe type of the child.
ReferenceTypeThe reference type to the child.
ModellingRuleThe ModellingRule defined for this child.
NodeClassThe NodeClass of this child.
DescriptionDescription of the child.
EquipmentLevelThe equipment element level.

Sub equipments can be deleted by pressing the waste bin icon to the right. Note that this will delete the whole equipment, not just the reference. If the waste bin is disabled, its probably because the modelling rule does not allow deletion.

Equipment Properties

The children (variables) of this instance of type EquipmentPropertyType (or an inheritance) with a HasISA95Property (or HasISA95ClassProperty) reference from this instance.

Column NameExplanation
DisplayNameThe DisplayName of the child.
TypeDefinitionThe type of the child.
IdThe NodeID of the child.
ReferenceTypeThe reference type to the child.
Reference IdThe NodeId of the reference.
ModellingRuleThe ModellingRule defined for this child.
ValueThe value of the variable.
DataTypeThe data type of the variable
DescriptionDescription of the child.
Expression

Calculation expression to compute the value from external item source(s). The expression be changed by clicking the button to open the Item Expression Editor.

Equipment properties can be deleted by pressing the waste bin icon to the right. Note that this will delete the property, not just the reference. If the waste bin is disabled, its probably because the modelling rule does not allow deletion.

Equipment Classes

Equipment Classes (types that inherit EquipmentClassType) defined for this instance and has a DefinedByEquipmentClass reference from this instance.

Column NameExplanation
NameThe name of the equipment class.
IdThe NodeId of the equipment class.
DescriptionThe description of the equipment class.

Equipment Classes can be removed by pressing the waste bin icon to the right. All equipment properties defined for the equipment class will be removed from the equipment.

Add Reference

Add a reference from an instance to a target by selecting the reference type and target.

Enter data for your new reference in the dialog. Press the Add button when done. You can add multiple references in this dialog. The new references are listed in the table below the Add button. New references are not created before you press the Ok button.

Property NameDescription
NamespaceThe target namespace where new references will be stored.
SourceThe name of the source instance.
ReferenceTypeThe type of the reference. Use the browse button to locate the type.
Target(s)The target(s) of the reference(s). Multiple targets can be selected in the target browse dialog. Use the browse button to open a target browser.

Add Equipment Class

Equipment Classes (types that inherit EquipmentClassType) can be added to an equipment in this dialog.

The Namespace is where the equipment properties of the equipment class and the reference to this equipment class will be stored. This can be different from the namespace of the equipment it is added to.

Use the tree on the left hand-side to browse to the desired equipment class. The equipment class is not added before you press the Ok button.

An overview of the equipment class' type is visible at the right hand-side of the dialog. It is also possible to select optional children, change types and create custom ids in the type overview. (Custom ids presuppose that the target namespace requires this). If the equipment class contains children with abstract types (red symbol), you must select a non-abstract type for the child. This is done by clicking on the red symbol and selecting a type from the browsable tree that appears.

Add Equipment

Add child equipment that is defined in the parents type.

Enter data for your new equipment to the left side of the dialog. The reference to the equipment is MadeUpOfEquipment and is defined in the ISA95 companion specification. Press the Add button when done. You can add multiple child equipments in this dialog. The new equipments are listed in the table below the Add button. New equipments are not created before you press the Ok button.

An overview of the child's type is visible at the right hand-side of the dialog. It is also possible to select optional children, change types and create custom ids in the type overview. (Custom ids presuppose that the target namespace requires this).

Property NameDescription
NamespaceThe target namespace where the new equipment will be stored.
Browse name URIThe BrowseName may be stored in another namespace.
EquipmentTypeType of the equipment, (is or inherits EquipmentType). Use the browse button to locate the type.
DisplaynameThe text part of the DisplayName.
BrowsenameThe text part of the BrowseName.
DescriptionThe description of the child.
EquipmentLevelThe equipment element level.

Abstract Data Types:

If the type definition contains children with abstract types (red symbol), you must select a non-abstract type for the child. This is done by clicking on the red symbol and selecting a type from the browsable tree that appears.

Add Property

Add one or more equipment properties (variable) that is defined in the parents type.

Enter data for your new equipment property to the left side of the dialog. The reference to the equipment property is HasISA95Property and is defined in the ISA95 companion specification. Press the Add button when done. You can add multiple child equipment properties in this dialog. The new equipment properties are listed in the table below the Add button. New equipment properties are not created before you press the Ok button.

An overview of the child's type is visible at the right hand-side of the dialog. It is also possible to select optional children, change types and create custom ids in the type overview. (Custom ids presuppose that the target namespace requires this).

Property NameDescription
NamespaceThe target namespace where the new equipment property will be stored.
BrowseName URIThe BrowseName may be stored in another namespace.
EquipmentPropertyTypeType of the equipment property. (Is or inherits EquipmentPropertyType). Use the browse button to locate the type.
DataTypeThe data type of this equipment property. (The type of the value). Use the browse button to locate the type.
DisplaynameThe text part of the DisplayName.
BrowsenameThe text part of the BrowseName.
DescriptionThe description of the child.

Abstract Data Types:

If the type definition contains children with abstract types (red symbol), you must select a non-abstract type for the child. This is done by clicking on the red symbol and selecting a type from the browsable tree that appears.

Add Optional Children

Types may define children (object and variables) that have the Optional modelling rule. If the modelling rule is Optional (or OptionalPlaceholder), instances if the type may, or may not, have instances of these children. The Add Optional Children dialog offers an easy way to add optional children to an instance.

The Namespace is the target namespace that the children will be added to (may be different from the parent instance).

If the target namespace requires the user to supply NodeIds for the children, an additional button will appear "Generate Ids". See Generate Custom Ids for how to generate custom NodeIds.

In the tree, nodes that are empty and not grayed out, are optional children that can be added by selecting them. Note: if an optional node exists, it will not be deleted if you deselect this node in the tree. Deleting nodes is only possible in the model construction editors.

Nodes that are grayed out, are children that are mandatory, they will always exist on the instance.

If a node is "half" selected (the white '/' symbol), it means that there are sub nodes that are optional (that can be selected).

If a node is fully selected (the white 'v' symbol), it means that the node either exists or will be created and that any optional children are also selected.

Press the Ok button to apply your changes, or Cancel to exit without any changes.

Abstract Data Types:

If the type definition contains children with abstract types (red symbol), you must select a non-abstract type for the child. This is done by clicking on the red symbol and selecting a type from the browsable tree that appears.

Add Custom Child

Children of any type of object or variable can be added in this dialog. If a child is not a part of the parents type, the child will not have a modelling rule.

Enter data for your new child to the left side of the dialog. Press the Add button when done. You can add multiple children in this dialog. The new children are listed in the table below the Add button. New children are not created before you press the Ok button.

An overview of the child’s type is visible at the right hand-side of the dialog. It is also possible to select optional children, change types and create custom ids in the type overview. (Custom ids presuppose that the target namespace requires this).

Property NameDescription
NamespaceThe target namespace where new children will be stored.
Browse name URIThe BrowseName may be stored in another namespace.
TypeDefinitionThe type of the child instance. Visualized in the right side of the dialog. Use the browse button to locate the type.
ReferenceTypeThe type of the reference from the parent to this child. Use the browse button to locate the type.
DisplaynameThe text part of the DisplayName.
BrowsenameThe text part of the BrowseName.
DescriptionThe description of the child.

Abstract Data Types:

If the type definition contains children with abstract types (red symbol), you must select a non-abstract type for the child. This is done by clicking on the red symbol and selecting a type from the browsable tree that appears.

Add Placeholder Children

Types may define children (object and variables) that have the Mandatory Placeholder or the Optional Placeholder modelling rule. If the modelling rule is Mandatory Placeholder, it means that instances of this type will have at least 1 child of the type specified for the child, and can have many. It he modelling rule is Optional Placeholder, it means that instances of this type may have 0 or many children of the type specified for the child.

Enter data for your new child to the left side of the dialog. The reference to the child is hierarchical and the type is defined in the parents type. Press the Add button when done. You can add multiple children in this dialog. The new children are listed in the table below the Add button. New children are not created before you press the Ok button.

An overview of the child’s type is visible at the right hand-side of the dialog. It is also possible to select optional children, change types and create custom ids in the type overview. (Custom ids presuppose that the target namespace requires this).

Property NameDescription
NamespaceThe target namespace where new children will be stored
Browse name URIThe BrowseName may be stored in another namespace.
PlaceholderType of the child defined by the placeholder modelling rules.
TypeDefinitionThe type of the child instance. Visualized in the right side of the dialog. Use the browse button to locate the type.
DisplaynameThe text part of the DisplayName.
BrowsenameThe text part of the BrowseName.
DescriptionThe description of the child.

Abstract Data Types:

If the type definition contains children with abstract types (red symbol), you must select a non-abstract type for the child. This is done by clicking on the red symbol and selecting a type from the browsable tree that appears.

Generate Custom Ids

Semantic namespaces in Apis can be setup to let the user supply the NodeIds. See the Assign Ids property on the ApisSemantics module. [* links missing]

When constructing models using Apis Management Studio (AMS), the user must supply the NodeIds in the various Add-dialogs (i.e. Add Custom Child).

Note that NodeIds must be unique within the namespace. If a NodeId you enter already do exist, the creation of your instance will fail.

In the Add-dialogs, there is an overview of the type to be created on the right hand-side.

You can press the fingerprint icon to the right to set the NodeId of each member of the type.

Column NameExplanation
Id TypeThe type of the id, choose from Numeric, String, Guid or ByteArray.
IdEnter your NodeId.

Press the Generate Ids button to set the NodeIds in the whole instance.

Column NameExplanation
Id TypeHierarchical or Unique. Hierarchical: Uses what you enter in Root Id and add ids corresponding to the type, hierarchically padding the BrowseNames. Unique: uses Guids.
IdThe root name of the hierarchy,
Overwrite existingWill overwrite any previously entered NodeIds.

Target Model Identification Form

Target Model Identification (TMI) is the process of identifying and declaring the entities (instances/variables/properties) that make up a semantic target model, based on knowledge of the real world that is the target of modelling, combined with knowledge of the metamodel upon the model shall be based. Typically, the process involves identification of real world entities that can be typed according to types defined in the selected target metamodel. The TMI form can be used to bulk-specify the entities that constitute the target model, create copies of models, update models, add references, connect external items and more.

The TMI Form is a document that defines the instances of a semantic model in a human readable form. The TMI form is defined in an Excel workbook (xlsx) file.

When building models in a TMI form, the recommended workflow is to first build a small part of the model in the model builder in AMS. This small part should span as much of the model as possible, i.e. contain the types used and one or more paths down to the leafs of the model. When exporting this model to a TMI form, the layout of the TMI will be correct and make a good starting point for manually adding entities to the model. After the entities are added to the TMI form, the form can be imported to the UA server.

The TMI Form format is different for UA companion specifications than the UA specification itself. Be sure to use the version corresponding to the specification used for the semantic model.

Apis Management Studio supports the following UA specifications:

Target Model Identification Form ISA95

This version of the TMI Form supports Asset Registry in the ISA95 UA companion specification.

Apis Management Studio (AMS) is used when exporting/importing TMI forms. The TMI form itself is an Excel spreadsheet (xlsx), which can be manipulated by humans.

When building models in a TMI form, the recommended workflow is to first build a small part of the model in the model builder in AMS using the Asset Registry perspective. This small part should span as much of the model as possible, i.e. contain the types used and one or more paths down to the leafs of the model. When exporting this model to a TMI form, the layout of the TMI will be correct and make a good starting point for manually adding entities to the model. After the entities are added/updated in the TMI form, the form can be imported to the UA server to update the semantic model.

Read this page for an overview of the TMI Form, then further documentation can be found here:

To export to TMI form, see Target Model Identification Export.

To import from TMI form, see Target Model Identification Import.

To import in bulk, see Import in Bulk.

TMI form explained

In the following a simple model of a room with a pump is used to explain the workings of the TMI Form. The pump has variables; pressure, rpm. The pump has a ClassType associated with it.

Exporting the model to a TMI Form will result in an Excel file with a number of sheets. The sheets contains all the information needed to create or update this model. Objects with the same UA Type and ClassType(s) are defined in the same sheet.

Objects may have EquipmentProperties (variables of type EquipmentProperty) and/or Properties (Attributes, of type PropertyType) associated with them. The EquipmentProperties of the objects have their own sheet with the same name as the associated object sheet, postfixed with '_EP'. The Properties of objects and their EquipmentProperties also have their own sheet with the same name as the associated object sheet, postfixed with '_PT'.

The content of the sheets will now be explained:

The Master sheet

The Master sheet contains information about UA Types and ClassTypes of the objects in the model, and in which sheet to find them. All objects of the same type and same combination of ClassTypes are listed in the same sheet.

Column NameExplanation
SequenceNoThe order of which sheets to create instances for. It is a good practice to order the creation sequence so that the base instances are defined in the earliest sheets (although, this is not a requirement, but he import will be faster)
EquipmentTypeThe UA type, with namespace index, of the objects in this sheet. Since the TMI is following the ISA95 UA companion specification, all object types inherit from EquipmentType. The number before the type name is the namespace index. The index is defined in the Namespaces sheet.
SheetName of the sheet where the objects are defined. This name must be unique and correspond exactly with the name of the sheet. The name is limited to 29 characters (by Excel). The names are auto generated on export.
ClassType1If the objects have a ClassType associated with them, the names of the ClassTypes are defined in columns named ClassType1 ... ClassTypeN.

The Namespaces sheet

The Namespaces sheet contains the namespaces used in the model. It also contains the namespace index for each namespace. The index is used elsewhere in the TMI as part of the UA types. Namespace indexes may vary from server to server and after restart of servers. This table is used as a map when importing to a server, and it is never necessary to change the index in the file.

Column NameExplanation
NamespaceThe namespace URI.
IndexThe namespace index associated with a namespace. The index may vary between servers and server restarts. Never change the Index in a TMI

The Object sheet

There may be many object sheets, with different names. This example shows the sheet called Equipment from our pump room example. All instances of type EquipmentType will be in this sheet, as defined in the Master sheet. To add more objects of EquipmentType, simply add rows in this sheet and type inthe data for the objects.

Column NameExplanation
RelativePath

The relative path from the top node (the node that was selected for export). The RelativePath is a series of DisplayNames separated by '/'. See later examples of this.

You may reference instances outside of your model. The RelativePath will then start from the Objects node, Object/SomeOject...

An instance may have multiple parents, in this case, the relative paths are all entered, separated by ';'. In this case The Reference column must also contain all references, separated by ';', in the same order as in RelativePath.

ReferenceThe reference to this instance from its parent. Note that no namespace index is needed, it is assumed that references are not extended and therefore are unique by name.
BrowseNameThe BrowseName of the object.
DisplayNameThe DisplayName of the object. If left blank, the object will not be created unless the ModellingRule is Mandatory.
NodeIdThe NodeId of the object. If left blank, it will be generated on import.
DescriptionThe Description attribute of the object.
EquipmentLevelThe EquipmentLevel (ISA95) of the object. This property could have been defined in the _PT sheet, but is included here for convenience, since it is an integral part of ISA95
EquipmentLevelIdThe NodeId of the EquipmentLevel property. If left blank, it will be generated on import.
ModellingRuleThe ModellingRule of the object. This column is for information only, and will not be set on import. The ModellingRule is defined in the type this object is a part of.

The instance sheet called SimplePump_Compressor contains a pump:

We see from the RelativePath column that this pump is a child of PumpRoom. The ClassType column is just for information and will be ignored in an import operation. To add more pumps, simply add more rows, and import.

The EquipmentProperty sheet

The EquipmentProperty sheet defines EquipmentProperties (variables) for objects in the Object sheet. Each Object sheet has an associated EquipmentProperty sheet. The EquipmentProperty sheet has the same name as the object sheet, postfixed with '_EP'.

All types in this sheet is, or inherits, EquipmentPropertyType (ISA95). In addition to standard properties (of type PropertyType) of EquipmentPropertyType, properties from the UA DataAccess definition are added in this sheet, for convenience.

The UA DataAccess properties are:

  • InstrumentRange
  • EURange
  • EngineeringUnits
  • Definition
  • ValuePrecision
  • TrueState
  • FalseState

Any properties outside this scope are located in the _PT sheet.

The EquipmentProperty sheet for instances in SimplePump_Compressor, is:

Column NameExplanation
RelativePathThe relative path from the top node (the node that was selected for export). The RelativePath is a series of DisplayNames separated by '/'.
Reference

The reference to this instance from its parent. Note that no namespace index is needed, it is assumed that references are not extended and therefore are unique by name.

EquipmentPropertyTypeUA type of this EquipmentProperty. A type that inherits from EquipmentPropertyType (an ISA95 type).
BrowseName

The BrowseName of the instance.

DisplayName

The DisplayName of the instance. If left blank, the instance will not be created unless the ModellingRule is Mandatory. In our example, AssetAssignment and Running will not be created in an import operation.

NodeIdThe NodeId of the instance. If left blank, it will be generated on import.
Description

The Description attribute of the instance.

ModellingRule

The ModellingRule of the instance.

InitialValueThe initial value of the variable. The value the variable will have after an Apis restart. This can be useful for variables that are not mapped.
DataTypeThe data type of the variable value.
EngineeringUnitsUnit of the variable. See the Unit sheet for all available units.
EngineeringUnitsId

The NodeId of the EngineeringUnits property. If left blank, it will be generated on import.

EURangeNormal operating range for this value.
EURangeId

The NodeId of the EURange property. If left blank, it will be generated on import.

InstrumentRangeThe value range that can be returned by the instrument.

InstrumentRangeId

The NodeId of the InstrumentRange property. If left blank, it will be generated on import.

ValuePrecisionThe maximum precision that the server can maintain for the value.

ValuePrecisionId

The NodeId of the ValuePrecision property. If left blank, it will be generated on import.

DefinitionA vendor-specific, human readable string that specifies how the value is calculated.
DefinitionIdThe NodeId of the Definition property. If left blank, it will be generated on import.
StoreToIMSStore the value to a historian.
StoreToIMSIdThe NodeId of the StoreToIMS property. If left blank, it will be generated on import.
FalseStateString to be associated with this value when it is FALSE
FalseStateIdThe NodeId of the FalseState property. If left blank, it will be generated on import.
TrueStateString to be associated with this value when it is TRUE
TrueStateIdThe NodeId of the TrueState property. If left blank, it will be generated on import.
ExpressionAn expression used to calculate the value
ExternalItem1An external item that the value depends on. There may be more than one ExternalItem. Make a new column for each and name them ExternalItem2 ... ExternalItemN

The Property sheet

The Property sheet defines properties of the UA type PropertyType. Each Instance sheet has an associated Property sheet. The Property sheet has the same name as the object sheet, postfixed with '_PT'. Some properties are, for convenience, in the Object and EquipmentProperty sheets, the rest of the properties are in this sheet. Properties of objects and properties their EquipmentProperties are listed in this sheet. We can see from RelativePath in the sheet below that the property 'MyProperty' is a property of object 'Pump_01', and the property 'SomeProperty' is a property belonging to the EquipmentProperty 'OutPressure'.

Column NameExplanation
RelativePathThe relative path from the top node (the node that was selected for export). The RelativePath is a series of DisplayNames separated by '/'.
ReferenceThe reference to this property from its parent.
PropertyTypeThe UA type of the property.
BrowseNameThe BrowseName of the property.
DisplayNameThe DisplayName of the property. If left blank, the property will not be created unless the ModellingRule is Mandatory.
NodeIdThe NodeId of the property. If left blank, it will be generated on import.
DescriptionThe Description attribute of the property.
ModellingRuleThe ModellingRule of the property.
ValueThe value attribute of the property
DataTypeThe data type of the value

The Unit sheet

The Unit sheet contains all available units on the UA server when the model was exported. Values defined for the EngineeringUnits property are units from this list. In the TMI form, a drop-down that contains all units is available for all EngineeringUnits. You can type only the DisplayName of the unit if you want, the first unit that matches the DisplayName will then be selected (the same DisplayName may appear in several namespaces). The format of the unit code is [eu namespace index]:[displayname][[id]].

Column NameExplanation
DisplayNameThe DisplayName of the unit
DescriptionThe Description of the unit
IdA unique unit id
EU NamespaceThe namespace of the unit
CodeA code used to define a unit uniquely in the spreadsheet

TMI Export

Semantic models in the Asset Registry can be exported to a TMI Form. A TMI Form is a human readable representation of a semantic model in an Excel file. The model can be modified and extended in the TMI Form.

In Apis Management Studio (AMS), connect to the Hive instance and browse Information Modelling / Perspectives to find models under Asset Registry:

Browse the Equipment node to locate the model you want to export. Any node beneath the Equipments node can be exported.

Right-click on the node of interest and choose 'Export to Excel':

The following dialog opens:

Press the Browse button to choose a target file, then press the Export button to start the export. While the export is ongoing, the number of handled instances will be updated and various information messages will be visible in the log view.

TMI Import

Semantic models defined in a TMI form can be imported to the Asset Registry. Instances defined in the TMI form will be created if they do not exist, or updated if they do.

The NodeIds in the TMI form will be used for new instances if the namespace requires custom Nodeids. It is not possible to change the NodeId of Instances that already exist. If you need to change the NodeId of an existing instance, the instance must be deleted from the Asset Registry (using AMS), then import the TMI form with the new NodeId.

If the NodeId of an instance is empty in the TMI form, a NodeId will automatically be created on import. When supplying custom NodeIds, make sure the NodeIds are unique throughout the entire UA server, or the import will fail.

Use Apis Management Studio (AMS) to connect to Apis Hive, then browse the Asset Registry to where you want to import the model. Right-click the parent node where you want to import your model. The model will be imported beneath the node you select.

A dialog opens:

Click the Browse button to locate the TMI Form.

The model is imported to a target namespaces of your choosing. Only instances that belong to these namespaces will be created/updated on an import. If your TMI form contains instances in other namespaces, that exist outside the model you are importing, references to these instances will be created.

Press the Import button to start the import process. The progress bar will visualize the import progress and various information messages will be visible in the log view.

TMI Import in Bulk

The power of the TMI Form is that models can be extended rapidly, in bulk, by humans, using the possibilities in Microsoft Excel. Extending the model usually consist of copy-paste and renaming operations. The starting point for building a model in a TMI Form is to first build a small part of the model in the model builder in Apis Management Studio (AMS). This small part should span as much of the model as possible, i.e. contain the types used and one or more paths down to the leafs of the model.

In the following we will extend the PumpRoom model presented in Target Model Identification Form to show how models are extended.

The starting point is the PumpRoom containing one pump, Pump_01. Say we want to extend the model with two more pumps, Pump_02 and Pump_03. We open the sheet for the pumps, 'SimplePump_Compressor'. Now, copy the row for Pump_01, and paste it into the next two rows. Amend the BrowseName and DisplayName, and maybe some other attributes such as the Description. New entities in green frame:

We have added two more pumps, now we need to add the EquipmentProperties and Properties of the new pumps. Following the same approach, we copy/paste the existing entities. This time we must remember to modify the RelativePath to the correct parent:

The TMI Form is now ready for import.

Target Model Identification Form OPC UA

This version of the TMI Form supports the standard OPC UA specification.

Apis Management Studio (AMS) is used when exporting/importing TMI forms. The TMI form itself is an Excel spreadsheet (xlsx), which can be manipulated by humans.

When building models in a TMI form, the recommended workflow is to first build a small part of the model in the model builder in AMS using the OPC UA perspective. This small part should span as much of the model as possible, i.e. contain the types used and one or more paths down to the leafs of the model. When exporting this model to a TMI form, the layout of the TMI will be correct and make a good starting point for manually adding entities to the model. After entities are added/changed in the TMI form, the form can be imported to the UA server to update the semantic model.

Read this page for an overview of the TMI Form, then further documentation can be found here:

To export to TMI form, see Target Model Identification Export.

To import from TMI form, see Target Model Identification Import.

To import in bulk, see Import in Bulk.

TMI form explained

In the following a simple model of a room with a pump is used to explain the workings of the TMI Form. The pump has variables; pressure, rpm.

Exporting the model to a TMI Form will result in an Excel file with a number of sheets. The sheets contains all the information needed to create or update this model. Objects of the same type are defined in the same sheet. There may be several sheets for the same type, depending on the hierarchy of the model.

Objects may have variables (variables of type BaseDataVariableType or descendants) and/or Properties (Attributes, of type PropertyType) associated with them. The variables of the objects have their own sheet with the same name as the associated object sheet, postfixed with '_VT'. The Properties of objects also have their own sheet with the same name as the associated object sheet, postfixed with '_PT'.

The content of the sheets will now be explained:

The Master sheet

The Master sheet contains information about the UA Types of the objects in the model; how types are used in the model hierarchy, and in which sheet to find them.

Column NameExplanation
TypeThe UA type, with namespace index, of the objects in this sheet. The number before the type name is the namespace index. The index is defined in the Namespaces sheet. The hierarchy of the model is suggested by spaces between ':' and the type name. If you enter types manually, you need not add spaces, the spaces are for visualization only.
SheetName of the sheet where the objects are defined. This name must be unique and correspond exactly with the name of the sheet. The name is limited to 29 characters (by Excel). The names are auto generated on export. The prefix of the sheet name reflects the hierarchy of the model ('1-1>' above). In the example above, '1-1>', says that instances in this sheet has parents in sheet 1 (the first 1). The second 1 indicates the instances number (of this type). The sign '>' indicates if the modelling rule is a placeholder (Optional-/MandatoryPlaceholder). If the last sign is '.', it is not a placeholder.
Include in importSetting this false will make the importer ignore instances in this sheet. This can be useful if the model is large and you have made changes to a small part of the model.

The Namespaces sheet

The Namespaces sheet contains the namespaces used in the model. It also contains the namespace index for each namespace. The index is used elsewhere in the TMI as part of the UA types. Namespace indexes may vary from server to server and after restart of servers. This table is used as a map when importing to a server, and it is never necessary to change the index in the file.

Column NameExplanation
NamespaceThe namespace URI.
IndexThe namespace index associated with a namespace. The index may vary between servers and server restarts. Never change the Index in a TMI
EU NamespaceThe namespace URI of engineering units.
IndexThe namespace index associated with a EU namespace. The index may vary between servers and server restarts. Never change the Index in a TMI

The Object sheet

There may be many object sheets, with different names. This example shows the sheet called 1>BaseObjectType from our pump room example. Instances of type BaseObjectType will be in this sheet, as defined in the Master sheet. To add more objects of BaseObjectType, simply add rows in this sheet and type in the data for the objects.

Column NameExplanation
RelativePath

The relative path from the top node (the node that was selected for export). The RelativePath is a series of DisplayNames separated by '/'. See later examples of this.

You may reference instances outside of your model. The RelativePath will then start from the Objects node, Object/SomeOject...

An instance may have multiple parents, in this case, the relative paths are all entered, separated by ';'. In this case The Reference column must also contain all references, separated by ';', in the same order as in RelativePath.

ReferenceThe reference to this instance from its parent. Note that no namespace index is needed, it is assumed that references are not extended and therefore are unique by name.
BrowseNameThe BrowseName of the object.
DisplayNameThe DisplayName of the object. If left blank, the object will not be created unless the ModellingRule is Mandatory.
NodeIdThe NodeId of the object. If left blank, it will be generated on import.
DescriptionThe Description attribute of the object.
ModellingRuleThe ModellingRule of the object. This column is for information only, and will not be set on import. The ModellingRule is defined in the type this object is a part of.

Lets look at another object sheet, 1-1>BaseObjectType:

We see from the RelativePath column that this pump is a child of PumpHouse.

The variable sheet

The variable sheet defines variables for objects in the Object sheet. Each Object sheet has an associated variable sheet. The variable sheet has the same name as the object sheet, postfixed with '_VT'.

All types in this sheet is, or inherits, BaseVariableType. In addition to standard properties (of type PropertyType) of BaseVariableType, properties from the UA DataAccess definition are added in this sheet, for convenience.

The UA DataAccess properties are:

  • InstrumentRange
  • EURange
  • EngineeringUnits
  • Definition
  • ValuePrecision
  • TrueState
  • FalseState

Any properties outside this scope are located in the _PT sheet (PropertyType sheet).

The variable sheet for instances in 1-1>BaseObjectType, is:

Column NameExplanation
RelativePathThe relative path from the top node (the node that was selected for export). The RelativePath is a series of DisplayNames separated by '/'.
Reference

The reference to this instance from its parent. Note that no namespace index is needed, it is assumed that references are not extended and therefore are unique by name.

TypeUA type of this variable. A type that inherits from BaseVariableType.
BrowseName

The BrowseName of the instance.

DisplayName

The DisplayName of the instance. If left blank, the instance will not be created unless the ModellingRule is Mandatory.

NodeIdThe NodeId of the instance. If left blank, it will be generated on import.
Description

The Description attribute of the instance.

ModellingRule

The ModellingRule of the instance.

InitialValueThe initial value of the variable. The value the variable will have after an Apis restart. This can be useful for variables that are not mapped.
DataTypeThe data type of the variable value.
EngineeringUnitsUnit of the variable. See the Unit sheet for all available units.
EngineeringUnitsId

The NodeId of the EngineeringUnits property. If left blank, it will be generated on import.

EURangeNormal operating range for this value.
EURangeId

The NodeId of the EURange property. If left blank, it will be generated on import.

InstrumentRangeThe value range that can be returned by the instrument.

InstrumentRangeId

The NodeId of the InstrumentRange property. If left blank, it will be generated on import.

ValuePrecisionThe maximum precision that the server can maintain for the value.

ValuePrecisionId

The NodeId of the ValuePrecision property. If left blank, it will be generated on import.

DefinitionA vendor-specific, human readable string that specifies how the value is calculated.
DefinitionIdThe NodeId of the Definition property. If left blank, it will be generated on import.
StoreToIMSStore the value to a historian.
StoreToIMSIdThe NodeId of the StoreToIMS property. If left blank, it will be generated on import.
FalseStateString to be associated with this value when it is FALSE
FalseStateIdThe NodeId of the FalseState property. If left blank, it will be generated on import.
TrueStateString to be associated with this value when it is TRUE
TrueStateIdThe NodeId of the TrueState property. If left blank, it will be generated on import.
ExpressionAn expression used to calculate the value
ExternalItem1An external item that the value depends on. There may be more than one ExternalItem. Make a new column for each and name them ExternalItem2 ... ExternalItemN

The Property sheet

The Property sheet defines properties of the UA type PropertyType. Each Instance sheet has an associated Property sheet. The Property sheet has the same name as the object sheet, postfixed with '_PT'. Some properties are, for convenience, in the Object and Variable sheets, the rest of the properties are in this sheet. Properties of objects and properties their variables are listed in this sheet. We can see from RelativePath in the sheet below that the property 'SerialNo' is a property of object 'Pump1'.

Column NameExplanation
RelativePathThe relative path from the top node (the node that was selected for export). The RelativePath is a series of DisplayNames separated by '/'.
ReferenceThe reference to this property from its parent.
PropertyTypeThe UA type of the property.
BrowseNameThe BrowseName of the property.
DisplayNameThe DisplayName of the property. If left blank, the property will not be created unless the ModellingRule is Mandatory.
NodeIdThe NodeId of the property. If left blank, it will be generated on import.
DescriptionThe Description attribute of the property.
ModellingRuleThe ModellingRule of the property.
ValueThe value attribute of the property
DataTypeThe data type of the value

The Unit sheet

The Unit sheet contains all available units on the UA server when the model was exported. Values defined for the EngineeringUnits property are units from this list. In the TMI form, a drop-down that contains all units is available for all EngineeringUnits. You can type only the DisplayName of the unit if you want, the first unit that matches the DisplayName will then be selected (the same DisplayName may appear in several namespaces). The format of the unit code is [eu namespace index]:[displayname][[id]].

Column NameExplanation
DisplayNameThe DisplayName of the unit
DescriptionThe Description of the unit
IdA unique unit id
EU NamespaceThe namespace of the unit
CodeA code used to uniquely define a unit in the spreadsheet

TMI Export

Opc Ua semantic models can be exported to a TMI Form. A TMI Form is a human readable representation of a semantic model in an Excel file. The model can be modified and extended in the TMI Form.

In Apis Management Studio (AMS), connect to the Hive instance and browse Information Modelling / Perspectives to find models under OPC UA:

Browse the Objects node to locate the model you want to export. Any node beneath the Objects node can be exported.

Right-click on the node of interest and choose 'Export to Excel':

The following dialog opens:

Press the Browse button to choose a target file, then press the Export button to start the export. While the export is ongoing, the number of handled instances will be updated and various information messages will be visible in the log view.

TMI Import

Semantic models defined in a TMI form can be imported to the OPC UA information modelling perspective. Instances defined in the TMI form will be created if they do not exist, or updated if they do.

The NodeIds in the TMI form will be used for new instances if the namespace requires custom Nodeids. It is not possible to change the NodeId of Instances that already exist. If you need to change the NodeId of an existing instance, the instance must be deleted (using AMS), then import the TMI form with the new NodeId.

If the NodeId of an instance is empty in the TMI form, a NodeId will automatically be created on import. When supplying custom NodeIds, make sure the NodeIds are unique throughout the entire UA server, or the import will fail.

Use Apis Management Studio (AMS) to connect to Apis Hive, then browse the OPC UA under Information Models to where you want to import the model. Right-click the parent node where you want to import your model. The model will be imported beneath the node you select.

A dialog opens:

Click the Browse button to locate the TMI Form.

The model is imported to a target namespaces of your choosing. Only instances that belong to these namespaces will be created/updated on an import. If your TMI form contains instances in other namespaces, that exist outside the model you are importing, references to these instances will be created.

Press the Import button to start the import process. The progress bar will visualize the import progress and various information messages will be visible in the log view.

TMI Import in Bulk

The power of the TMI Form is that models can be extended rapidly, in bulk, by humans, using the possibilities in Microsoft Excel. Extending the model usually consist of copy-paste and renaming operations. The starting point for building a model in a TMI Form is to first build a small part of the model in the model builder in Apis Management Studio (AMS). This small part should span as much of the model as possible, i.e. contain the types used and one or more paths down to the leafs of the model.

In the following we will extend the PumpHouse model presented in Target Model Identification Form to show how models are extended.

The starting point is the PumpHouse containing two pumps, Pump1 and Pump2. Say we want to extend the model with two more pumps, Pump3 and Pump4. We open the sheet for the pumps, '1-1>BaseObjectType'. Now, copy the row for Pump2, and paste it into the next two rows. Amend the BrowseName and DisplayName, and maybe some other attributes such as the Description. New entities in green frame:

We have added two more pumps, now we need to add the Variables and Properties of the new pumps. Following the same approach, we copy/paste the existing entities. This time we must remember to modify the RelativePath to the correct parent:

The TMI Form is now ready for import.

Variable Mapping

Variable mapping is the process of setting up or move mapping of variables, with transformations, for large systems. In the Apis UA Server, variables (that inherits from the UA type BaseDataVariableType) can be mapped to external sources. This means that variables can get their values from other systems, such as other UA servers.

The source values may be represented in another form than the target values. For example the source value may have a range in fraction (from 0 to 1), and the target value has a range in % (from 0 to 100), or the target value may be a sum of two source values. This will require a transformation, an Expression, to get the value mapped correctly.

Variable mapping allows for doing such operations in bulk. Bulk mapping can be performed by exporting the result of manually mapped model, then mass producing new mappings following the exported pattern, in external tools such as Excel. The new mapping can then be imported the to the same Apis UA server, or another Apis UA server.

In the following we describe the processes for:

Export Transformation Expressions

Export of Transformation Expressions is initiated by using Apis Management Studio (AMS). Each UA namespace has its own module (ApisSemantics) in Apis Hive. The values for the variables are located on Items on this module. The source item(s) (ExternalItems) and the expression are attributes on these items.

To export source items and expressions, start AMS and connect to the Apis Hive instance. Locate the ApisSemantics module for the namespace you want to export from.

Right-click on the module and choose "Transformation Expressions" -> "Export" in the popup.

You will be asked for a file name to save the export into. The exported file is a tab-delimited text file. This file can be manipulated in a standard text editor or Excel. When importing into Excel, follow the Text Import Wizard to import it as a tab-delimited file.

When imported into Excel, the content looks something like this:

The export file content has columns for the target item names (ItemID), the Expressions and a number of ExternalItems (source items). The numbers on the second row are Apis attribute ids, and must be left untouched if you want to import this file into Apis again.

Import Transformation Expressions

Import of Transformation Expressions is initiated by using Apis Management Studio (AMS). Each UA namespace has its own module (ApisSemantics) in Apis Hive. The values for the variables are located on Items on this module. The source item(s) (ExternalItems) and the expression are attributes on these items.

To import expressions and external items, start AMS and connect to the Apis Hive instance. Locate the ApisSemantics module for the namespace you want to import into.

Right-click on the module and choose "Transformation Expressions" -> "Import" in the popup.

Locate your file in the Open File dialog that appears, and press the Open button.

Bulk Mapping

The power of Variable Mapping is to configure a large amount of target variables in bulk. The workflow is to first setup mappings and transformations from a small number of source items to target variables. Then, export the transformation expression as described in "Export Transformation Expressions".

The exported file now contains a good starting point for mapping variables in bulk. Open the file in a text editor like Excel and use capabilities such as Sorting/Copy/Paste to fill inn the missing transformation expressions.

In the following example an equipment, FC101, is setup with transformation expressions and we want to setup a second equipment, FC102, based on work done on FC101. The export file look like this when imported to Excel:

We can now use Copy/Paste and manually edit the text to extend the content of the file with transformation expressions for equipment FC102:

Manual edit marked in yellow.

The file can now be imported as described in "Import Transformation Expressions", and the variables of FC102 will be mapped correctly.

Compare NodeSet or NodeSetChanges Files

The purpose of this function is to compare NodeSet or NodeSetChanges xml files to find the differences. The function only compare nodes and references. To start this function in AMS select an instance and open "Information Modelling" klick on "Models" and then right-click and select "Compare Nodeset files". A form as below will be visible.

To use this tool select "Nodeset Source File" and "Nodeset Target File" by pressing the Browse button. Then select the "Nodeset Changes File"where the result will be written.

The Source and target can either be a Nodset file or a Nodsetchanges file. The result will always be a Nodesetchanges file and consist of the nodes/references one have to add/delete to the source namespace to get the Target namespace. If a node exist in both namespaces, but have different attribute content, it will be a node in the result file in the section nodestoadd. In the Status area of the form, the number of nodes/references that have been added/deleted are displayed

Highlight Namespaces

Navigating the information model looking for types and instances in a specific namespace, may be hard on large systems. In Apis Management Studio (AMS) there is a feature called "Highlight Namespaces" that makes this easier. After connecting to an instance of Apis Hive. locate the "Information Modelling" node, then expand it to find the "Models" node. Right-click this node and choose "Highlight Namespaces" in the popup.

In the appearing dialog, select the one or more namespaces of interest:

Types and Instances belonging to the selected namespaces will now appear in bold under the Models node. See the "ProcessCell1" in the image below:

Namespace Versioning

A OpcUa namespace defined by the Apis Semantics module have parameter to describe the version. This is done by parameters:

  • Modelversion (id 1028) This is a readable text used to give a human readable information.
  • PublicationDate (id 1029) This is a timestamp in UTC and is the parameter other namespace will use when checking dependencies.
  • LastModified (id 1030) Is a UTC timestamp of last namespace-change.

When creating an ApisSemanticsBee you can select if the version shall be manually or automatic updated by setting parameter "Update of uri.metainfo" (id: 1027) to either auto or manual.

In Manual mode the values Modelversion and PublicationDate have to be changed manually. The LastModified value will be updated automaticly.

In Automatic mode the ModelVersion can not be changed. The PublicationDate will be updated automatically every time there is a change in the namespace, like adding/deleting/changing a node or a reference. LastModified will be update to present time (UTC).

A namespace can also have NamespaceMetadata information defined by the NameSpaceMetadataType defined in OpcUa Part5 version 1.004 table 21, under Objects/Namespaces/***. .If these nodes are defined, this content will also update according to the settings.

Aggregating Server

Concept

Aggregating OPC UA servers act as proxies for one or more OPC UA servers. This means clients can access the Aggregating server and expect to get the same information as if they accessed the Aggregated server.

Apis supports 3 aggregating server patterns:

Replicating proxy

Selected namespaces of the aggregated OPC UA server(s) are replicated in Apis Hive, which means Apis Hive hosts a copy of the address space of the aggregated server. The data variables of the aggregated server are mirrored as Apis Hive items using OPC UA subscription, so that client subscriptions and read requests to replicated data variables are handled solely by Apis, without involving the aggregated server. History read requests to data variables are handled by Apis or the Aggregated server depending on configuration. Replicating proxy is implemented in Apis in a concept called Namespace Replication.

Federating proxy

The aggregating server maps multiple autonomous UA servers (or UA server namespaces) into a single federated UA server. There is no actual data replication between the aggregated UA servers and the proxy. All client calls are routed and forwarded to the underlying servers, and results / subscriptions are delivered back to the clients via the proxy.

Basic proxy

The aggregating server replicates data variables from the aggregating server using of OPC UA subscription. In this pattern, only data variables and their values are replicated. Metadata structures such as objects, object hierarchies and properties are not replicated in the aggregating server, and hence not accessible for the clients.

Timeseries Caching

In a typical APIS configuration, the real-time data collected, will be stored locally into an APIS HoneyStore timeseries database, by using an Apis Logger module.

Timeseries on a generic item in APIS

When reading timeseries data on a regular item in the Hive namespace, it is required that the item is being stored into a HoneyStore timeseries database, typically by utilizing an Apis Logger module.

Then, the horizon of the timeseries data is decided by the HistoryLength property of the corresponding item in the HoneyStore database. Typically, this HistoryLength property is specified indirectly by the Historylength_X and Historylength_Unit properties of the Apis Logger module responsible for storing the real-time item data.

Timeseries on an OpcUa client item in APIS

When reading timeseries data on an Apis OpcUa client item in the Hive namespace, the read operation may be extended to query the underlying OpcUa server of the Apis OpcUa module.

First, the read operation is handled as if it was an generic item, i.e. reading timeseries data from the corresponding item in the HoneyStore database. If the item is not stored into any local Honeystore database, or if the local storage did not return data for the complete requested time interval, the read request will be extended to the underlying OpcUa server of the Apis OpcUa module.

Hence, you can have a shorter cache of timeseries data locally, and keep a longer timeseries storage on the remote OpcUa server location. In order for this to work, the underlying OpcUa server must implement the HistoryRead service. Of course, the APIS Hive OpcUa server implements the HistoryRead service.

Disable Timeseries Caching feature

The feature described above, is enabled by default. If you want, you can disable this feature across all items in your Hive instance.

To disable, please create the following registry key/value:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<InstanceName>\TimeSeriesAccess]
"EnableExtendedReads"=dword:00000000

If you create/modify this value, you will have to restart your Hive instance to make it take effect.

Federating Proxy

To configure ApisHive as a federating OpcUa proxy, the following steps must be performed:

1. Add a "Connection manager" module.

2. Add one "OpcUa connection" item for each aggregated OpcUa server. See the topic Configure Connection Manager for further details.

2. Add one "OpcUaProxy" module for each aggregated OpcUa server, and select the Connection item to use for each proxy in the "OpcUa server" property dropdown list. If any "OpcUa cluster" items are added to the Connection Manager, they will also be selectable from this list.

3. For each OpcUaProxy module, add "Namespace" items for each namespace that should be federated. If any of the federated namespaces should be exposed with a different uri in the aggregating server, the propery "Exposed uri" must be specified on the corresponding Namespace item.

The federated namespaces are now visible in the aggregating OpcUa server, and all OpcUa messages received by ApisHive will be forwarded to the relevant aggregated servers.

When Namespace items are deleted, or their "exposed uri" property is modified, ApisHive must be restarted before the old uris are removed from the namespace array of the OpcUa server.

Configuration Management

Please pick a topic from the menu.

Export Configuration

You can export parts of a configuration to text files. The names of the files are auto-generated. Export the configuration by selecting "Export configuration" in the context menu of the Instance Node. Then select a folder where the exported configuration files will be saved.

Import Configuration

You can import parts of a configuration using tab-separated text files. The format of the files can be found here. Select "Import Configuration" from the context menu of the Instance Node. A dialog appears and you can select multiple files at once. By clicking "Open", the configuration files are read, and the configuration is updated accordingly.

During the import the following window appears, showing the progress and allowing you to abort the import.

Aborting the import stops the import, but WON'T undo the files already imported.

Adding Items From a File

It's possible to add items from a tab-separated file. This is done by opening the "Add items" dialog. This dialog can be opened from the context menu of the Instance Node or the context menu of the Module Node.

By selecting the desired item type, and clicking "File add", a dialog appears for selecting the file to use to create items.

The format of the file is specified here.

See Import Configuration for how to add items from multiple files at once.

Item File Format

When importing items from a text file, the format of the file is of great importance. It is also possible to simultaneously add attributes to the items. To separate item names from their attributes, and to distinguish between different attributes, we use the following file format:

The first line of the file must contain attribute identifiers, as defined in the tables of predefined Apis and OPC attributes. Each attribute ID must be separated with a tabulator. Furthermore, the first attribute ID must always be '0', since the first column in the text file must always contain the item names. The other recognized attributes are predefined OPC attributes in the range 1 - 4999 and predefined Apis attributes in the range 5000 - 5999. As an alternative to the numeric attribute ID, the attribute name is also accepted if it is spelled (case-sensitive) exactly as it appears in a property page for similar item.

From the second line and down, the actual items with their attributes follows. Each line must start with the item name, then each of the desired attributes follows, separated by tabulators. Each line must contain exactly the same number of values separated by tabulators in order to successfully import item from the file. I.e., the lines in the file must have an entry for each of the attribute IDs included in the first line in the file. Each attribute entry must be separated with a tabulator.

See OPC DA Item attributes and Predefined Apis attributes for a list of valid attributes and their IDs / names.

Some Apis attributes have enumerated values, e.g. you may select between a predefined list of attribute values. Confer the list of enumerated values and matching integers in order to configure these attributes from a file.

An example file adding three items with the OPC attributes description (100) and engineering unit (101) and Apis attribute Location (5012), will then be as

follows, each column separated by tabulators:

01011005012
BoilerTempThe temperature of main boilerCBoiler room 1
BoilerFlowThe flow though main boilerm^3/sBoiler room 1
BoilerVolumeThe volume of main boilerm^3Boiler room 1

If some of the items in the file doesn't have a specific value for one or more of the attributes defined in the first line, simply enter two tab characters to skip the attribute.

Server cloning

Server cloning is the process of establishing new Apis server environments based on existing Apis servers. The server cloning functionality may be used to establish new Apis servers in new environments as well as migrating existing Apis servers to new infrastructure.

Server cloning is performed with use of Apis Backup Agent and Apis Management Studio

Follow the steps bellow to clone a Apis server:

Configuration Migration

Configuration migration is the process of moving different parts of a Apis configuration between different servers.

Configuration migration is performed with use of Apis Backup Agent and Apis Management Studio

Follow the steps bellow to migrate parts of a Apis configuration:

How to Partial Restore

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Recovery" node.

Partial Restore

Navigate to a Backup Set found under the "Backup set" node.

Navigate to a Hive instance configuration contained in the Backup set.

Right click on the Hive instance configuration and select "Partial Restore". From the Partial Restore view, follow the steps below:

Select the Modules you wish to restore. Only Modules that are included in the backup set can be selected.

Click on the "Run Partial Restore" button to start the partial restore job.

You are only allowed to run a partial restore job if the option combination is valid. The option combination is validated against the Backup set content.

Any warnings or errors returned from the restore job will show up in the Information part.

Be aware of the order when restoring Modules. Some Module configurations can be dependent on other Module configurations.

  • Modules that create Global Attributes must be restored prior to Modules which has Items that uses them.
  • Modules that hosts items which are used as External items must be restored prior to the consuming Modules.
  • Semantic Modules that hosts Namespace types must be included when restoring, or restored prior to, consuming Semantic Modules.

The selected target Hive instance will be started during partial restore, if it is not running.

Matching target Modules and their items will replaced with the selected Modules from the Backup set.

It might be necessary to restart data acquisition Modules after restore, in order to establish connection to remote servers.

Audit Trail

Configuration Audit Trail will log all configuration operations that can be associated with an user identity, to persistent storage. Such operations are:

  • Changing Apis Hive properties
  • Adding and removing Apis modules
  • Changing Apis Hive- and Apis Hive Module properties
  • Adding, removing and renaming Apis Hive items
  • Adding, removing and changing Apis item attributes
  • Apis external item configuration, adding and removing external items and changing expressions.
  • Apis Event Broker configuration, connecting and disconnecting events and commands
  • Semantic model configuration

To enable audit trails in your APIS configuration, you must first enable security. To enable security, please see here: Security

Enable Configuration Audit Trails in Apis Hive instance

To enable security, open the Windows registry editor on the machine where Apis Foundation is installed, and navigate to:


HKEY_LOCAL_MACHINNE/SOFTWARE/Prediktor/Apis/<Your Hive Instance>/Security/ConfigAudit

Set the "Enabled" registry value to 1.

Set the "WriteToFile" registry value to 1, enable writing of configuration audit trail events to plain text files.

If you leave the registry value "WriteToFile_Path" empty/blank, the files will be written to the folder:

  • <Install Directory>\Config\<Your Hive Instance>\AuditTrail

or, override this default behavior by entering a fully qualified path in the "WriteToFile_Path" key.

File format

The configuration audit trail text files, gets the file extension ".audlog" and a maximum size of 25 MBytes before a new file is created. File names are generated from the current system date and time, expressed in Coordinated Universal Time (UTC).

The file format is a TAB separated file containg lines like this:


<time of event (utc)><user identity><config operation><variable length config operation metadata>...

The files can easily be opened in any editor that can read text files.
Tip: you open the files in Excel as TAB separated text files, you can use Filtering on columns to more easily search for user identities and/or operations.

Example:


2018-09-27 06:28:33,7631785 <ApisConfigAuditFileWriter> Audit trail starting  
2018-09-27 06:46:59,3619059 PREDIKTOR\username ModuleAdded WorkerDEMO  
2018-09-27 06:47:13,8699130 PREDIKTOR\username ItemAdded WorkerDEMO.Sine  
2018-09-27 06:47:33,3106825 PREDIKTOR\username ItemAttributeChanged WorkerDEMO.Sine Amplitude 100  
...  
2019-09-27 06:49:38,0063460 <ApisConfigAuditFileWriter> Audit trail stopping

Import Engineering units

This section covers how to import third party Engineering units to the Apis Foundation server.

There are two methods for importing third party Enginering units:

Import Engineering units from csv file

Add Engineering unit namespace

Import Engineering units from csv file

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Engineering Units" node.

Import Engineering units from csv file

  • Right-click on the "Engineering units" node and select "Import Engineering Units".
  • Select a comma separated file with Engineering units.
  • Type an unique Engineering unit namespace uri.
  • Click "Run import".
  • An Engineering unit namespace node will appear below the "Engineering Units" node.
  • Map Engineering units when done.

csv file format

A comma separated file with the following fields:

FieldTypeOptionalDescription
UnitIdInt 32NoAn id that is unique within the Engineering unit namespace.
DisplayNamestringNoA short name for the unit, typically the abbreviation.
DescriptionstringNoThe full name for the unit.
QuantitystringYesThe name of the unit quantity.

Example:

The first line must contain the header.


UnitId, DisplayName, Description, Quantity

1,g,gram,mass

2,kg,kilogram,mass

24,bar,bar,pressure

25,Pa,Pascal,pressure

Add Engineering unit namespace

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Engineering Units" node.

Add Engineering unit namespace

  • Right-click on the "Engineering units" node and select "Add Engineering unit namespace".
  • Type the Engineering unit namespace uri, click ok.
  • An Engineering unit namespace node will appear below the "Engineering Units" node.
  • Go to Map Engineering units and add third party Engineering units.

Map Engineering units

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Engineering Units" node.

Map third party Engineering units to Apis Engineering units

  • You may add, edit and remove third party Engineering units.
  • Right-click on a Engineering unit namespace node and select "Engineering Unit Mappings".
  • In the Engineering Unit Mappings view, right-click on an Engineering unit and click "Select Apis Engineering Unit". Select an Apis Engineering unit from the dialog.
  • You may Add Custom Apis Engineering units if needed.

Activate Engineering unit namespace

  • In the Engineering Unit Mappings view, click the "Activate" button. The Engineering unit namespace will be activated only if all mappings are valid.

Trigger reload Engineering units

  • Go to the "Hive" instance in Apis Management Studio, right-click, and select "Reload Units".

Third party Engineering units must be mapped to Apis Engineering units in order to support unit conversion.

Only activated Engineering unit name-spaces will be loaded by Apis Hive.

Apis Engineering unit mappings are persisted to the file <INSTALLDIR>/CustomEngUnit/CustomEngUnitMap.xml.

Add Custom Apis Engineering units

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Engineering Units" node.

Add a custom Apis Engineering unit

  • Right-click on the "Engineering units" node and select "Apis Engineering Units".
  • Click the "Add" button.
  • Type the required values in the dialog and click "Add".

Custom Apis Engineering units will have the Built-in property value set to "False".

Custom Apis Engineering units are persisted to the file <INSTALLDIR>/CustomEngUnit/CustomEngUnit.xml.

Edge Management services

Configuration Repository

The Configuration Repository (CR) is a self-contained service for managing backups of:

  • Ua Namespace nodeset files
  • Apis Variable mapping files
  • Apis Hive configuration files

CR can compare different versions of backups to each other.

The CR API is implemented in gRPC.

Installation

CR is installed as a part of Apis Foundation. Remember to tick 'APIS ConfigRepository Server' in the install wizard. The service will typically be installed in the directory: 'C:\Program Files\APIS\ConfigRepositoryServer'

CR runs as a Windows service. The default port is 8237. The port can be changed in the configuration file, appsettings.json (Ioc.xml on older versions), see below. The configuration file can be found in the installation directory.

CR depends on PostgreSQL as its datastore (install it if not present). Connection details for the PostgreSQL database must be entered in the configuration file, see below. CR needs PostgreSQL credentials in the appsettings.json file to work correctly

Backup files are placed on the disk. The location for these files can be set in the configuration file, see below.

When doing a namespace diff, the namespace may have dependencies on other namespaces. For the diff to succeed, it needs access to these namespaces. Nodesetfile for these other namespaces can be placed in the folder defined in in the configuration file.

Locate the configuration file in the install directory and setup PostgreSQL settings.

appsettings.json:

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft": "Warning",
      "Microsoft.Hosting.Lifetime": "Information"
    }
  },
  "AllowedHosts": "*",
  "Kestrel": {
    "EndpointDefaults": {
      "Protocols": "Http2"
    },
    "Endpoints": {
      "Grpc": {
        "Protocols": "Http2",
        "Url": "http://0.0.0.0:8237" // CR Service endpoint
      }
    }
  },
  "ServiceConfig": {
    "RootStore": "C:\\Data\\ConfigServiceStore",	 // Place on disk to store backup files
    "TypelibStore": "C:\\Data\\TypelibStore",	// Location for additional namespaces needed to perform namespace diff.
    "DBUser": "postgres",		// PostgreSQL username
    "DBHost": "localhost",		// PostgreSQL host
    "DBPort": 5432,				// PostgreSQL port
    "DBName": "cfgdb",			// PostgreSQL database
    "DBPwd": "tokamap"			// PostgreSQL password
  }
}

Internal structure

The structure inside CR is organized as a hierarchy

  • Level 1: Folders
  • Level 2: Stores
  • Level 3: Revisions
  • Level 4: Content

Folders

Folders can be nested to the users choosing. Only folders can be at the root of the repository. The users are free to organize a folder hierarchy.

Stores

Stores are containers for Revisions (backups). A store is of one specific type, Namespace, Variable Mapping or Hive Configuration.

  • A Store of type Namespace can store nodeset2.xml file only.
  • A Store of type Variable Mapping can store Apis hive config files only (text).
  • A Store of type Hive Config can store Hive configuration backups only (zip).

Revisions

Revisions can be added to Stores only and must be of the same type as the Store. A Revision is a placeholder for an actual backup (Content). The revision contains metadata for the backup like, name, description, version, date and more.

A Revisions type must correspond to the type of the Store it belongs to. I.e., a Revision for a namespace can only belong to a Store for namespaces.

Content

Content is an actual backup file. Content is always associated with a Revision.

A Content file might be:

  • nodeset2.xml file for an Ua Namespace
  • Hive config text file for Variable Mapping
  • Hive backup zip file for Hive Configuration

Compare Revisions

Revisions in the same Store can be compared to one another. One revision is selected as base, the other as "compare to". The result will be sets of Added/Deleted/Modified from the viewpoint of the base. The diff-results for the 3 different types are quite different, look in the API description below for the details.

Remote connection

To receive connections from remote computers, you need to open the port for CR in the firewall, see: Configuration Repository - Open Firewall

Best practices

Take some time to layout a good folder naming and hierarchy for your organization. A good layout makes it easy to find your backups later.

Make one store for each Namespace, don't have different namespaces in the same store.

Make one store for each namespace's Variable Mappings.

Make one store for each Hive for your Hive Configurations.

API reference

Configuration Repository API

Common proto messages

gRPC API Configuration Repository

The API is defined in the proto-files ConfigRepository.proto. And depends on PrediktorCommon.proto.

Configuration Repository endponts are:

  // Convenience method to test for server connection
  rpc ping (google.protobuf.Empty) returns (google.protobuf.StringValue);
  // Get info about the API, version, vendor etc.
  rpc info(google.protobuf.Empty) returns (ConfigRepositoryDetails);

  // Create a new node beneath another node. 
  rpc createNode (NodeRequest) returns (NodeResponse);
  // Update a node. 
  rpc updateNode (NodeRequest) returns (PrediktorCommon.Result);
  // Get subnodes for a node
  rpc getNodes (GetNodesRequest) returns (GetNodesResponse);
  // Get details for a node
  rpc getNode (NodeId) returns (NodeDetail);
  // Delete a node
  rpc deleteNode (NodeId) returns (PrediktorCommon.Result);

  // Upload a binary content file to a Revision
  rpc uploadBinaryContent (stream UploadContentRequest) returns (PrediktorCommon.Result);
  // Upload a text content file to a Revision
  rpc uploadTextContent (stream UploadContentRequest) returns (PrediktorCommon.Result);

  // Download a binary content file from a Revision
  rpc downloadBinaryContent (NodeId) returns (stream PrediktorCommon.ByteStream);
  // Download a text content file from a Revision
  rpc downloadTextContent (NodeId) returns (stream PrediktorCommon.StringArray);

  // Compare two Namespace revisions
  rpc compareNamespaces (NamespaceCompareRequest) returns (stream NamespaceCompareResult);
  // Compare two Hive Config revisions
  rpc compareHiveConfig (HiveConfigCompareRequest) returns (stream HiveConfigCompareResult);
  // Compare two Variable Mapping revisions
  rpc compareVariableMappings (RevisionCompareRequest) returns (stream CompareVariableMappingsResult);

  // Subscribe to get events on Add/Delete/Update
  rpc subscribe (google.protobuf.Empty) returns (stream CrEventInfo);

Configuration Repository messages are:


// Id of a node
message NodeId {
	string id = 1;
}

// Array of nodeIds
message NodeIds {
	repeated NodeId ids = 1;
	PrediktorCommon.Result result = 2;
}

message NodeRequest { 
	NodeId parentId = 1;	// Id of parent (required by createNode)
	NodeId id = 2;			// Id of node (required by updateNode)
	string name = 3;		// Name of node
	string description = 4;	// Description of node
	oneof data {			// Define either NodeData for Folder/Store or RevisionData for a revision
		NodeData nodeData = 7;			// When creating/updating a Folder or Store
		RevisionData revisionData = 8;	// When creating/updating a Revision
	}
}

// createNode response
message NodeResponse{
	NodeId nodeId = 1;					// Id of newly created node
	PrediktorCommon.Result result = 2;	// Result of create operation. Check result.success to see if the operation was successful.
}

message GetNodesRequest {
	NodeId parentNodeId = 1;	// Subnodes for this node will be returned
	int32 pageNo = 2;			// Pagination page number
	int32 pageSize = 3;			// Pagination page size
}

// getNodes response
message GetNodesResponse {
	NodeId parentId = 1;				// Id of parent node
	repeated NodeDetail subNodes = 2;	// Array of detailed info of subnodes
	int32 prevPage = 3;					// Pagination, prevPage number
	int32 nextPage = 4;					// Pagination, nextPage number
	int32 pageSize = 5;					// Pagination, page size
	int32 totalCount = 6;				// Pagination, total number of pages
	PrediktorCommon.Result result = 7;	// Result of operation
}

// Details about a node
message NodeDetail {
	NodeId parentId = 1;				// Parent node id. Well be empty if root.
	NodeId id = 2;						// Id of node
	string name = 3;					// Name of node
	string description = 4;				// Description of node
	google.protobuf.Timestamp date = 5;	// Date when node was created
	PrediktorCommon.Result result = 6;	// Result of operation
	oneof data {						// nodeData or revisionData will be returned, dependent of the nature of the node
		NodeData nodeData = 7;			// Node is Folder or Store.
		RevisionData revisionData = 8;	// Node is Revision
	}
}

// Data about a Folder or a Store
message NodeData {
	NodeTypeEnum type = 1;
}

// Data about a Revision
message RevisionData {
	RevisionTypeEnum type = 1;
	string version = 2;			// Version of revision
	string modelVersion = 3;	// ModelVersion, applicable for Namespace Nodeset
	google.protobuf.Timestamp publicationDate = 4; // Namespace Nodeset PublicationDate
	string storeId = 5;			// Id of the Store this Revision belongs to
}

//Node types Folders and Stores
enum NodeTypeEnum {
	undefined = 0;
	folder = 1;					// Folder
	namespaceNodesetStore = 2;	// Namespace Nodeset Store
	hiveConfigurationStore = 3;	// Hive Configuration store
	namespaceDatabaseStore = 4;	// Namespace database (depricated)
	variableMappingsStore = 5;	// Variable Mapping Store
}

//Revision types
enum RevisionTypeEnum {
	undefinedRevision = 0;
	namespaceNodesetRevision = 1;	// Namespace Nodeset
	hiveConfigurationRevision = 2;	// Hive Configuration
	namespaceDatabaseRevision = 3;	// Namespace database (depricated)
	variableMappingsRevision = 4;	// Variable Mapping
}

// Information about the Config Repository service
message ConfigRepositoryDetails {
	string version = 1;		// Current version
	string minVersion = 2;	// Minimum version supported by this service
	string vendor = 3;		// Name of the vendor of this service
	string url = 4;			// Where to find information about the vendor
}

// Upload file content message
message UploadContentRequest {
	oneof request {
		NodeId revisionId = 1;						// Id of revision to upload to
		PrediktorCommon.ByteStream binaryChunk = 2;	// A chunk of the file for a binary file
		PrediktorCommon.StringArray textChunk = 3;	// A chunk of the file for a text file
	}
}

// A request to compare two Namespace revisions
message NamespaceCompareRequest {
	NodeId baseRevisionId = 1;				// Base revisions, compareRevision is compared to this one
	NodeId compareRevisionId = 2;			// This revision is compared to the base revision
	RevisionTypeEnum revisionType = 3;		// Revision type
	bool excludeValueSourceAttributes = 4;	// Include where a data variable gets its value from (recommended value = true)
}

// A request to compare two Revisions
message RevisionCompareRequest {
	NodeId baseRevisionId = 1;
	NodeId compareRevisionId = 2;
}

// A request to compare two Hive Configuration Revisions
message HiveConfigCompareRequest {
	NodeId baseRevisionId = 1;		// Base revisions, compareRevision is compared to this one
	NodeId compareRevisionId = 2;	// This revision is compared to the base revision
	string InstanceName = 3;		// Name of Hive instance
}

// Event message describing an Create/Delete/Update change in the repository.
message CrEventInfo {
	PrediktorCommon.EventType eventType = 1;	// Type of event, Create/Delete/Update
	repeated NodeId pathToNode = 2;				// Path of ids from root, down to the node this event is for.
	oneof sourceNodeType {
		NodeTypeEnum nodeType = 3;				// Type if affected node is Folder or Store
		RevisionTypeEnum revisionType = 4;		// Type if affected node is Revision
	}
}

// Ua NodeId and Browsename
message IdAndBrowsename {
	string id = 1;
	string browsename = 2;
}

// Where a value is present
enum ConfigValuePresence {
	sourceAndDestination = 0;	// Present in both
	onlySource = 1;				// Present only in source
	onlyDestination = 2;		// Present only in destination
}

// Representation of a value diff
message ConfigValueDiff {
	string						name = 1;				// Name of value
	ConfigValuePresence			valuePresence = 2;		// Where it is present
	string						sourceValue = 100;		// Value is has in source
	string						destinationValue = 101;	// Value is has in destination
}

// Representation of a change i Windows Registry
message RegistryConfigValueDiff {
	string						path = 1;	// Path in Registry
	ConfigValueDiff				diff = 10;	// The diff
}

// Array of Windows Registry diffs
message RegistryConfigValueDiffs {
	repeated RegistryConfigValueDiff registryValues = 1;
}

// Representation of a diff in a Hive attribute
message HiveAttributeValueDiff {
	int32						attributeId = 1;	// Id of attribute
	ConfigValueDiff				diff = 10;			// The diff
}

// Array of diffs in Hive attributes
message HiveAttributeValueDiffs {
	repeated HiveAttributeValueDiff attributes = 1;
}

// Representation of diff for a Hive module item
message HiveModuleItemConfigDiff {
	string						name = 1;			// Name of item
	ConfigValuePresence			itemPresence = 2;	// Where it is present
	HiveAttributeValueDiffs		attributeDiffs = 10;// Attribute diffs
}

// Array of diffs for Hive module items
message HiveModuleItemConfigDiffs {
	repeated HiveModuleItemConfigDiff Items = 10;
}

// Representation of diffs for a Hive Module
message HiveModuleConfigDiff {
	oneof diff {
		string						name = 1;			// Name of Module
		ConfigValuePresence			valuePresence = 2;	// Where it is present
		RegistryConfigValueDiffs	registryDiffs = 10;	// Diffs in Windows Registry
		HiveAttributeValueDiffs		moduleDiffs = 20;	// Diffs in the modules attributes
		HiveModuleItemConfigDiffs	itemDiffs = 30;		// Diffs for the modules items
	}
}

// Array of diffs for a Hive Modules
message HiveModuleConfigDiffs {
	repeated HiveModuleConfigDiff modules = 1;
}

// Message for errors
message ErrorInfo {
	string error = 1;
	string errorDetails = 2;
}

// Representation of a diff for a Hive
message HiveConfigCompareResult {
	oneof result {
		string						name = 1;			// Name of Hive
		bool						hasDifferences = 2; // True if there are differences present
		HiveModuleConfigDiffs		moduleDiffs = 10;	// Diffs for module
		RegistryConfigValueDiffs	registryDiffs = 20; // Diffs for Windows Registry
		ErrorInfo					errorInfo = 30;		// Error presentation
	}
}

// Representation of a diff for an attribute
message ModifiedAttribute {
	uint32 attributeId = 1;	// Id of attribute
	string oldValue = 2;	// Old value (base)
	string newValue = 3;	// New value (compared)
}


// Representation of a diff for some value
message ModifiedValue {
	string valueName = 1;	// Name
	string oldValue = 2;	// Old value (base)	
	string newValue = 3;	// New value (compared)
}

// Representation of a namespace
message ModifiedNamespace {
	int32 idx = 1;				
	bool autoUpdate = 2;
	string modelVersion = 3;
	google.protobuf.Timestamp publicationDate = 4;
	google.protobuf.Timestamp lastModified = 5;
}

// Where a namespace index in present in a compare result
message NsIndexPresence {
	uint32 nsIndex = 1;		// Namespace index
	int32 presentIn = 2;	// 0 = Both, 1 = Compare, 2 = Base
}

// Ua QualifiedName
message QualifiedName {
	uint32 namespaceIndex = 1;
	string name = 2;
}

// Ua LocalizedText
message LocalizedText {
     SemanticsLocale locale = 1;
     string text = 2;
}

// Ua SemanticsLocale
message SemanticsLocale {
	string language = 1;
    string region = 2;
}

// Representation of an Ua Reference 
message ReferenceDescription {
	string sourceId = 1;
	QualifiedName sourceBrowseName = 2;
	string referenceId = 3;
	QualifiedName referenceBrowseName = 4;
	string targetId = 5;
	QualifiedName targetBrowseName = 6;
	bool isHierarchical = 7;
}

// Map of namespaces and where they are present in a compare result
message NamespacesMap {
	map<string, NsIndexPresence> namespacesMap = 1;
}

// Modified nodes in a Namespace compare result
message ModifiedNode {
	IdAndBrowsename idAndBrowsename = 1;
	ModifiedAttributes modifiedAttribute = 2;
}

// Array of ModifiedNodes
message ModifiedNodes {
	repeated ModifiedNode modifiedNodes = 1;
}

// Array of modifiedAttributes
message ModifiedAttributes {
	repeated ModifiedAttribute modifiedAttrs = 1;
}

// Array of ModifiedValues
message ModifiedValues {
	repeated ModifiedValue modifiedVals = 1;
}

// Array of ReferenceDescriptions
message ReferenceDescriptions {
	repeated ReferenceDescription referenceDescrs = 1;
}

// Array of SemTypeMembers
message SemTypeMembers {
	repeated SemTypeMember members = 1;
}

// Ua semantic type member
message SemTypeMember {
	int64 parentIndex = 1;
	bool isRoot = 2;
	string id = 3;
	SemProperties properties = 4;
}

// Ua properties (a subset)
message SemProperties {
	SemPropertyLocalizedText displyNameProp = 1;
	SemPropertyQualifiedName qualifiedNameProp = 2;
	SemPropertyLocalizedText descriptionProp = 3;
}

// Ua QualifiedName property
message SemPropertyQualifiedName {
	int32 id = 1;
	QualifiedName browsename = 2;
	bool isReadOnly = 3;
}

// Ua LocalizedText propery
message SemPropertyLocalizedText {
	int32 id = 1;
	LocalizedText value = 2;
	bool isReadOnly = 3;
}

// Map of modified namespaces, key is namespace uri
message ModifiedNameses {
	map<string, ModifiedNamespace> modNamespaces = 1;
}

// Map of modified namespaces and the modified values
message ModifiedNamespaceValues {
	map<string, ModifiedValues> modNamespacesValues = 1;
}

// The result of a namespace compare
message NamespaceCompareResult {
	oneof result {
		NamespacesMap namespaces = 1;					// Map of namespaces and which compare sets (base, compare) are present
		ModifiedNodes modifiedNodes  = 2;				// Nodes (instances, types) that are modified
		ReferenceDescriptions newReferences = 3;		// References that are added
		ReferenceDescriptions deletedReferences = 4;	// References that are deleted
		ModifiedNameses newNamespaces = 5;				// Namespaces that are added
		ModifiedNameses deletedNamespaces = 6;			// Namespaces that are deleted
		ModifiedNamespaceValues modifiedNamespaces = 7;	// Namespaces that are modified
		SemTypeMembers newNodes = 8;					// Nodes (instances, types) that are added
		SemTypeMembers deletedNodes = 9;				// Nodes (instances, types) that are deleted
		bool hasDifferences = 10;						// True if there are differences between the two namespace versions
		bool success = 11;								// True if operation was successful
		string error = 12;								// Error message if operation was unsuccessful
		string errorDetails = 13;						// Detailed error message (may be empty) if operation was unsuccessful
	}
}


// The result of a Variable Mapping compare
message CompareVariableMappingsResult {
	oneof result {
		DiffTextModel old = 1;		// Diffs for base versjon of the file
		DiffTextModel new = 2;		// Diffs for the compare versjon of the file
		bool hasDifferences = 3;	// True if there are differences between the two mapping files
		bool success = 10;			// True if operation was successful
		string error = 11;			// Error message if operation was unsuccessful
	}
}

// Diff result for a text file
message DiffTextModel {
	repeated DiffTextPiece lines = 1;
	bool hasDifferences = 2;
}

// Diff result for a line in a text file 
message DiffTextPiece {
	ChangeTextType type = 1;
	int32 position = 2;
	string text = 3;
	repeated DiffTextPiece subPieces = 4;
}

// Change type for a line in a text file
enum ChangeTextType {
	unchanged = 0;
	deleted = 1;
	inserted = 2;
	imaginary = 3;	// Not present
	modified = 4;
}

Configuration Repository - Open Firewall

To allow for remote clients to connect to a CR, an inbound rule must be added to the firewall.

This tutorial is for the Windows Defender Firewall.

In the Windows taskbar search field, type 'Windows Defender Firewall' to open the config tool.

Choose Advanced setting to open a new dialog.

Select Inbound Rules in the tree, then press New Rule on the right.

Select Port

Choose TCP and specify the port CR is using, default is 8237.

Allow the connection

Apply the rule

Give the rule a name Press Finish

The rule is now created, but applies to all applications on the machine. To make it apply to CR only, go to the main window. Locate the new rule and right-click and select Properties:

Select Programs and Services, then This program. Browse to the CR exe-file: %ProgramFiles%\APIS\ConfigRepositoryServer\Apis.ConfigRepository.Service.Host.exe

You should now be able to connect to this CR from a remote machine.

If connection fails, you can try using Windows PowerShell and type this command to see if you can connect to the server and that the port is open:

tnc <ip-address> -Port 8237

gRPC Common proto messages

Shared messages for Prediktor gRPC services. The proto file is PrediktorCommon.proto.


// Message of bytes, typically used for streaming binary content
message ByteStream {
	bytes bytes = 1;
}

// Array of strings
message StringArray {
	repeated string arr = 1;
}

// Array of results
message ResultArray {
	repeated Result results = 1;
}

// The result of an operation
message Result {
	bool success = 1;			// True if operation was successful
	string error = 2;			// Error message if operation was unsuccessful
	string errorDetails = 3;	// Detailed error message (may be empty) if operation was unsuccessful
	int32 errorCode = 4;		// Error code if operation was unsuccessful
}

// Wrapper for a boolean value
message BooleanReply {
	bool value = 1;
}

// Message for a string result
message StringResult {
	string value = 1;
	Result result = 2;
}

message BrowseFilter {
	string value = 1;
}

message InstanceIds {
	bool success = 1;
	string error = 2;
	repeated InstanceId ids = 3;
}

message InstanceId {
	string parentId = 1;
	string id = 2;
}
message InstanceInfos {
	repeated bool success = 1;
	repeated string error = 2;
	repeated InstanceInfo infos = 3;
}

message InstanceInfo {
	InstanceId id = 1;
	string name = 2;
	string fullName = 3;
	string description = 4;
    bool isRemovable = 5;
	bool canHaveChildren = 6;
}

message PropertyCollection {
	repeated Properties PropertiesArray = 1;
	bool Success = 2;
	string Error = 3;
}

message Properties {
	repeated Property PropArray = 1;
	bool Success = 2;
	string Error = 3;
}

message Property {
	string Name = 1;
	google.protobuf.Any Value = 2;
	bool Readonly = 3;
	string Description = 4;
	uint32 Id = 5;
	bool Success = 6;
	string Error = 7;
}

message PropertiesWriteRequest {
	string Id = 1;
	Properties Properties = 2;
}

// Types of events
enum EventType {
	undefined = 0;
	created = 1;	// Node created
	updated = 2;	// Node updated
	deleted = 3;	// Node deleted
}

Comprehensive Overview of AMASH CLI Tool

AMASH is a command-line interface (CLI) tool designed to support client-server-related services. This tool enables seamless remote connection to a server using the gRPC framework in order to execute various actions via commands within the CLI, eliminating the need for users to connect to the server side remotely, such as through a remote desktop connection.

Usage:

amash.exe -h -s [server address] -p [port] [command [args...]]

To view available commands, use the following command in the command line:

amash.exe -h

Apis Management Services Agent Shell 1.0
--------
Usage: amash.exe -hsp [command [args...]]

  -h         Show this help
  -s <uri>   Connect to URI
  -p <port>  Specify port number (default is 7823)

Commands:
  GetApisVersion
  HsDefrag
  HsStart
  HsStatus
  HsCancel
  HsListDb
  HsListProcess
  HsListLogs
  HsDownloadLogs

This command provides guidance on setting up the URI and specifying the port number, followed by a list of nine commands that can be utilized.

Command Descriptions

GetApisVersion: Prints out the current Apis version from server side.

Example Usage GetApisVersion:

amash.exe -s localhost GetApisVersion

9.15.7.312

HsDefrag: If valid database path(s) are identified, the method returns true; otherwise, it returns false. It initiates the API's repair process for each discovered .dat file on the server, processing one .dat file at a time using settings configured in ../APIS/ManagementServer/appsettings.json.

Example Usage HsDefrag:

amash.exe -s localhost -p 7823 HsDefrag

Returns true/false

HsStart: Initiates a repair process for a selected Honestore database via the ApisHSTrendRepair tool and returns the Process ID (PID). By entering "HsStart -h", the user can reveal additional choices to toggle on:

amash.exe -s localhost HsStart -h

Run ApisHSTrendRepair on remote host.

  -inspect              Inspect Only! No files will be modified or repaired!

  -auto                 Automatically repair trendfiles.
                        May delete overlapping trend data!

  -serial               Repair trendfiles in serial,
                        instead of parallel in multiple treads.

  -force                Repair even though Repaired.log
                        file exists and directory has been repaired previously.

  -verbose              Verbose output.

  -dir <Directory>      Folder to be repaired.
                        Includes subfolders

  -numthreads:N         The number of threads (N) to run in parallell, if not -serial is specified.
                        N must be in the range[0, 256] (0 means same as default => sets N equal to
                        number of logical threads on given hardware.

Example Usage HsStart:

amash.exe -s localhost -p 7823 HsStart -inspect -auto -serial -force -verbose -dir "X:\exampleFolder.dat" -numthreads:N

Process started with PID 25324

HsStatus: Returns the repair status of a Honeystore database that is currently undergoing repair.

Example Usage HsStatus:

amash.exe -s localhost -p 7823 HsStatus 25324

25324 current status:   xx.x% done

HsCancel: Trigger a graceful shutdown of running process on host if given PID is found.

Sets the flag to true

amash.exe -s localhost HsCancel 24040  

HsListDb: Retrieves a list of Honeystore databases and their corresponding folder paths on the server side.

Example Usage HsListDb:

amash.exe -s localhost -p 7823 HsListDb

Name                    Filepath
---------------------------------
DatabaseExample1        C:\Something1\DatabaseExample1.dat
DatabaseExample2        C:\Something2\DatabaseExample2.dat
DatabaseExample3        C:\Something3\DatabaseExample3.dat

HsListProcess: Retrieves a list of all processes currently undergoing repair, along with those that have been repaired successfully or unsuccessfully.

Example Usage HsListProcess:

amash.exe -s localhost -p 7823 HsListProcess

CURRENT
PID       Start                LastUpdate           Runtime          Output                Filepath
---------------------------------------------------------------------------------------------------
24040     2024-02-16 10:33     2024-02-16 10:33     0hrs 0min        17.4% done            C:\Something3\DatabaseExample3.dat

COMPLETE
PID       Start                Stop                 Runtime          Output                Filepath
---------------------------------------------------------------------------------------------------
36384     2024-02-16 10:18     2024-02-16 10:19     0hrs 1min        100.0% done           "C:\exampleFiles\example.dat"

CRASHED
No current processes running

HsListLogs: Lists all Repair/Inspect logs from each database.

amash.exe -s localhost HsListLogs

List of log files:
C:\Something\RepairResults_2024-02-07T0930.log
C:\Something2\RepairResults_2024-02-07T0930.log
C:\Something3\RepairResults_2024-02-15T1236.log

HsDownloadLogs: Facilitates the downloading of {"RepairResults*.log", "RepairErrors*.log", "InspectResults*.log", "InspectErrors*.log"} files located within the server side. After executing the command, the user specifies the path on the client-side where the files are to be downloaded.

Example Usage HsDownloadLogs:

amash.exe -s localhost -p 7823 HsDownloadLogs "C:\exampleDownloadFolder.zip"

File downloaded successfully!

Apis High Availability

Introduction

The Apis High Availability concept is designed on basis the principles described in OPC UA Part 4 (Services). Apis supports non-transparent redundancy with hot failover mode, meaning that the nodes of the cluster do not exchange information or state, but operates on standalone basis, continuously connected to the underlying systems to update internal state and history. In addition to the general principles of OPC UA Client/Server redundancy described in OPC UA Part 4, Apis features concepts for config synchronization and history synchronization based on proprietary Apis technology and definitions. The high availability concept can be used to achieve both redundancy and load balancing.

OPC UA non-transparent redundancy

Non-transparent redundancy means that clients themselves identify what servers are available in the redundant server set. Servers expose information which tells the clients what modes of failover the server supports together with the current state (service level) of the server, and endpoint information of other servers in the cluster. This information allows the clients to determine what actions it may need to take to accomplish failover.

OPC UA hot failover mode

All Servers in the redundant server set are powered-on and are up and running. In scenarios where Servers acquire data from a downstream device, such as a PLC, then all servers are actively connected to the downstream device(s) in parallel. The Servers have minimal knowledge of the other servers in their group and are independently functioning. When a server fails or encounters a serious problem then its service level drops, which allows the clients to select another server in the server set. On recovery, the server returns to the redundant server set with an appropriate service level to indicate that it is available.

Apis Clustering

An Apis High Availability Node consists of an Apis Hive Instance with associated event databases and trend databases (Honeystore databases and Honeystore service). An Apis High Availability Cluster is defined by ApisHAGovernor (singleton Apis module) modules configured in the Apis Hive nodes (instances) constituting the cluster. Each node of the cluster is, in normal situations, connected to underlying sources (redundant or non-redundant), exposes real-time data and history from the underlying sources and logs and stores time series and events from the underlying sources to the databases associated with the node. The ApisHAGovernors are aware of their peers in the other nodes of the Apis Cluster and governs the synchronization of historical data between the nodes at startup of the node.

Inbound interfaces

For inbound interfaces, failover is handled by the OPC UA Client Modules and Connection Manager module, which monitors the service levels of connected redundant sources. The Apis OPC UA client module relays to the Connection Manager bee for connection information. The Connection manager bee maintains ServerURI arrays from underlying servers and selects proper server connection based of the service level exposed by the different servers in the cluster.

Outbound interface

On the outbound OPC UA Server interface, the service level of the Apis Hive instance is exposed, together with connection information to the peers of the cluster. The Apis OPC UA server exposes OPC UA redundancy related concepts such as RedundancySupport, ServiceLevel and ServerURI array. This allows connected clients to failover to other nodes in the Apis High Availability Cluster. It is up to the deployment project to define the algorithm for computing service level.

Synchronization of configuration

The configuration of the cluster nodes is synchronized as a manual triggered action, where the configuration is distributed from the config master node to other nodes in the cluster. When configuring or applying changes to the cluster, one of the nodes is promoted to config master. The config master node is then configured (either directly using AMS or import mechanisms or by migrating config from an engineering server). When the config master is configured/reconfigured, the other nodes are cloned from the config master node. Cloning of cluster nodes is based on server cloning pattern using Backup-Restore features of Apis.

Synchronization of historical data

Trend log and Event log agents are responsible, on behalf of the governor (ApisHAGovernor module), to keep the trend log and event log databases of the cluster nodes similar. The trend logs and event logs of the cluster nodes are not expected to be bit-equal, but the goal is to be able to extract similar logs from both servers and avoid major log gaps due to cluster node downtime. The log agents will keep track of a LastKnownGoodHistoryTime at all time, and update this whenever the history is considered good. On startup of a cluster node after a certain downtime, the cluster node will immediately start to collect real-time events from the sources and contact other servers in the cluster to close in the trend log and event log gap between LastKnownGoodHistoryTime and the time of startup.

The data synchronization is limited to synchronizing history databases on startup of cluster nodes, to fill in the gap in history since the last time the node was fully operative. Other types of gap synchronization, e.g. as a result of downtime of separate inbound interfaces or synchronization of manual entered or altered data is expected to be supported in later product versions.

Synchronize configuration

This section shows how to synchronize configurations for High Availability clusters

Configuration synchronization is performed with use of Apis Backup Agent and Apis Management Studio

Follow the steps bellow to synchronize nodes in a Apis High Availability cluster:

  • How to Backup the configuration master cluster node.
  • How to Import a Backup Set to the configuration slave cluster node.
  • How to Restore the Backup set on the configuration slave cluster node.
  • Change Overridable Values during the restore process. The following Overridable values should be changed for the HAGovernor module:
    • Property ID 1800 - ServerPort: Set to the port that should be used for this cluster node.
    • Property ID 2000 - HiveInstance: Set to value that points to the other Hive instances in the cluster.

There is no need to backup or restore history data during configuration synchronization, as history data will be synchronized by the HAGovernor on startup of the custer node.

Disaster Recovery

Disaster Recovery is configured with use of Apis Backup Agent and Apis Management Studio

Follow the steps bellow to perform Disaster Recovery:

How to Backup

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Recovery" node.

Backup

Right click on the "Backup sets" node and select "Backup" from the menu. From the Backup view, follow the steps below:

Select a backup root folder or optionally specify a full path for a Backup Set. If you select a backup root folder, the full path to the backup set will be generated.

Select the services you wish to back up. The following services are supported:

Apis Hive, Apis Chronical, Apis Honeystore and Apis OPC UA Namespace Server .

You can specify if configuration and/or history data should be included in the backup.

You can specify which Hive instance that should be included in the backup. Option "*" means all Hive instances found on the machine.

Click on the "Run backup" button to start a backup job.

You are only allowed to run a backup job if the option combination is valid.

Any warnings or errors returned from the backup job will show up in the Information part.

A backup set will appear under the "backup set" node when the backup job is finished executing.

Schedule Backup

Right click on the "Scheduled Backup" node and select "Scheduled Backup" from the menu. From the Scheduled Backups view, follow the steps below:

Select the services you wish to back up. The following services are supported:

Apis Hive, Apis Chronical, Apis Honeystore and Apis OPC UA Namespace Server .

You can specify if configuration and/or history data should be included in the Backup set.

You can specify which Hive instance that should be included in the backup. Option "*" means all Hive instances found on the machine.

The following schedules are supported:

One time, Hourly, Daily, Weekly, Monthly and custom Cron expression.

Specify date time settings according to the schedule.

Click on the "Add Schedule Backup" button to add the backup job.

The Apis Backup Agent will back up configurations for both running and stopped Apis services .

How to Import a Backup Set

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Recovery" node.

Import a Backup set

If restoring on a new environment you need to import the Backup Set

Backup Sets can be moved between machines via the operating system file explorer. Place the backup set in a designated folder on the target machine.

A Backup set is stored in its own root folder and contains a root backup file and several sub folders with backup data. Be sure to select the root folder with all its content when moving the backup set with the operating system file explorer.

Right click on the "Backup sets" node and select "Add Backup Set" from the menu. Browse to the Backup set root file, and click "Add". The Backup set will show up under the "Backup sets" node in Apis Management Studio.

You may explore the Backup set configuration content by browsing in Apis Management Studio.

How to Restore

Access the service "apis://localhost" from Apis Management Studio and navigate to the "Recovery" node.

Restore

Right click on a Backup Set found under the "Backup set" node and select "Restore". From the Restore view, follow the steps below:

Select the services you wish to restore. Only services that are included in the backup set can be selected.

You can specify if configuration and/or history data should be restored.

If you have selected the Hive service option, you must specify which Hive instance that should be restored. Only one Hive instance can be selected per restore job.

Click on the "Run Restore" button to start the restore job.

You are only allowed to run a restore job if the option combination is valid. The option combination is validated against the Backup set content.

Any warnings or errors returned from the restore job will show up in the Information part.

If restoring several Hive instance configurations and Honeystore data, it is best practice to restore Honeystore data first, then the Hive instance configurations.

The selected services will be restarted when executing the restore job.

When restoring data, the Apis Honeystore and Apis Chronical EventServer databases on the target machine will be replaced with content from the Backup set.

How to set a Custom Service Port

There are situations where altering the default service port becomes necessary (the default port is set to 50051) . This adjustment can be made in the configuration file located at "\..\APIS\appsettings.json". The modification is demonstrated in the snippet below.

appsettings.json:

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft": "Warning",
      "Microsoft.Hosting.Lifetime": "Information"
    }
  },
  "AllowedHosts": "*",
  "Kestrel": {
    "EndpointDefaults": {
      "Protocols": "Http2"
    },
    "Endpoints": {
      "Grpc2": {
        "Protocols": "Http2",
        "Url": "http://localhost:50051"		//Bare service port to change
      }
    }
  }
}

Additionally, it's necessary to update the corresponding port parameter in the Apis Management Studio which is an xml file within AMS configuration folder called "ioc.xml".

ioc.xml:

<?xml version="1.0" encoding="utf-8" ?>
<configuration>
  .
  .
  .
    <component id="apis">
      <parameters>
        <forceRemote>false</forceRemote>
		<backupAddress>127.0.0.1:50051</backupAddress>	//Corresponding service port to change
      </parameters>
    </component>
    .
	.
	.
</configuration>

Security

Please refer to the Apis Security Server documentation for guidance on how to configure the Security Server.

Enable security in Apis Foundation

To enable security, open the Windows registry editor on the machine where Apis Foundation is installed, and navigate to:


HKEY_LOCAL_MACHINNE/SOFTWARE/Prediktor/Apis/<Your Hive Instance>/Security.

Set the "Enabled" registry key to 1.

Ensure that the "ServerAddress" registry key targets the Apis Security Server address.

Please note that security is enabled pr Apis service instance.

Troubleshooting

This section covers various trouble shootig guides, post configurations, and how to's. Please pick a topic from the menu.

View Log files

When troubleshooting Apis Services it is useful to inspect trace log files. Each Apis Service has its own log file, which is typically found in:


[INSTALLDIR]/APIS/Logs


[INSTALLDIR]/APIS/<ServiceName>/Logs

Use Apis Management Studio and go to:

  • "File" -> "Import Log Files"
  • Browse to the log file and click "Open"

This will open the Log View.

Run Apis on an account that does not have interactive log-in rights

Introduction

It is fully possible to run Apis Foundation services on user’s that don’t have interactive login rights.

However, the requirements for user running Apis Foundation are:

  • The user has local administrative rights.
  • Log on as a batch job is allowed in local policy security settings.
  • Log on as a service is allowed in local policy security settings.

Deny interactive logon for Apis Service Accounts

There are slight differences in how interactive logon is restricted, depending on the computer is member of domain or not, this is described in detail below.

Deny interactive logon when computer member of domain

  1. In Active Directory, create an OU such as “Deny Interactive Logon” for storing your Service Account Users
  2. Create a Security Group which will hold all the Service Account users. Call it something meaningful such “Denied Interactive Logon Users”
  3. Create the User to be used as a Service Account and give them the required rights - try and avoid giving Domain Admin where possible. Add information about what this service account is used for in the description field.
  4. Move this user to the “Deny Interactive Logon” OU and add to the “Denied Interactive Logon Users” group. Open Group Policy Management. Create a new GPO and link it at the Domain level. Again, call it something meaningful such as “Lab Service Accounts Deny Interactive Logon”
  5. Edit the Group Policy. Under Computer Configuration/ Windows Settings/Security Settings/Local Policies/User Rights Assignment
  6. Add the “Denied Interactive Logon Users” Security Group to “'Deny log on locally” and “Deny log on through Terminal Services”If “Log on as a batch job” and “Log on as a service” policies are defined add the “Denied Interactive Logon Users” group to these policies.

Deny interactive logon when computer stand alone, not member of domain

  1. Create a Group which will hold all the Service Account users. Call it something meaningful such “Denied Interactive Logon Users”
  2. Create the User to be used as a Service Account and give them the required rights. Add information about what this service account is used for in the description field.
  3. Edit the Local Security Policy. Under Security Settings/Local Policies/User Rights Assignment
  4. Add the “Denied Interactive Logon Users” Group to “'Deny log on locally” and “Deny log on through Terminal Services
  5. Add the “Denied Interactive Logon Users” group to the “Log on as a batch job” and “Log on as a service” policies.

Troubleshooting OPC Communication DCOM and Firewall issues

When experiencing disruption in communication, first of all, check the Log View in Apis Management Studio for any messages related to your problem, if any messages containing:

Message containsSymptom
Access is denied. (0x80070005)Access denied, usually indicates DCOM security misconfiguration
The RPC server is unavailable. (0x800706BA)RPC errors can indicate Windows firewall security misconfiguration, or networks obstacles in general
The remote procedure call failed. (0x800706BE)

OPC enumerator problem

When configuration of security setting of remote computer is incomplete, the OPC server list will be empty when browsing for OPC servers on remote computer and you might get error message(s) in the Log View in Apis Management Studio.

DCOM security

Message like this in the Log View in Apis Management Studio indicates that the problem likely is DCOM security related more than firewall. Remote server says “Access denied”


Failed to create OPC Server Lister object on 10.100.86.125.

As a result, OPC servers might not be available from the list of servers to choose from. Make sure OPCENUM.EXE is properly registered and configured on the server machine, consider both DCOM security and open the Firewall for OPCENUM.exe.

Or, you can enter the CLSID of your OPC server directly into the server property.

Error return: Access is denied. (0x80070005)

Let’s assume in this case, the local client is running on “System account” meaning that Anonymous logon must have access right to remote computer and the OpcEnum process on the remote computer.

Solution:

Check computer wide limits for Anonymous logon on remote computer as well as access rights on the OpcEnum process.

Computer wide limits

On OPC server computer, start Component Services and browse to My Computer right click and Properties, select COM Security tab in Access Permissions section press Edit Limits, assure that Anonymous logon has Remote Access. If ANONYMOUS LOGIN does not exist in the list, it must be added.

Repeat for Launch and activation permissions.

OPC enumerator access rights

Still in Component Services browse to OpcEnum right click and Properties, select Security tab, press Edit button in Access permissions section, an assure Anonymous login has Remote access. If ANONYMOUS LOGIN does not exist in the list, it must be added.

Repeat for Launch and activation permissions.

If you changed any of the settings, the OpcEnum service must be restarted for the changes to take effect.

Firewall

Message like this in the Log View in Apis Management Studio indicates that the problem likely is firewall or network related. There is no answer from remote server.


Failed to create OPC Server Lister object on 10.100.86.125.

As a result, OPC servers might not be available from the list of servers to choose from. Make sure OPCENUM.EXE is properly registered and configured on the server machine, consider both DCOM security and open the Firewall for OPCENUM.exe.

Or, you can enter the CLSID of your OPC server directly into the server property.

Error return: The RPC server is unavailable. (0x800706BA)

Solution:

The firewall must be opened for the OpcEnum process.

Two alternatives to configure; script or firewall control panel.

Script

From elevated command prompt run the following commands:


netsh advfirewall firewall add rule name="Allow OpcEnum" dir=in program="C:\\Windows\SysWOW64\opcenum.exe" action=allow

netsh advfirewall firewall add rule name="Allow OpcEnum" dir=out program="C:\\Windows\SysWOW64\opcenum.exe" action=allow

Beware of the OpcEnum installation path.

Firewall control panel

On OPC server computer start Control panel-> Windows firewall->Advanced settings->New Rule select Program and press Next enter the program path to the OpcEnum executable like “C:\\Windows\SysWOW64\OpcEnum.exe” press Next

Select Allow the connection Next

Apply to all networks Next

Give the rule a proper name like “Allow OpcEnum” and Finish

The Window firewall will now allow connections to the OpcEnum process.

OPC DA/HDA access problems

When configuration of security setting of remote computer is incomplete, you are not able to connect to the remote OPC server, thus item browsing is unavailable and you might get error message(s) in the Log View in Apis Management Studio.

DCOM security on remote server.


»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed to create OPC server, Prediktor.ApisOPCServer.1, on 10.100.86.125.

Error return: Access is denied. (0x80070005).

This message indicates that the problem is DCOM security related. Remote server says “Access denied”

Let’s assume in this case, the local client is running on “System account” meaning that Anonymous logon must have access right to remote computer and the Prediktor.ApisOPCServer.1 process on remote the computer

Solution:

Check computer wide limits for Anonymous logon on remote computer as well as access rights on Prediktor.ApisOPCServer.1

Computer wide limits

See how to set Computer wide limits in previous section

OPC server access rights

Still in Component Services, in this case browse to ApisHive (OPC server) right click and Properties, select Security tab.

In this case the OPC server (ApisHive) is using default properties, we have two chooses:

• Change it to Customized permissions, follow the same procedure as in OPCenum access rights section

• Keep the default. The advantage of use default is if we are using several OPC server instances on same computer the access rights can be set in one place if desirable.

In this example we choose to keep default, now close the ApisHive Properties dialog, browse to My Computer right click and Properties, select COM Security tab in Access Permissions section and now press Edit default, assure that Anonymous logon has Remote Access.

Repeat for Launch and activation permissions, assure Anonymous user has Remote Launch and activation permissions.

If you changed any of the settings, the OPC server (ApisHive) service must be restarted for the changes to take effect.

Windows Firewall


ALARM from OPC

»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed to create OPC server, Prediktor.ApisOPCServer.1, on 10.100.86.125.

Error return: The RPC server is unavailable. (0x800706BA).

Like in the OPC enum case, this message indicates that the problem likely is firewall related. There is no answer from remote server.

Solution:

The firewall must be opened for ApisHive process. Follow the procedure in Firewall configuration of OPC enum but in this case open for ApisHive ("<install dir>\Bin\ApisHive.exe")

OPC server callback Firewall


ALARM from OPC/opcda://10.100.86.125/Prediktor.ApisOPCServer.1 [Primary]

»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed calling IOPCDataCallback::Advise - IOPCDataCallback! Error return: The RPC server is unavailable. (0x800706BA).

This message indicates that the problem likely is firewall related. There is no answer from remote server, the server tries to write back to client but hits the firewall.

Solution:

The firewall on the local client computer must be opened for ApisHive process. Follow the procedure in Firewall configuration of OPC enum but in this case open for ApisHive ("<install dir>\Bin\ApisHive.exe").

OPC server callback access rights


ALARM from OPC/opcda://10.100.86.125/Prediktor.ApisOPCServer.1 [Primary]

»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed calling IOPCDataCallback::Advise - IOPCDataCallback! Error return: Access is denied. (0x80070005).

This message indicates that the problem is DCOM security callback related. Remote server tries to write back to client but gets “Access denied”

In this case server is running on “OPCServerUser” account meaning that when trying to write back to the client it must have access right to local computer and the process running the client as well (Prediktor.ApisOPCServer.1).

On local computer:

Assure OPCServerUser exist with same password as the corresponding user on remote server.

Assure OPCServerUser has computer wide limits remote access rights.

Assure OPCServerUser has remote access rights to client process, in this case ApisHive, trough default access permissions.

If you changed any of the computer wide settings, the OPC server (ApisHive) service must be restarted for the changes to take effect.

How to set DCOM security Computer wide limits for a specific user

Start Component Services system configuration and browse to My Computer, right click, select Properties and select COM Security tab in Access Permissions section: Press Edit Limits button and assure that that the specific user has Local and Remote Access.

Repeat for Launch and activation permissions.

Run Apis Management Studio on user with limited rights

Limitation:

  • Apis Foundation must be installed from a user with local administrator rights.
  • Apis Foundation (windows) services cannot be Started or Stopped from users with limited rights, this is the nature of the operating system.

The following tasks must be fulfilled:

  • Give the user appropriate DCOM rights.
  • Give the user appropriate registry rights.
  • Restart Honeystore and ApisHive services.

The examples below show how to setup the operating system to allow a “standard” user to run Apis Management Studio and connect to ApisHive and ApisHoneystore instances.

In this example:

  • The “standard” user with limited rights is named AMSUser ( Apis Management Studio User)
  • Operating system is Windows server 2016 (the procedure is the same on other system the dialogs looks a bit different)
  • Computer is not a member of domain

Give the user appropriate DCOM rights

In principal the AMSUser must have DCOM access rights to ApisHive instance(s) and Apis Honeystore this can be done in several ways in DCOM configuration, through default settings, groups etc. Here is one of several possible procedures:

Start DCOM configuration, in Component services for ApisHive, in Security tab Edit Access permissions.

Add the AMSUser user and give it Local Access permission.

Still i Component services now select ApisHoneystore Properties/Security Access permissions

Add the AMSUser user and give it Local Access permission.

Still i Component services now select My Computer / COM Security / Access permission / Edit Default

Add the AMSUser user and give it Local Access permission.

Give the user appropriate registry rights

Give the user appropriate rights to config part of registry

Open registry editor, navigate to the following keys in order:

  • HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor
  • HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Eventlog\Application

Right click Permissions

Add the AMSUser user and give it Full Control rights

Setting registry permissions to COM part of registry

The user needs read/query-access to the COM part of registry where information about COM classes are stored.

Add special permissions for HKEY_CLASSES_ROOT\CLSID

Open registry editor, navigate to: HKEY_CLASSES_ROOT\CLSID

Right click and Permissions

Press add, and specify the AMSUser user.

Select the added user, and press the Advanced button

In the next dialog, select the AMSUser user again, and click the Edit button

Click “Show advanced permissions”

Make sure at least the permissions shown above is granted, and do NOT check the “Only apply these permissions to objects and/or containers within this container”

Press OK on the 3 open dialogs.

Restart Honeystore and ApisHive Services

Restart Honeystore and ApisHive Services to assure the DCOM security settings changes take effect.

Run Apis on user with limited rights

This procedure is for information only and is neither supported or recommended,

Running Foundation services on user without local adminastrive rights, should only be performed in extraordinary circumstances

Run Apis on user with limited rights

Install Apis from a user with administrator rights.

When finished, fulfill following tasks:

  1. Change the service Log On As account
  2. Change Identity in DCOM
  3. Give the user appropriate DCOM rights
  4. Give the user appropriate registry rights
  5. Give the user appropriate file system rights

Change the service Log On As account

Start services console and on the Log On tab of ApisHive service select This account and type in the user (in this case user) and the password for the user.

Change Identity in DCOM

Start DCOM configuration, in the Identity tab of property window of Apis Hive select This user and type in the user (in this case user) and the password for the user.

Apply

Give the user appropriate DCOM rights

Still in Component services for ApisHive, in Security tab Edit Launch and activation permissions.

Add the user and give it Local launch an Activation permission.

Repeat for Access and Configuration permissions

Give the user appropriate registry rights

Open registry editor, navigate to HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor, right click Permissions

Add the user and give it Full Control rights

Give the user appropriate file system rights

In windows Explorer navigate to the installation directory of ApisHive

For instance C:\\Program Files\APIS

Add the user and give it Full control

Run Apis on domain user with limited rights

Install Apis from a user with administrator rights and do this procedure from a user with administrator rights and access to AD.

Full fill following tasks:

  1. Change the service Log On As account
  2. Change Identity in DCOM
  3. Give the user appropriate DCOM rights
  4. Give the user appropriate registry rights
  5. Give the user appropriate file system rights
  6. Check domain group policy for user and computer running Apis
  7. Restart Honeystore and ApisHive Services

The examples below show how to setup ApisHive to run on a standard domain user Apis1 in the domain prediktor.

Change the service Log On As account

Start services console and on the Log On tab of ApisHive service select This account and type in the user (in this case prediktor\Apis1) and the password for the user.

Change Identity in DCOM

Start DCOM configuration, in the Identity tab of property window of Apis Hive select This user and type in the user (in this case prediktor\Apis1) and the password for the user.

Apply

Give the user appropriate DCOM rights

Still in Component services for ApisHive, in Security tab Edit Launch and activation permissions.

Add the prediktor\Apis1 user and give it Local launch an Activation permission.

Repeat for Access and Configuration permissions

Still i Component services now select ApisHoneystore Properties/Security Access permissions

Add the prediktor\Apis1 user and give it Local Access permissionss

Still i Component services now select My Computer / COM Security / Launch and Activation Permissions Edit Default

Add the prediktor\Apis1 user and give it Local launch an Activation permission.

Give the user appropriate rights to config part of registry

Open registry editor, navigate to:

  • HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor

right click Permissions

Add the prediktor\Apis1 user and give it Full Control rights

Repeat for

  • HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor
  • HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Eventlog\Application

Setting registry permissions to COM part of registry;

Hive needs read/query-access to the COM part of registry where information about COM classes are stored.

Add special permissions for HKCR\CLSID

Press add, and specify the service user (in this case, “User”)

Select the added user, and press the Advanced button

In the next dialog, select the service user again, and click the Edit button

Make sure at least the permissions shown above is granted, and do NOT check the “Apply these permissions to objects and/or containers within this container only”

Press OK on the 3 open dialogs. Now the Hive will be able to run as a regular user.

Give the user appropriate file system rights

In windows Explorer navigate to the installation directory of ApisHive

For instance

  • C:\\Program Files\APIS

Add the prediktor\Apis1 user and give it Full control

Check domain group policy for user and computer running Apis

In domain group policy, check registry access for the service user in policy group where the computer belongs.

The following is not fully verified:

On x64 version of Apis it seems that, the user must have full access to CLASSES_ROOT\CLSID

Assure you have a reliable backup of the systems in advance of this procedure

Manually Backup Restore Apis configuration

Manually copying Apis Hive configurations

A secure method to copy Apis Hive configuration from one computer to another, or simply have a backup of the configuration, is to copy the configuration files and registry settings. By using this method, copying and restore of configurations can be automated by scripting.

This method also describes various upgrade scenarios.

This method requires basic knowledge to Windows registry settings, how to export and import Windows registry keys.

The configuration location

Current version only support 64-bit, but the 32-bit information is included to suport upgrade to 64-bit.

The procedure varies slightly depending on the bitness (32/64) of the operating system and Apis Foundation. There are mainly 3 configuration storage types; registry, binary files and xml configuration files.

The location of registry Apis configuration

The registry holds information regarding basic functionality of ApisHive, HoneyStore and configuration file location of module configuration.

  • The location of registry Apis configuration:
    • HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor
  • 32-bit application on 64-bit operating system:
    • HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Prediktor

Apis module configuration files

These files hold information regarding module property, and items, default the file name is the same as the module/database name:

  • <ModuleName>.acd Module binary configuration file (old format)
  • <ModuleName>.ans Module binary configuration file
  • <DatabaseName>.acdb Database binary configuration file (old format)
  • <DatabaseName>.ansdb Database binary configuration file

The default location of Apis module configuration:

  • <Install Directory>_Config\<INSTANCENAME>_ Module configuration files
  • <Install Directory>_Config\ApisHoneyStore_ Database configuration files

Event Historian

If Event Historian (Chronical) is enabled the configuration is stored by default in

  • <Install Directory>_Chronical\<INSTANCENAME>_

Apis servers xml configuration files

These files hold information regarding advanced functionality of ApisHive and HoneyStore not found in the registry.

64-bit Apis Foundation:

  • <Install Directory>\Bin
  • ApisHiveX64.exe.config
  • ApisHiveX64.AppSettings.config
  • ApisHoneystoreX64.exe.config
  • ApisHoneystoreX64.AppSettings.config

32-bit Apis Foundation:

  • <Install Directory>\Bin
  • ApisHive.exe.config
  • ApisHive.AppSettings.config
  • ApisHoneystore.exe.config
  • ApisHoneystore.AppSettings.config

Upgrade paths

From Operating system bitnessTo Operating system bitnessFrom Apis Foundation bitnessTo Apis Foundation bitnessCopy procedureRestore procedure
3264326414
6464326425
6464646436

Copy Apis Hive configuration

1. Copy 32-bit Apis Hive configuration on 32-bit operating system

a. Copy all files from <Install Directory>\Config and if Event Historian (Chronical) is enabled copy all files from <Install Directory>\Chronical\<INSTANCENAME>

b. Copy ApisHive.AppSettings.config and ApisHive.exe.config from <Install Directory>\Bin

c. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<INSTANCENAME> to a file.

2. Copy 32-bit Apis Hive configuration on 64-bits-bit operating system

a. Copy all files from <Install Directory>\Config and if Event Historian (Chronical) is enabled copy all files from <Install Directory>\Chronical\<INSTANCENAME>

b. Copy ApisHive.AppSettings.config and ApisHive.exe.config from <Install Directory>\Bin

c. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\< INSTANCENAME > to a file.

3. Copy 64-bit Apis Hive configuration

a. Copy all files from <Install Directory>\Config and if Event Historian (Chronical) is enabled copy all files from <Install Directory>\Chronical\<INSTANCENAME>

b. Copy ApisHiveX64.AppSettings.config and ApisHiveX64.exe.config from <Install Directory>\Bin

c. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<INSTANCENAME> to a file.

Restore Apis Hive configuration

If the <Install Directory> directory on the destination computer is different from the source computer, the registry settings export file must be changed:

In the exported registry script file, locate where Apis Hive configuration was initially installed the "ApisStorageSource" string value, for instance where system was initially installed in C:\\Program Files (x86)\APIS and is copied/ moved to C:\\Program Files\APIS :

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\ApisHive\Modules\ApisWorker]

@="{983B4AE2-ABB9-11D2-9424-00608CF4C421}"

"ProgIDOfModule"="Prediktor.ApisWorker.1"

"ApisStorageClass"="{4C854C93-C667-11D2-944B-00608CF4C421}"

"ApisStorageSource"=" C:\\Program Files (x86)\APIS\ Config\ApisHive\Worker.ans "

Replace all occurrences of the original location to new location, for instance:

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\ApisHive\Modules\ApisWorker]

@="{983B4AE2-ABB9-11D2-9424-00608CF4C421}"

"ProgIDOfModule"="Prediktor.ApisWorker.1"

"ApisStorageClass"="{4C854C93-C667-11D2-944B-00608CF4C421}"

"ApisStorageSource"="C:\\Program Files\APIS\Config\ApisHive\Worker.ans "

4. Restore 32-bit Apis Hive configuration from 32-bit operating system to 64-bit bits operating system as 64-bit Apis Hive configuration.

a. Install Apis Foundation 64

b. Create new instance if not using default.

c. Copy all files (restore) (1.a.) to <Install Directory>\Config and possibly <Install Directory>\Chronical\<INSTANCENAME>

e. Run registry script. (1.c.)

5. Restore 32-bit Apis Hive configuration from 64-bit operating system to 64-bit bits operating system as 64-bit Apis Hive configuration.

a. Install Apis Foundation 64

b. Create new instance if not using default.

c. Copy all files (restore) (2.a.) to <Install Directory>\Config and possibly <Install Directory>\Chronical\<INSTANCENAME>

d. Compare the settings of “ApisHive.AppSettings.config and“ApisHive.exe.config” from the 32-bit source system (2.b) with the 64-bit “ApisHiveX64.exe.config and ApisHiveX64.AppSettings.config” on the destination system in <Install Directory>\Bin64. If the settings are different do necessary changes in the 64-bit files.

e. Edit the registry script (2.c.)

i. Replace all “SOFTWARE\Wow6432Node\Prediktor” with “SOFTWARE\Prediktor”

ii. Save

f. Run the modified registry script.

6. Restore 64-bit Apis Hive configuration from 64-bit operating system to 64-bit operating system as 64-bit Apis Hive configuration.

a. Install Apis Foundation 64

b. Create new instance if not using default.

c. Copy (restore) all files (3a.) to <Install Directory>\Config, (possibly <Install Directory>\Chronical\<INSTANCENAME>) and the “.config” files (3.b) to <Install Directory>\Bin64

d. Run registry script. (3.c.)

Manually copying Apis Honey Store configuration

Upgrade paths

From Operating system bitnessTo Operating system bitnessFrom Apis Foundation bitnessTo Apis Foundation bitnessCopy procedureRestore procedure
3232323213
3264326414
6464646425

Copy configuration

1. Copy Apis Honey Store configuration on 32-bit operating system

a. Copy all files from <Install Directory> Config\ApisHoneyStore and.

b. Copy ApisHoneyStore.AppSettings.config, ApisHoneystore.exe.config and ApisOPCHDA.exe.config from <Install Directory>\Bin

c. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore to a file.

2. Copy Apis Honey Store configuration on 64-bit operating system

a. Copy all files from <Install Directory>\Config\ ApisHoneyStore.

b. Copy ApisHoneyStoreX64.AppSettings.config, ApisHoneystorex64.exe.config and ApisOPCHDAx64.exe.config from <Install Directory>\Bin

c. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore to a file.

Restore Apis Honey Store configuration

3. Restore 32-bit Apis Honey Store configuration from 32-bit operating system to 32-bit operating system

a. Install Apis Foundation

b. Copy (restore) all files (1.a.) to <Install Directory>\Config\ApisHoneyStore and the “.config” files (1.b.) to <Install Directory>\Bin

c. Run registry script. (1.c.)

4. Restore 32-bit Apis Honey Store configuration from 32-bit operating system to 64-bit operating system

a. Install Apis Foundation 64

b. Copy (restore) all files (1.a.) to <Install Directory>\Config\ApisHoneyStore

c. Compare the settings of “ApisHoneyStore.AppSettings.config, ApisHoneystore.exe.config and ApisOPCHDA.exe.config” from the 32-bit source system (1.b) with the 64-bit “ApisHoneyStore.AppSettings.config, ApisHoneystore.exe.config and ApisOPCHDA.exe.config” on the destination system in <Install Directory>\Bin. If the settings are different do necessary changes in the 64-bit files.

d. Run registry script. (1.c.)

5. Restore 64-bit Apis Honey Store configuration from 64-bit operating system to 64-bit operating system

a. Install Apis Foundation 64

b. Copy (restore) all files (2.a.) to <Install Directory>\Config\ApisHoneyStore and the “.config” files (2.b.) to <Install Directory>\Bin

c. Run registry script. (2c.)

Manually copy/move Apis Honey Store database

Example user case:

Migration of ApisFoundation to new hardware from Server1 to Server2

Assume we have database named “RedLogger” this is located in C:\APIS\DBs on Server1

Assume ApisFoundation is installed in same location on both computers C:\APIS

During migration to Server2 we want to move the location of the “RedLogger” database, to E:\DBs

  • Stop ApisHoneyStore service on Server1 and 2
  • Copy C:\APIS\Config\ApisHoneyStore\RedLogger.ansb from Server1 to C:\APIS\Config\ApisHoneyStore on Server2
  • Copy the RedLogger.dat and RedLogger.cache files from C:\APIS\DBs on Server1 to E:\DBs on Server2
  • On Server1 export the registry key:
  • ”HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore\Databases\RedLogger” to a file
  • Copy the registry script file to Server2 and run it. (If ApisFoundation is installed in different location than Server1 the “ConfigFile” String Value in the registry script must be altered.)
  • Assure you have a backup of the file C:\APIS\Config\ApisHoneyStore\RedLogger.ansb
  • Edit the location of the database in the configuration file, there are two options (tools); offline ApisMetaStorageViewer and online MMC Snapin:

1. Offline: Start ApisMetaStorageViewer.exe (is a part of APIS_x_x_x-Tools)

  • Open the C:\APIS\Config\ApisHoneyStore\RedLogger.ansb

  • Change the Attrib ID 20 from C:\APIS\DBs\ to E:\DBs
  • The migration of the database is now finished and ApisHoneyStore can be started on Server2

2. Online: Start ApisHoneyStore service

  • From Microsoft Management Console, open/add the ApisHoneyStore MMC snapin
  • Navigate to RedLogger database right click and select Administer database…

  • In RedLogger properties dialog change DataDirPath from C:\APIS\DBs\RedLogger.dat\ to E:\DBs\RedLogger.dat\ and CachePath from C:\APIS\DBs\ to E:\DBs

  • The migration of the database is now finished

Surveillance

Extract how much data that are received from source

How to identify the amount of VQTs Apis Hive receive from a source

Each Apis Opc Ua module will have a #Connected# item. This item identifies whether the connection is established with server specified in the property window.

In order to see more detailed information regarding communication with sources, we can add “ApisPerformanceMonitor” module to Apis.

Right click on the PerformanceMonitor and select “Add items” --> “Performance Counter”.

Select “Browse”

Type the name of the connection you want to add surveillance for. For this example, it is used an ApisOpcUa module with the name “UaConnection”. The following is then utilized for search

Multiple performance counters might be of interest. The counter which gives information about how many samples that are received from the source are “@Apis Hive Bee/Receive samples/sec(ApisHive.<OpcUa module name>)

Click "Ok" to add the item to Apis Hive.

Identify that the item is added to the PerformanceMonitor module:

Apis Tools and Services

This section contains technical documentation for the Apis tools and services.

Apis Services:

  • Apis Hive is a multipurpose real-time data communication hub and container for Apis Modules. Hive is an executable that hosts data access, processing, and logging components into one efficient real-time domain.
  • Apis HoneyStore is Prediktor's high-performance, time-series database.
  • Apis Chronical is Prediktor's high-performance, event-server and historian.
  • Apis OpcUa Namespace Replication Service is a service for replicating namespaces from OPC UA servers to Apis Hive.
  • Apis Model Index Service is a service for query the information models hosted in different Apis instances.
  • Apis Backup Agent is a service responsible for executing backup and restore jobs.

Apis Tools:

  • Apis Management Studio is the main engineering interface for configuring Apis services.
  • Apis Bare is a tool for manually backing up and restoring configuration and data for your Apis applications.

Common Apis settings:

Apis Management Studio

Apis Management Studio (AMS) is the configuration tool for the Apis product range.

It's able to connect to supported servers, and configure or view the status of these servers. The supported functionality of AMS will change depending on which servers are connected. For instance, for many servers that support it, there could be historical values displayed. Other servers may support viewing of real-time data. The main idea, however, is that all servers support a hierarchical data structure that is displayed in the Solution Explorer, and the items in the hierarchy have properties which can be viewed and manipulated in the Property Editor.

The Apis Management Studio (AMS) consists of different parts that interact with each other. The image below shows the AMS before any services have been connected.

The left-most part is the Solution Explorer. This is where you connect to the servers which are to be configured. The Solution Explorer will also display the currently connected servers in a tree view. The tree view consists of items that can be clicked, and the properties of the last clicked item will be displayed in the Property Editor. The Solution Explorer is searchable from the search box at the top.

The middle part of the application consists of different kinds of views. The views available depend on which servers are connected. For instance if a Hive instance is connected, data views can be created which show the current status of Hive items.

The right-most part is the Property Editor, where properties of objects can be changed. The Property Editor will display the properties of the objects most recently selected in the Solution Explorer and in the data views.

Apis Management Studio is installed by the Apis Foundation Setup Kit. Once installed, it's available through the All Programs menu in Windows.

Menus

The user interface includes a menu bar at the top of the program window, containing several menu options. The menu options may change, depending on which servers are connected.

FILE

New...Creates a new solution. The existing views and connected items will be removed.
Open...Opens a file dialog where you can select a saved AMS configuration.
SaveSaves the current solution to a file. If the solution has not been saved previously, you'll be prompted for a file name.
Save As..Opens a file dialog and lets you save the solution to a file.
Import Log FilesOpens a file dialog which lets you choose one or more log files. This results in a log view opening if the log files are valid.
ExitExits the program.
  

VIEW

ThemesOpens a submenu, which lets you select between different themes.
High Precision TimeDecides whether the time shall be presented in high precision time. If false the display accuracy is in milliseconds. If true the accuracy is increased to a tenth of a microsecond.
Real Time

Opens a submenu, which lets you select between different real-time views.

  • All Items - displays all items of the connected Apis Hive instances.
  • Adaptive Items List.
  • Selected Items List.
  • Trend
Historical

Opens a submenu, which lets you select between different historical data views.

  • Table - displays historical data in a table
  • Trend - displays historical data in a graph
SearchOpens a search view
PropertiesOpens a properties view
System

Opens a submenu, which lets you select between different views.

  • Solution Explorer - hides or shows the Solution Explorer
  • Connection View - Displays the Connection View.
  • Debug View - Displays the Debug View
  • Connecton Limit - Decides the maximum number of connections which can be shown in the Connection View. If this limit is exceeded, a warning is issued, and the connections will not be displayed.
  • Status bar - toggles visibility of the status bar.
  • Display User Name - toggles the visibility of the name of the user running AMS.
  

SETTINGS

High Precision TimeToggles between showing time with microseconds or milliseconds
Themes

Opens a submenu, which lets you select between different themes.

  • Dark Theme
  • Light Theme
  • Blue Theme
Import Settings (Hive)When importing configuration files into hive, this selection decides if current culture or invariant culture is used.
Import Log SettingDecides if older log files than the ones selected when importing log files are automatically loaded. If this is true, load times will be longer
Docking

Opens a submenu

  • Save Layout - saves current layout which will be used after restart.
  • Reset Layout - resets layout to original layout
Load last config file on startupIf true, the last config file which was saved by the user will be automatically loaded the next time AMS is started.
Text Filter

This decides the how the text filter works in AMS. There are three alternatives:

  • Contains Text - If the text filter is contained in the searched text it is a match (default)
  • Like Operator -
  • Regular Expression - The text filter is a regular expression which is matched against the searched text

HELP

Help...Displays the documentation.
About...Displays a dialog which shows the version and copyright.

Solution Explorer

The Solution Explorer contains all the connected servers.

It consists of a tree view displaying the content of the connected server. The content of the server depends on what type of server is connected and the current configuration. At The Solution Explorer displays nodes in a tree view. By clicking on the nodes the Property Editor will display the properties for the different elements. It's possible to click more than one element by holding the "Ctrl" or "Shift" buttons when selecting the elements. The elements can be right-clicked, which brings up a context menu for some of the elements. The contents of the context menu depend on the element clicked.

Connecting to a server

The upper part of the Solution Explorer consists of the connection bar:

The first part is a combo box which contains the local servers that have been found.

There may be other connectable local servers available, beyond what appears in the list. The combo box is editable so it's possible to write the URL of the server and connect directly without searching for it.

There are three buttons in the connection bar:

ButtonAction
Refreshes the list of local server.
Connects to the server currently selected in the combo box.
Browses for servers on other computers.

By clicking the browse button the following dialog box appears:

Here you can either enter the name or IP address of the computer in which to search, or click the browse button to browse for computers.

Searching for computers might take a substantial amount of time.

A search of the computer will be performed once you hit enter or have browsed for a computer.

To connect, select a server and click "Connect".

Property Editor

The Property Editor displays properties for different kinds of elements in the solution. By clicking in the Solution Explorer, or in other views, the Property Editor will display properties for the elements last clicked. The Property Editor can display properties for several items at once, which will only show common property types. If the common property types have the same value the value will be displayed. If the value is different, a blank field is shown for text properties.

The edited property values will only be set when you click the "Apply" button. The changes can be cancelled by clicking the "Cancel" button.

By clicking the "Add Property" button, new properties can be added to the objects currently selected. Not all objects can have properties added to them, and that case the "Add Property" button will not be displayed.

By selecting the properties to add and clicking "Ok", the properties are added. By entering in the top text field, the properties will be filtered.

By clicking the "Remove" button, properties can be removed from the objects currently selected. Not all objects can have properties removed, and not all properties can be removed on the objects.

Some objects (for instance, Hive items) can be connected to other objects. This can be achieved by clicking the "Connect" button in the Property Editor, which will make the connection dialog appear.

Supported Servers

The table below shows the supported servers.

ServersUrlComment
Apis Hivehive://<computer>/<instance>If the hive instance is the default instance for a remote computer, the url is: hive://<computer>/. If the hive instance is the default instance on the local machine, the url: hive:// is sufficient.
Apis Honeystorehs://<computer>There can only be one Honeystore instance on a computer.
Apis Managementapis://localhostThis is where instances can be created, started and stopped. In addition some configuration which is done offline, is performed in this server. It is only possible to connect to this on the local machine.
Apis Security Serversecu.tcp://<computer>:<port> 
Apis Namespace Servernss://<computer>:<port> 
OPC UA Serveropc.tcp://<computer/address>:<port>There is a limited set of functionality available for OPC UA.
OPC HDA Serveropchda://<computer>/<progid>For example, to connect to the local Apis OPC HDA server: opchda://localhost/Prediktor.ApisOPCHDAServer
OPC DA Serveropcda://<conputer>/<progid>For example, to connect to the local Apis OPC DA server: opcda://localhost/Prediktor.ApisOPCServer
   

List Views

There are several different list views for displaying real-time data in AMS. The list views will update automatically when the data changes on the server. They all look and act the same, but the selection of real-time values displayed differs.

The list view consists of three main parts:

  1. Additional filters;
  2. Name filter;
  3. A list of real-time data.

Additional filters

There can be multiple filters, and they're added by clicking the "Add Filter" button.

The additional filters are usually collapsed, and you must click the uncollapse button to display the filters.

When a filter has been added, the first combo box selects by which property to filter. The next combo box selects what kind of filter is used. The last element is the filter value, a combo box or text box depending on the value type. When the filter value has been entered, press "Apply" for the filter to take effect.

The "Refresh" button refreshes the combo boxes.

Name Filter

The next part of the view is the name filter. This is probably the most used filter, and it filters items based on names. It supports wild card filters (*). You must either press enter or click the "Name search" button for the filter to take effect.

List of real-time data

The list of real time data shows the properties of items in tabular form. There are a configurable number of columns, and each row represents an item. The properties of the items are displayed in the table cells. If an item doesn't have the property, an empty field is displayed.

Edit columns

It's possible to add or remove columns by bringing up the context menu and selecting "Select columns".

You must select the columns to be displayed by clicking the check box next to the property name.

Historical Data View

The Historical Data View fetches and displays historical data in a list.

The view consists of two parts. The first part is the Time selector, where you select Start time and End time.

Time Selector

Both these times can be be either: "Relative Time" or "Absolute time". If you want "Absolute time", write the time directly in the text box, or click the calendar symbol on the right side of the text box to select the date and time.

If either of the times are absolute, the Use Local Time checkbox decides if the input is in Local Time or UTC.

The "Aggregate" field decides which processing will be performed on the data when it's fetched. Depending on the aggregate used, either the "Max Values" field (for raw data) or "Resample" field (for all other aggregates) will be displayed.

The "Resample" field indicates the new time span/interval between two data points after the processing has been performed. This is only applicable when raw data is not selected.

The "Max values" field is only applicable when raw data is selected, and it specifies the maximum number of values which can be fetched from the server.

The "Read History" button must be pushed to fetch data from the database.

List view

The list view is where the data is actually displayed. It's possible to drag and drop historical items into this view. Which means several items can be displayed in the same view. For each item, three columns will be displayed. The first column is the time, the second is the quality, and the third is the actual value.

If raw data for several items is fetched, the number of values for each item might be different. This means for some items there will be empty rows.

Adding items to a view

By dragging historical items from the Solution Explorer to the view, the items will be added to the view, and three columns will be added for each item.

Remove items

By right-clicking the view and selecting "Delete columns", a dialog box appears where you can select which items to delete.

History Explorer View

The History Explorer View fetches and displays historical data in a list, event list, and a trend, and can export historical data.

The view consists of two parts. The first part is the "Time Selector", where you can select "Start time", "End time", and other parameters for each item.

Time Selector

You can use the Time Selector to configure the queries for each item. It's possible to drag and drop items into this view to add a new query for items.

You can select "Relative Time" or "Absolute time". If you want absolute time, write the time directly in the text box, or click the calendar symbol on the right side of the text box to select the date and time.

The "Aggregate" field decides which processing will be performed on the data when it is fetched. Depending on the aggregate used, either the "Max Values" field (for raw data) or "Resample" field (for all other aggregates) will be displayed.

The "Resample" field indicates the new time span/interval between two data points after the processing has been performed. This is only applicable when raw data is not selected.

The "Max values" field is only applicable when raw data is selected, and it specifies the maximum number of values which can be fetched from the server.

The "Read History" button must be pushed to fetch data from the database.

You can use the "Add a new query from this item" button to add a new query to the item.

Table view

The table view is where the data is displayed. Several items can be displayed in the same view. For each item, three columns will be displayed. The first column is the time, the second is the quality, and the third is the actual value. You can use the check boxes at the top to hide the time and quality column, and display the local time.

If the raw data for several items is fetched, the number of values for each item might be different. This means that for some items there will be empty rows.

By default, if the start time > the end time, the data is sorting descending, If the end time > the start time, the data is sorting ascending.

Combined time table view

The combined time table view is where the data is displayed as a table with one time column.

The first column is the time, the following columns are the item ID and the quality of each items,

By default, if for all queries the start time > the end time, the data is sorting descending, if for all queries the end time > the start time, the data is sorting ascending, otherwise, the data is ascending.

Event list view

The event list view is where the data is displayed as an event list. This means if more than one item is displayed, the data is organised as a event list ordered by time.

The first column is the time, the second column is the item ID, the third column is the quality, the fourth column is the value.

By default, if for all queries the start time > the end time, the data is sorting descending, if for all queries the end time > the start time, the data is sorting ascending, otherwise, the data is ascending.

Graphical view

The graphical view is where the data is displayed as a trend.

Export data to file

You can click the "Export to file" button to start a dialog to export the data to file.

Adding items to a view

By dragging historical items from the Solution Explorer to the time selector view, the items will be added to the view, and three columns will be added for each item.

Remove items

By clicking the "Delete" button for each query (the button is only displayed when the mouse is hovering) you can delete one query.

History Event View

The History Event View fetches and displays historical event in a list, historical event details, and can export historical events.

The view consists of four parts, the object list, filter, history event list and event details.

Object list

The object list shows which objects the history events come from, It's possible to drag and drop objects from the solution explorer into this list to add a new query for objects.

Filter part

In the filter part the user can add more filters for the events, such as severity, source name.

History event list

The history event list shows the query result, it contains all events of the query, the user can use context menu to specify which columns he want to see.

The user can click the 'Export to file button' to export the events to a text file.

In the export dialog, the user can check the 'Export all fields' checkbox at the bottom to export all fields of events, if this checkbox is unchecked, only field in the lists are exported.

Event details

The event details part shows the detail information of one event item, when the user click one event in the event list, the event details view will show the detail information of this event.

Real time Event View

The Real time Event View fetches and displays real time event in a list, real time event details, and can export historical events.

The view consists of four parts, the object list, filter, real time event list and event details.

Object list

The object list shows which objects the real time events come from, It's possible to drag and drop objects from the solution explorer into this list to add a new query for objects.

Filter part

In the filter part the user can add more filters for the events, such as severity, source name.

Real time event part

The real time event part is a tab view with two tabs, one is events and the other is alarms, the events tab shows all event, and the alarms tab shows the alarms.

In the alarms tab, the user can use the 'Acknowledge' context menu item to acknowledge an alarm.

For both events tab and alarms tab, the user can use context menu to specify which columns he want to see.

For both events tab and alarms tab, the user can click the 'Export to file button' to export the events to a text file.

In the export dialog, the user can check the 'Export all fields' checkbox at the bottom to export all fields of events, if this checkbox is unchecked, only field in the lists are exported.

Event details

The event details part shows the detail information of one event item, when the user click one event in the event list, the event details view will show the detail information of this event.

Log View

The log view displays logfiles of specific formats, and lets the user search and filter the contents.

The log view can be opened from the File menu, or by right-clicking root nodes for Apis Hive and Honeystore and selecting Show Log.

By clicking Reload, the log file will be reloaded. The view will not detect changes in the log file unless Reload is clicked.

The list view can be searched by entering the search criteria in the Search text box and clicking the Search button. The matching rows will be highlighted with color set in the Highlight combobox.

Multiple searches can be done by assigning different colors to each search. The Previous and Next buttons will make the list jump to the previous/next Highlighted row.

By clicking a row the details of the log event will be shown in the detailed message view.

Filters

The filters are located in the header of the list. There are filters for Time, Level, Source Thread, Message and File, and the filters are ANDed.

Except for the Time filter, the filters can be either equal or not equal by clicking the ==/!= button next to the filter.

The time filter has a start time and and end time, and all the messages displayed will be between those times.

Apis Hive

Apis Hive is a real-time data hub and container for plug-in modules. Apis Hive is a Windows application, that can run either as a Windows service, or as a normal program. It's an OPC DA and AE Server, as well as an OPC UA Server. It is possible to register multiple instances of Apis Hive, which each runs separate configurations.

Apis Management Studio is used to configure the Apis Hive instances, and the configuration for each instance is stored in the Windows Registry.

Apis Hive Modules

An Apis Hive Module is a component living inside of Apis Hive. Several different types of modules exist, and among the tasks they perform are: communication with external systems; calculation; analysing data; logging to a database.

New module types are made all the time, adding new functionality to the Apis ecosystem.

Shared Module Properties

General Module Properties

The properties of a module depend on the module type. There are, however, some properties common to all modules, which may or may not be exposed by a specific module type.

Common module properties

The following properties may apply to any modules in Apis Hive.

Standard properties

NameDescriptionTypeID
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.4 byte integer100

Advanced properties

NameDescriptionTypeID
ExtItem full refreshWhen true, the external items manager will force a full refresh initially on start/reset when reading items. I.e. items not yet initialized in their source, will also trigger an external item update. Default is true.bool150
TimeReferenceItemAn item who's value will be used as a time reference for this module instead of the system time.string200
ExtItemCalculationSequenceDecides whether data validation or data transfer will be performed first in the external item manager.4 byte unsigned integer300
ExtItem pass-through qualitySpecifies the quality of external item values that will pass through external item transfers. Default is 'Good and Uncertain qualities'.enum, 2 byte signed integer400

Information properties

NameDescriptionTypeID
ExternalItem reportA status-report for the External Item Manager of this module.string110

Performance properties

Nearly all modules have some common performance properties. These properties relate to the performance of the reading and writing of items from and to the modules. Exceptions: in rare cases, a module may not have it own items. These modules won't have any common performance properties at all. For example: the ApisLoggerBee module.

Read performance

Item value read considerations:

NameDescription
JustifiedThe number of item reads that were justified on this module. I.e. the item had changed since the last time the same item was read by the same reader.
NeedlessThe number of item reads that were needless on this module. I.e. the item had not changed since the last time the same item was read by the same reader.
InvalidThe number of item reads that requested items not belonging to this module.
FailedThe number of item reads in this module that failed for some reason.

Read-call considerations:

NameDescription
Mean timeThe average time in milliseconds it takes to read the specified number of items.
Peak timeThe maximum time in milliseconds it has taken to read the specified number of items.

Write performance

Item value write considerations:

NameDescription
JustifiedThe number of item writes that were justified on this module. I.e. the value or timestamp of the item had changed since the last time the item was written.
NeedlessThe number of item writes that were needless on this module. I.e. the item value or timestamp written was the same as the value or timestamp the item already had.
InvalidThe number of item writes that requested items not belonging to this module.
FailedThe number of item writes to this module that failed for some reason.

Write-call considerations:

NameDescription
Mean timeThe average time in milliseconds it takes to write the specified number of items
Peak timeThe maximum time in milliseconds it has taken to write the specified number of items.

Additionally, modules may have several custom performance properties listed. To get help on these properties you must look in the help for that specific module.

Advanced module configuration

Changing default number of items per module

By default, the namespace in a single ApisHive instance, can have a maximum of 4096 modules, with a maximum of 1048575 items in each module.

If you for some reason want to change this, ie. to allow for more than 4096 modules in an instance or more than 1048575 items in a module, you must add/change an entry in the Windows registry.

Eg. to have a maximum of 65536 modules, with a maximum of 65536 items in each module, add/change the following registry entry:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules]
"MaxItemsInModule"=dword:0000ffff

The upper and lower limits for the MaxItemsInModule value, are:

  • Minimum (256 items/16777215 modules): "MaxItemsInModule"=dword:000000ff
  • Maximum (16777215 items/256 modules): "MaxItemsInModule"=dword:00ffffff

Please note that when changing this value for a named instance, modify the path of the registry key to reflect the name of the instance. Eg.:
[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis_MyInstanceName_\Modules]

Please note that the registry entry is not present by default, and must be added if not used before. Also, if changing this value, the Hive instance must be restarted to take effect.

Disable automatic resolution of external items when using ExternalItem filter attributes

When adding, deleteing or renaming items or modules in an ApisHive instance using ExternalItem filters, external items are by default automatically resolved at runtime. On huge configurations this might be time consuming, as all item connections potentially needs to be resolved again.
So, if you perform the add/delete/rename operation(s) at a time you can restart your Hive configutration, you can add/change the following registry entry:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules] "AutoResolveExternalItemFilters"=dword:0

Then restart your Hive instance, perform your desired add/delete/rename operation(s), stop your Hive instance, and revert the following registry entry:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules] "AutoResolveExternalItemFilters"=dword:1

and finally restart your Hive instance. All external item connections caused by using ExternalItem filters, have now been updated according to the new names.

See also: ExternalItem filters

Temporarily Disabling Modules

You can disable a module from being loaded into the Apis Hive configuration.

Disabling a module

Locate the registry for your Apis configuration:

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules

Here, each of your configured modules has its own registry key. Open the key of the module to disable. Underneath this key, e.g. for a module named OPC;

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules\OPC

create a DWORD value named Disabled. To disable this module, set value to 1, to enable module set value to 0.

Note! You have to restart Apis for this to take effect!

Apis AEClient

This module can subscribe to an OPC Alarm & Event from an external OPC AE server according the OPC AE specification. The alarms received can be displayed as items in the namespace and/or registered in the local Apis Alarms & Events server.

Provider: Prediktor

Properties

Commands And Events

The AEClient module has the following item types

OPC AEItem

HiddenItem

StatusItem

Properties

The AEClient module has the following properties:

NameDescriptionIDFlags
ActiveFALSE, if the Event Subscription is to be created inactive.1050Persisted
BufferMax SizeThe requested maximum number of events that will be sent in a single callback. A value of 0 means that there is no limit to the number of events that will be sent in a single callback.1031Persisted, ExpertPage
BufferTimeThe buffer time (in milliseconds) tells the server how often to send event notifications. A value of 0 means that the server should send event notifications as soon as it gets them.1030Persisted, ExpertPage
CmnItemIDPrefixCommon source ItemId prefix. This string will prefix each item's 'ItemID' when communicating with the OPC AE server. Items in this module will have names without this prefix in the Apis namespace.1011Persisted, ExpertPage
ComputerThe computer hosting the OPC server. Leaving it blank, the localhost will be used.1001Persisted, Computer
FilterAreaEvents and conditions available in the server are organized within one or more process areas. An Area is a grouping of the plant.1044Persisted, ExpertPage
FilterCategoryThe categories supported by a given event type1041Persisted, ExpertPage
FilterEventTypeThe filter criteria supported by the connected event server.1040Persisted, Enumerated
FilterSeverityHighThe severity high, indicates the highest severity of interest. This is also commonly called 'priority', especially in relation to process alarms.1043Persisted, ExpertPage
FilterSeverityLowThe severity is low, indicating the lowest severity of interest. This is also commonly called 'priority', especially in relation to process alarms.1042Persisted, ExpertPage
FilterSourceThe source of the event in the OPC AE Server1045Persisted, ExpertPage
Operation ModeThe type of mode of the Bee.1010Persisted, Enumerated
PDS AreaThe PDS area which an event subscription should be linked to.1032Persisted, ExpertPage
PDS ProcessUnitThe PDS Process Unit associated with this subscription.1033Persisted, ExpertPage
ReconnectSrvShutdownReconnect after an intended OPC server shutdown. This might cause problems when administering the OPC server.1008Persisted, ExpertPage
ReconnectTimeThe time to wait before attempting to reconnect the server after an RPC failure.1007Persisted, ExpertPage
ServerThe ProgID of the OPC server.1002Persisted, Enumerated, ProgID
SrvApisEventProcessingIntervalThe time in seconds Apis uses for processing an event.1092PerformancePage
SrvCLSIDThe CLSID of the OPC server.1006InfoPage
SrvCurrentTimeThe current time (UTC) as known by the OPC server.1071InfoPage
SrvEventSubscribtionStatus of the event subscription.1093InfoPage
SrvLCIDLocale ID of values coming from the server. You might need to specify this property if the OPC server provides string values that is converted to another type in your client (e.g. DDE bridges)1009Persisted, ExpertPage
SrvStartTimeThe time (UTC) the OPC server was started.1070InfoPage
SrvStateThe current status of the OPC server.1073Enumerated, InfoPage
SrvSupportBrowsingThe status of server support browsing1077InfoPage
SrvUpdateCallsThe number of times the OPC server has called back to this client with updated item values.1094PerformancePage
SrvUpdateCallsFailureThe number of times the module has failed to process an event.1096PerformancePage
SrvUpdateCallsNotHandledNumber of update calls not handled due to manually creation mode.1097PerformancePage
SrvUpdateCallsOKThe number of times the module has successfully processed an event.1095PerformancePage
SrvUpdateTimeThe time the OPC server sent the last data value update to this client, as known by the server.1072PerformancePage
SrvUpdateTimeClientThe time when this client received the last update, as known by the client.1090PerformancePage
SrvUpdateTimeIntervalThe time in seconds between the last two updates.1091PerformancePage
SrvVendorInfoVendor specific information about the OPC server.1075InfoPage
SrvVersionThe version number of the OPC server (major-minor-build).1074InfoPage
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TraceEventMessagesIf this is 'true', the module writes to the log files for log view in Apis Management Studio the information received on IOPCEventSink::OnEvent1061Persisted, ExpertPage
TraceOPCServerReport specific OPC calls, return from those calls, as well as callbacks to the log files for log view in Apis Management Studio.1060Persisted, Enumerated, ExpertPage

See also Module Properties

Commands And Events

The AEClient module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.Timer

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Item Types

Properties

External source item from an OPC AE Server

The OPCAEItem item type has the following properties:

NameDescriptionIDFlags
AlarmAreaThe default alarm area for the item.10100Persisted
ProcessUnitThe PDS process unit associated with this item. A process unit can be a piece of equipment or location, among other things.5716Persisted, ReadOnly
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
SrcItemIDThe item ID in the source. This is the item ID this item uses to fetch data from the source.5030Persisted
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2ReadOnly
ValueAssignmentAssign the item to one of the OPC Event data.10200Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

HiddenItem

The HiddenItem item type has the following properties:

NameDescriptionIDFlags
AlarmAreaThe default alarm area for the item.10100Persisted
ProcessUnitThe PDS process unit associated with this item. A process unit can be a piece of equipment or location, among other things.5716Persisted, ReadOnly
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
SrcItemIDThe item ID in the source. This is the item ID this item uses to fetch data from the source.5030Persisted
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2ReadOnly
ValueAssignmentAssign the item to one of the OPC Event data.10200Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

NA

The StatusItem item type has the following properties:

NameDescriptionIDFlags
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
SrcItemIDThe item ID in the source. This is the item ID this item uses to fetch data from the source.5030Persisted
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Alarm Area

The AlarmArea module monitors items in Apis Hive and generates alarms and events based on configurable criteria.

When an AlarmArea module is added to Hive, it will create a new event-category global attribute named after the module (e.g. MyAlarmAreaEvtCategory if the module is named "MyAlarmArea"). This attribute must then be added to all items that should be monitored, and set to one of the following values to specify the monitoring behavior on each item:

  • Discrete: The item value is either normal or not normal.
  • Level: The item value is expected to be within a certain range.
  • Watchdog: The item value is expected to be regularly updated.
  • WatchQuality: The item quality is expected to be good.

After setting the EvtCategory attribute, each item must be configured depending on the category selected for the item:

  • For the "Discrete" category, set the AlmNormalState attribute.
  • For the "Level" category, set the AlmH, AlmHH, AlmL and AlmLL attributes.
  • For the "Watchdog" category, set the AlmWatchdogPeriods attribute. A period is the time given in the ScanPeriod property of the AlarmArea module.
  • For the "Watch Quality" category, no additional attributes are necessary for proper configuration.

Another way to initiate alarm monitoring is to specify the AlmPrimaryArea attribute. Type the name of the AlarmAreaBee to be used to monitor the selected items. The default name is AlarmArea.

Mapping to Apis Chronical

By default, the AlarmArea module will create one Chronical eventsource for each item it monitors. The eventsource will be named after the item, and it will be linked below the sources "ApisHive/Areas/$AREANAME" and "ApisHive/Modules/$MODULENAME". When the state of an alarm is modified, the AlarmArea module will generate an event in Chronical to reflect the change.

The Chronical eventtypes used for these events are:

  • OffNormalAlarm for 'Discrete' monitoring.
  • LevelAlarm for 'Level' monitoring.
  • WatchdogAlarm for 'Watchdog' monitoring.
  • QualityAlarm for 'WatchQuality' monitoring.

The events will be reported on the eventsource created for the item, and they will use the item name for the "Sourcename" field defined in Chronical.

Some Chronical attributes can be added to each item to modify the default behaviour:

  • ChronicalEventType specifies the eventtype to use for events on the item.
  • ChronicalParent specifies a custom parent for the item instead of the default "ApisHive/Modules/$MODULENAME".
  • ChronicalSourceName specifies a custom value for the "Sourcename" eventfield.

Provider: Prediktor

Properties

Commands And Events

See also:

Properties

The AlarmArea module has the following properties:

NameDescriptionIDFlags
AckRequiredBy disabling this property, alarms generated by this module will be sent with acknowledge not required.3400Persisted, ExpertPage
AlmSeverityOfInactiveThe severity of events when going from active to inactive.1670Persisted, Enumerated, ExpertPage
AlmSeverityStepThe step in severity when moving from one subcondition to another e.g. from HI to HIHI.1650Persisted, ExpertPage
AreaNameThe name of the process area this module represents. Several areas can be listed using ';' (semicolon).1010Persisted
AreaPathThe path to the area this module represents. A hierarchical is specified using '.' (dot).1020Persisted, ExpertPage
DefaultAlmSeverityThe default value of the AlmSeverity attribute for new alarm items.1600Persisted, ExpertPage
DefaultWatchdogPeriodsThe default value of the AlmWatchdogPeriods attribute for new watchdog category items.1700Persisted, ExpertPage
EnableAlarmEvaluationThis is used to disable alarm evaluation3300Persisted, ExpertPage
EnableEvaluationItem If set, this item's value will enable (true) or disable (false) alarm evaluation. 3310 Persisted, ApisItem, ExpertPage
EnableInitialNormalReport By default, the first evaluation of an alarm condition is not reported if the alarm is in normal state. This property, when set, makes the AlarmAreaBee always report the result of the initial alarm evaluation. 3330 Persisted, ExpertPage
FreshestValue When TRUE, all items will be read directly from their physical source to receive the freshest possible values. 2010 Persisted, ExpertPage
InhibitAllSupress all alarm generation from this module.3200Persisted
InhibitQualitySupress alarms when quality is equal or worse than this.3100Persisted, Enumerated, ExpertPage
InvertInhibitSignalsInvert the value read from the inhibit signals, so that a 'false' inhibit value will prevent the alarm.1710Persisted, ExpertPage
PriorityLevelSpecifies the priority level for the working thread of this ApisAlarmAreaBee instance.3000Persisted, Enumerated, ExpertPage
ResetAlarmsBy setting this property to true all alarms will be reset if either EnablealarmEvaluation or EnableEvaluationItem is FALSE3320Persisted, ExpertPage
ScanPeriodThe scan period in milliseconds for testing the event conditions.1300Persisted
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
UseSourceTimestampsIf set to 'true', the timestamps of the monitored items will be used as event timestamps when they occur. If set to 'false', the time the event was detected by this AlarmArea module is used.1400Persisted, ExpertPage
WatchdogQualityEvaluationSet this property to true to include evaluation of the different quality OPC_QUALITY_GOOD3550Persisted, Enumerated, ExpertPage

Informational properties:

NameDescriptionIDFlags
AlmAttributeIDThe global alarm attribute ID assigned from Apis Hive.301200InfoPage
ApisConditionEventSinkCookieThe value of the cookie assigned to this module from the Apis Event Server.301800InfoPage
AreaNameThe name of the process area this module represents. Several areas can be listed using ';' (semicolon).1010Persisted
ConfigObserverCookieThe cookie identifying this module amongst the configuration aware clients of Apis Hive.301400InfoPage
Server compatibilityStatus about which version of interface supported301600InfoPage
TimeSrvEventIDThe time client event ID assigned from the Apis time server.301100InfoPage

Performance properties:

NameDescriptionIDFlags
AlarmEvalTimeThe time in milliseconds used to evaluate alarms.101007PerformancePage
HiveLastUpdCountThe number of items reported as updated at the previous request.101006PerformancePage
NumAlmItemsNumber of items this ApisAlarmArea instance stores to its database.101000PerformancePage
UpdSinceQueryTimeThe time used when requesting updated items from the Hive.101005PerformancePage

See also Module Properties

Commands And Events

The AlarmArea module has the following Commands and Events:

Commands

NameDescriptionCommand Type
ResetEventsReset any active or unacked/ackrequired event in the areaAsynchronous
ResetEventsNotifyReset any active or unacked/ackrequired event in the area and notify AE clientsAsynchronous
ScanA command initiating a Scan cycle of the items monitored by this AlarmAreaBee instance.Synchronous
Scan_DataPushThis command ensure that all items and samples in the data push package, are used for alarm evaluation where applicable.
See also:  APIS data transfer mechanism; Data Push
Synchronous

See also Commands And Events

BytePopulator

This module imports and exports content as byte array via file or REST api

Provider: Prediktor

Properties

Commands And Events

The BytePopulator module has the following item types

RESTItem

FileItem

Properties

The BytePopulator module has the following properties:

NameDescriptionIDFlags
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ExternalItem reportA status-report for the External Item manager of this module110InfoPage
ExtItem full refreshWhen true, the external items manager will force a full refresh initially on start/reset when reading items. I.e. items not yet initialized in their source, will also trigger an external item update. Default is true.150Persisted, ExpertPage
ExtItem pass-through qualitySpecifies the quality of external item values that will pass through external item transfers. If external item qualities has worse quality the this mask, the external item transfer is blocked. Default is 'Any quality'.400Persisted, Enumerated, ExpertPage
LogLevelSpecifies the loglevel for diagnostic messages from this module.500Persisted, Enumerated

See also Module Properties

Commands And Events

The BytePopulator module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified by user. The timer resolution is specified by the 'ExchangeRate' property.Timer

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
ReadSourcesTrigger items with attribute 'OnlyReadWhenTriggered' enabled to read sourcesSynchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Item Types

Item type: FileItem

Item which makes it possible to interpret files as byte array or write byte array to file

The FileItem item type has the following properties:

NameDescriptionIDFlags
EnabledThis attribute indicated whether the items operations is enabled or not10009Persisted
ExtItemOverrideMethodThis attribute decides what method to use when assigning a value from an external item.5999Persisted, Enumerated, BitMask
File PathThis attribute contains the path to the file that should be read or written10003Persisted
FileReader statusThis attribute indicated whether the file read operation went well or not10007Persisted, ReadOnly
FileWrite statusThis attributes indicated whether the file write went well or not10006Persisted, ReadOnly
NewTimestampEveryReadCycleThis attribute indicated whether the item should get a new timestamp even if the content is the same10008Persisted
OnlyReadWhenTriggeredThis attribute indicates whether the item should only read source when triggered, and not when it is attempted to be read itself10010Persisted
OperatingModeThis attribute selects whether it content is read or should be written10001Persisted, Enumerated
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage

See also Predefined Item Properties and OPC DA Properties

Item type: RESTItem

Item which makes it possible to GET content from REST server as byte array or POST byte array to REST server

The RESTItem item type has the following properties:

NameDescriptionIDFlags
EnabledThis attribute indicated whether the items operations is enabled or not10009Persisted
ExtItemOverrideMethodThis attribute decides what method to use when assigning a value from an external item.5999Persisted, Enumerated, BitMask
GET statusThis attribute indicates whether the GET went well or not10005Persisted, ReadOnly
NewTimestampEveryReadCycleThis attribute indicated whether the item should get a new timestamp even if the content is the same10008Persisted
OnlyReadWhenTriggeredThis attribute indicates whether the item should only read source when triggered, and not when it is attempted to be read itself10010Persisted
OperatingModeThis attribute selects whether it content is read or should be written10001Persisted, Enumerated
POST statusThis attribute indicated whether the POST went well or not10004Persisted, ReadOnly
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
URLURL specifies which URL the item utilises for communication10002Persisted, URL
ValueItem value2NormalPage

See also Predefined Item Properties and OPC DA Properties

Apis Connection Manager

This module is used to configure connections to external services, and clusters of such configurations.

Provider: Prediktor

Item types:

OpcUa Connection Item

OpcUa Cluster Item

OpcUa Replication Item

Item Types

OpcUa Connection Item

An item used to configure the connection to one OpcUa server. This connection setup can then be used by the OpcUa and OpcUaProxy modules, and/or be part of an OpcUa cluster,

The OpcUa connection item has the following standard properties:

NameDescriptionIDFlags
NameThe name selected for the OpcUa connection configuration.
Endpoint URLThe server endpoint, e.g. 'opc.tcp://servername:4880'10010Persisted
Reverse connectionIf true, the endpoint URL is used for reverse connections10016Persisted
ServiceLevel NodeIdThe nodeid to read the servicelevel from.10180Persisted
Security modeThe level of security used when communicating with this server (one of 'None', 'Sign', 'Sign and encrypt')10020Persisted, Enumerated
Security policyThe security policy used when communicating with this server.10021Persisted, Enumerated
AuthenticationThe type of authentication to use when connecting to this server (one of 'Anonymous', 'Username/password')10030Persisted, Enumerated
UsernameThe username to use when authenticating with the server.10031Persisted
PasswordThe password to use when authenticating with the server. 10032Persisted, Password
Pki providerThe Pki (certificate) provider to use when 'Security mode' is different from 'None', or 'Authentication' is different from 'Anonymous'.10040Persisted, Enumerated
Application certificate (SSL)The name of the file containing the application certificate that will be sent to the server. If not specified, the default SSL application certificate for the Apis instance will be used.10041Persisted
Application private key (SSL)The name of the file containing the private key for the application certificate. If not specified, the default SSL private key for the Apis instance will be used.10042Persisted
Server certificate (SSL)The name of the file containing a copy of the servers application certificate.10043Persisted
Storage path (SSL)The path to the root PKI directory. If not specified, the default pki directory for this Apis instance will be used (<INSTALL_DIR>\Config\<INSTANCE_NAME>\pki).10044Persisted
Application certificate (W32)The subject-name of the application certificate that will be sent to the server. If not specified, the default Windows certificate for the Apis instance will be used.10050Persisted
Server certificate (W32)The subject-name of the servers application certificate.10051Persisted
Storage path (W32)Name of the folder in Windows Certificate Store where the application- and server-certificates are stored.10052Persisted

The OpcUa connection item has the following advanced properties:

NameDescriptionIDFlags
Watchdog intervalThe number of seconds between each read of the servers status-variable. When one such read-operation fails, a reconnect sequence will be started.10060Persisted
Reconnect intervalThe number of seconds between each reconnect attempt when the server is not responding.10061Persisted
Rpc timeoutThe number of seconds to wait for rpc replies.10070Persisted
Session timeoutThe number of seconds to keep the session alive in the server when there is no rpc requests from the client.10071Persisted
Token timeoutThe number of seconds between security token renewals.10072Persisted
ClusterWhen specified, this connection item becomes part of the selected cluster.10090Persisted, Enumerated

The OpcUa connection item has the following informational properties:

NameDescriptionIDFlags
ServicelevelThe last servicelevel received from the server. This is used for automatic failover between servers in a cluster.10060Persisted

OpcUa Cluster Item

An item used to configure a cluster of two or more redundant OpcUa servers. This item defines the cluster and its failover behavior, while the connection information for each server in the cluster are defined by OpcUa connection items.

Each OpcUa cluster can be selected as a server configuration option in the OpcUa and OpcUaProxy modules, thereby giving these module types transparent failover support.

A failover between servers can occur when the ServiceLevel reported by each server changes. The ServiceLevel is a number from 0 to 255 with the following semantics:

  • Levels 0-1: The server is not able to provide any data from its data sources
  • Levels 2-199: The server has lost connection to some, but not all, of its data sources
  • Levels 200-255: The server is connected to all of its data sources

There are different failover rules for each ServiceLevel group:

  1. The active server has ServiceLevel 0-1: a failover occurs if another server has a ServiceLevel greater than 1.
  2. The active server has ServiceLevel 2-199: a failover occurs if another server has a ServiceLevel greater than 199, or if another server has a ServiceLevel greater than the ServiceLevel of the active server plus the value of the 'Failover deadband' property.
  3. The active server has ServiceLevel 200-255: failovers do not occur

Additionally, failovers never occur more frequently than specified by the 'Failover interval' property.

The OpcUa cluster item has the following standard properties:

NameDescriptionIDFlags
NameThe name selected for the cluster. This name will become available as a Server configuration option in the OpcUa and OpcUaProxy modules.
ValueThe server endpoint currently having the highest ServiceLevel for this cluster (read-only). The active OpcUa connection of the cluster does not necessarily change even when this value changes.2 

The OpcUa cluster item has the following advanced properties:

NameDescriptionIDFlags
Failover timeoutNumber of seconds to wait for additional servers to come online (after the first server reponse), before selecting a server as the current endpoint for the cluster.10100Persisted
Failover intervalMinimum number of seconds between failovers.10101Persisted
Failover deadbandMinimum ServiceLevel difference required to trigger a failover (only applies to servicelevels below 200).10102Persisted

OpcUa Replication Item

This itemtype is used to trigger a replication of another ApisHive instance. Each module with items in the replicated server will get an equally named OpcUa module in the replicating client, and one OpcItem for each item in the server. Global attributes for Logger and AlarmArea modules will also be replicated to the client.

The OpcUa Replication item has the following properties:

NameDescriptionIDFlags
ExtraPropertiesArray of additional item properties to replicate for each item10130Persisted
ModulenameFilterOptional filter used to restrict which modules to replicate. Supports wildcards such as '*', '?' and '#'10150Persisted
ModuletypeFilterSemicolon-separated list of moduletypes to replicate. If empty, all moduletypes are replicated.10160Persisted
OpcUa serverThe name of the OpcUa connection or cluster which should be replicated10120Persisted
PublishingIntervalThe publishing interval to use on all OpcUa modules10140Persisted
ReplicationModeSpecifies how the replication should work10125Persisted, Enumerated
SamplingIntervalThe sampling interval in milliseconds set on each replicated item10145Persisted
StandardPropSyncWhen to synchronize selected OpcUA Standard Properties from server nodes to local apis item attributes10128Persisted, EnumeratedFlags
SyncGlobalAttrsIf enabled, the replication item will synchronize global attributes10135Persisted
ValueBoolean value indicating if a replication is currently running. Set this to TRUE to trigger the replication.2NormalPage

The ExtraProperties setting specifies an array of Apis Attributes that should be replicated when the SyncStdProperties command is executed on each module.

When ReplicationMode is "Copy", all items found on the server gets a matching OpcItem in the client. When ReplicationMode is "Mirror", a normal "Copy" replication is performed but any OpcItem in the client that is not found on the server gets deleted from the client.

StandardPropSync is a multi-select property with the following options:

  • First time adding items
  • Each session
  • After replication

If "After replication" is activated, the "SyncStdProperties" command is executed on each module after synchronizing the module properties and items.

ApisEventBus

Apis module used to process events

Provider: Prediktor

Properties

Commands And Events

The EventBus module has the following item types

Channel

Router

Source.Chronical

Sink.Db

Sink.Smtp

Sink.Tracelog

Properties

The EventBus module has the following properties:

NameDescriptionIDFlags
DescriptionA user defined description of this module instance.900Persisted, InfoPage
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
LogLevelSpecifies the loglevel for diagnostic messages from this module.500Persisted, Enumerated
Text1A generic user defined text related to this module instance.910Persisted, InfoPage
Text2A generic user defined text related to this module instance.911Persisted, InfoPage
Text3A generic user defined text related to this module instance.912Persisted, InfoPage
Text4A generic user defined text related to this module instance.913Persisted, InfoPage

See also Module Properties

Commands And Events

The EventBus module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified by user. The timer resolution is specified by the 'ExchangeRate' property.Timer

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Item Types

Properties

Item used to define an event channel/stream

The Channel item type has the following properties:

NameDescriptionIDFlags
BatchTimeTime used to batch events, in seconds10060Persisted
BatchSizeMaximum number of events per batch. Overrides BatchTime if both attributes are specified.10070Persisted
IsConsumerThis attribute is TRUE on items that can consume events10010ReadOnly, InfoPage
IsProducerThis attribute is TRUE on items that can produce events10020ReadOnly, InfoPage

See also Predefined Item Properties and OPC DA Properties

Properties

Item used to route and/or transform an event stream

The Router item type has the following properties:

NameDescriptionIDFlags
InputChannelThe input channel for events consumed by this item10030Persisted, DynamicEnumeration
IsConsumerThis attribute is TRUE on items that can consume events10010ReadOnly, InfoPage
IsProducerThis attribute is TRUE on items that can produce events10020ReadOnly, InfoPage
OutputChannelThe output channel for events produced by this item10040Persisted, DynamicEnumeration
ScriptThe XSLT scipt that will be executed on incoming events to produce outgoing events10100Persisted, Multiline

See also Predefined Item Properties and OPC DA Properties

Properties

Item used to subscribe to events from Apis Chronical

The Source.Chronical item type has the following properties:

NameDescriptionIDFlags
EventsourcePath to the root eventsource to monitor10200Persisted, ApisEventSourcePath
EventtypeName of the root eventtype to monitor10210Persisted, DynamicEnumeration
IsConsumerThis attribute is TRUE on items that can consume events10010ReadOnly, InfoPage
IsProducerThis attribute is TRUE on items that can produce events10020ReadOnly, InfoPage
OutputChannelThe output channel for events produced by this item10040Persisted, DynamicEnumeration

See also Predefined Item Properties and OPC DA Properties

Properties

Item used to write events to a SQL database

The Sink.Db item type has the following properties:

NameDescriptionIDFlags
ConnectionStringADO Connectionstring for the database10400Persisted
InputChannelThe input channel for events consumed by this item10030Persisted, DynamicEnumeration
IsConsumerThis attribute is TRUE on items that can consume events10010ReadOnly, InfoPage
IsProducerThis attribute is TRUE on items that can produce events10020ReadOnly, InfoPage

See also Predefined Item Properties and OPC DA Properties

Properties

Item used to send emails. The Sink.Smtp item will automatically use TLS if the server supports the STARTTLS command.

The Sink.Smtp item type has the following properties:

NameDescriptionIDFlags
ServerAddress of smtp server, e.g. 'smtp.office365.com:587'10500Persisted
UsernameUsername used to authenticate with the smtp server10510Persisted
PasswordPassword used to authenticate with the smtp server10520Persisted
FromEmail address used in the 'From:' field10530Persisted
ToEmail address(es) used in the 'To:' field10540Persisted
CcEmail address(es) used in the 'Cc:' field10550Persisted
BccEmail address(es) used in the 'Bcc:' field10560Persisted
ImportanceValue of 'Importance:' field (0=Normal, 1=High, 2=Low)10560Persisted, DynamicEnumeration
IsConsumerThis attribute is TRUE on items that can consume events10010ReadOnly, InfoPage
IsProducerThis attribute is TRUE on items that can produce events10020ReadOnly, InfoPage

See also Predefined Item Properties and OPC DA Properties

Properties

Item used to log events to a tracefile

The Sink.Tracelog item type has the following properties:

NameDescriptionIDFlags
EnabledEnable/disable tracing of events10050Persisted
InputChannelThe input channel for events consumed by this item10030Persisted, DynamicEnumeration
IsConsumerThis attribute is TRUE on items that can consume events10010ReadOnly, InfoPage
IsProducerThis attribute is TRUE on items that can produce events10020ReadOnly, InfoPage
TraceFileFull path to the trace-file10300Persisted

See also Predefined Item Properties and OPC DA Properties

Apis FileReader

This is a called ApisFileReader which reads data from a file and inserts it into the namespace of Apis Hive

Provider: Prediktor

Properties

Commands And Events

The FileReader module has the following item types

FileItem

AnalogueVectorItem

Properties

The FileReader module has the following properties:

NameDescriptionIDFlags
Alarm commandThe command executed when there hasn't been any file update in 'TimeToAlarm' milliseconds in the specified directory. This typically is the full path to an executable or similar.1610Persisted, ExpertPage
Alarm command parametersThe parameters sent to the command in 'Alarm command'.1620Persisted, ExpertPage
Completion port handleThe handle of the I/O completion port in use.50120InfoPage
Directory handleThe handle of the monitored directory.50110InfoPage
FileChangeTimeoutThe maximum amount of milliseconds to wait for a file change notification to occur before the directory is manually scanned for filechange(s).1350Persisted, Enumerated, ExpertPage
FileChangeToParseDelayA delay in milliseconds indicating how long to wait from a file is detected as changed until it is actually being parsed by the FileParserObject.1400Persisted, ExpertPage
Filename PatternA filename pattern including the path for files that will be sent to the parser1150Persisted, File
FileParserThe ProgID identifying the parser component used by this module when parsing the file(s).1300Persisted, Enumerated, ProgID
FileReader trace file1If set, internal file update handling of the ApisFileReaderBee will be traced to this file. Should only be used for short time periods and for verbose troubleshooting of file update dynamics.2000Persisted, File, ExpertPage
LastFileChangeTimeThe time of the last notification of a matching file-change the module has registered.100000PerformancePage
LastFileParserFileNameThe name of file last parsed.103000PerformancePage
LastFileParserFileTimeUsageThe time used by the file-parser to handle last file update.102000PerformancePage
LastUpdatedItemCountThe number of items updated by the last file-change notification.101000PerformancePage
ScanDirectoryOnStartIf true, the SourceDirectory will be scanned once upon startup to look for matching files, if false directory will be scanned on startup but depend on directory changes and/or FileChangeTimeout for first scan.1340Persisted, ExpertPage
SourceDirectoryThe directory which the module shall monitor for file changes.1100Persisted, Hidden, Folder
TimeToAlarmThe time in milliseconds that the module delays when lack of fileupdates until the alarm action specified in the 'Alarm command' property is executed.1600Persisted, ExpertPage
UseApisTimeUse Apis timestamps instead of the timestamps given by the parser component.1500Persisted, ExpertPage

Additional FileReader properties, shared with the specific:

NameDescriptionIDFlags
FavorArrayItemsWhen 'true', the module tell it's file parser object to use vector and matrices as item types when applicable.10000Persisted, ExpertPage
Database loginSame as User ID in connection string. If blank, integrated security will be used.10005Persisted, User, ExpertPage
Database login passwordUsed when Database login is other than blank.10006Persisted, Password, ExpertPage
Database nameThe name of the database on given server, used in connection string.10004Persisted, ExpertPage
Database serverThe server machine where the database exists, used in connection string.10003Persisted, ExpertPage
FileParser debug trace fileIf specified, the complete filename of a tracefile used for debugging from parsers. Whether or not a file is generated, is parser specific behavior. If empty, no tracefile will be used.10009Persisted, File, ExpertPage
Honeystore databaseIf applicable, the name of the Honeystore database to use for parser component.10008Persisted, ExpertPage
Locale IDThe locale ID to be used when parsing files.10001Persisted, ExpertPage
Offset from UTCThe offset from UTC can be used by the parser component when handling files. Norwegian timezone is +60.10002Persisted, ExpertPage
Parser specific enumThis property may have meaning for some types of parsers, other may not use it.10007Persisted, DynamicEnumeration, ExpertPage

See also Module Properties 1: FileReader trace file has default a maximum count of 10 files with maximum size of 64 MB, other values can be specified in the module specific registry key using DWORD values TraceToFileMaxCount and TraceToFileMaxSize

Commands And Events

The FileReader module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

FileItem

A generic item parsed from a file

The FileItem item type has the following properties:

NameDescriptionIDFlags
MapKeyThe key of item in m_Handles map (parser handle)10100Hidden
QualityItem quality3ReadOnly
ReqVartypeThe type requested to be delivered for the value of this item from the server. Only change this one for special situations.10200Persisted, ReadOnly, Hidden, Enumerated
RightsItem access rights5ReadOnly
TimeItem timestamp4ReadOnly
TypeItem canonical datatype1ReadOnly
ValueItem value2ReadOnly

See also Predefined Item Properties and OPC DA Properties

AnalogueVectorItem

An analogue vector item of 4 byte floats parsed from a file

The AnalogueVectorItem item type has the following properties:

NameDescriptionIDFlags
DimensionThe dimension of a vector item (number or elements)5007Persisted, ReadOnly
MapKeyThe key of item in m_Handles map (parser handle)10100Hidden
QualityItem quality3ReadOnly
RightsItem access rights5ReadOnly
TimeItem timestamp4ReadOnly
TypeItem canonical datatype1ReadOnly
ValueItem value2ReadOnly

See also Predefined Item Properties and OPC DA Properties

Apis HAGovernor

This is an Apis module that governs High Availability tasks for an Apis Hive instance. I.e. synchronizing configuration, time-series and event data between redundant instances.

Provider: Prediktor

Properties

Commands And Events

The HAGovernor module has the following item types

Command Item

State Item

Function item

Tutorial

To use the Apis HA Governor module, please refer to this topic: Getting Started

Properties

The HAGovernor module has the following properties:

NameDescriptionIDFlags
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ExpertModeSet this to true, to make other properties writable and put the module into expert mode.1500ExpertPage
HA DeputiesDifferent High Availability deputies performing different tasks.1900Persisted, EnumeratedFlags
Hive InstancesAn array of one or more Hive instance(s) that take part in the same HA Cluster, as: <computername or ip>:<port>. E.g: localhost:55552000Persisted, TcpIpAddr
LogLevelSpecifies the loglevel for diagnostic messages from this module.500Persisted, Enumerated
PingIntervalThe pinging interval, in seconds, of the HA Governor and deputies (where applicable).1850Persisted
RedundancySupportRedundancySupport indicates what redundancy is supported by the Server.2110Persisted, DynamicEnumeration
ServerPortThe port number of the HA Governor Communication server for this instance.1800Persisted
ServerUriArrayServerUriArray is an array with the URI of all redundant Servers of the OPC UA Server.2120Persisted, TcpIpAddr
TrendHistory_LastUpToDateThe last time (in UTC) the HA Governor has determined that the Trend History of this instance, was up to date compared with all Hive Instances.3010Persisted, ReadOnly
EventHistory_LastUpToDateThe last time (in UTC) the HA Governor has determined that the Event History of this instance, was up to date compared with all Hive Instances.3020Persisted, ReadOnly

See also Module Properties

Commands And Events

The HAGovernor module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified by user. The timer resolution is specified by the 'ExchangeRate' property.Timer

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Item Types

Command Item

An item that represents a command to apply on this module.

The Command Item item type has the following properties:

NameDescriptionIDFlags
Command typeThis attribute defines the command type of the command item.10130Persisted, ReadOnly, Hidden
DescriptionItem description101Persisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage

See also Predefined Item Properties and OPC DA Properties

State Item

An item that represents a well defined state of this module.

The State Item item type has the following properties:

NameDescriptionIDFlags
DescriptionItem description101Persisted
QualityItem quality3ReadOnly
RightsItem access rights5ReadOnly
TimeItem timestamp4ReadOnly
TypeItem canonical datatype1ReadOnly
ValueItem value2ReadOnly

See also Predefined Item Properties and OPC DA Properties

High Availability Data Concept

By High Availability (HA), we do not mean full redundancy, but something that is close and in many situations serves the same purpose adequately.
In short, the HA provided by this module, will be like described below. There are 2 Hive instances involved; one local instance and one remote instance.

Configration Data

For now, synchronizing configuration data is not supported, and is required to be synchronized manually.

TimeSeries data

TimeSeries data will be synchronized like this: When your local Hive instance starts, the HAGovernor will request all items that are trended locally from the local instance. For each of these items, it will try to retrieve from the remote instance timeseries data from the time period when the local instance was not running. All timeseries data received from the remote instance, will be inserted into the corresponding timeseries data of the local instance. If any (for some reason) timeseries data already exists locally for the time period when your local instance was not running, it will be overwritten.

Event data

Event data will be synchronized like this: When your local Hive instance starts, the HAGovernor will request all event data from the remote instance, from the time period when the local instance was not running. All Event data received from the remote instance, will be inserted into the event data of the local instance. If any (for some reason) event data already exists locally for the time period when your local instance was not running, it will be not be overwritten.

Also, please refer to this section: Apis High Availability.

Getting Started

This module uses an proprietary TCP protocol to exchange timeseries and event data between this Hive instance and another redundant Hive instance.
For now, synchronizing configuration data is not supported, and is required to be synchronized manually.

What is meant by High Availability (HA), is described here: High Availability Data Concept

Basic configuration

Assuming you have 2 Hive instances, that has redundant configuration, one local and one remote. We will now describe how to configure an ApisHAGovernorBee on each of these instances in a HA setup

  1. In your local Hive instance, add new Apis module of type ApisHAGovernor.
  2. In your remote Hive instance, add new Apis module of type ApisHAGovernor.
  3. Specify a free and valid ServerPort property on both added modules.
  4. On the Hive Instances property of your local Hive instance, specify the endpoint to the remote Hive instance. Eg: <remote computername or IP>:<port from step 3.>
  5. On the Hive Instances property of your remote Hive instance, specify the endpoint to the local Hive instance. Eg: <local computername or IP>:<port from step 3.>
  6. Specify the HA Deputies property you want to use on both added modules. (Note: the Config Sync deputy is not implemented and will have no effect).
  7. Restart the modules to apply the changes.

When restarting the modules or their Hive instances, the HA Governor will synchronize TimeSeries/Event data from the timestamps of the properties TrendHistory_LastUpToDateEventHistory_LastUpToDate depending on which deputies were activated in the HA Deputies properties.

Advanced operation

If you want to manually control from what time a Timeseries/Event synchronization shall start, you can set the ExpertMode property to true. Then, the properties TrendHistory_LastUpToDateEventHistory_LastUpToDate will become writable and you may specify a custom start time. Synchronization from that time will take place the next time the module or the Hive instance is restarted.

The property RedundancySupport will reflect and control the RedundancySupport exposed by the OpcUa server of this Hive instance (if enabled).

The property ServerUriArray will reflect and control the ServerUriArray exposed by the OpcUa server of this Hive instance (if enabled).

Apis HSMirror

This Apis module, connects to an Apis HoneyStore real-time Historian, and exposes the trend items of its databases as items in this Hive instance.

Provider: Prediktor

Properties

Commands And Events

The HSMirror module has the following item types

HoneyStoreItem

Variable

Module state items

Item attribute items

Module events items

Function item

Properties

The HSMirror module has the following properties:

NameDescriptionIDFlags
AutoConfig

Select a strategy for automatically configuing the items of this module, based on changes in the databases of the Apis HoneyStore historian. Typically used in combination with property Database-ItemFilters.
Enum options are:

  • Manual: No automatic configuration, items must be addedand deleted manually.
  • AutoAdd: When trend-items matching the Database-ItemFilters are added to HoneyStore, items are automatically added to this module. Also, when module is started, HoneyStore is investigated for matching trend-items and if found, automatically added to this module.
  • AutoDelete: When trend-items matching the Database-ItemFilters are deleted from HoneyStore, items are automatically added to this module. Note, when module is started, HoneyStore is not investigated for missing trend items to delete. The reason for this is that we don't want items to disappear when HoneyStore databases are temporarily put into Disabled or Admin modes and this Hive instance is restarted.
  • AutoAddAndDelete: Same as selecting both AutoAdd and AutoDelete.
2000Persisted, Enumerated
Database-ItemFiltersWhen specified; a vector of "DataBase.ItemName" filters to use for automagic item configuration, wildcard syntax is supported. If an empty array, items must be added/deleted manually to the module. When specified, this also works as filter(s) for browsing the namespace of HoneyStore when adding HoneyStoreItems.2010Persisted
DescriptionA user defined description of this module instance.900Persisted, InfoPage
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
HS-ComputerHS-Computer\nThe computer hosting the Apis HoneyStore Historian. Leave empty for localhost, or enter a valid hostname/IP-address.1100Persisted, TcpIpAddr
LogLevelSpecifies the loglevel for diagnostic messages from this module.500Persisted, Enumerated
PersistValToInitValChoose strategy for copying and persisting current value to the InitValue.1500Persisted, Enumerated
Text1A generic user defined text related to this module instance.910Persisted, InfoPage
Text2A generic user defined text related to this module instance.911Persisted, InfoPage
Text3A generic user defined text related to this module instance.912Persisted, InfoPage
Text4A generic user defined text related to this module instance.913Persisted, InfoPage

See also Module Properties

Commands And Events

The HSMirror module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified by user. The timer resolution is specified by the 'ExchangeRate' property.Timer

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Item Types

Item type: HoneyStoreItem

An item mirrored from a HoneyStore database trend item.

The HoneyStoreItem item type has the following properties:

NameDescriptionIDFlags
QualityItem quality3ReadOnly
RightsItem access rights5ReadOnly
SrcItemIDItem ID in source5030Persisted
TimeItem timestamp4ReadOnly
TypeItem canonical datatype1ReadOnly
ValueItem value2ReadOnly

See also Predefined Item Properties and OPC DA Properties

Item type: Variable

User defined item, which can be written and read.

The Variable item type has the following properties:

NameDescriptionIDFlags
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2NormalPage
ValuetypeItem canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.10010Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Apis IEC104

A module for support of the IEC 60870-5 standard.

General information about the protocol.

Implemented using the MZ Automation IEC 60870-5-104 library.

Limitations:

  • Only client is supported
  • Only one connection/server

Provider: Prediktor

Properties

Commands And Events

The Apis IEC 104 module has the following item types:

Properties

The module contains the following standard properties:

NameDescriptionIDFlags
ModeSpecifies the run mode of the module. When in Offline mode, the module do not communicate (read/write) with any external system. When in Online mode, the module has normal communication (read/write) with external system.501Persisted
IP AddressThe IP address to the server1601Persisted,TCP/IP address
Port NumberThe port number the IEC 104 server is running on (plain default=2404, TLS default=19998)1602Persisted
TLSSet this to enable TLS connection1620Persisted,Enumerated
TLS Public KeyFull path to .pem file that contains the public key1630Persisted,File
TLS PasswordPassword of the public key file if it has one1631Persisted
TLS CertificateFull path to own .cer file that contains own certificate (x.509)1632Persisted,File
TLS CA CertificatePath to .cer file from certificate authority (x.509)1633Persisted,File
TLS Server CertificatePath to server .cer file (x.509)1634Persisted,File
TLS StrictIf set to true only known certificates are accepted. Connections with unknown certificates are rejected even if they are signed by a trusted authority1635Persisted
TLS TimeoutTLS renegotiation timeout in ms (-1 (disabled) by default)1636Persisted
TLS MinMin TLS version required for slave (Not selected,SSL 3.0,TLS 1.0,TLS 1.1,TLS 1.2,TLS 1.3)1637Persisted
TLS MaxMax TLS version required for slave(Not selected,SSL 3.0,TLS 1.0,TLS 1.1,TLS 1.2,TLS 1.3)1638Persisted
TLS ValidationEnables the validation of the certificate trust chain (enabled by default) 11639Persisted
KMMaximum number of unconfirmed I-frames1 that can be sent before waiting for S-frame (acknowledgement) from the receiver (server).1655Persisted
WMaximum number of I-frames that can be received before sending an S-frame (acknowledgment) to the sender (server).1656Persisted
T0 [s]Timeout for connection establishment[seconds] 11657Persisted
T1 [s]Timeout for receiving S-frame (ack) for sent I-frames. Resends the I-frames data after this timeout. [seconds]1658Persisted
T2 [s]Timeout for sending S-frame (ack) for received I-frames. Maximum time before sending an S-frame (ack). [seconds]1659Persisted
T3 [s]Timeout for data exchange. If no frames are exchanged for T3 period, a S-frame (test) is sent to verify the connection is alive. [seconds]1660Persisted

The module contains the following advanced properties:

NameDescriptionIDFlags
Watchdog ModeWatchdog Item will monitor the item specified in Watchdog Item for activity, and Any Item will register any change in any items as activity1640Persisted,Enumerated
Watchdog Timeout [ms]Max time since last registered activity before connection considered lost [milliseconds]1642Persisted
Watchdog ItemThe item name to use when watchdog mode is set to monitor a specific item1643Persisted
PollPeriodically send commands to get latest values from server1644Persisted,Enumerated
Poll Period [ms]Period between each read command request when auto read items is activated. [milliseconds]1645Persisted

See also Module Properties

1 See basic overview of IEC 104 Communication

Commands And Events

The module has the following events:

NameDescriptionEvent Type
ServerConnectedEvent triggered after established connection to serverNormal
ServerDataChangedEvent triggered after receiving data from serverNormal
ServerDisconnectedEvent triggered after disconnection to serverNormal

The module has the following commands:

NameDescriptionCommand Type
ReconnectTrigger reconnect to the serverAsynchronous

Item Types

Information Item

Receives process information in monitor direction (server->client).

The item type has the following properties:

NameDescriptionIDFlags
CACommon address18700Persisted
IOAInformation object address18701Persisted
ScaleWill scale the value according to y=ax+b, where a=scale21389Persisted
OffsetWill scale the value according to y=ax+b, where b=offset21390Persisted
COTCause of transmission of last received value18702ReadOnly
Type idType id of last received value18703ReadOnly
QualityThe IEC 104 specific quality of last received value18707ReadOnly

See also

Basic Item Properties

Interrogation Command Item

Sends an interrogation command in control direction (client->server).

The item type has the following properties:

NameDescriptionIDFlags
CACommon address18700Persisted
COTCause of transmission18702Persisted
QOISpecifies the behavior of the server when processing the command18706Persisted

On success, the corresponding Information Items on the same CA should be updated with the latest values from the server.

To trigger an interrogation command:

  1. Configure the interrogation command with the properties listed above;
  2. Set a new value on the item - not sent to server, only used for triggering an interrogation command;
  3. On success; the value on the item is set;
  4. On failure; the value on the item is not set - check log for more information.

See also

Basic Item Properties

Process Command Item

Sends process information in control direction (client->server) when it is updated with new values.

The item type has the following properties:

NameDescriptionIDFlags
CACommon address18700Persisted
IOAInformation object address18701Persisted
COTCause of transmission18702Persisted
CommandType of command to send18704Persisted
QualifierSpecifies the behavior of the server when processing the command18705Persisted

Sends a value, with the given attributes, to the server. It sends the value the item is set to.

To trigger a process command:

  1. Configure the process command with the properties listed above;
  2. Set a new value on the item - this value is sent as part of the command to the server;
  3. On success; the value on the item is set and the values of the information items with the same CA should be updated;
  4. On failure; the value on the item is not set - check log for more information.

See also

Basic Item Properties

Read Command Item

Sends a read command in control direction (client->server).

The item type has the following properties:

NameDescriptionIDFlags
CACommon address18700Persisted
IOAInformation object address18701Persisted

On success, reads the current value of the information item on the server and should be reflected on the corresponding Information Item.

To trigger a read command:

  1. Configure the read command with the properties listed above;
  2. Set a new value on the item - not sent to server, only used for triggering a read command;
  3. On success; the value on the item is set and the value of the corresponding information item should be updated;
  4. On failure; the value on the item is not set - check log for more information.

See also

Basic Item Properties

Test Command Item

Sends a test command in control direction (client->server).

The item type has the following properties:

NameDescriptionIDFlags
CACommon address18700Persisted

Can be used to e.g. keep the connection alive.

To trigger a test command:

  1. Configure the test command with the properties listed above;
  2. Set a new value on the item - not sent to server, only used for triggering a test command;
  3. On success; the value on the item is set;
  4. On failure; the value on the item is not set - check log for more information.

See also

Basic Item Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Control Parameters

IEC 104 data communication is bidirectional, i.e. both client and server can be sender and receiver of messages, which is divided into three set of frames which serves different purposes:

  • I-frames (Information Frames): payload that contains data (measurements, commands, etc) and incremental sequence number (sq);
  • S-frames (Supervisory Frames): used primarily for management and control of I-frames (Receive Ready for receipt and indicate the receiver is ready to accept more I-frames);
  • U-frames (Unnumbered Frames): used to test network connectivity (TESTFR_ACT, TESTFR_CON).

The transmission is regulated with a set of link control parameters: K,W,T0-T3, ref Properties

Connection Establishment

Parameters: T0
Used by: client
Usage: Ensuring that the client does not indefinitely wait for a respons.
Value: Typically the T0 timer is set to about 30 seconds, providing a reasonable window to accommodate network delays and server processing times.

Example:

  1. Client connects to the server and waits for T0 for server to accept connection request;
  2. Client aborts after T0 timeout;

Data Transmission

Parameters: K, W, T1, T2
Used by: client and server
Usage: Ensures all I-frames are received orderly and processed by the receiver
Values: The sender will stop sending I-frames and wait for S-frame either when number of I-frames sent before last received S-frame reach K or T1 timeout. The receiver will send S-frame when number of received I-frames reach W or T2 timeout. To avoid the sender to wait for S-frames from the receiver, W and T2 for the receiver should be less than K and T1 for the sender, typically 1/2.

Example:

  1. Server sends up to K number of I-frames before stopping and waiting for ack (S-frame);
  2. Server resends the I-frames sent after last received ack if no acknowledment is received before T1 timeout;
  3. Client sends the ack after it has received W number of I-frames or T2 has timed out.
  4. Server resets the K counter and T1 timer when receiving ack and continue sending I-frames;

Example 2:

  1. Client has X number of command items where X > Server value of W
  2. Client tries to send a command item for all items at once
  3. Client will only be able to send X - Server value of W commands before the Server has processed and sent ack (S-frame)
  4. Client will in Apis see good quality for the first successfully sent Server value of W command items and bad quality for the rest

Connection Integrity

Parameters: T3
Used by: client and server
Usage: Ensures connection is alive
Values: Typically to avoid unnecessary sending of U-frames, T3 should be greater than the other timers, typically set to 20 seconds.

Example:

  1. Client receives no I-frames within T3 timeout peroid;
  2. Client sends an U-frame (TESTFR_ACT) to test connection;
  3. Server sends an U-frame (TESTFR_CON) back to Client to confirm connection;

Choosing Client Values Based On Server Values

Calculating client parameters from server parameters

Formula and example values for calculating client side parameters:

ParameterServer ValueClient FormulaServer ExampleClient Example
KK_SW_S * 21216
WW_SK_S / 286
T030s30s
T1T1_ST2_S * 216s20s
T2T2_ST1_S / 210s8s
T320s20s

Choosing Server Values or Client Values With Unknown Server Values

Considerations for choosing K

Setting the "k" parameter has significant implications for the performance and efficiency of the network communication. The choice of a low vs. a high "k" value affects several aspects of the protocol's operation:

Low k Value (e.g., k=1):

  1. Increased Acknowledgments: A low "k" value means that each I-frame must be acknowledged before the next one can be sent. This can significantly increase the number of acknowledgment messages (S-frames), adding to the overhead in the communication.
  2. Lower Throughput: The need for frequent acknowledgments limits the number of I-frames that can be sent in a given period, potentially reducing the overall throughput of application data.
  3. Higher Sensitivity to Latency: The communication becomes more sensitive to network latency. Each I-frame's transmission depends on the round-trip time (RTT) for the acknowledgment of the previous frame, making the data transfer rate more susceptible to delays.
  4. Improved Error Recovery: On the positive side, a low "k" value can lead to faster error detection and recovery since any lost or unacknowledged frame will halt the transmission more quickly, allowing for prompt retransmission.

High k Value (e.g., k=1000):

  1. Reduced Acknowledgments: A higher "k" value allows more I-frames to be sent before an acknowledgment is required, reducing the overhead caused by acknowledgment messages and potentially making the communication more efficient.
  2. Higher Throughput: More I-frames can be "in-flight" before an acknowledgment is needed, which can significantly increase the data transmission rate, especially in networks with high latency.
  3. Buffer and Memory Considerations: Both the client and server must have sufficient buffer space to handle the higher number of unacknowledged I-frames. This requires careful resource management, especially in systems with limited memory.
  4. Delayed Error Recovery: With more unacknowledged frames allowed, it may take longer to detect and recover from errors. Lost or corrupted frames might only be identified after more frames have been sent, potentially complicating error recovery processes.

The choice between a low and high "k" value depends on the specific requirements of the application and the characteristics of the network. A low "k" may be preferred in networks where minimizing latency and quick error recovery are critical, albeit at the cost of throughput. A high "k" might be advantageous in stable networks with high latency or where maximizing throughput is desired, provided that the system can handle the increased memory and processing requirements for managing a larger number of unacknowledged frames. Balancing these factors is key to optimizing the performance and reliability of IEC 104 communications.

A commonly used range for "k" in many implementations is between 12 and 128. This range is considered to provide a good balance between throughput and the ability to quickly recover from errors in a variety of network conditions. However, the optimal value within this range, or even outside of it, should be determined based on testing and evaluation in the specific network environment and application context where it will be used.

Considerations for choosing W

A common recommendation is to set W to approximately half of K. This guideline helps to ensure that acknowledgments are received and processed before reaching the maximum number of unacknowledged I-frames, thus reducing the risk of transmission stops and enhancing data integrity.

For example, if `K = 32`, then `W` might effectively be set to around 16.

This ratio helps in managing network traffic more smoothly and prevents the scenario where the sender has to stop sending frames and wait for acknowledgments too frequently (which would happen if W is too low) or too infrequently (which risks data overflow and errors if W is too high).

Considerations for choosing T0

Recommended Setting: Typically, T0 is set to 30 seconds. This provides sufficient time for connection attempts even over slower or less reliable networks.

Considerations for choosing T1

Recommended Setting: T1 is usually set to around 15 seconds. This duration is enough to cover most transmission delays that might occur in typical network conditions.

Considerations for choosing T2

Recommended Setting: T2 is generally set shorter than T1, around 10 seconds. It should be less than T1 to ensure that a test frame can be sent and acknowledged before the T1 timeout for data frames.

Considerations for choosing T3

Recommended Setting: T3 is commonly set to 20 seconds. This setting should be longer than T1 and T2 to allow sufficient time for the test frames to be acknowledged under normal conditions.

Additional Considerations

Network Characteristics: These recommended settings can serve as a good starting point, but optimal values might vary based on specific network performance characteristics and requirements. Considerations include network latency, reliability, bandwidth, and the criticality of the application.
Adjustments and Testing: It is advisable to adjust and test these timers in the specific deployment environment to find the best settings that ensure reliable communication while minimizing unnecessary traffic.
Consistency with Standards: If there are industry-specific standards or regulations that dictate timer settings, those should take precedence to ensure compliance and interoperability.

These settings help balance the performance and reliability of the IEC 60870-5-104 communication by efficiently managing the connection lifecycle and error recovery processes. Adjustments based on actual network conditions and system performance are crucial for maintaining optimal operations.

Apis IEC 61850

A module for support of the IEC 61850 standard.

The standard is mapped to many protocols, including the MMS protocol used in this module.

The module will act as a client connecting to an IEC 61850 server over the MMS protocol.

The module is using the MZ Automation IEC 61850 library.

Provider: Prediktor

Properties

Commands And Events

The IEC 61850 module has the following item types:

Properties

The module contains the following standard properties:

NameDescriptionIDFlags
ModeSpecifies the run mode of the module. When in Offline mode, the module do not communicate (read/write) with any external system. When in Online mode, the module has normal communication (read/write) with external system.501Persisted
HostnameThe IP address to the server1601Persisted,TCP/IP address
PortThe IP port number to the server1602Persisted
Authentication modeAuthentication mode for connecting to the server1603Persisted,Enumerated
Connection stateThe connection state to the server1604ReadOnly,Enumerated
PasswordAuthentication password to connect to server1610Persisted
Watchdog modeMonitor the connection to the server with the watchdog and auto reconnect if connection is assumed lost1605Persisted,Enumerated
Watchdog timeoutMax time (ms) between last received message and time when watchdog is active1606Persisted
Watchdog itemThe item name to use when watchdog mode is set to monitor a specific item1607Persisted
Poll modePoll mode of the module1608Persisted
Poll periodTime (ms) between each poll cycle1609Persisted

Notes regarding Poll mode:

  • Polling DataObjects is most expensive network wise as it make a network request per DataObject, e.g. 1000 DataObject items -> 1000 network requests.
  • Polling DataSets is less expensive as it make one network request per DataSet item, which can contain many if not all DataObject items.
  • Prefered way should be to have Poll Mode set to Disabled; group the DataObject items into logical DataSet objects; have a report for each DataSet that subscribes to updates from the server. A similiar polling like behaviour can be configured in the report with Integrity as TrgOps and Integrity period set to the desired poll period. This will require the minimal network traffic as the client is not sending any explicit network requests, only server is sending data back to the client with regular interval.

Commands And Events

The module has the following events:

NameDescriptionEvent Type
ServerConnectedEvent triggered after established connection to serverNormal
ServerDataChangedEvent triggered after receiving data from serverNormal
ServerDisconnectedEvent triggered after disconnection to serverNormal

The module has the following commands:

NameDescriptionCommand Type
ReconnectTrigger reconnect to the serverAsynchronous

Item Types

DataObject

DataObject items maps to IEC61850 concept of DataObjects on the server, where each DataObject has a set of functionally constrained (FC) data attributes.

The item type has the following properties:

NameDescriptionIDFlags
ScaleWill scale the value according to y=ax+b, where a=scale5005Persisted
OffsetWill scale the value according to y=ax+b, where b=offset5006Persisted
SrcItmIDData reference to the DataObject, usually on the form LDName/LNodeName.Object [string]5030Persisted
Functional contraintFunctional constraint of the data object17001Persisted
VData Attribute (DA) used for item value [string]17003Persisted
QData attribute (DA) used for item quality [string]17004Persisted
TData attribute (DA) used for item time [string]17005Persisted
AttributesAvailable data attributes on the data object17002ReadOnly
IEC QualityThe IEC61850 specific quality of last received value17006ReadOnly

For scalar DataObjects, VQT attributes can be flat, e.g. v or nested structures e.g. level1.level2.v.

In the case where DataObject is an array, the items in the array can be accessed with

  1. SrcItmID = LDName/LNodeName.arrayObject(0) and VQT attributes = myStruct.myVal
  2. SrcItmID = LDName/LNodeName.arrayObject and VQT attributes = (0).myStruct.myVal.

The (1) method is prefered if the VQT values are located in the same index, otherwise (2) can be used but will be more inefficient as it will read the whole array.

All available VQT attributes are listed in the Attributes attribute of the item.

Note regarding VQT attributes:

  1. V is mandatory and must match an item in the the available data attributes.
  2. Q and T is optional, but if set must match a an item in the available data attributes. They must also match with the expected type, which for Q is a bitstring and T is a utc-time. If not set, the default values of good quality and current time is used. Failing to satisfy these requirements will result in a config error quality on the item.

Notes regarding Quality:

  1. Apis will set the item qualities to NOT_CONNECTED if it client looses connection to the server.
  2. Apis will set the item qualities to CONFIG_ERROR if srcItemID or VQT attributes are invalid.
  3. Apis will set the item qualities to COMM_ERROR if reading invalid/error data from the server.

See also

Basic Item Properties

DataSet

Item representing the IEC61850 concept of an DataSet on the server.

The item type has the following properties:

NameDescriptionIDFlags
SrcItemIDDataSet reference, must be on the form LDName/LNodeName.dataSetName . [string]5030Persisted
NodesArray of DataObject/DataAttribute items, must be on the form LDName/LNodeName.dataObject[FC] or LDNAME/LNodeName.dataObject.dataAttribute(arrayIndex)[FC].[string[]]17002

Note:

  1. DataSet cannot be modified or deleted when referenced by any reports.
  2. Always have the DataSet items in the module that is refered to by any reports.

Change SrcItemID: If DataSet exists on server, the DataSet item will point to the existing DataSet on the server. If it does not exist, it will

  1. try to delete the old DataSet on the server,
  2. try to create a new DataSet with the Nodes array on the server. This will effectively either be a rename of the DataSet or a copy of the previous DataSet. Example: Change SrcItemID from Device1/Node1.Dataset1 -> Device1/Node1.Dataset2, this will result in both Dataset1 and Dataset2 on the server if Dataset1 can not be deleted (e.g. used in a report), with Dataset2 containing the same nodes as Dataset1. The new Dataset2 can now be modified/deleted untill used in a report.

Change Nodes: If DataSet exist on the server, the DataSet will be deleted and recreated with the new set of nodes. If it can not be deleted, the quality will change to CONFIG_ERROR and the reason/error is printed in the log. If it can be deleted, or the DataSet is not already present on the server, a new DataSet will be created with the new set of nodes. If there is any configuration issues with the nodes, the item quality will change to CONFIG_ERROR and the reason/error is printed in the log. Example: if the reason it cannot be changed is because the DataSet cannot be modified or deleted, a possible workaround is to change the SrcItmID to create a copy of the DataSet, and then changing the nodes on the new DataSet.

Delete Item: If DataSet is not in use, e.g. reference in any reports, the DataSet on the server will be deleted. If it is not possible to delete the DataSet on the server, only the item will be deleted and the error/reason is printed to the log. Resolve the issue, add the DataSet item again, and delete it again to permanently delete it from the server.


Basic Item Properties

BufferedReport

Reporting allows a server to send data based on events and without explict request by the client. What data is sent and which events that cause the reports to be sent are configured through the attributes of the item.

Main characteristics of Buffered Reports:

  • Buffered Reports involve an intermediary buffering of report data within the IED before it is sent to the client. This buffer stores a sequence of reports.
  • In case of a communication interruption or if the client is temporarily unable to receive reports, the buffered reports are stored in the IED and can be retrieved once the communication is restored. This ensures that no data is lost during the period of communication failure.
  • Buffered reporting is more complex and requires additional memory and processing resources within the IED. It is essential for applications where data continuity and retrieval of missed reports are critical, such as in disturbance recording or where detailed post-event analysis is required.

The item type has the following properties:

NameDescriptionIDFlags
SrcItmIDData reference to the buffered report, on the form LDName/LNodeName.BR.reportName. [string]5030Persisted
DataSetThe DataSet reference, must be on the form LDName/LNodeName.dataSetName. [string]17007
Integrity period [ms]The integrity period specifies the interval for the​ periodic sending of intergrity messages in report if integrity is specified as trigger option in the report. [0-4,294,967,295]17008
Trigger optionsSpecifies which events will trigger reports17009
GISetting this parameter to true will​ initiate a report to be sent to the client. This report will​ contain all the data of the associated data set. The GI parameter​ will be reset to false automatically by the server once the report​ has been sent17011
EnabledBy setting this variable to true reporting will be​ enabled. Note that changing/writing and configuration fields of​ the RCB will fail as long as reporting is enabled17012
ResvTms [ms]Value of -1 = reserved by configuration for a certain set of​ clients, value 0 = report is not reserved, value > 0 = the time that the​ reservation shall be maintained after the association was closed​ or interrupted. [-32,768 to 32,767]17013
Report IDThe report ID identifies the RCB that has caused the generation of the report. [string]17014
Optional fieldsA bitstring where each bit indicates whether an optional field is included in the reports. The following optional fields exist: sequence-number, report- timestamp, reason-for-inclusion, data-set-name, data-reference, buffer-overflow, entryID, segmentation, and conf-revision. For URCBs the values of buffer-overflow and entryID are ignored1705
Buffer time [ms]For buffered reports, this is the interval between each report sent from the server. [0-4,294,967,295]17016
Purge bufferIf the rate of report generation exceeds the transmission or processing capacity, the buffer may become full. If enabled, older reports in the buffer may be purged (deleted) to make room for new reports. This ensures that the most recent and relevant data is available for transmission, at the cost of losing older, untransmitted reports.17018
Sequence numberNumber of received events from the server, incremented each time a new update is received from the server.17017ReadOnly

Notes:

  1. Make sure DataSet item that corresponds to the DataSet reference used in the report is present.

See also

Basic Item Properties

DataReport

Reporting allows a server to send data based on events and without explict request by the client. What data is sent and which events that cause the reports to be sent are configured through the attributes of the item.

Main characteristics of Data Reports (Unbuffered Reports):

  • Data Reports, or Unbuffered Reports, are sent directly from the IED (Intelligent Electronic Device) to the client without any intermediary storage of the report data within the IED.
  • If there is a communication interruption or if the client is unable to receive the report (for example, due to network congestion or client unavailability), the reports that occur during this time are lost. There is no mechanism to retrieve missed reports once the communication is restored.
  • This type of reporting is simpler and requires less memory and processing power within the IED, making it suitable for applications where real-time monitoring is essential, and data loss during communication failures is acceptable.

The item type has the following properties:

NameDescriptionIDFlags
SrcItmIDData reference to the report, on the form LDName/LNodeName.RP.reportName. [string]5030Persisted
DataSetThe DataSet reference, must be on the form LDName/LNodeName.dataSetName. [string]17007
Integrity period [ms]The integrity period specifies the interval for the​ periodic sending of intergrity messages in report if integrity is specified as trigger option in the report. [0-4,294,967,295]17008
Trigger optionsSpecifies which events will trigger reports17009
Reservedif the setting was successful the RCB will be reserved​ exclusively for this client17010
GISetting this parameter to true will​ initiate a report to be sent to the client. This report will​ contain all the data of the associated data set. The GI parameter​ will be reset to false automatically by the server once the report​ has been sent17011
EnabledBy setting this variable to true reporting will be​ enabled. Note that changing/writing and configuration fields of​ the RCB will fail as long as reporting is enabled17012
Report IDThe report ID identifies the RCB that has caused the generation of the report. It equals the RptID field of the RCB.17014
Optional fieldsA bitstring where each bit indicates whether an optional field is included in the reports caused by this RCB. The following optional fields exist: sequence-number, report- timestamp, reason-for-inclusion, data-set-name, data-reference, buffer-overflow, entryID, segmentation, and conf-revision. For URCBs the values of buffer-overflow and entryID are ignored1705
Sequence numberthe current sequence number of the RCB. After sending a report the RCB will increment the sequence number by one17017ReadOnly

See also

Basic Item Properties

Apis IEC 62056Bee

Description of ApisIEC62056Bee

Provider: Prediktor

Properties

Commands And Events

The IEC62056Bee module has the following item types

Register Readout

Register Set/Reset Readout

Register Data Interpreter

StatusItem

Properties

The IEC62056Bee module has the following properties:

NameDescriptionIDFlags
Com portSerial IO Com port id. Must Start with COM1055Persisted, ExpertPage
Communication typeChoose the communication type for the connection to the meter1020Persisted, Enumerated
CommunicationtimeoutThe communication timeout period in milliseconds1050Persisted, ExpertPage
ConnectedStatus if Apis is connected to device.1200InfoPage
ConnectedTimeLast time host connected device.1215InfoPage
Device AddressRS485 Address of the device. Will be ommitted if not set. 32 characters Max1017Persisted
EnableIf true the module will try to connect the equipment with IP address and port number specified (default: false).1010Persisted
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ItemUpdateIntervalThe interval in seconds that the module will try to update the read items.Minimum value is 1.1013Persisted
LastMsgRcvLast message received from device.1210InfoPage
LastMsgSendLast message send to device.1205InfoPage
Log fileThe file name for Seg scale serial IO log.1090Persisted, ExpertPage
Log file MaxsizeThe max size of the log file before it get recycled (Default value=5.000.000 [bytes]).1085Persisted, ExpertPage
MaxParseTimeThe maximum time used to parse a SF message [ms].1300PerformancePage
Meter manufacturerChoose the right configuration for the meter. This selection affects the use of protocolmode and item creation1015Persisted, Enumerated
Password Level 1Password used for level 1 data read.1035Persisted
Password Level 2Password used for level 2 data read.1040Persisted
Password Level 3Password used for level 3 data read.1045Persisted
ReconnectIntervalThe interval in seconds that the module will try to connect to a device after lost connection.Minimum value is 1.1012Persisted
Serial IO Data bitsThe serial IO Data bits1065Persisted, ExpertPage
Serial IO Stop bitsThe serial IO Stop bits1070Persisted, Enumerated, ExpertPage
Serial IO Stop bitsThe serial IO Stop bits1080Persisted, Enumerated, ExpertPage
TCP/IP addressThe IP address of the meter. Used for TCP communication1025Persisted, URL
TCP/IP portThe IP port of the meter. Used for TCP communication1030Persisted

See also Module Properties

Commands And Events

The IEC62056Bee module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

A Register Readout items main purspose is to read out specific register data

The Register Readout item type has the following properties:

NameDescriptionIDFlags
Address

Register address.

11000Persisted
Enable

Set to true to enable the item in the configuration.

10005Persisted
Index

The index of the data field.

11010Persisted
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001ExpertPage
Length

The length of the data field.

11020Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

A Register Readout items main purspose is to read out specific register data, after Set/Reset register is changed

The Register Set/Reset Readout item type has the following properties:

NameDescriptionIDFlags
Address

Register address.

11000Persisted
Enable

Set to true to enable the item in the configuration.

10005Persisted
Index

The index of the data field.

11010Persisted
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001ExpertPage
Length

The length of the data field.

11020Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Reset Value

Value the Transaction Manager will check against.

12040Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Set Value

Value to set in the Set/Reset register.

12030Persisted
Set/Reset Address

Set/Reset Register address.

12000Persisted
Set/Reset Length

Set/Reset Length of the data field in the register.

12020Persisted
Set/Reset Offset

Set/Reset Offset of the data field in the register.

12010Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

A Register Data Interpreter Item main purpose is iterpret data read out bu registerreadoutItems

The Register Data Interpreter item type has the following properties:

NameDescriptionIDFlags
Enable

Set to true to enable the item in the configuration.

10005Persisted
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

11120Persisted
Second Value Length

The length of the second data field in the register.

13040Persisted
Second value start index

Zero-based index indicating where to start interpreting the read second data.

13030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Value Arithmetic

The arithmetic used on values from first and second register data. On subtraction, the second value will be subtracted from the first.

13001Persisted, Enumerated
Value Cut Off Type

The cut off type is used on values to cut of positive or negative values.

13005Persisted, Enumerated
Value interpretation type

This property determines how to interpret the data file value:

String: reads the value as a plain string;

BCD (Binary Coded Digital): reads the value as an integer and scales it according to the scale property.

11100Persisted, Enumerated
Value Length

The length of the data field in the register.

13020Persisted
Value start index

A zero-based index indicating where to start interpreting read data.

13010Persisted

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

This item is used to expose internal status information

The StatusItem item type has the following properties:

NameDescriptionIDFlags
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Interpreter

This module interprets single line ascii string terminated by terminating character like <CR><LF> received from serial port or TCP socket

Provider: Prediktor

Properties

Commands And Events

The Interpreter module has the following item types

InterpreterItem

InterpreterSendItem

CommandItem

State Item

More information

Quick Start Guide

Characteristics of the Interpreter module:

  • Reads ASCII string from a device and interprets to numeric value(s).
  • Supports Read and Write.
  • Supports query, Write->Read (synchronous)
  • Can act as a simple TCP server
  • Supports, serial and TCP/IP communication (TCP and UDP)

Further, as an integrated module in the Apis Hive, the following optional features are available:

  • High performance data logging to the Apis Honeystore historian, with OPC Historical Data Access server interface

Properties

The Interpreter module has the following properties:

NameDescriptionIDFlags
ACKIf Greater or equal "0" this value is sent as ACK on successful receive1273Persisted, ExpertPage
Serial BaudRateBaud rate for serial communication. Valid only when Comm. type is Serial1210Persisted, Enumerated, ExpertPage
Buffer sizeSize of buffer (min. 10).1270Persisted, ExpertPage
Serial COM portThe COM port to use. Valid only when Comm. type is Serial1200Persisted, Enumerated
Comm. typeCommunication method, Serial or Winsock1110Persisted, Enumerated
Client command timeoutTime to wait for client command in (seconds) before giving up and reset connection1320Persisted
Serial DataBitsNumber of bits in the bytes transmitted and received. Valid only when Comm. type is Serial1230Persisted, Enumerated, ExpertPage
Mode of operationRead, Write, Write->Read, TCP-Server. Write->Read; Writes and then reads responce, TCP-Server acts as a stream server1120Persisted, Enumerated
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
Field separatorSeparation character for values when multiple values in one string.1075Persisted, Enumerated, ExpertPage
Serial FlowControlFlow control for serial communication. Valid only when Comm. type is Serial1250Persisted, Enumerated, ExpertPage
IP addressIP Address of Interpreter server. Valid only when Comm. type is Winsock1310Persisted, Computer
InterpretInterpret string to numeric value(s)1072Persisted, ExpertPage
Serial ParityParity scheme for the serial communication. Valid only when Comm. type is Serial1220Persisted, Enumerated, ExpertPage
IP portThe TCP/UDP port of Interpreter server. Valid only when Comm. type is Winsock1311Persisted
RawValueReceiveLast telegram raw value received1500ReadOnly, PerformancePage
RawValueSendLast telegram raw value received sent1501ReadOnly, PerformancePage
RawValueSend(hex)Last telegram raw value received sent (hex)1502ReadOnly, PerformancePage
ResetOnTimeoutReset communication when timeout error occurs.1262Persisted, ExpertPage
SkipAfterTermCharBytes rejected after termination character. Typical CRC.1272Persisted, ExpertPage
SkipLeftNumber of characters to skip on left side of telegram.1073Persisted, ExpertPage
SkipRightNumber of characters to skip on right side of telegram.1074Persisted, ExpertPage
Start characterIf Greater or equal "0" this value is used as start character for synchronization1269Persisted, ExpertPage
Serial StopBitsNumber of stop bits to be used. Valid only when Comm. type is Serial1240Persisted, Enumerated, ExpertPage
Terminating characterThe character terminating the string.1271Persisted, Enumerated, ExpertPage
TimeoutTime-out interval, in milliseconds.1261Persisted, ExpertPage
TimerPolling interval in TCP-Send mode transmit interval (seconds)1055Persisted
TracefileMaxSizeThe max size of the trace file before the file is truncated in bytes. Default is 50 MB => 50 * 1024 * 102415000Persisted, ExpertPage
TraceToFileThis is used to trace detail information about the incoming data over the link15010Persisted, File, ExpertPage
Trim control charsRemove control characters like <CR> <LF> <STX> etc. from telegram prior to update item. Useful when you need to receive "control" characters1280Persisted, ExpertPage
IP protocolProtocol type TCP/UDP. Valid only when Comm. type is Winsock1312Persisted, Enumerated

See also Module Properties

Commands And Events

The Interpreter module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
DataReady

Event indicating that new data is ready, i.e. item is updated with new value an timestamp. Use case: Connect HandleExteralItems on consuming modules to this event to instantly transfer value.

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
StartSend

Triggers send of SendItems when Direction is Write or Write->Read.

Asynchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

A Interpreter item, built from raw data or measurement elements

The InterpreterItem item type has the following properties:

NameDescriptionIDFlags
Address

Position of field to interpret when multiple items in one telegram.

5020Persisted
CRC calculation

CRC calculation. On InterpreterItem check CRC on telegram. On InterpreterSendItem generate CRC

10010Persisted, Enumerated
CRC positionThe position of CRC from RIGHT  
CRC order

Byte order of multibyte CRC. MSB/LSB

10022Persisted
CRC Skip Left

The number of bytes to exclude from CRC calculation left.

10030Persisted
CRC Skip Right

The number of bytes to exclude from CRC calculation right.

10040Persisted
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Update only on change

Update timestamp only when value changes.

10005Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

A Interpreter item, built from raw data or measurement elements

The InterpreterItem item type has the following properties:

NameDescriptionIDFlags
Address

Position of field to interpret when multiple items in one telegram.

5020Persisted
CRC calculation

CRC calculation. On InterpreterItem check CRC on telegram. On InterpreterSendItem generate CRC

10010Persisted, Enumerated
CRC positionThe position of CRC from RIGHT  
CRC order

Byte order of multibyte CRC. MSB/LSB

10022Persisted
CRC Skip Left

The number of bytes to exclude from CRC calculation left.

10030Persisted
CRC Skip Right

The number of bytes to exclude from CRC calculation right.

10040Persisted
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Update only on change

Send value only when value has changed.

10005Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Toggle this value to force write in Write mode

The CommandItem item type has the following properties:

NameDescriptionIDFlags
Description

Toggle this item to connect to the Interpreter server; true: is connect, false: is disconnect

01Persisted
ExtItemOverrideMethod

This attribute decides what method to use when assigning a value from an external item.

5999Persisted, Enumerated, BitMask
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
ResetCommand

Reset when value has been sent (Write Mode).

10110Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
ParentThe Item handle of the parent item to this item, an item in the same module as this item.5500 

The Parent attribute must be added manually and must point to InterpreterSendItem to be sendt.

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Item telling if module is connected to the Interpreter server; true: is connected, false: is disconnected

The State Item item type has the following properties:

NameDescriptionIDFlags
Description

Item telling if module is connected to the Interpreter server; true: is connected, false: is disconnected.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Java

This module connects to an Java application using JNI, exchanging data with Apis. Module is called ApisJava.

Provider: Prediktor

Properties

Commands And Events

The Java module has the following item types

Scalar

Vector

Matrix

TimeItem

PersistenceItem

StateFileItem

StateTriggerItem

CommandItem

StatusItem

Properties

The Java module has the following properties:

NameDescriptionIDFlags
AddItembufferSnaphostAvgThe average duration in milliseconds of adding item snaphots1530PerformancePage
ApplicationFilePathThe file path in which the applicationsfiles should reside1152Persisted, Folder
AutoDeletePersistenceItemsIf PersistedStateDestination is Items (or Both files and items) items will be created behind the scenes in which the states will be stored. An item of this type will be removed if the last persistence action did not include the state which corresponds to this certain item and this property is set to true. The reason why one might to turn off this option is to reduce the time spent on persisting the states1590Persisted, ExpertPage
Dbg-ExceptionHandlerModeThis value determines how an exception is handled. ONLY USE FOR DEBUGGING PURPOSES!3000Persisted, Enumerated, ExpertPage
EventReportLevelThe level of the reported events from the java application. (E.g. if Warning is chosen, Warning and Alarm events will be reported, but the Information events will not.)1197Persisted, Enumerated
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ExternalItem reportA status-report for the External Item manager of this module110InfoPage
ItemWriteMaskingSet the quality level of items that will be written to the Java application from this module1200Persisted, Enumerated, ExpertPage
JavaAppStartfileName of file which the Java-application will open when it is first started by the ApisJavaBee. The path of this file is set in the ApplicationPath property1150Persisted
JavaAppVisibleIf false all of the windows of this Java-application shall be closed.1140Persisted, ExpertPage
LibraryFilePathThe file path in which library files should reside1154Persisted, Folder
MaxItemBufferSizeThe maximum time steps item values will be buffered1195Persisted
NativeJavaClassThe java class which contains the defintion of the native C-functions1090Persisted, ReadOnly, ExpertPage
OneStepTimeLapseThe duration in milliseconds of the previous call to 'OneStep' function1500PerformancePage
OneStepTimeLapseAvgThe average duration in milliseconds of all calls to 'OneStep' function1505PerformancePage
OneStepTimeLapseJavaAvgThe average duration in milliseconds of all calls to 'OneStep' function, inside the java application.1525PerformancePage
OneStepTimeLapseMaxThe maximum duration in milliseconds of any call to 'OneStep' function1510PerformancePage
PersistedStateDestinationDecides where the destination of the persisted states. They may be persisted to File(s), Item(s), or Both. State persistence is turned off by selecting None1570Persisted, Enumerated, ExpertPage
PersistValToInitValChoose strategy for copying and persisting current value to the InitValue.
Tip: Consider using an InitVQTFromHoneystore attribute instead, for better performance.
1220Persisted, Enumerated, ExpertPage
RMIServerInfoThe complete RMI Server info, if applicable, of the Java application1550InfoPage
RunStates whether the java application is simulating. (I.e . whether the OnOneStep() java method should be called on regular basis.)1095Persisted
SaveActionDetermines when the java application will be saved1148Persisted, Enumerated, ExpertPage
StartupJavaClassThe java class which contains the startup method (StartupMethod)1040Persisted, ReadOnly, ExpertPage
StartupJavaMethodThe static method which will be called after the JVM and the StartupJavaClass have been loaded1050Persisted, ReadOnly, ExpertPage
StateFileFolderThe folder in which the file(s) recides (Working folder)1600Persisted, ExpertPage
StateLoggersThe loggers which will log the state if PersistedStatesDestination is Items1580Persisted, ExpertPage
SupportsAttributesIf true the java application supports item-attribute reflection1610ReadOnly, Hidden
SupportsItemsPersistenceTrue if state can be persisted to items1620ReadOnly, Hidden
SupportsPrimitiveArraysIndicates whether primitive array transfer is supported (this is a more optimal way of transferring vectors and matrices)1560ReadOnly, Hidden
TimerA timer which generates events. If set to less than 50, the timer is not active.1160Persisted
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TimeStepThe time step in the java application (ms)1159Persisted

See also Module Properties

Commands And Events

The Java module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
OneStepDone

An event indicating that the OnOneStep() java method is finished.

Normal
OnTimer

An event fired when at a rate given by the ModelRefreshInterval or Timerperiod property

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
OneStep

A command calling the OnOneStep() java method.

Asynchronous
RefreshTimestamps

Will force refresh of the time stamps of items (Scalar, Vector and Matrix).

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

Scalar value of arbitrary type.

The Scalar item type has the following properties:

NameDescriptionIDFlags
InputValue

If true, this item is an input value into the Java-application (Write). If it's set to false, then this value is an output from the Java-application (Read).

10030Persisted
Itemtype

Decides which type (e.g. Double, String, Bool) the item is to be. (Bool=10, Double=20, String=30)

10020Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Vector value of arbitrary type

The Vector item type has the following properties:

NameDescriptionIDFlags
Dimension

The dimension of a vector item (number or elements).

5007Persisted
InputValue

If true, this item is an input value into the Java-application (Write). If it's set to false, then this value is an output from the Java-application (Read).

10030Persisted
Itemtype

Decides which type (e.g. Double, String, Bool) the item is to be. (Bool=10, Double=20, String=30)

10020Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Matrix value of arbitrary type

The Matrix item type has the following properties:

NameDescriptionIDFlags
Columns

The number or columns in matrix item.

5009Persisted
InputValue

If true, this item is an input value into the Java-application (Write). If it's set to false, then this value is an output from the Java-application (Read).

10030Persisted
Itemtype

Decides which type (e.g. Double, String, Bool) the item is to be. (Bool=10, Double=20, String=30)

10020Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Rows

The number of rows in a matrix item.

5008Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Item that holds a time value

The TimeItem item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Item which contains a value of a state

The PersistenceItem item type has the following properties:

NameDescriptionIDFlags
PersistenceHierarchy

A string array containing the persistence hierarchy.

10200Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Valuetype

Item canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.

10100Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Contains the file name of a file in which states will be stored

The StateFileItem item type has the following properties:

NameDescriptionIDFlags
FileAccess

Decides whether the file specified in the item is in read, write, or read/write mode (Read=0, Write=1, Read/Write=2).

10300Persisted, Enumerated
ForceStopBeforeRead

This property will force the application to go to a "STOPPED" state before loading the state file. If this attribute is false, the application must be put in the "STOPPED" state manually.

10330Persisted
IncludeStateFileFolder

If this property is true, the state file will be put in the "StateFileFolder", otherwise it will be stored in the current directory.

10320Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
TriggerOnChange

If true, the state file will be read/written to when the item changes.

10310Persisted
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Will trigger read/write of states

The StateTriggerItem item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
ReadStates

If this property is set to "true", it triggers the reading of states, if "false" it triggers writing.

10400Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Performs a specific command when set to true. The command is decided by attribute COMMAND (10500)

The CommandItem item type has the following properties:

NameDescriptionIDFlags
CommandType

This attribute allows you to select the type of command to use.

10500Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Reset

Setting this property to true will perform a reset in the java application on the next iteration.

10700Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Contains the current status of the module represented as an integer. UNDEFINED = 0, STOPPED = 1, RUNNING = 2, WAITINGFORSTOP = 3, LOADING = 4, LOADED = 5

The StatusItem item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
StatusAsString

Set this property to display the current status as a string instead of an integer.

10600Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Logger

This module stores item data into the Apis Honeystore real-time trend historian. Data are retrieved through the Apis OPC HDA server and the APIS Hive UA server. Any OPC HDA and UA client can then access the data.

Global attributes

The Apis Logger registers 2 global attributes to the Hive:

  • The regular global boolean attribute that turns trending on/off for the item in question, given the same name as the Apis Logger module itself, ie. LoggerName
  • A global expression attribute that when used, represents an expression to be applied upon the value before it is stored to the database. The attribute is given the name LoggerName_Expr.

Logging of raw values (VQTs)

To store the raw VQTs of an item into a timeseries, you simply add the regular logger attribute (named LoggerName) to that item, and set its value to true.

Logging of calculated values (VQTs)

To store a derived, calculated value instead of the raw value, you also add the global expression attribute (name LoggerName_Expr) to the item, and specify the desired expression. You will also have to add and set the regular logger attribute to the item, and set its value to true.

When storing a calculated value, instead of the raw value, i.e. using the LoggerName_Expr attribute, the value type of the samples will typically be 8 byte float (unless the "calculation" result is a raw value), and the HDA quality part will be Calculated. Further, the name of the item in the HoneyStore database, will be the same as in the case of plain logging; the item ID in the Hive configuration.

Valid Expressions are the same as for the Calculations Operator of Legacy function item. There is one limitation however, you cannot use more than a single external input to the expression. Meaning just a single ex1 is allowed in the Expression.

Hint: If you would like to trend both the raw, as well as a calculated value of an item, you will need to use 2 different Apis Logger modules, targetting to different HoneyStore databases. Also, then you should specify the TimeseriesAccessOrder of the logger modules.
Further, if you want to access timeseries for both stored items inside your Hive configuration, you can use an ApisHSMirrorBee to expose the items having the higher TimeseriesAccessOrder.

Provider: Prediktor

Properties

Commands And Events

Properties

The Logger module has the following configuration properties:

NameDescriptionIDFlags
AutoDeleteDBSet to TRUE if the database of this module shall be automatically deleted when the module itself is deleted.1500Persisted, ExpertPage
AutoDeleteTrendSpecifies whether trends automatically will be deleted from the database when they are deleted from the namespace of Apis Hive and/or when their global log attribute is set to 'false'.1475Persisted, Enumerated, ExpertPage
AutoLoggerItemIDFiltersA set of item ID wildchar filters that items must satisfy to be automatically logged when their logg attribute goes to true. Only applies when DefaultLogAttributeValue is true!1610Persisted, ExpertPage
Cache sizeThe size of each cache in bytes, default value is 10040 bytes1420Persisted, ReadOnly, ExpertPage
DatabaseThe public name of the database into which this instance logs data.1010Persisted, ReadOnly
Database pool recycle strategy

Tells how to recycle the databases in the databasename pool, when Log-switch item triggers a change of active database.

  • None (no database pooling) - reuse existing database name.
    The current database is renamed by appending its creation date and time to its name, and a new database is created having same name as before, or named according to the value of Log-switch new database name item.
    When strategy is None, the Log-switch databasename pool is ignored.
  • Sliding (first always active) - means that the first database in the pool always is the active, a switch simply shifts the history of the first to second, second to third and last to nothing.
    When strategy is Sliding, the Log-switch databasename pool must contain at least one database name.
  • Circular (next in pool active) - means that the next database in the pool will set as the active when a switch is triggered.
    When strategy is Circular , the Log-switch databasename pool must contain at least one database name.
1718Persisted, Enumerated, ExpertPage
DefaultLogAttributeValueThe default value of the global log attribute when added to items. Note! If you change this property, you need to restart the Hive instance to make it take effect.1600Persisted, ExpertPage
FlushingStrategyDetermines the strategy to use when temporarily flushing the whole cache file to disk.3010Persisted, Enumerated, ExpertPage
FreshestValueWhen TRUE, all items will be read directly from their physical source to receive the freshest possible values.2010Persisted, ExpertPage
Historylength_UnitTotal length of historical data in database is given by multiplying this property with the Historylength_X property.1410Persisted, Enumerated
Historylength_XTotal length of historical data in database is given by multiplying this property with the Historylength_Unit property.1400Persisted
Log timestamp-filter itemA reference to an item giving a maximum timestamp filter for item value imestamps. Items having timestamps newer than this filter value, will not be logged. Only applies when Recordtype is Eventbased.1750Persisted, ApisItem, ExpertPage
Log-enable itemWhen the value of this item is 'true', the module is logging timeseries.
When value is 'false', logging is disabled.
1740Persisted, ApisItem, ExpertPage
LogOnStartWhen 'true' and property RecordType is 'Eventbased...', a sample will be written to the database at every time the mode of the database changes to Online, every time a 'LogEnable' item goes true and everytime the module enters the 'Started' runningstate.1650Persisted, ExpertPage
LogStateResolutionIf using any of 'Log-switch item', 'Log-enable item', 'Log-trigger item' or 'Log-switch new database name item' properties, this is the internal polling period in Apis when checking for updated item states.1690Persisted, ExpertPage
Log-switch databasename poolA pool of databasenames that will be re-used when the Log-switch item toggled. If this pool is empty, the current database name (according to Database-property) will be re-used and the previous database will be set to read-only and renamed by appending the creation-date to its name.1720Persisted, ExpertPage
Log-switch item

Set to an item that will control when active log/database is switched.

If ie. the current database will be renamed, set in read-only mode and a new database will be started. If set to a boolean item; log switches when value toggles 'false' to 'true'. If set to a non-boolean item; log switches when value changes.

1700Persisted, ApisItem, ExpertPage
Log-switch new database name itemWhen the database is switched using Log-switch item, and the Database pool recycle strategy is None, the value of this item holds the name of new database to switch to. If the Database pool recycle strategy is other than None, this property is ignored.1710Persisted, ApisItem, ExpertPage
Log-trigger itemSet to an item that will control when data is sampled and stored to the database. If set to a boolean item; data is stored when value toggles 'false' to 'true'. If set to a non-boolean item; data is stored when value changes.1730Persisted, ApisItem, ExpertPage
MaxItemsMaximum number of items (trends) in this database.1050Persisted, ReadOnly
PathThe directory path of the database files.1020Persisted, ReadOnly, Folder
PriorityLevelSpecifies the priority level for the working thread of this ApisLoggerBee instance.3000Persisted, Enumerated, ExpertPage
Recordtype

This property determines how data logged by this module, is stored to the database:

  • Sampled
    Stores data, without quality info, at a regular time interval given by Resolution
  • Sampled with quality
    Stores data, with quality info, at a regular time interval given by Resolution
  • Eventbased with quality
    Stores data, with quality and timestamp info, whenever data (Value, quality or timestamp) changes. The data is scanned for changes at a regular time interval, given by Resolution
1210Persisted, Enumerated
ResolutionWhen 'Recordtype' is one of the sampled options, this is the sampling period in seconds. If 'Recordtype' is 'Eventbased with quality', this is the internal polling period in Apis when checking for updated item values.1300Persisted
TimeseriesAccessOrderAn unsigned integer, determining the order of this ApisLoggerBee module for handling timeseries requests, when an item is logged by more than one ApisLoggerBee module.301202Persisted

Informational properties:

NameDescriptionIDFlags
CachePathThe path to the location of the cache-file of the database.301015InfoPage
ConfigObserverCookieThe cookie identifying this module amongst the configuration aware clients of Apis Hive.301400InfoPage
Database HandleThe internal handle of this database in Apis Honeystore.301010InfoPage
Disk Usage maxThe maximum usage of disk space given in giga bytes on the drive that the database of this module are configured to reside.301052InfoPage
EnabledWhen 'true', the module is logging timeseries, when 'false' logging is disabled301000InfoPage
EnabledReasonAn explanatory text for the 'Enabled' property.301001InfoPage
LogAttributeIDThe global log attribute ID assigned from Apis Hive.301200InfoPage
LogExprAttributeIDThe global log expression attribute ID assigned from ApisHive.301201InfoPage
RAM UsageThe usage of physical memory (RAM) in mega bytes for the database of this module.301050InfoPage
TimeSrvEventID - LogStateThe time-server EventID assigned from the Apis time server for the LogState timer.301150InfoPage
TimeSrvEventID - SamplerThe time-server EventID assigned from the Apis time server for data-sampling timer.301100InfoPage

Performance properties:

NameDescriptionIDFlags
AvgReadTimeThe average time in milliseconds spent reading items from Apis Hive.101010PerformancePage
AvgStoreTimeThe average time in milliseconds spent storing items to Apis Database.101020PerformancePage
AvgTotalTimeThe average time in milliseconds spent in iterating loop.101030PerformancePage
HiveLastUpdCountThe number of items reported as updated at the previous request.101006PerformancePage
NumLogItemsNumber of items this ApisLogger instance stores to its database.101000PerformancePage
NumPendingItemsNumber of pending items this ApisLogger instance is waiting to get correct meta info, to be able to create and store to its database.101001PerformancePage
PeakReadTimeThe maximum time in milliseconds spent reading items from Apis Hive.101040PerformancePage
PeakStoreTimeThe maximum time in milliseconds spent storing items to Apis Database.101050PerformancePage
PeakTotalTimeThe maximum time in milliseconds spent in iterating loop.101060PerformancePage
UpdSinceQueryTimeThe time used when requesting updated items from the Hive.101005PerformancePage

See also Module Properties

Commands And Events

The Logger module has the following Commands and Events:

Events

NameDescriptionEvent Type
LogDoneAn event notifying that a Log-cycle issued by the log-command of this LoggerBee instance has completed.Normal

Commands

NameDescriptionCommand Type
LogA command initiating a Log cycle of the items logged by this LoggerBee instance.Synchronous
Log_DataPushThis command ensure that all items and samples in the data push package, are stored to the HoneyStore database where applicable.
See also:  APIS data transfer mechanism; Data Push
Synchronous

See also Commands And Events

Apis Modbus Master

This module communicates with Modbus slave as a Modbus master using Modbus RTU or TCP

Provider: Prediktor

Properties

Commands And Events

The Modbus module has the following item types

DiscretesInput

Coil

InputRegister

HoldingRegister

Command Item

State Item

BitSelect

BitSelect

MaskSelect

Item attribute items

Function item

More information

Quick Start Guide

Modbus Specifications and Implementation Guides

Characteristics of the Apis Modbus Master module:

  • Implements Modbus Master

  • Supports: Modbus TCP, RTU over TCP/IP and Modbus RTU (serial)

  • Support redundant slaves in TCP mode

  • Uses following Modbus function codes:

    • 1 Read Coil status

    • 2 Read Input status

    • 3 Read Holding registers

    • 4 Read Input registers

    • 5 Force single Coil

    • 6 Preset single Register

    • 16 Preset multiple Registers

  • Supports following data types

    • 2 byte signed int

    • 4 byte signed int

    • 2 byte unsigned int

    • 4 byte unsigned int

    • 8 byte unsigned int

    • 4 byte float

    • 8 byte float

    • Unicode String (UTF-16)

    • ACSII String (UTF-8)

    • 2 byte signed int (Sign and magnitude)

    • 4 byte signed int (Sign and magnitude)

    • 2 byte float (Half-precision floating-point

  • Endianness

    • Big-endian

    • Little-endian

  • Bit numbering

    • MSB..LSB

    • LSB..MSB

Further, as an integrated module in the Apis Hive, the following optional features are available:

  • High performance data logging to the Apis Honeystore historian, with OPC Historical Data Access server interface

Properties

The Modbus module has the following properties:

NameDescriptionIDFlags
AdressMappingTrue: Registertype base adress is subtracted from address given in SrcItemID, eg. daddr = SrcItemID - 40000 , for Holding registers1125Persisted
AdressOffsetTrue: Registers are adressed starting at zero: register 1 is adressed as 01127Persisted
BaudRateBaud rate for serial communication. Only valid when Comm. type is Serial1210Persisted, Enumerated, ExpertPage
ByteorderByte transfer order1121Persisted, Enumerated
COM portThe COM port to use. Only valid when Comm. type is Serial1200Enumerated, ExpertPage
Comm. typeCommunication method; Serial, TCP/IP or None. Set to None if not connected to simulate signals.1110Persisted, Enumerated
DataBitsNumber of bits in the bytes transmitted and received. Only valid when Comm. type is Serial1230Persisted, Enumerated, ExpertPage
Default SlaveaddressThe Slaveaddress (Unit ID) on Modbus slave, this property can be set individually on item level.1260Persisted, ExpertPage
EndianModbus is a "big-endian" protocol: that is, the more significant byte of a 16-bit value is sent before the less significant byte. It seems obvious that 32-bit and 64-bit values should also be transferred using big-endian order. However, some manufacturers have chosen to treat 32-bit and 64-bit values as being composed of 16-bit words, and transfer the words in little-endian order. For example, the 32-bit value 0x12345678 would be transferred as 0x56 0x78 0x12 0x34. Select LittleEndian to use this mixed ordering.1120Persisted, Enumerated
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
FlowControlFlow control for serial communication. Only valid when Comm. type is Serial1250Persisted, Enumerated, ExpertPage
Function DelayDelay between multiple function calls where inquired data is not organized continuous in modbus registers and multiple calls are required on every read cycle. (ms)1060Persisted
IP address or host nameIp Address or host name of Modbus server. Only valid when Comm. type is TCP/IP1310Persisted, Computer
Handle no responseHow no or invalid response from particular slave is handled.

Quality com_failure, keep last value and timestamp

Quality unsertain, keep last value and timestamp

Quality good, keep last value and timestamp

1310Persisted, Computer
LastQueryLast command to modbus slave1500ReadOnly, PerformancePage
LastResponseLast modbus slave response1501ReadOnly, PerformancePage
Number Of Redundant SlavesThe maximum number of redundant slave for this instance, valid range is 0-2.2000Persisted, ExpertPage
ParityParity scheme for the serial communication. Only valid when Comm. type is Serial1220Persisted, Enumerated, ExpertPage
Port or serviceThe TCP port or the name of the service socket of Modbus server. Only valid when Comm. type is TCP/IP1311Persisted
Poll intervalSet this to other than 0 for a timer based polling (seconds)1055Persisted
Read optimizationValid only for function codes 03 Read Holding Registers and 04 Read Input Registers. Read strategy if data is organized in series with no gap in registry. Continuous Datatype: (default) Multiple values are read of same datatype. Continuous Address: Multiple values are read regardless of datatype.  
RedundantComputer_1The computer hosting the redundant Modbus slave, if configured.2010Persisted, ExpertPage
RedundantPort_1The TCP port of the redundant Modbus slave, if configured.2011Persisted, ExpertPage
RedundantComputer_2The computer hosting the redundant Modbus slave, if configured.2020Persisted, ExpertPage
RedundantPort_2The TCP port of the redundant Modbus slave, if configured.2021Persisted, ExpertPage
Reset socket on no responseResets socket immediately on no response, set to false if some request might not response, in order to maintain communicaton to working register2021Persisted, ExpertPage
Socket timeoutThe timeout, in seconds, for blocking send and receive calls. Only valid when Comm. type is TCP/IP1315Persisted
StopBitsNumber of stop bits to be used. Only valid when Comm. type is Serial1240Persisted, Enumerated, ExpertPage
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TracefileMaxSizeThe max size of the trace file before the file is truncated in bytes. Default is 50 MB => 50 * 1024 * 102415000Persisted, ExpertPage
TraceToFileThis is used to trace detail information about the incoming data over the link15010Persisted, File, ExpertPage

See also Module Properties

Commands And Events

The Modbus module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
ReadDone

An event signalled when a read of modbus slave has finished.

Normal
WriteDone

An event signalled when a write to modbus slave has finished.

Normal
PollDone

An event signalled when a complete poll operation of slave has finished.

User case: Connect this event to "Start Polling" command of another Modbus module accessing same slave without polling rate to avoid to modules to access same slave simultaneously, to reduce risk of corrupt response.

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
Start Polling

Initiate a poll from slave.

Asynchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

Single bit Read-Only Starting address 0x10001 to 0x1FFFF when address mapping is active or 0 to FFFF when addressmapping deactivated

The DiscretesInput item type has the following properties:

Name DescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
External quality tag

External tag overriding quality.

100040Persisted
External quality mask

Bitmap representing quality bits field, when External quality tag is used.

100041Persisted
External pattern bad quality

Bit pattern determinating bad quality, when External quality tag is used.

100042Persisted
External mask unertain quality

Bit pattern determinating unsertain quality, when External quality tag is used.

100043Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Read single

Force master to read this item only in a single request. Used to optimize error handling. WARNING: Enabling this feature on numerous items will affect overall performance

10032ReadOnly
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Single bit Read-Write Starting address 0x00001 to 0x0FFFF when address mapping is active or 0 to FFFF when addressmapping deactivated

The Coil item type has the following properties:

Name DescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Direction

Direction of data. Input (false) or Output (true).

5040Persisted
External quality tag

External tag overriding quality.

100040Persisted
External quality mask

Bitmap representing quality bits field, when External quality tag is used.

100041Persisted
External pattern bad quality

Bit pattern determinating bad quality, when External quality tag is used.

100042Persisted
External mask unertain quality

Bit pattern determinating unsertain quality, when External quality tag is used.

100043Persisted
EvaluationOrder

 

What order to use to evaluate the values:

Apis: The value is written to the slave on every cycle;

Slave: The value is read from the slave before the write. The value is only written to the slave when its changed in Apis.

10020Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Read single

Force master to read this item only in a single request. Used to optimize error handling. WARNING: Enabling this feature on numerous items will affect overall performance

10032ReadOnly
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2NormalPage
Write immediately

Send value immediately to slave on set value independet of polling rate (Valid only when Direction = true).

10125NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

16 bit word Read-Only Starting address 0x30001 to 0x3FFFF when address mapping is active or 0 to FFFF when addressmapping deactivated

The InputRegister item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Byteorder

Only valid when Valutype is Unicode String or ASCII, overrides module Byteorder. Byte transfer orde

10013Persisted, Enumerated
Endian

Only valid when the Valuetype property is set to "Unicode String" or "ASCII". This property overrides the module's endian. Modbus is a "big-endian" protocol: that means the more significant byte of a 16-bit value is sent before the less significant byte. Be careful with 32-bit and 64-bit values, which may not be transferred using big-endian order. Some manufacturers treat 32-bit and 64-bit values as being composed of 16-bit words, and transfer the words in little-endian order. For example, the 32-bit value 0x12345678 would be transferred as 0x56 0x78 0x12 0x34. Select LittleEndian to use this mixed ordering.

10012Persisted, Enumerated
External quality tag

External tag overriding quality.

100040Persisted
External quality mask

Bitmap representing quality bits field, when External quality tag is used.

100041Persisted
External pattern bad quality

Bit pattern determinating bad quality, when External quality tag is used.

100042Persisted
External mask unertain quality

Bit pattern determinating unsertain quality, when External quality tag is used.

100043Persisted
ModulusValue = registerhigh x Modulus + registerlow. Valid only for 4 byte signed and unsigned int10015Persisted, Enumerated
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Read single

Force master to read this item only in a single request. Used to optimize error handling. WARNING: Enabling this feature on numerous items will affect overall performance

10032ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
StringSize

The number of buffers to read. Only valid when "Valuetype" is "Unicode String" or "ASCII".

10011Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2ReadOnly
Valuetype

Item canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.

10010Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

16 bit word Read-Write Starting address 0x40001 to 0x4FFFF when address mapping is active or 0 to FFFF when addressmapping deactivated

The HoldingRegister item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Direction

Direction of data. Input (false) or Output (true).

5040Persisted
Byteorder

Only valid when Valutype is Unicode String or ASCII, overrides module Byteorder. Byte transfer orde

10013Persisted, Enumerated
Endian

Only valid when the Valuetype property is set to "Unicode String" or "ASCII". This property overrides the module's endian. Modbus is a "big-endian" protocol: that means the more significant byte of a 16-bit value is sent before the less significant byte. Be careful with 32-bit and 64-bit values, which may not be transferred using big-endian order. Some manufacturers treat 32-bit and 64-bit values as being composed of 16-bit words, and transfer the words in little-endian order. For example, the 32-bit value 0x12345678 would be transferred as 0x56 0x78 0x12 0x34. Select LittleEndian to use this mixed ordering.

10012Persisted, Enumerated
External quality tag

External tag overriding quality.

100040Persisted
External quality mask

Bitmap representing quality bits field, when External quality tag is used.

100041Persisted
External pattern bad quality

Bit pattern determinating bad quality, when External quality tag is used.

100042Persisted
External mask unertain quality

Bit pattern determinating unsertain quality, when External quality tag is used.

100043Persisted
EvaluationOrder

 

What order to use to evaluate the values:

Apis: The value is written to the slave on every cycle;

Slave: The value is read from the slave before the write. The value is only written to the slave when its changed in Apis.

10020Persisted, Enumerated
ModulusValue = registerhigh x Modulus + registerlow. Valid only for 4 byte signed and unsigned int10015Persisted, Enumerated
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
StringSize

The number of buffers to read. Only valid when "Valuetype" is "Unicode String" or "ASCII".

10011Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2NormalPage
Valuetype

Item canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.

10010Persisted, Enumerated
Write immediately

Send value immediately to slave on set value independet of polling rate (Valid only when Direction = true).

10125NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Command for controlling module. #Connect# Toggle this item to connect to the Modbus slave/server; true: is connect, false: is disconnect

The Command Item item type has the following properties:

NameDescriptionIDFlags
Description

Toggle this item to connect to the Modbus server; true: is connect, false: is disconnect.

Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

The item #Connected# telling if module is connected to the Modbus slave/server; true: is connected, false: is disconnected.

The item #Response_Ok# telling if module is receiving valid response from the Modbus slave/server; true: is ok, false: no or invalid response.

The State Item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Apis ModbusSlave

This module acts as an Modbus slave.

Provider: Unspecified

Properties

Commands And Events

The ModbusSlave module has the following item types

DiscretesInput

Coil

InputRegister

HoldingRegister

More information

Quick Start Guide

Characteristics of the Apis Modbus Slave module:

  • Implements Modbus Slave
  • Supports Read and Write.
  • Supports, Modbus TCP and Modbus RTU.
  • Supports functioon codes: 
    • 1 Read Coil status

    • 2 Read Input status

    • 3 Read Holding registers

    • 4 Read Input registers

    • 5 Force single Coil

    • 6 Preset single Register

    • 15 Force multiple Coils

    • 16 Preset multiple Registers

Further, as an integrated module in the Apis Hive, the following optional features are available:

  • High performance data logging to the Apis Honeystore historian, with OPC Historical Data Access server interface

Item Types

Properties

Single bit Read-Only Starting address 0x10001 to 0x19999 when address mapping is active or 0 to FFFF when addressmapping deactivated

The DiscretesInput item type has the following properties:

NameDescriptionIDFlags
Inverted

If this attribute is true, the incoming value is inverted before writing to the Modbus register.

10070Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly

See also Predefined Item Properties and OPC DA Properties

Properties

Single bit Read-Write Starting address 0x00001 to 0x09999

The Coil item type has the following properties:

NameDescriptionIDFlags
Inverted

If this attribute is true, the incoming value is inverted before writing to the Modbus register.

10070Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly

See also Predefined Item Properties and OPC DA Properties

Properties

16 bit word Read-Only Starting address 0x30001 to 0x39999

The InputRegister item type has the following properties:

NameDescriptionIDFlags
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rawvalue

The raw, untranslated value of the item.

10075Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
ValuetypeItem canonical datatype in Modbus register.10045Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Properties

16 bit word Read-Write Starting address 0x40001 to 0x49999

The HoldingRegister item type has the following properties:

NameDescriptionIDFlags
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rawvalue

The raw, untranslated value of the item.

10075Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Slaveaddress

The slave address (Unit ID) of the Modbus slave (overrides the module property).

10030Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Valuetype

Item canonical datatype in Modbus register. This determines what kind of data this field holds. For example: string, integer, etc.

10045Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Commands And Events

The ModbusSlave module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Properties

The ModbusSlave module has the following properties:

NameDescriptionIDFlags
BaudRateBaud rate for serial communication. Only valid when Comm. type is Serial1040Persisted, Enumerated
ByteorderByte transfer order1038Persisted, Enumerated
COM portThe COM port to use. Only valid when Comm. type is Serial1060Persisted, Enumerated
Comm. typeCommunication method, Serial or TCP/IP.1025Persisted, Enumerated
DataBitsNumber of bits in the bytes transmitted and received for serial communication.1072Persisted, Enumerated
Default SlaveaddressThe Slaveaddress (Unit ID) on Modbus slave, this property can be set individually on item level.1035Persisted
EndianModbus is a "big-endian" protocol: that is, the more significant byte of a 16-bit value is sent before the less significant byte. It seems obvious that 32-bit and 64-bit values should also be transferred using big-endian order. However, some manufacturers have chosen to treat 32-bit and 64-bit values as being composed of 16-bit words, and transfer the words in little-endian order. For example, the 32-bit value 0x12345678 would be transferred as 0x56 0x78 0x12 0x34. Select LittleEndian to use this mixed ordering.1037Persisted, Enumerated
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
Exception CodeException code to send for testing purpose, valid only when Send Exception property is greater that 016010 
HandshakingHandshaking option for serial transmission1070Persisted, Enumerated
IP addressEndpoint Ip address of Modbus server. Only valid when Comm. type is TCP/IP1301Persisted
Master command timeoutTime to wait for master command in (seconds) before giving up and reset connection1302Persisted
Message timeoutTime interval, the active message is reported (in seconds).1090Persisted
ParityParity scheme for the serial communication. Only valid when Comm. type is Serial1050Persisted, Enumerated
PortThe endpoint TCP/UDP port of Modbus server. Only valid when Comm. type is TCP/IP1300Persisted
Send ExceptionThis is used to simulate exception to master for test purposes. Set value to greater that 0, to enable exception response. 1 - exception on every response 2 - exception on every second response and so on.  
StopBitsNumber of stop bits to be used for serial communication.1074Persisted, Enumerated
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TracefileMaxSizeThe max size of the trace file before the file is truncated in bytes. Default is 50 MB => 50 * 1024 * 102415000Persisted, ExpertPage
TraceToFileThis is used to trace detail information about the incomming data over the link15010Persisted, File, ExpertPage
WritetimeoutTime to wait (milliseconds) before giving up an attempt to write to master1080Persisted

See also Module Properties

Set up an Modbus Slave

Follow the guide Add Module to Apis Hive, but this time select a module of type ApisModbusSlave from the Module type drop down list, name it "ModbusSlave" in this case.

  • After adding the module, select the new module named "ModbusSlave" from theSolution Explorer.

Basic setup, communication interface

The module supports both serial (RTU) and TCP/IP (Modbus TCP) interface, depending on your needs. In the Properties Editor, enter values for:

  1. TCP/IP based server:

    • Comm. type: TCP/IP
    • IP address: The endpoint IP address of your ModbusSlave server.
    • Port: The endpoint TCP port of your ModbusSlave server.

  2. Serial communication based server :

    • Comm. type receive : Serial
    • Com port: Com port connected to the slave.
    • BaudRate: Baud rate of your slave serial setup.
    • DataBits: Data bits of your slave serial setup.
    • FlowControl: Handshake of your slave serial setup.
    • Parity: Parity of your slave serial setup.
    • StopBits: Stop bits of your slave serial setup.

  • Further in the Properties Editor:

    • Default Slave address : Note! this is the initial property when new items are created.
    • Assure the byte order and endianness is according to your needs
  • Press "Apply" when done.

Add Items (registers)

Now follow the guide Add Items to a Module, but this time select the Modbus module and add items of one of the register types:

  • Coil
  • DiscretesInput
  • InputRegister
  • HoldingRegister

Example Holding register:

Give the Item a proper name like "Temp_Man_6". Assure SrcItemID is pointing to valid register address like "40002",check the Slaveadress and set correct Valetype of the value in the register of the slave. Ok

Expose data

To expose data, ModbusSlave Items must be connected to a source item . See Connecting Items

The Worker.Signal1 is now exposed to Modbus masters (clients) in Holding register 40002 as 8 byte float.

Trouble shooting

If no connection or data received:

  • Use 3 party terminal application like putty to check if server is sending telegrams.
  • Check firewall settings for the port.
  • Check network connection to server, (ping)

ApisMQTTClientBee

This module can be used to subscribe to messages on topics from a MQTT broker. The messages can be decoded and stored in Value Items. Currently we support simple Json formated messages. See Json messages for details. To configure this module, set up the needed property, MQTTAddress, MQTTPort, MQTTClient (any textstring), MQTTServicelevel, MQTTTransport (TcpServer or TcpServerTls if security is needed).

By using secure communication (TcpServerTls) a certificate from the server will be added under default certificatepatch (..instancename/pki/rejected/xxxx.crt) . This certificate should be checked and moved to the thrusted directory to allow secure communication with this broker.

When these parameters are ok then the status-item shall show value "Connected". Add a TopicItem by right-clicking the module on the left side. Configure the Topic by setting the correct topic to listen to. It is possible to listen to several topics. The value for topic-item will go up by one as it receives messages from the broker.

Then add ValueItem. Connect the Item to a topic by the property "TopicItem", and then select the correct textvalues for properties "Key", "TimeId", "QualityId", and "ValueId", (See JSONMessage). When these properties are correctly set up, the value-item will show quality "Good".

"Auto create items" property in TopicItem can also be turned on to automatically decode the message from the broker and create the value items instead of having to manually set up the value items.

Provider: Prediktor

Properties

Commands And Events

Json messages

The ApisMQTTClientBee module has the following item types

Topic

Value Item

Status

ApisMQTTClientBee Item Types

Topics Covered:

Item type: Topic

Topic item class description

The Topic item type has the following attributes:

NameDescriptionIDFlags
TopicThe topics to read messages from11000Persisted
Auto create itemsIf selected then the client will auto create items received in messages11113Persisted

See also Basic Properties and OPC DA Properties

Item type: ValueItem

ValueItem item class description

The ValueItem item type has the following attributes:

NameDescriptionIDFlags
Topic itemThe topic this item is getting the message from11101DynamicEnumeration,APISLocalItem,Persisted
KeyThe key this item is getting values from11103Persisted
Time idThe id of timevalue in message11110Persisted
Quality idThe id of qualityvalue in message11111Persisted
Value idThe id of value in massage11112Persisted

See also Basic Properties and OPC DA Properties

Item type: Status

This item is used to expose internal status information

The Status item type has the following properties:

NameDescriptionIDFlags
QualityItem quality3ReadOnly
RightsItem access rights5ReadOnly
TimeItem timestamp4ReadOnly
TypeItem canonical datatype1ReadOnly
ValueItem value2ReadOnly

See also Predefined Item Properties and OPC DA Properties

Commands And Events

The MQTTClientBee module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified by user. The timer resolution is specified by the 'ExchangeRate' property.Timer
HandleExternalItems_DataPushEvent fired when an external items update iteration has been handled. Ie. one per command 'HandleExternalItems', one or more per command 'HandleExternalItems_DataPush'.Normal
DataChangedVQTs are updated (only for VQT 2 messages).Normal
OneMessageDecodedA message (containing one or more VQTs) has been decoded.Normal

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
HandleExternalItems_DataPushCommand for updating external items from the item data pushed as input parameter to the command, ie. fired by ApisOpcUaBee event 'ServerDataChanged_DataPush'. When fired, the module will update any external items part of the data, and update/notify the ones that has changed.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Properties

The MQTTClientBee module has the following properties:

Name Description ID Flags
Client certificate The filename of the clients certificate when broker expect a client certificate to accept connection. 1015 Persisted, File
Exchangerate The exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values. 100 Persisted
External item report A status-report for the External Item manager of this module 110 InfoPage
ExtItem full refresh When true, the external items manager will force a full refresh initially on start/reset when reading items. I.e. items not yet initialized in their source, will also trigger an external item update. Default is true. 150 Persisted, ExpertPage
ExtItem pass-through quality Specifies the quality of external item values that will pass through external item transfers. If external item qualities has worse quality the this mask, the external item transfer is blocked. Default is 'Any quality'. 400 Persisted, Enumerated, ExpertPage
Loglevel Specifies the loglevel for diagnostic messages from this module. 500 Persisted, Enumerated
MQTT address Address to MQTT broker 1002 Persisted
MQTT clean session If selected then the client will not get history messages the broker has stored for this clientid since last disconnection. 1009 Persisted
MQTT clientid A string that define/identify the clients session at the broker. 1004 Persisted
MQTT password The clients password. 1006 Persisted, Password
MQTT port MQTT broker port 1003 Persisted
MQTT service level Quality of service 1008 Persisted, Enumerated
MQTT transport Transmission protocole: TcpServer, use unencrypted communication. TcpServeTLS use encrypted communication. 1011 Persisted, Enumerated
MQTT user The clients username. 1005 Persisted
MQTTVersion Version to use when connecting to server 1010 Persisted, Enumerated

See also Module Properties

JSON messages

The messages that are published to the broker should be JSON type. There are three types of messages which can be decoded by APIS:

  • VQT messages : This type of message can be used to update or create value items.
  • Meta messages : This type of message can be used to update the attributes of an existing value item.
  • Event messages : This type of message can be used to write any events or alarms to any existing tag in APIS.
  • Historical messages : This type of message can be used to write historical data for a value item directly to honeystore.

VQT messages

Currently there are two versions of the MQTT message formats that are supported in APIS.

VQT Version 1

Version 1 for MQTT messages is a simple JSON object that contains the following keys:

  • Message type
  • Message version
  • Value
  • Quality
  • Timestamp

All the values defined in a VQT1 message will have the same timestamp. If timestamp is not defined, the values will be associated with the current time on the system.

VQT1 Example


	{

		"messagetype": "vqt", 
		"version": 1, 
		"t": "2022-09-22T06:06:28.891+00:00", 
		"Tag1": 43, 
		"Tag2": 58,
		"Tag3": 80,
		"q": 0

	}

User can define the text strings for the time,value and quality of an item. The text strings for Time id, Value id and Quality id in a message must match the Time id, Value id and Quality id for the value items in the property editor. Note: Timezone information needs to be provided in the timsestamps.

For the above example message, the properties for a value items in APIS should be set as follows:

Value itemValue idTime idQuality id
Tag 1Tag1tq
Tag 2Tag2tq
Tag 3Tag3tq

Mandatory Fields in message:

  • messagetype
  • version
  • value id string

Quality Codes

The quality codes for the value items in VQT 1 messages are as follows:

QualityCodes
Good0
BadAnything other than 0

VQT Version 2

Version 2 for MQTT messages is a simple JSON object that contains the following keys:

  • Message type
  • Message version
  • Quality
  • Timestamp
  • Payload

Payload is the content of the message. It is an array/list of JSON objects for value items and has the following attributes:

  • k (Key)
  • v (Value id)
  • t (Time id)
  • q (Quality id)

All these attributes for value items shall be represented as key:value pairs in the payload.

Timestamp can be defined for the entire message or inside the payload object for every item separately.
With VQT2, it is also possible to have multiple timestamped values associated with a value item in a message i.e there can be multiple JSON objects for a single value item by including multiple timestamps inside the payload.
If timestamp is not defined, the values will be associated with the current time on the system.

VQT2 Example


	{
		"messagetype": "vqt",
		"version": 2,
		"payload": 
			[
				{
					"k": "Tag1",
					"v": 10,
					"t": "2022-09-22T06:06:32.913554+00:00",
					"q": 192
				},

				{
					"k": "Tag1",
					"v": 3,
					"t": "2022-09-23T05:56:36.930429+00:00",
					"q": 192
				},

				{
					"k": "Tag2",
					"v": 70,
					"q": 192
				},

				{
					"k": "Tag3",
					"v": 98,
				}
			]
	} 

The text strings for value id, time id and quality id are fixed in this version to "v", "q" and "t" respectively and must be defined accordingly in the message.

For the above example message, the properties for a value items in APIS should be set as follows:

Value itemKey
Tag 1Tag1
Tag 2Tag2
Tag 3Tag3

Mandatory Fields in message:

  • messagetype
  • version
  • payload
  • k (key)
  • v (value)

Quality Codes

The quality codes for the value items in VQT 2 messages are as follows:

QualityCodes
Good192
Local override216
Bad0
Config error4
Not connected8
Device failure12
Sensor failure16
Last known20
Comm failure24
Out of service28
Waiting for initial data32
Uncertain64
Last usable68
Sensor cal80
EGU exceeded84
Sub normal88

Meta messages

Meta messages can be used to write or edit the attributes/properties of MQTT value items existing in APIS environment.

A meta message is a JSON object that contains the following keys:

  • Message type
  • Payload

Payload is the content of the message. It is an array of JSON objects for value items containing the properties which need to be written/edited as the keys in its key:value pairs.

While trying to edit any attribute of an item, make sure that it's writable and that the key of the value item is provided. Index or id should be taken as a value for properties that have a dropdown menu or list to select from.

Meta Example


	{
		"messagetype" : "meta",
		"version" : 1,
		"payload": 
			[
				{
                    			"k" : "Tag1",
                    			"eu": 105,
                    			"description": "Description for tag 1",

				},

				{
                    			"k": "Tag2",
                    			"Offset": 100,

				},

				{
                    			"k": "Tag3",
                    			"AlmPrimaryArea": "AlarmArea1",
                    			"AlmLL": 0,
                    			"AlmL": 100,
                    			"AlmH": 200,
                    			"AlmHH": 350,
                    			"AlmHelp": "Help for alarm",
                    			"Logger": True,

				},

			]
	}

Mandatory Fields in message:

  • Message type
  • Payload
  • k (key)

Event messages

Event messages can be used to write events or alarms to a tag/variable/item in the APIS environment.

An event message is a JSON object that contains the following keys:

  • Message type
  • Payload

Payload is the content of the message. It is an array of JSON objects for events which need to be written to the items and can have the following attributes per event:

  • source
  • type
  • timestamp
  • severity
  • state
  • current value

Source is the location of the item (that you want to write an event to) inside the apis event server.

States of Alarms/Events:

Possible states of alarms can be:

  • None
  • Enabled
  • Active
  • AckRequired
  • Acked
  • Confirmed
  • Suppressed
  • Shelved
  • Substates

Event Example

			{		
				'messagetype': 'event', 
				'version': 1,
                                'payload':
                                    [
                                        {
                                            'type': 'LevelAlarm',
                                            'source': 'ApisHive/Areas/TestArea/TestWorker.Variable1',
                                            'time': '2022-09-22T11:52:02.438+00:00',
                                            'severity': 780,
                                            'state': 'Enabled,Acked',
                                            'currentvalue': 300,
                                        },
                                        {
                                            'type': 'OffNormalAlarm',
                                            'source': 'ApisHive/Areas/TestArea/TestWorker.Variable2',
                                            'time': '2022-09-22T11:52:02.438+00:00',
                                            'activetime': '2022-09-22T11:50:00.000+00:00',
                                            'severity': 780,
                                            'state': 'Enabled,Active,AckRequired',
                                            'currentvalue': 0,
                                            'message': 'Alarm on'
                                        },
                                        {
                                            'type': 'WatchdogAlarm',
                                            'time': '2022-09-22T11:52:02.438+00:00',
                                            'activetime': '2022-09-22T11:50:00.000+00:00',
                                            'source': 'ApisHive/Areas/TestArea/TestWorker.Variable3',
                                            'severity': 780,
                                            'state': 'Enabled,Active,Suppressed',
                                            'currentvalue': 150,
                                            'message': 'Value frozen'
                                        },
                                        {
                                            'type': 'WatchdogAlarm',
                                            'time': '2022-09-22T11:52:02.438+00:00',
                                            'activetime': '2022-09-22T11:50:00.000+00:00',
                                            'source': 'ApisHive/Areas/TestArea/TestWorker.Variable4',
                                            'severity': 835,
                                            'state': 'Enabled,Confirmed',
                                            'currentvalue': 100,
                                            'message': 'Value frozen for the last 10 seconds'
                                        },
                                        {
                                            'type': 'WatchdogAlarm',
                                            'time': '2022-09-22T11:52:02.438+00:00',
                                            'source': 'ApisHive/Areas/TestArea/TestWorker.Variable5',
                                            'severity': 780,
                                            'state': 'Enabled,Active,Shelved',
                                            'currentvalue': 100,
                                            'activetime': '2022-09-22T11:50:00.000+00:00',
                                            'message': 'Bad quality'
                                        },
                                    ]
                     }


Mandatory Fields in message:

  • message type
  • payload
  • type
  • source
  • time
  • severity
  • state
  • currentvalue

Historical messages

Historical messages can be used to add data to the honeystore for value items. This type of message adds data directly to the database that's responsible for logging the value items.

Historical message type is a JSON object that contains the following keys:

  • Message type
  • Message version
  • Payload

The payload is an array of JSON objects and can contain historical data for one or multiple value items with each JSON object inside the array dedicated to a single value item.

The payload array contains JSON objects with the following elements/keys:

  • k
  • v
  • overwrite

where,
'k' defines the tag/value item for which the history needs to be written,
'v' is an array of VQTs containing the actual historical values along with their timestamps and qualities defined,
'overwrite' is a True/False value that defines whether the history needs to be overwritten or not.

Note: The VQTs need to be ordered in the 'oldest to newest' timestamp order inside the 'v' array

Historical message Example


{
    "messagetype":"historical",
    "version":1,
    "payload":
    [
        {
            "k":"tag1",
            "v":
            [
                {
                    "v":25,
                    "q":192,
                    "t":"2022-09-25T20:00:00.000+00:00",                    
                },
                {
                    "v":23,
                    "q":192,
                    "t":"2022-09-25T21:00:00.000+00:00"
                }
            ],
            "overwrite":true
        },

        {
            "k":"tag2",
            "v":
            [
                {
                    "v":40,
                    "q":192,
                    "t":"2022-09-25T10:02:00.000+00:00",                    
                },
                {
                    "v":34,
                    "q":192,
                    "t":"2022-09-25T12:03:00.000+00:00"
                }
            ],
            "overwrite":true
        }
    ]
}

Mandatory Fields in message:

  • messagetype
  • payload
  • v (array field)
  • k (key)
  • v (value)
  • t (timestamp)

Apis OPC

The Apis OPC Client module connects the Apis system to an OPC Data Access Server. An OPC Data Access Server is a software driver that enables a standard way of accessing real-time data from a system. For further information about OPC technology, see: http://www.opcfoundation.org/.

Provider: Prediktor

Properties

Commands And Events

The OPC module has the following item types:

More information

Quick Start Guide

Characteristics of the ApisOPCBee Client module:

  • Implements an OPC Data Access 1.01a and 2.05a client.
  • Supports both polling and subscriptions when reading data.
  • Supports both asynchronous and synchronous writing.
  • Automatically re-connects if connection to the OPC Server breaks down (such as temporarily loss of network connections etc.)

Further, as an integrated module in the Apis Hive, the following optional features are available:

  • High performance data logging to the Apis Honeystore historian, with OPC Historical Data Access server interface
  • Enhances OPC DA 1.01a servers to OPC DA 2.05a servers
  • Cross connection between OPC servers, or cross connection between an OPC server and another Apis Access module
  • Conventional and Advanced Control with Apis Soft PLC or Apis Advanced Soft Controller
  • Access to OPC data, through Apis Process Explorer WEB application, as well as other Apis report generation tools.

Properties

The OPC module has the following properties:

Standard properties

NameDescriptionIDFlags
ComputerThe computer hosting the OPC server.1000Persisted, Computer
DataSourceThis property tells the OPC server how to retrieve its data. 'Cache' means that the OPC server should use any internal cached data, 'Device' means that the OPC server should refresh the data from any underlying device (ie. PLC)1019Persisted, Enumerated
DeadbandPercent deadband for item updates, when ReadMode is subscribe.1022Persisted
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ReadModeHow items are beeing read from the server.1020Persisted, Enumerated
ServerThe ProgID of the OPC server. Can also use a class ID directly, eg. if the class id is known and server browsing is unavailable.1001Persisted, Enumerated, ProgID
UpdaterateUpdaterate in milli-seconds for the items, when ReadMode is Subscribe x.0. When ReadMode is Asynch I/O, the intervall between each issued Refresh call. When ReadMode is Synch I/O, the intervall betweean each Read call.1021Persisted
WriteModeHow items are being written to the server.1030Persisted, Enumerated

Advanced properties

NameDescriptionIDFlags
CmnItemIDPrefixCommon source ItemId prefix. This string will prefix each item's 'ItemID' when communicating with the OPC server. Items in this module will have names without this prefix in the Apis namespace.1012Persisted, ExpertPage
DelayServerConnectDelay server connect on startup, connection will be performed when all modules are fully initialized.1045Persisted, ExpertPage
ForceOutOfProcPrevent the opc server from beeing loadad into the memory space of Apis.1010Persisted, ExpertPage
ForceRefreshOnConnectBy enabling this with subscription (1.0 or 2.0), the client will request a distinct refresh of all items from the server after a connect. Note that the OPC server should refresh all items afer an Advise, this is a feature for working with OPC servers that do not meet the specifiaction regarding this.1007Persisted, ExpertPage
GroupActiveActive flag for the group. Automatic item activation will cause this flag to change dynamically when needed.1033ExpertPage
ItemActivationItem activation strategy.1034Persisted, Enumerated, ExpertPage
ItemActiveTimeoutThe interval in seconds to wait before an item is deactivated when there has been no reads from the item. (Only for 'ItemActivation' = 'automatic with timeout')1036Persisted, ExpertPage
ItemPropSyncWhen to synchronize OPC DA properties with Apis item attributes1038Persisted, Enumerated, ExpertPage
NumberOfRedundantServersThe maximum number of redundant servers for this instance, valid range is 0-5.2000Persisted, ExpertPage
ReconnectSrvShutdownReconnect after an intended OPC server shutdown. This might cause problems when administering the OPC server.1008Persisted, ExpertPage
ReconnectTimeThe time to wait before attempting to reconnect the server after an RPC failure, or time interval for watchdog evaluation if watchdog is configured.1006Persisted, Enumerated, ExpertPage
RedundantCmnItemIDPrefix_xCommon source ItemId prefix for the redundant server 'x'. This string will prefix each item's 'ItemID' when communicating with the OPC server. Items in this module will have names without this prefix in the Apis namespace.20x2Persisted, ExpertPage
RedundantComputer_xThe computer hosting the redundant OPC server 'x', if configured.20x0Persisted, Computer, ExpertPage
RedundantServer_xThe ProgID of the redundant OPC server 'x', if configured.20x1Persisted, Enumerated, ProgID, ExpertPage
SerializeCallsToOPCServerSet this property to <true> if the OPC Server is not handling simultaneous calls properly.1042Persisted, ExpertPage
ServerWatchdog

A strategy for server communication monitoring. This can be useful when dealing with erronous servers that stop to operate correctly. Values are:

  • none=0
  • Restart if update interval > ReconnectTime
  • Restart if watchdog item(s) update interval > ReconnectTime
  • Restart if watchdog item(s) update or bad quality interval > ReconnectTime
1039Persisted, Enumerated, ExpertPage
ServerWatchdogItemsSet to one or more local items, when using a ServerWatchdog utilizing watchdog item(s)1043Persisted, ApisLocalItem, ExpertPage
SrvConfigFileThe configuration file of the server. This file is used with the optional IPersistFile implementation of the OPC server.1004DSN, ExpertPage
SrvLCIDLocale ID of values coming from the server. You might need to specify this property if the OPC server provides string values that is converted to another type in your client (e.g. DDE bridges)1018Persisted, ExpertPage
SuppressVendorQualityWhen true, any vendor specific item quality will be suppressed, only the standard OPC Foundation bits of the quality flag, will be set for item qualities1025Persisted, ExpertPage
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TraceOPCServerReport specific OPC calls, return from those calls, as well as callbacks to the log files for log view in Apis Management Studio.1040Persisted, Enumerated, ExpertPage
TraceToFile1If set, all OPC calls and callbacks will be traced to this file. Should only be used for short time periods and for verbose troubleshooting of communication.1041Persisted, File, ExpertPage
UseApisTimeUse Apis timestamps instead of the timestamps given by the OPC server.1023Persisted, ExpertPage

Information properties

NameDescriptionIDFlags
DebugInfoDebug information for advanced troubleshooting1058ReadOnly, InfoPage
ExternalItem reportA status-report for the External Item manager of this module110InfoPage
RedundantSrvCLSID_xThe CLSID of the redundant OPC server 'x', if configured.20x3Persisted, ReadOnly, InfoPage
SrvActiveCurrently active server, when using redundant severs, this tells which of the two that is currently set as active server (Primary / Redundant).1046ReadOnly, InfoPage
SrvBandWidthThe approximate percent of bandwith in use by the OPC server1055ReadOnly, InfoPage
SrvCapabilitiesThe capabilities of the OPC server.1048Persisted, ReadOnly, InfoPage
SrvCLSIDThe CLSID of the OPC server.1005Persisted, ReadOnly, InfoPage
SrvCurrentTimeThe current time (UTC) as known by the OPC server.1051ReadOnly, InfoPage
SrvGroupCountThe total number of groups (all public and private) being managed by the OPC server1054ReadOnly, InfoPage
SrvStartTimeThe time (UTC) the OPC server was started.1050ReadOnly, InfoPage
SrvStateThe current status of the OPC server1053ReadOnly, Enumerated, InfoPage
SrvVendorInfoVendor specific information about the OPC server1057Persisted, ReadOnly, InfoPage
SrvVersionThe version number of the OPC server (major-minor-build)1056Persisted, ReadOnly, InfoPage

Performance

NameDescriptionIDFlags
SrvAvgItemsPerCallThe average number of items in each update call.1064ReadOnly, PerformancePage
SrvLastPendWCTIDThe last pending write complete transaction ID.1092ReadOnly, PerformancePage
SrvLastUpdateIntervalWhen ReadMode is Subscribe or Asynch I/O, the time in seconds between the last two updates. When ReadMode is Sync I/O, the time interval between the last successful Reads.1061ReadOnly, PerformancePage
SrvLastUpdateTIDThe last update or read transaction ID that was received.1072ReadOnly, PerformancePage
SrvLastWCTIDThe last received write complete transaction ID.1094ReadOnly, PerformancePage
SrvPercentUpdateInvalidThe percentage of all updated items that was invalid (e.g. unknown item handles).1070ReadOnly, PerformancePage
SrvPercentUpdateSuccessThe percentage of all updated items that have been updated with success.1066ReadOnly, PerformancePage
SrvPercentUpdateUnneccessaryThe percentage of all updated items that was unneccessary (neither value nor quality had changed).1068ReadOnly, PerformancePage
SrvPercentWriteCmplFailThe percentage of all write complete items that have failed.1088ReadOnly, PerformancePage
SrvPercentWriteCmplInvalidThe percentage of all write complete items that was invalid (e.g. unknown item handles).1090ReadOnly, PerformancePage
SrvPercentWriteCmplSuccessThe percentage of all write complete items that have succeded.1086ReadOnly, PerformancePage
SrvUpdateCallsThe number of times the OPC server has called back to this client with updated item values.1062PerformancePage
SrvUpdateTimeThe time the OPC server sent the last data value update to this client, as known by the server.1052ReadOnly, PerformancePage
SrvUpdateTimeClientWhen ReadMode is Subscribe or Asynch I/O, the time when this client received the last update, as known by the client. When ReadMode is Synch I/O, the time of the last successful Read.1060ReadOnly, PerformancePage
SrvWriteCmplCallsThe number of times the OPC server has called back to this client with updated item values.1082PerformancePage
SrvWriteCmplTimeClientThe time when this client received the last write complete call, as known by the client.1080ReadOnly, PerformancePage

See also

Module Properties

1

TraceToFile has default a maximum count of 10 files with maximum size of 64 MB, other values can be specified in the module specific registry key using DWORD values TraceToFileMaxCount and TraceToFileMaxSize

Commands And Events

The OPC module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.Timer
ServerConnectedThe OPC server if connected.Normal
ServerDataChangeThe OPC server has sent a subscription based data change notification.Normal
ServerFirstTimeDataChangedThe OPC server has sent a subscription based data change notification first time after reconnect.Normal
ServerShutdownThe OPC server has sent a shutdown notification.Normal

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
ResetExternalItemsResets the external item manager, by forcing a full refresh of all external items when the 'HandleExternalItems' is fired next time.Synchronous
ReconnectReconnects to the OPC server.Asynchronous
ResetResets the OPC items in the client by performing a synchronous read from server.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also

Commands And Events

Item Types

OPC Item

An item in an OPC server

The OPC Item item type has the following properties:

Standard

NameDescriptionIDFlags
SrcItemIDItem ID in source5030Persisted

Advanced

NameDescriptionIDFlags
ActiveInServerTrue when this item is active in the OPC server.10110ReadOnly, ExpertPage
LocalOverrideToBadIf true, a good:local override item quality from the server will be remapped to a bad quality.10130Persisted, ExpertPage
ReqVartypeThe type requested to be delivered for the value of this item from the server. Only change this one for special situations.10100Persisted, Enumerated, ExpertPage

Information

NameDescriptionIDFlags
Client handleThe client handle of the item, as registered with the OPC sever. When negative, the item has not been accepted by the OPC server.10160ReadOnly, InfoPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Command Item

An item that represents a command to apply on this module.

The Command Item item type has the following properties:

NameDescriptionIDFlags
Command argument 1

This attribute may be used by an item to specify additional arguments.

10150Persisted
Command type

This attribute allows you to select the type of command to use.

10140Persisted, ReadOnly, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Trans OPC Item

An OPC item with automatic calculation of a linear transformation.

The Trans OPC Item item type has the following properties:

NameDescriptionIDFlags
ActiveInServer

True when this item is active in the OPC server.

10110ReadOnly, ExpertPage
Client handle

The client handle of the item, as registered with the OPC sever. When negative, the item has not been accepted by the OPC server.

10160ReadOnly, InfoPage
LocalOverrideToBad

If true, a "good:local" override item quality from the server will be remapped to a "bad" quality.

10130Persisted, ExpertPage
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
ReqVartype

The type requested from the server for the value of this item. Only change this property in special circumstances.

10100Persisted, Enumerated, ExpertPage
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

State Item

An item that represents a well defined state of this module.

The State Item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Module State Items

An item which retrieves status information from the module

The Module state items item type has the following properties:

NameDescriptionIDFlags
Item type

The item type this item aggregates statistics on, when applicable. Use the number inside the parentheses in "FileAdd" configuration files.

19001Persisted, ReadOnly, DynamicEnumeration
Module state

The kind of module state information represented by this item. This can be a number of items having a given quality, a total number of items, the time the newest/oldest item was updated. Use a number inside the parentheses in "FileAdd" configuration files.

19000Persisted, ReadOnly, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Item Attribute Items

An item which exposes an attribute of another item in the module

The Item attribute items item type has the following properties:

NameDescriptionIDFlags
Attribute ID

The ID of the attribute this item exposes from an item.

19002Persisted, ReadOnly, DynamicEnumeration
ParentItem

The parent item name of this item.

5502Persisted, ReadOnly, ApisLocalItem
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Apis OpcUa

This module connects to an OPC server, and provides access to OPC items.

Provider: Prediktor

Properties

Commands And Events

The OpcUa module has the following item types

OPC Item

Command Item

Trans OPC Item

State Item

OPC Method Item

Event Monitor Item

Module State Items

Item Attribute Items

Namespace Items

Function item

OPC UA catch-up

OPC UA Catch-up

Properties

The OpcUa module has the following properties:

NameDescriptionIDFlags
BrowseTypeFilterThe type definition base class for nodes returned from browsing.1025Persisted, Enumerated, ExpertPage
BatchSizeThe batch size for the number of operations per rpc message towards the ua server. Default is 10000.1054Persisted, ExpertPage
CatchupChunkSizeHow many values to read per node during catch-up read operations. Maps to 'numValuesPerNode' parameter of 'ReadRawModifiedDetails'. Default is 1000.1261Persisted, ExpertPage
CatchupMode

Enables data catch-up functionality of the communication.

  • NoCatchup: No data catch-up, just pure real-time communication.
  • SerializedFull: Historical data is streamed through Hive sample-by-sample, until we have caught up real-time. Note that this requires the UA server to implement the OPC UA HA profile!
  • SerializedFull_PauseAfterInitial: Same as 'SerializedFull', but with one stop in playback sequence after first StepNext, use CommandItem #ContinueCatchup# to continue playback sequence.
  • SerializedPartial: Only data missing locally, per item, is read from remote server and streamed through Hive, sample-by-sample, until we have caught up real-time.
  • SerializedPartial_PauseAfterInitial: Same as 'SerializedPartial', but with one stop in playback sequence after first StepNext, use CommandItem #ContinueCatchup# to continue playback sequence.
  • Direct: Data periods missing locally are read from the UA server and written directly to any configured Honeystore timeseries database for the item(s).

Note that to fully benefit from using catch-up, the UA server is required to implement the OPC UA HA profile!

More details on how to use catch-up can be found here.

1250Persisted, Enumerated, ExpertPage
CatchupChunkCountHow many chunks of data to cache during catch-up history read operations. Must be between 0-255. Default is 51260Persisted, ExpertPage
CatchupPeriodThe maximum period of time to look back for historical data, when initiating a catch-up operation. Select a preset, or enter a custom value in seconds.1262Persisted, Enumerated, ExpertPage
EnableModelChangeEventBy enabling this property, any model change events issued from the OPC server as a result of items being added, will automaticlly add items to this module.1230Persisted, ExpertPage
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ItemNameFormat

How to generate item names on items from the UA NodeID, DisplayName and/or BrowseName. Set of predefined choices, or a custom string, containig following specifiers:

  • {dn} = DisplayName
  • {bn} = BrowseName
  • {ni} = NodeId

Example: {bn}mystring{dn}

1020Persisted, Enumerated, ExpertPage
ItemNameSeparatorA character or string used when concatenating item names. Applies only when ItemNameFormat is ConcatenateDisplayName or ConcatenateBrowseName.1022Persisted, ExpertPage
LifetimeCountRequested lifetime count. The lifetime count shall be a minimum of three times the keep keep-alive count. When the publishing timer has expired this number of times without a Publish request being available to send a NotificationMessage, then the Subscription shall be deleted by the Server.1052Persisted, ExpertPage
MaxKeepAliveCountRequested maximum keep-alive count. When the publishing timer has expired this number of times without requiring any NotificationMessage to be sent, the Subscription sends a keep-alive Message to the Client. The negotiated value for this parameter is returned in the response. If the requested value is 0, the server shall revise with the smallest supported keep-alive count.1051Persisted, ExpertPage
NotificationsPerPublishThe maximum number of notifications that the Client wishes to receive in a single Publish response. A value of zero indicates that there is no limit. The number of notifications per Publish is the sum of monitoredItems in the DataChangeNotification and events in the EventNotificationList.1053Persisted, ExpertPage
PublishingIntervalThis interval defines the cyclic rate that the Subscription is being requested to return Notifications to the Client. This interval is expressed in milliseconds. The negotiated value for this parameter returned in the response is used as the default sampling interval for MonitoredItems assigned to this Subscription. If the requested value is 0 or negative, the server shall revise with the fastest supported publishing interval.1050Persisted
QualityOverridingThis property controls whether the module itself is allowed to set item qualities, in some situations. If 'when connection lost', qualities will be set to 'Bad: not connected', when connection to ua server gets lost. If 'when write fails', qualities will be set to whatever the write class returned when a failed write happens. If set to 'never', no qualities other than what is received from the UA server will be set on item qualities. Note: this property only applies when CatchupMode = NoCatchup.1340Persisted, Enumerated, ExpertPage
ServerThe OpcUa connection or OpcUa cluster item to use for the server connection.1000Persisted, Enumerated
ServerWatchdogA strategy for server communication monitoring. This can be useful when dealing with erronous servers that stop to operate correctly.1360Persisted, Enumerated, ExpertPage
ServerWatchdogItemsSet to one or more local items, when using ServerWatchdog = 'Restart if watchdog item(s) update interval > ReconnectTime'1365Persisted, ApisLocalItem, ExpertPage
SrvLCIDLocale ID of values coming from the server. You might need to specify this property if the OPC server provides string values that is converted to another type in your client (e.g. DDE bridges)1218Persisted, ExpertPage
StandardPropSync

When to synchronize selected OpcUA Standard Properties from server nodes to local apis item attributes (EngineeringUnit, EURange, InstrumentRange).

  • First time adding item
  • Each session
  • After replication
1310Persisted, EnumeratedFlags, ExpertPage
SubscriptionActiveActive flag for the subscription.1300Persisted, ExpertPage
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TraceLogLevelThe level of trace messages to log.1420Persisted, Enumerated, ExpertPage
TraceOPCServerReport specific OPC calls, return from those calls, as well as callbacks to the log files for log view in Apis Management Studio.1400Persisted, Enumerated, ExpertPage
TraceToFile1If set, all OPC calls and callbacks will be traced to this file. Should only be used for short time periods and for verbose troubleshooting of communication.1410Persisted, File, ExpertPage
TraceToFileVerbositySpecifies how much information that are written to the trace-file, if property 'TraceToFile' is set to a valid file. (Property ID: 1411)1411Persisted, File, Enumerated, ExpertPage
UseApisTimeUse Apis timestamps instead of the timestamps given by the OPC server.1223Persisted, ExpertPage
WriteModeHow items are being written to the server. Synchronous: items are written through a blocking call; Asynchronous; items are written through a non-blocking call. When the module is data, it might be desirable to do so without blocking any executing thread/operation, like any other eventbroker commands configured on the same eventbroker event, i.e. like External items transfer.1201Persisted, Enumerated
WriteTimeStampSelect the timestamp to include, if any, when writing a data value to the OPCUA server.1500Persisted, ExpertPage

Informational properties:

NameDescriptionIDFlags
DebugInfoDebug information for advanced troubleshooting2200ReadOnly, InfoPage
ExternalItem reportA status-report for the External Item manager of this module110InfoPage
RevisedKeepAliveCountThe actual maximum keep-alive count. The Server should attempt to honor the Client request for this parameter, but may negotiate this value up or down to meet its own constraints.2111InfoPage
RevisedLifetimeCountThe lifetime of the Subscription shall be a minimum of three times the keep-alive interval negotiated by the Server.2112InfoPage
RevisedPublishingIntervalThe actual publishing interval that the Server will use, expressed in milliseconds. The Server should attempt to honor the Client request for this parameter, but may negotiate this value up or down to meet its own constraints.2110InfoPage
ServerNamespaceArrayThe NamespaceArray of the UA server.2060Persisted, ReadOnly, InfoPage
SrvActiveCurrently active server, when using redundant severs, this tells which of the two that is currently set as active server (Primary / Redundant).2050ReadOnly, InfoPage
SrvCurrentTimeThe current time (UTC) as known by the OPC server.2121ReadOnly, InfoPage
SrvStartTimeTime (UTC) the server was started. This is constant for the server instance and is not reset when the server changes state. Each instance of a server should keep the time when the process started.2120ReadOnly, InfoPage
SrvStateThe current status of the OPC server2105ReadOnly, InfoPage
UaLibSessionStatusThe last session status change, as reported by the UA client library.2100ReadOnly, InfoPage
UaLibSubscriptionStatusThe last subscrption status change, as reported by the UA client library.2101ReadOnly, InfoPage

Performance properties:

NameDescriptionIDFlags
SrvLastWriteCallTimeThe last time we have written (one or more items) to the OPC server.3080ReadOnly, PerformancePage
SrvAvgItemsPerCallThe average number of items in each update call.3064ReadOnly, PerformancePage
SrvLastUpdateIntervalThe time in seconds between the last two updates.3061ReadOnly, PerformancePage
SrvPercentUpdateInvalidThe percentage of all updated items that was invalid (e.g. unknown item handles).3070ReadOnly, PerformancePage
SrvPercentUpdateSuccessThe percentage of all updated items that have been updated with success.3066ReadOnly, PerformancePage
SrvPercentUpdateUnneccessaryThe percentage of all updated items that was unnecessary (neither value nor quality had changed).3068ReadOnly, PerformancePage
SrvPercentWriteCmplFailThe percentage of all write complete items that have failed.3088ReadOnly, PerformancePage
SrvPercentWriteCmplInvalidThe percentage of all write complete items that was invalid (e.g. unknown item handles).3090ReadOnly, PerformancePage
SrvPercentWriteCmplSuccessThe percentage of all write complete items that have succeeded.3086ReadOnly, PerformancePage
SrvUpdateCallsThe number of times the OPC server has called back to this client with updated item values.3062PerformancePage
SrvUpdateTimeClientThe time when this client received the last update, as known by the client.3060ReadOnly, PerformancePage
SrvWriteCallsThe number of times the OPC server has called back to this client with updated item values.3082PerformancePage

Connection properties:

The following properties have been deprecated by the introduction of the Server property.
When the Server property is set to "<Custom configuration>", the properties in this list behave as in prior versions of Apis Hive. When the Server property is set to an OpcUa connection or OpcUa cluster, the properties in this list gets the additional flags ReadOnly and Hidden (except the ServerEndpoint property, which becomes ReadOnly but not Hidden; this property can then be used to check which server endpoint is active when connecting to an OpcUa cluster).

NameDescriptionIDFlags
AuthenticationThe authentication method used when connecting to server. Can be:
  • Anonymous
  • Username/password

If set to 'Username/password, the properties 'Username' and 'Password' must also be specified.

1061Persisted, Enumerated (ReadOnly, Hidden)
CertificatePath

Use this property to specify which certificate the OpcUaBee should use when connecting to the server.

When property 'PkiType' is 'Win32', this property specifies the subject-name of a certificate in the Windows Certificate Store. Typically this will be the name of the ApisHive instance.

  • Example using Win32: "ApisHive"

When property 'PkiType' is 'OpenSSL', this property specifies the name of a certificate file in the '<certificatestore>\trusted' folder.

  • Example using OpenSSL: "ApisHive.der"
1042

Persisted, File

(ReadOnly, Hidden)

CertificateStorePath

This property specifies where the OpcUaBee should look for certificates. Default value is empty.

If PkiType is "Win32", this property specifies a folder within the Windows Certificate Store. If the property is empty, the OpcUaBee will use the value "UA Applications".

If PkiType is "OpenSSL", this property specifies a file system path relative to the ApisHive instance configuration folder. If the property is empty, the OpcUaBee will use the value "pki", which resolves to the folder "<APISDir>\Config&lt;InstanceName>\pki".

1041

Persisted, Folder

(ReadOnly, Hidden)

MessageSecurity

This property sets the required security level for OpcUa messages. When setting this property, you may also have to set others.

If set to:

  • Unsecure: You must set 'SecurityPolicy' to 'None'.
  • 'Signed' or 'Signed and encrypted:' You must sprecify a 'SecurityPolicy' other than 'None'
1045

Persisted, Enumerated

(ReadOnly, Hidden)

NetworkTimeout

This property specifies the timeout period for RPC messages sent to the OpcUa server, measured in milliseconds. Default is 5000.

1111

Persisted

(ReadOnly, Hidden)

PkiType

This property selects which certificate handling implementation to use (PKI=Public Key Infrastructure). The options are:

  • OpenSSL: using the filesystem to access certificates
  • Win32: using the Windows Certificate Store.
1040

Persisted, Enumerated

(ReadOnly, Hidden)

Password

Password used to authenticate with server. Only applies when the property 'Authentication' is 'Username/password'.

1003

Persisted

(ReadOnly, Hidden)

PrivateKeyPath

This propery is used to select the private key for the active certificate.

When PkiType is 'OpenSSL', the property specifies the name of a key-file in the "<certificatestore>\private" folder, encoded in PEM format. Typically, if CertificatePath is "ApisHive.der", PrivateKeyPath would be "ApisHive.pem".

When PkiType is 'Win32', this property is ignored (the Windows Certificate Store handles the link between certificates and their private key).

1043

Persisted, File

(ReadOnly, Hidden)

ReconnectTime

The time to wait before attempting to reconnect the server after an RPC failure, or time interval for watchdog evaluation if watchdog is configured.

1355Persisted, Enumerated, ExpertPage
RedundantCertificatePath

This property is identical to the CertificatePath property, but for the redundant server.

1542Persisted, File, ExpertPage
RedundantCertificateStorePath

This property is identical to the CertificateStorePath property, but for the redundant server.

1541Persisted, Folder, ExpertPage
RedundantMessageSecurity

This property is identical to the MessageSecurity property, but for the redundant server.

1545

Persisted, Enumerated, ExpertPage

(ReadOnly, Hidden)

RedundantPkiType

This property is identical to the PkiType property, but for the redundant server.

1540

Persisted, Enumerated, ExpertPage

(ReadOnly, Hidden)

RedundantPrivateKeyPath

This property is identical to the PrivateKeyPath property, but for the redundant server.

1543

Persisted, File

(ReadOnly, Hidden)

RedundantSecurityPolicy

The OpcUa securitypolicy URI do use when connecting to the redundant server. Default value is "http://opcfoundation.org/UA/SecurityPolicy#None"

1546

Persisted, Enumerated

(ReadOnly, Hidden)

RedundantServerCertificatePath

Path/name of redundant server certificate. This property must refer to a valid certificate when RedundantMessageSecurity is either "Signed" or "Signed and encrypted".

1544

Persisted, File, ExpertPage

(ReadOnly, Hidden)

RedundantServerEndpoint

The Endpoint URL of any redundant OPC UA server. The use of a redundant server when the primary server is unavailable is triggered by setting this property.

1501

Persisted, ExpertPage

(ReadOnly, Hidden)

SecurityPolicy

The OpcUa securitypolicy URI do use when connecting to the server. Default value is "http://opcfoundation.org/UA/SecurityPolicy#None"

1046

Persisted, Enumerated

(ReadOnly, Hidden)

ServerCertificatePath

Path/name of server certificate. This property must refer to a valid certificate when MessageSecurity is either "Signed" or "Signed and encrypted". If MessageSecurity is 'None', this property is ignored.

When property 'PkiType' is 'Win32', this property specifies the subject-name of a certificate in the Windows Certificate Store.

  • Example using Win32: "ApisHiveUaServerInstance"

When property 'PkiType' is 'OpenSSL', this property specifies the name of a certificate file in the '<certificatestore>\trusted' folder.

  • Example using OpenSSL: "ApisHiveUaServerInstance.der"
1044

Persisted, File

(ReadOnly, Hidden)

EndpointUrl

The endpoint URL used to connect with an OPC UA server. Syntax: opc.tcp://{HostName or IP address}:{port number}

Example: opc.tcp://localhost:4850

1001

Persisted

(ReadOnly)

ReverseConnection

Specifies if the client or the server should initiate the connection

1005

Persisted

(ReadOnly)

SessionTimeout

Requested maximum number of milliseconds that a Session should remain open without activity. If the Client fails to issue a Service request within this interval, then the Server shall automatically terminate the Client Session. Default is 60000 milliseconds.

1112

Persisted

(ReadOnly, Hidden)

TokenTimeout

The timeout in seconds, of the security token if using 'MessageSecurity' other than 'None'. Default is 60000 seconds.

1113

Persisted

(ReadOnly, Hidden)

Username

Username used to authenticate with server. Only applies when the property 'Authentication' is 'Username/password'.

1002

Persisted

(ReadOnly, Hidden)

See also Module Properties

1

TraceToFile has default a maximum count of 10 files with maximum size of 64 MB, other values can be specified in the module specific registry key using DWORD values TraceToFileMaxCount and TraceToFileMaxSize

Commands And Events

The OpcUa module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.Timer
ExternalItemsHandled_DataPushThis event is fired after it has executed a HandleExternalItems_DataPush command. The data push package part of this ExternalItemsHandled_DataPush event, are all the resulting VQTs (Function items, ordinary external item transfer, etc.) from the HandleExternalItems_DataPush command.
On this ExternalItemsHandled_DataPush event, one can hook any _DataPush command(s) (Log; Scan; UaServerUpdateMonitorItems; HandleExternalItems), to chain a path of execution with data transferred alongside.
APIS data transfer mechanism; Data Push
Timer
ServerConnectedThe OPC server if connected.Normal
ServerDataChangedThe OPC server has sent a subscription based data change notification.Normal
ServerDataChanged_DataPushThe OPC server has sent a subscription based data change notification. This event is fired when the UA client receives data from the UA server, and the items and samples in the data push package are the ones that is received from the server.
See also: APIS data transfer mechanism; Data Push
Normal
ServerFirstTimeDataChangedThe OPC server has sent a subscription based data change notification first time after reconnect.Normal
Catchup-StepNextDoneEvent telling that one data set from catch-up-manager has been applied to items of the module!Normal
Catchup-DoneEvent telling that the catch-up opertation has completed and that module is running in normal real-time mode.Normal

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
HandleExternalItems_DataPushThis command ensure that all samples for all items and samples in the data push package are applied/used in the external item manager, including in services as Data Validation, Ext Items Transfer Control, etc.
See also: APIS data transfer mechanism; Data Push
Synchronous
ResetExternalItemsResets the external item manager, by forcing a full refresh of all external items when the 'HandleExternalItems' is fired next time.Synchronous
ReconnectReconnects to the OPC server.Asynchronous
ResetResets the OPC items in the client by performing a synchronous read from server.Synchronous
SyncStdPropertiesStart synchronization of standard OPCUA properties for all itemsAsynchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous
Catchup-StepNextStep the catch-up data 'playback' one step ahead, updating the item values according to next step. The event Catchup-StepNextDone, is fired when all items have their updated values set, to hook other commands onto.Synchronous

See also Commands And Events

Item Types

OPC Item

An item in an OPC UA Server

The OPC Item item type has the following properties:

NameDescriptionIDFlags
DeadbandType

The monitoring deadband. Default = None.

10550Persisted, Enumerated
DeadbandValue

The value of the monitoring deadband. Ignored when 'Deadband' is 'None'.

10560Persisted
DiscardOldest

A boolean parameter that specifies the discard policy when the queue is full and a new notification is to be queued. It has the following values:

True - the oldest (first) notification in the queue is discarded. The new notification is added to the end of the queue;

False - the last notification added to the queue gets replaced by the new notification.

Default = true.

10530Persisted
ExcludeFromCatchupIf true, this item is excluded from catchup, ie. no historical data is read from the server, for this item, during the catchup process.10600Persisted
MonitoringMode

The monitoring mode parameter is used to enable and disable the sampling of a MonitoredItem, and also to provide for independently enabling and disabling the reporting of notifications. This capability allows a MonitoredItem to be configured to sample, sample and report, or neither. Default = Reporting.

10500Persisted, Enumerated
MonitorStateThe actual state of the monitored item.11030ReadOnly
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
QueueSize

The requested size of the "MonitoredItem" queue. Default = 1.

10520Persisted
RevisedQueueSizeThe actual queue size that the Server will use.11020ReadOnly
RevisedSamplingIntervalThe actual sampling interval that the Server will use. This value is based on a number of factors, including capabilities of the underlying system. The Server shall always return a revisedSamplingInterval that is equal or higher than the requested samplingInterval. If the requested samplingInterval is higher than the maximum sampling interval supported by the Server, the maximum sampling interval is returned.11010ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SamplingInterval

The interval that defines the fastest rate at which the MonitoredItem(s) should be accessed and evaluated. This interval is defined in milliseconds. The value 0 indicates that the Server should use the fastest practical rate. The value -1 indicates that the default sampling interval defined by the publishing interval of the subscription is requested. A different sampling interval is used if the publishing interval is not a supported sampling interval. Any negative number is interpreted as -1. Default = -1.

10510Persisted
SrcUaNodeId

UA Node ID in source.

5031Persisted
SrcUaNodeId-HA

UA Node ID in source, used for reading history, if the UA server has different nodeids for reading real-time data versus historical data. If empty, history reads defaults to using the 'SrcUaNodeId'.

 

10505Persistd
Time

The date and time when this item was last updated.

4ReadOnly
Trigger

This property specifies the conditions under which a data change notification should be reported:

Status: Report a notification ONLY if the StatusCode associated with the value changes;

StatusOrValue: Report a notification if either the StatusCode or the value change. The Deadband filter can be used in addition for filtering value changes;

StatusOrValueOrTimestamp: Report a notification if either StatusCode, value, or the SourceTimestamp changes. If a Deadband filter is specified, this trigger has the same behaviour as StatusOrValue.

Default = StatusOrValueOrTimestamp.

10540Persisted, Enumerated
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Write as server datatype

If false, using current value datatype when writing this item to the server. If true, convert to datatype initially reported by server for this item, when writing to the server.

10610Persisted, Expert

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Command Item

An item that represents a command to apply on this module.

The Command Item item type has the following properties:

NameDescriptionIDFlags
Command argument 1This attribute may be used by an item to specify additional arguments.10150Persisted
Command typeThis attribute allows you to select the type of command to use.10140Persisted, ReadOnly, Enumerated
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Trans OPC Item

An item in an OPC UA Server, with automatic calculation of a linear transformation.

The Trans OPC Item item type has the following properties:

NameDescriptionIDFlags
DeadbandType

The monitoring deadband. Default = None.

10550Persisted, Enumerated
DeadbandValue

The value of the monitoring deadband. Ignored when 'Deadband' is 'None'.

10560Persisted
DiscardOldest

A boolean parameter that specifies the discard policy when the queue is full and a new notification is to be queued. It has the following values:

True - the oldest (first) notification in the queue is discarded. The new notification is added to the end of the queue;

False - the last notification added to the queue gets replaced by the new notification.

Default = true.

10530Persisted
ExcludeFromCatchupIf true, this item is excluded from catchup, ie. no historical data is read from the server, for this item, during the catchup process.10600Persisted
MonitoringMode

The monitoring mode parameter is used to enable and disable the sampling of a MonitoredItem, and also to provide for independently enabling and disabling the reporting of notifications. This capability allows a MonitoredItem to be configured to sample, sample and report, or neither. Default = Reporting.

10500Persisted, Enumerated
MonitorStateThe actual state of the monitored item.11030ReadOnly
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
QueueSize

The requested size of the "MonitoredItem" queue. Default = 1.

10520Persisted
RevisedQueueSizeThe actual queue size that the Server will use.11020ReadOnly
RevisedSamplingIntervalThe actual sampling interval that the Server will use. This value is based on a number of factors, including capabilities of the underlying system. The Server shall always return a revisedSamplingInterval that is equal or higher than the requested samplingInterval. If the requested samplingInterval is higher than the maximum sampling interval supported by the Server, the maximum sampling interval is returned.11010ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SamplingInterval

The interval that defines the fastest rate at which the MonitoredItem(s) should be accessed and evaluated. This interval is defined in milliseconds. The value 0 indicates that the Server should use the fastest practical rate. The value -1 indicates that the default sampling interval defined by the publishing interval of the subscription is requested. A different sampling interval is used if the publishing interval is not a supported sampling interval. Any negative number is interpreted as -1. Default = -1.

10510Persisted
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
SrcUaNodeId

UA Node ID in source.

5031Persisted
SrcUaNodeId-HA

UA Node ID in source, used for reading history, if the UA server has different nodeids for reading real-time data versus historical data. If empty, history reads defaults to using the 'SrcUaNodeId'.

 

10505Persistd
Time

The date and time when this item was last updated.

4ReadOnly
Trigger

This property specifies the conditions under which a data change notification should be reported:

Status: Report a notification ONLY if the StatusCode associated with the value changes;

StatusOrValue: Report a notification if either the StatusCode or the value change. The Deadband filter can be used in addition for filtering value changes;

StatusOrValueOrTimestamp: Report a notification if either StatusCode, value, or the SourceTimestamp changes. If a Deadband filter is specified, this trigger has the same behaviour as StatusOrValue.

Default = StatusOrValueOrTimestamp.

10540Persisted, Enumerated
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Write as server datatype

If false, using current value datatype when writing this item to the server. If true, convert to datatype initially reported by server for this item, when writing to the server.

10610Persisted, Expert

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

State Item

An item that represents a well defined state of this module.

The State Item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

OPC Method Item

A method item in an OPC UA Server, letting you invoke a method on an object.

The OPC Method Item item type has the following properties:

NameDescriptionIDFlags
SrcUaNodeId

UA Node ID in source.

5031Persisted
MethodId

The NodeId of the method to invoke, on the uaserver object.

12000Persistd

The method items also automatically creates necessary OPC Method Input Arguments Item(s) and OPC Method Output Arguments Item(s), according to the definition of the method item in the UA server, represent any necessary inputs and outputs for the method.

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Event Monitor Item

An item used to subscribe to events from the OPC UA server.

The Event Monitor Item has the following properties:

NameDescriptionIDFlags
SrcUaNodeIdThe UA Node ID in the source server which should be monitored for events.5031Persisted
EventType NodeIdThe UA Node ID of the base EventType this monitor should subscribe to.13000Persisted
UnmappedEventsourceActionThe action to perform when a received event has an unknown eventsource13001Persisted
LookupSourcePathTemplateTemplate string used to construct an eventsource path from the received event fields13002Persisted
UnmappedEventtypeActionThe action to perform when a received event has an unknown eventtype13003Persisted
CatchupPeriodMaximum number of seconds with event history to catchup after being disconnected13110Persisted

The property UnmappedEventsourceAction can have one of the following values, which decide the action to take when an event with unknown eventsource nodeid is received:

  • DropEvent: the event is dropped
  • UseModuleAsEventsource: the module containing the Event Monitor Item is used as the eventsource
  • LookupSourcename: the sourcename field of the event is used to lookup a fully qualified itemname in Apis Hive. If such an item is found, it will be used as the event source, automatically creating the needed source in Apis Event Server if required.
  • LookupSourcepath: A sourcepath is created from the property LookupSourcePathTemplate, and if exactly one event source in Apis Event Server matches the path, this is used as the event source.

The property LookupSourcePathTemplate is a string with injected event field values. Any event field can be referenced in the template with the syntax $(BrowseName), e.g. "Server/Plant/*/ABC-$(SourceName)". The search is case-insensitive and supports the following operators:

  • *: matches zero or more characters
  • ?: matches one character
  • #: matches one digit
  • []: match any character between the '[' and ']', unless the first character after '[' is '^', in which case the logic is inverted.

The property UnmappedEventtypeAction can have one of the following values, which decide the action to take when an event with unknown eventtype is received:

  • DropEvent: the event is dropped
  • UseParentType: traverse the OPC UA eventtype hierarchy and use the first known parent-type

The property CatchupPeriod, with a value greater than 0, causes the Event Monitor Item to perform history-read requests for periods where the OpcUa-module has been disconnected. This happens simultaneously with subscribing for realtime events.

Namespace Item

An item that represents a semantics caching namespace.

The namespace item type has the following properties:

NameDescriptionIDFlags
Name

The name of the item is the uri for the namespace it represents on the remote server the ua client module is connected to

  
Exposed URI / Value

The uri representing the namespace on the hive UA server. This may or may not be the same uri as the source uri on the remote server

15020Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Namespace store

Path to the database file containing the cached model for the namespace

15030Persisted
Separator char

Character used to separate parts of function item names. Normally '.' is used to indicate parent-child like separation.

15040Persisted
Name elements

What should make up the parts of function item names. Either browsename or displayname, possibly with the nodeid appended.

15050Persisted, Enumerated
Name termination

When to stop following parent nodes for inclusion in function item names. Stop at naming root: Stop at the first parent object node having a property of the naming root property type. Stop at first object node: Stop at the first parent that is an object node.

15060Persisted, Enumerated
Name generation

Determines when to generate function items fordata variables in the namespace. If no types in namespace: Only generate function items if there are no types defined by the namespace. Always, Never, Data variables that are not instance declarations: Unless a data variable is part of a type definition (instance declaration), it will have a function item generated for it.

15070Persisted, Enumerated
URI for naming root

The uri-part of the node id for the naming root property type.

15080Persisted
Naming root id type

The identifier-type-part of the node id for the naming root type.

15090Persisted
Naming root identifier

The identifier-part of the node id for the naming root type

15100Persisted
Naming root subtypes

Wether subtypes of the naming root type should be treated as a naming root

15110Persisted
Naming root fail policy

What to do if the objects folder is reached without identifying an object having a naming-root property along the way. Fail: Generate an error, Stop at first object node: Regenerate the name, but stop at the first object node. Stop at objects folder: Use the full path to the objects folder.

15120Persisted, Enumerated
Update Apis eventtypes

If set, custom eventtypes defined in the namespace will be created in Apis Chronical.

15160Persisted

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Module State Items

An item which retrieves status information from the module

The Module state items item type has the following properties:

NameDescriptionIDFlags
Item type

The item type this item aggregates statistics on, when applicable. Use the number inside the parentheses in "FileAdd" configuration files.

19001Persisted, ReadOnly, DynamicEnumeration
Module state

The kind of module state information represented by this item. This can be a number of items having a given quality, a total number of items, the time the newest/oldest item was updated. Use a number inside the parentheses in "FileAdd" configuration files.

19000Persisted, ReadOnly, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Item Attribute Items

An item which exposes an attribute of another item in the module

The Item attribute items item type has the following properties:

NameDescriptionIDFlags
Attribute ID

The ID of the attribute this item exposes from an item.

19002Persisted, ReadOnly, DynamicEnumeration
ParentItem

The parent item name of this item.

5502Persisted, ReadOnly, ApisLocalItem
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

OPC UA Catch-Up

Overview

This document describes the Catch-Up functionality implemented in the Apis Hive OPC UA client module. The Apis Hive OPC UA client module provides as default a standard OPC UA client with Node subscription and method call functionality.

In the cases of communication loss between an OPC UA server and OPC UA client, most OPC UA servers provides a short term buffering of subscription values. When clients re-connects after a relative short termed communication loss, the server start sending the buffered values. This standard OPC UA functionality will only work on short-term communication losses and are dependent on client/server buffer settings and server capabilities.

To accommodate problems with lost data after a long-term communication failure Prediktor has implemented the Catch-Up functionality in the OPC UA client. Unlike classic OPC, OPC UA provides both Data Access and History Access through the same interface. Apis Hive OPC UA client uses the History Access capability in OPC UA to read the logged values on the subscription nodes provided from the server for the duration of the communication failure. When all available history values on the subscription nodes are read up to the time of the re-connect, the client continues to read the subscribed node value through OPC UA Data Access as normal.

To get Catch-Up in Apis Hive UA client bee to work as intended, the server need to provide history data on all the items that the client subscribes to through Data Access.

The resolution on the logged data is not necessary the same as the negotiated sample-interval on the subscription nodes on source system. Differences between the logging interval and sample interval will result in different resolution on data collected through catch-up periods than through normal data access. Careful configuration on both the source server and client is necessary to accommodate this, if correct data resolution through both the catch-up (History Access) and Data Access data is important for the data consuming system.

Dependencies

The Apis Hive OPC UA bee has no Apis Hive dependencies.

Status Items

The Apis Hive OPC UA bee has a set of status items:

ItemVisibilityDescription
#Connected#AlwaysItem telling if module is connected to the OPC server; true: is connected, false: is disconnected.
#Session-State#AlwaysItem telling the current status of the OpcUA session.
#Subscription-State#AlwaysItem telling the current status of the OpcUA subscription.
#Endpoint#AlwaysThe URL to the OpcUA server.
#Catchup-State#CatchupItem telling the current state of data catch-up process. Realtime means no cathup is running, otherwise some sort of cathup related state.
#Catchup-PointInTime#SerializedCatchupItem telling the current point in time for the catchup process, when catch-up process is active.
#Catchup-RealTimeBufferCount#SerializedCatchupItem telling the current number of buffered real-time data callbacks during the catchup process, when catch-up process is active.
#Catchup-ReadChunksCount#SerializedCatchupItem telling the total number of data chunks read from the UA server during the catchup process, when catch-up process is active.
#Catchup-ReadChunksErrors#CatchupItem telling the total number of failed data chunks reads from the ua server during the catchup process, when catch-up process is active.
#Catchup-ReadSamplesCount#SerializedCatchupItem telling the total number of data samples read from the UA server during the catchup process, when catch-up process is active.
#Catchup-Speed#SerializedCatchupItem telling how many times faster than real-time the catchup process is running (note that a low figure may indicate both bad performance as well as high data density).
#Catchup-Progress#DirectCatchupItem telling the progress of direct catchup, in how many items have finished [#finished-read/#finished-writes/#total-items].
#Catchup-ReadSpeedAvg#Item telling the average read speed of the UA server, in samples per millisecond.
#Catchup-StepNextTimeAvg#SerializedCatchupItem telling the average duration of a StepNext iteration, in milliseconds.
#Catchup-WrittenSamplesCount#DirectCatchupItem telling the total number of data samples written during the direct catchup process, when catch-up is active.
#Catchup-WrittenChunksCount#DirectCatchupItem telling the total number of data chunks written during the direct catchup process, when catch-up is active.
#Catchup-WriteSpeedAvg#DirectCatchupItem telling the average history write speed, in samples per second.

Command Item

The Apis Hive OPC UA bee has a catch-up relevant command item:

ItemDescription
$Cmd_Catchup-Continue$Trigger continuation of the catch-up prosess, when catch-up type is SerializedFull_PauseAfterInitial or SerializedPArtial_PauseAfterInitial

The command item is added through the add item method. The command type id: 10140.

Catch-Up relevant module properties

  • CatchUpMode: The data catch-up functionality of the communication. Set this to enable catch-up. Property ID: 1250

    • NoCatchup: No data catch-up, just pure real-time communication.

    • SerializedFull: Historical data is streamed through Hive sample-by-sample, until we have caught up real-time. Note that this requires the UA server to implement the OPC UA HA profile!

    • SerializedFull_PauseAfterInitial: Same as 'SerializedFull', but will stop in playback sequence after first StepNext, use CommandItem $Cmd_Catchup-Continue$ to continue playback sequence.

    • SerializedPartial: Only data missing locally, per item, is read from remote server and streamed through Hive, sample-by-sample, until we have caught up real-time.

    • SerializedPartial_PauseAfterInitial: Same as 'SerializedPartial', but with one stop in playback sequence after first StepNext, use CommandItem #ContinueCatchup# to continue playback sequence.

    • Direct: Historical data is written diectly into an APIS HoneyStore timeseries database (when item is logged), and the realtime data starts to updated immediately on the item as when no catch-up is in use.
      Note! When using Direct catch-up, it is important that the related Honeystore database(s) have their Capabilities property set to accepting Out-of-sequence data.

  • CatchUpPeriod: The maximum period to look back for historical data when initiating a catch-up operation. Select a pre-set value, or enter a value in seconds. Property ID: 1262

  • CatchUpChunkSize: How many values to read per node during catch-up read operations. Maps to 'numValuesPerNode' parameter of 'ReadRawModifiedDetails'. Default is 5000. Property ID: 1261

  • CatchupChunkCount: How many chunks to cache during catch-up read operations. Must be between 0-255. Default is 5.Property ID: 1260

  • CatchUpPeriod: The maximum period to look back for historical data when initiating a catch-up operation. Select a pre-set value, or enter a value in seconds. Property ID: 1262

CONFIGURATION OF SERIALIZED CATCH-UP

Catch-Up operation, step by step

When the OPC UA Client module re-connects after a period of communication loss, and if configured to do automatic catch-up, the OPC UA Client module will start to read the history data on the subscription items, starting from the time when the connection was lost, possibly limited by the set CatchUpPeriod module property. During this period of history data read, the module will buffer all Data Access subscription values in the module until all history data are read from the source system. After all the available item data is read up to the time of the re-connect, the module continuous with normal OPC UA Data Access subscription operation. During the Catch-Up the OPC UA Client Bee updates the configured data items in steps, assuring that all History data values for the items are written to the tags. Only single values are updated in each step.

Implications on memory usage

The OPC UA Client module reads the data in chunks. The number and size of the chunks for each item is configurable through the module properties: CatchUpChunkSize and CatchupChunkCount. The values set on these properties together with the number of items and item types will have implications on memory usage. The buffering of the OPC UA Data Access subscription items will also add to the memory consumption, so careful choice of Catch-up period must be selected to avoid out-of-memory situations. It is recommended to use the 64bit version of Apis Hive to ensure enough memory in high item volume solutions.

Timestamps

Item values read through OPC UA History Access (and OPC UA Data Access) will keep the server data timestamp through all the External Item transfers in Apis Hive, if not the TimeReferenceItems module property is set explicit. We recommend not using the TimeReferenceItems property in a catch-up solution.

EventBroker

Each Apis module normally have set of events and a set of commands. An event in one module can cause a command to be fired in another module. The mapping of the events to commands are done in the Event Broker. The OPC UA client module has a specific event CatchUp-StepNextDone and a specific command CatchUp-StepNext, that fires when a catch-up step is done and triggers each step in the catch-up process respectively. The module also fires a ServerDataChanged event after a new set of item values are received, both during catch-up and normal operation. Only a single value for each item are updated for each trigging of the ServerDataChanged event.

Trig the CatchUp-StepNext command by the CatchUp-StepNextDone event on the OPC UA Client Module, to ensure proper operation of events during normal operation.

This will be coverd in more detail below.

Figure 1, Event broker, CatchUp-StepNext on CatchUp-StepNextDone.

Apis Hive Inter-module data exchange

When using the Catch-Up functionality in the Apis Hive OPC UA Client Bee, it is important to configure the Apis Hive Data Exchange accordingly.

It is important to use the synchronized event/command functionality in the Apis Hive Event Broker instead of asynchronous timer-based Data Exchange

ExternalItem transfer in Apis

Apis Hive is a module based system environment that gives you the opportunity to collect, process and store real-time data. Different modules in Apis Hive provides different functionality, but common to all is the inter-module data exchange provided by the ExternalItem functionality.

Example of data exchange between modules

In this example system, we use a simple setup consisting of tree modules in an Apis Hive default instance:

  • A connection module, the OpcUa Client bee, named OpcUaBee.

  • A process module, the Calculate Bee, named Calculate.

  • A store module, the Logger Bee, named ClientLogger.

A simple view of the modules:

Figure 2, Apis Hive with Calculate, Logger and OpcUa client module.

The OpcUaBee module has one subscription item that reads a sine signal from the a source systemed named CustomerServer. The Item is named CustomerServer.Worker.Sine. Further the Calculate module has one item, named SineTimes2, which has CustomerServer.Worker.Sine as input through the external item connection.

Figure 3, Item connection.

Normally, data exchange (ExternalItem) is configured at destination side by adding one or more external items to an item by the item attributes. Second the destination module need to be set to use a specific Exchange rate, a definition of the period between each read of the connected items, normally in the millisecond area. In this way, the items will be inter-connected, and read at a specific time period.

Data exchange, by event broker:

In the figure below, we see that the OpcUa trigger the ClientLogger module to log, the Calculate module to handle its external items and the ClientLogger module to log again when ServerDataChanged event fires. The reason for logging twice is to ensure that history values that could be needed by calculate operations are properly logged before calculations are trigged, and results from the calculations are logged. Values on items that has not changed will not be logged.

We also see that CatchUp-StepNext is triggered as before by CatchUp-StepNextDone.

Figure 4, Event broker.

Catch-Up steps summary

The steps of the Catch-Up summed up:

  • The module re-connects after a period of lost communication

  • The buffering of the OPC UA Data Access subscription data on the items starts immediately

  • The module decides from what timestamp to start to read item history data from, based on time of communication loss or reconnect-time and CatchUpPeriod setting.

  • The module starts to read history data in chunks and trigger the event in event broker for each change in item value.

  • If configured, the external item values are synchronously updated through the Event Broker event ServerDataChanged for each value update.

  • After all history data is read, the OPC UA Client Module continuous with normal OPC UA Data Access subscription operation.

  • If configured, the external item values are synchronously updated through the Event Broker for each value update.

Recommendations

  • Use event-broker to handle Apis Hive Data Exhange, trigging of CatchUp-StepNext command by the CatchUp-StepNextDone event is a must.

  • Ensure proper configuration of both client and source system server to achieve same data sampling rate of both History Access item data and item subscription (OPC UA Data Access) data.

CONFIGURATION OF DIRECT CATCH-UP

Direct catch-up is more straight forward to use, and is the correct choice if:

  • you want the realtime values for items to be applied on the UA client items rights away.
  • you just need to fetch historical data for periods where the UA client was not running.
  • you don't need to perform calculations and/or alarming on the items when catching up.

When using direct catch-up, the missing item data will be retrieved item-by-item. This means that during the catchup process, some items will have all their missing database fetched, as other items will not.

ADVANCED CATCH-UP CONFIGURATION

When running catchup towards misbehaving UA servers, i.e. servers not returning an initial VQT for all subscribed items after creating the subscription, the catchup process will be stuck waiting forever to get started. This is because the catchup process needs to have an initial VQT for all its items, to be able to determine the End times to use for the history read operations.

To allow the catchup process top run such UA servers, two strategies has been implemted as a workaround.
If forced, the highest time of the current local computer time and server time (as reported by the UA server), will be used as endtimes for the stale items.

These strategies are:

  1. MaxEqualMissingEndtimesCatchupContinue: After 3 (default) publish responses during startup, with the UA server failing to update any new VQTs, the catchup process will be forced to start.
    The default value of 3 can be changed through registry key value MaxEqualMissingEndtimesCatchupContinue of the module.
  2. MaxWaitSecondsMissingEndtimesCatchupContinue: After 600 seconds (default) since startup, with the UA server failing to update all VQTs, the catchup will be forced to start.

To modify the values any of these two settings, create DWORD values named MaxEqualMissingEndtimesCatchupContinue and/or MaxWaitSecondsMissingEndtimesCatchupContinue, for the for the actual Opc UA client module, and ehter other values.
E.g., for a Hive instance named AI_Catchup and an OPCUA client module named OpcUa and new default values 60 (=0x3c) and 120 (=0x78):

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\AI_Catchup\Modules\OpcUa]  
"MaxEqualMissingEndtimesCatchupContinue"=dword:0000003c  
"MaxWaitSecondsMissingEndtimesCatchupContinue"=dword:00000078

Apis OpcUa Proxy

This module is used to configure federation of selected namespaces from other OpcUa servers.

Provider: Prediktor

Properties

Item types:

Namespace item

Item Types

NamespaceItem

The Namespace item is used to configure one remote OpcUa namespace for federation through the OpcUa server in this Apis instance.

It has the following standard properties:

NameDescriptionIDFlags
NameThe namespace uri on the remote server. Persisted

Exposed uri

If specified, a custom uri used to represent the remote uri when exposing the federated namespace in the OpcUa server in this Apis instance.10010Persisted
ValueThe uri exposed by the OpcUa server in this Apis instance (read-only). This value is equal to the"Exposed uri" property when that property has been specified. When not, "Value" equals "Name", i.e. the actual namespace uri on the remote server.2 

Properties

The OpcUaProxy module has the following standard properties:

NameDescriptionIDFlags
OpcUa serverThe server configuration to use for this proxy module. The list of available servers is populated with OpcUa connection and OpcUa cluster items defined by ApisCnxMgr modules.1001Persisted, Enumerated

Impersonate

If checked, the authentication information provided by each OpcUa client connecting to this Apis instance, will be used when federating OpcUa messages for the client to the selected server. If not checked, all clients will reuse the authentication information configured by the selected OpcUa connection item.1002Persisted

Optional

If checked, browsing of the local OpcUa server and other federated OpcUa servers will work even if this module does not have a connection with its OpcUa server1003Persisted

The OpcUaProxy module has the following informational properties:

NameDescriptionIDFlags
Endpoint urlThe current endpoint url for the selected server.1100 

Apis OpcUa Method

A module for querying and iterating over collections using OPC UA methods.

Currently the supported integration is SQL server, which can be used to connect to an SQL server for querying using stored procedures available on the server and general select statements.

The module exposes a set of objects and methods in OPC UA that can be used for querying and iterating over the collections.

Provider: Prediktor

Properties

The OPC UA Method module has the following item types:

Item Types

Stored Procedure Item

This item type is an SQL Item for querying the SQL database using select statements.

The item type has the following properties:

NameDescriptionIDFlags
Stored procedureThe name of the stored procedure on the SQL server18653Persisted

The item type exposes stored procedures from an SQL server into an OPC UA namespace. It is configured by adding items in APIS that specifies the name of the stored procedure that is to be exposed. The parameters from the stored procedure is infered onto the First method exposed in the OPC UA namespace.

The available stored procedures can be browsed when adding new Stored procedure items in APIS.

Select Item

This item type is an SQL Item for querying the SQL database using select statements.

Parameters of the item type:

NameDescriptionIDFlags
Select statementSelect statement for querying the SQL server18653Persisted, Multiline

The select statement is a parameterized string containing the query, and with the paramaters as specified on the following format:

@[myparam]:[param datatype]

An example of this could be

select * from my_table
where my_int_column = @a:int

In OPC UA this would expose an object with a First method that takes the max number of rows and a parameter @a that must be of type int.

Supported datatypes:

  • int
  • float
  • datetime
  • nvarchar

SQL Items

SQL item types consists of

The items are used to collect data from an SQL server using either select queries or invoking stored procedures. The items exposes objects in OPC UA namespace that contains methods for iterating over the data return from the queries.

The items exposes two methods in the OPC UA namespace

  • First
  • Next

The parameters of the First method is the max numbers of rows to return in each query, and the parameters as defined in either the stored procedure or the select query. The supported SQL data types and the corresponding mapping to the OPC UA datatypes is listed in the table below.

SQL datatypeUA built-in type ID
int6
datetime13
nvarchar12
float10
bigint8

OPC UA Datatypes

First will return the data as an matrix containing the result of the query. If the results exceeds the max number of rows argument, First will also return an continuation point id. The continuation point id can then be used an argument when calling Next, which will return the next batch of results.

Notes

  1. When theres a continuation point, the user should always resolve the rest of the data by calling Next methods until theres no more continuation points. This will ensure the connection is closed.

Properties

The module contains the following standard properties:

NameDescriptionIDFlags
Connection stringStandard SQL connection string2563Persisted

For a connection to a Microsoft SQL server:

Provider=MSOLEDBSQL;Data Source=SERVER_URI;Initial Catalog=DB_NAME;User ID=USERNAME;Password=PASSWORD;

Note:

  1. The Provider for a Microsoft SQL server must be MSOLEDBSQL.

PostgreSQL

To use the module with a PostgreSQL server, do the following:

  1. Install psqlODBC using Stack Builder application
  2. Configure ODBC data source using ODBC Data Source Administrator
  • System DSN: Add
  • Select PostgreSQL ODBC Driver(UNICODE)
  • Specify parameters, e.g. database=postgres,server=localhost,username=postgres,port=5432,password=xxx
  • Test the connection
  1. Specify connection string for the module, e.g.
  • Provider=Microsoft OLE DB Provider for ODBC Drivers;Data Source=PostgreSQL35;location=postgres;User ID=postgres;password=xxx;timeout=1000;
  • Data Source is the Name of the data source as specified in ODBC Data Source Administrator

Make sure the user has read/write permissions to the database and tables used in the module:

  1. Right-click on the table and click Properties
  2. Go to Security tab and grant permissions

Apis PerformanceMonitor

Module for collecting performance data from a local or remote machine.

Provider: Prediktor

Properties

Commands And Events

The PerformanceMonitor module has the following item types

Performance Counter

Perfmon Command

Properties

The PerformanceMonitor module has the following properties:

NameDescriptionIDFlags
ComputerThe computer to connect to for getting Performance Monitor data. Left blank for local computer.1002Persisted
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
IntervalThe update interval for the Performance Counters, in ms. Determines how often the module will poll the system for updates.1001Persisted

See also Module Properties

Commands And Events

The PerformanceMonitor module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Add new Performance Counter items

You can add new Performance Counter items by right-click on PerformanceMonitor --> Add Items --> Performance Counter:

A dialog then pops up to allow you to configure the items to be created:

You MUST click on the Browse button to choose from all system predefined Performance Counters to add.

The Browse button brings up another dialog:

In this dialog, choose one or more Performance Counter and click Ok. You can also search for keywords in the Search bar on the top.

For example, if you want to monitor the incoming and outgoing internet connection speed, you can choose "@IPv4/Datagrams/sec":

Click Ok, you will see a new Performance Counter is added:

Now you have added a new Performance Counter item, you can use it as the same as other OPC items.

Identify Processes By Instance Name

When you use the system built-in utility Perfmon.exe to monitor multiple instances of the same process and you have more than one instances of the same executable file running at the same time, you have probably noticed that Perfmon.exe differentiates the process by giving them an arbitrary numbered name, for example ApisHive, ApisHive#1 and ApisHive#2. This behavior is most likely not what you would expect, since it is not very clear which process responds to which ApisHive instance you started, and the random numbers change every time you restart your ApisHive instances.

PerformanceBee improves this by adding the instance name to the process name, such as ApisHive_apishiveinstance, ApisHive_anotherinstance, where ApisHive is the process name(executable file name) and apishiveinstance is the instance name. This works even after you restart your ApisHive instances.

Now only ApisHive instances are identified in this way.

To support this, there are some configurations need to be done:

On target machine(machine being monitored):

  • HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\PerfProc\Performance\ProcessNameFormat set to 2. (DWORD)

  • Service named RemoteRegistry is running.

  • ApisHive instances are running using different names.

  • The monitoring account (usually a domain account, which is also used to launch the PerformanceBee on the monitoring machine) is added into user group “Performance Log Users” and “Performance Monitor Users”

  • The monitoring account is allowed in WMI\Root\CIMV2\Securities (use WMI Control snapin, wmimgmt.msc)

On monitoring machine:

  • The monitoring account(the account that launched PerformanceBee) can be added to related user groups on target machine.

Item Types

Properties

A Performance Counter describes a measurable property of a machine, remote or local, such as CPU load.

The Performance Counter item type has the following properties:

NameDescriptionIDFlags
Performance Counter CategoryThe category that the Performance Counter belongs in.10001Persisted
Performance Counter InstanceInstance name for the Performance Counter, when it is necessary to distinguish between them. Left blank for single-instance counters.10002Persisted
Performance Counter NameThe name given to the Performance Counter within its category.10003Persisted
Performance Counter Sample ExposureThe aspect of the Performance Counter to expose. Default means to expose the appropriate value depending on the counter type. RawValue exposes the CounterSample.RawValue Int64 Property of the counter. Calculated exposes the 4 byte floating point value resulting from the CounterSample.Calculate Method of the counter.10004Persisted
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

This item controls whether the Performance Monitor module is running or not.

The Perfmon Command item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Replay

This module will playback values recorded to an OPC HDA server.

Provider: Prediktor

Properties

Commands And Events

The Replay module has the following item types

SynchronousItem

Time

RelativeTime

Control item

State item

More information

Quick Start Guide

Properties

The Replay module has the following properties:

NameDescriptionIDFlags
ActiveBufferThe buffer which data is replayed13000ReadOnly, PerformancePage
ActiveBuffer1Status if the buffer is active11900ReadOnly, InfoPage
ActiveBuffer2Status if the buffer is active12400ReadOnly, InfoPage
CmnItemIDPrefixCommon source ItemId prefix. This string will prefix each item's 'ItemID' when communicating with the OPC HDA server, and used as a filter when browsing this server's items.1030Persisted
CmnItemIDPrefix itemCommon source ItemId prefix. This string will prefix each item's 'ItemID' when communicating with the OPC HDA server, and used as a filter when browsing this server's items.1031Persisted, ApisItem, ExpertPage
ComputerThe computer hosting the OPC HDA server.1010Persisted, Computer
CurrentPlayIndexThe index of the active buffer12900ReadOnly, PerformancePage
EndTimeThe latest time (UTC) of the history to be read.1065Persisted
EndTimeBuffer1The end time of the buffer12300ReadOnly, InfoPage
EndTimeBuffer2The end time of the buffer12800ReadOnly, InfoPage
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ItemCountBuffer1The count of values in the buffer12000ReadOnly, InfoPage
ItemCountBuffer2The count of values in the buffer12500ReadOnly, InfoPage
ItemPropSyncHow to synchronize item attributes from historian HDA attributes1095Persisted, Enumerated, ExpertPage
PrefetchStrategyPrefetchStrategy11700Persisted, Enumerated
RealtimeFetches the freshest HDA value, ignoring start/end time1082Persisted, ExpertPage
ReportBufferStatusReports buffer status to the log files for log view in Apis Management Studio11800Hidden, InfoPage
ResolutionThe time between updates [seconds]1075Persisted
ServerThe ProgID of the OPC HDA server.1020Persisted, Enumerated, ProgID
ServerCLSIDThe CLSID of the specified OPC HDA server10000Persisted, ReadOnly, InfoPage
SrvCurrentTimeThe current time (UTC) as known by the OPC server.11100ReadOnly, InfoPage
SrvLCIDLocale ID of values coming from the server. You might need to specify this property if the OPC server provides string values that is converted to another type in your client (e.g. DDE bridges)1040Persisted, ExpertPage
SrvMaxRetValsThe maximum number of return values supported by the server.11200ReadOnly, InfoPage
SrvStartTimeThe time (UTC) the OPC server was started.11000ReadOnly, InfoPage
SrvStateThe current status of the OPC HDA server11300ReadOnly, Enumerated, InfoPage
SrvStatusStringStatus string in the OPC HDA server.11400ReadOnly, InfoPage
SrvVendorInfoVendor specific information about the OPC HDA server11600ReadOnly, InfoPage
SrvVersionThe version number of the OPC HDA server (major-minor-build)11500ReadOnly, InfoPage
Start on loadIf true replay starts when module starts1088Persisted, ExpertPage
StartTimeThe earliest time (UTC) of the history to be read.1060Persisted
StartTimeBuffer1The start time of the buffer12200ReadOnly, InfoPage
StartTimeBuffer2The start time of the buffer12700ReadOnly, InfoPage
Step on NextTimePerform step when NextTime is set1083Persisted, ExpertPage
SynchronousReadAggregateThe aggregate to read from the OPC HDA server when using synchronous replay. Applies to data returned for SynchronousItems only.1087Persisted, DynamicEnumeration
SynchronousReadDelayA delay in milliseconds inserted before the ReadSynch command, 0 to disable delay.1085Persisted
TimeModeThe time mode used to timestamp items.1050Persisted, Enumerated, ExpertPage
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TracefileMaxSizeThe max size of the trace file before the file is truncated in bytes. Default is 50 MB => 50 * 1024 * 102415000Persisted, ExpertPage
TraceServerStateThis is used to trace detail information about internal steps of a replay sequence. Note: Be care full of enabling 'PlayBackValues' and 'All Calls' since this can result in mutch disk activity.Disabled - Trace not activeTraceCall - The different function calls which are called during replayWaitForData - The request for dataStatusChange - On status changeReadServer - When data is read from serverCurrentIndex - Current index being replayedFireBrokerEvent - When broker event is sendPlayBackValues - The values being replayedCommands - Commands being sentStartupSequence - The startup sequence of replay beeRequestMoreData - Not used at the momentActivatedItems - When items are set active for replayAll Calls - Trace active for all types15020Persisted, Enumerated, ExpertPage
TraceToFileThis is used to trace detail information about the internal status of the replay session. The type of status to trace is given by the property 'TraceServerState'15010Persisted, File, ExpertPage
UseHDATimestampUse HDA Timestamp for each interval, if false the item timestamp is used. Applies to data returned for SynchronousItems only when aggregate is Interpolative zero-order2000Persisted, ExpertPage
UseRelativeTimeUse relative Start and End time, valid only for aggregate data.1072Persisted, ExpertPage
ValueCountBuffer1The count of values in the buffer12100ReadOnly, InfoPage
ValueCountBuffer2The count of values in the buffer12600ReadOnly, InfoPage

See also Module Properties

Commands And Events

The Replay module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
OnPlaybackDone

An event that notifies that a playback step has finished. Necessary if another module is to be synchronized with the playback mechanism.

Normal
OnSynchReadDone

An event that notifies that a synchronous read operation has finished for all the SynchronousItem's of this module.

Normal
OnSynchReadEnd

An event that notifies that a synchronous start time is bigger than end time.

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
ReadSynch

A command initiating a synchronized read operation into the OPCHDA server for all SynchronousItems.

Synchronous
StartSyncRead

Initialize sync read.

Synchronous
StepSynch

A command initiating a synchronized step operation into the OPCHDA server for all SynchronousItems.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

An item available for synchronous replay from the OPC HDA server.

The SynchronousItem item type has the following properties:

NameDescriptionIDFlags
AlwaysActive

This property forces the item to always be in the active item set.

10030Persisted
ItemActive

The status for if the item is active or not.

10025ReadOnly
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Time (UTC) of the history to be read.

The Time item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

RelativeTime (UTC) of the history to be read.

The RelativeTime item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Integrated control item

The Control item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Replay state

The State item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis SegScaleBee

Description of ApisSegScaleBee

Provider: Prediktor

Properties

Commands And Events

The SegScaleBee module has the following item types

Int Status Item

Bool Status Item

String Status Item

Weight

Double Status Item

Int Input

Double Input Item

Bool Input Item

String Input Item

Batch weight of the last batch

Int VectorVariable

String VectorVariable

Double VectorVariable

Int Trigger In

Properties

The SegScaleBee module has the following properties:

NameDescriptionIDFlags
Com portSerial IO Com port id. Must Start with COM1002Persisted
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
Log fileThe file name for Seg scale serial IO log.1010Persisted, ExpertPage
Log file MaxsizeThe max size of the log file before it get recycled (Default value=5.000.000 [bytes]).1009Persisted, ExpertPage
Log levelDefines the log level1011Persisted, Enumerated, ExpertPage
Scale addressThe RS 485 bus address of the Seg scale1003Persisted
ScaleRoleDefines the SegScaleBee role type. The items will be generated based on the role of the module.1001Persisted, ReadOnly, Enumerated, ExtraInfo
Serial IO Baud RateThe serial IO Baud Rate1004Persisted, ExpertPage
Serial IO Data bitsThe serial IO Data bits1006Persisted, ExpertPage
Serial IO Stop bitsThe serial IO Stop bits1007Persisted, Enumerated, ExpertPage
Serial IO timeoutThe serial IO communication timeout period in milliseconds1005Persisted, ExpertPage
Weight marginThe margin in kilograms period, in seconds, for weight to indicate stand-still state1012Persisted, ExpertPage
Weight stand still periodThe period, in seconds, for weight to standstill to indicate stand-still state1008Persisted, ExpertPage

See also Module Properties

Commands And Events

The SegScaleBee module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

Int Status Item

The Int Status Item item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Bool Status Item

The Bool Status Item item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

Decides which type (e.g. Double, String, Bool) the item is to be. (Bool=10, Double=20, String=30)

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

String Status Item

The String Status Item item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

The weight

The Weight item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Double Status Item

The Double Status Item item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Int Input

The Int Input item type has the following properties:

NameDescriptionIDFlags
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Double Input Item

The Double Input Item item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Bool Input Item

The Bool Input Item item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

String Input Item

The String Input Item item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

The batch level trigger will change value when weight reaches batch size

The Batch weight of the last batch item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Int VectorVariable

The Int VectorVariable item type has the following properties:

NameDescriptionIDFlags
Dimension

The dimension of a vector item (number or elements).

5007Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

String VectorVariable

The String VectorVariable item type has the following properties:

NameDescriptionIDFlags
Dimension

The dimension of a vector item (number or elements).

5007Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Double VectorVariable

The Double VectorVariable item type has the following properties:

NameDescriptionIDFlags
Dimension

The dimension of a vector item (number or elements).

5007Persisted
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

Decides which type (e.g. Double, String, Bool) the item is to be. (Bool=10, Double=20, String=30)

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Int Trigger In

The Int Trigger In item type has the following properties:

NameDescriptionIDFlags
Inputs

The item handle (unique identifier) of the inputs.

5510Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Semantics

This module defines an opc ua namespace and allows mapping data variables in the model to hive function items. If a semantics module is removed from a hive instance, its' unique namespace uri cannot be re-used by a new instance of a semantics module before the hive instance has been restarted. If the namespace model for the semantics module needs to be replaced (ie. by importing an updated model from a nodeset2 xml file), importing the model without first deleting it, will replace the existing model without having to restart the hive instance.

Provider: Prediktor

Properties

The semantics module has the following item types

Module State Items

Item Attribute Items

Function item

Properties

The Worker module has the following properties:

NameDescriptionIDFlags
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ExtItem pass-through qualitySpecifies the quality of external item values that will pass through external item transfers. Default is 'Good and Uncertain qualities'400Persisted, Enumerated, ExpertPage
ExtItemCalculationSequenceDecides whether data validation or data transfer will be performed first in the external item manager.300Persisted, Enumerated, ExpertPage
PersistValToInitValChoose strategy for copying and persisting current value to the InitValue.
Tip: Consider using an InitVQTFromHoneystore attribute instead, for better performance.
1650Persisted, Enumerated
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
ObjectstorePath to the (sqlite) database containing the ua model for the namespace. The path may be expressed relative to the hive instance's configuration folder.1020Persisted
UriThe unique uri identifying the namespace1025Persisted
Update of uri-metainfoSpecifies wether the PublicationDate, ModelVersion and LastChanged properties for the namespace should be maintained automatically or manually.1027Persisted, Enumerated
ModelVersionThe current model version1028Persisted
PublicationDateThe publication date for the namespace model1029Persisted
LastModifiedThe last modified date for the namespace model1030Persisted
Nameoptions

What should make up the parts of function item names. Either browsename or displayname, possibly with the nodeid appended.

1050Persisted, Enumerated
Valuename separator

Character used to separate parts of function item names. Normally '.' is used to indicate parent-child like separation.

1065Persisted
Name termination

When to stop following parent nodes for inclusion in function item names. Stop at naming root: Stop at the first parent object node having a property of the naming root property type. Stop at first object node: Stop at the first parent that is an object node.

1095Persisted, Enumerated
Naming root type URI

The uri-part of the node id for the naming root property type.

1100Persisted
Naming root type ID type

The identifier-type-part of the node id for the naming root type.

1101Persisted, Enumerated
Naming root type ID

The identifier-part of the node id for the naming root type

1102Persisted
Allow naming root subtypes

Wether subtypes of the naming root type should be treated as a naming root

1103Persisted
Missing nameroot policy

What to do if the objects folder is reached without identifying an object having a naming-root property along the way. Fail: Generate an error, Stop at first object node: Regenerate the name, but stop at the first object node. Stop at objects folder: Use the full path to the objects folder.

1105Persisted, Enumerated
Function item generation

Determines when to generate function items fordata variables in the namespace. If no types in namespace: Only generate function items if there are no types defined by the namespace. Always, Never, Data variables that are not instance declarations: Unless a data variable is part of a type definition (instance declaration), it will have a function item generated for it.

1115Persisted, Enumerated
Assign IdsIf set, unique ids must be assigned by the client when creating new objects. If not set, unique (numerical) ids will be automatically generated when creating new objects.1080Persisted
Database busy timeoutTimeout in ms specifying max time to wait before returning BUSY error when the database is locked. Hive must be restarted in order for changes to this parameter to take effect.1150Persisted, ExpertPage
Retries when busyThe maximum number of times to retry a db transaction when the operation fails with a BUSY error. Hive must be restarted in order for changes to this parameter to take effect.1155Persisted, ExpertPage
Update Apis EventtypesIf set, custom eventtypes defined in the namespace will be created in Apis Chronical.1160Persisted, ExpertPage

See also Module Properties

Item Types

Item type: Variable

User defined item, which can be written and read.

The Variable item type has the following properties:

NameDescriptionIDFlags
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2NormalPage
ValuetypeItem canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.10010Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Apis SQL

The ApisSQL module reads and/or writes data to/from any SQL relational database.

Provider: Prediktor

Properties

Commands And Events

The SQL module has the following item types

Read item

Write item

Write vector item

Write matrix item

Triggeritem

More information

Quick Start Guide

SQL Procedure Examples

Get SQLBee Values

Pu and Get SQLBee Values

Write Data Target

Properties

The SQL module has the following properties:

NameDescriptionIDFlags
ADO CommandTimeoutIndicates how long to wait (in seconds) while executing an SQL command before terminating the attempt and generating an error. The default is 30 seconds. If you enter the value 0, SQL commands will never timeout.1090Persisted
AvgSQLQTimeThe average query time in milliseconds for SQL queries performed. This is averaged from the last time the statistics were reset or Apis was restarted.100000PerformancePage
Connection stateThe current state of the connection to the database. This field allows you to see if the bee is connected to the database, if it has the appropriate security rights and if the stored procedure ran successfully.10000InfoPage
Custom ConnectionStringWhen not using MS SQL Server, or if you want to have full control over the connection string, you can specify a custom connection string here. NOTE! When this property is in use, the database login, database login password, database name, and database server are ignored.1040Persisted, ExpertPage
Database loginThe same as User ID in a standard connection string. If this field is blank, integrated security will be used.1020Persisted, User
Database login passwordThe password to use when accessing the SQL database.1030Persisted, Password
Database nameThe name of the database on given server, used in the connection string.1015Persisted
Database serverThe server machine where the database exists, used in the connection string.1010Persisted
DescriptionA description of this module, useful for future maintenance to describe what this module should do.101Persisted
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
Last executed SQLThe last executed SQL statement, the text last sent to the SQL server for execution. This field can be useful for debugging, as it allows you to see exactly what SQL is being sent to the SQL server.10100InfoPage
LastRecordsetParseTimeThe last time a recordset was received and parsed from the SQL Server.100100PerformancePage
Locale IDThe locale ID to be used when parsing files. This is a unique id for the country and language, used to determine multiple factors including date formats and decimal points.1057Persisted
MaxSQLQTimeThe maximum query time in seconds for the SQL queries performed since the last time the statistics were reset or Apis was restarted.100001PerformancePage
PDS snapshot SourceDatabaseThe name of the snapshot source database. Must be specified when running queries on a PDS snapshot, use this field together with the property 'PDS snapshot usage'1046Persisted, ExpertPage
PDS snapshot usageWhether to use PDS Snapshots or not. When snapshots are to be used, specify the name of the source database database in property 'PDS snapshot SourceDatabase' as well , and the SQL query will be executed on a snapshot acquired from the PDS Snapshot Manager.1045Persisted, Enumerated, ExpertPage

Provider
The provider of the module. Persisted
SQL statementThe SQL statement to execute. Can be an 'execute statement' or a fully qualified file name containing the statements to execute. This, in combination with the SQL statement type property, decides what text to send to SQL server.1080Persisted
SQL statement typeThis dropdown property has two options: Execute as text - the text will be sent to the SQL database as it's written and executed. For example, SELECT * FROM AnyTableName; Execute as stored procedure - A name of a stored procedure with predefined signature that will be executed with arguments (mandatory when using Write items). The write items are sent as XML, which is covered in further detail in the Quick Start Guide.1079Persisted, Enumerated
Table schema

How the data will be returned from the SQL executed. 'By rows' means item name, value, timestamp and quality are in one row, in this order. 'By columns' means item name(s) is column header(s), value in first row, timestamps and quality are not supported.

1075Persisted, Enumerated
Timer intervalTimer interval in milliseconds, must be >= 50 ms. Ensure that this is not smaller than the runtime of the SQL executed, or the server may struggle to keep up with demand from the module.1110Persisted
TimeReferenceItemAn item, the value of which will be used as the time reference for this module instead of the system time when setting the timestamp on items.200Persisted, ApisItem, ExpertPage
VersionThe version of SQL Bee this module is running. Persisted

See also Module Properties

Commands And Events

The SQL module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
OnTimer

An event fired when at a rate given by the ModelRefreshInterval or Timerperiod property

Normal
UpdateDone

An event for when the updating of the recordset is done.

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
SynchUpdate

Updates the recordset synchronously.

Synchronous
Update

Update the recordset.

Asynchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Read Item Properties

Item type for reading scalar, vector and matrix values from the database.

The Read item item type has the following properties:

NameDescriptionIDFlags
InitValueEnabled

Whether this item has an initial value on start up of Apis Hive. If enabled, this allows for the setting of a value so the initial value isn't blank or null on each restart.

 Persisted
Item type

A field indicating what type of item this is. This is always set to "Read item" for Read Items.

 ReadOnly
Name

The name of this item. This is independent of the name of the item from SQL server (see the SrcItemID property).

 Persisted
Quality

Item quality. This indicates the current quality of the signal from SQL Server. The quality may be "Good" meaning the field is being read correctly, "Bad:Connect error" meaning that there's a problem with the database location or security properties on the module, or "Bad:Config error" meaning there's another problem with the connection to the database, or the field is not found in the database results.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly
Valuetype

Item canonical datatype. The type of value this field contains. For example, string, int, datetime, etc.

10031Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Write Item Properties

Item type for writing scalar values to the database.

The Write item item type has the following properties:

NameDescriptionIDFlags
InitValueEnabled

Whether this item has an initial value on start up of Apis Hive. If enabled, this allows for the setting of a value so the initial value isn't blank or null on each restart.

 Persisted
Item type

A field indicating what type of item this is. This is always set to "Write item" for Write Items.

5801ReadOnly
Name

The name of this item. This is independent of the name of the item from SQL server (see the SrcItemID property).

100Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly
Valuetype

Item canonical datatype. The type of value this field contains. For example, string, int, datetime, etc.

10031Persisted, Enumerated
WriteUpdatedOnly

This field is set to true by default, and means that values will only be sent to SQL Server if the value has changed since the last time the command was run. If the value has no changed, it will be omitted from the list of parameters in the XML to SQL Server.

11000Persisted

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Write Vector Properties

Item type for writing vector values to the database.

The Write vector item item type has the following properties:

NameDescriptionIDFlags
Arraytype

The data type of the values to be held in the matrix. For example, string, int, datetime, etc.

10041Persisted
Dimension

The dimension of a vector item (number or elements).

5007Persisted
InitValueEnabled

Whether this item has an initial value on start up of Apis Hive. If enabled, this allows for the setting of a value so the initial value isn't blank or null on each restart.

 Persisted
Item type

A field indicating what type of item this is. This is always set to "Write vector item" for Write Vector Items.

5801ReadOnly
Name

The name of this item. This is independent of the name of the item from SQL server (see the SrcItemID property).

100Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly
WriteUpdatedOnly

This field is set to true by default, and means that values will only be sent to SQL Server if the value has changed since the last time the command was run. If the value has no changed, it will be omitted from the list of parameters in the XML to SQL Server.

11000Persisted

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Write Matrix Properties

Item type for writing matrix values to the database.

The Write matrix item item type has the following properties:

NameDescriptionIDFlags
Arraytype

The data type of the values to be held in the matrix. For example, string, int, datetime, etc.

10041Persisted
Columns

The number or columns in matrix item.

5009Persisted
InitValueEnabled

Whether this item has an initial value on start up of Apis Hive. If enabled, this allows for the setting of a value so the initial value isn't blank or null on each restart.

 Persisted
Item type

A field indicating what type of item this is. This is always set to "Write matrix item" for Write Matrix Items.

5801ReadOnly
Name

The name of this item. This is independent of the name of the item from SQL server (see the SrcItemID property).

100Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Rows

The number of rows in a matrix item.

5008Pesisted
SrcItemID

The item ID in the source. This is the item ID this item uses to fetch data from the source.

5030Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly
WriteUpdatedOnly

This field is set to true by default, and means that values will only be sent to SQL Server if the value has changed since the last time the command was run. If the value has no changed, it will be omitted from the list of parameters in the XML to SQL Server.

11000Persisted

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Trigger Item Properties

Item which can trigger an SQL update.

The Triggeritem item type has the following properties:

NameDescriptionIDFlags
InitValueEnabled

Whether this item has an initial value on start up of Apis Hive. If enabled, this allows for the setting of a value so the initial value isn't blank or null on each restart.

 Persisted
Item type

A field indicating what type of item this is. This is always set to "Triggeritem" for Trigger Items.

5801ReadOnly
Name

The name of this item. This is independent of the name of the item from SQL server (see the SrcItemID property).

100Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

SQL Examples

WriteDataTarget

This query creates table (WriteDataTarget) with all necessary columns to be able to read/write data from Apis SQL module.


USE [TestDB]

GO

/****** Object: Table [dbo].[WriteDataTarget] Script Date: 06/08/2009 15:53:05 ******/

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

CREATE TABLE [dbo].[WriteDataTarget](

[ID] [int] IDENTITY(1,1) NOT NULL,

[ItemID] [nvarchar](max) NOT NULL,

[ItemValue] [nvarchar](max) NOT NULL,

[ItemTimestamp] [datetime] NOT NULL,

[ItemQuality] [smallint] NOT NULL,

[Row] [smallint] NULL,

[Col] [smallint] NULL,

CONSTRAINT [PK_WriteDataTarget] PRIMARY KEY CLUSTERED

(

[ID] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

) ON [PRIMARY]

GO

PutAndGetSQLBeeValues


--USE [TestDB]

--GO

/****** Object: StoredProcedure [dbo].[PutAndGetSQLBeeValues] Script Date: 12/09/2005 08:50:44 ******/

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

if (not exists(

select SS.Name, SO.Name from Sys.objects SO

INNER JOIN Sys.schemas SS ON SS.schema_id = SO.schema_id

where SS.Name = 'dbo' and SO.Name = 'PutAndGetSQLBeeValues' and SO.type = 'P'))

BEGIN

exec ('CREATE PROCEDURE dbo.PutAndGetSQLBeeValues AS')

END

GO

-- =============================================

-- Create date: 30/5-2009

-- Description: Sample SP for reading and writing data to/from SQL server using Apis SQL Bee

-- =============================================

ALTER PROCEDURE [dbo].[PutAndGetSQLBeeValues]

-- Add the parameters for the function here

@ByRows bit = 1,

@ModuleName nvarchar(max) = '',

@TriggerName nvarchar(max) = '',

@xmlWriteData xml = null

AS

BEGIN

-- SET NOCOUNT ON added to prevent extra result sets from interfering with SELECT statements.

SET NOCOUNT ON;

------------------------------------------------------------------

-- First handle WRITE operations:

-- GENERIC CODE:

-- The Format of the XML Document give as input param @xmlWriteData:

-- <?xml version="1.0" encoding="utf-16"?>

-- <ROOT>

-- <ItemSample ItemID="SQLBee1.MyWriteItem1" ItemValue="1.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" />

-- <ItemSample ItemID="SQLBee1.MyWriteItem2" ItemValue="3.14" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" />

-- <ItemSample ItemID="SQLBee1.MyWriteItem3" ItemValue="6.28" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" />

-- :

-- <ItemSample ItemID="SQLBee1.MyWriteVector1" ItemValue="1.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="0"/>

-- <ItemSample ItemID="SQLBee1.MyWriteVector1" ItemValue="2.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="1"/>

-- <ItemSample ItemID="SQLBee1.MyWriteVector1" ItemValue="3.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="2"/>

-- <ItemSample ItemID="SQLBee1.MyWriteVector1" ItemValue="4.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="3"/>

-- :

-- <ItemSample ItemID="SQLBee1.MyWriteMatrix1" ItemValue="1.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="0" Col="0"/>

-- <ItemSample ItemID="SQLBee1.MyWriteMatrix1" ItemValue="2.1" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="1" Col="0"/>

-- <ItemSample ItemID="SQLBee1.MyWriteMatrix1" ItemValue="1.2" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="0" Col="1"/>

-- <ItemSample ItemID="SQLBee1.MyWriteMatrix1" ItemValue="2.2" ItemTimestamp="2009-05-29 12:00:00.000" ItemQuality="192" Row="1" Col="1"/>

-- </ROOT>

--print convert(nvarchar(max), @xmlWriteData)

Declare @WriteDataTable TABLE

(

ItemID nvarchar(max),

ItemValue nvarchar(max),

--ItemType smallint,

ItemTimestamp datetime,

ItemQuality smallint,

Row smallint,

Col smallint

)

DECLARE @hXmlDoc int

Exec sp_xml_preparedocument @hXmlDoc Output, @xmlWriteData

if (@hXmlDoc is not null)

begin

INSERT INTO @WriteDataTable(ItemID, ItemValue, ItemTimestamp, ItemQuality, Row, Col)

SELECT ItemID, ItemValue, ItemTimestamp, ItemQuality, Row, Col

FROM

OPENXML(@hXmlDoc, '/ROOT/ItemSample',1)

WITH (ItemID nvarchar(max), ItemValue nvarchar(max), ItemTimestamp datetime, ItemQuality smallint, Row smallint, Col smallint)

end

Exec sp_xml_removedocument @hXmlDoc

-- Then use contents of @WriteDataTable to insert into target system

-- END OF GENERIC CODE:

-- CUSTOMER SPECIFIC CODE!

-- Sample 1: Insert into target table:

--insert into WriteDataTarget

--select * from @WriteDataTable

-- Sample 2: Call a customer specific stored procedure

--Exec CustomerStoredProc

-- convert( real, select ItemValue from @WriteDataTable where ItemID = 'Tagnavn 1'),

-- convert( real, select ItemValue from @WriteDataTable where ItemID = 'Tagnavn 2'),

-- convert( real, select ItemValue from @WriteDataTable where ItemID = 'Tagnavn 3'),

-- convert( real, select ItemValue from @WriteDataTable where ItemID = 'Tagnavn 4')

------------------------------------------------------------------

-- Then handle READ operations:

-- Insert statements for procedure here

-- Below is sample code for generating a dummy namespace and dummy values:

Declare @TS datetime

Set @TS = GetUtcDate()

if @ByRows = 1

begin

Declare @RandomTable TABLE

(

ItemName varchar(64),

ItemValue varchar(128),

ItemTimestamp datetime

)

insert into @RandomTable select 'BoolValue', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue', 'StringValue X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue2', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue2', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue2', 'StringValue2 X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue2', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue3', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue3', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue3', 'StringValue3 X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue3', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue4', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue4', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue4', 'StringValue4 X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue4', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue5', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue5', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue5', 'StringValue5 X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue5', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'StringArray', '[2] elem1-' +cast(round(100*RAND(), 0) as varchar(128)) + '; elem2-' + cast(round(100*RAND(), 0) as varchar(128)) + ';', @TS

insert into @RandomTable select 'DoubleArray', '[2] ' + cast(100*RAND() as varchar(128)) + '; ' + cast(100*RAND() as varchar(128)) + ';', @TS

select * from @RandomTable

end

else

begin

Select

cast(round(Rand(), 0) as bit) as 'ColBoolValue',

10*RAND() as 'ColFloatValue',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue',

getDate() as 'ColDateValue',

cast(round(Rand(), 0) as bit) as 'ColBoolValue2',

10*RAND() as 'ColFloatValue2',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue2',

getDate() as 'ColDateValue2',

cast(round(Rand(), 0) as bit) as 'ColBoolValue3',

10*RAND() as 'ColFloatValue3',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue3',

getDate() as 'ColDateValue3',

cast(round(Rand(), 0) as bit) as 'ColBoolValue4',

10*RAND() as 'ColFloatValue4',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue4',

getDate() as 'ColDateValue4',

cast(round(Rand(), 0) as bit) as 'ColBoolValue5',

10*RAND() as 'ColFloatValue5',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue5',

getDate() as 'ColDateValue5',

'[2] tkst-' +cast(round(100*RAND(), 0) as varchar(128)) + '; tkst-' + cast(round(100*RAND(), 0) as varchar(128)) + ';' as 'StringArray',

'[2] ' + cast(100*RAND() as varchar(128)) + '; ' + cast(100*RAND() as varchar(128)) + ';' as 'DoubleArray'

end

END

GetSQLBeeValues


USE [TestDB]

GO

/****** Object: StoredProcedure [dbo].[GetSQLBeeValues] Script Date: 12/09/2005 08:50:44 ******/

SET ANSI_NULLS ON

GO

SET QUOTED_IDENTIFIER ON

GO

-- =============================================

-- Author: <Author,,Name>

-- Create date: <Create Date,,>

-- Description: <Description,,>

-- =============================================

CREATE PROCEDURE [dbo].[GetSQLBeeValues]

-- Add the parameters for the function here

@ByRows bit = 1

--<@Param2, sysname, @p2> <Datatype_For_Param2, , int> = <Default_Value_For_Param2, , 0>

AS

BEGIN

-- SET NOCOUNT ON added to prevent extra result sets from

-- interfering with SELECT statements.

SET NOCOUNT ON;

waitfor delay '00:00:01'

-- Insert statements for procedure here

--SELECT @p1, @p

Declare @TS datetime

Set @TS = GetUtcDate()

if @ByRows = 1

begin

Declare @RandomTable TABLE

(ItemName varchar(64), ItemValue varchar(128), ItemTimestamp datetime)

insert into @RandomTable select 'BoolValue', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue', 'row X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue2', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue2', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue2', 'row X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue2', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue3', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue3', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue3', 'row X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue3', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue4', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue4', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue4', 'row X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue4', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'BoolValue5', cast(round(Rand(), 0) as varchar(128)), @TS

insert into @RandomTable select 'FloatValue5', cast(10*RAND() as varchar(128)), @TS

insert into @RandomTable select 'StringValue5', 'row X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV', @TS

insert into @RandomTable select 'DateValue5', cast(getDate() as varchar(128)), @TS

insert into @RandomTable select 'StringArray', '[2] tkst-' +cast(round(100*RAND(), 0) as varchar(128)) + '; tkst-' + cast(round(100*RAND(), 0) as varchar(128)) + ';', @TS

insert into @RandomTable select 'DoubleArray', '[2] ' + cast(100*RAND() as varchar(128)) + '; ' + cast(100*RAND() as varchar(128)) + ';', @TS

select * from @RandomTable

end

else

begin

Select

cast(round(Rand(), 0) as bit) as 'ColBoolValue',

10*RAND() as 'ColFloatValue',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue',

getDate() as 'ColDateValue',

cast(round(Rand(), 0) as bit) as 'ColBoolValue2',

10*RAND() as 'ColFloatValue2',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue2',

getDate() as 'ColDateValue2',

cast(round(Rand(), 0) as bit) as 'ColBoolValue3',

10*RAND() as 'ColFloatValue3',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue3',

getDate() as 'ColDateValue3',

cast(round(Rand(), 0) as bit) as 'ColBoolValue4',

10*RAND() as 'ColFloatValue4',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue4',

getDate() as 'ColDateValue4',

cast(round(Rand(), 0) as bit) as 'ColBoolValue5',

10*RAND() as 'ColFloatValue5',

'column X' +cast(round(100*RAND(), 0) as varchar(128)) + 'XEV' as 'ColStringValue5',

getDate() as 'ColDateValue5',

'[2] tkst-' +cast(round(100*RAND(), 0) as varchar(128)) + '; tkst-' + cast(round(100*RAND(), 0) as varchar(128)) + ';' as 'StringArray',

'[2] ' + cast(100*RAND() as varchar(128)) + '; ' + cast(100*RAND() as varchar(128)) + ';' as 'DoubleArray'

end

END

Apis StateSave

This module saves and load values of items from the namespace of Apis Hive to ascii file.

Provider: Prediktor

Properties

Commands And Events

The StateSave module has the following item types

StateFileName

StateFileFolder

StateSaveLoad

StateSaveDelete

Properties

The StateSave module has the following properties:

NameDescriptionIDFlags
AwaysReportStateChangeSet this value to false when you only want to fire 'StateSave' or 'StateLoad' event on successfully save or load of state of one or more state file.1800Persisted, ExpertPage
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
SkipModuleNameWhen true, module part of item names are omitted when saving/loading statesave files. This property applies only when NOT using global Key attribute to lookup statesave items.1700Persisted
StateFileFolderThe folder in which the file(s) recides (Working folder)1600Persisted, Folder

See also Module Properties

Commands And Events

The StateSave module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
StatesDeleted

A state-delete operation has been performed.

Normal
StatesLoaded

A state-load operation has been performed.

Normal
StatesSaved

A state-save operation has been performed.

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

A StateSave file item

The StateFileName item type has the following properties:

NameDescriptionIDFlags
CreateDirectory

Whether to attempt to create file directory if it doesn't exist.

10306Persisted
DeleteDir

Deletes the snapfile directory after a delete of snapfile, if the directory is empty.

10320Persisted
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
FileAccess

Decides whether the file specified in the item is in read, write, or read/write mode (Read=0, Write=1, Read/Write=2).

10300Persisted, Enumerated
FilePrefix

A fixed value to include before the file name.

10302Persisted
FileSuffix

A fixed value to include after the file name.

10304Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
TriggerOnChange

If true, the state file will be read/written to when the item changes.

10310Persisted
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

The folder in which the file(s) recides (Working folder)

The StateFileFolder item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

A StateSave item

The StateSaveLoad item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

A StateSave delete item

The StateSaveDelete item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis TaskScheduler

This module schedules non-Apis tasks for execution.

Provider: Prediktor

Properties

Commands And Events

The TaskScheduler module has the following item types

Task

Properties

The TaskScheduler module has the following properties:

NameDescriptionIDFlags
DefaultFolderDefault browsing folder for files to use when adding tasks.1010Persisted, Folder
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ShowWebBrowserShow the window which navigates to- and prints 'HTMLprint' tasks.1020Persisted
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage

See also Module Properties

Commands And Events

The TaskScheduler module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

An executable command in the OS or a file to apply a verb to.

The Task item type has the following properties:

NameDescriptionIDFlags
CurrentState

The CURRENT_STATE of a task.

11000Persisted, Hidden
Directory

The directory in which to execute the task.

10050Persisted, Folder
File

The file to execute, print etc..

10005Persisted, File
NextTime

The next execution time of the task in the local timezone.

10010Persisted
Parameter

The parameters to send to the task (if needed).

10040Persisted
Period

The time period between each repetition of the task.

10020Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Show

How to display the task when executed.

10060Persisted, Enumerated
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Verb

The verb to apply to this task (normally 'print' or 'open').

10030Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis UaAcClientBee

Used to communicate to OPC UA AC servers.

Provider: Prediktor

Properties

Commands And Events

The UaAcClientBee module has the following item types

ApisStatusItem

ApisAlarmConditionItem

ApisAlarmMonitorItem

ApisSimpleAttributeFilterItem

Properties

The UaAcClientBee module has the following properties:

NameDescriptionIDFlags
AreaThe default area to link the source item in Apis Hive (default: Area).1140Persisted
CertificateSpecifies the certificate used when connecting to an OpcUA server (default = Apis Hive instance certificate)1490Persisted, Enumerated, ExpertPage
CertificateSubjectThe server certificate subject3050InfoPage
CertificateThumbprintThe server certificate thumbprint3060InfoPage
CurrentKeepAliveCount3010InfoPage
CurrentLifetimeCount3020InfoPage
CurrentPriority3040InfoPage
CurrentPublishingEnabled3030InfoPage
CurrentPublishingInterval3001InfoPage
EnabledThe parameter is used to enable or disable the module (default:false).1050Persisted
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
KeepAliveCountWhen this counter reaches the value calculated for the lifetime of a Subscription based on the MaxKeepAliveCount parameter, the Subscription is closed. Minimum value is 5 (default:10).1200Persisted
LifetimeCountA value that contains the number of consecutive publishing timer expirations without Client activity before the Subscription is terminatedSubscription LifetimeCount. Minimum value is 5 (default:10).1250Persisted
MaxNotificationsPerPublishThe maximum number of notifications that the Client wishes to receive in a single Publish response. A value of zero indicates that there is no limit (default:1000)1300Persisted
Operation ModeThe type of mode of the Bee (default: Both_auto).1112Persisted, Enumerated
OperationTimeoutThe OperationTimeout should be twice the minimum value for PublishingInterval*KeepAliveCount (default=20000)1150Persisted
Password1450Persisted, Hidden, Password, ExpertPage
PriorityIndicates the relative priority of the Subscription. When more than one Subscription needs to send Notifications, the Server should dequeue a Publish request to the Subscription with the highest priority number. A Client that does not require special priority settings should set this value to zero .Subscription Priority (default:0)1350Persisted
PublishingIntervalThis interval defines the cyclic rate that the Subscription is being requested to return Notifications to the Client. This interval is expressed in milliseconds Subscription PublishingInterval. The value 0 is invalid (default:1000).1170Persisted
ReconnectTimoutThe timeout for reconnecting server. The unit is in sec and value >5 is valid, and 0 will disabled it (default: 30).1125Persisted
RefreshSupportedIf server support refresh function of alarms (default=true).1145Persisted
StorePathThe path of the certificate store (Default='LocalMachine\UA Applications')1550Persisted, ExpertPage
StoreTypeThe type of the certificate store (Default=Windows)1500Persisted, Enumerated, ExpertPage
SubjectNameThe subject name of the certificate. Required if Certificate=Custom.1600Persisted, ExpertPage
TraceLogMaxSizeThe max size of the trace log file. The value of '<0' means no limits and the value of '=0' means the log is turned off.When the log size if specified the size must be greater than 100KByte.Note: The unit is in KBytes and the file is located in configuration folder (Default='0')1800Persisted, ExpertPage
URLOPC UA server url (http OR opc.tcp).1100Persisted
Username1400Persisted, Hidden, ExpertPage

See also Module Properties

Commands And Events

The UaAcClientBee module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

Internal items that are used to inform you about the status.

The ApisStatusItem item type has the following properties:

NameDescriptionIDFlags
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001Persisted, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

This item is used to link an OPC UA Alarm&Condition event to an item source in Apis.

The ApisAlarmConditionItem item type has the following properties:

NameDescriptionIDFlags
Area

This is the source item reference.

10025Persisted, ExpertPage
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001Persisted, ExpertPage
OpcEventType

The source item reference.

10150Persisted, Enumerated, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SourceName

The name of the source event object.

10050Persisted, ExpertPage
SourceUri

The source URI of the item reference (Default:nsu=http://opcfoundation.org/UA/;i=2253).

10075Persisted, ExpertPage
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

This item is used to configure monitoring of alarms in UA.

The ApisAlarmMonitorItem item type has the following properties:

NameDescriptionIDFlags
AreaUri

The source URI of the item reference (Default:nsu=http://opcfoundation.org/UA/;i=2253).

10065Persisted, ExpertPage
Enabled

Whether this item Is enabled or not.

10012Persisted
EventType

The type of event to subscribe to.

10175Persisted, Enumerated
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001Persisted, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
SimpleAttributeFilters

The SimpleAttributeOperand structure is used in the select clause to select the value to return. It returns it if an event meets the criteria specified by the where clause.

10200Persisted, ApisLocalItem
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

This iem is used to create filters which when set up monitoring on items

The ApisSimpleAttributeFilterItem item type has the following properties:

NameDescriptionIDFlags
AttributeType

The type of attribute used in the simple operand expression. For example, string, integer, etc.

10400Persisted, Enumerated
BrowsePath

The browse path used in the simple operand expression.

10375Persisted, Enumerated
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
ItemHandle

The handle of the item - this is the unique ID for the item. This property is often hidden.

10001Persisted, ExpertPage
LiteralOperand

The operand used in the filter expression.

10300Persisted, Enumerated
LiteralValueType

The value type of the literal value. The value of the item is used to set the "LiteralValue".

10275Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
TypeDefinition

The type definition used in the "Simple operand" expression.

10350Persisted, Enumerated
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis UaPublisherBee

This module can be used to publish information to Microsoft Eventhub. We create the messages according to the OpcUa spesification defined in "OpcUa Part 14 PubSub Release 1.04 Specification".

Also see: Stream data to broker

We only support the Json messages defined in 7.3.2 JSON Message Mapping and Json_Simplified

We use a standard communication protocol AMQP, MQTT, and Kafka to send the information to the broker. We also have the possibilities store the created messages in files, but this is to check the messages layout and content for more debugging purpose.

The property "BrokerType" define the communication to the broker. Below is a description how to configure different

File:

Select this and define a filename under an existing directory. e.g "c:/pubsub/data.json" in the property "FileName".The file will be created, and for every message sent, a new line in the file will be added. There is a maximum size of the file and then a new file will be created. Up to 10 files and then the files will be reused.

AMQP:

Then AMQP kommunication is used. Properies starting with AMQP has to be set. Se the Stream data to broker for more information.

MQTT:

The properties starting with the "MQTT." define this communication. Check the used broker to get the details. The MQTTTransport define either plain or secure communication. The value TcpServerTls represent secure communication. Then the certificate from the broker will be stored in the rejected directory for certs. (e.g. ...Config/InstanceName/pki/rejected) . If you thrust this broker and the certificate, move it to the "thrusted" directory.

Provider: Prediktor

Properties

Commands And Events

JSON messages

UaPublisher data flow

The UaPublisherBee module has the following item types

Writer Group

Variable DataSetWriter

Status

Properties

The UaPublisherBee module has the following properties:

				</tbody>
			</table>

See also Module Properties

Commands And Events

The UaPublisherBee module has the following Commands and Events:

Events

Name Description ID Flags
_DataSetWriterId_ hidden prop to book-keep the DataSetWriterIdAttribute put on dataset writer items 5010 Persisted, Hidden
_WriterGroupIdProperty_ hidden prop to book-keep the WriterGroupIdAttribute put on writer group items 5011 Persisted, Hidden
Backfill databasename The name of the database where messages are stored when they could not be transmitted to the broker 6010 Persisted, File
Backfil databasesize The maximun size of Backfill database in Gb. If 0 , then no limit in database-size is set. 6011 Persisted
Broker type File: messages are witten to a file. AMQP: sending messages to Microsoft Azure Eventhub. MQTT: sending the messages to a MQTT broker. 1100 Persisted, Enumerated
DeltaTime Dataset with newest timestamp older than now - deltatime (in seconds) will be sent to Backfill address. If 0 then no time-evaluation is done. 6007 Persisted
ExchangeRate The exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values. 100 Persisted
ExternalItem report A status-report for the External Item manager of this module 110 InfoPage
ExtItem full refresh When true, the external items manager will force a full refresh initially on start/reset when reading items. I.e. items not yet initialized in their source, will also trigger an external item update. Default is true. 150 Persisted, ExpertPage
ExtItem pass-through quality Specifies the quality of external item values that will pass through external item transfers. If external item qualities has worse quality the this mask, the external item transfer is blocked. Default is 'Any quality'. 400 Persisted, Enumerated, ExpertPage
LogLevel Specifies the loglevel for diagnostic messages from this module. 500 Persisted, Enumerated
Message encoding The encoding to use: Json: The message will be created according to the opcua json publishe spesification. See opcua documentation. UADP: Not in use. Json_Simplifies: A simplifier json format. See dokumentation. Json_Simplified_2: A simplifier json format with common timestamp and quality. See dokumentation. 5000 Persisted, Enumerated
Metadata directory The directory where metadata files are created. If empty or invalid, no meta data will be created. 6000 Persisted, Folder
PublisherId The PublisherId is a unique identifier for a Publisher within a Message Oriented Middleware 1010 ReadOnly, ReadOnlyAfterCreate
Reset counters Resets all counters for the publisher. 6005 Persisted, Folder
Tracecverbosity The trace verbosity when property LogLevel is set to Debug or higher. (If LogLevel < Debug, this property has no effect. 1005 Persisted, Enumerated
Transport protocol The TransportProtocol to use (Datagram or Broker) 1050 Persisted, Hidden, Enumerated
AMQP main address The connectionstring to Azure Event Hub. In addition the EntityPath have to be added to the WriterGroup item property AMQP main EntenityPath." 2100 Persisted
AMQP backfill address The connectionstring to Azure Event Hub. In addition the EntityPath have to be added to the WriterGroup item property AMQP backfill EntenityPath." 2120 Persisted
AMQP communication The communication to use. 1101 Persisted
AMQP connectiontype Wether there are one connection per group(Multiple), one connection for all groups (Single), or a new connection for each transaction(Transient). 2122 Persisted
Filename The Filename when BrokerType is File. 2300 Persisted, Hidden, File
MQTT main address Address to MQTT broker 2402 Persisted
MQTT main port MQTT broker port. 2403 Persisted
MQTT main clientid A string that define/identify the clients session at the broker. 2404 Persisted
MQTT main user The clients username. 2405 Persisted
MQTT main password The clients password. 2406 Persisted
MQTT main clean session Not relevant when sending data. 2409 Persisted
MQTT main version Select between V3.1.1 or V5.0 2410 Persistede
MQTT main transport Transmission protocole: TcpServer, use unencrypted communication. TcpServeTLS use encrypted communication. 2411 Persisted
MQTT main client certificate The filename of the clients certificate when broker expect a client certificate to accept connection. 2415 Persisted
MQTT backfill address Address to MQTT broker 2502 Persisted
MQTT backfill port MQTT broker port. 2503 Persisted
MQTT backfill clientid A string that define the client at the broker. 2504 Persisted
MQTT backfill user Username this client use when connecting to MQTT broker. 2505 Persisted
MQTT backfill password Username this client use when connecting to MQTT broker. 2506 Persisted
MQTT backfill clean session Not relevant when sending data. 2509 Persisted
MQTT backfill version Select between V3.1.1 or V5.0 2510 Persistede
MQTT backfill transport Transmission protocole: "TcpServer", use unencrypted communication. "TcpServeTLS" use encrypted communication. 2511 Persisted
MQTT backfill client certificate The filename of the clients certificate when broker expect a client certificate to accept connection. 2515 Persisted
NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified by user. The timer resolution is specified by the 'ExchangeRate' property.Timer

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

JSON messages

We support two main type of messages: Json defined by OpcUa and Json_Spesial defined by Prediktor. At present time we do not support the binary message UADP..

Json_Spesial Message

The Json_Simplifies is simple and fixed Json format consist of one object with element named "data" that holds an array of VQT objects. In addition the object can be expanded by the user with a fixed string. The Italicized part is here an example of a user defined extension. This can be added in UserExtra parameter (id : 11045) for a Writer Group Item. This has to expand the object with a valid Json format.

When using this format the UaPublisherBee will autocratically split or concatenate DataSet to maximize the message size without being too large. Because of this the user do not need to tune the number of values to control the message-size.

Json_Spesial Example


{

"data": [

_"schema": "apis/schemas/vqt.json","dataType":"raw","plant":"myPlant",_

{"tag": "test2.Worker.mySignal1","time": "2019-02-09T11:44:34.8905431Z","value": 145.126114,"status": 0},

{tag": "test2.Worker.mySignal2","time": "2019-02-09T11:44:34.8895786Z","value": 126.807716,"status": 0},

. . .

]

}

Json Message defined by OPC Unified Architecture.

This format is defined in the "OPC Unified Architecture Specification Part 14: PubSub Release 1.04", and the user should look at chapter 7.2.3 Json Message Mapping where this is described in details. Below is an example where we have a WriterGroup with property JsonNetworkMessageContentMask (id:11100 ) set to NetWorkMessageHeader=true, DataSetMessageHeader=true, and PublisherId=true. The VariableDataSet Writer property DataSetFieldContent (id:12060) set to StatusCode=true, and SourceTimeStamp=true. To test how the JSON change look in documentation, or try it out by change the parameters anf write the messages to a file.

Json Message Example


{

"MessageId": "5aaf34fe-51d8-4ddc-a294-3e8d3db3b5ab",

"MessageType": "ua-data",

"PublisherId": "UaPublisherBee1",

"Message": [

{

"DataSetWriterId": "2",

"PayLoad": {

"test2.Worker.Signal1": {

"Value": 0.911874,

"SourceTimestamp": "2019-02-13T15:03:33.1637587Z"

},

"test2.Worker.Signal2": {

"Value": 426.613861,

"SourceTimestamp": "2019-02-13T15:03:33.1637587Z"

},

"test2.Worker.Signal3": {

"Value": 113.692924,

"SourceTimestamp": "2019-02-13T15:03:33.1637587Z"

}

}

},

{

"DataSetWriterId": "2",

"PayLoad": {

"test2.Worker.Signal1": {

"Value": 0.889142,

"SourceTimestamp": "2019-02-13T15:03:34.163777Z"

},

"test2.Worker.Signal2": {

"Value": 423.00354,

"SourceTimestamp": "2019-02-13T15:03:34.163777Z"

},

"test2.Worker.Signal3": {

"Value": 113.268066,

"SourceTimestamp": "2019-02-13T15:03:34.163777Z"

}

}

}

]

}

UaPublisher Data Flow

Below is a picure visualize the internal data flow in the ApisUaPublisherBee.

The OpcUa bee creates new DataSet every time new values from the OpcUa-server arrive. The DataSet is transfered to UaPublisher and added to the DataQue. Each writer in UaPublisher has it own que. The QueSize is defined by the property MaxDataSetMessageBufferSize (id: 121100) and should be 2-3 times the expected size in a PublishInterval (id:11020). By setting the DiagnosticLevel for the Variabel DataSetWrter to debug you get a status DataSetQueSize showing the current number of DataSet in the DataQue.

On PublishInterval (id:11020) the WriterGroup empty all DataQue convert the DataSet according to specification and store the messages in the MessageQue. The maximum size of MessageQue is defined by the paramete NetworkMessageQueSize (id:11025). It is one que for each WriterGroup. If the MessageQue is dfull, then the message is written to the Database (DB).

There is one thread for each MessageQue that sends the messages to the specified destination. If a sending fails, the message will be resent once, and if that one also fail, the message is stored in the DB. When e.g. the network to an Eventhub is down all messages will end up in the DB.

Messages in DB are sent to the Backfill eventhub. Messages are deleted from DB when messages are successfully written to Eventhub.

If the OpcUaBee that represent the DataSet are configured with catchup, all DataSet are converted to JsonMessages and stored directly in DB. If there are added sizelimits to the DB, the UaPublisher will wait until DB er ready for more. In this case no DataSet will be lost.

APIS UaPublisherBee Item Types

Topics Covered:

Writer Group Properties

Items representing a Publisher WriterGroup.

The Writer Group item type has the following properties:

Name Description ID Flags
DataSetOrdering The DataSetOrdering defines the ordering of the DataSetMessages in the NetworkMessages. The default value is Undefined. 11210 Persisted, Hidden, Enumerated
Diagnostics level Define the level of diagnostics to be created. 10150 Persisted, Enumerated
Enabled Whether the item is enabled or not. 10100 Persisted
GroupVersion The parameter GroupVersion with DataType VersionTime defines the expected value in the field GroupVersion in the header of the NetworkMessage. The default value 0 is defined as null value, and means this parameter shall be ignored. 11200 Persisted, Hidden
JsonNetworkMessageContentMask The parameter NetworkMessageContentMask defines the optional header fields to be included in the NetworkMessages produced by the WriterGroup. The DataType for the JSON NetworkMessage mapping is JsonNetworkMessageContentMask. (See doc OpcUa Part 14 1.0.4 chapter 6.3.2.2.1) 11100 Persisted, EnumeratedFlags
KeepAliveTime The KeepAliveTime with DataType Duration defines the time in milliseconds until the Publisher sends a keep alive DataSetMessage in the case where no DataSetMessage was sent in this period by a DataSetWriter. The minimum value shall equal the PublishingInterval. 11030 Persisted
Max NetworkMessage queuesize Maxium number of NetworkMessages in the internal queue. 11025 Persisted
MaxNetworkMessageSize The maximum size in bytes for networkmessages created by the Writergroup. It refers to the size of the complete message including padding and signature without any additional headers added by the transport protocol mapping. If the size of a message exceeds the "Max message size", the behaviour depends on the message mapping./td> 11010 Persisted
MessageRepeatCount The MessageRepeatCount with DataType Byte defines how many times every NetworkMessage is repeated. The default value is 0 and disables the repeating. 11060 Persisted, Hidden
MessageRepeatDelay The MessageRepeatDelay with DataType Duration defines the time between NetworkMessage repeats in milliseconds. The parameter shall be ignored if the parameter MessageRepeatCount is set to 0. 11065 Persisted, Hidden
AMQP partitionkey The key used to generate a hash code which decides the partition the message will be sent to when using AMQP. 11015 Persisted
Priority The Priority with DataType Byte defines the relative priority of the WriterGroup to all other WriterGroups across all PubSubConnections of the Publisher. If more than one WriterGroup needs to be processed, the priority number defines the order of processing. The highest priority is processed first. The lowest priority is 0 and the highest is 255. 11040 Persisted, Hidden
PublishingOffset The SamplingOffset with the DataType Duration defines the time in milliseconds for the offset of creating the NetworkMessage in the PublishingInterval cycle. Any negative value indicates that the optional parameter is not configured. In this case the Publisher shall calculate the time before the PublishingOffset that is necessary to create the NetworkMessage in time for sending at the PublishingOffset. 11240 Persisted, Hidden
PublishInterval The PublishingInterval with the defines the interval in milliseconds for publishing NetworkMessages and the embedded DataSetMessages created by the related DataSetWriters 11020 Persisted
Quality Item quality 3 ReadOnly
AMQP main EntityPath A string parameter specifies the EntityPath in the broker. 11053 Persisted
AMQP backfill EntityPath A string parameter specifies the EntityPath in the broker. 11054 Persisted
MQTT main topic The topic string parameter specifies the queue this WriterGroup send messages to. 11050 Persisted
MQTT backfill topic The topic string parameter specifies the queue this WriterGroup send messages to. 11051 Persisted
RequestedDeliveryGuarantee The RequestedDeliveryGuarantee parameter with DataType BrokerTransportQualityOfService specifies the delivery guarantees that shall apply to all NetworkMessages published by the WriterGroup unless otherwise specified on the DataSetWriter transport settings. 11055 Persisted, Enumerated
Rights Item access rights 5 ReadOnly
SamplingOffset TThe SamplingOffset with the DataType Duration defines the time in milliseconds for the offset of creating the NetworkMessage in the PublishingInterval cycle. Any negative value indicates that the optional parameter is not configured. In this case the Publisher shall calculate the time before the PublishingOffset that is necessary to create the NetworkMessage in time for sending at the PublishingOffset. 11230 Persisted, Hidden
Time Item timestamp 4 ReadOnly
Type Item canonical datatype 1 ReadOnly
UadpNetworkMessageContentMask The parameter NetworkMessageContentMask defines the optional header fields to be included in the NetworkMessages produced by the WriterGroup. The DataType for the UADP NetworkMessage mapping is UadpNetworkMessageContentMask. 11220 Persisted, Hidden, EnumeratedFlags
UserExtra UserExtra is a static Json-part that is added to the Json_spesial message 11045 Persisted
Value Item value 2 ReadOnly
WriterGroupId The WriterGroupId with DataType UInt16 is an identifier for the WriterGroup and shall be unique across all WriterGroups for a PublisherId. All values, except for 0, are valid. The value 0 is defined as null value. 11000 Persisted, ReadOnly

See also Predefined Item Properties and OPC DA Properties

Variable DataSet Writer Properties

Items representing a variable DataSet writer.

The Variable DataSetWriter item type has the following properties:

				</tbody>
			</table>

See also Predefined Item Properties and OPC DA Properties

Status Properties

This item is used to expose internal status information

The Status item type has the following properties:

Name Description ID Flags
ConfiguredSize The parameter ConfiguredSize with the DataType UInt16 defines the fixed size in bytes a DataSetMessage uses inside a NetworkMessage. The default value is 0 and it indicates a dynamic length. If a DataSetMessage would be smaller in size (e.g.because of the current values that are encoded) the DataSetMessage is padded with bytes with value zero.In case it would be larger, the Publisher shall set bit 0 of the DataSetFlags1 to false to indicate that the DataSetMessage is not valid. 12510 Persisted, Hidden
DataSetFieldContentMask A DataSet field consists of a value and related metadata. In most cases the value comes with status and timestamp information. Set flags to decide what to include in addition to the value in the DataSetMessage.(See OpcUa Part 14 1.04 chapter 7.2.3.3) 12060 Persisted, EnumeratedFlags
DataSet connected data The name of the corresponding PublishedDataSet. I.e. the name of an apis module in this instance, that implements the IApisPublishedDataSet interface, and that will provide the dataset to publish. 12100 Persisted, DynamicEnumeration
DataSetWriterId The DataSetWriterId with defines the unique ID of the DataSetWriter for a PublishedDataSet. It is used to select DataSetMessages for a PublishedDataSet on the Subscriber side. It shall be unique across all DataSetWriters for a PublisherId. All values, except for 0, are valid DataSetWriterIds. The value 0 is defined as null value. 12050 Persisted, ReadOnly
Diagnostics level Define the level of diagnostics to be created. 10150 Persisted, Enumerated
Enabled Whether the item is enabled or not. 10100 Persisted
JsonDataSetMessageContentMask The DataSetMessageContentMask defines the flags for the content of the DataSetMessage header. 12400 Persisted, EnumeratedFlags
KeyFrameCount The KeyFrameCount with DataType UInt32 is the multiplier of the PublishingInterval that defines the maximum number of times the PublishingInterval expires before a key frame message with values for all published Variables is sent.The delta frame DataSetMessages contains just the changed values.If no changes exist, the delta frame DataSetMessage shall not be sent.If the KeyFrameCount is set to 1, every message contains a key frame. 12070 Persisted
MajorVersion The MajorVersion reflects the time of the last major change of the DataSet content. 18050 ReadOnly, InfoPage
DataSet message buffer size The maximum number of DataSet that can be buffered internaly 12110 Persisted
MinorVersion The MinorVersion reflects the time of the last change of the DataSet content. 18051 ReadOnly, InfoPage
Quality Item quality 3 ReadOnly
Rights Item access rights 5 ReadOnly
Time Item timestamp 4 ReadOnly
Type Item canonical datatype 1 ReadOnly
UadpDataSetMessageContentMask The UadpDataSetMessageContentMask defines the flags for the content of the DataSetMessage header, when using UADP message mapping. 12500 Persisted, Hidden, EnumeratedFlags
Value Item value 2 ReadOnly
WriterGroup itemname The WriterGroupItem for this dataset writer. 10200 Persisted, APISLocalItem, DynamicEnumeration
Value identifier The names to use in messages to identify values. SourceNodeId: selecr opcu ua nodeid to the value. ItemName: use the APIS itemname, SourceNodeId_Regex: Use the opcus nodid, but in addition manipulate this with regex. ItemName_Regex: Use APIS itemname and manipulate it with regex. 12205 Persisted
Regex source If the name shall use regex: This parameter holds the string to search for. 12210 Persisted
Regex replace If the name shall use regex: This parameter holds the replace string. 12220 Persisted
Name Description ID Flags
DiagnosticsClassification 14020 Persisted, ReadOnly, Enumerated
DiagnosticsLevel 14010 Persisted, ReadOnly, Enumerated
Quality Item quality 3 ReadOnly
Reset Resets this status item 14000 Persisted
Rights Item access rights 5 ReadOnly
Time Item timestamp 4 ReadOnly
Type Item canonical datatype 1 ReadOnly
Value Item value 2 ReadOnly
Module StatusDescriptionDiagnisticsLevelDiagnosticClassification
DBMessageCountThe number of messages currently stored in the databaseBasicInformation
MainPublisher.SentMessagesNumber of sent messagesBasicInformation
BackfillPublisher.SentMessagesNumber of messages sent to backfill eventhubBasicInformation
LostMessagesNumber of lost mesagesBasicError
MainPublisher.ResolvedAddressstatus of resolving the address to the eventhubBasicInformation
BackfillPublisher.ResolvedAddressstatus of resolving the address to the backfill eventhubBasicInformation
Writer Group StatusDescriptionDiagnisticsLevelDiagnosticClassification
ConversionErrornumber of error when converting data to JSONbasicError
PublisherExceedsTimerused time in sendigprosessAdvancedInformation
BackfillPublisherExeedsTimerused time in sendingprosessAdvancedInformation
SentNetworkMessagesnumber of sent messagesInfoInformation
FailedTransmissionsnumber of failed transmissionsInfoInformation
RemovedNetworkMessagtesremoved messagesInfoInformation
SendMessageQueSizeNumber currently in the messagequeDebugInformation
SendThreadcurrent status of the messagethreadDebugInformation
Variable DataSet Writer StatusDescriptionDiagnisticsLevelDiagnosticClassification
RemovedDataSetMessagesNumber of removed datasetmessagesInfoInformation
KeyFrameCountNumber KeyFramest sentInfoInformation
DataSetsReceivedNumber of dataset veceivedInfoInformation
MessageSequenceNumberthe current MessageSequenceNumberInfoInformation
DataSetQueSizeThe current size of the Dataset quedebugInformation

See also Predefined Item Properties and OPC DA Properties

Apis VectorFunctionBee

General vector function bee.

Provider: Prediktor

Properties

Commands And Events

The VectorFunctionBee module has the following item types

Child item - BooleanItem

Child item - TriggerItem

Child item - IntegerItem

Child item - DoubleItem

Child item - StringItem

Child item - DoubleVectorItem

FunctionItem

AggregatedItem

Properties

The VectorFunctionBee module has the following properties:

NameDescriptionIDFlags
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted

See also Module Properties

Commands And Events

The VectorFunctionBee module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

Boolean child item type.

The Child item - BooleanItem item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
MandatoryAttribute

An attribute that determines if the child item is mandatory or not.

10003Persisted, ReadOnly
ParentItem

The parent item name of this item.

5502Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Trigger child item type.

The Child item - TriggerItem item type has the following properties:

NameDescriptionIDFlags
MandatoryAttribute

An attribute that determines if the child item is mandatory or not.

10003Persisted, ReadOnly
ParentItem

The parent item name of this item.

5502Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Integer (Int32) child item type.

The Child item - IntegerItem item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
MandatoryAttribute

An attribute that determines if the child item is mandatory or not.

10003Persisted, ReadOnly
ParentItem

The parent item name of this item.

5502Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Double child item type.

The Child item - DoubleItem item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
MandatoryAttribute

An attribute that determines if the child item is mandatory or not.

10003Persisted, ReadOnly
ParentItem

The parent item name of this item.

5502Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

String child item type.

The Child item - StringItem item type has the following properties:

NameDescriptionIDFlags
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
MandatoryAttribute

An attribute that determines if the child item is mandatory or not.

10003Persisted, ReadOnly
ParentItem

The parent item name of this item.

5502Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Double array child item type.

The Child item - DoubleVectorItem item type has the following properties:

NameDescriptionIDFlags
ExtItemOverrideMethod

This attribute decides what method to use when assigning a value from an external item.

5999Persisted, Enumerated, BitMask
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
MandatoryAttribute

An attribute that determines if the child item is mandatory or not.

10003Persisted, ReadOnly
ParentItem

The parent item name of this item.

5502Persisted, ApisLocalItem, ExpertPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Function item.

The FunctionItem item type has the following properties:

NameDescriptionIDFlags
FunctionDefinitionAttribute

The function definition implemented by a function item.

10001Persisted, Enumerated, ExtraInfo
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

An aggregated item to perform functions on. Use external input.

The AggregatedItem item type has the following properties:

NameDescriptionIDFlags
FunctionListAttribute

A semi-colon ';' separated list of function instances to use in calculations.

10002Persisted
InitValue

The initial value, set during initialisation of the Apis Module. In other words, when Apis Hive is restarted, this is the value the item will be set to.

5002Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
TriggerAwareAttribute

If set to true, this item's value will be recalculated when one of its function item's trigger items has been triggered.

10004Persisted
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis WITS client

This module sends / receives data on WITS level 0. For further information about WITS 0, see: W.I.T.S. Wellsite Information Transfer Specification

Provider: Prediktor

Properties

Commands And Events

The WITS module has the following item types

WITSItem

Command Item

State Item

More information

Quick Start Guide

Characteristics of the ApisWITS Client module:

  • Implements WITS 0 client.
  • Supports both passive and active and passive communication.
  • Supports Read and Write.
  • Supports, serial and TCP/IP communication (TCP and UDP)
  • Supports automatic tag generation of predefined WITS records. For more information on predefined WITS records, refer to W.I.T.S. Wellsite Information Transfer Specification

Further, as an integrated module in the Apis Hive, the following optional features are available:

  • High performance data logging to the Apis Honeystore historian, with OPC Historical Data Access server interface

Properties

The WITS module has the following properties:

NameDescriptionIDFlags
Activeserver mode, some WITS servers (Pason i.a.) only send data when receiving request message, Set this property to true to activate request message, see property Request Message."1050Persisted, ExpertPage
AutogenerateAutomatic Tag Database Generation of predefined WITS records, based on telegram from server. For more information on predefined WITS records, refer to W.I.T.S. Wellsite Information Transfer Specification1057Persisted, Enumerated, ExpertPage
BaudRate receiveBaud rate for serial communication, valid only when Comm.type Serial is selected1210Persisted, Enumerated, ExpertPage
COM port receiveThe COM port to use, valid only when Comm.type Serial is selected1200Persisted, Enumerated
Comm. type receiveCommunication method, Serial or Winsock1110Persisted, Enumerated
DataBits receiveNumber of bits in the bytes transmitted and received, valid only when Comm.type Serial is selected1230Persisted, Enumerated, ExpertPage
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
FlowControl receiveFlow control for serial communication, valid only when Comm.type is Serial1250Persisted, Enumerated, ExpertPage
ForceUpdateUpdate time stamp even though value has not changed (Default=true).1000Persisted, ExpertPage
IP address receiveIp Address of WITS server, valid only when Comm. type Winsock is selected1310Persisted, Computer
Last telegramLast raw telegram received from WITS server30000ReadOnly, PerformancePage
Parity receiveParity scheme for the serial communication, valid only when Comm.type Serial is selected1220Persisted, Enumerated, ExpertPage
Poll IntervalSet this to bigger than 0.01 for a timer based polling (seconds)1055Persisted
Port receiveThe TCP/UDP port of WITS server.Valid only when Comm. type Winsock is selected1311Persisted
Port sendThe TCP/UDP port of WITS server1513Persisted, Hidden
ProtocolProtocol TCP or UDP Valid only when Comm. type Winsock is selected1312Persisted, Enumerated
Request MessageThis parameter specifies the message portion of the frame that the driver will send to request solicited data. Valid only when server mode is Active1051Persisted, Enumerated, ExpertPage
StopBits receiveNumber of stop bits to be used, valid only when Comm.type Serial is selected1240Persisted, Enumerated, ExpertPage
TimeoutTime-out interval, in seconds when timerbased.1120Persisted
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
TracefileMaxSizeThe max size of the trace file before the file is truncated in bytes. Default is 50 MB => 50 * 1024 * 102415000Persisted, ExpertPage
TraceToFileThis is used to trace detail information about the incomming data over the link15010Persisted, File, ExpertPage

See also Module Properties

Commands And Events

The WITS module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItems

Timer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.

Timer
ReceiveDone

An event signalled when a receive from WITS serever is successful.

Normal

Commands

NameDescriptionCommand Type
HandleExternalItems

Command for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.

Synchronous
StartReceive

Initiate a receive of data from WITS server.

Asynchronous
UpdateItemTimestamp

Command used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.

Synchronous

See also Commands And Events

Item Types

Properties

A WITS item, built from raw data or measurement elements

The WITSItem item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Direction

Direction of data. Input (false) or Output (true).

5040Persisted
Field

The field of the item.

5037Persisted
Offset

The linear transformation addend to use when calculating and item value. (Value = RawValue * Scale + Offset)

5006Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Record

The record of the item.

5036Persisted
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Scale

The linear transformation multiplier to use when calculating the item value. (Value = RawValue * Scale + Offset)

5005Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

100Persisted
Value

The current value of the item.

2ReadOnly
Valuetype

Item canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.

10010Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

A command for controlling module

The Command Item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Properties

Item telling if module is connected to the WITS server; true: is connected, false: is disconnected

The State Item item type has the following properties:

NameDescriptionIDFlags
Description

A description of what this item does. This is free text, so you can write anything you like here.

101Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Apis Worker

This module generates signals of different types.

Provider: Prediktor

Properties

Commands And Events

The Worker module has the following item types

Signal

Time

Variable

VariableVector

BitSelect

String formatter

Multiplexer

TrigEvtBrokerCmd

VariableMatrix

Expression

Module State Items

Item Attribute Items

Module events items

Function item

Properties

The Worker module has the following properties:

NameDescriptionIDFlags
ExchangeRateThe exchange rate in milliseconds for the 'ExternalItems' timer event. By default, used for updating external items values.100Persisted
ExtItem pass-through qualitySpecifies the quality of external item values that will pass through external item transfers. Default is 'Good and Uncertain qualities'400Persisted, Enumerated, ExpertPage
ExtItemCalculationSequenceDecides whether data validation or data transfer will be performed first in the external item manager.300Persisted, Enumerated, ExpertPage
PersistValToInitValChoose strategy for copying and persisting current value to the InitValue.
Tip: Consider using an InitVQTFromHoneystore attribute instead, for better performance.
1650Persisted, Enumerated
RandomizeItemAttribsGive item attributtes such as signal amplitude and period random initial values when items are added.1500Persisted
TimeReferenceItemAn item who's value will be used as the time reference for this module instead of the system time, when timestamping items.200Persisted, ApisItem, ExpertPage
UpdateInitvalsOnSaveIf True, items having initial values will have them updated from their current values when configuration is saved.1600Persisted, ReadOnly, Hidden, ExpertPage

Informational properties:

NameDescriptionIDFlags
ExternalItem reportA status-report for the External Item manager of this module110InfoPage

See also Module Properties

Commands And Events

The Worker module has the following Commands and Events:

Events

NameDescriptionEvent Type
ExternalItemsTimer event for handling of external items. By default, the command 'HandleExternalItems' is connected to this event, but that can be modified. The timer resolution is specified by the 'ExchangeRate' property.Timer
ExternalItemsHandled_DataPushThis event is fired after it has executed a HandleExternalItems_DataPush command. The data push package part of this ExternalItemsHandled_DataPush event, are all the resulting VQTs (Function items, ordinary external item transfer, etc.) from the HandleExternalItems_DataPush command.
On this ExternalItemsHandled_DataPush event, one can hook any _DataPush command(s) (Log; Scan; UaServerUpdateMonitorItems; HandleExternalItems), to chain a path of execution with data transferred alongside.
APIS data transfer mechanism; Data Push
Timer
TrigEvt1Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt1'Normal
TrigEvt2Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt2'Normal
TrigEvt3Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt3'Normal
TrigEvt4Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt4'Normal
TrigEvt5Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt5'Normal
TrigEvt6Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt6'Normal
TrigEvt7Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt7'Normal
TrigEvt8Event triggered by a TrigEvtBrokerCmd item with 'EventBrokerEntry=TrigEvt8'Normal

Commands

NameDescriptionCommand Type
HandleExternalItemsCommand for reading external items. When fired, the module will read all its external items, and update/notify the ones that has changed.Synchronous
HandleExternalItems_DataPushThis command ensure that all samples for all items and samples in the data push package are applied/used in the external item manager, including in services as Data Validation, Ext Items Transfer Control, etc.
See also: APIS data transfer mechanism; Data Push
Synchronous
UpdateItemTimestampCommand used for updating the value of the item specified in the 'TimeReferenceItem' property, when configured.Synchronous

See also Commands And Events

Item Types

Item type: Signal

A function, periodic in time

The Signal item type has the following properties:

NameDescriptionIDFlags
Amplitude

This is the amplitude of the signal.

10001Persisted
Bias

A constant value added to the signal.

10005Persisted
Overridden quality

The actual quality the item value should have, used for simulating qualities.

10100Persisted, Enumerated
Period

The time period between each repetition of the task.

10004Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly
Waveform

The waveform of the signal.

10002Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Time

The current time as a DATE type

The Time item type has the following properties:

NameDescriptionIDFlags
Local time

When true, the time is converted to a local time, when false the time is UTC time.

10022Persisted
MinTimeBeforeUpdate

The minimum time that can elapse (in seconds), before the item value will be updated.

10009Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Variable

User defined item, which can be written and read.

The Variable item type has the following properties:

NameDescriptionIDFlags
AutoResetTimeoutThe timeout for automatically resetting the value. The timeout is in milliseconds. To disable the autoreset, type 0.10020Persisted, ExpertPage
AutoResetValueThe value to set when the AutoResetTimeout occurs.10019Persisted, ExpertPage
QualityItem quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.3ReadOnly
RightsItem access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.5ReadOnly
TimeThe date and time when this item was last updated.4ReadOnly
TypeThe item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.1ReadOnly
ValueThe current value of the item.2NormalPage
ValuetypeItem canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.10010Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: VariableVector

Vector of variables, which can be written and read.

The VariableVector item type has the following properties:

NameDescriptionIDFlags
Dimension

The dimension of a vector item (number or elements).

5007Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Valuetype

Item canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.

10025Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: BitSelect

A boolean item whose value is set from a bit-position of an external item

The BitSelect item type has the following properties:

NameDescriptionIDFlags
BitPosition

The start bit position of the object. This property is only applicable for BOOL and BITSTRING objects,

10011Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: String formatter

An item which formats a string based on external item(s) and a format-control string

The String formatter item type has the following properties:

NameDescriptionIDFlags
DecimalPoint

What character to use as a decimal-point, For example, in English-speaking countries it's period '.' but in many European countries, it's comma ','

10017Persisted, Enumerated
Format as UTC

When true, time formatted strings are treated as in UTC. If this value is false, they are treated as being in the local timezone.

10018Persisted
FormatAsVector

Format the external item as a vector, using the prefix, suffix, and separator.

10013Persisted
FormatControlString

The format-control string used when formatting the string value of this item, based on its external item(s).

10012Persisted
FormatVector separator

The separator when formatting the string as a vector.

10016Persisted
FormatVectorPrefix

The prefix when formatting the string as a vector.

10014Persisted
FormatVectorSuffix

The suffix when formatting the string as a vector, or the character used to trim the result string.

10015Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
TrimCharacter

This property is the value used to trim the string.

10027Persisted
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Format Syntax

Format Control String

The format-control string used when formatting the string value of this item, based on its external item(s)

A format specification, which consists of optional and required fields, has the following form:

%[flags] [width] [.precision] [{h | l | ll | I | I32 | I64}]type

Each field of the format specification is a single character or a number signifying a particular format option. The simplest format specification contains only the percent sign and a type character (for example, %s). If a percent sign is followed by a character that has no meaning as a format field, the character is copied to stdout. For example, to print a percent-sign character, use %%.

The optional fields, which appear before the type character, control other aspects of the formatting, as follows:

type

Required character that determines whether the associated argument is interpreted as a character, a string, or a number (see the "Type Field Characters" table in Type Field Characters).

flags

Optional character or characters that control justification of output and printing of signs, blanks, decimal points, and octal and hexadecimal prefixes (see the "Flag Characters" table in Flag Directives). More than one flag can appear in a format specification.

width

Optional number that specifies the minimum number of characters output (see Width Specification).

precision

Optional number that specifies the maximum number of characters printed for all or part of the output field, or the minimum number of digits printed for integer values (see the "How Precision Values Affect Type" table in Precision Specification).

h| l| ll| I| I32| I64

Optional prefixes to type-that specify the size of argument (see the "Size Prefixes" table in Size and Distance Specification).

Flag Characters

FlagMeaningDefault
 Left align the result within the given field width.Right align.
+Prefix the output value with a sign (+ or ) if the output value is of a signed type.Sign appears only for negative signed values ().
0If width is prefixed with 0, zeros are added until the minimum width is reached. If 0 and appear, the 0 is ignored. If 0 is specified with an integer format (i, u, x, X, o, d) and a precision specification is also present (for example, %04.d), the 0 is ignored.No padding.
blank (' )Prefix the output value with a blank if the output value is signed and positive; the blank is ignored if both the blank and + flags appear.No blank appears.
#When used with the o, x, or X format, the # flag prefixes any nonzero output value with 0, 0x, or 0X, respectively.No blank appears.
 When used with the e, E, f, a or A format, the # flag forces the output value to contain a decimal point in all cases.Decimal point appears only if digits follow it.
 When used with the g or G format, the # flag forces the output value to contain a decimal point in all cases and prevents the truncation of trailing zeros. Ignored when used with c, d, i, u, or s.Decimal point appears only if digits follow it. Trailing zeros are truncated.

Type Field Character

CharacterTypeOutput format
cint or wint_tWhen used with printf functions, specifies a single-byte character; when used with wprintf functions, specifies a wide character
Cint or wint_tWhen used with printf functions, specifies a wide character; when used with wprintf functions, specifies a single-byte character.
dintSigned decimal integer.
iintSigned decimal integer.
ointSigned octal integer.
uintUnsigned decimal integer.
xintUnsigned hexadecimal integer, using "abcdef."
XintUnsigned hexadecimal integer, using "ABCDEF."
edoubleSigned value having the form [ ]d.dddd e [sign]dd[d] where d is a single decimal digit, dddd is one or more decimal digits, dd[d] is two or three decimal digits depending on the output format and size of the exponent, and sign is + or .
EdoubleIdentical to the e format except that E rather than e introduces the exponent.
fdoubleSigned value having the form [ ]dddd.dddd, where dddd is one or more decimal digits. The number of digits before the decimal point depends on the magnitude of the number, and the number of digits after the decimal point depends on the requested precision.
gdoubleSigned value printed in f or e format, whichever is more compact for the given value and precision. The e format is used only when the exponent of the value is less than 4 or greater than or equal to the precision argument. Trailing zeros are truncated, and the decimal point appears only if one or more digits follow it.
GdoubleIdentical to the g format, except that E, rather than e, introduces the exponent (where appropriate).
adoubleSigned hexadecimal double precision floating point value having the form []0xh.hhhh p±dd, where h.hhhh are the hex digits (using lower case letters) of the mantissa, and dd are one or more digits for the exponent. The precision specifies the number of digits after the point.
AdoubleSigned hexadecimal double precision floating point value having the form []0Xh.hhhh P±dd, where h.hhhh are the hex digits (using capital letters) of the mantissa, and dd are one or more digits for the exponent. The precision specifies the number of digits after the point.
nPointer to integerNumber of characters successfully written so far to the stream or buffer; this value is stored in the integer whose address is given as the argument. See Security Note below.
pPointer to voidPrints the address of the argument in hexadecimal digits.
sStringWhen used with printf functions, specifies a single-bytecharacter string; when used with wprintf functions, specifies a wide-character string. Characters are printed up to the first null character or until the precision value is reached.
SStringWhen used with printf functions, specifies a wide-character string; when used with wprintf functions, specifies a single-bytecharacter string. Characters are printed up to the first null character or until the precision value is reached.
t TTimeUsed to format Time object, terminate with &t or &T. see Time format %T%Y.%#m.%d %H:%M:%S&T output: 2006.9.06 18:02:21

How Precision Values Affect Type

TypeMeaningDefault
a, AThe precision specifies the number of digits after the point.Default precision is 6. If precision is 0, no point is printed unless the # flag is used.
c, CThe precision has no effect.Character is printed.
d, i, u, o, x, XThe precision specifies the minimum number of digits to be printed. If the number of digits in the argument is less than precision, the output value is padded on the left with zeros. The value is not truncated when the number of digits exceeds precision.Default precision is 1.
e, EThe precision specifies the number of digits to be printed after the decimal point. The last printed digit is rounded.Default precision is 6; if precision is 0 or the period (.) appears without a number following it, no decimal point is printed.
fThe precision value specifies the number of digits after the decimal point. If a decimal point appears, at least one digit appears before it. The value is rounded to the appropriate number of digits.Default precision is 6; if precision is 0, or if the period (.) appears without a number following it, no decimal point is printed.
g, GThe precision specifies the maximum number of significant digits printed.Six significant digits are printed, with any trailing zeros truncated.
s, SThe precision specifies the maximum number of characters to be printed. Characters in excess of precision are not printed.Characters are printed until a null character is encountered.

Width SpecificationCharacters

The second optional field of the format specification is the width specification. The width argument is a nonnegative decimal integer controlling the minimum number of characters printed. If the number of characters in the output value is less than the specified width, blanks are added to the left or the right of the values depending on whether the flag (for left alignment) is specified until the minimum width is reached. If width is prefixed with 0, zeros are added until the minimum width is reached (not useful for left-aligned numbers).

The width specification never causes a value to be truncated. If the number of characters in the output value is greater than the specified width, or if width is not given, all characters of the value are printed (subject to the precision specification).

If the width specification is an asterisk (*), an int argument from the argument list supplies the value. The width argument must precede the value being formatted in the argument list. A nonexistent or small field width does not cause the truncation of a field; if the result of a conversion is wider than the field width, the field expands to contain the conversion result.

Size Prefixes for printf and wprintf Fomat-Type Specifiers

To specifyUse prefixWith type specifier
long intl (lowercase L)d, i, o, x, or X
long unsigned intlo, u, x, or X
long longlld, i, o, x, or X
short inthd, i, o, x, or X
short unsigned intho, u, x, or X
__int32I32d, i, o, x, or X
unsigned __int32I32o, u, x, or X
__int64I64d, i, o, x, or X
unsigned __int64I64o, u, x, or X
ptrdiff_t (that is, __int32 on 32-bit platforms, __int64 on 64-bit platforms)Id, i, o, x, or X
size_t (that is, unsigned __int32 on 32-bit platforms, unsigned __int64 on 64-bit platforms)Io, u, x, or X
long doublel or Lf
Single-byte character with printf functionshc or C
Single-byte character with wprintf functionshc or C
Wide character with printf functionslc or C
Wide character with wprintf functionslc or C
Single-byte character string with printf functionshc or C
Single-byte character string with wprintf functionshc or C
Wide-character string with printf functionslc or C
Wide-character string with wprintf functionslc or C
Wide characterwc
Wide-character stringws

Thus to print single-byte or wide-characters, use format specifiers as follows.

To print charachter asUse functionWith format specifier
single byteprintfc, hc, or hC
single bytewprintfC, hc, or hC
widewprintfc, lc, lC, or wc
wideprintfC, lc, lC, or wc

Time format arguments

The format argument consists of one or more codes; as in printf, the formatting codes are preceded by a percent sign (%). Characters that do not begin with % are copied unchanged

The formatting codes for are listed below:

Format codeMeaning
%aAbbreviated weekday name
%A

Full weekday name

%b

Abbreviated month name

%B

Full month name

%c

Date and time representation appropriate for locale

%d

Day of month as decimal number (01 31)

%H

Hour in 24-hour format (00 23)

%I

Hour in 12-hour format (01 12)

%j

Day of year as decimal number (001 366)

%m

Month as decimal number (01 12)

%M

Minute as decimal number (00 59)

%p

Current locale's A.M./P.M. indicator for 12-hour clock

%S

Second as decimal number (00 59)

%U

Week of year as decimal number, with Sunday as first day of week (00 53)

%w

Weekday as decimal number (0 6; Sunday is 0)

%W

Week of year as decimal number, with Monday as first day of week (00 53)

%x

Date representation for current locale

%X

Time representation for current locale

%y

Year without century, as decimal number (00 99)

%Y

Year with century, as decimal number

%z, %Z

Either the time-zone name or time zone abbreviation, depending on registry settings; no characters if time zone is unknown

%%

Percent sign

As in the printf function, the # flag may prefix any formatting code. In that case, the meaning of the format code is changed as follows.

Fomrat codeMeaning
%#a, %#A, %#b, %#B, %#p, %#X, %#z, %#Z, %#%# flag is ignored.
%#cLong date and time representation, appropriate for current locale. For example: "Tuesday, March 14, 1995, 12:41:29".
%#xLong date representation, appropriate to current locale. For example: "Tuesday, March 14, 1995".
%#d, %#H, %#I, %#j, %#m, %#M, %#S, %#U, %#w, %#W, %#y, %#YRemove leading zeros (if any).

Item type: Multiplexer

An item which multiplexes external items. ExternalItem1 is selector, ExternalItem2 = input1, ..., ExternalItemN = input(N-1)

The Multiplexer item type has the following properties:

NameDescriptionIDFlags
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: TrigEvtBrokerCmd

An item which when set to true, fires an event in the Apis Event Broker. Use to control firing of commands.

The TrigEvtBrokerCmd item type has the following properties:

NameDescriptionIDFlags
EventBrokerEntry

Select the event in the event broker to trigger when this item is set.

10021Persisted, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: VariableMatrix

Matrix of variables, which can be written and read.

The VariableMatrix item type has the following properties:

NameDescriptionIDFlags
Columns

The number or columns in matrix item.

5009Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Rows

The number of rows in a matrix item.

5008Persisted
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage
Valuetype

Item canonical datatype. This is the type of data the field holds. For example: integer, string, datetime, etc.

10025Persisted, Enumerated

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Expression

An item which runs a set of criterion on its external input. The output can be used as selector in a multiplexer item.

The Expression item type has the following properties:

NameDescriptionIDFlags
Expressions

An array of expressions used to calculate the value of this item.

5111Persisted
No-match quality

The output quality used if there's no expression match.

10033Persisted, Enumerated
No-match value

The output value used if there's no expression match.

10032Persisted
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2NormalPage

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Module State Items

An item which retrieves status information from the module

The Module state items item type has the following properties:

NameDescriptionIDFlags
Item type

The item type this item aggregates statistics on, when applicable. Use the number inside the parentheses in "FileAdd" configuration files.

19001Persisted, ReadOnly, DynamicEnumeration
Module state

The kind of module state information represented by this item. This can be a number of items having a given quality, a total number of items, the time the newest/oldest item was updated. Use a number inside the parentheses in "FileAdd" configuration files.

19000Persisted, ReadOnly, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Item Attribute Items

An item which exposes an attribute of another item in the module

The Item attribute items item type has the following properties:

NameDescriptionIDFlags
Attribute ID

The ID of the attribute this item exposes from an item.

19002Persisted, ReadOnly, DynamicEnumeration
ParentItem

The parent item name of this item.

5502Persisted, ReadOnly, ApisLocalItem
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Items

An item, also known as a tag or signal, represents real-time values either originating from an external system (PLC, DCS, etc.), or values calculated or derived from such values within the Apis Hive environment (calculated control value, logical signal, etc.).

In most cases, a module will bring items into the Apis Hive environment, but there are also cases when a module doesn't provide items by itself, e.g. the ApisLoggerBee module.

Item attributes

Items are typically real-time process data or calculated data, and they have several attributes associated with them. All items have the standard OPC attributes: value; quality; timestamp; rights.

Additionally, items will typically have more attributes, dependent on the items purpose, and will vary for different types of items. Such attributes can be divided into two categories: custom attributes and predefined Apis/OPC attributes (for example, descriptions and engineering units).

Custom attributes

Attributes specific to the purpose of the item. E.g. the amplitude of a sine signal.

Predefined Apis / OPC attributes

Attribute of a more generic nature. See OPC DA Item attributes and "Predefined Apis attributes".

Item types

There are three main item types; scalar values, vector values and matrices.

  • A scalar value is typically a flow signal, temperature, etc. from an external system.
  • A vector value typically is a spectrum from an NIR instrument, or control vector in a Model-based Predictive Control (MPC) system.
  • A matrix value typically is a system matrix in an MPC system.

See also the Datatypes Overview.

Shared Item Types

Item type: Module State Items

An item which retrieves status information from the module

The Module state items item type has the following properties:

NameDescriptionIDFlags
Item type

The item type this item aggregates statistics on, when applicable. Use the number inside the parentheses in "FileAdd" configuration files.

19001Persisted, ReadOnly, DynamicEnumeration
Module state

The kind of module state information represented by this item. This can be a number of items having a given quality, a total number of items, the time the newest/oldest item was updated. Use a number inside the parentheses in "FileAdd" configuration files.

19000Persisted, ReadOnly, Enumerated
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Item Attribute Items

An item which exposes an attribute of another item in the module

The Item attribute items item type has the following properties:

NameDescriptionIDFlags
Attribute ID

The ID of the attribute this item exposes from an item.

19002Persisted, ReadOnly, DynamicEnumeration
ParentItem

The parent item name of this item.

5502Persisted, ReadOnly, ApisLocalItem
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

1ReadOnly
Value

The current value of the item.

2ReadOnly

See also

Basic Item Properties

Predefined Item Properties

OPC DA Properties

Item type: Module events items

An item holding a string-array of the most recent simple event(s) generated by the module.

The Module events items item type has the following properties:

NameDescriptionIDFlags
EventCountThe number of most recent simple event(s) to keep in item string-array value.19003Persisted
EventSeverityFilterA severity filter for simple event(s) to keep in item string-array value.19004Persisted, Enumerated
QualityItem quality3ReadOnly
RightsItem access rights5ReadOnly
TimeItem timestamp4ReadOnly
TypeItem canonical datatype1ReadOnly
ValueItem value2ReadOnly

See also Predefined Item Properties and OPC DA Properties

Item type: Function item

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

This item has two different calculators (algorithm syntax), C# and Legacy (proprietary), see:

The Function item item type has the following properties:

NameDescriptionIDFlags
ExpressionAn expression used to calculate the value of this item5110Persisted
ExpressionsDefinitions of array inputs to the calculator (applicable for C# only)5111Persisted
CalculatorSpecifies which calculator to use, C# or legacy19101Persisted
DataChangeTriggerThe DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
Quality: Report as updated only if the Qualtiy associated with the value changes.
QualityValue: Report as updated if either the Qualtiy or the Value change.
QualityValueTimestamp: Report as updated if either Quality, Value or the Timestamp change (default).
19102Persisted, Enumerated
External ItemsThe external items that are inputs to the formula given by the Expression property20000 ... 20000+NPersisted
QualityItem quality3NormalPage
RightsItem access rights5ReadOnly
TimeItem timestamp4NormalPage
TypeItem canonical datatype1ReadOnly
ValueItem value2NormalPage
ValuetypeItem canonical datatype.19100Persisted, Enumerated

See also Predefined Item Properties and OPC DA Properties

Item type: Function item - C# calculator

This Item is a calculated value based on existing items in Hive. The calculation uses an expression with C# syntax to perform a calculation based on inputs from external Items.

Make sure to select C# for the function item Calculator attribute.

Looking for the Advanced content?

Calculation

The Expression is the body of a method with the following definition:

VQT Compute(VQT[] ex, VQT[][] arr)
{
	//Your expression will be placed here by the framework
}

ex is an array of the external items. The order is the same as the order of the external items in Hive.

arr is an array of arrays of the external items that are a part of arrays. These arrays are defined on the Expressions attribute. (See more on arrays below)

The parameters ex and arr are variants of type VQT with Value, Quality and Timestamp:

public readonly struct VQT
{
  public object Value // The value of the item
  //One of the primitive types: float, double, int, short, byte, sbyte, decimal, uint, ulong, ushort, long, bool 

  public OpcDaQuality Quality // OPC DA quality. enum : ushort. OPC DA qualities are 2 bytes

  public ulong Timestamp // FILETIME

  public double DblVal //Value as double, can be used if you know the value is of type double or it's ok to convert to double

  public int IntVal // Value as int, can be used if you know the value is of type int

  public bool BoolVal // Value as bool, can be used if you know the value is of type bool
}

The datatype of the variants Value is the same as in Hive.

The C# function items has built in operators for these variants, so they can be used as if they are normal primitive values. I.e. the following expression adds the first 2 items in the external items array:

return ex[0] + ex[1]; 

You can even omit the 'return' and the ';' if you have a oneliner and just type:

ex[0] + ex[1]

The expression can include various operators, functions, loops and logic operators. The item type of the returned value is usually the same as the inputs. A lot of the built in methods will return an 8 byte float.

In most cases when using expressions of kind All*, e.g. AllGoodCount, you will most likely need to have the module property ExtItem pass-through quality is set to Any quality.
Otherwise, one or more external item having a bad quality will prevent the Function item from being evaluated completely.
See also: ExtItem pass-through quality.

OperatorDescriptionExample
+The sum of two valuesex[0] + ex[1]
-The subtraction of one value from anotherex[0] - ex[1]
*The multiplication of two valuesex[0] * ex[1]
/The division of one value by anotherex[0] / ex[1]
%The modulus of one value by anotherex[0] % ex[1]
Function Description Example Item Type
Abs The absolute value Abs(ex[0]) 8 byte float 
Sin The sine of a value, in radians Sin(ex[0]) 8 byte float
Asin

Returns the angle, θ, measured in radians, such that -π/2 ≤θ≤π/2 .

The argument is a number must be greater than or equal to -1, but less than or equal to 1

 

Remarks:

A positive return value represents a counterclockwise angle from the x-axis; a negative return value represents a clockwise angle.

Multiply the return value by 180/Math.PI to convert from radians to degrees

Asin(ex[0]) 8 byte float
Sinh

The hyperbolic sine of value. If value is equal to NegativeInfinity. The parameter is an angle, measured in radians.

 

Remarks:

The angle, value, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.

Sinh(ex[0]) 8 byte float
Asinh Returns the angle whose hyperbolic sine is the specified number Asinh(ex[0]) 8 byte float
Cos The cosine of a value, in radians Cos(ex[0]) 8 byte float
Acos

Returns the angle, θ, measured in radians, such that 0 ≤θ≤π.

The argument is a number must be greater than or equal to -1, but less than or equal to 1

 

Remarks:

Multiply the return value by 180/Math.PI to convert from radians to degrees.

Acos(ex[0]) 8 byte float
Cosh

The hyperbolic cosine of value. If value is equal to NegativeInfinity or PositiveInfinity, PositiveInfinity is returned. If value is equal to NaN, NaN is returned. The parameter is an angle, measured in radians.

Remarks:

The angle, value, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.

Cosh(ex[0]) 8 byte float
Acosh Returns the angle whose hyperbolic cosine is the specified number Acosh(ex[0]) 8 byte float
Tan The tangent of a value, in radians Tan(ex[0]) 8 byte float
Atan

Returns the angle, θ, measured in radians, such that -π/2 ≤θ≤π/2.

-or-

NaN if d equals NaN, -π/2 rounded to double precision (-1.5707963267949) if d equals NegativeInfinity, or π/2 rounded to double precision (1.5707963267949) if d equals PositiveInfinity.

 

Remarks:

A positive return value represents a counterclockwise angle from the x-axis; a negative return value represents a clockwise angle.

Multiply the return value by 180/Math.PI to convert from radians to degrees

Atan(ex[0]) 8 byte float
Tanh

The hyperbolic tangent of value. If value is equal to NegativeInfinity, this method returns -1. If value is equal to PositiveInfinity, this method returns 1. If value is equal to NaN, this method returns NaN. The parameter is an angle, measured in radians.

Remarks:

The angle, value, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.

Tanh(ex[0]) 8 byte float
Atanh Returns the angle whose hyperbolic tangent is the specified number Atanh(ex[0]) 8 byte float
Exp Exponent of an expression Exp(ex[0]) 8 byte float
Floor Rounds the value to nearest integer towards minus infinity Floor(ex[0]) 8 byte float
Ceiling Returns the smallest integral value that is greater than or equal to the specified Ceiling(ex[0]) 8 byte float
Log The natural (base e) logarithm of a specified number Log(ex[0]) 8 byte float
Neg Invert sign Neg(ex[0]) 8 byte float
Bit The bit value at specified index

Bit(ex[0], 3)

(returns 0 or 1)

int32
Pow The first value raised to the power of the second value Pow(ex[0] , 2) 8 byte float
Sqrt The square root of a value Sqrt(ex[0]) 8 byte float
Max

Return max value of an array of values

Max(ex) 8 byte float
Max (scalars)

Return max value of two values

Max(ex[0], ex[1]) 8 byte float
Min

Return min value of an array of values

Min(ex) 8 byte float
Min (scalars)

Return min value of two values.

Min(ex[0],ex[1]) 8 byte float
AllGoodMin

Return the minimum of an array that have good quality.

If no good value exists, then double.MaxValue and bad quality is returned.

AllGoodMin(ex) 8 byte float
Avg

Return the average of an array of values

Quality will be the worst quality.

Avg(ex) 8 byte float
AllGoodAvg

Return the average of an array that have good quality.

If no good value exists, then 0.0 and bad quality is returned.

AllGoodAvg(ex) 8 byte float
AllGoodMax

Return the maximum of an array that have good quality.

If no good value exists, then double.MinValue and bad quality is returned.

AllGoodMax(ex) 8 byte float
Median

Return the median of an array

Median(ex) 8 byte float
AllGoodMed

Return the median of an array that have good quality.

If no good value exists, then NAN and bad quality is returned.

AllGoodMed(ex) 8 byte float
Var

Return the variance of an array. Normalized with N, this provides the square root of the second moment around the mean

Var(ex) 8 byte float
AllGoodVar

Return the variance of the elements in an array that have good quality. Normalized with N, this provides the square root of the second moment around the mean

If no good value exists, then NAN and bad quality is returned.

AllGoodVar(ex) 8 byte float
Std

Return the standard deviation of an array. Normalized with N, this provides the square root of the second moment around the mean

If no good value exists, then NAN and bad quality is returned.

Std(ex) 8 byte float
AllGoodStd

Return the standard diviation of all external items that have good quality.

If no good value exists, then NAN and bad quality is returned.

AllGoodStd(ex) 8 byte float
AllCount

Return the size of an array.

AllCount(ex) int
AllGoodCount

Return the number of all external items that have good quality.

AllGoodCount(ex) int
Sum

Return the sum of an array.

Quality will be the worst quality.

Sum(ex) 8 byte float
AllGoodSum

Return the sum of all external items that have good quality.

If no good value exists, 0.0 and bad quality is returned.

AllGoodSum(ex) 8 byte float
AllGoodGreatherThanAvg

Return the the average of all good values greather than x.

If no good value exists, then NAN and bad quality is returned.

AllGoodGreatherThanAvg(ex, x) 8 byte float
AllGoodLessThanAvg

Return the the average of all good values less than x.

If no good value exists, then NAN and bad quality is returned.

AllGoodLessThanAvg(ex, limit) 8 byte float
Delay

Delay a signal a number of steps (number of external item transfers). Maximum number of delays is 3600.

Delay(5, ex[0])

This will delay the first external item 5 steps.

Same as input
LowPassFilter

Will lowpass filter a signal with the specified time constant. Only inputs with quality Good will be used in the filter. If input is not Good, output will have Uncertain quality

LowPassFilter(60.0, ex[0])

The time constant here is 60 seconds. dT in the filter is the time difference between current and previous external item timestamp.

8 byte float
MovingAvg

Moving average for a specified window size. Computed on every external item transfer. Maximum window size is 3600. Only inputs with quality Good will be used in the algoritm. If some inputs are not Good, output will have Uncertain quality. If all inputs are not Good, output will be Bad

MovingAvg(5, ex[0])

The window size here is 5.

8 byte float
MovingTimeAvg

Moving average for a period of time (window). Computed on every external item transfer. Maximum window size is 3600 seconds. Only inputs with quality Good will be used in the algoritm. If some inputs are not Good, output will have Uncertain quality. If all inputs are not Good, output will be Bad. Tip: If your intension is to low pass filter a signal, use the LowPassFilter method instead, it is much more efficient.

MovingTimeAvg(5.0, ex[0])

The window size here is 5 seconds.

8 byte float
Hysteresis

Hysteresis for a specified deadband. The output value will not change before the input value moves outside of the deadband. The timestamp will change even if the value does not

Hysteresis(2, ex[0])

The deadband here is 2. Example of inputs/outputs:

Input 1 output 1

Input 2 output 1

Input 3 output 3

Input 4 output 4

Input 5 output 5

Input 4 output 5

Input 3 output 5

Input 2 output 2

Input 1 output 1

Input 0 output 0

Same as input
PosFlankDelay

For a boolean input signal, any positive flank (from false to true) will have to stay positive (true) for certain amount of time before the output goes positive (true). NB! This method requires at least one extra external item that changes continuously. The first element in the external item array must be the signal to filter. This is a requirement do to the fact that the external item update mechanism only happens on external item change.

PosFlankDelay(5.0, ex)

The delay period here is 5 seconds. Note that the whole external item array, ex, is used as an argument.

Boolean
NegFlankDelay

For a boolean input signal, any negative flank (from true to false) will have to stay negative (false) for certain amount of time before the output goes negative (false). NB! This method requires at least one extra external item that changes continuously. The first element in the external item array must be the signal to filter. This is a requirement do to the fact that the external item update mechanism only happens on external item change.

NegFlankDelay(5.0, ex)

The delay period here is 5 seconds. Note that the whole external item array, ex, is used as an argument.

Boolean
GetDateTime

The value of this function item is the timestamp, in OLE Automation date format, of the external item (UTC). The datatype of this function item must be Date.

GetDateTime(ex[0]) DATE
GetUnixTime

The value of this function item is the timestamp, in milliseconds since 1.1.1970, of the external item (UTC). The value is a double including milliseconds as decimals.

GetUnixTime(ex[0]) 8 byte float (default)
LogicDescriptionExample
>Greater thanex[0] > ex[0]
>=Greater than or equalex[0] >= 2.5
<Smaller thanex[0] < 3.4
<=Smaller than or equalex[0] <= ex[1]
==Equalex[0] == ex[1]
<>Not equalex[0] != ex[1]
!=Not equalex[0] != ex[1]
!Not!( ex[0] > ex[1] )

Loops

Standard C# loops can be used in an Expression:

For loop:

for (int i = 0; i < ex.Length; i++)
{
	if (ex[i] > 10.0)
		return true;
}

While loop:

bool hit = false;
int index = 0;
while(index < ex.Length && !hit)
{
	if (ex[index] > 10.0)
		hit = true;
	index++;
}
return hit;

Array External Items

The external items can be organized into arrays (arr) that enter the Compute method:

VQT Compute(VQT[] ex, VQT[][] arr)

Sometimes it's natural to organize some of the external items into arrays. Example: we have one status item and 3 temperature measurements as inputs, it might be natural to place the temperatures in a separate array, to i.e. compute the average temperature.

By using the Expressions attribute (note that it ends with an s, different attribute from Expression) on the function item, you can organize the external items into arrays. In the example we have 4 external items, say the first three are temperatures and the last is Status. The Expressions attribute then becomes:

[1] ex1, ex2, ex3;

'[1]' means 1 array, consisting of external items ex1, ex2 and ex3. The array content is separated by ';', so two arrays would be:

[2] ex1, ex2; ex3, ex4;

The external items that are in arrays will not be a part of ex parameter. In our example, ex will be 1 long.

if (ex[0] == 0) // Status check, note index is 0 even though status is the forth external item
	return 0; // Status indicated error

var tempMean = Avg(arr[0]); // Temp average, the first array (and only, in this case)
if (tempMean > 30.0)        // Temp too high
	return 3;

return 1; // Ok

Example 1

Expression:

ex[0] + ex[1] + 5.14

Example 2

Expression:

if(ex[1] > 0.0 && ex[2] > 0.0)
	return 100.0*1000.0*ex[0]/(ex[1]*ex[2]);
return 0.0;

Example 3

Expression:

if (ex[0] == true) return 3; //Waiting

if (arr[0].Length > 0)
{
	if (arr[0].Any(a => a == 4)) return 4; //Leak detected

	var tankCount = 0.0;
	var tankOkCount = 0.0;

	for (int i = 0; i < arr[0].Length; i++)
	{
		tankCount++;
		if (arr[0][i] == 0)
			tankOkCount++;
	}

	if (tankCount == 0.0)
		return 0; //Ignore

	var tanksProdusingFrac = tankCount > 0.0 ? tankOkCount / tankCount : 0.0;

	if (tanksProdusingFrac < 0.9) return 2; //Error
	if (tanksProdusingFrac < 1.0) return 1; //Warning
}

return 0;

Specifying a non-existing External Item

If you want to specify a dummy/non-existing external item in the list of external items, you can specify external item(s) named ##DummyExternalItem.
Items having this name will be given special handling inside APIS and will not be used in the calculation. This is useful when you want to use the same expression for multiple items, but some of the items have different number of external items.
Note that to achieve this, you must apply the configuration by import from a text file. Using the Add items dialog in Apis Management Studio together with File add will also work.


See the Advanced content for more information.

Item type: Function item - C# calculator - Advanced

Here we will go deeper into how to use C# function items.

Calculation

The standard doc does not show the whole truth about the method the expression is executed in context of. The full definition is:

VQT Compute(VQT[] ex, VQT[][] arr, IItemContext itemContext, Func<IState> state, ICsScriptLog log)
{
	//Your expression will be placed here by the framework
}

itemContext is defined as follows:

public interface IItemContext
{
    /// <summary>
    /// Handle of Hive item
    /// </summary>
    int ItemHandle { get; }

    /// <summary>
    /// Func item expression
    /// </summary>
    string Expression { get; }

    /// <summary>
    /// Func item expression arrays
    /// </summary>
    string[] ExpressionArrays { get; }
}

state is actually a method (Func) that will give you an instance of IState. You can store values between computations on the state instance:

public interface IState
{
    bool Contains(StateKey key);
    byte GetOrAdd(StateKey key, byte defaultValue);
    sbyte GetOrAdd(StateKey key, sbyte defaultValue);
    short GetOrAdd(StateKey key, short defaultValue);
    ushort GetOrAdd(StateKey key, ushort defaultValue);
    int GetOrAdd(StateKey key, int defaultValue);
    uint GetOrAdd(StateKey key, uint defaultValue);
    long GetOrAdd(StateKey key, long defaultValue);
    ulong GetOrAdd(StateKey key, ulong defaultValue);
    float GetOrAdd(StateKey key, float defaultValue);
    double GetOrAdd(StateKey key, double defaultValue);
    decimal GetOrAdd(StateKey key, decimal defaultValue);
    string GetOrAdd(StateKey key, string defaultValue);
    IList<VQT> GetOrAdd(StateKey key, IList<VQT> defaultValue);
    VQT GetOrAdd(StateKey key, VQT defaultValue);

    void AddOrUpdate(StateKey key, byte defaultValue);
    void AddOrUpdate(StateKey key, sbyte defaultValue);
    void AddOrUpdate(StateKey key, short defaultValue);
    void AddOrUpdate(StateKey key, ushort defaultValue);
    void AddOrUpdate(StateKey key, int defaultValue);
    void AddOrUpdate(StateKey key, uint defaultValue);
    void AddOrUpdate(StateKey key, long defaultValue);
    void AddOrUpdate(StateKey key, ulong defaultValue);
    void AddOrUpdate(StateKey key, string defaultValue);
    void AddOrUpdate(StateKey key, float defaultValue);
    void AddOrUpdate(StateKey key, double defaultValue);
    void AddOrUpdate(StateKey key, decimal defaultValue);
    void AddOrUpdate(StateKey key, IList<VQT> defaultValue);
    void AddOrUpdate(StateKey key, VQT defaultValue);
    void Remove(StateKey key);
    void Clear();
}

The StateKey is a unique key for entries in the state:

public struct StateKey
{
    /// <summary>
    /// Key for storing values in an instance of IState
    /// </summary>
    /// <param name="paramId">The id of the parameter to store</param>
    /// <param name="algId">The id of the algorithm (method)</param>
    /// <param name="extItemHandles">The external item handles</param>
    public StateKey(int paramId, int algId, int[] extItemHandles)
    {
        _paramId = paramId;
        _algId = algId;
        _extItemHandles = extItemHandles.OrderBy(e => e).ToArray();
    }
    ...
}

log is a logger provided by Hive. Log entries will end up in the Hive log:

public interface ICsScriptLog
{
    bool IsDebugEnabled { get; }

    bool IsInfoEnabled { get; }

    bool IsWarnEnabled { get; }

    bool IsErrorEnabled { get; }

    bool IsFatalEnabled { get; }

    void Info(object logEntry);

    void Info(object logEntry, Exception e);

    void InfoFormat(string formatString, params object[] args);

    void Warn(object logEntry);

    void Warn(object logEntry, Exception e);

    void WarnFormat(string formatString, params object[] args);

    void Error(object logEntry);

    void Error(object logEntry, Exception e);

    void ErrorFormat(string formatString, params object[] args);

    void Fatal(object logEntry);

    void Fatal(object logEntry, Exception e);

    void FatalFormat(string formatString, params object[] args);

    void Debug(object logEntry);

    void Debug(object logEntry, Exception e);

    void DebugFormat(string formatString, params object[] args);
}

Example expression

Next, we will show how to use the state and log in an expression. In the example we are creating a low pass filter:


//VQT Compute(VQT[] ex, VQT[][] arr, IItemContext itemContext, Func<IState> state, ICsScriptLog log) <- context
//{
    //Expression start
    var timeConstant = 60.0;

    int algId = GetAlgorithmId("Acme_LowPass"); // Generate an id for the method. Use a string that is globally unique and means something in your context.
                                                // Here we are making a lowpass filter for the company 'Acme'

    var myState = state();     // Get the state from the environment.
    var filteredId = 1;        // Id of filter value (previously filtered value)
    var timestampId = 2;       // Id of the timestamp of previous external item

    var filteredValueKey = new StateKey(filteredId, algId, new[] { ex[0].ItemHandle });    // Generate unique key for filtered value
    var timestampKey = new StateKey(timestampId, algId, new[] { ex[0].ItemHandle });       // Generate unique key for previous timestamp

    if (!myState.Contains(filteredValueKey)) // If first time, add values to the state and return
    {
        myState.AddOrUpdate(filteredValueKey, ex[0].DblVal);
        myState.AddOrUpdate(timestampKey, ex[0].Timestamp);
        return ex[0];
    }

    var prevFiltered = myState.GetOrAdd(filteredValueKey, 0.0);   // Get previous filtered value
    var prevTimestamp = myState.GetOrAdd(timestampKey, 0UL);      // Get previous timestamp

    var dT = (ex[0].Timestamp - prevTimestamp) / 1e7; // -> sec

    if (dT <= 0.0)
    {
        log.Error($"dT is '{dT}', not a legal value");
        return ex[0];
    }

    double filtered;
    if (timeConstant > dT) // Prevent unstable filter
    {
        filtered = prevFiltered + dT * (ex[0].DblVal - prevFiltered) / timeConstant; // Compute new filtered value
    }
    else
    {
        log.Warn($"timeConstant > dT, unstable. Consider increasing your time constant");
        filtered = ex[0].DblVal;
    }

    myState.AddOrUpdate(filteredValueKey, filtered); // Update state with new filterd value
    myState.AddOrUpdate(timestampKey, ex[0].Timestamp); // Update state with new timestamp

    return new VQT(filtered, ex[0].Quality, ex[0].Timestamp); // Return filtered value
    //Expression end
//}

Pitfall !

If the state key is not unique, the state will be shared and the results will be a mess. I.e. If one call the same method twice in an expression for the same external item:

var lp1 = LowPassFilter(timeConstant, ex[0], state);
// Do something with lp1
var lp2 = LowPassFilter(timeConstant, ex[0], state); // <- same as lp1
// Do something with lp2

In the case abowe, do not lowpass ex[0] twice.

Item type: Function item - Legacy calculator

This Item is a calculated value based on existing items in Hive. The calculation is formula based on inputs from external Items.

Calculation

The formula can an expression with the names ex1, ex2,ex3,...exN. The formula can include the following Operators, Functions, and Logic.
In most cases when using expressions of kind All*, e.g. AllGoodCount, you will most likely need to have the module property ExtItem pass-through quality is set to Any quality.
Otherwise, one or more external item having a bad quality will prevent the Function item from being evaluated completely.
See also: ExtItem pass-through quality.

OperatorDescriptionExampleItem Type
+The sum of two valuesex1 + ex28 byte float (default)
-The subtraction of one value from anotherex1 - ex28 byte float (default)
*The multiplication of two valuesex1 * ex28 byte float (default)
/The division of one value by anotherex1 / ex2

8 byte float (default)

%The modulus of one value by anotherex1 % ex2

8 byte float (default)

Function Description Example Item Type
( Left bracket 1.0 / ( ex1 + ex2 )

 

) Right bracket 1.0 / ( ex1 + ex2 )

 

/**/ Comment. In a function Item everything between /* and */ will be omitted. /* a comment */  
abs The absolute value abs(ex1) 8 byte float (default) or custom scalar number
asinh Represents the inverse of the hyperbolic sine function asinh(ex1) 8 byte float (default) or custom scalar number
acosh Represents the inverse of the hyperbolic cosine function. acosh(ex1) 8 byte float (default) or custom scalar number
atanh Represents the inverse of the hyperbolic tangent function. atanh(ex1) 8 byte float (default) or custom scalar number
arcsin

Returns the angle, θ, measured in radians, such that -π/2 ≤θ≤π/2 .

The argument is a number must be greater than or equal to -1, but less than or equal to 1

 

Remarks:

A positive return value represents a counterclockwise angle from the x-axis; a negative return value represents a clockwise angle.

Multiply the return value by 180/Math.PI to convert from radians to degrees

arcsin(ex1) 8 byte float (default) or custom scalar number
arccos

Returns the angle, θ, measured in radians, such that 0 ≤θ≤π.

The argument is a number must be greater than or equal to -1, but less than or equal to 1

 

Remarks:

Multiply the return value by 180/Math.PI to convert from radians to degrees.

arccos(ex1) 8 byte float (default) or custom scalar number
arctan

Returns the angle, θ, measured in radians, such that -π/2 ≤θ≤π/2.

-or-

NaN if d equals NaN, -π/2 rounded to double precision (-1.5707963267949) if d equals NegativeInfinity, or π/2 rounded to double precision (1.5707963267949) if d equals PositiveInfinity.

 

Remarks:

A positive return value represents a counterclockwise angle from the x-axis; a negative return value represents a clockwise angle.

Multiply the return value by 180/Math.PI to convert from radians to degrees

arctan(ex1) 8 byte float (default) or custom scalar number
cos The cosine of a value, in radians cos(ex1) 8 byte float (default) or custom scalar number
cosh

The hyperbolic cosine of value. If value is equal to NegativeInfinity or PositiveInfinity, PositiveInfinity is returned. If value is equal to NaN, NaN is returned. The parameter is an angle, measured in radians.

Remarks:

The angle, value, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.

cosh(ex1) 8 byte float (default) or custom scalar number
exp Exponent of an expression exp( ex1 ) 8 byte float (default)
floor Rounds the value to nearest integer towards minus infinity floor(ex1) 16 byte fixed point (default) or custom scalar number
ln Logarithm of a value ln(ex1 + ex2) 8 byte float (default)
neg Invert sign neg(ex1) 8 byte float (default) or custom scalar number
pow The first value raised to the power of the second value pow(ex1 , ex2) 8 byte float (default) or custom scalar number
sin The sine of a value, in radians sin(ex1) 8 byte float (default) or custom scalar number
sinh

The hyperbolic sine of value. If value is equal to NegativeInfinity. The parameter is an angle, measured in radians.

 

Remarks:

The angle, value, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.

sinh(ex1) 8 byte float (default) or custom scalar number
sqrt The square root of a value sqrt(ex1) 8 byte float (default) or custom scalar number
tan The tangent of a value, in radians tan(ex1) 8 byte float (default) or custom scalar number
tanh

The hyperbolic tangent of value. If value is equal to NegativeInfinity, this method returns -1. If value is equal to PositiveInfinity, this method returns 1. If value is equal to NaN, this method returns NaN. The parameter is an angle, measured in radians.

Remarks:

The angle, value, must be in radians. Multiply by Math.PI/180 to convert degrees to radians.

tanh(ex1) 8 byte float (default) or custom scalar number
max

Return max value of two values.

max(ex1,ex2) 8 byte float (default) or custom scalar number
min

Return min value of two values.

min(ex1,ex2) 8 byte float (default) or custom scalar number
allgoodavg

Return the average of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodavg() 8 byte float (default)
allgoodmin

Return the minimum of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodmin() 8 byte float (default)
allgoodmax

Return the maksimum of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodmax() 8 byte float (default)
allgoodmed

Return the median of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodmed() 8 byte float (default)
allgoodstd

Return the standard diviation of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodstd() 8 byte float (default)
allgoodvarians

Return the varians of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodvarians() 8 byte float (default)
allgoodcount

Return the number of all external items that have good quality.

allgoodcount() int
allgoodsum

Return the sum of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodsum() int (default)
allgoodcountvalue

Return the number of all external items that have good quality and value equal to the given value.

allgoodcountvalue(5.1) int
allgoodlastdatetime

Return the newest timestamp of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodlastdatetime() DATE
allgoodlastunixtime

Return the newest timestamp of all external items that have good quality. The value is a double with ms as decimals.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodlastunixtime() 8 byte float (default)
allgoodfirstdatetime

Return the oldest timestamp of all external items that have good quality.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodfirstdatetime() DATE
allgoodfirstunixtime

Return the oldest timestamp of all external items that have good quality. The value is a double with ms as decimals.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodfirstunixtime() 8 byte float (default)
allgoodgreatherthanavg

Return the the average of all good values greather than X.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodgreatherthanavg(XX) 8 byte float (default)
allgoodlessthanavg

Return the the average of ajl good values less than X.

If no good extenal value exist, then NAN and bad quality is returned.

allgoodlessthanavg(XX) 8 byte float (default)
allavg

Return the average of all external items.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allavg() 8 byte float (default)
allmin

Return the minimum of all external items that have good quality.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allmin() 8 byte float (default)
allmax

Return the maksimum of all external item.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allmax() 8 byte float (default)
allmed

Return the median of all external.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allmed() 8 byte float (default)
allstd

Return the standard diviation of all external items.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allstd() 8 byte float (default)
allvarians

Return the varians of all external.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allvarians() 8 byte float (default)
allcount

Return the number of all external items.

allcount() int
allsum

Return the sum of all external items.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allsum() int (default)
allcountvalue

Return the number of all external items that have value equal to the given value.

Quality will be the worst quality.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allcountvalue(5.1) int
allgreatherthanavg

Return the the average of all values greather than X.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

allgreatherthanavg(XX) 8 byte float (default)
alllessthanavg

Return the the average of all values less than X.

If error in conversion to double, the value will be omitted, and quality will be set to BAD

alllessthanavg(XX) 8 byte float (default)
movingavg

Calculate the moving average of the last 5 values of calculation defined in the brackets

movingavg(ex1) 8 byte float (default) or custom scalar number
movingavgXX

Create an moving average of last XX values

movingavg10(ex1) 8 byte float (default) or custom scalar number
deltaavg

Create an moving average of last 5 values, but the value is only updated when calculation is based on 5 new values.

deltaavg(ex1) 8 byte float (default) or custom scalar number
deltaavgXX

Create an moving average of last XX values, but the value is only updated when calculation is based on XX new values.

deltaavg10(ex1) 8 byte float (default) or custom scalar number
movingtimeavgXX

Calculate the moving average of the last XX seconds of the calculation defined in the brackets. If XX is 60, then the function return the moving average of the last 60 seconds.
If XX is omitted, a default value 5 is used.

movingavg100(ex1) 8 byte float (default)
deltatimeavgXX

Calculate the moving time average of the last XX seconds of the calculation defined in the brackets. If XX is 60, then the function return the moving average of the last 60 seconds every 60 seconds. The calculation is also synchronized to XX. If XX is 60 a new value will be calculated every minute (60 seconds) e.g. 10:01:00 and 10:02:00, etc. If XX is 600, there will be a new value at 10:00:00, 10:10:00, 10:20:00, etc.
If XX is omitted, a default value 5 is used.

deltatimeavg60(ex1) 8 byte float (default)
deltatimeendXX Calculate/return the last sample of the last XX seconds of the calculation defined in the brackets. If XX is 60, then the function return the moving average of the last 60 seconds every 60 seconds. The calculation is also synchronized to XX. If XX is 60 a new value will be calculated every minute (60 seconds) e.g. 10:01:00 and 10:02:00, etc. If XX is 600, there will be a new value at 10:00:00, 10:10:00, 10:20:00, etc.
If XX is omitted, a default value 5 is used.
deltatimeend60(ex1) 8 byte float (default)
delay Delay the value 1 step. delay(ex1/ex2) 8 byte float (default)
delayXX Delay the value XX step. delay5(ex1/ex2) 8 byte float (default)
pulsedetect Detect a pulse defined as the previous value below 0.5 and current valiue above 0.5. If a puls is detected the return value is 1 else the value is 0. pulsedetect(ex1) 8 byte float (default)
pulsecount Count the number of detected pulses. A puls is defined bu the function pulsdetect. The counting starts from 0 when Apis is started. pulsecount(ex1) 8 byte float (default)
filetime2oadate

Converts a FILETIME value into the equivalent OLE Automation date.

A Windows file time is a 64-bit value that represents the number of 100-nanosecond intervals that have elapsed since 12:00 midnight, January 1, 1601 A.D. (C.E.) Coordinated Universal Time (UTC). Windows uses a file time to record when an application creates, accesses, or writes to a file.

An OLE Automation date is a 8 byte floating-point number whose integral component is the number of days before or after midnight, 30 December 1899, and whose fractional component represents the time on that day divided by 24. The base OLE Automation Date is midnight, 30 December 1899.

filetime2oadate(ex1) DATE
unixtime2oadate

Converts a Unix Timestamp into the equivalent OLE Automation date.

Unix Timestamp is the number of seconds that have elapsed since UTC 1970-01-01T00:00:00Z.

The OLE Automation data type is described above.

unixtime2oadate(ex1) DATE
unixtimems2oadate

Converts a Unix Timestamp with milliseconds (Unix time multiplied with 1000), into the equivalent OLE Automation date.

Unix Time and OLE Automation data types are described above.

unixtimems2oadate(ex1) DATE
getdatetime

Take the timestamp of the value and use this as the value itself.

getdatetime(ex1) DATE
getunixtime

Take the timestamp of the value and use this as the value itself. The value is a double including milliseconds as decimals.

getunixtime(ex1) 8 byte float (default)
if-elseif-else

Add possibilities to have IF - ELSEIF - ELSE construction in function Items. ELSE part is mandatory, but the ELSEIF can be omitted or added several times.

If (ex1 > 2)

{ex1*ex2}

ElseIf (ex1 > 1)

{(ex3+1)*ex1}

Else {5.0}

8 byte float (default)
LogicDescriptionExampleItem Type
>Greater than(ex1 > ex2)8 byte float ( 0.0 or 1.0)
>=Greater than or equal( ex1 >= 2.5 )8 byte float ( 0.0 or 1.0)
<Smaller than( ex1 < 3.4)8 byte float ( 0.0 or 1.0)
<=Smaller than or equal( ex1 <= ex2 )

8 byte float ( 0.0 or 1.0)

==Equal( ex1 == ex2 )

8 byte float ( 0.0 or 1.0)

<>Not equal( ex1 != ex2 )

8 byte float ( 0.0 or 1.0)

!=Not equal( ex1 != ex2 )

8 byte float ( 0.0 or 1.0)

!Not!( ex1 > ex2 )

8 byte float ( 0.0 or 1.0)

Logical expressions are written like (ex1 > 4.5), the result however will be a double value either 0.0 (false) or 1.0 (true). These expressions can be a part of a formula.

The result of a calculation will always be a double value.

External Items

The values ex1, ex2 are the values from the external items defined in the external items attribute.

Example 1

Expression:
ex1+ex2 + ex1+ 5.14

External Items:
ex1 : Worker.signal1
ex2: Worker.Signal2

Example 2

Expression:
(ex1+3.2) / ex2+5.3 / ( ex2-ex1 )-ex3 + Ln( ex1+ex2 ) * ( ex1 > 5.2 )

External Items:
ex1 : Worker.signal1
ex2: Worker.Signal2
ex3: Worker.Signal4 

Example 3

Expression:
Ln( ex1+ex2+ Abs(ex1) )

External Items:
ex1 : Worker.signal1
ex2: Worker.Signal2

Global Attributes

By global attributes, we mean attributes that can be added dynamically to any item in the Apis Hive environment at any time. There are three main categories of global items: global attributes defined by an Apis module; predefined Apis attributes; predefined OPC attributes.

Defined by modules

Some modules register global attributes in the Apis Hive environment. These global attributes can be added dynamically to any item in the Apis Hive namespace at any time. The module owning the global attribute, defines the meaning of it.

An example using ApisLoggerBee

The ApisLoggerBee module defines a global attribute called the Log attribute. This attribute controls whether or not an item is being stored to the Apis Honeystore historian as a time series. If this attribute is added to an item, and the value of the attribute is set to true, the item is being stored.

Predefined Apis attributes

The predefined OPC attributes are attributes that have been defined by the OPC Foundation according to the OPC DA specification. These attributes are described here, Predefined Apis attributes.

Predefined OPC attributes

The predefined OPC attributes are attributes that have been defined by the OPC Foundation according to the OPC DA specification. These attributes are described here, OPC DA Item Attributes.

Item Attributes

Items have attributes (sometimes also referred to as properties). They can be changed by selecting one or more items in the Solution Explorer or an item view. The attributes are changed in the Property Editor.

Adding item attributes

Attributes can be added to items by selecting one or more items in the Solution Explorer or a view, and then clicking the "Add Property button" in the Property Editor. A dialog box will appear listing attributes/properties that can be added. Select one or more attributes and click "Ok" to add the attributes.

Removing item attributes

Attributes can be removed from items by selecting one or more items in the Solution Explorer or a view, and then clicking the "Remove Property" button in the Property Editor. A dialog box will appear where attributes that can be removed are listed. Select one more attributes and click "Ok" to remove the attributes.

Note: Not all the attributes can be removed. Only the ones which have been added previously.

Basic Item Properties

All items have a set of basic properties, these are:

Standard

NameDescriptionIDFlags
Value

The current value of the item.

2NormalPage
Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

3ReadOnly
Time

The date and time when this item was last updated.

4ReadOnly
Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

5ReadOnly

See also

Predefined Item Properties and OPC DA Properties

Enumerated Item Properties

Some Apis Hive properties / attributes have enumerated values. i.e. You may select from a predefined list of attribute values. The enumerated values and matching integers are listed here in order to make configuration with text files easier.

Monitoring - Pulse Triggering - MonMethod

None => 0,
ValueChanged => 1,
ValueChangedAutoReset => 2

Data transfer control - ExtTransferCtrlMethod

Always => 0,
When CtrlItem changes from zero => 1,
When CtrlItem is zero => 2,
When CtrlItem is non zero => 3,
When CtrlItem becomes > argument => 10,
When CtrlItem becomes < argument => 11,
When CtrlItem becomes = argument => 12,
When CtrlItem becomes <> argument => 13,
When CtrlItem > argument => 20,
When CtrlItem < argument => 21,
When CtrlItem = argument => 22,
When CtrlItem <> argument => 23

External item override - ExtItemOverrideMethod

Default => 0,
Assign a complete value from external item => 1

Data validation - DVMethod

The method that decides which data validation tests to use.
None => 0x0,
BQ => 0x1,
R => 0x2,
ROC => 0x4,
WD => 0x8,
DF => 0x10,
ADF => 0x20,
SPC => 0x40,
____ 2 rules ____
BQ_R => 0x3,
BQ_ROC => 0x5,
R_ROC => 0x6,
BQ_WD => 0x9,
R_WD => 0xa,
ROC_WD => 0xc,
BQ_DF => 0x11,
R_DF => 0x12,
ROC_DF => 0x14,
WD_DF => 0x18,
BQ_ADF => 0x21,
R_ADF => 0x22,
ROC_ADF => 0x24,
WD_ADF => 0x28,
DF_ADF => 0x30,
BQ_SPC => 0x41,
R_SPC => 0x42,
ROC_SPC => 0x44,
WD_SPC => 0x48,
DF_SPC => 0x50,
ADF_SPC => 0x60,
____ 3 rules ____
BQ_R_ROC => 0x7,
BQ_R_WD => 0xb,
BQ_ROC_WD => 0xd,
R_ROC_WD => 0xe,
BQ_R_DF => 0x13,
BQ_ROC_DF => 0x15,
R_ROC_DF => 0x16,
BQ_WD_DF => 0x19,
R_WD_DF => 0x1a,
ROC_WD_DF => 0x1c,
BQ_R_ADF => 0x23,
BQ_ROC_ADF => 0x25,
R_ROC_ADF => 0x26,
BQ_WD_ADF => 0x29,
R_WD_ADF => 0x2a,
ROC_WD_ADF => 0x2c,
BQ_DF_ADF => 0x31,
R_DF_ADF => 0x32,
ROC_DF_ADF => 0x34,
WD_DF_ADF => 0x38,
BQ_R_SPC => 0x43,
BQ_ROC_SPC => 0x45,
R_ROC_SPC => 0x46,
BQ_WD_SPC => 0x49,
R_WD_SPC => 0x4a,
ROC_WD_SPC => 0x4c,
BQ_DF_SPC => 0x51,
R_DF_SPC => 0x52,
ROC_DF_SPC => 0x54,
WD_DF_SPC => 0x58,
BQ_ADF_SPC => 0x61,
R_ADF_SPC => 0x62,
ROC_ADF_SPC => 0x64,
WD_ADF_SPC => 0x68,
DF_ADF_SPC => 0x70,
____ 4 rules ____
BQ_R_ROC_WD => 0xf,
BQ_R_ROC_DF => 0x17,
BQ_R_WD_DF => 0x1b,
BQ_ROC_WD_DF => 0x1d,
R_ROC_WD_DF => 0x1e,
BQ_R_ROC_ADF => 0x27,
BQ_R_WD_ADF => 0x2b,
BQ_ROC_WD_ADF => 0x2d,
R_ROC_WD_ADF => 0x2e,
BQ_R_DF_ADF => 0x33,
BQ_ROC_DF_ADF => 0x35,
R_ROC_DF_ADF => 0x36,
BQ_WD_DF_ADF => 0x39,
R_WD_DF_ADF => 0x3a,
ROC_WD_DF_ADF => 0x3c,
BQ_R_ROC_SPC => 0x47,
BQ_R_WD_SPC => 0x4b,
BQ_ROC_WD_SPC => 0x4d,
R_ROC_WD_SPC => 0x4e,
BQ_R_DF_SPC => 0x53,
BQ_ROC_DF_SPC => 0x55,
R_ROC_DF_SPC => 0x56,
BQ_WD_DF_SPC => 0x59,
R_WD_DF_SPC => 0x5a,
ROC_WD_DF_SPC => 0x5c,
BQ_R_ADF_SPC => 0x63,
BQ_ROC_ADF_SPC => 0x65,
R_ROC_ADF_SPC => 0x66,
BQ_WD_ADF_SPC => 0x69,
R_WD_ADF_SPC => 0x6a,
ROC_WD_ADF_SPC => 0x6c,
BQ_DF_ADF_SPC => 0x71,
R_DF_ADF_SPC => 0x72,
ROC_DF_ADF_SPC => 0x74,
WD_DF_ADF_SPC => 0x78,
____ 5 rules ____
BQ_R_ROC_WD_DF => 0x1f,
BQ_R_ROC_WD_ADF => 0x2f,
BQ_R_ROC_DF_ADF => 0x37,
BQ_R_WD_DF_ADF => 0x3b,
BQ_ROC_WD_DF_ADF => 0x3d,
R_ROC_WD_DF_ADF => 0x3e,
BQ_R_ROC_WD_SPC => 0x4f,
BQ_R_ROC_DF_SPC => 0x57,
BQ_R_WD_DF_SPC => 0x5b,
BQ_ROC_WD_DF_SPC => 0x5d,
R_ROC_WD_DF_SPC => 0x5e,
BQ_R_ROC_ADF_SPC => 0x67,
BQ_R_WD_ADF_SPC => 0x6b,
BQ_ROC_WD_ADF_SPC => 0x6d,
R_ROC_WD_ADF_SPC => 0x6e,
BQ_R_DF_ADF_SPC => 0x73,
BQ_ROC_DF_ADF_SPC => 0x75,
R_ROC_DF_ADF_SPC => 0x76,
BQ_WD_DF_ADF_SPC => 0x79,
R_WD_DF_ADF_SPC => 0x7a,
ROC_WD_DF_ADF_SPC => 0x7c,
____ 6 rules ____
BQ_R_ROC_WD_DF_ADF => 0x3f,
BQ_R_ROC_WD_DF_SPC => 0x5f,
BQ_R_ROC_WD_ADF_SPC => 0x6f,
BQ_R_ROC_DF_ADF_SPC => 0x77,
BQ_R_WD_DF_ADF_SPC => 0x7b,
BQ_ROC_WD_DF_ADF_SPC => 0x7d,
R_ROC_WD_DF_ADF_SPC => 0x7e,
____ 7 rules ____
BQ_R_ROC_WD_DF_ADF_SPC => 0x7f

Data validation - DvSubstMethod

last good => 1,
InitVal => 2,
AltExtItem => 3,
PassThrough => 4,
Follow ROC => 5,
Default => 255

Engineering unit - EU

See ApisDIR/Bin/EngUnit.xml

Item Qualities

When using External Item Data Validation, Apis Hive will mark the item quality with Prediktor-specific qualities when validation fails. In compliance with the OPC DA specification, vendor specific item qualities are supported inside the upper 8 bits of the 16-bit quality word.

Apis uses the following item qualities:

Quality nameQuality value(16 bit word)Description
bad quality256Item value had bad quality
range257Item value failed range check
roc258Item value failed rate of change check
watchdog259Item value failed watchdog check
difference260Item value failed difference check
abs difference261Item value failed absolute value of difference check
spc unable262Item value failed SPC check cause unable to perform check (ie. data conversion failure)
spc 1beyza263Item value failed SPC check cause One point beyond zone A
spc 9beycent264Item value failed SPC check cause Nine points in a row in zone C or beyond on one side of CL
spc 6incdec265Item value failed SPC check cause Six points in a row steadily increasing or decreasing
spc 14altupdo266Item value failed SPC check cause Fourteen points in a row alternation up and down
spc 2of3inza267Item value failed SPC check cause Two out of three points in a row in zone A or beyond
spc 4of5inzb268Item value failed SPC check cause Four out of five points in a row in zone B or beyond
spc 15inzc269Item value failed SPC check cause Fifteen points in a row in zone C above or below CL
spc 8outzc270Item value failed SPC check cause Eight points in a row on both sides CL with none in zone C

Optional Alarm Configuration

Specify severity by setting the AlmSeverityList attribute.

To specify the alarm text, change the AlmHelp attribute. Different texts can be specified for each sub-condition by adding new texts to the AlmHelp attribute, separated by a semicolon. If no text is given, automatically generated text is sent to clients. If only one set of text is given, it'll be used for all sub-conditions. If more than one text is given they must be positioned in the AlmHelp string as follows:

Event categorySub conditionAlmHelp text position (1 based)
DiscreteNot normal2
DiscreteNormal1
LevelLoLo5
LevelLo4
LevelHi3
LevelHiHi2
LevelNormal1
WatchdogFrozen2
WatchdogUpdating1
WatchQualityGood3
WatchQualityUncertain2
WatchQualityBad1

Events can be delayed for a certain amount of ScanPeriods by setting the AlmActiveDelayPeriods attribute. You might need to add this attribute to your items the same way you added the AlarmAreaEvtCatregory attribute. If the conditions becomes inactive within the delay, no event will be issued.

Predefined Item Properties

The tables below are a summary of the attributes defined by Apis Hive, which may be applied to any item in Apis Hive.

Some Apis Hive attributes have enumerated values, e.g. you may select between a predefined list of attribute values. Check the list of enumerated values and matching integers to get an overview of these attributes.

Internal attributes

Note: These attributes should not be added

IDNameDescriptionType
5000HandleHandle of an item - used when persisting the runtime state.4 byte integer
5001TypeIDThe type ID of an item.4 byte integer
5003ItemIDThe name of the item.string
5002InitValueThe initial value of an item.-
5045InitValueQualityInitial value quality of an item (only valid when having an InitValue).-
5046InitValueTimestampInitial value timestamp of an item (only valid when having an InitValue).-
5047InitVQTFromHoneystoreIf true, the initial value, quality and timestamp will be fetched from the last logged sample in Apis Honeystore.Note! To successfully use this attribute, the item in question must be logged to Honeystore by an ApisLoggerBee module.Boolean

Alternative value presentations

IDNameDescriptionType
5004RawValueThe Raw 32-bit untranslated value.4 byte integer
5005ScaleLinear transformation multiplier, a floating-point value.8 byte real
5006OffsetLinear transformation addend, floating point value.8 byte real

Vector and matrix attributes

IDNameDescriptionType
5007DimensionThe number of elements for a vector item (vector only)2 byte unsigned integer
5008RowsThe number of rows for a matrix item (matrix item only)2 byte unsigned integer
5009ColumnsThe number of columns for a matrix item (matrix item only)2 byte unsigned integer

Item location attributes

IDNameDescriptionType
5010ProcStationIDProcess station ID, a 32 bit integer4 byte integer
5011ProcAreaProcess areastring
5012LocationGeneral location attribute stringstring
5013StationStation namestring

Object attributes

IDNameDescriptionType
5015ObjectClassThe object class this item belongs tostring
5016ObjectInstanceThe object instance this item belongs tostring
5017ObjectPropertyThe object property this item representsstring

Item bus/network location attributes

IDNameDescriptionType
5020AddressBus/network address 14 byte integer
5021Address2Bus/network address 24 byte integer
5022Address3Bus/network address 34 byte integer
5023Address4Bus/network address 44 byte integer
5026SegmentBus/network segment number2 byte integer
5028SegmentPosPosition within segment2 byte integer
IDNameDescriptionType
5030SrcItemIDItem ID in source (often defaults to Item Name in Apis)string
5031SrcUaNodeIdUA NodeID in source, mostly relevant from OPC UA communication modules.string
5032InstrumentNameInstrument namestring
IDNameDescriptionType
5034SourceThe source of the itemstring
5035TableThe table of the itemstring
5036RecordThe record of the itemstring
5037FieldThe field of the itemstring

Directional attributes

IDNameDescriptionType
5040DirectionDirection flag, Input when false or Output when trueBoolean

Display attributes

IDNameDescriptionType
5050DisplayOperator display associated with the itemstring
5054XposX-pos, graphical x-position of the item4 byte integer
5055YposY-pos, graphical y-position of the item4 byte integer
5056WidthWidth, width of the item4 byte integer
5057HeightHeight, height of the item4 byte integer

Device attributes

IDNameDescriptionType
5060DevTypeDevice typestring
5062DevVendorDevice vendorstring
5064DevSerNumDevice serial numberstring
IDNameDescriptionType
5070MaxStringLengthAn indication of the maximum length of the value of a string item. Must be used when logging string items to the Apis Honeystore historian, to control how many characters will be stored for each sample.4 byte integer
5071MaxCacheDurationWhen the item are logged to Apis Honeystore, the maximum duration in seconds, for the item cache when the item is logged by ApisLoggerBee!4 byte unsigned integer
5080LowerBoundLBound, The lower bound of items in vector4 byte real
5081UpperBoundUBound, The upper bound of items in vector4 byte real
5090RowNamesThe names of the rows in a vector or matrix, separated by ';'array of strings
5091ColNamesThe names of the columns in a matrix, separated by ';'array of strings
5100ArgumentItemAn item whose value is used to calculate this itemstring
5105FormatStringA general purpose format string, used when applying string formatting to calculate the value of this item.string
5110ExpressionAn expression used to calculate the value of this itemstring
5111ExpressionsAn array of expressions used to calculate the value of this itemarray of string
5130RateAn attribute which keeps the value rate8 byte unsigned integer
5135HorizonAn attribute which keeps the value horizon8 byte unsigned integer
5220CoefficientsVector of coefficients used for calculation purposesarray of 8 byte real

Composition attributes

IDNameDescriptionType
5500ParentThe Item handle of the parent item to this item, an item in the same module as this item.4 byte integer
5502ParentItemThe Item name of the parent item to this item, an item in the same module as this item.string
5505AggregateThe item handle of the aggregate this item is part of4 byte integer
5510InputItemsLocalArray of item handles regarded as input-items to this item, only for items in the same module.array of 4 byte integers
5511InputItemsArray of item handles regarded as input-items to this item, for items in any module.array of 4 byte integers
5514InputFlagsFlags associated with the inputsarray of 4 byte integers
5520OutputItemsLocalArray of item handles regarded as output-items from this item, only for items in the same module.array of 4 byte integers
5521OutputItemsArray of item handles regarded as output-items from this item, for items in any module.array of 4 byte integers
5524OutputFlagsFlags associated with the outputsarray of 4 byte integers

Alarm attributes

IDNameDescriptionType
5600AlmSeveritySeverity of alarm associated with this item (according to OPC AE). 1 is the lowest, while 1000 is the highest.4 byte integer
5610AlmSeverityListSeverity of alarm associated with this item (according to OPC AE). 1 to 1000 as the allowable range.string
5630AlmNormalStateThe normal state of a discrete alarm/event (according to OPC AE) associated with this item. A deviation from this value will generate a discrete event.Variant
5635AlmActiveDelayPeriodsThe number of periods an item is active before an alarm is raised.2 byte integer
5640AlmWatchdogPeriodsThe max number of periods this item can have the same value before a watchdog alarm is raised.2 byte integer
5650AlmInhibitItemThe Item ID of another item whose value will be used to inhibit this alarm if set to true. Changed from a string to a 4 byte unsigned integer at start-up, i.e. 4 byte unsigned integer runtimestring
5652AlmInhibitOffDelayThe time in milliseconds the alarm inhibit state is delayed before going off4 byte integer
5653AlmResetItemThe Item ID of another item, whose value (when true) will reset the alarm of the source when it is inhibittedstring

Chronical attributes

IDNameDescriptionType
5250ChronicalEventTypeName of the eventtype to use for new events. This will override the default eventtype generated by the AlarmArea modulestring
5252ChronicalParentPath to an eventsource to use as the parent source for the item. This will override the default location of the items eventsourcestring
5255ChronicalSourceNameValue of the SourceName field on new events. This will override the default sourcename used by the AlarmArea modulestring

General purpose attributes

IDNameDescriptionType
5800Text1General purpose attributestring
5801Text2General purpose attributestring
5802Text3General purpose attributestring
5803Text4General purpose attributestring
5810ActivationCountThe number of times this item has been activated4 byte unsigned integer
5811ExecutionCountThe number of times this item has been executed4 byte unsigned integer
5900UserLevThe user level needed to access this item 0 -255 (BYTE)2 byte unsigned integer
5940ExtendedRightsExtended access rights, used to restrict access to items beyond the OPC Item attributes.1 byte unsigned integer
5950TypeFlagsThis item is protected from users without the required level4 byte integer

Data validation attributes

IDNameDescriptionType
5670DvMethodThe method that decides which data validation tests to use. See "Data Validation Method".2 byte unsigned integer
5672DvHiRangeThe high limit for a DV range check8 byte real
5673DvLoRangeThe low limit for a DV range check8 byte real
5674DvMaxRocThe maximum rate of change for a DV rate of change check8 byte real
5676DvWdTimeThe time limit in seconds for a DV watchdog check8 byte real
5677DvCompareItemThe comperand item to use when using any of the difference data-validation methods, ie. The difference is calculated by subtracting the value of the item referenced from the value of the item owning this attribute.string
5678DvCompareValueThe compare-value to use when using any of the difference data-validation methods. i.e. If the difference is bigger than the value of this attribute, the data is invalid.8 byte real
5680DvSPCUCLThe upper control limit in SPC8 byte real
5681DvSPCLCLThe lower control limit in SPC8 byte real
5682DvSPCTestThe value assigned causes tests to be performed (multiple selects possible)string
5683DvSPCUSLThe upper SPC specification limit item reference, specified as an item reference or value directly.string
5684DvSPCLSLThe lower SPC specification limit item reference, specified as an item reference or value directly.string
5685DvSPCCLThe central line in SPC, specified as an item reference or value directly. If CL is not specified, UCL and LCL will be used to calculate it.string
5690DvSubstMethodThis method decides which data validation substitute value to use, see Data Validation Substitution Method.2 byte unsigned integer
5692DvAltExtItemAn alternative external item, to fetch a DV substitute value fromstring
5695DvSuppressAlarmIf true (default behaviour), successive DV alarms of same subcondition are suppressed from being notified to the Alarm server of this Hive instance, ie. suppressed from any AE clients and Alarm database. If false, no DV alarms are suppressed.Boolean
IDNameDescriptionType
5960ExtItemMetaTransferAn enum deciding what kind of metadata to transfer from source to destination external item(s).
EngineeringUnit - The EU and Unit attributes are transferred.

Description - The Description attribute is transferred.

EURange - The High/Low EU attributes are transferred.

InstrumentRange - The High/Low Instrument Range attributes are transferred.
4 byte unsigned integer
5970MonMethodThe method that decides how external items are monitored. See Pulse Monitoring2 byte unsigned integer
5971MonTrueValueThis value is set on the monitoring item if the monitoring method is trueVariant
5972MonFalseValueThis value is set on the monitoring item if the monitoring method is falseVariant
5980ExtTransferCtrlMethodThe control method that decides when external items are transferred. See "External Item Data Transfer Control Methods"2 byte unsigned integer
5982ExtTransferCtrlItemThe Item ID of another item, whose value will control the transfer of external item values to this itemstring
5984ExtTransferCtrlArgumentThe argument to use with the methodstring
5986ExtTransferCtrlInhibitQualityThe quality to set when external item value transfer is inhibited. One of:
none=0
quality: bad=65536
quality: uncertain=65600
quality: good=65728
quality: bad:config error=65540
quality: bad:not connected=65544
quality: bad:device failure=65548
quality: bad:sensor failure=65552
quality: bad:last known value=65556
quality: bad:comm failure=65560
quality: bad:out of service=65564
quality: bad:waiting for initial data=65568
quality: uncertain:last usable=65604
quality: uncertain:sensor not accurate=65616
quality: uncertain:engineering units exceeded=65620
quality: uncertain:sub normal=65624
quality: good:local override=65752
4 byte unsigned integer
5990ExtRowSelectThe vector/matrix row in an external vector/matrix item to copy to this scalar item2 byte integer
5991ExtColSelectThe vector/matrix column in an external vector/matrix item to copy to this scalar item2 byte integer
5994ExternalItem FiltersOne or more item name template(s) to use for connecting many external items. When set not an empty array, this attribute takes control over the external item configuration for its item. This means that upon startup, and when changing this attribute at run-time, an operation is triggered that resolves all matching items and connects them to the as external items. Any existing connections no longer matching the filter(s) are removed. An empty array is ignored, and external items can be configured as if this attribute were not present on the item. To make the ExternalItem Filters perform better, one should avoid having a wildcard '*' in the first part of the filter, before the first '.' separating the item names from its module name part.array of string
5998ExtItemPaddingValueA padding value to be used when assigning missing external item values to an array item, ie. when assigning a source array item of dim 3 to a target item of dim 5-
5999ExtItemOverrideMethodThis attribute decides what method to use when assigning a value from an external item, see external item External Item Override Methods2 byte unsigned integer

External item attributes

IDNameDescriptionType
20000ExternalItem1External Item (Apis item name of the external item which will be automatically copied to this item)string
85535ExternalItemMaxMax external item IDstring

PDS Connection attributes

IDNameDescriptionType
5714ProcessCellThe PDS Process Cell associated with this item4 byte integer
5715ProcessUnitClassThe PDS Process Unit Class associated with this item4 byte integer
5716ProcessUnitThe PDS Process Unit associated with this item4 byte integer
5718EquipmentThe PDS Equipment associated with this item4 byte integer
5730IdentifierThe PDS Identifier associated with this item4 byte integer
5732MaterialDefinitionThe PDS Material Definition associated with this item4 byte integer
5734MaterialClassThe PDS Material Class associated with this item4 byte integer
5736MaterialQualityThe PDS Material Quality associated with this item4 byte integer
5742CarrierClassThe PDS Carrier Class associated with this item4 byte integer
5744CarrierThe PDS Carrier associated with this item4 byte integer
5752DataClassThe PDS Data Class associated with this item4 byte integer
5754DataDefinitionThe PDS Data Definition associated with this item4 byte integer
5764EventSourceThe PDS Event Source associated with this item4 byte integer
5774EUThe PDS Engineering unit (engineering unit) associated with this item4 byte integer
5775GuiEUThe default GUI PDS Engineering Unit associated with this item.4 byte integer
IDNameDescriptionType
5300DEPRECATED_MessageTypeIDDEPRECATED! The MessageTypeID of an ApisMessage4 byte unsigned integer
5301DEPRECATED_MessageCategoryDEPRECATED! The MessageCategory of an ApisMessage4 byte unsigned integer
IDNameDescriptionType
6000DEPRECATED_GemVIDDEPRECATED! SECS/GEM Variable ID4 byte unsigned integer
6001DEPRECATED_GemVIDClassDEPRECATED! SECS/GEM Variable Class ID1 byte unsigned integer
6010DEPRECATED_GemCEIDTrigDEPRECATED! SECS/GEM Collection Event ID for item used as trigger. Values below 1000 are reserved for internal usage, and will be ignored on item attributes.4 byte unsigned integer
6020DEPRECATED_GemTIDSndDEPRECATED! SECS/GEM Terminal ID for equipment generated text4 byte unsigned integer
6021DEPRECATED_GemTIDRcvDEPRECATED! SECS/GEM Terminal ID for host generated text4 byte unsigned integer
6030DEPRECATED_GemALIDDEPRECATED! SECS/GEM Alarm ID of an alarm. Values below 1000 are reserved for internal usage, and will be ignored on item attributes.4 byte unsigned integer
6031DEPRECATED_GemALCEIDOnDEPRECATED! SECS/GEM Alarm CEID for an alarm going ON.4 byte unsigned integer
6032DEPRECATED_GemALCEIDOffDEPRECATED! SECS/GEM Alarm CEID for an alarm going OFF.4 byte unsigned integer
IDNameDescriptionType
6100DEPRECATED_IEC104TypeIDDEPRECATED! IEC104 TypeID, used by IEC104 server to expose Apis items, enumeration of supported IEC104 TypeIDs:Not in use=0Single point information[Bool]=1Bitstring 32 bits[Int32,UInt32]=7Measured value[Float32]=131 byte unsigned integer
IDNameDescriptionType
5150DEPRECATED_HSDimensionIDDEPRECATED! This attribute maps this item value to instances of a Dimension definition in the Apis HoneyStore Indexing databasestring
5151DEPRECATED_HSMeasurementIDDEPRECATED! This attribute maps this item value to instances of a Measurement definition in the Apis HoneyStore Indexing databasestring
5152DEPRECATED_HSLocationIDDEPRECATED! This attribute maps this item value to instances of a Location definition in the Apis HoneyStore Indexing databasestring

DataTypes

Although data types are visualized as strings in most places in Apis, it is sometimes necessary to know the code for a data type. The following table is a brief overview, mapping the types to their code value.

DescriptionValue
Character16
Unsigned character17
2 byte integer2
4 byte integer3
8 byte integer20
2 byte unsigned integer18
4 byte unsigned integer19
8 byte unsigned integer21
4 byte real4
8 byte real5
Boolean11
String8
Date7
Variant12
Decimal14
Error10
Currency6
Filetime64
Array8192
Vector4096
Null terminated string30
Wide null terminated string31

FileFormats

It is possible to export parts of a configuration to text files.

What can be exported is:

Event - Command configuration

The event-command configuration will be exported to a tab separated text file called ##EventBrokerConfig_<time>.txt. An example of such a file is:

// File generated by Apis Snapins Configuration Export<br />

// ##version:2## (do not modify this line)

##EventBrokerConfiguration## (EventModuleName.EventID CommandModuleName.CommandID)

ApisCalculate 1.8000& ApisCalculate 1.8000

ApisJava1.110 ApisJava1.100

ApisJava1.8000 ApisJava1.8000

ApisJava1.8000 ApisJava1.120

ApisOPC1.8000 ApisOPC1.8000

Java.110 Java.100

Java.8000 Java.8000

The third line indicates that this is configuration for event-commands, and the following lines represent connections between events and commands.

The spaces between the events and command must be tabs.

Modules

There will be one file for each module in the configuration. The files will be named <NameOfModule>_##ModuleProperties_<time>.txt.

An example of such a file is:

// File generated by Apis Configuration Export

// ##version:2## (do not modify this line)

ModuleCLSID {d8e2d2c7-7bd5-4afa-8811-85e863c1d3a0}

ModuleName ApisCalculate 1

ModuleStorageCLSID {4c854c93-c667-11d2-944b-00608cf4c421}

##Module properties##

ExchangeRate TimeReferenceItem ExtItemCalculationSequence ExtItem pass-through quality UpdateInitvalsOnSave PersistValToInitVal Calculation&nbsp TraceToFile

100 200 300 400 1600 1650 1700 1900

0 0 64 False 0 True

The module properties are registered immediately below the <em>##Module properties##</em> line.

The first line contains the name of the property and is not actually used when importing.

The second line contains the ids of the properties.

The third line contains the values of the properties.

Items

There will be one file for each item type in each module. The files will be named <NameOfModule>_<NameOfItemType>_<time>.txt.

An example of such a file is:

// File generated by Apis Configuration Export

// ##version:2## (do not modify this line)

ModuleCLSID {983b4ae2-abb9-11d2-9424-00608cf4c421}

ModuleName ApisWorker1

ModuleStorageCLSID {4c854c93-c667-11d2-944b-00608cf4c421}

ItemType Signal [ID:1]

//##Module properties## (uncomment this and the next three lines to import module properties):

//ExchangeRate TimeReferenceItem ExtItemCalculationSequence ExtItem pass-through quality RandomizeItemAttribs UpdateInitvalsOnSave PersistValToInitVal

//100 200 300 400 1500 1600 1650

//2000 0 64 True False 0

//ItemID Amplitude Waveform Address Period Overidden quality Address2 Bias UALogger2 UALogger Type

0 10001 10002 5020 10004 10100 5021 10005 UALogger2 UALogger

Signal1 600 0 0 60 192 0 10 True False 4

Signal2 600 0 0 60 192 0 20 True False 4

Signal3 700 0 0 60 192 0 30 True False 4

Signal4 700 0 0 60 192 0 20 True False 4

Signal5 800 0 0 60 192 0 10 True False 4

Signal6 800 0 0 60 192 0 13 True False 4

Signal7 900 0 0 60 192 0 14 True False 4

Signal8 900 0 0 60 192 0 15 True False 4

The first part of the file is the same as for the modules, except it also contains a line for the item type, and that module properties are commented out.

The last part is a list of items. The columns are the properties of the item. The first column should always have id 0, and is the name of the item.

The spaces between the events and command must be tabs.

.

External items

In order for the Apis modules to exchange values, the items defined in Apis may have an attribute called ExternalItem. Exchanging values in the Apis context simply means copying the value of one item from one module to another item (typically in another module).

The ExternalItem attribute(s) is a string that contains the name of the item whose value is to be retrieved. It is always the item that receives the value from another item that defines the external item.

The external item feature in Apis may be utilised in many ways. In addition to raw copy of the value from an external item, functionality such as data validation and pulse trigging is available. Confer the linked topics below for a detailed description.

Apis Online SPC

Apis offers online SPC monitoring of signals as part of the data validation feature, i.e. one of the data validation methods available is SPC.

How to set up a monitoring:

Create an item ("SPC Tag"). Add the following attributes: DvMethod = SPC,  DvSPCUCL, DvSPCLCL, DvSPCTest and DvSubstMethod (optional) and set the appropriate values. Make the item you want to monitor ("Signal Tag") an external item to the "SPC Tag".

An alarm is triggered if one or more of the SPC tests fails. The alarm will be logged in the system log, but it is also possible to send it out as an e-mail notification.

SPC Test input codes

The set of SPC tests to apply when performing online SPC on a signal may be configured via the Apis attribute named "DvSPCTest". The table below lists the SPC tests and the code to use as input to "DvSPCTest". It is possible to select multiple tests. In such cases the input codes for the desired tests are entered as a semi colon separated list, e.g. 1;3;4.  

Input codeShort descriptionDescription
11BEYZAOne point beyond zone A
29BEYCENTNine points in a row in zone C or beyond on one side of CL
36INCDECSix points in a row steadily increasing or decreasing
414ALTUPDOFourteen points in a row alternation up and down
52OF3INZATwo out of three points in a row in zone A or beyond
64OF5INZBFour out of five points in a row in zone B or beyond
715INZCFifteen points in a row in zone C above or below CL
88OUTZCEight points in a row on both sides CL with none in zone C

Example zones and control limits

External Item Data Transfer Control

This feature makes it possible to control when an external item’s value shall be transferred to another item. It may not always be the case that we want the value to be transferred regardless of the value. Here we may select between different rules for when a transfer should be carried out depending on the external item value. In addition we may set up the system such that the value is transferred from different external items depending on the value of a third item (MUX).

The following item attributes must be specified:

Attribute ID

Name

Description

Type

5980

ExtTransferCtrlMethod

The control method that decides when external items are transferred to the item of the attribute. One of the following may be selected:

Name

Meaning

Value

always

Always transfer external item values

0

when CtrlItem changes from zero

Transfer once when control item (ExtTransferCtrlItem) value changes from 0

1

when CtrlItem is zero

Transfer when control item (ExtTransferCtrlItem) value = 0

2

when CtrlItem is non zero

Transfer once when control item (ExtTransferCtrlItem) value  is != 0

3

when CtrlItem becomes > argument

Transfer once when control item (ExtTransferCtrlItem) value becomes greater than argument value (ExtTransferCtrlArgument)

10

when CtrlItem becomes < argument

Transfer once when control item (ExtTransferCtrlItem) value becomes less than argument value (ExtTransferCtrlArgument)

11

when CtrlItem becomes = argument

Transfer once when control item (ExtTransferCtrlItem) value becomes equal argument value (ExtTransferCtrlArgument)

12

when CtrlItem becomes <> argument

Transfer once when control item (ExtTransferCtrlItem) value becomes different argument value (ExtTransferCtrlArgument)

13

when CtrlItem > argument

Transfer once when control item (ExtTransferCtrlItem) value is greater than argument value (ExtTransferCtrlArgument)

20

when CtrlItem < argument

Transfer once when control item (ExtTransferCtrlItem) value is less than argument value (ExtTransferCtrlArgument)

21

when CtrlItem = argument

Transfer once when control item (ExtTransferCtrlItem) value is equal to argument value (ExtTransferCtrlArgument)

22

when CtrlItem <> argument

Transfer once when control item (ExtTransferCtrlItem) value is different from argument value (ExtTransferCtrlArgument)

23

2 byte unsigned integer

5982

ExtTransferCtrlItem

The Item ID of another item, whose value will control the transfer of external item values to this item

string

5984

ExtTransferCtrlArgument

The argument to use with the ExtTransferCtrlMethod

string

5986

ExtTransferCtrlInhibitQuality

(Note that not all item-types
will have this functionality
implemented)

The quality to set when external item value transfer is inhibited, as a result of the chosen ExtTransferCtrlMethod. When other than none, data is still transferred during control method failure. When none, data transfer stops during control method failure.
The attribute is enumerated, with the following options:

Name

Meaning

Value

none

Quality is not set/altered upon control method failure, but data transfer is inhibited

0

quality: bad

Quality is set to this value upon control method failure, but data is still transferred

65536

quality: uncertain

Quality is set to this value upon control method failure, but data is still transferred

65600

quality: good

Quality is set to this value upon control method failure, but data is still transferred

65728

quality: bad:config error

Quality is set to this value upon control method failure, but data is still transferred

65540

quality: bad:not connected

Quality is set to this value upon control method failure, but data is still transferred

65544

quality: bad:device failure

Quality is set to this value upon control method failure, but data is still transferred

65548

quality: bad:sensor failure

Quality is set to this value upon control method failure, but data is still transferred

65552

quality: bad:last known value

Quality is set to this value upon control method failure, but data is still transferred

65556

quality: bad:comm failure

Quality is set to this value upon control method failure, but data is still transferred

65560

quality: bad:out of service

Quality is set to this value upon control method failure, but data is still transferred

65564

quality: bad:waiting for initial data

Quality is set to this value upon control method failure, but data is still transferred

65568

quality: uncertain:last usable

Quality is set to this value upon control method failure, but data is still transferred

65604

quality: uncertain:sensor not accurate

Quality is set to this value upon control method failure, but data is still transferred

65616

quality: uncertain:engineering units exceeded

Quality is set to this value upon control method failure, but data is still transferred

65620

quality: uncertain:sub normal

Quality is set to this value upon control method failure, but data is still transferred

65624

quality: good:local override

Quality is set to this value upon control method failure, but data is still transferred

65752

4 byte unsigned integer

Note:  The module property ExtItemCalculationSequence is important in data validation and transfer. It decides whether data validation or data transfer will be performed first in the external item manager. Available values: TransferBeforeValidation, ValidationBeforeTransfer.

Alternatively

ExtItemOverrideMethod. This attribute decides what method to use when assigning a value from an external item. One of the following may be selected: “Default” or “Assign complete value from external item”.

External Item Data Validation

This feature makes it possible to perform various data validations of data values on Apis external items before making use of the data. An alarm is triggered when a data validation fails. The alarm will be logged in the system log, but it is also possible to send it out as an e-mail notification. When a validation fails, the invalid data may be substituted according to different rules.

The data-validation is performed when the value of the item connected as the external item changes. When an item fails a data-validation check, the quality of the item will include some Apis specific quality information, see Apis Vendor Specific Item Qualities for details.

Data-validation methods

Apis supports the following data validation methods:

Data validation method

Abbreviation

Value*

Description

Related item attribute(s)

None

None / 0

0

 

 

Bad quality

BQ / 1

1

Validation fails when the quality of the item becomes bad

 

Range

R /

2

Validation fails when the value of the item violates a range check

DvHiRange, DvLoRange

Rate of change

ROC

4

Validation fails when the value of the item changes to fast

DvMaxRoc

Watchdog

WD

8

Validation fails when the value of the item stops updating

DvWdTime

Difference

DF

16

Validation fails when the difference of the value of the item compared to the value of an other item is too big

DvCompareValue, DvCompareItem

Absolute difference

ADF

32

Validation fails when the absolute value of the difference of the value of the item compared to the value of an other item is too big

DvCompareValue, DvCompareItem

SPC

SPC

64

Any of the supported SPC tests, see Apis Online SPC.

DvSPCTest, DvSPCLCL, DvSPCUCL

*) When using File-add, the values of the desired DV methods must be OR-ed together. Eg: R | ROC = 6

The following item attributes are used in data validation:

ID

Name

Description

Type

5670

DvMethod

The method that decides which data validation tests to use. Any combination the following is possible: None, BQ, R, ROC, WD, DF, ADF, SPC.

2 byte unsigned integer

5672

DvHiRange

The high limit for DV range check.

8 byte real

5673

DvLoRange

The low limit for DV range check.

8 byte real

5674

DvMaxRoc

The maximum rate of change for DV rate of change check.

8 byte real

5676

DvWdTime

The time limit in seconds for DV watch dog check.

8 byte real

5677

DvCompareItem

An external item used by data-validation methods DF and/or ADF, when comparing values.

 

5678

DvCompareValue

The value to compare the difference with, when using data-validation methods DF and/or ADF

8 byte real

5680

DvSPCUCL

The upper control limit in SPC.

8 byte real

5681

DvSPCLCL

The lower control limit in SPC.

8 byte real

5682

DvSPCTest

The assignable causes tests to perform (multiple selects possible, e.g. 1;3;4). 1 - 1BEYZA, 2 - 9BEYCENT, 3 - 6INCDEC, 4 - 14ALTUPDO, 5 - 2OF3INZA, 6 - 4OF5INZB, 7 - 15INZC, 8 - 8OUTZC. Test codes explained.

string

5690

DvSubstMethod

The method that decides which data validation substitute value to use. One of the following may be selected:

Name

Meaning

Value

last good

Substitute with last good value when data validation fails

1

InitVal

Substitute with InitVal attribute when data validation fails. If no InitVal is associated with the item, it uses the last good value

2

AltExtItem

Substitute with an alternative external item (DvAltExtItem). If no alternative external item can be obtained, it uses the last good value

3

PassThrough

No substitution occurs, external item is passed through as if no validation failed

4

Follow ROC

Substitute with a value that increases/decreases with a rate of change limitation according to the maximum rate of change (DvMaxRoc).

5

Default

First, value is  substituted after a Follow ROC strategy, if DvMethod specifies ROC.
Then, if DvMethod specifies Range, a range check validation is performed. If Range-chack fails, an AltExtItem substitution  is performed if an DvAltExtItem is specifed.

If none of the above mentioned substitution occurs, the default substitution is last good..

255

2 byte unsigned integer

5691 DvSubstQuality

The quality to use when data validation substitution takes place.  This attribute lets the user control the quality for item values when Data Validation substitution takes place, the quality can be overridden to bad, uncertain or good, by specifying the desired quality using the DvSubstQuality attribute.

The APIS Vendor specific quality bits, telling what violation is broken , will still be added to the quality. I.e. when using Range validation, the quality of the item during a violated state is decided according to the table below, but with Apis Vendor quality range added to the quality.

Name

Meaning

Value

Default

The quality follows the quality of the source item, with any with Apis Vendor quality violation state added to it.

255

Bad The quality is always set to bad, with any with Apis Vendor quality violation state added to it. 0
Uncertain The quality is always set to uncertain, with any with Apis Vendor quality violation state added to it. 64
Good The quality is always set to good, with any with Apis Vendor quality violation state added to it. 192
unsigned character

5692

DvAltExtItem

An alternative external item to fetch a DV substitute value from. Used when 'DvSubstMethod'='AltExtItem'.

string

Note:  The module property ExtItemCalculationSequence is important in data validation and transfer. It decides whether data validation or data transfer will be performed first in the external item manager. Available values: TransferBeforeValidation, ValidationBeforeTransfer.

Data-validation alarms

Data validation will generate alarms into the Alarms & Event server of Apis, and OPC clients can retrieve these alarms over OPC UA and OPC AE. Further, data-validation alarms & events can be written to an Apis event database, configuration of this is described in Advanced Event Configuration.

External Item Monitoring - Pulse Trigger

This feature makes it possible to monitor an external item and create a pulse value in an Apis item depending on the value in the external item and a set of predefined rules for monitoring.

The following item attributes must be specified:

IDNameDescriptionType
5970MonMethodThe method that decides how external items are monitored. Available methods: None, ValueChanged, ValueChangedAutoReset.2 byte unsigned integer
5971MonTrueValueThe value to be set on the destination item if the external (source) item behave in a manner that should produce a pulse value.Variant
5972MonFalseValueThe value to be set on the destination item if the external (source) item behave in a manner that should not produce a pulse value.Variant

How to set up a monitoring:

Create an item (DestinationItem), add the three attributes (Add attributes) described above and set the appropriate values. Make the item you want to monitor (SourceItem) an external item to the DestinationItem (Set external item).  

Note: Monitoring is always carried out after potential data transfer and data validation operations.

At the time of writing, ValueChanged and ValueChangedAutoReset are the only monitoring methods that are implemented. The pulse production for these methods is described in the figure below.

See also

External Items, Data transfer control, Data validation

Commands And Events

To specify the processing order of specific tasks in Apis modules, you can use "Events and Commands". An "Event" is an outgoing notification from an Apis module. For example, a notification could be sent when a calculation has finished. A "Command" is an incoming notification to an Apis module, telling the module to perform a specific task. For example, this could be to calculate new values based on the present data values.

The Apis Event Broker controls the relationship between events and commands. An "Event" can be configured to trigger a series of "Commands". The commands will then be executed in the order specified when an event notification is sent from an Apis module to the event broker.

The events and commands are specific to each type of Apis module. For the modules handling external items, the event "ExternalItems", and the command "HandleExternalItems", are common. The "ExternalItems" event is a timer event handling exchange of external items. The period of the timer is specified in the "ExchangeRate" attribute of the module. The timer is only running if the module has configured external items. The "HandleExternalItems" command handles the updating of the items connected to external items.

Event broking

In order to specify the processing order of specific tasks in the Apis modules, the concept of Events and Commands has been introduced together with a so-called Event Broker. An Event is an outgoing notification from an Apis module. The notification could e.g. be sent when a calculation has finished. A Command is an in-coming notification to an Apis module, telling the module to perform a specific task. This could be e.g. to calculate new values based on the present data values.

The Apis Event Broker controls the relationship between events and commands. An Event can be configured to trigger a series of Commands. The Commands will then be executed in the order specified when an event notification is sent from an Apis module to the event broker.

The Events and Commands are specific for each type of Apis module. Common for the modules handling external items, are the event named ExternalItems, and the command named HandleExternalItems. The ExternalItems event is a timer event handling exchange of external items. The period of the timer is specified in the ExchangeRate attribute of the module. The timer is only running if the module has configured external items. The HandleExternalItems command handles the updating of the items connected to external items.

APIS data transfer mechanism; Data Push

A new way for transferring data internally in APIS, has been implemented. This mechanism will hereafter be called, Data Push.

Background

In brief, when connecting to an OPC UA server, one can have different rates for data sampling and publishing. Ie. you can sample items every one second, but choose to only send data (publish) to the client every 30 second. The client will then receive up to 30 samples per item, every 30 second, a small queue/series of samples instead of one sample by sample, which is the old Apis way, and also the way of classic OPC DA. Even more complex, a UA client may have a mix of various sampling rates, in one subscription (publishing).

Traditionally, Apis Hive has relied on sampling data when transferring data between different modules and internal services, ie. the External Item manager and Logger/AlearmArea modules consuming data. To consume data from an OPC UA client in the old Apis way, the UA client module Apis OpcUaBee must apply one “timestep by timestep” from a queue/series of samples. After each timestep, an Eventbroker event, ServerDataChanged, is fired, to notify all consumers that they should check for any updated data on the items they consume, by sampling all of the items they are monitoring. When a Logger, or an OPC UA server client-sampler in the Hive UA server, receives such an event, it might sample/read hundreds of thousands of items, even if just a few items actually were updated.

I.e.. when just a few hundreds of a million of items, are updated with a queued series of samples of length 30, all internal consumers (Logger, External items, UA server client-sampler, AlarmArea,…) will run 30 sample/read loops over a million of items, in a closed loop, to check for updated data samples. This is unnecessary and has a negative impact of the Apis Hive dataflow performance. To remedy this, a new way has been developed, Data Push.

The new Data Push data transfer

The new way of transferring data internally, will send the updated data samples as a package, as a parameter attached to an APIS Hive Event broker event.

For now, we have one natural source of such data packages; i.e. data received by the ApisOpcUaBee UA-client, from any OPC UA server. The ApisOpcUaBee hence has gotten a new Event broker event; ServerDataChanged_DataPush. To consume this new event with its data parameter, we also need new Event broker commands where appropriate.

The following new events have been implemented:

EventDescription
In the ApisOpcUaBee client; ServerDataChanged_DataPushThis event is fired when the UA client receives data from the UA server, and the items and samples in the data push package are the ones that is received from the server.
In any Apis module having External Items; ExternalItemsHandled_DataPushThis event is fired after it has executed a HandleExternalItems_DataPush command. The data push package part of this ExternalItemsHandled_DataPush event, are all the resulting VQTs (Function items, ordinary external item transfer, etc.) from the HandleExternalItems_DataPush command.On this ExternalItemsHandled_DataPush event, one can hook any _DataPush command(s) (Log; Scan; UaServerUpdateMonitorItems; HandleExternalItems), to chain a path of execution with data transferred alongside.

The following new commands have been implemented:

CommandDescription
In the Hive UA server; UaServerUpdateMonitorItems_DataPushThis command ensures that any UA clients that subscribe/monitor any of the items and samples in the data push package transfer these samples to the client.
In any Apis module having External Items; HandleExternalItems_DataPushThis command ensure that all samples for all items and samples in the data push package are applied/used in the external item manager, including in services as Data Validation, Ext Items Transfer Control, etc.
In the ApisLoggerBee; Log_DataPushThis command ensure that all items and samples in the data push package, are stored to the HoneyStore database where applicable.
In the ApisAlarmAreaBee; Scan_DataPushThis command ensure that all items and samples in the data push package, are used for alarm evaluation where applicable.

Finally, worth mentioning is that the new data push event broker event and commands, can be used together with the old fashioned way; i.e. the OPC UA ServerDataChanged event are still fired appropriate number of times, alongside with a single ServerDataChanged_DataPush event.

Example configuration

Consider the two Hive instances below; AI_Middle and AI_Top.

AI_Middle

  • OpcUa; a UA client connected to an(y) OPC UA server, here using Sampling interval of 1 second and Publish interval of 5 seconds to force data push packages of 5 samples per item per package.
  • AI_Middle_DB; a logger bee storing data to HoneyStore as eventbased trend types.
  • AlarmArea; an alarm module monitoring some of the items from the UA server.
  • Func; a Worker bee running Function items on a some of the item from the UA server.

AI_Top

  • OpcUa; a UA client connected to the OPC UA server of Hive instance AI_Middle.
  • AI_Top_DB; a logger bee storing data to HoneyStore as eventbased trend types.
  • AlarmArea; an alarm module monitoring some of the items from the UA server.
  • Func; a Worker bee running Function items on a some of the item from the UA server.

Below, the Event broker configuration for both instances are shown.
AI_Middle event broker configuration:

AI_Top event broker configuration:

Here, the snapshot of the same items, in the source, the Middle and Top instance, are shown in a real time list in AMS.
Note that the timestamps upstream will update fast, at arrival of every package, and will typically be delayed up to the same amount of seconds as the publishing interval.

Commands And Events

To specify the processing order of specific tasks in Apis modules, you can use "Events and Commands". An "Event" is an outgoing notification from an Apis module. For example, a notification could be sent when a calculation has finished. A "Command" is an incoming notification to an Apis module, telling the module to perform a specific task. For example, this could be to calculate new values based on the present data values.

The Apis Event Broker controls the relationship between events and commands. An "Event" can be configured to trigger a series of "Commands". The commands will then be executed in the order specified when an event notification is sent from an Apis module to the event broker.

The events and commands are specific to each type of Apis module. For the modules handling external items, the event "ExternalItems", and the command "HandleExternalItems", are common. The "ExternalItems" event is a timer event handling exchange of external items. The period of the timer is specified in the "ExchangeRate" attribute of the module. The timer is only running if the module has configured external items. The "HandleExternalItems" command handles the updating of the items connected to external items.

Setting Event Broker Priorities

To ensure the Event Broker executes as expected, you can increase its priority on systems running heavy tasks. Do this by setting the following registry entries.

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\ApisEventBroker]

"MainThreadPriority"="2"  
"ExecutionThreadPriority"="2"

The supported values of these entries, are:

PriorityMeaning
1ABOVE_NORMAL: Priority 1 point above the priority class
-1BELOW_NORMAL: Priority 1 point below the priority class
2HIGHEST: Priority 2 points above the priority class
-15IDLE : Base priority of 1 for IDLE_PRIORITY_CLASS, BELOW_NORMAL_PRIORITY_CLASS, NORMAL_PRIORITY_CLASS, ABOVE_NORMAL_PRIORITY_CLASS, or HIGH_PRIORITY_CLASS processes, and a base priority of 16 for REALTIME_PRIORITY_CLASS processes
-2LOWEST: Priority 2 points below the priority class
0NORMAL: Normal priority for the priority class
15TIME_CRITICAL: Base priority of 15 for IDLE_PRIORITY_CLASS, BELOW_NORMAL_PRIORITY_CLASS, NORMAL_PRIORITY_CLASS, ABOVE_NORMAL_PRIORITY_CLASS, or HIGH_PRIORITY_CLASS processes, and a base priority of 31 for REALTIME_PRIORITY_CLASS processes

The MainThreadPriority is the priority of the main-loop or event delegator thread of the Event Broker.

The ExecutionThreadPriority is the priority of the worker or execution threads of the Event Broker.

OPC UA Communication

APIS Hive offers OPC UA communication, both as an OPC UA Server and as an OPC UA Client.

When enabling the APIS Hive UA server, the following OPC UA profiles are implemented:

OPC UA communication can be initiated in two ways:

  1. A UA Client initiates a connection towards a UA Server, i.e. the Server offers an Endpoint listening for connections requests from client(s). Lets denote this the normal way to initiate a connection, or forward connectivity.

  2. A UA Server in initiates connection(s) towards one or more clients, in case the clients offers Endpoints the listening for connections request from the server. This is also called Reverse Connectivity.

A server can use any of those strategies when communicating with clients, i.e. offer one or both of the 2 ways to connect described above.

Enabling the Hive UA server

To enable the OPC UA Server for an Hive instance, you will need to enable one or more Endpoints for the instance. See here for how to add/enable an Endpoint for an Hive instance.

Once you have enabled one or more endpoints, the UA server of your instance will be subject for OPC UA communication (forward/reverse) according to the Endpoints configuration.

Also, you will need to decide what Primary Key Infrastructure to use, and which certificates to trust and/or reject.

Connecting to a UA server

In order to connect to a UA server, you will have to use a UA client. If you want to connect to a UA server (another APIS Hive instance or any other 3rd party server), you have to use an Apis OpcUa client module, and optionally an Apis OpcUa Proxy client module.

For further reading about OPC UA, please take a look at:

Adding/modifying an Endpoint

Please see here APIS Hive UA Server Endpoints for more details.

Advanced UA server configuration settings

The UA server has a set of advanced configuration settings, that normally just should be kept at their default values.
These settings are described in more detail here APIS Hive UA Server Advanced Settings

OPC UA Server Endpoints

In many circumstances, you will need to define one or more Endpoints for your APIS Hive configuration. As a typical example, when enabling the APIS Hive UA Server, you will need to enable at least one Endpoint for the UA communication to run on. As default, there will be one Endpoint available listed under the availble Endpoints for your instance. If not, or you want to add another Endpoint, see Adding an Endpoint. If you want to use / modify an existing Endpoint, see Modifying an Endpoint.

Adding an Endpoint

From the tree view in APIS Management Studio, locate the Endpoints folder under your instance, then right click and choose Add a new Endpoint.

Then, a new Endpoint will appear underneath the Endpoints folder.

Modifying an Endpoint

To modify an existing Endpoint, locate the Endpoints in the Endpoints folder, and select it. The default property editor will then show the properties of the endpoint:

The properties of the endpoint are explained below:

NameDescriptionId
NameThe name of the endpoint, free for the user to specify.10
DescriptionA description of the endpoint, free for the user to specify.20
UrlThe url of the endpoint, e.g.: opc.tcp://0.0.0.0:4850You will need to specify a Url even if you don't want your UA server to be available for remote, inbound connections, as the Url is used to specify the kind of serialization to use.To specify an Url that is not reachable for normal, forward connection, enter e.g.: opc.tcp://localhost:485030
MessageSecurityMessage security modes allowed for this endpoint. The MessageSecurityMode is an enumeration that specifies what security should be applied to messages exchanges during a Session. Also see here.40
SecurityPoliciesTransport security policies allowed for this endpoint. Please see for options: SecurityPolicy – None, SecurityPolicy – Basic128Rsa15, SecurityPolicy – Basic256, SecurityPolicy – Aes128-Sha256-RsaOaep, SecurityPolicy – Basic256Sha256, SecurityPolicy – Aes256-Sha256-RsaPss.50
AuthenticationMethodsUser authentication methods allowed for this endpoint.60
AuthenticationPolicyAuthentication security policy to use on unsecure channels.70
ReverseConnectionsArray of client endpoint urls for reverse connections. If you do not want to use reverse connectivity, leave this array empty. If you want to use reverse connectivity on this Endpont, specify an array of client Urls for the listening clients.120
ConnectIntervalWhen using Reverse connectivity, this is how many seconds between each reverse connection attempt.130
EnabledWhether the endpoint is enabled or not. If Enabled is False, the endpoint (forward and/or reverse) will not be enabled in the Hive UA server.110

OPC UA Certificate Management

In many circumstances, you will need to define one or more Endpoints for your APIS Hive configuration. As an typical example, when enabling the APIS Hive UA Server, you will need to enable at least one Endpoint for the UA communication to run on. As default, there will be one Endpoint available listed under the available Endpoints for your instance. If not, or you want to add another Endpoint, see Adding an Endpoint . If you want to use / modify an existing Endpoint, see Modifying an Endpoint.

Managing OPC UA Certificates

From the tree view in APIS Management Studio, locate the Endpoints folder under your instance, then right click and choose Manage UA Certificates.

Then, a new dialog will appear, like shown below. Use this to specify the Public Key Infrastructure you want to use. We recommend using the default PKI Type: SSL.

As UA clients connect to the Hive UA server, their certificates will be rejected until you open the Certificate Manager, and explicitly select a rejected certificate and clicks the Trust button.

OPC UA Redundancy

The OPC UA communication protocole offers standardized ways of achieving redundancy, as described in full detail here: OPC UA Online Reference - Server Redundancy

We support redundancy on both server and client side.

Server Redundancy

To configure redundancy on the server side we use Apis High Availability to set up a OpcUa "Redundant Server Set".

This will be a Non-transparent Redundancy Redundant Server Set.

By definition all servers shall have identical AddressSpace including:

  • identical NodeIds
  • identical browse paths and structure of the AddressSpace
  • identical logic for setting the the ServiceLevel. (defined by Apis HAGovernor)

OpcUa Namespace 2 (NS2) in Hive, reflect configuration and all variable/signals witch are connected. By default Hive-instances automaticly create an URI for NS2, based on computername and instancename. When configure a Redundant Server Set the NS2 uri have to be equal. This can be sett manually by changing the registry key NS2 under UAServer for all servers in the Redundant Server Set.

Apis support Cold and Hot server failover, see defenitions of Server Failover mode .

Connect to a Redundant Server Set

To create a connection to a Redundant Server Set use the Apis Connection Manager together with the Apis OpcUa. The ClusterItem in ApisCmxMgr define the connectionpoint, and shall be used as the Server property in the Apis OpcUa. The client will connect to the server with highest ServiceLevel. The ApisCnxMgr will continue to monitor the ServiceLevel of all servers in the set. If the ServiceLevel of the current used server indicate a degenerated server, the connection manager will switch to a server with the highest ServiceLevel, and the ApisOpcUa module will connect to the this server. Some data can be lost due to the time involved in reconnection to a new server.

This is defined as Warm failover. See Redundancy Failover mode .

OPC UA Server Advanced Settings

In most circumstances, you will not need to modify any of the advanced UA server settings. Anyhow, if you do need to change or tweak some of these, they can be modified directly in the UAServer sub-key of the Windows registry key of the Hive instance.

The set of advanced UA server settings are described in the following table:

NameDescriptionDefault valueRequire restart to apply
EnabledNot in use, will be removed in a future Apis version0 (false)Yes
Enable_DownSamplingIf enabled, VQTs with timestamp delta less than the sampling interval is dropped0 (false)Yes
EngineeringUnits_UnitAttributeFallbackIf enabled, use proprietary EUInformation when proper EUInfo is unavailable0 (false)Yes
ForceAllowHistoryWriteIf enabled, allow HistoryWrite on any node 10 (false)No
Limits_JobWorkersNot in use, will be removed in a future Apis version0 (false)Yes
Limits_MaxArrayLenSpecifies max items per Publish response0x1000000Yes
Limits_MaxEventQueueSpecifies max queuesize on event subscriptions0x100000Yes
Limits_MaxHistoryRead_EventsSpecifies max number of events per HistoryRead0x2000Yes
Limits_MaxHistoryRead_VQTSSpecifies max number of VQTs per HistoryRead0x10000Yes
Limits_MaxJobsNot in use, will be removed in a future Apis version0x100Yes
Limits_MaxSamplingFuzzFuzzy deltatime limit when downsampling is enabled, in milliseconds0x10Yes
Limits_MaxSubscriptionsPerSessionMax number of subscriptions per session 20Yes
Limits_MinSamplingIntervalMin samplinginterval accepted by the server, in milliseconds10Yes
MaxThreadsPerCoreMax number of threads per core for parallelized HistoryRead processing8Yes
NS2Static URI to use for namespace 2 (Apis Hive canonical namespace)Yes
NsTableStatic initial set of semicolon-separated URIs in the servers namespace tableYes
Parallel_ClientSamplers_ThresholdControl if/when to parallelize Event broker commands 31000No
Persist_NsTableIf enabled, the active namespace table is saved into NsTable at shutdown0Yes
PKI_*Settings related to security/use of certificate 4
RedundancySupportThe type of redundancy reported by the OPCUA server 50Yes
RootNot in use, will be removed in a future Apis versionYes

1 When set to 1, the HistoryWrite AccessLevelType on the target node is ignored

2 When set to 0, use the default limit in the Apis OPCUA Server SDK, which is 1024 as of Apis 9.16

3 The Event Broker commands UaServerSampleMonitoredItems and UaServerUpdateMonitoredItems_DataPush will be executed in parallel when the number of active OPCUA subscriptions passes this limit

4 Use Certificate Management in Apis Management Studio to configure these settings

5 Use the Property Editor in Apis Management Studio on your Apis Hive instance to configure these settings

Hierarchical OPC Namespace

The Apis Hive OPC server can expose a flat or hierarchical OPC namespace to third party OPC DA clients connecting to Apis Hive.

By default, a flat namespace is exposed. A hierarchical namespace can be configured using the registry settings for your Apis Hive configuration. The following is an example of such a configuration, with the registry keys inside the brackets and their values below:


[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace]

"EnableHierarchicalNS"=dword:00000001

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\OPC]

"Filter1"="5003==OPC.*"

"ShowItems"=dword:00000000

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\OPC\Random]

"Filter1"="5003==OPC.Random.*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker]

"Filter1"="5003==Worker.*"

"ShowItems"=dword:00000000

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker\Simulation Signals]

"Filter1"="5003==Worker.Signal*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker\Various Signals]

"Filter1"="5003==Worker.Variable*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker\Timestamp]

"Filter1"="5003==Worker.Time*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Alltimestamps]

"Filter1"="1==7"

Concept

To enable/disable a hierarchical namespace, set the registry value EnableHierarchicalNS to 0 or 1.

Each registry key, represents a branch or tree node. Underneath each key, you can specify zero or more filters that apply to the items at this level in the namespace. All filters are inherited from the parent keys/branches as well. A filter value has an arbitrary name, but the value is <AttributeID><Operator><FilterValue>:

  • AttributeID - The correct attribute ID specified in "Predefined Apis Hive attributes" or OPC DA Item attributes,
  • Operator - One of: ==, <>, <, <=, >, >= (equal, not equal, less, less or equal, bigger, bigger or equal).
  • FilterValue - The filter value the specified attribute must match. For proper types, wildcards are accepted. (See "Using wildcards")

Additionally, we can specify a value named "ShowItems" underneath any key in the namespace hierarchy, that specifies whether or not the items at this level should be visible or not. If this value is missing, the items are always shown by default.

OPC DA item attributes

The following OPC DA OPC DA - OLE for Process Control Data Access specification  item attributes have been defined by the OPC Foundation, and may occur on items seen in the Apis Hive environment.

OPC DA attributes

ID

Name

Description

Type

1

Type

The item canonical data type - the specific type of this item. Whether it's an integer, string, date, etc.

2 byte integer

2

Value

The current value of the item.

-

3

Quality

Item quality - this tells you whether an item currently has a good or bad quality. It might be bad, for example, because it isn't connected or is set to the wrong type.

2 byte integer

4

Time

The date and time when this item was last updated.

date

5

Rights

Item access rights - this property may be "read", or "read write". This means that the property may only be read from, or also can be written to. For OPC items, this written value will be passed on to the external items this item comes from.

4 byte integer

6

Scanrate

Server scan rate

4 byte real

100

Unit

EU units - this is the unit this item value uses. For example: centimetres, kilograms, kilowatts, etc.

string

101

Description

A description of what this item does. This is free text, so you can write anything you like here.

string

102

HiEU

High EU

8 byte real

103

LoEU

Low EU

8 byte real

104

HiRange

High instrument range

8 byte real

105

LoRange

Low instrument range

8 byte real

106

Close

Contact close label

string

107

Open

Contact open label

string

108

TimeZone

Item time zone

4 byte integer

200

FGC

Current foreground color

4 byte integer

201

BGC

Current background color

4 byte integer

202

Blink

Current blink

string

203

BMP

BMP file

string

204

SND

Sound file

string

205

HTML

HTML file

string

206

AVI

AVI file

string

300

AlmStat

Condition status

string

301

AlmHelp

Alarm quick help

string

302

AlmAreas

Alarm area list

array of strings

303

AlmPrimaryArea

Primary alarm area

string

304

AlmCondition

Condition logic

string

305

AlmLimit

Limit exceeded

string

306

AlmDB

Deadband

8 byte real

307

AlmHH

HiHi limit

8 byte real

308

AlmH

Hi limit

8 byte real

309

AlmL

Lo limit

8 byte real

310

AlmLL

LoLo limit

8 byte real

311

AlmROC

Rate of change limit

8 byte real

312

AlmDEV

Deviation limit

8 byte real

See also

Predefined Apis attributes, Data types overview

OPC DA Qualities

Quality namesDescriptionStatusValue
OPC_QUALITY_GOODQuality mask. The Quality of the value is Good.Good0xC0
OPC_QUALITY_LOCALOVERRIDEThe value has been Overridden. Typically this is means the input has been disconnected and a manually entered value has been ‘forced’.Good0xD8
OPC_QUALITY_BADQuality mask. The value is bad but no specific reason is known.Bad0x00
OPC_QUALITY_CONFIG_ERRORThere is some server specific problem with the configuration. For example the item in question has been deleted from the configuration.Bad0x04
OPC_QUALITY_NOT_CONNECTEDThe input is required to be logically connected to something but is not. This quality may reflect that no value is available at this time, for reasons like the value may have not been provided by the data source.Bad0x08
OPC_QUALITY_DEVICE_FAILUREA device failure has been detected.Bad0x0C
OPC_QUALITY_SENSOR_FAILUREA sensor failure had been detected (the ’Limits’ field can provide additional diagnostic information in some situations).Bad0x10
OPC_QUALITY_LAST_KNOWNCommunications have failed. However, the last known value is available.Bad0x14
OPC_QUALITY_COMM_FAILURECommunications have failed. There is no last known value is available.Bad0x18
OPC_QUALITY_OUT_OF_SERVICEThe block is off scan or otherwise locked. This quality is also used when the active state of the item or the group containing the item is InActive.Bad0x1C
OPC_QUALITY_WAITING_FOR_INITIAL_DATAAfter Items are added to a group, it may take some time for the server to actually obtain values for these items. In such cases the client might perform a read (from cache), or establish a ConnectionPoint based subscription and/or execute a Refresh on such a subscription before the values are available. This substatus is only available from OPC DA 3.0 or newer servers.Bad0x20
OPC_QUALITY_UNCERTAINQuality mask. There is no specific reason why the value is uncertain.Uncertain0x40
OPC_QUALITY_LAST_USABLEWhatever was writing this value has stopped doing so. The returned value should be regarded as ‘stale’. Note that this differs from a BAD value with Substatus 5 (Last Known Value). That status is associated specifically with a detectable communications error on a ‘fetched’ value. This error is associated with the failure of some external source to ‘put’ something into the value within an acceptable period of time.Uncertain0x44
OPC_QUALITY_SENSOR_CALEither the value has ‘pegged’ at one of the sensor limits (in which case the limit field should be set to 1 or 2) or the sensor is otherwise known to be out of calibration via some form of internal diagnostics (in which case the limit field should be 0).Uncertain0x50
OPC_QUALITY_EGU_EXCEEDEDThe returned value is outside the limits defined for this parameter. Note that in this case (per the Fieldbus Specification) the ‘Limits’ field indicates which limit has been exceeded but does NOT necessarily imply that the value cannot move farther out of range.Uncertain0x54
OPC_QUALITY_SUB_NORMALThe value is derived from multiple sources and has less than the required number of Good sources.Uncertain0x58

Limits

  • OPC_LIMIT_OK 0x00
  • OPC_LIMIT_LOW 0x01
  • OPC_LIMIT_HIGH 0x02
  • OPC_LIMIT_CONST 0x03

OPC HDA Qualities

OPC DA Qualities

The OPC DA OPC DA - OLE for Process Control Data Access Specification

Quality flags represent the quality state for a item's data value. The low 8 bits of the quality flags are currently defined in the form of three bit fields; Quality, Sub-status and Limit status.  The 8 quality bits are arranged as follows:

QQSSSSLL

Quality bit fields

Good

The Quality of the value is Good

Bad

Value is not useful for reasons indicated by the sub-status

Uncertain

The quality of the value is uncertain for reasons indicated by the sub-status

Sub-status bit fields

When the quality is "Good", the following sub-status may apply:

Non-specific

The value is good. There are no special conditions

Local Override

The value has been Overridden. Typically this is means the input has been disconnected and a manually entered value has been ‘forced’

When the quality is "Bad", the following sub-status may apply:

Non-specific

The value is bad but no specific reason is known

Configuration Error

There is some server specific problem with the configuration. For example the item is question has been deleted from the configuration

Not Connected

The input is required to be logically connected to something but is not.  This quality may reflect that no value is available at this time, for reasons like the value may have not been provided by the data source

Device Failure

A device failure has been detected

Sensor Failure

A sensor failure had been detected (the ’Limits’ field can provide additional diagnostic information in some situations.)

Last Known Value

Communications have failed. However, the last known value is available

Comm Failure

Communications have failed. There is no last known value is available

Out of Service

The block is off scan or otherwise locked   This quality is also used when the active state of the item or the group containing the item is InActive

When the quality is Uncertain, the following sub-status may apply:

Non-specific

There is no specific reason why the value is uncertain

Last Usable Value

Whatever was writing this value has stopped doing so

Sensor Not Accurate

Either the value has ‘pegged’ at one of the sensor limits  or the sensor is otherwise known to be out of calibration via some form of internal diagnostics

Engineering Units Exceeded

The returned value is outside the limits defined for this parameter. Note that in this case (per the Fieldbus Specification) the ‘Limits’ field indicates which limit has been exceeded but does NOT necessarily imply that the value cannot move farther out of range

Sub-Normal

The value is derived from multiple sources and has less than the required number of Good sources

Limit bit field

The Limit Field is valid regardless of the Quality and sub-status. In some cases such as Sensor Failure it can provide useful diagnostic information. The following limits may occur:

Not Limited

The value is free to move up or down

Low Limited

The value has ‘pegged’ at some lower limit

High Limited

The value has ‘pegged’ at some high limit

Constant

The value is a constant and cannot move

See also

Apis Vendor Specific Item Qualities

OPC HDA Qualities

Quality namesDescriptionValueAssociated DA Quality
OPCHDA_EXTRADATAMore than one piece of data that may be hidden exists at same timestamp0x00010000Good, Bad, Quest
OPCHDA_INTERPOLATEDInterpolated data value0x00020000Good, Bad, Quest
OPCHDA_RAWRaw data value0x00040000Good, Bad, Quest
OPCHDA_CALCULATEDCalculated data value, as would be returned from a ReadProcessed call0x00080000Good, Bad, Quest
OPCHDA_NOBOUNDNo data found to provide upper or lower bound value0x00100000Bad
OPCHDA_NODATANo data collected. Archiving not active (for item or all items)0x00200000Bad
OPCHDA_DATALOSTCollection started / stopped / lost0x00400000Bad
OPCHDA_CONVERSIONScaling / conversion error0x00800000Bad
OPCHDA_PARTIALAggregate value is for an incomplete interval0x01000000Bad, Quest

OPC DA Qualities

OPC HDA Aggregates

Aggregate nameDescriptionImplemented by APIS
Interpolative1. order interpolated values.x
TotalThe totalized value (time integral) of the data over the resample interval.x
AverageThe average data over the resample interval.x
Time AverageThe time weighted average data over the resample interval.x
CountThe number of raw values over the resample interval.x
Standard DeviationThe standard deviation over the resample interval.x
Minimum Actual TimeThe minimum value in the resample interval and the timestamp of the minimum value.x
MinimumThe minimum value in the resample interval.x
Maximum Actual TimeThe maximum value in the resample interval and the timestamp of the maximum value.x
MaximumThe maximum value in the resample interval.x
StartThe value at the beginning of the resample interval. The time stamp is the time stamp of the beginning of the interval.x
EndThe value at the end of the resample interval. The time stamp is the time stamp of the end of the interval.x
DeltaThe difference between the first and last value in the resample interval.
Regression Line SlopeThe slope of the regression line over the resample interval.
Regression Line ConstantThe intercept of the regression line over the resample interval. This is the value of the regression line at the start of the interval.
Regression Line ErrorThe standard deviation of the regression line over the resample interval.
VarianceThe variance over the sample interval.x
RangeThe difference between the minimum and maximum value over the sample interval.x
Duration GoodThe duration (in seconds) of time in the interval during which the data is good.x
Duration BadThe duration (in seconds) of time in the interval during which the data is bad.x
Percent GoodThe percent of data (1 equals 100 percent) in the interval, which has good quality.x
Percent BadThe percent of data (1 equals 100 percent) in the interval, which has bad quality.x
Worst QualityThe worst quality of data in the interval.x
AnnotationsThe number of annotations in the interval.

In addition, APIS Honeystore also implements the following vendor specific aggregates.

Aggregate nameDescription
SumThe sum of all raw values over the resample interval.
Interpolative zero-order0. order interpolated values, aka sample-and-hold.
MedianThe median is described as the numeric value separating the higher half of a set if values, from the lower half.
If a < b < c, then the median of the list {a, b, c} is b, and if a < b < c < d,
then the median of the list {a, b, c, d} is the mean of b and c, i.e. it is (b + c)/2.
MinimumActualTime2UA - The minimum value in the resample interval and its timestamp, including bounding values.
MaximumActualTime2UA - The maximum value in the resample interval and its timestamp, including bounding values.
Range2UA - The Range2 Aggregate finds the difference between the maximum and minimum values in the i nterval as returned by the Minimum2 and Maximum2 Aggregates. Note that the range is always zero or positive.
PercentGood (UA)UA - Retrieve the percent of data (0 to 100) in the interval which has a good StatusCode.
PercentBad (UA)UA - Retrieve the percent of data (0 to 100) in the interval which has a bad StatusCode.
VectorElementSumCaclulates the sum of vector elements individually (Not OPC HDA compliant)
VectorElementAverageCaclulates the average of vector elements individually (Not OPC HDA compliant)
VectorElementMinFinds the minimum of vector elements individually (Not OPC HDA compliant)
VectorElementMaxFinds the maximum of vector elements individually (Not OPC HDA compliant)
DisplayValuesValues that give the best trend curve representation for a specific number of pixels. The desired start/end time and the number of x-pixels (nx) available, are used to divide the period into nx resample intervals (1 resample interval per x-pixel). For each resample interval, anything from 0 to 4 data points may be returned. At most, the following data points are returned for each resample interval: the first data point, the maximum data point, the minimum data point and the last data point.
LowpassFilterLowpass filtering of the RAW data for the given period. The number of datapoints returned is the same as the number of raw datapoints.
The resample interval is interpreted as the time-constant (T). The algorithm is:
v(n+1) = v(n) + dT*(v(n+1) - v(n))/T
where
v - value
n - datapoint index
dT = timestamp(n+1) - timestamp(n)
If T <= dT, no filtering is applied to avoid instability.
MovingAverageByCountMoving average of the RAW data for the given period, ignoring the time between datapoints. The number of datapoints returned is the same as the number of raw datapoints.
The resample interval is interpreted as the number of values in the window. For the first window, a cumulative average is applied.
MovingAverageByTimeMoving average of the RAW data for the given period, taking the time between datapoints into account. The number of datapoints returned is the same as the number of raw datapoints.
The resample interval is interpreted as the the window size. For the first window, a cumulative average is applied.

The OPC Server Enumerator

The OPC Server Enumerator is an application provided by the OPC Foundation make connecting an OPC client to an OPC server on a remote computer easier. This topic describes how to install the OPC Server Enumerator on a remote computer if it isn't already installed. The ApisOPCBee OPC client also uses the OPC Server Enumerator to retrieve necessary server information when connecting to a remote computer.

All OPC servers are obliged to register in the OPC servers component category on their local computer. This provides a practical way for OPC clients to retrieve all servers that are installed on that computer, they simply query the appropriate component category for all available servers. The component categories are maintained within the registry on the computer, and when the OPC server and client run on the same computer, there are no obstacles for the client to query the component categories on that computer. When the server is running on a remote computer on the other hand, the client needs help to query available servers on that computer. Therefore, OPC Foundation provided the OPC Server Enumerator.

Installing the OPC Server Enumerator

In the Apis binaries directory, e.g. C:\\Apis\Bin, there is a file called OPCENUM.EXE. Copy this file to a suitable directory on the remote computer. Then, from a Command Prompt on the remote computer, go to the directory and execute the following command:

OpcEnum /Service

Then, open DCOMCNFG.EXE located in the Windows system directory, and grant "Everyone", " SYSTEM", and "NETWORK" accounts "Access and Launch" rights to the OpcEnum application. Also, ensure that the "Enable Distributed COM on this computer" check box is checked.

OPC Server Enumerator is now installed, and you can test it by launching an application named ENUMTEST.EXE located in the Apis binaries directory. Using ENUMTEST, specify a remote server type by entering R, then enter the remote server name or IP-address to try to retrieve the OPC servers on that remote computer.

OPC DCOM setup

Introduction

This document provides additional help when configuring DCOM security settings in windows

Acronyms and Abbreviations

OPC : Ole for Process Control

DCOM : Distributed Component Object Model

OPCENUM : A computer service (program) provided by the OPC foundation. It is used to find and list the installed OPC servers on a computer.

Prerequisites for OPC communication

OPC communication always involves at least one OPC client and one OPC server. In order for OPC communication to work, both computers must be connected via a TCP/IP based network. Also, the computers must be active, that is, turned on, and not hibernating. The OPC server software must be installed on the server, and the proper DCOM settings must be set. Setting the proper DCOM settings involves:

1. To allow a client to browse the server computer for installed OPC servers, the ”Opcenum” service should be installed and properly configured. The presence and use of OPCENUM is not required in order to run OPC communication, but configuration work is easier when OPCENUM is available.

2. User and access rights must be configured in order to establish an OPC client – server communication.

Sources to confusion, obstacles and pitfalls

  • Firewalls
  • Computer wide and process specific security.
  • User account can achieve rights through group membership and Default and Customized permissions.
  • Local system account is treated as Anonymous logon on remote computer or domain.
  • Group Everyone is limited to all known users on local computer.
  • Callbacks with unknown identity, different than the caller.
  • Microsoft changes their security mechanism in service packs and new versions.
  • No universal recipe, “ALL” cases are different

Main tasks/challenges

  • Collect info

    • Client
      • Running account with password.
      • Path to executable.
    • Server
      • Host name or IP address
      • Running account with password
      • Path to executable.
  • Firewalls

  • Windows “security” UAC

  • DCOM security

    • OPC enumerator

    • Specific OPC server DCOM settings

OPC setup, step-by-step

This is a step by step configuration of security settings for OPC communication between a server and a client.

Requirements:

  • Firewall open for OPC traffic
  • Opc server is installed
  • OpcEnum.exe is installed

Windows Firewall

Before starting any configuration, we must check Windows Firewall settings on all of the computers participating in the communication.

Following programs and ports must be open:

  • DCOM (rpc): TCP port 135, bidirectional
  • All OPC servers and clients (program)
  • OPC enum

Do this from the firewall control panel configuration tool, or the most efficient way, from script

Example:

DCOM (rpc):

netsh advfirewall firewall add rule name="OPC DCOM (RPC)" protocol=TCP dir=in localport=135 action=allow profile=any

netsh advfirewall firewall add rule name="OPC DCOM (RPC)" protocol=TCP dir=out localport=135 action=allow profile=any

OPC Servers and client

In this case ApisHive:


netsh advfirewall firewall add rule name="AllowApis" dir=in program="C:\Program Files\APIS\Bin\apishive.exe" action=allow

netsh advfirewall firewall add rule name="AllowApis" dir=out program="C:\Program Files\APIS\Bin\apishive.exe" action=allow

OPC enum:

netsh advfirewall firewall add rule name= "Allow OpcEnum" dir=in

program="C:\\Windows\SysWOW64\opcenum.exe" action=allow

netsh advfirewall firewall add rule name= "Allow OpcEnum" dir=out

program="C:\\Windows\SysWOW64\opcenum.exe" action=allow

User Account Control UAC

If possible, turn off UAC on all computers participating in the communication.

This saves us for painful popups, and it is more likely that programs and configurations installs correctly.

The most efficient way to do this is to run following command on W7/Vista/2008 computer:


reg.exe ADD HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System /v EnableLUA /t REG_DWORD /d 0 /f

NOTE! Reboot is required

DCOM security

This can be a bit confusing and over-complex from time to time and can cause a major trouble during commissioning.

There is no universal recipe, ALL cases are different, so basic knowledge to how the security model works is essential, along with thorough overview of user accounts, computers and client/servers participating in the communication, general assumptions of any of this parameter will lead to failure. Spend some time before starting any configuration to identify all OPC clients and servers with belonging User Accounts.

When all parameters are known, the (D)COM principals we are dealing with are basically simple:

User Access is the key!

OPC DA clients can principally operate in two main read modes:

  • Polling; client polls for data at certain interval
  • Subscribe; Client initiates subscription on server and server sends data to client when value, timestamp or quality changes.

DCOM Security mechanism

Example

The basic mechanism is comparable to a postal worker, carrying goods from a storage room in one building, to a storage room in another building. Each building has two locked doors: the main entrance door; and an internal door to the storage room. The postal worker must be known and have access to the two doors.

Terms:
Example TermComputer Term
Postal workerA user account
The storage roomAn OPC server
The buildingsTwo computers
The mailData
Polling

The postal worker leaves the client building, enters the main door in the server building, then the storage room. He gets the goods he wants and returns home to the client building with his goods.

In this case, the postal worker from the client building must be well known and have access to the two doors in the server building.

Subscription

The postal worker leaves the client building, enters the main door in the server building, then enters the storage room, but now he asks the caretaker:

“When you have any new goods I’m interested in; can you bring it to me at the storage room in the client building?”

“No problem!” the caretaker says, then the postal worker leaves home without any data.

Now, when new goods are produced in the server building and placed in storage, the caretaker collects them immediately and leaves to deliver them to the storage room in the client building. The caretaker from the server building has to enter the two doors in the client building to deliver the goods.

As in the polling case, the postal worker from client building must be well known and have access to the two doors in the server building to deliver his wishes. Then the caretaker from server building must be well known and have access to the two doors in the client building to deliver the goods.

Mutual User Account recognition

To enable both computers to properly recognize User Accounts, it is necessary to ensure that User Accounts are recognized on both the OPC Client and Server computers. This includes all the User Accounts that will require OPC access. The account running the local client, might be different from the account running the remote OPC server.

In other words, we must have knowledge to all User Accounts for the various processes which are participating in the communication.

As mentioned OPC clients can operate in two read modes: Polling and Subscribe

  • Polling; client polls for data on certain interval
  • Subscribe; Server sends data to client when value, timestamp or quality changes.

Issue:

Polling:

  • Client must have read access on server.
  • The user asking for data must be known to the server
  • The user must have “read” access on the OPC server

Subscribing:

  • To be able set up subscription, the user asking for data must be known to the server
  • The user must have “read” access on the OPC server
  • In addition the OPC server must have “write” access on client, thus the user account running the server must be known to the client, as it writes back to the client (callback)

User(s) participating in communication must exists on both server and client.

Users must have appropriate rights to server and client.

Recommendations:

  • In production environment it is recommended when possible, the server should be installed as service and running on specific account, preferably System account
  • Avoid running server interactively, meaning starting it directly from logged on user. This will lead to confusion on the client side, the server running account changes depending on who is logged on server.
  • Server and Client running account password policy should be set to “never expire”, if not configuration must be updated on password change.

“Local system account” is treated as “Anonymous user” on remote computer.

“Some unknown user” is NOT treated as anonymous user on remote computer.

Terms:

Assumptions used in this document (as mentioned all cases are different this is just an example):

ApplicationUserPassword
OPC server (ApisHive)OPCServerUser<some password>
OPC client (ApisHive)Local System<some password>

Commissioning:

Find out what User account(s) are running, the OPC server, the OPC client and finally the local configuration tool. This overview is essential and the key for further configuration and cannot be repeated too often.

  1. Find out which user is running the application(s) client and server.
  2. Is the application running as service or not?
  3. Use task manager or services console if server/client is running as service.
  4. If the application is not running as service, keep in mind that the running user might change depending on which user is currently logged on.

Computer settings

1. Select ”Local Security Policy” under ”Administrative tools”

2. Select ”Network access: Sharing and security model for local accounts” under Security Options

3. Right click and select Properties and choose ”Classic – local users authenticate as themselves”.

Start dcomcnfg by clicking Start – Run, type in: dcomcnfg, and hit enter

a. Component Services window opens

5. Right click on Component Services - Computers – My Computer and select properties

6. Select Default Properties tab

7. Check Enable Distributed COM on this computer

Check and set DCOM security on OPC server

System-Wide DCOM settings

Classic OPC depend on Microsoft’s DCOM for the data transportation. Consequently, you must configure DCOM settings properly. The system-wide changes affect all Windows applications that use DCOM, including OPC application. In addition, since some OPC Client applications do not have their own DCOM settings, they are affected by changes to the default DCOM configuration.

Task:

Allow OPC client to access OPC server host, check settings on remote computer.

Action:

Use Component Services to set the limits (the main entrance door) and default access configuration see Computer wide limits section in the troubleshooting guide, assure user running the local client has access.

Specific OPC server DCOM settings

When system wide access is granted (access to the entrance main door) it’s time to assure access to the specific OPC server (the storage room).

On remote OPC server computer, start Component Services and browse to My Computer see OPC server access rights example in the troubleshooting guide, assure user running the local client has access.

Check and set DCOM security on OPC client

OPC server callback access rights

As mentioned earlier the OPC server must have “write” access on client, thus the user account running the server must be known to the client, as it writes back to the client (callback).

Figure out what user account is running the remote server, take a look at the OPC server callback access rights section in the troubleshooting guide and assure the user running the server has access to your local computer as well as the local OPC client process.

OpcEnum DCOM Settings

To be able to browse available OPC servers OpcEnum must be installed on the server computer.

Usually this is performed by the OPC server installer.

However, this should be part of the configuration checklist.

If Opcenum is not running as service, locate it and register it:

  • Opcenum.exe -service
  • Run under local system account.
  • Give Anonymous and all users running clients access to it in DCOM follow procedure at OPC enumerator access rights in troubleshooting section and assure all actual users have access.

Troubleshooting

In case of OPC communication problems:

1. At first, check the list of common pitfalls;

2. Ping the client from the server and vice-versa to verify that network communication exists;

3. Check the documentation of both server and client, to see whether error messages can be logged for view in log viewer. If so, enable the logs and check them in log viewer in Apis Management Studio;

a. Use ”Control panel-> Local security policy -> Audit policy” to enable security auditing. Success or failure of user authentication then can be logged to the Security log, and this may reveal the cause of communication failure (Remember to restart the server to activate the new settings);

4. If communication is established, but tags are ”bad:not connected”, and read mode is “Subscribe”, change the read mode to ”Poll”. If this solves the problem, the server has insufficient access rights (lacks the ”write” right) to the client;

5. If communication is established, but tags are ”bad:configuration error”, the source item id (OPC address) in the server is usually incorrectly configured on the client.

6. Further readings see Troubleshooting OPC Communication DCOM and Firewall issues.

Troubleshooting OPC Communication DCOM and Firewall issues

When experiencing disruption in communication, first of all, check the Log View in Apis Management Studio for any messages related to your problem, if any messages containing:

Message containsSymptom
Access is denied. (0x80070005)Access denied, usually indicates DCOM security misconfiguration
The RPC server is unavailable. (0x800706BA)RPC errors can indicate Windows firewall security misconfiguration, or networks obstacles in general
The remote procedure call failed. (0x800706BE)

OPC enumerator problem

When configuration of security setting of remote computer is incomplete, the OPC server list will be empty when browsing for OPC servers on remote computer and you might get error message(s) in the Log View in Apis Management Studio.

DCOM security

Message like this in the Log View in Apis Management Studio indicates that the problem likely is DCOM security related more than firewall. Remote server says “Access denied”


Failed to create OPC Server Lister object on 10.100.86.125.

As a result, OPC servers might not be available from the list of servers to choose from. Make sure OPCENUM.EXE is properly registered and configured on the server machine, consider both DCOM security and open the Firewall for OPCENUM.exe.

Or, you can enter the CLSID of your OPC server directly into the server property.

Error return: Access is denied. (0x80070005)

Let’s assume in this case, the local client is running on “System account” meaning that Anonymous logon must have access right to remote computer and the OpcEnum process on the remote computer.

Solution:

Check computer wide limits for Anonymous logon on remote computer as well as access rights on the OpcEnum process.

Computer wide limits

On OPC server computer, start Component Services and browse to My Computer right click and Properties, select COM Security tab in Access Permissions section press Edit Limits, assure that Anonymous logon has Remote Access. If ANONYMOUS LOGIN does not exist in the list, it must be added.

Repeat for Launch and activation permissions.

OPC enumerator access rights

Still in Component Services browse to OpcEnum right click and Properties, select Security tab, press Edit button in Access permissions section, an assure Anonymous login has Remote access. If ANONYMOUS LOGIN does not exist in the list, it must be added.

Repeat for Launch and activation permissions.

If you changed any of the settings, the OpcEnum service must be restarted for the changes to take effect.

Firewall

Message like this in the Log View in Apis Management Studio indicates that the problem likely is firewall or network related. There is no answer from remote server.


Failed to create OPC Server Lister object on 10.100.86.125.

As a result, OPC servers might not be available from the list of servers to choose from. Make sure OPCENUM.EXE is properly registered and configured on the server machine, consider both DCOM security and open the Firewall for OPCENUM.exe.

Or, you can enter the CLSID of your OPC server directly into the server property.

Error return: The RPC server is unavailable. (0x800706BA)

Solution:

The firewall must be opened for the OpcEnum process.

Two alternatives to configure; script or firewall control panel.

Script

From elevated command prompt run the following commands:


netsh advfirewall firewall add rule name="Allow OpcEnum" dir=in program="C:\\Windows\SysWOW64\opcenum.exe" action=allow

netsh advfirewall firewall add rule name="Allow OpcEnum" dir=out program="C:\\Windows\SysWOW64\opcenum.exe" action=allow

Beware of the OpcEnum installation path.

Firewall control panel

On OPC server computer start Control panel-> Windows firewall->Advanced settings->New Rule select Program and press Next enter the program path to the OpcEnum executable like “C:\\Windows\SysWOW64\OpcEnum.exe” press Next

Select Allow the connection Next

Apply to all networks Next

Give the rule a proper name like “Allow OpcEnum” and Finish

The Window firewall will now allow connections to the OpcEnum process.

OPC DA/HDA access problems

When configuration of security setting of remote computer is incomplete, you are not able to connect to the remote OPC server, thus item browsing is unavailable and you might get error message(s) in the Log View in Apis Management Studio.

DCOM security on remote server.


»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed to create OPC server, Prediktor.ApisOPCServer.1, on 10.100.86.125.

Error return: Access is denied. (0x80070005).

This message indicates that the problem is DCOM security related. Remote server says “Access denied”

Let’s assume in this case, the local client is running on “System account” meaning that Anonymous logon must have access right to remote computer and the Prediktor.ApisOPCServer.1 process on remote the computer

Solution:

Check computer wide limits for Anonymous logon on remote computer as well as access rights on Prediktor.ApisOPCServer.1

Computer wide limits

See how to set Computer wide limits in previous section

OPC server access rights

Still in Component Services, in this case browse to ApisHive (OPC server) right click and Properties, select Security tab.

In this case the OPC server (ApisHive) is using default properties, we have two chooses:

• Change it to Customized permissions, follow the same procedure as in OPCenum access rights section

• Keep the default. The advantage of use default is if we are using several OPC server instances on same computer the access rights can be set in one place if desirable.

In this example we choose to keep default, now close the ApisHive Properties dialog, browse to My Computer right click and Properties, select COM Security tab in Access Permissions section and now press Edit default, assure that Anonymous logon has Remote Access.

Repeat for Launch and activation permissions, assure Anonymous user has Remote Launch and activation permissions.

If you changed any of the settings, the OPC server (ApisHive) service must be restarted for the changes to take effect.

Windows Firewall


ALARM from OPC

»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed to create OPC server, Prediktor.ApisOPCServer.1, on 10.100.86.125.

Error return: The RPC server is unavailable. (0x800706BA).

Like in the OPC enum case, this message indicates that the problem likely is firewall related. There is no answer from remote server.

Solution:

The firewall must be opened for ApisHive process. Follow the procedure in Firewall configuration of OPC enum but in this case open for ApisHive ("<install dir>\Bin\ApisHive.exe")

OPC server callback Firewall


ALARM from OPC/opcda://10.100.86.125/Prediktor.ApisOPCServer.1 [Primary]

»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed calling IOPCDataCallback::Advise - IOPCDataCallback! Error return: The RPC server is unavailable. (0x800706BA).

This message indicates that the problem likely is firewall related. There is no answer from remote server, the server tries to write back to client but hits the firewall.

Solution:

The firewall on the local client computer must be opened for ApisHive process. Follow the procedure in Firewall configuration of OPC enum but in this case open for ApisHive ("<install dir>\Bin\ApisHive.exe").

OPC server callback access rights


ALARM from OPC/opcda://10.100.86.125/Prediktor.ApisOPCServer.1 [Primary]

»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»

Failed calling IOPCDataCallback::Advise - IOPCDataCallback! Error return: Access is denied. (0x80070005).

This message indicates that the problem is DCOM security callback related. Remote server tries to write back to client but gets “Access denied”

In this case server is running on “OPCServerUser” account meaning that when trying to write back to the client it must have access right to local computer and the process running the client as well (Prediktor.ApisOPCServer.1).

On local computer:

Assure OPCServerUser exist with same password as the corresponding user on remote server.

Assure OPCServerUser has computer wide limits remote access rights.

Assure OPCServerUser has remote access rights to client process, in this case ApisHive, trough default access permissions.

If you changed any of the computer wide settings, the OPC server (ApisHive) service must be restarted for the changes to take effect.

How to set DCOM security Computer wide limits for a specific user

Start Component Services system configuration and browse to My Computer, right click, select Properties and select COM Security tab in Access Permissions section: Press Edit Limits button and assure that that the specific user has Local and Remote Access.

Repeat for Launch and activation permissions.

Run Time States

The Apis Hive has several runtime-states. A runtime-state is an internal condition of Apis Hive and its modules, and the different states are:

NameDescription
InitializedInternal Apis Hive initializing.
Modules createdThe included modules are created.
Configuration loadedThe modules have loaded their persisted configuration.
Resources acquiredThe modules have acquired any resources they need.
StartedThe configured system is running.
PausedThe configured system is idle.
StoppedThe configured system is stopped.
Configuration savedThe configuration is saved.
Resources releasedThe modules have released any acquired resources.

By default, Apis Hive will automatically enter the "Started" runtime-state when launched, passing through all the other states. Apis Hive will initialize itself, create the modules, load their configuration, and acquire their resources before entering the "Started" state. The automatically entered runtime-state is configurable, see "Configuring the Apis Hive properties".

It's also possible to manually pass through the different states. This might be desirable, for example, when debugging a malfunctioning configuration. To do this, set the auto start level to Initialize and restart the Apis Hive. To manually set the runtime-state of an Apis Hive Instance node, change the "Running state" property.

Delaying Runtime State Transitions

It's possible to set a delay between some of the different runtime-states when Apis Hive is starting. This might be desirable when some external application (e.g. OPC server) started indirectly by Apis Hive needs time to load before it can accept or handle any queries from another program.

At present, these settings must be configured through the Windows NT registry, using the RegEdit.exe Windows NT application, located in the Windows directory. In this program, locate the registry key:

HKEY\_LOCAL\_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive

Under this key, create a DWORD value for each of the delay types listed in the table below that you want to take advantage of.

Name of DWORD valueMeaning
CreateToConfigureDelaySpecifies a delay in milliseconds between the runtime-states "Modules created" and "Configuration loaded".
ConfigureToAcquireResourcesDelaySpecifies a delay in milliseconds between the runtime-states "Configuration loaded" and "Resources acquired".
AcquireResourcesToStartDelaySpecifies a delay in milliseconds between the runtime-states "Resources acquired" and "Started".

The actual value of the respective DWORD values gives the delay in milliseconds.

Rename And Delete Multiple Items Using Files


Renaming items in one or more modules using a text file

There is an option to batch rename items from a text file using Apis Management Studio.

The input file text structure

  • A line within a text file is defined as a sequence of characters followed by a line break.
  • Each line in the text file corresponds to an item subjected to renaming.
  • The semi-colon character acts as the separator between the old and the new name of the item.
  • Blank or duplicate lines are not allowed.
  • Whitespace characters are not allowed.
  • All lines must adhere to the following structure (spaces and brackets omitted):
    [module_name]   .   [item_name]   ;   [new_item_name]

Renaming items in different modules from a single file

Multiple modules can have their individual items renamed from within a single file, as shown below.

Line by line text format

  Module_A.Item_1;Renamed_Item_1
  Module_A.Item_2;Renamed_Item_2
  Module_B.Item_3;Renamed_Item_3

The Apis Management Studio rename items from file feature

In the example below, we are using the text file renameitems.txt, with the following content:

Worker.Signal1;RenamedSignal1
Worker.Signal2;RenamedSignal2
Worker.Signal3;RenamedSignal3
Worker.Signal4;RenamedSignal4
Worker.Signal5;RenamedSignal5
Worker.Signal6;RenamedSignal6
Worker.Signal7;RenamedSignal7
Worker.Signal8;RenamedSignal8
Worker.Signal9;RenamedSignal9
Worker.Signal10;RenamedSignal10

The process of batch renaming items

  1. In Apis Management Studio:
    Right click the Modules node and select Rename Items From File

    The following dialog appears:

  2. Click the button marked Select File and select the text file to upload.

  3. The dialog displays the current and new item names.
    To prevent an item from being renamed, in the Selected column, de-select the row corresponding to the item.

  4. Click the butten marked Rename Items to start the renaming process.


Deleting items from one or more modules using a text file

In Apis Management Studio, the process of batch deleting items is similar to renaming items, however, the file format is somewhat simpler.

The input file text structure

  • A line within a text file is defined as a sequence of characters followed by a line break.
  • Each line in the text file corresponds to an item subjected to removal.
  • Blank or duplicate lines are not allowed.
  • Whitespace characters are not allowed.
  • All lines must adhere to the following structure (spaces and brackets omitted):
    [module_name]   .   [item_name]

Deleting items in different modules from a single file

Multiple modules can have their individual items removed from within a single file, as shown below.

Line by line text format

  Module_A.Item_to_remove_1
  Module_A.Item_to_remove_2
  Module_B.Item_to_remove_3

The Apis Management Studio delete items from file feature

In the example below, we are using the text file deletetems.txt, with the following content:

Worker.RenamedSignal1
Worker.RenamedSignal2
Worker.RenamedSignal3
Worker.RenamedSignal4
Worker.RenamedSignal5
Worker.RenamedSignal6
Worker.RenamedSignal7
Worker.RenamedSignal8
Worker.RenamedSignal9
Worker.RenamedSignal10

The process of batch deleting items

  1. In Apis Management Studio:
    Right click the Modules node and select Remove Items From File

    The following dialog appears:

  2. Click the button marked 'Select File and select the text file to upload.

  3. The dialog displays the items marked for deletion.
    To prevent an item from being deleted, in the Selected column, de-select the row corresponding to the item.

  4. Click the button marked Delete Items to start the item removal process.

Registering Multiple Instances

You can install multiple instances of Apis Hive. This is convenient: when running unrelated configurations on the same server; if you want to increase stability when interacting with unstable software components; to allow for different DCOM security configuration on different instances to reduce access; and more.

You can use the Apis Management Studio or the Command Line to register new instances.


DCOM security

The named instance needs to have its DCOM security setting set, the same as any default Apis Hive instance. The name of the instance will appear in the DCOM Configuration utility, just like any other DCOM server. See DCOM Configuration.


Remove a Named Apis Hive Instance

Use the Apis Management Studio or the Command line to remove a named Apis Hive Instance.

Migrate 32-bit Apis Hive configuration to 64-bit

Apis Hive has been available in both a 32-bit and a 64-bit versions. The current version only support 64-bit. The 64-bit version contains fewer modules than the 32-bit version, but modules are continuously converted to 64-bit.

A configuration created by a 32-bit version of Apis Hive may at YOUR OWN RISK be manually migrated to a 64-bit.

WARNING: Check that all 32-bit modules in your configuration are supported by the 64-bit version before attempting to migrate!

  • Back up your configuration.

  • Manually copy 32-bit Apis Hive instance registry configuration content from :

    HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\<INSTANCENAME>

    to:

    HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<INSTANCENAME>

  • Remove 32-bit Apis Hive instances with use of Apis Management Studio or command line

  • Uninstall the 32-bit Apis Hive version.

  • Remove 32-bit Apis Hive instance registry configuration content from:

    HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\<INSTANCENAME>

  • Install the 64-bit Apis Hive version.

  • Add 64-bit Apis Hive instances with use of Apis Management Studio or command line

  • Set the desired DCOM and Windows service settings for the 64-bit Apis Hive instances.

Apis Hive DCOM Configuration

Apis utilises DCOM technology, and is subject to DCOM security considerations in general. Using DCOM security, you can customise who is allowed to access Apis for configuration and data retrieval.

Default settings

When installing Apis for the first time on a computer, the following DCOM settings are applied:

  • The Apis Hive application is registered to run as a Windows NT service, running with the security context of the built-in "System" account.
  • The access rights for the Apis Hive application are set to allow everyone access, i.e. very poor access control.

If you're not comfortable with these settings, you should read the rest of this topic to change them.

NOTE: If you're upgrading or re-installing Apis on a computer, the previously used settings are preserved.

Setting the security context for Apis Hive

The Apis Hive must run in a security context with Administrator privileges. Depending on whether Apis Hive is configured to run as a service or DCOM server, there are several options. Launch the Windows Distributed COM Configuration Properties application, DCOMCNFG.EXE, located the Windows system directory.

Selecting the user account to use to run Apis Hive

From the applications tab in the DCOMCNFG window, locate and select Apis Hive from the list of applications. Click the "Properties" button, then, select the "Identity" tab.

  • If Apis Hive is registered to run as a DCOM server, select a user account with administrator privileges on the local machine.
  • If Apis Hive is registered to run as a service, the lower checkbox named "The system account" will be enabled as well, and should be selected. Note: In some cases, when Apis Hive, or its modules, communicates with other applications using network services, Apis Hive may require to run on a dedicated user account instead, due to restrictions on network usage on services.

Important! The selected user account Apis Hive is configured to run on must have the user rights "Log on as batch job" and "Log on as service" enabled on it.

Enabling remote access and configuration of Apis Hive

To enable remote configuration or access to Apis, select the "Default Properties" tab in the DCOMCNFG window. Make sure the "Enable Distributed COM on this computer" and "Enable COM internet services on this computer" are checked. Also, select the "Default Authentication Level" to "Connect" and "Default Impersonation Level" to "Identify".

Granting access and launch permissions

The last step is to grant access and launch permissions to different users of Apis Hive. Select the "Security" tab from Apis Hive's properties. To customise the access and launch permissions of Apis Hive, select "Use custom access permissions" and "Use custom launch permissions", and press the "Edit" buttons to modify them.

You should always grant access and launch rights to the "Administrators" group and the "System" account. If Apis Hive is accessed through Internet Information Server, for instance, you must enable access permission for the user configured in Internet Information Server. Particularly when used together with Apis Process Explorer, Apis Jar, or the Apis OLEDB Providers, . This user is by default: IUSR_ComputerName, where ComputerName is the name of the computer running Internet Information Server. Further, if remote access and configuration are to be allowed, you must also grant the "Network" account access to Apis Hive.

Other Apis executables, such as the Apis historian services, Apis Honeystore and Apis OPCHDA, also need the same security configuration. So, instead of configuring each application individually, you set each of them to default access, launch and configuration permissions, and instead configure the default's Security in the "Distributed COM Configuration Properties" page. Make sure the Apis account and web server account have full access for all three types of permissions.

Note that in systems with other DCOM servers, this might compromise security policies for those servers.

Apis Environment Variables

There are some system environment variables that can be set and shared between Apis products.

You can use these two variables to conveniently separete configuration and data files from the default location.
Default location is under the installation directory, eg. %PROGRAMFILES%\APIS.

APIS_CONFIG_PATH

This system environmen variable specifies a base directory for the configration files, for all ApisHive instances and ApisHoneyStore.
The ApisHive instances and ApisHoneyStore will create sub folders as needed, typically to store instance specific files.

APIS_DATA_PATH

This system environmen variable specifies a base directory for the data files. Typically, the Apis HoneyStore and Chronical databases use this variable
as a default/suggested location for where to put their data files.

The Apis Hive Command Line

The Apis Hive binary file is ApisHive.exe. Locate the directory "<ApisDIR>\Bin" at a Command Prompt to execute the commands.

How to run Apis Hive

You can register the Apis Hive application to run in one of two possible configurations: as a DCOM server or as a Windows service.

Running Apis Hive as a DCOM server

To run Apis Hive default instance as DCOM server, execute the following command:


ApisHive –Regserver

Running Apis Hive as a Windows service

To run Apis Hive default instance as service, execute the following command:


ApisHive –Service

Re-register the Apis Hive application

If you've had Apis Hive installed on your computer earlier, you can register Apis Hive to run as the same application type as before. To do so, execute the following command:


ApisHive –Install

Registering a new named Apis Hive instance

To register a named Apis Hive instance, execute the following command from a command prompt:


C:\\Apis\Bin>ApisHive.exe -MyInstanceName -AddInstance

Now, you have an instance called -MyInstanceName. The named instance is, by default, registered to run as a Windows service and can now be started from the Services Control Panel Applet, just like any other Windows service.

Registering a named instance as a COM server

To register a named Apis Hive instance to run as a service, execute the following command from a command prompt:


C:\\Apis\Bin>ApisHive.exe -MyInstanceName -Regserver

The named instance is now registered to run as a COM server, and can be started from the command prompt:


C:\\Apis\Bin>ApisHive.exe -MyInstanceName

Registering a named instance as a Windows service

To register a named instance to run as a Windows service, execute the following command at the command prompt:


C:\\Apis\Bin>ApisHive.exe -MyInstanceName -Service

The named instance can now be started from the Services Control Panel Applet, just like any other Windows service.

Remove a named Apis Hive instance

Execute the following command at the command prompt:


C:\\Apis\Bin>ApisHive.exe -MyInstanceName -RemoveInstance

List all instances

To list all Apis Hive instances registered on a computer, execute the following command on the command prompt:


C:\\Apis\Bin>ApisHive.exe -ListInstances

When executing this command, registry files will also be exported to the respective config directories. Which includes all CLSIDs necessary to allow for remote launching/activation/access from other Apis clients and OPC servers on a remote computer. This might also be useful if you want to assure that all named Apis Hive instances on a computer farm have the same CLSIDs on the same computers.

Windows Registry

Apis Hive instances are configured with Apis Management Studio, but some settings must be manually edited in the Windows Registry, These settings are stored under the registry key

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<INSTANCENAME>

The following topics have further details on the available settings:

Setting Event Broker Priorities

To ensure the Event Broker executes as expected, you can increase its priority on systems running heavy tasks. Do this by setting the following registry entries.

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\ApisEventBroker]

"MainThreadPriority"="2"  
"ExecutionThreadPriority"="2"

The supported values of these entries, are:

PriorityMeaning
1ABOVE_NORMAL: Priority 1 point above the priority class
-1BELOW_NORMAL: Priority 1 point below the priority class
2HIGHEST: Priority 2 points above the priority class
-15IDLE : Base priority of 1 for IDLE_PRIORITY_CLASS, BELOW_NORMAL_PRIORITY_CLASS, NORMAL_PRIORITY_CLASS, ABOVE_NORMAL_PRIORITY_CLASS, or HIGH_PRIORITY_CLASS processes, and a base priority of 16 for REALTIME_PRIORITY_CLASS processes
-2LOWEST: Priority 2 points below the priority class
0NORMAL: Normal priority for the priority class
15TIME_CRITICAL: Base priority of 15 for IDLE_PRIORITY_CLASS, BELOW_NORMAL_PRIORITY_CLASS, NORMAL_PRIORITY_CLASS, ABOVE_NORMAL_PRIORITY_CLASS, or HIGH_PRIORITY_CLASS processes, and a base priority of 31 for REALTIME_PRIORITY_CLASS processes

The MainThreadPriority is the priority of the main-loop or event delegator thread of the Event Broker.

The ExecutionThreadPriority is the priority of the worker or execution threads of the Event Broker.

Tracking OPC Server Write Operations

Apis can track all write operations done by Apis  clients. To enable this set the following registry value to 1:

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\RuntimeSettings\TrackOPCWrites

Delaying Runtime State Transitions

It's possible to set a delay between some of the different runtime-states when Apis Hive is starting. This might be desirable when some external application (e.g. OPC server) started indirectly by Apis Hive needs time to load before it can accept or handle any queries from another program.

At present, these settings must be configured through the Windows NT registry, using the RegEdit.exe Windows NT application, located in the Windows directory. In this program, locate the registry key:

HKEY\_LOCAL\_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive

Under this key, create a DWORD value for each of the delay types listed in the table below that you want to take advantage of.

Name of DWORD valueMeaning
CreateToConfigureDelaySpecifies a delay in milliseconds between the runtime-states "Modules created" and "Configuration loaded".
ConfigureToAcquireResourcesDelaySpecifies a delay in milliseconds between the runtime-states "Configuration loaded" and "Resources acquired".
AcquireResourcesToStartDelaySpecifies a delay in milliseconds between the runtime-states "Resources acquired" and "Started".

The actual value of the respective DWORD values gives the delay in milliseconds.

Hierarchical OPC Namespace

The Apis Hive OPC server can expose a flat or hierarchical OPC namespace to third party OPC DA clients connecting to Apis Hive.

By default, a flat namespace is exposed. A hierarchical namespace can be configured using the registry settings for your Apis Hive configuration. The following is an example of such a configuration, with the registry keys inside the brackets and their values below:


[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace]

"EnableHierarchicalNS"=dword:00000001

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\OPC]

"Filter1"="5003==OPC.*"

"ShowItems"=dword:00000000

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\OPC\Random]

"Filter1"="5003==OPC.Random.*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker]

"Filter1"="5003==Worker.*"

"ShowItems"=dword:00000000

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker\Simulation Signals]

"Filter1"="5003==Worker.Signal*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker\Various Signals]

"Filter1"="5003==Worker.Variable*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Worker\Timestamp]

"Filter1"="5003==Worker.Time*"

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Namespace\Alltimestamps]

"Filter1"="1==7"

Concept

To enable/disable a hierarchical namespace, set the registry value EnableHierarchicalNS to 0 or 1.

Each registry key, represents a branch or tree node. Underneath each key, you can specify zero or more filters that apply to the items at this level in the namespace. All filters are inherited from the parent keys/branches as well. A filter value has an arbitrary name, but the value is <AttributeID><Operator><FilterValue>:

  • AttributeID - The correct attribute ID specified in "Predefined Apis Hive attributes" or OPC DA Item attributes,
  • Operator - One of: ==, <>, <, <=, >, >= (equal, not equal, less, less or equal, bigger, bigger or equal).
  • FilterValue - The filter value the specified attribute must match. For proper types, wildcards are accepted. (See "Using wildcards")

Additionally, we can specify a value named "ShowItems" underneath any key in the namespace hierarchy, that specifies whether or not the items at this level should be visible or not. If this value is missing, the items are always shown by default.

Advanced module configuration

Changing default number of items per module

By default, the namespace in a single ApisHive instance, can have a maximum of 4096 modules, with a maximum of 1048575 items in each module.

If you for some reason want to change this, ie. to allow for more than 4096 modules in an instance or more than 1048575 items in a module, you must add/change an entry in the Windows registry.

Eg. to have a maximum of 65536 modules, with a maximum of 65536 items in each module, add/change the following registry entry:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules]
"MaxItemsInModule"=dword:0000ffff

The upper and lower limits for the MaxItemsInModule value, are:

  • Minimum (256 items/16777215 modules): "MaxItemsInModule"=dword:000000ff
  • Maximum (16777215 items/256 modules): "MaxItemsInModule"=dword:00ffffff

Please note that when changing this value for a named instance, modify the path of the registry key to reflect the name of the instance. Eg.:
[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis_MyInstanceName_\Modules]

Please note that the registry entry is not present by default, and must be added if not used before. Also, if changing this value, the Hive instance must be restarted to take effect.

Disable automatic resolution of external items when using ExternalItem filter attributes

When adding, deleteing or renaming items or modules in an ApisHive instance using ExternalItem filters, external items are by default automatically resolved at runtime. On huge configurations this might be time consuming, as all item connections potentially needs to be resolved again.
So, if you perform the add/delete/rename operation(s) at a time you can restart your Hive configutration, you can add/change the following registry entry:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules] "AutoResolveExternalItemFilters"=dword:0

Then restart your Hive instance, perform your desired add/delete/rename operation(s), stop your Hive instance, and revert the following registry entry:

[HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules] "AutoResolveExternalItemFilters"=dword:1

and finally restart your Hive instance. All external item connections caused by using ExternalItem filters, have now been updated according to the new names.

See also: ExternalItem filters

Temporarily Disabling Modules

You can disable a module from being loaded into the Apis Hive configuration.

Disabling a module

Locate the registry for your Apis configuration:

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules

Here, each of your configured modules has its own registry key. Open the key of the module to disable. Underneath this key, e.g. for a module named OPC;

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHive\Modules\OPC

create a DWORD value named Disabled. To disable this module, set value to 1, to enable module set value to 0.

Note! You have to restart Apis for this to take effect!

Disable federated timeseries-access

By default, ApisHive will federate timeseries-requests to an upstream datasource if the requested timeperiod is not available locally and the source of the requested timeseries supports historical access.

This feature can be disabled by setting the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\[INSTANCE]\TimeSeriesAccess\DisableFederation to 1.

Customizing dump-file generation

If a serious error occurs in Apis Hive or one of its modules, a dump file will be generated. The dump file contains detailed information about the state of the process at the time of the error, and can be used to debug the error.

Using APIS services to generate dump-files

By default, the APIS serivces (ApisHive, ApisHoneystore, ApisOPCHDA) are set up to generate a dump file with full memory, and to store the dump file(s) in the folder <APIS-installdir>\Bin.
The dump-files are generated with full memory (MiniDumpWithFullMemory), and can hence be quite big.
To customize what to include in the dump-files, please modify the MiniDumpType registry value of the APIS service in question. These registry values are located at:

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\[INSTANCE]\MiniDumpType
HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore
HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore\OPCHDAConfig

By default, this value is set to 2, which means that a dump-file with full memory will be generated (MiniDumpWithFullMemory).
Typically, to reduce the size of the dump-files, plase enter MiniDumpType value of 0 (=MiniDumpNormal).
For a full set of valid flags, please refer to the following link, MINIDUMP_TYPE enumeration.

When letting APIS services generate dump-files, the the MiniDumpType registry is the only setting you can modify. For more advanced, custom options, please refer to the next section.

Using Windows Error Reporting (WER) to generate dump-files

In some situations, it might be desirable to use Windows Error Reporting (WER) to generate dump-files instead of the APIS services. I.e. if the APIS services are eating memory, of experiencing stack overflowa, APIS services themselves are not able to generate a dump-file. For these situations, to set up WER to generate dump-files, you must modify the registry value MiniDumpType of the APIS service in question, and set its value to 0xFFFFFFFF (or 4294967295 in decimal).
Then, typically you want to further specify how and where WER generates dump-file. These options are controlled by registry values under:
HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\Windows Error Reporting\LocalDumps\ApisHive.exe
HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\Windows Error Reporting\LocalDumps\ApisHoneystore.exe
HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\Windows Error Reporting\LocalDumps\ApisOPCHDA.exe
and values like:

To tell WER what kind of dump type to generate, enter a valid MINIDUMP_TYPE enumeration in value DumpType.
To override the default location of the dump-file, enter a valid path in value DumpFolder.
To limit the number of dump-files to keep, enter a valid count in value DumpCount.

For a complete set of options, please refer to WER Settings

Apis Honeystore

Apis Honeystore is a time-series database, designed to provide the best possible performance on standard PC Hardware. The time-series may be regularly sampled over time, or based on events/changes if appropriate.

Compared to competing SQL based solutions, it has higher performance, both when storing and retrieving data. Instead of providing an SQL engine for data retrieval, it provides a standard OPC Historical Data Access interface, and optionally OLE DB / ADO and Internet Information Server ISAPI.

The combination of high performance, and standard client interfaces, means that Apis Honeystore is the ideal choice for an industrial database solution. You can sample all the data you want. And you can access this data from almost any PC-based business- or automation-software platform.

Capacity samples

The limitations lie in the hardware. When sampled mode is used, and compression is on, each sample will typically consume 0.5-1 bytes of data. As an example, on a low cost 4x80 GB IDE RAID solution, the system can be configured as follows:

  • Storing 7 500 signals per second for a year
  • Storing 15 000 signals each 10 second for 5 years.
  • Storing 1 500 signals each 100 millisecond for 6 months.

You may set up a mix of storage scenarios on a single system, each with different horizon and sample interval. If you add more hard disk capacity to your system, the capability will scale accordingly. When event-based mode is used, you will in most cases be able to store even more. The horizon is in this case not deterministic.

Apis Honeystore features

  • OPC Historical Data Access interface, supporting the following optional interfaces/methods:
    • IOPCHDA_SyncRead::ReadProcessed: Retrieving the following aggregated values over the resample intervals:
      • Maximum/Minimum;
      • First order interpolated data;
      • Standard deviation and variance;
      • Average;
      • Sum;
      • Sample count.
    • IOPCHDA_Playback, Playback of both raw and aggregated historical data.
  • Non-lossy data compression.
  • Namespace browsing.
  • Storage capacity and performance scales with the hardware provided.
  • Both time-fixed resolution and event-based storage.
  • Different sampling rates can be configured for different sets of tags.
  • Time-series of vectors (e.g. NIR-spectra) and matrices as single items.
  • Sampled data contains:
    • Data value only;
    • Data value and associated OPC DA quality;
    • Data value, associated OPC DA quality and a custom timestamp.
  • Seamless integration of Apis Hive with the ApisLoggerBee module, enabling sampling of data available from third party OPC servers and/or drivers, towards other proprietary interfaces and protocols.
  • Easy and remote configuration through the user interface of Apis Hive and Honeystore using Microsoft Management Console or Apis Management Studio.
  • If errors occur, a tray icon will appear with a red stop or warning sign, flashing periodically, notifying you.
  • All information, warning and error messages are written to log files, enabling remote surveillance and troubleshooting of Apis Honeystore through the log viewer in Apis Management Studio.
  • Visual Basic support through the OPC HDA automation wrapper.
  • A custom DCOM interface available for 3rd party interfacing

Database Properties

Honeystore consists of databases. A database is a collection of historical values for items. A database has a set of properties. New databases can be created when Honeystore is running. Database can also be deleted when Honeystore is running.

Database Properties

The Honeystore database has the following properties:

NameDescriptionChangeability
FileVersionThe file version of the meta-storageRead-only
NameName of the databaseRead-write
CachePathThe path of the location where the cache-file will be placedRead-write
DataDirPathThe path of the location on disk where all the historical files are storedRead-write
ConfigFileThe full path and name to the configuration file of the databaseRead-write
RunningModeThe running mode of the database, see Running Modes for description of the various modes.Read-write
EventSuppressionA property deciding which events that will be suppressed from the event-log reporting mechanism.Read-write
MaxItemsThe currently maximum item-count of the databaseRead-write
UsedItemsThe currently used item-count in the databaseRead-only
CacheSizeThe size (in bytes) of the internal caches of the database, See also this.Read-write
MaxTrendFileSizeThe maximum size (in bytes) an active trend-file may have. No retroactive effect if changed.Read-write
MaxDataBlockSizeThe maximum size (in bytes) a single contiguous data-block may have. Used when importing and logging data. No retroactive effect if changed.Read-write
CreationDateThe date when the database was createdRead-only
ModifiedDateThe date any of the properties of the database were modifiedRead-only
DBHandleThe current handle of the databaseRead-only
CompressionStateThe compression state of the databaseRead-write
TimestampPrecisionThe precision of the time-stamps, applies only for Eventbased with quality data samplesRead-write
CapabilitiesThe capabilities of a Honeystore database; to allow or deny inserts of out-of-sequence (OOS) data in Online mode.
Note! When allowing inserts of out-of-sequence data in Online mode, the inserts are handled asynchronously. That means that if you read back history immediately after an Insert call, you are not guaranteed to read back the OOS data you just inserted. OOS data inserts are put into a queue, and will be Inserted into history without interfering with the real-time capabilities of the database.
Read-write
AvgCacheFlushTimeThe average time in milliseconds used when flushing a cache into its corresponding trend file.
Only applies when RunningMode is Online no write-cache.
Read-only
MaxCacheFlushTime

The maximum time in milliseconds used when flushing a cache into its corresponding trend file.

Only applies when RunningMode is Online no write-cache.

Read-only
NumLostCaches

The number of lost caches due to performance issues. If this value is other than 0, it indicates data loss caused by performance issues.

Only applies when RunningMode is Online no write-cache.

Read-only
NumCacheFlushes

The number of caching flush iterations. Ie. the number of individua item caches that have been flushed since database was loaded.

Only applies when RunningMode is Online no write-cache.

 

Most of the properties can be changed directly, marked as read-write. The properties marked as read-only are changed indirectly (like the number of used items, which varies when items are added or removed from the database), or simply display some information and are not applicable for changing (like cache flush time properties).

Compression State

The data of a HoneyStore database, typically is compressed. Several compression types are offered.

The various compression states of a database, are:

NameDescription
UncompressedNo compression, raw data stored as is.
Legacy_FastLossless, old maximum compression. Fast.
Legacy_MaxLossless, old maximum compression. Very slow, but offers the highest compression ratio.
LZ4_2Lossless LZ4 compression, acceleration level 2 (default). Best compromise of speed /compression ratio.
LZ4_8Lossless LZ4 compression, acceleration level 8. Slightly better speed, lower compression ratio, than LZ4_2.
LZ4_56Lossless LZ4 compression, acceleration level 56. Even better speed, and lower compression ratio, than LZ4_8.

LZ4 compression disclaimer

LZ4 Library
Copyright (c) 2011-2016, Yann Collet
All rights reserved.
www.lz4.org

Database meta data storage

HoneyStore may store its meta data in several ways:

Windows registry and storage file.

In the windows registry, there will be a key for each database, to tell where the actual configuration file of the database is located. Using MYDB as an example, this registry would be:

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore\Databases

Underneath this registry key, there will be one string value named ConfigFile, whose value will be the full path and filename of the configuration file of that database.
For our example dayabase, when using the ApisNativeStorage file format):

ConfigFile = ApisDIR\Config\ApisHoneystore\MYDB.ansb

or (if using older ApisStructuredStorage file format):

ConfigFile = ApisDIR\Config\ApisHoneystore\MYDB.acdb

where ApisDIR typically is: C:\\program files\apis\config

Historical data files

The historical data files of the database are located at:

DATADIR\MYDB.dat*

where DATADIR is the DataDirPath property of the database.

Item Properties

An item, also known as a tag or signal, represents a data value of some sort. Either originating from an external system (PLC, DCS, etc.), or calculated/derived from such values within the Apis environment.

Item types

There are three main item types; scalar values, vector values and matrix values.

A scalar value may typically be a flow signal, temperature, etc. from an external system. A vector value typically is a spectrum from a NIR instrument or a control vector in a model based predictive control (MPC) system, and basically is a one-dimensional array. A matrix value typically is a system matrix in an MPC system, and basically is a two-dimensional array.

Supported data values

Apis Honeystore supports all common data types, ie. 32/64 bits floating point values, 8/16/32/64 bits signed/unsigned integer values, boolean values, date values, string values, as well as arrays (vectors/matrices) of all these types.

Apis Honeystore items have several properties (also called attributes) associated with them. Some of these properties are mandatory for all items. Other properties may apply or exist only for some items, this apply to properties like description, engineering units, normal minimum/maximum values etc. The properties are added to items and maintained by the application writing to the database, typically an ApisLoggerBee module within the Apis Hive framework.

Mandatory properties

NameDescription
HandleAn internal handle (placeholder) of the item.
ItemIDThe name of the item. This name is unique within one database.
DatatypeThe data type of the item, eg. floating point, boolean, string, etc.
Recordtype

The record type of the item. This tells what and how the item is sampled. Three types are defined:

  • Sampled: Only data values are stored at a regular sample interval
  • Sampled with quality: Data values and quality information are stored at a regular sample interval
  • Eventbased with quality: Data values, quality information are stored each time the value or quality changes
ResolutionFor the two Sampled record types; the sampling interval of the item. For the Eventbased with quality recordtype: the minimum period between two subsequent samples.
HistoryLengthThe minimum time horizon of the trend of an item, meaning that trend data will not be deleted/overwritten until at least the time period specified by this attribute has elapsed.
SampleSizeThe uncompressed size of one data sample in bytes.

Optional properties that may occur

NameDescription
DescriptionA description of the item
Engineering unitSpecifies the label to use in displays to define the units for the item (e.g., kg/sec)
Normal maximumThe upper limit for the normal value range for the item. Used for trend display default scaling etc.
Normal minimumThe lower limit for the normal value range for the item. Used for trend display default scaling etc.
PDS Engineering UnitThe PDS Engineering unit (engineering unit) associated with this item. This attribute only have meaning inside an Apis Click & Trace™ configuration
DisplayOperator display associated with the item
HiLimitThe Hi alarm limit of this item
HiHiLimitThe HiHi alarm limit of this item
LoLimitThe Lo alarm limit of this item
LoLoLimitThe LoLo alarm limit of this item
UpperBoundThe upper bound of vector item
LowerBoundThe lower bound of vector item
ArgumentItemThe argument item of this item, meaning is client specific
MaxCacheDurationThis attribute determines the maximum duration (in seconds) of item cache before it is flushed to disk.

Estimating memory requirements of a database

A database requires physical memory (RAM) and disk space. The following gives more information about these requirements.

RAM usage

Generally the database uses [(2*(cache size) + 325)*(number of items)] bytes of RAM for each item. Although the database can function using virtual memory, the speed is considerably improved using physical memory. E.g.: the default cache size of 10040 bytes and 10000 sampled items, gives a RAM usage of approx. 195 MB. Though, since the operating system as well as other applications uses memory, this number of bytes is only meant as a guideline when configuring the hardware. As a rule of thumb, when it comes to RAM, it's better to have too much rather than too little.

Disk usage

Disk usage is generally linear with the trend horizon and item size. E.g. an item with data type "32 bit floating point", sampled with quality, at a resolution of 60 seconds, and a history length of one year, will occupy approximately [(4+4)*60*24*365] 4 MB of disk space. If data compression is used, the space required can be considerably reduced. Typically, Apis Honeystore compress data 10 - 20 times, depending on the variation of the data stored. Worst case compression is approx. 2 times. For a database of 10000 such items, the total disk capacity requirements will be: 40 GB uncompressed, and typically 4 GB compressed. When specifying the size of your disk drives, remember that disks that are close to their capacity slow down, showing poor capacity. Typically, you always want a minimum of 10 % free disk space.

Cache Size

The Apis Honeystore uses internal caching to achieve high performance. The cached data is also mirrored to disk, to gain high data persistence. The size of this cache is determined by the property "CacheSize". In situations where you are running a high performance configuration, ie. rapid sampling of many items into a database, it might be desirable to specify a different value than the default.

As an absolute minimum, the cache must be big enough to contain at least one data sample, meaning that the size of the cache must be bigger than the "SampleSize" attribute of the biggest item in the database. Further, the cache should also be big enough to hold more than one sample for an item. How many samples it should be able hold depends on how fast you are sampling your data (The resolution attribute). The cache size should not be to small, not to stress your system unnecessarily, and as a rule of thumb the cache duration should not go below 10 minutes. A cache duration of several hours (or days), is a typical choice.

Calculating cache size based on desired cache duration

Cache duration in hours (formula):

                                          (Cache size)*(Sampling rate in sec.)

Cache duration (hrs) = --------------------------------------------------

                                                        3600*(Sample size)

Example:

Cache size = 10040,

Sample size = 8

Sampling rate = 10 sec.

Would yield a cache duration of approximately 3.5 hours.

Running Modes

The Apis Honeystore databases have a property named "Running mode". The property denotes an internal state or mode of the database, and has one of four values:

Online no write-cacheThe database is online, but without use of caching, Client applications can still retrieve data from the database, but only bulk inserts can be used when writing data to the database.
AdminThe database is partially loaded, but not capable of storing or delivering data to client applications. This is a mode used internally by Apis Honeystore when changing properties, etc.
DisabledThe database has not been loaded, the database doesn't require any system resources. The database cannot handle any client requests in this state.

Text File Format

This section describes the text file format for to use in data import.

Sample import files are provided, located in directory: [Apis-Dir] sample files.

The text files are divided into two logical parts, a header and a body.

The header declares all items into which you want to import data. It must contain enough information to create the items, if they don't exist. The header has 7 rows.

LineNameDescriptionLegal values
ColumnHeader

First line in the file. Defines how to interpret the column. 4 column types are:

None - the column is neglected during parsing;

Time - column contains time stamps;

Quality - column contains quality information;

Item - column contains time defines the item to import.

None Time Quality Item
ItemID:2nd line in the file, contains item IDs in Item columns, otherwise ignored.Legal Item IDs
Vartype:3rd line in the file, contains the data type in Item columns, otherwise ignored.VARTYPE, either as number or text
Recordtype:

4th line in the file, contains the item "RecordType" in Item columns, otherwise ignored.

Sampled = 1;

Sampled with quality = 2;

Event based = 3.

1 2 3
Resolution:5th line in the file, contains the item "Resolution" in Item columns, otherwise it's ignored. Specify as an integer in millisecondsInteger
HistoryLength:6th line in the file, contains the history length for the item in item columns, otherwise it's ignored. Specify as an integer in secondsInteger
Dimension:7th line in the file, this line is optional and should only be included when one or more of the items in file is vector/matrix. For import of only scalar items, omit this line. Specifies the dimension of a vector (N), or the size of a matrix (MxN)N MxN

The leftmost slots of line names, can be omitted in the file, but the semantics of the header are still the same. Omitting the line names makes editing the text file easier in applications like Excel etc, since then the header and body for an item then are placed in the same column. Otherwise, the header data is displaced one column to the right of its corresponding body data.

Body

The body of the text file defines the actual data to import. The contents of the columns in to body are interpreted differently, depending on the column type:

Column typeDescriptionLegal values
NoneThis type is provided to tell file interpreter to ignore the contents of this column. This is a nice feature when importing data from files generated by other software that may contain columns of no interest/meaning.Anything
Time

In this column, the timestamps of the data samples are specified. The first line must specify the timestamp of the first sample to import. The following rows for the item may contain either an exact timestamp, or a time difference in milliseconds from the previous timestamp.

When specifying one difference, no further differences are necessary to specify unless the time difference between two samples changes. If a time column is left out for an item, the timestamps are taken from the last valid timestamp to the left of an item that was successfully interpreted.

This means that if you import data having equal timestamps, you only have to specify one time column. The timestamps is parsed using the locale provided during the import wizard, and the format of the timestamps must match this.

timestamp difference
Quality

In this column, the qualities of the samples are specified, when wanted. If no quality is specified, it defaults to good quality. When a quality is specified, it is valid for all the consecutive samples for that item, until a new quality is specified.

Specify a quality as an integer value (either as a decimal or a hexadecimal), according to the OPC HDA and DA specifications. Bits 31–16 is OPC HDA Quality, bits 15–0 is OPC DA Quality, and all valid qualities are provided below.

integer
ItemIn this column, the actual data values for the items are specified. Each line must contain a string value that can be converted to the Vartype specified in the header of the item, using the locale provided during the import wizard. Typically, the decimal separator of the text data to import must match.value string

OPC HDA Qualities

OPC DA Quality

Relative Time

Relative time is used to specify times and time spans when fetching data.

The format for the relative time is: keyword+/-offset+/-offset… where keyword and offset are as specified in the tables below.

KeywordDescription
SECONDThe start of the current second
MINUTEThe start of the current minute
HOURThe start of the current hour
DAYThe start of the current day
WEEKThe start of the current week
MONTHThe start of the current month
YEARThe start of the current year
NOWThe current UTC time as calculated on the server
OffsetDescription
SOffset from time in seconds.
MOffset from time in minutes
HOffset from time in hours
DOffset from time in days
WOffset from time in weeks
MOOffset from time in months
YOffset from time in years

White space is ignored.

All keywords and offsets are specified in uppercase.

Examples

  • "NOW - 1H" The UTC time on the server when reading the history and 1 hour back.
  • "NOW -1D + 1M + 45S" The UTC Time minus one day plus one minute and 45 seconds.

“DAY -1D+7H30M” could represent the start time for data request for a daily report beginning at 7:30 in the morning of the current day (DAY = the first timestamp for today, -1D would make it the first timestamp for yesterday, +7H would take it to 7 a.m. yesterday, +30M would make it 7:30 a.m. yesterday (the + on the last term is carried over from the last term).

Similarly, “MO-1D+5h” would be 5 a.m. on the last day of the previous month, “NOW-1H15M” would be an hour and fifteen minutes ago, and “YEAR+3MO” would be the first timestamp of April 1 this year.

Resolving relative timestamps is based upon what Microsoft has done with Excel, thus for various questionable time strings, we have these results:

  • 10-Jan-2001 + 1 MO = 10-Feb-2001
  • 29-Jan-1999 + 1 MO = 28-Feb-1999
  • 31-Mar-2002 + 2 MO = 30-May-2002
  • 29-Feb-2000 + 1 Y = 28-Feb-2001

Month: if the answer falls in the gap, it is backed up to the same time of day on the last day of the month.

Year: if the answer falls in the gap (February 29), it is backed up to the same time of day on February 28.

Note that the above does not hold for cases where one is adding or subtracting weeks or days, but only when adding or subtracting months or years, which may have different numbers of days in them.

OPC HDA Aggregates

Aggregate nameDescriptionImplemented by APIS
Interpolative1. order interpolated values.x
TotalThe totalized value (time integral) of the data over the resample interval.x
AverageThe average data over the resample interval.x
Time AverageThe time weighted average data over the resample interval.x
CountThe number of raw values over the resample interval.x
Standard DeviationThe standard deviation over the resample interval.x
Minimum Actual TimeThe minimum value in the resample interval and the timestamp of the minimum value.x
MinimumThe minimum value in the resample interval.x
Maximum Actual TimeThe maximum value in the resample interval and the timestamp of the maximum value.x
MaximumThe maximum value in the resample interval.x
StartThe value at the beginning of the resample interval. The time stamp is the time stamp of the beginning of the interval.x
EndThe value at the end of the resample interval. The time stamp is the time stamp of the end of the interval.x
DeltaThe difference between the first and last value in the resample interval.
Regression Line SlopeThe slope of the regression line over the resample interval.
Regression Line ConstantThe intercept of the regression line over the resample interval. This is the value of the regression line at the start of the interval.
Regression Line ErrorThe standard deviation of the regression line over the resample interval.
VarianceThe variance over the sample interval.x
RangeThe difference between the minimum and maximum value over the sample interval.x
Duration GoodThe duration (in seconds) of time in the interval during which the data is good.x
Duration BadThe duration (in seconds) of time in the interval during which the data is bad.x
Percent GoodThe percent of data (1 equals 100 percent) in the interval, which has good quality.x
Percent BadThe percent of data (1 equals 100 percent) in the interval, which has bad quality.x
Worst QualityThe worst quality of data in the interval.x
AnnotationsThe number of annotations in the interval.

In addition, APIS Honeystore also implements the following vendor specific aggregates.

Aggregate nameDescription
SumThe sum of all raw values over the resample interval.
Interpolative zero-order0. order interpolated values, aka sample-and-hold.
MedianThe median is described as the numeric value separating the higher half of a set if values, from the lower half.
If a < b < c, then the median of the list {a, b, c} is b, and if a < b < c < d,
then the median of the list {a, b, c, d} is the mean of b and c, i.e. it is (b + c)/2.
MinimumActualTime2UA - The minimum value in the resample interval and its timestamp, including bounding values.
MaximumActualTime2UA - The maximum value in the resample interval and its timestamp, including bounding values.
Range2UA - The Range2 Aggregate finds the difference between the maximum and minimum values in the i nterval as returned by the Minimum2 and Maximum2 Aggregates. Note that the range is always zero or positive.
PercentGood (UA)UA - Retrieve the percent of data (0 to 100) in the interval which has a good StatusCode.
PercentBad (UA)UA - Retrieve the percent of data (0 to 100) in the interval which has a bad StatusCode.
VectorElementSumCaclulates the sum of vector elements individually (Not OPC HDA compliant)
VectorElementAverageCaclulates the average of vector elements individually (Not OPC HDA compliant)
VectorElementMinFinds the minimum of vector elements individually (Not OPC HDA compliant)
VectorElementMaxFinds the maximum of vector elements individually (Not OPC HDA compliant)
DisplayValuesValues that give the best trend curve representation for a specific number of pixels. The desired start/end time and the number of x-pixels (nx) available, are used to divide the period into nx resample intervals (1 resample interval per x-pixel). For each resample interval, anything from 0 to 4 data points may be returned. At most, the following data points are returned for each resample interval: the first data point, the maximum data point, the minimum data point and the last data point.
LowpassFilterLowpass filtering of the RAW data for the given period. The number of datapoints returned is the same as the number of raw datapoints.
The resample interval is interpreted as the time-constant (T). The algorithm is:
v(n+1) = v(n) + dT*(v(n+1) - v(n))/T
where
v - value
n - datapoint index
dT = timestamp(n+1) - timestamp(n)
If T <= dT, no filtering is applied to avoid instability.
MovingAverageByCountMoving average of the RAW data for the given period, ignoring the time between datapoints. The number of datapoints returned is the same as the number of raw datapoints.
The resample interval is interpreted as the number of values in the window. For the first window, a cumulative average is applied.
MovingAverageByTimeMoving average of the RAW data for the given period, taking the time between datapoints into account. The number of datapoints returned is the same as the number of raw datapoints.
The resample interval is interpreted as the the window size. For the first window, a cumulative average is applied.

Add Alarms

Setup And Configuration

Launching Apis Honeystore

Apis Honeystore consists of two Windows Services, named Apis Honeystore and Apis OPCHDA. To start and stop these, we use the Services control panel application.

From your Start menu, open Programs->Control Panel->Administrative tools->Services from. Here, we can start and stop the Apis Honeystore services.

Note! Stopping the Apis Honeystore services also means that all applications storing data or reading data, will be prevented doing so!

Optimizations

Performance issues

Common disk-intensive applications like the Windows Indexing service, disk defragmenting and analyzing tools, may affect the performance of the Apis Honeystore. When running configurations demanding high performance, disabling such applications is recommended.

Security configuration

As part of the initial configuration of the Apis Honeystore applications, we must decide who should have access to the historical data, both regarding writing and reading. Apis Honeystore uses DCOM technology to achieve inter process communication, and by using the Distributed COM Configuration Properties application, we can control who should have access to the historical data of Apis Honeystore. The Distributed COM Configuration Properties application, is located in the Windows system directory in the executable named dcomcnfg.exe.  

When using DCOMCNFG, it is important to remember that Apis Honeystore consists of two DCOM applications, named Apis Honeystore and Apis OPCHDA.

Grant access to users

All users that shall be able to store or retrieve data from Apis Honeystore, must have been granted access rights to the two Apis Honeystore applications. We will now go through the steps necessary to grant access adequate access rights. Remember that Apis Honeystore consists of two such DCOM applications, Apis Honeystore and Apis OPCHDA, meaning that the following steps must be carried out for each of these applications.

Step 1. Launch DCOMCNFG

First, we have to launch the DCOM Configuration application.

Step 2. Locate the Apis Honeystore application

In the list of applications shown, locate and select the Apis Honeystore or the Apis OPCHDA application, and click the Properties button.

Step 3. Specify custom access permissions

From the Apis Honeystore properties dialog, select the Security tab.

Then, click the Use custom access permissions choice and push the Edit button.

Step 4. Grant access to users

The last step is to add the users that you want to have access permissions, to the list shown above. Click the Add button, and a list of available users and groups shows up.

Here, locate users and groups that you want to grant access, click the Add button and specify the Type of access that you want. Repeat this until all desired users and groups have been added to the list. Click the OK buttons on all the dialog that you have opened, to set the new properties. You must always include the SYSTEM account to this list, as well as the users that have access to the application. If you're a novice user, adding Everyone might be convenient as a start.

Note! The new settings will be applied when the application is restarted, meaning that you will have to restart the service(s) to activate the changes.

Remember to repeat the steps for both of the Apis Honeystore and Apis OPCHDA applications.

Backup and Restore

All databases holding critical data should be backed up at regular intervals to minimize the risk of loosing data in case of a severe system breakdown and disk errors.

Apis Honeystore supports on-line backup, meaning that you do not have to stop any Apis applications or services while the backup software is running.

NEW backup / restore routine

The preferred backup and restore utility of APIS, is the ApisBare - Backup and Restore Tool

Please refer to this section for backup and restore routines.

Backup

NEW backup / restore routine

The preferred backup and restore utility of APIS, is the ApisBare - Backup and Restore Tool

Please refer to this section for backup and restore routines.

OLD and Deprecated backup / restore routine

The oldbackup program used by Apis, is the default Microsoft Windows Backup program, a program that is included when installing Windows. To start this program, locate the shortcut Programs->Administrative tools->Backup from the Start menu. The backup program can be scheduled to run periodically, typically after regular office hours.

The backup program is documented in the Windows help and support files. Therefore, the remainder of this section will focus on the steps necessary to obtain a valid backup of a database.

Files and directories to backup

To obtain a complete backup of a database, you need to include the configuration file of the database as well as the directory with all sub-directories where the historical data is stored.

Configuration file (required)

The location of the configuration file of a database, is decided by the ConfigFile property of the database. This file must be included in your backup.

Historical data files (required)

The location of the historical data files of a database, is decided by the Path and the Name properties of the database. In this directory, you will find a sub-directory named Name.dat. This directory and all sub-directory file must be included in your backup. For a database named MainDB located having a Path property equal to C:\\Databases, the directory (with sub-directories) to backup will be: C:\\Databases\MainDB.dat.

Cache file (optional)

The Apis Honeystore uses internal caching to achieve high performance. The cached data is also mirrored to disk, to gain high data persistence. As an alternative to backing up the cache file, you can specify a cache size, using the of the database, that ensures that the amount of data kept in the caches will be insignificant compared to the period your backup runs. Meaning that if your backup runs every 24 hours, and the cache holds data for e.g. 6 hours, then a system crash 6 hours or more after the backup last ran will regardlessly result in loss of data. For how to calculate a reasonable cache size, see Calculating a reasonable cache size.

The location of the cache file, is determined by the Path and the Name properties of the database. In this directory, you will find a file named Name.cache, which should be included in your backup if you decide that you want to include the cache file in your backup. For a database named MainDB located having a Path property equal to C:\\Databases, the file to backup will be: C:\\Databases\MainDB.cache.

This file is always held open by Apis Honeystore, meaning that your backup software must be able to backup open files. To configure the default Microsoft Windows Backup program to include open files in its backup, you must alter a Windows registry setting, Using a registry editor, open the key:

HKEY_CURRENT_USER\Software\Microsoft\Ntbackup\Backup Engine

Add a string value (if it doesn't already exist) named Backup files inuse, and set its value to 1.

Error handling and messages

If a critical error occurs, a tray icon will appear at the lower right of the computer desktop, flashing a database icon and alarm icon periodically. Right-click the flashing icon to bring forward a pop-up menu, where you can choose to View Events or to Acknowledge the alarm.

Error situations

If Apis Honeystore encounters errors or problems during operation, it starts flashing the tray icon mentioned above. In such cases, you should analyze the events written to the log viewer in Apis Management Studio to determine the cause of the error. It might just be as simple as that the database is full, i.e. cannot hold any more items, or that you tried to trend data for an undefined item of unknown data type. In such cases, you can simply acknowledge the alarm by right clicking on the flashing tray icon, and then select Acknowledge alarm from the pop-up menu.

Bad/corrupt files

In case of a more serious error, such as corrupt database files, you should stop all applications accessing the databases, including any running logger system such as Apis Hive. Then, you should , called Apis Honeystore and Apis OPCHDA. When the Apis Honeystore service stops, the flashing tray icon will also disappear.

As a first thing to do, you should try to re-start the Apis Honeystore. If this service starts without any error messages or flashing tray icons, you should also try to start the logger system. If all starts up without any errors, and the historical data in the database is present, it might have been a “false alarm” and it is not necessary to restore the database from the backup. If it was not a false alarm, error symptoms will occur again, and database recovery from backup may be necessary to fix the problem.

Recreating a database from files

This section describes how recreate a database based on the database files of a Apis Honeystore. The database files are typically obtained from another computer running Apis Honeystore, and you must have all the files associated with the database available. Also, you must have Apis Honeystore installed on the computer.

Restoring files from backup or alternate source media

Make sure that you have all the necessary files present on your system. These include the configuration file and the historical data files, as described in section Backup.

Now, with the files present on your computer, we have two choices; either to recreate the database on the exact same location and with the same name, or to recreate the database to a different location and/or a different name.

A. Recreate database at exact same location and with same name

When you have recreated the database files exactly at the same locations and with the same names as they were on the source computer, all you have to do is to enter a registry entry for the database.

Using the regedit.exe application located in your Windows System folder, locate the registry key:

HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore\Databases

Beneath this key, assuming the database is named MYDB, create a new registry key named MYDB. Then, create a new string value named ConfigFile beneath this key. The value of the string value must be the full path and name to the configuration file of the database. E.g.,  C:\\Apis\Config\ApisHoneystore\MYDB.acdb.

As for the location of the historical data files, their location is kept stored inside the file configuration file, meaning that as long as you haven't changed the path of these, they will be found automatically. Restart the Apis Honeystore services, and the new database should be loaded by Apis Honeystore.

B. Recreate database to a different disk drive and/or with a different name

We might want to restore/recreate the database to a different disk drive, and/or we need or want to rename the database, i.e. a database already exists having the same name as the one we want to recreate. Then, we have to change the correct configuration parameters in the meta storage for the database before loading it into Honeystore.

B1. Using Prediktor Backup Restore utility (BARE API)

todo...

B2. Metastorage in configuration files

If the meta data of the database is stored in separate configuration files, we will have to open the configuration file offline and modify the desired values, ie. Name, CachePath and/or DataDirPath properties. We use a utility called ApisStructuredStorageViewer.exe for this. This application is part of the APIS Expert tools package and should be used with care. Always take a backup of the configuration file before opening and modifying it in this utility.

The name property is identified with the Attrib ID 10. The CachePath property is identified with AttribID 20. The DataDirPath property is identified with AttribID 20. Note, if the CachePath and the DataDirPath properties are equal, the DataDirPath is omitted from the config file.

B3. Metastorage in separate SQL database (HoneyStoreConfigDB)

If the meta data of the database is stored in a separate sql database, the HoneyStoreConfigDB, locate the database you want to recreate in the HSDatabase table. Change the desired values of the Name, PATH and DATADIRPATH columns, to reflect your new name and/or file locations. Note, the column PATH maps to the database property CachePath.

Restart the Apis Honeystore services, and the new database should be loaded by Apis Honeystore.

Restore from backup media

NEW backup / restore routine

The preferred backup and restore utility of APIS, is the ApisBare - Backup and Restore Tool

Please refer to this section for backup and restore routines.

OLD and Deprecated backup / restore routine

If a critical disk crash or failure occurs, it may be necessary to restore an Apis Honeystore database from backup. Such an incident is indicated by Apis Honeystore, when it cannot open or access the database files as it needs, through an error message in the log viewer in Apis Management Studio and a flashing tray icon.

This section describes how to restore a database from backup media, onto the same Windows installation as the backup was created. If you need to restore onto another or fresh Windows installation, pleas refer to the section Recreating a database as well.

Stop running Apis applications

Before you start restoring, you must stop the applications accessing and using your databases. When used together with Apis Hive, you must stop this application. Then .

Note! If the computer is available on a network for remote database users, you should temporarily set the startup modes of these two services (Apis Honeystore/Apis OPCHDA ) as Disabled, to prevent remote users starting up the services of while you are in the middle of a restore operation.

When restoring databases, you can either restore all the files from your backup (complete restore), or study the log viewer in Apis Management Studio to determine which files that are corrupt, and only restore those files (selective restore). The first case is by far the easiest and less time-consuming, but if the database contains data of great importance or only a few files has been damaged, it may be desirable to only restore as few files as possible from the backup tape.

Complete restore

Using your preferred restore application, restore all the files that you included in your backup, as described in section Backup.

Selective restore – file by file

To decide which files that need to be restored, you must examine the log viewer in Apis Management Studio to locate all the files that cannot be open or read. Then, from your restore application, restore only the necessary files. This can be a very time-consuming procedure and demands greater computer skills than the complete restore alternative.

Restart Apis applications

When you have finished restoring the database files, you can start the Apis Honeystore services again. Then, after the services has been started, you should examine the log viewer in Apis Management Studio for error messages, to ensure that your restore operation was successful.

At last, other applications writing/reading data from Apis Honeystore can be started.

Apis OPC UA Namespace Server

The Apis OPC UA Namespace Server (UANSS) is a tool for extracting the models for one or more namespaces on an external Ua Server and exposing those namespaces through a local Hive Ua Server.

The Hive server will replicate the values of data variables from the original server through subscription, however all other attributes and references will be stored locally.

The UANSS runs as a separate service, named 'APIS OPC UA Namespace server'.

The UANSS runs separately from Apis Hive, and any issues with the service will not affect production data.

How to install UANSS can be found here.

For how to use and configure the UANSS, see Namespace Replication in the How To Guides.

A detailed description of how namespace crawling works, can be found here.

See the Troubleshooting Guide if any issues occur.

Apis OPC UA Namespace Server Installation

To install the Apis OPC UA Namespace Server, check the box for 'APIS OPC UA Namespace Server' when running the Apis Foundation install kit:

The Apis OPC UA Namespace Server will run as a service named 'APIS OPC UA Namespace server'. The service can be started, stopped and configured from the Services view in Windows.

Namespace crawling

The process of extracting opc ua namespaces models from an external server is referred to as crawling the external server.

The crawler works in conjunction with an opc ua client module configured on the hive instance receiving the local copy of the chosen namespaces on the external server. The crawler will use the connection properties of the ua-client, and create namespace items in the module representing the namespaces crawled on the remote server. The namespace items contain the configuration for how each namespace should be treated, and having configuration properties similar to the module properties of a semantics module.

The crawler process consists of several phases:

  • Node discovery. This phase starts at a predefined nodes (root), and follows all references found by browsing the root node. Then the phase continues by browsing all targets found in the previous step until no new nodes are found. This phase will result in knowing all node ids that are reachable from the root node along with their corresponding nodeclass, for all namespaces hosted on the server.
  • Namespace extraction. This phase will extract one or more specific namespaces to individual namespace databases. Since all nodes with nodeclasses are found during the first phase, only attributes for the nodes in the namespaces to be extracted will be read.
  • Namespace deployment. This phase will load the resulting namespace databases into the hive-instance. The semantics runtime in hive will then look for variables that should be subscribed and generate hive items for these values, according to naming rules defined in the namespace service configuration. Finally, the model will be scanned for event sources, and an event-hierarchy will be built for the discovered event sources.

Memory Footprint

For speed, all discovered nodes are cached in order to quickly determine whether a target for a browse-reference is already processed or not. This means that the crawler may use a lot of memory while the remote server is being crawled. It will, however, return that memory as soon as the crawl is completed. It is possible to reduce the memory footprint, but that will necessarily result in the process taking longer to complete. The cache utilizes a high- and low watermark configuration setting, where the high watermark setting defines the maximum number of nodes held by the cache. When the cache reaches the high watermark, it will flush the least referenced (highwatermark - lowwatermark) nodes so the cache only holdts lowwatermark entries. If memory is not an issue, it is also possible to use an in-memory database for holding the complete address-space of the crawled server. Disabling cache-limits and enabling IMDB may speed up the crawl by as much as 50-70%.

Namespace extraction

Namespace databases will by default be deployed to the subfolder "semantics\proxies" in the hive instance's configuration folder . When extracting the namespace databases, some references may be included in more than one namespace database. This occurs because references do not belong to any particular namespace. What the crawler tries to do, is to include all references that has a source or target node in the namespace being extracted. WIth one exception; If the reference type is "HasType", the reference will only be included in the namespace defining the instance (if the type is stored in a different namespace).

Item name generation

Values that should be subscribed on the source server needs hive function items in order to transfer the values to the local ua variable. The crawler will assign such function items to all variable nodes that are of basedatavariabletype, or any of its' subtypes. In order to be able to determine which nodes are basedatavariables, all namespaces defining the types that make up the basedatavariabletypes must be available on the local hive ua server. That means they must either be pre-defined basedatavariabletypes from namespace 0, part of a namespace that is also crawled from the remote server or imported on the local ua server, such as ISA95. If (part of) the type for a variable is not known when the namespace is loaded, the system will not be able to determine that the variable is actually a basedatavariable, and no function item will be generated for that variable.

Datatype support

All primitive scalar datatypes are supported, but only a small subset of struct datatypes are supported.

The following are the supported struct datatypes:

  • Range
  • EUInformation
  • TimeZoneDataType

Variables containing arrays or matrices of a primitive datatype (with the exception of the special ByteArray datatype) cannot be stored in the namespace database. That means properties for such values will not be able to read their values from the database. All other aspects of such variables will be available, but the datavalue cannot be read. However, basedatavariables connected to a hive function item will work for array and matrices as long as the underlying datatype is supported by hive.

Error recovery

If for any reason the crawler looses connectivity with the source server during the first phase of crawling (node-discovery), the crawl will be aborted.

Once the crawler gets passed this stage and the crawler has started to populate the attributes for nodes to be exported, the crawler has some recovery built in. If the connection is lost or the opc ua session for some other reason becomes invalid, the crawler will attempt to reconnect to the source server. It will try 5 times to recreate server connectivity before giving up completely.

Apis Chronical

Apis Chronical is an event-server and historian designed to efficiently monitor, store, and retrieve large numbers of events and alarms.

Features:

  • Customizable EventType hierarchy based on the OPC UA event model
  • Multiple independent EventSource hierarchies
  • User interface for configuration, monitoring and querying
  • Online incremental backups
  • Standard OPC UA AC interface for event subscriptions and historical access

To enable logging of events and alarms, the "Database Horizon" property must be non-zero.

Concepts

An event is a record with the following primary fields:

  • Timestamp: the date and time of the event
  • Type: the type of the event
  • Source: the source of the event

The primary fields are treated as the unique key for event records. If a new event is created with the same Timestamp, Type, and Source as an existing event, the new event replaces the old event1.

The Type field defines which additional fields the event may contain. Since event types are organized in a hierarchy, each event type inherits the fields defined by its parent type.

The Source field defines the "location" where the event occurred. This field is used when real-time subscriptions and historical queries are filtered to only include event sources in specific sub-hierarchies.

1

If the Type is marked as Immutable, both the old and the new events are kept in the historian.

Event Types

Event types are used by Apis Chronical for two purposes. First, each event type defines the fields that can be included in events of that type. Secondly, they define a hierarchy of increasingly more specialized types. For example, the DiscreteAlarm type inherits from/is a specialization of the generic Alarm type. This hierarchy then defines the semantics of each event type, i.e. all subtypes of the Alarm type are also some kind of alarms.

Predefined Types

The following event type hierarchy is predefined in Apis Hive/Chronical:

Event
  ConditionEvent
    Alarm
      DiscreteAlarm
        DataValidationAlarm
        OffNormalAlarm
        QualityAlarm
        WatchdogAlarm
      LimitAlarm
        LevelAlarm
      OpcAlarm
  SystemEvent
    BatchEvent
    DeviceEvent
    ProcessEvent
    TimeSeriesEvent
    TraceEvent
  TrackingEvent
    AdvancedControlEvent
    OperatorChangeEvent
    SecurityEvent
      SessionEvent
        LogonEvent
    SystemConfigEvent
      EventSourceModified

The following event fields are defined by these event types:

Field nameDefined byDatatypeDescription
TimestampEventtime64Date and time of event, 100ns resolution
GenerationEventuint32Internal usage
SequenceEventuint32Internal usage
SourceEventuint32ID of event source
TypeEventuint32ID of event type
StateEventuint32Bitmap of event states
SeverityEventuint16The severity of the event in range 0-1000
MessageEventstringUnstructured information intended for humans
ReceivedEventtime64Date and time when event was received
SourcenameEventstringExtra source description
UserNameEventstringName of related user
CategoryEventuint32ID of classical OPC AE event category
ActiveTimeConditionEventtime64Date and time when the condition became active
CurrentValueConditionEventvariantCurrent observed value of the monitored signal
CurrentQualityConditionEventuint16Current observed OPC quality of the monitored signal
CurrentTimestampConditionEventtime64Current observed date and time of the monitored signal
LastStateConditionEventvariantPrevious event state
LastValueConditionEventvariantPrevious observed value of the monitored signal
LastQualityConditionEventuint16Previous observed OPC quality of the monitored signal
LastTimestampConditionEventtime64Previous observed date and time of the monitored signal
LastSeverityConditionEventvariantPrevious event severity
UnshelveTimeConditionEventtime64Date and time when event should be unshelved
CommentConditionEventstringUser comment given by e.g. Acknowledge
AckTimeConditionEventtime64Date and time when alarm was Acknowledged
ConditionNameOpcEventstringOPC AE condition name
SubconditionNameOpcEventstringOPC AE subcondition name
StartTimeTimeSeriesEventtime64Internal usage
EndTimeTimeSeriesEventtime64Internal usage
SamplesTimeSeriesEventuint32Internal usage
StatusTrackingEventboolIndicates if the requested action was successfull
ClientAuditIdTrackingEventstringAudit ID specified by client
OldNameEventSourceModifiedstringThe old name of the event source
OldFlagsEventSourceModifiedstringThe old flags value of the event source
NewNameEventSourceModifiedstringThe new name of the event source
NewFlagsEventSourceModifiedstringThe new flags value of the event source

Supported Datatypes

The event field datatype is either a scalar or an array of one of the following base types:

  • bool
  • uint8
  • uint16
  • uint32
  • uint64
  • int8
  • int16
  • int32
  • int64
  • time64
  • float32
  • float64
  • string
  • variant

Flags

The following flags can be specified on each event type:

NameDescription
DeletedEvents of this type should not be included in query results
ImmutableEvents of this type are never replaced/updated by newer events

The "Immutable" flag is used if Chronical receives events from another server, and that server generates multiple events with the same Timestamp, Type, and Source, and these events should all be considered valid. The default behaviour in Chronical is to consider such event series as updated/replaced events, meaning that only the last event received with a specific Timestamp, Type, and Source would be retreived on historial reads.

The following flags can be specified on each event field:

NameDescription
DeletedThis field should not be included in query results
StickyEvents without this field inherits the field from the previous event

The "Sticky" flag can be used to attach third-party information on an event that will be automatically preserved on newer events with identical Type and Source where the "Sticky" field is not present.

Event Sources

Event sources are used by Apis Chronical to organize possible event emitters into various hierarchies. Each event source can have multiple parent sources and multiple child sources.

Flags

The following flags can be specified on each event source:

NameDescription
DeletedEvents from this source should not be included in query results
Event areaThis event source is a logical or physical event area
Proxy sourceThis event source is a proxy for an event source in another server
Internal leaf-nodeInternal usage
OpcAE source conditionThis source is used to represent an OPC AE condition on its parent source

The following flags can be specified on each link between parent and child sources:

NameDescription
DeletedThe child sources of this link should not be included in query results
Internal leaf-nodeInternal usage

Properties

Each ApisHive instance contains a dedicated Apis Chronical database. The configuration properties for each Apis Chronical database is available on the "Apis Event Server" node for the ApisHive instance in Apis Management Studio.

The following properties are available:

NameDescriptionDefault valueAccess
LoglevelSpecifies the verbosity of logmessages produced by the eventserver.WarningsRW
RunmodeEither "Online", "Admin", or "Offline". In "Online" mode, the database is running normally. In "Admin" mode, database metadata can be modified, but events will not be processed or stored. In "Offline" mode, the database is closed and only the runmode property can be modified.OnlineRW
Database horizonNumber of days to keep event history. The value -1 means "Forever", and the value "0" means that no history will be stored. A value of e.g. 365 means that one year of event history will be stored in the database.0RW
Datafile periodThe time-period covered by each datafile, in steps from 7.2 minutes to 0.9 years. This setting affects the number of datafiles and the size of each datafile in the database.1.5 weeksR
Archive block sizeEach datafile consist of one or more archives, and this setting controls the size of the archive block header.16KBRW
Event block sizeEach datafile archive consist of one or more eventblocks, and this setting controls the size of new, uncompressed eventblocks256KBRW
Event block countNumber of events per event block1024RW
Event block compressionLZ4 acceleration level for datafile event blocks4RW
Cache queue sizeSize of the event cache queue (number of events). When modified, the new setting takes effect after a restart of Apis Hive.16384RW
Cache block countNumber of blocks in the cache file16R
Cache block sizeSize of each block in the cache file4MBR
Cache flush intervalNumber of milliseconds between each flush-operation on the cache file10000RW
Monitor queue sizeSize of the event monitor queue (number of events). When modified, the new setting takes effect for new event subscriptions.4096RW
Persist stateEnables/disables persisting of the current event state at shutdownDisabledRW
Persist state intervalNumber of milliseconds between each state snapshot10000RW

Querying History

When performing historical reads over OPC UA, the client can specify a ContentFilter.

In Apis Management Studio, similar filtering can be performed in the "Event History" view using a text expression comparable with a SQL WHERE-clause.

Simple filter expressions:

1) Timestamp>NOW-1m
2) Type=DiscreteAlarm
3) Type is DiscreteAlarm
4) Source="ApisHive/Areas/AlarmArea/Equipment-01"
5) Source is "ApisHive/Areas/AlarmArea/Equipment-01"
6) Message like "Level is [12]00*"
7) UnshelveTime=ActiveTime+1h
8) State&2=2
NoExplanation
1Return events with a timestamp from the last minute.
2Return events of type DiscreteAlarm, excluding subtypes
3Return events of type DiscreteAlarm, including subtypes
4Return events from the source 'ApisHive/Areas/AlarmArea/Equipment-01', excluding child sources
5Return events from the source 'ApisHive/Areas/AlarmArea/Equipment-01', including child sources
6Return events where the Message field starts with the string "Level is 100" or "Level is 200"
7Return events where the UnshelveTime is exactly 1 hour after the ActiveTime
8Return events where the 'Active' state bit is high

All such simple expressions can be combined with and and or, and grouped into subexpressions with parentheses.

NB: If multiple event types define fields with the same name, such field names must be prefixed by its event type, e.g. 'TrackingEvent.Status'.

Syntax

This is the formal1 definition of the syntax used in textual query filters:

expr    ::= andExpr ('or' expr)?
andExpr ::= relExpr ('and' andExpr)?
relExpr ::= bitExpr (relOp bitExpr)?
bitExpr ::= addExpr (bitOp bitExpr)?
addExpr ::= mulExpr (addOp addExpr)?
mulExpr ::= term (mulOp mulExpr)?
term    ::= unaryOp? symbol | string | number | list | subExpr
unaryOp ::= [!+-~]
symbol  ::= [a-zA-Z_][a-zA-Z_0-9.]*
string  ::= ["] [^"]* ["]
number  ::= [0-9]+ (("." [0-9]+)|([dhms]))?
subExpr ::= "(" expr ")"
list    ::= "[" expr ("," expr)* "]"
relOp   ::= [<>=] | "<=" | ">=" | "!=" | "is" | "in" | "like"
bitOp   ::= [&|^] | "<<" | ">>"
mulOp   ::= [*%/]
addOp   ::= [+-]

A symbol is either "NOW", an event type, or an event field optionally prefixed by its event type.

NB: list expressions and the keyword "in" are not yet supported.

1

the notation used is a custom BNF extension

Apis History Explorer

Apis History Explorer is a tool to fetch and display historical data.

The Apis History Explorer supports OPC HDA and OPC UA protocal.

Main Window

The main window of the application has three part.

The left part is a tree to display the items in the database.

The middle part is a view to display the detailed data of the items.

The right part is a property window to display the properties of objects.

Explorer Tree

You can use the left explorer tree to see the items on the server.

Search text box

You can use the top search text box to filter items by name.

Conext menus

There are three context menu items for one item

Delete: You can use this menu item to delete one item from the database.

History explorer view: You can use this menu item to open a history explorer view to show the data of this item.

Add items to current history explorer view: You can use this menu item to add the item to the active history explorer view, this item is only available when there is at least one history explorer view open.

Property Window

You can see the properties of objects in the property window.

Connection Dialog

You can use the connection dialog to connect to a server.

This dialog will display at the start of the application, and you can start this dialog from the menu SERVERS - Connect.

The connection dialog can connect to OPC HDA server and OPC UA server.

OPC UA

Select OPCUA in service type to connect to an UA Server. Check the Use Discovery check box and enter a URL for an OPC UA server. (The URL should be on the form opc.tcp://<computer>:<port>/<path>.

!> If a secure connection is used a certificate must reside in specific folders.

If running in the context of Apis Management Studio the folders are:

Public key (common filename: AMS.der)

%LOCALAPPDATA%\APIS\AMS\Config\pki\certs

Private key (common filename: AMS.pfx):

%LOCALAPPDATA%\APIS\AMS\Config\pki\private

The subject name of the certificate MUST be AMS.

If running in the context of History Explorer the folders are:

Public key (common filename: Explorer.der)

%LOCALAPPDATA% \APIS\HistoryExplorer\Config\pki\certs

Private key (common filename: Explorer.pfx):

%LOCALAPPDATA% \APIS\HistoryExplorer\Config\pki\private

The subject name MUST be Explorer.

ApisHive as UA server
ApisHive must trust a client by trusting its certificate. This is done using the Apis Management Studio. This means that the first time a client connects to Hive it will fail. After the first connection, open the Apis Management Studio, connect and navigate to your Apisd Hive instance, right-click on the Endpoints node and select Manage UA Certificates and finally trust the certificate under Manage Certificates. Then, make the client reconnect.

Running History explorer with command-line options

Running History explorer from the command-line, allows you to run automated file exports from a scheduled task or launched from another program. Running from the command-line only makes sense when using an History explorer configuration file (an .acf file) for data export.

Usage, command-line options:

Prediktor.HistoryExplorer.exe -config="C:\\a.acf" -export="C:\\autoexport.txt" [optional parameters]

The export file name is the file name specified by the -export option concatenated by a time stamp, in this example the export file name is like autoexport_2015-12-29 12-26-20.txt.

Parameters:

● -hidden

Runs the automated file-export hidden

● -starttime:"YYYY-MM-DD hh:mm:ss"

Specify the start-time for an automated file-export

● -endtime:"YYYY-MM-DD hh:mm:ss"

Specify the end-time for an automated file-export

Specifying the StartTime and EndTime will allow the user to adjust the start/end times in the

configuration files to reflect more recent time periods for which to export timeseries data.

● -exportviewtype

Specify the view type for the export file, can be table, combinedtimetable, eventlist, default is table. Example: -exportviewtype=combinedtimetable

● -rowbyrow

Organize the data row by row.

● -honeystoreformat

Export a honeystore import format file.

● -notime

Don't show time in the export file.

● -noquality

Don't show quality in the export file.

● -localtime

Using local time in the export file

A complex example:

Prediktor.HistoryExplorer.exe -config="C:\\a.acf" -export="C:\\autoexport.txt" -localtime -starttime="2015-12-1" -endtime="2015-12-29" -hidden

Apis Backup and Restore Tool (ApisBare)

Introduction

ApisBare is a tool for backing up and restoring configuration and data for your Apis applications. ApisBare is available as a command line tool, and as a Windows service named "Apis Backup Agent".

Apis Backup Agent

ApisBare Command line - Backing up your configuration

Manually Backup and Restore Apis Foundation configuration

The api is able to extract all configuration properties in a standardized model, regardless of whether the property is actually located in the registry, a custom storage file, database or wherever. Furthermore it lets you add metainformation to certain properties, making it easier to find those properties that will most likely have to change if you carry a configuration from one machine to another. Typical values that are tied to the computer - like IP-adresses, database names, connection strings etc. may need to be modified when the configuration is moved.

Apis Backup Agent

Apis Backup Agent is a Windows service that is installed by Apis Foundation setup. The service is responsible for executing backup and restore jobs.

Backup and restore jobs are configured with use of Apis Management Studio

The Apis backup agent service is installed to:


[INSTALLDIR]/APIS/BareAgent

Troubleshooting

Debug trace logs can be imported with Apis Management Studio The logs are found in directory:


[INSTALLDIR]/APIS/BareAgent/Logs

Backup Set

A Backup set contains configuration and/or history data for the Apis Hive, Apis Honeystore, Apis Chronical and Apis OPC UA Namespace Server services.

The Backup set configuration content that can be explored by using Apis Management Studio.

Backup sets can be moved between machines with the operating system file explorer.

A Backup set is stored in its own root folder and contains a root backup file and several sub folders with backup data. Be sure to select the root folder with all its content when moving the backup set with the operating system file explorer.

Overridable Values

Overridable values are properties and settings extracted from Apis Hive configuration content in a Backup Set that the user are allowed to change before running a restore job. Overridable values are loaded per selected Hive instance.

Examples of Overridable values are:

  • References to external Servers
  • References to local file, directory or database resources
  • Internal Server URL address settings

Some Overridable values will be changed automatically during the restore process if restoring in a new environment. The automatically changed Overridable values will show up as changed values in the restore view in Apis Management Studio.

Examples of automatically changed values are:

  • References to Apis install directory
  • Internal hostname or IP address settings

Please refer to Apis Hive and Apis Hive Modules for further documentation.

ApisBare Command line - Backing up your configuration

Backing up your configuration

If run with the command Backup [Filename], it will make a backup of the configuration into the specified file. Note that the full path to the backupfile must be specified. If other files are detected in the instances' configurations, they will be placed in a subdirectory with the name of the corresponding instance. These instance subdirectories will be created in the same directory as the backupfile.

If a Hive instance contains a chronical event database, this database will be backed up into the file [InstanceName].chr.

Honeystore data-backup will be placed in one subdirectory per database. Each database subdirectory will contain the files:

  • [DatabaseName].bin - Metainfo for the items backed up
  • [DatabaseName].bin.blockdata - Raw blocks of data

When appending to an incremental backup, the specified filename should be for the original (root) entry. For each time an incremental backup is made, a new backup file will be written with a sequence number appended. Ie. if backup.cfg is the root backup, the first incremental backup configuration will be stored in backup.cfg.1. The same concept is used for the subfolders containing honeystore datafiles. When appending an incremental backup, chronical incremental data will be appended to the existing backupfile. Ie. [InstanceName].chr.1 will not be created.

ApisBare Command Line - Backup Option

The backup command may be accompanied with the following options:

  • /DataOnly - only honeystore database data will be backed up
  • /ConfigOnly - only hive/honeystore configuration will be backed up - no honeystore or chronical data
  • /Hive - only hive configurations will be backed up - no honeystore configuration or data
  • /Honeystore - only honeystore configuration/data will be backed up - no hive instances
  • /Incremental - Make or append to an incremental backup
  • /Wait - Wait for a keypress before exiting when the backup is completed
  • /FromDate:DateTime - Backup data starting at this time (Not fully implemented)
  • /ToDate:DateTime - Backup data up until this time (Not fully implemented)

ApisBare Command line - Restoring your configuration

Restoring your configuration somewhere else

If the program is run with the command Restore [Filename], it will read the configuration from the specified file and ask if you want to restore each of the instances/databases found in the backupfile. Note that the full path to the backupfile must be specified. If the backup is detected to contain incremental entries, the /Incremental option will be required. During the restore phase, all paths in the configuration starting with the standard hive-configuration path will be remapped from the original machine's config path to the target machine's config path if they differ. Likewise, connection strings, database names, ip-adresses and servernames containing the original machine's name/address will be replaced with the target machine's name/address. A list of an instance's configuration properties that may have to be manually changed are also printed before you are asked if you want to continue restoring the instance unless the /Yes option is specified.

Restore options

The restore command may be accompanied with the following options:

  • /Yes - Restore everything, will not ask for confirmation before restoring
  • /Hive - Only restore hive instances
  • /Honeystore - Only restore honeystore database configurations + data
  • /DataOnly - Only restore honeystore data, leaves the honeystore configuration unchanged
  • /ConfigOnly - Only restore configuration (hive + honeystore), do not restore honeystore or chronical data
  • /Wait - Wait for a keypress before exiting when the restore is completed
  • /FromDate:DateTime - Restore data starting at this time (Not fully implemented)
  • /ToDate:DateTime - Restore data up until this time (Not fully implemented)

Note that hive instances will be stopped prior to configuration being restored. Likewise, honeystore will be stopped before restoring a database's configuration, and then started in order to import the data. Honeystore might even be stopped/started several times in case of an incremental restore. If more than one database is restored, honeystore will be stopped and started for each instance being restored.

ApisBare Command line - Automatically Modified Config Values

Automatically Modified Config Values

Some configuration values that may need to be modified when restoring a configuration to a different machine, are automatically adjusted during the restore operation:

  • Path/file references originating from the default apis configuration directory (or subdirectory thereof) will be modified to be rooted from the new machine's default apis configuration directory if this is different.
  • Tcp/Ip addresses referencing (one of) the original machine's tcp/ip address will be modified to reference the new machine's (default) tcp/ip address.
  • URLs referencing the tcp/ip address or hostname of the original machine will be modified to refer to the new machine's hostname or tcp/ip address.
  • Hostname references addressing the original machine's hostname will be modified to reference the new machine's hostname.
  • Databasename / Connectionstring references will get the tcp/ip address and/or hostname replaced.

Manually Backup and Restore Apis Foundation configuration

Current version is 64 bit only, but we include this to show how to upgrade from legacy 32-bit to 64-bit.

A secure method to copy Apis Hive configuration from one computer to another, or simply have a backup of the configuration, is to copy the configuration files and registry settings. By using this method, copying and restore of configurations can be automated by scripting.

This method requires basic knowledge to Windows registry settings, how to export and import Windows registry keys.

The configuration location

The procedure varies slightly depending on the bitness (32/64) of the operating system and Apis Foundation. There are mainly 3 configuration storage types; registry, binary files and xml configuration files.

The location of registry Apis configuration

The registry holds information regarding basic functionality of ApisHive, HoneyStore and configuration file location of module configuration.

  • The location of registry Apis configuration:
    • HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor
  • 32-bit application on 64-bit operating system:
    • HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Prediktor

Apis module configuration files

These files hold information regarding module property, and items, default the file name is the same as the module/database name:

  • <ModuleName>.acd Module binary configuration file (old format)
  • <ModuleName>.ans Module binary configuration file
  • <DatabaseName>.acdb Database binary configuration file (old format)
  • <DatabaseName>.ansdb Database binary configuration file

The default location of Apis module configuration:

  • <Install Directory>\Config\<INSTANCENAME> Module configuration files
  • <Install Directory>\Config\ApisHoneyStore Database configuration files

Event Historian

If Event Historian (Chronical) is enabled the configuration is stored by default in

<Install Directory>\Chronical\<INSTANCENAME>

Apis servers xml configuration files

These files hold information regarding advanced functionality of ApisHive and HoneyStore not found in the registry.

64-bit Apis Foundation

  • <Install Directory>\Bin
    • ApisHoneystore.exe.config
    • ApisHoneystore.AppSettings.config

32-bit Apis Foundation:

  • <Install Directory>\Bin
    • ApisHive.exe.config (deprecated)
    • ApisHive.AppSettings.config (deprecated)
    • ApisHoneystore.exe.config
    • ApisHoneystore.AppSettings.config

Upgrade paths

From Operating system bitnessTo Operating system bitnessFrom Apis Foundation bitnessTo Apis Foundation bitnessCopy procedureRestore procedure
32643264C1R6
64643264C2R8
64646464C3R9

Copy Apis Hive configuration

  1. Copy 32-bit Apis Hive configuration on 32-bit operating system

    1. Copy all files from <Install Directory>\Config and if Event Historian (Chronical) is enabled copy all files from <Install Directory>\Chronical\<INSTANCENAME>
    2. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<INSTANCENAME> to a file.
  2. Copy 32-bit Apis Hive configuration on 64-bits-bit operating system

    1. Copy all files from <Install Directory>\Config and if Event Historian (Chronical) is enabled copy all files from <Install Directory>\Chronical\<INSTANCENAME>
    2. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\< INSTANCENAME > to a file.
  3. Copy 64-bit Apis Hive configuration

    1. Copy all files from <Install Directory>\Config and if Event Historian (Chronical) is enabled copy all files from <Install Directory>\Chronical\<INSTANCENAME>
    2. Export the registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<INSTANCENAME> to a file.

Restore Apis Hive configuration

If the <Install Directory> directory on the destination computer is different from the source computer, the registry settings export file must be changed:

In the exported registry script file, locate where Apis Hive configuration was initially installed the "ApisStorageSource" string value, for instance where system was initially installed in C:\\Program Files (x86)\APIS and is copied/ moved to C:\\Program Files\APIS :

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\ApisHive\Modules\ApisWorker]

@="{983B4AE2-ABB9-11D2-9424-00608CF4C421}"

"ProgIDOfModule"="Prediktor.ApisWorker.1"

"ApisStorageClass"="{4C854C93-C667-11D2-944B-00608CF4C421}"

"ApisStorageSource"=" C:\Program Files (x86)\APIS\ Config\ApisHive\Worker.ans "

Replace all occurrences of the original location to new location, for instance:

[HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Prediktor\Apis\ApisHive\Modules\ApisWorker]

@="{983B4AE2-ABB9-11D2-9424-00608CF4C421}"

"ProgIDOfModule"="Prediktor.ApisWorker.1"

"ApisStorageClass"="{4C854C93-C667-11D2-944B-00608CF4C421}"

""ApisStorageSource"="C:\Program Files\APIS\Config\ApisHive\Worker.ans "

  1. Restore 32-bit Apis Hive configuration from 32-bit operating system to 64-bit bits operating system as 64-bit Apis Hive configuration.

    1. Install Apis Foundation 64
    2. Create new instance if not using default.
    3. Copy all files (restore) (1.a) to <Install Directory>\Config and possibly <Install Directory>\Chronical\<INSTANCENAME>
      1. Run registry script. (1.c)
        1. Install Apis Foundation
        2. Create new instance if not using default.
        3. Copy all files (restore) (1.a) to <Install Directory>\Config , (possibly <Install Directory>\Chronical\<INSTANCENAME>) and the “.config” (1.b) files to <Install Directory>\Bin
        4. Run registry script. (1.c)
  2. Restore 32-bit Apis Hive configuration from 64-bit operating system to 64-bit bits operating system as 64-bit Apis Hive configuration.

    1. Install Apis Foundation 64
    2. Create new instance if not using default.
    3. Copy all files (restore) (2.a) to <Install Directory>\Config and possibly <Install Directory>\Chronical\<INSTANCENAME>
    4. Edit the registry script (2.c)
      1. Replace all “SOFTWARE\Wow6432Node\Prediktor” with “SOFTWARE\Prediktor”
      2. Save
    5. Run the modified registry script.
  3. Restore 64-bit Apis Hive configuration from 64-bit operating system to 64-bit operating system as 64-bit Apis Hive configuration.

    1. Install Apis Foundation 64
    2. Create new instance if not using default.
    3. Copy (restore) all files (3.a) to <Install Directory>\Config, (possibly <Install Directory>\Chronical\<INSTANCENAME>) and the “.config” files (3.b) to <Install Directory>\Bin64
    4. Run registry script. (3.c)

Manually copy / move Apis Honey Store database

A secure bulletproof method to copy Apis Honeystore data from one computer to another, or simply take backup/copy of the configuration, the disadvantage is that the database must be taken offline.

Example; common user case:

Migration of ApisFoundation to new hardware from Server1 to Server2

  • Assume we have database named “RedLogger” this is located in C:\APIS\DBs on Server1
  • Assume ApisFoundation is installed in same location on both computers C:\APIS

During migration to Server2 we want to move the location of the “RedLogger” database, to E:\DBs

  • Assure ApisHoneyStore service is stopped on Server1 and 2
  • Copy C:\APIS\Config\ApisHoneyStore\RedLogger.ansb from Server1 to C:\APIS\Config\ApisHoneyStore on Server2
  • Copy the RedLogger.dat and RedLogger.cache files from C:\APIS\DBs on Server1 to E:\DBs on Server2
  • On Server1 export the registry key:
  • ”HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\ApisHoneyStore\Databases\RedLogger” to a file
  • Copy the registry script file to Server2 and run it. (If ApisFoundation is installed in different location than Server1 the “ConfigFile” String Value in the registry script must be altered.)
  • Assure you have a backup of the file C:\APIS\Config\ApisHoneyStore\RedLogger.ansb
  • Edit the location of the database in the configuration file, there are two options (tools); offline ApisMetaStorageViewer and online MMC Snapin:
  1. Offline:
    •  Start ApisMetaStorageViewer.exe (is a part of APIS_x_x_x-Tools)

      • Open the C:\APIS\Config\ApisHoneyStore\RedLogger.ansb

      • Change the Attrib ID 20 from C:\APIS\DBs\ to E:\DBs
      • The migration of the database is now finished and ApisHoneyStore can be started on Server2
  2. Online:
  •   Start ApisHoneyStore service

    • Open the ApisHoneyStore MMC snapin
    • Navigate to RedLogger database right click and select Administer database…

    • In RedLogger properties dialog change DataDirPath from C:\APIS\DBs\RedLogger.dat\ to E:\DBs\RedLogger.dat\ and CachePath from C:\APIS\DBs\ to E:\DBs

    • The migration of the database is now finished
Release 9

Apis Foundation 9.16.3 release notes

Apis Hive NET API

  • Bugfix: Fixed bug causing Flags information on item config callbacks to be missing, causing some GUI applications (like Apis Management Studio) to show some attribute values incorrectly (i.e. showing an item handle instead of an item name).
  • Bugfix: Fixed mismatch of double-quoting string attributes when converting to text and parsing back from text, causing errors/issues for configuration export/import for like Apis Management Studio.

ApisCnxMgrBee

  • Feature: Use polling to get the servicelevel for each OpcUa Connection item instead of subscribing to the servicelevel node. This is a workaround so that non- conformant servers can be part of an OpcUa Cluster.
  • Feature: the OpcUa replication item has a new attribute SyncGlobalAttributes that can be used to enable/disable the synchronization of global attributes.

ApisIEC104Bee

  • Feature: added shared module property Mode that controls the operational mode of the module.
  • Bugfix: fixed issues wrt TLS connection
  • Bugfix: fixed possible deadlock during connection
  • Bugfix: fixed issue where connecting state item could have wrong value after restart
  • Bugfix: fixed issue with watchdog triggering delay

ApisIEC61850Bee

  • Feature: added shared module property Mode that controls the operational mode of the module.
  • Bugfix: fixed issues wrt TLS connection
  • Bugfix: fixed possible deadlock during connection

Apis Management Studio

  • Bugfix: When showing trace logs in AMS, either by using Show Log or File->Import Log Files, every 100th entry in the underlying log file(s) were missing from the Log View.

Apis Backup and Restore

  • Feature: Added new /localOnly restore option to api and cmdline utility for specifying that all external urls, connection strings, hostnames and ip addresses should be replaced by localhost

Apis Semantics

  • Bugfix: When limiting the number of namespace database versions to retain, the result databases may be copied straight into the proxy database folder after a crawl. Thus, when determining the number of files to delete, it must be accounted for that the new database file may already be present in the proxy-database directory.

Apis OPC UA Server Library

  • Bugfix: a potential application crash during subscription teardown has been fixed.
  • Bugfix: a memoryleak could occur when a client is disconnected, this is now fixed.

Apis Foundation 9.16.2 release notes

ApisEventBusBee

  • Bugfix: Fixed a memory leak when processing events.

Apis Management Studio

  • Bugfix: In certain conditions AMS would crash when loading acf-file with History Explorer containing items from disconnected Hive.

Apis Semantics

  • Feature: Added config and support for setting the max number of databases that should be retained after updating the namespace database (nodeset import or crawling).
  • Bugfix: When a namespace is updated from a crawled apis semantics namespace already containing itemnames, only create items for hive's value attributes (not eu, eurange and definition)

Apis Security Server

  • Bugfix: Fixed error saying insufficient parameters supplied to the command, caused by upgrading to System.Data.SQLite.Core version 1.0.118.

Apis Foundation Installer

  • Bugfix: The installer will now remove leftover WAL and SHM files from the previous installation.

Apis Foundation 9.16.1 release notes

Apis Hive

  • Feature: added new Apis attribute "ChronicalParent" used to specify an eventsource parent for any item.
  • Feature: added new eventfields for tracking the external eventid and provider on events received from OPCUA servers, and use the external eventid, if available, as the OPCUA eventid.
  • Feature: the OPCUA source nodeid for events where the chronical eventsource does not have an OPCUA source nodeid specified, will now contain the path to the chronical eventsource.

ApisAlarmAreaBee

  • Feature: use the new attribute ChronicalParent to override the default eventsource parent for an item.

ApisIEC104Bee

  • Feature: Added the 'Any item' watchdog mode that checks for changes in any of the items, and the 'Any data' watchdog mode that checks for any data frames, incl. test frames, received.

ApisOpcUaBee

  • Feature: store external event id and provider on all events received by EventMonitorItems
  • Feature: The eventmonitor item now can identify eventsources when the source nodeid is unknown, but has the new eventsource-nodeid format generated by ApisHive.

Apis Semantics

  • feature: Return "Access denied" when writing attributes that are modifying the address-space and is not passed through from a proxy to its' connected server (ie. only allow datavalue and the semantics-specific itemname attribute for proxy namespaces).

Configuration Repository

  • Bugfix: Fixed logging issue in the configuration repository (Linux version)

Apis Foundation 9.15.15 release notes

Apis Hive NET API

  • Bugfix: Fixed mismatch of double-quoting string attributes when converting to text and parsing back from text, causing errors/issues for configuration export/import for like Apis Management Studio.

Apis Management Studio

  • Bugfix: When showing trace logs in AMS, either by using Show Log or File->Import Log Files, every 100th entry in the underlying log file(s) were missing from the Log View.

Apis Semantics

  • Bugfix: When limiting the number of namespace database versions to retain, the result databases may be copied straight into the proxy database folder after a crawl. Thus, when determining the number of files to delete, it must be accounted for that the new database file may already be present in the proxy-database directory.

Apis Foundation 9.15.14 release notes

Apis Management Studio

  • Bugfix: In certain conditions AMS would crash when loading acf-file with History Explorer containing items from disconnected Hive.

Apis Semantics

  • Feature: Added config and support for setting the max number of databases that should be retained after updating the namespace database (nodeset import or crawling).
  • Bugfix: When a namespace is updated from a crawled apis semantics namespace already containing itemnames, only create items for hive's value attributes (not eu, eurange and definition)

Apis Security Server

  • Bugfix: Fixed error saying insufficient parameters supplied to the command, caused by upgrading to System.Data.SQLite.Core version 1.0.118.

Apis Foundation Installer

  • Bugfix: The installer will now remove leftover WAL and SHM files from the previous installation.

Apis Foundation 9.15.13 release notes

ApisEventBusBee

  • Bugfix: Fixed a memory leak when processing events.

Apis Foundation 9.15.12 release notes

ApisHive Modules

  • Bugfix: Fixed memory leaks when doing arithmetic operations on APIS Variants that cannot be converted to floating point numbers.
  • Bugfix: Fixed memory leaks when parsing Expression attributes containing 'if-elseif-...-else' expressions in Function items. This

Apis Backup and Restore

  • BugFix: Will now backup registry values in the "Modules" subkey within a hive instance.

Apis Foundation Installer

  • List of prerequisites included by the installer:
    • Microsoft .NET Framework 4.8
    • Microsoft .NET Core 6.0.29 Server
    • Microsoft .NET Core 6.0.29 Desktop
    • Microsoft Visual C++ 2015-2022 Redistributable 14.38.33135.0
    • OPC Core Components 3.0.108

Apis Foundation 9.15.11 release notes

Apis HoneyStore

  • Feature: ConfigSinkTimeoutSeconds added to the RuntimeSettings configuration section, to allow for a max timeout period when notifying configuration observers (clients) when shutting down databases/Honeystore service. Default is 300 seconds.
  • Feature: Improved handling of configuration change callbacks: do not call out when in a locked state, and improved trace logging for observer callbacks.
  • Feature: Introduced interface IApisHSConfigRequest2, to let clients identify themselves when requesting configuration changes. This is to allow for better logging and debugging of configuration observers.

Apis Hive

  • Feature: Added [Show running states of instances] option in ApisBuddy file-menu to enable/disable balloon tip notifications for the running state of instances in ApisBuddy. Default state is set to disabled.

ApisCnxMgrBee

  • Bugfix: Avoid possible crash when failing to update a global attribute on one or more items during replication.
  • Feature: a new attribute 'ServiceLevel NodeId' has been added to 'OpcUa Connection' item. This property control/override the default nodeid to read servicelevel from.

ApisEventBusBee

  • Bugfix: the Source.Chronical item now correctly handles nested alarmareas and eventsources with slashes in their names

ApisHSMirrorBee

  • Bugfix: When Honeystore was shutting down, the MirrorBee module was not properly disconnecting from Honeystore, now it does!

ApisLoggerBee

  • Bugfix: Fixed possibility for crash when using a null pointer interface when not connected to HS, and trying looking up HS item from Hive item.

Apis Management Studio

  • Feature : Port for Bare services can now be set in AMS’ ioc.xml file.

Apis OPC UA Server Library

  • Bugfix: avoid a potential deadlock when checking for dead sessions

Apis OPC UA Client Library

  • Bugfix: avoid use-after-free when federating an event historyread request with continuationpoints

CsScript

  • Feature: AllGood* methods now treat NaN values as Bad quality.

Apis Foundation Installer

  • List of prerequisites included by the installer:
    • Microsoft .NET Framework 4.8
    • Microsoft .NET Core 6.0.28 Server
    • Microsoft .NET Core 6.0.28 Desktop
    • Microsoft Visual C++ 2015-2022 Redistributable 14.38.33135.0
    • OPC Core Components 3.0.108

Apis Foundation 9.15.10 release notes

ApisCnxMgrBee

  • Bufix: A bug was introduced in 9.15.9 that caused every other newly created item during replication to get wrong values on the SrcUaNodeId and SamplingInterval attributes.

ApisOpcUaBee

  • Bugfix: changes re. batchsize in 9.15.9 introduced a bug where items in the OpcUaBee might get assigned invalid datatype/accessrights when connecting to the OpcUa Server.

Apis Foundation 9.15.9 release notes

Apis Hive

  • Bugfix: Fixed bug causing annoying errors messages saying "Unexpected error when validating properties", when adding an attribute to an item in Apis Management Studio.
  • Bugfix: If an eventsource already existed for a semantics-item, regenerating items for the semantics namespace would fail to update the required flags on the eventsource due to a bug in the eventserver. For OPCUA clients subscribing on events from such eventsources, the consequence would be that the events would contain an invalid SourceNode ID. This has now been fixed.

Apis Hive NET API

  • Bugfix: Fixed bug causing an exception when importing external items on items, where the same external items are used more that once (bug introduced during performance improvements in 9.15.6).

ApisCnxMgrBee

  • Feature: a new attribute 'ReplicationMode' has been added to control how the replication item should behave. The default value is 'Copy' which means that each item found on the server gets a matching OpcItem in the client. If changed to 'Mirror', the replication item will also remove local OpcItems that are not found on the server.
  • Feature: The replication item will now trigger the 'SyncStandardProperties' command on each replicated module after adding and/or removing items on the module.
  • Feature: The replication item now has an attribute 'SamplingInterval' that is used to override the default sampling interval on replicated items.
  • Bugfix: The replication item will no longer touch module properties that already has correct values, to avoid unnecessary reconnecting to the opcua server.

ApisOpcUaBee

  • Feature: StandardPropSync can now be triggered from the EventBroker, and will always run asynchronously, i.e. not block the OpcUaBee from getting new data from the server. There is also a new CommandItem that can be used to trigger a full sync of all items in the OpcUaBee. To complement this new feature, StandardPropSync is now a multi-select property with the options "First time adding item", "Each session" and "After replication", where the latter option will trigger a full sync of all items in the OpcUaBee when the module is updated from a Replication item on the CnxMgrBee.

Apis OPC UA Server Library

  • Bugfix: sending an empty nodeid when creating monitored items could cause the server to crash, this is now fixed.
  • Bugfix: when receiving publish-requests with an invalid session authentication token, the server would never send a response. Now, the server will send a response with serviceresult BadIdentityTokenInvalid.
  • Bugfix: when receiving too many publish-requests for a session, the server would never send a proper publish-response. Now, the server will send a response with serviceresult BadTooManyPublishRequests.

Apis OPC UA Client Library

  • Bufix: when requesting blocking RPC messages, the library user could get a bogus errorcode if the server did not respond in time. There could also be logged a bogus error message (Invalid RPC object, response missing), these issues are now fixed.
  • Bugfix: honor batchsize when encoding Modify/DeleteMonitoredItems requests

APIS HoneyStore Replication

  • Feature: Improved error-logging on missing, required configuration parameters.
  • Prerequisite: This release installs .NET Framework 4.8 on the target machine, if not already installed.

Apis Foundation 9.15.8 release notes

Apis HoneyStore

  • Bugfix: fixed a bug causing checksum calculations to hit a buffer overrun bug, when working on corrupt data, that also crashed the ApisHoneyStore service.

ApisChronical

  • Bugfix: when searching for eventsources, the search could fail when using the "Contains" mode combined with the "Case insensitive" option, this is now fixed.

ApisEventBusBee

  • Bugfix: If the Sink.Db-item looses its connection to the database, it will now try to reconnect every time a new event is processed.
  • Bugfix: improve performance when locating the alarmareas for an eventsource

Apis OPC UA Server Library

  • Bugfix: in certain cases, the OPC UA server would fail to release a session and its subscriptions, this is now fixed.

Apis Foundation Installer

  • Bugifx: Install .NET core 6.0.26 even if .NET core 7 or newer is installed
  • List of prerequisites included by the installer:
    • Microsoft .NET Framework 4.8
    • Microsoft .NET Core 6.0.26 Server
    • Microsoft .NET Core 6.0.26 Desktop
    • Microsoft Visual C++ 2015-2022 Redistributable 14.38.33130
    • OPC Core Components 3.0.108

Apis Foundation 9.15.7 release notes

Apis HoneyStore

  • Bugfix: Fixed bug causing the metered licensed items count of Honeystore to leak licensed items.
    When attempting to add more items to a full database (ie. Used items == Max items) or more items than free items,
    the count of items attempted being added would erroneously be added as metered licensed items, and hence eventually lead to all licensed items being consumed.

Apis Hive

  • Bugfix: A refcount error could occur if an OpcUa client tried to create an invalid event subscription, leading to a memory leak of session objects in the OpcUa server.
  • Bugfix: In certain cases, when running as a service, Hive would report its ServiceStatus as stopped during startup, causing the ServiceMain function to exit normally. Hive would then also exit normally even though it had not yet finished starting.
  • Feature: Added support for new engineering units: VAr/min, kVAr/min
  • Feature: default value for the registry setting UAServer/Limits_MaxSubscriptionsPerSession has been changed from 16 to 0, effectively using the default limit defined by the OPCUA server library in new installations. Note: existing hive instances will not be affected by this changes, since the existing registry value will override the new default behavior.

Apis Hive NET API

  • Bugfix: Fixed bug in ImportAndCheckNamespace for nodesets containing errors

ApisChronical

  • Feature: improved performance when registering/looking up eventsources
  • Bugfix: eventfields declared with datatype "Variant" containing a vector was not serialized to the database correctly, this is now fixed.

ApisCnxMgrBee

  • Bugfix: when connection to an opcua server is lost, the servicelevel for that server is now set to 0 so that a cluster failover can occur.

ApisIEC104Bee

  • Bugfix: Fixed a bug related to the Enabled property and the watchdog functionality. After a module restart, an internally cached enabled flag was set to false, and thereby prevented the watchdog functionality from detecting whether the module was in a disconnected state. Also, watchdog evaluation did not kick in if connection state was good, despite not receiving data from equipment.

ApisOpcUaProxyBee

  • Bugfix: when multiple namespaces were federated, a bug in the handling of namespace mappings could cause history-reads to fail with a BadInternalError statuscode.
  • Bugfix: a deadlock could occur if an OpcUa client issued RPC messages that should be federated before the mapping between local and remote namespace tables had been created.

Apis Management Studio

  • Bugfix: Import mappings via 'Transformation Expressions' sometimes failed with 'Unable to access temporary directory' error. Fixed.

Apis Semantics

  • Bugfix: When recreating item names for a namespace, items could be created and/or renamed while the namespace database was left unchanged. The root of the problem was caused by having variables in the namespace that lack a hierarchical path to the root node (objects folder). This would lead to missing subscriptions and possibly duplicate items if attempting to recreate item names yet again with a different naming configuration.
  • Bugfix: When importing XML-nodeset files sometimes the LocalizedText have not been imported correctly. This is now more robust.
  • Bugfix: When importing XML-nodeset files for an existing namespace where names were changed on nodes leading up to a data variable, the importer was unable to detect that the connected data source (hive item) should be renamed. Leading to an orphaned hive item for the old name.

Apis OPC UA Server Library

  • Bugfix: improve performance when one session has many subscriptions. Also, set default limit for number of subscriptions per session to 1024.

Apis OPC UA Client Library

  • Bufix: in certain cases, rpc response handling could hide a missing response object.

CsScript

  • Bugfix: Changed the '==' and '!=' operators for VQTs for Double and Float from '==' to Equals to handle NaN values, which fixes never ending loops in some cases when using the State cache.

Apis Foundation 9.15.6 release notes

Apis HoneyStore

  • Bugfix: Some robustifications related to configuration observers locking, to prevent "deadlock" similar situation in case an ill-behaving observer threw an exception.
  • Performance: Some improvements on starting Honeystore from a "dirty" shutdown, e.g. when the Honeystore service is terminated prematurely when computer is rebooted.

Apis Hive

  • Performance: Reducing memory usage and improving speed for inter module communication during startup, shutdown and when changing the configuration at runtime.
  • Feature: The default RunStateChangeTimeout for loading and starting modules, has been increased from 120 seconds to 300 seconds per module.

ApisHive Modules

  • Performance: Added a feature/tweak for boosting performance, when importing (huge) configuration files containing external items. Enable it by adding a DWORD value in registry, named PreventExternalItemsReconnectWhenStarted, under key:
    HKEY_LOCAL_MACHINE\SOFTWARE\Prediktor\Apis\<InstanceName>\Modules
    Value > 0 turns tweak On, = 0 turns tweak off.
    Note! When using this feature, you must restart the Hive instance after importing the configuration file, to make the external items reconnect to their source items.
    Note2! Enabling this tweak will also inhibit ExtItemMetaTransfer from detecting and transferring meta data at runtime (will work after a restart though).

ApisAlarmAreaBee

  • Feature: support generating eventtypes that does not inherit from ConditionType, e.g. SystemEventType.

ApisCnxMgrBee

  • Feature: support the 'Logger_Expr' global attribute during replication
  • Bugfix: Make OpcUaItem.ServiceLevel work correctly with ItemAttributeItems
  • Bugfix: when an OpcUa Connection item is modified, the changes are now propagated to connection users (such as ApisOpcUaBee) even if the user got the connection info through an OpcUa Cluster item.

ApisLoggerBee

  • Bugfix: Fixed bug causing ambiguous deprecation state of the global attributes of the Logger module, when module is deleted (or aborted during adding), often seen as having attribute description equal dummy.

ApisModbusBee

  • Bugfix: #Connect# control item now obeys any initial value configured on it.
  • Performance: Improved performance when building internal register tables, noticeable for large configurations.

ApisOpcUaPublisherBee

  • Bugfix: Fixed timeformat in common timestamp inOPCUA JSON format.

Apis Management Studio

  • Bugfix: Import mappings via 'Transformation Expressions' sometimes failed with 'Index out of bounds' error. Fixed.
  • Feature: Default FilterType changed from 'Contains' to 'Like'
  • Performance: Lots of performance improvements when importing configuration files.

Apis Semantics

  • Feature: Support importing nodeset2.xml files for a namespace that have been deleted without requiring a restart of the opc ua server (ie. the hive instance)
  • Feature: Added opc ua method in namespace 1 (Hive) for listing semantics namespace configurations.
  • Bugfix: Fixed a bug in nodeset export introduced in version 9.14 that caused certain variables not to be exported.

Apis OPC UA Server Library

  • Bugfix: make sure that all monitored items respect changes to MonitoringMode.

Apis UANSService

  • Feature: Use custom opc ua method for copying namespace databases from the server if the opc ua server is hive >= 9.15.6
  • Feature: Read namespace configuration (naming rules etc.) from the server if the opc ua server is hive >= 9.15.6
  • Feature: Add namespace config property for specifying that (item) names should be taken from a copied database rather than regenerated.

Configuration Repository

  • Bugfix: Fixed upload/download of large textfiles

Apis Foundation 9.15.5 release notes

Apis Hive

  • Bugfix: wait until all hive modules have been configured before opening opcua endpoints

ApisAlarmAreaBee

  • Feature: add property EnableAlarmEvaluationDelayPeriods
  • Bugfix: Improve errormessage when failing to save an event

ApisOpcUaBee

  • Bugfix: do not try to save ConditionRefreshStart and ConditionRefreshEnd events

Apis Foundation 9.15.4 release notes

Apis Hive

  • Bugfix: Conversions from OPCUA ContentFilter to chronical filter expressions now maps OPCUA namespace indexes to the chronical namespace table, enabling e.g. filtering on custom eventtypes.

ApisChronical

  • Bugfix: The option "Persist state" was wrongfully set to "Enabled", and could not be changed. This is now fixed, meaning that the default value now is "Disabled".

Apis Management Studio

  • Bugfix: improve performance when working with Hive modules containing lots of items

Apis Foundation 9.15.3 release notes

Apis Hive

  • Bugfix: Certain chronical eventfields containing datetime values would be mapped to OpcUa UInt64 values, now all such fields are mapped to OpcUa DateTime values.

Apis Foundation 9.15.2 release notes

Apis Hive

  • Bugfix: do not allow clients to change the Runmode property of Apis Event Server
  • Bugfix: do not spam the logfiles with warning-messages about unsupported event fields for each event returned in OpcUa subscriptions and historyread requests.
  • Bugfix: fixed a bug in ApisNativeStorage crashing ApisHive when the storage file is corrupt.
  • Bugfix: make sure that eventsource links are updated when registering an existing eventsource
  • Bugfix: translate non-error returncodes from Chronical into non-error HRESULTs

ApisChronical

  • Bugfix: honor the deleted-flag on both eventsources and eventsource-links during historyread, subscriptions and eventsource search.

ApisAlarmAreaBee

  • Bugfix: Unlink the alarmarea and the tag eventsources when the EvtCategory of the tag is changed to "Not monitored"
  • Bugfix: Do not generate a "final,disabled" event on the eventsource when a tag is deleted or its evtcategory is set to "Not monitored"

ApisCnxMgrBee

  • Feature: Replication item now supports filtering based on modulename and -type
  • Bugfix: Relication item no longer replaces local state-items when replicating items with identical names as existing state-items on the local OpcUaBee.

ApisOpcUaBee

  • Bugfix: A bug was introduced in 9.14.7 which causes ApisHive to crash when the OpcUaBee reads historical data from an OpcUa server and the data received contains string values.

Apis Management Studio

  • Feature: the "Regenerate items" menuitem (under "Perspectives") now also regenerates missing eventsources

Apis Semantics

  • Feature: Return "Access denied" when trying to delete nodes that are mandatory for its' parent object when the parent is not deleted.

Apis Foundation 9.15.1 release notes

Apis Hive

  • Feature: a new event field, AckTime, is now supported in the EventServer
  • Feature: Hive modules exposed in the OpcUa server now references custom typedefinitions which defines the types of the modules.
  • Bugfix: Fixing the APIS Hive UA Server to obey the ExtendedRights attribute on browse/read/write on item-nodes and their value.

ApisChronical

  • Feature: a new flag "Immutable" can be specified per event type. This flag disables filtering of replaced events on historical reads.

ApisAlarmAreaBee

  • Feature: the new attribute ChronicalEventType can be used to specify a custom EventType for new events
  • Feature: the new attribute ChronicalSourceName can be used to specify a custom SourceName for new events

ApisCnxMgrBee

  • Feature: Replication item now supports setting a non-default publising interval on the replicated modules.

Apis Management Studio

  • Bugfix: Timestamps are sorted as time instead of as string for Event Status View and Event History View.

Apis Semantics

  • Feature: namespaces that defines custom eventtypes can now optionally create matching eventtypes in Apis Chronical. This is controlled by the new property Update Apis Eventtypes on the ApisSemanticsBee and by the new attribute Update Apis Eventtypes on Namespace items on the ApisOpcUaBee.

Apis OPC UA Server Library

  • Feature: browsing of OPCUA namespaces has been optimized for better performance

Apis Foundation 9.14.11 release notes

Apis Hive

  • Bugfix: Certain chronical eventfields containing datetime values would be mapped to OpcUa UInt64 values, now all such fields are mapped to OpcUa DateTime values.

Apis Foundation 9.14.10 release notes

Apis Hive

  • Bugfix: do not allow clients to change the Runmode property of Apis Event Server
  • Bugfix: do not spam the logfiles with warning-messages about unsupported event fields for each event returned in OpcUa subscriptions and historyread requests.

ApisChronical

  • Bugfix: honor the deleted-flag on both eventsources and eventsource-links during historyread, subscriptions and eventsource search.

ApisAlarmAreaBee

  • Bugfix: Unlink the alarmarea and the tag eventsources when the EvtCategory of the tag is changed to "Not monitored"
  • Bugfix: Do not generate a "final,disabled" event on the eventsource when a tag is deleted or its evtcategory is set to "Not monitored"

ApisCnxMgrBee

  • Bugfix: Relication item no longer replaces local state-items when replicating items with identical names as existing state-items on the local OpcUaBee.

ApisOpcUaBee

  • Bugfix: A bug was introduced in 9.14.7 which causes ApisHive to crash when the OpcUaBee reads historical data from an OpcUa server and the data received contains string values.

Apis Foundation 9.14.9 release notes

Apis Hive

  • Feature: Added support for new engineering units; VAh, kVAh, MVAh, GVAh.
    Note only the kVAh has a matching unit in the OPCUA UNECE unit namespace.
  • Bugfix: properly escape eventsource-names containing slashes

ApisChronical

  • Bugfix: a performance improvement added in 9.14.2 could cause duplicate eventsources to be created when an eventsource name contained a forward-slash. This release fixes the bug, and adds a 'rebuild' option to the commandline-tool shac which can be used to repair chronical databases with such duplicates. To rebuild a database, run the following command as administrator after stopping the relevant ApisHive instance:

    shac -ve rebuild "PATH_TO_DATABASE"

  • Bugfix: duplicate eventsources could be created when renaming a tag in ApisHive only to change the case of some characters in the tag-name. This will also be fixed by the 'rebuild' command.

ApisOpcUaMethodBee

  • Bugfix: Fix database connection leak
  • Bugfix: Fetch available stored procedures and parameters by using the information_schema, making it compatible with postgresql, mssql and possibly others.
  • Feature: Will now use functions returning records(et) in postgresql, and stored procedures in mssql
  • Bugfix: Made the component a lot more fault-tolerant, and support a few more primitive datatypes.
  • Feature: Added basic support for schema names for stored procedures

ApisOpcUaPublisherBee

  • Feature: Added the optional value tptxt to Prediktor defined JSON message. This value gives the datatype as text e.g. "Float", "Double", Int32,...

Apis Backup and Restore

  • Feature: When not restoring on the machine that was the original backup source, registry keys or values configured to be restored only when restoring on the same machine as where the backup was taken will no longer be deleted.

Apis OPC UA Server Library

  • Feature: improved throughput when lots of requests are received in parallel
  • Bugfix: closed a memoryleak caused by invalid refcounting of secure channels
  • Bugfix: correctly convert empty/irregular matrices to and from the UaStack

Apis Foundation 9.14.8 release notes

Apis Semantics

  • Feature: Modified item name generation algorithm to try to be more deterministic. When there are multiple parents, prioritize nodes with the same name as any existing itemname. Therafter, prioritize variables over objects. When there are multiple parentvariables or objects and no (matching) existing itemname, sort by name and use the first entry.

APIS HoneyStore Replication

  • Feature: big performance improvement for replications using a large GetDataChunkSize setting.

Apis Foundation 9.14.7 release notes

Apis HoneyStore

  • Bugfix: Fixed bug causing heavy trend file fragmentation, with only a few samples inside each datablock and resulting lots of trend files containing small amounts of data.
    This would happen for any of the sampled Recordtype, when using the MaxCacheDuration attribute.
  • Bugfix: Fixed bug preventing termination and repair of trendfiles having an empty datablock at the end.
  • Bugfix: Robustification and performance tweaks, when running the Honeystore repair tools in parallell and when cancelling tasks.
  • Feature: Added the /numthreads:N command line switch to the Honeystore repair tools. Use this to control/limit the number of threads running in parallell when repairing and the option /serial is not specified.
    N must be in the range [0,256] (0 means same as default => sets N equal to number of logical threads on the given hardware).
  • Feature: Removed the /expert , /expertmode command line switch from the Honeystore repair tools, as it is no longer needed. Expertmode functionality is now the default.

Apis Hive

  • Bugfix: only include interesting (i.e. active or ack-required) alarms when OPCUA clients call ConditionRefresh

ApisIEC104Bee

  • Bugfix: Fixed bug preventing user to actually set the module properties, Watchdog timeout and Auto read period, as values larger than 65535 ms.

ApisLoggerBee

  • Bugfix: When using Resolution equal 0 and none of the eventbroker Log* commands are connected, the Logger did not successfully reconnect to the Apis Honeystore service, if Apis Honeystore was restarted while Apis Hive was running, since the Logger did not reestablish its connection to Honeystore.
    This is now fixed, and the Logger will now successfully reconnect to the Apis Honeystore service, and handle any read/write requests properly after Apis Honeystore is restarted given these condtions.

Apis Backup and Restore

  • Feature: Added support for specifying registry keys or values that should only be restored when restoring on the same machine where a backup was taken.

Apis Semantics

  • Bugfix: Fix database lock issue that could occur when importing a nodeset2.xml namespace where the resulting namespace database had an invalid (or missing) namespace table.

Apis Foundation 9.14.6 release notes

ApisIEC104Bee

  • Bugfix: Fixed a locking issue, calling out to the IEC device when module is in write-locked state, effectively locking down all other threads reading or writing until device has responded.
  • Bugfix: Changed module properties Watchdog timeout and Auto read period from 16 bit to 32 bits unsigned integers,
    allowing for values larger than 65000 ms.
  • Feature: Adding common module properties ExtItem pass-through quality and ExternalItem report.

ApisModbusBee

  • Feature: Added new item type: Variable items, similar to the type in the Worker module.

ApisOpcUaMethodBee

  • Bugfix: fix invalid sql parameter parsing

Apis Management Studio

  • Bugfix: Fixed issue with missing deploy button after security server config change.

Apis Foundation 9.14.5 release notes

Apis Hive

  • Feature: The OpcUa.EventId field generated for ApisChronical events now includes the fields generation and sequence to make sure that the EventId is unique for each event. These fields must now be included when an event is acknowledged. The function IApisEventserver2.Acknowledge is therefore deprecated, and the new function IApisEventserver9.Acknowledge2 must be used instead.
  • Feature: add registry option to disable federated timeseriesaccess

Apis Hive NET API

  • Feature: Expose the IEventServer.Acknowledge2 method which requires the field generation and sequence. If connected to an older version of ApisHive the Acknowledge2 implementation will fall back to the old Acknowledge method.

Apis Management Studio

  • Feature: Use the new IApisEventserver.Acknowledge2 method to acknowledge events.

CsScript

  • Feature: Added built in functions PosFlankDelay and NegFlankDelay

Apis Foundation 9.14.4 release notes

ApisOpcUaPublisherBee

  • Feature: Using Async sending function. Updated Nuget Confluence.Kafka to version 2.1.1.
  • Feature: Added property SslCertificateLocation and SslKeyLocation to configure Kafka module.
  • Feature: Added Apis vqt Version 2.0 message.
  • Updated Nuget Confluence.Kafka to version 2.02.

Apis Management Studio

  • Bugfix: Instance information (run state) for Hive/Honeystore showing also when they are stopped (including start times).
  • Bugfix: Create semantic object will setup EventNotifier = 0. Will also handle EventNotifier = null from server.
  • Bugfix: Property filter on items with "or" operator will now work as intended.

Apis Semantics

  • Feature: HI/LOW EU + HI/LOW Instrument Range opc attributes are now available by default for new function items created by apis semantics bee.

Apis OPC UA Server Library

  • Bugfix: avoid possible crash during shutdown of OPCUA server
  • Bugfix: handle event historyread calls with multiple nodeids
  • Bugfix: avoid deadlock in the UAStacks errorhandling when sending response-messages

Apis Foundation 9.14.3 release notes

Apis HoneyStore

  • Bugfix: Fixed a bug when Inserting overlapping data, that would lead to returning wrong error code(s) for the inserted samples when one or more samples failed to be inserted.

Apis Management Studio

  • Bugfix: "Value cannot be null, parameter: value" issue while connecting to security server from AMS.
  • Bugfix: "Edit Object" caused NullReferenceException due to empty enum sets.
  • Bugfix: "Edit Object", modelling rules for children are now correct.

Apis Foundation 9.14.2 release notes

Apis HoneyStore

  • Feature: From this version, the Demo license never expires. This means that when running without a valid license, all functionality will be intact with no timeout.
    Note that this does not relax the requirement for having a valid license to run the APIS software!
  • Bugfix: Improved robustness when opening corrupt trendfiles, and also possibly rescuing more data from corrupt files when running repair.

Apis Hive

  • Bugfix: Fixed a bug in the OPC UA server regarding reading historical aggregates. The OPCUA HA aggregates PercentGood and PercentBad were returned as fractions (as for classic OPCHDA) and not as percent values (0-100). This has now been fixed.
  • Feature: From this version, the Demo license never expires. This means that when running without a valid license, all functionality will be intact with no timeout.
    Note that this does not relax the requirement for having a valid license to run the APIS software!

ApisChronical

  • Bugfix: some invalid optimalizations on convoluted filtering in the query analyzer has been fixed
  • Bugfix: a new function added in 9.14.1 and used by EventMonitor items with UnmappedEventSourceAction=LookupSourcePath could return an incorrect result when it failed to locate the specified source.
  • Bugfix: improved performance when traversing/verifying eventsource hierarchies
  • Bugfix: removed a possibility for heap corruption with event monitors

ApisHive Modules

  • Bugfix: Optimized usage of internal ItemMetaCache for performance and to avoid rare crashes.
  • Bugfix: Fixed a bug when adding Function items from the Add items dialog in Apis Management Studio, sometimes resulted in leaving the added Function items inactive until restart or changing the attribute(s) of the item(s).
  • Feature: When using Function items with Calculator = C#, you can now specify external item(s) named ##DummyExternalItem to let the calculator know that the external item(s) at give positions are ot be ignored.
    This is useful when you want to use the same expression for multiple items, but some of the items have different number of external items.
    Note that to achieve this, you must apply the configuration by import from a text file. Using the Add items dialog in Apis Management Studio together with File add will also work.

ApisLoggerBee

  • Feature: Reduced number of tracelog error messages when calculations fails, to avoid filling tracelogs with redundant messages.
    To get detailed error(s) for all failed calculations in a single Debug tracelog entry, set LogLevel proprety to Debug.

ApisModbusBee

  • Fixed bug in Performance report function when writing intensively to Holding and Coil registers, causing ApisHive to crash

ApisOPCBee

  • Feature: Deprecated properties SrvUser and SrvPassword, as a consequence of the DCOM hardening process of Microsoft.
    KB5004442 - Manage changes for Windows DCOM Server Security Feature Bypass (CVE-2021-26414)
  • Feature: Limiting TraceToFile functionality to default having a maximum of 10 files with maximum size of 64 MB, other values can be specified in the module specific registry key using DWORD values TraceToFileMaxCount and TraceToFileMaxSize.

ApisOpcUaBee

  • Feature: The SubscriptionActive property is now persisted and hence remembered between APIS Hive runs.

ApisReplayBee

  • Feature: Deprecated properties SrvUser and SrvPassword, as a consequence of the DCOM hardening process of Microsoft.
    KB5004442 - Manage changes for Windows DCOM Server Security Feature Bypass (CVE-2021-26414)

Apis Management Studio

  • Bugfix: Removed negative number block on UnitId for Custom Engineering Units

Apis Semantics

  • Bugfix: re-establish all OpcUa monitored items on properties stored in the semantics database after namespace import.
  • Bugfix: Fix bug introduced in 9.13.3 by ensuring that variables having a defined itemname in the database will read their value from the host, even if the valsrc column is undefined (DBNULL)

OPC HDA

  • Feature: Added support for OPCUA Aggregates PercentGood and PercentBad, as custom OPCHDA aggregates PercentGood (UA) and PercentBad (UA).

Apis OPC UA Server Library

  • Bugfix: use monotonic clock instead of utc clock for time measurements
  • Feature: OPC UA namespace updated to version 1.05.02
  • Feature: Removed annoying tracelog message, filling tracelogs with inane Info message saying "historyread finished".

APIS HoneyStore Replication

  • Bugfix: Setup: Fixed bug in selection of install folder, introduced in release 9.0
  • Feature: Setup; Added command line option to select whether Publisher, Subscriber or both should be installed, usage: REPLICATIONSERVICETYPE=PUBLISHER_AND_SUBSCRIBERIBER or PUBLISHER or SUBSCRIBERIBER

Apis Foundation 9.14.1 release notes

Apis Hive

  • Deprecation: The following legacy APIS Hive modules have now been deprecated: ApisSpektronD20Bee, ApisLVEstimator2Bee, ApisScenarioBee, ApisWITSServerBee

Apis Hive NET API

  • Feature: Ignore External Items on config import if a non empty External Item Filter is present
  • Bugfix: File import of multiple files sometimes ended with an error message about 'itemname' is already present in the dictionary.

ApisChronical

  • Bugfix: do not include old versions of modified events in query results
  • Feature: a new event-field flag, "Sticky", has been implemented. When specified for a field, the content of the field is copied from the current event to the new event if the field is unspecified on the new event.

ApisCnxMgrBee

  • Feature: Replication item now supports toggling from an externalitem connection

ApisIEC104Bee

  • Feature: Adding support for Function items on the ApisIEC104Bee module. (#5726)

ApisInterpreterBee

  • Bugfix: Interpreter module: Fixed bug in send queue.

ApisLoggerBee

  • Feature: Added support for calculations using the ModuleName_Expr attribute, also on Sampled loggers.
  • Feature: Added property TimeseriesAccessOrder, determining the order of this logger module for handling timeseries requests, when an item is logged by more than one logger module.
    Lower order values will be selected before higher values, when determinig which logger module to use.

ApisOpcUaBee

  • Feature: Added support for catchup of events, using Event Monitor items
  • Bugfix: certain cases using non-ascii unicode characters for e.g. OpcUa NodeIds where not handled correctly (#5724)

ApisOpcUaMethodBee

  • Bugfix: certain cases using non-ascii unicode characters where not handled correctly (#5724)

ApisOpcUaProxyBee

  • Feature: The OpcUaProxyBee now supports the property "Optional", which enables Browse-requests received by OpcUa clients not to fail even if the federated OpcUa server is not connected (#5712).
  • Feature: SetPublishingMode RPC is now federated to connected servers (#5704)

Apis Management Studio

  • Feature: Improved trace window outputs with more details and timing info, when importing configuration files.

Apis OPC UA Server Library

  • Bugfix: make sure that the UA_EVENT_MODULE_CREATED event is raised for all registered namespaces, even if the namespace driver fails to initialize the namespace. This fixes a bug where semantics namespaces with a bad initial configuration caused the namespace-mapping in chronical to get out of sync with the namespace-array in the OPC UA server (#5695).
  • Bugfix: a possible memoryleak when a client browse nodes with custom typedefinitions has been closed.

Apis OPC UA Client Library

  • Bugfix: a possible memoryleak when browsing nodes with custom typedefinitions has been closed.

Apis Foundation 9.13.4 release notes

ApisChronical

  • Bugfix: use monotonic clock instead of utc clock for time measurements

ApisCnxMgrBee

  • Bugfix: use monotonic clock instead of utc clock for time measurements

ApisOpcUaProxyBee

  • Bugfix: use monotonic clock instead of utc clock for time measurements

Apis Semantics

  • Feature: Support specifying guid or numeric ids when autocreating ids for new nodes.

Apis OPC UA Server Library

  • Bugfix: use monotonic clock instead of utc clock for time measurements

Apis OPC UA Client Library

  • Bugfix: use monotonic clock instead of utc clock for time measurements

Apis UANSService

  • Feature: Optimization - avoid reading attributes for namespaces on the server that will not be loaded as proxies.

Apis Foundation 9.13.3 release notes

ApisHive Modules

  • Bugfix: Fixed a bug when importing configuration for a non-existing module/items, causing ExternalItem Filters, where the filters did not resolve its External Items automatically.
  • Feature: Added support for attributes High Instrument Range and Low Instrument Range (InstrumentRange), as part of the ExtItemMetaTransfer attribute functionality.

ApisOpcUaBee

  • Bugfix: Fixed a bug, clearing EU attribute, when synching standard properties on OPC items and the UA server fails to return EngineeringUnit on one or more nodes.
  • Feature: Added support for InstrumentRange when synching standard properties on OPC items. Applied on APIS attributes High Instrument Range and Low Instrument Range, and can also be part of the ExtItemMetaTransfer attribute functionality.

Apis Semantics

  • Feature: Allow storing the value of EURange and InstrumentRange properties in attributes of the function item associated with the properties' parent.

Apis OPC UA Server Library

  • Bugfix: a deadlock could occur in the OpcUa stack when trying to flush a chunk of a response message to a closed socket when the chunk was not the final part of the message (#5722).
  • Feature: run maintenance tasks with regular intervals on all namespace databases.

Apis Foundation 9.13.2 release notes

Apis Hive NET API

  • Bugfix: File import of InitValue set the datatype to string. The InitValue datatype will now be set to DataValue (19100), if present in the import file. If DataValue is not present, Type (1) will be used. If item exists and neither DataValue or Type is present, existing datatype is used.

ApisEventBusBee

  • Bugfix: closed two memory leaks (xslt processing and smtp sinks)

ApisOpcUaBee

  • Bugfix: Fixed a bug, allowing Direct catchup to possibly request catchup data older than the maximum period, as configured by the module property CatchupPeriod, when such older periods are present in any catchupstate file.

ApisOpcUaMethodBee

  • Bugfix: avoid crashing when dbconnection fails

Apis OPC UA Server Library

  • Bugfix: memoryleaks related to MonitoredItems has been fixed
  • Bugfix: a timing-dependent access-violation when handling SetPublishingMode RPC has been fixed
  • Bugfix: The "Browse" and "TranslateBrowsepathsToNodeids" services required a valid reference type to be specifed. According to the standard, a missing (null or empty) reference type can be specified for returning all references, this is now implemented correctly by the OpcUa server and namespace drivers.
  • Bugfix: a deadlock could happen on certain callbacks from the uastack, this is now fixed (#5714).

Apis UANSService

  • Bugfix : Sometimes, when crawling multiple namespaces containing types where a supertype is defined in one crawled namespace, and a subtype is defined in another crawled namespace, the HasSubType reference would not be added to any of the namespaces. This would lead to the subtype not being available in the server hosting the namespace proxies.

Apis Foundation 9.13.1 release notes

Apis Semantics

  • Bugfix: Variables having an unsigned datatype will no longer be written as a negative number when the most significant bit is set.

Apis OPC UA Server Library

  • Bugfix: an out-of-bounds array access introduced in 9.12.1 has been fixed

Apis Foundation 9.12.1 release notes

Apis HoneyStore

  • Bugifx: HSTrendRepair tool, fixed auto repair

ApisHive Modules

  • Bugfix: Fixed a 100% CPU hang situation, when using Connection browser (ItemConnections) when having more than 1000 External items in the configuration.
  • Feature: Added support for attributes Description, High EU and Low EU (EURange), as part of the ExtItemMetaTransfer attribute functionality.
  • Performance: Improved performance when importing configuration containing use of ExternalItem Filters and Valuetype attributes.

ApisOpcUaBee

  • Bugfix: Using a thread-safe set when bookeeping nodes missing initial VQTs, when starting Catchup.

ApisOpcUaMethodBee

  • Bugfix: Added module to installer

ApisOpcUaPublisherBee

  • Bugfix: Removed {empty value} from OPCUA JSON format when the signal has no value.

Apis Management Studio

  • Bugfix: Fixed issue preventing attributes of category 'Shared Module Item attributes', to be selectable as columns in views like e.g. Aadaptive list.
    Now, attributes like e.g. Function item.DataChangeTrigger, can be viewed as column(s).
  • Feature: Removed limitation on 100 items as bulksize, when importing configuartion. Now, import all items of same type in a single call per module.

Apis Backup and Restore

  • Feature: Added support for pre-flight check algorithms for configuration in backup sets through new CHECK command and IConfigValidator interface
  • Feature: Added external item connection validation algorithm

Apis Semantics

  • Bugfix: closed some memoryleaks.

Apis OPC UA Server Library

  • Bugfix: when handling the ModifyMonitoredItems request, the server mixed up the parameters QueueSize and DiscardOldest, this has now been fixed.
  • Bugfix: the performance counter "Number of active notifications" was not correctly updated when DiscardOldest was false and the monitored items queue was full.
  • Bugfix: avoid use-after-free on publish-response data waiting to be ack'd and/or republished.
  • Bugfix: closed a memory leak when releasing eventmonitors with filtering
  • Bugfix: improved memory usage when an OPCUA client requests monitoring with invalid parameters

Apis Foundation 9.11.1 release notes

Apis HoneyStore

  • Feature: By design, Honeystore databases accepting out-of-sequence (oos) data, required at least 2 or more oos data samples per item to be written, otherwise the oos-data write woould be rejected. As a result of the newly implemented Direct catchup feature, it is necessary to allow even writing 1 oos data sample to a Honeystore database for the catchup to work as expected.

Apis Hive

  • Bugfix: a memoryleak related to OPC AE client disconnects has been fixed
  • Bugfix: format of parent nodeid in export of namespace has been fixed

ApisHive Modules

  • Bugfix: When setting the EU attribute on an item which doesn't have the UNIT attribute, the UNIT attribute will now be added to the item.

ApisOpcUaBee

  • Bugfix: The changes introduced on the Apis configuration storage provider, ApisNativeStorage, in version in 9.8.1, to handle multi threaded reads/writes, were insufficient to handle all scenarios. More robustness has now been added to prevent crashes when running in CatchupMode, Direct.
  • Feature: When running catchup towards misbehaving UA servers, not returning an initial VQT for all subscribed items, when creating the subscription, the catchup process will be stuck waiting forever to get started. Now, after 3 (default) publish responses during startup or after 600 seconds (default), without updating all subscribed items, the catchup will be forced to start. When forced, the highest of current local and server times, will be used as endtimes for such stale items.

Apis Management Studio

  • Bugfix: Fixed ill-formatted ApisPaths.EngUnitQuantityDir when running Apis Management Studio in a non-development environment, preventing the Apis Engineering Units editor to load the engineering units correctly.
  • Feature: Performance tuning on Item lists.
  • Bugfix: Removed the connection view until license or replacement is available.

Apis Foundation 9.10.1 release notes

Apis HoneyStore

  • Bugfix: Snapin: removed DCOM sequrity bypass (call to CoInitializeSecurity) to meet DCOM hardening requirements

Apis Hive

  • Bugfix: Snapin: removed DCOM sequrity bypass (call to CoInitializeSecurity) to meet DCOM hardening requirements

ApisHive Modules

  • Bugfix: Using Function items, fixed missing update when the DataChangeTrigger attribute is QualityValue, and only the quality had changed.

ApisModbusBee

  • Feature: Simulate signals when Comm. Type is None (#5656)
  • Feature: Added new write strategy. Item property EvaluationOrder Slave no writeback. Value is only written to slave when changed in Apis (#5657)

ApisOpcUaBee

  • Bugfix: Modified logic used when starting Catchup, when waiting for having received (at least) one VQT for each subscribed item included in Catchup.
  • Feature: Added state-item #Catchup-WriteChunksErrors#, telling how many (if any) data chunks that failed writing to the local Honeystore database for its item.

ApisOpcUaProxyBee

  • Bugfix: When multiple namespaces was federated from the same server, a refcount bug could cause the federation to fail.
  • Bugfix: Closed a memory leak

Apis Management Studio

  • Bugfix: Now, handling of Custom Engineering Units also is working in Apis Management Studio is now working. Was not loaded and hence not able to see custom engineering unit mappings.

Apis Backup and Restore

  • BugFix: BareAgent no longer treats wiping a non-existing hive-instance as an error. Slightly improved error messages from apicide.
  • BugFix: BareAgent now stores apicide logfiles in correct log-directory ($APIS_INSTALL\Logs\Apicide) instead of root-directory.

ApisAscii

  • Bugfix: Removed DCOM sequrity bypass (call to CoInitializeSecurity) to meet DCOM hardening requirements

Apis OPC UA Server Library

  • Bugfix: the UaStack would sometimes deadlock due to network connection issues, this is now fixed.
  • Bugfix: The OPC UA server now handles enabling/disabling publishing on a subscription correctly (#5659)

Apis Foundation 9.9.1 release notes

Apis Hive

  • Feature: a new registry setting, UAServer/Limits_MaxSubscriptionsPerSession, is used to control max number of OpcUa subscriptions per session, with a default value of 16.
  • Bugfix: a stack overflow has been fixed in the ION actor subsystem

ApisHive Modules

  • Bugfix: Added a max depth in recursive calculation and a maximum number of tokens in the formula to avoid stack overflow and preventing users to add function item expressions that are too large.

Apis Foundation 9.8.1 release notes

Apis Hive

  • Bugfix: In Apis Hive IApisItemConnections service, fixed a never ending recursion when items has circular, external items configuration. (#5600)
  • Bugfix: The Apis configuration storage provider, ApisNativeStorage, did not handle multiple threads reading/writing at the same time. This has now been fixed. (#5598)
  • Performance: Optimized loading and saving Apis configurations. In some configurations having lots of (thousands) External items on some items, typically as a result of using ExternalItem Filters attribute, loading and saving configurations took a long time. This has been addressed and optimized. (#5597)
  • Feature: LexFloatServer licesning software updated to version 4.8.6. The EXE is now signed, to please virus-scanners and not to be flagged as malware.

ApisHive Modules

  • Bugfix: Using Function items, specifying huge and complex Expressions (length > 25000 characters with lots of operators), would crash Apis as result of an unhandled Stack overflow exception. This has now been fixed. (#5599)

ApisAlarmAreaBee

  • Optimize the attribute config observer to detect uninteresting cases caused by externalitem filters.

ApisIEC104Bee

  • Feature: Value on information items only updated when different from current value in Apis

ApisOpcUaBee

  • Bugfix: The Apis configuration storage provider, ApisNativeStorage, did not handle multiple threads reading/writing at the same time, which could cause a crash on some systems using the newly implemented CatchupMode, Direct. This mode uses a catchupstate helper file read/written to by multiple threads at the same time. (#5598)

ApisOpcUaProxyBee

  • Bugfix: Create federated sessions early, and return errorcodes to uaclients when unable to federate browse-requests.

Apis Management Studio

  • Bugfix: The Apis Engineering Units handling of Apis Management Studio is now working again, been broken since version 9.0 due to different installed folder layout.
  • Bugfix: Add filter / Select Columns sometime caused a Catastrophic failure on attributes missing enumeration string resource in Hive module, has been fixed to handle such situations. (#5602)

Apis Backup and Restore

  • BugFix: Handle restore when the same directory is referenced in more than one config value for backup/restore. (ie both module and item config)

Apis Semantics

  • Bugfix: Handle nodeset export when namespace being exported is missing itemnames for engineering units hosted outside the namespace database.

Apis Module NET API

  • Feature: Added capability for ApisHive modules implemented in .NET, to fire a synchronous event into the Apis Hive event broker. Use when all connected commands should be executed before the thread of execution continues. New method is: ApisModule.FireBrokerEventSync (#5595)

Apis OPC UA Server Library

  • Bugfix: the UaStack would occasionally generate invalid certificates, this is now fixed.

Apis Foundation 9.7.1 release notes

ApisHSMirrorBee

  • Performance: When starting a huge configuration where Honeystore mirror items are used as external item inputs to typically Function Items, meta data (attributes) from source Honeystore items was fetched and applied on the target Hive items unconditionally. I.e. even if no change in the attribute values. This resulted in lots of unnecessary configuration observer notifications, and hence a very slow startup. This has been modified to only apply attribute values that actually has changed! (#5594)

ApisOpcUaProxyBee

  • Bugfix: Use correct lifetime and keepalive parameters when creating a federated subscription

Apis OPC UA Client Library

  • Feature: change default batch size for OpcUa sessions from 1000 to 10000

Apis Foundation 9.6.1 release notes

Apis Hive

  • Bugfix: when writing to read-only attributes on a module/item node using OPCUA, an incorrent success code was returned instead of the correct access denied result code. (#5571)

ApisHive Modules

  • Bugfix: When using the Valuetype attribute to forcefully change the value type of a Function Item, the DataChangeTrigger attribute evaluation always detected the value as changed. (#5574)

ApisIEC104Bee

  • Bugfix: Fixed bug crashing APIS when doing a Config import from Apis Management Studio. (#5573)

Apis Foundation Installer

  • Bugfix: Stopping local Cryptlex license service if named ApisEdgeLicSvc (the default service name) during uninstall, to prevent installation failure on upgrade.
  • Feature: Cryptlex license service installed at local machine as default, with service name: ApisEdgeLicSvc, and installed license configuration and exe files moved to [InstallFolder]\Lex.

Apis Foundation 9.5.1 release notes

Apis Hive

  • Bugfix: after semantics namespace updates, not all deleted eventsource links were updated in chronical.
  • Feature: Eventfields mapped to OPC UA fields containing arrays of extensionobjects are now supported.

ApisHive Modules

  • Feature: added DataChangeTrigger attribute on Function Items. The DataChangeTrigger is an enumeration that specifies the conditions for when the Function item should be reported as updated inside APIS after a calculation.
    Typical use is to prevent reporting the same calculated value as updated when the actual value is not changed.

Apis Management Studio

  • Feature: improved and simplified eventsource search dialogs

Apis Foundation 9.4.1 release notes

Apis HoneyStore

  • Bugifx: Fixed an unhandled exception in HoneystoreTestApp when trying to add a new database.

ApisChronical

  • Bugfix: handle invalid syntax when searching for event sources

ApisModbusBee

  • Bugfix: Fixed Unconsistence item quality handling when writing to slaves, causing timestamp updated incorrect.
  • Feature: Optimized shutdown sequence

ApisOpcUaBee

  • Feature: a new CatchupMode has been implemented, Direct. This catch-up mode will write data directly to the APIS Honeystore timeseries database, as an more efficient way of filling datagaps than the serialized catch-up kinds.

ApisOpcUaMethodBee

  • Bugfix: Deleting modules of this type could cause APIS Hive to crash (#5449)

ApisOpcUaPublisherBee

  • Bugfix: Fixed error in OPCUA JSON messages introduced in autosplit of OPCUA JSON messages.

Apis Semantics

  • Feature: When updating a namespace by importing or crawling an external server, replace the namespace database file instead of trying to replace the entire content of the database file. The previous method for updating a namespace can be restored by setting the DWORD registry value HKLM\SOFTWARE\Prediktor\Apis\[InstanceName]\Semantics\ReplaceDbInline to 1.
  • Feature: Add retry functionality to namespace database flushing.

Apis OPC UA Server Library

  • OPC UA namespace updated to version 1.05.01 (#5156)

Apis OPC UA Client Library

  • Bugfix: avoid crashing when retrying to create eventmonitors with contentfilters

Apis UANSService

  • Bugfix : Last batch of discovered nodes would most likely not be persisted, which could result in incomplete namespaces.

Apis Foundation 9.3.1 release notes

Apis HoneyStore

  • Bugfix: fixed a bug when writing out-of-sequence data, that could cause more than one active trend file to be present for an item
    after the out-of-sequence data write operation. This again could make trend history to seem incomplete, until a trendfile repair was performed.

Apis Hive

  • Feature: Added support for import/export of array and Matrix in nodeset2.xml files. (#5284)
  • Bugfix: Fixed some opcua JSON texts to be allign with the spesification. (#5451)
  • Feature: support OPCUA ACL checks for subscriptions and historical access of events (#5435)
  • Feature: OPCUA stack updated to v1.7.7
  • Bugfix: A potential use-after-free bug on OPCUA subscription handling has been fixed (#5462)
  • Feature: convert arrays of extensionobjects to array of JSON values when assigning such object to Hive items (#5488)

ApisChronical

  • Bugfix: closed some memory leaks (#5427)

ApisModbusBee

  • Feature: Improved error handling; quality during reconnect and invalid responce.
  • Feature: Added PollDone event, now possible to interconnect modules using one poll rate to prevent multiple modules to access one slave simultaneously.

ApisOpcUaPublisherBee

  • Feature: Added autsplitting of the OPCUA JSON messages, when size gets above defined max messagesize. (#5452)
  • Bugfix: Fixed error with backfill-messages sending thread is not started correctly when changing properties. (#5430 and #5328)

Apis Semantics

  • Bugfix: When adding HasEventSource references between nodes, sometimes not all the references would be replicated to APIS EventServer (#5465)

Apis OPC UA Server Library

  • Fix a crash-situation where clients issue RPC requests and immediatly disconnects (#5472)

Apis OPC UA Client Library

  • Bugfix: improved validation of RPC responses (#5473)
  • Bugfix: removed a deadlock possibility (#5469)

Apis UANSService

  • Bugfix : References no longer gets exported in every namespace when a parent object has children in a different namespace.
  • Feature : Use timeout hint for browsing as well as attribute reading

Apis Foundation 9.2.1 release notes

Apis Hive

  • Bugfix: Ensure that correct versions of .net assemblies are loaded by preloading the .NET UA client assembly during startup. This behaviour may be disabled by adding the DWORD registry entry HKLM\Software\Prediktor\APIS[InstanceName]\InhibitUaClientPreload and specifying a value different from 0.

Apis Management Studio

  • Bugfix: Import Namespaces: Pressing 'Backup set to import' caused crash, now fixed.
  • Feature: Possible to create a new Apis Hive instance from a Configuration Repository backup.

Apis UANSService

  • Bugfix : Allow huge collections for caching during crawling, in order to not receive OutOfMemoryExceptions that could lead to deadlocks.
  • Bugfix : Optimized memory footprint and speeded up the crawling process somewhat.
  • Bugfix : Re-prioritized some log entries, and made sure some often duplicated log-messages are not repeated as often as it used to.

Apis Foundation 9.1.1 release notes

Apis HoneyStore

  • Bugfix: If adding item(s) to database running in Online no write-cache mode, with non-default values for the MaxCacheDuration attribute, the mechanism for checking/validating the max cache duration in a database would start. This mechanism is only meaningful when a database is in Online mode, and subsequently would crash the ApisHoneyStore service in some situations. This issue has been fixed.
  • Feature: Updated LexFloatServer to latest version, version 4.8.4.
    To use this new version on existing installations, you must deploy this new version manually according to the ReadMe.txt file in the %ProgramFiles%\APIS\Bin\Lex folder.
    Note: You are not required to update to this latest version, any prior deployed version LexFloatServer will run satisfactory.
  • Feature: Increased demo timeout period to 10 days, to let users have more time to get an updated license in case an existing license becomes invalid or failes.

Apis Hive

  • Feature: Updated LexFloatServer to latest version, version 4.8.4.
    To use this new version on existing installations, you may have to deploy this new version manually according to the ReadMe.txt file in the %ProgramFiles%\APIS\Bin\Lex folder.
    Note: You are not required to update to this latest version, any prior deployed version LexFloatServer will run satisfactory. Feature: Increased demo timeout period to 10 days, to let users have more time to get an updated license in case an existing license becomes invalid or failes.

ApisOpcUaMethodBee

  • First release of this module

Apis OPC UA Server Library

  • Fix a memory-leak due to unreleased securechannels (#5427)
  • Improve performance when modifying an OPCUA endpoint (#5363)
  • Fix uastack file access that could fail on certain certificates

Apis OPC UA Client Library

  • Bugfix: improve subscription-logic to detect missing publish responses/stale subscriptions (#5425)
  • Bugfix: avoid infinite wait for initial server response when opening a new securechannel (#5426)
  • Bugfix: a possible deadlock during disconnect has been fixed

Apis UANSService

  • Feature : Add support for storing datavalues containing arrays and 2-dimensional matrices in the cached namespace.

Apis Foundation Installer

  • Bugfix : Setups added check for NetCore version prerequisite, preveting setup to fail when NetCore is already installed.

Apis Foundation 9.0.1 release notes

Apis HoneyStore

  • Feature: New licensing provider has been implemented, using the software licensing API from Cryptlex. APIS Honeystore now requires a new license using our new framework, any older licenses from Sentinel is now deprecated and will no longer work! Hence, you will need to get a new license from Prediktor using our new license provider.
  • Deprecation: As a result of migrating APIS from .NET Framework to .NET (formerly named .NET Core), the APIS HoneyStore Datacenter Edition features are unavailable until further notice.

Apis Hive

  • Feature: New licensing provider has been implemented, using the software licensing API from Cryptlex. APIS Hive now requires a new license using our new framework, any older licenses from Sentinel is now deprecated and will no longer work! Hence, you will need to get a new license from Prediktor using our new license provider.
  • Deprecation: The following APIS modules have been deprecated:
    • ApisAISXmlBee
    • ApisAsysBee
    • ApisBatchOptimizerBee
    • ApisCalculateBee
    • ApisGPSolarBee
    • ApisIMAWarehouseBee
    • ApisInnolasBee
    • ApisIntegoBee
    • ApisMessageBuilderBee
    • ApisSecsGemHostBee
    • ApisTunnelBee
    • ApisPrinterBee
    • ApisWitsmlBee
  • Deprecation: The following components have been deprecated:
    • ApisMessageBroker
    • ApisSecsGemBroker
    • ApisSMTP (use the Sink.Smtp item type of the ApisEventBus instead).
    • ApisSMS
    • ApisHSIndexing
    • IEC60870_5_104 plugin
  • Deprecation: The legacy possibility to configure the UA Server through a dedicated file, specified by setting UAServer-ConfigFile, has now been deprecated.
  • Deprecation: The following customer specific APIS modules have been deprecated:
    • CapulaFileReader
    • HFApisModules
  • Feature: The HasTypedefinition reference for Hive items now reflects the datatype of the item for numeric (BaseAnalogType) and boolean (TwoStateDiscreteType) types (#4926)
  • Feature: .NET modules are now running on .NET 6.0

ApisIEC104Bee

  • Feature: Initial release of IEC 60860-5-104 module for APIS Hive

Apis Management Studio

  • Feature: Apis Management Studio is now running on .NET 6.0.

Apis Semantics

  • Bugfix: Handle numeric node ids with values larger than MaxInt.
  • Feature: Added TypeMemberSelection structure for specifying browsepaths to properties in (named) instances of a type that should have function items and subscriptions for cached proxies.
  • Bugfix: No longer allow namespace database backup to try forever making the copy on a busy system, in order to avoid database lockups.
  • Bugfix: Item name generation would initialize the naming root with wrong naming root type id when using blobs as node-id.
  • Feature: Optimized engineering unit postprocessing upon loading a namespace (nodeset import, and crawling).
  • Feature: Extended semantics service with new method for getting a typedef with a fully inherited instance declaration hierarchy. NB: new COM interface, ISemanticsService2
  • Feature: Added support for parent node id property having a custom semantics property id
  • Feature: Added support for datatypes inherited from standard scalar datatypes in namespace 0