#6 How to implement “Google Protobufs” serialization/deserialization ?

1.0 protobufs
accepted
None
2 days ago
5 days ago
aliazzz
No

The core component of the SparkplugB Library will be the IEC “Google Protobufs” encoder/parser.
How do we solve this issue?

  • Should we include an external C library?
    • Personally I have no experience with that but I am willing to learn it ;-)
  • Should we try to translate an existing version into A pure IEC solution?
    • GO, C, C#, Python and many more are available, but no IEC61131-3..
  • Can we think of another way of implementing this? Ideas are welcome!

This ticket is therefore a placeholder for Milestone 1.0.

It is neccesary for the following;


Sparkplug™ B MQTT Payload Definition

The goal of the Sparkplug™ is to provide a specification that both OEM device manufactures and application
developers can use to create rich and interoperable SCADA/IIoT solutions using MQTT as a base messaging
technology. In Sparkplug™ B message payload definition, the goal was to create a simple and straightforward
binary message encoding that could be used primarily for legacy register based process variables (Modbus
register value for example).

The Sparkplug™ B MQTT payload specification has come about based on the feedback from many system
integrators and end user customers who wanted to be able to natively support a much richer data model within
the MQTT infrastructures that they were designing and deploying. Using the feedback from the user community
Sparkplug™ B provides support for:
• Complex data types using templates.
• Datasets.
• Richer metrics with the ability to add property metadata for each metric.
• Metric alias support to maintain rich metric naming while keeping bandwidth usage to a minimum.
• Historical data.
• File data.

Sparkplug™ B definition creates a bandwidth efficient data transport for real time device data. For WAN based
SCADA/IIoT infrastructures this equates to lower latency data updates while minimizing the amount of traffic and
therefore cellular and/or VSAT bandwidth required. In situations where bandwidth savings is not the primary
concern, the efficient use enables higher throughput of more and interesting data eliminating sensor data that
have been left stranded in the field. It is also ideal for LAN based SCADA infrastructures equating to higher
throughput of real time data to consumer applications without requiring extreme networking topologies and/or
equipment.
There are many data encoding technologies available that can all be used in conjunction with MQTT. Sparkplug™
B selected an existing, open, and highly available encoding scheme that efficiently encodes register based process
variables. The encoding technology selected for Sparkplug™ B is Google Protocol Buffers also referred to as
Google Protobufs.

“Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.”
Google Protocol Buffers, sometimes referred to as “Google Protobufs”, provide the efficiency of packed binary
data encoding while providing the structure required to make it easy to create, transmit, and parse register based
process variables using a standard set of tools while enabling emerging IIoT requirements around richer metadata.
Google Protocol Buffers development tools are available for:
• C
• C++
• C#
• Java
• Python
• GO
• JavaScript

Additional information on Google Protocol Buffers can be found at:
https://developers.google.com/protocol-buffers/

https://developers.google.com/protocol-buffers/docs/

Discussion

  • i-campbell

    i-campbell - 5 days ago

    Hmmm... So google protobufs is some code. You give it a .proto file and it writes some code. This code you include in an application. - There is a lot going on there!
    Of course, we (the users of the library) want in the device only IEC code.

    So I think we don't need to implement "taking a .proto file and converting it to some code" in IEC (actually we don't need it at all). really there is only one .proto file we are interested in. Maybe a future wish, but it adds no value.

    Really the way I see the solution is this:

    • SparkplugB spec says "here is a .proto file. It completely describes a binary encoding"
    • So all we need to do is write some IEC code to:
      • setup a particular memory layout(s). (one time on PLC startup)
      • Update the data of that layout(s).
      • Move this payload to the MQTT Publisher before sending.
      • Distribute that data when the MQTT Subscriber gets its payload.

    For unit testing I would:
    Create test data using for example python protobufs, covering all (most) of the use cases. So a table of "If data structure is this AND data value is this THEN binary blob looks like this"
    Write the same data structures and data value combos in IEC. Do the transform. Does IEC blob = expected blob.
    And for the subscriber:
    * Write the same data structures and store the blob in IEC. Do the inverse transform. Do all data values = expected data values?
    There may even be already some Sparkplug B transform test cases lying on github somewhere.

     
    👍
    1
  • aliazzz

    aliazzz - 5 days ago
    • labels: Encode, IEC, Google, Protobufs, Parser -->
    • status: open --> closed
     
  • aliazzz

    aliazzz - 5 days ago
    • status: closed --> open
     
  • aliazzz

    aliazzz - 5 days ago
    • summary: implement an IEC “Google Protobufs” encoder/parser --> How to implement “Google Protobufs” serialization/deserialization ?
    • Description has changed:

    Diff:

    --- old
    +++ new
    @@ -1,7 +1,18 @@
     The core component of the SparkplugB Library will be the IEC “Google Protobufs” encoder/parser.
    +How do we solve this issue?
    +
    +* Should we include an external C library? 
    +    * Personally I have no experience with that but I am willing to learn it ;-)
    +* Can we think of another way of implementing this? Ideas are welcome!
    +
    +
     This ticket is therefore a placeholder for Milestone 1.0. 
    
     It is neccesary for the following;
    +
    +
    +-----
    +
    
     Sparkplug™ B MQTT Payload Definition
    
     
  • aliazzz

    aliazzz - 5 days ago
    • Description has changed:

    Diff:

    --- old
    +++ new
    @@ -3,6 +3,8 @@
    
     * Should we include an external C library? 
         * Personally I have no experience with that but I am willing to learn it ;-)
    +* Should we try to translate an existing version into A pure IEC solution? 
    +    * GO, C, C#, Python and many more are available, but no IEC61131-3..
     * Can we think of another way of implementing this? Ideas are welcome!
    
     
  • aliazzz

    aliazzz - 5 days ago

    I've seen such a blob in the git examples. The steps you suggest actually make this problem a bit easier. I allready started building a skeleton for the .Library. I'll upload it asap in the SVN corner.

    Would you be so kind and work your ideas out in some ticket(s) so that we can distrubute the workload to tackle this 'protobuf' beast? I have this feeling that the sooner we tackle the ' protobuf' dragon the sooner we are at the finish. We must slay it before we can continue our questee :-)

     

    Last edit: aliazzz 4 days ago
    • i-campbell

      i-campbell - 4 days ago

      The dragon is but a bearded dragon.

       
  • i-campbell

    i-campbell - 4 days ago

    I did some reading today:
    0. The actual google protobufs compiler is quite deep. The generated code i would say is not so much machine readable.
    1. https://developers.google.com/protocol-buffers/docs/encoding describes the protobufs encoding.. in its entirity.
    a. The order of fields is freely configurable
    2. https://github.com/eclipse/tahu/commits/master/sparkplug_b/sparkplug_b.proto describes the sparkplug_b protobuf in its entirity.
    3. So I think we need an IEC function block called Payload which has .blob, all the fields of sparkplug_b.proto, and .xEncode and .xDecode. What do you think?
    4. I see a posibility of putting a generic protobufs library in the future, but the first implementation should be only sparkplug.

     
  • aliazzz

    aliazzz - 4 days ago

    3. So I think we need an IEC function block called Payload which has .blob, all the fields of sparkplugb.proto, and .xEncode and .xDecode. What do you think?

    Thinking out loud and expanding on your idea;

    Yes, it sounds like a good way to handle the Payload itself.
    An FB_Payload which exposes various methods via interface ITF_Payload which has methods
    Encode and Decode.

    Every producer would like to send (Encode) its payload data.
    Every consumer would like to receive (Decode) its payload data.

    So the number of FB_Payload instances will mirror the number of devices/instruments/whatever (i.e. payload acts as a bridge/gateway) to the outside world from the devices/instruments/whatever code point of view.

    Also, From what I have understood from the hierarchy, we need to couple a single EoN instance to multiple FB_Payload instances. They share a 1..n relation with n>=1.

    We should dynamically couple (register) the Payload instances to the EoN edge device via an interface (ITF_Payload?)
    Perhaps on start of the code via FB_Init via dependency injection or some other form of dynamic registering.
    I have seen code do this with the alarm handler so we can do it too. This way the user has a minimum of Burner plate code, which I personally think is very important!

    4. I see a posibility of putting a generic protobufs library in the future, but the first implementation should be only sparkplug.

    Yes, I totally agree on this. The ease of use and debugging is key. We can always opt to separate it in a later stage

    Feel free to shoot on the ideas

     
  • aliazzz

    aliazzz - 2 days ago
    • status: open --> accepted
    • assigned_to: i-campbell
     

Log in to post a comment.