Statecharts are everywhere! #6 – Verification example

As we mentioned in a former post, Bence, who is our colleague, has developed a model transformation plugin for YAKINDU as well. The plugin transforms a chosen SCT model into timed automata, and automatically generates CTL expressions for formal verification.

Let’s have a look at its theoretical background and a practical example.

Theoretical approach

The workflow of the transformation is depicted above. First, the YAKINDU Statecharts to UPPAAL Transformation Engine parses the Statechart design model (the SCT file), and looks for different mathces of structural graph patterns in it.

The patterns were composed in the Query Language of EMF-IncQuery. For composing these patterns the metamodel of the YAKINDU Statecharts were used, because the statcharts in the SCT model are the instances of that metamodel.

After that, the Transformation Engine transforms the given statecharts to timed automata acceptable by UPPAAL. In order to achieve it, we used the metamodel of UPPAAL. The metamodel is publicly available and was designed at the University of Padernborn. The transformation code was written in Xtend and it extensively uses the EMF Reflective API.

Finally, the Transformation Engine creates UPPAAL automata that adhere to the statecharts, and generates CTL expressions to make model checking automatic. You can find details on model checking in a former blogpost.

Practical approach

Now let us show you in an example what our YAKINDU Statecharts to UPPAAL Transformation Engine is capable of. First of all, we need a Yakindu model that we want to transform. We have made a simple timer statemachine just for this demonstration.

As you can see, the model consists of two states. In Idle state the system does not do anything. In Running state the system measures time. Also, there are two events that can be used for controlling the system. Finally, there is an integer variable that symbolizes elapsed time.

Let us transform this statemachine to UPPAAL. This is what we get:

Control template:

As you can see, the transformer creates two templates. The first one is the UPPAAL equivalent of the YAKINDU statechart. The second one is a control template that is used for firing events.

Let us see what transformation rules the transformer uses:

  • Events are transformed to synchnronization channels
  • Integer variables are transformed to int variables
  • Vertices are transformed to locations
  • Transitions are transformed to edges
  • Updates of transitions are transformed to updates on edges
  • Time events (after .. s) are transformed to clock variables, location invariants and guards on outgoing edges

So what can we use this UPPAAL model for? First, you can run simulations on it. With the control template you can fire events, so transitions and actions can take place in the primary process. You can see the actual active locations of templates, you can follow changes in values of variables etc. Here is a picture of a simulation on the timer model:

Second, queries can be created for checking certain properties of our models. Let us say you want to know whether your statemachine is able to reach a certain state, or a value of a variable can be negative. You just have to create a query that describes the condition and the Verifier will tell you whether it is possible or not in your model. If it is possible, even a simulation trace is created that shows you the transitions taking your model into the certain state. Let us show you some queries:

The first query checks whether there can be a deadlock in the system. Fortunately, the model has been designed well, so deadlocks cannot take place.

The second and third queries check whether Running or Idle states can be reached. We create states because we want to reach them. If a state is unreachable, then something is wrong with the model.

The fourth query checks whether time can be measured for longer than 9 seconds. We created this model to measure time, and according to the Verifier we can do so for at least 10 seconds.

The fifth query checks whether the elapsed time can be a negative value. If it was possible, then something would be very wrong with the model. Fortunately, this is not the situation now.

Model-driven development of the LEGO robot – An overview

Introduction

In the middle of the summer we found an EV3 robot arm, so we thought to start developing some model based safety critical control program for it.

12767352_601286420025460_649008822_n.jpg

This robot had three motors and two sensors: one touch and one color sensor. It was built according to the step-by-step guide which you can find on this website. Later we have enhanced it by adding a new motor. With this new structure, as you can see in the picture below,  we are able to rotate the gripper head.

WP_20160226_18_51_48_Pro.jpg

We have implemented the control for the LEGO crane (robot arm) on top of multiple communicating components. The basic architecture is depicted below:

architektura_postba.png

As it can be seen there are multiple functions in the system: Yakindu is used to develop the control with the help of statecharts, communication uses MQTT, the slave runs on the robot on an embedded processor. Simulation is used to predict movements, OpenModelica is used for these purposes. Computer vision and OpenCV gathers the information from the environment and sends alerts to the controller. In addition, it provides information from the object to be moved.

LEGO Robot architecture

In the first component you can see the EV3 itself. We have implemented a python script which directly controls the robot motors and is able to access sensor information or detect if the motor is overdriven. Generally, the python script receives commands from the controller component and executes them, although in order to prevent motor overdriving it also have some local reflexes. For example if any of the touch sensors is pushed, it stops all the movements automatically. As you can see in the pictures below, there are two touch sensors, one indicates if the robot arm moves too high, and the other if it wants to turn right too much. Some more details are provided in this blog post.

12784403_601285903358845_53653513_n.jpg                 12784303_601285886692180_2045549919_n.jpg

Communication

As I have mentioned, the controller component sends the control commands. But how does it do it? Almost all of these units communicate through MQTT. This protocol provides high maintainability and reconfigurability, so it made relatively easy to add new features. Every unit has its own topic where they publish and they subscribe to the other’s topics. We’ve developed hierarchical topics to organize the messages. This robot component only communicates with the controller component, which has other connections to the ‘user’, ‘observation’ and ‘prediction’ units.

User Interface

So let’s talk about the other units! It is obvious that we wanted to have control over this little robot, so we developed a graphic user interface in order to control its movements.

12782255_751527974946671_292187155_n.jpg

As you can see there’s no button for moving the arm vertically or horizontally, these functions are controlled with the arrow keys. This ‘user unit’ sends the commands to the ‘controller unit’ which processes them, afterwards sends a message to the robot. The GUI also shows the messages we’ve sent, and which we’ve received.

It is also possible to switch automatic operation. Here comes the logic question: what can this robot operate automatically? To be honest we have also found some model railway next to this robot that summer. So we thought the EV3 could move the cargos of the trains automatically. Obviously more help is needed for it, more information about the environment. That is why you can also see a monitoring component in the architectural view.

We observe the robot and its environment with a camera so we can analyse the situation in real-time using OpenCV3. This computer vision application sends messages to the controller. It warns the robot if an obstacle comes to the way. It notifies the controller if a train arrived to the desired place, this message also contains information about the orientation of the cargo. We can also ask for other information from it, for example the orientation of the gripper head, we use this function in the initialization of the robot’s motors.

For more details there is a more detailed blog post.

openCV.png

Controller

It’s time to see what the controller component does. This unit is responsible for the logic of the operation. We used the open source Yakindu Statechart Tools toolkit to model the control scenarios. The safety critical parts are implemented at first by state machines and then these design models are used for code generation: the produced source code can be deployed and integrated with MQTT to be able to communicate with the controlled objects. The controller also receives information from the computer vision parts and reacts to the events.

Yakindu statecharts has some useful features: the simulation of state machine models allows the dynamic semantics to be checked. Active states are directly highlighted in the statechart editor. In addition, Yakindu have many built in features for example validation rules which turned out to be very useful during the development. We also used the validation rules introduced in a former post to further increase the quality of the models, various analysis runs were conducted to check the design models!

Two kinds of composition rules were applied during the development of the control models. Yakindu tool provides built in hierarchy which we exploited in the design. We also decomposed the problem into two parts: one statechart provides the abstraction of the physical world and another statechart is responsible for the real control. This decomposition significantly increase the reusability of the control logic: for other robots with similar missions it would be enough to implement a new abstraction layer and the control logic could be reused!

12788788_601359243351511_834678038_o.jpg

Prediction

To compare the desired and the actual behavior of the motors we’ve used OpenModelica, an open-source simulation environment. The purpose of utilizing such a complex modelling tool was twofold: 1) design time analysis can help setting the parameters of the controllers. Estimations helped choosing the proper parameters for the controllers. 2) Runtime prediction can be built on top of the modelica models. Asking information about the limits of the robot in certain situations turned out to be helpful. Especially in those situations could we benefit from the simulation results, when the mission reaches the limit in a direction where no sensor information is available. In those cases simulation can predict the rotation value which is still safe for the system.

For more information read this post!

Validation & Verification

The controller of the robot arm was developed with the help of Yakindu statecharts. Validation rules provided by the tool and also by ourselves helped to find problems early in design. Verification was also done by transforming the design model into a formal representation and analysing simple reachability queries on it.

Summary

We have developed a the control of a Lego robot. Computer vision is used to observe the environment and provide autonomous behaviour of the robot. Simulation is used at design time to estimate the necessary parameters of the control and at runtime to check dangerous movements of the robot arm.

image

Various IoT and model based techniques were implemented in the project, the synergies of these technologies led to a complex IoT application!