You have not given any consents yet.

YSoft Labs: Robotics for verification and validation testing

The technology behind Digital Transformation takes many forms. One of them is the use of robotics. While many companies use robotics as part of completing tasks in manufacturing or in customer service (known as Robotic Process Automation), YSoft Labs is looking at robotics in advancing the capabilities of quality assurance.
 
As you can imagine with the increasing applications of IoT, human interactions with devices will be on the rise. These devices need to be tested and traditional test scripts are no longer adequate. So, let’s look at one area that robotics can help advance the traditional quality assurance area.

Most of us think of robots as doing repetitive tasks – repeatedly taking an instruction and performing the action. This is a normal robotic task as robots are good at this sort of thing whereas humans need to take breaks and make mental mistakes through boredom. Therefore, robots are often used in quality assurance scenarios where verifying features can be tested.

In verification testing, a robot can simply test whether a feature works according to specification or not. It is binary. The feature either did what it was specified to do, or it didn’t. An example might be, if I press the power-on button, did the power come on, or not. If I press the power-on button twice, did the machine power-on and then power-off or did it do something else. And if I press the button for long time, etc.

However, with recent computer vision capabilities and declining costs of sensors, robots have become very good at validation testing. With validation testing, the question is not whether something happened or not, it answers the question, was what happened expected, or do the results meet users needs? In some ways this is akin to using robots for usability testing but there are also ways robotics have evolved to do even more.

Let’s start with usability testing. In human form, users may be asked to perform a task using a computer or device and be timed to see how fast the user interface is and how fast the user interface evolves in a time period – to understand if the user can complete the task quickly. Another example would be if the user understood the results of an action and could take the next step in a process. To some extent, regular robots can do this kind of usability testing very well. But humans can typically only test one variable at a time with any accuracy.

When paired with computer vision, sensors and artificial intelligence, robots can perform many kinds of validation and verification tests simultaneously. And, when a test fails, the robot can recognize the ‘fail state’, and continue with the proper action or capture the situation, reboot and continue testing another area or stay in “fail state” and call for assistance.



For example, a robot may be scripted to recognize a screen icon to test the launch of an application on a touch screen. Typically, the script would include the exact position on the screen where the icon could be found. But if the icon is in a different location, the robot cannot perform the test. In this case, computer vision technology allows the robot to find whether the correct icon is there and where it is. Similarly, if the user interface changes or looks different on another device, the robot must be able to find the icon to continue the test.

The real industry challenge is the combination of verification and validation simultaneously. This approach is typically impossible to do manually, but with increasing complexity of SW and HW systems, the importance is much bigger. In the current dynamic world with extremely fast progress, there is not space to rely just on a functionality approach (verification) or to split functionality and SW & HW qualities into separate areas. When the robotic system can do validation and verification in the same moment, it provides a faster feedback loop for developers while also supporting greater quality.

Using the same scenario, when a human user encounters an unexpected response – perhaps the load time of a screen is taking too long for example, the user knows what to do to continue. With computer vision and together with artificial intelligence, a robot can also learn to take the necessary steps to continue.

Sensors can also help robots in validation testing in other ways. Temperature sensors will enable robots to test conditions in various hot/cold scenarios. Proximity, motion, weight, level sensors and simple counters can also aid in validation testing.

As YSoft SAFEQ is currently embedded in 2D and 3D printers and is constantly evolving, what it can measure becomes more sophisticated. This has applications in robustness, security and analytics. Our robotic testing must also evolve to ensure that YSoft SAFEQ and other innovative services we may deliver are utilizing the latest technology.

 
Jakub Pavlák
Jakub focuses on the automation for QA processes designed for large-scale HW/SW solutions, while challenging HW specific areas, and looks to the future toward Industry 4.0. Jakub also leads the infrastructure team, which provides R&D engineers infrastructure for development, building, verifying and delivering our SW and HW products.
View all posts by Jakub Pavlák

Subscribe to our newsletter

US