Testing GUIs is hard: unit tests require a tremendous amount of mocking and often don't capture the exact user input anyway, integration tests on the other hand are difficult to setup as they often require a dedicated infrastructure with virtual machines.
libuitest
should become something like the image recognition part of openQA: it receives a video or image stream from some source and searches for matches (e.g. button found, sound played). It should also be able to send clicks, key presses, strings, etc. back via an abstract API to a backend.
It should be compatible to openQA's needles, but the main way how to write tests should be done via a more generic API: `
help_menu = find_menu("Help", search_area_percent=((0, 0), (100, 0)))
help_menu.click()
about_my_program_button = find_menu("About Foo", search_area_percent=(help_menu.location, (100, 0)))
about_my_program_button.click()
window = find_window("About Foo")
about_foo_text = get_text_from_area(window.area)
assert(about_foo_text == f"This is Foo version {env.FOO_VERSION}")
The image recognition part should be handled by OpenCV and the ocr via tesseract. It would be also nice to use some simple machine learning so that libuitest
could be trained to find buttons, menus, windows, etc. independently of the used GUI and theme (or at least robustly if you tell it which theme & GUI you are using).
Looking for hackers with the skills:
This project is part of:
Hack Week 19
Activity
Comments
Be the first to comment!
Similar Projects
Uyuni test suite improvements by dgedon
Project Description
Uyuni is the upstream...
Testing and adding GNU/Linux distributions on Uyuni by juliogonzalezgil
Join the Gitter channel! [https://gitter.im/uy...
OpenQA test list views improvement by mdati
Project Description
Create views of **Ope...
Uyuni: add SLE-Micro acceptance tests by mbussolotto
Project Description
Uyuni: add SLE-Micro ...
How software creation process can save energy and CO2 emissions by acervesato
[comment]: # (Please use the project descriptio...