GStreamer on TI DaVinci and OMAP
Plug-in testing is broken down into many different areas, as shown below. In many cases, a media file will be used during a test. Each of these media files needs to be licensed such that the file can be make readily available to all who are interested. Currently, the test suites use media that can not be distributed. We are working to resolve this limitation.
Ideally, there would be an automated test suite that is run on each hardware device each time a check-in to the SVN trunk occurs. We are a long way form being able to do this type of automated testing. As new tests cases are developed, the goal of being able to automate the test should be kept in mind. Using tools that can detect latency and dropped frames (which don't exist yet, but may be possible with gst-tracelib extensions), it should be possible to automate most tests except tests requiring human perception, such as garbled audio or video.
The anticipated flow for automated testing includes:
- Automated build (ala buildbot). Ideally sparate builds for each target device on each supported Linux distribution. Each build would likely be based on OpenEmbedded for OMAP / DaVinci. Ideally this would happen each time code is checked into the SVN trunk.
- Automated target hardware update. Since all the devices use u-boot as the bootloader, this could be as simple as using tftpboot and a root NFS mounted target file system.
- Automated test suite execution. At the core, something like Perl Expect could be used to setup a test, invoke a test case, and expect a certain response. There also might be a test suite execution framework that can be run directly on the target hardware.
Clearly automated testing is dream at the moment. Community members interested in making dreams come true are encouraged to do so.
In an effort to simplify automating existing test, all new tests should be developed as follow when possible:
- Cut-and-paste Linux commands to setup the test.
- Cut-and-paste Linux commands to initiate the test.
- Description of expected response. In the simplest case, it might be just the command running to completion without error. In the ideal case, provide a regular expression to expect for success, or conversely which indicates failure.
GStreamer includes the GStreamer Unit Test Framework. Tests are stored in the check directory, with an example directory layout and a good set of examples available in the base GStreamer set of plug-in tests/check directory.
Currently the TI GST DMAI plug-in doesn't include any unit tests. Unit tests will likely be required before the general GStreamer community will accept the plug-in.
SVN Trunk Check-in Testing
Although the intent of the SVN trunk is to be stable and defect free (or at least with a known list of documented defects), there is no formal testing process when making changes to the SVN trunk. Engineers checking code into the SVN trunk should perform the testing they feel is necessary to minimize introducing defects.
Defect Resolution Regression Testing
As defects are found and fixed, a set of tests that verify the defect is fixed needs to be included with the patch that fixes the defects. These defect regression tests will ensure future changes don't cause a previously solved problem to reappear.
Plug-in Release Testing
As the TI GST DMAI plugin matures, the release testing will improve also. The release testing consists of running several test suites on each of the supported device. The test suites include:
|Basic functionality||Encode, decode, resize, etc.|
|GST state compliance||Play, pause etc.|
|GST event compliance||Seek, flush, new-segment, etc.|
|Element parameter setting||Min value, max value, typical value, parameter interaction, etc.|
|Regression||Set of tests that verify fixed defects are still fixed.|
|Performance||Each release should have better performance characteristics than the previous release. See PerformanceAnalysis for details.|
|Customer functionality||As users provide important example pipelines, such as network streaming, these pipelines should be validated with each release. It is especially important to verify that performance is at least as good as the previous release.|
|Endurance||When possible, idle target hardware should be running endurance tests to verify there are no defects that only show up after many itterations.|
|Stress||As the load on the target hardware resources increase, we need to verify the pipeline does not crash.|
|Host distribtion||Each version of each Linux host distribution (Ubuntu, Redhat, etc) needs to be used to verify runnable target hardware images can be built without error.|
The current plug-in release testing only covers the basic functionality with the individual test cases documented in then GStreamer Release Testing Matrix.
GStreamer Release Testing
As new versions of GStreamer are release, the TI GST DAMI plug-in needs to be validated that it works properly. Example tests include interoperability with typefind, etc.
GStreamer and Gnome Style Compliance
The goal is for the TI GStreamer DMAI plug-in to be accepted by the general GStreamer community. To facility an eventual merge with mainline GStreamer code, the GStreamer code style and Gnome programming guidelines to be followed. A technical review of each source code file with an eye to style compliance needs to be initially performed. Once in compliance, all patches need to be checked against the coding standards as well.
|Item ID||Associated Item||Comment|
|No Associated Items Found|