Patching lib webrtc almost automatically

I. Introduction

Most product will want to modify the lib webrtc to add some features. Some, like tokbox, will want to change the libvpx compilation flags to enable VP8 SVC (only temporal scalability) to use between their mobile SDKs. Some others, like voxeet, might want to add additional audio codec. Many, including pristine.io will want to add h264 support.

Whatever the goal, everybody will need to be able to patch the source code in a consistent way against a fast moving library, and keep the number of patch and how to apply them manageable.

Moreover, since this is a public code, there are some patch I will contribute for all, but users might want to have their own private patches and keep them for themselves. The architecture should allow for that.

Finally, just for the fun of it, I wanted to have a way to quickly get patches from the google review process to be able to have in the lib features before they even appear in chrome (or to test those patches with this system).

In this post we will propose one easy, but efficient way to do just that.

II. Implementation

1. Specific CMake Command used here.

CMake has this nice add_subdirectory() command that makes a lot of thing easy. Basically what the command does is just to iterate into the corresponding folder and act on any CMakeLists.txt file that would be present there.

  1. add_subdirectory(Patches)

By making it conditional you can design a nice layout to keep your patches managed, for example by platform:

  1.  if( APPLE )
  2.   add_subdirectory( mac )
  3. elseif( WIN32 )
  4.   add_subdirectory( win )
  5. endif()

2. Patch creation and specific libwebrtc concerns

libwebrtc code source is a patchwork of separated libraries that are being fetch depending on the DEPS file. While gclient as an option to generate a global patch, we preferred simply using git. Then, each patch you generate is for a specific git tree and you have to remember where to apply it.

We created a CMake macro to help automate that:

  1. set_webrtc_patch_target(
  2.   GIT_APPLY_CMD
  3.   APPLY_DIR
  4.   PATCH
  5.   DEPENDS_ON_TARGET
  6. )

The DEPENDS_ON_TARGET, allow us to make sure the patches are applied after the code is downloaded.

The GIT_APPLY_COMMAND allows for flexibility in which git command you use. Some prefer the “git diff” / “git apply” approach, while other prefer “git format-patch” / “git am”. In our case, we keep it simple:

  1. set(
  2.   GIT_APPLY_CMD
  3.   git apply –ignore-space-change –ignore-whitespace
  4. )

3. How to make a clean/undo command

The problem with patches is that they leave the source tree “dirty”, and a good rule for development or even build bots is that the source code should stay clean (unmodified) as much as possible.

When using git, one way to get back to a clean state is to do a reset: “git reset –hard -q”. This would bring all the tracked files to a clean state, but can leave untracked files behind, e.g. if you add new files, or delete others. “git clean -qfdx” if then needed to make sure the source tree is back to where you want it. The code looks like that:

  1. set( GIT_RESET_CMD git reset –hard -q )
  2. set( GIT_CLEAN_CMD git clean -qfdx ) 

Additionally, you need to know where to apply the commands, so you need to keep track of all the directories where a patch has been applied. For each patch, we’re going to add the directory it s applied to in a list. When times come to “unpatch”, we’ll use that list:

  1. list(REMOVE_DUPLICATES PATCHED_DIRS) # remove duplicates
  2. add_custom_target(
  3.   UNPATCH_ALL
  4.   ${CMAKE_COMMAND} -E touch dummy.phony
  5. )
  6. foreach( dir ${PATCHED_DIRS} )
  7.   add_custom_command( 
  8.     TARGET                               UNPATCH_ALL          POST_BUILD
  9.     COMMAND                         ${GIT_RESET_CMD}
  10.     COMMAND                         ${GIT_CLEAN_CMD}
  11.     WORKING_DIRECTORY  ${WebRTC_SOURCE_DIR}/${dir}
  12.     COMMENT                          “Unpatching ${dir}.”  
  13.   )
  14. endforeach()

4. How to integrate my private/proprietary patches?

With the add_subdirectory() command, things are quite simple. The code below checks if there is a “PvtPatches” subdirectory and if there is, walk into it. You can use git subtrees, or the method of your choice to have in your local copy such a directory, with a CMakeFiles copied (hum … largely inspired) from the one in Patches and everything will be good.

  1. if(EXISTS ${CMAKE_CURRENT_SOURCE_DIR}/PvtPatches)
  2.   add_subdirectory(pvtPatches)
  3. endif()

Note that extra care has been taken about the variable that contains the list of directories to apply the reset and clean command to, so that you can modify it from within subdirectory and it remains valid. The “CACHE” option make sure it s consistent across the entire project whatever the current source directory is. The “INTERNAL” option is here to make sure this variable does not appear in the graphical user interface that goes along with cmake.

  1. set( PATCHED_DIRS “” CACHE INTERNAL “Internal variable.” )

Of course, you’re likely to re-run the build generation tool after applying the patches:

  1. python /src/webrtc/build/gyp_webrtc.py

III. Conclusion

You have seen in this post how to set up a simple patch system for libwebrtc. Of course, it would not be complete without a few examples, so in following post I will show how to integrate google patches:

  • support for openH264 (patch)
  • support for AVFoundation renderer for mac (patch)

The most difficult is not the patching, but the testing. I need to push some examples and stand alone tests first.

Cheers.

Alex.

 

Creative Commons License
This work by Dr. Alexandre Gouaillard is licensed under a Creative Commons Attribution 4.0 International License.

This blog is not about any commercial product or company, even if some might be mentioned or be the object of a post in the context of their usage of the technology. Most of the opinions expressed here are those of the author, and not of any corporate or organizational affiliation.

Dashboard “Greenness”, one bug at a time

I. Introduction

Precompiled libraries for stable version of webrtc (those used in chrome) have been requested many times on the mailing list, but so far nobody as put him/herself at making them.  One of the goal of this blog is to provide those to lower the barrier of entry for people that want to build on top of webrtc.

As I was preparing the libraries on linux, i bumped again in the failing test I mentioned in a previous post:

The first error seems to be related to a bad allocation. That’s were you realize that running this on the smallest possible linux instance in AWS was possibly a bad idea. It should disappear when I host the Linux built bot on a bigger instance. The second error is more elusive, and I can’t figure it out just from the logs. Once I will have set up a more powerful linux build host, I will debug there directly.

As far as I am concerned, having even a single test failing is a no-no. So I dug deeper. Here is the build before the changes.

II. Investigating

In the mean time, I moved the build bot to a stronger (c3.2x) instance. Indeed, the first error was a memory allocation problem triggered by an undersized instance, and went away without any special attention.

The second error was related to screen sharing tests, which is not a surprise given that we are running on a virtual machine without display.

The original tests are run through a test driver written in python. The code is separated from libwebrtc and can be found there. The main file is here. Here again, it is code coming from chrome which contains a lot of things not needed to test the standalone version of webrtc (chrome sandbox, …).

It also does a lot of nice things in term of checking that no left over from previous, possibly failing, tests are not on the way. There are a lots of extra steps that improve the stability and robustness of the tests, so it’s not all bad.

To make things simple, you just need to install Xvfb and openbox, 

  1. sudo apt-get install xvfb
  2. sudo apt-get install openbox

then define a display, create it, and run the window manager before you run your test (the code below is written to stay as close as possible to google tests conditions).

  1.  export DISPLAY=:9
  2.   Xvfb :9 -screen 0 1024x768x24 -ac -dpi 96&
  3.  openbox&

Now, all tests pass!

III. Conclusion

The greenness of the dashboard is something that is of utmost importance. If the dashboard dis not free  you are developing blindfolded. Making it green is an everyday challenge,. It can be seen as too much to bother about, but it is actually a developer safety net, and allow you to focus on developing only.

The advantages of cmake here are twofold: lower barrier of entry, and collaborative dashboard. 

Once again, one can see that the chrome build tools, however good and advanced, are an overkill in the case of the standalone libwebrtc. I do believe it is slowing down adoption of and contribution to webrtc, as one needs to become a chromium developer first, and the learning curve is steep.

In any case, you can now download tested, precompiled, libraries and headers for linux, mac or windows on the Tool page. If what you want is just to develop something against libwebrtc that work against the latest stable chrome, you have all you need now. 

Some people request features that are not, or not yet, in webrtc. In a following post, I will explain how to patch libwebrtc effortlessly as part of the process described before.

Enjoy.

 

Creative Commons License
This work by Dr. Alexandre Gouaillard is licensed under a Creative Commons Attribution 4.0 International License.

This blog is not about any commercial product or company, even if some might be mentioned or be the object of a post in the context of their usage of the technology. Most of the opinions expressed here are those of the author, and not of any corporate or organizational affiliation.

How to set up build bots for libwebrtc

I. Introduction

Following My previous posts, I got a lot of e-mails concerning setting up the build bots. I have to admit that my previous post did not address that in detail, and that the documentation about it is sparse and confusing, as the recommended way to do it in the cmake community changed through the 15 years of the (vtk, itk) projects. So here is a post to describe, step by step, how to set up your own bots in a matter of hours, and manage them remotely through git, without ever having to connect to them again (in theory, in practice, s#$%^ happens, and you might also want to connect from time to time to debug problems directly.)

It is good policy to keep the build bot script separated from the main code, as they might contain sensitive information about your infrastructure. For example, you might put access key to upload the result of the build (packaged libraries) and that had better be private. Moreover, with the current setting, anybody that manage to have access to your build script end up being able to run anything on your build bot, which is also something you don’t want 🙂 In our case, it’s more a tutorial, and all scripts are accessible here.

II. Scripting CTest

So far, I touched on two ways of using ctest:

  • as an extension of CMake, to handle test suite directly from within the CMake files.
  • as an CDash client, to run CMake and automatically send the results of the upgrade, configure, build and test steps to a CDash server. 

There is a third way to use ctest: through scripting. You can write a files, using cmake syntax to prepopulate CTEST_<> variables to run ctest in a controlled way. You then call “ctest -S” with your file as argument to run ctest in script mode.

Some very useful variables are defined for you to use, to set CTEST cache, or environment variables, or hardcode compilers, and any given program or cmake variable before hand. That allow for example to run 32 bits and 64 bits builds, debug and release, on a single machine. Another example is to have multiple versions of compilers on a given machine, and use ctest scripts to use a specific one at a time. One of th best script I saw, written by gaethan lehman, was handling different versions of MSVC and Java on windows. Hat off.

One can get more information about CTest capacity on the old page written when people were still using purify (here, here and here),  and a more recent version here, .

III. What about libwebrtc?

For this example, I used the latest method, developed for ITK v4. A very fast overview is here. This version was focussed on git, and implement some nice tricks to handle different branches, which makes setting up bots for development branches easier.

1. A generic script that does all the heavy lifting

The idea is to have a very generic script that handle most of the problems for you, and to leave only a few variables to be defined to the user. I ported the generic script to be usable for libwebrtc: libwebrtc_common.cmake. Unless you’re a purist, I do not recommend modifying it, or even looking at it. It now allows you to define a set of parameters, some of the usual CMAKE or CTEST variables, but some new dashboard_ variables as well, to control your build.

  • dashboard_model = Nightly | Experimental | Continuous
  • dashboard_track = Optional track to submit dashboard to
  • dashboard_loop = Repeat until N seconds have elapsed
  • dashboard_root_name = Change name of “My Tests” directory
  • dashboard_source_name = Name of source directory (libwebrtc)
  • dashboard_binary_name = Name of binary directory (libwebrtc-build)
  • dashboard_data_name = Name of ExternalData store (ExternalData)
  • dashboard_cache = Initial CMakeCache.txt file content
  • dashboard_do_cache = Always write CMakeCache.txt
  • dashboard_do_coverage = True to enable coverage (ex: gcov)
  • dashboard_do_memcheck = True to enable memcheck (ex: valgrind)
  • dashboard_no_clean = True to skip build tree wipeout
  • dashboard_no_update = True to skip source tree update
  • CTEST_UPDATE_COMMAND = path to git command-line client
  • CTEST_BUILD_FLAGS = build tool arguments (ex: -j2)
  • CTEST_BUILD_TARGET = A specific target to be built (instead of all)
  • CTEST_DASHBOARD_ROOT = Where to put source and build trees
  • CTEST_TEST_CTEST = Whether to run long CTestTest* tests
  • CTEST_TEST_TIMEOUT = Per-test timeout length
  • CTEST_COVERAGE_ARGS = ctest_coverage command args
  • CTEST_TEST_ARGS = ctest_test args (ex: PARALLEL_LEVEL 4)
  • CTEST_MEMCHECK_ARGS = ctest_memcheck args (defaults to CTEST_TEST_ARGS)
  • CMAKE_MAKE_PROGRAM = Path to “make” tool to use
  • Options to configure builds from experimental git repository:
  • dashboard_git_url = Custom git clone url
  • dashboard_git_branch = Custom remote branch to track
  • dashboard_git_crlf = Value of core.autocrlf for repository

If you want to extend the capacity of this core script, some hooks are also provided to keep things clean and compartementalized.

  • dashboard_hook_init = End of initialization, before loop
  • dashboard_hook_start = Start of loop body, before ctest_start
  • dashboard_hook_started = After ctest_start
  • dashboard_hook_build = Before ctest_build
  • dashboard_hook_test = Before ctest_test
  • dashboard_hook_coverage = Before ctest_coverage
  • dashboard_hook_memcheck = Before ctest_memcheck
  • dashboard_hook_submit = Before ctest_submit
  • dashboard_hook_end = End of loop body, after ctest_submit

2. A very simple file to define a bot

Eventually, that makes writing a build script very easy indeed:

  1. set(CTEST_SITE “Bill_._Our_fearless_leader” )
  2. set(CTEST_BUILD_NAME “Ubuntu-12.04-32-Deb” )
  3. set(CTEST_BUILD_FLAGS -j8 )
  4. set(CTEST_DASHBOARD_ROOT “/home/ubuntu/Dashboards” )
  5. set(CTEST_TEST_TIMEOUT 1500 )
  6. set(CTEST_BUILD_CONFIGURATION Debug )
  7. set(CTEST_CMAKE_GENERATOR “Unix Makefiles” )
  8. set(dashboard_model Experimental )
  9. include(libwebrtc_common.cmake)

And … voila! you have a linux build bot all set up! Replace ‘Debug’ by ‘Release’, and you have your release build ready. To change from 32 to 64 bits, since we use ninja, you have to set up the right env variable, but it’s not difficult either:

  1. set( CTEST_ENVIRONMENT
  2.   “GYP_DEFINES=’target_arch=x64′” # or ia32 for 32 bits
  3. )

The corresponding file is here.

A word of warning though, installing the dev environment for libwebrtc is hard. First, it will almost only work under ubuntu, second, the environment install scripts provided do not seem to work, so you will end up having to manually install quite a few libs yourself before being able to compile. The good news is, you will only have to do that once. 

3. How to automate it all?

Now you are armed with several files for each build you want to run. You might very well run many build on the same machine, e.g. 32/64, Debug/Release. For Linux machines, you might want to cross-compile the android binaries as well (more on the mobile target in another post). 

However, you still need to have access to the machine, and manually launch

  1. ctest -S My_Build_script.cmake

For it to work.

One way around this is to define a (shell) script that run those commands for you. However, whenever you make a modification to the script, you have to connect to the machine again, and manually update the local script to the new version, grrrrr

That’s where cron (on linux or mac) and Scheduled tasks (on windows) comes in handy, as it can run a command at a given time for a given user. Here you have two schools: the original cmake members designed everything so that people could configure their always-on desktops to be used during sleeping hours. More recent developers will want to set up either a dedicated build bot, or a hosted build bot, and might want to reduce the cost by switching the machine off when the job is done. I will illustrate the later for linux (and mac) and the files for windows will be provided in the github account for those interested. Note that for windows build bot specifically, it has been shown that you’d better reboot the machine once a day in any case if you want it to work ….

Starting devices remotely is easy, all the cloud providers provide command line API, and you can maintain a build master (very tiny instance) whose sole job will be to wake up the bots once  day for them to fetch the latest code, configure, build test, and submit to the dashboard. In AWS EC2, that means playing with IAM, but nothing too hard there, and it’s very well documented. On linux, the cron daemon accept the ‘reboot’ keyword, and will run the corresponding task whenever the device is started.

  1. @reboot /home/ubuntu/Dashboards/Scripts/Bill-runall.sh

Finally, you can just use the shutdown command in your script to stop the instance when they’re done.

  1. shutdown -h now

We’re only left with automated updating of the scripts. 

The trick used by the ITK community is to keep the scripts (and the cron table) in a git repository, and to update this repository first. In this example, you can see from the shell script that we expect a ~/Dashboard/Scripts directory to contain the build scripts, a ~/Dashboard/Logs directory to be present, and that the crontable also get updated on the fly. Now I can just commit to my git repository, and the build bot will auto update itself. Sweet.

  1. # Go to the working directory
  2. cd /home/ubuntu/Dashboards/Scripts
  3. # get the latest scripts
  4. git pull –rebase -q
  5. # update the crontable
  6. crontab ./Bill-crontab
  7. # Run the builds
  8. ctest -S ./Bill-32-Debug.cmake
  9. ctest -S ./Bill-32-Release.cmake
  10. # done, let’s shutdown the instance to avoid paying too much
    sudo shutdown -h now

The full script with some additional features is here.

IV. Conclusion

It should now be pretty clear that setting up a build bot for libwebrtc is actually quite easy. The code provided in github should actually make it even easier. Feel free to set up your own build bot, hopefully with settings that are to yet present in any of the bots contributing to the dashboard today, and contribute to the fun. I should update to a bigger CDash server that will allow for more than 10 builds a day very soon. I would love to see people contributing for arm, android, iOS, …..

If you find this useful, let other know, and nice comments are also appreciated. 😉