
When we all started changing the way we work and communicate, most of us had to use online communication tools such as Zoom, Skype, Microsoft Teams, Jitsi, or BigBlueButton. Attending a video conference from the office is not the same as attending from home, as we may have personal items we do not want to show, such as family photos or other personal belongings, or simply an untidy room.
Introduction of virtual backgrounds
Most of the video conferencing platforms mentioned above offer a “virtual background” or “blur” feature. This allows users to change their background without a green screen, using machine learning algorithms to distinguish people from the background.
At LionGate, we use BigBlueButton for our online teaching platform “vicole”. Unfortunately, BigBlueButton lacked the virtual background feature, which was highly requested by our customers and other users. Therefore, we offered to support the project and develop the feature ourselves, contributing an open-source addition that anyone can use once added to the project.
Initial steps and understanding the technology
After offering to implement the feature, we first needed to understand the necessary steps. As an open-source and web-based platform, we looked at Jitsi and its implementation. Their source code showed that it was inspired by another open-source project on GitHub by “Volcomix,” which documented the steps well and provided a working demo.
We followed the steps of Volcomix and Jitsi and used the Google Meet Segmentation Model, released under Apache License 2 in early January. The model was only available as a TFLite file, so we embedded it in Web Assembly. Then the code was adapted to BigBlueButton’s procedures.
Adapting the feature to BigBlueButton
The BigBlueButton web client uses React and Meteor. A Meteor web handler was created to load the Web Assembly files upon user request. After loading, the video stream is replaced by a chosen background or blur effect and passed to an HTML canvas element. The MediaStream object is then used to replace the stream in the WebRTC RTCPeerConnection, so all participants see the virtual background without performance impact on their devices.
Conclusion
Although the feature is functional, further refinements are ongoing. A key aspect of open-source projects is that anyone can identify issues and contribute. This was a great opportunity for us to give back to a project we enjoy using ourselves. You can try a live demonstration at www.vicole.de.