Increasing the Fun in Video Calls

Working from home and spending a lot of time on video calls, got me wondering whether I could set up a system to have some sort of videoboard: Being able to have reaction-gifs or small clips playing on my video feed.

This is my quick documentation on how I set everything up (targeted to macOS, but should work similarly on other platforms).

What you’ll need

OBS as Video Source

OBS is mostly known in the context of streaming your gaming or coding. But only very recently the obs-mac-virtualcam became stable enough so you can use the output of OBS as a webcam in other programs 1.

So with OBS and the virtualcam plugin installed, you can start OBS and already add your webcam (Video Capture Device). Then select Tools > Start Virtual Camera from the menubar and you can already use it in your next video call.

Adding Clips as Scenes

So go ahead and download your favourite short video clips from Youtube (easiest with youtube-dl) and drop them into new scenes in OBS (you might have to resize them to fill the whole screen).

At this point you can already cut between the different clips and it is shown on your video feed. However you always have to manually switch back to camera feed. Advanced Scene Switcher to the rescue: Under Media you can create automations to switch back to the camera whenever a media file has ended.


So the video part is ready to go, but how to rick-roll your colleagues without sound? BlackHole allows us to pipe sound from one program to another. So we want to have OBS output all necessary audio signals to BlackHole and then use BlackHole as audio input in your video conferencing program.

We’re going to use the Monitoring functionality and set BlackHole 16ch as Settings > Audio > Advanced > Monitoring Device. Additionally we must also output the microphone to the monitoring device.

All that’s left to do: Use the BlackHole audio device as microphone input in your video tool. 🎉


  • Sometimes I had only static noise on the BlackHole audio device and had to restart my computer. So if you plan to use this setup be sure to give a quick test-run before you annoy everyone on the call.
  • You can also set hotkeys to switch to scenes – makes it much more comfortable to switch between scenes.
  1. Support is still limited. But for Zoom or using the Browser is fine in most cases. 

macOS Wifi Woes

I spent a large part of the past year being annoyed at the Wifi connection of my MacBook: Every once in a while it would clearly drop all traffic for a few seconds while reporting that everything is fine. This is particular annoying when being on audio or video calls. It occurred once or twice a week but I couldn’t reproduce it at will. So I set out on a pretty long troubleshooting journey that got me a few times on the brink of reinstalling macOS from a clean slate. In the end the fix turned out pretty straightforward and by writing this post, I hope to remember it the next time macOS has such hiccups.

The Solution

  1. Open Network Preferences
  2. Create a new Network Location and switch to it
  3. Reboot
  4. Connect to your network
  5. Profit!

Having done this a few month ago I never ran into the wifi drops ever since. 🤞

Other failed attempts

As mentioned, it was a pretty long time I endured this situation, but it wasn’t for a lack of trying to fix it. Here a list of things that didn’t work for me:

  • NVRAM / PRAM reset: All time classic when dealing with issues on a Mac, sadly didn’t bring any salvation this time.
  • Changing Wifi Routers: Both at work and at home I was connected to Ubiquiti access points, so I assumed that maybe there is something afoul with this connection. But nobody else at work was affected, nor did changing back to an older Wifi-Router at home help.(And yes, I even fiddled with the MTU sizes, which brought back memories from LAN parties in the early 2000s)
  • Removing the plist files: Nope, no improvements either.
  • Running Wifi Diagnostics: Even when I was lucky enough to catch a dropout when running the wireless diagnostics, it wouldn’t list any issues.

Working from Home: Audio Edition

Spending a good amount of my days in calls nowadays, audio quality is quite important. I have a plethora of headphones so I can comfortably listen to others on calls, but figuring out how to sound my own best is a bit more tricky and pretty much an ongoing process. Currently I settled on the microphone of the external webcam together with

I have been content with the quality of using either the Bose QC35 or the AirPods via Bluetooth for calls so far. But when we as a team at work decided to gather in a permanently open Mumble room1 things got a bit more tricky: Having the microphone channel open, everything switches to the low-quality low-latency SCO codec. This makes listening to music and any other media rather unpleasant.

So I started fiddling with other options to split microphone input and audio output. The microphone on the MacBook quickly showed that it’s too much dependent on my relative position to it and even worse that you could hear the fans spinning up. So I was left with the external webcam I use, a Logitech c920. While I would say the sound itself is okayish it had a pretty bad echo. A colleague described it as preaching in a church. Having read about krisp.ai2 earlier I gave it a shot, and lo and behold I’m impressed: It completely removes the echo and also filters out much of the ambient noise from traffic, most typing, and similar things. So I don’t have to do the mute-unmute-dance after every sentence anymore.

Here are two audio samples:

  • Without
  • With active

For now I’m quite happy that I can listen to music and chime in on Mumble (and Zooms) without having to reconfigure everything all the time. Nonetheless I’m already prying on a dedicated microphone as it looks like the work-from-home situation will continue for months to come.

  1. Most of the time everybody is muted. But it is nice to have a direct chat when something comes up. 

  2. Referral link to get a month at for free 

Finicky - Always open the right browser


Thanks the ongoing pandemic, I currently work from home and spend much more time in video calls. We use a variety of systems to do that: Slack, Zoom, Google Meet / Hangout and Jitsi. Especially the last two systems run exclusively in the browser. And while I am a big fan of Firefox and use its Developer Edition as my main browser, video calls appear to run smoother in Chrome.

Until today I would most often follow links to Hangouts or Jitsi to open in Firefox only to copy over the URL to Chrome. However I wondered whether there is a better way to do this — that’s how I stumbled upon Finicky: A small macOS utility to set as default browser and that can be scripted for which URLs to open in what browser.

So no matter where I click on a Jitsi or Hangouts link — it will open in Chrome but everything else will open in Firefox.

My current configuration:

module.exports = {
  defaultBrowser: "Firefox Developer Edition",
  handlers: [
      match: finicky.matchHostnames(["", "", ""]),
      browser: "Google Chrome"
      match: /\/j\//,
      browser: "us.zoom.xos"
  • Update 2020-03-28: Added a config to open Zoom links directly in the app.

EOY 2019

I’m somewhat surprised my last year in review post is about 2014. I tend to write those for myself yearly but apparently didn’t get to create a public version from it in the last few years. So here an unsorted list about things I did in 2019, topics I found interesting and stuff that changed.

Most memorable where the four weeks I spent road-tripping Australia in October and November. Driving from Sydney to Cairns in a campervan. Likely the best vacation of my life. Still have a ton of photos and videos to sort through. As we left when the fires started, I am really sad about the current situation over there.

Some media consumption stats: Read 17 books. Listened some 36.000 minutes on Spotify. Went to 7 concerts (fav: Muse in Berlin).

Upgraded to the iPhone 11 for the trip to Australia — worth it for the camera alone. Also decided to give the Apple Watch a try — not yet convinced.

Bought a Synology DS218+ as center piece for our home network. Currently acts as VPN, Time Machine backup destination, PiHole, Unifi Controller and Plex server (and as data store for everything else).

At work I got to do two AWS certifications: Developer Associate and DevOps Professional. Worked on a lot of interesting projects at work, mostly centred around the services for Home Connect.

Also got to attend the We Are Developers Congress in Berlin. Favourite talk: Rasmus Lerdorf: 25 Years of PHP.

New favorite coding tools: TypeScript; Dank Mono as coding font; Color scheme toggles between Monokai Pro and Tomorrow Night.

Went to the gym more often than the years before but regularity only set in in November / December.

Onto a great 2020.


alias dapp='docker-compose exec app'

Going through things I changed in my workflow this year, I saw this alias. I use it very frequently to run commands in the Docker container named app.

Most projects I work on have their main component in such a container named app, so this makes a nice shortcut to do anything development related in the container. (And it replaced my dssh script for the most part.)

Some examples:

# Install dependencies in a node project
dapp npm install

# Update dependencies in a a PHP project
dapp composer update

5 Lessons Learned From Developing on the Fitbit Platform


Earlier this year our customer BSH tasked us with an exciting project: Let’s create a Home Connect app for Fitbit smartwatches. The goal: Home Connect users can monitor and control their home appliances like coffeemachines, washers or ovens from their wrists. Six weeks later we were proud to see it launch.

A few days ago I published a piece on developing applications for Fitbit watches on the Scandio blog.

Passing the AWS Certified Developer Exam

Two weeks ago I passed the AWS Certified Developer Exam. As it was a huge help for me to read about other people’s experience, I also want to share my story. So this a short retro on the what, how and why.

The AWS Developer Certification

Amazon Web Services are one of those clouds everybody keeps talking about. It’s a multitude of different services for all kinds of use cases. I’ve only been using a small fraction that fitted our projects at work.

AWS offers different tracks of certifications:

  • Solutions Architect — Focusing on the infrastructure to run your application on AWS
  • SysOps — Concentrating on the automation and the operations part
  • Developer — Using AWS from a software developer perspective

The chance to do the certification came from work. My employer, Scandio, is AWS partner and in this role we’re required to have certified staff. As one of the more experienced AWS users in our company I volunteered to give the Developer path a try. Why the developer and not the solutions architect (which is the most common one)? The developer path overlapped the most with what I’ve been doing with AWS so far.

How I Tackled the Exam

This certification was the first exam I took since leaving university and I actually enjoyed “studying” more than I anticipated.

I used the mornings and afternoons of my workdays to put in the time for studying. In total I put in a bit over 30 hours over the course of two weeks.

As a starting point I went through the Cloud Guru course in about a week. My personal take on the topics covered in this course ranged from “I know this already, let’s quickly skim through it” (ElasticBeanstalk, EC2, S3) to “used once but let’s see how it’s actually meant to be used” (SNS, SQS, Lambda, API Gateway) to “never used, let’s see what’s behind this” (Kinesis, Cognito, CodeBuild, CodePipeline).

The second half was more focused on working with the services I didn’t have much experience with so far. Reading up on the FAQs and actually setting up examples. I also read a lot of other people’s posts about their experiences with the exam. This helped me quite a bit deciding what to look into.

As a last preparation step I took the test exam provided by AWS and the exam simulator offered by Cloud Guru. This allowed me to get a feeling for the type of questions.

The whitepapers which are also recommended for the preparation I mostly skimmed during my daily commute. Personally there wasn’t much new information which I didn’t learn during the last few years as a software developer.

The Exam Itself

The questions in the exam were mostly scenario based — like “You are asked to set up an automated deployment for X. How can you achieve this while always having a capacity of Y%?”. Sometimes all answers would solve the problem at hand, only one or two would actually fulfill the specific criteria asked for. So I actually took the time to read every question twice.

Per NDA I’m not allowed to share any specific questions that were asked in the exam. But I want to share at least the topics which were part of my exam as I also benefited from others doing so while preparing.

Deeply covered

  • CI / CD with AWS: CodeBuild, CodeDeploy, CodePipeline — How can you override configurations; what options are offered by the different services; which service is the right one for specific scenarios
  • SAM, Lambda, API Gateway: How are they used effectively together; Some more specific questions on the services themselves
  • Cognito: When to use which feature
  • Elastic Container Service / Docker: How to set it up properly and use effectively with other services
  • CloudWatch: Mostly in relation to other services how CloudWatch could help in specific scenarios

Superficially covered

  • EC2 / VPC / Security Groups
  • RDS
  • SNS
  • X-Ray
  • CloudFormation

Not covered

  • Kinesis
  • Details from the AWS Whitepapers unrelated to the services

Closing Thoughts

Having gone through the process I definitely got a better understanding of many AWS services. In some cases I was also already able to uses some of my learnings at work. The result of the exam (954 / 1000) was better than I expected before starting with the certification. So, would I do it again? Yes.

If I were to do this again (or as a personal learning for other certifications), I would put more time into actually using the services I’m not familiar with. In this cases that would have been SAM, API Gateway and the CodePipeline-related services.

But I would again try to fit into a few weeks at most, because this allowed me to keep the concentration and not get carried away by everyday business.

Prettier Code

If you care about code formatting, you might want to take a look at Prettier. It changed the way I think about coding styles quite a bit.

So I used to spend a lot of time fiddling with code styles. From debating spaces vs. tabs to comparing Symfony’s Coding Standards and Google’s Styleguides. With JavaScript starting to be the language of choice for most new projects, I settled on the Airbnb JS Style Guide and with the matching linter rules the topic was settled for quite some time.

But half a year ago, we decided at work to use Prettier for a new project. And this has changed how I think about code styleguides in a pretty fundamental way: I just don’t care anymore.

What prettier does: Instead of looking at the code style as it was written and applying rules on it, Prettier parses the code and prints it in its own format. So the leeway classic styleguides give every developer isn’t a topic to ponder on anymore.

Like many linters, it automatically reformats the files on saving. At first it felt like a heavy intrusion in my work. After all I – at least pretended – to put some effort and pride into the styling of the code I wrote. But a few days later I almost completely stopped thinking about code formatting. Months later I’m on the other end: I write code and am always heavily confused, if it doesn’t automatically get reformated in the now familiar Prettier-style.

So if you happen to start a fresh project, just give Prettier a try for a couple of days.

DNS Resolution in Docker Containers

Networks in Docker are a powerful and only recently I learned about the embedded DNS server. So if you maintain containers for which DNS resolution is important but which might not have the most reliable connection to the common DNS servers (e.g. Google’s and CloudFlare’s this might be a feature to look into.

In my situation a Openresty / nginx container runs in multiple regions (EU, US, China) and its main purpose is to distribute requests to other upstream services. To do so it’s necessary to set the resolver directive and tell nginx which DNS server to use. First decision: and This worked fine until the container in China started to get timeouts while attempting to connect to those DNS servers. Essentially bringing down the whole service.

To get around this I toyed with different approaches:

  • Switch from hostnames to IP addresses for routing — didn’t work directly because of SSL certificates.
  • Adding a local DNS service in the container (dnsmasq) — didn’t really want to add any more complexity to the container itself.
  • Adding a separate container to handle DNS resolution.

Only then I stumbled across the embedded DNS server. If the container runs in a custom network, it’s always available at and will adhere to the host’s DNS resolution configuration. While all other host machines already had a robust enough DNS config, I manually added the most crucial IP addresses to the /ets/hosts file on the Chinese host. Bingo, no more DNS issues ever since.

I guess the lesson here for me is to dig a bit deeper in the tools already at hand before going down the rabbithole and constructing overly complex systems.