Technical Tips for video meetings

It seems a lot of people have already written a lot on the etiquette of video meetings, so in this little post, I’ll try to contribute with some of the technical tips which doesn’t seem to be covered as much.

Network connection

While wifi seem to work fine mostly, it can cause issues. If you have the option to use a wired connection for you device used for videomeetings, do so. It will have less latency then any wifi-connection and improve the experience.

Automatic MacOS shutdown

From time to time my Mac is doing stuff which takes quite awhile. Converting images, converting videofiles between formats or other stuff, which may take a long time (but reasonable predictable).

In those cases I run a little command in the terminal, to automatically shut down the Mac upon completion:

sudo shutdown -h +120

This command sets a timer which shutdown the machine after two hours (the 120 parameters being after 120 minutes).

No access to *.dev sites

I’ve been having an odd issue for a couple of months. When accessing sites having a .dev domain (like most recently go.dev), I my browsers have given me warnings and as many had HSTS-headers, not allowed me to visit the site.

It seemed like a strange error, and I’ve tried to remember if I’ve set up some proxy or VPN connection, that could cause this issue. A few times I’ve asked others on the net if they had issues - which was not the case - and I’ve tried using a web proxy, and everything worked. Yet no matter which browser I used it didn’t work.

Ubuntu 16.04 to 18.04 TLS...

The site went offline a few hours today. Sorry.

It turns out Ubuntu once again changed a major component and the upgrade path didn’t work as it should to keep the lights on after the upgrade.

I’ve been updating the security settings on the server all around, and one of the things I wanted to do was adding TLSv1.3 support (and nothing before TLSv1.2). For that I needed, it seemed the best option to push forward the Ubuntu server version to the newer LTS version (18.04) and as part of this get a newer NGINX with TLSv1.3 support. That part worked sort of great.

Crawl and save a website as PDF files

The web is constantly changing and sometimes sites are deleted as the business or people behind it moves on. Recently we removed a few sites as we were doing maintenance and updates on the many sites we run at work. Some of them had interesting content - for personal or professional reasons, and we wanted to make a static copy of the sites before deleting the sites completely.

I have not found any easy, simple and well-working software, which can produce and an all-inclusive downloaded copy of a website (including all resources sourced from CDNs and 3rd party sites (to actually make them browsable offline). As I needed to make the copy reasonable fast, I choose to try to capture the contents of the site (a text/article heavy site) as PDFs.