Github is quite a fantastic service for source code management and hosts a suite of wonderful tools which can help you manage the development, tracking issues and much more.
One of the fun things is the github activity dashboard located on your personal frontpage which shows how much you’ve done.
I’m sure it’s mostly a fun thing and it allows you to sort of get an idea of it the account is dead or alive.
I recently switch this site from Wordpress to Hugo - and the site with Danish content too. It was mostly easy and straight forward, but initially there was a few features missing (by the very nature of it being a static site) and some things I needed to look into once the switch had happened. One of these was a gallery function to present images (photos mostly).
Gallery options…. There are various ways to have a gallery on a site, and one of the pains I had with wordpress, was changing strategies over time, which left me with several plugins needed to handle the historic gallery choices.
As the web moves more and more to HTTPS and ehanced security (such as HSTS) keeping your certificates updated and valid becomes more and more important.
I’m toying with an idea of building a small webapp to monitor my small portfolio of certificates and warn me if a certificate is due to expire. As part of this, I’m slowly patching pieces together in Go and one of the small useful outcomes is a small (compilable to bin) script which prints the basic certificate details of a given domain.
I’ve been having an odd issue for a couple of months. When accessing sites having a .dev domain (like most recently go.dev), I my browsers have given me warnings and as many had HSTS-headers, not allowed me to visit the site.
It seemed like a strange error, and I’ve tried to remember if I’ve set up some proxy or VPN connection, that could cause this issue. A few times I’ve asked others on the net if they had issues - which was not the case - and I’ve tried using a web proxy, and everything worked.
The web is constantly changing and sometimes sites are deleted as the business or people behind it moves on. Recently we removed a few sites as we were doing maintenance and updates on the many sites we run at work. Some of them had interesting content - for personal or professional reasons, and we wanted to make a static copy of the sites before deleting the sites completely.
I have not found any easy, simple and well-working software, which can produce and an all-inclusive downloaded copy of a website (including all resources sourced from CDNs and 3rd party sites (to actually make them browsable offline).