tzeejay icon




The Missing Workhorse Mac

The Missing Workhorse Mac

Processing intensive workflows are higher in demand than ever, but the Mac lineup is missing a crucial piece to help pro customers of Apple.

AVM Fritz WiFi Mesh

AVM Fritz WiFi Mesh

AVM Fritz WiFi Mesh kam unerwartet ist aber ein Segen für den deutschen Markt

Turning iOS Extensibility to 11

Turning iOS Extensibility to 11

iOS is amazing but is lacking productivity basics

Setting Up Automated E-Mail GoAccess Reports via systemd

Despite the, to me, obvious drawbacks of client side website analytics libraries like Google Analytics etc. people insist on including it on their website since they “need the data” or whatever and think that there is no other way to achieve that. I disagree.
GoAccess is a great command line tool to generate web server reports directly in the terminal which lets you quickly interact with the data to answer questions you may have, but it also happens to output a, all included in a single static file, HTML report. The out of the box styling for the static HTML file is great and enables you to quickly glance at a report, or even send it to colleagues to help support their efforts. I really recommend giving it a try and I strongly recommend tearing privacy invading client side libraries out of everything you have control over. While the early 2000s of the internet were a really cool place to be, Google Analytics is one of those relics that should have died a decade ago.
Most web server logs hold tons of interesting information sent by the visitors web browser, which can then be analysed asynchronously by a tool like GoAccess in a privacy respecting way. You will miss out of a few things but nobody is sitting in front of the Google Analytics dashboard frantically optimising their website for every possible viewport height & width pixel by pixel.

I have been using this setup for over a year now but recently ran into some trouble with my cron automation, which this Raspberry Pi executed based on a file in /etc/cron.daily/. If you don’t know, placing files into those directories is a little tricky because various requirements need to be met and sometimes things just stop for inexplicable reasons, oh and there are no logs about anything.
I was finally over it and simply ignored the reports for a while, thinking that it’ll either fix itself again or that I would fix it myself at some point. Recently though I ran into a great blog post about setting up systemd timers to run automations. I have since converted various things in my life into these systemd timers and have not looked back. Some of y’all don’t quite understand the powers of systemd or don’t like it fundamentally, but it is quite compatible with my brain and getting worry free logs which are instantly available, query-able and sort-able via journalctl makes it all even better in my eyes.

The last remaining puzzle piece in this entire chain of events is the transfer of the GoAccess report into my E-Mail inbox. I looked at a few of the common recommendations on the internet but ended up using a great OSS project fittingly called eMail. It does a few fancy things, but it tries to get out of your way and simply be a good *NIX tool. There isn’t too much to say about it and that is a good thing. I can wholeheartedly recommend using it!

Post installation of GoAccess one would usually get started by running something like these commands to generate static HTML reports (drop the -o static-report.html to checkout the analysis in the terminal first)

zcat -f /var/log/nginx/access.* -f /var/log/nginx/error.* | goaccess -o static-report.html

You could even start excluding some paths that you do not care about in the GoAccess report by running it through grep for example

zcat -f /var/log/nginx/access.* -f /var/log/nginx/error.* | grep -Ev '/admin|/wp/login/' | goaccess -o static-report.html

This however will not work with systemd, nor will it work if you throw it all into a shell script which systemd is then executing for some reason. GoAccess will give you scary sounding nonsensical errors and gzip will complain about broken pipes.

Aug 05 09:15:56 raspberrypi systemd[1]: Started GoAccess Report generation script.
Aug 05 09:15:56 raspberrypi bash[22837]: GoAccess - version 1.5.1 - Jul 19 2021 17:11:41
Aug 05 09:15:56 raspberrypi bash[22837]: Config file: /usr/local/etc/goaccess/goaccess.conf
Aug 05 09:15:56 raspberrypi bash[22837]: Fatal error has occurred
Aug 05 09:15:56 raspberrypi bash[22837]: Error occurred at: src/goaccess.c - initializer - 1459
Aug 05 09:15:56 raspberrypi bash[22837]: No input data was provided nor there's data to restore.
Aug 05 09:15:56 raspberrypi bash[22837]: grep: write error: Broken pipe
Aug 05 09:15:56 raspberrypi bash[22837]: gzip: stdout: Broken pipe

I ended up fixing this by creating a working directory for my GoAccess executing script and defining a bunch of files which I create, use while logs are being analysed and finally deleted once the report was sent out.

# mkdir -p /usr/local/goaccess-reports
# touch /usr/local/goaccess-reports/excluded-paths.txt
# touch /usr/local/goaccess-reports/

Once that is set you can populate the files accordingly and start receiving daily reports about the traffic on your web server, straight into your inbox.


#! /bin/bash


echo "Assembling report from Nginx logs"
zcat -f /var/log/nginx/access.* -f /var/log/nginx/error.* > "$TMPLOGFILEFULL" 

echo "Cleaning the log"
grep -v -f /usr/local/goaccess-reports/excluded-paths.txt "$TMPLOGFILEFULL" > "$TMPLOGFILECLEANED"

echo "Analyzing the log"
goaccess -f "$TMPLOGFILECLEANED" --log-format=COMBINED --http-protocol=no --http-method=no --ignore-crawlers -a -o "$OUTPUTFILELOCATION$OUTPUTFILE"

echo "Sending report with eMail"
email -b -s "Website Report For $(date +"%Y-%m-%d")" -attach "$OUTPUTFILELOCATION$OUTPUTFILE"

echo "Deleting today-report.html file"

echo "Deleting tmp-full.txt"

echo "Deleting tmp-cleaned.txt"

echo "Done for today"


Description=GoAccess Report generation script

ExecStart=/bin/bash /usr/local/goaccess-reports/


Description=Timer for Daily GoAccess report



Finally remember to start and enable the systemd timer

# systemctl start goaccess-report.timer 
# systemctl enable goaccess-report.timer 

You can check on the status of your timer and when it’ll be executed again via

# systemctl status goaccess-report.service


A Developer's POV: App Store: The Schiller Cut

I’ve been deeply involved with the Apple developer community since the 1990s. There has always been conflict between developers and Apple. Over the balance of fixing bugs versus adding features to the platforms, over the quality of documentation, over the tools, over everything. But the relationship has clearly turned for the worse during the App Store era, and the reason, I think, is money.

This excellent paragraph misses one key thing in my opinion that nobody seems to be able to put into words and I am still struggling a little bit.

John Gruber’s excellent piece of the current sentiment in the developer community towards Apple and him circling around Phil Schiller’s E-Mail of what could have been, sent many years ago regarding App Store policies, are all spot on. I recommend reading the entire piece first if you haven’t yet before continuing to read my (poor) attempt at articulating my thoughts on the topic.
The quoted paragraph above is what lead me to write this blog post, and while I have disagreed with many of John’s opinions over the last two-ish years specifically, I believe that disagreeing is a good thing.

> ..., and the reason, I think, is money.

As much as this conflict is about money, it isn’t about money at all. It also isn’t about any of the stated friction points in particular. For example Xcode is the best and the worst development environment that I have used so far. It does so many delicate and complicated tasks in a very effortless feeling Apple-y fashion, but as much as it sometimes feels like magic it also fails in a very Apple-y fashion, and you know exactly what I mean by that.
I still get signing credentials for Guardian and my other apps manually because I was burnt so badly by the automatic tooling in the past that I am not yet willing to trust it again.

The biggest problem I personally have is the general indifference towards developers. The position that we better be grateful for the opportunities they have blessed us with or else…, is simply disappointing. To further break the problem down, there is only one correct answer in my opinion to the question “who needs who?” in this relationship and it’s “both need each other”, but Apple refuses to acknowledge that. Apple wouldn’t even sell half as many of their devices without software from third party developers that elevate the devices to become an essential turning point in people’s lives. The builtin software from Apple lays the foundation and sets the tone but the third party apps are what make the platform which sells the hardware. Remember the “There Is An App For That” marketing campaign? They sure as hell weren’t referring to their own apps, or are our apps really their apps? The refusal to acknowledge this symbiosis is why it then boils down to being about the money. If they took 2% + whatever the credit card processing fees cost them I bet many would be willing to accept the current terms and call it the “I hope you choke on it” tax.

I’ve often said that Apple’s priorities are consistent: Apple’s own needs first, users’ second, developers’ third.

This is correct but overly simplified, which was the correct tradeoff by John to not get lost in a tangent in his post. Developers aren’t third and users aren’t second.
Apple is first, Apple is second, Apple is third, Apple is fourth and Apple is fifth. Users might be sixth depending on which day of the week it is and developers aren’t even in the top 20.
It has a lot of Chappelle’s Show energy.

Cook said that lawsuits were in the back of his head, but what triggered the program was worry over small businesses during Covid.

I believe the problem that many run into recently, as do I, is that the words spoken by Apple’s leadership in interviews or keynotes in the past could generally be trusted and taken at face value. I do believe that they actually care about the user’s privacy, and try their best as they run into political limitations around the world. The blatant lies in broad daylight though, both under oath in court as well as in a few recent interviews touching on these and similar topics is reputational damage that will last a generation.

Just as a brief reminder, in the beginning of 2020 as everybody was running around losing their minds while trying to adjust to huge shifts of how they were going to live their lives, Apple decided that it was a good time to force as many apps as possible into adopting the in-app purchase system, which would ensure that they get their 30% cut from many things being forced to shift to online transactions pratically over night. Many people were unsure if and how they would be able to feed their families and while they tried to improvise as much as possible, as we all did, Apple either held app updates hostage over in-app purchase adoptions or threatened with removal from the App Store. Every other day a new story was shared by a new developer, big and small. I am convinced that Apple’s leadership team was well aware of what was going on at the time and had every chance to stop it, but they chose not to.

Here is the full quote, quoting the exchange:

Rogers also expressed doubt that Apple’s Small Business Program, which cut App Store fees in half for small developers, was made out of concern for small businesses during the Covid pandemic, as Cook testified on Friday. “That seemed to be the result of the pressure accrued because of investigations, of lawsuits,” Rogers said.

Cook said that lawsuits were in the back of his head, but what triggered the program was worry over small businesses during Covid.

Rogers remarked that she had seen a survey that 39% of Apple developers are dissatisfied with the App Store. “It doesn’t seem to me that you feel any pressure or competition to actually change the manner in which you act to address the concerns of developers,” Rogers said.

Cook disagreed and said that Apple “turns the place upside down for developers.”

Even their attempt to provide them with the smallest bit of saving grace due to legal pressure completely failed thanks to it being so complicated for no obvious reason that apparently even Apple struggles to work with it based on this thread started yesterday by Russell Ivanovic.
Whether or not it is complicated due to them being greedy or due to another reason, it is not a good look and has not helped them in any way in my opinion. Not with developers and not to alleviate legal pressures. It failed so transparently that they were directly called out about it during the court hearing, not by the opposing legal team, but rather by the judge.

Image of Russell’s original tweet expressing frustrations about Apple’s Small Business Program

From the replies in the thread it appears as if many run into this problem, as they have no idea whether or not they are actually enrolled (some simply never get enrolled it seems). Many small developers which already don’t have the time to deal with this sort of nonsense, now have to spend their time trying to read the tea leaves to understand whether or not the first, second, third or fourth time they applied to enroll actually got them into the program.

Image of a reply to Russell’s tweet expressing frustations about Apple’s Small Business Program

How many more devices could Apple sell if they were to allow developers to provide the best possible experience to their customers on their platforms. By simply making it nice in every way and have not just their, but also the software written by third party developers solve all of your problems effortlessly and get out of your way. Remember back in the iOS 5 days when developers and designers on Dribbble were trying to trump each others login screens with the coolest looking, best working login screen possible on iOS? I want that but for every problem on the platform from payments to logins to modal alerts to interface idioms. Image what would be possible with some constraints lifted like Phil Schiller tried to express in his E-Mail while guaranteeing all the security upsides that currently exist for users.


iOS' Share Sheet Is Broken For E-Mail

The share sheet that we use daily in iOS was first introduced almost ten years ago, it appears though that enabling users to share basic things like formatted plain text, the one thing E-Mail is most used for, via their preferred E-Mail client is seemingly still an afterthought. It appears that either nobody is able to implement these APIs correctly or nobody is interested to do so. Variables needed to be set via undocumented KVC calls, or by prepending your text with whitespace characters. Obviously these workarounds can be found in Apple’s great developer documentation in which they clearly communicate these shortcomings…. ah no wait I am just joking, actually you can find it in comments under answers to StackOverflow questions.

How is anybody willing to tolerate this shit?

At Guardian we try to make it as easy as possible for our customers to send a support inquiry from our app via their E-Mail account so that they have full control over what information is shared with our support staff. Opt-in instead of opt-out! No hidden data mining!

Guardian Contact Technical Support UI Guardian Contact Technical Support Formatted E-Mail

Every once in a while we get requests from users asking us to support E-Mail clients other than the builtin iOS Some prefer GMail, Fastmail or Outlook and we would like to make it easy for all customers to help us help them. We would like to conform to a system protocol (like UIActivityItemSource but useful) which allows us to set the recipients E-Mail address, the subject line and the body. Setting the E-Mail’s body is the only thing that actually works reliably. Instead we end up with a mess in which the iOS supports an undocumented KVC call on the UIActivityViewController instance that you created to set the subject line pre iOS 13, since it is undocumented GMail does not support that but if you add lots of return characters \n to your E-Mail body string GMail will not place your body string into the subject line and I am convinced that other apps have other problems as well.

I do not understand why the three required (yeah technically two, but who doesn’t set a subject) fields in an E-Mail cannot be set programmatically by an app. With iOS 13 Apple added activityViewController: subjectForActivityType: which allows you to set a subject line but in my testing neither GMail nor Fastmail actually support it. Maybe them not supporting it isn’t even intentional, maybe they can’t because Apple also didn’t bother to document how E-Mail clients are supposed to handle this new value. Who knows…

So instead of using system functionality that is always presented to the user in the same way and would reduce a lot of friction on both ends, we now have the choice of supporting no other E-Mail client like we currently do or add a bunch of URL schemes to deep link into various other apps like it’s 2010. What a time to be an iOS developer.


Fix Apple's GeoTrust APNS Cert Problem

Apple’s APNs (Apple Push Notification service) servers have started to act up last weekend and there was a lot of confusion about it at first. This is a rare occurrence since APNs and iMessage appear to be Apple’s only rock solid server side services while everything else appears to be regularly operated with a staff count of minus one. By started to act up I specifically mean that the certificate which the service has been using could no longer be verified by many servers after a ca-certificates package update went out removing the root CA (little bit of context here). Lots of servers have probably started to show an error message similar to this:

Feb 10 15:53:55 guardian-example-server service.elf[31376]: sendPushNotification(): APNs request failed: Post "<token>": x509: certificate signed by unknown authority

This happened to us at Guardian as well and I only caught it by accident. This lead to none of our Pro subscribers being able to get real time push notifications, which is a feature I had poured a lot of work into last summer to get right and had to reimplement it a couple of times. All of that work was instantly disabled when the certificate was kicked out the trust store on all of our VPN nodes. I did not want to re-install the certificate system wide again, since GeoTrust appears to be not trustworthy and it would have required me to run commands through a SSH session on way too many servers.

So I resorted to being lazy and disabled the TLS certificate verification for that one HTTP request. Every other outbound network connection would still fail, if they were to try to connect to a host which also served a TLS certificate signed by the same GeoTrust certificate Apple will continue to use until March 29th 2021. This was my lazy initial solution to this problem which never made it into production because @chronic instantly kicking me in the butt about it, and rightfully so. This is a bad practise and should not be used in a production environment in 2021!
He suggested dropping the certificate in the typical .pem format onto the filesystem of all hosts and add it to a temp trust store for that one request instead (code below basically only requires a ioutil.ReadFile() if you wanted to do that). This would mean that come March 30th we’d have a file lingering on all hosts that I did not want to be there. So manual upload now, use it for a couple of weeks and then manual removal. Too much potential for human failure if you ask me.
I ended up with a modified approach of what Will had suggested, but instead of reading a file off the filesystem I decided to embed the GeoTrust certificate into our binary since it really wasn’t a lot of data.

This, in my opinion, is the right way to solve this problem until March 29th 2021 in Go, but I am sure all server side languages offer a similar API. It allows you to establish a verifiable TLS connection now, while not jeopardising the integrity of your entire system’s trust store and enables easy removal once Apple starts using the new certificate.

In order to solve this problem the right way I first downloaded the GeoTrust certificate from their website, to which I was still able to establish a trusted connection since macOS still trusts the certificate.

Link to the GeoTrust certificate
I am not including the entire certificate here for a good reason. Verify it for yourself, you shouldn’t trust me!

var (
	geoTrustRootCA = []byte(`-----BEGIN CERTIFICATE-----
	-----END CERTIFICATE-----`)

Add the contents of the GeoTrust certificate as a variable any which way you prefer, it just needs to be accessible to x509.AppendCertsFromPEM() as a byte array.
In this case here, I created a global variable by casting a raw literal string to a byte array and assigning all that to geoTrustRootCA.

	certpool := x509.NewCertPool()

	transport := http.DefaultTransport.(*http.Transport).Clone()
	transport.TLSClientConfig = &tls.Config {
		RootCAs: certpool,

	httpClient := &http.Client {
		Timeout: 15 * time.Second,
		Transport: transport,
	resp, reqErr := httpClient.Do(...)

Here I create a new x509 certificate pool, add only the GeoTrust certificate and then include it for use in the HTTP client’s transport object by first creating a copy of the default default HTTP Transport setup. This http client will allow you to safely make a successful API call to the APNs endpoint for now. This may or may not break the second Apple rolls out the certificate on March 29th so be ready and setup and alarm or something…


Recovering APFS Data

One of my usual holiday duties is to fill the role of family tech support. This year I was assigned the difficult case of recovering data off of a seemingly broken-beyond-recovery SSD. The MacBook Pro in question was close to 10 years old and I had swapped a Samsung 850 Evo SSD into it a few years prior. The owner mostly lives on his iPhone and does not rely on the MacBook for intensive daily work tasks. The MacBook appears to have had some kind of internal hardware fault which lead to the corruption of the filesytem. After removing the SSD from the MacBook I tried to plug it into my Mac via SATA to USB adapter in order to run through a basic data recovery strategy but I quickly noticed that it was behaving in all kinds of unexpected ways. My Mac instantly recognised the drive itself, but was never able to activate or mount the any partition of filesystem.

It was there and appeared to within reach, but greyed out. I tried running First Aid on the drive and partition via Disk Utility but it kept spewing errors that didn’t give me any hope. After briefly googling I found that not much is available and most existing tools are still in the phase of trying to adopt APFS fully, like Disk Warrior.

After a bit more searching I found this great blog post describing a similar problem. On there a commercial but incredibly shady app was mentioned as well as the OSS, but experimental solution libapfs on Github. I first downloaded the commercial app, scanned the drive and saw that it was able to read the data on drive and reconstructed entire directories in the filesystem. This gave me hope right away. I stopped the scan, downloaded the Github project and started compiling it. The difference between the commercial data recovery app and libapfs was basically trying to read every block on the drive and reconstruct what made sense to it while the libapfs is trying to work around the formatting problems to actually mount the partition as usual so that you could open it via Finder and pull the data off the drive like you would usually do. After going through the fairly complicated compilation steps to setup libapfs it ended up not being able to read the drive which meant that this thing was properly scrambled.

% sudo fsapfsinfo /dev/disk3     
fsapfsinfo 20201107

Unable to open: /dev/disk3.
libfsapfs_container_superblock_read_data: invalid object type: 0x00000000.
libfsapfs_container_superblock_read_file_io_handle: unable to read container superblock data.
libfsapfs_internal_container_open_read: unable to read container superblock at offset: 0 (0x00000000).
libfsapfs_container_open_file_io_handle: unable to read from file IO handle.
info_handle_open_input: unable to open input container.

A few years ago I had to run through the same process after HFS+ on my then brand new 5k iMac decided to mess itself up to the point of no return. All data recovery apps appear incredibly shady and back then I had settled on buying Disk Drill which allowed me to recover all my data as well as not do anything else that I didn’t approve of.
I think out of the few Mac data recovery companies that I have looked at the makers of Disk Drill appear to be one of the least shady ones. Maybe I am entirely wrong about this, who knows. The app works really well and starts out with a quick scan before it really does go through the entire drive in order to reconstruct everything.

Disk Drill Quick Scan

Disk Drill was also able to reconstruct the entire macOS filesytem structure and I was able to walk through the user folders with the owner to recover all the valuable data. It was mostly a few photos and documents like CVs, etc. which totaled at around 10-20GB. He kept insisting that it was fine if he was going to lose the data but I had seen family pictures during the first scan which may not be valuable to him now but may become very valuable to him in the future. My goal was recover absolutely everything.

I do believe that this method is not possible if Full Disk Encryption is enabled, so I would recommend against enabling it if your valuable data only exists on that very Mac. Companies like Backblaze will happily back your data up and store it encrypted for you.
Another option would be to have another copy on an external drive or on a local NAS which automatically backs up your Mac like a Synology. Synology is probably the correct solution for most people as it gives you commercial support if you need it. I personally opted to build my own thing and run FreeNAS which has recently been renamed to TrueNAS Core, and I really enjoy it.

I hope that this little summary of what I attempted in order to solve this problems is going to help somebody out there, even if it’s just an endorsement to buy Disk Drill to recover your data.


Boring Tech: NOCO Boost HD GB70

NOCO GB70 Image copied from

I am just going to make a bold claim here and say that 2020 was not good for the collective health of car batteries around the world. Unless you have a really fancy car or too much money, the average car battery weights around 20 kg and is some kinds of a VRLA or Valve Regulated Lead Acid Battery. It’s heavy, it’s ancient tech but it’s really reliable and cheap to manufacture. Due to it being ancient tech and quite rapidly discharging itself, as compared to Lithium-Ion based chemical mixtures for example, not driving your car leads to your battery discharging itself to a point where it may damage itself while trying to provide power output. The other very obvious problem is that your car wont start. In order to get out of a potentially sticky situation I bought a battery jump pack in early February and it has come in handy various times already.

There are various brands available that all seem to build quality products but I opted to buy the NOCO Boost HD GB70. It claims to jump start a 8 litre petrol engine or a 6 litre diesel engine 40 times. If you know a little bit about internal combustion engines, the 6 litre diesel number is the more impressive number here. It runs at 12V internal voltage and claims to deliver up to 2000A. Based on the cables attached and the various times I have used it so far I am inclined to believe those numbers. So far I jump started a 1.4L petrol engine various times as well as a 3L V6 diesel once without any problems. The cars cranked over and started immediately after seeing the full 12V, but NOCO gives the disclaimer that some cars may wait 30 seconds before doing anything.

The main reason why I am recommending this to anybody though are the seemingly just-smart-enough-to-be-useful internals. The chips that are connected to the jump leads run through various little checks once connected and make sure that there is no short and that the leads are connected properly to + and - respectively. This means that you can hand this little jump pack to anybody and they would not able to use this to potentially damage the cars electronics or vital components like the starter motor. There is a way to disable all the protective features but you really have to want to turn them off before you get yourself into trouble.

It has a regular ol' 5V ~2.1A USB-A port which can be used to charge your phone or anything else that can be connected via USB as well as a 12V out connector. You may not be as familiar with it but it is a commonly used plug in the automotive industry as well as in workshops for various things. I have a soldering iron for example that runs via a 12V plug like this.
The only downside about this product and why I want to explicitly mention it here is the Micro USB which is used to charge the jump pack if you don’t go for a 12V fast charger. I never liked Micro-USB, but this model appears to have been on sale as is for a couple of years and even though I’d much prefer to have USB-C port I can see why it’s on there.
The jump pack also features a very bright flash light at the front which can be run in various modes from just lights on continuously to a few flashing modes like SOS to a police siren like flashing.

I really like that the clamps are made from what feels like high quality plastics and have just sharp enough jaws on the inside for the clamp to grab onto reliably without damaging anything.
I chose this jump pack in particular because it has the leads permanently attached, but it may be overkill for your needs and there are smaller and cheaper versions available for reasonable prices. Maybe this one is perfect for you, or another version may suit your needs better. I’d recommend to have a look around on the NOCO website. I am usually not the “prepper” type of person, but I believe that owning something like this is very cheap insurance especially now as the colder climate has clearly arrived in the northern hemisphere.