To verify that the user or someone who had physical access to the device (border checks, etc.) hasn't messed with the firmware.If I were a bank I wouldn't want to be on the hook for someone getting their bank account drained by the custom ROM someone downloaded from XDA.
Then there's the DRM thing, where copyright owners make companies like Netflix sign a document like "if you don't enforce strong DRM, you cannot serve our media". Their choice is either use DRM (which in turn uses integrity checking) or not serve you at all. As a user, you once again have the choice of "buy the box set" or "use a smartphone with a trusted OS".
There's also the corporate use case, companies have remote wipe capabilities for data integrity purposes and don't want their employees rooting phones.
Pokemon Go used it to check whether people were spoofing their location and ruining the game for others. They were especially assholish about it, but that should hardly be a surprise when Nintendo is involved at any part of the chain.
Any game with in-app purchases wants to verify that nobody messed with the APK to get paid content for free. It's almost a basic business requirement. Combining limited-lifetime remote attestation tokens with data fetch URLs means superweatherapp-patched-luckypatcher.apk on LineageOS will not be able to pretend to be the real app (GPlay on stock Android already offers app verification APIs).
In Google's case, "this is a physical device and not an emulator" is a strong signal that the user is not a bot pretending to be a human. In an age where CAPTCHAs are easier to solve for AI than they are for humans, that kind of verification is worth a lot.
I'm sure I'm missing a lot of use cases here, but the technology is useful. It's often used in apps and games I would never want to run on my phone anyway, except for banking apps perhaps.