If Google wanted to add developer verification without being evil, it could use SSL certificates connected to domain names. I think the whole concept is ill-conceived, though I’ll admit to a modest bias against protecting people from themselves.
They couldn’t. Domains and SSL certificates can be obtained very easily anonymously and thus wouldn’t let Google identify the developers of malicious apps, which is the goal of this
The trouble is Google’s definition of malicious apps. Are adblockers malicious? How about alternative apps for YouTube? Based on the recent history, I don’t think you will be able to install those apps on the phone you purchased.
Code signing certificates work a little differently than SSL certificates. A timestamp is included in the signature so the certificate only needs to be valid at the time of signing. The executable will remain valid forever, even if the certificate later expires. (This is how it works on Windows)
Doesn’t work, the reason they can expire is to make certificate rotation possible.
If an expired ssl certificate is cracked it doesn’t matter because no browser will accept the expired certificate, with your idea the expired certificate just signs an app with the date of 1984 and it works.
Certificates in SSL can’t change the date because that date is signed by a certificate higher in the hierarchy.
This isn’t “my idea”, this is how the industry already does code signing. You can’t sign something with a date of 1984 because your certificate has a start and end date, and is usually only valid for 1 year.
Then you need a Trusted Third Party, right? Still requires some though on how to prevent that third party from blocking applications they don’t like but I can see how a group of trusted authorities could work.
The trusted 3rd party in this case is actually multiple 3rd parties. There’s several options for trusted timestamping just like there’s multiple trusted root CAs for SSL. Since the timestamping service is free and public, anyone can use it to sign anything, even self-signed certificates. There’s no mechanism to deny access, at least for this portion.
There’s always a risk the root CAs all collude and refuse to give out certificates to people they don’t like, but at least so far this hasn’t been a problem. I don’t have a better solution unfortunately. If we could have a 100% decentralized signing scheme that would be ideal, but I have no idea how you would build such a thing without identity verification and some inherit trust in the system
So any APK I download will just expire at some point in time that’s probably really annoying to know, and then I have to dig through the internet again so I can install the app again?
If it’s anything like how Windows does it, you would still be able to override it. It just gives you a scary warning and hides the option unless you click “more info” or something.
If Google wanted to add developer verification without being evil, it could use SSL certificates connected to domain names. I think the whole concept is ill-conceived, though I’ll admit to a modest bias against protecting people from themselves.
They couldn’t. Domains and SSL certificates can be obtained very easily anonymously and thus wouldn’t let Google identify the developers of malicious apps, which is the goal of this
The trouble is Google’s definition of malicious apps. Are adblockers malicious? How about alternative apps for YouTube? Based on the recent history, I don’t think you will be able to install those apps on the phone you purchased.
Yes, I agree. Google will use this to control the Android app ecosystem beyond the Play Store and I don’t like it either
You can sure as shit know that NewPipe and Smart Tube Next won’t be getting a licence. Fuck Google so fucking hard.
It provides a way to open an investigation into a malicious developer without giving Google the ability to ban anyone it doesn’t like.
Yeah I mean some form of asymmetric encryption/validation would work but it stops the real reason why Google wants to implement this.
The problem with that is that certificates expire before someone would want to keep using the app.
Code signing certificates work a little differently than SSL certificates. A timestamp is included in the signature so the certificate only needs to be valid at the time of signing. The executable will remain valid forever, even if the certificate later expires. (This is how it works on Windows)
Doesn’t work, the reason they can expire is to make certificate rotation possible. If an expired ssl certificate is cracked it doesn’t matter because no browser will accept the expired certificate, with your idea the expired certificate just signs an app with the date of 1984 and it works.
Certificates in SSL can’t change the date because that date is signed by a certificate higher in the hierarchy.
This isn’t “my idea”, this is how the industry already does code signing. You can’t sign something with a date of 1984 because your certificate has a start and end date, and is usually only valid for 1 year.
You can read more about how this works here: https://knowledge.digicert.com/general-information/rfc3161-compliant-time-stamp-authority-server
https://en.wikipedia.org/wiki/Trusted_timestamping
Then you need a Trusted Third Party, right? Still requires some though on how to prevent that third party from blocking applications they don’t like but I can see how a group of trusted authorities could work.
The trusted 3rd party in this case is actually multiple 3rd parties. There’s several options for trusted timestamping just like there’s multiple trusted root CAs for SSL. Since the timestamping service is free and public, anyone can use it to sign anything, even self-signed certificates. There’s no mechanism to deny access, at least for this portion.
There’s always a risk the root CAs all collude and refuse to give out certificates to people they don’t like, but at least so far this hasn’t been a problem. I don’t have a better solution unfortunately. If we could have a 100% decentralized signing scheme that would be ideal, but I have no idea how you would build such a thing without identity verification and some inherit trust in the system
It need only check at install time.
Correction: SSL certificates can expire before someone would want to continue being able to install any given app.
Sure, the developer needs to keep the certificate up to date and re-sign the APK on occasion.
So any APK I download will just expire at some point in time that’s probably really annoying to know, and then I have to dig through the internet again so I can install the app again?
Another option is to allow otherwise-valid signatures after expiration. It’s generally still possible to check them.
That completely nullifies the entire point of signature validations.
How? Expiration doesn’t grant an unauthorized party access to the private key.
If it’s anything like how Windows does it, you would still be able to override it. It just gives you a scary warning and hides the option unless you click “more info” or something.
These two are identical for software.