We're also introducing GoStringUngarbler, a command-line tool written in Python that automatically decrypts strings found in garble-obfuscated Go binaries.
You want Google Search, Google Docs, Chrome, Android, and Google Cloud to all share the same blog? Not to mention lesser-known areas like Google Education and so forth...?
That tells me there isn't a single security research group, but at least six of them. Which doesn't surprise me.
Shouldn't they be blogging the org chart? When I want to follow updates, it's generally from a particular part of the org. Each group has its own separate mission and its own audience.
garble actually sounds like an excellent utility to add some protection around things like keys/secrets in a binary. Is there anything like this for Swift binaries?
Obfuscation tools like these only slow down attackers, they can never stop them. Even the best in the game, where there are strong financial incentives on the line, fall to attackers typically in a matter of months.
As such, you should never use them to protect data that needs to stay secret indefinitely (or for a long time), such as keys.
That was my reasoning as well, I used to work for a company that really wanted code to be obfuscated because they were terrified of corporate espionage. Even though the one I was working on was just a configuration interface, and the configuration was plain text files, and the application didn't do anything special, just complicated (mobile network routing / protocols, lots of domain specific knowledge but as far as I know nothing secret or difficult to reproduce with enough resources).
Apps and websites get copied all the time. Somebody throws up a duplicate with ads and steals your traffic and search rankings and customers and whatever.
Adding code to prevent your product from working when it's not on the right app/domain, and obfuscating your code to obfuscate those checks, can be sadly necessary. It doesn't need to defeat a determined attacker, but just be hard enough they'll spend their time cloning something else instead.
There are occasions where you just want to make it a little harder to impersonate an official client where it can be useful to store a secret in the binary. It's still vulnerable but requires intention and actual effort.
Might have the opposite effect. Like a Streissand effect... hacker sees that the app is mysteriously hiding a secret? Makes you want to hack it just for the challenge, even if you had no intention before.
Probably a much better solution would be to store those as environment variables. I can't think any sane way where adding secrets to a binary would be useful unless you want to do something malicious with it.
Unless you’re launching the binary with c&c infrastructure receiving remote commands to start the binary, I don’t see how you would obtain the values to inject them into environment variables.
But even this case doesn't make much sense. I expect that instead of adding the secrets inside the binary you will go through to the more traditional ensuring that the client is logged in and that the secrets are stored in the server.
Unless you want your app to be used anonymously, but then why have secrets?
The use case I have encountered was for anonymous users where the company wanted to prevent unauthorized clients (copies of the app) from relying on the same server-side HTTP API used by the official app. The point wasn't to make it impossible for an unofficial to be used, but to make it harder than "trivial".
So the app used a digital signature / request signing with a key that was obfuscated and embedded in the binary. With anonymous users I don't know how else you could avoid use of the private API.
I am not saying that it can't be done, but I still find it a flawed solution. It probably works if your product is not really popular, but once you have anything remotely interesting and popular you can be sure that people will be analyzing your binaries and leaking your secrets faster that you can replace them.
We did some similar work back in 2020: https://www.kryptoslogic.com/blog/2020/12/automated-string-d... I've always wanted to revisit it and add support for garble but I guess that's no longer necessary :)
We're also introducing GoStringUngarbler, a command-line tool written in Python that automatically decrypts strings found in garble-obfuscated Go binaries.
I wish Google did not have 47 separate domains it uses for blogging.
You want Google Search, Google Docs, Chrome, Android, and Google Cloud to all share the same blog? Not to mention lesser-known areas like Google Education and so forth...?
Even their security research group has, at least, six different outlets. They are "blogging the org chart".
googleprojectzero.blogspot.com, security.googleblog.com, cloud.google.com/blog/topics/threat-intelligence, bughunters.google.com/blog, blog.google/technology/safety-security
That tells me there isn't a single security research group, but at least six of them. Which doesn't surprise me.
Shouldn't they be blogging the org chart? When I want to follow updates, it's generally from a particular part of the org. Each group has its own separate mission and its own audience.
they should have single blog, and tags for you to filter...
[flagged]
garble actually sounds like an excellent utility to add some protection around things like keys/secrets in a binary. Is there anything like this for Swift binaries?
How can you read an article on automatic DEobfuscation and think "hey I can store secrets in that"?
Obfuscation tools like these only slow down attackers, they can never stop them. Even the best in the game, where there are strong financial incentives on the line, fall to attackers typically in a matter of months.
As such, you should never use them to protect data that needs to stay secret indefinitely (or for a long time), such as keys.
That was my reasoning as well, I used to work for a company that really wanted code to be obfuscated because they were terrified of corporate espionage. Even though the one I was working on was just a configuration interface, and the configuration was plain text files, and the application didn't do anything special, just complicated (mobile network routing / protocols, lots of domain specific knowledge but as far as I know nothing secret or difficult to reproduce with enough resources).
People are somehow really convinced their thing is uniquely special and worth stealing.
I think they're more often convinced that their thing is just worth stealing. There are many such things.
Exactly this.
Apps and websites get copied all the time. Somebody throws up a duplicate with ads and steals your traffic and search rankings and customers and whatever.
Adding code to prevent your product from working when it's not on the right app/domain, and obfuscating your code to obfuscate those checks, can be sadly necessary. It doesn't need to defeat a determined attacker, but just be hard enough they'll spend their time cloning something else instead.
I speak from experience...
There are occasions where you just want to make it a little harder to impersonate an official client where it can be useful to store a secret in the binary. It's still vulnerable but requires intention and actual effort.
Might have the opposite effect. Like a Streissand effect... hacker sees that the app is mysteriously hiding a secret? Makes you want to hack it just for the challenge, even if you had no intention before.
Probably a much better solution would be to store those as environment variables. I can't think any sane way where adding secrets to a binary would be useful unless you want to do something malicious with it.
Unless you’re launching the binary with c&c infrastructure receiving remote commands to start the binary, I don’t see how you would obtain the values to inject them into environment variables.
I assume they are shipping iOS apps
But even this case doesn't make much sense. I expect that instead of adding the secrets inside the binary you will go through to the more traditional ensuring that the client is logged in and that the secrets are stored in the server.
Unless you want your app to be used anonymously, but then why have secrets?
The use case I have encountered was for anonymous users where the company wanted to prevent unauthorized clients (copies of the app) from relying on the same server-side HTTP API used by the official app. The point wasn't to make it impossible for an unofficial to be used, but to make it harder than "trivial".
So the app used a digital signature / request signing with a key that was obfuscated and embedded in the binary. With anonymous users I don't know how else you could avoid use of the private API.
I am not saying that it can't be done, but I still find it a flawed solution. It probably works if your product is not really popular, but once you have anything remotely interesting and popular you can be sure that people will be analyzing your binaries and leaking your secrets faster that you can replace them.
Sure, and those occasions are when you should realize that what you want is a bad idea, and then not do that.
Please, anyone reading this: don't do it.