Literally like 99% of social engineering attacks would be prevented this way. Seriously, make a little "hang up, look up, call back" jingle for your company. Test it frequently with phishing tests. It is possible in my opinion to make this an ingrained part of your corporate culture.
Agree that things like security keys should be in use (and given Retool's business I'm pretty shocked that they weren't), but there are other places that the "hang up, look up, call back" mantra is important, e.g. in other cases where finance people have been tricked into sending wires to fraudsters.
The ineffectiveness of "security training" is precisely why TOTP is on its way out - you couldn't even train Google employees to avoid getting compromised.
IMO most of this is because most security training I've seen is abysmal. It's usually a "check the box" exercise for some sort of compliance acronym. And, because whatever compliance frameworks usually mandate hitting lots of different areas, it basically becomes too much information that people don't really process.
That's why I really like the "Hang up, look up, call back" mantra: it's so simple. It shouldn't be a part of "security training". If corporations care about security, it should be a mantra that corporate leaders begin all company-wide meetings with. It's basically teaching people to be suspicious of any inbound requests, because in this day and age those are difficult to authenticate.
In other words, skip all the rest of "security training". Only focus on "hang up, look up, call back". Essentially all the rest of security training (things like keeping machines up to date, etc.) should be handled by automated policies anyway. And while I agree TOTP is and should be on its way out, the "hang up, look up, call back" mantra is important for requests beyond just things like securing credentials.
It's not just because it's abysmal, it's because it was found, empirically, not to work, no matter how good you make it. The mitigation you're describing is also susceptible to lapses and social engineering, just like what got them into trouble in the first place.
The simpler mitigation of 'the target employee with with the Google account full of auth secrets should have had it U2F protected' would have worked even if the phone person had just read out the target's Google password to anyone who called and asked for it.
They could have enforced that with a checkbox in their GSuite admin console.
But aside from beating employees over the head with it, how many companies actually operate in a way that encourages and reinforces such an approach? I'd bet it's not many, and honestly if it's a non-zero number I'd be at least a bit surprised.
You can have all the security training in the world, but every time IT or HR or whoever legitimately reaches out to an employee, especially when it's not based on something initiated by the employee, the company is training exactly the opposite behavior Krebs is suggesting. Hanging up and calling back will likely at minimum annoy the caller and inconvenience the employee. Is the company culture accepting of that, or even better are company policies and systems designed to avoid such a scenario? If a C-suite person calls you asking for some information and you hang up and call them back, are they going to congratulate you on how diligently you are following your security training?
You're not wrong that the Krebs advice would help prevent most phishing, but I'd argue it has to be an idea you design your company around, not just a matter of security training. Otherwise you're putting the burden on employees to compensate for an insecure company, often at their own cost.
Reminds me of a situation early in my career where I was talking with the CTO about some security concern and I said "well it's all on the internal company network" and he immediately said "why on earth do you think you can trust our internal network?"
So I take it you are employed by someone that allows you to connect to nothing and change nothing? Because if you can do any of those things, your employer is clearly Doing It Wrong, based on your interpretation.
(If you happen to be local-king, flip the trust direction, it ends up in the same place.)
I’ve done 6 different versions of “security training” as well as “GDPR training” over the past few years. I think they are mostly tools to drain company money and wasting time. About the only thing I remember from any of it is when I got some GDPR answer wrong because I didn’t resize your shoe size was personal information and it made me laugh that I had failed the whatever quiz right after I had been GDPR certified by some other training tool.
If we look at the actual data, we have seen a reduction in employees who fall for phishing emails. Unfortunately we can’t really tell if it’s the training or if it’s the company story about all those million that got transferred out of the company when someone fell for a CEO phishing scam. I’m inclined to think it’s the latter considering how many people you can witness having the training videos run without sound (or anyone paying attention) when you walk around on the days of a new video.
The only way to really combat this isn’t with training and awareness it’s with better security tools. People are going to do stupid things when they are stressed out and it’s Thursday afternoon, so it’s better to make sure they at least need a MFA factor that can’t be hacked as easily as SMS, MFA spamming and so on.
To emphasize, I 100% agree with you. I'm not arguing for more security training, I'm arguing for less.
"Hang up, look up, call back". That's it. Get rid of pretty much all other "security training", which is just a box ticking exercise for most people anyway.
I also agree with the comment about better security tools, but that's why I think "hang up, look up, call back" is still important, because it teaches people to be fundamentally suspicious of inbound requests even in ways where security tools wouldn't apply.
Then I guess we agree! My mantra is to just never click on any links. Even when I know they aren’t phishing I don’t click on them.
Of course that’s probably easier for a programmer than most other employees. I’m notoriously hard to reach unless it’s through a user story on our board that’s first been vetted by a PO. Something you probably can’t get away with if you’re not privileged enough by being in an in-demand role where they can’t just replace you by someone more compliant. I do try not to be an asshole about it, but we have soooooo many fake phishing mails and calls from our cyber awareness training that it’s just gotten to the point where it’s almost impossible to get trapped by one unless you ignore things until someone shows up in person. Luckily one of my privacy adding in FireFox prevented me from getting caught when I actually did click one of the training links on one of those famous Thursday afternoons. So I still don’t have the “you’ve clicked a phishing link” achievement… which I’m still not sure why is there, because I sort of want it now that it is, and eventually that urge is going to win.
No. If you think people at your company would fall for this, then IMO you have bad security training. The simple mantra of "Hang up, lookup, call back" (https://krebsonsecurity.com/2020/04/when-in-doubt-hang-up-lo...) would have prevented this.
Literally like 99% of social engineering attacks would be prevented this way. Seriously, make a little "hang up, look up, call back" jingle for your company. Test it frequently with phishing tests. It is possible in my opinion to make this an ingrained part of your corporate culture.
Agree that things like security keys should be in use (and given Retool's business I'm pretty shocked that they weren't), but there are other places that the "hang up, look up, call back" mantra is important, e.g. in other cases where finance people have been tricked into sending wires to fraudsters.