The answer is obviously complex and difficult, but I think it's fair to say the current approach isn't working well..
So you need something new to address the problem. I think that the answer is government legislation.
To be clear, I think that it's a terrible answer, in fact everything I'm about to write has glaring problems, I just can't come up with a better alternative.
So software suppliers have to be held responsible for the security of the products they supply, it would obviously be a long and arduous process, and would likely have serious repercussions on the industry as it stands today.
Flip-side is that you try to regulate the market for vulns to reduce their use in malware etc. you could do this by requiring companies to buy discovered vulns and require researchers to sell to the developers.
One of the many major issues with this approach is, what do you do about open source software. There's no money, so liability has no meaning. Here, if it's constituted into a commercial product (think all the lovely "appliances" out there) the liability passes to the guy making the money. Where an end-user company directly uses open source, they get the liability to go with it.
Along-side this you start slowly ramping up the security compliance requirements for companies and organisations processing transactions or personal data on the Internet.
I'd compare this to security becoming like "health & safety" . I don't think most companies really want to spend money on it, and I think a huge amount of money is wasted in unneeded process, but compared to the alternative, it has been seen to be the best approach.
What does the policy that requires me to sell to vendors actually look like? I keep asking because nobody is really answering.
O.K., so if I find an RCE in Windows 10, I have to sell it to Microsoft. But if I don't like the price I'm going to get from Microsoft, why wouldn't I just not independently find RCEs in Windows 10, and instead just sell engineering services to non-Microsoft companies who want to find and exploit those vulnerabilities themselves?
Obviously, you'd want to ban that kind of service. But how would you do that? What does that ban look like? A ban on reverse engineering? How do you differentiate the kind of reversing that Tridge did to build Samba from the kind of reversing HD Moore's team does to build Metasploit? Also: you want to ban Metasploit?
well I'm not a legislator, and if you're looking to pick holes, trust me you would be able to :)
The question is, is that the best approach and if not, what is? Once an approach is agreed then the inevitable x years of wrangling on details could occur.
You are crazy if you think that legislators are going to do a better job of designing a policy to regulate vulnerability researchers than technologists will. Whatever the policy would end up being, it will be asymptotically as effective and reasonable as whatever we talk about here.
If a bunch of technologists (including vuln researchers) can't define a reasonable policy, I think it's a pretty safe bet that there's no reasonable policy to be had.
Well I did say at the top of this thread that the proposal had glaring problems :) I was interested to hear if other people had alternate suggestions which would address the issue.
So you need something new to address the problem. I think that the answer is government legislation.
To be clear, I think that it's a terrible answer, in fact everything I'm about to write has glaring problems, I just can't come up with a better alternative.
So software suppliers have to be held responsible for the security of the products they supply, it would obviously be a long and arduous process, and would likely have serious repercussions on the industry as it stands today.
Flip-side is that you try to regulate the market for vulns to reduce their use in malware etc. you could do this by requiring companies to buy discovered vulns and require researchers to sell to the developers.
One of the many major issues with this approach is, what do you do about open source software. There's no money, so liability has no meaning. Here, if it's constituted into a commercial product (think all the lovely "appliances" out there) the liability passes to the guy making the money. Where an end-user company directly uses open source, they get the liability to go with it.
Along-side this you start slowly ramping up the security compliance requirements for companies and organisations processing transactions or personal data on the Internet.
I'd compare this to security becoming like "health & safety" . I don't think most companies really want to spend money on it, and I think a huge amount of money is wasted in unneeded process, but compared to the alternative, it has been seen to be the best approach.