Responsible Disclosures and their consequences have been a disaster for the human race. Companies need to feel a lot more pain a lot more often in order for them to take the security of their customers a lot more serious. If you just give them month to fix an issue and spoon-feed them the solution it's just another ticket in their Backlog. But if every other security issue becomes enough news online that their CEOs are involved and a solution must be find in hours not month, they will become a lot more proactive. Of course it's the end users that would suffer most from this. But then again, they buy ASUS so they suffer already...
By IlikeKitties a day ago
I think ASUS' turnaround time on this was quite good, I don't see the problem here. ASUS didn't deny the bug, didn't threaten to prosecute anyone for reverse engineering their software, and quickly patched their software. I have no doubt that before the days of responsible disclosure, this process would've taken months and might have involved the police.
Normal people don't care about vulnerabilities. They use phones that haven't received updates in three years to do their finances. If you spam the news with CVEs, people will just get tired of hearing about how every company sucks and become apathetic once there's a real threat.
The EU is working on a different solution. Stores are not permitted to sell products with known vulnerabilities under new cybersecurity regulations. That means if ASUS keeps fucking up, their motherboards become dead stock and stores won't want to sell their hardware anymore. That's not just computer hardware, but also smart fridges and smart washing machines. Discover a vulnerability in your dish washer and you may end up costing the dish washer industry millions in unusable stock if their vendors haven't bothered to add a way to update the firmware.
By jeroenhd a day ago
>They say “This issue is limited to motherboards and does not affect laptops, desktop computers”, however this affects any computer including desktops/laptops that have DriverHub installed
>instead of them saying it allows for arbitrary/remote code execution they say it “may allow untrusted sources to affect system behaviour”.
Sounds like Asus did in fact deny the bug.
By ycombinatrix 12 hours ago
> Stores are not permitted to sell products with known vulnerabilities under new cybersecurity regulations.
What are the specifics on that? Like does the vulnerability need to be public or is it enough if just the vendor knows about it? Does everyone need to stop selling it right away if new vulnerability is discovered or do they some time patch it? I'm pretty sure software like Windows almost definitely has some unfixed vulnerabilities that Microsoft knows about and is in process of fixing every single day of the year. Currently even if they do have a fix, they would end up postponing it until next patch Tuesday.
And what even is "vulnerability" in this context? Remote RCE? DRM bypass?
"Stores are not permitted to sell products with known vulnerabilities under new cybersecurity regulations."
Do stores have to patch known vulnerabilities before releasing the product to customers or can customers install the patch?
By Polizeiposaune 10 hours ago
Stores don’t have the capability to do this. These aren’t car dealerships we’re talking about here, more like Walmart or Best Buy. It would take a recall/RMA or online firmware updates, both of which already exist and are widely used.
By aspenmayer 3 hours ago
"Responsible" disclosure is paradoxically named because actually it is completely irresponsible. The vast majority of corporations handle disclosures badly in that they do not fix in time (i.e. a week), do not attribute properly, do not inform their users and do not learn from their mistakes. Irresponsibly delayed limited disclosure reinforces those behaviors.
The actually responsible thing to do is to disclose immediately, fully and publically (and maybe anonymously to protect yourself). Only after the affected company has repeatedly demonstrated that they do react properly, they might earn the right for a very time-limited heads-up of say 5 work days or something.
That irresponsibly delayed limited disclosure is even called "responsible disclosure" is an instance of newspeak.
By holowoodman a day ago
I make software. If you discover a vulnerability, why would you put my tens of thousands of users at risk, instead of emailing me and have the vulnerability fixed in an hour before disclosing?
I get that companies sit on vulnerabilities, but isn't fair warning... fair?
By stavros a day ago
> why would you put my tens of thousands of users at risk, instead of emailing me and have the vulnerability fixed in an hour before disclosing
You've got it backwards.
The vuln exists, so the users are already at risk; you don't know who else knows about the vuln, besides the people who reported it.
Disclosing as soon as known means your customers can decide for themselves what action they want to take. Maybe they wait for you, maybe they kill the service temporarily, maybe they kill it permanently. That's their choice to make.
Denying your customers information until you've had time to fix the vuln, is really just about taking away their agency in order to protect your company's bottom line, by not letting them know they're at risk until you can say, "but we fixed it already, so you don't need to stop using us to secure yourself, just update!"
By ang_cire a day ago
You're making an assumption that doesn't match reality - vulnerability discovery doesn't work like some efficient market. Yes, intelligence agencies and sophisticated criminal groups might find 0-days, but they typically target selectively, not deploying exploits universally.
The real threat comes from the vast number of opportunistic attackers who lack the skills to discover vulnerabilities themselves but are perfectly capable of weaponizing public disclosures and proof-of-concepts. These bottom-feeders represent a much larger attack surface that only materializes after public disclosure.
Responsible disclosure gives vendors time to patch before this larger wave of attackers gets access to the vulnerability information. It's not about protecting company reputation - it's about minimizing the window of mass exploitation.
Timing the disclosure to match the fix release is actually the most practical approach for everyone involved. It eliminates the difficult choice customers would otherwise face - either disrupt their service entirely or knowingly remain vulnerable.
Most organizations simply can't afford the downtime from abruptly cutting off a service, nor can they accept the risk of continuing with a known vulnerability. Providing the fix simultaneously with disclosure allows for orderly patch deployment without service interruption.
This coordinated approach minimizes disruption while still addressing the security issue - a balanced solution that protects both the security and continuity needs of end users.
By renmillar a day ago
I understand the arguments for the current system, I just don't agree that disruption is worse than loss of agency. Your position inevitably ends up arguing for a paternalistic approach, as you are when you say
> It eliminates the difficult choice customers would otherwise face - either disrupt their service entirely or knowingly remain vulnerable.
You decided they are better off not having to make that choice, so you make it for them whether they like it or not.
In fact, you made the worst choice for them, because you chose that they'd remain unknowingly vulnerable, so they can't even put in temporary mitigations or extra monitoring, or know to be on the lookout for anything strange.
> Most organizations simply can't afford the downtime from abruptly cutting off a service, nor can they accept the risk of continuing with a known vulnerability.
Now this is an interesting part, because the first half is true depending on the service, but bad (that's a BCDR or internet outage issue waiting to happen), and the second half is just wrong (show me a company that doesn't know and accept that they have past-SLA vulns unpatched, criticals included, and I'll show you a company that's lying either to themselves or their customers).
> This coordinated approach minimizes disruption while still addressing the security issue - a balanced solution that protects both the security and continuity needs of end users.
This is not a balanced approach, this is a lowest-common-denominator approach that favors service providers over service users. You don't know if it protects someone's security needs, because people have different security needs: a journalist being targeted by a state actor can have the same iphone as someone's retired grandma, or infotainment system, or home assistant, etc.
I've managed bug bounty and unpaid disclosure programs, professionally, and I know firsthand that it's the company's interests that responsible disclosure serves, first and foremost.
By ang_cire a day ago
Let’s imagine you found how to steal funds from a bank, best is to let them know that you are concerned (as a customer) for the safety of your own funds.
If they do nothing after a reasonable amount of time, escalate to regulators or change bank. Then once they release information that some processes are changed: “thanks to XXX working at YYY for helping us during it”. You win, they win, clients win, everybody wins.
Unwanted public disclosure directly leads to public exploitation, there is nothing good at all about it.
For example, there is a RCE in Discord (totally statistically certain due to the rendering engine, just not public yet), and this is going to be exploited only if someone shares the technical details.
If you don’t disclose it, it’s not like someone else will discover it tomorrow. It’s possible, but not more likely than it was yesterday. If you disclose it, you make sure that everybody with malicious intent knows about it.
By rvnx a day ago
Increasing the chance of a bad actor actually doing something with a vulnerability seems bad, actually. You're effectively shifting responsibility to consumers, who are probably not going to see a CVE for one of the dozens of softwares they use every day.
By fastball a day ago
> You're effectively shifting responsibility to consumers, who are probably not going to see a CVE for one of the dozens of softwares they use every day.
Which is again, a problem created by the companies themselves. The way this should work is that the researcher discloses to the company, and the company reaches out to and informs their customers immediately. Then they fix it.
But instead companies refuse to tell their customers when they're at risk, and make it out to be the researchers that are endangering people, when those researchers don't wait on an arbitrary, open-ended future date.
> Increasing the chance of a bad actor actually doing something with a vulnerability seems bad, actually.
Unless you know who knows what already, this is unprovable supposition (it could already be being exploited in the wild), and the arguments about whether POC code is good or bad is well tread, and covers this question.
You are just making the argument that obscurity is security, and it's not.
By ang_cire a day ago
I disagree. The vast majority of script kiddies don't know about the zero day.
Instead of just one bad actor using that vulnerability on Andrew select targets, your proposal will have a few tens of thousands bots performing drive by attacks on millions of victims.
By lelanthran 19 hours ago
I think one point being made is that (in this example) you would've been much less careless about shipping the vulnerability, if you knew you'd be held accountable for it.
With current practice, you can be as sloppy and reckless as you want, and when you create vulnerabilities because of that, you somehow almost push the "responsibility" onto the person who discovers it, and you aren't discouraged from recklessness.
Personally, I think we need to keep the good part of responsible disclosure, but also phase in real penalties for the parties responsible for creating vulnerabilities that are exploited.
(A separate matter is the responsibility of parties that exploit the vulnerabilities. Some of those may warrant stronger criminal-judicial or military responses than they appear to receive.)
Ideal is a societal culture of responsibility, but in the US in some ways we've been conditioning people to be antisocial for decades, including by elevating some of the most greedy and arrogant to role models.
By neilv a day ago
> you would've been much less careless about shipping the vulnerability, if you knew you'd be held accountable for it
I have a problem with this framing. Sure, some vulnerabilities are the result of recklessness, and there’s clearly a problem to be solved when it comes to companies shipping obviously shoddy code.
But many vulnerabilities happen despite great care being taken to ship quality code. It is unfortunately the nature of the beast. A sufficiently complex system will result in vulnerabilities even a careful person could not have predicted.
To me, the issue is that software now runs the world, despite these inherent limitations of human developers and the process of software development. It’s deployed in ever more critical situations, despite the industry not having well defined and enforceable standards like you’d find in some engineering disciplines.
What you’re describing is a scenario that would force developers to just stop making software, on top of putting significantly more people at risk.
I still believe the industry has a problem that needs to be solved, and it needs a broad culture shift in the dev community, but disagree that shining a bright light on every hole such that it causes massive harm to “make devs accountable” is a good or even reasonable solution.
By haswell 21 hours ago
>What you’re describing is a scenario that would force developers to just stop making software, on top of putting significantly more people at risk.
Good. I work in code security/SBOM, the amount of shit software from entities that should otherwise be creating secure software should worry you.
Businesses care very little about security and far more about pushing the new feature fast. And why not, there is no real penalty for it.
By pixl97 19 hours ago
I think that culture shift will have to come from the top in business -- the CEO and the board.
At this point, the software development field is about operating within the system decided by those others, with the goal of personally getting money.
After you've made the CEO and board accountable, I think dev culture will adapt almost immediately.
Beware of attempts to push engineering licensing or certifications, etc. as a solution here. Based on everything we've seen in the field in recent decades, that will just be used at the corporate level as a compliance letter-but-not-spirit tool to evade responsibility (as well as a moat to upstart competitors), and a vendor market opportunity for incompetent leeches.
First you make CEO and board accountable, and then let the dev culture change, and then, once you have a culture of people taking responsibility, then you'll have the foundation to add in licensing (designed in good faith) as an extra check on that, if that looks worthwhile.
By neilv 20 hours ago
> A sufficiently complex system will result in vulnerabilities even a careful person could not have predicted.
I think as a field we're actually reasonably good at quantifying most of these risks and applying practices to reduce the risk. Once in a blue moon you do have "didn't see that coming" cases but those cause a very minor part of the damage that people suffer because of sw vulnerabilities. Most harm is caused by classes of vulnerabilities that are boringly pedestrian.
By fulafel 18 hours ago
The problem with a fair warning is that once I email you such a warning, I'll never be able to anonymously publish it no matter how much you ignore the report. Then the fair thing becomes I never go public I'm confident you'll call lawyers.
By technion a day ago
So can't you disclose it anonymously? I'm pretty sure most people who are savvy enough to find zero-days know how to get an email address anonymously.
By SahAssar a day ago
All ill say is: try it in practice. You'll quick find it dismissed as "not professional" and people will quickly claim its "irresponsible" for that reason.
By technion a day ago
Can't you just send it from anon email?
By frainfreeze a day ago
Because there is an information disparity I could profit from instead of doing free work for you. Even if that disparity is just "posting the vuln to my blog" to get e-famous.
By beeflet 9 hours ago
You made the software, you have your paid customers, you are responsible for security of your customers. If you have an RCE that's your problem and you gotta fix it.
By v3ss0n 2 hours ago
According to the post above, if you earned enough reputation then you might be given that one-hour window for fixing before disclosing. The issue isn't so much about whether or not there should be a "private" window but how long it lasts, especially when the editor is a multi-billion company
By rakoo 21 hours ago
Let’s not forget the end users in this scenario, who will not be able to react to this as quickly as a billion dollar company regardless of how well they notify their customers.
By haswell 20 hours ago
Absolutely, which is yet another reason why this abstraction from the conditions of creation of anything tech-related is something that should be eliminated
By rakoo 19 hours ago
An hour, sure. Frequently companies sit on it for months.
By JonChesterfield a day ago
Yes but responsible disclosure should be "you have a week (or whatever) from my first email, then I go public".
By stavros a day ago
what if the vulnerability cannot be easily fixed within the week, even if the company stops all work and focus completely on the problem?
If the reason for responsible disclosure is to ensure that no members of the public is harmed as a result of said disclosure, should it not be a conversation between the security researcher and the company?
The security researcher should have an approx. idea of how or what to do to fix, and give a reasonable amount of time for a fix. If the fix ought to have been easy, then a short time should suffice, and vice versa.
By chii a day ago
Many types of vulnerabilities cannot be resolved in one hour. Some require complex thought to resolved.
One hour is absurd for another reason, what timezone are you in? And they? What country, and therefore, is it a holiday?
You may say "but vulnerability", and yes. 100% no heel dragging.
But all companies are not staffed with 100k devs, and a few days, a week is a balance between letting every script kiddie know, and the potenital that it may be exploited in the wild currently.
If one is going to counter unreasonable stupidity, use reasonable sensibility. One hour is the same as no warning.
By bbarnett a day ago
Strange wording. You are the one that put tens of thousands of your users at risk. Not the one who discovers the problem.
By efdee 21 hours ago
If you forget your shop's door open after hours, and someone starts shouting "HEY GUYS! THIS DOOR IS OPEN! LOOK!", I have a hard time putting 100% of the blame on you.
By stavros 20 hours ago
If I point out the bridge is cracking and you get angry about it, I'm blaming the idiots that engineered a crap bridge and didn't maintain it.
Maybe it's time we get professional standards if this is how we are going to behave?
By pixl97 19 hours ago
Fair warning through "responsible" disclosure was abused again and again and again. Why should I trust company no 1000 after 999 have mislead bug reporters, the public, their customers and the rest of the world about their own "just an hour"?
By holowoodman 18 hours ago
You already put your tens of thousands of users at risk. The people putting bugs in the software, not the ones discovering them.
By cenamus 20 hours ago
Please enlighten me on how you've managed to never write any bugs.
By stavros 19 hours ago
Didn't say that. But I can't blame the ones publicising the bugs we put in there.
By cenamus 3 hours ago
Well, not sure DJB posts here, but he has kept it to a minimum.
And this is mostly BS too. People don't write bug free software, they write features.
Other industries had to license professional engineers to keep this kind of crap from being a regular issue.
By pixl97 19 hours ago
That's because nobody actually cares about security nor do they want to pay for it. I'm a security champion at my company and security related work gets pushed off as much as possible to focus on feature work. If we actually wanted security to be a priority, they would employ security champions who's only job was to work on security aspects of the system instead of trying to balance security and feature work, because feature work will always prevail.
By giantg2 a day ago
It's such a loaded term that I refuse to use it. "vendor-coordinated disclosure" is a much better term, imho
(and in the world of FOSS you might have "maintainer-coordinated" too)
By Retr0id 21 hours ago
What about damage control? I would argue your "anonymous, immediate disclosure" to the public (filled with bad actors) would be rubbing salt in the wound (allow more people to exploit the vulnerability before it's fixed). That's why nobody publishes writeups before the vuln is fixed. Even if corporations don't fix vulns in time, I can only see harm being done from not privately reporting them.
By rfl890 19 hours ago
>I can only see harm being done from not privately reporting them
Because you need to take a look at the fuller picture. If every vuln was published immediately the entire industry would need to be designed differently. We wouldn't push features at a hundred miles per hour but instead have pipelines more optimized for security and correctness.
There is almost no downside currently for me to write insecure shit, someone else will debug it for me and I'll have months to fix it.
By pixl97 18 hours ago
I mean, to be a bit more reasonable, there's a middle ground here. Maybe disclosing a massive RCE Vulnerability in software used by a lot of companies on 25th of December is not a good Idea. And perhaps an Open Source Dev with a security@project mail deserves a tad more help and patience than a megacorp with a record of shitty security management. And if you are a company that takes security serious and is responsive to security researchers inquiries they deserve at least the chance to fix it fast and before it becomes public.
It's just that there are some companies EVERYONE knows are shitty. ASUS is one of them.
By IlikeKitties a day ago
You are right about open source developers who do this on the side, as a hobby, and even if they don't are usually underpaid and understaffed. They do deserve more time and a different approach.
But corporations making big bucks from their software need to be able to fix things quickly. They took money for their software, so it is their responsibility. If they cannot react on a public holiday, tough luck. Just look at their payment terms. Do they want their money within 30 days or 25 work days? Usually it is the former, they don't care about your holidays, so why should anyone care about theirs? Also, the bad guys don't care about their victims' holidays. You are just giving them extra time to exploit. The only valid argument would be that the victims might not be reading the news about your disclosure on a holiday. But since you are again arguing about software used by a lot of companies (as opposed to private users), I don't see a problem there. They also have their guards on duty and their maintenance staff on call for a broken pipe or something.
What's most important is that I'm saying we should revert the "benefit of the doubt". A vast majority of corporations have shitty security handling. Even the likes of Google talk big with their 90 day time window from private irresponsible disclosure to public disclosure. And even Google regularly fails to fix things within those 90 days. So the default must be immediate public and full disclosure. Only when companies have proven their worth by correctly reacting to a number of those, then they can be given the "benefit of the doubt" and a heads up.
Because otherwise, when the default is irresponsible private disclosure, they will never have any incentive to get better. Their users will always be in danger unknowingly. The market will not have information to decide whether to continue buying from them. The situation will only get worse.
By holowoodman a day ago
> But corporations making big bucks from their software need to be able to fix things quickly. They took money for their software, so it is their responsibility. If they cannot react on a public holiday, tough luck.
Because it is not corporations who are reacting on public holidays, but developer human beings.
It is not corporations that are reacting to install patches on a Friday, but us sysadmins who are human beings.
By throw0101d a day ago
Companies will act out of greed and use their customers and developers as "human shields" to get out of their responsibility. Your on-call duty should be paid by the hour just as any duty, doubling the pay on weekends, holidays and nights. "But the poor developers" is just the "we will hurt this poor innocent puppy"-defense. The evil ones are the ones inflicting the hurt, the greedy companies. Not the reporters.
By holowoodman 18 hours ago
Overall, I share your reasoning and would concur mostly but there are some rather important caviats, especially regarding this one:
> The only valid argument would be that the victims might not be reading the news about your disclosure on a holiday. But since you are again arguing about software used by a lot of companies (as opposed to private users), I don't see a problem there.
Let's say MegacorpA is a big Software Vendor that makes some kind of Software other Companies use to manage some really sensitive user data. Even if MegacorpA fixes their stuff on the 25th 2 hours after they got an e-mail from you, all their clients might not react that fast and thus a public disclosure could cause massive harm to end users, even if MegacorpA did everything right.
Ultimately, I guess my argument is that there's not a one size fits all solution. But "responsible disclosure" should be reserved for companies acting responsibly.
By IlikeKitties a day ago
> "Responsible" disclosure is paradoxically named because actually it is completely irresponsible.
It's only paradoxical if you've never considered the inherent conflicts present in everything before.
The "responsible" in "responsible disclosure" relates to the researchers responsibility to the producer, not the companies responsibility to their customers. The philosophical implication is that the product does what it was designed to do, now you (the security researcher) is making it do something you don't think it should do, and so you should be responsible for how you get that out there. Otherwise you are damaging me, the corporation, and that's just irresponsible.
As software guys we probably consider security issues a design problem. The software has a defect, and it should be fixed. A breakdown in the responsibility of the corporation to their customer. "Responsible disclosure" considers it external to the software. My customers are perfectly happy, you have decided to tell them that they shouldn't be. You've made a product that destroys my product, you need to make sure you don't destroy my product before you release it.
The security researcher is not primarily responsible to the public, they are responsible to the corporation.
It's not a paradox, it's just a simple inversion of responsibility.
By delusional a day ago
> The security researcher is not primarily responsible to the public, they are responsible to the corporation.
Unless the researcher works for the corporation on an in-house security team, what’s your reasoning for this?
Why are they more responsible to the corporation they don’t work for than for to the people they’re protecting (depending on the personal motivations of the individual security researcher I guess).
By einsteinx2 a day ago
With "simple reversion of responsibility" do you mean your twisted logic of "everyone should think first and foremost about my profits"?
By drowsspa a day ago
The problem is just one of legislation of liability. Car manufacturers are ordered to recall and fix their cars, but software/hardware companies face just too little pressure. I think customers should be able to get full refund for broken devices (with unfixed CVE for example).
By oezi a day ago
The devices and core functionality (including security updates, which are fixes to broken core functionality) must survive the manufacturer and should not require ongoing payments of any type*. (new updates being created? maybe, access to corrections to basic behavior? Bug / security fixes should remain free.)
By mjevans 20 hours ago
Yes. I would envision that it is at least 5 years of such updates fixes and another 5 years available for purchase capped at 20% of device price.
All manufacturers must pay an annual fee to an insurance scheme which covers the case of insolvency of manufacturers.
By oezi 18 hours ago
Citing CGPGrey: Solutions that are the first thing you can think of are terrible and ineffective.
Good safety/security culture encourages players to not hide their problems. Corporations are greedy bastards. They'll do everything to hide their security mistakes.
You are also making legitimate, fixable in a month issues available for everyone which increases their chances to be exploited a lot.
By okanat a day ago
> You are also making legitimate, fixable in a month issues available for everyone which increases their chances to be exploited a lot.
I don't think you can fathom the amount of people that have phones with roughly 3 years of no android updates as their primary device with which they use all the digital services they use, Banking, Texting, Doomscrolling, Porn, ...
Users, especially the most likely to be exploited are already vulnerable to so much shit and even when there's a literal finished fix available, these vendors do shit about it. Only when their bottomline is threatened because even my mom knows "Don't buy anything with ASUS on it, your bank account gets broken into if you do" will we see change.
By IlikeKitties a day ago
> I don't think you can fathom the amount of people that have phones with roughly 3 years of no android updates as their primary device with which they use all the digital services they use, Banking, Texting, Doomscrolling, Porn, ...
I do. I'm an embedded software developer in a team that cares about having our software up-to-date a lot.
> Users, especially the most likely to be exploited are already vulnerable to so much shit and even when there's a literal finished fix available, these vendors do shit about it. Only when their bottomline is threatened because even my mom knows "Don't buy anything with ASUS on it, your bank account gets broken into if you do" will we see change.
Yes individuals are quite exploitable. That's why I really like EU's new regulations Cyber Resiliency Act and new Radio Equipment Directive. When governments enforce reasonable disclosure and fixing timelines, then threaten your company's ability to sell things in a market alltogether, if you don't comply, it works wonders. Companies hate not being able to make money. So all the extra security policies and vulnerability tracking we have been experimenting with and secure-by-default languages are now the highest priority for us.
EU regulation makes sure that you're not going to be sold a router that's instantly hackable in a year. It will also force chip manufacturers to have meaningful maintenance windows like 5-10 years due to pressure from ODMs. That's why you're seeing all the smartphone manufacturers have extended support timelines, it is not pure market pressure. They didn't give fuck about it for more than 10 years. When EU came with a big stick though...
Spreading word-of-mouth knowledge works until a point. Having your entire product line being banned entering a market works almost every time.
By okanat a day ago
The fact about people running outdated OS versions is totally true, but it also indicates that the risk of being vitally harmed by those vulnerabilities is quite low in reality, if you’re not an individually targeted person. And that’s why not a lot of people care about them.
By layer8 20 hours ago
In this day and age, you're just as likely to be targeted by a large-scale ransomware operation that just happens to find your vulnerable device by network scanning, for example.
By int_19h 7 hours ago
I’m not sure that’s a great example as they would be vulnerable to many responsibly disclosed and previously fixed issues anyway since they never update.
In fact they would be just as vulnerable to any new responsibly disclosed issues as they would if they were immediately “irresponsibly” disclosed because again, they never update anyway.
By einsteinx2 a day ago
> Good safety/security culture encourages players to not hide their problems. Corporations are greedy bastards. They'll do everything to hide their security mistakes.
This is why I despise the Linux CNA for working against the single system that tries to hold vendors accountable. Their behavior is infantile.
By Avamander a day ago
Business idea. Maybe this already exists. A disclosure aggregator/middle man which:
- protects the privacy of folks submitting
- vets security vulns. Everything they disclose is exploitable.
- publishes disclosures publicly at a fixed cadence.
- allows companies to pay to subscribe to an "early feed" of disclosures which impact them. This money is used to reward those submitting disclosures, pay the bills, and take some profit.
A bug bounty marketplace, if you will. That is slightly hostile to corporations. Would that be legal, or extortion?
By hamandcheese a day ago
Thought of something along the lines of this too before.
I think there is serious potential for this.
By hashstring 17 hours ago
It does indeed already exist in many sectors as trade publications and journalism.
By ajcp 21 hours ago
Isn't that basically HackerOne?
By darkwater a day ago
No, HackerOne gets paid by the companies, so they're heavily incentivized to work for their benefit.
I've had three really bad experiences with unskilled H1 triagers that the next vuln I find from a company that uses H1 will go instantly public. I'm never going to spend that much effort again, to get a triager that would actually bother to triage.
By Avamander a day ago
except there you spend several months walking an underpaid person in india who can barely use a shell though reproduction steps, get a confirm after all that work and the vendor still ignores you
By asmor a day ago
HackerOne, BugCrowd, et al don't appear to make any serious effort to vet reports themselves.
By xmodem a day ago
Is that true? I thought you could pay for a H1 service that basically had professionals triaging the vulnerabilities and only pass on the correct ones?
By hashstring 17 hours ago
Our company pays for one of these third party triage services for H1.
The quality is seriously lacking. They have dismissed many valid findings.
By ycombinatrix 12 hours ago
As I keep saying, liability like in any other industry.
Most folks don't put up with faulty products unless by decision, like those 1 euro/dollar shops, so why should software get a pass.
By pjmlp a day ago
Or we could just have regulation or at least the same product liability for software as everything else.
By fulafel 20 hours ago
> a disaster for the human race.
This is a prime example where a hyperbole completely obliterates the point one is trying to make.
By curiousgal a day ago
> This is a prime example where a hyperbole completely obliterates the point one is trying to make.
Just post it next day, when found. That will be the proper incentive and losing face also contributes to better security next time.
By lofaszvanitt 16 hours ago
> I asked ASUS if they offered bug bounties. They responded saying they do not, but they would instead put my name in their “hall of fame”. This is understandable since ASUS is just a small startup and likely does not have the capital to pay a bounty.
:(
By Gys a day ago
It's understandable for such small companies, like Cisco, that does the same for the myriad of online offerings they've acquired over the years.
Cisco have gone even further, by forgetting about their security announcements page, so any recognition is now long lost into the void.
When I reported something, and this was probably around 8 years ago, they only had bounties for their equipment, not for "online properties".
I reported a vulnerability in some HR software they owned, but alas I can't even find where it used to live on the internet now.
By eterm 20 hours ago
The 2 that are live there definitely cover software (one doesn't deal in hardware at all).
By ang_cire 15 hours ago
no bug bounty, onto black market of exploit it goes.
that or full public disclosure.
By Xelbair a day ago
Maybe something for gamers Nexus to light a fire
By hypercube33 21 hours ago
I wonder how worried they would get if more people actually started selling exploits on the black market, instead of reporting and not getting a bug bounty. If you don’t offer a bug bounty program in the first place, my gut feeling is that they probably wouldn’t care in that case either. Either way, this is a super good reason to not do business with such a company.
By LadyCailin 21 hours ago
I wonder if centralized "sell program vulnerabilities here" government shops can be set up
While intelligence agencies are an obvious benefitiary, this would also give leverage of government over capital
By NooneAtAll3 7 hours ago
if the fire it lit under them, after their software leads to widespread hack - they will care.
that's the point - to put pressure on them to CARE.
By Xelbair 21 hours ago
> Asus is just a small startup
I'm not sure where they got that from, Asus have been making motherboards and other pc parts since at least the 90s...
This makes me never want to buy another ASUS product again.
By throaway920181 a day ago
For me it's them lying about providing a way to unlock the bootloader of my soon to be 1000€ paperweight(2 android updates only) called an Asus zenfone 10.
By pohuing a day ago
If they actually lied about it, that kind of money could be worth it to take them to (whatever your local equivalent of) small claims court over.
By jeroenhd a day ago
I'm in Germany which makes it a bit harder. Someone in the UK went through the trouble and all they got was an offer for a refund or an insanely outpriced option to downgrade the os iirc.
About the lie, they've repeated multiple times this would be an option a year ago...
Out of curiosity, what got you to spend 1000 Euros on a Zenphone 10 phone when Samsung S23 was net superior and cheaper and provides like 5 years of updates? It's not like previous phones from Asus had a better track record. I kept waring people to stay away form the Zenphone yet the online community kept overhyping it for some reason as the second coming of Christ or something.
By FirmwareBurner a day ago
What cempler said. I tried the dongle approach when the jack in my pixel 4a was failing but found I didn't like it. Having the cable go out the bottom in the center is a terrible place for me, as I rest my phone on my outstretched pinky. The zenfone ticked all boxes on paper and in reviews.
Great chipset, solid build, a form factor fitting my tiny hands(though in retrospect it's so heavy that my pinky hurts after a couple hours of reading).
And a headphone jack which I use to plug my phone in my stereo and my Sennheiser headphones. Really the jack is the primary reason I got this phone.
Coupled by the fact that until now all zenfones had a hassle boot loader unlock and a decent rom community it really was the best choice on paper.
God damn it Asus, I wasn't aware they're that dodgy :/
By pohuing 15 hours ago
Zenfone is smaller and has a headphone jack. It's the superior phone
By campl3r a day ago
It is virtually the same size[1] as the era equivalent S23.
I don't think a headphone jack which you can get via a super cheap USB-C adaptor, makes the justification for a 1000 Euro paperweight.
>so I could see if anyone else had a domain with driverhub.asus.com.* registered. From looking at other websites certificate transparency logs, I could see that domains and subdomains would appear in the logs usually within a month. After a month of waiting I am happy to say that my test domain is the only website that fits the regex, meaning it is unlikely that this was being actively exploited prior to my reporting of it.
This only remains true in so far as no-one directly registered for a driverhub subdomain. Anyone with a wildcard could have exploited this, silent to certificate transparency?
By antmldr a day ago
A wildcard certificate is only for a single label level, '*.example.com.' would not allow 'test.test.example.com.', but would allow 'test.example.com.'. If someone issued a wildcard for '*.asus.com.example.com.', then could present a webserver under 'driverhub.asus.com.example.com.' and be seen as valid.
By ZoneZealot a day ago
Yes... I believe you've successfully reworded what your comment's parent said.
By throaway920181 a day ago
Parent comment is making a point that it might have been possible for an attacker to avoid discovery via certificate transparency logs, because anyone 'with a wildcard' could pull off the attack, which is not correct.
I'm pointing out that a wildcard at the apex of your domain (which is what basically everyone means when saying 'a wildcard'), would not work for this attack. Instead if you were to perform the attack using a wildcard certificate, it would need to be issued for '*.asus.com.example.com.' - which would certainly be obvious in certificate transparency logs.
By ZoneZealot a day ago
Can you still publicly apply for a “*.*.mydomain.com” certificate? IIRC a wildcard cert starting with “*.*.” allows you to chain 2+ names with that cert, I think? (E.g.: “*.*.example.com” cert would match “hello.world.and.hi.com.example.com”)
By smileybarry 17 hours ago
With public CAs, you can only apply for a wildcard at a single label. You can't have nested wildcards.
I don't know of any CA that allows for wildcard characters within the label, other than when the whole label is a wildcard, but it is possible under that RFC.
> Wildcard Certificate: A Certificate containing at least one Wildcard Domain Name in the Subject Alternative Names in the Certificate.
> Wildcard Domain Name: A string starting with “*.” (U+002A ASTERISK, U+002E FULL STOP) immediately followed by a Fully‐Qualified Domain Name.
Now of course with your own internal CA, you have complete free reign to issue certificates - as long as they comply with the technical requirements of your software (i.e. webserver and client).
Also note that a cert issued as '..example.com.' would only match 'hi.com.example.com.', not an additional three labels.
By ZoneZealot 14 hours ago
I think the point is that it wouldn't be silent to certificate transparency, because having a certificate for *.asus.com.example.com would be a clear indication of something suspicious
By a2128 a day ago
Nice idea, just checked it now and can confirm there was nothing suspicious in the wildcard records.
By MrBruh a day ago
Furthermore:
- Would a self-signed cert work? Those aren’t in transparency logs.
- Does it have to be HTTPS?
By kstrauser 7 hours ago
You're right about the wildcard certificate blind spot. An attacker with a wildcard cert for .example.com could have exploited this without appearing in CT logs specifically for driverhub.asus.com. domains. This is why CT log monitoring alone isn't sufficient for detecting these types of subdomain takeover vulnerabilities.
By ethan_smith a day ago
It's 'driverhub.asus.com.example.com.' not 'driverhub.example.com.', therefore entirely discoverable in CT logs by searching for (regex): (driverhub|\*)\.asus\.com\.
By ZoneZealot a day ago
> MY ONBOARD WIFI STILL DOESN’T WORK, I had to buy an external USB WiFi adapter. Thanks for nothing DriverHub.
All this, for literally nought
By satyanash a day ago
It's a nice blogpost though.
By Avamander a day ago
The latest wifi drivers don't work, you have to use an older version.
By ThrowawayTestr a day ago
I asked ASUS if they offered bug bounties. They responded saying they do not, but they would instead put my name in their “hall of fame”. This is understandable since ASUS is just a small startup[1] and likely does not have the capital to pay a bounty.
I'm surprised to find that this is just a random person's blog. Was very prepared for an ad page, scalped domain, or some corporation trying to make money out of it. On the sadder side, it doesn't seem like this person makes any use of the domain's name at all; they could have had firstlast.cctld for their blog and given this to someone who wants to put a sarcastic joke on it. But better this than ad farms so I don't blame them for keeping it!
By lucb1e 9 hours ago
> When submitting the vulnerability report through ASUS’s Security Advisory form, Amazon CloudFront flagged the attached PoC as a malicious request and blocked the submission.
> This is understandable since ASUS is just a small startup.
A small startup with a marketcap of only 15 B. What is more than understandable is that you give a shit not only about your crappy products but the researcher that did a HUGE work for your customers.
I truly feel bad for researchers doing this kind of work only to get them dismissed/trashed like this. So unfair.
The only thing that is ought to be done is not to purchase ASUS products.
By liendolucas a day ago
MY ONBOARD WIFI STILL DOESN’T WORK, I had to buy an external USB WiFi adapter. Thanks for nothing DriverHub.
I feel sorry for this guy, having deviated from the original issue. Though it'd only took a couple of seconds to note the WLAN chipset from specs or OEM packaging and then heading to station-drivers.
This was also the very reason I dislike Asus, I don't want a BIOS flag/switch that natively interact with a component in OS layer.
By cobalt60 20 hours ago
Wow, no bug bounty is insane. No more ASUS products for me...
By IshKebab a day ago
they are a "small startup"
By _pdp_ a day ago
They have over 14500 employees. I wouldn't call that small.
By charcircuit a day ago
You missed a small amount of sarcasm there
By Pesthuf a day ago
"Small" as in "small startup"
By 7bit a day ago
I think that was exactly the joke.
By stavros a day ago
Both Asus software and customer support is atrocious and always has been.
"ASUS emailed us last week (...) and asked if they could fly out to our office this week to meet with us about the issues and speak "openly." We told them we'd be down for it but that we'd have to record the conversation. They did say they wanted to speak openly, after all. They haven't replied to us for 5 days. So... ASUS had a chance to correct this. We were holding the video to afford that opportunity. But as soon as we said "sure, but we're filming it because we want a record of what's promised," we get silence."
Edit: formatting
By sigmaisaletter a day ago
So are there any "basically respectable" motherboard manufacturers? Or is there a similar story about each of the big players?
Asking for a friend who is thinking about building a new PC soon.
By jeffparsons a day ago
Asrock (sub-brand of Asus but seemingly independent in the product and dev side) has been fine for me over the ~10 years I've bought their mobos. There was the thing a few months ago with X870 mobos that were apparently frying CPUs, but I think that was not sufficiently proven to be their fault?
That said, in their X670 / B650 they have the same setting as what this article is about, and it could be equally as broken on the software side as Asus's is, but I wouldn't know because I don't use Windows so I disabled it.
By Arnavion 17 hours ago
Asus and AsRock are separate since 2010.
By oynqr 5 hours ago
Its new owner since 2010 is still part of the Asus group, but sure it's technically a different company from Asus proper.
By Arnavion 5 hours ago
All the consumer brands are pozzed. My last build (i7-14700K) used an MSI board. Their secureboot is still broken. The BIOS setup is complete mess, and all the settings are reset after a BIOS update. I have to unplug and replug my USB keyboard after a poweroff, or it doesn't work. But I insisted on a board without RGB lights, and that limited the selection. Computers are over.
By encom a day ago
There really needs to be an open source project for a PC motherboard.
By ribcage a day ago
Just a few days ago people were talking about this on the kicad discord. A chinese team made an open hardware x86_64 motherboard and published it not too long ago. Then they were essentially wiped off the face of the planet.
That was the day I learned you literally cannot develop a computer motherboard without Intel's permission. Turns out Intel is no different than the likes of Nintendo.
By matheusmoreira 21 hours ago
I doubt that.
Chinese "tinker" has been making countless "x99" motherboard that reuse consumer chipset like h81 or b85.
I don't think Intel approve that
By mrheosuper 8 hours ago
Yes, if you want to go that route, you'll be better off going with RISC-V.
By Arnavion 17 hours ago
This makes me angry, so can anyone think of a legitimate steelman of their position?
Expect my view is consistent with reality, though: they’re chasing profits and getting away with it, so why go on the record and look bad if they can ignore & spend that time on marketing.
By Barbing a day ago
ASUS doesn’t want to deal with the social media horde, who can and will cherry pick words and take things out of context.
If a person comes to talk business with a camera attached to his head, I know he does not come in good faith.
By vachina a day ago
It's a journalist coming, because you said you want to talk to the journalist, because of the bad press you had before, because you fucked up.
Seems fair to take a camera.
By sigmaisaletter 13 hours ago
This is really a well written blog post.
The practice of "injecting pre-installed software through BIOS" is such a deal-breaker. Unfortunately this seems to be widely adopted by the major players in motherboard market.
By ritcgab 18 hours ago
I like ASUS products but I disable the UEFI-installed support app every single time. IIRC it used to be a full ROG Armory Crate installation, which is really annoying to uninstall.
When ASUS acquired the NUC business from Intel, they kept BIOS updates going but at some point a “MyASUS” setup app got added to the UEFI like with their other motherboards. Thankfully, it also had an option to disable and IIRC it defaults to disabled, at least if you updated the BIOS from an Intel NUC version.
By smileybarry 17 hours ago
I have a similar model motherboard from ASUS in my desktop I had custom built a few years ago, and I've mostly just been annoyed that I have to have Windows installed to be able to even update the BIOS at all given that the previous one I had (which I think was also from them?) would just let me do it over ethernet if I booted directly into the BIOS setup menu. Now I have much larger concerns in addition to the risk of not updating as frequently seeming much larger...
By saghm 10 hours ago
Any mobo will let you download the firmware file to a FAT32-formatted USB drive etc, and then use that to update the UEFI within the UEFI UI.
Yes some mobos have the feature in their UEFI to connect to the internet and download the update, but it's best to not rely on that since you have no idea how securely that is implemented. Considering how the submitted article is about a shitty implementation in a regular Windows program, you can be sure the implementation in UEFI is even shittier (may not check certs, may not even use HTTPS, etc). Asrock used to have an "Internet Flash" feature in their UEFI and then suddenly removed it, probably because it was too insecure to fix.
By Arnavion 6 hours ago
A few of the drivers they install (or want to install) are also on Microsoft's vulnerable actively exploited driver blacklist. So that's fun, they have no intention of fixing it because they do not support "third party software". I'm also pretty sure their installer doesn't work without unencrypted HTTP traffic being let through. Plus they keep offering bloatware as "updates" to you.
On top of it all, the software they offer is slow and buggy on brand-new hardware.
But most of those issues also exist with AMD's or Gigabyte's drivers, most hardware vendors seem trashy like that. Like, if you install Samsung Magician (for their SSDs) then that even asks you if you're in the EEA (because of the privacy laws I suspect), it's absolutely crazy.
Microsoft should make it *significantly* harder to ship drivers outside of Windows Update and they should forbid any telemetry/analytics without consent.
I find Linux's hardware support model significantly nicer, although some rarer things do not work OOB, there's none of this bullshit.
By Avamander a day ago
I thought it was revealing that the driver blacklist (sipolicy.p7b) was not updated for several years until Will Dormann happened to notice it in 2022.
Hardware manufacturers consistently ship out the worst softwares in existence. It's just a cost center to them. They've already sold the thing, it doesn't matter anymore.
My laptop has a fan and keyboard LED application that requires kernel access and takes over a minute to display a window on screen. Not to mention being Windows only.
Words can barely describe just how aggravating that thing was. One of the best things I've ever done is reverse engineer that piece of crap and create a Linux free software replacement. Mine works instantly, I just feed it a configuration file. I intend to do this for every piece of hardware I buy from now on.
By matheusmoreira 21 hours ago
I really wish someone made such software for ASUS and Gigabyte both, without dangerous kernel drivers.
In that sense fwupd has been an amazing development, as there's now a chance that you can update the firmware of your hardware on Linux and don't have to boot Windows.
By Avamander 20 hours ago
Actually having hardware lying around to reverse engineer is the limiting factor for me. I suppose I could give it a shot if people who own the devices sent me the required data. I'd need their help with testing.
USB stuff was really nice to work with. Wireshark made it really easy to intercept the control commands. For example, to configure my keyboard's RGB LEDs I need to send 0xCC01llrrggbb7f over the USB control channel; the ll identifies the LED and rrggbb sets the color. Given this sort of data it's a simple matter to make a program to send it.
Reverse engineering ACPI stuff seems to be more involved. I wasn't able to intercept communications on the Windows side. On Linux I managed to dump DSDT tables and decompile WMI methods but that just gave me stub code. If there's anything in there it must be somehow hidden. I'm hoping someone more experienced will provide some pointers in this thread.
By matheusmoreira 18 hours ago
> Microsoft should make it significantly harder to ship drivers outside of Windows Update
No. No no no no no no no NO! That just centralises even more control to MS.
What we really need is for more people to develop open-source Windows drivers for existing hardware, or encourage the use of Linux.
By userbinator 8 hours ago
I am assuming the timeline posted in this article is a year off, and the author means 2024 instead of 2025.
By cebert a day ago
Why would you assume that instead of the more likely scenario of them using DD/MM/YYYY format? The CVE linked has a date in 2025. Not everyone uses the insane American date formatting.
By psolidgold 17 hours ago
> When submitting the vulnerability report through ASUS’s Security Advisory form, Amazon CloudFront flagged the attached PoC as a malicious request and blocked the submission.
Reminds me of the time I reported SQL disclosure vuln to Vivaldi and their WAF banned my account for - wait for it - 'SQL injection attempt' so hard their admin was unable to unlock it :)
By rasz 12 hours ago
It is not just a mainboard issue. I had an asus mechanical keyboard. After I started using it, Windows kept installing software and background services in system that is a listening port. I kept deleted it manually and no matter I did, windows kept installing it without my consent. It was really annoying.
By serguzest 15 hours ago
I've read Acer for some reason, and was surprise and disappointed it is actually Asus.
By nexoft a day ago
All our motherboards, the root of trust, are made in Taiwan. All props to their industriousnes and agility but there should be western alterntive in that can be purchased?
By IlikeKitties a day ago
By jeroenhd a day ago
By ycombinatrix 12 hours ago
By buzer 18 hours ago
By chillax 2 hours ago
By Polizeiposaune 10 hours ago
By aspenmayer 3 hours ago
By holowoodman a day ago
By stavros a day ago
By ang_cire a day ago
By renmillar a day ago
By ang_cire a day ago
By rvnx a day ago
By fastball a day ago
By ang_cire a day ago
By lelanthran 19 hours ago
By neilv a day ago
By haswell 21 hours ago
By pixl97 19 hours ago
By neilv 20 hours ago
By fulafel 18 hours ago
By technion a day ago
By SahAssar a day ago
By technion a day ago
By frainfreeze a day ago
By beeflet 9 hours ago
By v3ss0n 2 hours ago
By rakoo 21 hours ago
By haswell 20 hours ago
By rakoo 19 hours ago
By JonChesterfield a day ago
By stavros a day ago
By chii a day ago
By bbarnett a day ago
By efdee 21 hours ago
By stavros 20 hours ago
By pixl97 19 hours ago
By holowoodman 18 hours ago
By cenamus 20 hours ago
By stavros 19 hours ago
By cenamus 3 hours ago
By pixl97 19 hours ago
By giantg2 a day ago
By Retr0id 21 hours ago
By rfl890 19 hours ago
By pixl97 18 hours ago
By IlikeKitties a day ago
By holowoodman a day ago
By throw0101d a day ago
By holowoodman 18 hours ago
By IlikeKitties a day ago
By delusional a day ago
By einsteinx2 a day ago
By drowsspa a day ago
By oezi a day ago
By mjevans 20 hours ago
By oezi 18 hours ago
By okanat a day ago
By IlikeKitties a day ago
By okanat a day ago
By layer8 20 hours ago
By int_19h 7 hours ago
By einsteinx2 a day ago
By Avamander a day ago
By hamandcheese a day ago
By hashstring 17 hours ago
By ajcp 21 hours ago
By darkwater a day ago
By Avamander a day ago
By asmor a day ago
By xmodem a day ago
By hashstring 17 hours ago
By ycombinatrix 12 hours ago
By pjmlp a day ago
By fulafel 20 hours ago
By curiousgal a day ago
By IlikeKitties a day ago
By lofaszvanitt 16 hours ago
By Gys a day ago
By eterm a day ago
By ang_cire a day ago
By eterm 20 hours ago
By ang_cire 15 hours ago
By Xelbair a day ago
By hypercube33 21 hours ago
By LadyCailin 21 hours ago
By NooneAtAll3 7 hours ago
By Xelbair 21 hours ago
By nubinetwork 7 hours ago
By int_19h 7 hours ago
By rendaw 2 hours ago
By tankenmate 3 hours ago
By throaway920181 a day ago
By pohuing a day ago
By jeroenhd a day ago
By pohuing 14 hours ago
By FirmwareBurner a day ago
By pohuing 15 hours ago
By campl3r a day ago
By FirmwareBurner a day ago
By GuestFAUniverse a day ago
By indrora a day ago
By antmldr a day ago
By ZoneZealot a day ago
By throaway920181 a day ago
By ZoneZealot a day ago
By smileybarry 17 hours ago
By ZoneZealot 14 hours ago
By a2128 a day ago
By MrBruh a day ago
By kstrauser 7 hours ago
By ethan_smith a day ago
By ZoneZealot a day ago
By satyanash a day ago
By Avamander a day ago
By ThrowawayTestr a day ago
By rkagerer 18 hours ago
By 93po 15 hours ago
By lucb1e 9 hours ago
By josephcsible 17 hours ago
By liendolucas a day ago
By cobalt60 20 hours ago
By IshKebab a day ago
By _pdp_ a day ago
By charcircuit a day ago
By Pesthuf a day ago
By 7bit a day ago
By stavros a day ago
By swinglock a day ago
By sigmaisaletter a day ago
By jeffparsons a day ago
By Arnavion 17 hours ago
By oynqr 5 hours ago
By Arnavion 5 hours ago
By encom a day ago
By ribcage a day ago
By matheusmoreira 21 hours ago
By mrheosuper 8 hours ago
By Arnavion 17 hours ago
By Barbing a day ago
By vachina a day ago
By sigmaisaletter 13 hours ago
By ritcgab 18 hours ago
By smileybarry 17 hours ago
By saghm 10 hours ago
By Arnavion 6 hours ago
By Avamander a day ago
By Hilift 20 hours ago
By matheusmoreira 21 hours ago
By Avamander 20 hours ago
By matheusmoreira 18 hours ago
By userbinator 8 hours ago
By cebert a day ago
By psolidgold 17 hours ago
By rasz 12 hours ago
By serguzest 15 hours ago
By nexoft a day ago
By ikekkdcjkfke a day ago