Ask HN: Is it unethical to write code for a framework with security risks?

8 points by luxuryballs a year ago

Let’s say I know that .NET 5 has security flaws, and the right thing to do is upgrade to .NET 6. If I am asked to just write the feature for .NET 5 because “we aren’t targeting .NET 6 yet”, knowing I’ll have to change it later, and that .NET 5 is insecure, is it unethical for me to follow this order and do the work?

What if the boss says “it’s OK don’t worry”? Does the boss knowing or not knowing the full scope of the security implications change whether or not it’s ethical for me to actually write such software?

Is it really correct if I just say “ok well I warned you” but then do it anyways?

maxbond a year ago

In a frictionless ethical vacuum, I think it's pretty clear that knowingly contributing to shipping any severe defect in any product people rely on (or can be reasonably anticipated to rely on) is wrong. It's an engineer's responsibility to society to ensure that the products they work on function and do not cause harm. Software with vulnerabilities is defective and affords the opportunity for harm, in a not too dissimilar way to a faulty airbag (if you ignore the magnitude of the harm being done, obviously a malfunctioning airbag is generally more dangerous).

Your boss's knowledge factors into it in that it is relevant to your forming a professional opinion about whether those conditions are met.

Outside of frictionless ethical vacuums there can be many confounding factors, eg, if making an issue of it would put you at risk of being fired and not being able to fulfill responsibilities to your household, then I think it is okay to take that into consideration, especially if you've made your views known & thus given your boss the opportunity to course correct.

ETA: Reminds me of this conversation a week ago about coming up with a professional code of ethics for software engineers. https://news.ycombinator.com/item?id=33805884

  • dysarray a year ago

    I think the engineer's responsibility to society should always come first. If your boss is knowingly shipping a severe defect and your efforts to communicate your views on the matter to them have been met with ignorance or refusal to take action, then it is your ethical duty to take whatever steps are necessary in order to prevent harm being caused to society from the defect. This might include, for example, going over their head and raising the issue with upper management, or taking the issue to external organisations.

    • maxbond a year ago

      I don't really disagree with anything you said, but I do think ethical principles have priority and that higher priority principles can override lower priority principles. Some people are simply not in a position to uphold their principles, and I don't think there's shame in that; if anyone is at fault, it's the people who put them in a position to make that choice.

      Our first duty is always to the survival of ourselves and those around us. Of course, if you're working on a system that can put lives in jeopardy - which may not be an artificial heart or something obviously critical to life like that, it may just mean a social networking app used in countries suffering from authoritarianism - then it's possible to come into a very sticky conflict.

anenefan a year ago

It's good to have principles, but they must be weighed against what actual harm might arise and if it's actually your responsibility. If you had been aware of half the holes in M$ which have been "fixed" over the years (from what I'm told there's still a few despite being informed, that have been carried though for near 20 years) would you have programmed for anyone who wanted a product to run on M$? Obviously most people continue to write programs that ran on M$, since it is quite easy to see that a great amount of people use nothing else.

Since .NET is not proprietary any more, there's a good chance ALL of the issues are being addressed for the next version, unlike M$ who held onto a few of them knowing that eventually they might need an ace to "push."

Out of interest, as a end user myself, as well as what I might install for the average user, I have actively avoid .net products ... too many on offer were written poorly and it's too hard to vet every single one, as well as nasty surprise requirements. This may have changed in the last 8 years or so. On the other hand a coder who could push the same in C++ I could trust, along with most products which employed a good coding team.

HeyLaughingBoy a year ago

Is there any framework that doesn't have security holes?

killingtime74 a year ago

It highly depends. Are you writing something with customer data? If I am writing some internal tool that is devoid of customer data just internal configuration I would have no problem with that.

  • maxbond a year ago

    I don't think it's quite so clean as that though. All data is customer data from the perspective of the service; it may in fact be employee data, but employees are also people who's privacy should be respected. And even if it doesn't contain data about any people, breaches often involve attacking multiple services to leverage increasing access. So you're putting a vulnerable app into an ecosystem of software, which doubtless includes sensitive days, and increasing the vulnerability of that ecosystem.