Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> would immediately be the highest possible level of breach

Obviously not true, in fact none of the companies I worked in that was the case. Now, if we're talking about somebody like network admin that may be more juicy, but in a large company rarely a single person has the "highest possible level".

> If you never worked at a company that took it seriously it is hard to imagine that there are people who do take it seriously

This sounds quite unsufferably smug, and unwarrantedly so. Yes, we get it, in Google they have access levels. Newsflash: this concept wasn't invented by Google. A lot of companies use it. Running code on engineers workstation does not give you keys to the whole castle. But it does get you into a first perimeter, after which you can do some recon, not available outside the big firewall, launch other things against corporate sites not available to mere mortals, look for "experimental" servers which are poorly secured because they're not production yet and are behind corporate firewall (that happened in pretty much ever company I've worked with), get their hands on some juicy browser cookies containing authorizations to some internal services, steal some code finally... There are a lot of interesting things a person can do with engineer's access and some of them may land a stepping stone to a next step access maybe. If you don't understand why such a thing may be important even without giving somebody the keys to the castle - maybe you don't know enough about how secure systems are built to smugly dismiss other people.



> Obviously not true, in fact none of the companies I worked in that was the case.

Selection bias. In some companies access to an engineers single workstation allows access to potentially billions of dollars worth of intellectual property. Imagine China stealing the design for the latest aircraft engine from GE, or accessing the PC of the most senior accounts payable person in a company and getting access to actual money. Just because YOU weren't the most valuable target in a company doesn't mean no one in your role is. My account at my company is constantly under attack because I'm the VP of IT in healthcare. The reality is "my" account doesn't even have any powers, it's just email. My accounts with sensitive access are separate.


The parent commenter said this:

> People [..] freak out about this because [..] in 99.9% of companies, running code on an engineer's workstation would immediately be the highest possible level of breach.

So it's not selection bias, it's a counterargument. The poster also said engineer not "VP-level".

So, your comment is not really relevant.


It's 100% relevant because there are even MORE valuable than me in my company. I was pointing out the importance of people to an attacker is directly proportional to their access, not their rank. If they got into one of our RCM people, we'd be royally screwed, and they make $30/hr.


> Obviously not true, in fact none of the companies I worked in that was the case

I once offered a bet to the large security team at a well-known decacorn tech company I worked at: I offered to make a personal, reasonable-sized cash bet with any member of the security team that I would win if I could deploy malicious, unreviewed code to any service or machine of their choice without it being prevented or proactively noticed by them.

The members of the security team all declined my bet. We're talking about a team of probably at least a dozen people, many of who had been working at the company far longer than I and who had been shaping and reviewing the company's security design for years.

They knew perfectly well that I would be able to win the bet. Not because their security was unusually bad, but because it was bad in the common, usual ways. Securing the supply chain is hard, and real security is almost impossibly expensive to add to a system late in the game if you didn't design it in from the beginning.


Or maybe they simply didn't want to risk personal money on some bet about the state of security at their job. I wouldn't take the bet even if thought the security was good.


If you're not even willing to make a bet for a single signed dollar, that doesn't speak highly to your confidence in your work.

It's fine to not be confident, but when professional security teams at large companies are afraid to express confidence that their systems are non-trivial for a random engineer to hack in their free time, that seems at odds with the claim that it's "obvious" that permission escalation is hard


Making such a bet is not a really professional thing to do. Regardless of the actual risk it introduces. If I was a manager in that company and two of my employees made such a bet I'd be tempted to fire both or, at the very least, have a very serious conversation. I think that's borderline malpractice.


When I worked at Google back in the day, we used to make dollar bets all the time. You'd tape the signed dollars you won to your monitor.

A willingness to take pride in your work and to not take it too seriously when smart, well-intentioned people make mistakes (e.g. blameless postmortems) is part of the culture difference that led to Google's engineering becoming so exceptional and innovative vs the more corporate, don't-rock-the-boat, fear-driven culture that the traditional businesses had at the time.


The second paragraph seems at odds with the first. I'd describe a culture where people are making bets on whether or not you can find a bug in someone else's work is the opposite of blameless. I'd consider it quite hostile, to be honest. Specially if it's something that management is actually ok with.

I'm assuming you were at google in late 90s/early 2000s?


>If you're not even willing to make a bet for a single signed dollar, that doesn't speak highly to your confidence in your work.

I've long thought that one should have the attitude (and act to make it so) that one should be willing to bet their job on the quality of their work, but not necessarily actually do so.

And betting anyone (co-worker or not) that they can't compromise the systems (especially, but not limited to production systems) you're tasked with keeping from compromise is a bad bet -- even if you win.

I'd class that sort of behavior as having serious potential to be a "Career Limiting Move" (CLM).


Yah, so they have to pay out on a bet and they become unemployed. That seems really smart. Never gamble in anything that is 100% correlated with your primary source of income.


You seem to fundamentally misunderstand how this works, and are smug about it in the process.

There is no big firewall to bypass here. That's the whole point of zero trust.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: