Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've had the 'chance' to work with some deeply manipulative persons in the past, the kind who goes to your desk and say 'Hey, I noticed you started to speak to X again, and your performance seems to suffer as a result", where X is a friendly colleague that opposed some plan of that person. It is incredibly difficult to keep those people in check as all that behaviour is off the record and impossible to prove. When people complain it's a 'you said, he said' situation where the manipulator inevitably wins. Wether those persons are positive or negative for the company is not all that clear, but they create an incredibly unpleasant work environment.


I find it hard to believe this was the kind of environment he was cultivating at OpenAI if 95% of the staff were ready to follow him out the door.

I've worked for the type of people you mention and no one followed them when they leave. 95% threatening to leave in this case is hard to ignore.


> I find it hard to believe this was the kind of environment he was cultivating at OpenAI if 95% of the staff were ready to follow him out the door.

I work for a startup that's on the cusp of having an exit event valued at 70 billion dollars. Drama within the board, who I have no connection with, has reduced the probability of that happening to 0. There's a chance another company will hire me and my co-workers and match our total compensation in liquid stocks we can actually sell.

It's really hard to imagine why I or anyone else would sign a letter that turns back the decision impacting the exit event or join the company that'll actually let me cash out the equity portion of my compensation. It definitely reflects my feelings for the CEO and not my own self interest.


I can believe the staff likes or even loves him, but the following him part was mostly because of money/shares and because they know he's influential and well connected to people with money. And peer pressure may have had a part in that letter signing. You don't want to be on the side of the losers if Altman gets his way.


> mostly because of money/shares

How do you know?


I don't obviously. But since those people were ready to jump ship to Microsoft, I am pretty sure they care more about their own careers than 'creating AGI that benefits humanity as a whole in the first place'


Didn't most sign the letter before they knew they had any offer to join Microsoft in any capacity?

Also maybe I'm just too risk averse but if I were concerned about money I wouldn't be putting my name on such a list. Although at some point past 50% it would feel pretty safe because what are they going to do, fire everyone?


"We, the undersigned, may choose to resign from OpenAI and join the newly announced Microsoft subsidiary..." so no, they knew.


> Didn't most sign the letter before they knew they had any offer to join Microsoft in any capacity?

I very much doubt it.


Presumably they're jumping ship with Sam, and I'd assume that they'd assume that Sam would uphold the same perceived integrity at MS


Sam's integrity would be at home at Microsoft, for sure.


keyword "perceived"


Occam's razor.

Maybe the simplest explanation isn't the right one for 100% of the people that followed Sam (or were ready to), but it's the right one for 90% of them, which is what matters for practical purposes.

Follow the money.


Of course. I mean come on you may love the guy but your primary reason for following him will still be money. Why would you want your years of work to go down to 0$?


These people are at the top of the AI industry, they’d make bank in a ton of jobs if they left tomorrow. They weren’t getting equity at Microsoft yet they still chose that opportunity as an alternative.

Clearly they care about working on the most interesting AI around instead of continuing to work under a CEO and board whose whole plan is to cripple AI development. Both the interim CEO Shear and likely coup leader Toner made it clear they are anti-AI and want to slow progress. Toner specifically said she’d be okay with the company collapsing as that was in line with the charter.

Occams Razor is people working on the most interesting stuff in the tech industry want to keep working on it rather than follow some radical EA doomer plan to kill it off well before we get near AGI.


> These people are at the top of the AI industry, they’d make bank in a ton of jobs if they left tomorrow.

I know a signatory of the letter and I can assure you that they were nowhere near the top of the AI industry six months ago.


But being at OpenAI, they now probably have the reputation of belonging to the top.


> They weren’t getting equity at Microsoft

This is wrong.


Memetic thinking aside, Ilya signing that letter might have sealed it for them. Though, working for someone as formidable as sama in itself is a great pull, nevertheless.


> working for someone as formidable as sama

His name is Sam Altman. And why is he so formidable?


> And why is he so formidable?

Commenting on an article that portrayed him as such?

> His name is Sam Altman.

Unsure what your point is; sama is his hn username.


I don't know the usernames of people discussed in articles and prefer not referring to people colloquially.

And I had assumed that you meant formidable in a positive sense. To me, he seems like a manipulative grifter. We even see that in his response to being fired. Instead of discussing facts, he was trying personal power plays, manipulating the media and employees, and trying to simultaneously start a new company, get a new job at Microsoft, and weasel back in as CEO of OpenAI. That seems to track as someone only concerned with himself.

Through all of this, it has remained confusing and disturbing just why he is considered so important to any of this. He seems completely replaceable. I haven't ever read or heard anything from him that didn't seem to come from some startup 101 playbook, almost like a cosplayer.


> almost like a cosplayer

If only growing startups were as easy as cosplay.

> And I had assumed that you meant formidable in a positive sense

Yes, I did. See also: https://twitter.com/karaswisher/status/1727386273936199893

> prefer not referring to people colloquially

If not everyone, at least for hackernews participants with 12k+ karma, you'd think they'd know very well who runs hackernews, or used to.


The money vs. mission question was what I was trying to answer with this hypothetical polling: https://news.ycombinator.com/item?id=38357485

(It seems obvious that hitching your wagon to Mr. Altman probably has a much better chance of making you rich, than does playing harps on a cloud at an altruistic non-profit. The question is what you actually want.)


> but the following him part was mostly because of money/shares and because they know he's influential and well connected to people with money.

In other words, they believed in his leadership, direction, and ability to serve their interests more than they believed in the board's.

I don't understand why so many people are performing mental gymnastics attempting to turn the unanimous support behind him into somehow being evidence that he's the antichrist. Why wouldn't the employees act in their own self-interest? What's wrong with them acting in their own self-interest? I would assume all employees everywhere, more or less, act in their own self-interest and I don't think that makes them or their preferred leadership evil incarnate.


I want to know other people's opinion on this.

Because if it was me working at OpenAI, I would've signed it just out of peer pressure even if I disliked him. As the CEO, Altman undoubtedly shaped senior management that would've one way or another put pressure on everyone else under them.

When I was salaried, my main concern would've been to just get my pacheck and keep things going as smoothly as possible in my day-to-day with the least amount of drama. And I feel like a lot of people are like this.


I suspect the signers were a combination of wanting to follow their comp out the door and a bit of Tom Wambsgans from Succession: "Because I've seen you get fucked a lot, and I've never seen Logan [in this case Sam] get fucked once."

There's very little risk in signing if everything falls apart, but there's a lot of risk to not signing if Sam comes back on as lead.

> I find it hard to believe

I also find it hard to believe that anyone on HN interested in this space doesn't at least have a "friend of a friend" who works at OpenAI. Based on what I've heard (which is nothing particularly quotable), it certainly gives off the vibe of being exactly that "kind of environment"


It’s not exactly a secret. The company structure was a setup that allowed a high degree of internal alignment (at a level of a cult, it seems). And at some point there was a need to realign with making a lot of cash. This resulted in an alignment on this goal, and of course everyone who is in on it is supporting Sam Altman’s moves.

https://www.technologyreview.com/2020/02/17/844721/ai-openai...


OpenAI is more religion than company. Sam could be a deeply flawed leader and still have extreme loyalty due to what OpenAI has achieved under his leadership. The people at OpenAI are believers in a mission and that means they’re far more likely to allow personal failings to slide. He’s more a Musk figure than a whoever-the-ceo-of-McDonald’s-is figure.


do you have evidence to back this up?


I don't think its necessary to prove anything he says, the keyword is 'could'. We don't know, and people who actually do don't spill it on HN just because we would like them to.

These are generic statements about cult-like leaders, Musk is a prime example. Its hard won affection, not just smooth BS, we here all know that.

That being said, people generally don't change, just situations (barring some catastrophic accidents or similar). Whatever actions given person did in the past describe them well enough in present. Again, generic but IMHO always valid so far.


"OpenAI is more religion than company" sounds like a factual statement to me.


> 95% of the staff were ready to follow him out the door.

I'd rephrase that to:

- "95% of the staff were ready to follow him and join Microsoft"

Amid so much confusion and uncertainty, the prospect of joining Microsoft through an acquihire would appear quite appealing and like the safest choice. This sentiment is strengthened considering the team's approval of Sam's leadership.


I don't work there, but can guarantee that 100% of the staff wanted to be paid. They're going to follow the person that is going to make them generational wealth.


Working at Microsoft doesn’t give you generational wealth like it doesn working on an AI startup, with a few exceptions. These AI researchers are in huge demand at plenty of companies and investors. It’s equally as plausible they just want to keep working with this collection of very smart people on the cutting edge of AI rather than have to start over from scratch somewhere else, as OpenAI was basically DOA under new coup leadership.


> 95% threatening to leave

Have you never had that employee or colleague who threatens to leave once a year? Curiously around pay negotiations?

Nobody joined Microsoft. Nobody left. Two people were fired. Lots of threats were made, every one magically leaked within minutes to Twitter.

Nobody followed anyone anywhere. Instead we saw $81bn vaporise, and the people who stood to gain from it panic and throw their weight around.


Isn't this how you gain power? You influence as many people as you can through suggestion that you can give them what they desire? Then grow that group to be large enough so that you're cemented within the org?

Manipulation doesn't even necessarily feel bad. Just promising something, or offering a place inside the "in-group" could do the trick for most. It's when you're up against someone whose job it is to safeguard something (like someone on the board dedicated to a mission) where you start needing to get a bit more gangster with your tactics.


Dunno, you have to be able to deliver on some of those promises of desires fulfilled. And as you get older, your ability to see through it should only increase. At that point, the only real question becomes: is it to my benefit?

FWIW, while I follow this saga, I am kinda waiting to see the full retrospective. I think we don't know everything relevant yet.


If there _was_ a good reason to fire Sam, and the board had appropriately and clearly communicated their decision, I think less of the staff would have signed a the petition to walk. From the public's perspective, and probably most rank and file employees, this decision came from left field and had no logic behind it. The waffling and back peddling that followed certainly didn't help perception


If you give me the choice between making a lot of $$$ by working for a for-profit company or staying at a nonprofit with limited upside I'd also choose the former, even if I don't like the CEO much. Don't know where this myth of "people followed him" comes from. There is no evidence for it.


"95% of the staff" -- this is Kim Jong Un approval rate territory. caution advised.


Vladimir Putin had 77% of the vote in Russia’s 2017. If Putin can’t fake a 95% approval rating, surely the OpenAI numbers must be real.


This is reminding me of the Ewok defense.


Looks like I need to work on my sarcasm phrasing.


When Ilya signed the letter, most of the researchers would follow suit.

As for the rest of the non-researching roles, most of them were hired after Altman's expansion for commercial operation. The existence and future prospersity of their jobs rely on having someone like Altman to push for profitabilty/go-to-market vision.


You don't need to manipulate all employees. Just key ones ;)


I feel the same way, however.

The 95% will lose a huge chunk of money if Sam leaves, at least their fortune are all in serious jeopardy. So, money might have played a bigger role here.


I don't know anything about the specific situation, but in general this is totally possible with a tyrannical leader.

If he does come back and you didn't sign, he'll make your life hell; if he comes back and you did sign, you will be rewarded for your loyalty.


Well maybe they were not as good at manipulating as others can be.


Cult of personality and connection to the 1% of 1% given our tech fueled economy skews worker motives.

If you had such a chance to sit around while everyone else grew your potatoes, you would.


This endorsement of Sam from 2011 is actually pretty damning, though it is so long ago if it were the only thing it wouldn't be a huge red flag:

>I just saw Sam Altman speak at YCNYC and I was impressed. I have never actually met him or heard him speak before Monday, but one of his stories really stuck out and went something like this:

> "We were trying to get a big client for weeks, and they said no and went with a competitor. The competitor already had a terms sheet from the company were we trying to sign up. It was real serious.

> We were devastated, but we decided to fly down and sit in their lobby until they would meet with us. So they finally let us talk to them after most of the day.

> We then had a few more meetings, and the company wanted to come visit our offices so they could make sure we were a 'real' company. At that time, we were only 5 guys. So we hired a bunch of our college friends to 'work' for us for the day so we could look larger than we actually were. It worked, and we got the contract."

> I think the reason why PG respects Sam so much is he is charismatic, resourceful, and just overall seems like a genuine person.

https://news.ycombinator.com/item?id=3048944

I think the article mentions what may be this same incident, without saying how it was done:

> Rabois noted that Altman, as a Stanford dropout, persuaded a major telecommunications company to do business with his start-up Loopt — the same quality, he said, that enabled Altman to persuade Microsoft to invest in OpenAI.

From the earlier comment, it seems he persuaded the telecom essentially through fraud though maybe not legally so.


This is sort of par for the course in the world of early stage startups. No one wants to be your first customer as it is risky, but you need that first customer. So you "fake it until you make it."

It is similar to dressing the part you want - at least when that mattered. You buy more expensive clothes than you should be able to afford so that people think you are more successful than you are, and then they are more willing to bet on you, and then you become more successful.

There is nothing that is a red flag for me in the above story.

I also had a prospective first client want to visit our offices so I quickly rented an office and asked my part-time contractors to all come into the office that day to fill it out. It worked! And then I could afford an office and hiring those part-time contractors as full-time employees. So it was sort of a self-fulfilling.


I think OPs point was that this was sama finding the line of what was the most egregious thing that is acceptable to admit in public which is almost certainly not the most egregious thing he's done and could be a large part of the explaination of why people's opinion of him knowing certain private actions diverges so much from everyone else.


Interesting viewpoint, lie is a lie and amoral is amoral. We can wrap it in nice package or act like 'it had to be done because others are doing it', and it may be a correct statement. But its still a plain in-your-face lie.

If that telco would know truth they would most probably cut them out, not due to their size but due to their lies. This is not how trust is built, this is how you lose it very quickly and for good.

Maybe we need to accept that this is expected from all startup owners/ceos. Fine with me too, but its still amoral. We define our own legacy, if we ever care (and these mega egos do care a lot).


I believe such behaviour is harmful and that we shouldn't be rewarding those that engage in it.


The bad behaviour is predicating the purchase on seeing the office.

Having an office doesn't make a company real, nor any more or less likely to execute on the project


> Having an office doesn't make a company real or less likely to execute on the project

Having real employees vs sham ones does imply a lot about a business, even if it isn't a perfect signal.


Both can be bad. Even more so when you don't know which party established the idea as bad in the first place.

A purchaser who insists they only see white employees in the office is bad. Anyone that forces their non-white employees out of sight to secure that purchase is just as bad, if not worse.

To play along is to accept the notion, to contribute to it's perceived validity, and to harm anyone who happens to be honest. The result is that people we'd be better off without are pushed upwards in society.


> There is nothing that is a red flag for me in the above story.

Elizabeth? Is that you?


If Elizabeth Holmes had been able to pull off some successful product that made a lot of money, no doubt all the "fake it 'til you make it" she did at the beginning (showing demos that didn't work, sending tests to outside labs and saying they were run on their equipment) would have been forgiven no doubt.

Just another nostalgic Silicon Valley "hustler" story.


She really only got into trouble because her lies became obvious and she risked people's lives. If it was some CRUD app and she didn't get enough customers or whatever, more than likely she'd have gotten money for another company.


> she risked people's lives

She was found not guilty of that bit. The conviction and jail time is only for defrauding the investors.


Only that Theranos product was techically impossible. Which makes the whole thing even crazier, nobody did even the slightest due dilligence there. Seems to be par of th cours so, other exhibits are FTX and WeWork.


The core FTX crypto exchange business was very profitable. But Alameda wasn't. Also everyone at FTX was committing fraud.


Based on what financial statements do we know FTX was profitable? Also, the stole customer funds.


> Based on what financial statements do we know FTX was profitable?

The crypto-exchange part I have read many times it was profitable. Running an exchange is a profitable endeavour as you just take a cut of all transactions. As long as you control your costs it is a money printer.

The rest of FTX was full of fraud and Alameda was a money sink via unprofitable speculation. Also likely helping laundry money as well via poor KYC.

Running an exchange is a great business though if you have the volume, doesn't matter if it is crypto or futures or stocks.


No, crypto exchanges are only profitable as a result of massive wash trading and scamming. If they had to actually compete the margins would be hilariously low. Probably even lower than a typical bank because the product is just worse.


Theranos wasn't a Silicon Valley company - nobody in SV invested in it except Draper (who's a nutcase) because they all thought it was a scam.

It was an affinity fraud on non-SV rich people founded on acting like a tech company.


> If Elizabeth Holmes had been able to pull off some successful product

Like Loopt?


> This is sort of par for the course in the world of early stage startups

It’s so par of the course that I’m willing to bet it didn’t happen.


I think that level of honesty isn't unusual in Silicon Valley.

Personally, if I were the prospective customer, I'd be angry at being lied to, and my message to my team would probably be that we'd be foolish to depend on this startup after they've shown from the start that they're dishonest.

If I were an established company, I think I'd also have our lawyers look at situation, to make sure the institutional knowledge was captured, and to see whether there's anything else we needed to do.

(For example of something else to do: though I'd treat things as confidential by default, in some future n-ary relationship/deal, is there a situation in which I'm obligated to mention to a third company that we previously had negative vetting info on the other company.)

But in the context of current startup culture, I don't think "fake (fraud) it till make it" is that unusual. And it's been normalized.

But I still don't want to do business with dishonest startup founders -- whether it's because they're naturally lying liars, or because they're surrounded by frequent dishonesty and they're not smart enough to cut through that.


I put that in approximately the same place as the founders of Reddit making alts and posting things on early Reddit or Porsche labeling its first-ever car design as Type 7.

There's a deceptive "fake it 'til you make it" aspect to both, and both play towards inflating the current appearance of scale/traction/experience, but I don't find them particularly damning.


VC capital optimises for revolutionaries thus they get revolutionaries.

Please note any positive connotations for the word 'revolution' should be abandoned at this point. Revolutions are short-term 100% bad and long term coin-toss bad, or worse. VCs love those odds.


What about the industrial revolutions?


What am I missing? The worst sin is trying to look bigger than they are?

You should listen to How I Built This. Tricks like this when starting out are pretty common, be it unicorn startups or personal businesses. So common that founders are openly willing to admit to it on public radio. In almost all cases, both parties came out better. It's not as if the client is at all upset at this "fraudulent" behavior.


oh sht, this guy can persuade clients and close deals? Better keep him away from the company!


Remember BillG sold an OS to IBM for the Intel 8086 that was not even owned or written by Microsoft at the time.


And somewhat ringing of these current events, his mom was on a charity board with the head of IBM.


I'm also neither a Sam Altman booster or detractor, but the types of activities described here (and honestly, sometimes much much worse) are very common at startups.


Every good CEO is also a Confidence Man/Woman.


No, not really. Not even remotely. Business is ruthless, that's fine. It has to stay clear of fraud and deception. And funny enough, most old school companies do, mowt of the time.


I have had similar experiences.

The best career decision I ever made was to prioritize working with Good People and one of my few regrets was putting up with smart jerks for so long.


This whole saga whiffs of Machiavellianism


> as Machiavelli said:

>> Make mistakes of ambition and not mistakes of sloth. Develop the strength to do bold things, not the strength to suffer.

https://blog.samaltman.com/value-is-created-by-doing


I meant the dark triad personality traits, more than borrowing from The Prince.


On just Sam’s part or all around? Seems like there might be quite a lot of it.

Sam gives me a manipulative vibe but the way he was booted with knives out was also pretty gross. No clue what else was going on behind the scenes.

Edit: if the people who booted him were really doing it in the name of safety paranoia, that doesn’t mean it wasn’t Machiavellian. The motive can be whatever but conspiring to boot someone like that is still a knife in the back.


I have interacted with him a few times and when he decides to help, he will help you all the way with an almost maniacal focus and drive. For what it's worth I have never heard bad things about him from individual interactions.


I would not be surprised if this is the beginning of the end for the company.


Nah. Microsoft still exist and is thriving. Altman is the new Bill Gates except he is better at retaining ~~cul~~ employees. Many at HN love him for those qualities.


Can you clarify the meaning of 4 tildes surrounded by the letters 'cul', for those of us who are new around here? Thank you


I believe they meant to use the tildes to indicate a strikethrough text format, as with markdown. The "cul", I would guess is an unfinished "cultists", even though you'd typically strikethrough a completed word. When trying to indicate a "change of mind" it would be better to use a dash: "Better at retaining cul- uh, employees."


He does not hire "mercenaries", only trOO believers...


>When people complain it's a 'you said, he said' situation where the manipulator inevitably wins

There’s no such thing as a free lunch. These types must have weaknesses of their own. I’m growing the cynicism necessary to tolerate them, but I’d like to know more robust strategies to manage them and keep them in check.

I find it hard to truly hate people, but with this type I can muster some pretty flowery invective on the spot.


Unfortunately, it's a time disparity issue.

Someone who politics for more time (with some aptitude) will generally beat out someone who doesn't.

One of the marks in favor of being cutthroat about pre-registering KPIs and expected outcomes, and then evaluating solely based on them.

In the end, I think it comes down to organizational culture.

The companies I've seen with healthier executive ranks all had a very strong culture/tradition of "brook no bullshit" and shunned/discouraged up and coming colleagues from doing the same. As well as a focus on a central, objective mission (e.g. "Does this help us X?").

You still got bad apples, but their behavior wasn't nearly as pervasive as I've seen other places.


> One of the marks in favor of being cutthroat about pre-registering KPIs and expected outcomes, and then evaluating solely based on them.

That's the only thing off in your comment. Those KPIs are always set by politics, always have surprisingly subjective measurements, and always have unpredictable consequences that are cleared out by politics.

An environment with all formal strictly set objective metrics is one of the easiest ones to manipulate.


The worst option, except for all the other ones.

What's the better alternative?


Have the KPIs, but stay aware that they aren't objective and complete measurements of individual performance, thus keeping the entire revision and verification of reported data.


> "thus keeping the entire revision and verification of reported data."

What does that mean?

The entire point of KPIs is to better solve the "subjective measures allow politicial employees to dominate less-so ones" problem, by converting it into a "defining KPIs such that gaming them produces outcomes beneficial to the company" problem.

KPIs aren't objective, but they're certainly more objective than manager opinions.


Yea I’m fortunate to have worked in more good companies than pathological ones, so maybe whatever my strategy is has worked so far.


909 people followed Jim Jones to jonestown and died, so?

[edited]: sorry means to replied one comment replied to this comment


This is basic bullying. I would ask for specific examples of the performance decline. That will also be a "he said you said" situation.

However, sunlight is the best disinfectant. A bully cannot stand in isolation unless he is enabled. But if left too long they can amass too much power as the bully can manipulate enough people to vote for him (see Trump) or manufacture the vote.

In those cases it takes a far larger force to bring about change.


> But if left too long they can amass too much power as the bully can manipulate enough people to vote for him

That feels exactly like why the board did what they did. Reading between the lines of everything that has been published, the actual sin that led to Altman's firing seems obvious:

(1) Altman went to a board member and proposed something that would decrease the board's power over him (probably kicking someone off the board)

(2) That board member tells other board members about the conversation

(3) Board asks Altman if he had that conversation. Altman denies it

(4) Board fires him for lack of candid communication with board

(5) Board doesn't explicitly say what happened publicly, because it's inside baseball. But they absolutely know it did happen, because it they were first parties to it

This feels less about safety vs commercialization (in the immediate future) and more about not having faith in a CEO caught in a lie while trying to remove oversight.


Absolutely. Also reporting these out of the ordinary behaviors before they become problematic is also a way to keep these guys in line. Once they see that you have a systematic way to report (replace report with "ask if this is normal practice within the company"), they’ll avoid you.


Worldcoin

You must be a sociopath to think that's a good idea.

> “Sam lives on the edge of what other people will accept,” said one of the people who had worked with him closely. “Sometimes he goes too far.”

Silicon Valley has a profound problem with (a lack of) morals and ethics.


(I'll hijack your comment a bit, just want to share my experience working in something related to it)

I've had a chance to work with some HR people who genuinely wanted to improve the work environment on their respective companies (I know! Please believe me, lol).

One of the bigger issues was corruption in general, of which this sort of behavior could fall under. The line of reasoning for that is that people usually resort to these behaviors in order to immorally/unlawfully attain some material benefit to them (it is very strange to find a pure blooded sociopath that just does it for the sake of it). When people artificially distort any system that is set up (for acquisitions, promotions, terminations, you name it) so that it no longer serves the company's interest but that of a group of rogue employees, well ... that's corruption. This framing is nice as it makes company exec's take a look at it from a business' gain/loss perspective instead of "meh, it's just employee's gossip".

Anyway, the proposed solution was a sort of ombudsman for companies (it's actually a tech thing, not an actual person), a private channel where people could raise these issues without fear of retaliation. There cannot be a clear cut criteria by which one could define whether a particular employee is being corrupt or not, but we've observed something like a bi-modal distribution where problematic individuals truly stand out! Quoting Warren Buffet, "there's never just one cockroach in the kitchen"; you usually observe a lot of employees with no comments on them, a few getting like one or two remarks per month (and you can just ignore those, shit happens everyday) and then you have this guy who is getting 10+ comments per week and that's who you really need to sit down with and ask what's going on.

Obviously this relies on the HR person being fair and honest, not part of the plot, and that comes with its own set of caveats; but at least, it's much easier to control that for one person than for 100s. Overall, the whole thing felt like an improvement.

But, conclusion, the app didn't go much farther than being used at a couple companies, and then we realized it would be very hard to monetize, the team disbanded and we all moved on to other things :P.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: