Even before Tesla reported the first known death of a driver using its autopilot feature, some employees worried the car company wasn't taking every possible precaution.
Those building autopilot were acutely aware that any shortcoming or unforeseen flaw could lead to injury or death -- whether it be blind spots with the car's sensors or drivers misusing the technology.
But Tesla founder and CEO Elon Musk believes that autopilot has the potential to save lives by reducing human error -- and has pushed hard to get the feature to market.
The team's motto is "not to let the perfect be the enemy of the better," according to a source close to Tesla. For Musk specifically, the source says his driving force is "don't let concerns slow progress."
However, Tesla CEO Elon Musk brushed aside certain concerns as negligible compared to autopilot's overall lifesaving potential.
Some Tesla (TSLA) employees struggled with this balance, according to interviews CNNMoney conducted with five current and former Tesla employees, including several from the autopilot division, most of whom spoke on condition of anonymity.
Related: Inside Elon Musk's grand plan for Tesla
Eric Meadows, a former autopilot engineer at Tesla, recalls testing the feature on a Los Angeles highway in mid-2015, a few months before its release, when he was pulled over on suspicion of driving drunk. The car, he says, had struggled to handle the sharp turns while in autopilot mode.
Meadows knew he was "pushing the limits" of autopilot, but he assumed customers would push those limits too. That's why the incident worried him.
"I came in with this mentality that Elon had: I want to go from on-ramp to off-ramp and the driver doesn't have to do anything," says Meadows, who was fired from Tesla two months later for performance reasons. "The last two months I was scared someone was going to die."
In another anecdote recounted by two sources, Musk was told that the sensors used for Tesla's self-parking feature might have difficulty recognizing something as small as a cat. Musk is said to have responded that given how slow the car moves in this parking mode, it would only be dangerous to "a comatose cat."
Related: Inside Tesla's ginormous Gigafactory
Musk pushed back against employees who raised concerns about autopilot that he viewed as "overly cautious," according to one former Tesla executive who worked closely with the CEO.
"Safety is a top priority at Tesla," a Tesla spokesperson said in a statement provided to CNNMoney. "We constantly build updates to our software and specifically autopilot features to continue to improve the feature and driver experience."
Employees we spoke with described Tesla's staff as being a mix of younger, data-driven engineers and individuals with more experience in the auto industry. The latter group is said to be more sensitive to nagging doubts about safety and liability.
Related: Most dangerous thing about autopilot is that it's called autopilot
Raj Rajkumar, an autonomous car pioneer at Carnegie Mellon, frequently meets with employees from auto companies at conferences and research events. According to Rajkumar, Tesla employees he has met "say it's an understatement to say [Tesla] is hyperaggressive."
When Rajkumar has raised concerns with those Tesla employees about autopilot's technical limitations, the response is they have to "wash their hands of it" because "it's a business decision."
Multiple employees told CNNMoney that numbers and data matter above all else for winning arguments with Musk and other top execs.
Sometimes those numbers can be used to justify not addressing a potential safety issue immediately -- like the cats -- that might keep certain engineers awake at night, simply because the likelihood of it happening is so small and the data shows more benefit to getting autopilot out into the world.
As an example, one source close to Tesla pointed to the development of the self-parking feature. The sensors might not work as intended when the car is parked on the edge of a precipice, but that uncommon risk was weighed internally against the benefit of preventing "thousands of deaths" from drivers backing out of their garages.
Related: Consumer Reports to Tesla: Reel in Autopilot
"It's hard to believe a Toyota or a Mercedes would make that same tradeoff," says David Keith, an assistant professor of system dynamics at MIT Sloan School of Management who studies new technologies in the automotive industry. "But the whole ethos around Tesla is completely different: they believe in the minimum viable product you get out there that's safe."
Musk likes to push the envelope. In one instance, he wanted to allow videos to play on the car's center console, according to the former executive. He was eventually talked out of it after receiving staff pushback over liability concerns.
"We didn't think having videos on the center console was safe, so that is why we didn't do it," says a Tesla spokesperson.
Last month, Tesla disclosed that a driver in Florida was killed in a crash with a tractor-trailer while his car was in autopilot mode. The autopilot system failed to recognize the white side of the trailer against the bright sky. The driver may also have been watching a video in the car on a portable DVD player. On Tuesday, the NTSB released a preliminary report suggesting that the driver was speeding.
With the news, Tesla employees were forced to confront the repercussions of the company's product decisions head on. Some, including Musk, focused on data showing Tesla's autopilot is still safer than the alternative. Others internalized the tragedy more.
The autopilot team was particularly hard hit by the news. The source close to Tesla called it "very, very difficult" for those employees. They held a team meeting to talk about the crash.
Then they went back to work, brainstorming new radar functionality for the cars with the hope of preventing a similar accident.