Their unreleased LaMDA[1] famously caused one of their own engineers to have a public crashout in 2022, before ChatGPT dropped. Pre-ChatGPT they also showed it off in their research blog[2] and showed it doing very ChatGPT-like things and they alluded to 'risks,' but those were primarily around it using naughty language or spreading misinformation.
I think they were worried that releasing a product like ChatGPT only had downside risks for them, because it might mess up their money printing operation over in advertising by doing slurs and swears. Those sweet summer children: little did they know they could run an operation with a seig-heiling CEO who uses LLMs to manufacture and distribute CSAM worldwide, and it wouldn't make above-the-fold news.
The front runner is not always the winner. If they were able to keep pace with openai while letting them take all the hits and miss steps, it could pay off.
Time will tell if LLM training becomes a race to the bottom or the release of the "open source" ones proves to be a spoiler. From the outside looking while ChatGPT has brand recognition for the average person who could not tell the difference between any two LLMs google offering Gemini in android phones could perhaps supplant them.
Indeed, none of the current AI boom would’ve happened without Google Brain and their failure to execute on their huge early lead. It’s basically a Xerox Parc do-over with ads instead of printers.
If the car that did a hit-and-run was operated autonomously the insurance of the maker of that car should pay. Otherwise it's a human and the situation falls into the bucket of what we already have today.
> If the car that did a hit-and-run was operated autonomously the insurance of the maker of that car should pay
Why? That's not their fault. If a car hits and runs my uninsured bicycle, the manufacturer isn't liable. (My personal umbrella or other insurance, on the other hand, may cover it.)
They're describing a situation of liability, not mere damage. If yor bicycle is hit you didn't do anything wrong.
If you run into someone on your bike and are at fault then you generally would be liable.
They're talking about the hypothetical where you're on your bike, which was sold as an autobomous bike and the bike manufacturer's software fully drives the bike, and it runs into someone and is at fault.
Perhaps require monitoring of the arm muscle electrical signals, build a profile, match the readings to the game actions and check that the profile matches the advertised player
Sounds like it could be fixed by making it configurable to hide all issues without a certain tag (or auto-apply a hiding tag) for the issues "landing page".
I always thought they deliberately tried to contain the genie in the bottle as long as they could
reply