Only because of the broader context of the legal environment. If there was no prosecution for breaking and entering, they would be effectively worthless. For the analogy to hold, we need laws to throw coercive measures against those trying to bypass guard rails. Theoretically, this already exists in the Computer Fraud and Abuse Act in the US, but that interpretation doesn't exist quite yet.
Goalpost movement alert. The claim was that "AI can be told not to output something". It cannot. It can be told to not output something sometimes, and that might stick, sometimes. This is true. Original statement is not.
After learning that guaranteed delivery was impossible, the once-promising "Transmission Control Protocol" is now only an obscure relic of a bygone era from the 70s, and a future of inter-connected computer systems was abandoned as merely a delusional, impossible fantasy.
> The third and fourth arms are extreme compression construction arms "ecca", where a programming language interpreter is created and individual incoming letters are interpreted as instructions specifying which phase (mod 2) and line of glider to emit.
As the wiki page states, the period is 133076755768, and it moves by two cells in that time. Spaceships in GoL by definition don’t leave anything behind, they produce the exact same configuration, just shifted across the grid.
Given that it starts as a single line, it is symmetric in the axis implied by that line, and hence can’t possibly move diagonally or orthogonal to the line. Hence it moves in the direction of the line.
Yeah, “orthogonal” here just means “not diagonal”. Since GoL configurations don’t have a distinguished orientation (you can rotate and/or mirror them however you like), it wouldn’t make sense to specify up/down/left/right, at least not without first fixing an (arbitrary) orientation.
I’m not sure where our guidelines/norms are on this kind of thing, but I get the sense that most of us feel very capable of pasting articles into LLMs ourselves.
What we’re less capable of—and the reason we look to each other here instead—is distinguishing where the LLM’s errors or misinterpretations lie. The gross mistakes are often easy enough to spot, but the subtle misstatements get masked by its overconfidence.
Luckily for us, a lot of the same people actually doing the work on the stuff we care about tend to hang out around here. And often, they’re kind enough to duck in and share.
Thank you in any case for being upfront about it. It’s just that it’d be a shame and a real loss if the slop noise came to drown out the signal here.
I think you're mistaking the funding and starting of companies with the execution of their vision through software engineering -- the entire point of the article, and the OP.
You are correct. I've operated under many protective orders that require me to redact portions of reports clients paid for because they were not authorized to see those specific parts due to the order.
If you do anything in America that results in a stored record it's possible it will be released in discovery and a lawyer will read it. This happens all the time, and has happened for hundreds years.
It's not like the NYT will be published this shit in the news. Their lawyers and experts will have access to make a legal case, under a protective order. I'm not going to lose my law license because I'm doing doc review and you asked it something naughty and I think it's funny.
Courts and lawyers deal with this stuff all the time. What's very very weird to me is how upset OpenAI is about it.
reply