This thread is better than the article. Though this quote:
> just another blatant example of how AI needs to be regulated in so many ways immediately as an emergency of sorts. This is just going to get worse and its happening fast.
Rubs me wrong. We already have laws around false advertisement or not delivering what's promised that apply here. I'm not saying AI doesn't need to be regulated but AI isn't to blame here. Bad planning and false advertisement is to blame. Yes, maybe AI makes it easier but you could fake pictures or use a different event's pictures to lie about your event before AI.
Stolen, or deliberately scrapped by the owners? It’s entirely possible they legitimately thought that the FCC wouldn’t care about the AM requirement in their license.
The owners would almost certainly know that the thing went offline. They would have a maintenance schedule, and power bills to pay.
Is it at all possible that they outsourced the physical maintenance - including paying the power bills - of the tower to someone else, who in turn screwed them over?
start tracking all food, from crops to harvest, reduce family farms. Charge people to use carbon on a metered rate, slowly reducing the allowed amount each year. Keep everyone restricted with scarcity.
For what it's worth, I went to a coding bootcamp in 2012. From there, I found people who were willing to pay me money to do software development as a Freelancer.
I did countless events, I listed my services in local blogs.
This leadup to becoming CTO of a startup. 8 years later here we are.
This was in Chicago.
It was one of the best decisions I ever made.
I also did an iOS bootcamp in 2014, mostly because clients kept asking for mobile and I didn't like the hybrid mobile platforms.
I've noticed searching the recent questions, the answer count and views are really low, there just seems to be not a lot of people on the site answering questions anymore
There is a big message telling users not to use AI generated content for answers
I'm starting to wonder if the no AI answers rule is a good idea
I think that is precisely the problem. A kid is teaching himself to program. He has an error which a million people before him have faced. He types it into google and gets SEO bullshit. He asks stackoverflow and they close his question immediately as a duplicate of a question which, to him, looks very different. He asks ChatGPT and he gets an answer immediately.
> SO used to have interesting questions, questions that I needed answered or that I found interesting to spend time on to research.
IMO a lot of the interesting questions got moderated away. Almost all of my favourite SO questions are closed as being off topic or subjective. I do think that the community struggled to scale with influx of a huge volume of users. But I also think that the strategy that they chose to deal with it was ultimately overzealous.
By interesting question I mean that ones that I did not know and were not a simple read the manual. They would be ones I would have find fun in researching the answer.
Your definition of interesting is a discussion which is not what SO is for - even if they did then they do not have to tools to help this e.g. threading being able to vote on interim comments etc.
No wonder LLMs ate their lunch. If questions about contents of the manual are the only ones allowed, you are literally better off just asking Bing Chat.
Everything should be in the manual - but it is not nowadays. Many projects just say read the code which is difficult to first find the correct code and then understand it as the code might be more complex than the uisers knowledge or be in a different language.
Even if it is it is often difficult to put the individual facts together and that is what a SO answer will do.
I think because the effort bar becomes too low when providing answers.
The gamification of the Stack Exchange sites is a double-edged sword, and I think it relies on an assumption about the effort threshold required to provide an answer.
Typing out an answer, even a wrong one, takes time and effort that the system then rewards through points and badges.
If you can just copy a question to Copilot and then paste the answer back, the effort threshold is likely then low enough that too many (or at least lots) of people will do only that in order to engage with the gamification and get points, rather than do it to try to help out the asker.
Maybe it's not a good idea for the company. However, avoiding the echo chamber feedback loop in AI training data is absolutely a good idea for society.
The issue is that current AI answers invent 'facts' or produce wrong code.
AI might help improve how people questions and possibly help them find an answer e.g. improve searching but to answer you must be correct and give correct references - not false ones as have been found in legal cases and others.
The views being lower is partially due to StackOverflow implementing a legal cookie confirmation dialog, that is as easy to reject cookies as to accept them
When politics becomes less about swaying a vulnerable middle and more about growing your base/shrinking your opponent's base, you eventually realize the best tactics are to take from those who will never vote for you and give to those who will.
The subheadline and four of the first five paragraphs address this:
"In 2011, Spielberg had already explained that the guns would be returning for the 30th anniversary release, explaining that he was 'disappointed' in himself."
https://twitter.com/AlsikkanTV/status/1762235022851948668