Agreed. There was a period of time roughly 10-15 years ago where symbolic operators in Scala were very en vogue. That fell out of style in a big way and I haven't encountered symbol soup in a very long time.
Most of the conversations I have with folks about Scala issues these days center around implicits, tooling, and its decline/lack of popularity.
Not confident it's quite that straightforward. Here's a presentation from Meta showing a 6-12% increase in diff throughput for above-median users of agentic coding: https://www.youtube.com/watch?v=1OzxYK2-qsI
From my perspective the two biggest challenges of the Scala 3 migration were macros and poor tooling support.
Macros were an experimental Scala 2 feature, but were used all over the Scala ecosystem. Because they were considered experimental a good migration story for them was never developed. That lack of support stopped migration efforts dead in their tracks at our company for a long while. It just wasn't worth contributing simultaneous support for Scala 3 and Scala 2 macros to every third party dependency who used Scala 2 macros. That said, we did it for some and lived on a fork for others.
IDE support for Scala 3 was really rough when it first released. We checked in on it with every IntelliJ release for roughly 3 years before we decided it was far enough along. Prior to that it was rough enough that we froze migration efforts in order to keep the tooling usable enough for engineers to be productive.
> IDE support for Scala 3 was really rough when it first released. We checked in on it with every IntelliJ release for roughly 3 years before we decided it was far enough along. Prior to that it was rough enough that we froze migration efforts in order to keep the tooling usable enough for engineers to be productive.
Same story for us - about 3 years before Intellij was usable, even then not up to what it had been on Scala 2. We still only have 2 Scala 3 repos, out of about 30, for my team, and we're actually MORE adventurous than most other teams at the company!
Well, IJ experience for Scala 3 is still noticeably worse than for Scala 2. We're still cross-compiling our work projects, waiting until Scala 3 support in Intellij is good enough to switch.
Compiler error messages improved significantly with Scala 3. IIRC there was a dedicated effort with Scala 3 to improve error messages and make them more actionable. Scala 2 error messages improved somewhat over time, but can still be obtuse.
Based on my reading of the table, the PurpleAir II sensors seemed to be the best of the sub $300 for PM1.0 (field R^2 of 0.96 to 0.98) and PM2.5 (field R^2 of 0.93 to 0.97). The PM10 readings were not as good (field R^2 of 0.66 to 0.70).
After skimming the rest of the table, it looks like the PurpleAir II sensors might have some of the best field R^2 for PM 2.5 and PM 1.0
That sensor is also available from Adafruit: https://www.adafruit.com/product/3686
I got it hooked up to a RaspberryPi that exposes sensor readings in Prometheus format.
Do you have a PurpleAir II to compare against? I suspect that there will be some extra calibration or signal processing some to make it more accurate that will be missing in the raw sensor.
EDIT: The link we're discussing says this explicitly: "[...] These particle counts are processed by the sensor using a complex algorithm to calculate the PM1.0, PM2.5 & PM10 mass in ug/m3. [...] PurpleAir PA-II uses two identical PMS5003 sensor units attached to each other and placed in the same shelter. [...]"
I don't think you can recommend the PMS5003 as a substitute for the PurpleAir II.
That sensor available at Adafruit uses the same algorithm for calculating PM levels, it's literally the same chip. Maybe using 2 of them yields more accurate results though.
The Adafruit sensor is just the sensor, the PurpleAir II is two sensors plus extra logic that processes the readings to give extra accuracy.
You can't expect one sensor to give the same performance as two sensors plus correction logic. If it was that easy, PurpleAir could just put a case on a PMS sensor and be done with it.
Yet another example of the DMCA being completely, insanely imbalanced in terms of power for those issuing take-downs, call me when anybody at ESL goes to jail for perjury.
It wasn't a great experience, but I personally felt like the outage was a loud overreaction from the community. I guess that the circumstances were perfect for outrage: Facebook is not likeable, it's a big change from twitch, and Facebook paid for exclusivity. I was fine with watching the games on Facebook, and part of me wishes they succeeded so that we wouldn't have a twitch monopoly.
It was just five or six years ago when my friends and I found Twitch website unbearable. We'd always watch twitch on vlc using live streamer http://docs.livestreamer.io/
Today, we have YouTube Live and to a much smaller extent mixer. More importantly, I'm not afraid to open Twitch on a desktop web browser. What changed? The main change is reliable 60fps streaming.
If Facebook can't do 60fps on day one, it might as well not try.
I remember asking Justin tv engineers whether they thought they could break even. They said it is more than enough to show one thirty second ad every hour (as far as I remember) to keep the lights on. But then this was before Twitch partner programs. Also we were streaming from potato quality laptop webcams. Watching 240p video with 20 second latency was an ordinary miracle.
I imagine the costs are likely much higher today. But I'm curious. Did Facebook spend a billion dollars on content deals?
The community definitely overreacted, but the experience really was pretty poor. I watched it on fb (because I like the ESL production and I don't hate facebook), but I had serious connectivity and quality problems even if I have fibre. On top of this the platform is just not that great, I can't think of a single feature that was better than Twitch.
Most of the conversations I have with folks about Scala issues these days center around implicits, tooling, and its decline/lack of popularity.
reply