Hacker Newsnew | past | comments | ask | show | jobs | submit | bearforcenine's commentslogin

Agreed. There was a period of time roughly 10-15 years ago where symbolic operators in Scala were very en vogue. That fell out of style in a big way and I haven't encountered symbol soup in a very long time.

Most of the conversations I have with folks about Scala issues these days center around implicits, tooling, and its decline/lack of popularity.


Not confident it's quite that straightforward. Here's a presentation from Meta showing a 6-12% increase in diff throughput for above-median users of agentic coding: https://www.youtube.com/watch?v=1OzxYK2-qsI


It's Jujutsu based, but I imagine East River Source Control https://ersc.io/ may be building a GitHub competitor.


From my perspective the two biggest challenges of the Scala 3 migration were macros and poor tooling support.

Macros were an experimental Scala 2 feature, but were used all over the Scala ecosystem. Because they were considered experimental a good migration story for them was never developed. That lack of support stopped migration efforts dead in their tracks at our company for a long while. It just wasn't worth contributing simultaneous support for Scala 3 and Scala 2 macros to every third party dependency who used Scala 2 macros. That said, we did it for some and lived on a fork for others.

IDE support for Scala 3 was really rough when it first released. We checked in on it with every IntelliJ release for roughly 3 years before we decided it was far enough along. Prior to that it was rough enough that we froze migration efforts in order to keep the tooling usable enough for engineers to be productive.


> IDE support for Scala 3 was really rough when it first released. We checked in on it with every IntelliJ release for roughly 3 years before we decided it was far enough along. Prior to that it was rough enough that we froze migration efforts in order to keep the tooling usable enough for engineers to be productive.

Same story for us - about 3 years before Intellij was usable, even then not up to what it had been on Scala 2. We still only have 2 Scala 3 repos, out of about 30, for my team, and we're actually MORE adventurous than most other teams at the company!


Well, IJ experience for Scala 3 is still noticeably worse than for Scala 2. We're still cross-compiling our work projects, waiting until Scala 3 support in Intellij is good enough to switch.


Compiler error messages improved significantly with Scala 3. IIRC there was a dedicated effort with Scala 3 to improve error messages and make them more actionable. Scala 2 error messages improved somewhat over time, but can still be obtuse.


Isn't the table sorted alphabetically?


True

My bad


Based on my reading of the table, the PurpleAir II sensors seemed to be the best of the sub $300 for PM1.0 (field R^2 of 0.96 to 0.98) and PM2.5 (field R^2 of 0.93 to 0.97). The PM10 readings were not as good (field R^2 of 0.66 to 0.70).

After skimming the rest of the table, it looks like the PurpleAir II sensors might have some of the best field R^2 for PM 2.5 and PM 1.0


That sensor is also available from Adafruit: https://www.adafruit.com/product/3686 I got it hooked up to a RaspberryPi that exposes sensor readings in Prometheus format.


Do you have a PurpleAir II to compare against? I suspect that there will be some extra calibration or signal processing some to make it more accurate that will be missing in the raw sensor.

EDIT: The link we're discussing says this explicitly: "[...] These particle counts are processed by the sensor using a complex algorithm to calculate the PM1.0, PM2.5 & PM10 mass in ug/m3. [...] PurpleAir PA-II uses two identical PMS5003 sensor units attached to each other and placed in the same shelter. [...]"

I don't think you can recommend the PMS5003 as a substitute for the PurpleAir II.


That sensor available at Adafruit uses the same algorithm for calculating PM levels, it's literally the same chip. Maybe using 2 of them yields more accurate results though.

The specs for particulate matter are identical:

https://www2.purpleair.com/collections/air-quality-sensors/p...

https://cdn-shop.adafruit.com/product-files/3686/plantower-p...


The Adafruit sensor is just the sensor, the PurpleAir II is two sensors plus extra logic that processes the readings to give extra accuracy.

You can't expect one sensor to give the same performance as two sensors plus correction logic. If it was that easy, PurpleAir could just put a case on a PMS sensor and be done with it.


If that was true, then why are the specs for the PurpleAir II identical with the Adafruit sensor ?

You provide absolutely no data to back up your claims that there is extra logic on the PAII or that 2 sensors are better than one.


I agree, looking at devices around the sub-$300 price point that were lab tested, PA-II has good performance and seems to be readily available.


PurpleAir also makes an indoor air quality sensor that measures particulate matter: https://www2.purpleair.com/products/purpleair-pa-i-indoor

Not sure how it compares with Awair or laseregg.

I have no affiliation with PurpleAir, but they are pretty popular in Utah.


One thing I'm not totally clear on after reading the article is which AWS account the AWS KMS resides. Is it Slack's AWS account or the customer's?


https://slack.engineering/engineering-dive-into-slack-enterp...

This shows the KMS key is in the customer's AWS account.


Similar experience with Dota 2 and ESL on Facebook. The experience was horrible. Viewership was bad. Doesn't seem like anyone was happy about it https://twitter.com/Slasher/status/1071863217243197440

If I recall correctly, people on Twitch streamed games from the Dota Majors/Minors and got more viewers than the Facebook stream. Then ESL started DMCA'ing streams https://www.reddit.com/r/DotA2/comments/7skt8e/did_mlpdota_j... Then Valve got involved and made a statement about the whole kerfuffle http://blog.dota2.com/2018/01/dotatv-streaming/


Yet another example of the DMCA being completely, insanely imbalanced in terms of power for those issuing take-downs, call me when anybody at ESL goes to jail for perjury.


It wasn't a great experience, but I personally felt like the outage was a loud overreaction from the community. I guess that the circumstances were perfect for outrage: Facebook is not likeable, it's a big change from twitch, and Facebook paid for exclusivity. I was fine with watching the games on Facebook, and part of me wishes they succeeded so that we wouldn't have a twitch monopoly.


It was just five or six years ago when my friends and I found Twitch website unbearable. We'd always watch twitch on vlc using live streamer http://docs.livestreamer.io/

Today, we have YouTube Live and to a much smaller extent mixer. More importantly, I'm not afraid to open Twitch on a desktop web browser. What changed? The main change is reliable 60fps streaming.

If Facebook can't do 60fps on day one, it might as well not try.

I remember asking Justin tv engineers whether they thought they could break even. They said it is more than enough to show one thirty second ad every hour (as far as I remember) to keep the lights on. But then this was before Twitch partner programs. Also we were streaming from potato quality laptop webcams. Watching 240p video with 20 second latency was an ordinary miracle.

I imagine the costs are likely much higher today. But I'm curious. Did Facebook spend a billion dollars on content deals?


The community definitely overreacted, but the experience really was pretty poor. I watched it on fb (because I like the ESL production and I don't hate facebook), but I had serious connectivity and quality problems even if I have fibre. On top of this the platform is just not that great, I can't think of a single feature that was better than Twitch.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: