Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, Spark does allow you to accomplish distributed workloads for certain forms of computation. But it's limited to those forms of computations (streaming, map-reduce). It also has a large operational footprint. It's also lamentable that distributed code that uses Spark looks nothing like its non-distributed counterpart.

Something very much like Spark map-reduce can be implemented in ~100 lines of Unison code:

https://www.unison-lang.org/articles/distributed-datasets/

Some videos on Unison's capabilities over and above Spark:

Distributed programming overview: https://www.youtube.com/watch?v=ZhoxQGzFhV8

Collaborative data structures (CRDTs): https://www.youtube.com/watch?v=xc4V2WhGMy4

Distributed data types: https://www.youtube.com/watch?v=rOO2gtkoZ3M

Distributed global optimization with genetic algorithms: https://www.youtube.com/watch?v=qNShVqSbQJM



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: