Hacker Newsnew | past | comments | ask | show | jobs | submit | anko's commentslogin

100% this, i've been finding metal is getting very compelling against aws. For example latitude has 4 real cores and 32 GB of ram for $92/month.

https://www.latitude.sh/pricing/c2-small-x86?gen=gen-2

hetzner doesn't even have specs this low from what i can tell!

https://www.hetzner.com/dedicated-rootserver/#cores_threads_...


It has a VM with 32GB RAM and 4x the cores for 1/10th of the price: 25eur/mo. Effectively even lower because it has 20TB of included traffic, and the overage cost for it is ~1/10th of the AWS egress cost.

Or, for 184eur/mo you can get one of their bare metal GPU offerings with 64GB of RAM, a i5-13500 and an RTX4000.


I have been thinking along these lines myself. Most of the time, if we need to calculate things, we'd use a calculator or some code. We wouldn't do it in our head, unless it's rough or small enough. But that's what we ask LLMs to do!

I believe we juggle 7 (plus or minus 2) things in our short term memory. Maybe short term memory could be a tool!

We also don't have the knowledge of the entire internet in our heads, but meanwhile we can still be more effective at strategy/reasoning/planning. Maybe a much smaller model could be used if the only thing it had to do is use tools and have a basic grasp on a language.


I was once told that we can only hold 7 things in our heads at once, especially smart people might manage 9; this was by a psychologist that I respect- whether its true or not I am not certain. He was using it as an argument to either condense the array of things I was thinking about into smaller decisions, or to make decisions and move on instead of letting them rot my brain.

It was good advice for me.


I would love if tantivy had a single file format, eg. .tantivy extension so you could drag it into a notebook like you can with .sqllite files.


my demo ran out of time before it did anything :(


> The cost of computing decreases over time. No one will pay $100 Billion to train GPT-6. That is absurd. The current top supercomputer in the world (Frontier) cost $600M.

Just wanted to ask the question - do you think frontier has provided more or less value to the world than gpt-4?


GPT-4 is not a piece of hardware. The comparison makes no sense. GPT-4 was enabled by hardware.

The idea is that you use something like a Frontier to create something like a GPT-4.

My point was about cost of computing. What kind of GPT can you train with a $600M computer?


> The real issue is running out of the input window.

isn't this what abstractions are for? you summarise the key concepts into a new input window?


Sure, but if we're talking about editing an entire book eventually the fine details do matter. That, and presumably human authors' abstraction/memories of their books are stored in some more compact form than language tokens. Though we can't be sure about that.


I wonder if AMD would adopt this too?


> Gross. Python is a million times better than ruby to read and write

why do you think that?


I wouldn’t be so harsh, as to call it gross, but I also much prefer Python, because Ruby reminds me of Perl. It feels clever, but not in a way that I expect to shorten its BNF. It still bugs me a bit, that ruby has pascals ‘end’, and Python uses whitespace, but it worked out in the real world.

Given just the syntax, I would always recommend Python as a first language to scientists in a lab, rather than ruby. The code just reads and writes itself better, without special characters.

But, I think it’s not fair to call Ruby gross, given some people love C++, php, bash, JavaScript… I’d take ruby over many languages, given a choice.


Syntax is such a minor detail, I don't know why people care about it so much (unless it's APL or something similarly exotic).

The much bigger elephant in the room is the semantics. My personal pet peeve is that Ruby, just like Perl or C, doesn't have any sort of file-based isolation. While importing something in Python normally doesn't mess up any namespace except for the stuff you've just imported, Ruby basically leaves this to programmers' and they create monstrosities where one require statement can do way too much magic to my liking. And while there are some libraries and frameworks that are closer to Python in spirit ("explicit is better than implicit"), Rails is something that really throws me off as most things just magically happen to work with some incantation that seemingly comes out of thin air.

This new autocomplete thing may be a real breakthrough for people who learn by getting their hands dirty and trying to write something, probing around the available methods and functions to find the appropriate one. If it can suggest what's possible/available, a lot of the magic may fade away and become proper, explainable hard science.

Just a personal opinion, of course.


What "special characters" are you talking about in Ruby? My impression is that Ruby and Python are roughly equivalent in terms of non-alphanumeric characters used in syntax.


Some symbols I can think of that Ruby uses that python doesn’t: $ for global vars @ for instance vars :: for namespace stuff => for map key, value separator {||} for blocks .. and … for ranges %w for special array construction : for symbols ? and - in method names ?: for ternary expressions #{} for string interpolation

And python that Ruby doesn’t have: :: for slices @ for decorators

And for the symbols that they both share, subjectively, a lot of them are used for often in Ruby.


Ruby does have the block related stuff like the & argument and the single line block { }. But other than that I also think it's relatively similar to Python (which doesn't even support blocks anyway).


From my perspective (DevOps/SRE) Ruby is a horrible platform. It is heavy on resources, it is difficult to run (Unicorn is a pain), maintain, monitor, debug. Many Ruby projects has silently failed (Chef? Puppet?) and the biggest Ruby tool in DevOps world which is Gitlab is incredibly difficult to run on premise and struggles with a ton of issues that I believe are caused by the platform. If I had a choice, I would never work on a ruby project.


As both a Ruby programmer and an DevOps/Infrastructure in the past, I have found Ruby to be a superlative programming environment but just as easy to program in badly as any other. I have never found it harder to run, monitor or debug than any other platform, certainly not harder than building and tuning a Java server platform. Really I think Ruby has been the biggest influence for more brilliant tools and practices in modern development than anything else, even if it doesn’t come close to the performance of a static language.


You're spot on. You can see its influence in a lot of tooling coming after, such as several package managers (from yarn to cargo), java collections syntax, go structural typing, python's gunicorn, JVMs invokedynamic (introduced for jruby originally), among others I can't remember from the top of my head. Several new languages were created, or benefit from the collaboration of ex-rubyists, from elixir to rust to node, which greatly influenced their approach to developer ergonomics. Even if the world would refuse to stop using ruby forever (which will never happen, why would it...), its influence would last for a long time, after which it'd be rediscovered again after the mandatory forgetfulness cycle.


Some great examples there, and look at how many frameworks are “$LANG on Rails”!

Rubyists introduced me to automated provisioning and deployment - Puppet, Chef, Capistrano - as well as concepts like test-driven-development and the genius of metaprogramming, but it’s common to hear javascripters waxing lyrical about TDD while slagging off Ruby.

Ruby is like 12-Bar Blues: people who love rock music don’t always like hearing blues. My favourite story is about seeing Earl Slick and Bernard Fowler performing Bowie, and they struck up a long bluesy intro which caused one of the two older fellas standing next to me at the bar to turn to me and say, “I really don’t like all this blues crap!” only for the “blues crap” to become The Jean Genie two seconds later.

Careful what threads you try and unravel as they may weave your own narrative…


From my perspective of devops, I've built large hybrid cloud deployments in Ruby (from scratch, including an orchestrator written in Ruby) and would again given the right requirements, as for tooling the ease of writing Ruby outweigh anything else, and for running it it is no different or more complicated than any other container workload. It's a bizarre objection.


> as for tooling the ease of writing Ruby outweigh anything else

Your team mates probably object to that ease of writing, as they did for Perl 20 years ago.


Whether you write unreadable code or not is not a function of language. This is a lazy attempt at criticism. My preference for Ruby is in part because reading and understanding well written Ruby is a joy compared to every other of the dozens of languages I've used.


And since (un)readable code is not a function of language, you cannot state in your next sentence that reading well written Ruby is joy in comparison with other languages.


Of course I can. There is no contradiction there. You can write readable code in any language, even assembly, but that does not mean well written code in a language that is also particularly readable won't be more of a joy to read than well written code in a verbose or hard to read language.


The first statement is provably false, empirically, as huge systems are overwhelmingly not written in dynamically typed languages.

Not all languages are created equal and some were designed to be more conducive to maintable code for large teams of developers.

Ruby is not one of these languages.


> The first statement is provably false, empirically, as huge systems are overwhelmingly not written in dynamically typed languages.

Your conclusion is not supported by the claim you try to support it with.

> Ruby is not one of these languages.

You're free you think so, but you've not provided anything but unsupported conjecture and logically invalid reasoning to support your belief, so rather than convince me, you've provided an additional reason to question your judgement.


What do you mean by huge systems? Many of the largest web platforms were indeed built with dynamically types languages. And these days Javascript somehow ends up being used for almost everything you can think except an OS kernel.


Chef's problems "with ruby" were largely design problems. The whole structure of the run collection and the "two pass parsing" design made it so that users had to almost immediately fully understand how the ruby parser saw ruby code. That wasn't really ruby's fault and there could have been other ways to structure recipes and avoid smacking new users with ruby syntax quite so hard.


which one is elixir liveview?


the one you use and dont tell anyone about so you have an advantage


I agree 100% and I'd add, static typing helps when you inherit a poorly written codebase. This unfortunately is probably 99% of codebases in the wild. It has happened so many times when i've seen a lack of test coverage, a lack of understanding of the codebase but the business requirement to make changes.

Having a type system in place makes minor refactoring possible in this nightmare scenario.

I have been in the situation of poorly written ruby codebases and I can tell you 100% that I would prefer to have a poorly written java codebase with its static types. I prefer ruby as a language but man when it's bad, it's terrible. Just trying to work out the intent of a function when multiple types are passed in as the same argument over the codebase is pure hell.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: