Hacker Newsnew | past | comments | ask | show | jobs | submit | more truth_seeker's commentslogin

> On the topic of memory, with millions of particles the server barely breaks over 100mb

Although experimental as of now, but use of arena package is a natual fit here.


The arena experiment is on indefinite hold:

> Note, 2023-01-17. This proposal is on hold indefinitely due to serious API concerns.

https://github.com/golang/go/issues/51317

Potential successor: https://github.com/golang/go/discussions/70257


>Note that the best-case scenario is the elimination of the overheads above to 0, which is at most ~10% in these particular benchmarks. Thus, it's helpful to consider the proportion of GC overhead eliminated relative to that 10% (so, 7% reduction means 70% GC overhead reduction).

Wow. amazing to see of off-heap allocation can be that good

https://go.googlesource.com/proposal/+/refs/heads/master/des...


Meanwhile Java and .NET have had off-heap and arenas for a while now.

Which goes to show how Go could be much better, if being designed with the learnings of others taken into account.

The adoption of runtime.KeepAlive() [0], and the related runtime.AddCleanup() as replacement for finalizers are also learnings from other languages [1].

[0] - https://learn.microsoft.com/en-us/dotnet/api/system.gc.keepa...

[1] - https://openjdk.org/jeps/421


What a coincedence ! :)

Recently used MemorySegment in Java, it is extremely good. Just yesterday i implemented Map and List interface using MemorySegment as backing store for batch operations instead of using OpenHFT stuff.

Tried -XX:TLABSize before but wasnt getting the deserved performance.

Not sure about .NET though, havent used since last decade.


JSON Serialization, seriously ???

Why not "application/octet-stream" header and sending ArrayBuffer over the network ?


Because JSON serialization is built into the browser.

I'm obviously a huge fan of binary serialization; I wrote Cap'n Proto and Protobuf v2 after all.

But when you're working with pure JS, it's hard to be much faster than the built-in JSON implementation, and even if you can beat it, you're only going to get there with a lot of code, and in a browser code footprint often matters more than runtime speed.


any reasons to use Java over Typescript, Go or Rust for server side programming?


Spring is very capable and has lots of things built-in


Magic-RegExp aims to create a compiled away, type-safe, readable RegEx alternative that makes the process a lot easier. https://blog.logrocket.com/understanding-magic-regexp-regexp...

example from blog:

import { createRegExp, exactly, wordChar, oneOrMore, anyOf, } from "magic-regexp";

const regExp = createRegExp(

  exactly("http")

    .and(exactly("s").optionally())

    .and("://")

    .optionally()

    .and(exactly("www.").optionally())

    .and(oneOrMore(wordChar))

    .and(exactly("."))

    .and(anyOf("com", "org", "io")),

  ["g", "m", "i"]
);

console.log(regExp);

/(https?:\/\/)?(www\.)?\w+\.(com|org|io)/gmi


> TLS servers now prefer the highest supported protocol version, even if it isn’t the client’s most preferred protocol version.

>Both TLS clients and servers are now stricter in following the specifications and in rejecting off-spec behavior. Connections with compliant peers should be unaffected.

This is nice.


Nah ! I am not convinced that context engineering is better (in the long trem) than prompt engineering. Context engineering is still complex and needs maintainance. Its much lower level than human level language.

Given that domain expertise of the problem statment, we can apply the same tactics in context engineering on higher level in prompt engineering.


Going to disagree here.

Early in the game when context windows were very small (8k, 16k, and then 32k), the team I was working with achieved fantastic results with very low incidence of hallucinations through deep "context engineering" (we didn't call it that but rather "indexing and retrieval").

We did a project for Alibaba and generated tens of thousands of pieces of output . They actually had human analysts reviews and grade each one for the first thousand. The errors they found? Always in the source material.


Are we on the same page ?

Whats really stopping you to parse and prioritise CUSTOM CONTEXT if given as text instruction in prompt engineering.


That's why indexing and retrieval is perhaps the better term. Custom context doesn't exist unless a team makes it so.


This whole industry is complex and needs constant maintenance. APIs break all the time -- and that's assuming they were even correct to begin with. New models are constantly released, each with their own new quirks. People are still figuring out how to build this tech -- and as quickly as they figure one thing out, the goal posts move again.

This entire field is basically being built on quicksand. And it will stay like this until the bubble bursts.


Agreed. but making ENGLISH or any human speakable language as main interface shoul be given highest priority IMHO !


You mean something like this:

https://www.npmjs.com/package/pkg

or perhaps this one:

https://www.npmjs.com/package/nexe


Compared to other programming languages, Rust's compiler and linters go a long way to implement best practices at build time.


Its areally great effort, but i would stick with PostgreSQL with PL/SQL procedure and tons of extensions.

Lowering down the TRSACTION LEVEL, Async transctions on disk backed tables and UNLOGGED Tables can help you go a long way.

Also, in next major version (v18) of PG, they are trying to implment io_uring support which will further drastically improve the random read/write performance.


wha about improving the compilation time ?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: