Hacker Newsnew | past | comments | ask | show | jobs | submit | JimmyAustin's commentslogin

You might have to mess around with the software to figure out the details, but from a hardware perspective a hall effect keyboard should be at least able to infer the speed a key is travelling at when bottoms out.


There are a great many of them, you just can't see them in the dark forest. https://www.paradigm.xyz/2020/08/ethereum-is-a-dark-forest


Hey, Replit employee here. I'm pretty sure this isn't us (definitely isn't our marketing team's MO AFAIK). Can you email me some examples at james @ replit dot com so I can look into this?


It reduces the degree of regressive, but as long as lower-income people spend any money at all on nonexempt goods it will never make it neutral.


GST and VAT are the same thing, different countries just call them different things.

As an example, GST in Australia is in the price on the shelf. You can also claim back the GST you paid on your input goods, which means you effectively pay the difference on the GST your customers pay you, and you paid your suppliers (which is the value added!).


FWIW, I had the same issue with my 3090 (though I believe that uses a slightly different port?). I was using a custom cable like this guy. Nvidia replaced it under warranty, and I went back to using the (ugly) provided adapter.


> they are assistants. They aren’t inventors or prognosticators. They cannot predict how the market will react to a product or idea.

Can you?


Humans can obviously invent, assist, prognosticate, and predict. Just because not every human may be able to do every one of those things doesn't invalidate the point that LLMs cannot do those things.


TLDR: If he buys a car a year, and drives the same number of miles as the average American, his total emissions are probably equal to someone who drives 28k miles a year.

Carbon emissions for a Model 3 vs a Toyota Corolla even out after 13'500 miles according to Argonne National Laboratory [1], which is slightly less than the average an American drives per year (14'263 miles [2]). Assuming that he drives as much as the average driver, his cars generate as much Co2 as a Model 3 (definitely not true for the Cybertruck, but he probably has low-build-emission ICE cars in the other 8 to lower the average), and he buys a car a year, he has roughly the equivalent emissions of someone who drives twice the average number of miles each year. For reference, a long haul driver (of which there are 300k-500k in the US [3]) drives 100-110k miles [4] a year (7-8x the average).

[1] https://www.reuters.com/business/autos-transportation/when-d...

[2] https://www.thezebra.com/resources/driving/average-miles-dri....

[3] https://www.npr.org/sections/money/2021/05/25/999784202/is-t....

[4] https://www.caltrux.org/driver-faqs/#:~:text=Begin%20a%20Car....


Truck drivers are doing a commercial activity where it is reasonable to largely attribute the emissions to the customer.


Yes, those Americans driving on average 77 miles a day. Every day.


A lot of analysis is done using CSVs being pushed in and out of Excel. Doing so strips the formatting. Please understand the workflows being crying “skill issue”.


You can write CSVs in such a way that forces Excel to infer specific data types:

"=""Data Here"""

will always be treated as a string. This is also supported by Sheets, apparently.


TIL, neat! Does Excel automatically export like that if you format the columns as strings?


ok but will that work other tools that work with those csvs? I imagine that they export / import from excel to csv for a reason.


Yes, but you can create a small macro to the change the formatting and assign that to your table. Your CSV should import deterministically. That's not impossible to do.

I saw a complete analysis engine written as an Excel file, which accepts and exports CSVs cleanly. It can be done.


You don’t even need to write a macro. Excel’s Import Text Wizard will allow you to assign data type to each column at the time of import.


Same for Power Query and Text to Columns.


"It can be done" is a far cry from "it's reasonable to expect scientists to do this".


As a person who does research and support researchers, I can't see the gap, sorry.

I understand some people don't know it's possible, and some don't care, but for any competent researcher, it's expected them to master the tools they use. This is esp. true for career researchers.


I'm not disputing that a competent computer user can do it. From the perspective of "this is what I would do if I was a scientist", you're totally correct.

But when you're writing guidelines for an entire field - as the article describes HGNC doing - you're catering to all researchers in that field: good, bad, and ugly. Plus technicians, editors, admins and anyone else that might handle the files. Given how hidden and unintuitive Excel's behaviour is here, I think what they're doing makes sense.


I agree.

As a researcher you may have to learn how to carefully dig up skulls, raise rats, handle lasers, remember not to accidentally syringe yourself with viruses etc.

Getting cut by Excel seems like part of the job and at least is hopefully less life threatening than possibly blowing yourself up or giving yourself silicosis.

That said the problem with computers is that they're pervasive, they're a moving target and often it's a case of the blind leading the blind when it comes to research. And probably more and more research groups need dedicated computer technician resources who can centralize the required computer knowledge of keeping a research group running.


Sorry, can't understand, and that excuse is unacceptable. It's totally a skill issue, and a failure of it, and a laziness of these 'scientists'.

You'd expect scientists - people working to understand the nature of reality - to have some base competency about how they measure reality. Could have at least used a database for things like this; moreover any decent database can often import from CSV and export to CSV as well. Excel is not at fault here; the 'scientists' are.


Haha, you think scientists have control over what software their employer buys for them to work on.

There are probably wonderful places where that is the case, and probably several of those places use something like libreoffice which doesn't do the idiotic data conversions excel does, but they are definitely not the norm.


Coming up next: people should still use C because writing insecure code is a “skill issue”.

Why can’t we as a society make ANYTHING easier without the usual blathering on from the peanut gallery turning it into a question of one’s intelligence?


The model they use (BAAI/bge-small-en-v1.5) produces embeddings 384 wide, at float32 that equals 1536 bytes each. The size of all the vectors for 100 items is 153kb. Calculating the dot product of that against a query will be under measured in nano seconds, even with a naive implementation.


This is actually what llama-index does if you don't use a vectordb integration


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: