Hacker Newsnew | past | comments | ask | show | jobs | submit | notfish's commentslogin

My ex coworkers at spacex are pretty damn motivated to get to mars, many of them despite all the Musking happening. I wouldn’t bet against them.


Can they please get him to Mars?


I wouldnt bet against them either, but the material science aspect of it all just isnt there and wont be for a while. How many tons would we need to get into Earth orbit alone, let alone transfer to mars?

I never hear anyone speak of the radiation outside of our atmosphere very often when it comes to 'moonshot' ideas like this, and how we would be incapable of preventing it or surviving it once we arrive, in our current biological form.

How much would it cost to get all the lead or H20 you would need to generate a barrier against it into orbit? Do we need to have a moon base extracting materials for us to even think about transferring orbits?

Its all pie in the sky, and that is great because the sky is a pie that we should long to eat, but lets not fool ourselves that in ours or our childrens childrens lifetimes we will have a human on a planet that is not Earth.


> I never hear anyone speak of the radiation outside of our atmosphere very often when it comes to 'moonshot' ideas like this, and how we would be incapable of preventing it or surviving it once we arrive, in our current biological form.

Eh, it's not that out of the realm of possible. It's about twice what ISS astronauts experience, without any mitigation efforts like shielding. https://www.jpl.nasa.gov/images/pia04258-comparison-of-marti...

> How much would it cost to get all the lead or H20 you would need to generate a barrier against it into orbit? Do we need to have a moon base extracting materials for us to even think about transferring orbits?

The astronauts will need water either way. Might as well have it be useful in transit.


It will never happen. Zero evidence any organic as complex as a human can move planets. It’s not an emergent event that’s common enough for us to have observed so far it’s too unlikely.

Some dried goo in a meteor or some poetic notion of consciousness being the innate physical interaction of electromagnetism and matter such that consciousness is everywhere and imagining the potential is as good as humans will ever get.


We also didn't evolve to do math or science, a lot of our intelligence is incidental and born of the communication & strategy that we did evolve.

Given time and will, it is a guarantee that humans could colonize Mars. Heck, even terraforming is possible at large timescales and an enormous concerted effort.

But there's nothing for us there. The cost in both resources, opportunity, and human lives would be enormous and there's almost no payback at the end but saying "neat".

We're struggling politically to keep our own planet from boiling us alive, at even giving food and water and shelter and healthcare to the citizens within our country - let alone the planet - let alone another. THAT is why it will never happen, not because we didn't evolve for space travel.


IMO those things you mention as being what really prevent us are emergent evidence we are not evolved enough for space travel though. Biologically or socially.


They can be as excited as they want, but we are decades away from that even being a sliver of a chance of happening. None of the science or groundwork is there to actually make that happen.

Any manned trip to Mars is a guaranteed suicide mission, if they even get there at all.

And that's without even discussing the politics of a system like that. We can't even agree children should have food in this country. Our population is getting poorer, less healthcare, losing hope and gaining debt. And somehow we're ready to create a colony that is entirely dependent on us that can never be abandoned that requires resupplies so expensive that they could feed every child in America multiple times over for no known benefit?


We will be on Mars by the end of the year.


How many years have they been pushing that line now?


That's the trick. You don't say which year.


You ever see that video of the flat Earther who disregarded what his own eyes saw for his memorized semantics?

Your ex workers goto work whistling Star Trek TNG theme song?

Come on… the shit people believe? Some people just “believe” to make money but so many many more believe in American civil religion, or whatever stream of consciousness they simmered in early on.

I’m not hating. I’m saying direct experience is truth not our visual syntax. Still waiting for my nuclear powered… everything. Where’s my mini commuter helo and …etc etc etc

It’s a government job with extra steps. No hate. Good grift if you can get it.


Yeah I’ve wanted to build this for ages (and have tried a couple times). The use case is festivals/sporting events and other places where permanent infrastructure doesn’t really exist. The hard part is keeping messages small if you wanna include any of the token tech you’re talking about - probably, a system where your payment for usage is that you be an active relay node is more effective. Something something trust models, ala existing cert signing models.


Its best for places like football games or festivals, where the traditional network gets overrun


If you go to a football game or a festival to frantically keep messaging, better stay at home.


Spoken like someone who's never lost their pal at a festival


And doesn't see the potential sexy use-cases...


You are not getting laid by showing off your Bluetooth mesh chat app. Go to horny jail bonk


Wild. I was in my 4th/5th years of college during the pandemic and had an internship writing mission planning tooling for lucy. Felt like either two weeks or ten years, coding on the couch while we were pretty sure the world was ending outside. Bizarre times.


Don’t do this, it makes it a huge pain to test bar().

When you write the initial code to figure out bar, just throw that code in a unit test so you can run it any time. Stop throwing tests away!


Yes, perhaps I wasn’t clear — I’m advocating for just the inlined version, not the nested function in the general case.

The nested function is, I think, fine when it’s so tiny it’s not worth unit testing (like, it-could-have-been-a-lambda small) or when bar is returned by foo (ie foo is higher order), in which case you can test the return value. Apart from that… restraint!


The target market is the same as normal mining, you probably mine platinum. The cost probably only closes with either starship or mining fuel in space.


100% this is a margin play on the Platinum Group Metals - were just going to better ore sources.


In what languages is n % 2 -1 for negative odd numbers?

Edit: apparently JS, java, and C all do this. That’s horrifying


Horrifying? It’s mathematically correct.


it's a semantics problem, not a maths problem - modulus and remainder are not the same operation. This easily trips up people since `%` is often called "modulo", yet is implemented as remainder operation in many languages

https://stackoverflow.com/questions/13683563/whats-the-diffe...


It's actually really awkward. Math usually considers (-7 mod 5) === (2 mod 5). But in C, (-7 % 5 != 2 % 5).


No. Math considers -7 = 3 modulo 5. it's a ring that repeats every 5 units. -7 + 5 + 5 = 3.

Think of a clock which is a ring of size 12. In a clock, going backwards 15 hours (-15) is the same as going backwards 3 hours (-3) which is the same as going forwards 9 hours.

-15 = -3 = 9 modulo 12


You are correct. I thinko'd and missed the edit window. I meant to say:

Math usually considers (-7 mod 5) === (3 mod 5). But in C, (-7 % 5 != 3 % 5).

The issue is that -7 and 3 are congruent, but the % operator keeps the sign. So -7 % 5 yields -2, not +3. Those are congruent, but not equal. I've never had a use for this behaviour, but I've definitely had to work around it. The lazy way is ((x % n) + n) % n which is safe (assuming n > 0).


+1. I wholeheartedly agree. That "lazy way" looks all too familiar haha.


wrong. it's not any more correct than 1. that's the key part of an "equivalence" class is that the elements are "equivalent"


An algorithm is “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer”

How does a giant pile of linear algebra not meet that definition?


It's not made of "steps", it's an almost continuous function to its inputs. And a function is not an algorithm: it is not an object made of conditions, jumps, terminations, ... Obviously it has computation capabilities and is Turing-complete, but is the opposite of an algorithm.


If it wasn’t made of steps then Turing machines wouldn’t be able to execute them.

Further, this is probably running an algorithm on top of an NN. Some kind of tree search.

I get what you’re saying though. You’re trying to draw a distinction between statistical methods and symbolic methods. Someday we will have an algorithm which uses statistical methods that can match human performance on most cognitive tasks, and it won’t look or act like a brain. In some sense that’s disappointing. We can build supersonic jets without fully understanding how birds fly.


Let's see that Turing machines can approximate the execution of NN :) That's why there are issues related to numerical precision, but the contrary is also true indeed, NNs can discover and use similar techniques used by traditional algorithms. However: the two remain two different methods to do computations, and probably it's not just by chance that many things we can't do algorithmically, we can do with NNs, what I mean is that this is not just related to the fact that NNs discover complex algorithms via gradient descent, but also that the computational model of NNs is more adapt to solving certain tasks. So the inference algorithm of NNs (doing multiplications and other batch transformations) is just needed for standard computers to approximate the NN computational model. You can do this analogically, and nobody would claim much (maybe?) it's running an algorithm. Or that brains themselves are algorithms.


Computers can execute precise computations, it's just not efficient (and it's very much slow).

NNs are exactly what "computers" are good for and we've been using since their inception: doing lots of computations quickly.

"Analog neural networks" (brains) work much differently from what are "neural networks" in computing, and we have no understanding of their operation to claim they are or aren't algorithmic. But computing NNs are simply implementations of an algorithm.

Edit: upon further rereading, it seems you equate "neural networks" with brain-like operation. But brain was an inspiration for NNs, they are not an "approximation" of it.


But the inference itself is orthogonal to the computation the NN is going. Obviously the inference (and training) are algorithms.


NN inference is an algorithm for computing an approximation of a function with a huge number of parameters. The NN itself is of course just a data structure. But there is nothing whatsoever about the NN process that is non-algorithmic.

It's the exact same thing as using a binary tree to discover the lowest number in some set of numbers, conceptually: you have a data structure that you evaluate using a particular algorithm. The combination of the algorithm and the construction of the data structure arrive at the desired outcome.


That's not the point, I think: you can implement the brain in BASIC, in theory, this does not means that the brain is per-se a BASIC program. I'll provide a more theoretical framework for reasoning about this: if the way to solve certain problems by an NN (the learned weights) can't be translated in some normal program that DOES NOT resemble the activation of an NN, then the NNs are not algorithms, but a different computational model.


This may be what they were getting it, but it is still wrong. An NN is a computable function. So, NN inference is an algorithm for computing the function the NN represents. If we have an NN that represents a function f, with f(text) = most likely next character a human would write, then running the inference for that NN is an algorithm for finding out which character it's most likely a human would write next.

It's true that this is not an "enlightening" algorithm, it doesn't help us understand why or how that is the most likely next character. But this doesn't mean it's not an algorithm.


We don’t have evidence that a TM can simulate a brain. But we know for a fact that it can execute a NN.


> It's not made of "steps", it's an almost continuous function to its inputs.

Can you define "almost continuous function"? Or explain what you mean by this, and how it is used in the A.I. stuff?


Well, it's a bunch of steps, but they're smaller. /s


Each layer of the network is like a step, and each token prediction is a repeat of those layers with the previous output fed back into it. So you have steps and a memory.


I would say you are right that function is not an algorithm, but it is an implementation of an algorithm.

Is that your point?

If so, I've long learned to accept imprecise language as long as the message can be reasonably extracted from it.


> continuous

So, steps?


"Continuous" would imply infinitely small steps, and as such, would certainly be used as a differentiator (differential? ;) between larger discrete stepped approach.

In essence, infinite calculus provides a link between "steps" and continuous, but those are different things indeed.


If you had spacex’s budget you could make all of these things happen


Didn't they already burn through their Starship development budget?


They're a private company funded personally by the richest man in the world. What makes you think there is such a thing as a "starship development budget" and that it's limited?


Musk's interplanetary delirium aside, HLS is the customer for Super Heavy/Starship. They've reported spending all of the NASA's 3 billion and delivered very little of contracted capability so far.

SpaceX filings show $3.8B of funding from various mostly undisclosed investors, Musk's own stake can't exceed that can it?

The richest man wouldn't be so rich if he didn't watch his money.


> They've reported spending all of the NASA's 3 billion and delivered very little of contracted capability so far.

This is not 'reported'. This is just something somebody who doesn't like Musk made up and is successfully spreading it on social media. I have read this claim 100s of time in ever forum, I don't know where this came from.

The facts are literally out-there, but nobody cares. You can literally look up the contract on the official government website.

The payment for HLS has about 40-50 milestones, each gives a part of the money. SpaceX has completed like 25 of them. They have received 2.x billion $ so far. If they finish everything they will get like 3.4 billion $.

And its a fixed price contract, how much money they already spend is completely irrelevant for the government, unless SpaceX might go bankrupt, and they clearly aren't.

SpaceX got exactly as much money as specified in the contract for how far the development has come. This is a far better method for government to pay contractors then what NASA usually does.


The audiobooks are pretty well narrated too, great for anyone who needs 160 hours of audio


Very true. Then you get to Cryptonomicon - which is set in the same universe. I think it's the best audiobook I've ever listened to.


I found that Fall; or, Dodge in Hell wraps up the whole thing very nicely!


That novel really expands the scope of the whole alternate universe dramatically. He's an amazing writer.

EDIT: nice username! Is this where you got it?


Yes - started using the username on Slashdot back on the day after reading Cryptonomicon and have used it on various places over the years... :-)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: