My naive take is we discovered it as a math tool first but later on rediscovered it in nature when we discovered the electromagnetic field.
The electromagnetic field is naturally a single complex valued object(Riemann/Silberstein F = E + i cB), and of course Maxwell's equations collapse into a single equation for this complex field. The symmetry group of electromagnetism and more specifically, the duality rotation between E and B is U(1), which is also the unit circle in the complex plane.
Maybe I missed it but I don't see them defining what they mean by ethics. Ethics/morals are subjective and changes dynamically over time. Companies have no business trying to define what is ethical and what isn't due to conflict of interest. The elephant in the room is not being addressed here.
Especially as most AI safety concerns are essentially political, and uncensored LLMs exist anyway for people who want to do crazy stuff like having a go at building their own nuclear submarine or rewriting their git history with emoji only commit messages.
For corporate safety it makes sense that models resist saying silly things, but it's okay for that to be a superficial layer that power users can prompt their way around.
Ethics are all well and good but I would prefer to have quantified limits for water quality with strict enforcement and heavy penalties for violations.
Of course. But while the lawmakers hash out the details it's good to have companies that err on the safe side rather than the "get rich quick" side.
Formal restrains and regulations are obviously the correct mechanism, but no world is perfect, so whether we like it or not ourselves and the companies we work for are ultimately responsible for the decisions we make and the harms we cause.
De-emphasizing ethics does little more than give large companies cover to do bad things (often with already great impunity and power) while the law struggles to catch up. I honestly don't see the point in suggesting ethics is somehow not important. It doesn't make any sense to me (more directed at gp than parent here)
Is it ethical for a water company to shutoff water to a poor immigrant family because of non-payment? Depending on the AI's political and DEI-bend, you're going to get totally different answers. Having people judge an AI's response is also going to be influenced by the evaluator's personal bias.
I note in the UK that it is illegal for water companies to cut off anyone for non-payment, even if they're an Undesirable. This is because humans require water.
How useful/effective would a business AI be if it always plays by that view?
Humans require food, I can't pay, DoorDash AI should provide a steak and lobster dinner for me regardless of payment.
Take it even further: the so-called Right to Compute Act in Montana supports "the notion of a fundamental right to own and make use of technological tools, including computational resources". Is Amazon's customer service AI ethically (and even legally) bound to give Montana residents unlimited EC2 compute?
A system of ethics has to draw a line somewhere when it comes to making a decision that "hurts" someone, because nothing is infinite.
Asan aside, what recourse do water companies in the UK have for non-payment? Is it just a convoluted civil lawsuit/debt process? That seems so ripe for abuse.
Civil recovery, yes. It's not like you don't know where the customer lives.
Doesn't seem to be a problem for the water companies, which are weird regulated monopolies that really ought to be taken back under taxpayer control. Scottish Water is nationalized and paid through the council tax bill.
Or whatever's cheapest for your local food supply. Every time I've done this game with supermarket produce, it comes under £1/day to support someone's nutritional requirements, currency tells you where I played that game.
McD is pretty expensive these days, I've seen cheaper even in the caregory of fast food.
I'd love to see a return to the idea of government cheese, or at least align food stamps with WIC (WIC in US is a specific food aid program restricted to ostensibly healthier foods), instead of allowing the ridiculous moral hazard and waste posed by regular foodstamps.
I understand the point you’re making but I think there’s a real danger of that logic enabling the shrugging of shoulders in the face of immoral behavior.
It’s notable that, no matter exactly where you draw the line on morality, different AI agents perform very differently.
I think parent comment was pointing to lack of establishing a causation link. The finding in their abstract is extrapolated by statistical inference. For example smokers tend to drink more etc. The paper does take such factors into account. Personally I wouldn't jump to such a strong conclusion from statistical inference because it closes the door on other factors that might be even stronger when combined together. The paper reflects motivated reasoning more than a discovery outcome. That said, smoking is of course a major health risk, I am just pointing at the research approach.
your question leaks your intentions and drives the LLM to confirm your cognitive bias. it treats your intentions as conclusion. Try to form your questions in a way that allow LLM to arrive to the word/concept of "suppression" in a more neutral probabilistic manner when the context hints to such instead of giving it the words you want to hear. Otherwise you're just falling into confirmation bias.
> The behaviour of brk() and sbrk() is unspecified if an application also uses any other memory functions (such as malloc(), mmap(), free()). Other functions may use these other memory functions silently.
That barely scratches the surface when it comes to reproducible c and c++ builds. In fact the topic of reproducible builds assumes your sources are the same, as in that's really not the problem here.
You need to control every single library header version you are using outside your source like stdlibs, os headers, third party, and have a strategy to deal with rand/datetime variables that can be part of the binary.
As well as the toolchain used to compile your toolchain, through multiple levels, and all compiler flags along the path, and so on, down to some "seed" from which everything is build.
A good package manager, e.g. GNU Guix, let's you define a reproducible environment of all of your dependencies. This accounts for all of those external headers and shared libraries, which will be made available in an isolated build environment that only contains them and nothing else.
Eliminating nondeterminism from your builds might require some thinking, there are a number of places this can creep in (timestamps, random numbers, nondeterministic execution, ...). A good package manager can at least give you tooling to validate that you have eliminated nondeterminism (e.g. `guix build --check ...`).
Once you control the entire environment and your build is reproducible in principal, you might still encounter some fun issues, like "time traps". Guix has a great blog post about some of these issues and how they mitigate them: https://guix.gnu.org/en/blog/2024/adventures-on-the-quest-fo...
Virtualization, imho. Every build gets its own virtual machine, and once the build is released to the public, the VM gets cloned for continued development and the released VM gets archived.
I do this git tags thing with my projects - it helps immensely if the end user can hover over the company logo and get a tooltip with the current version, git tag and hash, and any other relevant information to the build.
Then, if I need to triage something specific, I un-archive the virtualized build environment, and everything that was there in the original build is still there.
This is a very handy method for keeping large code bases under control, and has been very effective over the years in going back to triage new bugs found, fixing them, and so on.
Back in the PS2 era of game development, we didn't have much of virtual machines to work with. And, making a shippable build involved wacky custom hardware that wouldn't work in a VM anyway. So, instead we had The Build Machine.
The Build Machine would be used to make The Gold Master Disc. A physical DVD that would be shipped to the publisher to be reproduced hopefully millions of times. Getting The Gold Master Disc to a shippable state would usually take weeks because it involved burning a custom disc format for each build and there was usually no way to debug other than watching what happened on the game screen.
When The Gold Master Disc was finally finalized, The Build Machine would be powered down, unplugged, labeled "This is the machine that made The Gold Master Disc for Game XYZ. DO NOT DISCARD. Do not power on without express permission from the CTO." and archived in the basement forever. Or, until the company shut down. Then, who knows what happens to it.
But, there was always a chance that the publisher or Sony would come back and request to make a change for 1.0.1 version because of some subtle issue that was found later. You don't want to take any chances starting the build process over on a different machine. You make the minimal changes possible on The Build Machine and you get The Gold Master Disc 1.0.1 out ASAP.
Yes I've seen this technique used effectively a number of times in various forms over the years, including in game companies I've worked at.
The nicest variant was the inclusion of a "build laptop" in the budget for the projects, so that there was a dedicated, project-specific laptop which could be archived easily enough, serving as the master build machine. In one company, the 'Archive Room' was filled with shelves of these laptops, one for each project, and they could be checked out by the devs, like a library, if ever needed. That was very nice.
For many types of projects, this is very effective - but it does get tripped up when you have to have special developer tooling (or .. grr .. license dongles ..) attached before the compiler will fire up.
That said, we must surely not overlook the number of times that someone finds a "Gold Master Disc" with a .zip file full of sources out there, too. I forget some of the more famous examples, but it is very fun to see accidentally shipped full sources for projects, on occasion, because a dev wanted to be sure the project was future proof, lol.
Incidentally, hassles around this issue is one of the key factors in my personal belief that almost all software should be written with scripting languages, running in a custom engine .. reducing the loss surface when, 10 years later, someone decides the bug needs to be fixed ..
The electromagnetic field is naturally a single complex valued object(Riemann/Silberstein F = E + i cB), and of course Maxwell's equations collapse into a single equation for this complex field. The symmetry group of electromagnetism and more specifically, the duality rotation between E and B is U(1), which is also the unit circle in the complex plane.
reply