Hacker Newsnew | past | comments | ask | show | jobs | submit | ttshaw1's commentslogin

>When the electromagnetic wave hits a substance, it splits into separate electric and magnetic waves

Not in any sense I'm familiar with


I don't like the notion of doing speed control by putting a digipot in series with a motor. It worked because the fan happened to be low enough power but it doesn't seem like the author gave thought to the power handling capability of the digipot. If the fan happened to be beefier he's letting the smoke out with this design.

Plus, this is more complicated than just doing PWM.


Based on the description of the wiring to the motor (24V, GND, POT1, POT2, NC), it doesn't sound like the original setup would have been drawing much power through the pot either -- there's probably something else on the other end of that wire that is doing modulation based on the sense resistance, and the motor is itself drawing power from the 24V line. So while it's true that there should be a check for the allowable limits on the digipot, I don't think it's actually being used to sink much power.


Hey, author here. That's correct. The potentiometer has 5V going through it, with a current range of 30-164μA, which fell within limits of the digipot. I opted to use the digipot instead of my own PWM because something else must be doing PWM closer to the motor, where I didn't want to go modifying.


You're right, I didn't read thoroughly enough.

>Everything joined up via a 2-pin and 5-pin connector on the PCB. From there, it was a straightforward matter of measuring voltages and continuity to work out what connected to what: the 2-pin connector was offering 24V DC. The 5-pin connector was what went off to the motor itself. Two of its pins were passing through the 24V DC and ground directly. Two more pins were connected to the potentiometer. The fifth pin was not connected.


> there's probably something else on the other end of that wire that is doing modulation based on the sense resistance

And it would have been great if that arbitrary assumption had been tested by the OP and the results were documented in the article so that they wouldn't come off as somewhat clueless as to the limitations of their design.. oh well.


From the HN guidelines:

> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

> Don't be curmudgeonly. Thoughtful criticism is fine, but please don't be rigidly or generically negative.


These passive aggressive posts are a far worse violation of the HN Guidelines than what they're in reply to almost all of the time - and this is no exception.


No, no they're not. I would much rather people are warned about the guidelines and adhere to them going forward than the opposite and we then just let violations run rampant.


They're not "warnings". They're passive aggressive internet dick waving virtue signalling. The flag button exists.


I, for one, appreciate knowing why people have flagged my comments. The "flag and move on" strategy is for use against bad actors.


There are a lot of people who read or watch stuff from the Internet and then play with mains voltages without giving a though to how dangerous that is.

See: any craze which uses the high voltage transformers from microwaves


I don't get what the point of the article is. Is the takeaway that I should lower the channel width in my home? How many WAPs would I need to be running for that to matter? I'd argue it's more important to get everyone to turn down TX power in cases where your neighbors in an apartment building are conflicting. And that's never going to happen, so just conform to the legal limit and your SNR should be fine. Anything that needs to be high performance shouldn't be on wifi anyway.

If you want to spend a really long time optimizing your wifi, this is the resource: https://www.wiisfi.com/


This sort of thing is definitely in the class of "are you experiencing problems? if not don't worry about it".

If you are experiencing problems, this might give you an angle to think about that you hadn't otherwise, if you just naively assume Wifi is as good as a dedicated wire. Modern Wifi has an awful lot of resources, though. I only notice degradation of any kind when I have one computer doing a full-speed transfer for quite a while to another, but that's a pretty exceptional case and not one I'm going to run any more wires around for for something that happens less than once a month.


The takeaway is that you'll probably experience more reliable wifi if you turn your 5ghz channel width down to 40mhz and especially make sure your 2.4ghz width is 20mhz not 40mhz. As noted, you can't do anything about the neighbors, but making these changes can improve your reliability. And I think the larger takeaway is that if manufacturers just defaulted to 40mhz 5ghz width, like enterprise equipment does, wifi would be better for everyone. But if your wifi works great then no need.

Also that's an amazing resource, thanks for linking.


2.4GHz wifi at 40MHz squats literally half of the usabke channels, you speed improvement, very likely you now get 100mbps. If you just disabled 2.4GHz and forced 5GHz you would get the exact same improvement and wouldn't be polluting half of the available frequencies.

Add another idiot sitting on channel 8 or 9 and the other half of the bandwidth is also polluted, now even your mediocre IoT devices that cannot be on 5GHz are going to struggle for signal and instead of the theoretical 70/70mbps you could get off a well placed 20MHz channel you are lucky to get 30.

Add another 4 people are you cannot make a FaceTime call without disabling wifi or forcing 5GHz


I lose wifi signal consistently in my bedroom on my 80Mhz wide 5Ghz wifi.

I just now reduced it to 20Mhz, and though there is a (slight) perceptible drop in latency, those 5 extra dB I gained from Signal/Noise have given me wifi in the bedroom again


Every doubling of the channel width costs roughly 3dB. Shannon's law strikes again!


Every doubling of the channel width doubles the Shannon limit*

* An a gaussian white noise environment, which WiFi usually isn't in.


*If the bandwidth of the Analog Front End (AFE) and Analog to Digital Converter (ADC) / Digital to Analog Converter (DAC) doubles as well. In the real world the AFE of any wifi radio has a fixed bandwidth, with the ADC sampling rate and accuracy being fixed as well. The end result is that doubling the channel width in a wireless network requires a received signal strength that is roughly 3 dB more in real world devices. This constraint is quite visible in data sheets for most wifi cards like here: https://compex.com.sg/wp-content/uploads/2024/01/wle7002e25-...


Wow! There are certain areas of my house that I get such bad wifi signal that I often switch to cellular data since it's more reliable. I didn't even know you could change a setting like this to reduce speeds but improve reliability - it worked like a charm, thanks!


Everytime I have questions about Wi-Fi I search for this distinctive site wiisfi.com … I should bookmark this.

The best Ressource out there. Period.


Wow, that is an awesome resource and something I wish I knew about earlier!


That's Blood Meridian


Try qalculate. It's great with units and I think will work for base conversions, though I haven't tried that


>Unlike nature, which utilizes passive structures to shape sound, most artificial sound control systems require active devices or resonance-based systems.

What's wrong with resonance-based systems? I have to wonder if their side lobes and frequency range would be better if they used resonance


We're already at NISQ


I meant practically usable NISQ


Why, because Russia can grind out a village a week? Ukraine is inflicting disproportionate losses and is supplied to the hilt by Europe, while Russia's moving closer every day to a Potemkin economy.


Ukraine is inflicting massively disproportionate losses. Meanwhile, Ukraine does very aggressive conscription while Russia mostly deploys volunteers and only resorted to reservists in 2022 in an emergency. It doesn't really add up, does it.

And the collapse of the Russian economy will happen any day now for the past 3 years.

After 20 years of being told the military leadership of the western world had COIN all figured out, you're going to have to give people something more than a prayer that the enemy's economy will collapse all of a sudden. Proud ignorance of the basic facts of the field or of the enemy won't procure much public support any more.


Of course Ukraine conscripts, they're in a war for their survival. They aren't drafting anyone under 25, by the way, so it's not as dire as you seem to think. And Russia's beating people and throwing them in pits if they won't sign contracts to go to Ukraine, so it's not all roses over there.

It's not at all unreasonable to think that Ukraine can continue ceding ground and shredding Ladas full of mobiks until Putin kicks the bucket, or the Russian economy collapses. A healthy economy doesn't have a 20% key interest rate for 8 months straight, you know. We've already seen one large-scale mutiny in the Russian armed forces, too, so who knows what else might happen?

You haven't proposed any sort of alternative to continuing to arm and fund Ukraine. What's your idea, cut them off and say "good luck?" How does that benefit anyone besides Russia and the minority of Ukrainians who don't want to fight?

edit: if you're thinking that I care about the financial cost of arming Ukraine, I don't. This is the best money we've ever spent and the only time I've respected our MIC, and I wish we were sending more weapons and more financial support. Every time Ukraine spends $100,000 of aid destroying a piece of Russian armor, that's saving us god knows how much in money spent on deterrence.


You shouldn't need to prevent gaps entirely. You only need to make sure there are no holes larger than roughly the wavelength of the radiation you're trying to block. Which, for 2.4GHz wifi, is about 125mm. I think what you saw is that a single layer of foil isn't enough skin depths thick to block radiation sufficiently at that frequency.


It takes well-calibrated electronics detonating conventional explosives with precise timing to set off a nuclear warhead. The warhead maybe would fizzle but wouldn't detonate because you intercepted. And anyway, it's much better to have it detonate anywhere besides where it was targetted


> It takes well-calibrated electronics detonating conventional explosives with precise timing to set off a nuclear warhead.

The need for precise control and timing is true for plutonium implosion-style devices but not true for uranium gun-style ones. Gun-style detonators just need to smash two lumps of uranium together. You better hope the interceptor completely demolishes the aforementioned lumps of uranium instead of ramming one into the other.


We're not that concerned about uranium gun devices, because they aren't really worthwhile to make. Their yields aren't high enough to justify the cost relative to conventional weapons. And explosive disassembly of the device is still likely to cause the nuclear element to fail. There's a reason the world basically gave up on them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: