Hacker Newsnew | past | comments | ask | show | jobs | submit | u8080's commentslogin

I saw chinese hw companies use "HDTV" or "HD" to avoid HDMI trademark usage.

Yep, and "HDML" on one device that would obey its user and strip HDCP from the stream when asked.

1x PCIE 3.0 has 8 Gbps raw speed - for 2.5Gbps duplex Ethernet you'll need 6~7 Gbps of raw link to CPU.

For 5Gbps and higher, you'll need another PCIE line - and SOHO motherboards are usually already pretty tight on PCIE lanes.

10GbE will require 4x3.0 lanes


> 10GbE will require 4x3.0 lanes

3.0 PCIE is irrelevant today when it comes to devices you want on 10G. I'm pretty sure the real reason is that 2.5G can comfortably run on cable you used for 1G[1], while 10G get silly hot or requires transceiver and user understanding of a hundred 2-3 letter acronyms.

Combine it with IPS speeds lagging behind. 2.5G while feels odd to some, makes total sense on consumer market.

[1]: at short distances, I had replaced one run with shielded cable to get 2.5G, but it had POE, so it might contribute to noise?)


Thanks for fresh Realtek RTL8127 - 10GbE is now cool and cheap. Is is also okayish on short cat 5e cables.

PCIe is full duplex. And there's no requirement for ethernet ports to be able to do full tilt. Even with a 1x PCIe 3.0, a 10G port will be much much better than a 2.5G one.

(But PCIe 3.0 of course is from 2010 and isn't too relevant today - 4.0, 5.0, 6.0 and 7.0 have 16/32/64/128 Gbps per lane respectively)


It is relevant, SOHO boards still use PCIE 3.0 to attach NICs, i.e. https://tech4gamers.com/wp-content/uploads/2022/10/B650-AORU...

Agree that 10G would be still better, but OEMs couldn't write 10G on their hw while it is capped to 7Gb


Are motherboards commonly using PCIe 3.0 for onboard peripherals these days? I wouldn’t expect it to save them much money, but my PCIe knowledge is constrained to the application layer - I know next to nothing about the PHY or associated costs.


This is got to be it!

>Bluetooth and WiFi isn't running if you turned them off.

BT and WiFi are running when turned off, at least on Android without extra opting out.


Like in avoiding participation and being worried about future, speaking as Russian Kagi user from Belgrade.

Most popular places to move I guess are Georgia(365 daya visa-free, easy to reach), Serbia, UAE, Cyprus, Poland.


Could have been easily solved by granting it by default, but I doubt that was original intent.


Well, the original intent was to ask the user for permission at installation time, which turned out to be a poor idea after a while. Perhaps you mean that it would have been simple to change the API in some particular way, while retaining compatibility with existing apps? If I remember the timeline correctly, which is far from certain, this happened around the same time as Android passed 100k apps, so a fairly strong compatibility requirement.


I mean, just make it "Granted" by default and give user ability to control it. Permissions API was already broken few times(i.e. Location for bluetooth and granular Files permissions)


This is not directly comparable with display resolution since actually you are looking for PPI per degree of vision to judge on clarity.


No, bluetooth has enough bandwidth for 990kbps LDAC, so it should be possible to do 128kbps stereo + 64kbps OPUS mono mic.


WH-1000XM6 should support GMAP according to reddit, however Mediatek PCIE Wi-Fi/BT combos seems have crap drivers and I was not able to make it working. And Intel ones does not work with AMD CPUs(sounds like bullshit, but it requires some Intel proprietary DSP driver to supposedly "decode LC3").


For real quality improvement which is 48kHz stereo + mic, you'll also need GMAP(Gaming Audio Profile) support both on BLE adapter and headset.

I've tried multiple combinations with my WH-1000XM6 and WF-1000XM5, but nothing works stable on Windows. Linux requires hand-patching bluez and friends which also failed for me. Android does not support GMAP and just when using LE, a lot of messengers unable to detect it properly(Google Meet works, Telegram and Viber does not).

I've finally gave up on that idea. Just thinking about fact we cannot use duplex wireless audio in 2025 pisses me off so much tbh.


Worse yet, I got a new Bose headset with USB C audio support - and the microphone doesn't work at all on either the USB or Bluetooth while USB C is playing audio!


My WH-1000XM5 set broke, and it was going to cost more to repair it than buy than simply buying a new pair. So I decided to check out the cheaper end of the market, and bought a pair of Edifier W830NB.

They are pretty decent (notable downgrade in most aspects, you do get what you pay for, but good enough for my daily needs). But I was very happy to discover that when plugged in via USB-C, the microphone works over usb with full quality, that's one thing my WH-1000XM5s couldn't do, nor the newer XM6s

So both Bose and Sony need to step up their game.


Very similar approach is used in MCST Elbrus CPUs: https://en.wikipedia.org/wiki/Elbrus-8S#Supported_operating_...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: