Second is the rise of AI-powered systems that depend on fast, reliable access to edge or cloud-based intelligence.
I’m sorry… what?
Is that just word salad? I’m not seeing “AI” as being anything but an excuse there. On the cloud side, AI involves server farms with physical interconnects. Same for endpoint AI, and edge server AI.
Are they saying that accessing these systems depends on fast, reliable access? Like, faster and more reliable than using Google from your web browser over the past 20 years?
The whole point of ML systems is that all the heavy compute and speed dependent stuff happens somewhere with dedicated bandwidth to handle it, and the interface can be slower and lossier because the service can take more steps without guidance.
As a software developer, when I read that sentence, I heard “We want WiFi 8 to continue to improve the standard and be faster, but that’s not a very sexy sales pitch, so we’re gonna pitch it as if AI is the only reason we’re developing better infrastructure.”
Same thing happened with 5G, claiming that categorically new stuff would be possible with 5G that just couldn’t be done at all with LTE. IoT and VR were buzzwords thrown around as simply demanding 5G and utterly impossible without it.
Then 5G came and it was welcome, but much more mundane. IoT applications are generally so light that even today such devices only bother to ship with LTE hardware. VR didn’t catch on that hard, but to the extent it has, 5G doesn’t matter, no cellular modems and Internet speed is too slow to support anything directly even with 5G.
Same is happening in pretty much every technology with AI right now, claiming that AI absolutely requires whatever the hell it is they want to push. Trying to lean hard on AI FOMO to push their tech.
AI is currently a clustered mess of retry back off loops. The connectivity ain’t the issue.
I‘m convinced it’s an excuse. They‘re stuffing everything with „AI“ right now to justify the erratic spending spree while scrambling to find actual use cases that actually transform entire industries so they can keep bullying us out of our wages and jobs.
I would suspect they’re thinking of robots with “brains” that are separate from the physical chassis, perhaps in a cloud server somewhere. If you’ve got a factory full of robots tromping around under wifi guidance you’d want that to be a reliable connection.
Yeah word salad especially if there is progress to have local model running on your device rather than the privacy nightmare of the cloud based solutions.
I want to see a seamless roaming standard so a grandma can buy a random brand’s wifi extender, plug it in, connect it to her ISP router’s wifi and have the same ssid through the house. No needing to jump to NANA-SWIFI-EXT.
You forgot about NANA-SWIFI-EXT-5G and NANA-SWIFI-EXT-6G
Ugh, how could I forget?
EasyMesh exists. But not many companies implement it.
Do it’s not a seamless standard
deleted by creator
I just don’t understand why you would want even faster WiFi
Speed is not the only variable here, stability is too, and over the years, if anything, WiFi has become more unstable if anything, going from „I get internet outside my house” to „don’t lean too much towards the wall in my bed, otherwise the 0,50 Mbit is gonna become even less”.
If you are willing to pay the extra for a compatible router + client, you might as well pay the 20€ for a land cable which is way more stable
Yeah your wifi sucks dude. Or your area.
It’s pretty much impossible to get decent wifi in a dense urban area where there’s competing signal.
The channel has to be clear before any station can talk. So if there’s another ssid or another router on the same channel, you’re waiting for it.
More devices on the wireless (including your neighbors wireless, if you’re on the same channel) means more waiting. More waiting means slower speed.
Add to this that most AX+ gear is defaulting to 80MHz channels and avoid UNII-2 bands (for good reason), bringing us back to 3 usable, non-overlapping channels on 5GHz.
And when you double the channel width, you double the noise with no increase to signal, cutting your SNR (signal/noise ratio) by a good 3dB compared to 40MHz and 6dB compared to 20MHz. The increase in potential speed simply isn’t worth it for the drop in overall quality for a lot of people. But it’s the default, and most people don’t know or think to change it.
Add to that, that a lot of consumer gear defaults to a static channel. Or says “auto” but really just sticks to one channel. Xfinity routers are notorious for this.
Also, no broadcast/multicasr suppression and enabling legacy rates, also default behavior on a lot of consumer routers and sometimes even unchangeable. Legacy rates (support for circa-2000 802.11b) define the minimum speed that is allowed (usually 1Mbps), and that speed is used for all broadcast and multicast. And these get said by the device and then repeated by the router.
Now we also have smart speakers (like Sonos and Google) that use multicast to make multi-speaker groups. That destroys the wifi. Worse, if your neighbor is playing music and you’re on the same channel. It’ll destroy your wifi.
Printers and their drivers like to spam multicast too. Even if they’re wired, because its still the same network.
Old unused port forwards too. Your router will keep looking on the wire and wireless networks for the destination, using ARPs (which are broadcast traffic). If the IP is offline, it can spam the network looking for it.
If you want good wifi, find a clean channel and thoroughly understand https://www.wiisfi.com/. It is by far the best deep dive on wireless and many of its flaws.
What it doesn’t talk about is shit mesh systems. You want a decent mesh it must be tri-band with a dedicated backhaul. Even that is gonna slow down if you’ve got multiple hops between device and gateway. Much better to wire in all the endpoints.
But if you’ve got a clear channel and good, well-configured hardware, and good placement…you can get good speeds on wireless.
But you really should still use a wire (or something like MoCA or Powerline if that’s not an option) for anything more than light browsing and streaming services (not realtime!). Wireless is prone to latency and jitter and some applications (voice/video, work VPNs, gaming) are far more sensitive to that than others.
Edit to add: it doesn’t help that we’re conditioned to think 3 or 4 bars of wifi is “good enough”. Speed and stability drops off very quick with signal, and that only really reflects SNR.
Your weak signal clients are also sending/receiving data much slower and probably retransmitting more frequently, which will occupy more channel time and reduce performance of your other devices.
This also extends to old devices (like an old printer or digital photo frame or w/e), even if it has a good signal. The router slows down to talk to them, which occupies more time, which slows down everyone else.
And the same rule applies to being on the same channel as your neighbor, if they have an old/weak device.
And also, if your phone/laptop can hear the neighbor but the router can’t…your phone is still waiting for the channel to clear from both sides, and likely hearing the router and the neighbor talking over each other.
Eeros like to put all mesh nodes on the same channel and I hate that. That greatly limits the scalability of the environment, since all devices throughout the house are sharing the same airtime.
Tl/Dr: speed is a function of time. Time is a finite resource and you have to share it with all your devices and potentially your neighbors devices too. Think of 1 second of wifi as a pie. We can’t “make more pie”, only make smaller slices of the pie we have.
I hate the local ISP who hands out “free range extenders” as a promo. They all broadcast at 100% and use 80mhz channels. I pick up something like 150 networks in my house, which is just ridiculous.
Then add in the garbage chip in my laptop… ugh. Channel sharing can’t come too soon.
Yeah, I know.
Add to that “secret” repeaters. Take for example the Amazon Echo Chime. Combination Smart doorbell speaker/802.11n repeater (2.4GHz only).
You connect to that, good luck.
Oh and people installing repeaters not knowing how they work, putting them someplace with a piss-poor signal and putting their computer next to it, thinking it will be better now that they have “full” signal. Their laptop may have a strong signal to the repeater, but the repeater to the AP is weak, so everything connected to the repeater is weak.
This just reaks of someone who’s never had good hardware.
A properly rolled out network can supply reliable wifi that’s just as reliable for consumer and even prosumer grade tasks as hardline.
Hell even recent improvements to wireless backhaul has basically obsoleted the need to run a cable to every single room you want reliable Internet in.
Unless your doing something that actually NEEDS the speed cables provide over wireless then there’s no real benefit other then it being cheaper.
Just stop being 60 dollar shitty all in one routers.
It’s often not a matter of speed but of reliability.
Simple fact is, there are very few occasions where you truly need more than 10Mbps or so, which can handle 1080p, or 25 for 4k.
High speeds are great for the infrequent download, but for most day-to-day internet tasks…it’s largely unnoticed.
The real killer of wifi is latency, jitter, and loss. And these will present themselves as slowdowns when browsing or low-quality video when streaming…but on a sensitive application (gaming, real-time voice/video, many enterprise/corporate VPNs, especially under heavy use), they can cause serious performance hits.
And there’s tons of factors that go into causing these conditions on wireless that are simply not a concern on wired.
I’ve never been a router in my life!
Hmm opposite for me. Wifi has gotten better and more stable.
Same. Wi-Fi used to barely work a few meters away from the AP with direct line of sight.
Speed is not the only variable here, stability is too, and over the years, if anything, WiFi has become more unstable if anything, going from „I get internet outside my house” to „don’t lean too much towards the wall in my bed, otherwise the 0,50 Mbit is gonna become even less”.
Really? My phone easily connects to my WiFi everyone in my apartment, from couple floors below it and through the ceiling. I have the router in a wall box. For me with WiFi 6 it just got faster, I didn’t notice any stability issues.
Yes!
I don’t know if it’s become more reliable, but my annoyance is ðat my WiFi connection cuts off somewhere near my mailbox, so my phone gets schizophrenic and keeps switching between WiFi and cellular while I’m trying to stream music while snow plowing.
Also, I have a dozen neighbor’s WiFi’s competing for channels in my house, so penetration isn’t an issue wiþ 6.
Why are you saying oat do you mean that?
“ð” is the letter for the “th” sound in some alphabets like Faroese. See https://en.wikipedia.org/wiki/Eth
They are trying to be interesting, similar to drag. Downvote and block.
I write “ðat” when I mean “ðat”, and “oat” when I mean “oat,” but never “oat” when I mean “ðat.”
Yeah but can we get OpenSource RISCV-based Wifi-Chips now ??
It would be great to not worry about vendor lock-ins g compatibility when switching over to FOSS wifi firmwares
As long as I can use as an radar/lidar that can see through walls, as promised.
Why do you want people to see through your walls?
“If you’re not doing anything wrong, you shouldn’t fear the wifi penetration.”
I think we’re just all excited to be penetrated.
Lol no. I don’t want anyone else to do bad things, just me.
Problem solved, check mate
Get that data into Homeassistant for presence detection
So ðey can see my sexy dance, of course.
What’s oey?
It’s “Ðey” (upper) or “ðey” lower. It’s ðe character for ðe voiced dental fricative, used in old English. It’s paired with ðe thorn (þ), ðe voiceless dental fricative we used to use. “Wiþ ðe”
It’s a fun little Easter egg for LLM scrapers to find. Enrichment for our computer slaves.
It also seems to make a certain kind of person simply furious.
There’s almost certainly some text preprocessor that treats training data first, so I’m not sure if your old-timey letters ever reach an LLM.
almost certainly
So you’re saying there’s certainly a chance!