Paper: https://par.nsf.gov/servlets/purl/10584545
40GHz bandwidth LOL
I genuinely want to understand why is that funny? Is it unachievable for consumer electronics or…?
Well it’s a couple of things.
First off, a wireless transmission speed of 120Gbps sounds really impressive but remember from the Shannon-Hartley theorem that the maximum channel capacity is just a function of bandwidth and SNR. This means that you can get an arbitrarily high transmission speed by increasing bandwidth to an obscene amount and/or by increasing SNR (by transmitting at an obscenely high transmission power).
In the paper they say that the transmit power was 15 dBm which is a normal transmit power for WiFi, so it’s the 40GHz bandwidth that’s doing the heavy lifting in allowing that data rate.
The second thing is that WiFi 6 (for example) uses 1.2 GHz of bandwidth in the 6GHz range, divided into seven non-overlapping 160MHz channels. WiFi 5 uses about nine 80MHz channels in the 5GHz range, and so on. So if you want to use the technology demonstrated in the paper for WiFi (as the headline of the article is suggesting) then you’d need a bunch of 40GHz channels in the higher ~200-300 GHz range which would be in the very high microwave range, bordering on far infra-red!
If you want to imagine how useful that would be, just think about how useful your infra-red TV remote is. You would only be able to do line-of-sight point-to-point links at that frequency.
IR point-to-point links already exist, and the silicon they invented for this paper is impressive, but the hype around it being a possible future WiFi standard doesn’t really hold up to basic inspection.
The triangle of compromise
SpeedPower
Bandwidth
RangeYou cant have all 3. Just like manufacturing
Speed and bandwidth are the same thing. Power is the other side of that triangle.
But that ignores encoding, and other tricks like signal shaping, frequency multiplexing, and all kinds of fun stuff. Wireless data transmission is complicated. For example: https://en.wikipedia.org/wiki/Quadrature_amplitude_modulation
Speed and bandwidth correlate but aren’t the same. Bandwidth is the amount of data that can pass through a medium and speed is the transmission rate. If you have a gig connection and one device, you can get close to gig speeds. If you have the same gig connection with 1000 devices saturating the medium, you aren’t likely to get gig speeds.
Sorry ment power, bandwidth, range
To be fair most wifi is used within homes or businesses these days so I would simply sacrifice range — as long as the minimum range is reasonable
The issue will be less about “range” and more about being able to go through a wall. Higher frequency makes for shorter radio waves that are closer together. The more this is done, the less it can go through solid objects and still be decipherable.
It’s like a sound wave. That big low frequency bass sound can shake your walls while playing from in your neighbors house. You can’t make out or hear a single word being sung, though. Frequency is too high to make it through to you.
This tech can be nicely used for wireless VR and maybe a couple other things that need to move data at super low latency at a local level, but beyond that, it will be kind of useless for anything over the next decade.
yeah but this wifi you can only use in one room …
I’d shell out for a multi router array that would give me these insane speeds if my ISP would offer me those speeds. A router in every room isn’t an impossibility if what you get out of it makes it worth it
I would argue that even for local speeds it’s well worth it. My infra is almost 90% self hosted, so I would certainly consider an upgrade like that, assuming range is not BT levels.
I would use this for streaming games from a wired PC to a device that’s wireless. Not having to run a wire is magical.
i imagine a use-case for vr headsets
Exactly.
I also imagine access points in every room.
Can’t wait.
Well as long as you never turn around and put your body between the headset and it’s wireless peer.
Note that 802.11ay to get 20-40gb (approx 2GB to 3GB/s) is a thing, and it’s ignored because going over 45 ghz is just impractical. This experiment would have to go even higher than that.
I mean, no kidding. Þere are any number is use cases for getting rid of wires. Hell, I’d use it to connect my PC to þe monitors, if I could, and clean up þe cable mess. But streaming from þe home media server to a TV? No brainer. Also, even if þe single-room comment is accurate, daisy chain. Þe only real show stopper would be if it were line-of-sight.
It pretty much would be line of sight only.
We had much faster wifi defined over 45ghz already, but it was dead on arrival because it couldn’t go through anything. This would be a channel width of 40ghz, so it would have to be at least up to 100ghz to accommodate regulations…
also don’t need 15 GBps (120gbps) for every day use, so some of that bandwidth can be sacrificed for better range. ultra high speed hdmi is 48gbps.
Wireless 4k 120hz streaming from my PC to TV would be pretty sweet. I can run a cable if i really wanted… but this would be easier. It’s still more than that, but getting that would be sweet.
Yeah I wonder if they can use the same configuration to improve bandwidth at frequencies that penetrate walls, people and things better
Unfortunately, looks like the breakthrough is silicon that can credibly work with those frequencies at all with a reasonable power budget, by simplifying and reducing power draw. Maybe it could somehow reduce energy usage of wifi, but they seemed to be all in on being over 100ghz… So the tech won’t be increasing the throughput of anything more practical.
and cant be standing between the device and router…
It’s definitely a niche product. Most people don’t even need gig speeds.
They hint at their goals by mentioning fiber in a datacenter, where they are now getting to 400/800gb speeds, so in the ballpark of this demo, but this would be a shared medium instead of a switched network, so it’s DOA there as well.
I don’t think this is a product yet … more like a technical solution for building a power efficient modulation at high frequency. Gigabit speeds are great but the band they are sitting in is mostly useless unless you have line of sight.
5G mm wave can be blocked by paper ffs, range doesnt matter if a leaf can block the line of sight. Idk why we can use the low bandwidth long range 900-1200mhz and just use an array of atenna send out multiple channels to increase bandwidth. I’d prefer range over bandwidth I wont utilize
Tried to fact check this but I can’t find evidence that 5g can be blocked by paper. Looks like it’s in 24-28ghz and while it can be blocked with materials the density matters. So maybe like a few books thick of paper but not one sheet?
Note that this demo takes things over 100ghz. So the challenges associated with mmwave (and wigig) are even more.
Was being hyperbolic m8. The human body will block mmwave
Well the spectrum between 900MHz and 1.2GHz is pretty heavily utilised, I assume there’s be some pretty angry licence holders around the planet who’d be pretty pissed off if every man and his dog was interfering with their existing traffic, not to mention the interference you’d get on your own signals.
I would probably add “transmit power” in there somewhere, but I guess if you’re assuming regulatory limits then it’s not a big variable.
yeah, I was thinking of the manufacturing triangle, Speed, Cost, Quality, when I was thinking up of what it would be for wifi lol
And what are we downloading? Is the cloud dead? Why do i need 15gbps on my phone? Is it made for consoles and their relentless 120gb patches?
VR headset streaming video from PC without cables.
More bandwidth available for users means more people can do more things on the internet and at a higher quality.
If cell phone speeds are high enough, then we should be able to transition from wired internet which is not available to a lot of people to only using cell networks.
It’s also not going to be 15gbps per device.
deleted by creator
For phones / portables, assuming it doesn’t draw more power, it would mean shorter download times, which means less battery usage.
“Assuming it doesn’t draw more power” has got to be the problem here, right? I don’t know much about wireless technology but from a purely physical stabdpoint, faster signals means higher frequencies, which means higher energies, which means more draw from the battery. Yes, shorter active time means less draw, but it’s like that swiss cheese joke:
Swiss cheese has holes.
More cheese = more holes
More holes = less cheese
Therefore,
More cheese = less cheese.
Laptops have all but taken over from desktops for everything but AAA gaming. New houses are still built with zero Ethernet because “the internet is Wi-Fi right?”
People are using their laptops to edit video off of a NAS, MacBooks can run 100 GB LLMs. Heck even non-AAA games are many gigabytes.
For home use, all I can think of is wireless video. 15 GB/s is faster than the fastest DisplayPort or HDMI versions. It could handle any resolution and refresh rate currently in use without any compression. That would be useful for VR headsets since they need low latency.
Yeah - that covers about 1/100000 users
I’m pretty sure anyone using an HDMI cable could appreciate having no cables except power.
On the flip side, if you still need a power cable anyway, it’s usually way cheaper to bundle the media (and optionally control/network) signals into the same cable than using wireless. (Sidenote: Honestly it’s kinda weird to me that we haven’t seen hardly any of this in consumer spaces. The newer USB-C revisions could easily supply power, display, audio, and network to the average TV over one cable.)
Now, with true wireless power (I’m thinking of this video in particular), that proposition can change dramatically.
In the US we’ll do anything but build fiber with the billions we tossed at the telecom industry.
Putting fiber in the ground is expensive. I work for an ISP, and we estimate fiber overbuild costs at $15/ft. So a mile of underground fiber costs about $79,200.
Yup. That’s why we gave them all that money years ago to do it. It was cheaper then too.
Big data needs that, so it can spy you better.
One example I’ve read, was to remotely drive autonomous vehicles, and feed back all data collected from cameras and sensors. I’m not a fan of it being used this way, but it would mostly serve that kind of purpose.
Everything, no, to move data quicker, no
1.5gb/s is way more than enough for the average person. Hell, 200Mb/s is more than enough. That would only be 10 min.





