GeForce Now With RTX 3080 Tested

GreekFire

Newbie
Sep 18, 2021
302
4
28
Nvidia just upgraded its GeForce Now subscription game streaming service with the RTX 3080 tier, which promises gamers the performance of a GeForce RTX 3080 in the cloud. Well, sort of. You see, it's actually an Nvidia A10G GPU in the cloud, which is both better and worse than an RTX 3080 from a performance and features standpoint — more on that below. The RTX 3080 is still our top pick for the best graphics cards, contingent upon actually finding one for close to the recommended price. Also, it's in the cloud, which inherently limits what you can do with the GPU. This can be good and bad.

For example, there's no cryptocurrency mining using GeForce Now. Yay! Users are also allocated a GPU solely for their use, while the CPU gets shared between two users. Unfortunately, you're limited to games that are supported on the GeForce Now platform, and while there are many Steam, Epic, GoG, and Ubisoft games that will run on GeForce Now, there are plenty of missing titles as well: Borderlands 3 (and all the rest of the series), Dirt 5, Horizon Zero Dawn, Metro Exodus Enhanced Edition, and Red Dead Redemption 2 are just a few of the missing games. Also, anything that requires the Microsoft Store is out.

But it's not just about the games and hardware. One of the biggest barriers to PC gaming right now is the lack of graphics cards. Unless you want to pay eBay GPU prices, which still tend to be 50–100 percent higher than the official MSRPs, it's virtually impossible to buy a graphics card. That goes double for the RTX 3080. During the past couple of weeks, the average price on RTX 3080 cards purchased on eBay still sits at nearly $1,700, more than double the nominal price.

What if you could forget about the difficulty of buying a graphics card and simply rent the performance in the cloud? That's the theory behind the latest upgrades to GeForce Now. The RTX 3080 tier supports up to 2560x1440 resolution gaming, and even streams at 120 fps. Nvidia also claims improved latencies — supposedly better than playing locally on a latest generation Xbox Series X. The cost for this tier is $100 every six months.

That might seem steep, but considering an RTX 3080 might cost $1,500 or more, that's seven and a half years of GeForce Now streaming. Or, you know, however long it takes for GPU prices and availability to return to normal. Theoretically, you could pay for the service for six months, or even two years, and then when the current shortages and extreme retail pricing fades away, you can upgrade your home PC.

It's not the worst idea we've ever heard, but how does the new RTX 3080 tier of GeForce Now actually perform? And how does it compare to running games locally on an RTX 3080? We set out to do some testing to see for ourselves. Let's quickly start with the basics.

GeForce Now Superpod Hardware

The new GeForce Now Superpods come packed with a lot of hardware. Nvidia didn't want to get into all the specifics, unfortunately for us because we love that kind of detail, but we do know that each Superpod houses a bunch of rack mounted servers equipped with Threadripper 3955X CPUs and Nvidia A10G graphics cards.

The Nvidia A10G isn't the same as an RTX 3080, and we assume it's basically the same core hardware as the datacenter Nvidia A10. It comes equipped with 24GB of GDDR6 memory — the same amount you'd get with an RTX 3090! Except, the VRAM runs at 12.5Gbps (probably to save power), giving you 600GBps of bandwidth. However, it has 9,216 CUDA cores where an RTX 3080 has 8,704 CUDA cores, so in theory it should be comparable to an RTX 3080: more memory, more compute, less memory bandwidth.

The Threadripper 3955X is a Zen 2 processor with two 8-core compute chiplets. Nvidia allocates all of the cores and threads from one chiplet to a user, which should improve overall performance since it keeps commonly accessed data local to the chiplet that way. Clock speeds are 3.9–4.3GHz on the 3955X, but because it's the older Zen 2 architecture it's probably a bit slower for gaming purposes than many of Intel's latest chips. Still, it shouldn't be a serious bottleneck for gaming at 1440p, especially when doing so from the cloud.

Besides the CPU and GPU, each RTX 3080 GeForce Now instance gets 28GB of DDR4-3200 memory and access to 30TB of fast PCIe Gen4 SSD storage. When you put everything together, the total cost per GeForce Now instance has to run at least $2,000. But Nvidia doesn't expect people to use these instances 24/7, so even though users only pay $16.67 per month for unlimited streaming (limited to eight hour sessions at a time), Nvidia presumably allocates dozens of users per installed set of hardware.

We asked a bunch of other questions about the GeForce Now Superpods, but unfortunately Nvidia didn't want to provide answers or photos. These look similar to the A100 Superpods, but the individual servers are obviously quite different. Our back of the napkin match (based on Nvidia's statement that each Superpod houses 8,960 CPU cores and 11,477,760 CUDA cores, plus the above photo), is that there are 20 racks per Superpod, with each rack housing 28 servers. Each server would come equipped with a single Threadripper Pro 3955X and two Nvidia A10G GPUs.

Except, that works out to 10,321,920 CUDA cores, so the A10G may actually look more like an RTX 3080 Ti with double the VRAM. 10,240 CUDA cores per A10G would give us 11,468,800 CUDA cores spread out over 1,120 GPUs, and the 'missing' 8,960 CUDA cores could be located in Nvidia Bluefield network devices. Or perhaps there's some other explanation, but that's all we've got for now. We do know that Nvidia says each Superpod supports "over 1,000 concurrent users," which jives with most of the other numbers.

GeForce Now Bandwidth Requirements

Before we get into the comparisons, let's get this out of the way: GeForce Now, like any game streaming service, requires a decent internet connection. It's not too onerous, though, with a 30Mbps download speed as the baseline, and 50Mbps as the maximum configurable stream quality. I did some testing and found that using the default Balanced connection settings (30Mbps), actual data usage was typically in the 20–30 Mbps range, averaging around 25Mbps. That works out to about 11GB of data per hour. Using the Custom option and selecting the maximum quality (50Mbps, 2560x1440, 120 fps, and no automatic network speed adjustments), data use wasn't actually that much higher: about 30Mbps, or 13.5 GB per hour.

How you feel about that amount of data usage will depend very much on your internet provider. I used to have Comcast Xfinity, which cost me around $110 per month for the X-Fi Complete package with unlimited data, or $100 a month with a 1.2TB data cap (technically 1228GB), and charged $10 for each additional 50GB up to a maximum of $100 per month. The data cap sucked, mostly because of multiple game downloads and updates every month that could easily consume hundreds of GB of data, and my household routinely came close to that limit. Paying an extra $10 for unlimited data was an easy fix, though you'd need to check whether your provider offers unlimited data and how much it charges.

With a data cap of 1TB, which seems relatively common these days, that's still enough for about 75 hours of GeForce Now gaming each month — assuming no one does anything else with the internet. It's basically equivalent to the amount of data used by streaming 4K movies and television. But seriously, if you have a data cap and play games a lot, or stream TV and play games, forget about game streaming services or look into upgrading to an unlimited plan first. (For the record, I moved and now have faster Internet via TDS for about $97 a month, with no data cap. Hallelujah!)

If you think 30 or 50 Mbps seems like a lot, it's good to put things into perspective. The source data for 2560x1440 at 120 fps would require 14.16 Gbps without compression. Yeah, we're not getting that level of bandwidth into our homes any time soon. We're basically looking at about a 300:1 compression ratio, using lossy video compression algorithms because that's the only way to make this work.

Wired or Wireless Networking?

The next thing you need to consider is whether you'll be using wired or wireless networking — or possibly both. Nvidia recommends wired connections, which makes sense. Wireless connections are far less predictable. Despite having a strong signal (>400Mbps down and 20Mbps up), when I tried GeForce Now on a laptop I ended up with a lot of intermittent lag. Playing off a wired connection fixed the problem.

Your particular network setup and devices will play a role, however. Maybe my router isn't the best, or perhaps the wireless adapter in my laptop was to blame. Or maybe there was some other interference causing issues. If you can use a wired gigabit (or faster) Ethernet connection, that will undoubtedly work best. If not, try to stay close to your router and perhaps downgrade the resolution and framerate to 1080p at 60 fps.

GeForce Now works on Android smartphones and tablets, Chromebooks, MacBooks, and of course Windows. Some people claim to have gotten it to work on Linux as well, but that's not officially supported — and neither are iOS devices. Some of these options often lack wired connections, in which case you'd again be better off with the $50 per six month tier of GeForce Now.

GeForce Now RTX 3080 Availability

The GeForce Now RTX 3080 tier has officially rolled out, but it's not available at all GeForce Now installations. Specifically, you'll want to check this page to see if you have a reasonably close node that will work. I'm in Colorado (US Mountain region), but the US Midwest region provided a good result while the US West region basically failed the latency test. The GeForce Now client software does check for optimal region selection automatically, and you can start with the free tier of service to see what it selects.

Nvidia will continue upgrading its GeForce Now data centers with the new Superpods, but there's no indication of how quickly that will happen. Note also that, like the original GeForce Now Founders Edition, the RTX 3080 tier has a limited number of slots available. Presumably, more slots will be added over time, but Nvidia doesn't want too many people signing up for something when it doesn't have the infrastructure in place to support those users. Nvidia wouldn't tell us the ratio of users to hardware that it's using.

Bottom line: If you want the benefits of the RTX 3080 tier, you'll probably want to jump on it quickly and subscribe.

GeForce Now Test Setup

For these tests, I've used a PC equipped with a Core i5-11400F CPU, 16GB of DDR4-3200 memory, with an RTX 3080 Founders Edition for the local GPU testing. It also has SSD storage for the games, and it's connected to an Acer Predator X27 monitor (for 144Hz support). I also tested on an Ice Lake i7-1065G7 laptop with a 4K display (set to 1440p) for wireless testing, but that wasn't a pleasant experience.

My main goal here was to compare actual performance using in-game benchmarks (you can't capture frametimes for games running on GeForce Now), and also to look at image quality. I also played some games running off the service, which you can read about in the experiential gaming section below.

Note that GeForce Now RTX 3080 tier also supports 4K and 60 fps streaming, but only if you have a Shield TV. I didn't test this, and in general I'd prefer 1440p and 120 fps just because that should improve both latency and performance — 4K gaming tends to be a bit too demanding in a lot of games, even with an RTX 3080 equivalent. Nvidia hasn't enabled 4K support on PCs due to the wide disparity in video decoding hardware, but don't be surprised if that shows up somewhere down the road (with a minimum requirement of a 6th Gen Intel CPU or better).

Due to the limited selection of games available on GeForce now (as I noted above, Borderlands 3, Dirt 5, Horizon Zero Dawn, and Red Dead Redemption 2 are all unavailable on GeForce Now, which is sad as they all have useful built-in benchmarks), I've tested with Assassin's Creed Valhalla, Far Cry 6, Shadow of the Tomb Raider, and Watch Dogs Legion. Three of those games come from Ubisoft and ran off Ubisoft Connect, while Shadow of the Tomb Raider used Steam.

Watch Dogs Legion and Shadow of the Tomb Raider both support ray tracing effects as well, so we tested that. Unfortunately, while Far Cry 6 supports ray tracing, it uses DirectX 12 Ultimate, and that's not currently supported on Windows Server 2016. Nvidia says it's working on upgrading to Windows Server 2022, which will allow DX12 Ultimate games to work with ray tracing enabled. We manually configured GeForce Now for 2560x1440 and 120 fps, set Windows to 2560x1440 and 144Hz with G-Sync enabled, and proceeded to run some tests.