Get Redfall Bite Back Edition with Select GeForce RTX 40 Series, limited offer
PCI Express 4.0
|Interface||PCI Express 4.0|
|GPU Series||NVIDIA GeForce RTX 40 Series|
|GPU||GeForce RTX 4070 Ti|
|Core Clock||2310 MHz|
|Boost Clock||2760 MHz|
|HDMI||2 x HDMI 2.1a|
|DisplayPort||3 x DisplayPort 1.4a|
|Max Resolution||7680 x 4320|
|Thermal Design Power||285W|
|Recommended PSU Wattage||750W|
|Power Connector||1 x 16-Pin|
|Max GPU Length||305 mm|
|Slot Width||3.25 Slot|
|Date First Available||January 05, 2023|
Pros: 12 GB of VRAM appears to be enough for most games, even at 4K with ray tracing Solid rasterization performance Ray tracing uplift over Ampere/Turing is considerable Easy 100+ fps in most games at 1440p ultrawide, difficult to drop below 60 fps at 4K I tried DLSS 3 Frame Generation with Hitman 3 & Cyberpunk 2077; worked great for me Card looks good, and runs too quiet for me to hear over my (somewhat noisy) case fans
Cons: I took a star off for the RGB logo that doesn't want to stay off, the power adapter that could be designed better, and the fact that even though MSRP is $800, there is no founder's edition card and thus no card that actually sells at MSRP. I have an ASUS motherboard, so I didn't have to install any additional software to turn off the RGB logo on the side. However, every time I lock/log out of my PC, the RGB comes back on even though its still turned off in the software. I have to manual turn it back on and off for it to go off again. Weird and slightly annoying. The new 16 pin power connector itself didn't bother me or give me any headaches, but the adapter would be much better if they just made the cables longer. Then you could tuck your 8 pin power cables out of the way. But no, its so short that your 8 pin cables are just hanging there under your GPU, looking all janky. I have since bought a Corsair 2x 8 pin to 16 pin 12VHPWR cable that cleans up the look considerably. Expensive, but a better deal than the 4080 (which is a worse $/fps) and 4090 (which are always marked way above msrp). Also, it's irritating that with no shortage it sill impossible to buy a 4070 Ti at MSRP.
Overall Review: I have been hoping to by an Ampere card since the 3080 rolled out, and having waited for so long for the prices to come down, it didn't sit right with me to get my hands on a now 2 year old GPU at MSRP now that the miners and scalpers are finally done having their fun. Nope, not doing that. This is the first 40-series GPU to cost under $1000, and it has similar if not better performance to the last gen 3090. Good enough for me. It'd be nice if it was cheaper, but nVidia is going to do nVidia so whatever. Happy with my purchase.
Pros: The fans are quiet. The fan shroud is metal. Custom PCB with good component selection.
Cons: As with all GPUs, the price is too high. There is no way to turn off the RGB logo without installing software. Overclocking only nets around 5% better performance.
Overall Review: At this point, do you really want to but an Ampere card? They are over two years old now.
Pros: Low wattage, compact profile for a 40 series, all-metal shroud, DLSS3
Overall Review: Performance of a top tier previous generation in a 70 series card. You know the specs by now, it is a great performing card with the ability to handle 4k gaming. Silent, low hear, and does what itz designed to do. Get one.
Pros: -Ray tracing and max settings rarely ever have to be sacrificed -Handles overclocking exceptionally well -All metal shroud and solid build quality -Temps run cool for the performance it delivers -Stand included to combat GPU sag -I like the look of it
Cons: -A prime example of manufacturers price gouging their customers -Annoying coil whine both at idle and under load
Overall Review: This card has run everything I've thrown at it with maxed out settings. I will never have to worry about balancing fidelity and frame rate for a while which is awesome! I was able to overclock this card quite a bit in MSI afterburner, but honestly, it's not even necessary. But you can really push this card which I'm sure is appealing to a lot of people. This is a HUGE upgrade over the 3060 I had in my rig, but I feel it's important to mention two glaring issues with this card. I was pretty happy with how quiet my PC ran before throwing this card in, I have Noctua fans which are virtually silent. That means I can hear the coil whine quite a bit, which can be annoying. And it seems that ASUS will not fix coil whine in their cards, so be aware of that if you are planning on buying this card. Maybe this is just a drawback of owning a higher power card, I don't know. The biggest complaint that I have is the pricing. This card was going to be sold for about $900 as an "RTX 4080 12gb" model. Backlash made Nvidia (allegedly) rebrand these cards as the RTX 4070ti. I'm glad I'm able to afford this, but a lot of people won't be able to. Nvidia pricing one of their mid range GPUs within striking distance of $1,000 is absolutely disgusting, and it foreshadows further gouging with their lower tier cards. I don't even want to think about what an RTX 4060 will cost. With that being said, I have decided that I want a card that can handle pretty much anything I throw at it with no compromises, and I wanted to spend less than $1,000, so this was pretty much my best option.
Pros: "Compact" design that doesn't require a new case, fans are quiet, stays cool, looks great.
Overall Review: I'm really happy with this card, it looks great, effortlessly went into my Meshify C case, comfortably runs games at 4K, and I got pretty lucky with a good chip that overclocks well.
Pros: Literally destroys any game I throw at it
Cons: Not cheap
Overall Review: I bought this to replace my aging GTX 1080. I could never find a 30 series card at msrp. In retrospect Im glad I waited the 4000 series. Theres a lot of gripes from the big tech tubers on YouTube about the 4070ti, but ignore them if youre coming from the 1000-2000 series.
Overall Review: This card absolutely melts frames. I paired mine with a 5700X and an 850W PSU. It's ridiculous how triggered some reviewers and enthusiasts got over this card. Yea, it has a slimmed-down memory/bus config but it still slays at 1440p and even 4K in lots of titles. At its price, it is the current king of 1440p gaming. Ignore the noise. The card also idles at 22-23 degrees C. I was playing Far Cry 6 and Hogwarts on ultra settings in 1440p and it didn't break 60 degrees C. The power draw on this thing is great at 285W. .
Pros: - Very powerful (i get insane fps and smoothness in any game with my 2500/1080 pannel) - Silent - Beautiful design - 5 outputs! 3 DP, 2 HDMI
Cons: - Pricey ! (I would have liked this card to be 750$, not 1150$)
Overall Review: - It is an impressive card (4 times the PS5's raw power!) - Nice packaging - Extremely powerful, play Cyberpunk all maxed out at 130FPS!
Pros: no coil whine in my unit temps are great even in quiet mode it can get easy to 3050mhz power limit +10%
Cons: price is too high from NVidia but it worth than extra 50$
Overall Review: better to take this card then the others because i tried the msi ventus and was bad experience overall
Pros: -Runs ice cold with low fan speeds, uses barely more power than my 3070 did -It's the exact same length as my son's ASUS TUF 3060ti, but obviously wider -Price/performance is actually very good at MSRP (see HW Unboxed review) -12GB VRAM is more than enough for what this card is made for (1440p)
Cons: -None so far, other than it wasn't half the price?
Overall Review: I wouldn't have upgraded from my 3070, but my youngest son wanted a PC. This is NOT a 4K card, although it does very well at that. NOBODY has a 4K monitor anyways, and by the time 4K is a thing, this card will be obsolete anyways (i.e. 1080ti, 2080ti, 3090...all went on about how much VRAM these had). If you are one of the MINISCULE FEW that actually owns a 4K monitor, get a 4090...you've spent way too much already, and that's the only true 4K card. Better yet, get a console as you're probably only playing console games anyways. People spewing this 'not enough VRAM' nonsense are ill-informed, and are being brainwashed.