After months of leaks and rumors, the time has finally come. As expected, NVIDIA presented the new Geforce RTX 4090 and two versions of the RTX 4080 on the live stream called “Geforce Beyond”. In the following article we bring you all the essential information about the new RTX 4000 series cards, their expected release date, and the price NVIDIA will charge for the Ada Lovelace GPUs.
NVIDIA GeForce RTX 4090: Confirmed specs and more
As expected, NVIDIA has presented their new Geforce RTX 4090 flagship graphics card. With 76.3 billion transistors built on 4 nm processing node, Ada Lovelace has the highest density and the highest absolute number of transistors in a GPU that has ever been released on market. The numerous leaks about memory also proved to be correct – NVIDIA is launching its new top model with 24 gigabytes of GDDR6X video memory.
In addition, the RTX 4090, like its direct predecessor, will use a 384-bit memory interface, as can be seen on the NVIDIA product page for the new graphics cards. With 16,384 CUDA processing units, a base clock of 2,230 MHz and a boost clock of 2,520 MHz, the new flagship card brings massive performance boost. The technical data also confirms that RTX 40 continues to rely on PCI Gen 4.
Further information on the RTX 4090 is based on the details shown by the manufacturer PNY. The new GPU model will feature up to 576 Tensor cores (4th Gen) and 144 RT cores (3rd Gen), which accordingly not only means a qualitative leap forward, but also a quantitative increase compared to the RTX 3000 series.
The power consumption of the RTX 4090 is expected to increase as well. Leaks speak of a standard TDP of 450 watts, which could increase up to 600 watts. However, it remains to be seen whether cards designed for gaming will feature the same power consumption rating.
NVIDIA GeForce RTX 4080: One card, two models
In addition to the RTX 4090, the two variants of the RTX 4080 were also shown. The first one features 16 gigabytes of GDDR6X video memory, and sports a base clock of 2,210 MHz, and a boost clock of 2,505 MHz. This means that the RTX 4080 will only be marginally slower than the RTX 4090. However, this RTX 4080 variant will only support a 256-bit memory interface, and its memory will run at a clock of 23 Gbps.
The number of CUDA processing units is also reduced here to 9,728 cores. In return, the expected power draw will also be significantly lower compare to the RTX 4090. NVIDIA has provided a figure of 320 watts, however, rumors suggest of 340 watt TDP, which could be increased up to 516 watts.
The second model of the RTX 4080 that NVIDIA presented is weaker in almost all aspects, and many tech journalists think that it was originally supposed to the RTX 4070 (Ti). This card will feature 12 GB of VRAM, running at speed of 21 Gbps, and will utilize 192-bit memory bus. It will also feature much less CUDA cores, namely 7,680 of them.
However, the weaker card will run at higher clocks. This means that the RTX 4080 12GB will feature a base clock of 2,310 MHz, and a boost clock of 2,610 MHz. Furthermore, the card will probably consume less power. According to leakers, the TDP of this card could be around 285 watts, but could be increased up to 366 watts.
NVIDIA RTX 4000: DLSS 3.0 and new ray tracing technologies
As NVIDIA promises, the Geforce RTX 4090 should be able to deliver up to 4x the performance in ray tracing games compared to the previous top GPU, the RTX 3090 Ti. There should be a similar leap in performance with the 16 GB version of the RTX 4080 compared to the RTX 3080 Ti, while the RTX 4080 12 GB should be able to surpass the RTX 3090 Ti with the help of the ne DLSS 3.0.
DLSS 3.0 is set to become one of the main features of the Ada Lovelace chips. The new artificial intelligence upscaler set to be released on October 12 is only compatible with the RTX 4000 series graphics cards and is said to be able to quadruple the frame rates in supported games.
Previously, DLSS (Deep Learning Super Sampling) worked by rendering low-resolution images before the AI upscaled them. DLSS 3.0, on the other hand, should be able to generate its own individual images and thus completely omit the more complex rendering, which results in significantly better performance. More than 35 games and applications should already be able to support the new technology when it is released on October 12 – including Cyberpunk 2077, Microsoft Flight Simulator or the new Unreal Engine 5.
The ray tracing and Tensor cores used are also being updated to the third and fourth generations respectively. The new RT cores should achieve a computing power of up to 200 RT teraflops and, according to NVIDIA, are equipped with two new engines. While the “Opacity Micromap” engine (OMM) is responsible for faster ray tracing of alpha-tested textures – including foliage or fences – the new DMM engine, (which stands for “Displaced Micro-Mesh”) is intended for use in particular real-time ray tracing tasks.
The fourth generation of Tensor cores, on the other hand, has a theoretical computing power of 1,400 Tensor teraflops and is primarily intended to accelerate the previously mentioned DLSS 3.0. In addition to Cyberpunk 2077 and the Microsoft Flight Simulator, the Portal was also shown as an example, which is to be reissued as an RTX version.
NVIDIA Geforce RTX 4090 and 4080: Price and release date
The official release date for the GeForce RTX 4090 will be October 12th. The Founders Edition should be available from this date, and the first board partners such as Gigabyte or MSI have already announced a similar release dates. The two RTX 4080 versions should follow in November, with only the RTX 4080 16 GB receiving a Founders Edition.
NVIDIA has also already provided MSRPs for the new cards. For the Geforce RTX 4090, the company set an MSRP of $1,599. The 16GB variant of the RTX 4080, on the other hand, will feature a price tag o $1,199, while the 12GB version is going to cost $899. The cards can already be pre-ordered at Newegg.com.
- AMD Ryzen 7000 “Zen 4” CPUs: Release Date, Pricing and More
- Intel 13th-Gen Raptor Lake CPUs Specs, Release Date
- What To Do With Old Graphics Card?