Notebookcheck Logo

NVIDIA GeForce GTX 675M SLI vs NVIDIA GeForce GTX 680MX

NVIDIA GeForce GTX 675M SLI

► remove from comparison NVIDIA GeForce GTX 675M SLI

The NVIDIA GeForce GTX 675M SLI is a high-end graphics solution for laptops based on the two GTX 675M graphic cards in SLI mode. With SLI, each card usually renders a single frame (AFR mode). Therefore, it may suffer from micro stuttering in low fps ranges of 30fps. This happens because of different timespans between two frames (e.g., irregular delays between sequential frames).

The GeForce GTX 675M SLI supports the same features as a single GTX 675M card. Therefore, it supports DirectX 11 and is produced in a 40nm fabrication process at TSMC. Technically, the GTX 675M (SLI) is completely identical to the GTX 580M (SLI).

GF114 architecture

The GF114 core is a power-optimized version of the GF104 (used in the GTX 485M), but with no architectural changes. Therefore, the performance per MHz stays the same. Nvidia, however, was able to clock the shaders higher while still remaining in the same power envelope. The GTX 675M offers all 384 shader cores in the GF114. More information on the similar GF104 architecture can be found on the GeForce GTX 485M page.

Performance

Due to the higher clock speeds, the GeForce GTX 675M SLI is faster than the GTX 485M SLI an as fast as the GTX 580M SLI. Demanding games like Crysis 2, Witcher 2, Dirt 3 or even Metro 2033 can be played in 1920x1080 with maximum details and most Antialiasing features enabled.

Features

Similar to the older GF104 chip, the GF114 also supports Bitstream HD Audio (Blu-Ray) output via HDMI. That means it can transfer, for example, Dolby True HD or DTS-HD bitstream without quality loss to a HiFi receiver.

The GTX 675M offers the PureVideo HD technology for video decoding. The included Video Processor 4 (VP4) supports feature set C and is therefore able to fully decode MPEG-1, MPEG-2, MPEG-4 Part 2 (MPEG-4 ASP - e.g., DivX or Xvid), VC-1/WMV9, and H.264 (VLD, IDCT, Motion Compensation, and Deblocking). 

Furthermore, the GPU is able to decode two 1080p streams simultaneously (e.g. for Blu-Ray Picture-in-Picture).

Through CUDA, OpenCL, and DirectCompute 2.1 support, the GeForce GTX 675M can be of help in general calculations. For example, the stream processor can encode videos considerably faster than can a fast CPU. Furthermore, physics calculations can be done by the GPU using PhysX (e.g., supported by Mafia 2 or Metro 2033). As two 675M are used for the SLI combination, one card alone can be used for PhysX and the other for rendering the game graphics.

3D Vision enables the laptop to send 3D content (3D games, 3D Web Streaming, 3D photos, 3D Blu-Rays) to a built-in 3D enabled screen or an external 3D TV (only if supported by the laptop manufacturer).

The power consumption of the GeForce GTX 675M is in the same region as the old GTX 580M, which is supposedly about 100 Watt (TDP including the MXM board and memory). Therefore, the SLI system needs up to 200 Watt using both cards. Without load, the performance is clocked at 50/100 MHz (chip/shader) in 2D mode and 200/400 in 3D mode to save power. Furthermore, a single card can be deactivated to save power.

The similar desktop GeForce GTX 580 is based on the GF110 chip and much faster in comparison. Therefore, the 675M is more closely related to the GeForce GTX 560Ti, the latter of which is still faster due to higher clock speeds.

NVIDIA GeForce GTX 680MX

► remove from comparison NVIDIA GeForce GTX 680MX

The NVIDIA GeForce GTX 680MX is a high-end DirectX 11-compatible graphics card commonly found on Apple iMac products. It is based on the 28nm GK104 Kepler architecture similar to the GTX 680M, but features more CUDA cores (1536 vs. 1344) and a higher memory clock rate (720/2500MHz vs. 720/1800MHz)

Architecture

The Kepler architecture is the successor to the Fermi architecture that first appeared in laptops with the GeForce 400M series. The GK104 Kepler core offers eight shader blocks, called SMX, that are clocked at the same speed as the central core. In the GTX 680MX, all eight blocks are active for a total of 1536 CUDA cores. Although the Kepler architecture can utilize more shader cores than a Fermi chip, its shaders can be up to twice as power efficient.  However, due to the missing hot clock of the shader domain, two shaders of a Kepler chip are of similar speed to one shader of a Fermi chip (as the latter is clocked twice as fast).

 

PCIe 3.0 is now supported by the mobile Kepler series and an optional Turbo mode can automatically overclock the Nvidia card by a theoretical 15 percent if the laptop cooling system allows it. The implementation of this boost mode is done in the BIOS, but it is ultimately dependent upon the manufacturer of the laptop.

Performance

Thanks to the additional shader cores and the faster memory, the graphics performance of the GeForce GTX 680MX should be 15 - 25 percent above the GTX 680M and similar to the Desktop GTX 580. The GPU has enough power to run demanding games of 2012 fluently with Full HD resolution and maxed out graphical settings. Battlefield 3, Skyrim, and Crysis 2, for example, are playable at the highest detail settings.

As an example, the GTX 680MX can play Battlefield 3 on ultra settings at 30 FPS on a native resolution of 2560x1440 during our benchmark sequence. For fluent multiplayer gameplay, the resolution and/or anti-aliasing should be reduced (e.g. 1920x1080 Ultra at 45 fps).

Features

The improved feature set now includes support for up to 4 active displays. Furthermore, high resolution monitors of up to 3840x2160 pixels can now be connected using DisplayPort 1.2 or HDMI 1.4a if available. HD-Audio codecs, such as Dolby TrueHD and DTS-HD, can be transmitted via bitstream mode through the HDMI port. However, as most laptops will feature Optimus, the integrated GPU will likely have direct control over the display ports and may limit the feature set available by the Nvidia Kepler cards.

The 5th generation PureVideo HD video processor (VP5) is also integrated in the GK104 core and offers hardware decoding of HD videos. Common codecs such as MPEG-1/2, MPEG-4 ASP, H.264 and VC1/WMV9 are fully supported up to 4K resolutions while VC1 and MPEG-4 are supported up to 1080p. Two streams can be decoded in parallel for features such as Picture-in-Picture. Another novelty is the inclusion of a dedicated video encoding engine similar to Intel QuickSync that can be accessed by the NVENC API.

The power consumption of the GeForce GTX 680MX should be somewhat higher than the GTX 680M, making it difficult to cool for laptops. The Apple iMac is currently the most readily available product to utilize this high-end card.

NVIDIA GeForce GTX 675M SLINVIDIA GeForce GTX 680MX
GeForce GTX 600M Series
GeForce GTX 680M SLI compare 2688 @ 0.72 GHz2x 256 Bit @ 3600 MHz
GeForce GTX 680MX 1536 @ 0.72 GHz256 Bit @ 5000 MHz
GeForce GTX 675M SLI 768 @ 0.62 GHz256 Bit @ 3000 MHz
GeForce GTX 680M compare 1344 @ 0.72 GHz256 Bit @ 3600 MHz
GeForce GTX 670MX SLI compare 1920 @ 0.6 GHz2x 192 Bit @ 2800 MHz
GeForce GTX 675MX compare 960 @ 0.6 GHz256 Bit @ 3600 MHz
GeForce GTX 675M compare 384 @ 0.62 GHz256 Bit @ 3000 MHz
GeForce GTX 670MX compare 960 @ 0.6 GHz192 Bit @ 2800 MHz
GeForce GTX 670M compare 336 @ 0.6 GHz192 Bit @ 3000 MHz
GeForce GTX 660M compare 384 @ 0.84 GHz128 Bit @ 4000 MHz
GeForce GTX 680M SLI compare 2688 @ 0.72 GHz2x 256 Bit @ 3600 MHz
GeForce GTX 680MX 1536 @ 0.72 GHz256 Bit @ 5000 MHz
GeForce GTX 675M SLI 768 @ 0.62 GHz256 Bit @ 3000 MHz
GeForce GTX 680M compare 1344 @ 0.72 GHz256 Bit @ 3600 MHz
GeForce GTX 670MX SLI compare 1920 @ 0.6 GHz2x 192 Bit @ 2800 MHz
GeForce GTX 675MX compare 960 @ 0.6 GHz256 Bit @ 3600 MHz
GeForce GTX 675M compare 384 @ 0.62 GHz256 Bit @ 3000 MHz
GeForce GTX 670MX compare 960 @ 0.6 GHz192 Bit @ 2800 MHz
GeForce GTX 670M compare 336 @ 0.6 GHz192 Bit @ 3000 MHz
GeForce GTX 660M compare 384 @ 0.84 GHz128 Bit @ 4000 MHz
CodenameN12E-GTX2
ArchitectureFermiKepler
Pipelines768 - unified1536 - unified
Core Speed620 MHz720 MHz
Shader Speed1240 MHz
Memory Speed3000 MHz5000 MHz
Memory Bus Width256 Bit256 Bit
Memory TypeGDDR5GDDR5
Shared Memorynono
APIDirectX 11, Shader 5.0DirectX 11, Shader 5.0, OpenGL 4.3
Power Consumption2x 100 Watt122 Watt
technology40 nm28 nm
FeaturesOptimus, SLI, PhysX, Verde Drivers, CUDA, 3D Vision, 3DTV PlayOptimus, SLI, PhysX, Verde Drivers, CUDA, 3D Vision, 3DTV Play
Notebook Sizelargelarge
Date of Announcement06.01.2011 23.10.2012
Link to Manufacturer Pagewww.nvidia.comwww.geforce.com
Max. Amount of Memory2048 MB
Transistors3.5 Billion

Benchmarks

3DMark 11 - 3DM11 Performance Score
6389 Points (9%)
6883 Points (10%)
3DMark 11 - 3DM11 Performance GPU
6407 Points (6%)
6736 Points (6%)
3DMark Vantage
3DM Vant. Perf. total + NVIDIA GeForce GTX 675M SLI
3DMark Vantage - 3DM Vant. Perf. total
18405 Points (5%)
3DM Vant. Perf. total + NVIDIA GeForce GTX 680MX
25501 Points (7%)
3DM Vant. Perf. GPU no PhysX + NVIDIA GeForce GTX 675M SLI
3DMark Vantage - 3DM Vant. Perf. GPU no PhysX
17631 Points (10%)
3DM Vant. Perf. GPU no PhysX + NVIDIA GeForce GTX 680MX
25270 Points (14%)

Average Benchmarks NVIDIA GeForce GTX 675M SLI → 100% n=4

Average Benchmarks NVIDIA GeForce GTX 680MX → 124% n=4

- Range of benchmark values for this graphics card
- Average benchmark values for this graphics card
* Smaller numbers mean a higher performance
1 This benchmark is not used for the average calculation

Game Benchmarks

The following benchmarks stem from our benchmarks of review laptops. The performance depends on the used graphics memory, clock rate, processor, system settings, drivers, and operating systems. So the results don't have to be representative for all laptops with this GPU. For detailed information on the benchmark results, click on the fps number.

low 1024x768
GeForce GTX 680MX:
72  fps
med. 1366x768
GeForce GTX 680MX:
43.6  fps
high 1920x1080
GeForce GTX 680MX:
26.4  fps
ultra 1920x1080
GeForce GTX 680MX:
14.2  fps
ultra 1920x1080
GeForce GTX 680MX:
133  fps
high 1366x768
GeForce GTX 680MX:
55.4  fps
ultra 1920x1080
GeForce GTX 680MX:
34.4  fps
F1 2012

F1 2012

2012
ultra 1920x1080
GeForce GTX 680MX:
97.5  fps
high 1366x768
GeForce GTX 680MX:
99  fps
ultra 1920x1080
GeForce GTX 680MX:
31.1  fps
low 1024x768
GeForce GTX 680MX:
126  fps
med. 1366x768
GeForce GTX 680MX:
104  fps
high 1366x768
GeForce GTX 680MX:
92  fps
ultra 1920x1080
GeForce GTX 680MX:
45.6  fps
NVIDIA GeForce GTX 680MXlowmed.highultraQHD4K
The Witcher 37243.626.414.2
Dead Space 3133
Hitman: Absolution55.434.4
F1 201297.5
Sleeping Dogs9931.1
Battlefield 31261049245.6
< 30 fps
< 60 fps
< 120 fps
≥ 120 fps


1
1

1
1
1
1
2
1
3
1
1






For more games that might be playable and a list of all games and graphics cards visit our Gaming List

Add one or more devices and compare

In the following list you can select (and also search for) devices that should be added to the comparison. You can select more than one device.

restrict list:

show all (including archived), 2024, 2023
v1.26
log 27. 03:06:55

#0 checking url part for id 3147 +0s ... 0s

#1 checking url part for id 3525 +0s ... 0s

#2 not redirecting to Ajax server +0s ... 0s

#3 did not recreate cache, as it is less than 5 days old! Created at Thu, 25 Apr 2024 05:40:10 +0200 +0.001s ... 0.001s

#4 composed specs +0.156s ... 0.157s

#5 did output specs +0s ... 0.157s

#6 start showIntegratedCPUs +0s ... 0.157s

#7 getting avg benchmarks for device 3147 +0.031s ... 0.188s

#8 got single benchmarks 3147 +0.008s ... 0.196s

#9 getting avg benchmarks for device 3525 +0.002s ... 0.198s

#10 got single benchmarks 3525 +0.008s ... 0.207s

#11 got avg benchmarks for devices +0s ... 0.207s

#12 min, max, avg, median took s +0.031s ... 0.238s

#13 before gaming benchmark output +0s ... 0.238s

#14 Got 14 rows for game benchmarks. +0.003s ... 0.241s

#15 composed SQL query for gamebenchmarks +0s ... 0.241s

#16 got data and put it in $dataArray +0.005s ... 0.246s

#17 benchmarks composed for output. +0.015s ... 0.262s

#18 calculated avg scores. +0s ... 0.262s

#19 return log +0.051s ... 0.312s

Please share our article, every link counts!
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > Benchmarks / Tech > Graphics Card Comparison - Head 2 Head
Redaktion, 2017-09- 8 (Update: 2023-07- 1)