What is adaptive clocking?
Here is one indisputable fact: in the last few years on the market of PC laptops sold more than desktop computers. This is an inevitable side effect of the mobile revolution, which today we live in a world in which computer users don’t want to be «anchored» in one place. Instead, they want to use them on the move, anytime, anywhere as they please.
For most modern models is not a problem – modern configuration notebook provides blazing speed in such usage scenarios, like web surfing, office applications, watching movies and listening to music.
With games, however, the situation is a little different. Traditionally, they are among the most «greedy» on the hardware resources computer applications, which places exceptionally high demands on the characteristics of the system. And since modern mobile technologies still require a certain level of compromise between performance and efficiency, even the most powerful (and expensive) laptop gaming class could not provide performance-level desktop PC, equipped with components of the latest generation.
But it can stay in the past thanks to a very interesting technology called «Adaptive sync». For her we have to thank the company NVIDIA, which, although can not be called its Creator, has made the greatest contribution to the commercialization of this technology.
With her help very soon, the gaming notebooks will have the ability to offer a gaming experience comparable in quality to those in their desktop counterparts, and without drastic compromise from the point of view of efficiency of use of energy.
Frame synchronization and why is it a problem for gamers?
If you read the materials discuss the gaming performance of a particular configuration, you probably noticed that in them, authors often refer to the same specific rate – 60 frames per second. For most gamers is the coveted 60 fps are a kind of «Holy Grail» of high-quality computer gaming, but the reason for this is connected not with computer hardware and with the characteristics of modern LCD displays.
Today most displays, especially those that are equipped with mobile computers use a fixed refresh rate. This is one of the key indicators of each monitor, which indicates how many times per second updates the image displayed on it.
Another parameter which applies directly to graphics card, known as the «fps» (frames per second or just the fps) and its name speaks for itself: this is the number of game frames which can be provided with appropriate configuration to create a realistic illusion of motion.
To get the maximum gaming experience, these two indicators must be synchronous – i.e., the computer hardware must provide 60 frames per second that the monitor will be visualized with a refresh rate of 60 Hz. In this ideal case, you will see the picture on the screen, which will be smooth, no flicker, tearing or blurring.
Alas, in practice this will never happen. The reason is that any modern display with a fixed refresh rate of 60 Hertz (although some monitors frequency is 120-144 Hz). However, the number of frames per second necessarily varies, that is, this figure can not be permanent. For example, in the game with open world one and the same system can produce 50+ shots in enclosed spaces (i.e., performance close to the coveted 60 fps), but in open locations of the record can quickly sink up to 30-40 fps. On slower configurations, the gap between frequent updates of the display and the number of frames per second will be more.
The problem is further complicated by the fact that the number of frames largely depends on the level of graphics settings and resolution in game. For example, a particular configuration can guarantee 50-60 fps and relatively comfortable game at a resolution of 1280 x 720 pixels and medium detail level. However, you should increase the resolution up to Full HD (1920 x 1080 pixels), and visual parameters to the level of the Ultra, and the performance of the «collapse» of 10-20 frames per second, which in practice makes the game «unplayable».
The solution to the problem
Until recently, the traditional response to a synchronization problem between the monitor and the computer (graphics card) was V-Sync. In other words, the «vertical sync». This is a crude but relatively effective way, which in practice makes the game engine to synchronize with the refresh rate of the display.
Alas, this solution has one serious drawback: it works correctly only if the next frame is rendered in less than 1/60 of a second. If the training frame takes more time, the next refresh cycle of the display he’s just not ready for visualization. Accordingly, the video card will have to render it again. Unfortunately, it happens with most modern graphics cards, even of the highest class, but the visible result of all this become annoying lag and other unpleasant side effects.
It is here that intervene NVIDIA with the idea of so-called «adaptive sync», which in their case became known as G-Sync. This is the opposite of V-Sync, which forces the monitor to synchronize with the game, not Vice versa. Thus, even if the hardware (graphics card) able to provide, say, 30 frames per second, this is not a particular problem, because the display system will be synchronized with it and will work with a refresh rate of 30 Hz.
This technology may prove to be manna from heaven for all gamers and especially for those who play on laptops, which traditionally offer lower gaming performance compared to a desktop PC.
Not only the pros
On paper G-Sync sounds great and has the potential to provide high quality game even on slower laptops, which otherwise can only dream of the coveted 60 frames per second. In practice, however, things are not so simple – especially in terms of implementation of G-Sync in mobile computers.
The problem is that for the implementation of adaptive sync computer monitor needs extra module, which dynamically adjusts the refresh rate in accordance with the number of frames per second. This module is costly and, worse, requires a lot of energy, making it impractical addition, at least in laptops, for which the issues related to energy consumption are particularly painful.
This is where G-Sync was in 2013, when NVIDIA first announced this technology. However, the company continued to work actively on the development of the concept of adaptive synchronization, and as a result the world was recently presented Mobile G-Sync is the kind of original ideas designed specifically for use in portable computers.
The main advantage of new modification technologies is the lack of need for a separate hardware module synchronization. Instead, Mobile G-Sync uses one of the most modern interfaces – embedded DisplayPort (eDP), which is equipped with most laptops of a new generation.
Mobile G-Sync is a software rather than hardware method of adaptive synchronization. It is based on a complex mathematical algorithm that tries to predict with high accuracy the ability of the graphics card to prepare the following, intended to render the frame and adjusts the refresh rate of the display.
Of course, achieving 100% accuracy in this case is impossible, but even approximate result gives a serious reflection on the quality of gaming experience.
The advantages of Mobile G-Sync is obvious: a smoother playback even on low-end hardware, without increasing energy consumption. But unfortunately, technology has its weaknesses. As already mentioned, the attainment of absolute accuracy in the prediction frame is impossible. It is for this reason partly the algorithm sacrifices accuracy due to fixed refresh rates and smoother playback.
More unpleasant side effect from the practical implementation of this technology is that Mobile G-Sync and NVIDIA Optimus are mutually exclusive. As you may know, the latter is a popular feature that allows you to dynamically switch between built-in CPU graphics core and discrete (Radeon) graphics card. So, when working with light tasks, such as, say, Internet browsing and document editing, the laptop can use the integrated video, which consumes much less energy than the discrete graphics adapter.
However, Mobile G-Sync display of the laptop should be connected directly to a discrete graphics card (with the brand NVIDIA, of course). This practically excludes the built-in graphics core and makes Mobile G-Sync and Optimus mutually exclusive.
According to NVIDIA, this is not a significant problem, especially for laptops with graphics processors of the new generation of Maxwell, which are extremely energy efficient. However, this is an important compromise that will have to go to many OEM partners of the company, if they decide to offer Mobile G-Sync as an option in gaming notebooks next generation.
Of course, in the absence of independent tests is still not very clear, how much more will this compromise and to what extent the use of Mobile G-Sync due to Optimus will affect the battery life.
Another issue is that even the most advanced models of laptops may not offer a particularly impressive autonomy – especially given the ultra-efficient mobile systems such as ultrabooks last generation.
But given the fact that we are talking about highly specialized portable configurations designed for a specific audience (gamers) who prefer peak performance, such a sacrifice in terms of time of battery life is unlikely to be fatal in the case that Mobile G-Sync will fulfill the promise to provide a truly significant increase in the quality of gaming experience.