AMD unveils its own CPU recommendations for Oculus VR
http://www.extremetech.com/wp-conten...00-640x353.jpg
Earlier this week, Oculus opened pre-orders for systems and configurations that it believes will deliver an acceptable VR experience. Overall, it’s probably best if consumers hold off on pre-ordering VR equipment — but since we’ve spent most of our time discussing the GPU side of the equation, the CPU deserves some love as well.
Jason Evangelho of Forbes sat down with AMD to talk about its processor support for VR and whether the company’s FX and APU families can drive headsets like the Oculus Rift. The good news is, they absolutely can, even if the current version of the Oculus hardware tester claims otherwise.
http://www.extremetech.com/wp-conten...2/Capture3.png
Originally, AMD told Forbes that there were a variety of AMD FX processors that could handle VR, as well as some of the highest-clocked APU processors. The company has since walked back its APU claims, however, and is now saying that only the FX chips have been validated. If you have an eight-core or six-core AMD CPU with a base clock no lower than 3.9GHz, you should be good to go with VR.
The fact that AMD is still validating VR on its APUs doesn’t mean those chips can’t handle the technology, but it may be difficult for them to do so. Even AMD’s upcoming A10-7890K, with a supposed 4.1GHz base clock and 4.3GHz boost clock isn’t all that powerful compared to Intel’s Core family. Since AMD CPUs can’t match the single-threaded performance of Intel chips, which is why AMD positions its multi-core offerings against Intel processors with a lower core count. According to Oculus’ recommendations, the minimum Intel chip you should use is a Core i5-4590. That’s a 3.3GHz quad-core with a 3.7GHz Turbo clock, 6MB of L2 cache, and no Hyper-Threading.
more...
Vulkan API reaches 1.0, claims broad API support, cross-OS compatibility
http://www.extremetech.com/wp-conten...-3-640x353.jpg
It’s been some 18 months since Khronos announced the next-generation of OpenGL, and the final version of the spec, Vulkan, is finally ready for deployment. As of today, everything related to Vulkan — drivers, SDKs, and early software support is ready for launch. This is a change from Khronos’ usual practice, which is to announce a new version of the OpenGL API with vendor support following at a later time.
http://www.extremetech.com/wp-conten...02/Vulkan2.jpg
Vulkan is a direct descendent of AMD’s Mantle and it shares the same underlying philosophy as both its predecessor and DirectX 12. What sets Vulkan apart from DirectX 12 is that it supports a much wider range of operating systems, from Windows 7 and 8.1 up through Windows 10, Linux, SteamOS, Android, and Tizen. That’s a significant theoretical advantage for the new API, though Microsoft is doing everything it can to push Windows 7 or 8.1 gamers to adopt Windows 10.
Like DX12, Vulkan is designed to minimize driver overhead, scale across multiple CPUs, and offer developers greater control over how workloads are executed across the GPU. Mobile GPU companies like Qualcomm and Imagination Technologies have both pledged support for the API and intend to support it in their own development tools. In theory, this broad industry base could make it more likely that PC games will cross over to Android devices, Steam Machines, and Linux as a whole. Even Vivante has pledged to support the API, though its products are typically used in lower-end mobile hardware.
more...
Fallout 4 is a great game but a terrible RPG
http://www.extremetech.com/wp-conten...e1-640x353.jpg
Fallout 4 is a great game. It’s got better gunplay and action than any previous modern Fallout, a decent crafting system, streamlined skills and talents, and an evocative setting. The landscapes and environments are gorgeous, even if the character models are lacking. There’s a lot to like about this game — but it’s a terrible RPG.
What’s an RPG?
Strip away the tropes and conventions of the genre and there are two linked characteristics common to all role playing games (RPGs) — a (hopefully) strong story, and the opportunity to make meaningful choices that influence how the game’s narrative evolves. Some games play out the same way but allow the gamer to choose different play styles, while others allow the player to directly shape the narrative.
Because game assets and development time are both scarce resources, the best RPGs developers are skilled illusionists. Quest lines, conversation threads, and plot-specific developments are often woven in ways that give players enough freedom to explore alternative narratives while minimizing the amount of overhead required to do so. In Fallout: New Vegas, you can choose to stand with Mr. House, New Vegas, the NCR, or Caesar’s Legion. What you can’t do is decide that the Mojave is really boring, and you’d really prefer to see what the Baja Peninsula is like 200 years after the bombs fell.
http://www.extremetech.com/wp-conten...wn-640x361.png
Fallout New Vegas: A study in brown
RPGs are the only popular game type whose abbreviation tells you nothing about how you play the title. Every other abbreviation — FPS, RTS, turn-based, third-person shooter — is designed to explain how the player experiences the game. All of these game types are potentially compatible with the label “RPG.”
more...
HTC Vive VR: $800, shipping in April, and another risky bet for gamers
http://www.extremetech.com/wp-conten...e1-640x353.jpg
Today, just before Mobile World Congress, HTC announced both Vive VR’s price tag and some additional features. The headset will cost $800 at launch, pre-orders begin on February 29, and the device will include some novel functionality, including the ability to integrate with smartphones.
Vive’s official announcement states: “Enabling you to stay connected to the real world, without exiting the virtual world, Vive Phone Services demonstrates the ability to combine both realities without losing touch of either. By allowing you to receive and respond to both incoming and missed calls, get text messages and send quick replies and check upcoming calendar invites directly through the headset, it opens up a whole new world of possibilities for both consumers and businesses.”
Vive is talking about full SteamVR integration into its platform and the $800 price tag buys you two hand tracking, wand-like controllers. I experimented with these in Sonoma at RTG’s Polaris architecture event, though I didn’t come away with a high opinion of them — they weren’t working all that well when I tested them, and one hand kept dropping out during gameplay. Assuming that issue got solved, the final controllers were comfortable enough and easy to hold for the short time I used them.
The Vive ships with two titles, at least for now — Job Simulator and Fantastic Contraption, neither of which I’ve played personally.
Don’t pre-order a Vive, either
Earlier this month, after Oculus unveiled its recommended hardware, I stated that I didn’t think people should pre-order the Rift, even though I believe VR has tremendous long-term potential and want to see it succeed in the gaming market.
All of the reasons I recommended people take a wait-and-see approach to the Rift apply to the Vive as well, at least as strongly. However gobsmackingly amazing VR may prove in the future, right now, today, the technology is new, and the price of entry can be considerably more than $600 to $800 depending on what kind of PC hardware you currently own. I’ve built gaming PCs for people for less money than you’ll pay for just the HTC Vive.
There’s a lot we don’t know yet about how VR solutions will compare between companies or which games will be cross-compatible on which platforms, and which will technically be cross-compatible, but in reality will perform vastly better on one solution versus another. Oculus has gathered most of the headlines to-date, but that doesn’t mean it’s automatically going to field the best solution.
The fact that many PC gamers who have been perfectly happy with 30-60 FPS at 1080p will need to upgrade to substantially more powerful hardware in order to experience VR means that it’s all the more important to wait and see what real-world performance looks like. Do you need a GTX 970, a 980, or a 980 Ti? Is an R9 390 enough performance, or should you buy a Fury X or wait for Pascal / Polaris later this year on 14nm? These aren’t just hypothetical questions; they all carry price tags. It’s going to take some time post-launch to see how the hardware compares, how game support shapes up, and which solutions ultimately deserve buy-in and which don’t.
When I recommended people hold off on pre-ordering the Oculus Rift in favor of waiting for reviews and performance data, some readers commented that it was strange to see a website called ExtremeTech being so conservative with its recommendations. I’m perfectly happy to extoll the virtues of VR or other cutting-edge technology when we’re talking about them as experiences or potential game-changers. When it comes to telling you where I think you should spend your hard-earned cash, I’m a lot less willing to play fast-and-loose with my recommendations.
If you already have a tricked-out high-end gaming rig or a six-figure take-home pay, you’ve got enough cash on hand that you don’t have to worry if your VR bet doesn’t pan out. If you were one of the first Oculus backers on Kickstarter and you’ve stayed engaged with the company since it first went live, you’ve already made your ecosystem and purchasing choices.
more...
AMD clobbers Nvidia in updated Ashes of the Singularity DirectX 12 benchmark
http://www.extremetech.com/wp-conten...re-640x354.png
Nvidia reached out to us this evening to confirm that while the GTX 9xx series does support asynchronous compute, it does not currently have the feature enabled in-driver. Given that Oxide has pledged to ship the game with defaults that maximize performance, Nvidia fans should treat the asynchronous compute-disabled benchmarks as representative at this time. We’ll revisit performance between Teams Red and Green if Nvidia releases new drivers that substantially change performance between now and launch day.
Roughly six months ago, we covered the debut of Ashes of the Singularity, the first DirectX 12 title to launch in any form. With just a month to go before the game launches, the developer, Oxide, has released a major new build with a heavily updated benchmark that’s designed to mimic final gameplay, with updated assets, new sequences, and all of the enhancements to the Nitrous Engine Oxide has baked in since last summer.
Ashes of the Singularity is a spiritual successor to games like Total Annihilation, and the first DirectX 12 title to showcase AMD and Nvidia GPUs working side-by-side in a multi-GPU configuration.
The new build of the game released to press now allows for multi-GPU configuration testing, but time constraints limited us to evaluating general performance on single-GPU configurations. With Ashes launching in just under a month, the data we see today should be fairly representative of final gameplay.
AMD, Nvidia, and asynchronous compute
Ashes of the Singularity isn’t just the first DirectX 12 game — it’s also the first PC title to make extensive use of asynchronous computing. Support for this capability is a major difference between AMD and Nvidia hardware, and it has a
significant impact on game performance.
A GPU that supports asynchronous compute can use multiple command queues and execute these queues simultaneously, rather than switching between graphics and compute workloads. AMD supports this functionality via its Asynchronous Compute Engines (ACE) and HWS blocks on Fiji.
https://www.extremetech.com/wp-conte...WS-640x360.jpg
Fiji’s architecture. The HWS blocks are visible at top
Asynchronous computing is, in a very real sense, GCN’s secret weapon. While every GCN-class GPU since the original HD 7970 can use it, AMD quadrupled the number of ACEs per GPU when it built Hawaii, then modified the design again with Fiji. Where the R9 290 and 290X use eight ACEs, Fiji has four ACEs and two HWS units. Each HWS can perform the work of two ACEs and they appear to be capable of additional (but as-yet unknown) work as well.
more...
Nintendo NX console coming this Christmas with new Zelda game: report
http://www.extremetech.com/wp-conten...a1-640x360.jpg
It’s shaping up to be a major year for Nintendo if the rumor mill is even partially accurate. The company has already announced that it will showcase its new NX console at E3 this summer, as our sister site IGN reported, but fresh rumors suggest we’ll see the NX launch by Christmas of this year. That’s somewhat less time than either Microsoft or Sony gave themselves for ramping up the PS4 and Xbox One, both of which held launch announcements in February and May, respectively.
Exact dates for the launch are still being nailed down, and it’ll showcase the new Zelda title. Nintendo has previously promised its Legend of Zelda title would debut on the Wii U, and it plans to keep that promise by offering the game as a launch NX title and a Wii U game. This would seem to suggest that the game will be scaled up to match the NX’s hardware capabilities rather than designed for that platform and scaled down to fit the Wii U’s less powerful hardware.
This is supposedly a major rethink of the classic Legend of Zelda concepts and gameplay, with a new, Elder Scrolls — Zelder Scrolls? — style of open-world exploration.
https://www.extremetech.com/wp-conte...ls-640x353.jpg
Supposedly the new Zelda will receive a mammoth marketing push, with a $10 million earmark out of Nintendo’s $34.5 million budget for the Wii U. Nintendo will spend an estimated $56 million on marketing the 3DS, and may be planning a further price cut for the platform this fall. That would fit speculation that the NX platform is meant to at least partly replace the 3DS as Nintendo’s handheld of choice, but even if that rumor proves true, it raises others. If the NX is both console and handheld, will users be able to play 3DS games they’ve already purchased?
The Nintendo NX
We’ve previously rounded up the Nintendo NX’s capabilities, controllers, and positioning, so if you’re looking for background information, that’s where to start. Our understanding of the hardware is still limited. Nintendo has said that the NX is a clean break from its past, which heavily implies that the company will finally break away from the 20 year-old CPU architecture that it’s been using since Clinton was President. As capable as the old 750CXe was in its day, Nintendo has made only modest modifications to the CPU core since it debuted.
https://www.extremetech.com/wp-conte...01-640x358.jpg
Nintendo’s patent draws, showing a hypothetical controller
A 2016 debut would line up with AMD’s claims that it secured another semicustom design win with revenue expected later this year, since Nintendo would begin ramping up production several months before shipping hardware. If AMD built the SOC, there’s a very good chance that it handled the graphics as well, but there’s still much we don’t know. Of course, it’s also possible that AMD’s semi-custom win is merely coincidentally timed. Either way, Nintendo has historically done a good bit of customization work on its SoC designs, and we expect it would do so here, as well.
Whatever ideas Nintendo is fielding with the NX, it needs to do a better job marketing the final product than the Wii U managed. The GamePad was an interesting idea, but it never matured into a must-have peripheral, and lifetime Wii U sales have been far below the Wii. If Nintendo pushed for an aggressive update, it could easily field a console to match the PS4 and Xbox One, but the company has previously preferred to maximize hardware profitability, with consoles that, while capable, lagged behind their peers in terms of sheer horsepower. The Wii U is built on 40nm technology, for example, even though 28nm was available when the hardware was designed. Unlike Sony and Microsoft, Nintendo typically does fewer die shrinks and other revisions, but that could change if the firm decides to go head-to-head with the dominant players in the console industry.
more...
The Windows Store saddles PC games with significant limitations
http://www.extremetech.com/wp-conten...ak-640x354.jpg
Microsoft really wants PC and Xbox One gamers to have a common platform that links their purchases and allows for both cross-buying and possibly cross-play. At first, this seemed a win-win for everyone. Now, however, the restrictions placed on Windows 10 games sold through the Windows Store seem like they might kill the entire concept.
We touched on this recently, but it’s worth revisiting the topic in-depth, since it affects games like Quantum Break, Rise of the Tomb Raider, and the upcoming Gears of War Ultimate Edition.*Specifically, games purchased through the Windows Store:
- Only run on Windows 10
- Cannot be managed by Steam
- No SLI / Crossfire support
- No refund policy
- Use borderless full screen with a refresh rate lock synchronized to your display rate (the lock has been reported as 60Hz, because that’s the maximum refresh rate on many displays)
- Protected game files can hinder modding (this will vary depending on the structure of the title)
In addition, there are mouse and key binding issues, as well as the fact that you can’t override in-game settings with Nvidia’s Control Panel or AMD’s Catalyst Control Center, according to Ars Technica.
Having said that, there are a few erroneous restrictions that are being reported as Windows Store-specific, but which may or may not be. The reason benchmark monitoring and reporting tools are having trouble with these applications is the same reason FCAT can’t monitor Ashes of the Singularity on AMD hardware — the software and application support for DX12 monitoring isn’t available yet. Support for these modes will hopefully come in time.
Avoid the Windows Store
Reading over the list of restrictions, I can’t honestly come up with a reason why a PC gamer should ever buy a title through the Windows Store, unless it’s a mobile game or something you already own on an Xbox One. Giving up multi-GPU support, tunable options, cross-OS compatibility, and Steam’s refund policy gets you… well, practically speaking, it gets you nothing. If you own an Xbox One, cross-buying may be a potent incentive. But for everyone else, the Steam / GoG / whoever deal is going to be better.
https://www.extremetech.com/wp-conte...er-640x360.jpg
This speaks to an ongoing problem with Microsoft and the Windows Store in general. Universal Windows Apps might be awesome in theory, but in practice they’re far more trouble than they’re worth. If Microsoft sold its games for less money in the Windows Store, it might be worth the limited feature set, but the company’s entire plan appears to be “Sell them less for the same price tag.” Gamers who are used to how Steam works may not be pleased with the way the Windows Store locks everything down. Microsoft may want to unify the Windows Store and its Xbox One gaming empire with the PC space, but it risks creating a two-tier system in which Steam buyers have dramatically better features and compatibility than Microsoft products on a Microsoft platform.
These kinds of limits and lockdowns may make sense on mobile devices, but they’re not going to help Microsoft win converts for the PC Windows Store. As things stand, it should be considered the method of last resort for buying a game, unless you specifically want the Xbox One cross-buy feature. Once Microsoft revises its policies and requirements, that could change.
more...
The new Gears of War Ultimate Edition is a DX12 disaster
http://www.extremetech.com/wp-conten...W1-640x353.jpg
We covered Ashes of the Singularity and how the game’s DirectX 12 performance has evolved between AMD and Nvidia. This week, Microsoft has launched the PC version of Gears of War Ultimate Edition, but the characteristics of the two titles couldn’t be more different. The new Gears of War is catastrophically broken on Radeon cards.
Jason Evangelho of Forbes has details on the exceedingly strange performance results, of which there are many. The Radeon Fury X is incapable of holding a steady frame rate, with multiple multi-second pauses throughout the benchmark run. The same problem struck the 4GB R9 380 and the 4GB R9 Nano as well. Meanwhile, an R7 370 — that’s a midrange card based on a four-year-old graphics architecture, which also ships with 4GB of RAM — runs just fine.
Here’s the R9 Nano running in 4K at High Quality.
The Forbes tests show two trends. First, GCN 1.0 cards perform smoothly, while GCN 1.1 and 1.2 cards stutter and struggle. Second, AMD GPUs with >4GB of RAM show marked improvement. I spoke to Jason about his results; he indicated this is not the case on Nvidia hardware, where 4GB of RAM gives the GTX 980 all the headroom it needs at settings and resolutions that cripple AMD.* Last year, we surveyed 15 titles to determine whether gamers needed more than 4GB of VRAM to play in 4K and determined they did not. The fact that AMD is hammered at 1440p and High detail suggests that memory management in Gears of War Ultimate Edition is fundamentally broken as far as AMD GPUs are concerned.
One of the historical differences between AMD and Nvidia has been their Day 1 driver support. AMD has put a great deal of work into closing that gap in recent years, but Nvidia is still widely perceived to have an edge when it comes to launch-day optimizations.
https://www.extremetech.com/wp-conte...PC-640x361.jpg
The game’s visuals are badly corrupted on AMD cards above 1080p and at higher detail levels.
In this case, however, the problems go far beyond performance profiling. The game isn’t slower on AMD — it’s unplayable on many AMD GPUs. Hawaii / GCN 1.1 is now more than two years old, Tonga is 18 months, and Fiji has been in-market for nine months. None of these are new products.
Developer or driver?
There are several reasons to suspect this is a developer issue rather than a driver problem. First, there’s the fact that DirectX 12 is designed to give developers far more power over how a game is rendered. This can be a double-edged sword. DX12 allows for better resource allocation, multi-threaded command buffers, asynchronous compute, and better performance tuning — but it also makes it harder for the IHV (that’s AMD or Nvidia) to optimize in-driver. There are optimizations that AMD and Nvidia could perform under DX11 that can’t be done in DX12.
more...
Microsoft’s Xbox chief wants to build a fully upgradeable Xbox One
http://www.extremetech.com/wp-conten...r1-640x353.jpg
The gap between computers and consoles has been shrinking for decades — and now, Xbox head Phil Spencer wants to eliminate it altogether. Microsoft is already taking steps to unify its Xbox One and Windows 10 experience, through features like game streaming across networks and cross-buy capability. Upgrading hardware, however, is something else entirely.
Nonetheless, upgraded hardware appears to be what Spencer meant. “We see on other platforms whether it be mobile or PC that you get a continuous innovation that you rarely see on console,” Polygon reports Spencer as saying. “Consoles lock the hardware and the software platforms together at the beginning of the generation. Then you ride the generation out for seven or so years, while other ecosystems are getting better, faster, stronger. And then you wait for the next big step function.”
https://www.extremetech.com/wp-conte...Mk-640x390.png
This graph shows the step function Spencer is referring to. PC performance increases over time at a fairly steady rate, consoles have long periods of static performance, followed by a jump.
“When you look at the console space, I believe we will see more hardware innovation in the console space than we’ve ever seen,” Spencer said. “You’ll actually see us come out with new hardware capability during a generation allowing the same games to run backward and forward compatible because we have a Universal Windows Application running on top of the Universal Windows Platform that allows us to focus more and more on hardware innovation without invalidating the games that run on that platform.”
A new console paradigm?
Before we hit the software side of things, let’s talk about long-term trends in console development. The truth is, the gap between consoles and PCs has been shrinking, bit by bit, ever since the Nintendo NES launched in the mid-1980s. In the 1990s, consoles adopted CD-ROM and DVD-ROM technology, even if they used customized discs and encoding schemes. Microsoft’s Xbox was the first mainstream console to use an Intel CPU and Nvidia GPU; both the Xbox 360 and PlayStation 3 made integrated storage standard, even if Microsoft did technically sell a disc-only option. The PlayStation 3 could run Linux, until Sony patched it out.
Today, the Xbox One and PlayStation 4 are PCs in everything but name. They rely on commodity x86 hardware and consumer graphics cards. They’re built around low-level APIs that have much in common with their PC brethren like Vulkan, DX12, and AMD’s Mantle. This is slightly more obvious with the Xbox One, which literally runs a version of Windows, but there’s nothing at the hardware level that would prevent the PS4 from doing so as well.
more...