-
For want of a nail: How hardware hijinks fracked my AMD Radeon R9 Fury testing
http://www.extremetech.com/wp-conten...ry-640x353.jpg
Hardware reviewing is often a hectic process. Whenever you’re dealing with pre-release drivers and early hardware, there’s a greater-than-average chance that something might go wrong. Sometimes, that’s the fault of the company providing the hardware. Sometimes, it’s because the reviewer screwed something up. And sometimes… sometimes it’s because of mind-boggling hardware issues that you didn’t know existed in the first place.
My problems began with the death of my test mouse. At first, this didn’t seem like much of a problem. I typically test hardware with the rig configured next to my workspace for easy access, which makes shifting my work mouse from one system to another relatively easy. In addition, there are plenty of applications that provide quick administrative access to a testbed. Between these various options, I was confident I had things covered, especially since I’ve been using both RDP and TightVNC for years. I turned down my significant other’s offer to use her mouse and opted to manage my various login sessions with TightVNC for mousing and a physically connected keyboard.
I fired up my testbed, installed the new Catalyst 15.7 drivers, launched BioShock Infinite… and the application promptly crashed. This was odd, but it wouldn’t be the first time that an odd bit of driver didn’t get properly uninstalled. I uninstalled AMD’s driver suite, double-checked to make certain there were no odd bits of Nvidia software hanging around the testbed, ran Display Driver Uninstaller, rebooted into Safe Mode, ran the uninstall sequences for both AMD and Nvidia hardware (just to be certain), rebooted again, reinstalled the Catalyst 15.7 driver, ran BioShock Infinite again…
And the game crashed. The 2K intro video played perfectly, but died immediately thereafter.
http://www.extremetech.com/wp-conten...te-640x321.jpg
I’m not sure what this person was trying to say, but they perfectly captured my state of mind.
The problem wasn’t confined to BioShock Infinite. Company of Heroes launched and benchmarked perfectly, but Metro Last Light and Total War: Rome 2 wouldn’t launch at all. At this point, I was convinced I was seeing a driver problem or Steam error — but Steam’s “Check Game Files” option found no errors. Swapping to the Radeon Fury X or a GTX 980 Ti (uninstalling and reinstalling drivers each time) didn’t work, either. Company of Heroes 2 always worked, but Metro Last Light (Safe Mode or regular), BioShock Infinite, and Total War: Rome 2 never did, whether I was using AMD or Nvidia hardware.
I downloaded and reinstalled all three applications from scratch, but it did no good. None of the crash dumps in BioShock Infinite contained any useful information, just a generic “The thread attempted to read from or write to a virtual address for which it does not have the appropriate access” error. Not helpful. Running “scannow /sfc” on the Windows 8.1 installation fixed nothing, but I clearly had a problem somewhere deep in the OS.
http://www.extremetech.com/wp-conten...F2-640x400.jpg
My preferred “drastic solution” at approximately 11 PM on Thursday night.
more...
-
Oculus founder confirms VR is shaping up into an unavoidably fragmented mess
http://www.extremetech.com/wp-conten...er-640x353.png
Ever since Oculus launched its Kickstarter, hopeful fans of VR technology have looked to the company to provide a cutting-edge gaming solution. Since then, we’ve seen a number of companies announce their own VR efforts, from the Vive (Valve and HTC) the Gear VR (Samsung and Oculus), Razr’s OSVR, the new StarVR, to Sony’s Project Morpheus, there’s currently a race on to see who can provide the best hardware for virtual reality gaming. Unfortunately, as a new reddit thread illustrates, that future may be considerably more fragmented than users would prefer.
Oculus CEO Palmer Lucky took to the PCMasterRace subreddit earlier this week to answer questions around Oculus‘ decision to fund the creation of roughly 24 games that highlight the capabilities of VR technology. Oculus is building more than just a headset — it’s planning to build an entire platform, including motion controllers. Users took to the subreddit to ask how much support Oculus would be offering for other headsets or devices, and what consumers can expect in terms of hardware support.
http://www.extremetech.com/wp-conten...nc-640x312.jpg
AMD’s asynchronous shaders are key to its LiquidVR
Palmer notes that Oculus is not a closed platform, meaning the company does not require game developers to submit requests for approval to build titles for the Rift and does not collect a fee for adding Oculus support for games. At the same time, however, the company is investing in its own ecosystem, and it’s not prioritizing projects for other headsets. When asked if it would be easy to add such support, Lucky implied that it wasn’t, saying: “Extending VR support to multiple headsets is not as simple as a patch, it requires pretty deep integration into the code of the game, integration that the developers themselves have to spend a lot of time integrating and updating. This is especially true for games that rely on our SDK features like timewarp, direct mode, late latching, and layered compositor to get a good experience. We can’t possibly make any promises about support through external patches, and we won’t commit to supporting people who want to use our store to buy games for headsets that our store and software don’t currently support.”
more...
-
Batman: Arkham Knight for the PC now delayed until at least September
http://www.extremetech.com/wp-conten...ht-640x353.jpg
Ever since Batman: Arkham Knight vanished from store shelves amid controversy around the terrible condition of the PC port, there’s been questions about when the game might be back on store shelves. A new leaked email to EB Games suggests that the Caped Crusader won’t return to PCs to conclude his adventures for quite some time. “As previously advised,” read an internal EB Games email, “we have stopped sales of Batman: Arkham Knight PC while Warner and Rocksteady work on addressing performance issues with the game. The latest information from Warner is that the updates won’t be available until Spring. Due to this we have made the difficult decision to recall all PC stock from stores to return to the vendor until an acceptable solution is released.”
“Spring,” in this case, means Australian spring, which begins in September. This jives with other reports that we heard earlier this summer suggesting it would take months to bang the PC version into shape. The good news is, it suggests Rocksteady is committed to fixing the game in the long-term, as opposed to either cutting its losses and running or simply allowing the Steam refund process to take care of unhappy users.
Rocksteady has released one fix for Arkham Knight that addressed certain issues, but no major patches as of yet. The company has stated that DLC production for the PC version is on-hold until it finishes patches up the main game, which implies players who haven’t had problems will have to wait longer to see the full content console players are already enjoying. Then again, early reviews of the first Batgirl DLC have suggested it’s a rote, by-the-numbers adventure that doesn’t do much to tap the unique backstory or capability of one of Batman’s oldest allies.
http://www.extremetech.com/wp-conten...m8-640x360.jpg
Previous games, like Arkham City, offered the chance to play as Catwoman.
It’s not the first time that DLC or add-on characters in Arkham titles have been accused of missing the boat. Previous games have offered the chance to step into the shoes of other characters, like Harley Quinn or Catwoman, but have almost always treated these characters as cookie-cutter versions of Batman with access to the same abilities, acrobatic fighting style, and gadgets. The patina of variation is just that — a paper-thin method of differentiation that offers a few small deviations from Batman’s classic fighting style, but rarely deviates enough to feel truly different.
more...
-
Is AMD designing Nintendo’s next-generation NX console?
http://www.extremetech.com/wp-conten...on-640x353.png
Yesterday, on AMD’s conference call, CEO Lisa Su announced that the company had recently added a new embedded design win to its portfolio, though without a firm date on when the company might recognize revenue from the win. One potential candidate for a hypothetical new device is Nintendo, which announced earlier this year that it would launch a new hybrid mobile device in 2016, codenamed the NX.
Dean Takahashi has laid out the reasons why he thinks Nintendo has contracted with AMD to build its next-gen console chip. There are multiple reasons to think this is a plausible match. AMD provides designs for every current-generation console on the market, including the existing Wii U’s GPU. The difference between Nintendo and Sony/Microsoft, however, is that Nintendo appears to have licensed an AMD GPU design that’s built by a third-party, Renesas, and the GPU they licensed — by all accounts, a derivative of AMD’s HD 4000 family — was already quite dated even when the Wii U was new.
According to Nintendo, the NX will be a “hybrid” between mobile and traditional living room gaming. This is broad enough to mean almost anything — a tablet like Nvidia’s Shield could conceivably be classified as a “hybrid” if connected to a television, since the machine supports video-out and wireless controllers, while an ultra-portable living room system could conceivably be declared “mobile” as far as picking it up and walking away with it.
One thing we can predict, however, is that Nintendo’s next-generation console will probably focus more on affordability and unique features as opposed to raw performance. Satoru Iwata’s recent passing could change that, if the company’s new president and CEO has a different vision for the future, but Nintendo has a decade-long history of preferring alternative control schemes and innovative technology over raw horsepower. Sony and Microsoft have historically leapt for new process nodes and die shrinks as quickly as they were available, while Nintendo followed updates its consoles at a far more leisurely pace.
more...
-
Windows 10 beta users can now stream Xbox One games locally
http://www.extremetech.com/wp-conten...ne-640x353.jpg
While Windows 10 isn’t officially available until the end of July, early adopters have been running the Insider Preview for months now. And as of a few days ago, those of us running the latest build can take advantage of the Xbox One game streaming functionality in the Windows 10 Xbox app.
In a post on the Xbox Wire, Larry “Major Nelson” Hryb walks us through the process of enabling this shiny new feature. If you want to give it a go for yourself, start by heading to the Settings menu on the Xbox One. Under Preferences, you should be able to toggle on a setting titled Allow game streaming to other devices.
http://www.extremetech.com/wp-conten...pp-640x353.png
Next, make sure you’re running the latest Windows 10 build and the newest version of the Xbox application. Launch the app, navigate to Connect, and select Add a device. Provided you’re on the same network, you should be able to select your Xbox One from this menu. Plug in a controller, go to the Home tab, and then select your console under the Game Streaming section.
http://www.extremetech.com/wp-conten...y-640x363.jpeg
Of course, Microsoft isn’t the first out of the gate for local game streaming. Sony allows you to stream PS4 games to the Vita and PlayStation TV over Remote Play, Valve offers in-home streaming in the Steam client, and even Nintendo offers off-TV play for many titles on the Wii U. It’s nice to see Microsoft finally taking advantage of the massive Windows market, but why did it take so long? I called out the Xbox One’s lack of game streaming over a year ago, and Microsoft is just now rolling out that functionality.
more...
-
AMD’s GPU performance under Linux can be boosted by renaming certain executables
http://www.extremetech.com/wp-conten...rd-348x196.jpg
For most of the past decade, the idea of gaming under Linux was a contradiction in terms. Apart from a handful of dedicated titles or ports, the only option that gamers had was to either dual-boot into Windows or deal with the Wine emulator. Valve’s decision to pursue the Linux gaming market and develop its own Linux-based operating system has changed that, with a vast array of indie titles (and a handful of AAA’s) now available on the OS. Unfortunately, it looks as though AMD’s driver team hasn’t quite caught up with the times.
Phoronix has published an article demonstrating how performance in certain games can be significantly improved by renaming executables. Both AMD and Nvidia use application profiles to tell the GPU how to render certain titles in the most optimum fashion. Nvidia users have always enjoyed more freedom to tweak the low-level settings within profiles than their AMD counterparts. What Phoronix found is that renaming the csgo_linux binary to hl2_linux dramatically boosted the frame rate on a range of Radeon cards.
http://www.extremetech.com/wp-conten...ix-640x498.png
What’s happening here is simple: AMD has created profiles for other Source-based games and packages those profiles as part of its binary blob driver download. When you map Counterstrike: Go to an Half-Life 2 profile, it dramatically improves the game’s performance because HL2 and CS:Go are based on the same Source engine.
Whether or not this technique would work on other games is unknown. In theory, it could be used to apply fixes to titles if a game with a saved profile is built on the same engine as another game without a profile (Unreal Engine 3, for example). In practice, however, this will be a hit-and-miss technique. Two games can use the same base engine, but contain a number of custom libraries or performance-enhancing techniques added by the developer for their own specific title.
Attempting to run one game under the profile settings for another could reduce performance or cause visual bugs depending on the game in question.
Phoronix notes that CS:Go is still exhibiting this behavior despite having been out for nearly a year, but hopefully AMD will release Catalyst Application Profile (CAP) updates for Linux in the near future. Penguinistas may currently account for a small fraction of all gamers, but if Valve’s SteamOS sees significant uptake, that could change in the not-too-distant future. Phoronix also examines performance in several other modern titles, so hit the link above if you want to see additional details.
more...
-
Star Wars: KOTOR II gets first patch in a decade, adds Mac, Linux support
http://www.extremetech.com/wp-conten...r2-640x353.jpg
Star Wars: Knights of the Old Republic is considered one of the best Star Wars game ever built. Its sequel, Knights Of the Old Republic II, on the other hand, suffered from a rushed release, cut content, and a crippling number of bugs. Obsidian released several patches for the game, but stopped development in 2005. Four years later, a team of modders released The Sith Lords Restored Content Mod, which added a number of missing areas and solved old bugs. Now, without any fanfare, Obsidian has developed its first patch for the game in a decade and released it on Steam.
The new patch contains the following features (quoted from Steam):
• 37 achievements to be earned through gameplay
• Steam Cloud saves
• Native widescreen resolution support
• Resolution support up to 4K and 5K
• Support for controllers, including Xbox 360, Xbox One, Playstation 3, and Playstation 4, along with several others (check the system requirements for details)
• Steam Workshop support! We proudly worked with the Restored Content Mod Team to have their famous TSLRCM up on launch day
This is a huge update for a once-popular game and, combined with immediate access to the Sith Lords Restored Content Mod, means that playing KOTOR II today can be a much richer experience than it was back in 2005. This mod adds the M478 “Droid Planet” missions and a number of other changes, major and minor. The goal of the TSLRCM was to restore cut content and reintegrate scenes, dialog, and quests that had previously been slashed. Right now, most of the content available in the Steam workshop is tied to his mod, though other projects may soon move over.
http://www.extremetech.com/wp-conten...et-640x274.jpg
The restored Droid planet.
The game is currently on sale for $7.49, and this patch likely clears up a number of issues with the game’s resolution support. I recently replayed KOTOR and had significant problems with cut-scenes, to the point that I had to disable them altogether. Resolution settings and mouse issues were a common problem, and while KOTOR I is two years older than KOTOR II, the games used the same engine and likely had some of the same weaknesses.
If you never played KOTOR II, I’d highly recommend it, particularly with the TSLRCM available. The game’s planned storyline was as good or better than KOTOR’s originally — Obsidian was forced to build the game in a tiny window and simply wasn’t able to complete their vision of what the game should’ve been. With those assets restored, KOTOR II is generally thought to rank as high as the original, particularly if you enjoy a game that dabbles more in shades of gray than its predecessor. Just remember that this is still fundamentally a title from 2005 running on a vintage engine — even with content mods and texture packs, it’s never going to look like a modern game.
In unrelated-but-awesome Star Wars news, a fan project has created a Star Wars VR demo using Unreal Engine 4 and Oculus Rift support. The full trailer looks amazing.
more...
-
Microsoft will add keyboard, mouse support to Xbox One for gaming
http://www.extremetech.com/wp-conten...20-640x353.jpg
Microsoft has been working to improve the Xbox One’s value proposition through offering features like backwards compatibility and the ability to stream games from the Xbox One to Windows 10 devices. Now the company is planning to add keyboard and mouse support to the Xbox One. With these features, one of the last significant differentiations between the Xbox One and a standard PC will vanish.
As bit-tech.net details, Phil Spencer responded to a question about whether or not users would be able to stream from a Windows machine to an Xbox One with the following:
http://www.extremetech.com/wp-conten...et-640x371.png
To be clear, you can already use a USB mouse and keyboard with an Xbox One, but not for gaming. Windows 10 streaming, for now, is also a one-way affair. If Microsoft brings this feature to Xbox One, it opens the door to a great many options, including the ability to play online against PC users without worrying that PC gamers will have an intrinsic advantage thanks to superior controls and pixel-perfect mousing. While I realize there are gamers who don’t like playing with a mouse and keyboard, there’s no contest as to which method of input offers greater precision. Unlike controllers, a mouse is capable of pinpoint accuracy, while keyboards offer a much greater range of commands and inputs.
more...
-
Forget Doom — now modders are moving Half-Life to smartwatches
http://www.extremetech.com/wp-conten...HL-640x353.jpg
For years, the ability to run Doom on various devices has been a hallmark of a system’s flexibility. We’ve seen the FPS pop up on everything from calculators to copiers, but with embedded devices becoming ever-more powerful, the ability to run code nearly 25 years old has become old hat. It’s time for a new standard (though the ability to run Doom within Doom is a fairly nifty recent achievement), Now, modder Dave Bennet has beaten the Doom-level achievement by getting Valve’s original Half-Life up and running on an LG G Watch.
The LG G Watch runs on a Qualcomm MSM8226 with an Adreno 305 GPU. The original Half-Life isn’t multi-threaded in any fashion, so the game is running effectively on a single-core Cortex-A7 CPU and a DX9-class GPU. Then again, that’s a much higher clock speed than when the game shipped, while the feature set on Adreno 305 is higher than what Half-Life supported at release. As you might expect, this is entirely a proof-of-concept — it’s almost impossible to actually play Half-Life on a 1.65-inch screen, even if the game is running in a touch-responsive wrapper.
http://www.extremetech.com/wp-conten...-1-640x353.png
This hack is made possible by the SDLash application, which emulates the GoldSource engine. GoldSource is the original Half-Life engine and dates back to the original Quake (albeit in a heavily modified form). The usefulness of such a hack is questionable, since few people are going to try to play a game on a smartwatch; the heat and power requirements would likely drain the battery and burn your wrist at the same time. The frame rate also chugs in places, from a high of 46 FPS to as low as 2 FPS depending on what’s going on in-game. There are multiple potential causes for the instability, from emulator issues to throttling of the smartwatch itself, to underpowered hardware.
Then again, the point is that this is frickin’ Half-Life running on a device that fits on your wrist. Give wearables another few tech generations, and the devices that debut on 14nm or 10nm technology in the future could probably run the game, no problem. Then again, this assumes you’d ever actually want to — controlling Gordon Freeman on a 1.65-inch screen doesn’t sound like actual fun.
Still, as a gamer who was thrilled by the original Half-Life on my K6-233 and 8MB Voodoo 2, seeing it running on a device you wear is pretty incredible, even if it makes me feel about 80 years old.
more...
-
A security flaw in Steam let anyone change your password
http://www.extremetech.com/wp-conten...rt-640x353.png
Did you experience any unexpected activity on your Steam account last week? Well, it seems that there was a major security flaw in Valve’s password reset feature that allowed anyone to reset your password — even without access to your email. The accounts of numerous popular streamers were compromised for a short period, and Valve is left looking incredibly foolish.
If you forget your Steam password, Valve normally sends you a one-time-use code over email that you can use to reset your password. However, it was discovered last week that Steam wasn’t actually checking to verify that your code was valid. If you simply refrained from entering anything during the authentication step, the client would still allow you to reset the password.
more...
-
Sony has sold more PS4s than the Xbox One and Wii U combined: Report
http://www.extremetech.com/wp-conten...4-640x353.jpeg
From the get-go, Sony’s PlayStation 4 has been selling well. It’s only been about a year and a half since it launched in November of 2013, but Sony announced today that the PS4 has sold over 25 million units worldwide. We still don’t know the exact number of Xbox Ones sold to consumers, but if the estimates are correct, the PS4 has now sold more units than the Xbox One and Wii U combined.
In the second quarter of calendar 2015 alone, Sony sold about three million PS4s worldwide. In that same time frame, Microsoft shipped 1.4 million Xbox consoles (both 360s and Ones). Unfortunately, Redmond doesn’t disclose the breakdown of those sales. However, the folks over at Ars Technica have assigned a ballpark estimate of 0.95 million Xbox Ones sold in Q2. And Nintendo? The Wii U sold a paltry 0.47 million units.
http://www.extremetech.com/wp-conten...rt-640x240.png
But what about the big picture? How many consoles have been sold to date? With over 25 million PS4s, over 10 million Wii Us, and an estimated 14.32 million Xbox Ones sold by the end of June 2015, the PS4 holds over 50% of the current-gen console market. At this point in the cycle, it’s safe to say that the PS4 has unambiguously won the battle for sales and consumer support.
While the PS4 has the most horsepower of the existing consoles, that’s not nearly enough to explain the massive sales gap. Microsoft made a number of egregious mistakes leading up to the Xbox One launch, and wasted a lot of the good will it created during the Xbox 360’s lifespan. Redmond has since made huge improvements to its console strategy, but it still hasn’t quite been enough to win back a significant portion of the audience it took for granted.
more...
-
Windows 10 coming to Xbox One in November with new UI
http://www.extremetech.com/wp-conten...10-640x353.jpg
Microsoft announced at Gamescom today that the Xbox One’s Windows 10 rollout will take place in November. We’ve known that the upgrade was coming for a while, alongside DirectX 12 support, but Microsoft had previously declined to put a date on it. The new Windows 10 launch will debut alongside a new UI that apparently diverges from the Windows 8-style layouts that Microsoft has used since the Xbox One launched in 2013. Other new improvements include a new OneGuide design, a Store makeover, and support for Universal apps created with Windows 10.
If you like Cortana, you’ll also have the option to enable her on the Xbox one, and the new universal platform really could go a long way to transforming what the console is capable of. Cortana will also allow you to talk to the console if you have a Kinect, though it’s not clear how many Xbox One owners actually do. Given the way Microsoft’s sales spiked after it killed the mandatory Kinect bundle and began shipping other versions of the console, it’s likely that less than 50% of buyers actually bothered to shell out cash for a glorified paperweight. Finally, of course, there’s the new Chatpad, the Xbox 360 backward compatibility (hopefully in much better shape than it was a few months ago, when Eurogamer tested it), and full DVR support coming to the platform.
http://www.extremetech.com/wp-conten...AL-640x360.png
more...
-
Why some video game ports are spotless while others are train wrecks
http://www.extremetech.com/wp-conten...-9-640x360.jpg
During the last console generation, HD remasters became a popular trend among many of the major publishers. PS2 and original Xbox games were given a fresh coat of paint, and re-released to the world. While the reaction to this trend wasn’t universally positive, it was more-or-less accepted because of the significant jump in fidelity between 480p and 720p. This time around, the situation is a bit more complicated, and the bloom is off the rose.
The PS4 and Xbox One have been out for about a year and a half now, and the number of significant exclusive releases has been pretty slim. To help pad out the line-ups, a massive wave of ports from the 360, PS3, and PC has hit both the PS4 and Xbox One. And since the vast majority of the original games were released in high definition to begin with, the “HD remaster” moniker doesn’t really make sense anymore.
While some releases are definitely more impressive than others, the definitions aren’t particularly clear here. At this point, the line between “port” and “remaster” is significantly blurred. Even if you’re upping the resolution, improving the textures, and reworking the net code, those two terms are used interchangeably more often than not.
And while this generation of consoles has started to become more PC-like with x86-64 CPUs and standard AMD GPUs, PCs are starting to function more like consoles with DirectX 12 and Mantle. The definitions are changing rapidly, and it leaves most of us more than a little confused — even when it comes to day-one releases of multi-platform games.
So, why are some ports nearly spotless while others are train wrecks? Are the PS4 and Xbox One really just dolled-up midrange PCs? These kind of questions pop up frequently in tech/game enthusiast forums, and it usually devolves into a screaming match. I wanted to bypass the noise, and hear the story directly from the devs themselves.
Recently, I had the opportunity to interview developers from two game studios: Iron Galaxy and BlitWorks. Both of those studios are well-known for delivering ports on numerous different platforms, so their first-hand knowledge of this complicated field is exactly what I was looking for. Without further ado, let’s hear from the boots on the ground.
BlitWorks interview
Grant Brunner: Can you give some background on BlitWorks, and what your company does?
Tony Cabello: Well, basically, we’re specialized in porting games. With all the founders being involved in the game development industry for almost twenty years — we have a very deep technical background. Miguel, my partner, he’s a legend in the emulation scene. He did System 32, the Mega Drive, the Dreamcast, and a lot of other work.
Miguel Horna: Neo-Geo, Capcom System II & III, Sega Model 2.
TC: So, we started offering porting services. And over time, we are adding more services like publishing and remakes. Although it’s difficult to say what is a remake and what is a port, because there’s a fine line between them.
GB: There’s a common opinion in the enthusiast community that because the PS4 and Xbox One are using AMD CPUs and GPUs, it’s easy to port to and from the PC. Does this idea actually hold water based on your experience?
TC: Yes and no. The architecture is fairly similar to a PC, but a few details make a big difference. You have to fully understand each component in order to get the best performance. It’s easy to do a port, but it takes a lot of effort to get the max performance of each console.
This is where the traditional big difference between consoles and PC remains — on consoles you have low-level access to all the hardware. The OS runs in a reserved set of cores and memory, so there’s no overhead from it. You have specific memory mapped regions. You have very fine-grained control of what is going to be cached or how you build the display list, and things like that. There’s almost no graphics driver in between. You are feeding the commands directly to the GPU, and this requires a very different mindset.
And also, it’s important that you have very powerful tools. I mean, the profilers and the debuggers on the consoles are spectacular compared to what you get on PC. And this allows you to fully understand what’s going on. If the game is running slowly, you can always know why.
GB: Since Mantle and DirectX 12 are allowing developers to get closer to the metal, does that mean good things for PC-console ports going forward?
TC: Yeah, sure. The new APIs are following that direction. And, basically, they are getting things out of the drivers, so they are allowing you low-level access. In the end, all the GPUs are more similar nowadays than a few years ago[…]
GB: And in terms of CPUs, both PCs and consoles are doubling down on more cores with less of a focus on clock speed. Is that trend a positive thing when you’re porting games to and from these different platforms?
TC: Right. The trend really helps because the mindset is closer than back in the day with the Cell processor — it wasn’t exactly in the trend. Each core had specific memory, and there was a DMA [direct memory access] across the different cores. I mean, it was a very different architecture than a set of general purpose cores that you can find in current CPUs.
MH: That’s why the Xbox 360 was thought to be easier to program [than the PS3]. Because it had three general purpose cores[…] it was more like a PC. PS3 has a main CPU, but then the other CPUs are using a different instruction set, different memory, and everything has to be communicated by DMA. So it was very hard to work with it.
GB: So if the PS3 is on one end of the difficulty spectrum, is there a platform on the other end? Something incredibly easy to port to and from?
TC: I would say the PS4 and Xbox One[…] everything else falls in between like the Vita and the 3DS. They’re in between because they have different challenges. They are less powerful, but still, they use a normal architecture.
http://www.extremetech.com/wp-conten...y-640x360.jpeg
GB: BlitWorks has ported games with a very strong focus on precise timing (Meat Boy, Necrodancer, OlliOlli). Are those kinds of games harder to execute on properly than, let’s say, a menu-driven RPG?
TC: Yeah, every time you have to deal with real time, a lot of unexpected things happen. On every platform, you’ll find a different bottleneck that keeps you away from 60 frames per second. Actually, 60 frames is very difficult. It’s so tight — it’s 16 milliseconds. If anything unexpected comes, you’ll miss a frame. And it’s so easy to miss a frame in the middle of an animation, and [ruin the smoothness].
For instance, in Spelunky, we hit some particular rendering issues on the Vita. People complained because it made the game harder to play during specific moments. Being a hard game already, people got upset. So, we had to further optimize the game, and keep a solid 60 frames per second since it was affecting gameplay.
MH: In games like OlliOlli, any alteration on the timing can make you miss a trick. The common issue with this particular game is when using TVs that do excessive signal processing. Either you activate the game mode on the TV set or you’re going to have a hard time getting used to compensate the lag in your brain, as you will have to press the buttons a few milliseconds in advance. Necrodancer is also a good example, as it has a calibration screen at the start of the game in order to compensate the lag for you.
GB: What about compensating for different controllers? Button positioning, dead zones, and the overall feel is different between controllers. Do you ever have to make tweaks to make the game feel “right”?
TC: Yeah. In every game, we have to do some degree of fine tuning of controls. This is a very subjective topic, and it’s not always clear what to do, so both our testers and the original game creator exchange very valuable feedback so the intended feel is preserved. But there’s no simple rule, just play and see how it feels.
MH: When you have a touch game for Vita like OlliOlli, and you want to port it to PS4… The menus were only built to be used by touch, so we have to create that new menu system so it can be navigated with gamepads.
GB: On the BlitWorks website, there is a page dedicated to something called “Unsharper.” You folks call it a “semi-automatic tool for converting C# code into C++.” How large of a role does automation play in a typical port from BlitWorks?
TC: Well, we started this tool because when we ported FEZ, we used an already existing tool for converting from C# to C++. This tool missed some features, and we had to finish the port manually by filling the gaps in the generated code. But with bigger games, like Bastion, on which we’re working now, this way would have been impossible. The amount of code is much bigger, so we needed to automate the conversion. The manual part is still there, but it’s not the same amount of work we did with FEZ which was very, very time consuming.
GB: Do you do the manual changes before or after the automation process?
TC: The tool is generating C++, and we do the manual changes on the C# side, so we’re constantly converting to see the improvements. The tool is actually a compiler if you look at it this way. What we do is, we adjust the C# to fit the requirements of our compiler, which is not perfect, and does not support everything.
GB: Is there anything about game ports (or the porting process) that you think ExtremeTech readers should know about?
TC: Well, what I would say is that ports can be very technical, but good ports always carry a lot of product level kind of work — adjusting the controls, getting a proper UI, getting smooth graphics. In a sense, a good port is like going back to the time when the game was being developed, changing some of the assumptions, and reworking the affected parts.
I think this is the biggest difference from doing something like an emulator which is a 100% technical thing. You just have to get the game to behave the way it did originally. In porting, there’s a lot of care about the product — exactly like when the game was developed originally, but on a different platform.
MH: If you do ports like… an automatic machine, your ports aren’t going to be good. They need a lot of care — a lot of attention. A lot of love from the team to make the port in the best possible way.
more...
-
4K Ultra Blu-rays will land on store shelves by Christmas
http://www.extremetech.com/wp-conten...ys-640x353.jpg
If you’ve been hoping for a 4K Blu-ray upgrade to show off a swank new 4K television, Christmas 2015 should be the highlight of your year. According to Victor Matsuda, the Blu-ray Disc Association’s Global Promotions Committee Chair, 4K Blu-ray discs will be available for the holiday season, along with a bevy of new technologies and additional features. In addition to supporting 4K resolutions, the new discs include support for multiple types of high dynamic range (HDR) lighting, and a new “digital bridge” feature.
http://www.extremetech.com/wp-conten...e1-640x474.png
The digital bridge feature is the physical media world’s method of offering the kind of digital convenience that streaming services sell standard. According to an interview with Matsuda, the digital bridge will offer two functions: copy and export. “Copy” permits a bit-for-bit copy to be stored on an authorized media drive, while “export” allows a file to be transferred to an authorized media device. It’s not clear which devices will be considered “authorized,” and whether or not the licensing terms will allow for transcoding into different formats for playback on specific devices. The fact that two different standards have been created for fundamentally similar practices suggests that “copy” may be for storing copies of a movie directly on an Ultra HD Blu-ray (that’s the official 4K name) while “export” could allow a film to be shifted to a tablet or smartphone.
more...
-
Crackdown 3 will deliver Microsoft’s cloud-backed, fully destructive terrain
http://www.extremetech.com/wp-conten...98-640x353.jpg
When Microsoft first announced the Xbox One, it claimed developers and gamers could take advantage of Microsoft’s unique cloud infrastructure to boost game performance or create special effects that a single Xbox One couldn’t possibly handle. The company never said much about the feature after that, though we saw a glimmer of it last year, when Titanfall ran the game’s AI on Microsoft’s cloud servers. Now that’s changing, thanks to Crackdown 3. The upcoming game was shown off at Gamescom this week and its multiplayer mode has a rarely tapped capability: full environmental destruction.
Environmental damage isn’t new to gaming. It’s been featured in titles like Battlefield: Bad Company 2, Battlefield 4, and the Red Faction series using the GeoMod 2.0 and 2.5 engine. What sets Crackdown 3 apart, however, appears to be the degree of damage players can inflict on a structure during the game’s multiplayer mode, as well as the sheer amount of computational power Microsoft is going to throw at the problem. According to Dave Jones, the creative director on Crackdown 3, the goal is to make the entire game world fully destructible.
“We thought: what about if, for the first time, we make the whole world fully destructible?” Jones told Ars Technica. “We asked ourselves simple questions, such as ‘why don’t my bullets go through walls when I shoot them?’ or ‘why can’t I step through big holes I’ve made in those walls?’ It’s a very different way of thinking about games. If there’s a guy behind the wall, I can just shoot him through the wall, shooting both the wall and the guy to bits. That’s the way we think game worlds need to evolve.”
Crackdown 3 is relying on a backend developed by Cloudgine, a company dedicated to developing cloud-based rendering and offload technology. During the tech demo, Jones showed how increased environmental destruction required more computational power. That need was communicated back to Cloudgine, and more servers were brought online to deal with the increased demand. The extra workload apparently averages out to roughly six Xbox One’s worth of number crunching, though that can burst as high as 13. The number 20 has also been batted around, though this may represent a maximum workload scenario with a full set of players blowing bits of the game into smithereens simultaneously.
Cloud offload could be the future of gaming
While it’s true that the Xbox one isn’t as powerful as the PlayStation 4, don’t be fooled into thinking cloud offload is just an attempt to paper over differences between the two consoles. If Crackdown 3 requires 6-13 Xbox One’s to render full environmental destruction in real time, that still works out to 4-10 PS4s, depending on the workload in question. The Crackdown team still needs to demonstrate that their solution can scale and perform under pressure, but assuming that it can, we could be seeing the beginning of a new model of game development.
more...
-
Imagination pits Vulkan API against OpenGL in ‘gnomes per second’ test
http://www.extremetech.com/wp-conten...es-640x353.jpg
One of the biggest expected announcements at SIGGRAPH this week has been delayed: The Kronos Group has yet to release its Vulkan API. Fortunately, Imagination Technologies has taken over, and has released a tech demo to make the months of waiting more bearable. Gnome Horde demonstrates the impressive performance advantages that this low-level API provides compared with other graphics APIs.
The video below shows a consumer tablet outfitted with a GPU PowerVR G6430. The right half reproduces this scene using OpenGL ES, and the left half uses Vulkan.
The large number of objects to render stresses the hardware, and this is especially noticeable when the camera zooms out. At that point, OpenGL collapses, with CPU utilization above 90% and frame rates in the single digits. The greater control granted by Vulkan allows the scene to optimize its programming so the CPU load is shared between all cores and the frame rate never falls below 30 fps.
This performance increase is not free, as the post of Imagination says it comes at the expense of more work by the programmer: “All of the features require implementation in code, so the use of Vulkan does come with added code complexity compared to OpenGL ES. However, Imagination is committed to continuing full support for OpenGL ES for a long time to come alongside developing a new Vulkan API driver for PowerVR Rogue GPUs. Devices with the new Vulkan API should bring new optimization opportunities and increased efficiency to application developers.”
This also means that OpenGL will not disappear, and developers will be free to choose to program their projects at a high-level using OpenGL, or choose Vulkan to further optimize their code — although you can safely assume all the main game engines will add support for Vulkan.
http://architosh.com/wp-content/uplo...l-v-vulkan.jpg
It seems clear that the new API will allow the arrival of hitherto unthinkable games to mobile devices and budget graphics. There are other low-level APIs that come with*the same promises: Microsoft DirectX 12 and Apple’s Metal. The advantage of Vulkan is that as a free API it will be available on any platform that wishes to adopt it. Google has recently confirmed Vulkan support in the new versions of Android, and other platforms that will also support Vulkan include*Windows and Linux, especially Steam OS (Vulkan seems to be Valve’s secret weapon).
We don’t have definitive dates for the release of Vulkan, or for its integration with Android, but Kronos is still planning to release it later this year.
more...
-
EA executive calls on-disc DLC complaints ‘nonsense,’ but the truth is more complex
http://www.extremetech.com/wp-conten...se-640x353.jpg
Ask a gamer what they think of downloadable content (DLC) these days, and you’re as likely to hear a torrent of blistering language as anything positive. The process of releasing content piecemeal over time has been extremely controversial in the gaming community. Publishers have embraced the concept despite the mixed reaction of gamers. It’s now common practice for AAA games to release several follow-up areas or adventures, and access to such is often sold as a “Season Pass.”
http://www.extremetech.com/wp-conten...lc-640x320.jpg
DLC, as often viewed by gamers
One of the most controversial aspects of DLC has been on-disc DLC that ships as part of the core game, but that players must pay a fee to unlock. Dragon Age: Origins was criticized for this when it launched in 2009 — as soon as you reached your camp, NPCs essentially attempted to sell you a DLC package from within the game itself. In a recent interview with GameSpot, EA executive Peter Moore declared that gamer hostility to DLC was “nonsense,” and caused by a fundamental misunderstanding of how game development works. When asked how EA reconciles the tension between gamers who generally dislike DLC and publishers that increasingly depend on it for revenue, Moore responded “Well a lot of that resistance comes from the erroneous belief that somehow companies will ship a game incomplete, and then try to sell you stuff they have already made and held back. Nonsense.”
That answer might seem dismissive, but there’s quite a bit of truth to it.
more...
-
DirectX 12 arrives at last with Ashes of the Singularity, AMD and Nvidia go head-to-head
http://www.extremetech.com/wp-conten...s1-640x353.jpg
Ever since Microsoft announced DirectX 12, gamers have clamored for hard facts on how the new API would impact gaming. Unfortunately, hard data on this topic has been difficult to come by — until now. Oxide Games has released an early version of its upcoming RTS game Ashes of the Singularity, and allowed the press to do some independent tire-kicking.
Before we dive into the test results, let’s talk a bit about the game itself. Ashes is an RTS title powered by Oxide’s Nitrous game engine. The game’s look and feel somewhat resemble Total Annihilation, with large numbers of on-screen units simultaneously, and heavy action between ground and flying units. The game has been in development for several years, and it’s the debut title for the new Noxious engine.
http://www.extremetech.com/wp-conten...s3-640x390.jpg
An RTS game is theoretically a great way to debut an API like DirectX 12. On-screen slowdowns when the action gets heavy have often plagued previous titles, and freeing up more CPU threads to attend to the rendering pipeline should be a boon for all involved.
Bear in mind, however, that this is a preview of DX12 performance — we’re examining a single title that’s still in pre-beta condition, though Oxide tells us that it’s been working very hard with both AMD and Nvidia to develop drivers that support the game effectively and ensure the rendering performance in this early test is representative of what DirectX 12 can deliver.
Nvidia really doesn’t think much of this game
Nvidia pulled no punches when it came to its opinion of Ashes of the Singularity. According to the official Nvidia Reviewer’s Guide, the benchmark is primarily useful for ascertaining if your own hardware will play the game. The company also states: “We do not believe it is a good indicator of overall DirectX 12 performance.” (emphasis original). Nvidia also told reviewers that MSAA performance was buggy in Ashes, and that MSAA should be disabled by reviewers when benchmarking the title.
more...
-
Windows 10 jettisons SafeDisc and SecuROM, may phone home about cracked games
http://www.extremetech.com/wp-conten...10-640x353.jpg
Two new bits of information have surfaced regarding Windows 10 and its various DRM and phone-home strategies. First, there’s news that looks great at first glance — Windows 10 no longer supports the much-hated SecuROM and SafeDisc. The downside of this news is that Windows 10 can no longer play those titles if the DRM hasn’t been patched out by the original developer.
This revelation comes from Microsoft’s Enthusiast Marketing Manager for Windows, Boris Schneider-John, who told German publication Rocket Beans the following:
“Everything that ran in Windows 7 should also run in Windows 10. There are just two silly exceptions: antivirus software and stuff that’s deeply embedded into the system. And then there are old games on CD-ROM that have DRM. This DRM stuff is also deeply embedded in your system, and that’s where Windows 10 says ‘sorry, we cannot allow that,’ because that would be a possible loophole for computer viruses.
That’s why there are a couple of games from 2003-2008 with SecuROM, etc. that simply don’t run without a no-CD patch or some such. We can just not support that if it’s a possible danger for our users. There are a couple of patches from developers already, and there is stuff like GOG where you’ll find versions of those games that work.”
Mixed blessings
On the one hand, good on Microsoft for patching potentially dangerous loopholes in system security. While these programs weren’t actually rootkits, they embedded themselves in a manner similar to rootkits, and existed at a very low level within the system itself. Security researchers, as a result, are anything but fans of the technology. Gamers uniformly loathed them, as illustrated by this Penny Arcade comic.
http://www.extremetech.com/wp-conten...OM-640x304.jpg
No more of this? Hurrah! But can we get backwards compatibility?
On the other hand, however, this means a number of games, some released quite recently, will no longer work without new patches or game cracks. While SafeDisc hasn’t been used for several years, a number of games shipped with SecuROM, including titles like Fallout 3, Dragon Age II (EA attempted to camouflage this by calling it Sony Release Control), Oblivion, BioShock, the released version of Final Fantasy for PC in 2012, and dozens more over the years.
more...
-
Nvidia launches GTX 950, boosts game performance at the $159 price point
http://www.extremetech.com/wp-conten...50-640x353.jpg
Over the last year, Nvidia has slowly and steadily introduced a new line of GPUs to replace its aging Kepler family. The latest entry in the GTX 900 series, the GTX 950, is based on the same GM 206 GPU as the GTX 960, but drops a few CUDA cores and some texture mapping units to hit its new targets. The GTX 950 contains two graphics processing clusters (GPCs), 768 cores, 48 texture units, and 32 ROPS. That’s cut back a fair ways from the GTX 960, at 1024 cores, 64 TMUs, and the same 32 ROPs, but it should offer significantly improved performance over both the GTX 750 Ti and the GTX 650 Ti.
http://www.extremetech.com/wp-conten...42-640x442.jpg
Zotac’s GTX 950
The GTX 950 also doubles up on memory (2GB instead of 1GB) and offers 6.6GHz of 128-bit bandwidth. Both the 750 Ti and 650 Ti had just 5.4GHz of GDDR5 bandwidth, though the 750 Ti wowed us back then with its impressive memory bandwidth efficiency and overall performance, even if the card wasn’t an overall performance winner against AMD’s hardware. This time around, the GTX 950 has a better hand to deal, and AMD ends up rocked on its heels.
Because Nvidia decided to launch in the middle of IDF, most publications haven’t had time yet to look at the card itself. One exception is Guru3D, which has put together a fairly comprehensive evaluation of the GTX 950, 960, and AMD’s lower-end products. They also catalog the performance difference between OEM specialty cards, which typically retail for more money ($169 – $179, compared to a base price of $159) and the vanilla-flavored GTX 950.
Unfortunately, AMD is fighting relatively new Maxwell hardware with GCN-based GPUs that are much older. AMD’s $149 GPU, the R7 370, does include a 4GB RAM buffer, but its performance can’t quite match the GTX 950. The graph excerpt below shows performance in a common title we also test, BioShock Infinite.
http://www.extremetech.com/wp-conten...aphExcerpt.png
This kind of discrepancy typifies the comparison, though the exact figures can vary from title to title. At best, the R7 370 is always a few frames behind the GTX 950, even if you only compare the default GTX 950 and not the Asus or Palit cards. While it’s true that the R7 370 is also slightly cheaper than Nvidia’s card, the $10 price difference probably isn’t enough to balance the scales.
more...
-
Steam competitor GOG Galaxy to allow game rollbacks, patch uninstalls
http://www.extremetech.com/wp-conten...xy-640x353.png
GOG (formerly Good Old Games) launched back in 2008 as a one-stop shop for older titles without any DRM restrictions. Over the past few years, the service has transformed into a would-be Steam competitor. The GOG Galaxy service went into beta last year as a combined store, social media hub, and software delivery service, but with the same focus on delivering games sans DRM. Steam dominates the world of PC digital distribution, but GOG just introduced a unique feature that could endear it to modders and players alike — the ability to forego patches and updates.
Steam, Origin, and uPlay all offer ways to prevent games from updating, either by disconnecting your entire account from the Internet (assuming a game doesn’t require its own validation) or by choosing, title-by-title, not to update the game when an update is available. None of these services, however, actually offer an option to roll back to an earlier version of the game. Backing up a game in Steam doesn’t bestow this capability, either — if you make a copy of your files and later attempt to restore them, you’ll be prompted to verify the installation online and the game won’t launch until it has finished updating the original backed up version.
http://www.extremetech.com/wp-conten...xy-640x365.jpg
The GoG client
“We know that patches can occasionally break a game or affect your mods,” Piotr Karwowski, GOG’s vice president of online tech, said in a statement. “With the newest update to GOG Galaxy, we’re giving our users more control over their games and patches, but also addressing many of the top requests from our community.”
Other improvements to the still-beta service include a streamlined navigation system, better performance, support for high-resolution displays, the ability to pause and resume installations, and the aforementioned “Rollback” feature. GOG doesn’t mention if taking this option consumes additional hard drive space, but it could — GOG may copy files locally to speed the recovery process. Alternately, the game service could track updated files and re-download only the initial versions to restore a title to full functionality.
http://www.extremetech.com/wp-conten...ta-640x344.png
The new, 1.1 client feature list
It’s not clear how this new capability would play with some of the updates we’ve seen from time to time on other services. About a year ago, Rockstar courted controversy when it updated the Steam version of Grand Theft Auto: San Andreas to remove 17 songs that had previously been licensed for the game. While the company likely did this due to an expired license agreement, consumers were angry that versions of the game they purchased months or years before had content unceremoniou sly stripped out. In theory, GOG’s new patch-dodging feature would block such maneuvers by Rockstar for gamers who chose to keep the song libraries that shipped with the title. Then again, some companies might not be willing to work with the service without being able to control customer access to such content.
more...
-
Call of Duty: Black Ops III beta pulls off 60fps on the PS4 — almost
http://www.extremetech.com/wp-conten...y-640x353.jpeg
It’s no secret that the PS4 and Xbox One haven’t seen very many solid 60fps releases. While slower-paced games work fine at 30fps, competitive first person shooters like Call of Duty really benefit from a smooth 60fps experience. So, has Treyarch been able to achieve a solid 60fps for Black Ops III? For the most part, the answer is yes. Unfortunately, there are a few slight issues to consider before picking this game up at launch.
Earlier this month, the Black Ops III multiplayer beta went live on the PS4, and the results have been fairly solid. I spent about 45 minutes playing around with it, and the performance was fine. While I was handily murdered by genre devotees, I didn’t run into any notable technical issues.
Over at Digital Foundry, you can see exactly how the frame rate holds up under pressure. By and large, the game stays at 60fps, but it does drop here and there. When numerous large-scale events happen simultaneously, the frame rate will occasionally drop to 50fps. However, that’s a relatively rare occurrence, and the frame rate bounces back to 60fps very quickly.
Additionally, screen tearing is a little bit of a problem. When the engine can’t quite keep up, you’ll see some screen tearing at the top of the image. While it’s mildly disappointing that it doesn’t deliver a rock-solid locked 60fps experience from top to bottom, none of these issues should have a significant impact on how the game is played.
While the traditional competitive multiplayer seems fine, Digital Foundry seems more worried about the*four-player campaign co-op mode. The early footage we’ve seen of it ranges wildly from 30fps to 50fps, but Treyarch’s Mark Lamia claims that the existing footage is from a pre-alpha build, and it wasn’t given the full optimization treatment.
http://www.extremetech.com/wp-conten...s-640x360.jpeg
As it stands, it’s important to keep in mind that this is just a peek at a portion of the total product. The final game won’t be hitting shelves until November 6th, so there is still time to make performance improvements. And given the state of the gaming industry, a post-release performance patch isn’t out of the question.
As for the Xbox One and PC versions of the game, we still don’t know exactly what to expect. Since Sony signed a deal with Activision, we won’t be able to see the beta running on other platforms until tomorrow. While numerous recent PC releases have been something of a mess, Call of Duty has a long history on the PC. I don’t anticipate any serious issues, but only time will tell.
While the PS4 has had 1080p Call of Duty releases (almost) from the get-go, the Xbox One hasn’t been so lucky. Ghosts was stuck at 720p, and last year’s Advanced Warfare ran at the strange resolution of 1360×1080. While we can probably expect a similar frame rate on the Xbox One, the final resolution remains up in the air.
more...
-
New GTA V mod boosts PC version’s graphics detail, crushes console versions
http://www.extremetech.com/wp-conten...re-640x353.jpg
Every time a major game launches, there’s a rush to compare performance and specs between PCs and consoles. In the case of Grand Theft Auto V, those comparisons came out solidly in the PC’s favor. While the Xbox One and PS4 versions of the game are still gorgeous, the PC version offers higher detail levels, better foliage, and options like 16x anisotropic filtering (the two consoles only use 4x). Modders, however, are rarely satisfied with “best in class,” and a new mod project for GTA V shows just how gorgeous the game can be with a little more work.
The new Toddyhancer mod, by Martin Bergman, is still very much a work-in-progress. Bergman, amusingly, recommends that players “not go crazy,” noting that “Its just Reshade Shaders, ENB series, simple tweaks and some tonemapping with class!”As with all mods, fine-tuning details can take a significant amount of time. The video below illustrates some gorgeous detail levels and stylistic shots, but also may have lighting issues — it’s a bit dark overall. Other screen shots are a bit dark as well, though users have noted that many of them appear to have been taken at night.
Other screenshots of Bergman’s work on Toddymancer as well as other mods are available on his FB page, where he notes that a test will be released when he feels ready, and that the mod likely won’t be compatible with GTA V Online. Rockstar has taken a hard line with modding where online use is concerned, and the company isn’t afraid to drop the hammer. In some cases, users have lost access to all Rockstar games as a result of account suspensions, not just GTA V.
http://www.extremetech.com/wp-conten...-2-640x360.jpg
http://www.extremetech.com/wp-conten...-1-640x360.jpg
Bergman notes that the mod does come with a performance hit; his Asus ROG G751JY laptop takes a 10 FPS performance hit with it fully enabled. If your system is already stuttering in GTA V, mods like this may not improve your game much, even if you love the visuals.
Modding: The true strength of PC gaming
I’ve been a PC gamer since I was eight years old and text-based adventure games were a hot commodity. When I was a child, I lusted after a Nintendo that my parents refused to buy — in the 1980s, consoles were capable of a fluidity and speed that PC games never matched. Mario running and jumping smoothly on TV might not seem like much, but there were few equivalents on the PC side.
Over the decades, we’ve seen both platforms advanced by leaps and bounds. I still maintain that a keyboard and mouse are superior to any controller in existence, but that’s a peripheral issue rather than a core strength of either platform. Thanks to the magic of USB, you can use game controllers for PC titles if you have a mind to. What sets PC gaming apart isn’t graphics, or keyboards, or even game types — it’s modding.
more...
-
AMD unveils Radeon R9 Nano: HBM and Fury X in a 6-inch GPU
http://www.extremetech.com/wp-conten...o1-640x353.jpg
When AMD announced the Fury family of GPUs at E3 back in June, it promised that its new graphics hardware would ship in multiple high-end flavors. We’ve already covered the launch of the water-cooled, 7.5-inch Fury X and the larger, air-cooled Fury, but today’s announcement covers what was arguably the most interesting card of all — the six-inch Radeon R9 Nano. (The GPU was initially referred to as the AMD Radeon R9 Fury Nano, but AMD’s marketing folks realized that’s a bit of a mouthful).
http://www.extremetech.com/wp-conten...o4-640x510.jpg
It’s very small
AMD has set a high bar for the Nano, with some aggressive performance claims and a tight power envelope. Now that the company is formally revealing the card, let’s take a look at its specs and capabilities.
more...
-
Ashes dev dishes on DX12, AMD vs. Nvidia, and asynchronous compute
http://www.extremetech.com/wp-conten...s1-640x353.jpg
When Ashes of the Singularity launched two weeks ago, it gave us our first view of DirectX 12’s performance in a real game. What was meant to be a straightforward performance preview was disrupted by a PR salvo from Nvidia attempting to discredit the game and its performance results. Oxide Games refuted Nvidia’s statements about the state of Ashes, but the events raised questions about the state of Nvidia’s DX12 drivers and whether its GPUs were as strong in DirectX 12 as they have been in DirectX 11. (Oxide itself attributed these differences to driver maturity, not any fundamental quality of either GPU family). Now, an unnamed Oxide employee has released some additional information on both the state of Ashes and the reason why AMD’s performance is so strong.
http://www.extremetech.com/wp-conten.../DX12-High.png
AMD’s R9 Fury X tied the GTX 980 Ti in DX12, though NV swept DX11 by a large margin.
more...
-
Metal Gear Solid V runs at 60fps on PS4 and Xbox One, dips down to 20fps on last-gen consoles
http://www.extremetech.com/wp-conten...g-640x353.jpeg
Hideo Kojima’s magnum opus, Metal Gear Solid V: The Phantom Pain, is out today on the PS4, Xbox One, PS3, Xbox 360, and PC. In spite of all of the Konami drama over the last year, MGSV is no worse for wear. In fact, it’s received nigh-on universal praise for both its ambition and gameplay execution. But how do the graphics hold up?
First thing’s first: the resolution is different across the board. The last-gen consoles are rendering the game at roughly 992×720, the Xbox One is running at 1600×900, the PS4 at 1920×1080, and the PC version can deliver 4K if you have the gear to handle it. While it’s slightly disappointing that the Xbox One still can’t hit 1080p, this is a substantial boost from last year’s 720p release of Metal Gear Solid: Ground Zeroes on Microsoft’s console.
Over at Eurogamer’s Digital Foundry, you can see a full breakdown of all of the console versions. Sadly, the PC version will have to wait a bit, but expectations are definitely high after a solid showing with Ground Zeroes. Thankfully, both of the current-gen consoles are capable of delivering a nearly perfect 60 frames per second. There are occasional dips on both platforms, but it’s pretty rare. I’ve spent about three hours with the PS4 version already, and it’s been silky smooth from top to bottom. Gameplay is absolutely unhampered here.
The PS4 offers better texture filtering than the Xbox One, but the difference is trivial. Pop-in is extremely mild on the newer consoles, and you probably won’t even notice it. However, one thing stands out as particularly puzzling. Subsurface scattering, an effect designed to make skin look slightly more realistic, seems to be missing completely from the Xbox One version. That effect was present on the Ground Zeroes release for the Xbox One, and it isn’t even particularly taxing on the hardware. Digital Foundry is so confused by this omission, the team posits that it might actually be a bug that can be patched down the road.
Unsurprisingly, the Xbox 360 and PS3 versions of the game don’t hold up quite so well. The target frame rate is halved to 30 frames per second, but neither the PS3 or 360 can keep it pegged. The PS3 seems to be running at a slightly worse frame rate over all, but neither one is very good. There’s substantially more pop-in, and the textures are clearly blurrier on the old consoles.
However, it’s still impressive that the entire game is running on these ancient machines. The unstable frame rate will make lining up shots during frantic moments a bit harder, but it’s nice that those among us still clinging to the last-gen consoles still get to enjoy what seems to be the last “real” Metal Gear game.
more...
-
New PS4 firmware 3.0 update will make cloud saves suck less
http://www.extremetech.com/wp-conten...4-640x353.jpeg
As the PS4’s second anniversary draws closer, Sony is working hard on the third major iteration on its BSD-based operating system. With improved cloud storage, better social features, and the addition of YouTube live streaming, this update won’t be revolutionary, but hopefully it’ll make the day-to-day operation of the PS4 significantly less of a hassle.
Last October, Sony released firmware 2.0 to the world. It brought Share Play, music playback, and themes to the popular game console. Back in March of this year, firmware 2.5 gave us higher quality Remote Play streams and the ability to suspend/resume gameplay at will. Firmware 3.0, dubbed “Kenshin,” is set to deliver a similar level of improvement this time around.
http://www.extremetech.com/wp-conten..._o-640x360.jpg
By far, the biggest impact here is how Sony will be handling cloud saves for PlayStation Plus subscribers. Instead of the measly 1GB of cloud storage space we’ve been stuck with, 10GB is the new standard. Considering that my save data for Dragon Age: Inquisition alone is 300MB, this is a godsend. As it stands, I hit the 1GB cap on the regular, and I’m forced to manually delete items if I want to upload a new save file. Even better, firmware 3.0 will offer better tools for monitoring how much cloud storage you’re using, and altering which titles will attempt to automatically upload new saves.
Now that Google is taking the gaming market seriously, Sony is adding in the ability to live stream to YouTube from your PS4. Of course, you’ll still be able to stream to Ustream and Twitch, so diehards of those services won’t be negatively impacted. While this is certainly a smart move, I’d also like to see Sony add support for smaller platforms like Hitbox.
Historically, Sony hasn’t had a particularly strong grip on the social aspects of online gaming. With Kenshin, the PlayStation team is attempting to address some of the more frustrating elements of communicating and playing online. “Favorite groups” will offer easy access to your favorite PSN friends, so you won’t have to scroll through your entire friends list to find them. If you’re in need of new online friends to play with, the new “communities” feature will help you connect with new people. And if you can’t be bothered to communicate with your voice, or even the written word, you’ll now be able to send stickers back and forth.
Firmware 3.0 has a few other nice touches, but they’re relatively minor. You’ll be able to send up to 10 seconds of gameplay footage directly to Twitter, and you can now badger your friends to start live streaming with the new “request to watch” feature. The “what’s new” and “live from Playstation” features are getting a fresh coat of paint, but there’s nothing here to get excited about — just some quality of life improvements here and there.
The closed beta began earlier this week, but Sony has remained silent regarding the public release date. However, it’s safe to assume that we’ll see this update hit later this year unless the beta testers find a catastrophic showstopper.
more...
-
Asynchronous compute, AMD, Nvidia, and DX12: What we know so far
http://www.extremetech.com/wp-conten...NV-640x353.jpg
Ever since DirectX 12 was announced, AMD and Nvidia have jockeyed for position regarding which of them would offer better support for the new API and its various features. One capability that AMD has talked up extensively is GCN’s support for asynchronous compute. Asynchronous compute allows all GPUs based on AMD’s GCN architecture to perform graphics and compute workloads simultaneously. Last week, an Oxide Games employee reported that contrary to general belief, Nvidia hardware couldn’t perform asynchronous computing and that the performance impact of attempting to do so was disastrous on the company’s hardware.
This announcement kicked off a flurry of research into what Nvidia hardware did and did not support, as well as anecdotal claims that people would (or already did) return their GTX 980 Ti’s based on Ashes of the Singularity performance. We’ve spent the last few days in conversation with various sources working on the problem, including Mahigan and CrazyElf at Overclock.net, as well as parsing through various data sets and performance reports. Nvidia has not responded to our request for clarification as of yet, but here’s the situation as we currently understand it.
Nvidia, AMD, and asynchronous compute
When AMD and Nvidia talk about supporting asynchronous compute, they aren’t talking about the same hardware capability. The Asynchronous Command Engines in AMD’s GPUs (between 2-8 depending on which card you own) are capable of executing new workloads at latencies as low as a single cycle. A high-end AMD card has eight ACEs and each ACE has eight queues. Maxwell, in contrast, has two pipelines, one of which is a high-priority graphics pipeline. The other has a a queue depth of 31 — but Nvidia can’t switch contexts anywhere near as quickly as AMD can.
http://www.extremetech.com/wp-conten...on-640x251.png
According to a talk given at GDC 2015, there are restrictions on Nvidia’s preeemption capabilities. Additional text below the slide explains that “the GPU can only switch contexts at draw call boundaries” and “On future GPUs, we’re working to enable finer-grained preemption, but that’s still a long way off.” To explore the various capabilities of Maxwell and GCN, users at Beyond3D and Overclock.net have used an asynchronous compute tests that evaluated the capability on both AMD and Nvidia hardware. The benchmark has been revised multiple times over the week, so early results aren’t comparable to the data we’ve seen in later runs.
more...
-
AMD announces comprehensive graphics reorganization as investor rumors swirl
http://www.extremetech.com/wp-conten...re-640x353.png
Two major pieces of AMD news have crossed the wire today, and both could be good news for the struggling chip company. First, AMD is announcing a major reorganization of its graphics division. The entire graphics team will now be headed by Raja Koduri, including all aspects of GPU architecture, hardware design, driver deployments, and developer relations. Koduri left AMD for Apple in 2009, only to return to the company in 2013. Since then, he’s served as the Corporate Vice President of Visual Computing.
Now, Raja is being promoted to senior vice president and chief architect of AMD’s entire graphics business (dubbed the Radeon Technologies Group). In this new role, Koduri will oversee the development of future console hardware, AMD’s FirePro division, the GPU side of APUs, and all of AMD’s graphics designs on 14/16nm. Bringing all of these elements under one roof, along with developer relations and driver development, will allow AMD to unify its approach to various products that have previously been managed by different departments. This could pay significant dividends in areas like driver management and feature updates, which have previously been handled by other teams that reported to different managers. Koduri is well-respected in the industry and we’ve heard that the R9 Nano, which debuts in the very near future, was a project he championed at AMD.
more...
-
AMD R9 Nano review: Stellar performance in a pint-sized graphics card
http://www.extremetech.com/wp-conten...e1-640x353.jpg
At E3 this past summer, AMD announced four new GPUs that would make up its next-generation product family: The Radeon Fury X, the Fury, the Fury Nano, and the as-yet-unreleased dual-GPU Fury. We’ve covered each launch in turn, but the R9 Nano (AMD opted to shorten the name) is arguably the pinnacle of the entire Fury family. While it’s not expected to match the Radeon Fury X’s performance, it’s even shorter than the Fury X’s 7.5 inches. At just six inches, the Nano is exactly the same size as a PCIe x16 slot. Any shorter, and AMD would’ve had to compromise on bus bandwidth.
http://www.extremetech.com/wp-conten...o1-640x353.jpg
The Radeon Nano
In person, the card is every bit as impressive as it looks in pictures. At six inches long, it’s actually shorter than the first 3D accelerator I ever bought, the Diamond Monster II. GPUs have gotten larger and smaller over the years, but I don’t ever recall AMD or Nvidia fielding a GPU this small at the high end of the product stack. This opens up all manner of interesting system possibilities for mini-ITX or small form factor enthusiasts, and that’s the audience that AMD wants to target with this card.
http://www.extremetech.com/wp-conten...TX-640x423.jpg
The Radeon Nano in a mini-ITX system.
One note: We don’t currently have data to share on how the Nano performs specifically in a mini-ITX configuration. The chassis that we ordered for the review shipped late and missed its plane. As a result, our Coolermaster Elite 110 is currently sitting in Newark, NJ. For now, we’ve decided to review the card against both the GTX 970 Mini and its full-size cousins and competitors from AMD and Nvidia in a full-size ATX chassis. AMD may have a different market in mind for this GPU, but there’s no reason it can’t be used in a full-size chassis — and users interested in purchasing one may want to know its performance characteristics in different thermal environments.
A tiny GPU with a steep hill to climb
The Nano is the smallest high-end GPU we’ve ever tested and the card’s cooler and heatsink are clearly high-class, but the price gap between the R9 Nano and its closest competitor is nothing to sneeze at. The GTX 970 Mini is a $355 GPU, going up against a $649 Nano. That’s a high bar to clear, and if you’re looking to the Nano to justify it strictly on performance grounds, you’re going to be disappointed. The Nano is more complicated than that, for reasons we’re going to be discussing.
more...
-
The new Apple TV is a big deal, but it’s not a gaming console
http://www.extremetech.com/wp-conten...V-640x353.jpeg
Earlier today, Apple unveiled the new Apple TV at its big September event. With faster internals, a new operating system, and the availability of third-party apps, this is a big step forward for Apple in the living room. There’s a ton of potential for consumers and developers alike. But it’s never going to be the game console you really want.
Built with a custom Apple A8 system-on-a-chip, 802.11b/g/n/ac WiFi support, Bluetooth 4.0, 1080p video output, and Dolby Digital 7.1 audio output, this $149 device is significantly more versatile than the previous Apple TV model. And now that it finally supports native third-party applications via a dedicated app store, this set-top box is going to deliver a lot more than just streaming video.
http://www.extremetech.com/wp-conten...V-300x260.jpeg
The primary way in which you’ll be interacting with the new Apple TV is the “Siri Remote.” With a built-in touchpad and microphone, this tiny new controller is a lot more capable than the old D-pad style remote. Swipe through menus, ask Siri for recommendations, and start watching whatever tickles your fancy. Even though we’ve seen similar functionality in other devices, this seems like an incredibly slick solution that will resonate with the average consumer.
Personally, I’m excited to see what this does for the set-top box market. I use numerous devices to stream video (including the existing Apple TV), and a large influx of custom apps could make my television-watching experience even better. However, I know for sure that this will never offer the gaming experience I’m looking for.
First and foremost, the hardware in this 3.9-inch device simply doesn’t pack the same horsepower as a real gaming console. Considering that the $400 consoles on the market already have a hard time reaching 1080p60, imagine how poorly a major modern console game like The Witcher 3 or Metal Gear Solid V would run on this tiny little box. And since the premium $199 model only has 64GB of storage, there’s no hope for large-scale games on this platform. There’s nothing keeping game streaming solutions like PlayStation Now from coming to the Apple TV, but that’s a completely different conversation.
Second, it doesn’t ship with a real controller. Sure, you can use the Siri Remote or your iPhone to control casual games like Crossy Road, but traditional “core games” need thumb sticks and numerous buttons. Games like Fibbage or Tiny Wings work perfectly using a phone as a controller, but you’ll have a terrible experience with the likes of a first person shooter. Even though third-party Bluetooth controllers are an inevitability, only a tiny fraction of Apple TV owners will have those niche gamepads.
Third and finally, Apple just doesn’t care enough about gaming to take on the existing players. Sure, they’ll continue to pay lip service to game developers, but we’ll end up with more Candy Crush or Clash of Clans — not something the enthusiest crowd will care about. In the end, the Apple TV simply isn’t competing directly against the PS4 and Xbox One. It’s a console that can play some games — not a gaming console.
more...
-
Nvidia recalls Shield Pro console over hard drive woes
http://www.extremetech.com/wp-conten...re-640x353.jpg
Nvidia has announced a replacement program for its Shield Pro console, though the company claims only a tiny fraction of the shipped devices are actually affected. According to the company, this recall only affects Shield Pro users who bought the 500GB version of the console, not standard users with the 16GB flavor. According to Nvidia, devices that need to be replaced are those which exhibit the following characteristics (screenshot from Nvidia’s GeForce forum):
http://www.extremetech.com/wp-conten...ll-640x420.png
Nvidia hasn’t given any further information on what caused the issue, but has promised to replace the defective units for customers, or to work with them to secure RMAs through other retailers where it is able to do so. The company claims that less-than 1% of all units have been impacted, but also notes that the issue can “worsen over time.” If you’re seeing issues with graphical corruption or unusual pauses today, in other words, you’d best deal with the problem now before things get worse.
The timing on this particular recall is a bit awkward, given that NV prominently called out the Apple TV in a recent blog post. aggressively positioning Shield as the best all-in-one microconsole and dismissing the Apple TV as only good for people who play “Crossyroad” [sic]. (This appears to be a misspelled reference to the mobile game “Crossy Road,” which debuted for iOS late last year).
http://www.extremetech.com/wp-conten...ad-640x562.jpg
more...
-
Jim Keller, AMD’s chief CPU architect, leaves the company
http://www.extremetech.com/wp-conten...rs-640x353.jpg
Earlier this week, we reported on rumors that AMD’s Zen might have slipped into Q4 2016. Since then, we’ve heard the chip could actually launch in the Q1 2017 timeframe — and now, there’s further reason to think that something happened to AMD’s next-generation CPU timetable. Now, it’s been reported that Jim Keller, who returned to AMD to helm its new CPU after a stint with Apple, has left the company to “pursue other opportunities.”
Keller wasn’t just responsible for the Zen CPU architecture; he was also leading the team that designed AMD’s still-upcoming ARM-based K12 CPU, which isn’t expected to launch until 2017. AMD sought to downplay the impact of this announcement and told Hexus.net that “Jim’s departure is not expected to impact our public product or technology roadmaps, and we remain on track for “Zen” sampling in 2016 with first full year of revenue in 2017.” Mark Papermaster will now step in and head Keller’s team.
An uncertain impact
The knee-jerk way to read this announcement is that Jim Keller was fired because Zen is coming in 6-7 months late. That’s entirely possible, and it wouldn’t be the first time an AMD executive left to pursue “other opportunities” for reasons that only became clear months after they were gone. Dirk Meyer’s departure as CEO didn’t make much sense at the time, and it was widely reported that he was forced out over disagreements related to the tablet and mobile markets. Later, once Bulldozer had hit store shelves, it became clear that tablets and mobile products hadn’t been the only problem.
http://www.extremetech.com/wp-conten...AMD-Zen-01.png
AMD’s Zen CPU
When you consider the differences between the AMD that Keller came back to in 2012 and the AMD he left in 2015, there’s no shortage of factors that might have caused a break-up. In 2012, AMD was clearly planning to enter the ARM market and launch its own custom ARM core (and Keller’s most recent expertise was in ARM SoCs, not x86 processors). In 2015, the K12 and Cortex-A57 CPUs that Sunnyvale once championed scarcely warrant a mention.
http://www.extremetech.com/wp-conten...86-640x360.jpg
Skybridge went from mission-critical to dust in less than a year
As recently as 2014, AMD had a public roadmap for a common socket platform between x86 and ARM cores that would bridge the two, with an HSA-enabled version of the Jaguar architecture that might have helped plug the holes in AMD’s roadmap between now and Zen’s launch in 2017. By 2015, those plans had been canceled. The recent graphics reorganization and rumors of substantial private equity investments could be further indications that AMD’s new focus isn’t what Keller signed on to shepherd, and that he’s decided to pursue other opportunities without it being evidence of a substantial problem in AMD’s product pipeline.
One thing we’ve heard from multiple knowledgeable sources is that Zen is finalized. We don’t know if the architecture has taped out or not, but at least the vast majority of the work is already complete. I’m reminded of a quote attributed to Robert Palmer, the ex-CEO of Digital. “Designing microprocessors is like playing Russian roulette. You put a gun to your head, pull the trigger, and find out four years later if you blew your brains out.”
Keller’s departure will not be well-received. Here’s hoping it had more to do with differences over the company’s focus as opposed to Zen itself. AMD is out of time for putting its own house in order.
more...
-
Nvidia launches overclock-friendly GTX 980 desktop GPU for gaming laptops
http://www.extremetech.com/wp-conten...Ti-640x353.jpg
Earlier this month, we covered Asus’ insane water-cooled laptop and the appearance of a new Nvidia GPU that looked for all the world like a GTX 980 desktop chip in a mobile form factor. Today, Nvidia is taking the lid off that particular project — and it’s even crazier than we thought it was. Putting a GTX 980 in a notebook chassis is a bit nuts, even if these systems are only portable in the same way that a five-gallon bucket is portable.
The handful of laptops now shipping with full-size GTX 980 graphics cards in them will allow those cards to be overclocked, opening up still more performance alongside the unlocked Intel Skylake-K chips that Chipzilla is now shipping in mobile as well. None of the specs are skimped on — the GPU’s core clock is 1126MHz base, identical to the standard desktop, while its GDDR5 memory bus is clocked at an effective 7Gbps. Max clock frequencies could supposedly hit an additional 200MHz, though that’s going to depend a great deal on the cooling solution of the system.
http://www.extremetech.com/wp-conten...NV-GTX980M.jpg
Other features coming to these new systems include variable fan controls, 17-inch 1080p panels, and 4-8 phase power supplies with higher peak currents. The end result should be some truly amazing notebooks from the likes of Asus, MSI, and Clevo. MSI even intends to introduce an SLI version of the laptop with an 18.4-inch screen.
Of notebooks and cooling
It’s worth pausing for a moment and considering the throttling situation in modern notebooks. If you’ve never owned a high-end boutique laptop or had occasion to test one, it’s easy to think that these are the systems you buy when you want maximum performance with minimal-to-zero throttling. You might think that — but in many cases, you’d be wrong. While I haven’t reviewed every high-end laptop ever built, it’s been my observation that systems that can handle their own heat output are few and far between. Systems that can handle their heat output without either turning into banshee-possessed wind tunnels or melting whatever surface you happen to set them on are even rarer.
http://www.extremetech.com/wp-conten...21-640x353.jpg
Asus’ upcoming water-cooled system
The problem many boutique builders face is that high-end consumers want the fastest-sounding processor without any regard for what will happen if you try to run that particular chip in a chassis for more than 60-120 seconds. Sure, a high-end CPU core may burst to 3.7GHz initially, but once the chip overheats, you’ll be back in 2.5GHz territory. These kinds of issues are why Intel’s lowest-end Core M processors were measured outperforming the highest-end Core M chips earlier this year.
When I say that boutique systems typically throttle, I don’t mean that they throttle in Prime95 + FurMark when left sitting on a bed for an hour. I mean they tend to throttle within minutes to a greater or lesser degree. Not every laptop has this problem (I’ve been very pleased with the performance of Asus’ recent offerings), but many do, including those from major brands. It’s caused by a boutique manufacturer being willing to slam their foot down on the gas pedal, even when doing so will harm the customer’s experience.
That doesn’t mean that vendors like MSI, Asus, and Alienware can’t build great gaming laptops, even with this hardware — but I’d definitely read reviews, when they eventually appear, with a careful eye.
One final note: This GPU is apparently being called the GTX 980 for mobile as opposed to the GTX 980M, or any kind of Titan branding, as we originally theorized. If you go looking for it, make sure you pick the appropriate GPU. We reached out to Nvidia, who told us that this GTX 980 mobile card will also allow for higher operating temperatures then what we’ve seen with mobile GPUs in the past — presumably that means 90-95C as opposed to the 70C that most mobile GPUs seem to target.
more...
-
This Star Trek: The Next Generation Enterprise-D VR tour is a killer Unreal Engine 4 demo
http://www.extremetech.com/wp-conten...-D-640x353.jpg
As VR headsets like the Oculus Rift move closer to production, we’ve started to see more software showcasing what the format can achieve. In some cases, that’s been courtesy of expensive tech demos from existing studios — but sometimes, it’s dedicated fans that produce their own impressive work. A recent tour of Star Trek’s Enterprise-D is making the rounds, courtesy of Unreal Engine 4 and, in the future, the Oculus Rift.
The goal of the project, according to the author, is to “Place you on board the ship. Walk the corridors. Explore the unseen rooms and communal areas. Hear the engines. See the screens. Fly the shuttlecraft. Exist as a member of the crew.” As the 12-minute YouTube demo makes clear, this is no idle boast — the tech demo takes you from landing in the main shuttlebay on a tour of multiple areas of the saucer section, including a theorized “Two-Forward” lounge and the bridge itself.
If you ever wondered how some of the sets on the ship connected together, or, more pragmatically, where the actual bathrooms were, this tour answers it. There are no loading screens or other immersion-breaking issues, and while the current version of the project isn’t running on Oculus hardware yet, the author intends to bring the project to VR hardware as soon as its available. As he notes, “Previous virtual tours have not gone far enough. They usually are 360 degree panoramic, without actual movement or immersion.”
http://www.extremetech.com/wp-conten...05-640x360.png
Work in progrses
Source material for the project include Rick Sternbach’s official blueprints, Ed Whitfire’s unofficial blueprints (these are, some apparently argue, more in line with some of Andrew Probst’s original designs), as well as some minor creative license on the part of the author to model things in-game that may have been inconsistently modeled or approached on-screen. That last isn’t uncommon — Ex Astris Scientia has a huge section on inconsistencies in Star Trek models and designs errata.
http://www.extremetech.com/wp-conten...erprise-D2.jpg
These are the voyages…
The long-term future of the probject could be complicated, since it’s not a licensed production and the appropriate legal powers that be haven’t agreed to allow this kind of creation. At the same time, however, Paramount has shown precious little interest in actually creating new licensed Star Trek TV shows or games that would actually revisit this space. Once upon a time, ST:TNG was a trailblazing franchise in digital entertainment — the Next Generation Technical Manual CD-ROM was an early CD-ROM title, and Star Trek: Borg made heavy use of then-popular FMV sequences. With the exception of some awful movie tie-in products, the last decent Star Trek video games are a decade old or more.
more...
-
Fable Legends: AMD and Nvidia go head-to-head in latest DirectX 12 benchmark
http://www.extremetech.com/wp-conten...re-640x353.jpg
As DirectX 12 and Windows 10 roll out across the PC ecosystem, the number of titles that support Microsoft’s new API is steadily growing. Last month, we previewed Ashes of the Singularity and its DirectX 12 performance; today we’re examining Microsoft’s Fable Legends. This upcoming title is expected to debut on both Windows PCs and the Xbox One and is built with Unreal Engine 4.
Like Ashes, Fable Legends is still very much a work-in-progress. Unlike Ashes of the Singularity, which can currently be bought and played, Microsoft chose to distribute a standalone benchmark for its first DirectX 12 title. The test has little in the way of configurable options and performs a series of flybys through complex environments. Each flyby highlights a different aspect of the game, including its day/night cycle, foliage and building rendering, and one impressively ugly troll. If Ashes of the Singularity gave us a peek at how DX12 would handle several dozen units and intense particle effects, Fable Legends looks more like a conventional first-person RPG or FPS.
http://www.extremetech.com/wp-conten...e2-640x360.jpg
There are other facets to Fable Legends that make this a particularly interesting match-up, even if it’s still very early in the DX12 development cycle. Unlike Ashes of the Singularity, which is distributed through Oxide, this is a test distributed directly by Microsoft. It uses the Unreal 4 engine — and Nvidia and Epic, Unreal’s developer, have a long history of close collaboration. Last year, Nvidia announced GameWorks support for UE4, and the UE3 engine was an early supporter of PhysX on both Ageia PPUs and later, Nvidia GeForce cards.
Test setup
We tested the GTX 980 Ti and Radeon Fury X in Windows 10 using the latest version of the operating system. Our testbed was an Asus X99-Deluxe motherboard with a Core i7-5960X, 16GB of DDR4-2667 memory. We tested an AMD-provided beta driver for the Fury X and with Nvidia’s latest WHQL-approved driver, 355.98. NVidia hasn’t released a beta Windows 10 driver since last April, and the company didn’t contact us to offer a specific driver for the Fable Legends debut.
http://www.extremetech.com/wp-conten...e3-640x360.jpg
more...
-
Oculus demos $99 Gear VR in push for mainstream virtual reality
http://www.extremetech.com/wp-conten...-1-640x361.jpg
There were a number of important announcements at the Oculus Connect 2 conference this week. First, Samsung’s Gear VR is expected to be on store shelves by the end of November, with support for a wider number of Android handsets. The first version of the device was only compatible with the Galaxy Note 4, which naturally limited its appeal. This new flavor will go on sale for $99 (half the cost of the original) and will be compatible with the Note 5, S6, S6 Edge, and S6 Edge Plus. The strange thing — and I’m not kidding, I find this more than passingly odd — is that the Gear VR final version apparently isn’t compatible with the Note 4.
Now, granted, you could argue that Samsung already rewarded last year’s group of customers with the hardware they paid for, but Samsung is claiming that the final Gear VR is 22% lighter and much more comfortable to wear for extended periods. You’d think that the customers the company would want to support the most would be those that shucked out $200 for last year’s model, but apparently that’s not the case. Oculus is also bringing a new arcade experience to Gear VR, billed as a novel way to play classic arcade games in a more immersive environment. Hopefully this doesn’t involve pumping virtual quarters into slots, much less the $2-$3 per play that most arcades charged when I last visited.
http://www.extremetech.com/wp-conten...er-640x257.png
If you’ve been trying to wrap your head around what kind of hardware you need for an Oculus-ready experience, meanwhile, the company has heard you and wants to make the entire buying experience less painful. Oculus is going to develop a new “Oculus ready” brand for inclusion on systems that meet a certain standard of quality. Exactly what this will look like is unclear, but everything we’ve seen suggests that the hardware will need to be quite powerful to deliver cutting-edge performance in VR. In traditional games, players can often tolerate frame rates in the 30-50 range without nausea, but low frame rates or erratic frame timing are both the kiss of death for VR — unless you like your virtual reality with a slice of very real-world nausea.
Finally, there’s Oculus Touch. We haven’t previously talked much about Oculus Touch (above), beyond a brief description of the initial demo earlier this summer. This system is designed to give users tactile feedback in VR environments; Gizmodo has published an incredibly positive hands-on review of the technology earlier this year. This is where the bad news comes in — while Oculus Rift is still expected to ship in early 2016, the Oculus Touch won’t be part of the system. The Rift is set to debut in Q1 2016, with Touch delayed until Q2 2016. As for hard launch dates for software or game conversions, information is still a bit fuzzy. Expect the VR space to take some time to really get kicking — there’s a lot of heavy lifting still to be accomplished.
more...
-
Sony to skip PlayStation Vita 2, blames mobile gaming for handheld’s decline
http://www.extremetech.com/wp-conten...ta-640x353.jpg
Earlier today, we covered how the PlayStation TV can be hacked to play Vita titles. Today, Sony executive Shuhei Yoshida, president of Sony Computer Entertainment Worldwide Studios, more-or-less confirmed that Sony was planning to exit the dedicated handheld business once the Vita reaches the end of its lifespan. When asked about the possibility of a follow-up to Sony’s PSV, Yoshida noted that mobile gaming has created a tough climate for handhelds and called the possibility of a successor a “tough question.”
Yoshida put blame on the general rise of smartphone gaming, the advent of free-to-play titles, and the fact that handhelds have different hardware control schemes that simply don’t translate well to modern touch-based smartphones, Eurogamer reports. Of these points, the last is definitely true — games that try to ape the functionality of a joystick or buttons by providing virtual touch-based interfaces are often difficult to control and reserving screen space for a joystick chews up valuable real estate.
There’s no doubt the advent of smartphones created a challenging environment for handheld gaming, but I’m not convinced iPhones and Android are entirely to blame. When Sony announced in June 2013 that the PlayStation Vita would have a new feature, Remote Play, that allowed it to stream games from the PS4, sales of the Vita began to spike ahead of the PS4 launch. As this chart from VGChartz illustrates, PSV sales exploded from October to December, 2013. The Vita has sold 12.26 million units since it launched — and moved nearly 15% of them in those three months.
http://www.extremetech.com/wp-conten...re-640x269.jpg
Chart courtesy of VGChartz. Click to enlarge
Clearly, the problem wasn’t with the Vita hardware, which always held up well in comparison with the Nintendo DS. Nor was it an issue of an intrinsically limited market. If it was, Nintendo’s 3DS would never have broken the 50-million mark. While that’s just a fraction of the Nintendo DS, the DS was produced for a decade, while the 3DS is just 4.5 years old. It may never reach the DS’ sales volume, but it should have no trouble racking up another 10-20 million units over the course of its life.
There are multiple reasons why Sony’s Vita sits at 12 million units shipped as compared to 53 million for Nintendo that have nothing to do with mobile gaming. Remote Play was billed as a late-launching Vita feature, but it’s has always had asterisks attached to it. While it works, the Vita doesn’t have an identical set of inputs as the PS4, which means certain functions are emulated using the rear touch panels. Lag is also a common problem, unless you’re sitting on top of the PS4.
the problem isn’t just Remote Play’s lackluster implementation. From the beginning, Sony has gouged users for memory cards ($100 for a 32GB Vita-compatible card, instead of $18 for a standard model), offered lackluster ports, and published just a handful of titles relative to Nintendo. Nintendo published 35 of the top 50 games for the 3DS, as measured by total sales. Sony published just 13 of the top 50 Vita titles. The bestselling title for the Vita, Uncharted: Golden Abyss, moved 1.47 million copies worldwide — which puts it at 26th place on the 3DS chart.
more...
-
Sony PlayStation TV hacked to restore Vita compatibility
http://www.extremetech.com/wp-conten...V1-640x353.jpg
When Sony launched its PlayStation TV in the US last year, the device debuted to mixed recommendations and a puzzling lack of compatibility. While it’s fundamentally based on the same hardware as the PlayStation Vita, the PSTV can’t play many games that the Vita runs quite handily.
In some cases, this makes sense — the Vita has a microphone, camera, and gyroscope, all of which the PS TV lacks. In many other cases, however, there’s been no intrinsic reason why the Vita could play a game that the TV couldn’t. Both systems are based on an ARM Cortex-A9 quad-core, both have 512MB of RAM and 128MB VRAM, and both use a PowerVR SGX543MP4+ graphics card. Now, a modder has found a way to unlock a number of games for the PlayStation TV, removing some of these roadblocks.
The hack, provided by Hackinformer poster Mr.Gas, shows how it’s possible to exploit a loophole in the Vita’s email client to load a modified whitelist into the client and open up an entirely new set of games. Full step-by-step instructions are available on HackInformer, but the list of games that can be played successfully is likely to skyrocket with this new method — at least, until Sony updates the firmware and breaks this method of compatibility.
http://www.extremetech.com/wp-conten...TV-640x363.jpg
What’s less clear is why users are having to resort to this kind of method for loading games in the first place. Anyone who buys a PlayStation TV has either purchased game cartridges (which are compatible with either the PS TV or Vita) or has downloaded games over PlayStation Now. There’s no “lost sale,” here, in other words, and by keeping games off the platform Sony is only harming those who prefer to play Vita titles on the big screen, or who want some of the additional set-top functionality that the Vita TV offers, like Netflix support.
It’s entirely possible that the whitelist approach was implemented to mollify Sony’s*mobile team, which doesn’t want to acknowledge that the PlayStation Vita is moribund. VGChartz shows that the 3DS regularly outsells the Vita by 4x or more per month — and while an install base of 12 million devices isn’t terrible, it’s nowhere near Nintendo’s 50 million 3DS sales.
VitaReviews has published an exhaustive list of every current title now known to run on the PS TV, but some caveats are attached. Games that require motion controls won’t work. Games that require touch may or may not be emulated. Some games still don’t work, even with this exploit installed. You can’t connect to the PSN network, and it’s not clear if rebooting the device also requires you to apply the patch again.
more...
-
10 fixes the Sony PlayStation 4 (PS4) desperately needs
http://www.extremetech.com/wp-conten...4-640x353.jpeg
By all accounts, the PlayStation 4 has been a huge success. It’s selling extremely well, it regularly out performs the Xbox One, and it’s become the de facto standard for third-party releases (much like the PS2 or Xbox 360 in their heydays). Even so, it’s far from perfect. I’ve used the PS4 nearly every day since it launched in 2013, and familiarity has most definitely bred contempt.
Today, I’ll be highlighting 10 of the most frustrating faults with the PS4, and offering up solutions to fix the issues. And even though there’s going to be a lot of complaining here, this isn’t an indictment of the platform all together. After all, I wouldn’t be using it if I didn’t enjoy it. With that in mind, let’s jump in.
http://www.extremetech.com/wp-conten...es-640x360.jpg
Increase cloud storage for everyone
For most of the PS4’s existence, the cloud save situation has been rough. PlayStation Plus subscribers got a gig of cloud storage, and those who don’t pony up for a subscription got nothing. Starting with system software 3.0, PS+ users will get 10GB of storage, but everyone else still gets nothing.
Going forward, Sony needs to loosen up, and stop being so stingy with the cloud storage space. Truth be told, I think a Dropbox or iCloud model would work well here. Non-paying members should get a gig or two, and paying members should effectively have unlimited save space on Sony’s servers. At this point, cloud storage is so cheap, it’s actually a little embarrassing that Sony is still so tight-fisted.
http://www.extremetech.com/wp-conten...te-640x360.jpg
Allow everyone to update in rest mode
Under the current scheme, only PlayStation Plus subscribers can download app updates while the PS4 is in rest mode. That means if you’re unwilling to drop an additional 50 bucks a year for PS+, you’re stuck twiddling your thumbs while your games and apps update. It’s an absurd limitation.
PlayStation Plus offers online multiplayer, free games every month, and some really fantastic discounts. It’s a good service, and plenty of people are willing to buy in for those benefits. However, the rest mode updates are a simple quality of life feature built in at the OS level — it doesn’t use any significant resources on Sony’s end. I’m not opposed to all paywalls for premium features, but this specific paywall feels wrong in every way.
http://www.extremetech.com/wp-conten...CP-640x360.jpg
more...