-
The Oculus Rift’s $599 price tag isn’t as bad as it sounds
http://www.extremetech.com/wp-conten...er-640x353.jpg
After years of waiting, the finalized retail version of the Oculus Rift is ready for pre-order. If you decide to buy now, you’ll get the headset, the sensor, a remote control, an Xbox One controller, a copy of Lucky’s Tale, and a copy of EVE: Valkyrie for a whopping $599. Unsurprisingly, that hefty asking price hasn’t gone over well with the enthusiast crowd. Comparisons with the overpriced 3DO and PS3 launches are popping up everywhere, but these strong negative reactions don’t necessarily spell doom for VR or the Oculus Rift. Truth be told, this kerfuffle might just make the VR market stronger.
The first group of orders will start shipping on March 28th in the initial 20 markets, but the ship date for new pre-orders has already slipped to June. Even if you’re willing to pony up $599 right now, you’re probably not going to be getting your Oculus Rift until some time in Q2. The limited supply is disappointing, but it’s not entirely surprising. This isn’t a massive consumer-focused launch — it’s more of a slow rollout to the eager early adopters with deep pockets.
http://www.extremetech.com/wp-conten....1-640x360.jpg
Earlier this week, Oculus surprised us all by announcing that everyone who pre-ordered the Oculus dev kit by backing the Kickstarter in 2012 would be getting the finalized retail hardware at no additional cost. It’s a nice gesture, but most of that goodwill seems to have gone out the window.
The negative reactions across social media have been loud, but it seems this is a failure of Oculus’s messaging — not a true indictment of expensive first-gen technology. In May of 2015, Oculus said that you’d need to spend about $1,500 for an Oculus Rift and a gaming rig capable of driving it smoothly. This $599 price tag definitely fits within that estimate, yet even the enthusiast crowd seems shocked that the final hardware is so expensive. If Oculus had simply specified the $599 asking price a few months back, it would have given us the reality check we so desperately needed.
However, there is a silver lining: Oculus’s competitors will almost certainly try to exploit the sticker shock. The HTC Vive will be launching very soon, and it’s likely to be in the same ballpark, but other companies could use this opportunity to get a foothold in this brand new market.
https://www.youtube.com/watch?v=kWSIFh8ICaA.
And if Oculus starts feeling severe pressure from the competition, it will help drive the price down as soon as it’s financially feasible.
more...
-
Confirmed: GlobalFoundries will manufacture AMD’s mobile, low-power Polaris GPUs
http://www.extremetech.com/wp-conten...s1-640x353.jpg
When AMD unveiled its new Polaris architecture, there was still some question as to where the new graphics processors would be built. Historically, AMD has built GPUs with TSMC and used GlobalFoundries for its CPUs and higher-end APUs, which use integrated graphics. GlobalFoundries has now officially confirmed that this is changing at 14nm.*In a statement, the company said it will fab cards based on Polaris for a number of applications and scenarios, “including thin and light gaming notebooks, small form factor desktops, and discrete graphics cards with lower power demands.”
This appears to confirm our previous hypothesis: AMD is tapping GF to fab its mobile designs and low-power desktop cards, but building higher-power architectures with TSMC. This also makes sense given what we know about the two foundries’ respective processes: GF’s technology is based on Samsung’s 14nm LPP (Low Power Process), and while it can perform some customization work to validate higher TDPs, AMD’s Zen CPUs are expected to top out around 95W TDP. GPUs, in contrast, can reach much higher values. It’s not uncommon for high-end graphics cards to hit 250W, and ultra-high-end cards can break 300W.
more...
-
Prominent cracking group claims game piracy could be dead within two years
http://www.extremetech.com/wp-conten...te-640x353.jpg
For as long as games have had copy protection, there have been ways to break said copy protection. In the earliest days of gaming, games often required you to enter a random word or sentence from the manual, or look up a code sequence. For more than 30 years, game crackers and developers have fought a never-ending war with each other. At best, DRM has delayed a crack by a handful of days or a week. Most of the time, it hasn’t even managed that.
According to a prominent cracking group (as first reported by TorrentFreak), that may finally be changing. The Chinese cracking group 3DM has a great deal of expertise in breaking game DRM, including last year’s Dragon Age Inquisition. The group’s founder, Bird Sister (aka Phoenix) recently stated that the team has had no luck cracking the latest version of Denuvo, a fairly niche but apparently extremely effective DRM technology.
http://www.extremetech.com/wp-conten...vo-640x354.jpg
3DM may have one previous rounds, but Denuvo is holding its own this time.
In a response to the length of time it has taken to crack Just Cause 3, Bird Sister stated the following:Recently, many people have asked about cracks for Just Cause 3, so here is a centralized answer to this question. The last stage is too difficult and Jun [cracking guy] nearly gave up, but last Wednesday I encouraged him to continue.
I still believe that this game can be compromised. But according to current trends in the development of encryption technology, in two years time I’m afraid there will be no free games to play in the world.
A world in which game cracks couldn’t exist would be a drastic departure from the past 30-40 years, and there would be some significant unintended consequences. While publishers would undoubtedly cheer the move, it would cause real problems for games that relied on online authentication servers that may no longer exist, or titles whose DRM checks caused significant performance issues. Both of these have occurred on shipping titles, and there’s no guarantee they’d stop happening if publishers perfected DRM.
Even if cracking isn’t abolished, it’s possible that it might take progressively longer for titles to be cracked post-launch. From a developer standpoint, this would be almost as good — most games generate the majority of their sales in the first few months, so a game that took six months to crack would likely earn most of the revenue it was going to earn. (Games with subscription costs or microtransaction models are an obvious exception to this.)
more...
-
AMD slashes the Radeon Nano’s price, and now it’s a killer deal
http://www.extremetech.com/wp-conten...e1-640x353.jpg
AMD’s Radeon Nano was, in many ways, the crown jewel of the Radeon Fury family. While it didn’t offer quite as much performance as the Fury X or Radeon Fury, it blew the power efficiency of both cards out of the water. It was an incredible card for small form factor PCs, and packed considerably more firepower into the smallest high-end GPU we’ve ever seen.
The one downside? Price. At $649, the Radeon Nano was priced against the Fury X and the GTX 980 Ti, despite not quite matching the performance of either solution. That made the GPU something of a niche offering — fabulous if you needed its tiny size, but hard to recommend as a general card. Today, that changes — AMD has slashed the Radeon Nano’s price by $150, or 23%.
http://www.extremetech.com/wp-conten...nsumption1.png
Absolute power consumption.
The price cuts bring the Radeon Nano down to $500, and at that price, it’s playing a different game. Unless AMD cuts the price on the Radeon Fury, the Nano and the Fury are now neck-and-neck — with the Nano using much less power, fitting into smaller form factors, and taking up a lot less of your case.
http://www.extremetech.com/wp-conten...1/WattsPer.png
more...
-
Nvidia’s Drive PX 2 prototype allegedly powered by Maxwell, not Pascal
http://www.extremetech.com/wp-conten...n1-640x353.jpg
When Nvidia’s CEO, Jen-Hsun Huang took the stage at CES last week he unveiled the company’s next-generation self-driving car platform, the Drive PX 2. According to Nvidia, its Drive PX 2 platform packs the same amount of compute power as six Titan X boards, in just two GPUs. During the show, Jen-Hsun displayed the new system — but what he showed from stage almost certainly wasn’t Pascal.
http://www.extremetech.com/wp-conten...Us-640x383.jpg
Image by Anandtech. The label is fuzzy to my eye.
As Anandtech readers noted, the hardware Jen-Hsun showed was nearly identical to the GTX 980 in an MXM configuration. The new Drive PX 2 is shown above, the GTX 980 MXM is shown below. The hardware isn’t just similar — the chips appear to be identical. Some readers have also claimed they can read the date code on the die as 1503A1 — which would mean the GPUs were produced in the third week of 2015.
http://www.extremetech.com/wp-conten.../GTX980MXM.jpg
Image by Anandtech. The GTX 980 MXM
If Nvidia actually used a GTX 980 MXM board for their mockup, it would explain why the Drive PX 2 looks as though it only uses GDDR5. While Nvidia could still be tapping that memory standard for its next-generation driving platform, this kind of specialized automotive system is going to be anything but cheap. We’ve said before that we expect GDDR5 and HBM to split the upcoming generation, but we expect that split in consumer hardware with relatively low amounts of GPU memory (2-4GB) and small memory busses. The Drive PX 2 platform sports four Denver CPU cores, eight Cortex-A57 CPUs, 8 TFLOPS worth of single-precision floating point, and a total power consumption of 250W. Nvidia has already said that they’ll be water-cooling the module in electric vehicles and offering a radiator block for conventional cars. Any way you slice it, this is no tiny embedded product serving as a digital entertainment front-end.
Then again, it is still possible that the compute-heavy workloads the Drive PX 2 will perform don’t require HBM. It seems unlikely, but it’s possible.
more...
-
EA’s Origin Access offers PC gamers unlimited access to select games
http://www.extremetech.com/wp-conten...in-640x353.jpg
Electronic Arts, and by extension the Origin game service, is not particularly well-liked by gamers thanks to its heavy use of microtransactions and DLC*packs. Perhaps it will earn some goodwill with a new service called Origin Access. PC gamers are able to subscribe to Access for a mere $4.99 per month and get early versions of unreleased games and unlimited access to The Vault, a Netflix-style selection of slightly older games.
EA calls the early access feature “First Play Trials,” and stresses that these will be full versions of the games, not demos. This feature will be limited to EA titles, but it has the potential to save you some real cash. When a game unlocks for First Play, you can dive in and see how it suits you. If you don’t like it, no problem — just don’t buy it when it comes out. If you do enjoy the game, you can play it as much as you want until it’s released, then buy the full version and have your First Play progress carry over.
There’s only one First Play Trial listed on the Origin site right now, a platforming game called Unravel. It will cost $19.99 when it launches on February 9th. Origin Access members will be able to start playing it on February 4th, though. As an added bonus, everything you buy on Origin comes with a 10% discount for Access subscribers.
http://www.extremetech.com/wp-conten...es-640x368.png
more...
-
Intel claims its integrated GPUs now equal discrete cards
http://www.extremetech.com/wp-conten...e1-640x353.jpg
For more than a decade, the phrase “Intel integrated GPU” was synonymous with “terrible graphics solution.” The first Intel motherboard with integrated graphics, the i810, had terrible performance, even in 2D desktop work. The 2D graphics performance improved, but Intel’s 3D capabilities were more-or-less terrible until the launch of Sandy Bridge.
Since Sandy Bridge debuted, Intel has been much more aggressive about improving its 3D capabilities, generation-on-generation. AMD stole the GPU performance lead from Intel with its Llano APU back in 2011, but Intel has steadily chipped away at this advantage. AMD still holds an overall advantage against Intel’s desktop CPUs (as a write-up at Anandtech makes clear,) but Intel’s exact words were: “For the mainstream and casual gamer, we have improved our Iris and Iris Pro graphics tremendously. We have improved our graphics performance [by 30 times] from where it was five years ago. We believe that the performance of Intel’s integrated graphics today, what we offer in the products […], is equivalent to the performance of about 80% of discrete [GPU] installed base.”
Them’s fighting words. But is it true? As far as we can tell… no. At least, not according to Steam’s hardware survey. We ran down the list of AMD and Nvidia GPUs, counting only those cards we were certain could beat the Iris / Iris Pro in a head-to-head comparison. If Steam’s figures are accurate, AMD and Nvidia combined hold roughly 31% of the GPU market in terms of GPUs that are at least midrange discrete cards.
http://www.extremetech.com/wp-conten...PU-640x491.png
Part of what makes Intel’s statement tricky to evaluate, however, is that the first part of its statement depends on how you evaluate the meaning of “mainstream and casual gamer.” If that means Farmville, or equivalent Facebook-style games, then Intel is correct. I doubt there’s much difference between playing low-end titles on an Intel chip versus AMD or Nvidia these days.
more...
-
Microsoft, Mojang unveil Minecraft: Education Edition
http://www.extremetech.com/wp-conten...ox-640x353.jpg
Microsoft announced it will be releasing an educational version of Minecraft, the popular online sandbox building game the company acquired in 2014. Minecraft: Education Edition will be released as a free trial this summer to existing Minecraft users, according to Mojang, the Microsoft subsidiary acquired for $2.5 billion in September 2014.
The new title builds on MinecraftEdu, “a version of Minecraft built for the classroom [which] has been used in over 40 countries,” Mojang said in a statement. Microsoft acquired MinecraftEdu this week, the statement said. Terms of that deal were not disclosed.
As with the public game, the updated classroom edition of Minecraft will continue to teach “essential life-skills like tree-punching and good Creeper-defense,” per Mojang, but will also feature purpose-built “lessons”*to help educators instruct students in areas ranging from basic problem-solving to history, art, and STEM disciplines.
Lessons include “Redstone Lodge,” a Minecraft: Education Edition mod, which “gives the full range of electrical engineering principles, logic gates, and even music composition within Minecraft” and the bomb-shelter building mod “Anderson Shelters,” which promises to teach both World War II history and physics.
Last year, Microsoft introduced Minecraft: Windows 10 Edition Beta as the first version of the game with Redmond’s stamp on it, as part of the rollout of the software giant’s latest PC operating system. But the rollout of a Minecraft version specifically tailored for the classroom has been cooking for even longer.
more...
-
Review: The Asus G751JY-DB72 is one heck of a gaming laptop
http://www.extremetech.com/wp-conten...JY-640x360.jpg
We’ve already covered the Asus G751JY-DB72 laptop indirectly, when we reviewed mobile G-Sync and its impact on frame rates and visual quality earlier this fall. But after spending some time with this laptop, we wanted to do a*more thorough review. The G751JY-DB72 isn’t perfect, and its base model has been on the market for well over a year, but don’t let that fool you — this system offers a better price/performance ratio than many more expensive systems that shipped in the past six months.
Specs and availability
Asus manufacturers a number of laptops under the G751JY brand, so there are some specific attributes to be aware of if you’re interested in buying one. The particular model we’ve tested has gotten a bit tough to find at retail, and the price can vary significantly depending on whether or not you want the 2014 model (which shipped with 16GB of RAM but without G-Sync support) or the updated 2015 version, which had G-Sync.
The specifications on the model we tested were:
- Intel Core i7-4720HQ CPU (2.6GHz base, 3.6GHz Turbo)
- 24GB of DDR3L RAM
- 128GB SSD w/1TB HDD for additional storage
- Nvidia GTX 980M (4GB of RAM)
- 17.3-inch, 1920×1080 display (G-Sync enabled)
The handful of DB72’s showing up for sale are priced around $2,000 for the G-Sync model and a 256GB SSD + 1TB HDD, which is still quite expensive. One alternative, based on the same form factor and chassis, is the G751JY-WH71. It’s identical to the DB72, but uses a DVD-ROM instead of Blu-ray, has “just” 16GB of RAM and a smaller 128GB SSD + 1TB HDD. The price, meanwhile, drops to $1,569.
http://www.extremetech.com/wp-conten...al-640x640.jpg
If you want a powerhouse mobile gaming system, that $1,569 price tag for a GTX 980M and a G-Sync panel is very hard to beat. Even Asus’ refreshed G752 Series doesn’t come close. The G752VL-DH71 is a $1,499 system but only offers a GTX 965M, while the $2,000 G752VT-DH74 tops out with a GTX 970M. Both are capable GPUs, but neither is as nice as the GTX 980M.
more...
-
EA: We aren’t trying to be a greedy ‘corporate beast’
http://www.extremetech.com/wp-conten...go-640x353.jpg
Andrew Wilson, Electronic Arts’ CEO, took the stage at the BC Tech Summit on Tuesday, and heroically declared that his company isn’t trying to be a “corporate beast.” His comments echo remarks made late last year by Chief Financial Officer Blake Jorgenson, who similarly argued that EA had no interest in “nickel and diming” gamers.
“If you understand the video game business, EA — the branding is this corporate beast that just wants to take money from them while people play our games,” said Wilson. “That’s not actually what we’re trying to do.”
Jorgenson’s comments weren’t exactly greeted by adulation from the publisher’s fans, and it’s doubtful Wilson will have better luck. EA has been repeatedly voted the worst company in America, and while I personally disagree with that assessment — I’d argue there are companies that engage in far worse behavior than anything a video game publisher has ever done — it’s obvious that there’s a great deal of lost trust and anger between EA and its customers.
http://www.extremetech.com/wp-conten...1/EA-Award.png
EA lost to Comcast in 2014, which is progress, we guess.
At the conference, Wilson went on to explain how services like EA Access, which gives Xbox One gamers access to a back catalog of titles and early access to upcoming products, proved that the company had gamers’ best interests at heart. In and of itself, EA Access sounds like a decent concept, but Wilson’s description of its earnings potential isn’t all that encouraging.
“For the longest time in civilization, we would spend money as human beings, then we would spend time where we spent our money. That’s reversed now,” Wilson said. “You come in, and play a bunch of games, and ultimately you invest after that.”
more...
-
JEDEC certifies GDDR5X — will AMD, Nvidia tap it for next-gen graphics after all?
http://www.extremetech.com/wp-conten...PU-640x353.jpg
Last fall, Micron announced that it would bring a new type of GDDR5 to market, GDDR5X. At the time, it wasn’t clear if the announcement would amount to much, since Micron didn’t expect availability until the end of 2016, and both AMD and Nvidia will have likely refreshed their desktop and mobile products by then. Now, the memory standard consortium JEDEC has officially recognized and published GDDR5X as a new memory standard, which could make it much more attractive to both AMD and Nvidia.
GDDR5X vs. GDDR5
Unlike high bandwidth memory (HBM, HBM2), GDDR5X is an extension of the GDDR5 standard that video cards have used for nearly seven years. Like HBM, it should dramatically improve memory bandwidth.
GDDR5X accomplishes this in two separate ways. First, it moves from a DDR bus to a QDR bus. The diagram below shows the differences between an SDR (single data rate), DDR (double data rate) and QDR (quad data rate) bus.
http://www.extremetech.com/wp-conten...er-640x324.png
An SDR bus transfers data only on the rising edge of the clock signal, as it transitioned from a 0 to a 1. A DDR bus transfers data both when the clock is rising and when it falls again, meaning the system can effectively transmit data twice as quickly at the same clock rate. A QDR bus transfers up to four data words per clock cycle — again, effectively doubling bandwidth (compared to DDR) without raising clock speeds.
more...
-
Sony reorganizes PlayStation business, establishes division as new company
http://www.extremetech.com/wp-conten...4-640x353.jpeg
File this one under “Didn’t see this coming.” Today, Sony announced that its transforming its PlayStation business and creating a new LLC (Limited Liability Corporation). The new company is comprised of Sony Computer Entertainment and Sony Network Entertainment International. These two divisions cover Sony’s entire PlayStation business, including R&D, game development and publishing, and the PlayStation Network.
The new company will be known as Sony Interactive Entertainment, and will begin operation on April 1, 2016 (and no, it’s not an April Fools’ joke).
“By integrating the strengths of PlayStation’s hardware, software, content and network operations, SIE will become an even stronger entity, with a clear objective to further accelerate the growth of the PlayStation® business,” said Andrew House, President and Global CEO of Sony Computer Entertainment Inc. and Group Executive in charge of Network Entertainment of Sony Corporation.
http://www.extremetech.com/wp-conten...33-640x198.jpg
more...
-
Rise of the Tomb Raider looks and plays better on PC, but budget gaming rigs won’t cut it
http://www.extremetech.com/wp-conten...er-640x353.jpg
Back in November, Rise of the Tomb Raider launched on the Xbox One and Xbox 360 to massive critical acclaim. As a sequel to 2013’s Tomb Raider reboot, it was seen as an improvement in nearly every way possible. However, the limitations of Microsoft’s consoles tripped up the game in the performance department. Thankfully, the PC port launches today, and it looks incredible.
The folks over at Digital Foundry took an in-depth look at the PC release, and they were thoroughly impressed by the attention to detail that Nixxes Software put into this port. After the nightmare of Arkham Knight and the apparent abandonment of the PC version of Mortal Kombat X, it’s just nice to see such a top-notch port hitting the PC.
Interestingly, one of the most noteworthy improvements here isn’t the increased graphical fidelity — it’s the latency. In the Xbox One version, there is significant input latency that wasn’t found on the 360 port (also handled by Nixxes Software).
After putting the PC port through its paces, Digital Foundry found that the input latency was reduced even further in this version of the game. It doesn’t make much of a difference when you’re just walking about the open world, but when it comes time to aim your gun quickly, the latency can definitely cause some frustration in the Xbox One version.
While the PC port is capable of outperforming the Xbox One original, don’t expect a budget gaming PC to hit 60fps on high settings. The game’s minimum requirements list a Core i3-2100, 6GB of RAM, and a 2GB GTX 650, but the fidelity will suffer substantially if that’s all you’ve got. If you want to run the high settings, you’re going to need a graphics card with at least 3GB of memory. Very high? Invest in a 4GB card. Square-Enix recommends using a GTX 970, but Digital Foundry says a GTX 960 will be able to deliver a 1080p30 experience superior to the Xbox One version.
Of course, hitting 1080p60 requires a bit more horsepower. A GTX 970 can deliver 60fps, but it drops some frames in the village area. If that bothers you, turning off tessellation will solve the problem. Even if you don’t have top of the line hardware, you can make some sacrifices in the name of frame rate. Like any good PC game, this port allows you to switch most effects on and off at will, so you can decide exactly which features you can live without.
Earlier this week, the single-player DLC Baba Yaga: The Temple of the Witch landed on both the Xbox One and Xbox 360 to mostly positive responses. While some people were disappointed in the length of this $10 release, any excuse to jump back into Rise of the Tomb Raider is a good one. And since the DLC is launching day-one for the PC crowd, they can get the full gameplay experience from the get-go.
more...
-
Samantha Kalman on indie VR game development: rethinking fundamentals
http://www.extremetech.com/wp-conten...ed-640x353.png
he VR game scene is heating up, and it’s not just about AAA titles, as we’re finding in our new series on developing for VR. Hot off the heels of our interview with No Goblin’s Dan Teasdale, I had the pleasure of speaking to an indie developer named Samantha Kalman. She’s been making games for years, and now she’s decided to take on the challenge of making multiple new VR games on her own.
Thanks to her fascinating background in both the creative and technical aspects of game development, hearing Kalman’s thoughts on VR lends a new perspective — not just a better understanding of what VR is now, but a hopeful vision of what VR can become. If you still classify yourself as a VR skeptic, Samantha’s infectious passion might just change your mind.
Samantha Kalman interview
Grant Brunner: Can you tell us about who you are and your experience in gamedev?
Samantha Kalman: Sure. I’m Samantha Kalman. I started making games as a hobby in my early twenties when I discovered [the Unity game engine] which was at version 1.0 at the time. I was working in software testing when I found it and started doing hobby stuff. I became QA director of Unity when it was very, very small — fewer than ten people. Followed through with Unity up until 3.0 — released that as part of the team.
And then I became an indie developer full time. I tried a couple of times to make it work. I made a game called Sen, which is a music game that’s up on Kongregate.com. That made me no money, so I went and got a job at Amazon where I was a design technologist, which is basically a designer who writes code, on the Amazon Fire Phone. I was there for a year and a half working closely with the user experience and the 3D presentation of the device. And then tried again with indie games.
I felt like Sen was an unrealized idea — like unfinished business. I returned to this idea of “how do you make a game that helps people make music?” So, I spent the next couple of years making Sentris, and that’s out, and it kinda did it. Since that has been finished, I’ve been going into a lot of VR experiments. You know, just playing with a couple of different projects and ideas with the platforms.
GB: You’ve publicly discussed two of your current projects: Project Red and Unwinding. Can you give us an overview of what those are?
SK: Yeah, so, they’re two projects, and they’re pretty different. Unwinding is an experiment for me to do what’s called ground-up game design. Where as Sentris was top-down — I sort of had a goal, and I was aiming toward that. Unwinding is a ground-up where I’m trying to play with mechanics, and learn about what’s interesting in VR. And sort of build toward an undefined goal and see what kind of fun and challenging experiences can emerge in the meantime. Both projects share a similarity of creating a feeling of existing in an impossible space. So, Unwinding is kind of like a space metaphor. Like, there’s kind of a planet thing and then you are solving puzzles maybe within the planet — very abstract puzzles.
Project Red is more of a world building experiment. I love the look of Rez, and wanted to build something Rez-like that has a sense of impossibility. But in “Red” it’s really about the motion — the sensation of moving through space. Kind of being surrounded, and maybe even being overwhelmed by geometry and how close it is to you. There’s some light Metroidvania mechanics on the table for that game.
http://www.extremetech.com/wp-conten...ng-640x360.png
GB: All I’ve seen of these projects are your blog posts and screenshots from the early stages of development, and they’re both very abstract in appearance. Do the games take place in first-person, or is it so abstract that you’re not even really a person — you’re an abstract form as a camera in the game world?
SK: Yeah, it’s a good question. I think that all VR games are inherently first-person — even ones where you’re controlling an avatar. The really incredible thing about VR is that you are a person who is wearing this thing that is providing you a lens into the world. It’s like picking up a pair of binoculars, and looking at the world far away. But you’re picking up a pair of goggles, and you’re looking into a different world. This is one of the most incredible things about the medium: to be looking first-person into a world.
When the iPhone came out, people were like “Oh, when we start developing games from the ground-up for a touch screen, that’s going to be a game-changer.” I think that for VR, building a game from the ground-up means acknowledging that the person playing it is a person. It’s almost like a fourth wall thing, but it’s not exactly. I think it’s important to acknowledge that there are people stepping inside these worlds.
more...
-
Report: Nintendo NX developer kits are shipping; device will be both handheld and console
http://www.extremetech.com/wp-conten...o3-640x354.jpg
E3, and the Nintendo NX’s full review, are still months away, but news around Nintendo’s next-generation console has been hot lately. Today, we’ve got a new pair of rumors around the device: Nintendo is supposedly seeding software development kits already, and the platform will combine a full-fledged handheld device with console play.
According to the WSJ via Geek.com, analyst Hiroshi Hayase, from IHS Technologies, expects “a small recovery in shipments of flat-panel displays for game devices because of Nintendo’s new game hardware expected to be released in 2016.” Hayase expects shipments of 3.1 – 5-inch panels to increase from 14.1 million units in 2015 to 16.5 million units this year.
Nintendo’s NX: Console hardware, standalone handheld?
Like the Wii U, the NX will supposedly combine a handheld gamepad with a living room console. Unlike the Wii U, however, the NX controller will supposedly be an independent unit. Instead of being tethered to the base console by an invisible wireless leash, gamers can take the handheld gamepad anywhere they want.
Sony has tried an approach like this with the PlayStation Vita, with mixed results. Since the PS4 has a different controller from the Vita, not all titles map well to handheld. From what we’ve heard, the Vita is technically capable of streaming a game over the Internet, but you’ll realistically want WiFi and a local PS4 to use the features.
As for the NX controller, a recent patent granted to five Nintendo employees show a rather odd device:
http://www.extremetech.com/wp-conten...02-640x358.jpg
The entire face of the device is one screen, with embedded thumbsticks on the left and right. While Nintendo loves being edgy and trying new things, I’m deeply dubious of this design. First, there’s the practical question of how well any display would survive rigorous gameplay sessions day after day. I’ve never cracked an iPhone screen from touching it, but I’ve never tried to play Smash on my iPhone for hours at a time, either.
What about the hardware?
If the Wall Street Journal is correct, one of the major features of the Nintendo NX will be the ability to transfer a game session to a gamepad and take it with you. A 5.1-inch panel would actually be smaller than the current Wii U Gamepad, though far larger than the 3DS.
The big question is how Nintendo would handle the local rendering and associated battery life hit. If the console has a target of 60 FPS and 900p, the handheld version may be able to get away with a somewhat lower resolution target. But would the handheld the same hardware as the console (albeit likely at a lower clock rate) Would save game sessions you completed locally synchronize with the NX when you stepped back in range? If the console offloads gaming sessions to handheld hardware, can a family take advantage of this to use the NX console for an entirely different game?
If the game controller contains a significant amount of standalone hardware, that will drive up costs and weight. The final product wouldn’t need to be as complex or expensive as a modern smartphone, but it would still need physical interface buttons, some type of operating system, storage, wireless radios, and at least a moderately powerful SoC. While there have been rumors that AMD is building the NX, the focus of those rumors was that AMD would be building the console, not the handheld.
more...
-
Nintendo in 2016: The next-generation NX, mobile games, and investigating VR
http://www.extremetech.com/wp-conten...NX-640x353.jpg
Nintendo is currently one of the biggest question marks in the world of gaming. We’re likely to see the next-gen NX system for the first time later this year, and that could lead to an early demise for the Wii U. The company’s first mobile game is set for release next month, and there are already four more titles lined up for smartphones. Even better, Nintendo President Tatsumi Kimishima finally expressed interest in modern virtual reality hardware. With so many balls in the air, everyone in the industry is lining up to see how many Nintendo will be able to catch.
Last week, word started circulating that dev kits for the NX have already begun to ship. The rumor mill has the device pegged for a holiday 2016 release, but I wouldn’t be surprised to see Nintendo wait it out until sometime in 2017. Either way, a four or five year lifespan is pretty short these days. The Wii lasted around six years before being replaced, and the Xbox 360 was able to pull off a whopping eight years. But thanks to the Wii U’s lack of traction, pulling the plug early is starting to seem pretty appealing.
http://www.extremetech.com/wp-conten...irtual-Boy.jpg
more...
-
Bethesda’s Doom reboot drops May 13, complete with $120 Collector’s Edition
http://www.extremetech.com/wp-conten...se-640x353.jpg
After a whopping eight years in development, Bethesda has finally given a date for the next iteration of Doom. On May 13, players will finally suit up as, erm, Anonymous Space Marine, to take on the forces of Hell and/or an evil corporation. It’s all a bit vague, you see. All of Bethesda’s trailers have been heavy on action, extremely short on gameplay, and today’s launch trailer is no different.
Today’s launch trailer builds on the E3 gameplay trailer that came out last year. Both show a game that’s exceedingly heavy on action and apparent combos — tearing off limbs, gouging out eyes, being ripped in half by a Cyberdemon — you know, the usual events of an Anonymous Space Marine’s life.
If the gameplay trailers are anything to go by, Bethesda’s Doom is a different beast than the brooding, jump-scare, darkfest that was Doom 3. id’s 2004 shooter may have sold well, but as someone who loved both Doom and Doom 2, I couldn’t get into it. Doom 3 and Half-Life 2 debuted within months of each other, but the contrast between the two couldn’t have been larger.
Half-Life 2 was one of the first FPS games to successfully integrate object manipulation and physics into the game engine (we shall not speak of Trespasser). Doom 3, in contrast, relied on static environments and a number of already-tired tropes in the genre. Enemies that spawned behind you or leapt out from hidden closets in rooms you’d thought you cleared might have been cutting edge in 1993 and 1994, but they were already worn thin ten years later.
http://www.extremetech.com/wp-conten...m1-640x360.jpg
As for this new Doom, it looks a fair bit like the Brutal Doom mod for the modern game. I’ve actually spent a fair bit of time playing Brutal Doom this year, and it’s a hilariously fun way to replay the original maps. Scavenged weapons, grenades, mouselook, fatalities, and many of the other tweaks collectively update the classic Doom, while simultaneously remaining true to the original title in a way Doom 3 never managed. The mod effort to implement Doom’s shareware levels in Doom 3’s engine were much closer to the original game, in my personal opinion.
more...
-
How much RAM do you need, should you upgrade it, and will it speed up your PC?
http://www.extremetech.com/wp-conten...re-640x354.jpg
Welcome to ExtremeTech’s comprehensive RAM guide, in which we’ll answer a broad range of questions related to how much system RAM you need these days, whether or not it’s worth it to upgrade older systems, and whether DDR3 or DDR4 (the new main types of system RAM) is a better investment option.
It’s interesting to look back and see how much things have changed over the past twenty years. People have been writing RAM guides for decades, but back when I was learning about computing, much more emphasis was paid to the specific technical implementation of any given RAM standard.* Fast Page Mode RAM, EDO RAM, SDRAM, DDR, and RDRAM are just a few of the standards that existed elbow-to-elbow, and which type of memory your system used often determined if it was worth upgrading.
Nowadays, things are simpler. While a few of you may still have DDR2-based equipment from 2005 to 2009, the majority of systems today are likely using DDR3. That’s the memory standard we’ll focus on; if you have DDR2-related questions you’re welcome to drop them in the comments.
How much RAM do you need?
How much RAM you need in a system depends on what you intend to do with it, how long you intend to keep it, and whether or not you can upgrade your memory post-purchase. This last point is important, as many high-end laptops have eliminated user-upgradeable RAM in order to reduce system thickness by roughly six nanometers.
Adding additional RAM to any laptop generally increases power consumption by a measurable (if small) amount, but this shouldn’t be an issue for most users. It’s also better to have a bit too much RAM than too little, as whatever you gain in power savings you’ll promptly lose to increased disk paging.
Apple’s MacBook Air offers 4GB of RAM, but most of the systems from Dell, HP, and other OEMs start at 8GB, and I think that’s the better sweet spot. That’s not to say you can’t get by on 4GB — you absolutely can — but 8GB gives you a bit more breathing room.
http://www.extremetech.com/wp-conten...ir-640x353.png
The MacBook Air ushered in the era of soldered DRAM. Everyone followed.
There’s at least some evidence that modern desktop applications have slowed the rate at which they demand more RAM. From 1990 to 2000, Photoshop’s minimum RAM requirement rose from 2MB to 64MB, a 32x increase in 10 years. It took another 16 years to match this early rate (from 64MB in 2000 to 2GB in 2016).
A lightweight system today can get by with 4GB of RAM. 8GB should be plenty for current and near-term future applications, 16GB gives you comfortable space for the future, and anything over 16GB is likely overkill unless you specifically know you need it (such as for video editing or audio post-production). This holds true for desktops as well as laptops.
DDR3 or DDR4?
Right now there’s plenty of DDR3 systems still being sold, but DDR4 has already begun to replace it on the mass market. If you’re building a new system and don’t have a specific reason to use DDR3, we’d recommend buying hardware that’s compatible with DDR4.
With that said, if your system does use DDR3, that’s not the problem that it used to be. In the old days, a computer stuck on, say, PC133 SDRAM was at an intrinsic performance disadvantage compared with systems that used DDR, particularly at higher clock speeds. That’s less true than it used to be, and it may make sense to upgrade a DDR3 system depending on what you have and when you bought it. The reason to use DDR4 at this point has more to do with long-term memory pricing trends and future compatibility than fundamental performance. We’ll explore current price and the performance question later in this guide.
Does faster RAM boost system performance?
Short answer: Sometimes, but not by much.
Medium answer: It depends on other system components, workload, and whether or not you’re using integrated graphics.
Longer answer: See below.
RAM performance is controlled by two metrics: Clock speed and access latency. Access latencies tend to fall much more slowly than clock rates — as this diagram shows, the memory cell cycle time of PC100 is roughly equivalent to DDR4-2133. DDR4 doesn’t match DDR3-2133 cycle times until you hit DDR4-4266.
http://www.extremetech.com/wp-conten...11-640x634.jpg
RAM cycle times at various clock speeds
Conventional wisdom is that RAM latency has become relatively less important in recent years, thanks to a combination of factors. Back when L2 caches were small, memory controllers were off-die (and clocked at a fraction of CPU speed), and there were no L3 caches, memory latency had a larger impact on overall system performance. Modern CPUs are typically backed by 512 to 1MB of L2 (per core), and 1.5MB to 2MB of L3 cache (per core). Memory controllers are now integrated on-die and run at full processor speed. As a result, RAM latency simply doesn’t play as large a part as it once did in determining performance.
As for raw memory bandwidth, the same large caches that minimize the impact of RAM latency in most applications also limit the impact of memory bandwidth. Desktop applications are, for the most part, latency-sensitive, not bandwidth-sensitive.
http://www.extremetech.com/wp-conten...ck-640x450.jpg
These performance results are from Corsair, but they match extensive testing on the topic. AMD APUs love fast DRAM.
There’s one major exception to this rule: Integrated graphics performance. Both Intel and AMD integrated graphics see some benefit from higher-speed memory, but the gains are particularly large on the AMD side. This has proven true for every APU since at least Trinity, and will likely continue to be accurate for DDR4-based hardware. The advent of HBM2 in APUs will finally throw open the bandwidth floodgates — until then, integrated graphics will always be somewhat bandwidth-limited.
What about high-end gaming performance?
Until recently, I would’ve told you that high-speed RAM had very little impact on high-end gaming. A recent report from Digital Foundry, however, appeared to show otherwise.
more...
-
Oculus announces first Rift bundles, certified PCs
http://www.extremetech.com/wp-conten...ft-640x353.jpg
Oculus promised it would unveil certified PCs and bundles that would meet the demands of its own VR solution. It’s now unveiled the first of those systems, as promised, though there’s been some confusion regarding price points and bundles. We’ll help you sort it out.
The first wave of systems consists of desktops from Asus and Dell (Alienware is a Dell subsidiary). The cheapest configuration is an Asus G11CD, a desktop tower with an Intel Core i5-6400, 8GB of DDR4-2133, a GTX 970, and a 1TB HDD for $1,049. Alienware and Dell both offer systems with identical specs for $1,199, which raises the question of why anyone would buy them when the Asus rig is $150 cheaper.
Bundles, discounts, and Oculus pre-orders
There’s been some confusion over which systems are bundled with an Oculus Rift and which are not, so let’s clear that up first. Bundled systems — by which we mean systems that include the Oculus Rift, an Xbox One controller, and all the software that the Rift ships with start at $1,499. But not every system sold above $1,500 comes with a Rift bundled — none of the systems on the Rift blog post actually include a Rift headset. Oculus notes that this $1,499 bundle price point is for a limited time only, implying that the cost of bundled hardware will rise afterwards.
http://www.extremetech.com/wp-conten...ec-640x793.jpg
Asus system specs
There is one caveat to this. The various manufacturers are offering a $100 discount if you already pre-ordered an Oculus Rift. The website notes: “For those who’ve already pre-ordered Rift, you’ll be able to purchase discounted Oculus Ready PCs in select countries and regions. To claim your discount, check your order status and opt into partner offers if you haven’t already. Offer codes will appear on your order status page February 16.”
more...
-
Bryan Fuller to helm new Star Trek series: What do his past episodes tell us about the new show?
http://www.extremetech.com/wp-conten...ms-640x353.jpg
CBS has announced that the upcoming Star Trek TV show will be helmed by Brian Fuller, creator of Wonderfalls, Pushing Daisies, and Hannibal. Before he created these shows, Fuller was a writer on both Star Trek: Deep Space Nine*and Voyager, and has talked about the idea of bringing the series back to television for nearly a decade.
Fuller, who has been a Star Trek fan since he was a child, has pitched multiple ideas around a new series since 2008, when he told IF Magazine: “I would love to return to the spirit of the old series with the colors and attitude. I loved Voyager*and Deep Space Nine, but they seem to have lost the ‘60s fun and I would love to take it back to its origin.”
More recently, Fuller has speculated about setting a Star Trek TV show on the USS Reliant rather than the USS Enterprise, or rebooting the TNG universe within the Abrams timeline rather than returning to the original timeline in which all of the previous television shows took place.
Watching Brian Fuller
I’m not familiar with Fuller’s later television shows, but it just so happens that I’ve been re-watching all of the Star Trek spin-offs. Fuller wrote just two DS9 episodes, “The Darkness and the Light,” and “Empok Nor.” Both are solid, though I’d argue “Empok Nor” is the better of the two.
http://www.extremetech.com/wp-conten...er-640x521.jpg
Bryan Fuller. I have no explanation for the fox.
His Voyager repertoire is more extensive, at 20 episodes. My initial opinion of Voyager when it aired wasn’t very positive, but I’ve actually warmed to the show more now that I’m watching it again. It’s still my third-favorite behind DS9 and TNG, but it deserved more credit than I initially gave it. Now that I’ve seen which episodes Fuller wrote, I think some of that credit belongs to him.
more...
-
AMD unveils its own CPU recommendations for Oculus VR
http://www.extremetech.com/wp-conten...00-640x353.jpg
Earlier this week, Oculus opened pre-orders for systems and configurations that it believes will deliver an acceptable VR experience. Overall, it’s probably best if consumers hold off on pre-ordering VR equipment — but since we’ve spent most of our time discussing the GPU side of the equation, the CPU deserves some love as well.
Jason Evangelho of Forbes sat down with AMD to talk about its processor support for VR and whether the company’s FX and APU families can drive headsets like the Oculus Rift. The good news is, they absolutely can, even if the current version of the Oculus hardware tester claims otherwise.
http://www.extremetech.com/wp-conten...2/Capture3.png
Originally, AMD told Forbes that there were a variety of AMD FX processors that could handle VR, as well as some of the highest-clocked APU processors. The company has since walked back its APU claims, however, and is now saying that only the FX chips have been validated. If you have an eight-core or six-core AMD CPU with a base clock no lower than 3.9GHz, you should be good to go with VR.
The fact that AMD is still validating VR on its APUs doesn’t mean those chips can’t handle the technology, but it may be difficult for them to do so. Even AMD’s upcoming A10-7890K, with a supposed 4.1GHz base clock and 4.3GHz boost clock isn’t all that powerful compared to Intel’s Core family. Since AMD CPUs can’t match the single-threaded performance of Intel chips, which is why AMD positions its multi-core offerings against Intel processors with a lower core count. According to Oculus’ recommendations, the minimum Intel chip you should use is a Core i5-4590. That’s a 3.3GHz quad-core with a 3.7GHz Turbo clock, 6MB of L2 cache, and no Hyper-Threading.
more...
-
Vulkan API reaches 1.0, claims broad API support, cross-OS compatibility
http://www.extremetech.com/wp-conten...-3-640x353.jpg
It’s been some 18 months since Khronos announced the next-generation of OpenGL, and the final version of the spec, Vulkan, is finally ready for deployment. As of today, everything related to Vulkan — drivers, SDKs, and early software support is ready for launch. This is a change from Khronos’ usual practice, which is to announce a new version of the OpenGL API with vendor support following at a later time.
http://www.extremetech.com/wp-conten...02/Vulkan2.jpg
Vulkan is a direct descendent of AMD’s Mantle and it shares the same underlying philosophy as both its predecessor and DirectX 12. What sets Vulkan apart from DirectX 12 is that it supports a much wider range of operating systems, from Windows 7 and 8.1 up through Windows 10, Linux, SteamOS, Android, and Tizen. That’s a significant theoretical advantage for the new API, though Microsoft is doing everything it can to push Windows 7 or 8.1 gamers to adopt Windows 10.
Like DX12, Vulkan is designed to minimize driver overhead, scale across multiple CPUs, and offer developers greater control over how workloads are executed across the GPU. Mobile GPU companies like Qualcomm and Imagination Technologies have both pledged support for the API and intend to support it in their own development tools. In theory, this broad industry base could make it more likely that PC games will cross over to Android devices, Steam Machines, and Linux as a whole. Even Vivante has pledged to support the API, though its products are typically used in lower-end mobile hardware.
more...
-
Fallout 4 is a great game but a terrible RPG
http://www.extremetech.com/wp-conten...e1-640x353.jpg
Fallout 4 is a great game. It’s got better gunplay and action than any previous modern Fallout, a decent crafting system, streamlined skills and talents, and an evocative setting. The landscapes and environments are gorgeous, even if the character models are lacking. There’s a lot to like about this game — but it’s a terrible RPG.
What’s an RPG?
Strip away the tropes and conventions of the genre and there are two linked characteristics common to all role playing games (RPGs) — a (hopefully) strong story, and the opportunity to make meaningful choices that influence how the game’s narrative evolves. Some games play out the same way but allow the gamer to choose different play styles, while others allow the player to directly shape the narrative.
Because game assets and development time are both scarce resources, the best RPGs developers are skilled illusionists. Quest lines, conversation threads, and plot-specific developments are often woven in ways that give players enough freedom to explore alternative narratives while minimizing the amount of overhead required to do so. In Fallout: New Vegas, you can choose to stand with Mr. House, New Vegas, the NCR, or Caesar’s Legion. What you can’t do is decide that the Mojave is really boring, and you’d really prefer to see what the Baja Peninsula is like 200 years after the bombs fell.
http://www.extremetech.com/wp-conten...wn-640x361.png
Fallout New Vegas: A study in brown
RPGs are the only popular game type whose abbreviation tells you nothing about how you play the title. Every other abbreviation — FPS, RTS, turn-based, third-person shooter — is designed to explain how the player experiences the game. All of these game types are potentially compatible with the label “RPG.”
more...
-
HTC Vive VR: $800, shipping in April, and another risky bet for gamers
http://www.extremetech.com/wp-conten...e1-640x353.jpg
Today, just before Mobile World Congress, HTC announced both Vive VR’s price tag and some additional features. The headset will cost $800 at launch, pre-orders begin on February 29, and the device will include some novel functionality, including the ability to integrate with smartphones.
Vive’s official announcement states: “Enabling you to stay connected to the real world, without exiting the virtual world, Vive Phone Services demonstrates the ability to combine both realities without losing touch of either. By allowing you to receive and respond to both incoming and missed calls, get text messages and send quick replies and check upcoming calendar invites directly through the headset, it opens up a whole new world of possibilities for both consumers and businesses.”
Vive is talking about full SteamVR integration into its platform and the $800 price tag buys you two hand tracking, wand-like controllers. I experimented with these in Sonoma at RTG’s Polaris architecture event, though I didn’t come away with a high opinion of them — they weren’t working all that well when I tested them, and one hand kept dropping out during gameplay. Assuming that issue got solved, the final controllers were comfortable enough and easy to hold for the short time I used them.
The Vive ships with two titles, at least for now — Job Simulator and Fantastic Contraption, neither of which I’ve played personally.
Don’t pre-order a Vive, either
Earlier this month, after Oculus unveiled its recommended hardware, I stated that I didn’t think people should pre-order the Rift, even though I believe VR has tremendous long-term potential and want to see it succeed in the gaming market.
All of the reasons I recommended people take a wait-and-see approach to the Rift apply to the Vive as well, at least as strongly. However gobsmackingly amazing VR may prove in the future, right now, today, the technology is new, and the price of entry can be considerably more than $600 to $800 depending on what kind of PC hardware you currently own. I’ve built gaming PCs for people for less money than you’ll pay for just the HTC Vive.
There’s a lot we don’t know yet about how VR solutions will compare between companies or which games will be cross-compatible on which platforms, and which will technically be cross-compatible, but in reality will perform vastly better on one solution versus another. Oculus has gathered most of the headlines to-date, but that doesn’t mean it’s automatically going to field the best solution.
The fact that many PC gamers who have been perfectly happy with 30-60 FPS at 1080p will need to upgrade to substantially more powerful hardware in order to experience VR means that it’s all the more important to wait and see what real-world performance looks like. Do you need a GTX 970, a 980, or a 980 Ti? Is an R9 390 enough performance, or should you buy a Fury X or wait for Pascal / Polaris later this year on 14nm? These aren’t just hypothetical questions; they all carry price tags. It’s going to take some time post-launch to see how the hardware compares, how game support shapes up, and which solutions ultimately deserve buy-in and which don’t.
When I recommended people hold off on pre-ordering the Oculus Rift in favor of waiting for reviews and performance data, some readers commented that it was strange to see a website called ExtremeTech being so conservative with its recommendations. I’m perfectly happy to extoll the virtues of VR or other cutting-edge technology when we’re talking about them as experiences or potential game-changers. When it comes to telling you where I think you should spend your hard-earned cash, I’m a lot less willing to play fast-and-loose with my recommendations.
If you already have a tricked-out high-end gaming rig or a six-figure take-home pay, you’ve got enough cash on hand that you don’t have to worry if your VR bet doesn’t pan out. If you were one of the first Oculus backers on Kickstarter and you’ve stayed engaged with the company since it first went live, you’ve already made your ecosystem and purchasing choices.
more...
-
AMD clobbers Nvidia in updated Ashes of the Singularity DirectX 12 benchmark
http://www.extremetech.com/wp-conten...re-640x354.png
Nvidia reached out to us this evening to confirm that while the GTX 9xx series does support asynchronous compute, it does not currently have the feature enabled in-driver. Given that Oxide has pledged to ship the game with defaults that maximize performance, Nvidia fans should treat the asynchronous compute-disabled benchmarks as representative at this time. We’ll revisit performance between Teams Red and Green if Nvidia releases new drivers that substantially change performance between now and launch day.
Roughly six months ago, we covered the debut of Ashes of the Singularity, the first DirectX 12 title to launch in any form. With just a month to go before the game launches, the developer, Oxide, has released a major new build with a heavily updated benchmark that’s designed to mimic final gameplay, with updated assets, new sequences, and all of the enhancements to the Nitrous Engine Oxide has baked in since last summer.
Ashes of the Singularity is a spiritual successor to games like Total Annihilation, and the first DirectX 12 title to showcase AMD and Nvidia GPUs working side-by-side in a multi-GPU configuration.
The new build of the game released to press now allows for multi-GPU configuration testing, but time constraints limited us to evaluating general performance on single-GPU configurations. With Ashes launching in just under a month, the data we see today should be fairly representative of final gameplay.
AMD, Nvidia, and asynchronous compute
Ashes of the Singularity isn’t just the first DirectX 12 game — it’s also the first PC title to make extensive use of asynchronous computing. Support for this capability is a major difference between AMD and Nvidia hardware, and it has a
significant impact on game performance.
A GPU that supports asynchronous compute can use multiple command queues and execute these queues simultaneously, rather than switching between graphics and compute workloads. AMD supports this functionality via its Asynchronous Compute Engines (ACE) and HWS blocks on Fiji.
https://www.extremetech.com/wp-conte...WS-640x360.jpg
Fiji’s architecture. The HWS blocks are visible at top
Asynchronous computing is, in a very real sense, GCN’s secret weapon. While every GCN-class GPU since the original HD 7970 can use it, AMD quadrupled the number of ACEs per GPU when it built Hawaii, then modified the design again with Fiji. Where the R9 290 and 290X use eight ACEs, Fiji has four ACEs and two HWS units. Each HWS can perform the work of two ACEs and they appear to be capable of additional (but as-yet unknown) work as well.
more...
-
How do SSDs work?
http://www.extremetech.com/wp-conten...sh-640x353.png
Here at ExtremeTech, we’ve often discussed the difference between different types of NAND structures — vertical NAND versus planar, or multi-level cell (MLC) versus triple-level cells (TLC). Now, let’s talk about the more basic relevant question: How do SSDs work in the first place?
To understand how and why SSDs are different from spinning discs, we need to talk a little bit about hard drives. A hard drive stores data on a series of spinning magnetic disks, called platters. There’s an actuator arm with read/write heads attached to it. This arm positions the read-write heads over the correct area of the drive to read or write information.
Because the drive heads must align over an area of the disk in order to read or write data (and the disk is constantly spinning), there’s a non-zero wait time before data can be accessed. The drive may need to read from multiple locations in order to launch a program or load a file, which means it may have to wait for the platters to spin into the proper position multiple times before it can complete the command. If a drive is asleep or in a low-power state, it can take several seconds more for the disk to spin up to full power and begin operating.
From the very beginning, it was clear that hard drives couldn’t possibly match the speeds at which CPUs could operate. Latency in HDDs is measured in milliseconds, compared with nanoseconds for your typical CPU. One millisecond is 1,000,000 nanoseconds, and it typically takes a hard drive 10-15 milliseconds to find data on the drive and begin reading it. The hard drive industry introduced smaller platters, on-disk memory caches, and faster spindle speeds to counteract this trend, but there’s only so fast that drives can spin. Western Digital’s 10,000 RPM VelociRaptor family is the fastest set of drives ever built for the consumer market, while some enterprise drives spun up to 15,000 RPM. The problem is, even the fastest spinning drive with the largest caches and smallest platters are still achingly slow as far as your CPU is concerned.
How SSDs are different
“If I had asked people what they wanted, they would have said faster horses.” — Henry Ford
Solid-state drives are called that specifically because they don’t rely on moving parts or spinning disks. Instead, data is saved to a pool of NAND flash. NAND itself is made up of what are called floating gate transistors. Unlike the transistor designs used in DRAM, which must be refreshed multiple times per second, NAND flash is designed to retain its charge state even when not powered up. This makes NAND a type of non-volatile memory.
http://www.extremetech.com/wp-conten..._1-640x518.png
The diagram above shows a simple flash cell design. Electrons are stored in the floating gate, which then reads as charged “0” or not-charged “1.” Yes, in NAND flash, a 0 means that data is stored in a cell — it’s the opposite of how we typically think of a zero or one. NAND flash is organized in a grid. The entire grid layout is referred to as a block, while the individual rows that make up the grid are called a page. Common page sizes are 2K, 4K, 8K, or 16K, with 128 to 256 pages per block. Block size therefore typically varies between 256KB and 4MB.
One advantage of this system should be immediately obvious. Because SSDs have no moving parts, they can operate at speeds far above those of a typical HDD. The following chart shows the access latency for typical storage mediums given in microseconds.
http://www.extremetech.com/wp-conten...SD-Latency.png
more...
-
Nintendo NX console coming this Christmas with new Zelda game: report
http://www.extremetech.com/wp-conten...a1-640x360.jpg
It’s shaping up to be a major year for Nintendo if the rumor mill is even partially accurate. The company has already announced that it will showcase its new NX console at E3 this summer, as our sister site IGN reported, but fresh rumors suggest we’ll see the NX launch by Christmas of this year. That’s somewhat less time than either Microsoft or Sony gave themselves for ramping up the PS4 and Xbox One, both of which held launch announcements in February and May, respectively.
Exact dates for the launch are still being nailed down, and it’ll showcase the new Zelda title. Nintendo has previously promised its Legend of Zelda title would debut on the Wii U, and it plans to keep that promise by offering the game as a launch NX title and a Wii U game. This would seem to suggest that the game will be scaled up to match the NX’s hardware capabilities rather than designed for that platform and scaled down to fit the Wii U’s less powerful hardware.
This is supposedly a major rethink of the classic Legend of Zelda concepts and gameplay, with a new, Elder Scrolls — Zelder Scrolls? — style of open-world exploration.
https://www.extremetech.com/wp-conte...ls-640x353.jpg
Supposedly the new Zelda will receive a mammoth marketing push, with a $10 million earmark out of Nintendo’s $34.5 million budget for the Wii U. Nintendo will spend an estimated $56 million on marketing the 3DS, and may be planning a further price cut for the platform this fall. That would fit speculation that the NX platform is meant to at least partly replace the 3DS as Nintendo’s handheld of choice, but even if that rumor proves true, it raises others. If the NX is both console and handheld, will users be able to play 3DS games they’ve already purchased?
The Nintendo NX
We’ve previously rounded up the Nintendo NX’s capabilities, controllers, and positioning, so if you’re looking for background information, that’s where to start. Our understanding of the hardware is still limited. Nintendo has said that the NX is a clean break from its past, which heavily implies that the company will finally break away from the 20 year-old CPU architecture that it’s been using since Clinton was President. As capable as the old 750CXe was in its day, Nintendo has made only modest modifications to the CPU core since it debuted.
https://www.extremetech.com/wp-conte...01-640x358.jpg
Nintendo’s patent draws, showing a hypothetical controller
A 2016 debut would line up with AMD’s claims that it secured another semicustom design win with revenue expected later this year, since Nintendo would begin ramping up production several months before shipping hardware. If AMD built the SOC, there’s a very good chance that it handled the graphics as well, but there’s still much we don’t know. Of course, it’s also possible that AMD’s semi-custom win is merely coincidentally timed. Either way, Nintendo has historically done a good bit of customization work on its SoC designs, and we expect it would do so here, as well.
Whatever ideas Nintendo is fielding with the NX, it needs to do a better job marketing the final product than the Wii U managed. The GamePad was an interesting idea, but it never matured into a must-have peripheral, and lifetime Wii U sales have been far below the Wii. If Nintendo pushed for an aggressive update, it could easily field a console to match the PS4 and Xbox One, but the company has previously preferred to maximize hardware profitability, with consoles that, while capable, lagged behind their peers in terms of sheer horsepower. The Wii U is built on 40nm technology, for example, even though 28nm was available when the hardware was designed. Unlike Sony and Microsoft, Nintendo typically does fewer die shrinks and other revisions, but that could change if the firm decides to go head-to-head with the dominant players in the console industry.
more...
-
The Windows Store saddles PC games with significant limitations
http://www.extremetech.com/wp-conten...ak-640x354.jpg
Microsoft really wants PC and Xbox One gamers to have a common platform that links their purchases and allows for both cross-buying and possibly cross-play. At first, this seemed a win-win for everyone. Now, however, the restrictions placed on Windows 10 games sold through the Windows Store seem like they might kill the entire concept.
We touched on this recently, but it’s worth revisiting the topic in-depth, since it affects games like Quantum Break, Rise of the Tomb Raider, and the upcoming Gears of War Ultimate Edition.*Specifically, games purchased through the Windows Store:
- Only run on Windows 10
- Cannot be managed by Steam
- No SLI / Crossfire support
- No refund policy
- Use borderless full screen with a refresh rate lock synchronized to your display rate (the lock has been reported as 60Hz, because that’s the maximum refresh rate on many displays)
- Protected game files can hinder modding (this will vary depending on the structure of the title)
In addition, there are mouse and key binding issues, as well as the fact that you can’t override in-game settings with Nvidia’s Control Panel or AMD’s Catalyst Control Center, according to Ars Technica.
Having said that, there are a few erroneous restrictions that are being reported as Windows Store-specific, but which may or may not be. The reason benchmark monitoring and reporting tools are having trouble with these applications is the same reason FCAT can’t monitor Ashes of the Singularity on AMD hardware — the software and application support for DX12 monitoring isn’t available yet. Support for these modes will hopefully come in time.
Avoid the Windows Store
Reading over the list of restrictions, I can’t honestly come up with a reason why a PC gamer should ever buy a title through the Windows Store, unless it’s a mobile game or something you already own on an Xbox One. Giving up multi-GPU support, tunable options, cross-OS compatibility, and Steam’s refund policy gets you… well, practically speaking, it gets you nothing. If you own an Xbox One, cross-buying may be a potent incentive. But for everyone else, the Steam / GoG / whoever deal is going to be better.
https://www.extremetech.com/wp-conte...er-640x360.jpg
This speaks to an ongoing problem with Microsoft and the Windows Store in general. Universal Windows Apps might be awesome in theory, but in practice they’re far more trouble than they’re worth. If Microsoft sold its games for less money in the Windows Store, it might be worth the limited feature set, but the company’s entire plan appears to be “Sell them less for the same price tag.” Gamers who are used to how Steam works may not be pleased with the way the Windows Store locks everything down. Microsoft may want to unify the Windows Store and its Xbox One gaming empire with the PC space, but it risks creating a two-tier system in which Steam buyers have dramatically better features and compatibility than Microsoft products on a Microsoft platform.
These kinds of limits and lockdowns may make sense on mobile devices, but they’re not going to help Microsoft win converts for the PC Windows Store. As things stand, it should be considered the method of last resort for buying a game, unless you specifically want the Xbox One cross-buy feature. Once Microsoft revises its policies and requirements, that could change.
more...
-
The new Gears of War Ultimate Edition is a DX12 disaster
http://www.extremetech.com/wp-conten...W1-640x353.jpg
We covered Ashes of the Singularity and how the game’s DirectX 12 performance has evolved between AMD and Nvidia. This week, Microsoft has launched the PC version of Gears of War Ultimate Edition, but the characteristics of the two titles couldn’t be more different. The new Gears of War is catastrophically broken on Radeon cards.
Jason Evangelho of Forbes has details on the exceedingly strange performance results, of which there are many. The Radeon Fury X is incapable of holding a steady frame rate, with multiple multi-second pauses throughout the benchmark run. The same problem struck the 4GB R9 380 and the 4GB R9 Nano as well. Meanwhile, an R7 370 — that’s a midrange card based on a four-year-old graphics architecture, which also ships with 4GB of RAM — runs just fine.
Here’s the R9 Nano running in 4K at High Quality.
The Forbes tests show two trends. First, GCN 1.0 cards perform smoothly, while GCN 1.1 and 1.2 cards stutter and struggle. Second, AMD GPUs with >4GB of RAM show marked improvement. I spoke to Jason about his results; he indicated this is not the case on Nvidia hardware, where 4GB of RAM gives the GTX 980 all the headroom it needs at settings and resolutions that cripple AMD.* Last year, we surveyed 15 titles to determine whether gamers needed more than 4GB of VRAM to play in 4K and determined they did not. The fact that AMD is hammered at 1440p and High detail suggests that memory management in Gears of War Ultimate Edition is fundamentally broken as far as AMD GPUs are concerned.
One of the historical differences between AMD and Nvidia has been their Day 1 driver support. AMD has put a great deal of work into closing that gap in recent years, but Nvidia is still widely perceived to have an edge when it comes to launch-day optimizations.
https://www.extremetech.com/wp-conte...PC-640x361.jpg
The game’s visuals are badly corrupted on AMD cards above 1080p and at higher detail levels.
In this case, however, the problems go far beyond performance profiling. The game isn’t slower on AMD — it’s unplayable on many AMD GPUs. Hawaii / GCN 1.1 is now more than two years old, Tonga is 18 months, and Fiji has been in-market for nine months. None of these are new products.
Developer or driver?
There are several reasons to suspect this is a developer issue rather than a driver problem. First, there’s the fact that DirectX 12 is designed to give developers far more power over how a game is rendered. This can be a double-edged sword. DX12 allows for better resource allocation, multi-threaded command buffers, asynchronous compute, and better performance tuning — but it also makes it harder for the IHV (that’s AMD or Nvidia) to optimize in-driver. There are optimizations that AMD and Nvidia could perform under DX11 that can’t be done in DX12.
more...
-
Microsoft’s Xbox chief wants to build a fully upgradeable Xbox One
http://www.extremetech.com/wp-conten...r1-640x353.jpg
The gap between computers and consoles has been shrinking for decades — and now, Xbox head Phil Spencer wants to eliminate it altogether. Microsoft is already taking steps to unify its Xbox One and Windows 10 experience, through features like game streaming across networks and cross-buy capability. Upgrading hardware, however, is something else entirely.
Nonetheless, upgraded hardware appears to be what Spencer meant. “We see on other platforms whether it be mobile or PC that you get a continuous innovation that you rarely see on console,” Polygon reports Spencer as saying. “Consoles lock the hardware and the software platforms together at the beginning of the generation. Then you ride the generation out for seven or so years, while other ecosystems are getting better, faster, stronger. And then you wait for the next big step function.”
https://www.extremetech.com/wp-conte...Mk-640x390.png
This graph shows the step function Spencer is referring to. PC performance increases over time at a fairly steady rate, consoles have long periods of static performance, followed by a jump.
“When you look at the console space, I believe we will see more hardware innovation in the console space than we’ve ever seen,” Spencer said. “You’ll actually see us come out with new hardware capability during a generation allowing the same games to run backward and forward compatible because we have a Universal Windows Application running on top of the Universal Windows Platform that allows us to focus more and more on hardware innovation without invalidating the games that run on that platform.”
A new console paradigm?
Before we hit the software side of things, let’s talk about long-term trends in console development. The truth is, the gap between consoles and PCs has been shrinking, bit by bit, ever since the Nintendo NES launched in the mid-1980s. In the 1990s, consoles adopted CD-ROM and DVD-ROM technology, even if they used customized discs and encoding schemes. Microsoft’s Xbox was the first mainstream console to use an Intel CPU and Nvidia GPU; both the Xbox 360 and PlayStation 3 made integrated storage standard, even if Microsoft did technically sell a disc-only option. The PlayStation 3 could run Linux, until Sony patched it out.
Today, the Xbox One and PlayStation 4 are PCs in everything but name. They rely on commodity x86 hardware and consumer graphics cards. They’re built around low-level APIs that have much in common with their PC brethren like Vulkan, DX12, and AMD’s Mantle. This is slightly more obvious with the Xbox One, which literally runs a version of Windows, but there’s nothing at the hardware level that would prevent the PS4 from doing so as well.
more...
-
The Witness: The risks and rewards of doing lines
http://www.extremetech.com/wp-conten...01-640x360.png
The clue that there was something wrong should have been that I was writhing around on the floor, sobbing. Yet at the time it made perfect sense. When you’re ambushed by approximately half a million Tetris-style bricks that won’t stop haunting you and taunting you even as you start hitting them and screaming at them to go away, this kind of thing happens. Sometimes you just want to be left alone, safe from the torment, and be allowed to once again live in a manner close to the way you see fit. So the sweat that was pouring and the tears that were flowing just didn’t seem like that big of a deal. Scratch that: They weren’t a big deal at all.
It’s only now that I can look back on this that I realize I had a problem. A big problem. I was addicted to Blow. Or, maybe more accurately, I was hooked on doing lines.
Lines aligning
Looking at it from a distance, The Witness is not hard to unpack on its most elemental level. You begin in a dark tunnel, trudging ever forward toward a glimmer of light that eventually resolves itself as a simple shape: a circle with a short horizontal line connected to it. By selecting the circle and then dragging the pointer you fill the complete shape, and then something happens. In this case, a door opens and you move on, before long facing something that’s slightly more advanced (a line with a 90-degree turn). Solve that one and before long you’re in the broad, bright, beautiful island, à la Myst, where you’ll encounter literally hundreds of puzzles like these.
https://www.extremetech.com/wp-conte...28-640x360.png
They get a lot harder very quickly, of course. Sooner than you might expect, you’re progressing from basic mazes to challenges involving separation (there must always be a line between all the black and white squares); symmetry (your line is matched, in mirror image, on the other side of the board); color pairing; connect the dots; my personal ligne noir, shape matching, in which the figures you draw must outline one or more geometrical forms you’re provided (in theory—though the difficulty I had solving these left me seriously wondering); and plenty more.
But even though its riddles rapidly evolve from mild to maddening, The Witness is structured so that you always have the essential tools you need to solve them: Built-in tutorials guide you gracefully through new concepts over half a dozen puzzles, for example, and when ideas are combined, you get a chance to acquaint yourself with the new relationship before things start getting really nutty. Blow may be a gleefully evil sadist, but he’s a fair gleefully evil sadist.
https://www.extremetech.com/wp-conte...01-640x360.png
He’s not necessarily an overly obvious one, however. During the course of your time on the island, you’ll discover what appears to be a story: a community frozen in time, with people changed into statues while doing ordinary, everyday things. Who are they? Why are they here? And can you help them—or is your destiny to become one of them? These questions are not easily answered, even if you discover many of the audio and video recordings scattered around that investigate the scientific and spiritual relationships that seem to underlie the land’s logic. Even so, you’ll ponder them as you move from one section of the island to another, either trying to track down the next hint you need or giving yourself a respite from your present frustrations by doing something easier for a few minutes.
more...
-
AMD, Razer want to standardize external GPUs, bring desktop gaming to ultrabooks
http://www.extremetech.com/wp-conten...re-640x319.jpg
Modern laptop technology has advanced enormously in recent years, as systems have gotten thinner, lighter, and more power efficient. Unfortunately, mobile gamers have been left out of most of these trends. The laws of physics aren’t kind to people who try to stuff 45-80W GPUs into ultrabook chassis. Gamers today mostly have to choose between ultraportable systems with lower-end graphics or larger desktop-replacement class hardware. There are a handful of 15-inch laptops that attempt to straddle this divide, but reviews show that they tend to run hot and noisy — an unavoidable consequence of their configurations.
AMD has a plan to solve this problem through the use of a standardized external GPU interface that would allow customers to attach a desktop graphics card via an external chassis.
https://www.extremetech.com/wp-conte...re-640x480.jpg
AMD’s Robert Hallock posted the following to Facebook, alongside a photo of the Razer Core:Gaming notebooks are great for gaming, but nobody in their right mind wants to carry one all the time. Ultrathin notebooks are awesome to carry, but nobody in their right mind would confuse one for a gaming notebook…
External GPUs are the answer. External GPUs with standardized connectors, cables, drivers, plug’n’play, OS support, etc.
Given that he posted this next a photo of the Razer Core, I think we can assume it’s the first chassis to implement AMD’s new idea. Since we know that device is equipped with Thunderbolt 3 and a USB-C connector, it’s fairly easy to guess what AMD has implemented.
Don’t we already have docks?
Enthusiasts have been building their own external docks for years, and we’ve covered some of those efforts on ExtremeTech. The earlier solutions used older versions of Thunderbolt, however, which means they wouldn’t deliver as much bandwidth to*a high-end GPU. A Thunderbolt 3-powered solution has four lanes of PCI Express 3.0. That may not sound like much, but it’s significantly more than previous external solutions. Single-GPU bandwidth needs tend to scale only modestly upwards; you need high-bandwidth connections for multi-GPU hardware, but one card should be just fine on Thunderbolt 3.
https://www.extremetech.com/wp-conte...rari-one-2.jpg
AMD’s 2008-era solution was ATI-specific and much, much larger than Thunderbolt 3 running on USB Type-C.
Once you start poking around at existing dock solutions, some problems emerge. Alienware’s Graphics Amplifier doesn’t allow a laptop to hibernate, is only compatible with Alienware hardware, and is frankly rather large.
AMD’s phrasing in this announcement implies that their new solution is based on a standard implementation for everything, including drivers, OS, connectors, and cables. Razer, meanwhile, has already advertised the Core as being compatible with both AMD and Nvidia graphics cards.
more...
-
HTC’s Vive VR safety recommendation: It’s dangerous to go alone, and don’t sit on the furniture
http://www.extremetech.com/wp-conten...VR-640x353.jpg
When the film studios and gaming industries began their brief flirtation with 3-D technology a few years back, there was a small (but occasionally raucus) debate over whether or not 3-D glasses and technology were somehow bad for people. Now with VR technology rapidly maturing, we’re seeing similar warnings and discussions over safety. As with 3-D, some of this is common sense, and some of it’s a bit goofy.
HTC’s recommendations for using its Vive headset land are a bit on the silly side, as Ars Technica reports. The company actually recommends that you only use its equipment when you have a partner nearby to spot you. Don’t try to run from encounters or dodge enemy fire by actually throwing yourself across the living room, always use the handheld controllers with their safety strap around your arm, lest you accidentally knock grandma’s teeth out, and perhaps most notably:
https://www.extremetech.com/wp-conte.../FakeChair.png
This chair is not real. Please don’t sit in it.
“It is important to remember that simulated objects, such as furniture, that may be encountered while using the product do not exist in the real world, and injuries may result when interacting with those simulated objects as if they were real, for example, by attempting to sit down on a virtual chair.”
There’s a kernel of truth to all of this, of course. If you’re wearing a face computer, it’s probably best not to try juggling at the same time. Clearing your gaming area and making sure there aren’t any objects on the floor that could trip you up is a smart move, and truthfully, there are certain aspects of VR design that aren’t exactly intuitive. VR designers have been advised to not include certain types of environments, as they make people prone to vertigo or even falling over. While the risk of sitting down on virtual furniture and accidentally smashing grandma seems small, certain environments that are common in first person shooters, like stairs, have been known to cause vertigo. When you walk up or down stairs, your body instinctively positions itself to do so — which means not actually encountering a stair can throw your balance off. If you’ve ever expected one more step down than you actually had, you’ve felt this type of sensation.
https://www.extremetech.com/wp-conte...et-640x361.jpg
The HTC Vive.
If you’re jumping on the VR train, do practice common sense. HTC has created a “chaperone” system that lets you map the boundary of a safe area, but it doesn’t suggest relying on it for protection. Treat it more like a guideline, and you’ll have a better chance of not throwing a motion controller across the room or tripping and falling.
And don’t sit on the furniture.
more...
-
Unreal developer blasts Microsoft, claims company wants to monopolize game development
http://www.extremetech.com/wp-conten...E4-640x354.jpg
The Windows Store and the restrictions it places on games and game settings*have been rising over the past week, ever since the release of Ashes of the Singularity and Gears of War Ultimate Edition. We’ve reached out to Microsoft in an attempt to clarify some of these issues, specifically those related to V-Sync, WDDM 2.0, and the current limits that lock down Windows Store titles. Tim Sweeney, the founder of Epic Games and lead developer on the Unreal Engine, recently blasted Microsoft’s Universal Windows Platform, UWP, and called for a complete boycott of the platform.
In an op/ed for The Guardian, Sweeney describes Microsoft’s actions as an aggressive attempt to lock down the Windows ecosystem, thereby monopolizing both application distribution and commerce. He writes:Microsoft has launched new PC Windows features exclusively in UWP, and is effectively telling developers you can use these Windows features only if you submit to the control of our locked-down UWP ecosystem. They’re curtailing users’ freedom to install full-featured PC software, and subverting the rights of developers and publishers to maintain a direct relationship with their customers.
Sweeney states that he has no problem with the Windows Store as such, but takes issue with the way Microsoft has locked down the platform. Because Microsoft controls the only distribution point for UWP applications, no other company can offer equivalent software. Side-loading can be enabled, but it’s off by default and could be removed entirely in a future Windows Update.
https://www.extremetech.com/wp-conte...ed-640x464.jpg
Epic Games founder and Unreal engine developer, Tim Sweeney
Sweeney calls on Microsoft to allow UWP applications to be distributed just as Win32 applications are now, for any company to be allowed to distribute UWP applications, including Steam and GOG, and that users and publishers should be allowed to directly engage in commerce with each other without paying a 30% fee to Microsoft. It should be noted that Valve, which owns the vast majority of digital distribution on the PC, also charges a 30% fee.This true openness requires that Microsoft not follow Google’s clever but conniving lead with the Android platform, which is technically open, but practically closed…
The ultimate danger here is that Microsoft continually improves UWP while neglecting and even degrading win32, over time making it harder for developers and publishers to escape from Microsoft’s new UWP commerce monopoly. Ultimately, the open win32 Windows experience could be relegated to Enterprise and Developer editions of Windows.
more...
-
Oculus CEO: We’ll build for Apple if it ever releases a ‘good computer’
http://www.extremetech.com/wp-conten...ro-640x353.png
Virtual reality has the potential to transform PC gaming in ways not seen since 3D acceleration became affordable in the late 1990s. While we don’t recommend pre-ordering either an Oculus or Vive until reviews and performance data are both available, we’re still excited about the long-term potential for the medium. For now, however, that medium is going to be Windows-only.
ShackNews caught up with Oculus CEO Palmer Lucky and asked him a question at least a few Mac fans had to be wondering about: Would Oculus Rift support the Mac?
“That is up to Apple,” Palmer said. “If they ever release a good computer, we will do it.”
Perhaps realizing he’d just written headlines across the entire tech industry, the CEO went on to clarify his thoughts: “It just boils down to the fact that Apple doesn’t prioritize high-end GPUs. You can buy a $6,000 Mac Pro with the top of the line AMD FirePro D700, and it still doesn’t match our recommended specs. So if they prioritize higher-end GPUs like they used to for a while back in the day, we’d love to support Mac. But right now, there’s just not a single machine out there that supports it.”
more...
-
This Web-based emulator lets you play NES games in 3D — sort of
http://www.extremetech.com/wp-conten...-1-640x353.jpg
Ever wonder what your favorite NES games would look like in 3D? No, not a 3D remake — actual NES games rendered on screen with a z-axis. Well, if you’re running a recent version of Firefox, you can see for yourself by loading your very own ROMs into a WebGL-based emulator called 3DNes.
Earlier, this story from Kotaku caught my eye, and suddenly everything ground to a halt. I had to try this emulator for myself, and see some of the best games of my childhood in a rough approximation of three dimensions.
https://www.extremetech.com/wp-conte...te-640x569.jpg
By default, the emulator is populated with a link to Thwaite — a homebrew Missle Command clone. Unfortunately, this is the only ROM I can get working in the emulator as of publication. It’s very simple, and the 3D effect makes everything look extruded — not a particularly exciting aesthetic. Still, there’s something inexplicably novel about manipulating a straightforward 2D game on the fly to see everything from a different perspective.
more...
-
Coleco formally pulls out of Chameleon, retro console disappears in a puff of vapor
http://www.extremetech.com/wp-conten...on-640x355.jpg
Nostalgia is a hot-ticket item in the gaming industry, and a new game console that would run any second-through-fifth generation games via FPGA-powered emulation was a major news item last month. The company behind that announcement, Retro VGS, had been trying to crowdfund a project for months with limited success, but announced a major licensing deal with Coleco in December 2015. In February, Retro VGS displayed the so-called Coleco Chameleon, claiming that the platform could run any second-through-fifth-generation game (that’s the Atari 2600’s generation through the Nintendo 64) through FPGA (field programmable gate array) emulation of the original hardware.
Suspicions were raised, however, when photographic evidence appeared to show nothing but an SNES Jr. motherboard jammed into an Atari Jaguar plastic case. Granted, Retro VGS had already disclosed that it intended to use the old Jaguar molds for its final hardware. Then the “custom FPGA hardware” turned out to be nothing but an old PCI capture card. Coleco, which had agreed to lend its name to the project, demanding that prototype units be investigated by independent engineers. Now that the results of that investigation are in, Coleco has withdrawn from the program and Retro VGS has gone dark altogether.
more...
-
AMD launches XConnect, partners with Intel, Razer to drive desktop gaming on laptops
http://www.extremetech.com/wp-conten...re-640x319.jpg
When it comes to gaming, laptop owners have always been stuck between a rock and a hard place. Thin and light systems often portability and convenience, but limited gaming performance. Larger, desktop-replacement systems offer more than enough firepower for gaming, but are typically heavy and fall into the category of “transportable” rather than portable. The handful of external chassis that have been built by various vendors tend to have their own limitations and often require a reboot when changing the graphics mode, or don’t support laptop hibernation. They’re also typically tied to either a specific product family or even a single product SKU.
https://www.extremetech.com/wp-conte...-3-640x331.jpg
AMD wants to offer mobile gamers a more flexible option, and it’s partnered with Razer and Intel to make it possible. XConnect is the name of AMD’s driver support and implementation using Thunderbolt external graphics.
AMD XConnect: External graphics done right?
Since we’re talking about the efforts of three different companies, let me break out which manufacturer contributes what.
https://www.extremetech.com/wp-conte...ct-640x353.jpg
AMD wrote drivers for its own Radeon GPUs in order to support an external graphics chassis. Radeon cards running Radeon Software 16.2.2 or later are capable of plug-and-play configuration with an external chassis. The new software can monitor which applications are running on an external GPU and offers the option to close current applications and prep the system for safe removal. Unlike previous solutions, you can connect or disconnect an external dock without rebooting the system.
https://www.extremetech.com/wp-conte...-2-640x344.jpg
more...
-
Early Alpha footage of System Shock remake looks amazing
http://www.extremetech.com/wp-conten...an-640x353.jpg
In 1994, the critically-acclaimed and critically overlooked game System Shock shipped from Looking Glass Studio. It was based on a refined version of the same engine used for Ultima Underworld and Ultima Underworld II, and it offered a more sophisticated engine than either Doom or Doom II. System Shock cast you as a solitary hacker, tasked with shutting down the insane computer AI Shodan, which had taken control of the Citadel space station.
While the original game didn’t sell well, it spawned a second fan favorite — System Shock II, in 1999, and inspired aspects of both Deus Ex and the BioShock universes. Last fall, Night Dive Studios announced they would be bringing a rebooted System Shock to PC and Xbox One, and this week at GDC they actually demonstrated pre-alpha footage of what they’re working on.
more...
-
Can the $399 PlayStation VR popularize virtual reality?
http://www.extremetech.com/wp-conten...VR-640x353.jpg
After months of waiting, Sony has finally spilled the beans about the PlayStation VR’s retail release. It’s shipping sometime in October 2016, it’ll cost $399, and the PlayStation Camera won’t be bundled in the standard SKU. Based on current Amazon prices, that puts the total cost of the PS4, camera, and PSVR at about $792 — roughly half the price of a PC and Oculus bundle. Is this significantly lower barrier to entry enough to propel PlayStation VR to the top, or is this headset doomed for a dusty shelf?
Over at the PlayStation blog, you can see for yourself just exactly what $399 will buy you. You’ll get the helmet itself, a set of earbuds, the infamous breakout box, and various different cables. Sadly, the mandatory camera is nowhere to be found. However, it’s not quite as bad as it sounds. While the camera’s list price is $60, it almost never sells for that price in the real world. Amazon is currently selling it for $44.08, and I got mine on sale for just $30.
Also worth noting is the lack of bundled controllers. The PS4 itself comes with a DualShock 4 with a built-in light bar, so that will work with a wide swath of VR titles right out of the gate. However, some games will take advantage of the PlayStation Move controller. If you want that experience, you’re probably going to need to drop about $29. However, I wouldn’t rush out and buy one just yet. We’re still not sure if it’ll be mandatory for any specific title. And truth be told, I don’t expect the Move controller to get much traction.
Besides retail info, we also saw confirmation on the final specs. The screen inside the helmet is a 5.7-inch OLED display with a 1920×1080 resolution (960×1080 per eye). It supports the same 90Hz refresh rate that other helmets seem to be standardizing on, but it also supports 120Hz so that 60Hz games can be reprojected at twice the native frame rate. The field of view is, oddly, listed as “approximately 100 degrees,” so the real number is still up in the air. Latency-wise, Sony is still keeping it vague with “less than 18ms.” How much less? Who can say? We probably won’t know until we get the PSVR in our own hands for testing. And as for tracking, the helmet sports nine LEDs that allow the camera to detect your position even when you’re facing away.
more...