Page 16 of 97 FirstFirst ... 6 14 15 16 17 18 26 66 ... LastLast
Results 151 to 160 of 964
Like Tree1Likes

Game Tech News

This is a discussion on Game Tech News within the Electronics forums, part of the Non-Related Discussion category; Ever since AMD announced Fiji and its new memory interface, there’s been one major question on the minds of enthusiasts ...

      
   
  1. #151
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    AMD lifts embargo on the Radeon R9 Fury X’s performance, shows GPU competing neck and neck with Nvidia’s GTX 980 Ti


    Ever since AMD announced Fiji and its new memory interface, there’s been one major question on the minds of enthusiasts — could this new memory architecture give Team Red the performance it needed to compete with Maxwell? Fiji, after all, is based on the R9 285 (aka Tonga), a modest update to AMD’s 2013 Hawaii GPU. The entire product family is 3.5 years old at this point. While GCN more than held its own against Kepler, Maxwell’s debut swept AMD’s performance and power efficiency.

    We weren’t supposed to run these numbers publicly until the NDA lifts on the Radeon R9 Fury X, but evidently that message didn’t get out to everyone, so AMD has lifted the embargo on its own performance projections — and thus, we find ourselves returning to the waters Fiji again to discuss the card — hopefully for the last time before we can bring you a full review.



    As the data shows, the Radeon R9 Fury X is neck-and-neck with the GTX 980 Ti across a wide spectrum of titles. This fits our general expectations for both cards, given their respective configurations and the fact that Fiji is a giant-sized implementation of the already well-known GCN architecture.

    Can you trust vendor-provided benchmarks?


    This brings up an interesting aspect of benchmarking that I don’t often get to write about (mostly because I don’t want to bore you to tears). Any time a vendor runs benchmarks, there’s always the question of whether or not the results are accurate or not. Generally speaking, the results that AMD, Nvidia, or Intel claim for their respective hardware is accurate, by which I mean that if you take their system configuration and test it with their settings, you’ll see the same results with some modest variation.

    The devil, in these cases, is in the details. Small shifts in detail levels and settings can significantly tilt competitive comparisons, often by 10-15% or more. The question for any given vendor comparison isn’t “Will I see these same results,” but “What settings were used to generate the data?” Since the point of a Reviewer’s Guide is to actually present useful information, the results are typically ballpark-accurate — it’s easy to tell when any company is gaming the system by using odd settings, since our own benchmark data will show a very different spread between two different cards.

    The Radeon R9 Fury X’s performance, meanwhile, is just one part of the equation. Power consumption and noise also matter, and these are areas where AMD has promised some truly significant improvements. We’ll know soon if those achievements arrived on schedule with better benchmark figures.

    More...

  2. #152
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    The Kickstarter for Shenmue 3 broke records — because Sony lied to backers


    The very first item on the FAQ states: “No, we cannot make an open world game for $2 million. Shenmue will be produced using both the funds raised from the Kickstarter and through other funding sources already secured by Ys Net Inc. We are very sorry, but due to contractual obligations, details of outside investments will not be disclosed.”That item didn’t initially exist at all — it was added days after the project launched, after the game had already blown through the $2 million mark. I believe gamers had a right to know this information, up front. For those of you who doubt this, the “Can you make an open world game for just $2 million?” question exists on the 18th but is absent from the 17th.

    Other statements provided to Kotaku UK directly by Sony itself have confirmed that the KS campaign is, essentially, a marketing test — a way to assess interest in the project.

    “We said ‘the only way this is gonna happen is if the fans speak up,’” said Corsi. “We thought Kickstarter was the perfect place to do this. We set a goal of two million dollars, and if the fans come in and back it, then absolutely we’re going to make it this a reality.”

    I am not arguing against Sony making Shenmue 3 — but I feel, strongly, that gamers should have been told upfront that they were contributing to an “interest” campaign to demonstrate to Sony that there was enough interest to bankroll the rest of the title. The fault, in this case, is not with players for wanting Shenmue to exist, but with Sony itself, for not being honest and upfront about the nature of the campaign or the way in which crowdfunding was being used.

    Original Story:


    At E3 this week there were few announcements larger than Sony’s press conference. One of the key underpinnings was the announcement of a crowdsourced campaign to fund the creation of Shenmue 3, the lost conclusion to a planned trilogy of titles that debuted on the Dreamcast. Now, it’s been confirmed that Sony is actually bankrolling the project, and the entire affair is likely to leave an extremely sour taste in backers’ mouths.

    Let’s start with the obvious. The original Shenmue was widely reported to have cost $70 million, though the game’s creator, Yu Suzuki, has claimed that the figure was inflated and that the real cost was $47 million. Either way, that’s far more than the $4 million stretch goal that the Kickstarter set (currently at $3.3 million as of this writing). Developing open-world, expansive gameplay has only gotten more expensive in the 16 years since Shenmue debuted, not less, and stretches credulity to think that a team of developers could deliver a sprawling adventure across multiple locations (or an incredibly detailed portrayal of a single location) in less than a tenth the original game’s budget.



    More...
    Game Tech News || Trading blogs || My blog

  3. #153
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Sony will launch 1TB PS4 in July, match Xbox One’s expanded storage


    Once Microsoft announced a 1TB version of the Xbox One, it was only a matter of time before Sony would do the same. Now, both console manufacturers are planning to move to the higher storage tier, though Sony has yet to announce pricing for the new model. If it goes by previous trends, the 1TB model will debut at or near the same price as the existing 500GB hardware, which will likely take a price cut to clear inventory.

    Sony is billing this new version of the console as the “Ultimate Player Edition,” and is rolling it out in “select Europe and PAL territories” initially, which means no word on when the console will hit the United States. Given that such rollouts typically occur worldwide, however, we can expect a US launch in the not-too-distant future to maintain parity between the two platforms.



    Our comparison of PS4 vs. PS3 game sizes showed the need for larger storage pools more than a year ago

    Whether the new edition will contain other goodies is still unclear. Ultimate Player Edition implies an upgrade slightly larger than a larger HDD, but the plans Sony filed with the FCC show a comparatively small upgrade — 8% reduced maximum power consumption and a slightly smaller chassis, along with additional storage, but nothing more substantive. It’s still not clear if 20nm versions of either console will materialize. We initially expected die shrinks for both the Xbox One and PS4, but the shifting strategies around 20nm and the process nodes unsuitability for large, high-power products may have put the kibosh on plans for a 20nm version of the hardware. With TSMC pushing hard for 16nm and GlobalFoundries adopting Samsung’s technology for 14nm, it’s entirely possible that the work done on 20nm planar was rolled into a 14/16nm version with a later roll-out date.

    The 1TB drives MS and Sony are offering aren’t likely to sway a lot of fence-sitters, but the size of modern games make such storage capacities necessary. It’s not uncommon for modern titles to require 40-50GB of drive space when fully installed, which can leave a 500GB drive feeling a bit cramped once you account for the HDD space you lose in conversion. No word yet if the new drive is faster than the older model, but the advantages of upgrading, even to an SSD, have been fairly modest in comparison to the stock drive. Repeated tests have shown that boot times and saved game loads can be meaningfully accelerated (Bloodborne on the PS4 loaded saved games in 29 seconds as compared to 45s or more for the stock drive). Over time, that can add up, especially in games that do a lot of in-game loads, but the difference isn’t huge.

    These relatively minor upgrades feel like the point updates MS and Sony are making in lieu of larger, more comprehensive overhauls. 14/16nm hardware refreshes, assuming they are in the works, could debut by Christmas of this year, or could slip into the first half of 2016 depending on yields and costs.

    More...
    Game Tech News || Trading blogs || My blog

  4. #154
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    AMD’s Radeon R9 Fury X: Previewing performance, power consumption, and 4K scaling


    Today, after months of previews, leaks, and a smattering of official disclosures, AMD is launching its much-touted Radeon R9 Fury X. The new GPU packs a number of potent firsts — it’s the largest GPU AMD has ever built, the first GPU to use HBM, and the first high-end card AMD has launched to compete directly against Nvidia’s Maxwell since that card debuted almost nine months ago. AMD has been promising a GPU that would truly leapfrog its previous Hawaii-class cards in terms of both performance, power consumption, and noise.



    The perspective isn’t twisted — this GPU is small

    Unfortunately, our initial examination of the Radeon R9 Fury X is going to be more constrained than we originally planned. Due to a miscommunication with Edelman and some truly astonishing incompetence from Fed Ex, our sample GPU that was supposed to arrive on Friday actually arrived on Tuesday, less than 24 hours before this morning’s 8 AM launch. A full evaluation of the Radeon R9 Fury X under these circumstances was impossible. Instead, we’ll be previewing some initial findings and continuing to work on comprehensive testing.

    The three big questions


    Over the past few months, readers have expressed three primary concerns about the Fury X. First and most obvious: Would it match Nvidia’s overall performance? Second, would it improve on Hawaii’s power consumption or performance per watt? While AMD’s 2013-era GPUs competed fairly well against Kepler, Nvidia took an aggressive lead on overall power consumption with Maxwell. Third, would the 4GB memory buffer on the Fury X harm scaling at 4K resolutions?

    We intend to visit all of these topics in greater detail, but the data we’re going to present right now is indicative of the trends we’re seeing in every category. Let’s start with overall performance previews. All of our tests were run on a Haswell-E system with an Asus X99-Deluxe motherboard, 16GB of DDR4-2667, and Windows 8.1 64-bit with all patches and updates installed. The latest AMD Catalyst Omega drivers and Nvidia GeForce 353.30 drivers were used. Our power consumption figures are going to be somewhat higher in this review than in some previous stories — the 1200W PSU we used for testing was a standard 80 Plus unit, and not the 1275 80 Plus Platinum that we’ve typically tested with.

    We’ve also included results for a slightly higher-end GTX 980 Ti from EVGA, the GeForce GTX 980 Ti SC+ ACX 2.0+ with a $679 price tag (up from the $649 MSRP on the standard GTX 980 Ti. The EVGA card



    BioShock Infinite has historically been a hair faster on Nvidia hardware than on AMD, but Fury closes the gap here, rocketing forwards to tie the GTX Titan X reference design in both 1080p and 4K. The super-clocked variant from EVGA is still a hair faster, but also a touch more expensive, at $679 compared to $649. The Fury X isn’t going to match the Radeon R9 295X2, but it’s 36% faster than the R9 290X in 1080p and a whopping 70% faster in 4K.



    In Shadow of Mordor, the R9 Fury X hits between the GTX 980 and the 980 Ti, but sits closer to the latter than the former in 1080p. In 4K, however, the tables turn a bit — here, the R9 Fury X is faster than any other single-GPU solution, save for the overclocked EVGA GTX 980 Ti, where it narrowly loses.

    The degree to which the Fury X can match the GTX 980 Ti varies somewhat from game to game, including some titles we haven’t finished testing yet. At its best, the card seems to offer equivalent performance to the GTX 980 Ti when tested at our standard detail levels and configurations, but it doesn’t win every benchmark. It’s always faster than the GTX 980, however, at least in everything we’ve had a chance to test.

    More...
    Game Tech News || Trading blogs || My blog

  5. #155
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Batman: Arkham Knight is a mess on PC, surprisingly solid on consoles


    Batman: Arkham Knight, the final game in Rocksteady’s Arkham trilogy, was released this week to massive critical acclaim. Our sister site IGN gave it a 9.2 out of 10, and it currently has a Metacritic score of 90 out of 100 — big praise. And while the console versions of the game are performing surprisingly well, the PC release seems to be something out of one of Scarecrow’s waking nightmares.

    More...
    Game Tech News || Trading blogs || My blog

  6. #156
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Report claims Warner Bros knew exactly how terrible Arkham Knight for PC was, months before it shipped


    The fallout and clean-up over Arkham Knight have continued — a recent PC patch has addressed some issues with the game — but a new report today suggests that far from being caught off-guard by the terrible PC version, Warner Brothers was fully aware of the cesspool of code it was dealing with. The problems, according to anonymous sources, were a combination of steep learning curves, outsourced work, and precautions Rocksteady took to safeguard against story leaks.

    According to Kotaku, who spoke to the sources, the same dev teams responsible for writing the primary game code were also responsible for squashing bugs, which meant it was nearly impossible for the team to fix and code simultaneously. Attention didn’t turn to bug-squashing until the game was all-but finished, badly delaying the process. In addition, the console version of the game was apparently broken for months with top priority given to banging that code into shippable shape.

    The new consoles were supposed to be easier to work on than previous versions, but that apparently wasn’t much help to Rocksteady. Either because of the game’s ambitious visuals, Batmobile inclusion, or a simple lack of familiarity with the hardware, the console version of Batman went months over schedule and this may have impacted the resources available to bug-fix the PC version. Primary development was handled by an outside studio, and Kotaku notes that these developments tend to lead to lower-quality PC releases as a whole. Small studios, it seems, are often overwhelmed by the difficulty of developing complicated projects and take the work on in the hopes of getting their own independent projects funded at a later date.

    More...
    Game Tech News || Trading blogs || My blog

  7. #157
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Testing illustrates noise problem with at least a few Radeon Fury X cards


    When AMD’s Fury X launched last week we, along with multiple other reviewers, made note of the odd cooler whine coming off the GPU. When we spoke to AMD about the problem initially, we were told that ” Yes, AMD received feedback that during open bench testing some cards emit a mild “whining” noise. This is normal for most high speed liquid cooling pumps; Usually the end user cannot hear the noise as the pumps are installed in the chassis, and the radiator fan is louder than the pump… The issue is limited to a very small batch of initial production samples and we have worked with the manufacturer to improve the acoustic profile of the pump. This problem has been resolved and a fix added to production parts and is not an issue.”

    In our initial preview, we said that we’d requested an updated card from AMD and hoped to test the new solution and were under the impression the affected cards went out to press only. AMD has now clarified that at least a few retail cards also ended up with the incorrect type of cooler and higher-than-expected noise levels as a result.

    PC Perspective, however, took things a step farther and bought a pair of retail cards. Their verdict? Both retail cards have different audio profiles from the AMD-supplied Radeon Fury X, and neither one of them appear to solve the problem. When we contacted AMD about reports of audible noise persisting in consumer cards, the company clarified that while the problem only affected a small number of GPUs, those cards shipped to both press and retail outlets. In short, some of the Day 1 cards available from major retailers also had issues. That’s supposed to be a small number compared to the total shipped (AMD maintains that a very small number of retail models were affected).



    More...
    Game Tech News || Trading blogs || My blog

  8. #158
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Windows 10 no longer launching on July 29 for vast majority of people


    When AMD accidentally leaked that Microsoft would launch Windows 10 this summer, it surprised a number of people. Even at the time, midsummer was near enough to seem unlikely, particularly given how badly Redmond bungled Windows 8. Microsoft later confirmed that the date was genuine, however, and has been riding hell for leather to finish the OS in time to make the ship date. Now, the company has given notice that the July 29 date is more of a guideline than an actual hard launch.

    Terry Myerson has the details over at the Windows 10 blog. Microsoft hasn’t given manufacturers a final build yet (what’s known as the Released to Manufacturing build, or RTM). This is the build that you’d typically buy in stores or download. Then, Microsoft will distribute a build to retailers, to help them “assist their customers with upgrades of newly purchased devices that were originally imaged with Windows 8.1.” That’s a little odd, since MS typically has just one RTM, but fine.

    Here’s where things take a left turn: “Starting on July 29, we will start rolling out Windows 10 to our Windows Insiders. From there, we will start notifying reserved systems in waves, slowly scaling up after July 29th.” That’s entirely opposite from Microsoft’s initial statement on the topic. When Redmond first announced the July 29 launch date, it said: ““On July 29, you can get Windows 10 for PCs and tablets by taking advantage of the free upgrade offer, or on a new Windows 10 PC from your favorite retailer.”

    Why launch Windows 10 this way?


    If we had to guess, we’d bet Microsoft is launching Windows 10 in staggered fashion because the OS simply isn’t ready. Microsoft will roll the product out to Windows Insiders first, because they’ve already signed up for beta testing. Similarly, it can begin shipping the OS out on qualified hardware because Dell, HP, and Asus will have done the necessary testing to make certain that drivers and hardware are all ready for the new operating system.

    The fact that retailers are getting a separate version of the OS than the manufacturers further implies that compatibility is the sticking point here, as do Myerson’s comments that shoppers should “Look for this sticker for assurance that our OEM partners have proactively tested a device for compatibility with Windows 10.” Finally, Microsoft will update customers in waves to tell them when they can download Windows (this, apparently, is what that Windows Update was for a few months back).



    The upshot of this is that Microsoft almost certainly mistimed its own launch, and not will have to do a staggered rollout to deal with compatibility and driver support rather than just shipping the OS in the first place. Granted, those of us getting a copy of the OS for free probably don’t have too much room to complain, but the uncertain timeline, the total lack of information regarding which hardware or devices might not be compatible at launch, and the backtracking all leave a bad taste in our mouth.


    More...
    Game Tech News || Trading blogs || My blog

  9. #159
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Why do some games get the ‘downgrade’ treatment?



    There’s been a lot of hubbub about so-called “downgraded” games in recent years. Developers like Ubisoft Montreal and From Software have received a lot of heat from fans over the fact that some of their finished games don’t look nearly as good as early gameplay videos made them out to be. What causes this phenomenon, and why haven’t developers learned their lesson about over promising and under delivering? Let’s take a look.

    Unknown hardware

    In the case of Ubisoft’s Watch Dogs, much of the blame can be laid at the feet of unfinished console hardware. The early footage of the game was truly outstanding, but the final game is clearly missing specific lighting effects. The game was shown off at E3 2012 — a year and a half before the current generation of consoles shipped to consumers. Frankly, it seems that the team mis-estimated what the hardware would be capable of, and they had to walk back the effects and population density in the face of real-world limitations.


    more...
    Game Tech News || Trading blogs || My blog

  10. #160
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    AMD Radeon R9 Fury review: Splitting Nvidia’s GTX 980 and 980 Ti in performance


    At E3 last month, AMD announced that it would bring launch multiple GPUs under its new Fury brand. First up was the Fury X, a $649 card meant to compete with the GTX 980 Ti and sporting its own custom water cooler. Today, the company is launching its follow-up to the Fury X, the $549 Radeon R9 Fury. This new card uses the same base Fiji GPU as the Fury X, but with fewer cores (3584 as opposed to 4096). The modest reduction in total compute units is matched by a slight cut to texture mapping units (down to 226, from 256), but the total number of ROPS stayed the same, at 64. The Radeon Fury’s clock speed has been cut slightly, to 1GHz (down from the Radeon Fury X’s 1050MHz), but the GPU packs the same 500MHz, 4096-bit HBM interface, 275W maximum board power, and dual 8-pin PCIe connectors.



    One of the factors that sets the new Radeon R9 Fury apart from the Fury X is the size of the card. While neither the Sapphire Tri-X or Asus Strix R9 Fury are that much bigger than other high-end air-cooled GPUs,*they’re far larger than AMD’s diminutive Radeon Fury X. Granted, that GPU used a water-cooler while the Strix (the card we have in-house) is air-cooled, but it’s not just the cooler that’s large — Asus mounted the Fury on a standard-length high-end PCB as well.



    The resulting card is the Asus Strix R9 Fury DirectCU III OC, but don’t let the OC get your hopes up. AMD’s reference card is clocked at 1GHz standard, while the Strix clocks in at a maximum of 1020MHz out of the box. That 2% OC isn’t going to push the envelope, and like the Fury X, Fury isn’t expected to have much overclocking headroom. One thing to like about the R9 Fury Strix, particularly if you have older monitors, is that the GPU supports a wide range of ports. Unlike the Sapphire version of the card, which offers 3x DisplayPort and 1x HDMI, the Strix packs 3x DisplayPort, 1x DVI-D, and 1x HDMI.



    According to Asus, the GPU cooler is designed to maintain a maximum temperature of 85C. That’s not nearly as low as AMD’s 50C target for Fury X, but for an air-cooled card, 85C is quite good. It’s particularly impressive given that AMD’s last high-end air-cooled cards, the R9 290 and R9 290X, often ran right up to their 95C thresholds. Asus is bringing the Strix R9 to market at $579, marginally higher than the $549 AMD is targeting for the R9 Fury in general. The heatsink and attached GPU are huge compared to previous cards, at 11.75 inches long and with significant cooler overhang.



    more...
    Game Tech News || Trading blogs || My blog

Page 16 of 97 FirstFirst ... 6 14 15 16 17 18 26 66 ... LastLast

LinkBacks (?)

  1. 10-25-2014, 03:45 AM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •