Page 14 of 97 FirstFirst ... 4 12 13 14 15 16 24 64 ... LastLast
Results 131 to 140 of 965
Like Tree1Likes

Game Tech News

This is a discussion on Game Tech News within the Electronics forums, part of the Non-Related Discussion category; The Witcher 3 has been both critically and commercially successful since its release last week, but it’s not without its ...

      
   
  1. #131
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    The Witcher 3 can’t maintain 30fps on either the PS4 or Xbox One


    The Witcher 3 has been both critically and commercially successful since its release last week, but it’s not without its issues. Most notably, the frame rate on the PS4 and Xbox One has been sub-par. Nobody sincerely expected it to be hitting 60fps on either console, but it seems that it can’t even maintain a solid 30fps. We’re a year and a half into this console generation, and AAA developers still can’t keep the frame rate up.

    The team at Digital Foundry did a brief comparison of the PS4 and Xbox One versions of the game, but the results are far from clearcut. As you may remember, the PS4 version runs at 1080p, but the Xbox One uses dynamic resolution scaling to switch between 900p and 1080p depending on how graphically intensive the scene is. And while the PS4 is capped at 30fps, the Xbox One is not.

    More...

  2. #132
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    AMD launches new top-end A10-7870K with faster CPU, GPU clock speeds


    When Kaveri debuted 15 months ago, it offered a number of i mpressive under-the-hood improvements for heterogeneous compute workloads, a refreshed CPU core, and the best integrated graphics performance available on any platform. AMD has already announced that its Carrizo and Carrizo-L APUs will ship in volume beginning this quarter, but those are mobile-only parts. Today, the company is also launching the A10-7870K — a new, updated version of the A10-7850K with tweaked CPU clock speeds and a significantly faster GPU.

    The speeds and feeds are self-explanatory. The A10-7870K will offer a5.4% faster base clock and 2.5% faster boost clock for the CPU cores. In practice, these gains are likely below the threshold of perception for common users, though they’ll be a slight boost to overall test results. The larger increase is on the GPU side, where clock frequencies have leapt up to 866MHz, an increase of 20%. TDP for this part is unchanged, at 95W.



    That’s going to pay some GPU performance dividends for AMD, though these gains will be relatively constrained by the A10-7870K’s memory bandwidth limitations. While the APU performance is still the best you can buy for an integrated chip, the dual-channel DDR3 memory interface puts the brakes on absolute performance.

    There’s actually a bit of good news on that front if you’re considering the APU option. DDR3-2400 still commands a price premium, but the cost of 8GB of DDR3-2400 with decent timing (11-13-13-30) is just $56 over at NewEgg. Adding high-speed memory to an APU meaningfully improves its overall performance — with DDR3-2400, AMD predicts 2x the GPU performance of a single-channel configuration with DDR3-1600.



    That said, there’s a definite drop-off as frequencies rise — DDR3-2400 is only estimated by AMD as 11% faster than DDR3-1866, despite clocking in at 28% faster in terms of sheer clock speed. Latencies can also impact how much benefit you get from higher RAM clocks. If you’re considering an APU, check memory prices and plan for what meets your own needs.

    AMD claims that the A10-7870K is targeting gamers with sub-1080p monitors, which is apparently common in the Asia-Pacific market region, and will therefore include support for the virtual screen resolution capabilities that debuted with AMD’s Catalyst Omega driver. VSR allows a GPU to render internally at higher resolutions before outputting in the lower mode. We suspect performance in this mode will depend on how memory bandwidth dependent the internal rendering is — some titles will likely handle the jump more gracefully than others.



    AMD is claiming that the A10-7850K will beat a Core i3 + GeForce GT 740 when the APU is connected up to DDR3-2400 memory. Obviously such claims should be taken with a grain of salt — the GT 740 is a very low-end Nvidia card. Still, in certain cases, a single APU can outperform some CPU + GPU pairs.

    Modest improvements, better pricing


    The A10-7870K isn’t likely to change anyone’s opinions on APUs — you either like them, or you don’t — but 15 months post-launch, AMD has offered a number of improvements to the overall Kaveri package. Kaveri supports FreeSync, and the frame-smoothing effects of operating in that mode are most apparent at lower refresh rates (30 FPS with FreeSync / G-Sync looks much different than a sustained 120 FPS). Adding support for virtual screen resolutions in-driver is another nice touch, as is the increased GPU clock speed.
    Meanwhile, the entire package is available at a much more reasonable price of $137, as opposed to the A10-7850K’s launch price of $180. At that price, there’s a solid argument to be made for an all-in-one APU with superior graphics performance as compared to the Core i3 family from Intel with a faster CPU.


    More...

  3. #133
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    How to use a PS4 DualShock 4 to play PC games




    Back when Sony first announced that the DualShock 4 would work with Windows, PC gamers across the internet cheered in unison. Unfortunately, it’s not exactly the plug-and-play solution we had hoped for. While it’s true that the PS4’s controller is recognized by Windows (and OS X) right out of the box, existing games don’t automatically work with the gamepad. Thankfully, there is an application available that maps the DualShock 4’s controls directly to Microsoft’s XInput API — effectively tricking games into thinking you’re using an Xbox 360 controller.

    To get started, head on over to the InputMapper website, and download the latest version. From here, simply run the installer, and agree to the terms and conditions. Once everything is finished, connect your DualShock 4 to your PC over USB or Bluetooth. Keep in mind, the USB method is much more reliable. The wireless connection method is infamously flakey on PC, and it seems to depend heavily on your chipset and drivers. If you’re having issues, default to the USB connection, and close out all other applications.



    Next, launch the InputMapper app. At this point, the DualShock 4 should be recognized in the application, and it should function identically to an Xbox 360 controller in most cases. However, some oddball games have some compatibility issues. You might need to go into the settings, and toggle the checkbox labeled “Use Exclusive Mode.” Otherwise, the standard controller functions are completely operational.



    If you’d like to customize your experience a bit, you can go into the Profiles pane, and you’ll be brought to a window filled with buttons and sliders. Here, you’ll be able to change the color of the light bar, remap your buttons, tweak the rumble settings, and even set-up custom macros.

    Keep in mind, this is still a work in progress. As it stands, you still need to launch the program every time you want to play a game. It’s a bit of a hassle, but that’s a relatively minor issue, since you can set it to launch minimized whenever Windows boots up. It’s also worth noting that this app is designed to mimic the Xbox 360 controller, so the button graphics in-game will show as A and B instead of X and O. If you’re looking for seamless integration, you need to either use a PS4, or wait for more PC games to ship with native DS4 support.
    If you already use an Xbox 360 controller on your PC, there’s no reason to switch unless you have a strong personal preference for the feel of the DualShock 4. Getting everything up and running takes a little bit of effort, and it’s still more prone to issues than the Xbox One or 360 controllers. Most PC gamers shouldn’t bother taking the plunge, but this is a viable solution for anyone dead set on playing PC games with Sony‘s new controller.


    More...

  4. #134
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    AMD updates Catalyst beta drivers, reportedly prepping new ‘Fury’ brand to take on GTX Titan


    AMD has launched new Catalyst beta drivers with specific improvements for Project Cars and The Witcher 3, and may be prepping an entirely new brand name for its upcoming Radeon with High Bandwidth Memory onboard. The new drivers target two games that have been in the news lately — we covered the dispute over GameWorks in some depth — but the Project Cars situation involved a war of words between the developer (who alleged AMD vanished off the radar and did no testing) and AMD, which generally disputes these allegations.

    AMD is claiming performance improvements of up to 10% for The Witcher 3 and up to 17% for Project Cars when used with single-GPU R9 and R7 products. The Witcher 3’s Crossfire performance has also been improved, though AMD still recommends disabling post-processing AA in Crossfire for best visuals.

    The GameWorks optimization question


    One thing I want to touch on here is how driver releases play into the GameWorks optimization question. When AMD has said it can’t optimize for GameWorks, what that means is that AMD can’t optimize the specific GameWorks function. In other words, in The Witcher 3, AMD can’t really do much to improve HairWorks performance.

    AMD can still optimize other aspects of the game that aren’t covered by GameWorks, which is why you’ll still see performance improve in GW-enabled titles. ComputerBase.de took the new drivers for a spin in The Witcher 3 and saw some modest frame rate increases:



    Improvements like this often depend on the specific video card settings or options enabled, so the gamut can swing depending on preset options. AMD has published its own optimization guide for The Witcher 3 for users looking to improve game performance.

    Upcoming Fiji to be sold as Radeon Fury?


    Meanwhile, the rumor mill is predicting AMD won’t brand the upcoming Fiji GPU as an R9 390X, but will instead sell the card as the Radeon Fury. Whether this is accurate is an open question, but it makes some sense — AMD pioneered the use of flagship CPU branding with the Athlon 64 FX back in 2003, and while it’s never had a flagship GPU brand, Nvidia’s Titan demonstrated that there’s clearly a use for such products.

    The name “Fury” also has some history behind it. Back when AMD’s graphics division was an independent company, called ATI, its first popular line of 3D accelerators was branded “Rage,” and the Rage Fury was the top-end part. A later chip, the Rage Fury Maxx, actually implemented AFR rendering in hardware, but driver issues and compatibility problems sullied the GPU brand somewhat. When ATI launched the Rage series’ successor, it adopted a new name — Radeon.

    Radeon Fury, if true, is a nice callback — and the performance from Fiji is rumored to be furious. At 4,092 cores and an expected 500GB/s of memory bandwidth, AMD’s new GPU is going to be serious competition for Nvidia — including, just possibly, the Nvidia Titan X.

    More...

  5. #135
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Nvidia GTX 980 Ti Review: Titan X performance at a fraction of the price



    Today, at Computex (It’s already 6 AM on June 1 in Taiwan), Nvidia is launching its next high-end GPU. Just as the GTX 780 followed the original Titan, the GTX 980 Ti will slide in between the Titan X and the upper-end “standard” consumer card, the GTX 980. This new card is based on the same GM200 GPU as the Titan X, but trims the VRAM buffer down to 6GB, from Titan X’s 12GB. The cooler design is outwardly identical to the shroud and fan that Nvidia has deployed since it first unveiled the GTX Titan.



    Overall, the GTX 980 Ti is a very modest step down from what the Titan X offers. It has 22 SM clusters as opposed to Titan X’s 24, for a total of 2816 GPU cores*(a roughly 9% reduction). Trim the texture units by the same ratio (176 as opposed to 192) and keep the total number of ROPS the same (96). Then cut the RAM in half, for a total of 6GB, down from 12GB, and voila — you have the GTX 980 Ti.



    The memory clock, base clock, and boost clock on the 980 Ti are all identical to Titan X, as is its pixel fill rate. Texture rate is down slightly, thanks to the decreased number of texture mapping units. Both chips have a 384-bit memory bus. Nvidia has promised that the 980 Ti has full access to its memory pool, and that overall GPU memory bandwidth should be in-line with Titan X. We see no evidence of any memory-related issues, and the 6GB memory buffer on the card give the chip room to breathe in any case.

    On paper, the GTX 980 Ti packs virtually all of the Titan X’s punch into a much lower $649 price.

    More...
    Game Tech News || Trading blogs || My blog

  6. #136
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Nvidia at Computex: GameWorks goes VR, G-Sync comes to mobile - multiprojection is already an Nvidia technology


    We’ve already covered the launch of Nvidia’s GTX 980 Ti, the $649 killer follow-up to the GTX Titan X, but that’s scarcely the only thing Nvidia is talking about this week. The company has kicked off a fairly massive set of announcements, with fresh information on multiple projects and programs. We’ll start with G-Sync, Nvidia’s program to improve game experiences by smoothing frame delivery.

    G-Sync goes mobile, windowed


    The last time we talked about G-Sync, it was to discuss claims that mobile G-Sync could be enabled by something as simple as a driver update (and, at the time, mostly debunk them). Nvidia told us at the time that mobile G-Sync absolutely was on the company’s long-term radar, and it’s now ready to talk about that feature in more detail.

    Nvidia will bring G-Sync to laptops that use panels approved for the technology. Panel capability is important — the display needs to support the variable refresh timing and other aspects of the embedded DisplayPort (eDP) standard that the mobile version of G-Sync relies on. eDP is the standard that AMD first used to demonstrate FreeSync on mobile devices several years ago, and Nvidia has confirmed that while it will offer G-Sync on mobile hardware, it won’t need to use dedicated scalars or other proprietary ASIC implementations to deliver the feature.



    Nvidia has said that it’ll institute a licensing-and-qualification procedure for displays, which implies that this will be a custom, boutique-oriented capability. G-Sync is also coming to windowed displays, presumably in an upcoming driver update. Up until now, G-Sync has been a fullscreen-only feature. Windowed G-Sync will also work in two-way SLI configurations (but nothing higher) and can be enabled or disabled on a per-application basis if games don’t play nice with the feature.



    The one caveat to this support is that Optimus — Nvidia’s battery-saving technology that utilizes an integrated IGP to drive desktop and 2D work when on battery and the full GeForce GPU for gaming — will not be compatible with mobile G-Sync. The two features will be mutually exclusive. Nvidia believes that mobile power consumption at idle on Maxwell is good enough not to harm battery life much (and most gamers don’t play on battery for long periods anyway), but we’ll have to wait for shipping hardware to check this.

    GameWorks goes VR


    VR is steadily advancing as the Next Big Thing in gaming (depending on who you believe, at least) and Nvidia is determined to carve themselves a piece of the action. The company is launching what it calls Multi-Resolution Shading to improve VR performance and building the capability into GameWorks as another library.



    One of the problems with VR as compared with standard screen rendering is that the lenses we look through for VR require warped images to match the display’s optics. If you render the game as if its being displayed on a conventional LCD panel, then crop the image to match the viewing field, you’ll wind up rendering many pixels that never get displayed. The other problem is that detail gets poured into screen edges that don’t require it — the human eye sees much more detail dead-center then around its own periphery, and at the viewing distance of a VR headset, this difference matters.



    What Nvidia does with its multi-resolution viewport technology is subdivide the image between a high-detail central image and lower detail peripheral images. Then, detail can be projected where needed depending on where the player is focused at any given point in time. This kind of approach, according to Nvidia, can improve pixel shader performance by 1.3x – 2x depending on the title and the degree of detail in the scene.



    I’m curious to see what the approach can deliver. I realize that people are going to bristle at the idea of purposefully rendering display data at less-than perfect resolution — I used to bristle at the idea of gaming on three displays when the output that gets pushed to the other monitors is often blurred or distorted. In reality, however, these facts turned out not to matter much. The center display, where my eyes focused was perfectly clear — the information coming in from the periphery was useful, but the distortion didn’t bother me when actually playing a game.

    Nvidia may have their finger on an approach that could yield real performance dividends in certain cases, but the fact that it’s sandboxed behind GameWorks could give some pause. Because multiprojection is already an Nvidia technology, there’s little to be gained from preventing AMD from using it — but sandboxing games into different VR implementations could effectively balkanize a burgeoning new field of gaming. AMD has been talking up its own LiquidVR initiative and the ability of the GCN-derived Radeons to render VR environments extremely efficiently. We’ll see which company has the better overall approach once both hardware and software are in the market.

    More...
    Game Tech News || Trading blogs || My blog

  7. #137
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Nintendo may use Android for its next console – so what?


    Update: Nintendo has officially denied the Android rumor. However, we’ve seen Nintendo deny rumors before — only to turn around months later and confirm that they were true all along. As always, we should treat the original rumor and Nintendo’s denial with skepticism.

    Nintendo has only mentioned its next hardware platform (codenamed “NX”) in passing, but the rumor mill is already ramping up quickly. The latest rumor focuses on the NX’s operating system, and one source is claiming that Nintendo actually plans to use Android as the foundation. If this is true, this could potentially have huge repercussions, but probably not in the way you’d assume.

    Earlier this week, the Tokyo-based Nikkei news outlet reported that the NX will be running Android in some way, shape, or form. Assuming the Nikkei’s source is right, it could mean a number of things for the future of Nintendo. First and foremost, it would likely mean switching away from the PowerPC CPUs Nintendo has favored for its consoles since the GameCube. Android does support x86 and MIPS processors, but ARM is obviously the platform of choice here. This would lend credence to the oft-mentioned idea that Nintendo plans to unify its handhelds and home consoles under a single banner.



    Of course, the mere mention of Android in the same breath as a gaming console conjures memories of the Ouya debacle. There’s no reason for doom and gloom here, though. If anything, I’ll bet that the NX will run a variant of Android in the same way that the PS4 runs a variant of FreeBSD. It’s a solid, open-source infrastructure in which they can build whatever they want. After all, why bother reinventing the networking stack if a free implementation is staring you in the face?

    More...
    Game Tech News || Trading blogs || My blog

  8. #138
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Sony plans 1TB PS4 launch — will the company drop the price this year?


    E3 is right around the corner now, so countless leaks and rumors are starting to pop up in the world of gaming. It’s unlikely that we’ll see any major hardware announcements in LA this year, but a new iteration of the PS4 has just been confirmed by FCC certifications. This refresh looks like it only contains small changes, but it might actually be the first hint of a PS4 price drop.
    Earlier this week, the FCC posted numerous documents relating to a new revision of the PS4. By far, the biggest news is the existence of a new 1TB model. In addition, the PSU is now rated for a lower wattage and amperage, and it weighs about half a pound less. This clearly points to a smaller, more efficient iteration on the existing internals.



    Of course, lower power consumption isn’t going to move the needle in the marketing department, but being able to cut costs is a huge part of the console life cycle. The lower the manufacturing costs, the more profit potential your console has. And with lower overhead comes the inevitable price drop. If Sony is looking to goose the PS4’s adoption rate, a price drop is exactly what it needs.

    After Microsoft dropped the Kinect requirement, and lowered the core Xbox One console down to $350, adoption jumped significantly. By all accounts, Sony is still well in the lead, but a strategic price drop in the near future could effectively cement the PS4 as this generation’s “winner.”

    If Sony does plan on cutting the PS4’s price, here’s how I think it’ll go: The 1TB model will land at the existing $400 price point, and the 500GB model will be discounted to somewhere between $350 and $380. Both major consoles will eventually make it down to the $300 and $250 price points, but it’s still too early in the game for drops of that magnitude — especially since sales have been better than expected. After all, the minimalistic PS4 is a far cry from the overpriced behemoth that was the launch PS3 model.

    More...
    Game Tech News || Trading blogs || My blog

  9. #139
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Thermaltake accused of ripping off multiple rivals, stealing designs


    When it comes to CPU coolers, cases, and other PC peripherals, Thermaltake has been a major presence for over a decade. The company’s reputation as a case designer is more mixed. While its designs are distinctive, early products like the Xaser and Kandalf were an un-ironic celebration of clashing plastic colors, cheap paint, and brilliant LEDs. Now, a manufacturer has publicly leveled serious charges at the firm. In a public letter posted to Facebook, CaseLabs has claimed that Thermaltake directly ripped off its own designs — and it’s got the invoices to prove it.

    Before we dive into the specifics of the claim, we should note Thermaltake has a number of suspiciously similar products on the market and CaseLabs is far from the only company involved. LegitReviews has a round-up of Thermaltake hardware that practically plays the “spot the difference” game with rival manufacturers. From cases to controllers to fans, Thermaltake’s designs are often suspiciously “inspired” by the work other companies are doing.



    CaseLabs, however, goes farther, alleging that Thermaltake’s lead case designer, Shannon Robb, openly attempted to photograph its*own designs before buying chassis and shipping them to Thermaltake HQ. Fast forward a year, and Thermaltake is introducing similar cases based on CaseLab’s previous prototypes. Fast forward a year after that, and ThermalTake is launching yet another set of hardware that copies CaseLabs designs. CaseLabs is understandably unhappy about the entire affair, given that it’s a small vendor with a fraction of the larger company’s resources and effort.



    In this comparison, the Thermaltake Suppressor is on top, the Fractal Design is on bottom.


    More...
    Game Tech News || Trading blogs || My blog

  10. #140
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    New Snowden leak: NSA uses warrantless Web surveillance to hunt hackers


    Ever since Edward Snowden began to leak details on the mass surveillance programs of the NSA and other government agencies, there’s been an ongoing debate over the nature and limits that should be placed on such surveillance. One of the most troubling trends exposed in repeated leaks is the degree to which the government has exceeded the enormous authority granted it by the Patriot Act and other legislation. New information, available today, is going to reignite that argument. Days after the Senate voted to reauthorize the Patriot Act with some modest modifications, details have leaked on how the Obama Administration authorized the NSA to search the Internet for evidence of malicious hacking, even when there were no obvious ties between the alleged hackers and any international groups.

    According to a joint investigation between the New York Times and Pro Publica, the Justice Department authorized the NSA to hunt for hackers without a warrant, even when those hackers were present on American soil. Initially, the DOJ authorized the NSA to gather only addresses and “cybersignatures” that corresponded to computer intrusions, so that it could tie the efforts to specific foreign governments. The NSA, however, sought permission to push this envelope. These new slides also note, incidentally, that Dropbox was targeted for addition to the PRISM program.



    These practices date back to at least 2011, when the Foreign Intelligence Surveillance Court (FISC, sometimes called the FISA Court) authorized the FBI to begin using NSA resources in pursuing foreign-based hackers. Data the NSA gathered on behalf of the FBI was to be routed to the FBI’s own repositories. As with previous controversial orders, it’s not clear what the criteria are for a target being “suspicious,” or what ties or evidence are gathered to link a specific individual to hacking attempts before warrantless surveillance is called in. Monitoring hackers also means monitoring what hackers are hacking — which means that the data stolen off US servers is being dumped back to the NSA. What happens to that data? It’s not clear — and the NSA’s ability to accurately identify the difference between friends and enemies has been repeatedly called into question, including by the FISA court itself.

    More...
    Game Tech News || Trading blogs || My blog

Page 14 of 97 FirstFirst ... 4 12 13 14 15 16 24 64 ... LastLast

LinkBacks (?)

  1. 10-25-2014, 04:45 AM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •