Page 9 of 97 FirstFirst ... 7 8 9 10 11 19 59 ... LastLast
Results 81 to 90 of 962
Like Tree1Likes

Game Tech News

This is a discussion on Game Tech News within the Electronics forums, part of the Non-Related Discussion category; As it stands, the PS4 doesn’t ship with DLNA streaming capability, ironically making the PS3 a better media center device. ...

      
   
  1. #81
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    How to use your PS4 as a media streamer without DLNA


    As it stands, the PS4 doesn’t ship with DLNA streaming capability, ironically making the PS3 a better media center device. It’s certainly possible that we’ll see support for the standard patched in eventually, but how are we supposed to play our media library on the PS4 right now? Thankfully, there is a way, and it’s all thanks to an app called Plex.

    With this handy little app, you can stream just about any video from your computer or NAS directly to your PS4. It only takes a few minutes to get going, so let’s jump right in.



    First off, you need to install the Plex Media Server. Download it, install it, and then launch the executable. It’s simple enough, and it’s available on Windows, OS X, Linux, and FreeBSD (FreeNAS). While you’re at it, sign-up for a free Plex account if you haven’t done so already.



    Configure server settings


    Once the application is running, you can configure your settings as you please. Choose your server’s name, add your media folders to the Plex library, and tweak your networking options as you see fit. If you need to change the port configuration, you’ll need to toggle on the advanced mode by clicking the “Show Advanced” button in the upper right. Most people shouldn’t need to tinker too much, but the options are available.



    Purchase a Plex Pass


    For the time being, the Plex App on PS4 is only available for Plex members with paid accounts. Eventually, you’ll be able to separately buy access to the PS4 app without the Plex Pass, but the subscription is mandatory for now. So if you want the streamlined experience, head on over to the Plex website, and buy a Plex Pass.



    Download the Plex app


    Now that your account is properly configured, go into the PlayStation store, and navigate to the “Apps” section. You’ll find the Plex app itself is free, so initiate the download. Once it’s done installing, you’ll find the Plex app under the “TV & Video” section of the PS4’s main menu. Alternately, you can always go to the “Library” menu, and navigate to “Applications.”



    Generate a PIN


    Launch the Plex app on your PS4, and you’ll be greeted with four alphanumeric characters. You’ll need this code to pair your account with your PS4.



    Pair your PS4 to your account


    Now, head on over to the PIN login page on the Plex website, sign in with your premium account, and enter the four characters being displayed on the PS4. Press the “Connect” button, and you’ll be greeted with a message. If it tells you that the PIN was activated, you’re ready to rock. If you get an error, go back to your PS4, and generate a new code in the Plex app.



    Enjoy yourself


    Finally, you’ll be able to stream movies and TV shows on your PS4 quickly and easily. Music and channel support isn’t implemented in the PS4 app just yet, but that functionality will be added in at a later date.

    Is there another way?



    More...

  2. #82
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Investigating the GTX 970: Does Nvidia’s GPU have a memory problem?



    Late last week, we covered claims that the GTX 970 had a major memory flaw that didn’t affect Nvidia’s top-end GPUs, like the GTX 980. According to memory bandwidth tests, the GTX 970’s performance drops above 3.2GB of memory use and craters above 3.5GB. Meanwhile, many users have published claims that the GTX 970 fights to keep RAM usage at or slightly below 3.5GB of total VRAM whereas the GTX 980 will fill the entire 4GB framebuffer.

    There are three separate questions in play here, though they’ve often been conflated in the back-and-forth in various forum threads. First, does the small memory bandwidth benchmark by Nia actually test anything, or is it simply badly coded?



    We’ve verified that this issue occurs properly.

    Second, does the GTX 970 actually hold memory use to the 3.5GB limit, and if it does, is this the result of a hardware bug or other flaw? Third, does this 3.5GB limit (if it exists) result in erroneous performance degradation against the GTX 980?

    Memory bandwidth and allocation on the GTX 970 vs. the GTX 980


    The GTX 970, like a number of other GPUs from Nvidia (and, historically, a few from AMD) uses an asymmetric memory layout. What this means, in practice, is that the GPU has a faster access path to some of its main memory than others. We reached out to Bryan Del Rizzo at Nvidia, who described the configuration as follows:

    “[T]he 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.”

    In other words, the answer to the first question of “Does this memory benchmark test something accurately?” is that yes, it does. but does this limit actually impact game performance? Nvidia says that the difference in real-world applications is minimal, even at 4K with maximum details turned on.

    Nvidia’s response also confirms that gamers who saw a gap between the 3.5GB of utilization on the GTX 970 and the 4GB on the GTX 980 were seeing a real difference. We can confirm that this gap indeed exists. It’s not an illusion or a configuration problem — the GTX 970 is designed to split its memory buffer in a way that minimizes the performance impact of using an asymmetric design.

    We went looking for a problem with the GTX 970 vs. the 980 in two ways. First, we reconsidered our own data sets from the GTX 970 review, as well as reviews published on other sites. Even in 4K, and with all detail levels cranked, our original review shows no problematic issues. The GTX 970 may take a slightly larger hit in certain circumstances (Nvidia’s information suggests that the impact can be on the order of around 4%), but we don’t see a larger problem in terms of frame rates.

    The next step was to benchmark a few additional titles. We tested the MSI program Kombustor and its RAM burner program, as well as the games Dragon Age: Inquisition and Shadows of Mordor. Both Dragon Age: Inquisition and Shadows of Mordor were tested at absolute maximum detail with all features and settings maxed out in 1080p and 4K.

    More...

  3. #83
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Most DirectX 12 features won’t require a new graphics card


    The last few updates to Microsoft’s DirectX platform have come with the requirement that you get new hardware to enjoy the benefits, but that’s not going to be the case with DirectX 12. According to Microsoft, DirectX 12 will work with most existing gaming hardware, at least for the most part. Some DX12 features will still need updated GPUs, but all the basic features should work.

    Microsoft announced DirectX 12 last year at GDC, and it’s still not fully baked yet. Because it’s not technically done yet, Microsoft has been cautious about explaining exactly what will and won’t work on current GPUs. What we do know is that the basic feature set will work on all Intel fourth-gen and newer Core processors, as well as AMD’s Graphics Core Next (GCN) architecture. On the Nvidia side, DirectX 12 will support Maxwell, Kepler, and even Fermi. Basically, a DX 11.1 card will be compatible with most of the new APIs. Note, Maxwell is actually the first GPU with full DX12 support, although DX12 graphics are currently only making appearances in demos.

    That makes some sense when you look at what DirectX 12 is designed to do. While past updates to DirectX have focused on new rendering effects like tessellation and more realistic shaders, DirectX 12 is an attempt to dramatically reduce driver overhead and get PC gaming closer to console levels of efficiency by learning some lessons from AMD’s Mantle API. Consoles have very narrow hardware profiles, but the hardware abstraction layer in DirectX slows things down.

    Some of the more significant aspects of DirectX 12 will be included in the basic features including power efficiency and frame rate improvements. That’s really all the detail Microsoft is willing to go into on the record right now.

    Redmond is probably referring to the improved threading of command lists from the CPU to GPU. The workload is shared across threads in DirectX 12, but more dependent on a single thread in DirectX 11. Splitting it up more efficiently means higher frame rates. DX12 will also support bundled commands within command lists that can be reused instead of being sent all over. That can decrease power use and further increase frame rates.

    We know from some of the benchmarks released last year that reducing the CPU overhead can boost frame rates by as much as 60%. That could be the difference between a game that’s unplayable laggy and completely smooth.



    Microsoft is even more vague about what features of DirectX 12 will need new hardware. According to company reps, there are several rendering pipeline features that will only be supported on new cards, but those won’t be detailed until GDC in a few months. The updated APIs should ship with Windows 10 later this year, and games utilizing the new version of DirectX will be around for the 2015 holiday season. Now that you know whether or not you’ll need new hardware, you can plan your splurging accordingly.

    More...

  4. #84
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Xbox One catching up to PS4, but the numbers don’t add up


    For the uninitiated, the eighth generation console war has followed roughly the same path since it began — Sony’s PlayStation 4 capitalizes not only on its technically superior hardware and content exclusives, but from Microsoft’s constant stream of public relation fumbles leading up to and following the Xbox One launch. Since launch, the PS4 was soundly leading the Xbox One in both hype and sales — Sony’s console was even more fun to try to break. During its quarterly earnings statement today, Microsoft revealed that it sold 6.6 million Xbox consoles during the holiday quarter — around 2.5 million more consoles than PS4s Sony moved during the same period. Did Microsoft turn the Xbox One around?

    Microsoft posted a $26.5 billion revenue, but both Xbox sales and net income were down year-over-year. Back during Microsoft’s second quarter, it revealed that it sold 7.4 million Xbox consoles, but divulged that the number was split between 3.9 million Xbox Ones and 3.5 million Xbox 360s. This time around, Microsoft used the same vague wording of “Xbox consoles,” but didn’t provide a detailed split. It did, however, claim that the Xbox One outsold the PS4 during that*time period, so you could safely assume that the Xbox One managed to sell around at least 4.2 million units. It wouldn’t be unlikely for the remainder of that*6.6 million number to be filled in with Xbox 360s.



    Sales are sales, but aside from the mysterious lack of delineation between Xbox One and Xbox 360 sales, a few other factors are at play. Microsoft has dropped the price of the Xbox One from $399 to $349 twice in the past few months, which was already down from the original unit’s $499 price tag. Meanwhile, console agnostics that were part of the PS4 sales rush may now be settling in and picking up the rival platform to make sure they don’t miss any exclusives.

    On the software side of things, Microsoft is doing just fine — in no small part thanks to the $2.5 billion acquisition of Minecraft developer Mojang, as well as the release of the nostalgia-laden*Halo: The Master Chief Collection and Forza Horizon 2. Despite Microsoft falling somewhere behind Sony in the console war numbers — almost by half if you believe certain tallies — its numbers themselves aren’t bad by any stretch, just in comparison to the competition. With enormous new markets opening potential doors for Microsoft, and with its (mostly) beloved Xbox Live Gold service possibly arriving on other platforms, there is still much room for the Xbone to grow, despite that loving-yet-telling moniker.

    More...

  5. #85
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Will the PS4 and Xbox One receive 4K support this year?



    Consumer electronics companies have begun the 4K push, and now it seems everyone is scrambling to get their houses in order. The PS4 and Xbox One are already technically capable of outputting 4K video, but considering how much these consoles struggle to reach 1080p, is 4K really feasible with the existing hardware? Netflix seems to think we’re in for PS4 and Xbox One hardware revisions this year, but are Sony and Microsoft willing to burn their early adopters?

    Back in January, Netflix’s Neil Hunt said publicly that Sony had supposedly promised a PS4 hardware revision with improved 4K support in mind. Earlier this week, Forbes followed up with Hunt, and he maintains that both the Xbox One and PS4 will see hardware refreshes at around the two-year mark. Specifically, he believes that they’ll include updated internals aimed at supporting 4K video playback.


    More...

  6. #86
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    DirectX 12 confirmed as Windows 10 exclusive, AMD and Nvidia go head-to-head


    It’s been just over a year since AMD launched its next-generation Mantle API, with the promise that low-overhead gaming would dramatically boost frame rates and lead to fundamentally new types of game engines. One of the demos that Sunnyvale used to show off its new API was Star Swarm, a tech demo and next-generation engine from Oxide Games. Now, a new head-to-head comparison puts AMD and Nvidia head-to-head in the test — only this time, they’re both running under DirectX 12.

    The performance data in*Anandtech’s comparison should be taken with a significant grain of salt. D3D12 support is baked into Windows 10, but the code is early. The drivers from AMD and Nvidia are pre-production, obviously, and the underlying OS it itself in a pre-RTM state. Windows 10 uses version 2.0 of the Windows Display Driver Model (WDDM), which means that a great deal of under-the-hood work has changed between Windows 8.1 and the latest version of the operating system. The preview is quite extensive,*they test the GTX 980 against multiple AMD cards in multiple CPU and GPU configurations and I don’t want to spoil their thunder. At the Extreme preset we see several interesting results:



    The first thing people are going to notice is that the GTX 980 is far faster than the R9 290X in a benchmark that was (rightly) believed to favor AMD as a matter of course when the company released it last year. I’ll reiterate what I said then — Star Swarm is a tech demo, not a final shipping product. While Oxide Games does have plans to build a shipping game around their engine, this particular version is still designed to highlight very specific areas where low-latency APIs can offer huge performance gains.

    As impressive as the GTX 980’s performance is, I’m going to recommend that nobody take this as proof that Nvidia’s current GPU will blow the doors off AMD when D3D12 is shipping and games start to appear late this year or early next.

    The second thing that some users will note is that the R9 cards offer very similar performance in Mantle vs. DirectX 12, at least for now. There was always some discussion over whether or not Mantle and D3D would offer similar performance capabilities, and at least for now, it looks as though they may — though again, that should be taken as a tentative conclusion.

    AT steps through multiple benchmarks and comparisons between the two GPU families, as well as simulated performance on dual and quad-core configurations. There’s no comparison of AMD hardware, which makes sense on the one hand — AMD CPUs are not widely used for enthusiast gaming these days — but is unfortunate on the other. Mantle has always had its best showing when used to accelerate the performance of AMD CPUs or APUs, and it would’ve been interesting to see if Direct3D 12 benefited its hardware as much as its own native API has done.

    Microsoft confirms: DirectX 12 will be a Windows 10 exclusive


    One point update that Anandtech disclosed as well is that Windows 10 and DirectX 12 will be bundled together — D3D 12 will not come to Windows 7, 8, or 8.1. The free upgrade offer on Windows 10 will doubtlessly blunt a great deal of criticism that MS would otherwise have come in for, but users who can’t upgrade or simply don’t want to won’t be happy.

    Whether or not this will breathe life into AMD’s Mantle is an interesting question. In theory, Mantle could see increased adoption if the MS userbase digs in its heels over Windows 10 the way it did over Windows 8. On the other hand, it’s possible that we’ll see increased support for the next-generation OpenGL standard (dubbed GLNext) as an alternative to DX12 and Windows 10.
    More details on both DX12 and GLNext will be released at GDC this year, which kicks off in early March.

    More...

  7. #87
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Google, Mattel team up to offer View-Master VR in kid-friendly package



    If you grew up from the 1960s through the 1980s, chances are you or someone you knew had a distinctive red image viewer and a stack of flimsy cardboard reels. The classic View-Master reels could depict scenery, movies, TV shows, or any other visual content in stereoscopic 3D, with some models even incorporating an audio track. Now, Mattel has announced a partnership with Google to bring a Carboard version of the View-Master to life with 3D animated reels that introduce kids to the concept of VR.

    The new View-Master will be available for roughly $30, new “reels” of content will cost roughly $15 in packs of four and offer a new VR experience that’s tailor-made for children. The device won’t be wearable, as such — it’ll maintain the interactive elements that’ve made the View-Master option unique, with application availability across Android, iOS, and Windows. The device will apparently fit most smartphones (compatibility has yet to be detailed) and uses an uprated version of Google Cardboard made from plastic. Content can apparently be purchased in plastic reels or downloaded via the application (exactly how this works hasn’t been disclosed yet).



    The new Googlefied version is clearly based on the classic styling of the original

    The Verge wasn’t impressed with the initial run of viewer apps, claiming that the environments look like crude video games and that the informative captions “don’t do much to help.” The photos are getting better ratings, and the entire idea of updating a physical, reel-based system is both nostalgic for existing adults and possibly a cool idea for kids as well, if the content can be brought up to snuff.
    One issue that the Verge brings up indirectly in its coverage is the simple fact that VR content will live and die on the strength of its material. This has been brought up in coverage at Ars Technica and from time to time in other areas — gaming that wants to include VR options have to be explicitly designed for it. Standard video effects that work perfectly well on a monitor aren’t well suited to a head-mounted display.

    Done properly, Mattel’s View-Master could be an amazing toy that blends old-school physical hardware with brand-new content in resolutions and quality levels that children in the 1960s could only dream of. If the content is lackluster, however, the Mattel View-Master will go down as a failed kludge — a device that tried to bridge the gap between real-world toys and virtual entertainment and fell squarely into the hole instead.


    More...

  8. #88
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Nvidia kills mobile GPU overclocking in latest driver update, irate customers up in arms


    Nvidia’s mobile Maxwell parts have won significant enthusiast acclaim since launch thanks to excellent performance and relatively low power consumption. Boutique builders and enthusiasts alike also tend to enjoy pushing the envelope, and Maxwell’s manufacturing characteristics apparently make it eminently suited to overclocking. Now, apparently, Nvidia is cracking down on these options with a driver update that removes the overclocking features that apparently some vendors sold to customers.

    As DailyTech points out, part of what makes this driver update problematic is that system manufacturers actively advertise their hardware as having overclock support baked in to mobile products. Asus, MSI, Dell (Alienware) and Sager have apparently all sold models with overclocking as a core feature, as shown in the copy below.



    Nvidia apparently cut off the overclocking feature with its 347.09 driver and kept it off with the 347.52 driver released last week. Mobile customers have been demanding answers in the company forums, with Nvidia finally weighing in to tell its users that this feature had previously only been available because of a “bug” and that its removal constituted a return to proper function rather than any removal of capability.

    Under normal circumstances, I’d call this a simple case of Nvidia adjusting a capability whether users like it or not, but the fact that multiple vendors explicitly advertised and sold hardware based on overclocking complicates matters. It’s not clear if Asus or the other manufacturing charged extra for factory overclocked hardware or if they simply shipped the systems with higher stock speeds, but we know that OEMs typically do put a price premium on the feature.

    To date, Nvidia has not responded formally or indicated if it will reconsider its stance on overclocking. The company isn’t currently under much competitive pressure to do so — it dominates the high-end GPU market, and while AMD is rumored to have a new set of cards coming in 2015, it’s not clear when those cards will launch or what the mobile flavors will look like. For now, mobile Maxwell has a lock on the enthusiast space. Some customers are claiming that they’re angry enough to quit using Team Green, but performance has a persausive siren song all its own, and the performance impact of disabling overclocking is going to be in the 5-10% range for the majority of users. If customers can prove they paid extra for the feature, that could open the door to potential claims against the OEMs themselves.

    For Nvidia, this surge of attention on their mobile overclocking is a likely-unwelcome follow-up to concerns about the GTX 970’s memory allocation and the confusion and allegations swarming around mobile G-Sync. While none of these are knock-out blows, they continue to rile segments of the enthusiast community.


    More...

  9. #89
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    PS4 will continue to outsell the Xbox One through 2018, report says


    Microsoft has made massive changes to its console strategy over the last two years. The Xbox One is cheaper, less restrictive, and more feature-rich than it once was. But in spite of these strides in the right direction, the PS4 still remains dominant. And if a recent analyst report is to be believed, Microsoft’s console is doomed to play second fiddle to the PS4 well into 2018.

    Recently, Strategy Analytics released a report on the sales trends of the current console generation. By the end of 2018, this analyst predicts that Sony will have sold 80 million PS4s. On the other hand, Microsoft will have sold just 57 million Xbox Ones. Neither number is anything to sneeze at, but that estimation puts Sony way out in front.



    More than anything, this report just adds credence to the idea that the PS4 is Sony’s return to the glory days of the PS2. Perhaps the sales gap won’t be quite as steep as it was two generations ago, but the Xbox team must be upset that they’ve burned so much good will with early mistakes. Hot off the Xbox 360, this was Microsoft’s generation to lose, and lose it did.
    What about the Wii U? Well, it looks like Nintendo’s platform will be hovering between 15 and 20 million if this report is to be believed. Even with last year’s top-notch first-party showing, hardware sales were weak. The Legend of Zelda is supposed to ship sometime in 2015, but even that won’t be able to pull the Wii U out of the gutter. I’m certainly not saying that it will sink Nintendo, but the Wii U continues to reek of failure.

    Meanwhile, PS4 exclusives have been few and far between. Even worse, Sony’s The Order: 1886 has received shockingly harsh review scores. At this point, Bloodborne and Uncharted 4 are pretty much the only major exclusives on the horizon. The PS4’s hardware is superior, and Sony’s messaging has been more consistent, but the severe lack of compelling titles could hurt the PS4 in the long run.

    So, can Microsoft change its fate? It’s definitely in the realm of possibilities. Redmond has been surprisingly proactive with massive shifts in strategy, and the Xbox One even managed to outsell the PS4 for a number of months in 2014. At this point, what these consoles need more than anything is content. And since Microsoft has Halo, Gears of War, and Tomb Raider on lockdown, the future looks bright for Xbox One owners even if the sales gap never shrinks.



    More...

  10. #90
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Can this Arduino box stop online cheating in video games?


    Cheating has always existed in multiplayer games, and for the most part it’s just a minor annoyance. But now that there are millions of dollars up for grabs at eSports tournaments, cheating has become a problem with much larger stakes. So, how do we fix it? Well, a man by the name of David Titarenco thinks he’s solved part of the problem with a tiny little Arduino box he calls “Game:ref.”

    A few weeks ago, Titarenco wrote a lengthy blog post about his hardware anti-cheat solution for Counter-Strike: Global Offensive. It got a fair bit of attention on Reddit, and Titarenco is now working to get this device into the hands of tournament organizers and gamers alike. It has since been dubbed Game:ref, and unsurprisingly, a Kickstarter project is in the works.



    At it’s core, this Arduino-based solution is designed to detect discrepancies between user input and what’s happening in the game. You simply pass the user’s input through the Game:ref on the way into the PC, and then compare those results with the data on the server side of things. If the two are drastically out of step, there’s reason to believe that there’s cheating software running on the user’s PC. It’s certainly not a silver bullet for every single method of cheating, but it might end up being a useful puzzle piece in the eSports scene.
    Keep in mind, this concept isn’t entirely new. In fact, Titarenco himself credits Intel’s “Fair Online Gaming” concept for inspiring this implementation. Earlier this week, he told Polygon that the devices themselves will be made for under $100 each, so possibly this relatively cheap solution can gain traction where Intel’s never did.

    So, can this really stop cheaters completely? Certainly not. As soon as someone has physical access to the device itself, all bets are off, and Titarenco seems aware of that. He’s going after input-based software cheating exclusively here, but there’s no real guarantee that will work perfectly either. Given enough time and financial incentive, it’s conceivable that cheaters could target this specific detection method, and find a work around. At best, I can see this working as an additional layer of protection in a tournament setting, but that’s about it.

    Frankly, I find it hard to believe that a perfect anti-cheat solution will ever exist — especially with so much money on the line. The best we can do is gather as much data as possible, implement strict regulations in tournaments, and keep our ear to the ground for the latest advancements in online cheating.


    More...

Page 9 of 97 FirstFirst ... 7 8 9 10 11 19 59 ... LastLast

LinkBacks (?)

  1. 10-25-2014, 03:45 AM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •