Page 21 of 97 FirstFirst ... 11 19 20 21 22 23 31 71 ... LastLast
Results 201 to 210 of 962
Like Tree1Likes

Game Tech News

This is a discussion on Game Tech News within the Electronics forums, part of the Non-Related Discussion category; For the past year, Nvidia’s GeForce Grid service has provided Shield owners with the ability to stream games from remote ...

      
   
  1. #201
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Nvidia renames, formally launches new GeForce Now game streaming service




    For the past year, Nvidia’s GeForce Grid service has provided Shield owners with the ability to stream games from remote servers. The service has now gone live as GeForce Now, an $8 per month streaming service. Nvidia claims to offer a 1080p-quality service at 60 FPS at that $8 rate, with the first three months included free (the service launches on October 1 for North America, the EU, and Japan.

    Right now, the service offers more than 50 titles, including the first three Batman Arkham titles, multiple Lego-themed games, Orcs Must Die (a personal favorite), Darksiders, and The Walking Dead. Multiple Grid titles are also available, as is the original Borderlands and The Witcher 2.



    Nvidia is talking a good game with its promises of speed and latency, but it’s important to remember that much of GeForce Now’s performance will depend on your ISP, not Nvidia itself. While Nvidia’s PR talks up the fact that it has “optimized every piece of the technology behind GeForce NOW for gaming,” it can’t optimize the quality of your Internet connection or the consistency with which you receive content. In order to maintain a 60 FPS frame rate, new frames need to be delivered extremely quickly. Nvidia’s previous latency slides have implied that GeForce Grid could match console play, but that’s going to depend on your Internet connection.



    One other note about Nvidia and the Shield ecosystem. If you buy an Nvidia controller and plan to take it back and forth across multiple devices, bear in mind that the controller requires GeForce Experience to be installed on a PC in order to function — and GeForce Experience doesn’t work with AMD or Intel GPUs. If you need a controller that can play on multiple devices and you aren’t willing to buy 100% into the NV ecosystem (something that’s increasingly hard to do, since an increasing number of laptops don’t contain discrete GPUs), you’ll need to buy a second controller.

    Nvidia has talked about wanting to become the Netflix of gaming, but it’s lock-ins like this that will make ubiquitous market domination difficult. Netflix is Netflix precisely because you can stream it to practically every device manufactured in the past five years. TVs, consoles, PCs, smartphones — Netflix runs on all of them. Nvidia’s GeForce Now service, in contrast, runs only on Nvidia’s Shield. Even the company’s controllers are only compatible with PCs if you have an Nvidia card installed — and, of course, Nvidia locks out customers from using GameWorks or PhysX on hybrid systems with an AMD GPU installed.

    more...
    Game Tech News || Trading blogs || My blog

  2. #202
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Oculus Rift now expected to cost more than $350



    Ever since the Oculus Rift debuted on Kickstarter, there have been questions about how much the headset would cost. The first dev kit cost $300 when purchased as part of the Kickstarter and the second dev kit, which improved on the platform in a number of ways, weighs in at $350. Analysts had expected that Oculus would attempt to bring the Oculus headset to market at around that price point, since $350 is already an incredibly steep price for a nascent platform, but CEO Palmer Lucky has shattered that expectation.

    World

    War Toons, one of a number of VR titles in development

    According to an new interview published by RoadtoVR, Lucky has now stated that the kits will be more than $350, though he’s not willing to state how much. Lucky explains that the Oculus Rift has added a great deal of additional technology since the days of the DK1 and DK2. According to him, the Rift is designed to be the best VR experience, hands down:

    “It would really suck if you put something out there and people were like ‘Ah man… the Rift is good, but it’s not quite there, you know? If only it was a little better, if the lenses were a little better, if the resolution was a little better, if the screens had been a little bit better, then it would be great because you’d you’d say God, we could have just charged a little more and put a little bit more money into custom hardware and actually achieve that.’… I can’t tell you that it’s going to be $350, and I would say I think people are going to be happy with what they get for the price because I really do think it’s going to be that best VR headset you can buy.”


    How the perfect is the enemy of good


    There are multiple reasons why this price is unlikely to sit well with the company’s fans. First, the entire point of the Facebook acquisition was supposed to be to give Oculus funds that would allow it to bring products to market that didn’t cost this much money. $400 (the minimum likely price point) is the same as what people would shell out for a PS4 or Xbox One bundle, which would contain multiple games. It’s the cost of a high-end PC video card or a 42-inch 1080p TV. That’s a huge commitment for an utterly unproven technology with few-to-no shipping titles on launch day. I’m certain there’ll be a game or two and some tech demos, but it’s going to take years before VR is widely integrated in games, assuming it achieves critical mass at all.



    Eve: Valkyrie

    Next, there’s the unflattering comparison against other VR solutions. It’s all well and good to aim for the top of the market, but that tends to only work when you’ve got a track record of delivering premium products. Companies like Apple have pulled this off before, but Oculus isn’t Apple. Perhaps more importantly, committing to a $400 Oculus also means buying a system capable of using that hardware effectively. The Oculus Rift may offer a vastly superior experience to, say, the Gear VR, but you can buy four Gear VR headsets for the price of a single Oculus Rift. That comparison isn’t flattering.

    The final problem is this: Oculus wants to deliver the premiere VR experience, but a $400 price tag guarantees that if the mass market adopts VR, it won’t use Oculus hardware — it’ll use equipment from Samsung or another low-cost manufacturer. This, in turn, means that whether VR sinks or fails will depend entirely on the experience of using VR on someone else’s hardware. If consumers buy low-end VR hardware and hate it, they’ll blame VR as a poor use of technology.

    I’m torn on this point, because I think high-quality VR experiences are critical to achieving acceptance for the platform — but if those experiences cost $400 or more, it’s unlikely that VR will ever achieve critical mass.

    Sensible maneuver or Oculus Grift?


    I believe Lucky when he says he wants to build the premiere VR headset experience you can have today, but I’m not at all convinced he’s made the right call on this one. Much will depend on how Sony’s PlayStation VR and the HTC Vive are priced. If Oculus comes in under these solutions, it could still win significant marketshare for itself, even if the high price tag keeps most users on the sidelines.

    Right now, it looks as though Oculus has priced itself neatly out of the market. At $400+, users are going to look for other solutions — and companies like HTC could make a killing on selling “good enough” hardware. By staking an early claim to best-in-class, Lucky may have ensured that the Rift becomes irrelevant.

    more...
    Game Tech News || Trading blogs || My blog

  3. #203
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Microsoft’s HoloLens coming in early 2016, developer kit to cost $3,000




    From the beginning, Microsoft has tried to set its nascent augmented reality technology, dubbed HoloLens, apart from competing VR solutions. While VR companies like Oculus and Sony’s Playstation VR have focused on showcasing how virtual reality creates new opportunities to explore games and distant worlds, Microsoft’s AR demos have dedicated themselves to showing how augmented reality concepts can overlay and interact with real life. Now, the software giant has revealed the upcoming HoloLens developer edition — and its $3,000 price tag.

    Reactions to the developer kit have been mixed so far. There’s no doubt that so-called “mixed reality” games can offer some interesting experiences; Microsoft’s own demo on the show floor showed robots breaking into a mocked-up living room and attacking a player. The spider bots are aware of the layout of furniture and other room objects as well as other creatures generated by the AR title — as a larger robot breaks into the room, the smaller robots scuttle to get out of its way. They can be destroyed by the player and can fire projectiles at him (as shown below).



    Unlike VR kits, which all require some additional device support (even if that’s just a mobile phone), the HoloLens is its own self-contained unit. Exactly which hardware components are used in the system is unclear, but we’ve heard rumors of an x86 processor, 60Hz refresh rate, and 2GB of RAM in total. Earlier this year, there were rumors that HoloLens used a custom Intel Cherry Trail processor — whether this is still true with the upcoming developer kits isn’t something Redmond is willing to tell the public just yet. Microsoft also claims to have developed its own holographic processor unit, or HPU, based on a custom silicon design.

    The living room demo


    The full demo video can be seen below — if you want to skip to the actual game demonstration, it starts around the 1:15 mark.

    As tech demos go, this one is fairly impressive, but it also raises some questions Microsoft hasn’t historically been good at answering. Once upon a time, the new hot technology from Microsoft wasn’t HoloLens but Project Natal, later called Kinect. Like HoloLens, Kinect was going to revolutionize gaming by turning your entire body into a controller. In reality, turning someone’s entire body into a controller wasn’t actually much fun. It made it nearly impossible to control player movement through a game world or to perform complex tasks. Without buttons to press, players were reliant on swipes or other large motions.

    If Microsoft had focused on making Kinect integration game-enhancing — by, for example, allowing players to use military sign language to issue orders to squadmates in games like Battlefield 3 or 4, then the technology might have taken off or at least earned a devoted following in specific titles. Instead, Kinect was generally ignored after the initial flurry of launch titles. By the time the Xbox One launched, Kinect 2 integration was seen as a negative, not a positive.


    more...
    Game Tech News || Trading blogs || My blog

  4. #204
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Star Wars Battlefront on the PC: Impressions and performance




    For the past few days, EA’s Star Wars: Battlefront has been in open beta. We spent some time in the game in all three modes — both the Battle of Sullust, Walker Assault on Hoth, and the single-player missions that pitch you against waves of stormtroopers and other attack vehicles in a survival mode. Unlike the console players, who are stuck dealing with either 900p on the PS4 or 720p on the Xbox One, PC gamers get the full monty — as much resolution as your monitor can handle, and quality settings that truly bring Star Wars to life around you.

    It’s difficult to know what to write about Star Wars: Battlefront, and for reasons that have nothing to do with the fact that this was a beta with just three game modes. The overwhelming and immediate thought when you first fire up the game is “I’m playing Star Wars!” On that front, Dice has succeeded beautifully. This game feels like a love letter to every kid who ever raced through the house clutching Han Solo’s DL-44 and making blaster noises. As you race to recover escape pods on Sullust, the capital ships overhead fire on each other (and occasionally on the planet).



    I’m even willing to forgive the fact that Dice shows Imperial Star Destroyers as in-atmosphere craft over both Sullust and Hoth when they ought to have used Victory or Venator-class Star Destroyers instead.
    This nostalgia is particularly strong on Hoth, whether you play as Imperial stormtroopers or the Rebel Alliance. The map is asymmetrical, meaning the two sides have vastly different goals and strategies for winning. The Rebels must activate satellite uplinks that enable Y-wing bombers to make attack runs on the advancing Walkers. The Imperials must defend the walkers against these attacks, which means keeping the satellite uplinks out of commission. As the battle progresses, you’ll fight past the iconic Kuat ion cannon and into Echo Base itself.



    I know that there’ve been previous Battlefront games, but the last version came out in 2005 — long before the advent of DirectX 11, 12, or modern hardware. While that 2005 game holds up reasonably well, considering its age, it’s got nothing on the models and levels of detail Dice has brought to the table. As a nostalgia play and crazy-fun dip into first person Star Wars combat, Battlefront is a true achievement.

    more...
    Game Tech News || Trading blogs || My blog

  5. #205
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    AMD announces fresh losses, pivots towards new licensing strategies



    AMD announced its quarterly results for Q3 2015 yesterday, and the short-term news wasn’t good. While company revenue rose by 12% compared to Q2 2015, total Q3 sales were down 25% year-on-year. AMD reported that demand for both its semi-custom and PC hardware increased in Q3 thanks to Carrizo’s ramp and the approaching Christmas holiday. AMD announced several new projects to hopefully boost its revenue in 2016 and beyond, including the sale of its assembly, test, mark, and pack facilities to a new joint venture with Nantong Fujitsu Microelectronics (NFME).

    As part of this deal, AMD will transfer 1700 employees to the new company, contribute its facilities in Penang, Malaysia, and Suzhou, and sell 85% of its share in these facilities and operations to NFME. AMD will receive $371 million, including $320M in cash. Earlier this year, we heard rumors that AMD had attracted a new source of capital that would help offset its cash burn-through rate, and this announcement confirms it.



    AMD reported some modest ASP improvements in GPUs, but CPU/APU prices were down sequentially and year-over-year. The enterprise, embedded, and semicustom segment (EESC) grew 13% sequentially, thanks to seasonally higher sales of Xbox One / PS4 chips.

    Analyzing AMD’s market position


    Last quarter, we noted that AMD’s APU sales had tanked year-on-year, and while Q3 improved those results, the gap is still critical. For the past 15 years (save for a brief period in 2004 – 2006), AMD has had one critical weakness compared to Intel — it lacked a low-volume, high-margin business that could shield it from the vagaries of the consumer space. One critical reason Intel is still clocking in margins north of 60% and recording solid revenue quarter after quarter is because Intel’s share of the still-growing enterprise market is propping up its financials. AMD has no such shield.

    Even in the consumer space, AMD has very little presence in the high-end boutique systems that are driving mobile gaming or high-end consumer sales. Its Radeon Fury family, including the R9 Nano, will sell into these segments (Fury sales are credited with improving GPU ASPs), but it can’t offer a high-end AMD solution that’s competitive with in both CPU performance and power efficiency. The lower-end systems that anchored AMD’s APU sales are falling to tablet SKUs.

    AMD bears full responsibility for its own mistakes and missteps, but the company would be in a vastly different position if the PC market was still growing at 2-3 percent a year, or even holding flat. The steep quarterly declines that we’re seeing are killing the company’s profits — AMD has been unable to trim its costs faster than the market is trimming its income. The company isn’t just trying to reinvent itself — it’s trying to reinvent itself in the worst, most prolonged slump in PC sales, ever.

    Subtle shifts in strategy


    One thing Lisa Su noted yesterday is that the company is exploring shifting from an “opportunistic” licensing strategy to a strategic one. Put simply, AMD wants to see if its patent portfolio can be shopped around for licensing opportunities. Su didn’t use any of the language patent trolls tend to prefer, like “Aggressively pursuing infringement cases” or “We have a new method of sucking the blood out of grandmothers and war orphans,” but I suspect there’s two motives behind the move. First, AMD needs the revenue patent licensing could bring. Second, AMD wants to show investors that it’s serious about leveraging all available assets.

    more...
    Game Tech News || Trading blogs || My blog

  6. #206
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    The Force Awakens and technology in the Star Wars universe



    The official trailer for Star Wars: The Force Awakens dropped Monday night, and I’m not ashamed to admit I’ve watched it enough times to wear holes in the tape (assuming YouTube videos came on VHS tapes, anyway). This is our first canonical look at the Star Wars universe some 30 years after the iconic Battle of Endor, and it’s an interesting window into how the technology of the Star Wars universe has — or in this case, hasn’t — evolved since Episodes IV – VI.

    Note: For the purposes of this article, I’ve stuck mostly to the “new” canon, which consists of the original movies, the prequels, the Clone Wars and Rebels cartoons, and the new comic books and tie-in novels issued in advance of Episode VII. In Star Wars, the Battle of Yavin is treated similarly to the BCE / ACE divide in modern chronology. Events before the battle are labeled as “BBY” and count down, while events after the battle are denoted as ABY and count up.


    Star Wars has always portrayed technology as progressing much more slowly than other famous franchises. One of the most common features of sci-fi universes is their depiction of technological progress, even if that progress ultimately creates dystopian settings. TV shows like Star Trek: The Next Generation or Battlestar: Galactica,*in contrast, depict a future in which technology continues to evolve quickly. Even by Star Wars standards, however, the vessels and technologies on display in The Force Awakens haven’t changed nearly as much as one would expect.

    In Star Wars, technologies like hyperspace travel, advanced AI, and hologram projection were just as ubiquitous at the beginning of the prequels as anything shown later in the franchise. Even on a backwater planet like Tatooine, Luke carries a blaster, drives a high-speed hovercraft, and isn’t fundamentally astonished to discover that blades of energy that can cut through anything actually exist. What sets The Force Awakens apart from the prequels is that Lucas depicted a very different array of starships, fighters, and combat technology in the prequels as compared to the films that came later.



    The new T70 starfighter (the original X-Wing was the T-65)

    In Episode VII, virtually every example of technology shown in the various trailers appears either directly derived from or identical to hardware we’ve seen in previous films. Stormtroopers still wear nearly-identical armor. Starfighter technology is clearly derivative — the TIE fighters and X-Wing match-up that occurs towards the end of the trailer both feature starfighters that are closely related to the versions seen in the original trilogy. This is rather interesting, given that Lucas’ prequels used a variety of fighter designs that were visually distinct from the fighters in Episodes IV-VI, even if they often featured similar visual themes.



    Ships like the Eta-2 Actis-class were deliberately designed to look like the TIE fighters featured in the original trilogy

    We haven’t seen many capital ships at all so far, but it appears that the First Order is using modified Imperial Star Destroyers. These iconic wedge-shaped vessels are instantly identifiable as “Star Wars ships,” but again, Lucas chose a different path. The early Star Destroyers that the Galactic Republic deployed in Attack of the Clones and continued using through the Clone Wars cartoons and Revenge of the Sith are clearly of a different class than the ISD that stretched across the cineplex when Episode IV debuted in 1977.


    more...
    Game Tech News || Trading blogs || My blog

  7. #207
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    A leaked Black Friday ad points towards a $300 Xbox One



    As 2015 enters its final stretch, the hype around Black Friday has begun to build. We’re only about a month away from the biggest shopping day of the year, and some impressive deals have already started to leak. Cheap laptops and HDTVs are no surprise, but the recent revelation of a super-cheap Xbox One has caught my attention.

    Over at BlackFriday.com, a six-page advertisement for Dell’s Black Friday sale has been scanned, and put up for the world to see. Desktops, laptops, tablets, and more are included, but the very first page features the most interesting placement: a $300 Xbox One bundle.



    For just $299.99, you’ll be able to get a stock 500GB Xbox One Gears of War: Ultimate Edition bundle with a copy of Fallout 4 and an extra controller. Considering that you’d typically be paying $350 for the console bundle, $60 for the game, and $60 for the controller, this package is offering up a whopping $170 worth of savings.

    Listed as a “doorbuster,” this online-only deal goes live at 6pm ET on November 26th. Of course, it’s still hard to tell how big this sale will turn out to be. The ad states that there will be “limited quantities,” but there’s no indication of just how limited the supply will be. Theoretically, Dell might only offer a couple hundred consoles at this price to drive traffic, and then switch out for a slightly worse deal hoping to profit off of the public’s desire to spend money during the holiday rush. If you want the best chance at getting a cheap Xbox One, plan on hitting the Dell store as soon as it goes live.

    Stepping back from this deal in particular, I’m starting to wonder how steep the console discounts will go this year. This ad could either be the tip of the discount iceberg or a massive outlier cooked up by a Dell executive. But considering that the standard price for the PS4 is now identical to the Xbox One’s asking price, Microsoft might feel the need to push hard across the board to compete with Sony.

    While the PS4 has been consistently outperforming the Xbox One in sales, Microsoft is still in a much better financial situation than Sony. If the powers that be at Redmond think it’s worth it to eat the losses to boost adoption, we could see the Xbox One’s price tag shrink even more.

    This time last year, Microsoft “temporarily” dropped the price of the Kinectless Xbox One SKU to just $350. That price ended up sticking around, but Microsoft still hasn’t been able to close the sales gap. Pair a $50 price cut with the release of Halo 5: Guardians, and this console starts looking quite attractive to the millions of Xbox 360 owners who still haven’t upgraded. And since Sony doesn’t have any major first party releases lined up in Q4 of 2015, now is the perfect time to strike. Let’s just hope that Microsoft figured that out.


    more...
    Game Tech News || Trading blogs || My blog

  8. #208
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Analysts predict slow ramp for VR technologies



    When Oculus announced its Kickstarter campaign for the Rift VR headset, it probably didn’t realize just how excited consumers were going to be. The number of companies planning to launch VR headsets or related technologies has skyrocketed in recent years. Sony has VR plans for the PlayStation 4, Qualcomm claims VR capabilities are baked into its upcoming Snapdragon 820, and both AMD and Nvidia are planning marketing blitzes of their own. It’s surprising — and honestly, refreshing — to see a major analyst firm putting the brakes on expectations for the market, and projecting that VR faces a slow, steady build over the next four years rather than meteoric success.



    Samsung’s Gear VR is coming at $99

    IHS is projecting initial VR hardware sales at $1.1 billion in 2016, growing to 2.7 billion in 2020. The firm expects 64% of this revenue to flow to companies that focus on smartphone platforms. The drastic price difference between the Oculus Rift and the upcoming $99 Gear VR headset likely accounts for some of this as well. IHS notes that most spending on software will take place at the high-end of the market, which is expected to split between the Oculus Rift, PlayStation VR, and HTC Vive. Total game-related VR sales for 2016 are expected to be $496 million.

    “Conditions are more suited to virtual reality technology and content adoption than ever before,” IHS supply wrote, “It is neither a bubble nor the next big thing.”

    Slow and steady wins the race


    I’ve been impressed by VR technology every time I’ve had a chance to use it, which is why I’m glad to see an analyst firm issuing a realistic analysis of the technology’s near-term potential. As great as VR is, designing VR games requires developers to consider aspects of how the brain processes motion, movement, and the location of your body. This sense is called proprioception, and wearing a headset can interfere with it dramatically. This piece from Ars Technica reviewed some of the challenges when designing environments for VR. Certain kinds of motion are prone to triggering nausea in players. Stairs, for example, can be troublesome — when your brain sees “you” climbing stairs, it instinctively attempts to position you for doing so. Players end up leaning backwards to counteract nonexistent forces.

    Some of these problems may be easy to fix; researchers have reported that simply adding a nose to VR first person games helps stabilize the view and reduce nausea. Either way, developers are going to have to change the kinds of content they develop for VR. And that’s a potential problem.

    It helps to consider the fate of 3-D cinema and that technology’s sudden surge and waning strength. The early 3-D films that dominated the modern era — Polar Express and Avatar — were created specifically in that format and were designed to take advantage of it. Once studios realized that 3-D content was selling, they jumped to take advantage of it — mostly by pushing inferior, post-production 2D conversions, or by including only a limited amount of 3-D film (15-20 minutes of the movie). These efforts are often distinctly inferior, and the color palettes tend to be muted and muddy as a result. Slap on a $5-$10 surcharge for 3-D tickets, and consumers quickly realized that while they were paying a premium for a 3-D film, they often weren’t receiving an improved experience.

    3-D and VR are very different technologies, but they both require content creators to design towards the specific capabilities of their respective platforms, while acknowledging and compensating for their differences. A huge flush of VR content might seem like a gaming best-case scenario, but I strongly suspect that many corporations would abandon principles of good design in their rush to pile on and make a quick buck.

    The downside to this trend is that it could take several years for the market to truly catch on, and funding for AAA titles in VR conversions could be few and far between. Studios will likely start experimenting with content almost immediately, and there are some amazing games in development, but we don’t expect VR to dominate game sales out of the gate. It’ll be another 4-5 years (and probably another generation of console hardware) before we start to see the medium find its legs. But as far as getting things right and building a stable base for the concept, that’s the right way to do it.

    more...
    Game Tech News || Trading blogs || My blog

  9. #209
    Senior Member matfx's Avatar
    Join Date
    Sep 2013
    Location
    Malaysia
    Posts
    1,178
    Blog Entries
    114
    Follow matfx On Twitter

    How dynamic resolution scaling keeps Halo 5 running so smoothly

    Over the years, gamers have gotten used to highly detailed games that drop frames and get distractingly choppy when the action gets too intense (a deep pain I've personally been suffering through since at least Gradius III on the SNES). Now it seems some developers are toying with the idea of dropping a few pixels of resolution in those cases in order to keep the frame rate silky smooth.

    The technique is called dynamic resolution scaling, and a recent analysis by Digital Foundry goes into some detail about how it works in Halo 5: Guardians. Basically, the developers at 343 have prioritized hitting 60fps consistently through the entire game, a big boon for a twitchy first-person shooter (and a first for the Halo series). The level of graphical detail in some game scenes, though, means that such a high frame rate can only be delivered at resolutions well below the Xbox One's highest 1080p standard.

    Instead of just statically setting a low resolution ceiling for the entire game, though, Halo 5 dynamically changes the resolution based on the detail of the current in-game scene. This on-the-fly adjustment takes place on both the X and Y axes, with resolutions jumping from as low as 1152×810 to as high as 1536×1080 in Digital Foundry's analysis. The apparent on-the-fly change in resolution wasn't even noticeable to my eye during some recent testing.

    While Digital Foundry says that "the game spends the overwhelming majority of its time well under full 1080p," this dynamic resolution scaling means that pixel counters can get the very best visual fidelity possible at any point without the usual frame rate jumpiness. The game also uses other tricks to preserve the overall frame rate, such as using less detailed, "half-rate" animations for far-off enemies (like many other games, Halo 5 also uses less-detailed polygonal models to save cycles when rendering far-off objects).

    This isn't a completely new feature to gaming, even on the Xbox One. Earlier this year, The Witcher III: Wild Hunt dynamically upscaled from its default 900p resolution to a more detailed 1080p when possible. But Halo 5's system seems much more robust, operating on a sliding scale designed to squeeze as many pixels as possible from every single scene.

    Other developers have shied away from similar resolution tweaking for their games, though. While working on the PS4 reboot of The Last of Us, Naughty Dog programmer Drew Thaler tweeted that while this kind of resolution scaling worked wonders in a fast-paced racing game like Wipeout, "[to be honest] racing games are special; it would probably look bad in most other game genres "

    Overall, though, we're bullish on this technique for any genre where smooth, 60fps refresh rates are important, i.e., any genre where quick reflexes are involved. It's no secret that many games have had trouble consistently reaching the 1080p ideal on the latest generation of console hardware, especially on the Xbox One. Dynamic scaling and similar techniques, though, can allow games to look as good as possible during relatively simple scenes, while still animating smoothly during crowded and chaotic portions.

    Making a game look as good as possible on static console hardware is always a difficult balancing act between resolution, frame rate, and the quantity and detail of the moving parts in a scene. Rather than forcing developers to pick one static maximum on all of those axes for an entire game, Halo 5 proves that some dynamic allocation can get the most out of set hardware at any one point.



    More

  10. #210
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    800

    Nintendo announces its first smartphone game, but is it too little too late?




    During a meeting with investors earlier this week, Nintendo unveiled its very first smartphone game. Dubbed “Miitomo,” this free-to-play game lets you use an adorable avatar to interact with your friends without investing in Nintendo hardware. This certainly isn’t the kind of mobile game many of us were hoping for, but can this oddball social game keep this gaming giant relevant in a rapidly evolving market?

    Unfortunately, we won’t be seeing a Nintendo game on smartphones this year. While we expected Nintendo and DeNA’s first release to ship at the end of 2015, it is now officially delayed until March of 2016. Thankfully, we did get an early look at what Nintendo has planned for us. Miitomo, a portmanteau of Nintendo’s famous avatars and a Japanese word for “friend,” has been confirmed to be a free-to-play experience. However, Nintendo was quick to qualify that some of its other smartphone games will feature a more traditional buy-to-play financial model.

    By all accounts, this new game is focused heavily on communicating with your friends while looking at cute avatars on your phone. This isn’t going to be the next Puzzle & Dragons or Clash of Clans, but I’ll wager that this will do well — especially in Japan. That market has already shown how willing it is to dump money on sticker packs for more traditional messaging clients, so it’s no surprise that Nintendo wants in on some of that easy cash. Charge 100 yen for a Mii Mario cap, and watch your pockets grow, right?

    While the 3DS was in no way a failure, it never received the same kind of attention as the wildly successful original DS platform. Part of that can be attributed to the poor launch, and the attachment to lackluster 3D technology, but it’s clear that the major uptick in smartphone adoption is what’s really hindering Nintendo’s handheld. We live in a much different world than we did when the DS first launched back in 2004.

    It hurts my poor nostalgic heart to say this, but this announcement is telling of Nintendo’s future on mobile platforms. It seems like it’s here to make some cold hard cash off of existing trends — not to move the medium forward. Take a look at the games that DeNA has released previously, and all of your hopes and dreams will go right out the window. Everything about this seems cold and calculating to me, despite the fact that it’s couched in warm and fuzzy language about togetherness.

    more...
    Game Tech News || Trading blogs || My blog

Page 21 of 97 FirstFirst ... 11 19 20 21 22 23 31 71 ... LastLast

LinkBacks (?)

  1. 10-25-2014, 03:45 AM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •