Page 8 of 97 FirstFirst ... 6 7 8 9 10 18 58 ... LastLast
Results 71 to 80 of 965
Like Tree1Likes

Game Tech News

This is a discussion on Game Tech News within the Electronics forums, part of the Non-Related Discussion category; Though it has only been less than two years since their reveal, Valve’s Steam Machines are basically vaporware at this ...

      
   
  1. #71
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Razer’s $100 Forge TV will try to bring PC gaming to the living room



    Though it has only been less than two years since their reveal, Valve’s Steam Machines are basically vaporware at this point. The hype was almost overbearing for a while, with gaming hardware companies announcing their own takes on the Steam Machine on a regular basis, and gaming communities passionately discussing whether the initiative would be Valve’s next big thing or next big flop. After countless controller redesigns, effectively no console announcements, and even Alienware changing its Steam Machine to a more standard gaming PC, it would seem that the hole in the living room PC gaming market is as big as it has always been. With the reveal of the Forge TV, a $100 Android gaming console capable of PC game streaming, Razer looks to plug up the hole that many thought Valve would fill.

    While both Android gaming consoles and Steam Machines haven’t caught on, Razer is hoping the combination of the two — plus the brand’s popularity, sleek aesthetics, and a fancy lapboard-and-mouse combo — will help. First, the Forge TV acts as an Android TV device, meaning it runs Android games from Google Play (in HD, of course), supports four-player local co-op, runs Android apps (and thus streaming media services), and has Google Cast support so you can access it through a variety of mobile devices and PCs.

    What sets the Forge TV apart from other Android consoles is that its companion software, Cortex, has the capability to stream games from your PC to your living room — and Cortex isn’t limited to one service. Cortex can not only stream Steam games, but can stream content from Battle.net, Origin, Uplay, and even games stored locally — all at 1080p.

    To achieve these lofty goals, the Roku-sized, Android 5.0 Lollipop Forge TV console is powered by a 2.5GHz quad-core Snapdragon 805 and Adreno 420 GPU, 2GB of RAM, and a seemingly meager 16GB of onboard storage. Its connectivity options offer 802.11ac WiFi, Bluetooth 4.1, and Gigabit ethernet, and the console also has one HDMI 1.4 port and one USB 3.0 port. It won’t hold a candle to your gaming laptop, and its simply mobile tech stuffed into a tiny 105mm x 105mm x 17mm body, but it should be more than enough to handle Android games and streaming.



    Judging by the constant controller redesigns, where Valve has hit a snag with its Steam Machine initiative is actually making traditional PC gaming controls comfortable in a non-desk environment. Translating a traditional keyboard-and-mouse control scheme to a console controller layout has thus far been tough for Valve, so instead of look for a single solution, Razer is offering up both options: an Nvidia Shield console-like controller-and-device combo (seen above), and a lapboard-and-mouse combo (seen below).



    The Bluetooth Shield-like controller, dubbed the Serval, apes the venerable Xbox controller layout, and has a clamp to attach your phone or tablet. Whereas a controller with a smartphone clamp isn’t anything new, Razer’s lapboard, the Turret, aims to be. It’s a tenkeyless keyboard (meaning no number pad) with an ambidextrous mouse and magnetic mouse pad that can attach to either side of the board. The wireless lapboard can last up to four months on a single charger, and the wireless mouse can handle up to 40 hours of nonstop use. Its battery charger doubles as a slim vertical stand, so its easy to store.

    The Forge TV alone sells for $99.99, and costs an extra $50 if you add a Serval controller. A standalone Serval runs $79.99, and the Turret lapboard costs $129.99. The entire shebang releases in the first quarter of this year.


    More...

  2. #72
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Comcast announces plans to roll out gigabit internet by the end of the year


    The various established telcos and cable operators have been under pressure ever since Google announced it would begin rolling out fiber networks to consumers in test locations across the country. Now, Comcast is gearing up to fight back with its own next-gen standard support and a promise of 1Gbps performance delivered over a cable modem.

    These speed jumps come courtesy of improvements to the DOCSIS (Data Over Cable Service Interface Specification) standard. Currently, DOCSIS 3.0 is deployed across much of the US and Europe, but the new technology Comcast is going to deploy is based on Broadcom’s BCM3390 cable modem SoC, and that chip adds support for DOCSIS 3.1.



    Expected performance from DOCSIS 3.1 (EuroDOCSIS is the European version but has similar specs)

    DOCSIS 3.1 is designed to allow cable modems to hit performance targets that were previously only possible with fiber networks through the use of 4096 QAM (quadrature amplitude modulation) and other steps that improve signal efficiency and transmission quality.



    When Broadcom announced its BCM3390 at CES 2015, Comcast’s executive vice-president, Tony Werner said, “DOCSIS 3.1 is a critical technology for Comcast to provide even faster, more reliable data speeds and features such as IP video to our subscribers’ homes by harnessing more spectrum in the downstream. By more effectively using our cable plant to grow our total throughput, we expect to offer our customers more than 1 Gigabit speeds in their homes in 2015 and beyond.” What’s a tad disingenuous about Comcast’s position is that the current DOCSIS 3.0 standard is fully capable of gigabit transmission already, albeit not very efficiently. Still, the capability is there. and Comcast hasn’t previously deployed it — most likely because rolling out new DOCSIS 3.1 modems is also an opportunity to drive up modem rental fees for the new hardware and to drag out the upgrade process.

    Pricing and rollout availability are both still to be determined, but we’d expect the company to target those cities which Google selects for its own fiber rollouts. AT&T has also promised to build out high-speed connections to 100 cities in the near future — it’s amazing how fast companies respond to perceived competitive pressures when competition actually exists in a market.

    Broadcom expects to sample the BCM 3390 modem in the first part of the year with modem availability coming in the back half of 2015. In addition to providing for efficient gigabit fiber networks, the new modems will also offer two gigabit WiFi performance in the home, potentially opening the door for wireless 4K streaming within the home provided that there’s a clear line of sight between router and television.


    More...

  3. #73
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Intel’s Broadwell is coming to mainstream laptops — here’s what you need to know



    CES has always been a major launch window for Intel and 2015*was no exception. The company announced multiple new Broadwell chips today, as it’s been promising to do for the past few years — but exactly which cores are these, and how much should you care about them compared to the Haswell chips currently on the market?

    First, let’s back up to how Broadwell’s initial delays impacted the long-term product rollout. When Intel decided to stagger Broadwell’s launch dates, it opted to introduce the most power-efficient cores first, followed by mobile parts in the 5-28W TDP range. The new chips are known as Broadwell-U, to distinguish them from the Core M family (Broadwell-Y).

    These new core are going to replace the current crop of Haswell low voltage and mainstream dual-core processors, and primarily drop into ultrabooks and non-enthusiast laptops. The advance are driven by two different cores — an uprated 1.3B Core M chip with higher TDPs and correspondingly more powerful clocks.



    The other Broadwell variant, shown above, bumps up the transistor count and die size considerably, mostly on the GPU side of the equation. Broadwell-U’s 1.9B transistors are spent pushing out the GPU to 48 executable units, up from the 24 EUs*in Broadwell-Y/Core M. In Intel’s parlance, an “EU” or execution unit, is the fundamental unit of the graphics engine — 48 EUs is equivalent to the CUDA core count or streaming processor count from an Nvidia or AMD chip (note: GPU core counts cannot be compared directly due to drastic differences in how much work each “core” can perform per cycle).



    Broadwell’s GPU is available in two basic configurations — GT3 (48 EUs) and GT2 (24 EUs). GPU horsepower is the major factor in determining Intel’s hardware stack and processor pricing. The company is releasing the same basic dual-core + Hyper-Threading CPU in multiple GPU configurations and TDPs, including:

    • Intel Iris Graphics 6100 (GT3 graphics, 28W TDP)
    • Intel HD Graphics 6000 (GT3 graphics, 15W TDP)
    • Intel HD Graphics 5500 (GT2 graphics, 15W TDP)
    • Intel HD Graphics (Pentium and Celeron, 15W TDP)





    The “HD Graphics” denotes a cut-down subsystem with minimal performance attached to a low-cost processor. Graphics capability plays an important role in both chip TDP and Intel’s pricing — the cheapest HD 6000 parts start at $315 for 15W parts, while the slightly higher-clocked Iris 6100 chips at 28W TDPs are the same price.

    Will Broadwell boost battery life?


    The big question for mobile users is going to be whether Broadwell will improve battery life. The picture there is complicated — it very much depends on where you look and what you’re looking for. On the positive side, Broadwell incorporates multiple improvements to the underlying SoC, over and above the improved 14nm process. Intel can now turn the GPU off for much of the time (while continuing to refresh the display controller). In addition, Broadwell sports a new, low-power audio DSP that improves power consumption while watching movies or playing back certain types of content.
    Previous research, however, has shown that these kinds of gains are very much conditional. When Haswell shipped, DigitalTrends did a comparison study on how an identical laptop with Haswell performed in various battery life scenarios compared to Ivy Bridge. In near-idle tests, Haswell indeed delivered a nearly 50% increase in battery life.



    In other scenarios, however, the gains were much more modest, and at full power load, battery life actually went backwards. This reflects underlying physics rather than skullduggery on Intel’s part, but it’s important to keep in mind that the improvements you see are going to be workload dependent.

    The second complicating factor is laptop manufacturer’s tendency to suck up efficiency improvements through the use of higher-end hardware and the relentless pursuit of thinner form factors and lighter systems. Again, there’s nothing objectively wrong*with that, but higher resolution displays consume more power than lower ones, assuming equal technology. Smaller, lighter laptops tends to mean smaller, thinner batteries. In some cases, this also means a reliance on USB charging, which can exacerbate the battery issue by taking longer to recharge the system.

    Put together, we’d suggest that end users expect to evaluate Broadwell system-by-system and in specific use cases. There will undoubtedly be scenarios where it substantially improves on Haswell — and there are going to be laptops that don’t show much gain at all, for a variety of reasons.

    Overall, Broadwell is going to be a positive step forward for the mobile PC industry and if you’re upgrading from a laptop from the 2010-2011 era, I’d jump on it, no question. If you’re already using Ivy Bridge or Haswell, however, I’d wait to see what specific systems offer and evaluate the various pros and cons of hardware configurations. Lower resolution displays and heavier batteries may not be as sexy as Retina-class hardware and featherweight products, but for some people, the battery life improvements outweigh the other issues.


    More...

  4. #74
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    GameStop resold temporarily repaired Red Ring of Death Xbox 360s, report says



    One of the most widespread hardware failures to plague consumer electronics was the Xbox 360’s Red Ring of Death, an unfixable light pattern — unaccompanied by an error code, unlike other Xbox 360 failures — that represented one or more hardware components have broken down in some way. A new report claims that GameStop learned how to fix it back in 2009, and has been reselling the refurbished consoles ever since. The fix, it turns out, could very well have been temporary, which means the much-derided GameStop was selling Xbox 360s marked for death.

    Last console generation, the iconic four-quadrant circular light on the Xbox 360’s power button was the cause of much heartbreak. If everything was in working order, the top-left quadrant and inner power symbol would glow green. If something was amiss, quadrants would light up red — all four if there was an AV cable error, or the top and bottom left if the console was overheating, for instance.

    Game Informer estimates that 54.2% of Xbox 360 consoles have experienced a Red Ring, meaning more than half of Xbox 360 owners have a hardware failure war story. As part of a Business Week article that detailed how retail chain GameStop has thus far managed to stay afloat in an ever-increasing digital world, the site reports that GameStop was reselling temporarily repaired Red Ringed consoles.

    The Red Ring was the result of a faulty connection between chip and motherboard, and — despite a slew of questionable homebrew fixes found throughout message boards — almost always resulted in sending the console back to Microsoft for repair. GameStop’s admittedly ingenious repair team discovered that by heating the top of the console while cooling the bottom, the connection between chip and motherboard could be fixed. The report states that GameStop built its own repair machine, which is operated by a $10-per-hour employee, and resells the refurbished consoles at near-full price. While we all know GameStop is the dictator of trade-ins, it was company policy not to accept Red-Ringed consoles — even though they built a machine to fix and prep those very consoles for resale.



    Sadly, the fix is likely temporary. The GameStop heating-and-cooling method sounds similar to the famous towel method that spread across the internet; remove everything (that’s easily removed) but the power supply from the Xbox 360, wrap it entirely in towels, plug the console in and let it run for 20 minutes, turn it off and unplug for 20 minutes, and it should work once again. The towel method often worked because it used the console’s own heat to re-melt solder on the GPU, but generally proved to be a temporary fix because of the inherent flaws in the motherboard that caused the problem in the first place.

    So, it’s entirely possible that the inflated number of 54.2% is not the result of Microsoft building so many broken consoles, but because previously broken consoles that would break again were being resold through retail stores. Perhaps that’s good news for Microsoft’s reputation, but certainly not good news for GameStop or the customers that purchased the marked-for-death consoles.

    More...

  5. #75
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    8 ways to improve your gaming experience





    Did you get any awesome gaming gear this holiday season — a PS4 or Xbox One under your tree? Or maybe Aunt Marge and Grandpa Smithers made your day with a pile of gift cards instead. Regardless of your platform of preference, you can always improve your gaming experience with some snazzy new accessories. And if you can subsidize your upgrades with all those gift cards burning a hole in your pocket, even better.

    Today, let’s take a look at eight of the very best ways to enhance your time spent gaming in 2015. Whether you have $5 or $500 to spend, there’s something here for everyone. Even if you’re still playing on a PS3 or Xbox 360, you’re still bound to find something here that catches your eye.

    Now, let’s jump in, and find the perfect accessory to fit your needs.



    Analog stick covers


    I like the DualShock 4 a lot, but my rubber grips are falling apart one measly year into the PS4’s lifespan. Instead of buying a new controller or tearing up my thumbs on plastic, I invested in some analog stick covers for less than $5. They work like a charm, and fit just about any controller on the market. I whole-heartedly recommend picking up a few of these for any well-loved controller in your life — especially the controllers for older consoles.



    Capture equipment


    Recently, I wrote up a post showing off how to capture and stream video from the PS4. Sony’s built-in solution is nothing to sneeze at, but you’re definitely going to want some dedicated gear if you want to output high-quality video. Personally, I use Elgato’s Game Capture HD60 for 1080p60 game capture, and it works seamlessly. It will work with any unencrypted HDMI signal, so Xbox One and Xbox 360 users can join in the fun as well. PS3 and legacy console users should stick with the older Game Capture HD for the analog ports, though.


    Remote Play


    Remote Play has been an incredible bullet point for the PS4, and it’s the number one reason why I continue to invest in Sony’s ecosystem. With a $80 PlayStation TV or $200 Vita, you can play full-fledged console games wherever you are. Whether you’re in front of the big screen, curled up in bed, or playing some Peggle at the office, Remote Play is worth investing in for the convenience factor alone.



    Headset


    If you’re playing an online multiplayer game, you need a headset. Communication is key for strategy and tactics, and nobody wants to be the only one without the ability to chat. This HyperX headset from Kingston is a superb cross-platform solution, but there are countless high-quality headsets available on every major platform. Sony’s PlayStation Gold headset is excellent for the PS4 and PS3, and Microsoft’s Xbox One headset works perfectly for voice chat — no need for Kinect.


    More...

  6. #76
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    AMD’s R9 380X finalized, could include new memory interface and dramatic performance improvements


    Two new tidbits on AMD’s next-generation GPU architectures have surfaced courtesy of LinkedIn — and while that’s not our typical avenue for discovering new data, past information cribbed from profiles has indeed panned out. In this case, it’s a pair of profiles — one from Linglan Zhang, a system architect manager at AMD, and one from Ilana Shternshain, an ASIC physical design engineer.

    The two profiles collectively point to two things. First, that the upcoming R9 380X GPU from AMD has already taped out and gone to manufacturing. This is welcome news, given that Nvidia’s Maxwell shipped months ago, but it doesn’t tell us much in and of itself. It was always a given that AMD would build a new GPU architecture, and we don’t have any information (yet) on whether the R9 380X jumped for 20nm technology or stayed on 28nm. Since new nodes aren’t a guaranteed advantage for large, complex chips the way they used to be, it’s harder to predict what AMD might do.

    Second, and arguably more interesting, is Zhang’s profile, which includes the following:


    So, what is “High Bandwidth Memory,” and how could it change the future of AMD’s graphics?

    Understanding HBM


    High-Bandwidth Memory is a specialized application of the Wide I/O memory standard we’ve previously discussed as a long-term replacement for DDR4. Hynix, AMD’s co-partner in developing the standard, describes it as “Wide I/O stacked DRAM with TSV” (through-silicon vias). In this configuration, the GPU RAM would be implemented directly around the GPU itself for optimal routing and minimal cost. One difference between 3D Wide I/O and HBM is that Wide I/O can be stacked directly on top of the SoC — you wouldn’t want to do that with HBM, due to the GPU’s phenomenal heat output.



    Approaches like Hybrid Memory Cube sometimes use 3D stacking. AMD is using interposer 2.5D stacking, as on the right.

    Everyone agrees that HBM is going to be the Next Big Thing, including AMD and Nvidia themselves, and there’s good reason for it. Even initial HBM implementations are going to offer at least equivalent bandwidth to today’s high-end cards, but with significantly improved access latencies and power consumption.


    One thing to keep in mind is that this charge is intrinsically designed to make HBM look good. They’re comparing a DDR3 DIMM of just 8 bits (traditional implementations are 64-bits) and an GDDR5 bus of just 32-bits (most modern GPUs gang 8-12 memory controllers together). In other words, the 128-256GB/s worth of bandwidth off HBM isn’t the only benefit — the benefit is in cost, trace complexity, latencies, and power.

    Is AMD ready to deploy HBM in next-gen Radeons?


    We know HBM can bring substantial performance improvements, and we know that AMD is working on the technology. The question is, is it ready to deploy on next-generation hardware yet? That’s where a macro-economic perspective is more helpful, and the information there, more tenuous. According to multiple sources, TSMC only began ramping up its TSV support quite recently. The next graph comes directly from TSMC’s own projections of wafer start times for various types of standards:



    The “HBM” block takes up most of 2015, but tilts towards the back half of the year. Since these are wafer starts as opposed to volume production, it implies that there’s going to be lag time between the launch of these parts and full availability. GlobalFoundries is also building HBM capabilities, but is rumored to be behind TSMC in deploying the technology.

    This suggests one of two things: Either the R9 380X will not use HBM and that technology will be reserved for a second, follow-up GPU (possibly an R9 390X coming later this year or early next year), or that the R9 380X will use HBM, but will not deploy until later in the year to give the technology time to mature.

    There is a third option, though it would be fairly out-of-band for AMD, given its current financial situation. The company could have opted to aggressively adopt HBM, sacrificing costs for performance and early availability.*If AMD managed to steal a year on Nvidia’s Pascal, it could tilt the competitive landscape sharply back towards Team Red and give the company a welcome shot of good news.


    More...

  7. #77
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Nasty Steam for Linux bug can wipe all your user files


    Over the past few years, Valve has built grassroots support and interest for alternate operating systems, including OS X and Linux, as well as its own version of the Linux operating system. Unfortunately, there are clearly bugs still to be worked out of the Linux variant, and a new problem can lead to Steam wiping all of a users’ files.

    Here’s the problem: when Steam installs itself, it sets its own active directory to $STEAMROOT. If the user then moves the Steam directory, when the system attempts to run “rm -rf”$STEAMROOT/”*. This command tells the system to remove all subfolders and directories from STEAMROOT, including the STEAMROOT directory itself.

    When that command fails to execute, typically because the end-user has moved the install directory to a different drive, the system interprets this failing as “rm -rf /”. For those of you not familiar with the Bash script, that command means “Delete everything on the hard drive.” Since Linux doesn’t give the userspace permission to touch core operating system files, the only thing that gets dumped is all the data in user-land. The command shouldn’t touch files on alternate hard drives, but since most user data is stored on the OS disk, the damage is enough to care about.
    This bug was triggered when user Keyvin moved the Steam install directory, but attempted to create symbolic links (symlinks) between the STEAMROOT location and the new storage point.

    The necessity of moving Steam folders


    Steam collections are somewhat unique when it comes to moving data, and if I sympathize with the penguinistas on this point it’s because I’ve performed precisely the same kind of operation in Windows multiple times before. Today, you can simply create a new library location and add games to it, but in the Bad Old Days, moving a Steam install from one drive to another was something of a laborious process that involved a partial reinstallation of the application.



    Copying your system folder and Symlinking it was a way to avoid going through the full reinstall process and it saved a great deal of time. It also made it easier to swap games between a small SSD and a large hard drive back when SSD costs were high enough that one could only reasonably fit a few games at a time.

    Having performed this kind of operation without a second thought in Windows, it’s sobering to think that a scripting bug could destroy years of accumulates files in Linux. Even if you use an online backup service, erasing and then retrieving 50-70GB of data could take quite some time.
    This latest bug comes on the heels of some rough hits for Steam — the Steam Machines project was delayed last year, and the company forced to institute region locking to prevent people from engaging in currency speculation via the collapsing Russian ruble. Valve has yet to respond to any requests for comments on this issue or provide guidance on a fix.


    More...

  8. #78
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    The top laptops for photographers, engineers, students, and everyone else


    Shopping for a laptop can be extremely frustrating at times. Even when you already know the ins and outs of personal computers, there are so many different aspects to consider with laptops. Weight, battery life, and cost come into play right off the bat, and when you factor in specific work requirements, the number of decisions can be more than a little overwhelming. Thankfully, we’ve already done the legwork for you. If you’re on the hunt for a new laptop, look no further.

    In these round-ups, we break down exactly what’s important when you’re buying a new laptop, and offer a handful of recommendations. Whether you’re a college freshman looking for a deal, or an engineer looking for a portable super computer, we’ve got you covered here, so let’s jump in.

    Best laptops for engineers (best mobile workstations)

    Whether you’re designing complex structures in AutoCAD or crunching data in R, engineers need as much horsepower as possible. If you want to work efficiently on the go, you definitely need a machine with a beefy CPU, a discrete GPU, and RAM to spare. These portable workstations are certainly on the pricey side, but being able to get real work done away from your desk is invaluable.


    Best laptops for video editing

    While video editing has traditionally been relegated to massive desktop machines, laptops have become increasingly viable alternatives in recent years. Quad-core CPUs and surprisingly powerful GPUs are easily found in modern high-end laptops, and with the introduction of Retina-caliber displays, you’ll be able to see every little detail even when you’re away from your 30-inch monitor at your desk. Add in loads of RAM and a super-fast SSD, and video editing laptops will do well by any editor.

    Best laptops for photo editing

    If you’re out shooting and editing photos in the field, you definitely want to have a powerful laptop by your side. Obviously, you’ll want fast storage and lots of RAM for managing high resolution images, but there’s more to consider. You’ll need to see every nook and cranny in your photos, so a nice big display with high pixel density will make your life a whole lot easier. But if the added cost and heft don’t fit your lifestyle, there are some damn fine ultrabooks available as well. Whether you’re a budding amateur or a seasoned pro, there’s definitely photography-compatible laptop here for you.

    Best laptops for business

    Too often, employees get stuck with underpowered laptops, but that doesn’t have to be the case. If you’re in charge of your company’s IT purchases, or you’re in a position to pick your own work laptop, you can snag some outstanding hardware for under $1000. These affordable business laptops certainly aren’t going to run Far Cry 4 on ultra settings, but these are rock-solid machines with shockingly reasonable price tags.


    Best laptops for college students


    Laptops are extraordinarily popular on campus, and it’s easy to see why. Taking notes, quickly accessing your body of academic work, and easily collaborating with your classmates is very enticing. But before you buy any ol’ machine, you need to consider the weight, battery life, and asking price. There are definitely trade-offs to be made between those three factors, but this selection of college-appropriate laptops should cover just about any student.

    A laptop for everyone


    Whether your budget is $300 or $3000, there’s a laptop here for everyone. High-end production machines and low-cost workhorses alike, picking out the right laptop is all about finding the right balance for your needs. CPU, GPU, RAM, display, battery life, weight, bulk, and price are always going to be pulling at you in different directions. Even with an unlimited budget, there’s not a single laptop to fit every single lifestyle. So read through these round-ups, take your time weighing your specific needs, and you’ll be more than ready to buy when the time is right.


    More...

  9. #79
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Microsoft’s Windows 10: Free upgrades, Xbox streaming, OS-level video capture


    After weeks of speculation, Microsoft took the lid off its Windows 10 stewpot this afternoon, giving both tech press and industry enthusiasts a concrete demonstration of the features and capabilities its next-generation operating system will offer. To say that the unveil was important would be something of an understatement; MIcrosoft’s Windows 8 is arguably the least-successful OS Microsoft has launched in decades, with an adoption rate that has generally lagged even Windows Vista. Based on what Microsoft showed today, the company has a shot at winning back some unhappy users.

    First up, the big news: Windows 10 will be a free upgrade to anyone currently running Windows 7 or Windows 8. Please note that caveats undoubtedly apply. It’s not clear yet if that upgrade guarantee applies to both Home and Pro users, or if Windows 8.1 users with the Media Center Pack will be properly taken care of this time (if you owned the MCP for Windows 8.0, the 8.1 upgrade was entertaining, to say the least). Presumably the upgrade offer will only allow you to download an upgrade ISO as opposed to a full reinstall or disc-based option, and Microsoft has said that the offer will only be good for a year. Whether the company intends to register all Windows 7 and 8 keys and then check them against a database of Windows 10 keys or not is unclear.

    Even knowing that there will be caveats attached, Microsoft’s decision to give Windows 8 away to a large chunk of its market base is a significant development. Throw in the price cuts and free offers for OEMs that commit to shipping Windows devices in various form factors, and it’s clear that Microsoft is fundamentally rethinking its entire approach to OS monetization.

    DirectX 12, Xbox streaming, video sharing


    The biggest potential gaming feature that Microsoft unveiled is an option I’ve been personally hoping the company would add since it first announced the Xbox One — the ability to stream games to a Windows 10 device. Details on precisely how the streaming will work or what the hardware requirements will be are still sketchy, but the tech demo from stage showed Forza 3 streaming to a Surface Pro 3 tablet via the Xbox application.

    This could be the killer feature that gives Microsoft a real leg up over Sony if it’s executed properly. While Sony and Nintendo both offer a form of game-shifting / streaming, both manufacturers lock the option to a specific piece of hardware — either the PlayStation Vita or the Nintendo Wii U. Microsoft, in contrast, has opened streaming options to millions of potential systems in the US alone. Even now, with tablet and smartphone adoption surging, the majority of US households own at least one PC, and the majority of those PCs are capable of running Windows 10. With new, universal apps simultaneously coming to the Xbox One, we could see intriguing cross-platform options. The Xbox One looks as though it’s going to gain some basic PC applications while PCs will soon be capable of streaming the Xbox One’s library of games. That’s a win-win.

    Moving past that announcement, this event was also supposed to be the big unveil for DirectX 12, but Microsoft only barely demonstrated the feature. We saw a single, canned demonstration from Futuremark that looks as though it was lifted from the same DX12 example code we saw running at GDC last year. Microsoft predicts performance gains of up to 50% in certain titles, which is in the range of what AMD also predicted (and sometimes delivered) with Mantle. Microsoft is also claiming huge power efficiency increases — up to 50% improvements in performance per watt in some cases.


    CPU performance in DirectX 12

    The company’s comments on Xbox were more interesting. The Xbox application will now ship standard on every single Windows 10 device and will function as a comprehensive front-end application for managing a games library across multiple devices. With the heavy push Microsoft is putting on unifying the kernel and development code across the various versions of Windows, it’s likely that the company will roll Windows Phone games over to the Windows desktop platform at some point.



    Microsoft indirectly confirmed that DirectX 12’s low latency API and performance-boosting capabilities are coming to the Xbox One and demonstrated a build of the latest Fable title running on its game console. Tellingly, it also illustrated an interface that let a PC player join a Fable multiplayer that was kicked off by an Xbox one player.



    Xbox One and PC , playing side-by-side

    The two players explored the area and killed multiple monsters without any lag spikes, visual degradation, or any other sorts of problems. The implication of this demo is that going forward, Xbox One and PC players will be able to compete against each other or play cooperatively in a wide array of titles.

    As cool as it was to see this demoed, we’re dubious on whether or not games will actually take advantage of the feature. It’s been technically possible to combine PC and console players (or MS and Sony players) on the same networks for years now, yet few games (if any) have ever taken advantage of the capability. Subtle differences in draw distance, texturing, and control schemes suddenly become far more important when players are competing on vastly different hardware. Handheld controllers aren’t nearly as accurate as a keyboard and mouse, but the aim assist feature built into many console FPS games could leave PC players crying foul.

    We’re not saying that no developers will ship games that are cross-compatible between Windows 10 and the Xbox One, but plenty of studios have promised this feature before, only to dump it before a game went gold.

    Video sharing has been a huge feature of both the Xbox One and PS4, and Windows 10 is going to capitalize on that with native video sharing and recording capabilities, again managed through the Xbox application. You’ll be able to record, annotate, and share video across a variety of services on both touchscreen devices and traditional laptops. This functionality can be used by other services — one of Microsoft’s demos showed video being recorded from a Steam title before being shared through Xbox Live.

    Whether this type of feature will provoke another temper tantrum and/or multi-year OS development from Gaben remains to be seen, but it wouldn’t surprise me if Valve was less-than thrilled about Microsoft’s continued encroachment into what it undoubtedly sees as its own turf. The more basic functionality Microsoft bakes into its Xbox and other gaming applications, the greater the chance that it can supplant Steam’s functionality in the long run.

    Conclusion: DX12 for everyone and the first steps towards a unified gaming experience


    The biggest difference between Windows 10’s release and all previous versions of Windows is that Microsoft is going to hand users a new version of DirectX that can deliver significant performance improvements without charging them for it. Whether or not you see those gains will depend on the game in question and the GPU you own, of course, but this is marked departure from previous eras, when Microsoft charged a premium for features like DirectX 10 — despite the fact that DX10 was much slower than DX9.

    The streaming ability, video recording, and library management are all welcome capabilities and a concrete sign of Microsoft’s willingness to add features that users actually want. All in all, Windows 10 is shaping up to be a killer release.


    More...

  10. #80
    member HiGame's Avatar
    Join Date
    Sep 2014
    Posts
    1,062
    Blog Entries
    801

    Nvidia’s GTX 960 review: Maximum mileage from miniature Maxwell


    For the past year, Nvidia has engaged in an unusually extended launch schedule. It debuted its Maxwell architecture with a budget GPU in February of 2014, followed it up with the high-end GTX 980 and 970 late last year, and today, with its new midrange GPU — the GTX 960.*While not as sexy as the high-end GTX 970 or 980, which retail for $330 and $550 respectively, it slips in at just $200, which makes it far more affordable and practical for a large number of PC gamers.

    It’s also debuting into the teeth of AMD’s strongest current price/performance ratio. AMD has held a strong position in the $100-$300 price brackets, thanks to aggressive price cuts and the incremental improvements offered by GPUs like the Radeon R9 285. Can Nvidia break that lock? Let’s take a look.



    The GTX 960 is based on Nvidia’s new GM206 GPU. The GM206 GPU is a midrange part that’s essentially exactly half a GM204. It retains the same architectural layout and resource allocation as that GPU, with 128 CUDA cores per Streaming Multiprocessor (SM). If you’ve read our previous Maxwell coverage, than GM206 has few surprises lurking inside it. The chip may be smaller, with just 1024 cores instead of GTX 980’s 2048, but the chip’s internal caches and resources are distributed in the same ratios.

    The GTX 960: Mini Maxwell


    One aspect of the card that’s likely to raise eyebrows is its reliance on a 128-bit main memory bus. While the GTX 960 uses fairly fast GDDR5, clocked at 1750MHz (7Gbps effective datarate), the main memory bus is tiny by modern standards.



    A quick walk down memory lane shows just how odd it is for a GPU launching at the $199 price point to have a memory bus this narrow. The GeForce 8800 GT, which launched in 2007 at $249 had a 256-bit memory bus. All of Nvidia’s cards in this segment have historically used at least a 192-bit bus, with most opting for 256 bits and up.

    This smaller bus fits with Nvidia’s general philosophy with Maxwell, which was to emphasize performance per watt and high overall efficiency over simply building a larger, more complicated chip. In fact, going solely on the numbers, the only thing extraordinary about the GTX 960 is that Nvidia is claiming it can hang with other $200 cards despite a limited memory bus and relatively few cores.

    Nvidia claims the card is so efficient, it’s actually published “effective” memory bandwidth figures that inflate the card’s actual performance in an attempt to give an apples-to-apples comparison. The company is claiming that this represents an “effective” data rate as a way of communicating just how efficient the memory bus on the GTX 960 actually is. We’ll examine this in our review, but the disparity between the GTX 960 and its competition (at least on paper) is fairly significant. AMD’s R9 285, the most potent and applicable comparison card, at least as far as price is concerned, is has 176GB/s of memory bandwidth, compared to 112GB/s for the Maxwell GTX 960.

    That disparity shows up when we consider the GTX 960’s primary competition, at least on paper. AMD has multiple graphics cards clustered around the $200 price point, but the obvious card to compare against is the R9 285. That GPU is based on AMD’s “Tonga’ class hardware,” and it’s the latest refresh of the GCN architecture available on the market.

    More...

Page 8 of 97 FirstFirst ... 6 7 8 9 10 18 58 ... LastLast

LinkBacks (?)

  1. 10-25-2014, 04:45 AM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •