Battlefield 3 MP Beta Date and System Reqs

Above title says all. The beta will take place on Sept. 29th (next Thursday) and will be available for all three platforms (360, PS3, and of course the PC). Players will play in the Operation Metro map that’s set in Paris with the mode being Rush. It’s essentially attacking/defending of respective M-COM stations for those of you who haven’t played Bad Company 2. Here’s what EA had to say:

“Gamers who pre-order the digital PC version of the game at Origin (powered by EA) will be granted early access to the beta starting on September 27, 2011,” EA said Tuesday. “In addition, all customers that pre-ordered a Limited Edition of Medal of Honor will also receive early access to the beta starting on September 27, 2011.”

We instantly snapped a screenshot of the specs needed for the game when we saw it, and they should be nearly identical for the final release. Check out the beta landing page here for more details.

Advertisements

Upgrading Your Rig for Battlefield 3 (Feature)

We’ve been getting quite a few questions lately regarding whether your (VS)PCs are “Battlefield 3 ready”, so we decided to write up a feature article to address most of your concerns. First off is how the new Frostbite 2 engine looks from footage EA/DICE has been releasing of gameplay and it’ll surely keep everyone guessing whether they can run it without a hitch especially if you want the full experience (on DirectX 11, Tessellation enabled, etc.). Check out the latest trailer following if you haven’t already, BUT something to keep in mind is that they always mention that the actual gameplay footage is based on ALPHA code. This means that it has yet to be optimized for performance and cleaned up for final retail release.

 

So far we’re well under two months in before BF3 hits, and any system requirements for the game? No, not really. About a few months back in June, Atomic PC Magazine interviewed Patrick Bach – Executive Producer at DICE. He obviously wasn’t keen on going into the specifics at the time, but he did mention that the demo system used to run everything had “standard high-end components” and a single GeForce GTX 580 graphics card. Again this is being run on early code as mentioned before, so there’s definitely a direct correlation of better hardware being used to showcase the demo. Then he continued mentioning that if your PC is able to match the same “output” of current game consoles (we believe many modern systems out there would) then you should meet the minimum requirements.

GameStop almost had us when they released a set of minimum and recommended PC specs about two months back, but unfortunately they were deemed simply false. It started with DICE’s Senior Gameplay Designer, Alan Kertz just saying, “We have not announced any specs.” through Twitter when being asked about it. While Johan Andersson at DICE declared on the Beyond3D forum, “FAKE. We haven’t announced any system requirements yet.” He did later say, “But highly recommend a quad core, just as with Bad Company 2.”

Now for the question that we all wind up asking at the end of the day: “So, should I upgrade?”

What we know for sure is that the Frostbite 2 engine has no support for DirectX 9 thus Windows XP, so if you are still kicking the hardware/software then it looks like you’re due for quite an upgrade if you want to get in on the latest Battlefield action.

If you have an ATI Radeon HD 4800 series or NVIDIA’s GeForce 9800 family based card or later then you should be able to run the game on DirectX 10 settings, however the previous generation products for both companies also support the tech and are ones where they first introduced support for DirectX 10 – having those parts handling BF3 well is in fact questionable and somewhat out of the picture.

DirectX 11 performance on the other hand gets a little tricky with more variables and settings, a mid to high end graphics card like something from at least the ATI Radeon HD 5800 family or NVIDIA’s GeForce GTX 460 series might be able to get you very reasonable frame rates. This also depends on the video settings in-game. Note that we can’t make any guarantees however at this stage yet, as it’s just our thoroughly analyzed forecast in terms of game requirements.

Bottom line is that you should sit tight if you are already running any of the above mentioned (or newer) video cards, and see how everything plays out as we get more info. As always, we’ll be striving to keep you all posted as this matter develops.

UPDATE: Some of you mentioned whether memory (RAM) was something to consider for an upgrade. While the minimum of 2GB and recommended 4GB requirements posted by GS are simply inaccurate, so we can’t base our feedback off of that. Although the amounts do similarly translate to the ideal memory capacities a typical system should have today, 2GB being the bare minimum and 4GB as the “de facto” or more common amount. With RAM prices at an all time low and continuing to drop throughout this year, upgrading shouldn’t be that big of a deal.

NVIDIA Says Kepler GPUs Delayed Till 2012

Looks like Team Green fans won’t be able to celebrate an early Christmas this year, as its been about a year since NVIDIA said they would launch the new Kepler GPUs based on a 28nm process starting Q3 of this year. It’s the second half of 2011 already, and looks like the company broke out of their own silence and stated that Kepler won’t hit retail shelves until sometime in 2012. Here’s what NVIDIA spokesperson Ken Brown had to say,

Although we will have early silicon this year, Kepler-based products are actually scheduled to go into production in 2012. We wanted to clarify this so people wouldn’t expect product to be available this year

Absolutely no mention of poor yields, but looks like this new line of GPUs are suffering the game fate of it’s predecessor – Fermi. Would this delay the 2013 slated date for the Maxwell GPUs that are next on their roadmap? Only time can tell.

Roundup: NVIDIA GeForce GTX 560

It felt like ages since we did our last roundup, but on another note you might have been aware that NVIDIA introduced a new GTX 560 (non Ti) to their lineup earlier this week. Not surprisingly, this new part performs a bit better than the former GTX 460 and falls shy of the GTX 560 Ti – it also fills a small price gap (about the $199 mark) in terms of what they have to offer. This new card does also takes the performance crown when compared to AMD counterparts which are priced at a similar price point.

All the Answers to Intel’s Brand New Z68 Chipset – Part 2 (Feature)

Looking back at our last article, we’ll continue with discussing Intel’s SRT (Smart Response Technology) and this cool feature on the Z68 chipset uses an SSD to cache more frequently used data from an hard drive. One can expect up to a 4x increase in performance over the traditional HDD.

The process of setting up this new tech requires a few simple steps to get it up and running, as you need to ensure that both the SSD and hard drive are hooked up then you need to make sure that the Intel controller default of AHCI is set to RAID in UEFI. After that you can go about installing Windows 7 to your hard drive as normal, but you can’t enable SRT until all the drivers are installed. Finally it’s just a mater of hitting the Accelerate button in the RST driver (see below).

Onwards, simply select the drive you’d like to use as a cache and select the disk you want “accelerated” (would usually be C:\) then choose the mode of Enhanced or Maximized (see following image). First option Enhanced is probably your best bet in terms of being safer, and it offers better read speeds but write speeds remain the same as the HDD used. Second is Maximized and this is a form of caching since data is written to the SSD then synced to the hard drive. This mode should yield similar write speeds as what the SSD used is capable of.

At the end of the day, SRT is ideal for those who can’t afford large capacity SSDs but want that solid state performance. By leveraging this tech, individuals can get smaller SSDs and be able to achieve similar speeds when used in conjunction with a traditional hard drive.

NVIDIA GeForce GTX 560 (non-Ti) to Hit Shelves Tomorrow

We yawn, but with jesting put aside the new GeForce GTX 560 by NVIDIA may be the modern counterpart compared to the 9800 GT back when it was released in 2008. This new card is probably going to perform a lot better though right? Based on surfaced detail of this card, we’re looking at a $199 MSRP – it’ll also feature 336 CUDA Cores (same as the GTX 460), 56 texture units, 256-bit memory interface, and 1GB of GDDR5 memory to top things off.

Take a look below if you haven’t seen this sneak peek clip from NVIDIA already.

UPDATE: Looks like Newegg was already listing GTX 560 cards by Palit and MSI, both having custom cooling solutions. When we rushed over there, the items seemed to have been pulled from the site already. Perhaps we’ll see them stocked later today or tomorrow (actual official release date). In terms of pricing, realistically expect to pay $200-225 as that was the range of what the two mentioned manufacturers listed their cards at.

All the Answers to Intel’s Brand New Z68 Chipset – Part 1 (Feature)

At the a first glance there aren’t a whole lot of changes opposed to its predecessor P67, but there are some significant improvements that make the Z68 platform worth considering. Let’s jump straight in, and they include:

  • Being able to overclock the graphics core in the CPU (Sandy Bridge)
  • A new IPT (Identity Protection Technology) feature that integrates a hardware token in to the PC
  • Support for switchable graphics between external graphics and the integrated solution in Sandy Bridge
  • SSD caching that uses a smaller capacity SSD and traditional hard drive to increase responsiveness

We have to agree with many critics out there that the first two won’t be the main selling points for a typical consumer, but being able to switch between might be more of an interest. The early adopters of the P67 chipset couldn’t utilize the on-die graphics simply because the board didn’t have any outputs as those were only on the H and Q-series chipsets. For the Z68 now, Intel bundles the Flexible Display Interface which allows processor graphics to the chipset and a good example which includes bother DVI and HDMI ports would be ASUS’s P8Z68-V Pro.

Here’s the part that will intrigue most people, since motherboard makers will be bundling LucidLogix’s Virtu tech as we previously mentioned. Two modes available are i-mode and d-mode, we’ll elaborate further.

When in i-mode, you are mainly utilizing integrated graphics and only tapping in to your discrete card for gaming and there is some prep that needs to be done. This includes initializing the integrated graphics port in UEFI, connecting your monitor to the motherboard’s graphics port, and installing graphics for both the on-board Intel graphics portion then your choice of the discrete card. It’s not all that labor intensive with just a couple of straight forward steps, but the problem arises when you need Lucid to make profiles for any game you decide to run in the Virtu mode and we know that several gamers are eager to dive in to the newly released title. Asides from that, power savings are insignificant, and in contrast to a mobile version of switchable graphics the external video card doesn’t shut off completely. Despite all the power management features in high-end graphics solutions today, they still consume quite a bit of power even when it’s idling. The i-mode is also not compatible at the moment with dual-GPU cards and SLI.

As for d-mode? We think that this will be the more useful one of the two. You are start by setting UEFI to initialize the PCI-E graphics adapter first then hooking up your monitor to the graphics card. In this particular mode, the external graphics solution is the main one and games can run without the use of any profiles coming from Lucid. Then you may be wondering now about the usefulness of this mode, well it’s being able to take advantage of the Quick Sync technology built right into Sandy Bridge. It may not shine when it comes to gaming, but it means serious business for encoding and transcoding since Intel dedicated transistors just for those jobs. How good is this implementation you would ask?

In d-mode, Cyberlink MediaEspresso 6.5 took 142 seconds on a GeForce GTX 580 to transcode a single VOB file to a generic WMV file. While the QuickSync mode on the Core i7-2600K took only 109 seconds, and that’s about a 30% difference. If something were to take hours, which option would you rather have?

Another rad feature is Intel’s SRT (Smart Response Technology) which lets the Z68 chipset to use an SSD to cache commonly used data from a hard drive. This supposedly yields up to a 4x improvement in performance over a traditional drive alone. Unfortunately our coverage got longer than we expected, and we’ll discuss SRT in the next segment as our two-part series. Stay tuned.

Roundup: AMD Radeon HD 6790

With their top of the line offering released and out of the way, AMD has expectedly put together an answer to NVIDIA’s GTX 550 Ti mid-range card. The new graphics unit retails for the same ($149), and comes clocked at 840MHz for graphics and shader speeds with 1GB of GDDR5 at 4.2GHz. We got reviewers acknowledge that this may have been a late response to their competitor’s GTX 460, but that’s a card that is slowly leaving the scene to be replaced by the less powerful GTX 550 Ti (awkwardly). If you’d want the latest video card for this particular price point then AMD’s solution does seem like the better option, but the new HD 6790 is suffering the same fate as its counterpart alternative since an HD 6850 can be had for close to the same price with rebates (if you don’t mind them that is). In all it’s a good card, and a tad lower price would have made it superb. All the detailed info and benchmarks below.

NVIDIA’s GTX 590 (Dual-GPU) Flagship Currently Sold Out

Well down in the States at least, but with the price point being high and limited supply/yield – it’s no real surprise especially when there’s only a couple thousand cards available in Europe alone. Popular retailers in the US like Best Buy, Newegg and TigerDirect are all out on this new part, but it can still be purchased overseas however.

There’s rumors that NVIDIA may slash prices in the European market because of the initial marked MSRP at about €603-610, and they may offer rebates for customers in that region to get these cards moving off the shelves.

“The GTX 590 is the best dual GPU product ever built,” said Drew Henry, general manager of GeForce GPU business at NVIDIA. “With leading performance, support for multi-monitor 3D gaming, Quad SLI, and an acoustic envelope that begs to be heard for how quiet it is, the GTX 590 epitomizes what a perfect dual graphics card looks, performs, and sounds like.”

If you happen to be a proud, new owner of a GTX 590 there’s new beta drivers NVIDIA released earlier today that add support for their flagship card as well as the GTX 560 and 550 Ti. They say that there are performance gains over the 266.58 drivers when using any GeForce 400/500 series GPUs and up to a 516% boost in Dragon Age 2 (only for SLI, 2560×1600, 8xAA/16xAF, Very High, SSAO On).

The 270 series (and later) drivers also introduce NVIDIA Update that will keep your PC up-to-date with the latest graphics drivers through notifications when new drivers are available. We got the links below if anyone’s interested.

Roundup: NVIDIA GeForce GTX 590

You’d probably want to know about the specs right off the bat, so it looks like we got 1,024 CUDA cores as speculated, 94 ROPs, and 3GB of GDDR5 RAM for this card. If you’re wondering, the GTX 590 is actually two GTX 580 chips combined but with power constraints – speeds were adjusted for the components on-board to compensate. The core is at a lower 607MHz, while it’s 1.2GHz for the shaders, and the memory is clocked at 3.4GHz.

Not surprisingly, with all the performance that’s jam packed into this card, it costs the same as AMD’s single-card flagship at $699. Hate to spoil it for all you NVIDIA fans out there, but the arrival of this much anticipated card doesn’t exactly blow the HD 6990 out of the water. It actually falls slightly behind the current single-card performance leader in some of benchmarks, however props to the experts behind an astonishingly quiet cooler especially for a card that performs at this level. We got the usual links for you to go through and bonus videos below for the weekend!