Battlefield 3 MP Beta Date and System Reqs

Above title says all. The beta will take place on Sept. 29th (next Thursday) and will be available for all three platforms (360, PS3, and of course the PC). Players will play in the Operation Metro map that’s set in Paris with the mode being Rush. It’s essentially attacking/defending of respective M-COM stations for those of you who haven’t played Bad Company 2. Here’s what EA had to say:

“Gamers who pre-order the digital PC version of the game at Origin (powered by EA) will be granted early access to the beta starting on September 27, 2011,” EA said Tuesday. “In addition, all customers that pre-ordered a Limited Edition of Medal of Honor will also receive early access to the beta starting on September 27, 2011.”

We instantly snapped a screenshot of the specs needed for the game when we saw it, and they should be nearly identical for the final release. Check out the beta landing page here for more details.

Advertisements

Upgrading Your Rig for Battlefield 3 (Feature)

We’ve been getting quite a few questions lately regarding whether your (VS)PCs are “Battlefield 3 ready”, so we decided to write up a feature article to address most of your concerns. First off is how the new Frostbite 2 engine looks from footage EA/DICE has been releasing of gameplay and it’ll surely keep everyone guessing whether they can run it without a hitch especially if you want the full experience (on DirectX 11, Tessellation enabled, etc.). Check out the latest trailer following if you haven’t already, BUT something to keep in mind is that they always mention that the actual gameplay footage is based on ALPHA code. This means that it has yet to be optimized for performance and cleaned up for final retail release.

 

So far we’re well under two months in before BF3 hits, and any system requirements for the game? No, not really. About a few months back in June, Atomic PC Magazine interviewed Patrick Bach – Executive Producer at DICE. He obviously wasn’t keen on going into the specifics at the time, but he did mention that the demo system used to run everything had “standard high-end components” and a single GeForce GTX 580 graphics card. Again this is being run on early code as mentioned before, so there’s definitely a direct correlation of better hardware being used to showcase the demo. Then he continued mentioning that if your PC is able to match the same “output” of current game consoles (we believe many modern systems out there would) then you should meet the minimum requirements.

GameStop almost had us when they released a set of minimum and recommended PC specs about two months back, but unfortunately they were deemed simply false. It started with DICE’s Senior Gameplay Designer, Alan Kertz just saying, “We have not announced any specs.” through Twitter when being asked about it. While Johan Andersson at DICE declared on the Beyond3D forum, “FAKE. We haven’t announced any system requirements yet.” He did later say, “But highly recommend a quad core, just as with Bad Company 2.”

Now for the question that we all wind up asking at the end of the day: “So, should I upgrade?”

What we know for sure is that the Frostbite 2 engine has no support for DirectX 9 thus Windows XP, so if you are still kicking the hardware/software then it looks like you’re due for quite an upgrade if you want to get in on the latest Battlefield action.

If you have an ATI Radeon HD 4800 series or NVIDIA’s GeForce 9800 family based card or later then you should be able to run the game on DirectX 10 settings, however the previous generation products for both companies also support the tech and are ones where they first introduced support for DirectX 10 – having those parts handling BF3 well is in fact questionable and somewhat out of the picture.

DirectX 11 performance on the other hand gets a little tricky with more variables and settings, a mid to high end graphics card like something from at least the ATI Radeon HD 5800 family or NVIDIA’s GeForce GTX 460 series might be able to get you very reasonable frame rates. This also depends on the video settings in-game. Note that we can’t make any guarantees however at this stage yet, as it’s just our thoroughly analyzed forecast in terms of game requirements.

Bottom line is that you should sit tight if you are already running any of the above mentioned (or newer) video cards, and see how everything plays out as we get more info. As always, we’ll be striving to keep you all posted as this matter develops.

UPDATE: Some of you mentioned whether memory (RAM) was something to consider for an upgrade. While the minimum of 2GB and recommended 4GB requirements posted by GS are simply inaccurate, so we can’t base our feedback off of that. Although the amounts do similarly translate to the ideal memory capacities a typical system should have today, 2GB being the bare minimum and 4GB as the “de facto” or more common amount. With RAM prices at an all time low and continuing to drop throughout this year, upgrading shouldn’t be that big of a deal.

Specs for AMD’s Bulldozer CPUs Revealed

It seems like AMD is experiencing some difficulty with the yields of their chips that can hit clockspeeds of revisions B0 and B1, but we’re not certain whether it’s the base clockspeed or Turbo Core based on sources. Here’s the list of processors that the company is launching:

  • FX-8150: 3.6GHz (4.2GHz Turbo Core), 8-core, 8MB L2 cache, 125W
  • FX-8120: 3.1GHz (4GHz Turbo Core), 8-core, 8MB L2 cache, 125W/95W
  • FX-8100: 2.8GHz (3.7GHz Turbo Core), 8-core, 8MB L2 cache, 95W
  • FX-6120: Clockspeeds TBD, 6-core, 6MB L2 cache, 95W
  • FX-6100: 3.3GHz (3.9GHz Turbo Core), 6-core, 6MB L2 cache, 95W
  • FX-4120: Clockspeeds TBD, 4-core, 6MB L2 cache, 95W
  • FX-4100: 3.6GHz (3.8GHz Turbo Core), 4-core, 4MB L2 cache, 95W

All of these CPUs will have an 8MB of L3 cache and take DDR3-1866 memory, and something to note is that they’re based on a 32nm process. We should see them surface around August or September with varying time frames for the parts.

AMD’s Dual-Core Llano Desktop APU Spotted

AMD recently launched two desktop (Llano) APUs which are available for purchase if you so desire, and there are four more different processors slated for the remaining of this year. According to some MSI marketing material, we got some idea of what kind of parts they’re brewing up. Our attention should be focused on the E2-3200 and is supposedly clocked at 2.4GHz bearing 1MB of L2 cache and has a TDP of 65W. It’s also to sport an integrated Radeon HD 6370D graphics solution clocked at 443MHz with 160 stream processors all on the same die. There is no Turbo Core and will be based on the FM1 package.

Sample AMD Bulldozer Chip OC’d to 4.63GHz

That’s quite a bit of “jigahertz”. So what we have under the spotlight is an engineering sample of Advanced Micro Devices’s very own FX-8130P (aka Zambezi) CPU, which is based on the Bulldozer architecture. An enthusiast from Czech Republic managed to push the 8-core chip to 4635.6MHz on air. There were blacked out screenshots, but possibly due to being an employee from an affiliated company or even AMD – we only know that the sample processor itself has 8 cores, 8MB of L3 cache, and takes DDR3-1866MHz, and is fabricated from a 32nm process.

Sadly there are no details on stock speeds, stepping, nor revision. What’s astonishing is that the overclocked managed to pull off Super Pi 1M in just 1.26s compared to an Intel i7 2600K that still took over 6 seconds while being clocked at 6.3GHz.

AMD Phasing Out Phenom II X6 Series

According to news sources, the chip maker has plans to mark their entire Phenom II X6 processor line to EOL (end of life) by the fourth quarter of this year. This is a move to put the spotlight on the new FX series of CPUs based on Bulldozer. The Phenom II X6 1045T, 1055T and 1065T are the first chips marked for retirement, but can still be ordered until third quarter (2011).

Even though AMD won’t manufacturing these chips, however they will still honor any warranty claims that are still valid.

Early Benchmarks for AMD’s “Llano” Platform Surface

Apparently somebody got their hands on the AMD A8-3800 Quad-Core APU along with Gigabyte’s GA-A75-UD4H motherboard and they did exactly the right thing. Ran several benchmarks. Aside from the two feature parts, other parts used were 4GB (2x2GB) of GSKILL DDR3-1600 RAM, a standard 1TB Seagate Barracuda 7200.12 HDD, and note that the graphics is the on-board Radeon HD 6550D that’s on the same die.

What we found fairly darn impressive (for an integrated solution) were the benchmarks in the gaming department with these scores (at 1080p):

  • Street Fighter 4: 50.32 FPS
  • Hawx: 54 FPS (DX9) / 22 FPS (DX10)
  • Hawx 2: 46 FPS (DX9) / 34 FPS (DX11)
  • Resident Evil 5: 29.0 FPS (DX9) / 27.4 FPS (DX10)

Roundup: NVIDIA GeForce GTX 560

It felt like ages since we did our last roundup, but on another note you might have been aware that NVIDIA introduced a new GTX 560 (non Ti) to their lineup earlier this week. Not surprisingly, this new part performs a bit better than the former GTX 460 and falls shy of the GTX 560 Ti – it also fills a small price gap (about the $199 mark) in terms of what they have to offer. This new card does also takes the performance crown when compared to AMD counterparts which are priced at a similar price point.

All the Answers to Intel’s Brand New Z68 Chipset – Part 2 (Feature)

Looking back at our last article, we’ll continue with discussing Intel’s SRT (Smart Response Technology) and this cool feature on the Z68 chipset uses an SSD to cache more frequently used data from an hard drive. One can expect up to a 4x increase in performance over the traditional HDD.

The process of setting up this new tech requires a few simple steps to get it up and running, as you need to ensure that both the SSD and hard drive are hooked up then you need to make sure that the Intel controller default of AHCI is set to RAID in UEFI. After that you can go about installing Windows 7 to your hard drive as normal, but you can’t enable SRT until all the drivers are installed. Finally it’s just a mater of hitting the Accelerate button in the RST driver (see below).

Onwards, simply select the drive you’d like to use as a cache and select the disk you want “accelerated” (would usually be C:\) then choose the mode of Enhanced or Maximized (see following image). First option Enhanced is probably your best bet in terms of being safer, and it offers better read speeds but write speeds remain the same as the HDD used. Second is Maximized and this is a form of caching since data is written to the SSD then synced to the hard drive. This mode should yield similar write speeds as what the SSD used is capable of.

At the end of the day, SRT is ideal for those who can’t afford large capacity SSDs but want that solid state performance. By leveraging this tech, individuals can get smaller SSDs and be able to achieve similar speeds when used in conjunction with a traditional hard drive.

All the Answers to Intel’s Brand New Z68 Chipset – Part 1 (Feature)

At the a first glance there aren’t a whole lot of changes opposed to its predecessor P67, but there are some significant improvements that make the Z68 platform worth considering. Let’s jump straight in, and they include:

  • Being able to overclock the graphics core in the CPU (Sandy Bridge)
  • A new IPT (Identity Protection Technology) feature that integrates a hardware token in to the PC
  • Support for switchable graphics between external graphics and the integrated solution in Sandy Bridge
  • SSD caching that uses a smaller capacity SSD and traditional hard drive to increase responsiveness

We have to agree with many critics out there that the first two won’t be the main selling points for a typical consumer, but being able to switch between might be more of an interest. The early adopters of the P67 chipset couldn’t utilize the on-die graphics simply because the board didn’t have any outputs as those were only on the H and Q-series chipsets. For the Z68 now, Intel bundles the Flexible Display Interface which allows processor graphics to the chipset and a good example which includes bother DVI and HDMI ports would be ASUS’s P8Z68-V Pro.

Here’s the part that will intrigue most people, since motherboard makers will be bundling LucidLogix’s Virtu tech as we previously mentioned. Two modes available are i-mode and d-mode, we’ll elaborate further.

When in i-mode, you are mainly utilizing integrated graphics and only tapping in to your discrete card for gaming and there is some prep that needs to be done. This includes initializing the integrated graphics port in UEFI, connecting your monitor to the motherboard’s graphics port, and installing graphics for both the on-board Intel graphics portion then your choice of the discrete card. It’s not all that labor intensive with just a couple of straight forward steps, but the problem arises when you need Lucid to make profiles for any game you decide to run in the Virtu mode and we know that several gamers are eager to dive in to the newly released title. Asides from that, power savings are insignificant, and in contrast to a mobile version of switchable graphics the external video card doesn’t shut off completely. Despite all the power management features in high-end graphics solutions today, they still consume quite a bit of power even when it’s idling. The i-mode is also not compatible at the moment with dual-GPU cards and SLI.

As for d-mode? We think that this will be the more useful one of the two. You are start by setting UEFI to initialize the PCI-E graphics adapter first then hooking up your monitor to the graphics card. In this particular mode, the external graphics solution is the main one and games can run without the use of any profiles coming from Lucid. Then you may be wondering now about the usefulness of this mode, well it’s being able to take advantage of the Quick Sync technology built right into Sandy Bridge. It may not shine when it comes to gaming, but it means serious business for encoding and transcoding since Intel dedicated transistors just for those jobs. How good is this implementation you would ask?

In d-mode, Cyberlink MediaEspresso 6.5 took 142 seconds on a GeForce GTX 580 to transcode a single VOB file to a generic WMV file. While the QuickSync mode on the Core i7-2600K took only 109 seconds, and that’s about a 30% difference. If something were to take hours, which option would you rather have?

Another rad feature is Intel’s SRT (Smart Response Technology) which lets the Z68 chipset to use an SSD to cache commonly used data from a hard drive. This supposedly yields up to a 4x improvement in performance over a traditional drive alone. Unfortunately our coverage got longer than we expected, and we’ll discuss SRT in the next segment as our two-part series. Stay tuned.