Skip to content
Tech

Fast, but compromised: Gigabyte’s AMD-powered mini gaming PC reviewed

The Brix Gaming box is hobbled by heat, noise, and mediocre Linux support.

Andrew Cunningham | 91
The Brix Gaming is a photogenic little machine, but it's not quite as good as it needs to be. Credit: Andrew Cunningham
The Brix Gaming is a photogenic little machine, but it's not quite as good as it needs to be. Credit: Andrew Cunningham
Story text

Mini desktops are a growing market, but so far it’s a market that Intel has had the run of. The company’s own “Next Unit of Computing” (NUC) and efforts like Gigabyte’s Brix Pro are diminutive but much more capable than the wimpy “nettops” of yesteryear.

Now it’s time for AMD to get in on the fun. The company sent us two Gigabyte mini PCs that are roughly the same size as the Intel versions, but these machines use AMD A8 chips instead of Intel ones. While the smaller, cheaper GB-BXA8-5545 (which we’ll be reviewing in full in a separate piece) is basically just an AMD version of the NUC, the Brix Gaming (yes, that’s the device’s full name) is something else altogether. The race-car-red machine combines AMD’s CPU with a true dedicated AMD GPU, promising a level of graphics performance that we haven’t yet seen in a mini desktop.

Unfortunately, this is one of those times when reality doesn’t quite match expectations. The Brix Gaming does have a much faster GPU than any mini PC we’ve seen, but it has to make a few too many compromises to get there.

If you’ve seen one, you’ve seen them all

The Brix Gaming next to the Intel-powered Brix Pro (right).
The smaller AMD Brix box compared to the Brix Gaming.
The Brix Pro has the same ports you’ll find in most mini PCs.
The Brix Gaming is relatively tiny, but it comes with a honking power brick.
Specs at a glance: Gigabyte Brix Gaming GB-BXA8G-8890
OS Windows 8.1 x64
CPU 2.1GHz AMD A8-5557M capped at about 1.9GHz by default, Turbo Boost up to 3.1GHz available with proper BIOS settings
RAM 8GB 1600MHz DDR3 (supports up to 16GB)
GPU Integrated: AMD Radeon 8550G (256 shaders)
Dedicated
: AMD Radeon R9 M275X (680 shaders) with 2GB 1125MHz GDDR5
HDD 128GB Crucial M500 mSATA SSD
Networking 433Mbps 802.11ac Wi-Fi, Bluetooth 4.0, Gigabit Ethernet
Ports 4x USB 3.0, 1x mini DisplayPort 1.2, 1x HDMI 1.4a, audio
Size 5.04” x 4.54” x 2.35” (128 x 115.4 59.6 mm)
Other perks Kensington lock, space for 2.5-inch laptop hard drive, VESA mounting bracket
Warranty 1 year
Price $569.99 (barebones), $814.97 with listed components and software

The Brix Gaming is on the chunkier end of the mini PC spectrum—it’s imperceptibly shorter and about half an inch wider than the GB-BXA8-5545. The two of them are in the same ballpark, though, and both use the same chunky external power brick. The Brix Gaming is constructed primarily of red and black plastic, with black metal mesh used on the sides and back to improve airflow.

The port layout is identical to just about every mini PC we’ve seen this year. It has two USB 3.0 ports on the front and two more on the back, plus gigabit Ethernet, a mini-DisplayPort, an HDMI port, and a Kensington lock slot. A standard headphone jack on the front of the system rounds it all out.

The Brix Gaming’s GPU should be sufficient for 4K video output and playback, even if it’s not sufficient to play games as these resolutions. The HDMI port should support up to 1080p at a refresh rate of 60Hz, 3840×2160 at 30Hz, and 4096×2160 at 24Hz. The DisplayPort can drive a display up to 2560×1600 at 60Hz or 3840×2160 at 30Hz. Using DisplayPort MST, you can also drive two 1920×2160 displays at 60Hz.

Journey to the center of the Brix

The Brix has two system boards on the inside: one with the RAM, SSD, CPU, and other system components on it (right, on edge of frame), and one that houses the GPU and its RAM (shown here under the large heatsink).
The Brix has two system boards on the inside: one with the RAM, SSD, CPU, and other system components on it (right, on edge of frame), and one that houses the GPU and its RAM (shown here under the large heatsink). Credit: Andrew Cunningham

What really makes the Brix Gaming unique among mini PCs is what’s inside the box.

Most mini-desktops really have more in common with laptops than with desktops. By necessity, they’re small, simple, and tightly integrated. The RAM, Wi-Fi, and mSATA slots are all kept on the bottom of the unit so that you can access them when you open the system up. A processor in a ball-grid array (BGA) package is mounted to the other side of the board—you couldn’t upgrade it if you wanted to, so there’s no point in making it user-accessible—and a heatsink and CPU fan strapped to the top of the CPU blow heat out the back. This is how the NUC was put together, it was how the Brix Pro was put together, and it’s how the smaller of our two AMD Brix boxes is put together.

The Brix Gaming is different. When Gigabyte says it includes a dedicated graphics card, it doesn’t just mean there’s a dedicated chip sitting next to some dedicated graphics RAM (like what you’d find in most laptops). There’s an entirely separate board in there, joined to the standard mini-desktop mainboard by a tiny stub of a board. The bottom of the main system board and the top of the graphics daughterboard are both covered by large copper heatsinks, and twin system fans push the heat out of the left side of the unit. Take special care not to block this side of the PC, or you’ll end up with heat problems very quickly.

The heatsink covering the CPU. The two heatsinks face each other when the system is fully assembled, and the two system fans blow directly into them to keep them cool.
The heatsink covering the CPU. The two heatsinks face each other when the system is fully assembled, and the two system fans blow directly into them to keep them cool. Credit: Andrew Cunningham
This small connector board links the mainboard and the MXM board with the GPU together.
This small connector board links the mainboard and the MXM board with the GPU together. Credit: Andrew Cunningham
Gigabyte is using practically the same motherboard in all of its AMD-based Brix systems—this board from the smaller Brix still has space on the board for an MXM connector even though the connector itself is absent.
Gigabyte is using practically the same motherboard in all of its AMD-based Brix systems—this board from the smaller Brix still has space on the board for an MXM connector even though the connector itself is absent. Credit: Andrew Cunningham

The two boards are connected together using the Mobile PCI Express Module (MXM) interface, a relatively rare interconnect used in some all-in-ones, chunky gaming laptops, and places where the promise of upgradeability is important. I say the promise of upgradeability because between these cards’ scarcity and the fact that your particular connector or BIOS may not even be compatible with whatever you try to replace it with, they probably don’t actually get upgraded all that often. The idea was to provide a mobile equivalent to regular desktop PCI Express cards; in reality the demand for such a feature has been too low for it to really matter.

GPU-Z reports that the GPU is connected via eight PCI Express 2.0 lanes. Sixteen lanes is the norm for most desktop graphics cards, but the speed of the interface is unlikely to be a bottleneck—even the fastest graphics cards available don’t benefit much from extra PCI Express bandwidth, whether it’s provided using more lanes or the faster PCI Express 3.0 spec.

Aside from the unique graphics card setup, opening and working on the Brix Gaming is much like working on any mini PC. You’ll need to bring your own RAM, SSD, and operating system license—we’ve budgeted out about $70 for 8GB of RAM across two 4GB sticks, $75 for a Crucial M500 mSATA SSD, and $100 for an OEM license of Windows 8.1. Your component choices and final system cost may vary, but as specced, our system would cost about $815 after you add those selections to the $570 you’ll pay for the base system.

Like the Brix Pro, the Brix Gaming also has a bracket and connector for a standard 2.5-inch laptop drive. You can use this as your primary drive or you can use it as extra storage alongside an integrated SSD, but AMD has no equivalent to Intel’s Smart Response Technology. This means you can’t combine the two into one logical drive that uses the SSD as a cache to improve responsiveness while using a standard HDD as a large, cheap storage pool.

Noise, heat, and power draw

The Brix Gaming makes a good first impression because it has an eye-catching color combination and design. Then you turn it on.

Those two system fans we saw earlier? They’re always clearly audible, even when the computer is idling. Start to stress the system and they’ll really take off. Considered in context, the fan noise is probably tolerable—this box is way more likely to be sitting in your entertainment system surrounded by speakers than it is to be on your desk opening Excel spreadsheets—but it’s far and away the loudest of the mini desktops we’ve laid ears on so far. It’s always whirring.

A Gigabyte representative told us that the company would be working on fine-tuning the “fan curve,” the rate at which the fan speed goes up or down based on the box’s internal temperature. The latest BIOS update has already made some adjustments, and more are apparently in the works. However, this will just affect how quickly the fan spins up when the computer is under load—Gigabyte can’t do much about fan noise at idle.

The dedicated GPU also ratchets up the power consumption, at least when it’s active. While using the integrated graphics, AMD’s CPU still consumes more power than Intel’s—this is where Intel’s manufacturing advantage helps out. Haswell chips were designed to have low idle power consumption, and it helps that Intel’s chips are built on a 22nm process instead of the 32nm process used by AMD here.

Activity Haswell NUC Gigabyte Brix Pro Gigabyte Brix Gaming
Off/Hibernated 0.5W 0.3W 0.3W
Sleep mode 1.1W 1.7W 3.4W
Idle at desktop 6.4W 14.0W 22.1W
Watching YouTube in Chrome 9.0W 21.0W 25.0W
Running Bioshock Infinite benchmark 38.0W 80.0W ~58W (integrated), ~86W (dedicated)
Running Prime95 CPU torture test 29.7W 84.0W 47.4W

Peak power consumption isn’t all that far from what we saw in the Intel-based Brix Pro, but AMD’s idle power consumption can’t approach what Intel’s Haswell chips enable. You can see why Intel CPUs are all but universal in mid-to-high-end laptops. If the NUC’s power consumption is a little closer to that of a dedicated set-top box like a Fire TV or Roku, the Brix Gaming’s power draw is a bit more like a game console.

It’s also clear here that while Intel’s quad-core CPU is much more powerful than AMD’s, it takes much more power to get all that extra work done. This will even out over time for many tasks, though—Intel’s CPU will take less time to get the work done than AMD’s, meaning it can return to a low-power idle mode more quickly.

CPU performance and “Turbo Mode”

Boosting performance old-school: the Brix Gaming has what amounts to a “Turbo” button in the BIOS.
Boosting performance old-school: the Brix Gaming has what amounts to a “Turbo” button in the BIOS. Credit: Andrew Cunningham

The short version: AMD’s CPU architectures have been behind Intel’s for years, and that hurts the Brix Gaming in CPU-heavy tasks. Generally speaking, it takes about two of AMD’s CPU cores to keep up with one of Intel’s.

By default, the Brix Gaming’s BIOS limits the CPU’s clock speed to around 1.9GHz to keep heat, power usage, and noise levels down. Enabling the desktop’s “Turbo Mode” increases CPU performance by a noticeable amount at the cost of increased fan noise and power consumption.

The long version: The Brix Gaming’s CPU is an AMD A8-5557M, a quad-core laptop CPU with a base clock of 2.1GHz and a Turbo clock speed of 3.1GHz. Based on all of those numbers, it sounds like it would be more comparable to the Core i7-4770R in the Brix Pro than the 1.3GHz dual-core i5-4250U in Intel’s NUC, but AMD’s CPU designs haven’t been directly comparable to Intel’s in years.

This CPU is based on AMD’s “Piledriver” architecture, which is a descendant of Bulldozer. That architecture was noted for lackluster single-core performance, and as we’ll see in our benchmarks, it generally takes two AMD CPU cores to keep up with one of Intel’s.

We ran into an additional wrinkle in our testing: the CPU in the Brix Gaming doesn’t actually seem to be able to ramp up to that 3.1GHz Turbo clock—the maximum clock speed reported by the Catalyst Control Center app is the standard 2.1GHz. The fastest speed we actually observed in Task Manager or CPU-Z was around 1.9GHz. By default, Gigabyte appears to be clamping down on clock speed to prevent heat and power problems.

By default, Gigabyte is capping the CPU’s clock speed near 1.9GHz. This is true whether one or all of the cores are in use.
By default, Gigabyte is capping the CPU’s clock speed near 1.9GHz. This is true whether one or all of the cores are in use. Credit: Andrew Cunningham

Heading into the BIOS revealed two different performance modes. The default, “Operation Mode,” was the source of our clock speed throttling. A second “Turbo Mode” is also available, but Gigabyte says that it “may make a lot of noise and heat.” It’s not an idle threat, either—enabling Turbo Mode unlocks the full 3.1GHz Turbo speeds the CPU is rated for, but the already-noisy fans spin up much more quickly, even if you’re just opening up Chrome and browsing around. After a minute or two of going at full-blast, the CPU speed goes back down to about 2.25GHz. So enabling Turbo Mode in the BIOS gets you a significantly faster CPU for light, “bursty” tasks, but the difference is smaller if you’re doing CPU-intensive tasks for long periods of time.

Activity Gigabyte Brix Gaming (Default) Gigabyte Brix Gaming (Turbo)
Off/Hibernated 0.3W 0.3W
Sleep mode 3.4W 3.6W
Idle at desktop 22.1W 22.1W
Watching YouTube in Chrome 25.0W 25.2W
Running Bioshock Infinite benchmark ~58W (integrated), ~86W (dedicated) ~62W (integrated), ~97W (dedicated)
Running Prime95 CPU torture test 47.4W 69.0W (peak), 58.0W (after throttling down)

Here are the systems being tested in the graphs below:

  • The Haswell NUC (model D54250WYK1), which has a dual-core 1.3GHz (2.6GHz Turbo) Core i5-4250U and an Intel HD 5000 GPU.
  • The Gigabyte Brix Pro, which has a quad-core 3.2GHz (3.9GHz Turbo) Core i7-4770R CPU and an Intel Iris Pro 5200 GPU.
  • A more conventional Falcon Northwest gaming PC, on loan to us from Nvidia. The Falcon Northwest Tiki desktop is much larger than the NUC or the Brix despite being a “small form-factor” offering, and it represents a reasonably well-balanced midrange gaming desktop. It includes a quad-core 3.5GHZ (3.9GHz Turbo) Core i7-4770K CPU and a GeForce GTX 760 GPU with 2GB of GDDR5, a core clock of 980MHz (1033MHz Turbo), and a memory clock of 1500MHz.
  • Our Intel graphics driver is version 10.18.10.3345. The GeForce driver is version 332.21. We used the Catalyst 14.3 beta for the AMD GPUs.

Our basic CPU benchmarks confirm two things: first, AMD’s CPU performance is more like the dual-core Intel NUC than the quad-core Brix Pro. Second, disabling Turbo on this CPU reduces speeds by around 25 percent. The processor should still be fast enough for most games since they’re GPU-bound more often than they’re CPU-bound, but if you do play CPU-limited games, you might consider enabling Turbo Mode in the BIOS despite the higher noise and heat levels.

The GPUs

The short version: The Brix Gaming has two GPUs, one integrated into the GPU and one more powerful dedicated card. Playing modern games at 1080p with a few settings turned down should be possible most of the time, though it still won’t compete with proper mid-to-high-end desktop GPUs.

The long version: The Brix gaming has two GPUs, the one integrated into the A8-5557M and the dedicated one. The system can switch between the two based on whether the system needs more power or more performance (the two can’t be used in tandem, though, unsurprising in a case this small). AMD’s switchable graphics solution remains a little clunkier and less seamless than Nvidia’s equivalent Optimus feature on the Intel side of the fence, but it generally does a decent job of letting the dedicated GPU kick in when there’s a game being played. If you don’t seem to be getting the performance you think you should, you can force use of the integrated or dedicated GPU on a per-application basis or globally using AMD’s driver software.

The processor GPU is a Radeon HD 8550G with 256 shaders rated at a maximum clock speed of 720MHz, and when paired with 1600MHz DDR3 system RAM, it has 25.6GBps of memory bandwidth. The dedicated GPU is listed on Gigabyte’s product page as a Radeon R9 M275X, a GPU that doesn’t show up on any AMD product pages—plug that model number into Google and all you’ll get back are announcements about the Brix Gaming. It’s new enough that we had to install AMD’s latest beta drivers to get it working since the non-beta package didn’t recognize it.

Thanks to some sleuthing, information provided by the Catalyst Control Center software, and a spec sheet from AMD, we can say with a fair degree of certainty that this GPU is a Radeon HD 8870M with a faster clock speed and a new name—the Brix Gaming’s model number, GB-BXA8G-8890, implies that it could just as easily be called a Radeon HD 8890M.

That means a GPU with 640 shaders rated at a maximum clock speed of 925MHz (according to the Catalyst software), connected to 2GB of 1125MHz GDDR5 via a 128-bit memory bus. AMD’s list of GPU architectures, codenames, and model numbers is as labyrinthine and inscrutable as ever, but as far as we can tell, this chip is based on AMD’s first-generation Graphics Core Next originally introduced in the Radeon 7000 series, not the revised version of the architecture introduced in some Radeon R9 desktop cards.

Before we show our benchmarks, a quick note: late in our testing, one of the Brix Gaming’s RAM slots completely stopped working. We finished enough benchmarking that we still feel comfortable making generalized observations about performance, but we’re missing a few scores. Gigabyte says that it hasn’t run into this problem with other Brix Gaming boxes and is in the process of repairing our unit—we will run an update if any of our findings change.

Turbo Mode doesn’t have the same effect on GPU benchmarks as it does on CPU benchmarks. You can see the scores go up a little bit in Bioshock Infinite when Turbo Mode is enabled, though the small size of the increase makes us think it comes from the increased CPU speed rather than any major boost to GPU speed. In other words, unless you’re running extremely CPU-heavy games, the Brix Gaming’s default “Operation Mode” setting should provide near-maximum performance.

And that performance is actually pretty impressive given the size of this box. The Brix Gaming is still only 33 to 50 percent as fast as the GTX 760 in the Falcon Northwest Tiki, but that’s a much larger system. You get really close to that 30fps average threshold in Bioshock Infinite at 1080p, so you should have no problems playing newer games at that resolution with some settings turned down (and older games with all the settings turned up). Generally speaking, it offers about twice the graphics performance of Intel’s best integrated GPU (represented here by the Brix Pro) and about four times the performance of the smaller Haswell NUC.

Ubuntu and SteamOS

We were initially excited about putting SteamOS on the Brix Gaming to see how it worked—as good as the Brix Pro’s CPU performance is, Intel’s best GPUs still aren’t up to the task of playing most games on a 1080p TV. Discovering that the box used AMD’s switchable GPUs dampened that enthusiasm. The state of Linux graphics drivers is iffy enough when there’s just one graphics processor, and neither AMD nor Nvidia’s implementations are ideal. We still installed Ubuntu 14.04 and the latest SteamOS beta to get some idea of what worked and what didn’t.

Unfortunately, we were completely unable to get SteamOS working. It would make its way through the entire installation process up until it came time to boot into the main SteamOS UI, at which point it would only output a blank black screen through the DisplayPort and HDMI port. The monitor was still receiving a signal, indicating that it was close to working, but the computer wouldn’t actually put out any usable images. This seems like a driver issue—our other, smaller AMD Brix box without switchable graphics acts exactly the same way. Hopefully, as curious SteamOS contributors get their hands on AMD Brix boxes this problem can be resolved.

Ubuntu 14.04 installed and booted up fine, which gave us an opportunity to test things with a newer Ubuntu version and Linux kernel than we tested on the Brix Pro or the NUC. The latest Ubuntu version has, for instance, implemented rudimentary support for the odd Realtek 802.11ac card that Gigabyte supplies with the Brix Pro and Brix Gaming, though connecting to our Airport Extreme using the 5GHz band was still unstable. Bluetooth and 2.4GHz Wi-Fi were fine, though.

While both GPUs worked in Ubuntu, we still encountered problems. As expected, the graphics switching feature don’t seem to kick in when playing a game downloaded from Steam—using the open-source Radeon drivers in the box, the computer would only use the integrated GPU even though both GPUs were detected by the OS. Building and installing the latest version of the Catalyst beta driver from AMD’s driver site increased performance in games, but the corresponding increase in power consumption while sitting idle at the desktop suggests that the system is only using the dedicated GPU rather than switching between the two GPUs to provide the best mix of performance and power.

At this point we can’t really recommend the Brix Gaming as a Linux box, both because of lackluster driver support and because the box’s strengths (good GPU performance) don’t really complement Linux’s strengths. For all of Valve’s ambitions, Linux isn’t an OS most people are going to want to game on, because there just aren’t that many high-end games there. Assuming SteamOS does gain some momentum and starts to attract developers, Gigabyte and others will have had a generation or two to make better, faster hardware. In short, as much as we wanted it to be, this is not the Steam Machine you’re looking for.

Try, try again

The Brix Gaming.
The Brix Gaming. Credit: Andrew Cunningham

Our time with the Brix Gaming has been intriguing and frustrating in equal measure. On the one hand, the box provides a blueprint for other mini PC makers to follow if they want to make a tiny gaming machine. On the other, its ambition is rarely matched by its execution—it’s hot and it’s noisy, and its performance can be eclipsed by Micro ATX and even mini ITX gaming PCs built for approximately the same price (see our homemade Steam Machine or our latest Budget Box for examples). Our review unit also broke down less than a week after it arrived, which might not be indicative of anything, but it isn’t an encouraging sign.

Some of our complaints stem from the use of an AMD CPU, which generates more heat and uses more power than comparable chips from Intel. The chip’s heat output and power consumption renders it unable to run at its rated speed out of the box, and even with “Turbo Mode” enabled in the BIOS it can’t maintain those speeds for very long. Games are more often constrained by the GPU than the CPU these days, but the exact same box with a dual-core Intel chip would have equal or better CPU performance while helping to alleviate the heat and noise problems the computer has.

This particular Brix Gaming model isn’t all it could be, but it comes close enough that we want to see what Gigabyte can accomplish next—a similar box that combines a dual-core Intel CPU with an Nvidia GPU is already apparently on the horizon, and we’ll try to get that one in for a review soon. The Brix Pro is still a great example of a mini PC with the CPU power of a much larger computer. With some time and some different components, a little desktop that can deliver a great gaming experience will surely follow.

The good

  • Unmatched GPU performance for the size
  • Good port selection and layout
  • Eye-catching design
  • Easy to open and work on
  • Bundled 802.11ac
  • Integrated 2.5-inch drive bay

The bad

  • Mediocre CPU performance
  • Higher power consumption than Intel offerings
  • Runs hot
  • Relatively poor Linux support
  • No SSD caching feature à la Intel’s Smart Response Technology

The ugly

  • We had a RAM slot go dead in the middle of our testing
  • Loud even when idle

Listing image: Andrew Cunningham

Photo of Andrew Cunningham
Andrew Cunningham Senior Technology Reporter
Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.
91 Comments