Enable and use open source radeon drivers in a muxless hybrid graphics (Intel/AMD) setup


It should be no news to people who have been closely following the open source driver development. This tutorial might be useful to those users who have been unaware of how to enable dynamic gpu switching at runtime. Thanks to reddit user JackDostoevsky, who wrote the steps necessary to set it up. This tutorial is just the re-iteration of that post. Read this post, if you want to follow the discussion. I have taken more conservative approach in this tutorial, therefore if you want to check git versions, follow above discussion. In case of ubuntu, I am using 13.04 (linux 3.8) with xorg-edgers ppa. In a game under wine, this setup was garbage. While in arch, I got better FPS than while using intel graphics card. Though the performance of native games were largely improved. Therefore, in this case, newer versions of driver and kernel are always better.

Step 1: Installation


If you are using Ubuntu 13.04, and using catalyst driver uninstall it (Reset Everything). Also, if you have disabled AMD graphics card using vgaswitcheroo in (/etc/rc.loal), remove that line. Enable xorg-edgers ppa and update using following commands.

sudo add-apt-repository ppa:xorg-edgers/ppa 
sudo apt-get update
sudo apt-get dist-upgrade

Note: If you have been using intel graphics installer make sure to remove it, because it might interfere with vgaswitcheroo. Also, if you are on ubuntu 13.10 and above (see comments section), you don’t need to  enable this ppa to get the basic functionality that was added in linux 3.11 and radeon driver feature to enable switching. However, as I said above, since development regarding radeon drivers and kernel (with amd driver features) is happening at a rapid rate, therefore if you want to experience the improvements as they happen, it is probably good idea to enable this ppa for the graphics drivers.


If you have been using catalyst driver, remove it. Also make sure systemd-vgaswitcheroo-units service is disabled (systemctl disable vgaswitcheroo.service), if you had installed it to disable dgpu at boot. Update your installation (sudo pacman -Syu) and make sure ati-dri, xf86-video-ati, lib32-mesa, lib32-mesa-libgl, mesa-libgl, mesa-demos, libtxc_dxtn, lib32-libtxc_dxtn and lib32-ati-dri packages are installed in addition to standard graphical packages.

On linux 3.11 and above, you can enable radeon.dpm=1 kernel parameter to enable dynamic power management.  Also JackDostoevsky notes that “DPM/power management process works significantly better in 3.12 than 3.11 — in 3.12 I get power management on par with Windows”. Therefore, definitely try out 3.12 to get better performance.

Step 2: Compositing

Make sure the window manager and desktop environment that you use has composting enabled. Standard gnome/cinnamon/unity has it enabled. If you are in xfce you can enable it from settings menu. If you are in LXDE, you can install a package called xcompmgr and enable it /etc/xdg/lxsession/LXDE/autostart by adding following line.

@xcompmgr -n

Restart your computer for this to take effect.

Step 3: List both GPU providers

Make sure, both graphics cards are enabled. Login as root (sudo su) and read the contents of /sys/kernel/debug/vgaswitcheroo/switch file.

# cat /sys/kernel/debug/vgaswitcheroo/switch 
0:IGD:+: Pwr:0000:00:02.0
1:DIS: : Pwr:0000:01:00.0

Run following command and note the id’s of both providers.

$ xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x79 cap: 0xb, Source Output, Sink Output, Sink Offload crtcs: 2 outputs: 4 associated providers: 0 name:Intel
Provider 1: id: 0x53 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 0 associated providers: 0 name:radeon

If you have only one provider, something is wrong. In ubuntu, I found if I boot with AMD graphics card disabled and try to enable it afterwards, above command sometimes did not list both providers. Therefore, make sure you don’t disable amd graphics card using some script during boot.

As you can see from above output, id for my integrated graphics card is 0x79 and the id for discrete amd graphics card is 0x53. I found them different on arch and ubuntu. Therefore, make sure you run above command to make sure you are dealing with proper ids.

Step 4: Enable prime/optimus dynamic switching.

Run following command (with id’s noted above) to enable the switching.

$ xrandr --setprovideroffloadsink 0x53 0x79

After this step, if you use DRI_PRIME=1 environment variable while running your application, you force the graphical processing to AMD graphics card and the final composting happens in intel graphics card. If you don’t use this variable, your application runs on intel graphics card as it used to.

$ DRI_PRIME=1 yourapplicationname


$ DRI_PRIME=1 glxinfo | grep render
$ DRI_PRIME=1 glxgears -info
$ DRI_PRIME=1 wine somegame.exe

As I said above, FPS in one of the games under wine was crap in ubuntu. For xonotic in ultra settings, I got 15 FPS in intel hd 3000, and it maintained 30+ FPS in AMD graphics card. One problem I had on Arch was, the computer turned off without any warning (may be because of overheating). Therefore, watch the overheating issue closely. As the development progresses, some of the heating issues should be solved.

Also, one reason for overheating (because of the compact build of laptops) might be lack of space between the laptop and the bottom surface (in addition to dust build up). As there is no space between the fan and the bottom surface (from where air flows in), as you progress in the game the gpu heats up, and there is no sufficient cool air to flow in. This in fact, heats up the whole environment to the side of the laptop from which air flows out. Result is, temperature gets above the max (85 degrees) pretty quickly reaches the critical point (100 degrees). At this point, the game is no more playable and computer is almost unusable (even turns off).

laptop with input fan at bottom

You can test this by putting your laptop above a book or something like that, which does not cover all the bottom space (specially that don’t block the fan). In my 2-3 hours gaming test (using external keyboard), temperature never went above 80 degrees and maintained a fairly playable FPS. When I exited the game, it came to 50 degrees just within 1 minute. If your results are similar, it is probably time to buy a laptop cooling pad or build something, that allows a lot of space for cool air to flow at the bottom side of the laptop. Since, I have not tested this a lot (as I said 2-3 hours game play), I can not say if this solves a lot of overheating issues many Linux gamers are having (tell me your experience, if you are having similar problem).



One strange, but good thing I had with arch is, I didn’t even have to enable AMD graphics card (I have enabled radeon.dpm=1 kernel parameter). I had re-enabled vgaswitcheroo.service (which means turn off AMD graphics card at boot), and added ( @xrandr –setprovideroffloadsink 0x53 0x79 to /etc/xdg/lxsession/LXDE/autostart for starting it at boot on LXDE). After starting the desktop, if I ran an application using DRI_PRIME=1. Although, the -info showed it was running on intel graphics card, the speed and rendering is what I get with AMD graphics card ( also gpu fan noise).

If this is true (please test this), it to some extent obviates the need to turn on AMD graphics card using vgaswitcheroo, or keep it turned on while booting. Also, when I checked the content of vgaswitcheroo/switch file, it was marked as off. The good thing about this is, some time after you leave the game, your computer comes back to its normal temperature.  This is one feature that is coming in linux 3.12 (though I am currently using 3.11).

This tutorial listed steps necessary for ubuntu and arch. In ubuntu installing ia32-libs installs all the 32-bit libraries necessary for radeon driver to work on 32-bit applications. With this information, it should not be hard to set it up on other distributions too. Also, there is lot of development happening in linux 3.12 regarding dynamic power management and runtime GPU management. Look forward to those releases.

Tell me your experience, and Good Luck :) !!

About these ads


Filed under Uncategorized

10 responses to “Enable and use open source radeon drivers in a muxless hybrid graphics (Intel/AMD) setup

  1. You sir, are a savior. Just to add to the discussion – this method works out of the box for the 13.10 beta with all upgrades and WITHOUT the xorg/edgers PPA. I can finally use my HD 8730m. Thanks a million.

  2. Hi. Ubuntu 13.10 64bits here. ATI HD6600M / Intel 3000
    I have enabled radeon.dpm=1 kernel parameter in Grub, too.
    Open source radeon driver is working with no fan problems, and a pretty good performance (almost 4.000 FPS glxgears; 30/120/120 fps SupertuxKart)
    Thank you very much!

  3. Know, i would only like that my 3D programs will be opened with the ATI card, by default.

    Any idea?

    • Assuming, you have executed the setprovider command.

      E.g If you want to execute tuxracer game on AMD card, you need to create a script file (lets say at /home/username/bin/tuxracer.sh) with following text (make it executable using chmod + x).

      DRI_PRIME=1 /usr/bin/tuxracer

      Now, all you have to do is either edit tuxracer .desktop file located at /usr/share/applications or copy the file inside /home/username/.local/share/applications directory, and change the Exec line to point to the script file like this.


      If you copy the desktop file to .local make sure to change Name, so you don’t confuse between the two versions.

      If the game is non-standard, you can use this tutorial to play more with the desktop file options.

      • Thanks! That’s the solution for me

        …but I think it would be a great idea to manage an automatic method somehow. I mean, a way to make the system reconize applications that need 3D acceleration.

      • Currently, I don’t know such solution in Linux. In windows, catalyst driver adds this option (open in high graphics mode etc).

        There was a talk (may be this one), where they were talking about that (for optimus/muxless setup like ours). May be it will come, when Wayland/Mir takes over X11 and frameworks/Window Managers decide which application to run using which gpu :)

  4. Hi, Do you have any solution for xrandr showing only one devide. I am actually using nvidia gpu, but I think it is relative. I have the quistion posted on fedora forums (“http://forums.fedoraforum.org/showthread.php?t=300306″), Please, if you have any solution, help me.
    Thank you.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s