Could not create OpenGL 3.2 context, SDL error

Running Arch Linux on a laptop with NVIDIA Optimus.

$ primusrun glxinfo | grep 'OpenGL version'
OpenGL version string: 4.5.0 NVIDIA 352.21

$ LD_PRELOAD=/usr/lib/nvidia/libGL.so PRIMUS_SYNC=1 primusrun ./UE4Editor-Linux-Debug -nosplash

[2015.06.20-00.08.43:524][  0]Log file open, 06/20/15 01:08:43
[2015.06.20-00.08.43:524][  0]LogInit:Display: Running engine without a game
[2015.06.20-00.08.43:524][  0]LogInit:Display: RandInit(-1968654583) SRandInit(-1968654574).
[2015.06.20-00.08.43:524][  0]LogTaskGraph: Started task graph with 4 named threads and 7 total threads.
[2015.06.20-00.08.43:525][  0]LogStats: Stats thread started
[2015.06.20-00.08.43:525][  0]LogInit: Version: 4.8.1-0+++depot+UE4-Releases+4.8
[2015.06.20-00.08.43:525][  0]LogInit: API Version: 0
[2015.06.20-00.08.43:525][  0]LogInit: Compiled (64-bit): Jun 20 2015 00:54:49
[2015.06.20-00.08.43:525][  0]LogInit: Compiled with Clang: 3.6.1 (tags/RELEASE_361/final)
[2015.06.20-00.08.43:525][  0]LogInit: Build Configuration: Debug
[2015.06.20-00.08.43:525][  0]LogInit: Branch Name: ++depot+UE4-Releases+4.8
[2015.06.20-00.08.43:525][  0]LogInit: Command line:  -nosplash
[2015.06.20-00.08.43:525][  0]LogInit: Base directory: /home/felix/devel/UnrealEngine/Engine/Binaries/Linux/
[2015.06.20-00.08.43:525][  0]LogInit: Rocket: 0
[2015.06.20-00.08.43:599][  0]LogInit: Using libcurl 7.38.0
[2015.06.20-00.08.43:599][  0]LogInit:  - built for x86_64-unknown-linux-gnu
[2015.06.20-00.08.43:599][  0]LogInit:  - supports SSL with OpenSSL/1.0.1i
[2015.06.20-00.08.43:599][  0]LogInit:  - supports HTTP deflate (compression) using libz 1.2.5
[2015.06.20-00.08.43:599][  0]LogInit:  - other features:
[2015.06.20-00.08.43:599][  0]LogInit:      CURL_VERSION_SSL
[2015.06.20-00.08.43:599][  0]LogInit:      CURL_VERSION_LIBZ
[2015.06.20-00.08.43:599][  0]LogInit:      CURL_VERSION_IPV6
[2015.06.20-00.08.43:599][  0]LogInit:      CURL_VERSION_ASYNCHDNS
[2015.06.20-00.08.43:599][  0]LogInit:      CURL_VERSION_LARGEFILE
[2015.06.20-00.08.43:600][  0]LogInit:      CURL_VERSION_TLSAUTH_SRP
[2015.06.20-00.08.43:600][  0]LogInit:  Libcurl: checking if '/etc/pki/tls/certs/ca-bundle.crt' exists
[2015.06.20-00.08.43:600][  0]LogInit:  Libcurl: checking if '/etc/ssl/certs/ca-certificates.crt' exists
[2015.06.20-00.08.43:602][  0]LogInit:  CurlRequestOptions (configurable via config and command line):
[2015.06.20-00.08.43:602][  0]LogInit:  - bVerifyPeer = true  - Libcurl will verify peer certificate
[2015.06.20-00.08.43:603][  0]LogInit:  - bUseHttpProxy = false  - Libcurl will NOT use HTTP proxy
[2015.06.20-00.08.43:603][  0]LogInit:  - bDontReuseConnections = false  - Libcurl will reuse connections
[2015.06.20-00.08.43:603][  0]LogInit:  - CertBundlePath = /etc/ssl/certs/ca-certificates.crt  - Libcurl will set CURLOPT_CAINFO to it
[2015.06.20-00.08.43:630][  0]LogOnline:Warning: No default platform service specified for OnlineSubsystem
[2015.06.20-00.08.43:763][  0]LogInit: Presizing for 0 objects not considered by GC, pre-allocating 0 bytes.
[2015.06.20-00.08.44:119][  0]LogInit: Object subsystem initialized
[2015.06.20-00.08.44:192][  0]LogInit: Initializing SDL.
[2015.06.20-00.08.45:242][  0]LogInit: Display metrics:
[2015.06.20-00.08.45:242][  0]LogInit:   PrimaryDisplayWidth: 1920
[2015.06.20-00.08.45:242][  0]LogInit:   PrimaryDisplayHeight: 1080
[2015.06.20-00.08.45:242][  0]LogInit:   PrimaryDisplayWorkAreaRect:
[2015.06.20-00.08.45:242][  0]LogInit:     Left=0, Top=0, Right=1920, Bottom=1080
[2015.06.20-00.08.45:242][  0]LogInit:   VirtualDisplayRect:
[2015.06.20-00.08.45:242][  0]LogInit:     Left=0, Top=0, Right=1920, Bottom=1080
[2015.06.20-00.08.45:242][  0]LogInit:   TitleSafePaddingSize: X=0.000 Y=0.000
[2015.06.20-00.08.45:242][  0]LogInit:   ActionSafePaddingSize: X=0.000 Y=0.000
[2015.06.20-00.08.45:242][  0]LogInit:   Number of monitors: 1
[2015.06.20-00.08.45:242][  0]LogInit:     Monitor 0
[2015.06.20-00.08.45:242][  0]LogInit:       Name: 0
[2015.06.20-00.08.45:242][  0]LogInit:       ID: display0
[2015.06.20-00.08.45:242][  0]LogInit:       NativeWidth: 1920
[2015.06.20-00.08.45:242][  0]LogInit:       NativeHeight: 1080
[2015.06.20-00.08.45:242][  0]LogInit:       bIsPrimary: true
[2015.06.20-00.08.45:277][  0]LogInit: Selected Device Profile: [Linux]
[2015.06.20-00.08.45:277][  0]LogInit: Applying CVar settings loaded from the selected device profile: [Linux]
[2015.06.20-00.08.45:339][  0]LogInit: Linux hardware info:
[2015.06.20-00.08.45:339][  0]LogInit:  - this process' id (pid) is 18948, parent process' id (ppid) is 19718
[2015.06.20-00.08.45:339][  0]LogInit:  - we are not running under debugger
[2015.06.20-00.08.45:339][  0]LogInit:  - machine network name is 'leikeze'
[2015.06.20-00.08.45:339][  0]LogInit:  - we're logged in locally
[2015.06.20-00.08.45:339][  0]LogInit:  - Number of physical cores available for the process: 4
[2015.06.20-00.08.45:339][  0]LogInit:  - Number of logical cores available for the process: 8
[2015.06.20-00.08.45:339][  0]LogInit:  - Memory allocator used: binned
[2015.06.20-00.08.45:339][  0]LogInit: Linux-specific commandline switches:
[2015.06.20-00.08.45:339][  0]LogInit:  -nodwarf (currently OFF): suppress parsing of DWARF debug info (callstacks will be generated faster, but won't have line numbers)
[2015.06.20-00.08.45:339][  0]LogInit:  -ansimalloc - use malloc()/free() from libc (useful for tools like valgrind and electric fence)
[2015.06.20-00.08.45:339][  0]LogInit:  -jemalloc - use jemalloc for all memory allocation
[2015.06.20-00.08.45:339][  0]LogInit:  -binnedmalloc - use binned malloc  for all memory allocation
[2015.06.20-00.08.45:339][  0]LogInit:  -httpproxy=ADDRESS:PORT - redirects HTTP requests to a proxy (only supported if compiled with libcurl)
[2015.06.20-00.08.45:339][  0]LogInit:  -reuseconn - allow libcurl to reuse HTTP connections (only matters if compiled with libcurl)
[2015.06.20-00.08.45:339][  0]LogInit:  -virtmemkb=NUMBER - sets process virtual memory (address space) limit (overrides VirtualMemoryLimitInKB value from .ini)
[2015.06.20-00.08.45:339][  0]LogInit: Setting LC_NUMERIC to en_US
[2015.06.20-00.08.45:339][  0]LogInit:  - Physical RAM available (not considering process quota): 8 GB (7882 MB, 8071200 KB, 8264908800 bytes)
[2015.06.20-00.08.45:578][  0]LogTextLocalizationManager: The requested culture ('en_GB') has no localization data; parent culture's ('en') localization data will be used.
[2015.06.20-00.08.46:039][  0]LogTextLocalizationManager:Warning: Loaded localization resources contain conflicting entries for (Namespace:ContentBrowser, Key:ImportAssetToolTip):
Localization Resource: (/home/felix/devel/UnrealEngine/Engine/Content/Localization/Editor/en/Editor.locres) Source String Hash: (-630476809) Localized String: (Import to {0}...)
Localization Resource: (/home/felix/devel/UnrealEngine/Engine/Content/Localization/Editor/en/Editor.locres) Source String Hash: (1271782899) Localized String: (Imports an asset from file to this folder.)
[2015.06.20-00.08.46:766][  0]LogLinux:Error: appError called: Assertion failed: Assertion failed:  [File:/home/felix/devel/UnrealEngine/Engine/Source/Runtime/OpenGLDrv/Private/Linux/OpenGLLinux.cpp] [Line: 183] 
_PlatformCreateOpenGLContextCore - Could not create OpenGL 3.2 context, SDL error: 'Could not create GL context: GLXBadFBConfig'

[2015.06.20-00.08.47:049][  0]LogLinux: === Critical error: ===
Unhandled Exception: SIGSEGV: invalid attempt to access memory at address 0x00000003

[2015.06.20-00.08.47:050][  0]LogLinux: Assertion failed: Assertion failed:  [File:/home/felix/devel/UnrealEngine/Engine/Source/Runtime/OpenGLDrv/Private/Linux/OpenGLLinux.cpp] [Line: 183] 
_PlatformCreateOpenGLContextCore - Could not create OpenGL 3.2 context, SDL error: 'Could not create GL context: GLXBadFBConfig'


[Callstack]  02  0x00007f26102fe368  ReportCrash(FLinuxCrashContext const&)
[Callstack]  03  0x00000000004acbfb  EngineCrashHandler(FGenericCrashContext const&) [/home/felix/devel/UnrealEngine/Engine/Intermediate/Build/Linux/x86_64-unknown-linux-gnu/UE4Editor/Debug/Launch/Module.Launch.cpp, line 0]
[Callstack]  04  0x00007f2610307b98  PlatformCrashHandler(int, siginfo_t*, void*)
[Callstack]  05  0x00007f25ff98d5b0  /usr/lib/libc.so.6(+0x335b0) [0x7f25ff98d5b0]
[Callstack]  06  0x00007f261031abd9  FOutputDeviceLinuxError::Serialize(wchar_t const*, ELogVerbosity::Type, FName const&)
[Callstack]  07  0x00007f2610566edc  FOutputDevice::Logf(wchar_t const*, ...)
[Callstack]  08  0x00007f26104fdccb  FDebug::AssertFailed(char const*, char const*, int, wchar_t const*, ...)
[Callstack]  09  0x00007f25e0c126aa  /home/felix/devel/UnrealEngine/Engine/Binaries/Linux/libUE4Editor-OpenGLDrv-Linux-Debug.so(+0x1886aa) [0x7f25e0c126aa]
[Callstack]  10  0x00007f25e0b5749d  PlatformInitOpenGL()
[Callstack]  11  0x00007f25e0b7be81  FOpenGLDynamicRHIModule::IsSupported()
[Callstack]  12  0x00007f26075a0a9f  PlatformCreateDynamicRHI()
[Callstack]  13  0x00007f26075a05bd  RHIInit(bool)
[Callstack]  14  0x000000000042d907  FEngineLoop::PreInit(wchar_t const*) [/home/felix/devel/UnrealEngine/Engine/Intermediate/Build/Linux/x86_64-unknown-linux-gnu/UE4Editor/Debug/Launch/Module.Launch.cpp, line 0]
[Callstack]  15  0x000000000041f182  EnginePreInit(wchar_t const*) [/home/felix/devel/UnrealEngine/Engine/Intermediate/Build/Linux/x86_64-unknown-linux-gnu/UE4Editor/Debug/Launch/Module.Launch.cpp, line 0]
[Callstack]  16  0x0000000000484b97  GuardedMain(wchar_t const*) [/home/felix/devel/UnrealEngine/Engine/Intermediate/Build/Linux/x86_64-unknown-linux-gnu/UE4Editor/Debug/Launch/Module.Launch.cpp, line 0]
[Callstack]  17  0x00000000004af575  ./UE4Editor-Linux-Debug(main+0x28e5) [0x4af575] [/home/felix/devel/UnrealEngine/Engine/Intermediate/Build/Linux/x86_64-unknown-linux-gnu/UE4Editor/Debug/Launch/Module.Launch.cpp, line 0]
[Callstack]  18  0x00007f25ff97a790  /usr/lib/libc.so.6(__libc_start_main+0xf0) [0x7f25ff97a790]
[Callstack]  19  0x000000000041ee99  ./UE4Editor-Linux-Debug(_start+0x29) [0x41ee99]

I am in the same boat.

I am also running archlinux, however my hardware is a bit different.
My setup is actually a VM thats running on a Quad Core Haswell, 16GB Ram and a passing through a AMD Radeon 7950.

Interestingly i can run all the linux UE4 demos and have no problem with Steam games(eg bioshock Infinite, Metro etc)

without LD_PRELOAD i get issue loading libGL, with LD_PRELOAD fails to create OpenGL context.
I also have a laptop running Archlinux with i7 2620m,16GB ram, and HD3000+nvidia-GT520M w/optimus, however this is loading up UE4Editor no problem, albeit slowly.

Did you resolve this? Does anybody else know where to start digging for a solution?

Same Problem here with Manjaro!
With LD_PRELOAD no issue loading libGL. But then it fails to create OpenGL context.

My System:
Notebook | Manjaro Linux | Nvidia GTX 765

My Drivers are:

[johannes@manjaro ~]$ pacman -Qs nvidia
local/bumblebee 3.2.1-5
    NVIDIA Optimus support for Linux through Primus/VirtualGL
local/cuda 7.0.28-2
    NVIDIA's GPU programming toolkit
local/lib32-nvidia-utils 352.21-1
    NVIDIA drivers utilities (32-bit)
local/libvdpau 1.1-1
    Nvidia VDPAU library
local/linux318-nvidia 352.21-2 (linux318-extramodules)
    NVIDIA drivers for linux.
local/mhwd-nvidia 352.21-1
    MHWD module-ids for nvidia 352.21
local/mhwd-nvidia-304xx 304.125-1
    MHWD module-ids for nvidia 304.125
local/mhwd-nvidia-340xx 340.76-1
    MHWD module-ids for nvidia 340.76
local/nvidia-utils 352.21-1
    NVIDIA drivers utilities
local/opencl-nvidia 352.21-1
    OpenCL implemention for NVIDIA

Does anybody have an idea of how to fix that?

Same problem.
Don’t start on nvidia card, and on intel integrated video card.

Notebook ASUS k55vm:
CPU: Intel Core i7-3610QM 3.3GHz
GPU: GeForce GT 630M
OS: ArchLinux x86_64
Kernel: linux 4.0.7-2-ARCH

nvidia 352.21-2
NVIDIA drivers for linux
bumblebee 3.2.1-10
NVIDIA Optimus support for Linux through VirtualGL
primus 20150118-2
Faster OpenGL offloading for Bumblebee

try to use nvidia prime instead of primusrun

on my laptop i always use nvidia driver (ubuntu 15.04)

sudo prime-select nvidia

hm… nvidia prime seems not to be supported in Linux Manjaro. There’s no package available.
But thanks for the tip!

Also having this error in Arch Linux with an Nvidia GTX 765M.
I even made a thread about this just a moment ago, better to keep this one alive.

i can confirm in my case this is all related to the glibc static TLS loading.

loading libGL with LD_PRELOAD fixes only part of the issue,

if you run UE4Editor with ‘LIBGL_DEBUG=verbose’ in addition to the LD_PRELOAD=/usr/lib/fglrx/libGL.so.1, you might find its having trouble loading something else.

without LIBGL_DEBUG=verbose, there is no information in the terminal that indicates anything else has failed.

In my case it was also unable to load fglrx_dri.so…

you can probably fix this by recompiling glibc with bigger DTV_SURPLUS or in my case for testing. I simply launched the editor with the following…

LD_PRELOAD=“/usr/lib64/xorg/modules/dri/fglrx_dri.so /usr/lib/fglrx/libGL.so.1” ./Engine/Binaries/Linux/UE4Editor

good luck…

You can try cherry picking this: https://github.com/EpicGames/UnrealEngine/commit/dd37bfd78a69f418ff4875e7e72a80c202bd4981

It should fix engine modules having STATIC_TLS flag - there are no code changes, it’s a matter of recompiling a third-party lib.

Thank you so much! This solved the problem for me and my Radeon HD 7750.

what’s about optirun and bumblebee drivers? They are in the oficial nvidia graphics repository!