[Solved] Failed to create a Vertex Buffer

I’m getting the following error in a popup box on startup.
Failed to create a Vertex Buffer:/builddir/Positech/GSB/library/GVertexBuffer.cpp 146

The console outputs the following single line:
Starting up GSB for Linux, Build: 2011-12-23T15:20:26

Contents of $HOME/.positech/GSB/build.log:
Starting up GSB for Linux, Build: 2011-12-23T15:20:26Z

No error.log was produced.

The game does open a window, but it stays completely blank. When I click the “Close” button in the popup box, the game exits normally (no further errors are produced, nor does it segfault). I tested on both 32 and 64 bit Ubuntu 11.10, fully updated. Uname -a of the 64 bit install: “3.0.0-15-generic #24-Ubuntu SMP Mon Dec 12 15:23:55 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux”. 32 bit install is also at 3.0.0-15. The system in question is a netbook running on an Intel Atom N450. Not sure if it matters, but the 64 bit install is actually Kubuntu (KDE) and the 32 bit install is actually Lubuntu (LXDE). I tried the Windows build from the Humble Bundle through wine and that does run (I played through the first 4 battles and built some custom ships).

This exact same error happens for me on 32 bit Ubuntu 11.04 running on a Cr-48.

On startup:

Starting up GSB for Linux, Build: 2011-12-23T15:20:16Z

uname -a:

2.6.32.26+drm33.12 #1 SMP Thu Feb 10 00:41:33 PST 2011 i686 i686 i386 GNU/Linux

YEah… that was a check I added in the latest build due to issues with some openGL drivers… Cr-48 run an intel graphics?

Basically the OpenGL driver is failing to map a created vertex buffer via the glMapBuffer OpenGL call. The buffer was successfully created in previous calls to glGenBuffer, glBindBuffers, and glBufferData). But it gives me a nice NULL pointer in the call to glMapBuffer with no GL error code.

So, this is looking like a bug in the intel graphics driver.

Could you provide me with some more info so I can file a bug report to the Intel graphics driver? I’m not sure exactly what they need, but my best guess would be a minimal test case.

Edit:
I did a trace of the GL calls using BuGLe and it doesn’t show any of the calls you wrote about. Maybe the code is taking a different path then you think it is?

I have tried to attach the bugle.log file, but apparently the extension “log” is not allowed. So I hosted it on pastebin instead: http://pastebin.com/tiWL0Q8i If I can provide any more information, please let me know.

Nice little program. I’d not heard of that one before… I just ran it through locally and mine is nearly identical (except more extensions as I have an nvidia card). Anyways mine has a few more glXGetProcAddressARB calls at the end and then this… those last two calls you have are simply the “Exiting” code from SDL. (glXMakeCurrent(0xXXXXX, 0x000000, NULL), and glXDestroyContext(0xXXXXXX, 0xXXXXX) )

[INFO] trace.call: glGenBuffers(1, 0x1fddd34 -> { 1 })
[INFO] trace.call: glBindBuffer(GL_ARRAY_BUFFER, 1)
[INFO] trace.call: glBufferData(GL_ARRAY_BUFFER, 160000, NULL, GL_STREAM_DRAW)
[INFO] trace.call: glBindBuffer(GL_ARRAY_BUFFER, 0)
[INFO] trace.call: glGenBuffers(1, 0x1fdb94c -> { 2 })
[INFO] trace.call: glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 2)
[INFO] trace.call: glBufferData(GL_ELEMENT_ARRAY_BUFFER, 15000, NULL, GL_STREAM_DRAW)
[INFO] trace.call: glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0)
[INFO] trace.call: glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 2)
[INFO] trace.call: glMapBuffer(GL_ELEMENT_ARRAY_BUFFER, GL_READ_WRITE) = 0x2006f40
[INFO] trace.call: glUnmapBuffer(GL_ELEMENT_ARRAY_BUFFER) = GL_TRUE
[INFO] trace.call: glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0)

Thanks for looking into it some more. Two questions spring to mind:

  1. Are those few more glXGetProcAddressARB calls at the end due to the extra extensions because of the nvidia card, or are those also stuff that’s missing in my log that should be there in a proper run of the program?

  2. Since my log doesn’t show the whole glGenBuffers etc. bit, maybe that’s where something is going wrong? Is there anything I can do for you to help you locate the source of the problem?

Here is the full dump of the trace.

dl.dropbox.com/u/3447934/bugle.log.bz2

I don’t know if it matters, but I noticed the following in the logs:

My log:
[INFO] trace.call: glXGetProcAddressARB(“glGenBuffersARB”) = 0x7f47ff543ca0

Your log:
[INFO] trace.call: glXGetProcAddressARB(“glGenBuffers”) = 0x7f6f49e7c970

[INFO] trace.call: glXGetProcAddressARB(“glGenBuffersARB”) = 0x7f6f49e7ca00

Notice the absence of the non-ARB version of glGenBuffers in my log.

The same goes for glBindBuffer, glBufferData, glMapBuffer and glUnmapBuffer.

I did notice that as well…
Can you post the output of glxinfo.

glxinfo output: pastebin.com/EYmHzcN2

Dang, that explains the issue there… The code is using OpenGL 2.1 (where genbuffers are a “native” functionality, not and ARB extension). Also GSB makes use of GLSL 1.20 shaders which most likely won’t work under a 1.4 GL implementation.

I’m pretty sure I’ve played games that use glsl1.20 shaders on this laptop before. Maybe it’s an extension? It does say GL_ARB_fragment_program, GL_ARB_fragment_shader, GL_ARB_shader_objects, GL_ARB_vertex_program, GL_ARB_vertex_shader and I’m pretty sure standard GL 1.4 doesn’t do programmable shaders at all.

Since you’re polling for the ARB versions of those commands (glGenBuffers etc) anyway, could you use them as a fallback?

well, actually, I’m not polling them, I think that is the “GLEW” library that is polling everything… The actual renderer code just blindly assumes OpenGL 2.1.

According to the specs, yes GLSL was added in as an extension for OpenGL 1.4, But the GLSL 1.20 spec wasn’t done until OpenGL 2.1.

According to software.intel.com/en-us/article … nce-guide/ it supports OpenGL 2.0 and Shader Model 3.0. It’s listed under “3D Features Comparison”, my processor is an Atom N450. Is that useful information? :slight_smile:

Also, it does work perfectly via wine (I’ve played quite some matches already), so the hardware is capable of running it.

I got it to work! :slight_smile:

Apparently, the chip really does support OpenGL 2.0 and GLSL 1.20. See the new glxinfo here: http://pastebin.com/Ghdrw9ru.

What I did is the following:

  1. Enable the xorg-edgers ppa (see note 1 first): https://launchpad.net/~xorg-edgers/+archive/ppa
  2. Enable the xorg-edgers SNA addon ppa (see notes 1 and 2 first): https://launchpad.net/~sarvatt/+archive/intel-sna
  3. sudo apt-get update
  4. sudo apt-get dist-upgrade
  5. sudo apt-get install libgl1-mesa-dri-experimental
    ** From here, do either the a or the b instructions, depending on your install (64-bit = a, 32-bit = b)
    6a. sudo mkdir /usr/lib/x86_64-linux-gnu/dri-backup
    6b. sudo mkdir /usr/lib/i386-linux-gnu/dri-backup
    7a. sudo mv /usr/lib/x86_64-linux-gnu/dri/i915_dri.so /usr/lib/x86_64-linux-gnu/dri-backup/i915_dri.so
    7b. sudo mv /usr/lib/i386-linux-gnu/dri/i915_dri.so /usr/lib/i386-linux-gnu/dri-backup/i915_dri.so
    8a. sudo cp /usr/lib/x86_64-linux-gnu/dri-alternates/i915_dri.so /usr/lib/x86_64-linux-gnu/dri/i915_dri.so
    8b. sudo cp /usr/lib/i386-linux-gnu/dri-alternates/i915_dri.so /usr/lib/i386-linux-gnu/dri/i915_dri.so
    ** From here the instructions are the same again
  6. Reboot
  7. Play the game

Notes:

  1. Please read the instructions on those pages thoroughly. I could have simply posted copy-paste command-line instructions, but the guys running the ppa specifically request that people don’t do that and link to the page instead.
  2. I’m not sure if SNA is really neccesary and the ppa says it can cause issues with Unity, so if you’re using plain Ubuntu please don’t enable the SNA ppa. SNA stands for Sandy Bridge New Acceleration, which is supposed to enhance the hardware acceleration of Intel GPUs. Don’t be fooled by the name, older architectures benefit as well, sometimes even more than Sandy Bridge does.

Troubleshooting:
Please run the following command in a terminal:
glxinfo
Copy the output. Go to pastebin.com and paste it in there. Copy the link and paste it here along with a description of your problem and I will try and help you.

Sweet, I’m glad you got it working, and that you found that there is updated drivers for that chip to support GL 2.0.