Not logged in, Join Here! or Log In Below:  
News Articles Search    

 Home / 3D Theory & Graphics / Problem with glDrawRangeElementsEXT Account Manager
Archive Notice: This thread is old and no longer active. It is here for reference purposes. This thread was created on an older version of the flipcode forums, before the site closed in 2005. Please keep that in mind as you view this thread, as many of the topics and opinions may be outdated.

April 30, 2005, 03:45 PM

I tried to improve the performances of my terrain engine by using glDrawRangeElementsEXT but I have a problem.

The terrain is divised into sectors, and each sectors contains 1089 vertices (33*33)

the vertices are stored in a VBO sector by sector, and each sector contains an array of indices.

Each sector are rendered in the same way but some of them are not rendered well.

Here a picture:

the sectors which have problems use indices between 32768 to 65536, all sectors before and after that range works well.

I don't know what happen for those value.

The more stranges is that if I use glDrawElements for rendering thoses sectors, it works, and if I use glDrawRangeElementsEXT with the end parameter to start + 50000 (for example) it works too. The right value is start + 1089.

I use unsigned int indices, my graphic card is a 6800GT and I use the latest nvidia drivers.

I search on google but that problem is not reported, I am wondering if someone can help me ??

Jari Komppa

May 01, 2005, 03:45 AM

One thing that comes to mind offhand is whether you're using shorts or unsigned shorts as indices.. although I don't know if that makes any difference.


May 01, 2005, 04:25 AM

Most strange, I just discovered that it works fine if I do not use the VBO..


May 01, 2005, 04:55 AM

I found the solution!!! the bug was caused by a driver optimization :(

I read the following in the NVidia VBO paper:

  1. If the specified range can fit into a 16-bit integer, the driver can optimize the format of indices to pass to the GPU. It can turn a 32-bit integer format into a 16-bit integer format. In this case, there is a gain of 2X.

I think there is a bug in the drivers and the optimization use a SHORT instead of and UNSIGNED SHORT or something like that..

So I did the optimisation myself. I call DrawRangeElements with UNSIGNED_SHORT when my indices are less than 65536 and with UNSIGNED_INT in the other case and it work fine!


May 01, 2005, 07:03 AM

It might be a bug. All it does is convert from 32-bit to 16-bit though. The problem is that you didn't explain your index, is it unsigned short or unsigned int? If you are using unsigned short, you can't fit more than 64k index. Also, don't make buffers larger than 64k anyways. You can do it, but it's usually better to break it apart. If you can, use triangle strips, or fans.


May 01, 2005, 09:28 AM

When I had the problem, my indices were all UNSIGNED_INT and it seems there is a problem when the driver convert them to 16-bit.

When I do not use VBO, there is not the problem so I read a doc about the VBO and at the end, I read something about that conversion. I think glDrawRangeElements only convert indices when it is use with VBO (vertices).

Now my indices are 16-bit for the ones which fit in 16-bit and 32-bit for the others, so the driver don't try to convert the indices and it works fine :)

In fact the convertion work for index < 32768 but not for 32768 < index < 65536.. I don't know why. I think its a bug in the NVidia driver..

This thread contains 6 messages.
Hosting by Solid Eight Studios, maker of PhotoTangler Collage Maker.