Grasshopper's picture

Texture Compression - PixelInternalFormat

I would like to use this texture compression:
(http://oss.sgi.com/projects/ogl-sample/registry/EXT/texture_compression_...)

COMPRESSED_RGB_S3TC_DXT1_EXT

But there seems to be no enum for it?

Additionally I checked the extensions for "GL_EXT_texture_compression_s3tc" and I have that but

GL.GetInteger(GetPName.NumCompressedTextureFormats, out numFormats);

gives me 0 formats.

Any ideas?


Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Grasshopper's picture
the Fiddler wrote:

S3TC is not among of those - is there any good reason why use that instead of the core formats?

No I just wanted to force texture compression.

But there seems to be a different problem.

I tried it in C++ and in C#/OpenTK and
glGetIntegerv(GL_NUM_COMPRESSED_TEXTURE_FORMATS, &numFormats);
gives me 0 formats.

But at http://www.khronos.org/opengles/documentation/opengles1_0/html/glGetInte... it says:
GL_NUM_COMPRESSED_TEXTURE_FORMATS

params returns one value, the number of supportex compressed texture formats. The value must be at least 10.

Running on Ubuntu 9.04, ATI Radeon HD4850, Catalyst, OpenGL 2.1.8575

the Fiddler's picture

You need to pass GL_COMPRESSED_TEXTURE_FORMATS, not GL_NUM_COMPRESSED_TEXTURE_FORMATS:

glGetInteger wrote:

GL_COMPRESSED_TEXTURE_FORMATS

params returns GL_NUM_COMPRESSED_TEXTURE_FORMATS values, the supported compressed texture formats. See glCompressedTexImage2D and glCompressedTexSubImage2D.

glGetError should indicate an error of GL_INVALID_ENUM.

Grasshopper's picture
Quote:

GL_NUM_COMPRESSED_TEXTURE_FORMATS

params returns one value, the number of supportex compressed texture formats. The value must be at least 10. See glCompressedTexImage2D and glCompressedTexSubImage2D.

Why would this be incorrect? GL_COMPRESSED_TEXTURE_FORMATS gives you the format ids, but I want the number of available formats.

the Fiddler's picture

Lack of sleep, sorry :)

What video card / drivers are you using?

Grasshopper's picture

Same with me I guess - that's the "wrong" opengl documentation ...

But I found some sample code, that should work:
http://www.gamedev.net/community/forums/topic.asp?topic_id=491316

martinsm's picture

I dont know why, but on my laptop GL_NUM_COMPRESSED_TEXTURE_FORMATS returns 0 in OpenGL 3.0 context. In "old" 2.1 context it returns 3 - DXT1, DXT3 and DXT5. This happens not only with OpenTK, it's also same with my C++ code.
This is on Vista x64, Nvidia GF8400M GS, driver 182.06.

the Fiddler's picture

Reading page 119 of the specs (pdf), it seems that that the query should return > 0:

specs wrote:

In addition to the specific compressed internal formats listed in table 3.14,
the GL provides a mechanism to obtain token values for all such formats
provided by extensions. The number of specific compressed internal for-
mats supported by the renderer can be obtained by querying the value of
NUM COMPRESSED TEXTURE FORMATS.

Everything from the Geforce 2 and up should support the DXT "specific compressed internal" formats.

Looks like a driver quirk, I'll check what my Ati card returns tomorrow.

Grasshopper's picture

I'm using Ubuntu here with the newest Catalyst Driver (Opengl 2.1 Context). Card is a Radeon HD4850.
I tried it on a Windows XP laptop and got back "5" (some ATI mobile and using OpenTK).

Inertia's picture
the Fiddler wrote:

See page 125 of the specs for the table of supported (core) compressed formats. S3TC is not among of those - is there any good reason why use that instead of the core formats?

I really think we should add to OpenTK.Graphics.PixelInternalformat

COMPRESSED_RGB_S3TC_DXT1_EXT 0x83F0
COMPRESSED_RGBA_S3TC_DXT1_EXT 0x83F1
COMPRESSED_RGBA_S3TC_DXT3_EXT 0x83F2
COMPRESSED_RGBA_S3TC_DXT5_EXT 0x83F3

Regardless whether this is in core or not.

Pro:

  • Other Extensions interact with ext_s3tc
  • The primary reason why it's not core is that the algorithm is patented and OpenGL seems to have no license for it.
  • Every DirectX 6 graphics card or later supports it. Ati, Intel, Nvidia and S3 drivers do support it.
  • Due to the number of games using it, it will most likely still be present on cards that are built in the year 2015.
  • The enums are there to make the programmer's job easier, and lowering the need to crawl through documentation. The enums are no guarantee that things will work (GL.GetError is your friend).

Con:

  • Uh ... you tell me. It's 4 tokens.