[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Chicken-users] Unsigned int conversion and OpenGL
From: |
felix winkelmann |
Subject: |
Re: [Chicken-users] Unsigned int conversion and OpenGL |
Date: |
Sat, 7 Jun 2008 01:02:46 +0200 |
On Tue, Jun 3, 2008 at 9:42 AM, Daniel Dewey <address@hidden> wrote:
> How do you cast or convert an integer into an unsigned integer in Chicken?
>
> I'm learning OpenGL using Chicken and I'm having trouble getting textures to
> work. I have a texture in memory and am getting the address with
> pointer->address, but this returns a regular signed int, which gl:BindTexture
> rejects with "bad argument type - not an unsigned integer".
>
I assume the address returned is negative, right? This is actually a
bug - "pointer->address"
should return an unsigned number. Isn't the second argument to
glBindTexture a non-pointer
argument? If yes, just use the negated address (if I understand
correctly, you just have to
have a unique value for the texture name - but I'm not an OpenGL expert).
Another option is to convert the value in C:
(define pointer->uaddr
(foreign-lambda* unsigned-integer ((c-pointer x))
"return((unsigned int)x);"))
cheers,
felix