By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,457 Members | 1,335 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,457 IT Pros & Developers. It's quick & easy.

casting problem (int * to int*)

P: n/a
hi,

i got this opengl/c++ code:

<code>
void render(CMesh *mesh){
...
float *pVertices;
int *pIndices;

//WORKS FINE
pVertices = (GLfloat *)mesh->getVertices(); // (A)
//PROBLEMS HERE!!!!
pIndices = (int *)mesh->getFaceIndices(); // (B)

...
}

class:

int *CMesh::getFaceIndices() const{
return faceIndices;
}

float *CMesh::getVertices() const{
return vertices;
}

Class CMesh
{
...
float *vertices = new float[numVertices * 3];;
int *faceIndices = new int[numFaces * 3];
...
}
</code>

the two arrays get filled at runtime.
for the the opengl rendering part, i need to have a pointer to a
float-array containing the vertices and a pointer to an int-array
conatining the indices of the triangles.

to get them from my mesh-object i cast them in the render-function.
the floatconversion (A) works. "pVertices" points to the same data as
the member variable "vertices" of the mesh-object. but when i try to
cast the pointer of my integerarray (B), a dataloss occurs. when i
loop through pIndices i get lesser data. only every 3rd integer of the
original array. the last 3 values are correct.
this must be a casting problem. why does it work with floats, but not
with int?
i tried (GLint *) and (int *) casting (they seem to be the same size
anyway, according to sizeof())... no success. any ideas?

TIA
ghostdog
Jul 19 '05 #1
Share this Question
Share on Google+
2 Replies


P: n/a
"ghostdog" <su***@gmx.at> wrote in message
news:e6**************************@posting.google.c om...
....
| //WORKS FINE
| pVertices = (GLfloat *)mesh->getVertices(); // (A)
| //PROBLEMS HERE!!!!
| pIndices = (int *)mesh->getFaceIndices(); // (B)
....
| float *vertices = new float[numVertices * 3];;
| int *faceIndices = new int[numFaces * 3];
....
| the two arrays get filled at runtime.
| for the the opengl rendering part, i need to have a pointer to a
| float-array containing the vertices and a pointer to an int-array
| conatining the indices of the triangles.
....
| to get them from my mesh-object i cast them in the render-function.
| the floatconversion (A) works. "pVertices" points to the same data as
| the member variable "vertices" of the mesh-object. but when i try to
| cast the pointer of my integerarray (B), a dataloss occurs. when i
| loop through pIndices i get lesser data. only every 3rd integer of the
| original array. the last 3 values are correct.
| this must be a casting problem. why does it work with floats, but not
| with int?
| i tried (GLint *) and (int *) casting (they seem to be the same size
| anyway, according to sizeof())... no success. any ideas?

I do not think that this is a casting problem. In any case, the code
that you have posted does not contain a problem by itself.

However, there are many caveats when dealing with this kind of
indexed vertex buffers: for example an index value will
(typically) have to be multiplied by 3 to get the first coordinate
of a vertex within your float array.
... but this is getting OT here.

My suggestion would be to post a more complete code example,
and to post it on a technology-specific forum (e.g. in comp.graphics.* ?,
or on a forum supported by an OpenGL vendor... ).
Regards,
--
http://ivan.vecerina.com
Jul 19 '05 #2

P: n/a
On 15 Sep 2003 07:30:11 -0700 in comp.lang.c++ :

If your functions return "float*"and "int*" respectively,
and you assigning those values to "float*" and "int*"
types,

I'm curious; Why cast ?
Jul 19 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.