<- 1  -   of 75 ->
^^
vv
List results:
Search options:
Use \ before commas in usernames
Got the materials (mainly textures) working. This model of Diddy seems to be a low-poly model. Does anyone have locations for higher poly models?
It's not that it's low poly, it's that you gotta import normals; that'll make it look a lot smoother :P
Oh damn. Never thought about that xD
Edit history:
Antidote: 2015-05-02 11:20:19 am
Antidote: 2015-05-02 11:18:19 am
Quote from Parax:
Ermm, I don't code in C# so maybe I don't understand how this works, but I looked up the documentation for ToSingle and it seems to do exactly what I said. https://msdn.microsoft.com/en-us/library/06d3c2st(v=vs.110).aspx

What I meant is: for example, if you feed it ToSingle(0x3F800000), you will get a float value of 1065353216.0, not 1.0. That's not due to endianness, that's the decimal equivalent of that hex number. The way I understand it, the way Jesse was trying to read the floats is equivalent to reading them into an int and then casting it to a float.

Yes, exactly, but if you were to give that value to an int you would get 32831, rather than 1065353216, try it, it's rather interesting how it works. You also have to remember that you're using the files STORED hex representation, therefore you would be giving it 0x0000803F, rather than the expected 0x3F800000
Edit history:
Antidote: 2015-05-02 11:30:44 am
Antidote: 2015-05-02 11:26:10 am
Antidote: 2015-05-02 11:18:13 am
Also, unless Microsoft horribly botched casting, a straight cast should get you the proper value, my understanding of Convert.ToSingle now is more along the lines of this:
float newVal = 1.0f * val;

Where multiplying 1.0f by the value inserts the default exponent into the value, giving you a proper float representation of the current value.
http://stackoverflow.com/a/13044715
That's what I found on the matter.. Typeless bytes are your friend! Only load as float once the swap is complete
so basically C# (.NET?) works in little-endian like x86 and for some reason (guessing architecture) the raw poly data for DCCR is in big-endian?
Edit history:
jackoalan: 2015-05-02 02:07:38 pm
indeed.. PowerPC is big-endian by default..

It's technically switchable to little-endian, but system integrators (like Nintendo) are expected to pick one and stick with it.. Apple also configured all their PowerPC machines as big-endian in the good ole 1994 days, so it sorta became the popular choice
So i am trying get the normals right, but i am not sure if blender is fucking with me or i am doing something wrong. The .obj i create and import into blender looks like this:


Now this looks okay to me. Smoother than the Diddy Model above (which i could not find again for this screenshot :P).

This is how it looks textured:


Looks still okay to me. The odd thing happens when i render it (The Donkey Model had black "artifacts" all over his body and was darkened all around):


Is there something i am doing wrong, or is there something i need to add to be able to render it in Blender? I already tried moving the light around (that's not the problem).
Yeah, looks like the normals are incorrect. How are you reading them?
Edit history:
jackoalan: 2015-05-02 01:10:58 pm
turn on normal visualization within blender.. Edit Mode -> Properties Panel -> Mesh Display -> Per-Vertex-Normals
also, could you post a simple .obj dump to Gist or Pastebin? Sometimes you can pick out numeric discrepancies from the data itself
you also seem to have a single light source from what I'm guessing is in front of the bird, so it'll render completely black on the sides/back where no light is hitting, you could try to add a are lamp and set it to some low value to give the entire bird some visibility.
Edit history:
jackoalan: 2015-05-02 01:27:15 pm
there's definitely something wrong with the normal data.. the Blender Render (last picture) shows an out-of-range normal magnitude going on
If you're dividing the short normals by 0x2000 you might want to try 0x8000 instead... I know 0x2000 is correct for the UVs, but normals are usually supposed to be in the -1 to 1 range, so it might be that that's the value to use for them. Also, in the Metroid Prime series the normals can be either shorts or UVs, and there's usually a flag indicating which one. I'm not sure if that is the case in DKCR as well since I only remember ever seeing shorts, but keep an eye out for that anyway.
Yeah. I divide the shorts by 0x8000, because 0x2000 would produce far to high values (4.xxx). I also noticed that even after dividing by 0x8000 some normals were over 1.0 so i just did some quick and dirty checks (like with the Vertex Positions) and if they are >= 1.0, i subtract 2.0f from them (which could be wrong!).

This is how the normals look like in Blender:


And this is after adding another lamp right before the bird:


Looks a little bit better, but there still seems to be something off with the normals.
A dump of the created .obj file can be found here: http://pastebin.com/1A4GhZH6

Thanks a lot for all your help!
Edit history:
Aruki: 2015-05-02 03:10:06 pm
Aruki: 2015-05-02 01:40:00 pm
Definitely remove that check. It's possible for a component of a normal to be >= 1.0. Being normalized just means the magnitude of the vector is 1.0; that usually means that all the components are between -1 and 1, but it's not always the case. [edit: never mind, I'm dumb, this is wrong]

If you mean they're showing up in the range of 0-2 and there's none below 0, though, then that sounds like you're reading them as unsigned instead of signed... fixing that should do the trick.
pressing ctrl+n and maby change the "inside" checkbox that pops up could give you a quicker result, but it will normalize every normals value to whatever blender thinks is appropriate. but it's efficient
hmm, those normals on the beak have the incorrect signing.. are you certain you're pairing positions/normals correctly for the entire mesh?

Also, ctrl-n is a destructive operation and will replace Retro's original data with what blender thinks is correct.
Edit history:
wowsers: 2015-05-02 01:58:26 pm
that was what I meant, but it's usually an easy way to tell what's wrong
[edit]: I looked at the normals in blender and all I can say is, that's a mess I'm not willing to tackle

messing around with cycles gives a nice wonky paper look...
Quote from Parax:
Definitely remove that check. It's possible for a component of a normal to be >= 1.0. Being normalized just means the magnitude of the vector is 1.0; that usually means that all the components are between -1 and 1, but it's not always the case.


hm, are you sure about that? magnitude is sqrt(x^2 + y^2 + z^2), yes?
So i did read the shorts as unsigned shorts which seemed to have caused the wrong range the values were in. But reading them as signed shorts also does not produce the right result (I did swap byte 1 and 2, but also tried without swapping). In fact it almost looks the same.

Here is the current .obj dump:
http://pastebin.com/PsuMDhNx
it looks the same to me, newer model to the left and older to the right:
Quote from DJGrenola:
Quote from Parax:
Definitely remove that check. It's possible for a component of a normal to be >= 1.0. Being normalized just means the magnitude of the vector is 1.0; that usually means that all the components are between -1 and 1, but it's not always the case.


hm, are you sure about that? magnitude is sqrt(x^2 + y^2 + z^2), yes?


oops, yeah, I think you might be right. My bad then.

puelo, can you post the code you're using to read the normals?