| Home | Forums | What's new | Resources | |
| Saturn Texture Coordinate Test |
| RockinB - Jul 1, 2005 |
| Prev | 1 | 2 | 3 | Next |
| M3d10n | Jul 23, 2005 | ||
| Ah, I get it. You're using the UV information to generate new square/rectangular textures that can be applied to each sprite. I think a few (very few) Saturn games might have used such method for a few models, but using some sort of pre-processing tool, never in real time. Actually... since in real-world scenarios, very few objects would ever need the UVs to be regenerated in real time (envronment mapped objects), you could get better looking results by creating an offline tool out of your code, thus being able to generate higher quality textures. You could use bilinear/bicubic interpolation when generating the final textures. | |||
| RockinB | Jul 26, 2005 | ||||||
Actually I think a lot of games used the "Texture slice" function of 3DEditor's tool "Texture List", which performs very similar operation as precomputation. The Sonic-R effect is done in real-time, but as mentioned in the posts before, it's probably rendered by software only. <!--QuoteBegin-M 3d10n@Sun, 2005-07-24 @ 03:01 PM Actually... since in real-world scenarios, very few objects would ever need the UVs to be regenerated in real time (envronment mapped objects), you could get better looking results by creating an offline tool out of your code, thus being able to generate higher quality textures. You could use bilinear/bicubic interpolation when generating the final textures. [post=137180]Quoted post[/post] [/quote] The 3DEditor can do that, but you're right, I could make a similar (maybe even superior) tool from my code. Remember the sky reflections in Daytona(not CCE) car windows? This one would be much more realistic and wouldn't take a lot of CPU time. WIP info: I could make a speedup in computation (texture coordinate mapping). The 3d (accurate, software only) sphere mapping is again using the slave, but the result is not correct (slave or not) I found a general problem in sign extension: Code:
So how to do the sign extension explicitly on Saturn? The C book says complicate things about type casts and sign extension.... Remember that I can only take O0 plus all named optimization flags, but not -O1, -O2, -O3? I tried it once again with -O2 and the only problem that comes up is due to the sign extension workaround from above, which is optimized away. How to do sign extension in C? | |||||||
| RockinB | Jul 26, 2005 | ||
| Using the asm instuction exts.w to explicitly force sign extension, I can finally use optimizations flags -O1, -O2 and -O3. | |||
| Omni | Jul 27, 2005 | ||
| Okay, wait. 1. I figured out the basic idea of resampling the source texture and creating a new rectangular texture for mapping. That sounds simple. 2. I can comprehend the idea of resampling a 2D quad or convex polygon from a source texture and creating a new rectangular texture for mapping. Have I understood that correctly? Also, why must the distorted quad be convex? 3. I barely understand what you mean by a mapping method such as sphere or cylinder. The only thing I can guess is that you use the idea of a sphere to generate texture coordinates automatically for a poly model based on the orientation of its faces. Is that what you are doing? And how are you doing that? I don't understand, if I even assumed how the mapping was done correctly, how taking the idea of the poly faces corresponding with a sphere will suddenly help you come up with texture coordinates. Is there some fundamental idea I'm missing? Do I need to read more about ...texture coordinates? The code is, however, amazingly cool. | |||
| RockinB | Jul 27, 2005 | |||||||||
It's due to the resmapling algorithm. You won't see a glitch or anything when a convex quad transforms into a concave quad in demo one, but the generated texture image is not correct. Just like when you want to display concave quads with VDP1. See Charles MacDonalds example, it's explained in a manual, too.
It's the method of texture mapping using an intermediate surface. Such an intermediate surface is a plane, a cylinder or a sphere and you can easily imagine how it looks like when a texture is applied to them. The faces of the 3D object are just projected onto the intermediate surface and the face gets the part of the texture that is mapped on the intermediate surface in that region. The mapping: point in 3D space -> point on intermediate surface -> texture coordinate Projecting the faces on the intermediate surface can be done in two ways: * project 4 vertices of a face (fast, but approximated) * project each texel of a face (slow, but accurate)
Use keywords like "texture mapping" "inverse mapping" "intermediate surface" <!--QuoteBegin-O mni@Thu, 2005-07-28 @ 03:47 AM The code is, however, amazingly cool. [post=137410]Quoted post[/post] [/quote] Yeah, it's really nice. But I guess environment mapping will be buggy for a long time. Solving quadratics functions using fixed math is a pain in the ass. I messed with it a long time with my voxel stuff and I don't want to go through this, again. | ||||||||||
| Omni | Jul 27, 2005 | ||
| Awesomeness! Okay...a few more questions, if you don't mind. 1. I can understand the idea of using the sphere to correspond to texture coordinates...but...I can't really imagine how that works with a cylinder. Much less a plane... [plane means "flat surface", right?] ...Is this something I will figure out as soon as I research texture coordinates? 2. How exaclty are you sampling the texture data? For the first demo and the rectangle source quad, that seems pretty simple...I think. I guess you'd retrieve the rectangle data copy it into a second buffer, 4somehow, but I'm not sure how -- I thought you guys were still working on DMA image access and linescrolling? Is there another way to transfer image data? 3. How do you sample a quad of texture coordinates? I'll assume this could be any quad polygon shape...the software idea makes sense, but how do you do it the hardware way? That seems...no clue. | |||
| RockinB | Jul 27, 2005 | |||||||||
Putting a texture on a plane - a rectangular shape - is the simplest method. The 3D object is projected onto it. But unlike the projection to screen - which is perspective projection - , it uses orthographic projection, which is much simplier (only take one dimension and discarda the other, in my case). The only difference between cylinder and sphere mapping is the computation of the vertical texture coordinate, which involves trigonometric in case of sphere and which is just as simple as plane mapping for cylinder.
The texels are sampled using nearest neighbor. That's a decision which I think is appropriate for runtime on Saturn, but bilinear interpolation of texels could be done in a PC tool.
In demo#1, there are two sprites in VRAM: the original texture and the texture which is resampled from the quad inside the other texture. In fact, there is a 2nd copy of the original texture in work RAM, as it is in all other demos. The linescroll enables the display of "continuous address" bitmaps using VDP2 scrolls. It's slightly faster than using VDP1 sprites and it's the only way to display 24bpp images. In either case, sprite or linescroll, the data is transfered using DMA. <!--QuoteBegin-O mni@Thu, 2005-07-28 @ 12:58 PM 3. How do you sample a quad of texture coordinates? I'll assume this could be any quad polygon shape...the software idea makes sense, but how do you do it the hardware way? That seems...no clue. [post=137424]Quoted post[/post] [/quote] Bilinear interpolation of the texel coordinates (not the texel color!) from the 4 vertices among the whole quad. In general, there are two ways: 1) transform the target coordinates to the source coordinates or 2) vice versa. I'm doing 1),obviously. Thanks for your interest! If you want to stick into texture mapping, just google for the topic. The knowledge needed for the stuff here can be gained by reading 1/2 hour through some docs from the net. If anyone finds a cool resource on the math behind environment mapping, let me know. | ||||||||||
| Omni | Jul 27, 2005 | ||
| Thanks a ton for more info. What I really wanted to know though, when I asked the sampling question, was, how exactly do you do the pixel access from the source texture when using hardware? Software makes sense as I imagine it involves...softwary access into memory once you get the texels. But how do you do...hardware access? I thought this wasn't supported in hardware. The orthographic projection onto a plane is brilliant! I have wondered how to easily wrap a texture, but I never thought of orthographically projecting the coordinates! I will certainly look up some more texture mapping info. | |||
| RockinB | Jul 28, 2005 | |||
All textures are generated in software in either way. The advantage over Sonic R is, that I use the VDP1 to display them. I only need to recompute textures if some texture coordinates change. And I can weighten the CPU load across multiple frames by using the keyframe approach (like in emus). | ||||
| RockinB | Jul 28, 2005 | ||
| Wow, great news :cheers : morden made a video and screenshots from real Saturn hardware! The video is great with background music added! TC_demo3_morden.avi... And some screenshots: Thanks a lot to morden. And I invite everyone whos interested to record his own video or screenshots! :thumbs-up: | |||
| ClaudioSH2 | Jul 30, 2005 | ||
| Nice work Rockin'-B , this demo is really impressive | |||
| RockinB | Aug 2, 2005 | ||
| Thanks, ClaudioSH2! Well, the demo is not very user friendly, since it wasn't intended to be more than a WIP snapshot. But I had some very nice feedback, thank you all So I made multiple text HUD display options, one being a help guide through, which displays available inputs and what actions are performed currently. If I wouldn't be so lazy right now, I would like to improve the data creation to make the API interface easier for programmers. Furthermore I would make it possible to only texture polygons of a specific material property (like I did in SaturnMAP tool). And finally I would add some more complex 3D models like the Sonic object and more textures, too. Maybe the demo#4 will have a source release, too. But currently, I'm extremely lazy, so don't expect anything soon. My PC is so unsuited for Saturn dev(unstable, K6-2 500MHz), I need to get a new one! | |||
| RockinB | Aug 3, 2005 | ||
| About the texture effect in Daytona games: -- sky reflections in car windows -- I just had a look at the first Daytona USA again, which had the most coolest effect, better as Daytona CCE and Daytona CE (Netlink). Daytona USA does support not only vertical texture scrolling, but also horizontal. No scaling or rotation is present, so a simple copy of an axis align sub rectangle from a source texture would do the job for the back window, but some of the side windows are triangular, so there must be more done than just a one-to-one copy. So ladies and gentleman, Daytona USA does perform (most likely) the same job like my demo does with plane mapping. Daytona CCE features a not so good looking sky reflection effect. There is only vertical scrolling, which could be done with a little software trick by advancing the texture table address of the sky texture by one line. | |||
| ClaudioSH2 | Aug 3, 2005 | ||
| Rockin'-B its possible make on saturn effects like psx wipeout ? [attachmentid=1401] | |||
| RockinB | Aug 4, 2005 | |||
I tried hard to see the effects your talking about, but the image is too small. I even played Wipeout again, but still don't know which effect you mean. | ||||
| Omni | Aug 4, 2005 | ||
| Question: can this be used for the same type of reflection mappy-kind of stuff that's seen on the cars in Gran Turismo 1 & 2 and on various objects (like the Bombchu) in Zelda 64? Is that mapping also called "environment mapping", or is it something else? | |||
| RockinB | Aug 5, 2005 | |||
That's the idea. BTW: you can apply multiple textures on Saturn, too. If anyone is interested to make a demo, I'd give him my texture mapping stuff. <!--QuoteBegin-O mni@Fri, 2005-08-05 @ 12:51 PM Is that mapping also called "environment mapping", or is it something else? [post=137941]Quoted post[/post] [/quote] This is something like that. I'm not totally sure, but I think environment mapping implies view dependent mapping. The view dependent mapping is currently very buggy (fixed point math) in my code, but the view-independent mappings can be used to simulate the effect of view dependence. This is what all Saturn games which have similar effects do. | ||||
| M3d10n | Aug 7, 2005 | ||
| You mean the reflective trophys in Wipeout XL? I am 99% sure the effect is present in the Saturn version too (bit it doens't look as nice as the PSX version, but the texture surely looks to be moving around while the trophy rotates around itself). | |||
| Prev | 1 | 2 | 3 | Next |