Home | Forums | What's new | Resources | |
building C++ Saturn games |
RockinB - Dec 28, 2004 |
RockinB | Dec 28, 2004 | ||||
Hey, thanks for this advice! I use the usual SGL linkscript in PROJECTS/COMMON/sl.lnk, furthermore the sglarea.o, but not cinit.o. cinit.o is compiled as C and those cannot call C++, as far as I know, so I had to include main() from cinit.c into my main.cpp. Anyways, the major problem is not anymore. My fault, sorry...I forgot to do slInitSystem() The smallest possible c++ program now works with Dev-CPP as well as with modified makefile/OBJECTS. |
RockinB | Dec 28, 2004 | |||
While testing the app I'm porting, a found out that it doesn't come past the initilization. It turns out that it hangs when allocating an array abject new UBYTE[131136]; What can be the reason for this? It's not the first usage of the new operator(2nd instead). Could it be something like out of memory? My sl.bin already is 660kB large. |
antime | Dec 28, 2004 | |||
Quite possibly. By default, everything is put into workram-h (because it's faster), so you have less than one megabyte of space available. |
RockinB | Dec 28, 2004 | ||||
Yes, I have investigated this and reduced the binary size several steps. With each try, the program got further in it's execution. But I have no clue how to deal with the memory in C++. For some strange reason, the binary is really huge(500KB, without other stuff). It doesn't seem to be debug info included. Removing the -g switch reduces sl.coff size, but sl.bin keeps it size. How do I controll where C++ gets the memory from? Maybe I want it to have some structures in low workRAM, or even some code, but how? This is really importand, don't know how to proceed else. BTW: how big can sl.bin be to work properly? |
antime | Dec 28, 2004 | |||
There's no debug info included in pure binaries, so that switch won't help. Try using objdump on the coff file to see what's included and what's taking space. Space requirements is also dependent on what C++ features you use. The linker should be smart enough to only pull in the bits you actually need, but the full C++ standard library is over 900KB by itself. Disabling exceptions and RTTI will save space as will limiting template usage. Google for documents on using C++ in embedded development, they contain lots of information. IIRC, the default allocator just uses the heap, defined by the bss_start and bss_end symbols. If you want more control over memory allocation you can write your own memory management routines (since you're using C++ you could for instance override operator new for some classes), but if you need that big chunks of dynamically allocated memory I would suggest your design has problems. |
RockinB | Dec 29, 2004 | ||||
Oh thanks a lot, your a god :bow ! I did -fno-rtti and -fno-exceptions. The latter involved some removing of thrown exceptions, but did not work. Maybe the new operator needs exceptions. Furthermore, I modified the link script and makefile to use the low workRAM and produce 2 binaries(low and high work RAM). The Map file gives fine info on the placement. Can move all .o files to low workRAM, but not the zlib stuff :huh . Of course, I could link every single .o file of zlib, but how is it done to place all stuff of a lib into a certain SECTION? Anyways, with the emus I cannot test the prog split into 2 binaries, because they seem to reset the workRAM to zero if I load the other binary. Gues what? I did implement overloading of new, delete, new[] and delete[] operators. Full parameterizable global or local for every single class. And I can switch to use low workRAM with SEGA_MEM.a, or usual malloc. This allows me to see every single mem alloc in debug mode, the requested size and given location. Man, I was very surprised how huge some class objects are! Fine to see where the alloc fails. Update: fixed now: But I have problems with low workRAM. I copy stuff from an object in high to low workRAM, but the writing seems to be without effect, the data written is zero :huh !! I could only test with Satourne, so with all the unsupported features that I discovered it could be possible that it's a Satourne bug. Or are there special things to do, when using low workRAM? Note: haven't yet tried to disable exception again, with overloaded operators. |
antime | Dec 29, 2004 | ||||||||||
Depends on your allocator. The one in libstdc++ doesn't need them (both exception-enabled and non-throwing versions are supplied).
Suitable wildcards in the input and output selections in the link script might do the trick, or you could assign the sections when creating the library.
You could also use your custom allocators to track memory leaks, memory profiling and other stuff. Instead of overloading the operators you could also have used "placement new", but your system allows more flexibility. I would however recommend against mixing allocators, especially SEGA_MEM and the compiler-provided ones. |
RockinB | Dec 30, 2004 | ||||
There is no problem with that. Have now added a 3rd option: fixed memory locations for objects. There is only one instance per class, so it should work, be faster and save libs. The access to low workRAM magically works now, as well as disabling exceptions(with my own new operators). I moved a >256KB object to low RAM and all that worked. The app does run on Satourne, but there seems to be some other problem, since the graphics output doesn't work. Other debug text display signals correct functionality so far. But on real Saturn, it hangs at some point. |
RockinB | Jan 5, 2005 | ||||
:agree Hmm, you're right, one must be very carefull. The author of the ported app mixed the pairing of the new and delete operators (a common problem, I've read in a handbook). He allocated an array with new[] and destroyed it with delete (not delete []). But I had overloaded new[] and delete[], so the real Saturn locked up when delete was called! I also tried to minimize the size of the binary. Very annoyingly, the linker does link stuff that is never used. It seems that every public function of a class is linked, whether it's used or not. So I had to insert some #ifdef's. As you said, the stc++ library is very large and the most stuff linked is from there with lots being trash I don't want to use. Streams for example. I don't really need to, but maybe I'll try to exclude those somehow, too. |
antime | Jan 5, 2005 | |||
Using new[] and delete on the same object is indeed a bug, but I was referring more to the fact that SEGA_MEM seems to keep its own list of free and allocated memory and does not use the C-library malloc/free. So if you mix the two for the same memory area you will inevitably get collisions; using them on different memory areas should be fine. |
RockinB | Jan 5, 2005 | |||
It's definitely worth it to look into sl.map and try to eliminate the whole bunch of trash that is linked there. So I avoided some snprintf() and similar such that the binary decreases to only 158kByte!!! Now I can put most stuff into high work RAM and it runs faster. BTW: we were talking about Handy, an Atari Lynx emulator. It can be expected to run at fullspeed, someday. |
antime | Jan 5, 2005 | ||||
The big hog there is the floating-point emulation. Newlib comes with an integer-only version if you really need the functionality. |
RockinB | Nov 8, 2005 | |||
I just tried to recompile my WonderSwan emu port with SaturnOrbit. As Dev-Cpp is linking with gcc, I can't link with g++ and have to supply the c++ std libs manually. It's a C++ project and it worked when compiling all cpp files as C files (but only for COFF). But when I compile them as C++ files, then the compiler complains to not find function rand(). I guess it's in the C stdlib, but not in the C++ stdlib. Not having much experience with C++, I just tried #include Anyone can help me? edit: Forget it, it's written in some strange C dialect, no real C++ and no real C. I don't care if it can be build with ELF, as I works fine with COFF. |