View Single Post
Logan
Guest
 
Posts: n/a

Scanning over it seems the major factor is that bytes and variable handling are of different sizes depending on architexture.

Boolean is 1 byte on x86 machines, 4 bytes on PPC

A long double is 16 bytes on both architectures, but only 80 bits are significant in long double data types on Intel-based Macintosh computers.

there's just a ton of really simple yet potentially tedius tweaks, and from what I scanned over it appears that when your code is universal binary ready, you'll know just simply after compiling on a universal binary-compatible machine by going to get info and looking at the application's type.

Be sure you're using the latest compiler too.
QUOTE Thanks