Hi,
The following segment:
using Xcode 3.2 (the project setting is 'Command Line tool') on OSX 10.6.8 compiles fine and produces the following output:
a = (
when of course it should produce:
a = (1.,1.)
Yes, I can display the real and imaginary parts using a.real(), a.imag() but the above should work.
Any ideas why Xcode / OSX has this weird behaviour? Has anyone else seen this, any suggestions?
Cheers,
m
The following segment:
Code:
#include <iostream>
#include <complex>
int main (int argc, char * const argv[])
{
std::complex<double> a(1.,1.);
std::cout << "a = "<< a << "\n";
return 0;
}
using Xcode 3.2 (the project setting is 'Command Line tool') on OSX 10.6.8 compiles fine and produces the following output:
a = (
when of course it should produce:
a = (1.,1.)
Yes, I can display the real and imaginary parts using a.real(), a.imag() but the above should work.
Any ideas why Xcode / OSX has this weird behaviour? Has anyone else seen this, any suggestions?
Cheers,
m