@Mark Buckingham: iPhone development docs some talk about using OpenGL for games. It sounds like from this topic CoreAnimation might be the way to go
Core Animation layers have no depth. It's famously called "2D planes in space," which means the environment is 3D, but the layers themselves aren't. OpenGL is full 3D, and a far more flexible API in general. Of course, the tradeoff is that Core Animation is much easier to learn and is directly integrated with Cocoa.
@ssp: I haven't seen that 'live' myself, but I'd say it's exactly a usage case which is not interactive and where latency or snappiness doesn't play a role.
The scene is thousands of album covers flying around in 3D space at a smooth framerate. It's a fairly complex scene requiring a lot of horsepower.
That's why I asked about numbers or concrete experiences.
You could download the NanoLife sample I posted and increase the layer count to 200 or more. I think you'll be impressed with the performance. If not, than it's possible the rendering isn't happening on the graphics card for some reason (something below Cocoa).
It's unsurprising that _on paper_ Apple make their technologies sound great
This is definitely not a case where the technology is just good on paper. Core Animation has astounding real-world performance, which is well-demonstrated on iPhone. That said, like any API, it's easy to undermine the performance in a single line of code by calling the wrong thing at the wrong time.
All I tried to do was to use CA for superimposing a handful of elements I needed drawn in my view
That's well within the intended uses cases for the API. Feel free to email me the code if you want me to take a glance. Core Animation performance should not just be "decent" but blisteringly fast. The only qualification to that is that filters can be quite slow when animated.
by Scott Stevenson — Oct 05
Core Animation layers have no depth. It's famously called "2D planes in space," which means the environment is 3D, but the layers themselves aren't. OpenGL is full 3D, and a far more flexible API in general. Of course, the tradeoff is that Core Animation is much easier to learn and is directly integrated with Cocoa.
@ssp: I haven't seen that 'live' myself, but I'd say it's exactly a usage case which is not interactive and where latency or snappiness doesn't play a role.
The scene is thousands of album covers flying around in 3D space at a smooth framerate. It's a fairly complex scene requiring a lot of horsepower.
That's why I asked about numbers or concrete experiences.
You could download the NanoLife sample I posted and increase the layer count to 200 or more. I think you'll be impressed with the performance. If not, than it's possible the rendering isn't happening on the graphics card for some reason (something below Cocoa).
It's unsurprising that _on paper_ Apple make their technologies sound great
This is definitely not a case where the technology is just good on paper. Core Animation has astounding real-world performance, which is well-demonstrated on iPhone. That said, like any API, it's easy to undermine the performance in a single line of code by calling the wrong thing at the wrong time.
All I tried to do was to use CA for superimposing a handful of elements I needed drawn in my view
That's well within the intended uses cases for the API. Feel free to email me the code if you want me to take a glance. Core Animation performance should not just be "decent" but blisteringly fast. The only qualification to that is that filters can be quite slow when animated.