![]() Those data buffers can be very expensive to constantly allocate and de-allocate, so PixelBufferPool allows you to recycle them. The CVPixelBufferPool allows you to efficiently recycle CVPixelBuffer back ends. It's got the pixel format, everything you need in order to correctly interpret the pixel data. It's got the dimensions, the width and the height. These are some of the common types that you'll encounter in these interfaces.ĬVPixelBuffer contains a block of image data and wrapping that buffer of data is the CVPixelBuffer wrapping.Īnd the CVPixelBuffer wrapping tells you how to access that data. So before we dive into more stuff, we're going I'm going to do a quick look at this cast of characters. On OS X, AVKit and AVFoundation will use hardware codec when they're available on the system and when you when it's appropriate.Īnd Video Toolbox will use hardware codec when it's available on system and when you request it. On iOS, AVKit, AVFoundation and Video Toolbox will all use hardware codec. So a quick note on using these frameworks.Ī lot of people think they have to dive down to the lowest level and use the Video Toolbox in order to get hardware acceleration, but that's really not true. In AVFoundation, we'll be looking at some interfaces that allow you to decode video directly into a layer in your application or compress frames directly into a file.Īnd the Video Toolbox we'll be looking at these interfaces to give you more direct access to encoders and decoders so you can decompress directly to CV pixel buffers or compress directly to CM sample buffers. So today, we're going to focus on AVFoundation and the Video Toolbox. These frameworks provide many of the necessary types that you'll see throughout the in the interfaces in the rest of the stack. Īnd below that we have Core Media Core Video. This provides direct access to encoders and decoders. Video Toolbox has been there on OS X for a while, but now it's finally populated with headers on iOS. You've seen stuff like this earlier this week, but we'll do it once more, and there's a little focus on video in my view of this, because we're talking about video.ĪVKit provides very easy-to-use high level view level interfaces for dealing with media.ĪVFoundation provides an easy-to-use objective C interface for a wide range of media tasks. Next, we'll be talking about when the case where you have a sequence of images coming in from the camera or someplace else and you'd like to compress those directly into a movie file.Īnd accompanying that, there's the case where you have a stream of images coming in from the camera or someplace else and you'd like to compress those but get direct access to those compressed sample buffers so that you can send them out over the network or do whatever you like with them.Īnd then finally, we're going to give you an intro to our new multi-pass APIs that we're introducing in iOS8 and Yosemite.Īll right, let's do a quick overview of our media interface stack. The next one we're going to talk about is the case where you have a stream of H.264 data coming in over the network, but you don't just want to display that in your application, but you actually want to get access to those decoded CV pixel buffers. The first scenario we're going to talk about is the case where you have a stream of H.264 data coming in over the network and you want to display that inside of a layer in your application. We're going to look at some common user scenarios. So today, we're going to break this first, we're going to break this down into a few case studies. Users will really appreciate it if their OS X, their portables as well as their iOS devices have improved battery life.Īnd as an added bonus, people with portables will love it if their fans don't kick in every time they're doing video processing. Obviously, they'll get better performance and they will be far more efficient, but most importantly, this will extend battery life. This will improve user experience in a number of ways. We want to make sure that no matter what you're doing with the video in your application, you have access to hardware encoders and decoders. This is session 513 and we're going to talk about Video Encoders and Decoders today. Learn about multi-pass export for improved H.264 encoding and see how to use it in your app. Gain best practices for when it is appropriate to use a high-level or low-level API for encoding or decoding. Direct Access to Video Encoding and Decoding Session 513 WWDC 2014 Discover how to use AV Foundation and Video Toolbox to access hardware accelerated encoding and decoding services.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |