Portland / Speakers / Chris Adamson

  • Speaker photo

    Chris Adamson


    Chris Adamson is the co-author of iPhone 10 SDK Development (Pragmatic Programmers) and Learning Core Audio (Addison-Wesley Professional). He's also an independent iOS and Mac developer, based in Grand Rapids, Michigan, where he also writes the Time.code() blog at subfurther.com/blog and hosts livestreams at invalidstream.com . Over the years, he has managed to own fourteen and a half Macs.

     

    CocoaConf PDX Presentations

    All Day Core Audio Workshop

    Core Audio is the low-level iOS API for processing audio in real-time, used in games, virtual instruments, web radio clients, music mixers, and more. It also has a well-earned reputation for being a challenging and unforgiving framework. In this advanced all-day tutorial, we'll shake off the scary and dig into Core Audio, playing with its powerful pieces and cutting through its thicket of jargon and secrets disclosed only in header files.

    The all-day class will be broken into four hands-on sections:

    1. Sounds and Samples, Properties and Callbacks: the concepts of digital audio, the conventions of Core Audio, and our first audio apps

    2. Pretty Packets All in a Row: Recording and Playback with audio queues

    3. Dreaming of Streaming: Building a web radio client with Core Audio

    4. Go With the Flow: Audio Units and Audio Processing Graphs

    This is an advanced tutorial, and it is assumed that attendees are intermediate to advanced iOS developers. At a minimum, you should know how to build and run apps, link and use iOS frameworks, and be fairly comfortable with C code (including pointers and malloc/free). Of course, you should have a MacBook with Xcode 4.3 or higher. Being able to deploy to a iOS device during the tutorial is optional, but may be helpful.


    Core Audio in iOS 6

    Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.

    Want to see, and hear, how all this stuff works? This section is the place to find out.


    Mobile Movies with HTTP Live Streaming

    If your iOS app streams video, then you're going to be using HTTP Live Streaming. Between the serious support for it in iOS, and App Store rules mandating its use in some cases, there realistically is no other choice. But where do you get started and what do you have to do? In this session, we'll take a holistic look at how to use HLS. We'll cover how to encode media for HLS and how to get the best results for all the clients and bitrates you might need to support, how to serve that media (and whether it makes sense to let someone else do it for you), and how to integrate the HLS stream into your app.


    Reverse Q&A Panel

    "In Soviet Russia, panel questions you!"

    Borrowing an idea from the Penny Arcade Expo (PAX) and the panels held there by Harmonix (makers of the "Rock Band" games), a "Reverse Q&A" literally turns the tables on the traditional panel. Speakers become questioners, and attendees are the ones with the answers.

    It's a new and novel idea, letting attendees have their moment in the limelight to say what they're really thinking, and letting speakers learn more about what people want from conferences, books, and their development life in general. With a combination of polls, follow-ups, person-on-the-street questions, and funny stories that we can all relate to, the Reverse Q&A will shake the cobwebs out of the old panel format and turn it into a two-way discussion that both sides of the mics can learn from.