Chris Adamson is an independent writer, editor, and developer, living in Grand Rapids, Michigan. Along with developing numerous App Store apps for clients, he is the co-author of iPhone SDK Development (Pragmatic Programmers) and Learning Core Audio (Addison-Wesley Professional). He maintains a corporate identity as Subsequently & Furthermore, Inc. and writes the [Time code]; blog at http://www.subfurther.com/blog. In a previous career, he was a Writer / Associate Producer at CNN Headline News, and over the years, he has managed to own thirteen and a half Macs.
CocoaConf Columbus 2012 Presentations:
"In Soviet Russia, panel questions you!"
Borrowing an idea from the Penny Arcade Expo (PAX) and the panels held there by Harmonix (makers of the "Rock Band" games), a "Reverse Q&A" literally turns the tables on the traditional panel. Speakers become questioners, and attendees are the ones with the answers.
It's a new and novel idea, letting attendees have their moment in the limelight to say what they're really thinking, and letting speakers learn more about what people want from conferences, books, and their development life in general. With a combination of polls, follow-ups, person-on-the-street questions, and funny stories that we can all relate to, the Reverse Q&A will shake the cobwebs out of the old panel format and turn it into a two-way discussion that both sides of the mics can learn from.
If your iOS app streams video, then you're going to be using HTTP Live Streaming. Between the serious support for it in iOS, and App Store rules mandating its use in some cases, there realistically is no other choice. But where do you get started and what do you have to do? In this session, we'll take a holistic look at how to use HLS. We'll cover how to encode media for HLS and how to get the best results for all the clients and bitrates you might need to support, how to serve that media (and whether it makes sense to let someone else do it for you), and how to integrate the HLS stream into your app.View Details
Core Audio is the low-level iOS API for processing audio in real-time, used in games, virtual instruments, web radio clients, music mixers, and more. It also has a well-earned reputation for being a challenging and unforgiving framework. In this advanced all-day tutorial, we'll shake off the scary and dig into Core Audio, playing with its powerful pieces and cutting through its thicket of jargon and secrets disclosed only in header files.
The all-day class will be broken into four hands-on sections:
1. Sounds and Samples, Properties and Callbacks: the concepts of digital audio, the conventions of Core Audio, and our first audio apps
2. Pretty Packets All in a Row: Recording and Playback with audio queues
3. Dreaming of Streaming: Building a web radio client with Core Audio
4. Go With the Flow: Audio Units and Audio Processing Graphs
This is an advanced tutorial, and it is assumed that attendees are intermediate to advanced iOS developers. At a minimum, you should know how to build and run apps, link and use iOS frameworks, and be fairly comfortable with C code (including pointers and malloc/free). Of course, you should have a MacBook with Xcode 4.3 or higher. Being able to deploy to a iOS device during the tutorial is optional, but may be helpful.View Details