Jonathan Blocksom is a lead software engineer working on iOS projects at Capital One in McLean, VA. Before joining Capital One, Jonathan taught Advanced iOS and OpenGL for the Big Nerd Ranch. A seasoned industry veteran, he has been developing iPhone software since the app store opened and 3D graphics software since OpenGL was just little bitty gl.
High quality cameras and fast processors on the iPhone have opened up the door for great computer vision capabilities in the palm of your hand. This session will explore what's possible with SDKs in iOS 5 and open source libraries like OpenCV for understanding the world around us through imagery. We will go into basic capabilities such as face detection and feature matching, and show how to use those to build apps for mosaicing, panoramas, and 3D shape recovery. The session will end with a blueprint for a computer vision SDK for the next decade.
What good is it to take your iPhone for a spin if you can't measure the G forces involved? This session will show you how with Core Motion, Apple's primary interface to the accelerometers, gyros and magnetometer in your pocket. We'll look into how to access each one, the different ways to access the data and how to tie them all together for a fully georeferenced measurement of position and orientation. To cap it all off we'll show how the orientation can be accessed with OpenGL, unlocking the door for immersive Augmented Reality application on your iPhone or iPad.
Moving beyond fancy transitions to the world of 3D graphics requires OpenGL. This session will introduce OpenGL ES 2.0 and Apple's GLKit, which go hand in hand to bring powerful 3D rendering to iOS devices. We'll cover how the OpenGL ES 2.0 pipeline goes from geometry to screen pixels, what sort of code you need to write in between, and how GLKit can make that part of your life simpler. This session is suitable for those with no 3D graphics experience, and those with some may learn some new tricks too!
Want to add 3D graphics to your app but have no idea where to start? Trying to figure out how to write your first shader? This session is for you! In this high level talk we'll go through the various options for 3D rendering on iOS and the Mac, see how the different SDKs relate to each other, and explain things like programmable pipelines and shaders.
Come away with the knowledge of both the forest and the trees of 3D rendering in Cocoa.
Unresponsive apps? Want to take advantage of multiple cores? Throw a little concurrency into your app! This intermediate level session will go through the task and queue based concurrency systems available in Cocoa and iOS, starting at the high level with NSOperation and moving down to the low level, block based Grand Central Dispatch. See how to improve load times, process data asynchronously, manage background tasks and more.
Good apps let users take something away, great apps let them share what they've made with the world. This introductory session will start with built in controllers for posting to Twitter and Facebook and how to customize the selector. Then you will learn to use a variety of open source modules for services such as Dropbox and Instapaper. Finally we'll show how to create your own custom UIActivity for whatever services you want users to access from your app.
iOS 7 introduces UI Dynamics, a physics engine for interface elements, and UI Motion Effects, a simply way to make your GUI react to the user moving their device. This session will give you an overview of these basic technologies, develop several interfaces using them, and point out their limitations. There will be many code samples and demos, and only a basic knowledge of the iOS SDK is required.
With iOS 7 Apple has exposed the pieces they use to make transitions happen and allowed us to hook in, creating our own custom view transitions with nifty animations. We will will cover:
This is an intermediate level session and will have plenty of code and demos.
Swift wasn't designed in a vacuum, working with the rich history of Objective-C frameworks was a critical part of the design. We will see how you can work with both languages in the same project, what works well and what the pitfalls are. You will also hear about the real world experience of adding Swift to a large consumer facing app with millions of users.
iOS 8 pushed the envelope of iOS 3D graphics in two directions away from good old OpenGL ES: upwards with Scene Kit and lower with Metal. In this session we will take a look at what all these options provide, how they fit in, and which might be right for your project. This will be illustrated with code samples showing the same thing using different technologies as well as highlighting what makes each special.
Details coming soon!
Break dimensional barriers by mixing the 2D game API SpriteKit with the 3D graphics API SceneKit! Each of these SDKs has hooks to use the other one and we will explore how they can work together. We will augment 2D games using 3D graphics with SceneKit, see how to add 2D overlays on our 3D content, and explore using SpriteKit's texture generation facilities with 3D models. Detailed knowledge of either SDK is not required.
What is it like working on an app with thousands of lines of code with dozens of other developers? We'll talk about the tools, techniques and traps of iOS software development on big projects. Bring your own war stories to share!
Metal gives us access to the GPU for both 3D graphics and number crunching. We will go over the basics of Metal shaders, the differences between graphics and compute shaders, and how to combine them in a cool physics-based simulation and visualization.
Swift can be a great language for dealing with web requests, whether it is from within your app or on the server side. We'll take a look at some of the open source frameworks for implementing web services with Swift and peek under the covers to see how they work. Specific things we'll look at include serving static files, generating web pages, and proxying calls to another server.
We will walk through the design and coding, in Swift and then in Metal, of a ray tracer — a program to make images by computing how light bounces around a scene and into the eye. How will the expressive, functional, protocol-oriented Swift ray tracer compare with the one written for screaming performance in C++ oriented Metal?
This hands-on course will introduce attendees to the fundamentals of 3D Graphics on iOS using SceneKit and a bit of Metal. No previous graphics programming experience is necessary but students should be familiar with Swift programming.
The morning will focus on introducing SceneKit and how to create 3D scenes and camera motion with code and external 3D resource files.
In the afternoon we will learn about Metal and OpenGL, and how shader programs can be used for custom rendering. We will end the day exploring the cool effects you can do with SceneKit, and attendees will have the chance to try a number of simple projects demonstrating them.
There will be a math refresher during lunch, covering the basics of vectors and matrix multiplication with an eye towards understanding lighting calculations and transformation matrices.
SourceKitten, an open source wrapper around Apple's SourceKit, is a great tool for parsing your Swift code and generating useful materials from it like documentation or dependency graphs. We'll go into details of how it works, how to use it, and what we can do with it on a large code base.
Apple's 3D libraries have added a lot of interesting effects and rendering algorithms over the last few years. In this talk we'll look into the procedures behind the pictures and what kind of tricks go into generating these nifty 3D images. No previous 3D graphics programming experience required.
Details coming soon!