Wednesday, May 14, 2014

Introducing DynamicXray

Today I would like to introduce DynamicXray, a UIKit Dynamics runtime visualisation and introspection library for iOS.




Overview

If you have done any work with UIKit Dynamics you may have found that it can be difficult to debug and fine tune the behaviour. UIKit Dynamics is driven by a rigid body physics engine (Box2D in fact) which gives the developer a lot of power, but the abstract nature of the framework can make it difficult to see under the hood of the physics simulation.

DynamicXray provides a live overlay of all dynamic behaviours and dynamic items at runtime, allowing you to visualise and introspect the underlying physics simulation as it is running. DynamicXray works on both devices and simulators.




Integration

DynamicXray is implemented as a UIDynamicBehavior. This means it can be simply added to any UIDynamicAnimator to enable the introspection overlay. By default, all behaviours added to the animator will be visualised.

For example:


#import <DynamicXray/DynamicXray.h>
...
DynamicXray *xray = [[DynamicXray alloc] init];
[self.dynamicAnimator addBehavior:xray];


For more control, the DynamicXray behaviour exposes options such as temporarily disabling the overlay, adjusting the cross fade between app and overlay, whether to draw dynamic item outlines, and more. 


Features

DynamicXray features include:

  • Easy and controllable integration. Simply add the DynamicXray behavior to your dynamic animator.
  • All UIKit Dynamic behaviours are visualised, including collision boundaries.
  • Visually differentiate between springy and rigid attachment behaviours.
  • Push behaviours are visualised by arrows representing the location, magnitude and direction of the push force.
  • Snap behaviours are visualised by arrows showing where the item is snapping to.
  • Gravity behaviours are visualised by an overlay showing magnitude and direction.
  • All dynamic item bodies in the scene are visualised.
  • Any contacts between dynamic items and other items or collision boundaries are highlighted.
  • Configurable overlay cross fade control for fading anywhere between 100% application to 100% DynamicXray overlay.
  • Built-in configuration panel for user to control run-time options.

Configuration Panel

As mentioned above, DynamicXray includes a built-in configuration panel to allow users to control some of the options at runtime. The configuration panel can be presented by calling -[DynamicXray presentConfigurationViewController].

For example:

DynamicXray *xray = [[DynamicXray alloc] init];
[self.dynamicAnimator addBehavior:xray];
[xray presentConfigurationViewController];



DynamicXray Catalog

The DynamicXray source repository includes a universal iOS app containing various UIKit Dynamics demonstrations. The demos in DynamicXray Catalog were created by various authors and are all open source. The demos include DynamicXray pre-loaded so introspection can be enabled on any demo to see the inner workings.

If you are interested in seeing more uses of UIKit Dynamics then take a look at DynamicXray Catalog. UIKit Pinball in particular is an interesting example of why not to use UIKit Dynamics for games.

I plan to continue adding interesting demos as I find them. Please submit a pull request if you would like to contribute a UIKit Dynamics demo to the catalog.


Videos

See more DynamicXray demonstration videos on YouTube.


Get It

DynamicXray is free and open source.

Download the DynamicXray framework and learn more at http://dynamicxray.net/

Download the source code from https://github.com/chrismiles/DynamicXray

Follow @DynamicXray and @chrismiles on Twitter.

Friday, May 2, 2014

Skala Color: a modern OS X colour picker for designers and developers

Recently I worked with the awesomely smart team at Bjango to help them build a new Mac tool. Here it is, Skala Color.

Skala Color is a Mac OS X colour picker plugin. When installed, it sits right beside the built-in colour pickers that everyone is used to like the colour wheel and the crayon selector.

What stands Skala Color apart is its attention to detail. Skala Color was built for the modern Mac OS X environment, with Retina display crispness and carefully crafted animations for a smooth experience.

Skala Color was designed by Marc Edwards to cater for modern designers and developers. Colour selection can be made all within the main gradient area by dragging a circular loupe. Vertical drag selects brightness, horizontal drag selects saturation. Dragging the small handle around the loupe selects the hue.

For finer control, hue can also be selected by the slider below the main gradient area. The slider has a clever fine control mode which becomes visible when dragging the handle. Drag the handle up into the fine control area to get 4x precision hue selection.

When opacity selection is available, a second fine control slider is presented below the hue slider, allowing the same precision selection of opacity level.

Below the colour/opacity selection controls, some real smarts come into play. Alongside preset black and white buttons, a third button is shown if the system clipboard contains any recognisable colours. Skala Color is clever about parsing colours from the clipboard. Many text formats are recognisable just by copying them to the clipboard. Here are some example text formats that it will recognise and automatically convert to colours:

  • #FF7A18
  • rgba(255, 122, 24, 1)
  • [UIColor colorWithRed:0.999 green:0.479 blue:0.095 alpha:1]
  • [NSColor colorWithDeviceHue:0.071 saturation:0.905 brightness:0.999 alpha:1]

Similarly, colours can be exported to a variety of different text formats. All main colour formats that are useful to web and Cocoa developers are supported. As a developer, rather than a designer, this is a feature I use a lot.

Bjango has released Skala Color for free, to help promote their upcoming product Skala. I urge you to check it out.

Tuesday, February 4, 2014

OpenGL ES for iOS Presentation Videos

At the last Swipe Conference I presented two talks on OpenGL ES for iOS developers. I have now published these presentation videos for anyone to watch. I hope they might be useful resources for the iOS community.

The first presentation, "Part 1: Learning to draw", was aimed at OpenGL noobs. I introduced the basics of OpenGL programming, rendering triangles, applying colours and lighting, and animating vertices. All using GLKit, which was introduced in iOS 5.  I also very briefly introduced OpenGL ES 2.0 shader programming and then used all the concepts to dissect Xcode's iOS OpenGL template project.

The second presentation, "Part 2: Rendering a masterpiece", covered more interesting rendering effects that GLKit makes very easy, such as textures, skyboxes and reflection maps. I demonstrated the tools that Apple provide for debugging, analysing and profiling OpenGL applications. I ended with a practical demonstration of some fancy custom storyboard segue transitions using OpenGL ES.

Even though the presentations were given in late 2012, all the material is still very relevant as the talks focussed on OpenGL ES 2.0 and GLKit, still the primary OpenGL technologies in iOS today. Recently, iOS 7 introduced support for OpenGL ES 3. However, OpenGL ES 3 requires the very latest hardware and so in practice you probably won't be adopting OpenGL ES 3 immediately or, if you do, will need an OpenGL ES 2.0 fallback path anyway.

You can watch the videos on YouTube:

Or watch them below where I have embedded them within the page, along with more details about what is covered.


OpenGL ES with iOS 5 - Part 1: Learning to draw by Chris Miles (Swipe Conference 2012)




An introduction to OpenGL ES and GLKit, aimed at iOS developers new to OpenGL programming. Presented at Swipe Conference 2012 by Chris Miles.

In the talk I cover:

* Setting up an OpenGL ES scene using GLKViewController + GLKView
* Rendering triangles (GL_TRIANGLES) and meshes made of triangles
* Applying vertex colours, using GLKBaseEffect
* Applying lighting, using GLKBaseEffect
* Using Vertex Array Objects (VAO) and Vertex Buffer Objects (VBO)
* Using interleaved vertex arrays (IVA)
* Animating vertex positions
* Very brief introduction to OpenGL ES 2.0 shader programming
* Dissection of Xcode's iOS OpenGL template project

To explain many of the concepts I use a small demo app, SwipeOpenGLTriangles. The full source to the demo app is released open source (MIT licensed) at https://github.com/chrismiles/SwipeOpenGLTriangles

The slides from the talk are available at https://speakerdeck.com/chrismiles/opengl-es-with-ios-5-part-1-learning-to-draw or http://chrismiles.info/presentations/SwipeConf-2012-OpenGL-ES-iOS5/Swipe-2012-OpenGL-ES-iOS5-Part1.pdf [PDF].


OpenGL ES with iOS 5 - Part 2: Rendering a masterpiece by Chris Miles (Swipe Conference 2012)




This is the second talk of two about OpenGL ES and GLKit, aimed at iOS developers. This talk covers rendering effects in OpenGL using GLKit, looking at the OpenGL debugging and profiling tools that ship with Xcode, and demonstrating how OpenGL can be used for some fancy segue transitions. Presented at Swipe Conference 2012 by Chris Miles.

In more detail, this talk covers:

* Rendering textured triangles using GLKTextureLoader and GLKBaseEffect;
* Creating cubemaps using GLKTextureLoader;
* Rendering skyboxes using GLKSkyboxEffect;
* Rendering reflection map effects using GLKReflectionMapEffect;
* Demonstration of the Xcode OpenGL ES frame debugger;
* Demonstration of the OpenGL ES Driver and Analyzer instruments;
* Demonstration of the OpenGL ES Performance Detective;
* Performance recommendations specific to OpenGL ES on iOS devices;
* Demonstration of some fancy custom storyboard segue transitions using OpenGL ES

The slides from the talk are available at https://speakerdeck.com/chrismiles/opengl-es-with-ios-5-part-2-rendering-a-masterpiece or http://chrismiles.info/presentations/SwipeConf-2012-OpenGL-ES-iOS5/Swipe-2012-OpenGL-ES-iOS5-Part2.pdf [PDF]

The demo apps used in the talk are all released open source.

SwipeOpenGLTriangles demonstrates rendering textured triangles  -- https://github.com/chrismiles/SwipeOpenGLTriangles

Swipe3D demonstrates GLKSkyboxEffect, GLKReflectionMapEffect, cubemap textures and indexed vertices -- https://github.com/chrismiles/Swipe3D

FancySegue shows how to build custom segue transitions using OpenGL -- https://github.com/chrismiles/FancySegue


I want to thank the Swipe Conference organisers for allowing me to edit and publish the videos myself.

Tuesday, January 14, 2014

Building a native iOS graphing engine for Pocket Weather 4

One of my favourite iOS apps is Pocket Weather Australia by Shifty Jelly who are a friendly little development team based in Adelaide. So it was an honour when they approached me to lend them a hand with development of their new version, overhauled and optimised for iOS 7: Pocket Weather 4.

Requirements


Specifically, my scope was to build a new, modern graphing engine for them, to replace the old legacy graphing code they had been using for years. An app like Pocket Weather lives or dies based on (a) Forecast data quality; and (b) Forecast data visualisations with rich graphs. Pocket Weather does both really well, visualising forecast data in a multitude of forms to cover many use cases. I couldn't help with their forecast data quality (which was already very high and even better in PW4) but I was happy to build a modern new high performant interactive graphing engine to cover their visualisation needs.

The graphing engine needed to support both static graphs as well as smooth scrolling interactive graphs, to cover all the visualisations that Pocket Weather 4 required. On iPhone, five main graphs were needed:
  • "Today": a static graph charting both air and "feels like" temperature forecasts, with "chance of rain" overlays.
  • "Detailed Outlook": a static graph charting forecast for the next two days, similar to the Today graph.
  • "Tide Times": a line chart of tide low/high forecast times, with edges nicely smooth out.
  • "Interactive Timeline": a scrolling graph charting air and "feels like" temperature forecasts, plus rain overlays, for the next 7 days.
  • "Interactive Tide Times": a scrolling graph of tide low/high forecast tides for the next few weeks.
Screenshots of two of the static graphs can be seen above, near the top. The interactive graphs are shown below.

Solution


There are a few iOS graphing libraries around that produce good results, both commercial and open source. However, third party library dependencies are always a trade off between convenience and managing extra baggage. Plus the ability and/or cost to customise them to meet requirements varies quite a bit. Based on the requirements, my preference was to hand-roll a graph rendering engine using native Core Graphics and Core Animation frameworks, rather than bring in a third party graphing library. Having had already done this a few times for previous projects, I was confident that we could get good results in a short time. This matched Shifty Jelly's preference also, as they would end up with modern code that they could own, manage and continue to customise themselves.


Technical


A common factor between all the required graphs was that they were all line graphs, charting either one or two lines of data, drawn with various styling options. As such it made sense to encapsulate line chart rendering in a class configurable enough to be shared by all the graph views. As some views would be drawing into UIView contexts (for static graphs) and others would be drawing into contexts at the CALayer level (for scrolling graphs, more on this below) that meant that the chart renderer could be a shared NSObject subclass, without any knowledge of the UIKit or CALayer levels. The chart renderer would talk Core Graphics, accepting a CGContext to render in to.

Above is an abstract diagram of the class architecture. Each graph view is a UIView subclass, with common elements abstracted into shared base classes, down to one base class for static graphs, and one for interactive graphs. Each graph view contains a chart renderer instance, configured appropriately for the particular graph type. Chart renderer objects were ultimately responsible for actually drawing the line graphs into specified CGContexts.

The static graph views were relatively straight forward. On "needs display" their drawRect: method delegates drawing to the chart renderer, passing down the CGContext to draw into.

The interactive graph views were a little more complex. They were very wide scrollable graphs. Rendering a large scrollable view is not possible, and would be very inefficient even if it was. The answer was to use a CATiledLayer to render the graph into tiles, on demand as the view is scrolled. CATiledLayer works great for this. Or would have, if it wasn't severely broken in iOS 7. In the end I had to roll my own CATiledLayer clone to get the job done, but the result was the same (more on this below).

Shown here are two images of an example scrollable graph view. The first image shows what the graph looks like rendered on screen. The second shows the view explosion, with all the subviews Revealed. You can see the large scrollable view, which cannot be snapshot by Reveal, but at runtime is rendered as horizontal tiles. The tiling layer renders tiles on demand, only drawing tiles as they become visible. Even then, rendering is performed asynchronously so as not to interrupt scrolling performance. Rendered tiles are then cached so they don't have to be drawn again unless necessary.




Challenges


As mentioned above, CATiledLayer is unfortunately broken in iOS 7.0 (rdar://15402066). It suffers a severe problem of not redrawing purged tiles in some cases. Easily reproducible and not shippable. After a discussion with Apple engineers via a TSI I concluded that the only practical current solution was to avoid CATiledLayer and write my own equivalent. I did that and it worked well for our case.

Another interesting challenge was the requirement to smooth the curves of the tide graphs, while also tracking the curved line path with a circular overlay when the graph is being scrolled. See the video below for an example of what I mean.



The tide data was a simple data set of low tide/high tide forecasts, in time order. For example: [low tide, high tide, low tide, high tide, ...]. By default the data would be graphed with sharp edges, as shown below in the first image. Shifty Jelly's rockstar designers understandably requested that the line be smoothed out, as shown in the second image below.
Techniques to smooth curves in a line are fairly well known and I used a Catmull-Rom spline to smooth the line.

The Catmull-Rom spline generated enough points to create a CGPath for Core Graphics to draw a smooth line. However, it didn't generate enough points to calculate the exact Y position needed for the circular overlay for every X position of the graph. The solution here was to interpolate between the curved line points to generate a set of (X, Y) points with just enough granularity for every X position of the graph. This calculation is done once for a data set + graph layout configuration and cached. Then, while scrolling, a lookup of overlay Y position for any X position is direct and instant, allowing the circular overlay to track the graph line.

Summary


While hand-rolling a graph engine may sound like a lot of work, in practice with a good knowledge of Core Graphics and Core Animation to do all the heavy lifting, the amount of code required is not terribly large.

We were all very pleased with the results we got from building a native graph engine ourselves. With the engine now built and shipped, modifications are fast and easy to do as the library is clean and well understood.

The requirements for the iPad interface were not known to me while building the graph engine. In fact, I didn't even know how the iPad build would look until PW4 shipped. I was pleased to see that the iPad interface was built around the scrolling forecast graph, very cleverly designed. The graphing engine scaled up to the iPad interface with little extra effort.

A big thanks to Shifty Jelly for bringing me on-board for this part of the project, it was a pleasure to work with them.






Thursday, September 26, 2013

EasyRes

Announcing EasyRes, a Mac OS X fast screen resolution switcher with live animated previews.

EasyRes is my new Mac app, a little utility I developed for myself, before polishing it up and releasing it in the Mac App Store. I found that I was switching resolutions a lot more frequently on the Retina MacBook Pro, which has more usable resolutions due to the high pixel density. Not impressed with the quality of the existing apps out there, I wrote my own.



Although a relatively simple app, which aims to do one thing right, it does include a bunch of useful features:
  • Live animated previews of how windows will be sized for each screen resolution by simply mousing over the menu.
  • Quick access to screen resolutions from the menu bar.
  • Resolutions and previews are shown for all active screens.
  • Retina smart: Resolutions are grouped by Retina and non-Retina modes.
  • HDTV smart: TV resolutions such as 1080p, 1080i, 720p are all listed when available, including refresh rates such as 50Hz/60Hz, making it easy to find the right HDTV resolution.
  • Recently selected resolutions are remembered for each screen.
  • User-friendly labels are displayed beside resolutions, such as "Best for Retina Display", "Native", "1080p NTSC".
  • Labels can be added and customised for any resolution on any screen, making it easy to find your favourite resolutions.
  • Option to automatically launch at login.

You can see EasyRes in action in a YouTube video demo.


Friday, June 7, 2013

Reveal Public Beta



My friends at Itty Bitty Apps have released an awesome new tool for iOS developers. Reveal is a Mac OS X application for introspecting iOS apps at runtime. It allows you to visualise and modify view hierarchies in real-time. The view structure can be "exploded" in 3D, with powerful tools for drilling down and isolating the parts of the hierarchy that need attention.

I have been helping the Itty Bitty Apps team develop Reveal for the past 6 months or so. My focus has been on the 3D view hierarchy explosion visualisation, using Scene Kit & OpenGL. We are working hard on getting it all polished and looking awesome for the 1.0 release.

Reveal is currently free during the public beta phase. Take a look at http://revealapp.com/



Wednesday, March 20, 2013

Bluetooth LE with CoreBluetooth Presentation

At February's Melbourne Cocoaheads I gave a presentation about integrating Bluetooth LE peripherals with iOS devices using Apple's Core Bluetooth framework.

Bluetooth LE is the new Bluetooth Low Energy (aka Bluetooth Smart) protocol introduced as part of Bluetooth 4.0. Apple embraced Bluetooth LE early on, adding support for it in the iPhone 4S and almost every Apple device since then, including iPads and Macs.



In the presentation I demonstrated an example iPhone app that connected to two Bluetooth LE peripherals simultaneously. It connected to a Wahoo Fitness Blue HR heart rate strap, measuring my heart rate live as I gave the talk. It also connected to a TI SensorTag, measuring both temperature and acceleration of the device. You'll see why when you watch the presentation.

The TI SensorTag is a great little Bluetooth LE device, containing a bunch of sensors that can all be read over a Bluetooth LE connection. It is handy for development and only costs US$25 direct from Texas Instruments. The sensors include:

  • Temperature (IR and ambient)
  • Humidity
  • Pressure
  • Accelerometer
  • Gyroscope
  • Magnetometer
  • plus 2 digital buttons

By the way, the technical issue I had with the SensorTag during the presentation was fixed by TI in a firmware update (released the morning after my presentation...).

I haven't released the source code to the Coffee Addict demo app I used in the talk yet. I hope to do so in the near future, when I tidy up a few things.

You can watch my presentation on Vimeo:
Bluetooth LE with CoreBluetooth.

You can view the presentation slides on Speaker Deck.



Monday, December 10, 2012

Introducing CMUnistrokeGestureRecognizer

How would you go about recognising a gesture like this star shape in an iOS app?
This was the problem posed to me recently while working on a project. I knew I wasn't the first to need to solve this type of problem. For example, the popular Infinity Blade series of iOS games used a shape drawing gesture for spell casting.


Looking further back into the past, you might remember Palm OS Graffiti used a similar gesture recognition technique for text input.
Something that all these gestures have in common is the requirement to recognise shapes drawn from a single path. That is, the path drawn by a user between putting their finger down and lifting their finger up.

Knowing this problem had obviously been solved, I began researching techniques for single path recognition and that's when I found...


$1 Unistroke Recognizer

Created by three clever chaps at the University of Washington back in 2007, the $1 Unistroke Recognizer was designed to recognise single path (unistroke) gestures, exactly what I was looking for. Not only that, but design goals for the technique make it an ideal candidate for use in mobile applications:


  • Resilience to movement & sampling speed
  • Rotation, scale, position invariance
  • No advanced maths required (e.g., matrix inversions, derivatives, integrals)
  • Easily implemented with few lines of code
  • Fast enough for interactive use
  • Define gestures with minimum of only one example
  • Return top results ordered by score [0-1]
  • Provide recognition rates competitive with more complex algorithms

A pretty bold set of requirements, but it looks like they were able to achieve them all. The creators published their paper describing their technique and included full pseudocode. Their project website also includes a demo implementation written in JavaScript so you can test it out live in the browser.

$1 Unistroke Recognizer website
$1 Unistroke Recognizer paper (PDF)


CMUnistrokeGestureRecognizer

CMUnistrokeGestureRecognizer is my port of the $1 Unistroke Recognizer to iOS. I'm not the first to implement this recogniser in Objective-C but none of the existing implementations met my requirements. I wanted the $1 Unistroke Recognizer to be fully contained within a UIGestureRecognizer, with as simple an API as possible.

So the CMUnistrokeGestureRecognizer implements the $1 Unistroke Recognizer as a UIGestureRecognizer. It features:

  • Recognition of multiple gestures
  • Standard UIGestureRecognizer callback for success
  • Template paths defined by UIBezierPath objects
  • Optional callbacks for tracking path drawing and recognition failure
  • Configurable minimum recognition score threshold
  • Option to disable rotation normalisation
  • Option to enable the Protractor method for potentially faster recognition
The core recognition algorithm is written in C and is mostly portable across platforms. I say "mostly" as it uses GLKVector functions from the GLKit framework for optimal performance on iOS devices. GLKMath functions take advantage of hardware acceleration such as the ARM NEON SIMD extensions, so I like to use them. It wouldn't take much work to substitute the vector functions if someone wanted to use the core C implementation on another platform.

The CMUnistrokeGestureRecognizer implementation sits on top of the core C library and provides the Objective-C/UIKit interface.

To use it, add the CMUnistrokeGestureRecognizer project to your own as a subproject and add the library to your target. In your source file, import the main header:

#import <CMUnistrokeGestureRecognizer/CMUnistrokeGestureRecognizer.h>

In your code, define one or more paths to be recognised. Create an instance of CMUnistrokeGestureRecognizer, register your paths, then add it to a view.

Your callback method will be called whenever a gesture is successfully matched against the template paths you registered.

Here's an example of the key points:

- (void)viewDidLoad
{
    [super viewDidLoad];

    // Define a path to be recognised
    UIBezierPath *squarePath = [UIBezierPath bezierPath];
    [squarePath moveToPoint:CGPointMake(0.0f, 0.0f)];
    [squarePath addLineToPoint:CGPointMake(10.0f, 0.0f)];
    [squarePath addLineToPoint:CGPointMake(10.0f, 10.0f)];
    [squarePath addLineToPoint:CGPointMake(0.0f, 10.0f)];
    [squarePath closePath];
    
    // Create the unistroke gesture recogniser and add to view
    CMUnistrokeGestureRecognizer *unistrokeGestureRecognizer = [[CMUnistrokeGestureRecognizer alloc] initWithTarget:self action:@selector(unistrokeGestureRecognizer:)];
    [unistrokeGestureRecognizer registerUnistrokeWithName:@"square" bezierPath:squarePath];
    [self.view addGestureRecognizer:unistrokeGestureRecognizer];

}

- (void)unistrokeGestureRecognizer:(CMUnistrokeGestureRecognizer *)unistrokeGestureRecognizer
{
    // A stroke was recognised
    
    UIBezierPath *drawnPath = unistrokeGestureRecognizer.strokePath;
    CMUnistrokeGestureResult *result = unistrokeGestureRecognizer.result;
    NSLog(@"Recognised stroke '%@' score=%f bezier path: %@", result.recognizedStrokeName, result.recognizedStrokeScore, drawnPath);
}

See the included demo app for a more detailed example. The demo includes all the template shapes used by the original creators in their own JavaScript demo. The demo app allows you to test the recognition engine, as well as create new template shapes and export shapes out as code for inclusion in your own projects. The demo app is universal for both iPhone and iPad.



CMUnistrokeGestureRecognizer is open source, released under a MIT license. Get it from https://github.com/chrismiles/CMUnistrokeGestureRecognizer

I look forward to seeing what developers create with it.

Thursday, October 18, 2012

OpenGL ES with iOS 5 Part 2: Rendering a masterpiece – Swipe Conference 2012

At September's Swipe Conference I gave two talks on OpenGL with iOS. The first talk, "OpenGL ES with iOS 5 Part 1: Learning to draw" was an introduction to OpenGL ES and GLKit. The second talk covered rendering effects in OpenGL using GLKit, looking at the OpenGL debugging and profiling tools that ship with Xcode, and demonstrating how OpenGL can be used for some fancy segue transitions.
In more detail, my talk "OpenGL ES with iOS 5 Part 2: Rendering a masterpiece" covered:

  • Rendering textured triangles using GLKTextureLoader and GLKBaseEffect;
  • Creating cubemaps using GLKTextureLoader;
  • Rendering skyboxes using GLKSkyboxEffect;
  • Rendering reflection map effects using GLKReflectionMapEffect;
  • Demonstration of the Xcode OpenGL ES frame debugger;
  • Demonstration of the OpenGL ES Driver and Analyzer instruments;
  • Demonstration of the OpenGL ES Performance Detective;
  • Performance recommendations specific to OpenGL ES on iOS devices;
  • Demonstration of some fancy custom storyboard segue transitions using OpenGL ES
The slides from the talk are available at https://speakerdeck.com/chrismiles/opengl-es-with-ios-5-part-2-rendering-a-masterpiece or http://chrismiles.info/presentations/SwipeConf-2012-OpenGL-ES-iOS5/Swipe-2012-OpenGL-ES-iOS5-Part2.pdf [PDF]

The demo apps used in the talk are all released open source.

SwipeOpenGLTriangles demonstrates rendering textured triangles  – https://github.com/chrismiles/SwipeOpenGLTriangles

Swipe3D demonstrates GLKSkyboxEffect, GLKReflectionMapEffect, cubemap textures and indexed vertices – https://github.com/chrismiles/Swipe3D

FancySegue shows how to build custom segue transitions using OpenGL – https://github.com/chrismiles/FancySegue
All the sample apps are universal and support all orientations.

Also see my post about the first talk: OpenGL ES with iOS 5 Part 1: Learning to draw – Swipe Conference 2012.

Update: the presentation video is now available online at https://www.youtube.com/watch?v=dkqBjsEpt5g

Tuesday, October 2, 2012

OpenGL ES with iOS 5 Part 1: Learning to draw – Swipe Conference 2012

In September I presented two talks at Swipe Conference in Sydney. The first talk, "OpenGL ES with iOS 5 Part 1: Learning to draw", was an introduction to OpenGL ES and GLKit, aimed at iOS developers new to OpenGL programming.
In the talk I used a simple demo app, SwipeOpenGLTriangles, to demonstrate OpenGL ES rendering concepts with GLKit, such as:


  • Setting up an OpenGL ES scene using GLKViewController + GLKView
  • Rendering triangles (GL_TRIANGLES) and meshes made of triangles
  • Applying vertex colours, using GLKBaseEffect
  • Applying lighting, using GLKBaseEffect
  • Applying texturing, using GLKBaseEffect and GLKTextureLoader
  • Using Vertex Array Objects (VAO) and Vertex Buffer Objects (VBO)
  • Using interleaved vertex arrays (IVA)
  • Animating vertex positions (tap screen to animate between flat triangles and 3D open box shape)
  • The sample app is universal and supports all orientations.




The full source to the demo app is released open source (MIT licensed) at https://github.com/chrismiles/SwipeOpenGLTriangles


The slides from the talk are available at https://speakerdeck.com/chrismiles/opengl-es-with-ios-5-part-1-learning-to-draw or http://chrismiles.info/presentations/SwipeConf-2012-OpenGL-ES-iOS5/Swipe-2012-OpenGL-ES-iOS5-Part1.pdf [PDF].

Update: The presentation video is now online at https://www.youtube.com/watch?v=s6VCaFQFBtM

Friday, May 18, 2012

CMTraerPhysics CocoaHeads Presentation

In March I gave a presentation at Melbourne CocoaHeads about my open source project CMTraerPhysicsCMTraerPhysics is a spring physics engine that I ported to Objective-C/Cocoa, along with some interesting demos for iOS.

Watch "Chris Miles presents CMTraer Physics" on Vimeo (embedded below if your browser supports it).




See the slides at https://speakerdeck.com/chrismiles/cmtraerphysics-melbourne-cocoaheads-march-2012

Thursday, May 10, 2012

Announcing EZForm 1.0 - iOS form handling & validation library

Announcing EZForm 1.0, my open source form handling and validation library for iOS.


The primary goal of EZForm is to simplify form handling in iOS apps, while not enforcing any constraints on the layout and design of the form UI.


EZForm is designed to be decoupled from your user interface layout, leaving you free to present your form UI any way you like. That doesn't mean EZForm won't integrate with your UI. You tell EZForm which of your controls and views you want to handle each form field, and EZForm will take care of input validation, input filtering and updating views when field values change.


EZForm features:
  • Form field types including: text, boolean, radio.
  • Block based validators. User-defined input validation rules can be added to fields as block objects. Some common validators are included with EZForm.
  • Block based input filters. Input filters control what can be entered by the user. For example, an input filter could be added to a text field to allow only numeric characters to be typed. Some common input filters are included with EZForm.
  • Standard input accessory and field navigation. A standard input accessory can be added to text fields by EZForm with one method call. It adds a bar to the keyboard with field navigation and done buttons, similar to Mobile Safari's input accessory. Navigation between fields is handled automatically by EZForm.
  • Automatic view scrolling to keep active text fields visible. With the option enabled, EZForm will adjust a scroll view, table view or arbitrary view to keep the text field being edited on screen and not covered by a keyboard.
  • Invalid field indicators. EZForm can automatically show invalid indicator views on text fields that fail validation. Invalid indicator views can be user-supplied or supplied by EZForm.
EZForm comes with full API documentation, which can be installed as an Xcode document set, for reading within Xcode's document viewer.

EZForm is convenient to use for both simple and complex forms. The source includes a demo app containing both simple and more complex form examples.


Get EZForm from https://github.com/chrismiles/EZForm