Using Augmented Reality Code for a Better Stretch Gesture

Loose Leaf is more about photos and imports than it is about drawing or sketching, and I wanted to make sure it was easy not only to cut and create new scraps, but also to manipulate and duplicate existing scraps. To make a copy of a photo or scrap, I thought through numerous gesture options, menus, long press popups, buttons, and more, and in the end I settled on a simple pull-it-apart-gesture.

So how’d it turn out? Here’s what it looks like to duplicate any photo scrap in Loose Leaf, just pull it apart into two identical pieces.

It’s a simple gesture to pull a photo into two copies. The stretch animation as you pull makes it obvious what’s happening, and then — snap! — It’s two copies!

The difficult piece of this gesture isn’t the copy itself, it’s the stretch animation as you pull the image apart. For a number of reasons, I needed to do this without additional OpenGL rendering, I needed to keep the code solidly in the normal UIKit stuff. To make it work, I borrowed some technology that’s fairly common in Augmented Reality apps: it’s called homography.

The short description is that any convex quadrilateral can appear to be any other convex quadrilateral by just rotating and translating it- It basically lets me turn any four sided shape into any other four sided shape. Perfect! scraps, even non-rectangle one’s cut by scissors, are still modeled as simple rectangle UIViews. As I stretch, I can transform that rectangle into a parallelogram for that stretched effect.

Here’s the simple test application I used to fine-tune the animation with the above strategy:

The algorithm visible above goes through the following steps:

  1. create a 4 sided quadrilateral from the four touch points on the scrap
  2. as the fingers move, create a 2nd quadrilateral from the new locations of those same touches
  3. compute the homography between those two quadrilaterals to find the transform between them
  4. apply that transform to the scrap itself

Note that I’m not trying to calculate the transform from the UIView’s bounds to its new parallelogram – instead i’m transforming the quad of finger positions to thew new quad of finger positions, then using that transform on the view.

The red lines in the above image is the quadrilateral between the touch points. The green line is an “average” quadrilateral, which averages the red quad into a parallelogram. You can see the stretch is very dramatic, and it causes a problem if the touch points form ever form a concave instead of convex polygon. You can see an example of what goes wrong below:

But if I use the average quad instead, and treat those points as transforms between parallelograms, then the same gesture gives smoother results:

My last experiment for the stretch gesture was how to keep the orientation of the image as it stretched. I thought about trying to maintain the “up”ness of the image regardless of the stretch vs. having the gesture also rotate the image during the stretch. You can see what I mean with the two videos below.

 

In the end, I decided to rotate the image during the gesture, and not try to maintain “up” orientation. It gave a cool effect, especially when stretching, but felt just a bit off since it also caused the fingers to slide more over the image.

To try it for yourself, download Loose Leaf on the app store.

You can also find all of the code on GitHub for each one of the versions talked about here, and be sure to check out the rest of Loose Leaf’s open source contributions as well.