Programming Without a Computer

I think of this quote almost every day:

I was trying to understand why rockets were so expensive. Obviously the lowest cost you can make anything for is the spot value of the material constituents. And that’s if you had a magic wand and could rearrange the atoms. So there’s just a question of how efficient you can be about getting the atoms from raw material state to rocket shape.

– Elon Musk

Visionaries-Elon-Musk-FLASH.jpg__800x600_q85_crop

Which I reworded and condensed to:

There exists an order of bits that’ll make the iPad do what I want, and my job as a programmer is to find that list of bits.

This fought against my competing idea:

Either what I’m trying to do is impossible, or I’m too dumb to figure out how to do it.

Every day was an exercise in motivation, and a balance between these two thoughts. A few features that gave me particular grief: scissors, gesturesperformance, OpenGL rendering, among others. Each took me weeks to figure out, and often a few weeks into their development I felt no closer than day one.

But then something fantastic happened: about halfway through Loose Leaf‘s development, I had some health problems that required me to dramatically change my diet and to start exercising. Health problems generally aren’t great news, but this one changed more than my diet.

As I walked about four miles each afternoon,  a wonderful thing happened that I didn’t expect: being away from my computer gave me time to think more completely about the problem I was working on. Since I obviously didn’t have my computer with my on my walk, I couldn’t dive straight into coding each half baked idea, which forced me to think deeper through each potential solution to prove that it’d work. When I finally got back to my computer later that afternoon, I’d have a well thought out plan of attack. This made the last half of the afternoon many times more productive than the entire afternoon ever would have been.

I honestly don’t think I could have built the scissors feature without taking that walk every afternoon. More than I could say has already been said about Deep Work and thinking without distractions, but this was it for me. Taking time away from technology – no computer, no phone, no iPod, no music – it let me find the solutions to difficult problems.

Using Augmented Reality Code for a Better Stretch Gesture

Loose Leaf is more about photos and imports than it is about drawing or sketching, and I wanted to make sure it was easy not only to cut and create new scraps, but also to manipulate and duplicate existing scraps. To make a copy of a photo or scrap, I thought through numerous gesture options, menus, long press popups, buttons, and more, and in the end I settled on a simple pull-it-apart-gesture.

So how’d it turn out? Here’s what it looks like to duplicate any photo scrap in Loose Leaf, just pull it apart into two identical pieces.

It’s a simple gesture to pull a photo into two copies. The stretch animation as you pull makes it obvious what’s happening, and then — snap! — It’s two copies!

The difficult piece of this gesture isn’t the copy itself, it’s the stretch animation as you pull the image apart. For a number of reasons, I needed to do this without additional OpenGL rendering, I needed to keep the code solidly in the normal UIKit stuff. To make it work, I borrowed some technology that’s fairly common in Augmented Reality apps: it’s called homography.

The short description is that any convex quadrilateral can appear to be any other convex quadrilateral by just rotating and translating it- It basically lets me turn any four sided shape into any other four sided shape. Perfect! scraps, even non-rectangle one’s cut by scissors, are still modeled as simple rectangle UIViews. As I stretch, I can transform that rectangle into a parallelogram for that stretched effect.

Here’s the simple test application I used to fine-tune the animation with the above strategy:

The algorithm visible above goes through the following steps:

  1. create a 4 sided quadrilateral from the four touch points on the scrap
  2. as the fingers move, create a 2nd quadrilateral from the new locations of those same touches
  3. compute the homography between those two quadrilaterals to find the transform between them
  4. apply that transform to the scrap itself

Note that I’m not trying to calculate the transform from the UIView’s bounds to its new parallelogram – instead i’m transforming the quad of finger positions to thew new quad of finger positions, then using that transform on the view.

The red lines in the above image is the quadrilateral between the touch points. The green line is an “average” quadrilateral, which averages the red quad into a parallelogram. You can see the stretch is very dramatic, and it causes a problem if the touch points form ever form a concave instead of convex polygon. You can see an example of what goes wrong below:

But if I use the average quad instead, and treat those points as transforms between parallelograms, then the same gesture gives smoother results:

My last experiment for the stretch gesture was how to keep the orientation of the image as it stretched. I thought about trying to maintain the “up”ness of the image regardless of the stretch vs. having the gesture also rotate the image during the stretch. You can see what I mean with the two videos below.

 

In the end, I decided to rotate the image during the gesture, and not try to maintain “up” orientation. It gave a cool effect, especially when stretching, but felt just a bit off since it also caused the fingers to slide more over the image.

To try it for yourself, download Loose Leaf on the app store.

You can also find all of the code on GitHub for each one of the versions talked about here, and be sure to check out the rest of Loose Leaf’s open source contributions as well.

Improving UIBezierPath Performance and API

I shared two weeks ago how I built the scissors feature in Loose Leaf; short story, lots and lots of UIBezierPath. As I worked through the algorithm for slicing paths together, it didn’t take me long to realize that default performance of UIBezierPath was… lacking. The only way to introspect the path itself is to create a custom function – not block mind you, function – and use CGPathApply() to iterate the path and calculate what you’re looking for. Every time.

For a path of size N, finding the first point? O(N). Finding points or tangents along the path? O(N). Is the path closed? O(N).

Enter PerformanceBezier.

PerformanceBezier improves many of these simple tasks with operations that average to O(1) for any length bezier. The first calculation may still iterate the path, but afterwards the answer will be cached for far faster access, and for some operations a full path iteration is never needed. It also brings UIBezierPath a bit closer to its NSBezierPath counterpart.

And the best part? It works with all UIBezierPath objects – no custom subclass, no extra code, it “just works.”

How ’bout an example?

Scissors isn’t the only feature in Loose Leaf that requires finding intersections and clipping Bezier paths – drawing over images is essentially the same algorithm. With each stroke, I need to clip the pen’s path to each scrap. Even drawing over multiple images will clip the pen’s ink across each image.

  

With every stroke, the scissor algorithm needs to be run for each image that the pen draws over. To do this in 60FPS means it needs to be done quick. Each image’s border path only changes when it’s cut with scissors, so it makes sense to cache as much information about that path as possible to reduce the cost of future clips with the pen.

So How Much Does It Help?

I’ve included a macro in PerformanceBezier so you can compare times as well. Turning this flag on will simulate running CGPathApply to retrieve the same information. So [path firstPoint] becomes O(N) instead of O(1), etc.

When I run PerformenceBezier without caching on a first gen iPad mini, the FPS drops dramatically. I used instruments to measure how much time was spent in the Main Thread in total, and how much of that time was wasted with uncached Bezier operations.

The Test: I dropped 4 irregularly cut images onto a page, and scribble over them with the pen. This stresses the bezier clipping algorithm as each new stroke segment is clipped to multiple paths.

The results were dramatic. Out of 10s spent on the Main Thread during drawing, 5s were spent wasting time in methods that could’ve been cached. Put another way, this doubled performance in Loose Leaf. After optimizations, I can get twice the clipping done in the same amount of time. Some method calls saw 10x speed improvement. Obviously, your milage will vary depending on your algorithm and how heavily you use UIBezierPath in your code, but Loose Leaf would simply not be possible without these optimizations.

Get the Code

Download and browse the code: https://github.com/adamwulf/PerformanceBezier. This isn’t the first repo I’m open sourcing – there’s lots more code from Loose Leaf, and more to come

I’m working to open source all of path clipping code for the scissors, and the first step starts today. Stay tuned!

Google Author link
Page 1 of 4512345...102030...Last »