I’ve been working a lot lately on gestures and am currently working on a swipe-in panel class. Mostly we take these controls and behaviours for granted. When I started looking closely however, the details turn out ot be rather interesting. They should also be fun to implement.
A side menu
Mobile devices often have panels, or menus, that can be swiped in from the top, bottom, left or right. Swiping is so common that I expected to find a lot of examples. But when I went looking it was a bit harder to find them than I thought. There is lots of swiping of lists and between screens, but actual menu panels was a bit less common.
iOS has the OS top/bottom menu, but none of the other common apps seemed to have any left/right menus. I eventually found one map application that had a left menu, but it required a button to activate, but could be swiped away. On Android I had only slightly more luck, again the standard map application. It could swipe in from the left. I’m sure there are more, but these would be enough to do some preliminaries.
Many apps prefer to use buttons to reach these panels, rather than a swiping gesture. This is an important detail, lest I create something where you can’t do this in your own application.
Swipe up, click down, slow down
I set about playing with these panels. First the top/bottom OS panels on my iPhone. This produces several observations:
- There is a finger velocity threshold. If I don’t move my finger fast enough across the edge border the panel will not slide in.
- The reveal time seems somewhat based on velocity. If I fling it then it opens quickly. If I go slower the opens a bit slower.
- It uses a physics attractor to reach it’s final state (it bounces slightly around it’s destination). This appears similar to the snap-back region on scrollers, though with more wiggling.
- The final direction of movement is important. If I start opening a panel I can switch directions and have it close again.
- Each panel has an arrow as a close button. It needs only to be tapped to have the panel slide away.
- I cannot swipe on the panel to have it close. I must use the button or swipe across the inside border.
- There is a delayed start to opening the panel. I must move a certain number of pixels before it even starts to show up. Once it does it quickly comes to meet my finger.
- There is a threshold to have the partial panel open completely. If don’t move far enough it will close again. This threshold seems just a bit higher than the previous one to get it to reveal at all.
On the Android apps map, which has a left panel. These are the differences I noticed from iOS:
- There is no tap-to-close button. I must swipe it back. Or I can tap inside the underlying app (I checked and you can also do this in iOS). This is perhaps an issue of horizontal/vertical orientation rather than OS, as an arrow in a vertical panel might be quite large.
- The threshold to open/close a panel is at 50% it seems. That is, without velocity having it over 50% revealed cause it to open, otherwise it will close. This happens regardless of whether it was open or closed before.
- It seems to have a lower velocity threshold, and distance threshold, needed to start opening it.
- It uses a more basic easing to reveal and hide, rather than an attractor.
More
I’m sure there are more variations in behaviour that I will find. There are also many variations in what happens with the main panel. Do they fade away, get dark, get smaller, slide with the panel, or stay underneath it? I tend not to notice all these variations in behaviour until I actually need to implement something myself.
Now that I know how it can work I just need to go implement it (well, significantly modify the one we already have). Some of the behaviours here are new. But many are already part of the gesture code I have done already. As with all gesture code, it’ll involve a lot of iterations and fine-tuning of the details.