-
Notifications
You must be signed in to change notification settings - Fork 36
[rmkit] Add simple Widget gestures #87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
raisjn
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
very cool! just played with it a bit on rM1 with touch and stylus, works surprisingly well!
click just means "the mouse was down in this widget, and now it's up", but short_click means "down and up within a 50-pixel area, and within 0.5 seconds"
the behavior i'd like for click is: "mouse was only down in this widget and now it's up", right now, one can slide into a widget and lift finger and click is fired. i think its a small change in main_loop to add (keep track of last mouse_down widget at a global level), but unsure which behavior would be more intuitive for people: short_click or click (they both seem fine to me intuitively)
about long press: should long press should automatically fire after an amount of time has passed or only on finger up? i ask because it is harder to predictably invoke long press when its on mouse up.
I thought about sticking the events under the MouseGestureManager instead of having them directly in MOUSE_EVENTS.
they all seem related to mouse events, but the one confusing part is that init_gestures() will need to be called before some are usable. could mislead people. i don't mind if its on gestures, drag or mouse.
That makes a lot of sense. Seems like what's happening is that down fires when the mouse enters the widget for the first time, even if it isn't the initial "down" touch. Maybe SynMotionEvent::left needs to be 1 for initial down, 0 for motion, -1 for up? Or something like that
I think regular click is what most people think of. Looking at Android again, they actually have several different "gesture" clicks: LongPress, SingleTapUp (which I think is similar to short_click), DoubleTap, and SingleTapConfirmed (short delay to make sure it isn't double-tap). I doubt any of these are constrained by distance, so maybe I need to think about how to separate drag and press events.
Good point. I think what you're describing is the most common way to detect a long press, and I agree that it would be nice to know ahead of time if long press was actually triggered. I'm not totally sure what the best way to do that is right now, so definitely open to ideas. Seems like a second thread, which I'm not super excited about, or a global concept of timers might work? I could see extending the code from #46 so that the read_input looks at the global list of timers for the shortest remaining timeout or something. |
i think this logically is fine but is more work bc 1) where does the logic for setting to 0 happen and 2) wouldn't we still need logic to make sure that the mouse down and mouse up happen on the same widget? i think using a variable in main loop to track clicked widget would only require changing the event dispatch in mainloop.cpy
at least with a separate thread its only one thread per click, right? we have to do something like this for launching remux after holding home button down for 1sec, it's not fun. |
* dragging * long_press * single_click * double_click
502fdf7 to
ab57498
Compare
mrichards42
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be ready for a real review!
input_demo has the same "drag the square around" widget, but I changed so that long_press, single_click, and double_click each toggle something different so you can observe those behaviors independently.
| return std::abs(ev.x - prev_ev.x) > tolerance || std::abs(ev.y - prev_ev.y) > tolerance | ||
| ; | ||
|
|
||
| class GESTURE_EVENTS_DELEGATE: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is my attempt at trying to make widget gestures transparent. The idea behind the implementation is simpler enough (wait to create GESTURE_EVENTS until we have an event listener), but honestly the implementation ended up being a little more complicated than I'd like. But the overhead is fairly small (8 pointers) compared to the entire GESTURE_EVENTS class.
It does work though -- see input_demo. It's just widget->gesture.long_press += ..., and that adds an event listener and sets up gestures.
The only other approach I could think of that felt simpler used slightly different syntax:
class Widget:
std::unique_ptr<GESTURE_EVENTS> _gestures
GESTURE_EVENTS & gestures():
if !_gestures:
_gestures = std::make_unique<GESTURE_EVENTS>()
_gestures->attach(&mouse)
return _gestures
// ^^ that's it, but to use gestures you have to call a function:
widget->gestures().long_press += ...| // triggered (otherwise single_click is triggered immediately on up). | ||
| long double_click_timeout = 250 | ||
|
|
||
| // State machine |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Once I added double_click it felt like the logic was getting scattered around too much. I think it's easier to follow the logic when it's framed as a state machine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, definitely makes sense
src/rmkit/ui/events.cpy
Outdated
| // The threshold area that determines if this is a touch-based gesture | ||
| // (e.g. long_press, double_click) or a motion-based gesture (e.g. drag). | ||
| int touch_slop_size = 50 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought a lot about separating touch events (long_press, double_click, single_click) from the dragging events, since they seem like separate concepts. But I looked around at how some other toolkits deal with gestures, and it seemed like there was some momentum behind (a) lumping all the gestures together, and (b) having some kind of threshold where touch events are within the threshold, and motion events (drag, fling, etc.) are outside the threshold.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems fine to put them together
what does touch_slop_size stand for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I'm going for is something like this: the "ideal" touch is down and up at the same position, but in real life that doesn't actually happen, so "slop" is an extra allowance for real life touches.
I could probably call these two touch_threshold and dragging_threshold if that sounds better to you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i just wasn't sure if it was a typo (slope?), whichever name is fine. maybe 'allowance'? i'm not sure what word would be used here normally.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about touch_threshold? I also added a bit more detail to this comment to hopefully make it clearer what the value means
raisjn
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
works really well, nice job!
src/rmkit/ui/events.cpy
Outdated
| single_click_timer = nullptr | ||
|
|
||
| bool outside_tolerance(const input::SynMotionEvent & ev, int tolerance): | ||
| return std::abs(ev.x - prev_ev.x) > tolerance || std::abs(ev.y - prev_ev.y) > tolerance |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
only one space?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch, thanks -- like I said, still figuring out the cpy auto-indentation 🙂 (your cpy syntax file from earlier helped, but I'm still not quite there)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for indentation, when i use o, or O, vim opens up a line with the same indent as the current one. i have to press tab to add in two spaces, so i guess i don't have auto-indentation working (it would automatically figure out the number of spaces based on if the current line ends with a ':'?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah ok, I'm just using ft=cpp, so I also needed to turn off most of the smart indentation options. I think I figured out how to get to where you are w/ indentation (I have to manually indent, but new lines start at the same level), which is much better!
| event_delegate<&GESTURE_EVENTS::long_press> long_press = { this } | ||
| event_delegate<&GESTURE_EVENTS::single_click> single_click = { this } | ||
| event_delegate<&GESTURE_EVENTS::double_click> double_click = { this } | ||
| event_delegate<&GESTURE_EVENTS::drag_start> drag_start = { this } | ||
| event_delegate<&GESTURE_EVENTS::dragging> dragging = { this } | ||
| event_delegate<&GESTURE_EVENTS::drag_end> drag_end = { this } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it's cool how dynamic c++ can be
src/rmkit/ui/events.cpy
Outdated
| // The threshold area that determines if this is a touch-based gesture | ||
| // (e.g. long_press, double_click) or a motion-based gesture (e.g. drag). | ||
| int touch_slop_size = 50 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems fine to put them together
what does touch_slop_size stand for?
| // triggered (otherwise single_click is triggered immediately on up). | ||
| long double_click_timeout = 250 | ||
|
|
||
| // State machine |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, definitely makes sense
src/input_demo/main.cpy
Outdated
| square_x = w / 2 | ||
| square_y = h / 2 | ||
| // drag events | ||
| gesture.drag_start += PLS_LAMBDA(auto & ev) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does gestures (plural) reads better than gesture? (i don't have much to say about the code itself, apparently ;) )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🤷 either way is good with me. I thought about both but couldn't really decide
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
plural, then
|
i'm excited to try this out with the drag handles in codename iago |
|
also, 🚀 🚢 |
oh that's really slick! |
First draft of drag and long-click events. A couple thoughts:
Naming might be confusing, since there's already a global Gesture class (edit: but then, "gesture" does seem to be the standard term: see for instance Android's GestureDetector which has a lot of similarities w/ this approach). Feedback very welcome.
There's some redundancy with
short_clickandclick, but they mean slightly different things, which is probably confusing.clickjust means "the mouse was down in this widget, and now it's up", butshort_clickmeans "down and up within a 50-pixel area, and within 0.5 seconds". Those values are configurable of course, but it's a more specific -- maybe too specific -- idea of what "click" means.I thought about sticking the events under the MouseGestureManager instead of having them directly in MOUSE_EVENTS. Maybe that would make it clear that these aren't raw/simple events, but are pre-processed events. Maybe the interface actually looks like
widget->gesture->drag_starteven, instead ofwidget->mouse.drag_start. Or perhaps there's a way to hide the creation of the GestureManager in a function call, likewidget->gesture().drag_start.I haven't dealt with capture yet, but I don't think it'll be too hard. In my mind it makes the most sense to have ui::MainLoop::capture_widget, but I think that creates a circular dependency, so it might have to be ui::Widget::capture_widget.