-
Notifications
You must be signed in to change notification settings - Fork 29.8k
Add buttons to gestures #30339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add buttons to gestures #30339
Conversation
|
If I'm reading this correctly, this treats a left-click the same as a right-click. Is that what we want? |
| TapDownDetails({ this.globalPosition = Offset.zero }) | ||
| : assert(globalPosition != null); | ||
| /// Arguments [globalPosition] and [buttons] must not be null. | ||
| TapDownDetails({ this.globalPosition = Offset.zero, this.buttons = 0 }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think before we make this change we should update the Android and iOS logic to treat fingers-touching-surface and stylus-touching-surface as button=1.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For now the number of set bits is not checked, so a buttons = 0 will also work as long as it stays 0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well you're right, for it to work without the change, the callback also needs device.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Hixie The plan is to apply it to long tap, drag and double click later.
Since they share a common superclass, does it make sense to fix them all at once?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a good idea, but technically the concept of PrimaryPointerGR does not necessarily imply "restricted to single button". Do you think I should implement it in PrimaryPointerGR or a new intermediate class?
| final Offset globalPosition; | ||
|
|
||
| /// The buttons pressed when the pointer contacted the screen. | ||
| final int buttons; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think for this to make sense we should require that it be a single button, not multiple buttons. Otherwise pressing left+right on a mouse will fail 99% of the time (because one will be detected a split second before the other) and work 1% of the time.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can limit it to "zero or one button" for #30339 (comment). What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should be one button, and we should never have an event with zero buttons.
|
@Hixie The idea is, to put it simpler,
So that both "left click" and "right click" will trigger the same TapGR, but the callback can distinguish with details provided (which doesn't help |
|
Should this be specific to tap, or should it apply to other primary-pointer recognizers like drag, or long press, or double tap? |
|
@Hixie The plan is to apply it to long tap, drag and double click later. |
|
I don't think it makes sense for us to fire onTap when you right-click. Nobody is going to be checking whether it's a right mouse button or not. IMHO, GestureDetector's onTap/onTapDown/etc should fire just for left clicks and taps. GestureDetector can then have an additional onRightClick that fires for right clicks. (Maybe call it onSecondary or something more general.) We can easily implement this by having the gesture recognizer take a constructor argument that says what button it supports. |
|
We talked about this in person. RawGestureDetector's API makes it awkward to do what I suggested. Not clear if it'll work or not, we'll have to try it and see how it comes out. |
|
@Hixie I wrote a chapter of "design reasoning" in PR description, which explained why other approaches (AFAIW) won't work except for two. The other viable design is to use different gesture classes on top of configurable super class, class BaseTapGestureRecognizer {
BaseTapGestureRecognizer({ this.button, });
}
class TapGestureRecognizer extends BaseTapGestureRecognizer {
TapGestureRecognizer() : super(button: kPrimaryButton);
}
class SecondaryTapGestureRecognizer extends BaseTapGestureRecognizer {
SecondaryTapGestureRecognizer() : super(button: kSecondaryButton);
}It will reduce the number of callbacks per class, but it will likely cause the increase to number of classes, especially if we're to support secondary button to drag gestures, then we'll need to add a secondary series to v-drag, h-drag, and pan. |
gspencergoog
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| /// * [kPrimaryMouseButton]: an alias of this constant when used by mouse. | ||
| /// * [kSecondaryButton], which describes a cross-device behavior of | ||
| /// "secondary operation". | ||
| const int kPrimaryButton = 0x01; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's too bad about all of these global constants. I wish they were in a class. Probably that ship has sailed since it would be a massive breaking change, but I thought I'd just put that out there.
| resolve(GestureDisposition.accepted); | ||
| _checkUp(); | ||
| } | ||
| _checkUp(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it, makes sense. Thanks for the explanation.
| properties.add(DiagnosticsProperty<Offset>('finalPosition', _finalPosition, defaultValue: null)); | ||
| properties.add(FlagProperty('sentTapDown', value: _sentTapDown, ifTrue: 'sent tap down')); | ||
| // TODO(tongmu): Uncomment this line and update related tests. | ||
| // properties.add(IntProperty('initialButtons', _initialButtons)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just delete these two lines: don't check in commented out code:
https://github.com/flutter/flutter/wiki/Style-guide-for-Flutter-repo#avoid-abandonware
This reverts commit 727e7e8.
…)" This reverts commit 8fd7fa4.

Overview
This PR add support for buttons to tap, long press, drag, and double tap gestures.
Their existing callbacks now only respond to primary button. For example,
onLongPressStartwill only respond when the long press is done by touch, stylus-touch, or LMB. This is supposed to cover most daily uses (without changing existing code).Tap also has a "secondary" version of each callback, such as
onSecondaryTapDown.The following is a summary table for callbacks after the PR:
on*(for primary)onSecondary*Design reasoning
This chapter explains the reason for choosing this design using an example of
TapGestureRecognizer.onTapas we try to fix the issue where buttons incorrectly respond to RMB clicks.First, the existing callbacks of
GestureDetectormust only respond to primary events. Most (if not all) existing widgets use callbacks while assuming the gestures received are all primary. For example, aButtonassignsGestureDetectorwith aonTapthat triggers its action without reassuring the buttons of the incoming gesture. Unless we limit the existing callbacks to primary events, we will have to change the code ofButton, as well as countless external code.It also means that we need separate
onSecondaryTap*callbacks to handle secondary taps.Second, because
RawGestureDetectorreceives a map from recognizer type to recognizer factory, there can only be oneTapGRinstance. It means thatTapGRcan not be configurable, which leaves us with three options:TapGRclass, whose callbacks respond to all tap events, and dispatching is done byGestureDetector.TapGRclass, whose callbacks are already post-dispatch (i.e. it has callbacks for primary as well as callbacks for secondary).SecondaryTapGR, who has callbackonTapwhich responds to secondary taps.Option 1 is infeasible, because the recognizer must filter events in isAllowedPointer instead of deferring it to callbacks. To explain this, say we have a component which only listens to primary drag and secondary tap. The two kinds of events are exclusive from each other and should not cause arena competition, thus both of them should be accepted immediately (for example, onTapDown should be fired immediately at Down events). However, now that the exclusiveness is only known by callbacks, both gestures will try to compete upon every Down events, which is a behavioral degradation.
Between the remaining two options, we preferred option 2 over 3 for two reasons:
TapGRa centralized class to handle "any kinds of tap gestures", which is an easier concept to understand.Major changes
Existing gestures require primary button
All existing callbacks of the following gesture recognizers now require "primary button rule":
TapGestureRecognizerLongPressGestureRecognizerDragGestureRecognizer(and its variantsPan,HorizontalDragandVerticalDrag)DoubleTapGestureRecognizerThe same change is applied to their counterparts in
GestureDetector.New "onSecondary" tap gesture callbacks
TapGestureRecognizeralso have a series of "onSecondary" callbacks:TapGestureRecognizer:onSecondaryTap{Down,Up,Cancel}Recognizers only compete on events if there are any related callbacks
A recognizer will compete on primary events only if there are non-null primary callbacks. Similarly,
TapGRwill compete on secondary events only if there are non-null secondary callbacks.This should not affect real-world code, but some tests used to create recognizers with no callbacks in order to form competence to the test target. These tests have been updated to assign an empty callback.
Minor changes
New utility functions
smallestButton: basically rightmost set bitisSingleButton: returns true if the targetbuttonshas one and only one set bitNew constant
kSecondaryButton: an alias tokSecondaryMouseButtonandkPrimaryStylusButtonTestPointer can configure buttons
TestPointernow has a modifiable propertyint buttons, which will be assigned toPointerDownEvents andPointerMoveEvents that it generates.Fix: Semantic long press dispatches all callbacks
RawGestureDetector._handleSemanticsLongPressused to only dispatchonLongPress. It now dispatchesonLongPress{Start,End,Up}as well.Related Issues
Tests
New tests: each of tap, long press, drag, and double tap gesture recognizers has the following new tests:
New tests: the following new utility functions work property:
smallestButtonisSingleButtonChecklist
Before you create this PR confirm that it meets all requirements listed below by checking the relevant checkboxes (
[x]). This will ensure a smooth and quick review process.///).flutter analyze --flutter-repo) does not report any problems on my PR.Breaking Change
Does your PR require Flutter developers to manually update their apps to accommodate your change?