I want to implement a modified version of Accessibility Zoom feature. In the zoomed in view, I want to be able to interact with the buttons which is present in the view below the Zoomed In View.
The following picture illustrates the issue.
I have
-
-
Parent View
-
a. Child View 1
-
b. Child View 2
-
-
Gesture events are handled in ChildView1. When I get a dragGesture in ChildView2 which is the zoomed in view, I want to be able to modify the touch points and pass the event to ChildView1. I've tried various approach:
-
sendAction which can be called only on UIControls. (I was looking for Swift equivalent of
element.dispatchEvent(event)-available in JavaScript) -
Check if Accessibility Zoom can be invoked programmatically (Not available)
-
Modify UIGestureRecognizer object to change the
location(in view: UIView?) -> CGPointin ChildView2 and invoke the selector fordragGestureHandlerin ChildView1. I'm not sure of the side effects with this approach.
Can you share a good approach to solve this problem or re-implement Accessibility Zoom feature?
from Modify gesture events(translate location) and programmatically send the modified event to another view
No comments:
Post a Comment