Re: XInput 2: the big picture, relationship to toolkits

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 2008-08-14 at 11:00 +0930, Peter Hutterer wrote:
> On Wed, Aug 13, 2008 at 05:45:40PM -0400, Owen Taylor wrote:
> > Configuration of devices should not be in the scope of an application.
> > Once a device is configured by the user globally, it should just work
> > in any application. Though in some cases, an application may want to 
> > allow it's users to configure how device functionality maps to application
> > functionality, see below.
> > 
> > Moreover, in the case where a device is controlling the core pointer
> > the application should not have to choose between "normal events" and 
> > "input extension events". If the user moved the pointer somewhere, it's
> > a motion event. Then, if the application cares, it should be able to
> > look and see "what device does this event come from?" "does this event 
> > have pressure information? what was the pressure?"
> > 
> > So, the question for the XInput extension is: are the APIs correct so that
> > a toolkit can seamlessly provide this functionality?
> 
> Assuming that we ignore core events I would say yes. Although the APIs can be
> improved, as described later below.
> The important thing is - forget that core events even exist. Applications
> should only use XI. The current events already contain screen coordinates -
> much like the core events and sufficient for the majority of apps - and also
> valuators if you care about pressure or device coordinates.

Yeah, that was my basic conclusion, modulo the awkwardness with event
selection and multiple master pointers.

> > One thing that occurs to me: with the master pointer support in XInput 2, 
> > you no longer have to select for input on all devices, you can simply 
> > select for the master device ... a definite improvement. However, if there 
> > are multiple master pointers, you have to select for events for every 
> > pointer, and track the set of master devices, and if that changes, redo 
> > the selection.
> 
> Correct. The DevicePresenceEvent/DeviceHierarchyChangedEvent notify you if a
> master appears/disappears and gives you the ability to re-read the device list
> for new master devices. Not perfect, but well...
>  
> > I don't have a great suggestion for fixing this problem. What I'd really
> > want at the toolkit level is to say "for this client extension send me 
> > all extended input events selected by my core event masks along with
> > the core events". But that's pretty radically different than how the XInput 
> > extension normally works.(Note that this doesn't make sense for floating slave
> > devices ... there what you really want to is to select for events
> > *without respect to the window hierarchy*). 
> 
> The current APIs do not really give you that option due to being slotted onto
> XI 1.x events and requests.
> 
> The important thing is that there are 3 devices:
> - master devices
>   These devices should be used for almost all applications to receive events.
> - attached slave devices
>   You should only ever execute requests on them if you want to change the
>   device's setting. Since those settings are copied into the master
>   automatically, there's no need for touching the master. There's hardly a
>   use-case where you want to listen to events by an attached slave.
> - floating slaves
>   Basically same thing as non-core extension devices in XI 1.
> 
> One of the things I would like to do is to consider ditching XI 1.x
> compatibility and hence XI 1.x events. I've been to hesitant to do it while
> doing MPX, but after writing all the protocol specs and my thesis it's
> becoming more and more obvious that not doing it would probably hurt us more.
> 
> There's a fair number of requests that don't need to be deprecated, but the
> events do. Mainly, we have XGE now, so lets make use of it. This includes 32
> bit keycodes in events, relative + absolute valuators in the same event (if
> applicable), 32 bit device identifiers, etc. 
> Those XGE events that are already in XI2 allow for a AllDevices event
> selection, which arguably could be extended to include AllMasterDevices as
> well. Either way, it removes the need for tracking the XEventClass in the
> client, making things much saner and closer to core protocol event tracking
> from a toolkit/app perspective.

Would be neat to have :-)

> > Currently:
> > 
> > * The entire space of the device is mapped to the entire space
> >   of the screen.
> > 
> > But:
> > 
> > * You might want to attach a tablet to a particular monitor. One case
> >   of this is a device like the Wacom Cintiq where the tablet *is*
> >   the monitor. (This is also the case for many mobile devices, but
> >   those are seldom used with multiple monitors so the screen/monitor
> >   distinction doesn't matter.)
> 
> This bit is missing and should be added. The old input API had xf86SetScreen
> or so. With the current one, it's hard to cross screen since device ranges are
> scaled. (At least last time I checked)
> This only applies for absolute devices of course, relative devices work fine.

Yeah, I suppose for completeness you want to to handle Zaphod mode,
though it isn't the most interesting part.

> > * You might possibly want to attach the tablet to a window. My 
> >   experience years ago was that this wasn't actually that useful;
> >   it forced you to switch between mouse and tablet a lot, and 
> >   tablets are high resolution devices available at decent sizes
> >   for reasonable prices ... it's OK if you aren't drawing with the
> >   *entire* tablet.
> 
> This is what floating slave devices are for. You float the device, grab it and
> map the device coordinate range (in the application) to the desired area. I
> believe this should cover these use case sufficiently?

If this use case exists (a definite question) then I think you want
events delivered through the master pointer for this as well ... client
side drawn pointers are really painful for applications, and you want
normal event delivery to subwindows. However, I think we can say that
this case isn't significant. If it comes up, it could be handled by
watching the window position and adjusting the configuration of the
slave device.

> > Another thing to consider is aspect ratio ... I think right now
> > if you have a 4x3 tablet and a 3x2 screen a circle you draw or
> > trace on your tablet will come out squished. Probably the
> > attachment API needs to have a flag as to whether to preserve
> > aspect ratio or not.
> 
> Shouldn't this API be in the toolkit? If you're actually dealing with
> valuators, X doesn't touch them much and forwards them as-is. Any scaling,
> squishing or otherwise should be done in the toolkit/app.

Again, what I'm talking about how the slave device coordinate ranges
are mapped onto pointer positions. This is something that has to be
configured in the server. You could make preserving the aspect ratio
part of the specification for that. Say you specify, for device:

 Root window rectangle
 Preserve aspect boolean

But you could also push it off to the configuring client. If the setting
is a list of:

 Screen #
 Device rectangle
 Root window rectangle

Then anything is possible.

> > Subpixel positioning
> > ====================
> > 
> > An important attribute of a graphics tablet is subpixel positioning; if
> > you draw a line at a small angle, you want that line to come out
> > smoothly antialiased, not with a step in it as the pen crosses pixel
> > boundaries.
> > 
> > The current code reports:
> > 
> >  rootX,rootY: integer screen pixel coordinates
> >  eventX, eventY: integer window-relative coordinates
> >  valuator[0], valuator[1]: device coordinates *quantized
> >    to screen pixel coordinates*.
> 
> I just checked the code and it looks as if the valuators reported are the
> device coordinates - for absolute devices. This should give you subpixel
> ranges, especially if you float the device and map it to arbitrary areas. Note
> that these valuators are always in absolute device coordinates and not
> relative to any window. 

The current code, as I read GetPointerEvents(), seems to kill the
subpixel positioning since it converts to screen coordinates then
converts back to device positioning. But beyond that, asking clients to
do the the conversion between device valuator coordinates and root
window coordinates seems unnecessarily annoying ... the client has to be
aware of all the details about how device window coordinates map to
screen coordinates and exactly replicate the algorithm that the server
uses.

> Relative devices however are clipped to screen coordinates. I guess the main
> problem here is simply a problem with XInput's axis specification. An axis can
> be absolute or relative, but not both. If it is relative, it will not get
> scaled by the server and is initialised (usually) with a range of -1/-1 or
> 0/-1.
>
> If such a device reports a relative axis movement by +10, it is difficult to
> scale that into subpixel values without knowing where a pixel ends and a
> subpixel starts. We'd have to look at device resolution + screen resolution to
> get any decent approximations. That said, I don't feel qualified yet to
> comment too much on that, having spent too little time on it. Simon, maybe you
> want to chime in here?

Subpixel positioning only makes sense when an axis is slaved to the X/Y
position of the pointer. Otherwise, the axis is just a number. 

When that slaving does occur, I don't see a problem with getting
subpixel relative motions ... the server has to know how it is
converting valuator motions into changes to the pointer position ...
that algorithm should work as well for non-integral pixels as integral
pixels. (You obviously have to track a subpixel position, this can be
done with dev->last.valuators[0].) 

The problem in this area I do see is that each *device* has to be either
relative or absolute. So if you have a relative device that both
controls the pointer and has other interesting relative axes (the scroll
wheel, say), then how do you generate master events that have both:

 - Subpixel cursor positioning information
 - The relative axes as relative events (there's no reasonable way to 
   convert a scroll wheel scroll into a absolute event)

I suppose, as part of the convention saying that v[0]/v[1] for master
devices are fixed-point screen coordinates, we could say they are always
absolute, no matter what the 'model' is from the ValuatorClass. Ugly,
but seems workable.

Or, if you are creating a new set of events, then more things are
possible, like just sending rootX/rootY in fixed point. (Since the fixed
event structure size Rubicon has already been crossed... :-)

> > Identifying Axes
> > ================
> > 
> > XListInputDevices() tells you, for each device axis:
> >  
> >  resolution (counts/meter)
> >  min-value
> >  max-value
> > 
> > But what you don't find out is what the axis actually is ...
> > is it X/Y position? Pressure? Tilt? Something else?
> > This is information that the driver has. Then it's thrown
> > away before sending it to X.
> > 
> > The master/slave setup makes this, if anything, worse, since
> > it continually mutates the master device to look like one or
> > the other slave device, but there is no actual indication of
> > what slave event one of these master events come from that
> > I'm aware of.
> 
> The DeviceClassesChangedEvent notifies you when the master changes, _before_
> the event is porcessed. It includes newSlave, so you know that from now on
> until the next DCCE, all events come from this slave.
> It includes the same info as XListInputDevices (for this device only), so you
> can update your internal state without having to query the server.

Ah, missed the newSlave field. I don't think you really save having to
ListInputDevices and track changes since the two pieces that
are most important ... type and name ... are not provided in the 
DeviceClassesChanged, but doing that tracking is certainly feasible.

> > One possible fix here would be to add another type of 
> > device information class .. ExtendedValuatorInfo, say that
> > contains extra information per axis.
> 
> I was toying with the idea of axis labelling through device properties.
> If we can come up with a sort-of standard for label names (x, y, z, pressure,
> tilt, etc.) that might be the simplest and most flexible approach.
> 
> The biggest advantage here: in the end the server doesn't care for anything
> but x/y, so let the driver label the axes once to reflect their physical
> meaning. Then let the clients argue about how to interpret each axis.
> 
> I have not yet found the time to sit down and actually think it through so
> there may be flaws in that plan.

Leaving some issues I have with the details of the device property
interface aside, I think it's as workable to put extended valuator
information there as it is in the ClassInfo structures. But it's going
to be far simpler if the device properties are supplied by the driver
rather than being argued about by clients :-). Otherwise, you'll need
some vastly more complicated solution with .fdi files and hal/DeviceKit
just to get things labeled properly.

> (As a side-effect, once we have axis labelling, it would also give us the
> ability to post scrollwheel events as buttons *and* as valuators, which
> arguably may lead to smoother scrolling.)
> 
> > Device Identity
> > ===============
> 
> See above, DCCE does it :)

Can you determine the initial condition?

- Owen


_______________________________________________
xorg mailing list
xorg@xxxxxxxxxxxxxxxxxxxxx
http://lists.freedesktop.org/mailman/listinfo/xorg

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [X Forum]     [Intel Graphics]     [AMD Graphics]     [Nouveau Driver]     [XFree86]     [XFree86 Newbie]     [IETF Annouce]     [Security]     [Fontconfig]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Video for Linux]     [Linux RAID]

  Powered by Linux