As the title gives away, I am trying to drag points around using the mouse in Mayavi. To be more specific, these points are the control points of a subdivision curve, and their movement is to be constrained to a surface.
Despite having a fair amountof experience with mayavi, this has proven problematic thus far. I was hoping to be able to register a dragging event, but there seems to be no such thing. Also, bootstrapping from the picking event seems a no-go, since it triggers on mouse up, not mouse down. The best I have been able to come up with is to pick a control point, then hit a key to lock it to mouse movement, and then make the point react to said mouse movements, until hitting the key again. The problem is: I cant figure out how to detect mouse movement in traits or mayavi in the first place! No on_mouse_move or anything anywhere, as far as I can tell. Surely this must be a gap in the documentation or in my google-fu?
Another possibility is that perhaps the actor mode can be used to achieve this? Could I register a callback somewhere on the trait that is modified when dragging the control points, and then react to any changes in their position, make them conform to my constraints, and update the rest of the visualization accordingly? I havnt gotten very far with this approach so far. Making everything but my control points non-draggable seems to fail, for starters.
Any help would be greatly appreciated!
I have gotten a little further with the actor mode: I am using the sphere that the control points reside upon as a 'trackball' of sorts. I select a control point, and then move it according to changes in the orientation of the sphere.
Or at least, I try. The orientation vector contains three angles, in degrees. I have tried converting them into a rotation matrix using every possible convention, but the rotation of the sphere never quite matches that of the point to be moved. Grrr. Any idea where I could look up the exact definition of these orientation angles? Alternatively, I might grab the world cords of the glyph and do an SVD to extract the rotation...
But despite this almost working, I am not too happy with the solution. For reasons not entirely clear to me, moving things in the actor mode is really slow. It probably triggers a complete reupload of the scene to the graphics card, even though only a change to the transforms should be required. But I suppose there is some nontrivial design reason for that. Either way, I need two such updates; one to capture the sphere rotation, and one to make my adjustments. The bottom line is that interactivity becomes somewhat of a mockery, even without any actual calculations, like updating my subdivision curves.
Simply being able to capture mouse movements would be a whole lot more convenient...
Still not much progress.
My next angle of attack will be to use one of the embedding strategies as described in the link.
Presumably, I should be able to listen to mouse motion events on the QT/wx window as a whole, and then act on that?
Also, would it be possible to trigger the picking mechanism from code? So I can pick on mouse-down and then do proper dragging, without having to select a point first and then using a separate toggle to enable dragging?
Thanks again for any help you can offer!
Right. So it does not seem I can capture mouse events from QT inside my mayavi window, unless someone enlightens me on the low level haxxoring to hook up the events.
On the plus side, dragging and redrawing my scene is somehow buttery smooth when using the QT backend, which revives my interest in the 'trackball' approach
All I need to do then is deduce how to convert my spheres orientation values to the motions I am interested in, but I guess ill figure it out eventually. And ideally I would like to suppress the actor manipulations that dont make sense. I see actor objects have a dragable field, but it does not seem to do anything. I suppose with the smoothness of my updates, I could simply suppress any such changes retroactively and it would look alright. But is there a better way to fix actors from actor mode manipulations by unhooking some sort of event?
Ive got something working; ill throw it out here for future reference; and in the hope that someone may find a more elegant solution.
First of all, I run in qt4; for some reason that makes updating the scene much faster, instead of the 1fps one otherwise rapidly devolves into.
Then I select a control point using the picking function. Then, I drag the surface of the sphere it sits on, and let the control point move as if pegged to the sphere. (ive found the correct interpretation of the orientation angles in the VTK docs. ZXY rotation order, intrinsic rotations, in case you were wondering)
The reason I don't let the actor mode work on the control points directly is thus: my middle mouse button is a disaster (as they generally are), so I must prefer to drag something with my left mouse button. Moreover, it seems one cannot influence the actor being dragged in code until one lets loose of the mouse, which would lead to non-conforming geometries when dragging the point directly. Also, there may be a LOT of control points; and giving them each their own point3d plot seems like a potential performance disaster.
It is rather easy to misclick and move any of the actors that are intended to be static. Thus, I want to constrain all DOFs I am not interested in manipulating. Perhaps there is a more clever way of doing this, but I now do so reactively. You can drag geometry out of place, but it will snap back to its default scale/position/orientation once you let go of the mouse. Not perfect, but it gets the job done.
A pretty ugly hack, for something that in an ideal world would be a single line of code, but ill shut up and go update my own open source projects before sounding too critical. All in all, it works quite well once you get used to these somewhat unorthodox controls.
A lot of what you are doing I don't know how to do, but it should be easy to ignore any of the actors that you are not interested in. All actors have a "pickable" trait, which you can turn off, and then actors will never respond to your picks. (these are tvtk actors not mayavi actors but i think mayavi actors correspond directly to tvtk actors).R
On Thu, Aug 15, 2013 at 9:51 AM, Eelco Hoogendoorn <[hidden email]> wrote:
Ive got something working; ill throw it out here for future reference; and in
Athinoula A. Martinos Center for Biomedical Imaging
Enthought-Dev mailing list
Thanks for the feedback.
That's interesting information; indeed this suppresses any kind of interacting in the actor mode. But the thing is, most objects should be static, but still pickable. Your suggestion has inspired an interesting hack however. I can disable picking for my visible objects, and for those that I want to be able to select anyway, I render them a second time with opacity set to 1e-6, picking enabled, and apply the somewhat hacky-looking reactive constraints to them. Which are now not visible at all, since their contribution to the rendering gets rounded to zero.
Using this kind of mechanism, I could also ditch the trackball altogether, and use 'ghost control points', whos manipulation I can read out and translate into conforming information on the fly. But you still need to click the point to be manipulated first, before a picking event is generated, before you can start dragging it. Given this constraint, the trackball is kinda nice, since your mouse cursor isn't directly over the part of the subdivision curve you are trying to tweak.
|Free forum by Nabble||Edit this page|