I need to know the normals to an isosurface at specified points. Using my_contour.normals.filter.output.points.to_array() appears to return all the normals, but it is not clear how to identify which normal corresponds to which point. In particular, the number of normals returned is generally different from the number of point or of polys.
I tried a more direct approach using probe_data, but what is returned is the scalar only, not the normal, at the specified point. How can I access the normal at a given point? Here is some sample code to show what I am trying to do: =========8<------------------------- from mayavi import mlab from numpy import mgrid, array, reshape N=5 x, y, z = mgrid[-1.:1.:N*1j, -.5:.5:N*1j, -.2:.2:N*1j] t = x**2 + 4*y**2 + 25*z**2 cont = mlab.contour3d( x,y,z,t, contours=[0.9, ], opacity=.5 ) xyz = array((.57735,0.28875,0.11547)) xyz *= 0.94868 xyz0 = (0,0,0) XYZ = zip(xyz0, xyz) mlab.plot3d(XYZ[0], XYZ[1], XYZ[2]) a = cont.normals.filter.output # this returns the scalar (0.9) only: mlab.pipeline.probe_data(a, xyz[0], xyz[1], xyz[2]) # normals data is not in 'vectors' either: # mlab.pipeline.probe_data(a, xyz[0], xyz[1], xyz[2], type='vectors') # the array of normals has different dimensions and points or polys: a1 = cont.normals.filter.output.points.to_array() print 'normals shape', a1.shape b = cont.contour.contour_filter.output.polys.to_array() b = reshape(b, (b.size / 4., 4))[:, 1:] print 'polys shape', b.shape c = cont.contour.contour_filter.output.points.to_array() print 'points shape', c.shape _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
On Wed, Nov 14, 2012 at 12:48:34PM -0800, Robert Stephenson wrote:
> I need to know the normals to an isosurface at specified points. Using > my_contour.normals.filter.output.points.to_array() appears to return all the > normals, but it is not clear how to identify which normal corresponds to which > point. In particular, the number of normals returned is generally different > from the number of point or of polys. I would say (haven't checked), that if you do the following: normals = iso.normals.filter.output.points.to_array() pd = iso.contour.outputs[0] polys = pd.polys.to_array() polys is now the array of polygon points, in the format (n_pts, pts1, pts2, pt3s, ... n_pts, pts1, pts2, ...) as explained for lines on http://docs.enthought.com/mayavi/mayavi/data.html#extracting-lines i.e if the first number is '3', the next 3 are the point ids of a triangle, and again... The corresponding points are given in pd.points HTH, Gaël _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
In reply to this post by Robert Stephenson
Dear all;
I am new in Mayavi. I would like to extract data (scalar) from .vtu file. Is it possible to have a function like "PLOT OVER LINE" of Paraview? Thank in advance for your helps. Arouna |
On Mon, Nov 19, 2012 at 09:08:49AM -0800, adarga wrote:
> Is it possible to have a function like "PLOT OVER LINE" of Paraview? I don't know what that does, sorry. Cheers, Gaël _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
Thank you for your response.
A detail about "Paraview plot over line " can be found here : http://www.paraview.org/Wiki/ParaView/Users_Guide/Plotting_and_Probing_Data It consist of extraction of data over a straight line. For example, with a 2D or 3D figure, with "Plot over line ", I can extarct the data over one axis or profile of some quantity. Please excuse, if the explanation is not clear! I can provide some figures to illustrate my problem. Thank you in advance Arouna |
On Mon, Nov 19, 2012 at 09:26:10AM -0800, adarga wrote:
> It consist of extraction of data over a straight line. For example, with a > 2D or 3D figure, with "Plot over line ", I can extarct the data over one > axis or profile of some quantity. OK. You can use probe_data for this purpose: http://docs.enthought.com/mayavi/mayavi/data.html#probing-data-at-given-positions Hope this helps, Gaël _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
In reply to this post by Gael Varoquaux
Thanks, Gael, for the reply. What puzzles me is that the number of normals is not the same as the number of polys, both calculated as you suggest. By playing with the pipeline viewer I discovered that the degree of this mismatch depends on the feature_angle property of the iso_surface. Low feature_angles give very large numbers of normals and as the feature_angle increases the number of normals approaches the number of polys. I haven't found much in the way of documentation on this -- perhaps I was looking in the wrong places?
For my application, I need to know the surface normals at a handful of selected points on an iso-surface. Any suggestions short of calculating them ab initio? If I explicitly invoked the poly_data_normals filter on my surface would it behave more predictably, or is that what is used to generate the normals already? On Nov 17, 2012, at 6:26 AM, Gael Varoquaux wrote: > On Wed, Nov 14, 2012 at 12:48:34PM -0800, Robert Stephenson wrote: >> I need to know the normals to an isosurface at specified points. Using >> my_contour.normals.filter.output.points.to_array() appears to return all the >> normals, but it is not clear how to identify which normal corresponds to which >> point. In particular, the number of normals returned is generally different >> from the number of point or of polys. > > I would say (haven't checked), that if you do the following: > > normals = iso.normals.filter.output.points.to_array() > pd = iso.contour.outputs[0] > polys = pd.polys.to_array() > > polys is now the array of polygon points, in the format (n_pts, pts1, > pts2, pt3s, ... n_pts, pts1, pts2, ...) > as explained for lines on > http://docs.enthought.com/mayavi/mayavi/data.html#extracting-lines > i.e if the first number is '3', the next 3 are the point ids of a > triangle, and again... > > The corresponding points are given in pd.points > > HTH, > > Gaël > > _______________________________________________ > Enthought-Dev mailing list > [hidden email] > https://mail.enthought.com/mailman/listinfo/enthought-dev _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
On Mon, Nov 19, 2012 at 11:48:35PM -0800, Robert S. Stephenson wrote:
> Thanks, Gael, for the reply. What puzzles me is that the number of > normals is not the same as the number of polys, both calculated as you > suggest. By playing with the pipeline viewer I discovered that the > degree of this mismatch depends on the feature_angle property of the > iso_surface. Low feature_angles give very large numbers of normals and > as the feature_angle increases the number of normals approaches the > number of polys. Indeed, you are right. The feature_angle is used to remove some triangles for which there is a large angle with the neighboring facets. > I haven't found much in the way of documentation on this -- perhaps I > was looking in the wrong places? For this level of technicity, you need to look in the VTK doc. It's kind of hairy. > For my application, I need to know the surface normals at a handful of > selected points on an iso-surface. Any suggestions short of > calculating them ab initio? I am not sure. I have never done anything close to that. You might try increasing the feature_angle to the max and see if you can establish clear-cut correspondance. > If I explicitly invoked the poly_data_normals filter on my surface > would it behave more predictably, or is that what is used to generate > the normals already? Indeed, it will not change anything. Sorry, you are going beyond anything I've tried. Gaël _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
Thanks, Gael, for your reply. I think I have finally figured out how the normals work -- thanks in part to the book Visualization Toolkit which is pricey but worth the money if you want to understand VTK. When an edge is too sharp -- i.e. when its dihedral angle is less than (180 - feature_angle) -- then its 2 vertices are duplicated, splitting the triangular mesh along that edge. This means that for every such edge the number of points goes up by 2 while the number of polys remains the same. And, when you have the polydata from the normals filter, .points gives you the augmented list of vertices, while .point_data gives you the normals. The code attached illustrates this. If you look closely, you can see that the normals algorithm is less than perfect. One hopes that its accuracy will increase with the number of polygons.
====================8<------------------------------------ from mayavi import mlab from numpy import mgrid, array, vectorize, allclose, ones, reshape N=5 x, y, z = mgrid[-1.:1.:N*1j, -.5:.5:N*1j, -.2:.2:N*1j] t = x**2 + 4*y**2 + 25*z**2 cont = mlab.contour3d( x,y,z,t, contours=[0.9, ], opacity=.5 ) pd = cont.contour.outputs[0] points = pd.points.to_array() # the original (pre-normal-filter) array of points polys = pd.polys.to_array() # the original (pre-normal-filter) facets polys = polys.reshape(polys.size/4, 4) norms_pd = cont.normals.outputs[0] norm_pts = norms_pd.points.to_array() # the (augmented) array of points on which normals are defined norm_polys = norms_pd.polys.to_array() # the connectivity of the facets after splitting acute edges norm_polys = norm_polys.reshape(polys.size/4, 4) normals = norms_pd.point_data.normals.to_array() # the normals print 'points & polys shapes', points.shape, polys.shape print 'normal points & polys shapes', norm_pts.shape, norm_polys.shape print 'normals shape', normals.shape # check that the normals are normalized mag_sqd = vectorize(lambda x, y, z: x*x+y*y+z*z, doc='given x,y,z, return x**2+y**2+z**2') allclose(ones(normals.shape[0]), mag_sqd(normals[:,0], normals[:,1], normals[:,2]) ) xyz1 = norm_pts xyz2 = norm_pts - 0.1*normals XYZ = array(zip(xyz1, xyz2) ) for i in range(XYZ.shape[0]): mlab.plot3d(XYZ[i,:,0], XYZ[i,:,1], XYZ[i,:,2], tube_radius=None) # get the scalar value at a point on the iso-surface mlab.pipeline.probe_data(cont, xyz1[0], xyz1[1], xyz1[2]) # You can only probe scalars, vectors or cells - not normals # mlab.pipeline.probe_data(normals, xyz1[0], xyz1[1], xyz1[2]) ====================8<------------------------------------ On Nov 20, 2012, at 12:51 AM, Gael Varoquaux wrote: > On Mon, Nov 19, 2012 at 11:48:35PM -0800, Robert S. Stephenson wrote: >> Thanks, Gael, for the reply. What puzzles me is that the number of >> normals is not the same as the number of polys, both calculated as you >> suggest. By playing with the pipeline viewer I discovered that the >> degree of this mismatch depends on the feature_angle property of the >> iso_surface. Low feature_angles give very large numbers of normals and >> as the feature_angle increases the number of normals approaches the >> number of polys. > > Indeed, you are right. The feature_angle is used to remove some triangles > for which there is a large angle with the neighboring facets. > >> I haven't found much in the way of documentation on this -- perhaps I >> was looking in the wrong places? > > For this level of technicity, you need to look in the VTK doc. It's kind > of hairy. > >> For my application, I need to know the surface normals at a handful of >> selected points on an iso-surface. Any suggestions short of >> calculating them ab initio? > > I am not sure. I have never done anything close to that. You might try > increasing the feature_angle to the max and see if you can establish > clear-cut correspondance. > >> If I explicitly invoked the poly_data_normals filter on my surface >> would it behave more predictably, or is that what is used to generate >> the normals already? > > Indeed, it will not change anything. > > Sorry, you are going beyond anything I've tried. > > Gaël > _______________________________________________ > Enthought-Dev mailing list > [hidden email] > https://mail.enthought.com/mailman/listinfo/enthought-dev _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
Hi Robert,
Glad you worked it out! On Sun, Nov 25, 2012 at 11:49:28AM -0800, Robert S. Stephenson wrote: > thanks in part to the book Visualization Toolkit which is pricey but > worth the money if you want to understand VTK. Indeed, this book is very useful. Too bad it is not available for free. G _______________________________________________ Enthought-Dev mailing list [hidden email] https://mail.enthought.com/mailman/listinfo/enthought-dev |
Free forum by Nabble | Edit this page |