Filters¶
These classes hold methods to apply general filters to any data type. By inherritting these classes into the wrapped VTK data structures, a user can easily apply common filters in an intuitive manner.
Example
>>> import pyvista
>>> from pyvista import examples
>>> dataset = examples.load_uniform()
>>> # Threshold
>>> thresh = dataset.threshold([100, 500])
>>> # Slice
>>> slc = dataset.slice()
>>> # Clip
>>> clp = dataset.clip(invert=True)
>>> # Contour
>>> iso = dataset.contour()
Dataset Filters¶
The pyvista.DataSetFilters
is inherited by pyvista.Common
making
all the the following filters available as callable methods directly from any
PyVista dataset.
Methods

Generate points at the center of the cells in this dataset. 

Transforms cell data (i.e., data specified per cell) into point data (i.e., data specified at cell points). 

Clip a dataset by a plane by specifying the origin and normal. 

Clips a dataset by a bounding box defined by the bounds. 

This filter computes sizes for 1D (length), 2D (area) and 3D (volume) cells. 

Find and label connected bodies/volumes. 

Contours an input dataset by an array. 

Return a decimated version of a triangulation of the boundary of this mesh’s outer surface 

Constructs a 3D Delaunay triangulation of the mesh. 

Generate scalar values on a dataset. 

Extract the outer surface of a volume or structured grid dataset as PolyData. 

Copies a geometric representation (called a glyph) to every point in the input dataset. 

Interpolate values onto this mesh from the point data of a given 

Produces an outline of the full extent for the input dataset. 

Produces an outline of the corners for the input dataset. 

Transforms point data (i.e., data specified per node) into cell data (i.e., data specified within cells). 

Resample scalar data from a passed mesh onto this mesh using 

Mark points as to whether they are inside a closed surface. 

Slice a dataset by a plane at the specified origin and normal vector orientation. 

Create many slices of the input dataset along a specified axis. 

Creates three orthogonal slices through the dataset on the three caresian planes. 

Find, label, and split connected bodies/volumes. 

Integrate a vector field to generate streamlines. 

Texture map this dataset to a user defined plane. 

This filter will apply a 

Thresholds the dataset by a percentage of its range on the active scalar array or as specified 

Returns an all triangle mesh. 

Warp the dataset’s points by a point data scalar array’s values. 

Extract all the internal/external edges of the dataset as PolyData. 

class
pyvista.
DataSetFilters
¶ A set of common filters that can be applied to any vtkDataSet

cell_centers
(vertex=True)¶ Generate points at the center of the cells in this dataset. These points can be used for placing glyphs / vectors.
 Parameters
vertex (bool) – Enable/disable the generation of vertex cells.

cell_data_to_point_data
(pass_cell_data=False)¶ Transforms cell data (i.e., data specified per cell) into point data (i.e., data specified at cell points). The method of transformation is based on averaging the data values of all cells using a particular point. Optionally, the input cell data can be passed through to the output as well.
See aslo:
pyvista.DataSetFilters.point_data_to_cell_data()
 Parameters
pass_cell_data (bool) – If enabled, pass the input cell data through to the output

clip
(normal='x', origin=None, invert=True)¶ Clip a dataset by a plane by specifying the origin and normal. If no parameters are given the clip will occur in the center of that dataset
 Parameters
normal (tuple(float) or str) – Length 3 tuple for the normal vector direction. Can also be specified as a string conventional direction such as
'x'
for(1,0,0)
or'x'
for(1,0,0)
, etc.origin (tuple(float)) – The center
(x,y,z)
coordinate of the plane on which the clip occursinvert (bool) – Flag on whether to flip/invert the clip

clip_box
(bounds=None, invert=True, factor=0.35)¶ Clips a dataset by a bounding box defined by the bounds. If no bounds are given, a corner of the dataset bounds will be removed.
 Parameters
bounds (tuple(float)) – Length 6 iterable of floats: (xmin, xmax, ymin, ymax, zmin, zmax)
invert (bool) – Flag on whether to flip/invert the clip
factor (float, optional) – If bounds are not given this is the factor along each axis to extract the default box.

compute_cell_sizes
(length=False, area=True, volume=True)¶ This filter computes sizes for 1D (length), 2D (area) and 3D (volume) cells.
 Parameters
length (bool) – Specify whether or not to compute the length of 1D cells.
area (bool) – Specify whether or not to compute the area of 2D cells.
volume (bool) – Specify whether or not to compute the volume of 3D cells.

connectivity
(largest=False)¶ Find and label connected bodies/volumes. This adds an ID array to the point and cell data to distinguish seperate connected bodies. This applies a
vtkConnectivityFilter
filter which extracts cells that share common points and/or meet other connectivity criterion. (Cells that share vertices and meet other connectivity criterion such as scalar range are known as a region.) Parameters
largest (bool) – Extract the largest connected part of the mesh.

contour
(isosurfaces=10, scalars=None, compute_normals=False, compute_gradients=False, compute_scalars=True, rng=None, preference='point')¶ Contours an input dataset by an array.
isosurfaces
can be an integer specifying the number of isosurfaces in the data range or an iterable set of values for explicitly setting the isosurfaces. Parameters
isosurfaces (int or iterable) – Number of isosurfaces to compute across valid data range or an iterable of float values to explicitly use as the isosurfaces.
scalars (str, optional) – Name of scalars to threshold on. Defaults to currently active scalars.
compute_normals (bool, optional) –
compute_gradients (bool, optional) – Desc
compute_scalars (bool, optional) – Preserves the scalar values that are being contoured
rng (tuple(float), optional) – If an integer number of isosurfaces is specified, this is the range over which to generate contours. Default is the scalar arrays’s full data range.
preference (str, optional) – When scalars is specified, this is the perfered scalar type to search for in the dataset. Must be either
'point'
or'cell'

decimate_boundary
(target_reduction=0.5)¶ Return a decimated version of a triangulation of the boundary of this mesh’s outer surface
 Parameters
target_reduction (float) – Fraction of the original mesh to remove. Default is
0.5
TargetReduction is set to0.9
, this filter will try to reduce the data set to 10% of its original size and will remove 90% of the input triangles.

delaunay_3d
(alpha=0, tol=0.001, offset=2.5)¶ Constructs a 3D Delaunay triangulation of the mesh. This helps smooth out a rugged mesh.
 Parameters
alpha (float, optional) – Distance value to control output of this filter. For a nonzero alpha value, only verts, edges, faces, or tetra contained within the circumsphere (of radius alpha) will be output. Otherwise, only tetrahedra will be output.
tol (float, optional) – tolerance to control discarding of closely spaced points. This tolerance is specified as a fraction of the diagonal length of the bounding box of the points.
offset (float, optional) – multiplier to control the size of the initial, bounding Delaunay triangulation.

elevation
(low_point=None, high_point=None, scalar_range=None, preference='point', set_active=True)¶ Generate scalar values on a dataset. The scalar values lie within a user specified range, and are generated by computing a projection of each dataset point onto a line. The line can be oriented arbitrarily. A typical example is to generate scalars based on elevation or height above a plane.
 Parameters
low_point (tuple(float), optional) – The low point of the projection line in 3D space. Default is bottom center of the dataset. Otherwise pass a length 3 tuple(float).
high_point (tuple(float), optional) – The high point of the projection line in 3D space. Default is top center of the dataset. Otherwise pass a length 3 tuple(float).
scalar_range (str or tuple(float), optional) – The scalar range to project to the low and high points on the line that will be mapped to the dataset. If None given, the values will be computed from the elevation (Z component) range between the high and low points. Min and max of a range can be given as a length 2 tuple(float). If
str
name of scalara array present in the dataset given, the valid range of that array will be used.preference (str, optional) – When a scalar name is specified for
scalar_range
, this is the perfered scalar type to search for in the dataset. Must be either ‘point’ or ‘cell’.set_active (bool, optional) – A boolean flag on whethter or not to set the new Elevation scalar as the active scalar array on the output dataset.
Warning
This will create a scalar array named Elevation on the point data of the input dataset and overasdf write an array named Elevation if present.

extract_geometry
()¶ Extract the outer surface of a volume or structured grid dataset as PolyData. This will extract all 0D, 1D, and 2D cells producing the boundary faces of the dataset.

glyph
(orient=True, scale=True, factor=1.0, geom=None)¶ Copies a geometric representation (called a glyph) to every point in the input dataset. The glyph may be oriented along the input vectors, and it may be scaled according to scalar data or vector magnitude.
 Parameters
orient (bool) – Use the active vectors array to orient the the glyphs
scale (bool) – Use the active scalars to scale the glyphs
factor (float) – Scale factor applied to sclaing array
geom (vtk.vtkDataSet) – The geometry to use for the glyph

interpolate
(points, sharpness=2, radius=1.0, dimensions=(101, 101, 101), pass_cell_arrays=True, pass_point_arrays=True)¶ Interpolate values onto this mesh from the point data of a given
pyvista.PolyData
object (typically a point cloud).This uses a guassian interpolation kernel. Use the
sharpness
andradius
parameters to adjust this kernel. Parameters
points (pyvista.PolyData) – The points whose values will be interpolated onto this mesh.
sharpness (float) – Set / Get the sharpness (i.e., falloff) of the Gaussian. By default Sharpness=2. As the sharpness increases the effects of distant points are reduced.
radius (float) – Specify the radius within which the basis points must lie.
dimensions (tuple(int)) – When interpolating the points, they are first interpolating on to a
pyvista.UniformGrid
with the same spatial extent dimensions
is number of points along each axis for that grid.pass_cell_arrays (bool, optional) – Preserve source mesh’s original cell data arrays
pass_point_arrays (bool, optional) – Preserve source mesh’s original point data arrays

outline
(generate_faces=False)¶ Produces an outline of the full extent for the input dataset.
 Parameters
generate_faces (bool, optional) – Generate solid faces for the box. This is off by default

outline_corners
(factor=0.2)¶ Produces an outline of the corners for the input dataset.
 Parameters
factor (float, optional) – controls the relative size of the corners to the length of the corresponding bounds

point_data_to_cell_data
(pass_point_data=False)¶ Transforms point data (i.e., data specified per node) into cell data (i.e., data specified within cells). Optionally, the input point data can be passed through to the output.
See aslo:
pyvista.DataSetFilters.cell_data_to_point_data()
 Parameters
pass_point_data (bool) – If enabled, pass the input point data through to the output

sample
(target, tolerance=None, pass_cell_arrays=True, pass_point_arrays=True)¶ Resample scalar data from a passed mesh onto this mesh using
vtk.vtkResampleWithDataSet
. Parameters
dataset (pyvista.Common) – The source vtk data object as the mesh to sample values on to
target (pyvista.Common) – The vtk data object to sample from  point and cell arrays from this object are sampled onto the nodes of the
dataset
meshtolerance (flaot, optional) – tolerance used to compute whether a point in the source is in a cell of the input. If not given, tolerance automatically generated.
pass_cell_arrays (bool, optional) – Preserve source mesh’s original cell data arrays
pass_point_arrays (bool, optional) – Preserve source mesh’s original point data arrays

select_enclosed_points
(surface, tolerance=0.001, inside_out=False, check_surface=True)¶ Mark points as to whether they are inside a closed surface. This evaluates all the input points to determine whether they are in an enclosed surface. The filter produces a (0,1) mask (in the form of a vtkDataArray) that indicates whether points are outside (mask value=0) or inside (mask value=1) a provided surface. (The name of the output vtkDataArray is “SelectedPointsArray”.)
The filter assumes that the surface is closed and manifold. A boolean flag can be set to force the filter to first check whether this is true. If false, all points will be marked outside. Note that if this check is not performed and the surface is not closed, the results are undefined.
This filter produces and output data array, but does not modify the input dataset. If you wish to extract cells or poinrs, various threshold filters are available (i.e., threshold the output array).
 Parameters
surface (pyvista.PolyData) – Set the surface to be used to test for containment. This must be a
pyvista.PolyData
object.tolerance (float) – The tolerance on the intersection. The tolerance is expressed as a fraction of the bounding box of the enclosing surface.
inside_out (bool) – By default, points inside the surface are marked inside or sent to the output. If
inside_out
isTrue
, then the points outside the surface are marked inside.check_surface (bool) – Specify whether to check the surface for closure. If on, then the algorithm first checks to see if the surface is closed and manifold.

slice
(normal='x', origin=None, generate_triangles=False, contour=False)¶ Slice a dataset by a plane at the specified origin and normal vector orientation. If no origin is specified, the center of the input dataset will be used.
 Parameters
normal (tuple(float) or str) – Length 3 tuple for the normal vector direction. Can also be specified as a string conventional direction such as
'x'
for(1,0,0)
or'x'
for(1,0,0)`
, etc.origin (tuple(float)) – The center (x,y,z) coordinate of the plane on which the slice occurs
generate_triangles (bool, optional) – If this is enabled (
False
by default), the output will be triangles otherwise, the output will be the intersection polygons.contour (bool, optional) – If True, apply a
contour
filter after slicing

slice_along_axis
(n=5, axis='x', tolerance=None, generate_triangles=False, contour=False)¶ Create many slices of the input dataset along a specified axis.
 Parameters
n (int) – The number of slices to create
axis (str or int) – The axis to generate the slices along. Perpendicular to the slices. Can be string name (
'x'
,'y'
, or'z'
) or axis index (0
,1
, or2
).tolerance (float, optional) – The toleranceerance to the edge of the dataset bounds to create the slices
generate_triangles (bool, optional) – If this is enabled (
False
by default), the output will be triangles otherwise, the output will be the intersection polygons.contour (bool, optional) – If True, apply a
contour
filter after slicing

slice_orthogonal
(x=None, y=None, z=None, generate_triangles=False, contour=False)¶ Creates three orthogonal slices through the dataset on the three caresian planes. Yields a MutliBlock dataset of the three slices
 Parameters
x (float) – The X location of the YZ slice
y (float) – The Y location of the XZ slice
z (float) – The Z location of the XY slice
generate_triangles (bool, optional) – If this is enabled (
False
by default), the output will be triangles otherwise, the output will be the intersection polygons.contour (bool, optional) – If True, apply a
contour
filter after slicing

split_bodies
(label=False)¶ Find, label, and split connected bodies/volumes. This splits different connected bodies into blocks in a MultiBlock dataset.
 Parameters
label (bool) – A flag on whether to keep the ID arrays given by the
connectivity
filter.

streamlines
(vectors=None, source_center=None, source_radius=None, n_points=100, integrator_type=45, integration_direction='both', surface_streamlines=False, initial_step_length=0.5, step_unit='cl', min_step_length=0.01, max_step_length=1.0, max_steps=2000, terminal_speed=1e12, max_error=1e06, max_time=None, compute_vorticity=True, rotation_scale=1.0, interpolator_type='point', start_position=(0.0, 0.0, 0.0), return_source=False)¶ Integrate a vector field to generate streamlines. The integration is performed using a specified integrator, by default RungeKutta2. This supports integration through any type of dataset. Thus if the dataset contains 2D cells like polygons or triangles, the integration is constrained to lie on the surface defined by 2D cells.
This produces polylines as the output, with each cell (i.e., polyline) representing a streamline. The attribute values associated with each streamline are stored in the cell data, whereas those associated with streamlinepoints are stored in the point data.
This uses a Sphere as the source  set it’s location and radius via the
source_center
andsource_radius
keyword arguments. You can retrieve the source aspyvista.PolyData
by specifyingreturn_source=True
. Parameters
vectors (str) – The string name of the active vector field to integrate across
source_center (tuple(float)) – Length 3 tuple of floats defining the center of the source particles. Defaults to the center of the dataset
source_radius (float) – Float radius of the source particle cloud. Defaults to onetenth of the diagonal of the dataset’s spatial extent
n_points (int) – Number of particles present in source sphere
integrator_type (int) – The integrator type to be used for streamline generation. The default is RungeKutta45. The recognized solvers are: RUNGE_KUTTA2 (
2
), RUNGE_KUTTA4 (4
), and RUNGE_KUTTA45 (45
). Options are2
,4
, or45
. Default is45
.integration_direction (str) – Specify whether the streamline is integrated in the upstream or downstream directions (or both). Options are
'both'
,'backward'
, or'forward'
.surface_streamlines (bool) – Compute streamlines on a surface. Default
False
initial_step_length (float) – Initial step size used for line integration, expressed ib length unitsL or cell length units (see
step_unit
parameter). either the starting size for an adaptive integrator, e.g., RK45, or the constant / fixed size for nonadaptive ones, i.e., RK2 and RK4)step_unit (str) – Uniform integration step unit. The valid unit is now limited to only LENGTH_UNIT (
'l'
) and CELL_LENGTH_UNIT ('cl'
). Default is CELL_LENGTH_UNIT:'cl'
.min_step_length (float) – Minimum step size used for line integration, expressed in length or cell length units. Only valid for an adaptive integrator, e.g., RK45
max_step_length (float) – Maxmimum step size used for line integration, expressed in length or cell length units. Only valid for an adaptive integrator, e.g., RK45
max_steps (int) – Maximum number of steps for integrating a streamline. Defaults to
2000
terminal_speed (float) – Terminal speed value, below which integration is terminated.
max_error (float) – Maximum error tolerated throughout streamline integration.
max_time (float) – Specify the maximum length of a streamline expressed in LENGTH_UNIT.
compute_vorticity (bool) – Vorticity computation at streamline points (necessary for generating proper streamribbons using the
vtkRibbonFilter
.interpolator_type (str) – Set the type of the velocity field interpolator to locate cells during streamline integration either by points or cells. The cell locator is more robust then the point locator. Options are
'point'
or'cell'
(abreviations of'p'
and'c'
are also supported).rotation_scale (float) – This can be used to scale the rate with which the streamribbons twist. The default is 1.
start_position (tuple(float)) – Set the start position. Default is
(0.0, 0.0, 0.0)
return_source (bool) – Return the source particles as
pyvista.PolyData
as well as the streamlines. This will be the second value returned ifTrue
.

texture_map_to_plane
(origin=None, point_u=None, point_v=None, inplace=False, name='Texture Coordinates')¶ Texture map this dataset to a user defined plane. This is often used to define a plane to texture map an image to this dataset. The plane defines the spatial reference and extent of that image.
 Parameters
origin (tuple(float)) – Length 3 iterable of floats defining the XYZ coordinates of the BOTTOM LEFT CORNER of the plane
point_u (tuple(float)) – Length 3 iterable of floats defining the XYZ coordinates of the BOTTOM RIGHT CORNER of the plane
point_v (tuple(float)) – Length 3 iterable of floats defining the XYZ coordinates of the TOP LEFT CORNER of the plane
inplace (bool, optional) – If True, the new texture coordinates will be added to the dataset inplace. If False (default), a new dataset is returned with the textures coordinates
name (str, optional) – The string name to give the new texture coordinates if applying the filter inplace.

threshold
(value=None, scalars=None, invert=False, continuous=False, preference='cell')¶ This filter will apply a
vtkThreshold
filter to the input dataset and return the resulting object. This extracts cells where scalar value in each cell satisfies threshold criterion. If scalars is None, the inputs active_scalar is used. Parameters
value (float or iterable, optional) – Single value or (min, max) to be used for the data threshold. If iterable, then length must be 2. If no value is specified, the nonNaN data range will be used to remove any NaN values.
scalars (str, optional) – Name of scalars to threshold on. Defaults to currently active scalars.
invert (bool, optional) – If value is a single value, when invert is True cells are kept when their values are below parameter “value”. When invert is False cells are kept when their value is above the threshold “value”. Default is False: yielding above the threshold “value”.
continuous (bool, optional) – When True, the continuous interval [minimum cell scalar, maxmimum cell scalar] will be used to intersect the threshold bound, rather than the set of discrete scalar values from the vertices.
preference (str, optional) – When scalars is specified, this is the perfered scalar type to search for in the dataset. Must be either
'point'
or'cell'

threshold_percent
(percent=0.5, scalars=None, invert=False, continuous=False, preference='cell')¶ Thresholds the dataset by a percentage of its range on the active scalar array or as specified
 Parameters
percent (float or tuple(float), optional) – The percentage (0,1) to threshold. If value is out of 0 to 1 range, then it will be divided by 100 and checked to be in that range.
scalars (str, optional) – Name of scalars to threshold on. Defaults to currently active scalars.
invert (bool, optional) – When invert is True cells are kept when their values are below the percentage of the range. When invert is False, cells are kept when their value is above the percentage of the range. Default is False: yielding above the threshold “value”.
continuous (bool, optional) – When True, the continuous interval [minimum cell scalar, maxmimum cell scalar] will be used to intersect the threshold bound, rather than the set of discrete scalar values from the vertices.
preference (str, optional) – When scalars is specified, this is the perfered scalar type to search for in the dataset. Must be either
'point'
or'cell'

triangulate
()¶ Returns an all triangle mesh. More complex polygons will be broken down into triangles.
 Returns
mesh – Mesh containing only triangles.
 Return type

warp_by_scalar
(scalars=None, factor=1.0, normal=None, inplace=False, **kwargs)¶ Warp the dataset’s points by a point data scalar array’s values. This modifies point coordinates by moving points along point normals by the scalar amount times the scale factor.
 Parameters
scalars (str, optional) – Name of scalars to warb by. Defaults to currently active scalars.
factor (float, optional) – A scalaing factor to increase the scaling effect. Alias
scale_factor
also accepted  if present, overridesfactor
.normal (np.array, list, tuple of length 3) – User specified normal. If given, data normals will be ignored and the given normal will be used to project the warp.
inplace (bool) – If True, the points of the give dataset will be updated.

wireframe
()¶ Extract all the internal/external edges of the dataset as PolyData. This produces a full wireframe representation of the input dataset.
