FastSurferCNN.generate_hdf5¶
- class FastSurferCNN.generate_hdf5.H5pyDataset(params, processing='aparc')[source]¶
Class representing H5py Dataset.
Attributes
dataset_name
(str) Path and name of hdf5-data_loader
data_path
(str) Directory with images to load
slice_thickness
(int) Number of pre- and succeeding slices
orig_name
(str) Default name of original images
aparc_name
(str) Default name for ground truth segmentations.
aparc_nocc
(str) Segmentation without corpus callosum (used to mask this segmentation in ground truth). If the used segmentation was already processed, do not set this argument
available_sizes
(int) Sizes of images in the dataset.
max_weight
(int) Overall max weight for any voxel in weight mask.
edge_weight
(int) Weight for edges in weight mask.
hires_weight
(int) Weight for hires elements (sulci, WM strands, cortex border) in weight mask.
gradient
(bool) Turn on to only use median weight frequency (no gradient)
gm_mask
(bool) Turn on to add cortex mask for hires-processing.
lut
(pd.Dataframe) DataFrame with ids present, name of ids, color for plotting
labels
(np.ndarray) full label list
labels_sag
(np.ndarray) sagittal label list
lateralization
(Dict) dictionary mapping between left and right hemispheres
subject_dirs
(List[str]) list ob subject directory names
search_pattern
(str) Pattern to match files in directory
data_set_size
(int) Number of subjects
processing
(str) Use aseg, aparc or no specific mapping processing (Default: “aparc”)
Methods
__init__
(params[, processing])Construct H5pyDataset object.
_load_volumes
(subject_path)Load the given image and segmentation and gets the zoom values.
transform
(plane, imgs, zoom)Transform the image and zoom along the given axis.
_pad_image
(img, max_out)Pad the margins of the input image with zeros.
create_hdf5_dataset
(blt)Create a hdf5 dataset.