Sample datasets

There are a number of available sample datasets that can be readily used to test Whitebox Workflows for Python. The following is a description of all available datasets:

Dataset NameFile NameDescriptionCompressed Size
Guelph_landsatband1.tif...band7.tif7 bands of a sub-area of a Landsat 5 data set10.9 MB
Grand_JunctionDEM.tifA small digital elevation model (DEM) in high-relief5.8 MB
GTA_lidarGTA_lidar.lazAn airborne lidar point cloud, in LAZ format54.3 MB
Jay_State_ForestDEM.tifA lidar raster digital elevatio model (DEM)27.7 MB
Kitchener_lidarKitchener_lidar.lazAn airborne lidar point cloud, in LAZ format41.6 MB
London_air_photoLondon_air_photo.tifA high-res RGB air photo87.3 MB
Southern_Ontario_roadsroads_utm.shpVector roads layer for a section of Southern Ontario7.1 MB
StElisAkStElisAk.lazAn airborne lidar point cloud, in LAZ format54.5 MB

The data can be downloaded using the download_sample_data function within the whitebox_workflows module. For example,

import whitebox_workflows as wbw

wbe = wbw.WbEnvironment()

wbe.working_directory = wbw.download_sample_data('Kitchener_lidar')
print(f'Data have been stored in: {wbe.working_directory}')

The data will be downloaded to a location within your HOME directory and the download_sample_data function will return the address of the dataset directory. This can be useful for updating the WbEnvironment.working_directory property, as in the script above. The data will be downloaded in a compressed file format (zip) and will be automatically decompressed after the download has completed. The download_sample_data function will automatically terminate after two minutes. You may encounter this issue if you attempt to download some of the larger sample datasets using a slower Internet connection.