pyrfuniverse.utils package

pyrfuniverse.utils.depth_processor module

pyrfuniverse.utils.depth_processor.image_bytes_to_point_cloud(rgb_bytes: bytes, depth_bytes: bytes, fov: float, local_to_world_matrix: ndarray)

Use the raw bytes of RGB image and depth image, as well as the camera FOV and extrinsic matrix to generate point cloud in global coordinate.

Parameters:
  • rgb_bytes – Bytes, raw bytes of RGB image.

  • depth_bytes – Bytes, raw bytes of depth image, EXR format.

  • fov – Float, camera Field Of View (FOV).

  • local_to_world_matrix – Numpy.ndarray, the local_to_world_matrix of camera.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.image_array_to_point_cloud(image_rgb: ndarray, image_depth: ndarray, fov: float, local_to_world_matrix: ndarray)

Use the RGB image and depth image, as well as the camera FOV and extrinsic matrix to generate point cloud in global coordinate.

Parameters:
  • image_rgb – Numpy.ndarray, in shape (H,W,3), the RGB image.

  • image_depth – Numpy.ndarray, in shape (H,W,3), the depth image.

  • fov – Float, camera Field Of View (FOV).

  • local_to_world_matrix – Numpy.ndarray, the local_to_world_matrix of camera.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.depth_to_point_cloud(depth: ndarray, fov: float)

Use the depth image and the camera FOV to generate point cloud in camera coordinate.

Parameters:
  • depth – Numpy.ndarray, in shape (H,W,3), the depth image.

  • fov – Float, camera Field Of View (FOV).

  • organized – Bool, whether keep the point organized. If True, the returned shape of point cloud is (H,W,3); If False, the returned shape of point cloud is (H*W,3).

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.image_array_to_point_cloud_intrinsic_matrix(image_rgb: ndarray, image_depth: ndarray, intrinsic_matrix: ndarray, local_to_world_matrix: ndarray)

Use the RGB image and depth image, as well as the camera intrinsic matrix and extrinsic matrix to generate point cloud in global coordinate.

Parameters:
  • image_rgb – Numpy.ndarray, in shape (H,W,3)[int][0-255], the RGB image.

  • image_depth – Numpy.ndarray, in shape (H,W,3)[float][real distance/unit meter], the depth image.

  • intrinsic_matrix – Numpy.ndarray, in shape (3,3), the intrinsic matrix of camera.

  • local_to_world_matrix – Numpy.ndarray, in shape (4,4), the local_to_world_matrix of camera.

Returns:

The point cloud in Unity Space.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.image_bytes_to_point_cloud_intrinsic_matrix(rgb_bytes: bytes, depth_bytes: bytes, intrinsic_matrix: ndarray, local_to_world_matrix: ndarray)

Use the raw bytes of RGB image and depth image, as well as the camera intrinsic matrix and extrinsic matrix to generate point cloud in global coordinate.

Parameters:
  • rgb_bytes – Bytes, raw bytes of RGB image.

  • depth_bytes – Bytes, raw bytes of depth image, EXR format.

  • intrinsic_matrix – Numpy.ndarray, the intrinsic matrix of camera.

  • local_to_world_matrix – Numpy.ndarray, the local_to_world_matrix of camera.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.image_open3d_to_point_cloud_intrinsic_matrix(color: Image, depth: Image, intrinsic_matrix: ndarray, local_to_world_matrix: ndarray)

Use the RGB image and depth image in open3d.geometry.Image format, as well as the camera intrinsic matrix and extrinsic matrix to generate point cloud in global coordinate.

Parameters:
  • color – open3d.geometry.Image, the RGB image.

  • depth – open3d.geometry.Image, the depth image.

  • intrinsic_matrix – Numpy.ndarray, the intrinsic matrix of camera.

  • local_to_world_matrix – Numpy.ndarray, the local_to_world_matrix of camera.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.mask_point_cloud_with_id_color(pcd: PointCloud, image_mask: ndarray, color: list)

Mask the point cloud with given segmentation masks and target color.

Parameters:
  • pcd – open3d.geometry.PointCloud, the point cloud.

  • image_mask – numpy.ndarray, the segmentation mask in shape (H,W,3).

  • color – List, the target color list.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.mask_point_cloud_with_id_gray_color(pcd: PointCloud, image_mask: ndarray, color: int)

Mask the point cloud with given gray-scale segmentation masks and target color.

Parameters:
  • pcd – open3d.geometry.PointCloud, the point cloud.

  • image_mask – numpy.ndarray, the segmentation mask in shape (H,W).

  • color – Int, the target gray-scale color.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.depth_processor.filter_active_depth_point_cloud_with_exact_depth_point_cloud(active_pcd: PointCloud, exact_pcd: PointCloud, max_distance: float = 0.05)

Use exact point cloud to filter IR-based active point cloud based on a tolerance distance.

Parameters:
  • active_pcd – open3d.geometry.PointCloud, the IR-based active point cloud.

  • exact_pcd – open3d.geometry.PointCloud, the exact point cloud.

  • max_distance – float, the maximum tolerance distance.

Returns:

The point cloud.

Return type:

open3d.geometry.PointCloud

pyrfuniverse.utils.rfuniverse_utility module

pyrfuniverse.utils.rfuniverse_utility.EncodeIDAsColor(instance_id: int)

Encode the object id to a color.

Parameters:

instance_id – Int, the id of the object.

Returns:

The encoded color in [r, g, b, 255] format.

Return type:

List

pyrfuniverse.utils.rfuniverse_utility.UnityEularToQuaternion(eular: list) list

Transform euler angle to quaternion in Unity.

Parameters:
  • eular – List of length 3, representing euler angle in [x, y, z]

  • degree. (order and measured in)

Returns:

The transformed quaternion in [x, y, z, w] format.

Return type:

List

pyrfuniverse.utils.rfuniverse_utility.UnityQuaternionToEular(quaternion: list) list

Transform quaternion to euler angle in Unity.

Parameters:
  • quaternion – List of length 4, representing quaternion in [x, y, z, w]

  • order.

Returns:

The transformed euler angle in [x, y, z] order and measured in degree.

Return type:

List

pyrfuniverse.utils.rfuniverse_utility.GetMatrix(quat=[0, 0, 0, 1]) ndarray

Transform the position and quaternion into a transformation matrix.

Parameters:

quat – List of length 4, representing the [x, y, z, w] quaternion.

Returns:

the transformation matrix.

Return type:

numpy.ndarray

pyrfuniverse.utils.coordinate_system_converter module

class pyrfuniverse.utils.coordinate_system_converter.CoordinateSystemConverter(cs1_direction=['right', 'up', 'forward'], cs2_direction=['right', 'up', 'forward'])

Bases: object

Coordinate System Converter class.

Parameters:
  • cs1_direction – list, The visual direction corresponding to the xyz axis, can be: [“left”/”right”/”up”/”down”/”forward”/”back”/ “l”/”r”/”u”/”d”/”f”/”b”/ “-left”/”-right”/”-up”/”-down”/”-forward”/”-back”/ “-l”/”-r”/”-u”/”-d”/”-f”/”-b”]

  • cs2_direction – list, The visual direction corresponding to the xyz axis, can be: [“left”/”right”/”up”/”down”/”forward”/”back”/ “l”/”r”/”u”/”d”/”f”/”b”/ “-left”/”-right”/”-up”/”-down”/”-forward”/”-back”/ “-l”/”-r”/”-u”/”-d”/”-f”/”-b”]

cs1_pos_to_cs2_pos(pos: list) list

Convert position form Coordinate System 1 to Coordinate System 2.

Parameters:

pos – List of length 3, position of Coordinate System 1.

Returns:

List of length 3, position of Coordinate System 2.

Return type:

list

cs2_pos_to_cs1_pos(pos: list) list

Convert position form Coordinate System 2 to Coordinate System 1.

Parameters:

pos – List of length 3, position of Coordinate System 2.

Returns:

List of length 3, position of Coordinate System 1.

Return type:

list

cs1_quat_to_cs2_quat(quat: list) list

Convert quaternion form Coordinate System 1 to Coordinate System 2.

Parameters:

quat – List of length 3, quaternion of Coordinate System 1.

Returns:

List of length 3, quaternion of Coordinate System 2.

Return type:

list

cs2_quat_to_cs1_quat(quat: list) list

Convert quaternion form Coordinate System 2 to Coordinate System 1.

Parameters:

quat – List of length 4, quaternion[x,y,z,w] of Coordinate System 2.

Returns:

List of length 4, quaternion[x,y,z,w] of Coordinate System 1.

Return type:

list

cs1_scale_to_cs2_scale(scale: list) list

Convert scale form Coordinate System 1 to Coordinate System 2.

Parameters:

scale – List of length 3, scale of Coordinate System 1.

Returns:

List of length 3, scale of Coordinate System 2.

Return type:

list

cs2_scale_to_cs1_scale(scale: list) list

Convert scale form Coordinate System 2 to Coordinate System 1.

Parameters:

scale – List of length 3, scale of Coordinate System 2.

Returns:

List of length 3, scale of Coordinate System 1.

Return type:

list

cs1_matrix_to_cs2_matrix(matrix) ndarray

Convert rotation matrix form Coordinate System 1 to Coordinate System 2.

Parameters:

matrix – list or np.ndarray shape[3,3], rotation matrix of Coordinate System 1.

Returns:

shape[3,3], rotation matrix of Coordinate System 2.

Return type:

np.ndarray

cs2_matrix_to_cs1_matrix(matrix) ndarray

Convert rotation matrix form Coordinate System 2 to Coordinate System 1.

Parameters:

matrix – list or np.ndarray shape[3,3], rotation matrix of Coordinate System 2.

Returns:

shape[3,3], rotation matrix of Coordinate System 1.

Return type:

np.ndarray