I assume the demand makes dedicated thread for Kinect Azure requests a reasonble idea.
Please write here your Kinect Azure RFE.
Please add access to calibration data of the camera sensors , handle it also as opencv standard.
Coordinate Transformation should be generalize for any combination,
typical example: getting the color\point cloud data in IR camera(depth camera) projected space.
(currently available only transform data to color camera projected space)
(update : maybe i`m wrong, you do seem to offer this option - but its seem not to work well)
getting the current bandwidth sent from device by each frame can help.