Parameters
Overview
SENSR is a highly configurable solution allowing users to optimize their site's configuration. This section aims at describing SENSR's settings and their default values.
SENSR settings can be classified into 4 categories:
- General parameters, defining the system's overall behavior,
- Master node parameters, defining the orchestration with algo nodes,
- Algo node parameters, defining the parameters used by the different algorithms to optimize object tracking
- Visualization parameters, defining the user interface settings
important
Tracking results can widely vary depending on the settings applied to a site. Tunning the parameters however comes with trade-offs, over optimizing for certain aspects may come at the cost of degrading others.
General Parameters
General parameters are parameters shared accross a whole project of each components, they can be found in Settings > General Parameters
.
Location | Field | Name | Description | Unit | Default | Range |
---|---|---|---|---|---|---|
Common | Output | Runtime Mode Point-Cloud Publishing Level | The level of publishing point cloud to output port. This setting will greatly increase the bandwidth between the algo nodes and the master node! - No Points - Object Points - Object Points & Background Points - All Points | int | Object Points | |
Zone Setup Point-Cloud Publishing Level | The level of publishing point cloud to output port. This setting will greatly increase the bandwidth between the algo nodes and the master node! - No Points - Object Points - Object Points & Background Points - All Points | int | Object Points & Background points | |||
Background Points Update Interval | Time between updates of background and ground point cloud information from algo nodes to the master node. Updating frequently can be resource-intensive.. | seconds | 1 | |||
Sensor Status Update Interval | Time between updates of sensor information from algo nodes to the master node. | seconds | 0.5 | |||
Publish Losing Event | Enables SENSR ppublishing loosing event when for a given object the tracking is lost | boolean | FALSE | |||
Output Sending Type Selection | Method for sending and updating new results. - Instant Sending: Publish result whenever a new one is available. - Fixed-Period Sending: Publish the most recent result periodically. The published result might already have been published in a previous frame. - Fixed-Period w/ Delay: Publish the most recent result periodically. If there is no new result, the sending is postponed until a new result is received or a maximum delay time has passed. - All-Update or Fixed Delay | int | Fixed-Period Sending | |||
Update Interval | Publish interval for output sending type 1 - Fixed-Period Sending and type 2 - Fixed-Period with Max-Delay | seconds | 0.1 | |||
Max Delay | Maximum delay time for "Fixed-Period with Max-Delay". | seconds | 0.05 | |||
Point-Cloud Bandwidth Reduction | This decreases the communication burden by reducing pointcloud data size. This option if enable will affects: - In calibration mode, only one point cloud will be sent (circularly) at a time. - In runtime mode, background and ground points will be downsampled - Points out of detection range will not be published regardless of the point cloud publishing level. | boolean | FALSE | |||
Downsampling Resolution | Resolution of downsampling method for pointcloud bandwidth reduction. Value can be between 0 to 1. | meters | 0.3 | |||
Input | Maximum Lidar Input Interval | If there is no response from one of the lidar sensors for this time duration, that lidar sensor will be considered dead. | seconds | 20 | [0,30] | |
Restart Launchers of Unresponsive Lidars | boolean | TRUE | ||||
Launcher Restart Threshold | When enabled, the system automatically restarts lidar launchers if the algo node does not receive any point cloud messages within the designated lidar restart threshold. | seconds | 30 | [5, +inf] | ||
Number of CPU Threads | Number of CPU threads allocated to the recovery of LiDARs. A high value might cause instability as SENSR would allocate most processing power to LiDAR recovery | int | 1 | [1, +inf] | ||
Frame Update Setting | Method for updating new frames from received results - Instant: Update frames as soon as results arrive - Use buffer: Update frames using buffer that stores frames after specified delay time in case of network traffic | int | Instant | |||
Minimum Pending Time (for Instant update) | Due to connection issues, the time interval between two point-cloud messages of one lidar can be unreasonably small. This minimum interval will ensure that one lidar point-cloud message of the pipeline has a reasonable time gap with the next one. A message which time difference with the previous message is below this minimum interval will get discarded. (This option is only visible if "Instant" is selected in the Frame Update Method.) | seconds | 0.02 | [0, 0.1] | ||
Frame Interval [s] (for Instant update) | Whenever data from all LiDARs have arrived, or the frame interval has been reached, the frame will be bundled and processed by the detection node. Only one frame will be bundled per frame interval. | 0.1 | [0.05, 5] | |||
Frame Jitter [s] (for Instant update) | The frame bundler will reuse LiDAR point clouds from previous frames, taking into account the frame jitter. | 0.02 | [0, +inf] | |||
Number of Frames Operating Delayed (for Using Buffer Update) | Number of frames operating behind the current time. The frame duration is specified by the frame interval parameter. | 3 | [1, +inf] | |||
Frame Interval [s] (For Using Buffer Update) | Perception pipeline will be triggered with new input according to the frame interval. | 0.15 | [0.1, +inf] | |||
Max Number of Parallel Rosbag Parser | This refers to the maximum number of threads that will be utilized for parsing the rosbag files. | 5 | [1, +inf] | |||
Min Number of Topics per Rosbag Parser | This specifies the minimum number of topics that a single parser thread must handle before a second thread is employed. For example, if you have 28 rosbag topics and the Minimum Number of Topics Per Rosbag Parser is set to 20, then the number of rosbag parser threads will be two (this assumes that the Maximum Number of Parallel Rosbag Parsers is set to 2 or higher). | 5 | [1, +inf] | |||
Retro-Reflection Tracking | Enable Retro-Reflection Tracking | Enables marking of objects as retro-reflective. For this to work properly the retro-reflectivity threshold setting for each lidar also needs to be tuned properly. | boolean | FALSE | ||
Min Point Count | For an object to be considered retro-reflective it has to have at least this many retro-reflective points. | int | 5 | [1, +inf] | ||
History Threshold | The number of frames an object will stay retro-reflective after classified as retro-reflective.. | int | 10 | [0, +inf] | ||
Reflection Filter | Use Simple Filter [2D] | Enable simple filtering algorithm for faster but less accurate filtering. The height will not be considered. | boolean | TRUE | ||
Use detection limit | If enabled, any sensor further from the reflection filter than the detection limit will not be considered during filtering. A smaller detection limit will make filtering faster. | boolean | TRUE | |||
Detection limit [m] | Detection range of the option "Use detection limit". | meters | 50 | [0, +inf] | ||
Zone Setting | Trigger Zone Events with MISC Objects | Enabling this will trigger zone events with MISC objects. | boolean | FALSE | ||
Zone Filter Resolution [m] | Grid Cell Size for Zone Filter. | meters | 0.1 | [0, +inf] | ||
Tracking History | Tracking History Length | The number of history positions to keep | int | 100 | [0, +Inf.] | |
History Smoothing Level | Trajectory smoothing level | int | 3 | [0, 3] | ||
Trajectory Prediction | Number of Prediction Steps | Number of prediction steps | int | 10 | [0, +Inf.] | |
Prediction Step Time-Horizon | Time step between two predicted points | seconds | 0.1 | [0, +Inf.] | ||
Max. Acceleration Filter Threshold | Maximum acceleration for prediction | m/s2 | 15 | [0, +Inf.] | ||
Data Collection | Maximum Used Storage Percentage to Allow Recording | If the storage used overshoots the defined percentage, the system will stop recording. | int | 90 | [10, 99] | |
Recording Queue Capacity | Defines the recording buffer's capacity. | int | 50 | [10, +inf] | ||
Enable Anomaly Data-Collector | When enabling the Anomaly Data-Collector, the latest "Num Look-Back Frames" are temporarily stored on the disk. Once tha anomaly detector is triggered, the Look-Back and Look-Ahead frames are saved | boolean | FALSE | |||
Only Record Foreground Points | Enable this option to only record foreground points and therefore reduce the size of recordings | boolean | FALSE | |||
Num Look-Back Frames | Number of previous frames temporarily stored on the disk | int | 100 | [1, +inf] | ||
Num Look-Ahead Frames | Number of frames stored on disk after anomaly detected | int | 100 | [1, +inf] | ||
Logging | Maximum Log File Size [MB] | Sets the Maximum File Size for a single file. Note that the total size of the logs can be much larger. | MB | 10 | [1, +inf] | |
Maximum Number of Log Files Per Directory | The number of rotating log files that will be kept for the master node and each algo node. | int | 3 | [2, 999] | ||
Maximum Number of Log Files Per Node | The specified number of the most recent log folders that will be retained for master node and each individual algo and edge node. | int | 5 | [2, 999] | ||
Recovery | Connection Timeout [s] | The time after which an unresponsive node is considered as dead. After exceeding this timeout, recovery is attempted. | seconds | 20 | [3, +inf] | |
Recovery Attempt Timeout [s] | If a recovery attempt was not successful after the specified timeout, the recovery attempt is considered as not successful and aborted. | seconds | 30 | [3, +inf] | ||
Maximum Number of Recovery Tries | int | 10 | [0, +inf] | |||
Time Between Recovery Attempts [s] | seconds | 60 | [0, +inf] |
Master Node Parameters
Master Node parameters define the orchestraion with Algo Nodes and can be found in Settings > Master Node Parameters
.
Location | Name | Description | Unit | Default | Range |
---|---|---|---|---|---|
Rosbag Sync | Max Rosbag Difference [s] | This parameter can be used to make sure that the played rosbag files come approximately from the same recording time. The value of this parameter will limit the difference between the rosbag files in terms of starting and ending time. If the difference between the maximum starting time and the minimum starting time is above this value, the algo nodes will not be launched. The same check is performed for the ending time. | int | 3600 | [0, +inf] |
Master Replay | Use Pseudo Realtome for the Frame-by-Frame Mode | boolean | TRUE | ||
Result Update | Result Update Method | Methods for updating algo nodes results. 0: Instant 1: Use buffer Instant method will process algo nodes results as soon they arrived on master node. Use Buffer method will process algo nodes results based on intervals specified by Algo Nodes Results Update Interval. All received results will be stored in a buffer which maximum size is Number of Algo Node Results Operating Delayed. | int | 0 | {0,1} |
Number of Algo Node Results Operating Delayed | Number of algo nodes results operating behind the current time. | int | 10 | [1,+inf] | |
Algo Nodes Results Update Interval [s] | Algo nodes results processing will be triggered according to the update interval. | seconds | 0.15 | [0.01,+inf] | |
Output-Merger > Algo-Node Result Timeouts | Algo-Node Object Result-Timeout | Compares the elapsed time between 2 algo node of object information messages. if the elapsed time is greater than the threshold, the algo node result is not taken into account during the aggregation. | seconds | 2 | [0.1, +Inf.] |
Algo-Node Point Result Timeout | For the point result, the point-cloud update frequency is added to this timeout. | seconds | 2 | [0.1, +Inf.] | |
Algo-Node Sensor Status Timeout | For the sensor statuses, the sensor update frequency is added to this timeout. | seconds | 10 | [0.1, +Inf.] | |
Output-Merger > Multi Algo-Node Merger | Use Multi Algo Object Association | Merges objects found between 2 algo nodes if they are within bound. | boolean | TRUE | |
Grow-Size x-Tolerance | Tolerance on the x-axis for object distortion before merger | meters | 2 | [0, +Inf.] | |
Grow-Size y-tolerance | Tolerance on the y-axis for object distortion before merger | meters | 1.5 | [0, +Inf.] | |
Grow-Size Rate-Tolerance | Distortion tolerance ration before merger | percent | 1.2 | [0, +Inf.] | |
Grow-Size Addition-Tolerance | Additional toleration for object growth before merger | meters | 0.5 | [0, +Inf.] | |
Association Distance Threshold | Distance threshold to associate objects in two nodes | meters | 2 | [0, +Inf.] |
Algo Node Parameters
Algo node parameters define the parameters used by SENSR's multiple algorithms to process 3D data.
The parameters are specific to a given Algo node allowing to optimize tracking differently from an Algo node to another.
The parameters can be found in Settings > Algo Node Parameters> NODE_NAME | UID: algo_####
.
Please change algorithm parameters with caution, as wrong parameters can negatively affect the performance.
Location | Field | Name | Description | Unit | Default | Range |
---|---|---|---|---|---|---|
Parameters > Performance | CPU Thread Utilization of the Preprocessing Node [%] | This parameter can be used to optimize the runtime performance of the preprocessing node. It specifies the percentage of the system's CPU threads that will be allocated for computationally expensive tasks on the preprocessing node. | percent | 30 | [10,100] | |
CPU Thread Utilization of the Detection Node [%] | This parameter can be used to optimize the runtime performance of the detection node. It specifies the percentage of the system's CPU threads that will be allocated for computationally expensive tasks on the detection node. | percent | 50 | [10,100] | ||
CPU Thread Utilization of the Environment Update Node [%] | This parameter can be used to optimize the runtime performance of the environment node. It specifies the percentage of the system's CPU threads that will be allocated for computationally expensive tasks on the environment node. | percent | 10 | [10,100] | ||
CPU Thread Utilization of the Long-Running Tasks Node [%] | This parameter can be used to optimize the runtime performance of the long-running tasks node. It specifies the percentage of the system's CPU threads that will be allocated for computationally expensive tasks on the long-running tasks node. | percent | 10 | [10,100] | ||
Parameters > Object | - | Allow Floating-Object | If True, min. z of the object is set to be the min. z-value of the object bounding box instead of ground height. | boolean | FALSE | |
Filter Objects Based on Number of Points and Radius | Enable | If activated, this filter will sort out objects based on their number of points and their radius. | boolean | TRUE | ||
Tracked Object's Min. Points | The minimum number of points of an object to be tracked. | int | 5 | [0, +Inf.] | ||
Tracked Object's Min Radius | Minimum object radius to be tracked. | meters | 0.15 | [0, +Inf.] | ||
Parameters > Tracking | - | Validation Period | Period of checking validity in the early stage of tracking. This time determines length of the VALIDATION period, or how quickly a new object will be tracked and classified. | seconds | 0.5 | [0, +Inf.] |
Invalidation Period | Period of short term prediction when tracking is lost while in the VALIDATION period. A longer period will allow for objects that are at the edge of detection range to be tracked, but will be more likely to introduce false alarms. | seconds | 0.5 | [0, +Inf.] | ||
Drifting Period | Period of short term location prediction when tracking is lost in TRACKING status. A longer period can help with obstructions, but can lead to object ID switching in busy environments | seconds | 1 | [0, +Inf.] | ||
Drifting Period for Miscellaneous Objects | Drifting period apply for Miscellaneous Objects. | seconds | 1 | [0, +Inf.] | ||
Parameters > Edge Node | LZ4 Compression | Publish Point Cloud Intensity | boolean | TRUE | ||
Output Type | Raw Point Cloud: No processing is applied, Filtered Point Cloud: Detection range filtering is applied by default. Downsampling could be applied for further compression. | int | Raw Point Cloud | |||
Apply LZ4 Compression | boolean | TRUE | ||||
Enable rounding of the coordinate values | Rounding float values can positively affect the compression rate | boolean | TRUE | |||
Rounding decimal precision | 2 means all values will be rounded to 0.01 | int | 2 | [0,inf] | ||
Sort points in the point cloud | Compression rate can be improved when the point cloud is sorted | boolean | FALSE | |||
Algorithm Components > Downsampling | Downsampling | Apply | Apply point downsampling. | boolean | FALSE | |
Resolution | Resolution for downsampling points. at very high density, additional points do not add any value to the perception. They can be removed to reduce the stress on the machine. | meters | 0.1 | [0.1, +Inf.] | ||
Algorithm Components > Ground Detector | Level Ground Detector | Apply | If True, use the ground detector. | boolean | TRUE | |
Ground Margin | Maximum height for ground segmentation. | meters | 0.3 | [-0.5, +Inf.] | ||
Resolution | Size of the grid used on the ground profiling function | meters | 5 | [0, +Inf.] | ||
Limit Detection Range | To limit the ground detection range (xy-range). If enable, all points out of certain xy-range will not be considered as ground points. Note that it's not applied for z detection range. | boolean | FALSE | |||
Limit Detection Range > Max Detection Range | Maximum xy-range to apply ground detection. | meters | 130 | [0, +Inf.] | ||
Algorithm Components > Obstruction Detector | Range Obstruction Detector | Apply | The Obstruction detector checks whether the sensors are obstructed or not based on a distance and proportion of obstruction. | boolean | TRUE | |
Maximum Distance | This parameter defines the range around the sensor where obstructions will be looked for. If the value is too high, false positives may be triggered by passing objects. | meters | 1 | [0, +Inf.] | ||
Obstructed Proportion Threshold | This parameter defines the portion of the pointcloud that has to be obstructed to trigger an alarm. The value is defined as a percentage. | percentage | 0.7 | [0, 1] | ||
Algorithm Components > Tilt Detection & Auto-Correction | Tilt Detection & Auto-Correction | Apply | boolean | TRUE | ||
Enable Auto-Correction | boolean | TRUE | ||||
Max Number of Lidar Processed at One Time | int | 2 | [0, +Inf.] | |||
Use Points Around Lidar | When disabled, uses all points inside algo node | boolean | FALSE | |||
ICP Target Points Type | Select the type of points that will be used as a reference for the Tilt and Auto-Correction functions | |||||
Auto-Correction | Min. Angular Offset to Apply Recalibration [deg] | Auto-correction will neglect the change if the detected angle offset is smaller than this threshold. | degrees | 0.2 | [0.01, +Inf.] | |
Min. Distance Offset to Apply Recalibration [m] | Auto-correction will neglect the change if the detected distance offset is smaller than this threshold. | neters | 0.1 | [0.01, +Inf.] | ||
Max. Angular Offset to Apply Recalibration [deg] | Auto-correction will not attempt to correct the tilt if the detected angle offset is bigger than this threshold. Lidar Tilt Status will be sent and user is recommended to manually check lidar calibration | degrees | 5 | [0.01, +Inf.] | ||
Max. Distance Offset to Apply Recalibration [m] | Auto-correction will not attempt to correct the tilt if the detected distance offset is bigger than this threshold. Lidar Tilt Status will be sent and user is recommended to manually check lidar calibration | neters | 0.5 | [0.01, +Inf.] | ||
Tilt Detection | Min Angle Offset to Set Lidar Tilt Status | Minimum offset to in degrees to trigger LiDAR tilt alarm. | degrees | 0.4 | [0.01, +Inf.] | |
Min Distance Offset to Set Lidar Tilt Status | Minimum offset to in meters to trigger LiDAR tilt alarm. | neters | 0.3 | [0.01, +Inf.] | ||
Algorithm Components > Background Detector | Background Detector | Apply | If True, use the background detector. | boolean | TRUE | |
Time to Initialize Background | Initiated period to learn background points. | seconds | 5 | [0, +Inf.] | ||
Time to Become Background | The period for static objects becomes background. | seconds | 600 | [1, +Inf.] | ||
Time to Become Foreground | The period for a background object becomes foreground when it starts moving. | seconds | 2 | [1, +Inf.] | ||
Number of Frames To Estimate Detection Range | Number of initial frames used to estimate the background detection range | int | 3 | [1, +Inf] | ||
Use Global Detection Range | If use this option, the global detection range will be used for background detection range instead of using initial frames estimation | boolean | FALSE | |||
Auto Save | This parameter lets SENSR to save a snapshot of the background pointcloud | boolean | FALSE | |||
Auto Load | This pararmeter lets SENSR load a perviously saved snapshot of the background points | boolean | FALSE | |||
Resolutions | Range | Range resolution | meeters | 0.2 | [0, +Inf.] | |
Azimuth | Azimuth angle resolution | degree | 0.8 | [0, 180] | ||
Elevation | Elevation resolution | degree | 1 | [0, 180] | ||
Algorithm Components > Clusterer | Sparse Grid-Clusterer | x-Resolution | Grid resolution in x-axis. | meters | 0.15 | [0, +Inf.] |
y-Resolution | Grid resolution in y-axis. | meters | 0.15 | [0, +Inf] | ||
Cell Point Threshold | Minimum points per cell for clustering | int | 1 | [1, +Inf.] | ||
Point-Size Growing > Enable Point-Size Growing | To allow using bigger clustering resolution to far-distant objects. | boolean | FALSE | |||
Point-Size Growing > Point-Size Scaling Factor per Meter | The scaling factor of point size's radius per metter with respect to the distance to lidar. | meters | 0.002 | [0, 0.01] | ||
Point-Size Growing > Max. x-Resolution | Max grid resolution in x-axis when using Point-Size Growing. | meters | 0.8 | [0, +inf] | ||
Point-Size Growing > Max. y-Resolution | Max grid resolution in y-axis when using Point-Size Growing. | meters | 0.8 | [0, +inf] | ||
Algorithm Components > Clusterer | Grid-Clusterer | x-Resolution | Grid resolution in x-axis. | meters | 0.15 | [0, +Inf.] |
y-Resolution | Grid resolution in y-axis. | meters | 0.15 | [0, +Inf] | ||
Cell Point Threshold | Minimum points per cell for clustering | int | 1 | [1, +Inf.] | ||
Point-Size Growing > Enable Point-Size Growing | To allow using bigger clustering resolution to far-distant objects. | boolean | FALSE | |||
Point-Size Growing > Point-Size Scaling Factor per Meter | The scaling factor of point size's radius per metter with respect to the distance to lidar. | meters | 0.002 | [0, 0.01] | ||
Point-Size Growing > Max. x-Resolution | Max grid resolution in x-axis when using Point-Size Growing. | meters | 0.8 | [0, +inf] | ||
Point-Size Growing > Max. y-Resolution | Max grid resolution in y-axis when using Point-Size Growing. | meters | 0.8 | [0, +inf] | ||
Algorithm Components > Cluster Merger (GPU plug-in) | Cluster Merger | Apply | If True, enables Cluster Merger. | True/ False | ||
Algorithm Components > Tracker | Hybrid-Tracker | Apply | If true, use Hybrid-Tracker | boolean | TRUE | |
Association-Distance | Distance threshold to associate objects in two consecutive frames. | meters | 1.5 | [0.01, +Inf] | ||
Merge Object Level | Object merging level to decide how easily small closed objects can be merged into one object. | float | 4 | [0, 10] | ||
Max. Grow-Size Tolerance | Maximum size of merged object can grow | meters | (1, 2) | [1, +inf] | ||
Max-Dimensions (W, L) | Maximum size of object can be tracked. | meters | (4, 15) | [0.01, +inf] | ||
Human-Tracker | Apply | If true, use Human-Tracker | boolean | FALSE | ||
Association-Distance | Distance threshold to associate objects in two consecutive frames. | meters | 0.2 | [0.01, +Inf] | ||
Merge/Split Size Mul. Threshold | Size ratio threshold that allows to merge/split | meters | 1.8 | [0.0, +Inf] | ||
Merge/Split Size Add. Threshold | Size increasing/decreasing threshold that allows to merge/split | meters | 0.5 | [0.0, +Inf] | ||
Min. Time for Splitting | Minimum lifetime of objects to apply merger/ splitter check | seconds | 3 | [0.0, +Inf] | ||
Static tracker | Apply | If True, use GPU Pipeline Supported Tracker | boolean | FALSE | ||
Tracking Association Distance | Distance threshold to associate objects in two consecutive frames. | meters | 2 | [0.0, +Inf] | ||
Bounding Box Configure > Use Object Tight Box | This option will ensure wrap a bounding box as tight as possible around the object | boolean | FALSE | |||
Bounding Box Configure > Use Position Smoothing | This option will smooth the object's postion to reduce jitter. | boolean | TRUE | |||
Bounding Box Configure > Use Size Smoother DL Car | This option will adjust the car size to create a more seamless transition, thereby minimizing abrupt changes when occlusion occurs. | boolean | TRUE | |||
Yaw Smoothing > Apply Heading Smoothing | This option will smooth the yaw of vehicles, reducing abberations. | boolean | TRUE | |||
Merge/Split Setting > Affected Objects | This parameter allows selecting what types of objects will be affected by the static tracker - Cars only: only cars will be tracked in a static zone - Cars & Small Objects: only small objects will be tracked in a static zone - All Objects: all objects will be tracked in the static zone | int | All Objects | |||
Merging level | Level of agressiveness on the merging, if objects break due to high speed, use a higher setting | int | ||||
Reference Size Reduce Rate | Shrinking rate of tracked object size. In case of occlusion, the size of a tracked object will be learnt and kept with this shrinking rate. | float | 0.98 | [0.0, +Inf] | ||
Use Splitting | Allows spliting objects if needed | boolean | FALSE | |||
Merge/Split Size Mul. Threshold (Manual Setting) | Additional threshold for merging and splitting process objects | meters | 2.1 | [0.0, +Inf] | ||
Merge/Split Size Add. Threshold (Manual Setting) (Min) | Size threshold for merging and splitting process objects | meters | 1.5 | [0.0, +Inf] | ||
Merge/Split Size Add. Threshold (Manual Setting) (Max) | Additional threshold for merging and splitting process objects | meters | 3 | [0.0, +Inf] | ||
Allow Merging with Deep Learning Vehicle (Manual Setting) | boolean | FALSE | ||||
Increased size Merging Threshold for Deep Learning Vehicle (Manual Setting) (Min) | meters | 1 | ||||
Increased size Merging Threshold for Deep Learning Vehicle (Manual Setting) (Max) | meters | 2 | ||||
Max. Distance to Apply Merging [m] (Manual Setting) | meters | 1.5 | ||||
Algorithm Components > Classifier | Light ML Classifier | Apply | If True, use Light ML Classifier | boolean | TRUE | |
Target Classification Classes | CAR/PED/CYC/MISC | The classes the output of the classifier will include. E.g. if only PED(estrian) and MISC(ellaneous) are chosen all the objects will be classified as either PED or MISC. | boolean | TRUE | ||
Only Classify Tracked Objects | If selected, only objects with tracking status will be classified | boolean | TRUE | |||
Skipped Classification Classes | CAR/PED/CYC/MISC | Any class marked with this option will be ignored by the classifier. | boolean | FALSE | ||
Object Class Stability | Stability Level | Classification stability level: - Disabled, - Semi-permanently If an object class is stable, the classification will be kept until replaced by another stable class, - Permanently: If an object class is stable, the classification will be kept permanently. | Permanently | |||
Min. Number of Frames for Stable Class Label | The number of frames an object class need to be continuously kept in order to turn to semi-permanently (for Stability Level = 1) or permanently (for Stability Level = 2). | int | 20 | |||
Max. Enlarged size | Size threshold for enlarged objects to maintain their class stabability for PED and CYC | meters | 2.5 | [0.0, +Inf] | ||
First Order Classifier | This session will be applied first and only if it can’t classify the objects the send order classifier can assist it. | [0.0, +Inf] | ||||
Size Classifier for Big Objects | Apply | If True, use the size classifier for big objects. If used, all objects which have size bigger than Min Length, Width, Height will be classified as one of Include Classes. | boolean | TRUE | ||
Length Range/ Width Range/ Height Range | The range of length, width, height to classify as big objects. | meters | [3,20][1.8, 4] [1, 5] | [0, +Inf] | ||
Include Classes | The classes the output of the size classifier for big objects will include. | boolean | CAR: TRUE, MISC: FALSE | |||
Size Classifier for Small Objects | Apply | If True, use the size classifier for small objects. If used, all objects which have size in the Length Range, Width Range, Height Range will be classified as one of Include Classes. If an object is classified as a big object, it won’t be classified as a small object even though it’s in the small object range. | boolean | TRUE | ||
Length Range/ Width Range/ Height Range | The range of length, width, height to classify as small objects. | meters | [0.2, 1.3][0.2, 0.8] [0.5, 2.2] | [0, +Inf] | ||
Include Classes | If True, the classes the output of the size classifier for big objects will include. | boolean | TRUE | |||
Velocity Classifier | Apply | If True, use the velocity classifier. | boolean | TRUE | ||
Non-MISC Min Displacement | Minimum displacement which object will not be classified as a MISC. | meters | 2 | [0, +Inf] | ||
Non-MISC Min Velocity | Minimum velocity which object will not be classified as a MISC. | km/h | 2 | [0, +Inf] | ||
PED Max Velocity | Minimum velocity which object will not be classified as a PED. | km/h | 12 | [0, +Inf] | ||
Use Velocity Estimator | Enable this to use the velocity estimator | boolean | FALSE | |||
Second Order Classifier > ML Classifier | Apply | If True, use the machine learning classifier. If applied, it will be used only after using size classifier and velocity classifier. E.g, After the size classifier and velocity classifier, the object is classified as either PED or CYC, ML classifier will decide whether it’s a PED or a CYC. | boolean | TRUE | ||
Min Num Points | Minimum number of points to use ML classifier. | int | 30 | [0, +inf] | ||
Max Num Objects To Classify | Maximum number of objects can be classified by ML classier. | int | 100 | [0, +inf] | ||
Algorithm Components > Object Detector | 2D Object Detector | Apply | If True, enables the 2D Object Detector | boolean | TRUE | |
Use Light Model | Use light model can speed up processing time. However, it might have a potential impact on the detection quality. | boolean | FALSE | |||
Car Probability Threshold | Minimum probability to be considered CAR | float | 0.45 | [0.0, 1.0] | ||
Pedestrian Probability Threshold | Minimum probability to be considered PED | float | 0.35 | [0.0, 1.0] | ||
Cyclist Probability Threshold | Minimum probability to be considered CYC | float | 0.35 | [0.0, 1.0] | ||
Exclude Invalid Points | If selected, points in an exclusion zone will not be included in the object detection model. | boolean | FALSE | |||
Exclude Ground Points | Exclude Ground points from the object detection | boolean | FALSE | |||
Exclude Ceiling Points | Exclude Ceiling points from the object detection | boolean | FALSE | |||
Exclude Points in Exclusion Zones | Exclude Exclusion zone points from the object detection | boolean | TRUE | |||
Use Ground Substraction | Use this parameter working in GPU mode with a significant slope | boolean | FALSE | |||
Apply Model Optimization | The system will try optimizing the object detection model to reduce computing power consumption. | boolean | FALSE | |||
Limit Detection Range > Enable | This to allow object detection cover in a smaller area compared to the global detection range model. | boolean | FALSE | |||
Limit Detection Range > Max Detection Range | Maximum range to apply object detection. | meters | 100 | [0, +Inf] | ||
Algorithm Components > Object Detector | 3D Object Detector | Apply | If True, enables the 3D Object Detector | boolean | TRUE | |
Use Light Model | Use light model can speed up processing time. However, it might have a potential impact on the detection quality. | boolean | FALSE | |||
Car Probability Threshold | Minimum probability to be considered CAR | float | 0.45 | [0.0, 1.0] | ||
Pedestrian Probability Threshold | Minimum probability to be considered PED | float | 0.35 | [0.0, 1.0] | ||
Cyclist Probability Threshold | Minimum probability to be considered CYC | float | 0.35 | [0.0, 1.0] | ||
Exclude Invalid Points | If selected, points in an exclusion zone will not be included in the object detection model. | boolean | FALSE | |||
Exclude Ground Points | Exclude Ground points from the object detection | boolean | FALSE | |||
Exclude Ceiling Points | Exclude Ceiling points from the object detection | boolean | FALSE | |||
Exclude Points in Exclusion Zones | Exclude Exclusion zone points from the object detection | boolean | TRUE | |||
Use Ground Substraction | Use this parameter working in GPU mode with a significant slope | boolean | FALSE | |||
Apply Model Optimization | The system will try optimizing the object detection model to reduce computing power consumption. | boolean | FALSE | |||
Limit Detection Range > Enable | This to allow object detection cover in a smaller area compared to the global detection range model. | boolean | FALSE | |||
Limit Detection Range > Max Detection Range | Maximum range to apply object detection. | meters | 100 | [0, +Inf] | ||
Algorithm Components > Output Object Filter | Noise Object Filter | Exclude Zero-Area Objects | Filter objects that have a dimension = 0 | boolean | FALSE | |
Object Filter Level | Methods for filtering noise objects. Target filtering including low intensity objects and occluded objects. + Object is called "Low Intensity" if the number of points whose intensity is higher than "Intensity Threshold" is smaller than "Min. Number of Bright Points". + Object is called "Occluded" if it is blocked by another objects considering lidar positions. - Disabled : Not using noise object filter. - Filter Low Intensity Occluded Object : Filter objects which is both low intensity and occluded at the same time. - Filter Occluded Object - Filter Low Intensity Object - Filter either Low Intensity or Occluded Object : Filter both low intensity objects and occluded objects. | Disabled | ||||
Intensity Threshold | Intensity Threshold to define low intensity objects. | int | 10 | [0, +inf] | ||
Min. Number of Bright Points | Min. Number of Bright Points to define low intensity objects. | int | 5 | [0, +inf] | ||
Algorithm Components > Ceiling Filter | Apply Ceiling Filtering | Filter to discard points and data coming from the ceiling | boolean | FALSE | ||
Min Height for Ceiling | Defines the height of the ceiling, points above that value will be discarded | meters | 2 | [0, +inf] |
Visualization parameters
The Visualization settings allow users to change the graphical aspects of SENSR, they can be found in View > Preference
(or by pressing F11
)
Location | Name | Description | Default |
---|---|---|---|
Visibility | Draw Object Points | Enable or disable showing points belonging to objects. | TRUE |
Draw Ground Point | Enable or disable showing ground points. (e.g. floor.) | TRUE | |
Draw Background Points | Enable or disable showing background points. (e.g. wall or desk…) | TRUE | |
Intensity Visual Mode | Enable this to color point-cloud by intensity values. Off: Intensity color visualization is off. Fixed Range: Use max and min intensity value from Range below. Auto Range: Calculate max and min intensity value from each frame. | 0: Off 1: Fixed Range 2: Auto Range | |
Draw Grid | Enable or disable showing squared Grid. | TRUE | |
Draw Grid Circular | Enable or disable showing circular Grid. | FALSE | |
Draw Axis | Enable or disable showing the world XYZ coordinate axis. | TRUE | |
Draw Objects | Enable or disable showing tracked objects. (e.g. pedestrian) | TRUE | |
Draw Misc Objects | Enable or disable showing miscellaneous objects. | TRUE | |
Draw Drifting Objects | Enable or disable showing drifting objects. | TRUE | |
Draw Invalidating Objects | Enable or disable showing invalidating objects. | TRUE | |
Draw Validating Objects | Enable or disable showing validating objects. | TRUE | |
Draw Object Trail | Enable or disable showing the object trail. | TRUE | |
Draw Predicted Trajectory | Enable or disable showing the predicted trajectory of moving objects. | FALSE | |
Draw Map Image | Enable or disable showing a map image. | TRUE | |
Draw Detection Range | Enable or disable showing the Detection range | TRUE | |
Detection Range Draw Mode | Draws the detection range in either 2D or 3D | 3D | |
Draw Lidar Name | Enable or disable showing the name of each sensor. | FALSE | |
Draw Lidar Topic | Enable or disable showing the topic of each sensor. | FALSE | |
Draw Algo Node Name | Enable or disable showing the name of each algo node. | FALSE | |
Draw Zone Name | Enable or disable showing the name of each zone. | FALSE | |
Draw Retro-Reflective Objects | Enable or disable showing the retro-reflective objects. | FALSE | |
Draw Annotations | Enable or disable drawing object annotations | FALSE | |
Visible Annotations -> Show Object ID | If object annotations are enabled the object’s ID is shown | TRUE | |
Visible Annotations -> Show Object Tag | An object can be tagged in SENSR through REST API by providing timestamp, box, and position. This parameter highlights a given object | FALSE | |
Visible Annotations -> Show Object Class and Probability | If object annotations are enabled the object’s class and probability score of that class is shown | FALSE | |
Visible Annotations -> Show Object Speed | If object annotations are enabled the object’s speed is shown | TRUE | |
Visible Annotations -> Show Object Height | If object annotations are enabled the object’s height is shown | FALSE | |
Color | Background Color | Color of the background. | - |
Ground Point Color | Color of the ground points. | - | |
Background Point Color | Color of the background points. | - | |
Object Point Color | Color of points belonging to objects. | - | |
Car Color | Color of cars. | - | |
Pedestrian Color | Color of pedestrians. | - | |
Cyclist Color | Color of cyclists. | - | |
Misc Color | Color of miscellaneous objects. | - | |
Event Zone Color | Color of event zones. | - | |
Exclusion Zone Color | Color of exclusion zones. | - | |
Map Exclusion Zone Color | Color of exclusion zones on 3D maps. | - | |
Reflection Zone Color | Color of reflection zones. | - | |
Static Zone Color | Color of static zones. | - | |
Detection Range Color | Color of the detection range outline. | - | |
Retro-Reflective Object Color | Color of the retro-reflective object | - | |
Miscellaneous | Cloud Point Size | Control the visual size of the rendered points. The values can range from 1.0 to 10.0. | 1 |
Use Random Obj. Color | This option colors each object in a different color. | FALSE | |
Far Clipping Plane | This option clips the z=0 plane with a set radius. | 3000 | |
Grid Interval [m] | Display grid size. | 10 | |
Background Alpha | Alpha of the background for labels. | 120 | |
Object Top Offset | Vertical offset from the objects. | 0.5 | |
Font size | Size of the font used. | 15 | |
View Range | Enable Z Clipping Range | Enable Z-Range to not display pointclouds outside of it | FALSE |
Z Clipping Range Limits [m] | Display Z-range. | [-10.0, 10.0] |
Export Configuration
SENSR can export the configuration changes the user has made into a human-readable format. This is useful for checking a project's settings against the settings SENSR uses by default.
To export the configuration changes go to sensor setup mode and in the top menu-bar navigate to File->Export Parameter change Report. This will bring up a dialog-box to choose where to export the text-file.