Skip to main content
Version: 3.2.1

Parameters

Overview

SENSR is a highly configurable solution allowing users to optimize their site's configuration. This section aims at describing SENSR's settings and their default values.

SENSR settings can be classified into 4 categories:

  • General parameters, defining the system's overall behavior,
  • Master node parameters, defining the orchestration with algo nodes,
  • Algo node parameters, defining the parameters used by the different algorithms to optimize object tracking
  • Visualization parameters, defining the user interface settings
important

Tracking results can widely vary depending on the settings applied to a site. Tunning the parameters however comes with trade-offs, over optimizing for certain aspects may come at the cost of degrading others.

General Parameters

General parameters are parameters shared accross a whole project of each components, they can be found in Settings > General Parameters.

LocationFieldNameDescriptionUnitDefaultRange
CommonOutputRuntime Mode Point-Cloud Publishing LevelThe level of publishing point cloud to output port. This setting will greatly increase the bandwidth between the algo nodes and the master node!
- No Points
- Object Points
- Object Points & Background Points
- All Points
intObject Points
Zone Setup Point-Cloud Publishing LevelThe level of publishing point cloud to output port. This setting will greatly increase the bandwidth between the algo nodes and the master node!
- No Points
- Object Points
- Object Points & Background Points
- All Points
intObject Points & Background points
Background Points Update IntervalTime between updates of background and ground point cloud information from algo nodes to the master node. Updating frequently can be resource-intensive..seconds1
Sensor Status Update IntervalTime between updates of sensor information from algo nodes to the master node.seconds0.5
Publish Losing EventEnables SENSR ppublishing loosing event when for a given object the tracking is lostbooleanFALSE
Output Sending Type SelectionMethod for sending and updating new results.
- Instant Sending: Publish result whenever a new one is available.
- Fixed-Period Sending: Publish the most recent result periodically. The published result might already have been published in a previous frame.
- Fixed-Period w/ Delay: Publish the most recent result periodically. If there is no new result, the sending is postponed until a new result is received or a maximum delay time has passed.
- All-Update or Fixed Delay
intFixed-Period Sending
Update IntervalPublish interval for output sending type 1 - Fixed-Period Sending and type 2 - Fixed-Period with Max-Delayseconds0.1
Max DelayMaximum delay time for "Fixed-Period with Max-Delay".seconds0.05
Point-Cloud Bandwidth ReductionThis decreases the communication burden by reducing pointcloud data size. This option if enable will affects:
- In calibration mode, only one point cloud will be sent (circularly) at a time.
- In runtime mode, background and ground points will be downsampled
- Points out of detection range will not be published regardless of the point cloud publishing level.
booleanFALSE
Downsampling ResolutionResolution of downsampling method for pointcloud bandwidth reduction. Value can be between 0 to 1.meters0.3
InputMaximum Lidar Input IntervalIf there is no response from one of the lidar sensors for this time duration, that lidar sensor will be considered dead.seconds20[0,30]
Restart Launchers of Unresponsive LidarsbooleanTRUE
Launcher Restart ThresholdWhen enabled, the system automatically restarts lidar launchers if the algo node does not receive
any point cloud messages within the designated lidar restart threshold.
seconds30[5, +inf]
Frame Update SettingMethod for updating new frames from received results
- Instant: Update frames as soon as results arrive
- Use buffer: Update frames using buffer that stores frames after specified delay time in case of network traffic
intInstant
Minimum Pending Time (for Instant update)Due to connection issues, the time interval between two point-cloud messages of one lidar can be unreasonably small. This minimum interval will ensure that one lidar point-cloud message of the pipeline has a reasonable time gap with the next one. A message which time difference with the previous message is below this minimum interval will get discarded. (This option is only visible if "Instant" is selected in the Frame Update Method.)seconds0.02[0, 0.1]
Frame Interval [s] (for Instant update)Whenever data from all LiDARs have arrived, or the frame interval has been reached, the frame will be bundled and processed by the detection node. Only one frame will be bundled per frame interval.0.1[0.05, 5]
Frame Jitter [s] (for Instant update)The frame bundler will reuse LiDAR point clouds from previous frames, taking into account the frame jitter.0.02[0, +inf]
Number of Frames Operating Delayed (for Using Buffer Update)Number of frames operating behind the current time. The frame duration is specified by the frame interval parameter.3[1, +inf]
Frame Interval [s] (For Using Buffer Update)Perception pipeline will be triggered with new input according to the frame interval.0.15[0.1, +inf]
Retro-Reflection TrackingEnable Retro-Reflection TrackingEnables marking of objects as retro-reflective. For this to work properly the retro-reflectivity threshold setting for each lidar also needs to be tuned properly.booleanFALSE
Min Point CountFor an object to be considered retro-reflective it has to have at least this many retro-reflective points.int5[1, +inf]
History ThresholdThe number of frames an object will stay retro-reflective after classified as retro-reflective..int10[0, +inf]
Reflection FilterUse Simple Filter [2D]Enable simple filtering algorithm for faster but less accurate filtering. The height will not be considered.booleanTRUE
Use detection limitIf enabled, any sensor further from the reflection filter than the detection limit will not be considered during filtering. A smaller detection limit will make filtering faster.booleanTRUE
Detection limit [m]Detection range of the option "Use detection limit".meters50[0, +inf]
Zone SettingTrigger Zone Events with MISC ObjectsEnabling this will trigger zone events with MISC objects.booleanFALSE
Zone Filter Resolution [m]Grid Cell Size for Zone Filter.meters0.1[0, +inf]
Tracking HistoryTracking History LengthThe number of history positions to keepint100[0, +Inf.]
History Smoothing LevelTrajectory smoothing levelint3[0, 3]
Trajectory PredictionNumber of Prediction StepsNumber of prediction stepsint10[0, +Inf.]
Prediction Step Time-HorizonTime step between two predicted pointsseconds0.1[0, +Inf.]
Max. Acceleration Filter ThresholdMaximum acceleration for predictionm/s215[0, +Inf.]
Data CollectionEnable Anomaly Data-CollectorWhen enabling the Anomaly Data-Collector, the latest "Num Look-Back Frames" are temporarily stored on the disk. Once tha anomaly detector is triggered, the Look-Back and Look-Ahead frames are savedbooleanFALSE
Num Look-Back FramesNumber of previous frames temporarily stored on the diskint100[1, +inf]
Num Look-Ahead FramesNumber of frames stored on disk after anomaly detectedint100[1, +inf]
LoggingMaximum Log File Size [MB]Sets the Maximum File Size for a single file. Note that the total size of the logs can be much larger.MB10[1, +inf]
Maximum Number of Log Files Per DirectoryThe number of rotating log files that will be kept for the master node and each algo node.int3[2, 999]
Maximum Number of Log Files Per NodeThe specified number of the most recent log folders that will be retained for master node and each individual algo and edge node.int5[2, 999]
RecoveryConnection Timeout [s]The time after which an unresponsive node is considered as dead. After exceeding this timeout, recovery is attempted.seconds20[3, +inf]
Recovery Attempt Timeout [s]If a recovery attempt was not successful after the specified timeout, the recovery attempt is considered as not successful and aborted.seconds30[3, +inf]
Maximum Number of Recovery Triesint10[0, +inf]
Time Between Recovery Attempts [s]seconds60[0, +inf]

Master Node Parameters

Master Node parameters define the orchestraion with Algo Nodes and can be found in Settings > Master Node Parameters.

LocationNameDescriptionUnitDefaultRange
Rosbag SyncMax Rosbag Difference [s]This parameter can be used to make sure that the played rosbag files come
approximately from the same recording time. The value of this parameter will limit
the difference between the rosbag files in terms of starting and ending time.
If the difference between the maximum starting time and the minimum starting time
is above this value, the algo nodes will not be launched. The same check is
performed for the ending time.
int3600[0, +inf]
Master ReplayUse Pseudo Realtome for the Frame-by-Frame ModebooleanTRUE
Result UpdateResult Update MethodMethods for updating algo nodes results.
0: Instant
1: Use buffer

Instant method will process algo nodes results as soon they arrived on master node.

Use Buffer method will process algo nodes results based on intervals specified by Algo Nodes Results Update Interval. All received results will be stored in a buffer
which maximum size is Number of Algo Node Results Operating Delayed.
int0{0,1}
Number of Algo Node Results Operating DelayedNumber of algo nodes results operating behind the current time.int10[1,+inf]
Algo Nodes Results Update Interval [s]Algo nodes results processing will be triggered according to the update interval.seconds0.15[0.01,+inf]
Output-Merger > Algo-Node Result TimeoutsAlgo-Node Object Result-TimeoutCompares the elapsed time between 2 algo node of object information messages. if the elapsed time is greater than the threshold, the algo node result is not taken into account during the aggregation.seconds2[0.1, +Inf.]
Algo-Node Point Result TimeoutFor the point result, the point-cloud update frequency is added to this timeout.seconds2[0.1, +Inf.]
Algo-Node Sensor Status TimeoutFor the sensor statuses, the sensor update frequency is added to this timeout.seconds10[0.1, +Inf.]
Output-Merger > Multi Algo-Node MergerUse Multi Algo Object AssociationMerges objects found between 2 algo nodes if they are within bound.booleanTRUE
Grow-Size x-ToleranceTolerance on the x-axis for object distortion before mergermeters2[0, +Inf.]
Grow-Size y-toleranceTolerance on the y-axis for object distortion before mergermeters1.5[0, +Inf.]
Grow-Size Rate-ToleranceDistortion tolerance ration before mergerpercent1.2[0, +Inf.]
Grow-Size Addition-ToleranceAdditional toleration for object growth before mergermeters0.5[0, +Inf.]
Association Distance ThresholdDistance threshold to associate objects in two nodesmeters2[0, +Inf.]

Algo Node Parameters

Algo node parameters define the parameters used by SENSR's multiple algorithms to process 3D data. The parameters are specific to a given Algo node allowing to optimize tracking differently from an Algo node to another. The parameters can be found in Settings > Algo Node Parameters> Algorithm (ip_address). Please change algorithm parameters with caution, as wrong parameters can negatively affect the performance.

LocationFieldNameDescriptionUnitDefaultRange
Parameters > PerformanceCPU Thread Utilization of the Preprocessing Node [%]This parameter can be used to optimize the runtime performance of the preprocessing node.
It specifies the percentage of the system's CPU threads that will be allocated for computationally
expensive tasks on the preprocessing node.
percent30[10,100]
CPU Thread Utilization of the Detection Node [%]This parameter can be used to optimize the runtime performance of the detection node.
It specifies the percentage of the system's CPU threads that will be allocated for computationally
expensive tasks on the detection node.
percent50[10,100]
CPU Thread Utilization of the Environment Update Node [%]This parameter can be used to optimize the runtime performance of the environment node.
It specifies the percentage of the system's CPU threads that will be allocated for computationally
expensive tasks on the environment node.
percent10[10,100]
CPU Thread Utilization of the Long-Running Tasks Node [%]This parameter can be used to optimize the runtime performance of the long-running tasks node.
It specifies the percentage of the system's CPU threads that will be allocated for computationally
expensive tasks on the long-running tasks node.
percent10[10,100]
Parameters > Object-Allow Floating-ObjectIf True, min. z of the object is set to be the min. z-value of the object bounding box instead of ground height.booleanFALSE
Filter Objects Based on Number of Points and RadiusEnableIf activated, this filter will sort out objects based on their number of points and their radius.booleanTRUE
Tracked Object's Min. PointsThe minimum number of points of an object to be tracked.int5[0, +Inf.]
Tracked Object's Min RadiusMinimum object radius to be tracked.meters0.15[0, +Inf.]
Parameters > Tracking-Validation PeriodPeriod of checking validity in the early stage of tracking. This time determines length of the VALIDATION period, or how quickly a new object will be tracked and classified.seconds0.5[0, +Inf.]
Invalidation PeriodPeriod of short term prediction when tracking is lost while in the VALIDATION period. A longer period will allow for objects that are at the edge of detection range to be tracked, but will be more likely to introduce false alarms.seconds0.5[0, +Inf.]
Drifting PeriodPeriod of short term location prediction when tracking is lost in TRACKING status. A longer period can help with obstructions, but can lead to object ID switching in busy environmentsseconds1[0, +Inf.]
Drifting Period for Miscellaneous ObjectsDrifting period apply for Miscellaneous Objects.seconds1[0, +Inf.]
Parameters > Edge NodeLZ4 CompressionPublish Point Cloud IntensitybooleanTRUE
Output Type Raw Point Cloud: No processing is applied,
Filtered Point Cloud: Detection range filtering is applied by default.
Downsampling could be applied for further compression.
intRaw Point Cloud
Apply LZ4 CompressionbooleanTRUE
Enable rounding of the coordinate valuesRounding float values can positively affect the compression ratebooleanTRUE
Rounding decimal precision2 means all values will be rounded to 0.01int2[0,inf]
Sort points in the point cloudCompression rate can be improved when the point cloud is sortedbooleanFALSE
Algorithm Components > Ground DetectorLevel Ground DetectorApplyIf True, use the ground detector.booleanTRUE
Ground MarginMaximum height for ground segmentation.meters0.3[-0.5, +Inf.]
ResolutionSize of the grid used on the ground profiling functionmeters5[0, +Inf.]
Limit Detection RangeTo limit the ground detection range (xy-range). If enable, all points out of certain xy-range will not be considered as ground points. Note that it's not applied for z detection range.booleanFALSE
Limit Detection Range > Max Detection RangeMaximum xy-range to apply ground detection.meters130[0, +Inf.]
Algorithm Components > Obstruction DetectorRange Obstruction DetectorApplyThe Obstruction detector checks whether the sensors are obstructed or not based on a distance and proportion of obstruction.booleanTRUE
Maximum DistanceThis parameter defines the range around the sensor where obstructions will be looked for. If the value is too high, false positives may be triggered by passing objects.meters1[0, +Inf.]
Obstructed Proportion ThresholdThis parameter defines the portion of the pointcloud that has to be obstructed to trigger an alarm. The value is defined as a percentage.percentage0.7[0, 1]
Algorithm Components > Tilt Detection & Auto-CorrectionTilt Detection & Auto-CorrectionApplybooleanTRUE
Enable Auto-CorrectionbooleanTRUE
Max Number of Lidar Processed at One Timeint2[0, +Inf.]
Use Points Around LidarWhen disabled, uses all points inside algo nodebooleanFALSE
Auto-CorrectionMin. Angular Offset to Apply Recalibration [deg]Auto-correction will neglect the change if the detected angle offset is smaller than this threshold.degrees0.2[0.01, +Inf.]
Min. Distance Offset to Apply Recalibration [m]Auto-correction will neglect the change if the detected distance offset is smaller than this threshold.neters0.1[0.01, +Inf.]
Max. Angular Offset to Apply Recalibration [deg]Auto-correction will not attempt to correct the tilt if the detected angle offset is bigger than this threshold. Lidar Tilt Status will be sent and user is recommended to manually check lidar calibrationdegrees5[0.01, +Inf.]
Max. Distance Offset to Apply Recalibration [m]Auto-correction will not attempt to correct the tilt if the detected distance offset is bigger than this threshold. Lidar Tilt Status will be sent and user is recommended to manually check lidar calibrationneters0.5[0.01, +Inf.]
Algorithm Components > Background DetectorBackground DetectorApplyIf True, use the background detector.booleanTRUE
Time to Initialize BackgroundInitiated period to learn background points.seconds5[0, +Inf.]
Time to Become BackgroundThe period for static objects becomes background.seconds600[1, +Inf.]
Time to Become ForegroundThe period for a background object becomes foreground when it starts moving.seconds2[1, +Inf.]
Use Multi-lidar Background FusionFuse background information from multi-lidarbooleanFALSE
Number of Frames To Estimate Detection RangeNumber of initial frames used to estimate the background detection rangeint3[1, +Inf]
Use Global Detection RangeIf use this option, the global detection range will be used for background detection range instead of using initial frames estimationbooleanFALSE
Auto SaveThis parameter lets SENSR to save a snapshot of the background pointcloudbooleanFALSE
Auto LoadThis pararmeter lets SENSR load a perviously saved snapshot of the background pointsbooleanFALSE
ResolutionsRangeRange resolutionmeeters0.2[0, +Inf.]
AzimuthAzimuth angle resolutiondegree0.8[0, 180]
ElevationElevation resolutiondegree1[0, 180]
Algorithm Components > ClustererGrid-Clustererx-ResolutionGrid resolution in x-axis.meters0.15[0, +Inf.]
y-ResolutionGrid resolution in y-axis.meters0.15[0, +Inf]
Cell Point ThresholdMinimum points per cell for clusteringint1[1, +Inf.]
Point-Size Growing > Enable Point-Size GrowingTo allow using bigger clustering resolution to far-distant objects.booleanFALSE
Point-Size Growing > Point-Size Scaling Factor per MeterThe scaling factor of point size's radius per metter with respect to the distance to lidar.meters0.002[0, 0.01]
Point-Size Growing > Max. x-ResolutionMax grid resolution in x-axis when using Point-Size Growing.meters0.8[0, +inf]
Point-Size Growing > Max. y-ResolutionMax grid resolution in y-axis when using Point-Size Growing.meters0.8[0, +inf]
Algorithm Components > Cluster Merger (GPU plug-in)Cluster MergerApplyIf True, enables Cluster Merger.True/ False
Algorithm Components > TrackerHybrid-TrackerApplyIf true, use Hybrid-TrackerbooleanTRUE
Association-DistanceDistance threshold to associate objects in two consecutive frames.meters1.5[0.01, +Inf]
Merge Object LevelObject merging level to decide how easily small closed objects can be merged into one object.float4[0, 10]
Max. Grow-Size ToleranceMaximum size of merged object can growmeters(1, 2)[1, +inf]
Max-Dimensions (W, L)Maximum size of object can be tracked.meters(4, 15)[0.01, +inf]
Human-TrackerApplyIf true, use Human-TrackerbooleanFALSE
Association-DistanceDistance threshold to associate objects in two consecutive frames.meters0.2[0.01, +Inf]
Merge/Split Size Mul. ThresholdSize ratio threshold that allows to merge/splitmeters1.8[0.0, +Inf]
Merge/Split Size Add. ThresholdSize increasing/decreasing threshold that allows to merge/splitmeters0.5[0.0, +Inf]
Min. Time for SplittingMinimum lifetime of objects to apply merger/ splitter checkseconds3[0.0, +Inf]
Static trackerApplyIf True, use GPU Pipeline Supported TrackerbooleanFALSE
Tracking Association DistanceDistance threshold to associate objects in two consecutive frames.meters2[0.0, +Inf]
Bounding Box Configure > Use Object Tight BoxThis option will ensure wrap a bounding box as tight as possible around the objectbooleanFALSE
Bounding Box Configure > Use Position SmoothingThis option will smooth the object's postion to reduce jitter.booleanTRUE
Bounding Box Configure > Use Size Smoother DL CarThis option will adjust the car size to create a more seamless transition, thereby minimizing abrupt changes when occlusion occurs.booleanTRUE
Yaw Smoothing > Apply Heading SmoothingThis option will smooth the yaw of vehicles, reducing abberations.booleanTRUE
Merge/Split Setting > Affected ObjectsThis parameter allows selecting what types of objects will be affected by the static tracker
- Cars only: only cars will be tracked in a static zone
- Cars & Small Objects: only small objects will be tracked in a static zone
- All Objects: all objects will be tracked in the static zone
intAll Objects
Merging levelLevel of agressiveness on the merging, if objects break due to high speed, use a higher settingint
Reference Size Reduce RateShrinking rate of tracked object size. In case of occlusion, the size of a tracked object will be learnt and kept with this shrinking rate.float0.98[0.0, +Inf]
Use SplittingAllows spliting objects if neededbooleanFALSE
Merge/Split Size Mul. Threshold (Manual Setting)Additional threshold for merging and splitting process objectsmeters2.1[0.0, +Inf]
Merge/Split Size Add. Threshold (Manual Setting) (Min)Size threshold for merging and splitting process objectsmeters1.5[0.0, +Inf]
Merge/Split Size Add. Threshold (Manual Setting) (Max)Additional threshold for merging and splitting process objectsmeters3[0.0, +Inf]
Allow Merging with Deep Learning Vehicle (Manual Setting)booleanFALSE
Increased size Merging Threshold for Deep Learning Vehicle (Manual Setting) (Min)meters1
Increased size Merging Threshold for Deep Learning Vehicle (Manual Setting) (Max)meters2
Max. Distance to Apply Merging [m] (Manual Setting)meters1.5
Algorithm Components > ClassifierLight ML ClassifierApplyIf True, use Light ML ClassifierbooleanTRUE
Target Classification ClassesCAR/PED/CYC/MISCThe classes the output of the classifier will include. E.g. if only PED(estrian) and MISC(ellaneous) are chosen all the objects will be classified as either PED or MISC.booleanTRUE
Only Classify Tracked ObjectsIf selected, only objects with tracking status will be classifiedbooleanTRUE
Skipped Classification ClassesCAR/PED/CYC/MISCAny class marked with this option will be ignored by the classifier.booleanFALSE
Object Class StabilityStability LevelClassification stability level:
- Disabled,
- Semi-permanently If an object class is stable, the classification will be kept until replaced by another stable class,
- Permanently: If an object class is stable, the classification will be kept permanently.
Permanently
Min. Number of Frames for Stable Class LabelThe number of frames an object class need to be continuously kept in order to turn to semi-permanently (for Stability Level = 1) or permanently (for Stability Level = 2).int20
Max. Enlarged sizeSize threshold for enlarged objects to maintain their class stabability for PED and CYCmeters2.5[0.0, +Inf]
First Order ClassifierThis session will be applied first and only if it can’t classify the objects the send order classifier can assist it.[0.0, +Inf]
Size Classifier for Big ObjectsApplyIf True, use the size classifier for big objects. If used, all objects which have size bigger than Min Length, Width, Height will be classified as one of Include Classes.booleanTRUE
Length Range/ Width Range/ Height RangeThe range of length, width, height to classify as big objects.meters[3,20][1.8, 4] [1, 5][0, +Inf]
Include ClassesThe classes the output of the size classifier for big objects will include.booleanCAR: TRUE, MISC: FALSE
Size Classifier for Small ObjectsApplyIf True, use the size classifier for small objects. If used, all objects which have size in the Length Range, Width Range, Height Range will be classified as one of Include Classes. If an object is classified as a big object, it won’t be classified as a small object even though it’s in the small object range.booleanTRUE
Length Range/ Width Range/ Height RangeThe range of length, width, height to classify as small objects.meters[0.2, 1.3][0.2, 0.8] [0.5, 2.2][0, +Inf]
Include ClassesIf True, the classes the output of the size classifier for big objects will include.booleanTRUE
Velocity ClassifierApplyIf True, use the velocity classifier.booleanTRUE
Non-MISC Min DisplacementMinimum displacement which object will not be classified as a MISC.meters2[0, +Inf]
Non-MISC Min VelocityMinimum velocity which object will not be classified as a MISC.km/h2[0, +Inf]
PED Max VelocityMinimum velocity which object will not be classified as a PED.km/h12[0, +Inf]
Use Velocity EstimatorEnable this to use the velocity estimatorbooleanFALSE
Second Order Classifier > ML ClassifierApplyIf True, use the machine learning classifier. If applied, it will be used only after using size classifier and velocity classifier. E.g, After the size classifier and velocity classifier, the object is classified as either PED or CYC, ML classifier will decide whether it’s a PED or a CYC.booleanTRUE
Min Num PointsMinimum number of points to use ML classifier.int30[0, +inf]
Max Num Objects To ClassifyMaximum number of objects can be classified by ML classier.int100[0, +inf]
Algorithm Components > Object Detector (GPU plug-in)2D Object DetectorApplyIf True, enables the 2D Object DetectorbooleanTRUE
Use Light ModelUse light model can speed up processing time.
However, it might have a potential impact on the detection quality.
booleanFALSE
Car Probability ThresholdMinimum probability to be considered CARfloat0.45[0.0, 1.0]
Pedestrian Probability ThresholdMinimum probability to be considered PEDfloat0.35[0.0, 1.0]
Cyclist Probability ThresholdMinimum probability to be considered CYCfloat0.35[0.0, 1.0]
Exclude Invalid PointsIf selected, points in an exclusion zone will not be included in the object detection model.booleanFALSE
Exclude Ground PointsExclude Ground points from the object detectionbooleanFALSE
Exclude Ceiling PointsExclude Ceiling points from the object detectionbooleanFALSE
Exclude Points in Exclusion ZonesExclude Exclusion zone points from the object detectionbooleanTRUE
Use Ground SubstractionUse this parameter working in GPU mode with a significant slopebooleanFALSE
Limit Detection Range > EnableThis to allow object detection cover in a smaller area compared to the global detection range model.booleanFALSE
Limit Detection Range > Max Detection RangeMaximum range to apply object detection.meters100[0, +Inf]
Algorithm Components > Output Object FilterNoise Object FilterExclude Zero-Area ObjectsFilter objects that have a dimension = 0booleanFALSE
Object Filter LevelMethods for filtering noise objects. Target filtering including low intensity objects and occluded objects. + Object is called "Low Intensity" if the number of points whose intensity is higher than "Intensity Threshold" is smaller than "Min. Number of Bright Points". + Object is called "Occluded" if it is blocked by another objects considering lidar positions. - Disabled : Not using noise object filter. - Filter Low Intensity Occluded Object : Filter objects which is both low intensity and occluded at the same time. - Filter Occluded Object - Filter Low Intensity Object - Filter either Low Intensity or Occluded Object : Filter both low intensity objects and occluded objects.Disabled
Intensity ThresholdIntensity Threshold to define low intensity objects.int10[0, +inf]
Min. Number of Bright PointsMin. Number of Bright Points to define low intensity objects.int5[0, +inf]
Algorithm Components > Ceiling FilterApply Ceiling FilteringFilter to discard points and data coming from the ceilingbooleanFALSE
Min Height for CeilingDefines the height of the ceiling, points above that value will be discardedmeters2[0, +inf]

Visualization parameters

The Visualization settings allow users to change the graphical aspects of SENSR, they can be found in View > Preference (or by pressing F11)

LocationNameDescriptionDefault
VisibilityDraw Object PointsEnable or disable showing points belonging to objects.TRUE
Draw Ground PointEnable or disable showing ground points. (e.g. floor.)TRUE
Draw Background PointsEnable or disable showing background points. (e.g. wall or desk…)TRUE
Intensity Visual ModeEnable this to color point-cloud by intensity values. Off: Intensity color visualization is off. Fixed Range: Use max and min intensity value from Range below. Auto Range: Calculate max and min intensity value from each frame.0: Off
1: Fixed Range
2: Auto Range
Draw GridEnable or disable showing squared Grid.TRUE
Draw Grid CircularEnable or disable showing circular Grid.FALSE
Draw AxisEnable or disable showing the world XYZ coordinate axis.TRUE
Draw ObjectsEnable or disable showing tracked objects. (e.g. pedestrian)TRUE
Draw Misc ObjectsEnable or disable showing miscellaneous objects.TRUE
Draw Drifting ObjectsEnable or disable showing drifting objects.TRUE
Draw Invalidating ObjectsEnable or disable showing invalidating objects.TRUE
Draw Validating ObjectsEnable or disable showing validating objects.TRUE
Draw Object TrailEnable or disable showing the object trail.TRUE
Draw Predicted TrajectoryEnable or disable showing the predicted trajectory of moving objects.FALSE
Draw Map ImageEnable or disable showing a map image.TRUE
Draw Notification WindowEnable or diable showing the Notification windowTRUE
Draw Detection RangeEnable or disable showing the Detection rangeTRUE
Detection Range Draw ModeDraws the detection range in either 2D or 3D3D
Draw Lidar NameEnable or disable showing the name of each sensor.FALSE
Draw Lidar TopicEnable or disable showing the topic of each sensor.FALSE
Draw Algo Node NameEnable or disable showing the name of each algo node.FALSE
Draw Zone NameEnable or disable showing the name of each zone.FALSE
Draw Retro-Reflective ObjectsEnable or disable showing the retro-reflective objects.FALSE
Draw Draw AnnotationsEnable or disable drawing object annotationsFALSE
Visible Annotations -> Show Object IDIf object annotations are enabled the object’s ID is shownTRUE
Visible Annotations -> Show Object Class and ProbabilityIf object annotations are enabled the object’s class and probability score of that class is shownFALSE
Visible Annotations -> Show Object SpeedIf object annotations are enabled the object’s speed is shownTRUE
Visible Annotations -> Show Object HeightIf object annotations are enabled the object’s height is shownFALSE
ColorBackground ColorColor of the background.-
Ground Point ColorColor of the ground points.-
Background Point ColorColor of the background points.-
Object Point ColorColor of points belonging to objects.-
Car ColorColor of cars.-
Pedestrian ColorColor of pedestrians.-
Cyclist ColorColor of cyclists.-
Misc ColorColor of miscellaneous objects.-
Event Zone ColorColor of event zones.-
Exclusion Zone ColorColor of exclusion zones.-
Map Exclusion Zone ColorColor of exclusion zones on 3D maps.-
Reflection Zone ColorColor of reflection zones.-
Static Zone ColorColor of static zones.-
Detection Range ColorColor of the detection range outline.-
Retro-Reflective Object ColorColor of the retro-reflective object-
MiscellaneousCloud Point SizeControl the visual size of the rendered points. The values can range from 1.0 to 10.0.1
Use Random Obj. ColorThis option colors each object in a different color.FALSE
Grid Interval [m]Display grid size.10
View RangeEnable Z Clipping RangeEnable Z-Range to not display pointclouds outside of itFALSE
Z Clipping Range Limits [m]Display Z-range.[-10.0, 10.0]

Export Configuration

SENSR can export the configuration changes the user has made into a human-readable format. This is useful for checking a project's settings against the settings SENSR uses by default.

To export the configuration changes go to sensor setup mode and in the top menu-bar navigate to File->Export Parameter change Report. This will bring up a dialog-box to choose where to export the text-file.

Logo
If you need access, please contact