Quantcast
Channel: LabWindows/CVI topics
Viewing all articles
Browse latest Browse all 5362

Numeric control precision: display vs value

$
0
0

With a float or double numeric control there are (sort of) two values: that perceived by the user as restricted by the formatting and the true underlying value. This means the code can set a control's value to 12.345 and the control may display this as 12.3 if one decimal is to be shown. Fair enough. 

 

Now consider a case where the numeric up/down feature is enabled with an increment of 0.1.  The user clicks the "up" once and sees "12.4" in the control. Behind the scenes the value is now 12.445. 

 

Clearly in some cases this is appropriate, while in others it is not. It all depends whether the user intended to enter 12.4, or whether they intended to increment the existing value by 0.1. And there is a clearly an argument that if the difference between those matters, then the display should show those digits. 


I'm not saying anything is broken. I just have a mutli-platform issue where I'm trying to maintain parity in the behavior of the two systems (CVI and non-CVI based), so this detail is important to me now.

 
Does anyone have any tricks / techniques / experience to mitigate this issue? Have you ever tried to modify the increment/decrement buttons to jump to round values based on the display precision?

 
I'd appreciate any thoughts on the topic.

 

 

 

 


Viewing all articles
Browse latest Browse all 5362

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>