Any tolerance on a part less than +/- 0.001 isn’t real. If I can change the size of the part enough to blow it out of tolerance by putting my hand on it and putting some of my body temperature into it then it’s just not real.
I used to work with those kinds of tolerances. Sensors for supersonic vehicles definitely need them and the tools to make them as well. Our tolerances were as tight as 0.01 arcseconds in rotations of motors smaller then my hand.
That’s the tolerance on paper, but the dude was doubting that you actually achieve that tolerance in the field. How can you when (using your supersonic aircraft example) you need to deal with temperature fluctuations more than 100c, not to mention pressure changes and sheer forces, which all deform the part?
Were they temperature controlled deep in the craft? On the leading edge, things can get up to foundry temperatures and so thermal expansion can reach the percents.
I guess you could design something to self-compensate for that, like the old gridiron pendulums, but that’s a big ask on top of a task that already sounds daunting.
Everything was temp and humidity controlled in the factory and delivery. After that I don’t know.
I know when we branched out and did control sticks we did thermal cycles where someone held it with 2 hands running on a treadmill to simulate the exhertion of a pilot.
It’s a rule, every high-tech workplace does something low-tech and stupid looking. Last time it came up it was somebody handling radioactive samples with tape on a stick. (It was long enough and they yelled for everyone to stay away, you see)
Any tolerance on a part less than +/- 0.001 isn’t real. If I can change the size of the part enough to blow it out of tolerance by putting my hand on it and putting some of my body temperature into it then it’s just not real.
I suppose if it’s something that’s going to sit in a temperature controlled compartment or room you’ll know it, although those applications exist.
You usually hear finish considered separate of tolerance in machining, but if your making an optical component they’re actually not, as well.
0.001 what? Meters?
apples
Inches I reckon. .001= one thou
Yes, most likely. Although that’s a cursed industry-specific unit, right up there with carats and hands.
I used to work with those kinds of tolerances. Sensors for supersonic vehicles definitely need them and the tools to make them as well. Our tolerances were as tight as 0.01 arcseconds in rotations of motors smaller then my hand.
That’s the tolerance on paper, but the dude was doubting that you actually achieve that tolerance in the field. How can you when (using your supersonic aircraft example) you need to deal with temperature fluctuations more than 100c, not to mention pressure changes and sheer forces, which all deform the part?
Were they temperature controlled deep in the craft? On the leading edge, things can get up to foundry temperatures and so thermal expansion can reach the percents.
I guess you could design something to self-compensate for that, like the old gridiron pendulums, but that’s a big ask on top of a task that already sounds daunting.
Everything was temp and humidity controlled in the factory and delivery. After that I don’t know. I know when we branched out and did control sticks we did thermal cycles where someone held it with 2 hands running on a treadmill to simulate the exhertion of a pilot.
It’s a rule, every high-tech workplace does something low-tech and stupid looking. Last time it came up it was somebody handling radioactive samples with tape on a stick. (It was long enough and they yelled for everyone to stay away, you see)