kwackers is right.
Depending on the class & grade of your DTI, the mechanical drive is a set of flexure linkages that mechanically multiply the angular rotation of the indicator point's driven pivot. The point itself is a friction fit to the driven pivot -- which is what allows you to manually change the angle with respect to the DTI's centerline. The pivots themselves (at least in good quality DTI's) are jewel mounted pivots with spring-loaded connections to the other pivot levers in the multiplication chain. The indicator point's shaft should line up within 10° of the surface you are indicating.
I made my first DTI as part of my apprenticeship. As I was 13 years old when I made it, it is nowhere near as pretty as I had hoped as I had to hand saw and file all the parts except for the jewel locations (done using disk locators on a drill press). However, between 1968 and 1988 it managed to pass the NBS/ANSI calibration as being accurate to within .0001 inch. (I have not done a job requiring my indicators to have formal calibration since 1988.) This unit is in storage and I more normally use commercial units in my shop as it has been more than a decade since I have had enough space to set-up my full tool set (which is why I have no pictures to share).
The thing is that the accuracy of the indicator depends more on the symmetry of the pivot-link connecting pinch springs than anything else -- though the position and alignment of the jeweled pivots comes a close second. If you stop and think about it, a traditional .0001 inch resolution DTI measures .0040 total in 270° of rotation. That's 40 divisions per 270° or 6.75°/.0001 inch of deflection. If you do the math on the ratios involved, you will start thinking that .0001 inch accurate DTI's are cheap!