This work focuses on the implementation of a vision-based motion guidance method, called virtual fixtures, on admittance-controlled human-machine cooperative robots with compliance. The robot compliance here refers to the structural elastic deformation of the device. The high mechanical stiffness and non-backdrivability of a typical admittance-controlled robot allow for slow and precise motions, making it highly suitable for tasks that require accuracy near human physical limits, such as microsurgery. However, previous experiments have shown that even small robot compliance degraded virtual fixture performance, especially at the micro scale. In this work, control methods to minimize the effect of robot compliance on virtual fixture performance were developed for admittance-controlled cooperative systems. Based on a linear model of the robot dynamics, we applied a Kalman filter to integrate the measurements obtained from the camera and encoders to estimate the robot end-effector position. A partitioned control law was used to achieve end-effector trajectory following on the desired velocity commanded by the admittance and virtual fixture control laws. The effectiveness of the Kalman filter and the controller was validated on a one degree-of-freedom admittance-controlled cooperative testbed.