In most cases, no, it is not true that you can only see your employer’s doctors for injuries sustained on the job.
If you have been injured at work, you have the right to seek medical treatment from any doctor or healthcare provider of your choosing. However, in Texas and under some workers’ compensation plans, your employer may have the right to choose the initial doctor who will treat you for your work-related injury…