Question by Dean O: Travel insurance to America?
I live in the UK and I was wondering if i NEED to have travel insurance to enter America? & which company is the best to take it out with?

Best answer:

Answer by tonalc2
Travel insurance is not a requirement.

Give your answer to this question below!

Travel tip of the day
Powered By Vacation Travel 101

2 COMMENTS

  1. It is not mandatory for you to have travel Insurance while entering the US.

    However, travel to foreign lands is risky and depending on your existing insurance policy and health coverage, you may need health insurance to ensure that you are covered in case of emergencies.

LEAVE A REPLY

Please enter your comment!
Please enter your name here