When did acting become an official job?
There is no specific date when acting became an "official job". Acting has been a recognised profession for centuries, dating back to ancient Greece and Rome. However, it was not until the 17th century that actors began to be recognised as professionals in the modern sense, with the establishment of the first professional theatre companies.