What is At-Will Employment?
The concept of at-will employment came into being in the 1870s, shortly after the Civil War, when most areas of the country were busy rebuilding and expanding. It was a time when companies expanded without much government interference. There was an aura of freedom about men throughout the country--not just those who had been emancipated by the Civil War, but the working man in general felt the impetus of his own individualism. An atmosphere of equality thrived between employer and employee: Just as the employer had always enjoyed the option to terminate someone’s employment whenever he pleased and without any reason, the employee gained equal footing to “up and quit" his job at will, whenever it suited him.
Eventually, relationships normalized between employers for both unionized and non-unionized employees. The status quo indicated that if an employee reported for work regularly and did a good job, he provided a value that would preclude any unnecessary or unjust termination.
Initially, all 50 states plus the District of Columbia subscribed to the at-will doctrine. But social changes in the mid-1900s brought the issue back to the table. Legislators and jurors revisited concerns about employee rights of at-will employment and eventually identified three conditions--called exceptions--that protected employees from the hazards of sudden, unexplained termination: Employees were protected against discharges that went against public policy; they were protected against discharge when an implied contract existed; and they were protected when good faith covenants existed.
As the exceptions evolved, most states accepted one or more of them, but today only six states honor all three exceptions—Alaska, California, Idaho, Nevada, Utah, and Wyoming. Conversely, there are four states that refuse to recognize any of the exceptions—Florida, Georgia, Louisiana, and Rhode Island.