When you hear the term ‘gold standard’, you are likely hearing about the International Gold Standard. This was established in the late 1800s after a silver crisis in England which ended with the U.S. suspending all silver payments. In 1871, the first gold standard was established by Germany, and within three decades, nearly all countries with big economies had established similar systems.
With the onset of the First World War, the gold standard reached an initial crisis. Britain had moved from it to a fiat standard, due to the huge cost that was involved in fighting the war. With the Treaty of Versailles, which set conditions for Germany’s surrender, Germany was forced to turn over (as reparations), the bulk of its gold supply to the winning countries. This left Germany without enough gold to maintain it, thus giving them no other option than to switch to a fiat currency system.
The U.S. and most other major economies abandoned the gold standard by the mid Twenties. The official demise of the standard came in 1933 when those nations using the gold standard failed to come to an agreement on the value of gold. Following WWII, the Bretton Woods Agreement would govern the value of currency, until 1972, when the free-floating currency era began. Gold lost its status as the basis for reserve accounting for central banks.
The gold standard, initially, allowed for the expansion of trade around the world, but it was problematic. Gold supplies grew much more slowly than the global economy, which made the gold standard highly deflationary. For example, in the U.S., periods of deflation that lasted for as long as fourteen years were seen when the economy switched to the it. It is also possible for local distortions of value to be realized. Such was the case when the Irish found it more profitable to export potatoes to England rather selling them domestically, which led to what became known as the Great Potato Famine.