Chip famine typically occurs when there is some sociological or physical change that prevents certain chips from being produced in large enough numbers to satisfy demand. A severe case of chip famine occurred in 1988 after a pact between American and Japanese chip manufacturers resulted in severely reduced production. Changing to newer production methods also causes chip famines as the new factories are not built quick enough to meet the demand for the newer chips. An example was the shortage of smart card chips in 1999, another was during the rationing of other types of chips in 2004. Recently, the 2011 Japanese earthquake led to a period when it was difficult to source all the parts needed for various systems.
Chip famines can have a major effect on the electronics industry where manufacturers change their sourcing of chips and suffer major loss of profits, such as when PC manufacturer Gateway switched from Intel to AMD microprocessors in 2000. Some manufacturers may find themselves having to redesign their products to take account of the shortage of certain chips, or have to leave design options open to allow alternative chips to be incorporated into the design.
In 1988 there was a shortage due to high demand. Workers at seven Hitachi corporation factories had to work through their summer vacations to meet demand. In 1994, there was a shortage due to new technologies being developed. The newer manufacturing processes required significantly cleaner "clean rooms" and many batches of integrated circuits were discarded due to manufacturing errors.
Intel suffered a shortage of several products in 2000. Larger companies were able to receive the products they needed but smaller companies such as Gateway had to wait or find other chips. 
There was a lack of CDMA chips in 2004. This was due to the strong push of mobile phone companies to introduce and establish CDMA in both the United States and India.
After the 2011 earthquake in Japan, there was a severe chip shortage of NAND memory and displays. Qualcomm released a statement on April 19, 2012 that they expect a shortage of their Snapdragon chip due to a lack of facilities to manufacture the 28 nm chip.
1986 U.S.-Japan semiconductor trade pact was designed to help US chip manufacturers compete with Japanese companies. This resulted in severe cuts in Japanese production  A 1993 DRAM chip famine was caused by a factory explosion at the factory which produced 60% of the world's supply of resin used in chips. From 1993 to 1994, there was a glut of chips and companies lost incentive to build new leading-edge factories. When the new generations came out, there were not enough factories to produce the new chips.
A previous chip famine might also cause slumps in the market, which would result in fewer chips being manufactured. When the slump is over, the demand might grow too high for the companies to fulfill, resulting in a second shortage.
New generation chips are difficult to produce due to manufacturing capabilities. In many cases batches of product are discarded due to manufacturing defects in the first few runs, resulting in manufacturing output that could have gone to producing older chips not being used to ship newer chips either. Furthermore, customers that want the newest chips available and may not be willing to settle for older chips, so companies must wait for the newer chips to put into their products.
A chip pact enacted in 1986 was designed to help the United States compete with Japanese manufacturers. However, it had unintended consequences. The pact called for Japanese companies to stop selling chips below cost, or dumping, which led to the companies producing and exporting fewer chips, the root cause of the dumping. American companies did not reenter the market as expected due to the high cost of production and risk.
Shortages of DRAM, SRAM, and processors have led to raises the prices for remaining chips. Lack of IC's for their Wii console caused the global Wii shortage in 2007.