The 8 Worst Programming Mistakes in History

Code is almost everywhere. The advent of modern computers arrived in the 1940s. In its rich history, programming enabled better communication, and led to advancements across a myriad of industries. Everything from space travel to telecommunications and healthcare has been revolutionized and affected by code.

Plus, programming can teach valuable life lessons. However, in its storied past, coding wrought destruction as well. Instances of a little bit of bad code caused disaster on a major level. The following are 10 of the worst programming mistakes in history.Computer Science Computer Science Computer Science Computer Science Computer Science

Day of the Living Dead: St. Mary’s Mercy Hospital

Computer Science Computer Science Computer Science Computer Science Computer Science

In 2003 a software glitch incorrectly “killed” 8,500 people. St. Mary’s Mercy Medical Center in Grand Rapids, Michigan erroneously reported that many patients dead with a glitch in their patient management software system. This bad code disaster is rather harmless when compared to the Therac-25 fatalities, since no one actually died. Still, reading about your own demise is disconcerting — particularly when you’re alive and well.

False death reports weren’t limited to patients. This correspondence went out to insurance companies and Social Security offices. Because Social Security and insurance providers ensure eligible patients have Medicare, this presented quite a problem. St. Mary’s Mercy employees informed patients, government agencies, and insurance providers of the error. Ultimately the programming mistake didn’t gain much attention. It’s unclear if the coding error was ever corrected. However no further false death reports emerged. St. Mary’s Mercy hospital simply switched patient management software.

Why it’s one of the worst programming mistakes: Thankfully nobody actually died. But the damage control of ensuring continued healthcare coverage was a mess.

Y2K Bug

The Year 2000 bug, aka Y2K Bug or Millennium Bug, was a coding problem predicted to cause computer pandemonium. In the 90s, most computer programs listed four digit years in an abbreviated version. So 1990 read 90, 1991 written as 91, etc. By shortening four digit years to two digits, coders thus saved valuable memory. But computers were unable to recognize 2000 as simply 00. Further exacerbating the problem, 2000 was a leap year. Certain software applications didn’t account for the extra day.

Prev1 of 5Next

Leave a Reply

Your email address will not be published. Required fields are marked *