Six Sigma the very term evokes notions of clinical, stringent quality standards. Much vaunted and in-demand, Six Sigma techniques have resulted in savings across business process and product lifecycles for corporations across the world.
But how did this come to be? Read on, as we trace out a brief history of Six Sigma, dig into its humble beginnings, and chart out its evolution.
Brief Overview of Six Sigma
The set of principles that comprise Six Sigma has its origins in the quest for quality in mass production, beginning in the late 18th century, though the field of statistics itself –upon which many of Sigma’s tools are based- has been around for much longer.
The central pillar of statistical theory, as utilized in Six Sigma, is German mathematician Friedrich Gauss’ Normal Distribution curve (also called a ‘Bell Curve’). The outliers on the normal distribution lie multiples of one standard deviation, represented by the Greek alphabet ‘σ’ (‘sigma’), away from the mean. In the context of statistical quality control, processes and products are measured and evaluated to determine variation from acceptable standards, and the spread of the distribution signifies variability.
The Industrial Revolution – First Stirrings of Quality Management
In the pre-industrial world, quality inspection and management was an expensive affair. Quality workmanship was prized, and the only way to get a good-quality product was to have it done at a premium price. On the flipside, supervision was unnecessary, which made the entire process less tiresome.
All this changed with the Industrial Revolution. The introduction of machinery meant goods could now be produced en masse and quicker than before. It was around this time that Eli Whitney –famed American inventor of the cotton gin- took up an idea first articulated by Honore le Blanc: interchangeable parts. When handed a contract for the production of 10000 muskets by the French government, Eli came up with designs for standardized musket parts so they could be produced without much variation off a single template for years, thus ushering in the age of mass production.
Whitney’s application- termed the Uniformity System- was adopted by the military-industrial complex and defense establishments across Europe and the Americas. This development is notable for the simple reason that this was the first instance a concerted effort had been made to address quality in production.
Ford, Shewhart, and American progress
With the introduction of the assembly line and Ford’s adaptation of it to the automobile industry, cost-effective mass manufacturing became a reality. This meant that the need for measurement of parts against pre-determined standards became more acute. A large number of parts involved had rendered manual measurement against go and no-go gauges, as was the prevalent practice, unfeasible. The onus thus shifted to the measurement of the consistency of the process in place to produce interchangeable parts, so that the end products were within acceptable tolerance limits for quality.
An allied development was the widespread adoption of rudimentary statistical techniques such as sampling. Deployment of statistical tools in the management of quality had become commonplace.
It was in this fertile climate for statistics that Dr. Walter Shewhart, statistician, and engineer, was brought in to improve the quality of manufacturing processes at the Western Electric Company, in 1924. One of Shewhart’s first –and ultimately most important- contributions was the Process Control chart, which was to become a staple of quality management in the decades to come.
By the 1930s, the role of the supervisor or inspector in American industries had moved away from measurement and identification of defects to determine the stability of the processes involved, identifying deviations from the norm and suggested corrective action. Increased complexity and increase in the volumes of parts handled meant that businesses began to establish separate, dedicated Quality Control Departments. avenue
The Second World War and progress in Japan
At the conclusion of the Second World War, faced with the Herculean task of rebuilding a land in tatters, the Japanese looked to the occupying forces under MacArthur for leadership. Japanese industry and business leaders considered the Allies’ tremendous success to have been due in part to the strong industrial infrastructure in place in the West, which they believed was driven by sound quality assurance mechanisms.
Accordingly, MacArthur secured the services of W. Edwards Deming, a contemporary of Walter Shewhart, to aid in the process of reconstruction. Deming introduced the concept of the PDSA (Plan-Do-Check-Act) cycle –which he termed the Shewhart cycle- to Japanese industry, and trained native engineers and managers thoroughly in Statistical Process Control (SPC). Deming’s work in Japan laid the foundations for developments in the decades to come, and his insights continue to find a place in today’s Six Sigma training courses.
Juran in Japan
A major problem most American and Japanese organizations that had deployed Quality Management systems had to grapple with was the integration of quality inspectors within the overall corporate structure. Managers had little to no knowledge of statistics, and statisticians were distrusted by the workforce, as it was believed they monitored performance closely.
Joseph Juran, an American engineer and management consultant of Romanian origin, came up with the idea of integrating the various strata of quality management in an organization. Before Juran, teams of inspectors, statisticians, and surveyors existed at every level of a company, from shop floor to top management. These teams were now integrated into a single, seamless whole, and quality was achieved through active engagement of management –termed the Big Q. This philosophy continues to inform modern Six Sigma methodology.
Independently of Shewhart, Juran introduced courses in quality management in Japan and trained middle and top-level management, a move that ruffled feathers in conservative America, where management was seen as being above training.
These and other contributions, as enshrined in his "A Quality Control Handbook" (1951), played a vital role in the formulation of the modern quality management ideology.
An American Resurgence
By the early 1970s, the Japanese focus upon quality had superseded that of the Americans. Japanese businesses ran on two core principles: cycle time reduction and defect elimination. This emphasis on quality meant that Japanese automobiles, which were far more fuel-efficient than their American counterparts, began to rule the roost when the oil crisis hit in 1973.
US industry finally stood up and took notice. Juran and Deming, now with decades of experience in training for quality, were roped in to work a second miracle. Philip Crosby’s book, "Quality is free", set out his 14-step approach to quality management and the principle of Zero Defects, which was eventually discarded and largely ignored on account of being absolutist and impracticable.
Taking note of these developments on either side of the world, the Geneva-based International Standards Organization introduced a quality specification in 1987, modeled after the old British BS 5750 system (1901), called the ISO: 9000. The intent was to ensure uniformity of quality management guidelines and practices. However, the ISO only validates the consistency of production and manufacturing processes, not of the end products themselves.
Other Japanese Contributions
As we have seen before, the quality revolution was particularly protracted and intense in Japan. Japanese quality teams –especially those working for auto giants such as Toyota- came up with a number of concepts and techniques that have since been absorbed into the Six Sigma fold, such as Just-In-Time manufacturing, Quality Circles, Toyota’s Kanban squares, kaizen (the principles of continuous, incremental development), etc.
While many of these were adopted by American and European industry, some were initially dismissed as being ‘culturally untransferable’ Others, such as Total Quality Management (TQM), had originally been American innovations, and were thus easily adaptable.
The Motorola Story
In keeping with the spirit of the times and to encourage enterprise, the Malcolm Baldridge National Quality Award was instituted in the United States, won by the Motorola Corporation first time out, in 1988. Motorola’s tryst with quality began when the company was looking to revamp its pocket pager business, in the early 1980s. Under the leadership of Bob Galvin and Bill Smith, former executives who had first mooted the idea of continuous quality improvement, Motorola instituted a policy of applying statistical quality control to gauge not just process capability, but to product specifications as well. The idea was to hitch product design to process quality and ensure a product only went into design when the specifications were up to the standards expected by process control.
To facilitate this new paradigm, Motorola expanded upon the older notion of three sigma by three additional standard deviations from the mean to include product specifications as well, thus birthing the term ‘Six Sigma’. Statistically speaking, a spread of six standard deviations–six sigmas–about the mean would encompass 99.99% of all output, resulting in a minuscule 0.02 defects per million opportunities –effectively zero.
Motorola’s also began to utilize statistical methods of control extensively. Data tools such as Cp (process potential index) and CpK (process capability index) began, for the first time in the history of quality management, to dictate and inform policy at the highest levels. Setting specific quality targets, such as 3.4 DPMO (Defects Per Million Opportunities), now followed by organizations worldwide, was a practice first perfected by Motorola.
Their runaway success spurred IBM to adopt Six Sigma practices, which improved upon the Measure, Analyze, Improve, Control (MAIC) cycle and added the Define dimension (DMAIC). In 1989, Motorola made Six Sigma its flagship approach to quality, and Xerox, GE, and Kodak followed suit.
Harry Mikel and The Six Sigma Academy
Harry Mikel, an ex-Motorola employee, teamed up with colleague Richard Schroeder to found the Six Sigma Academy in the early 1990s. Like his quality management forbears, Mikel’s aim was to teach and train employees in Six Sigma tools such as Lean Six Sigma and to guide businesses in successfully implementing Six Sigma principles in the organization.
Mikel’s first client was AlliedSignal’s Lawrence Bossidy, who applied Six Sigma to turn his ailing business around. Bossidy later also introduced a close friend and CEO of General Electric, Jack Welch, to the methodology, who applied it wholesale at General Electric and achieved much-documented success as a result. The Academy’s other notable clients included DuPont and Merrill Lynch.
While at Unisys, in 1987, Harry Mikel borrowed inspiration from Eastern martial arts to apply the belt argot to Six Sigma practitioners, labeling professionals Green, Yellow, Black and Master Black Belts.
Looking to improve business processes and reduce costs while adding value to your organization? Simplilearn's Certified Lean Six Sigma Green Belt is the perfect program for you. This certification program provides in-depth training on the principles of Lean Six Sigma, including process improvement, data analysis, and project management. With the Certified Lean Six Sigma Green Belt, you'll learn how to identify and eliminate waste, optimize business processes, and deliver exceptional results. Start your journey to becoming a certified Lean Six Sigma Green Belt and take your career to the next level with Simplilearn.