Walk down any city construction site and you’re bound to see a network of steel beams and columns rising from the ground. Why are they using steel? Because steel is strong, durable, and easy to work with, making it the iron alloy of choice for building construction.
If you’re wondering how steel is manufactured, wonder no more! In this blog post, we’ll explain the process from start to finish.
History of Steel
The emergence of steel as the foremost material for construction can be traced back to the Iron Age, when it is was used to make swords and other materials.
With the advent of the railroad construction boom in the 19th century, and it’s subsequent requirement for this resource to make the tracks was proving to be an issue, due to the fact that there wasn’t any automatic production process to fill the need.
The Initial Making of Steel
Steel does not grow out of thin air. It begins with the mining of iron ore, which then has to be combined with the element carbon via a blast furnace. Let’s get more involved understanding how this process works.
Mining the Iron Mineral
It all begins with the mining of iron ore. An ore represents a mineral where a valuable asset can be extracted.
Once it is taken out from the quarry, the ore is melted and purified (removing impurities from the ore and leaving only the metal). This is done in a blast furnace.
Carbon is an element in the Periodic Table that has an atomic number of six, with four electrons in its outer shell and two electrons in its inner shell.
Atoms that have less than eight electrons in its outer shell, (called the valence shell) tend to look for other atoms to bond with, so that their valence shells can stabilize the atom by balancing the shell to eight electrons. This is based on the Octet Rule.
Iron has eight electrons in its valence shell, so if you bond the carbon atom that has six valence elections with the iron atom, you have a molecule of two different atoms which forms steel.
It is essential to ensure that the correct amount of carbon is used with iron, approximately 0.04%, so that the resultant product is that of steel. If the wrong amount of carbon is mixed with iron, a different product will be produced such as cast iron or wrought iron – both of these are not efficient products to render steel.
When is Carbon Added to Iron?
For steel, this combination of the two elements is done while the iron metal is liquid hot, which then alters the iron’s properties to change to that of steel.
Steel subsequently becomes an alloy (a metal made by combining two or more metallic elements) of iron and carbon. This causes a distortion of the crystalline lattice structure of iron and subsequently enhances the metal’s strength; specifically, it increases the metal’s tension and compression properties.
The Manufacturing Process
A breakthrough for manufacturing steel via an automated process materialized in 1856 when Henry Bessemer found a way to manufacture steel quickly. Bessemer’s steel production process is what inspired the Industrial Revolution.
It was the first cost-efficient industrial process for the large scale production of steel from molten pig iron, by taking out impurities from pig iron using an air blast.
Adding Carbon Produces a Variety of Iron Alloys
As previously mentioned, when mixed with carbon, the iron’s characteristic will be changed, allowing a variety of different types of metal alloys to be created. It all depends upon the amount of carbon that is added to it. Let’s take a look.
Wrought iron is softer than cast iron and contains less than 0.1 percent carbon and 1 or 2 percent slag. It was an advancement over bronze and began to replace bronze in Asia Minor by the 2nd century BC. Because iron was far more plentiful as a natural resource, wrought iron was used for a wide variety of implements as well as weapons and armor.
Cast iron is an alloy of iron that contains 2 to 4 percent carbon, along with smaller amounts of other elements, such as silicon, manganese and minor traces of sulfur and phosphorus. These minerals are non metallic and are referenced in the industry as slag. Cast iron can be easily molded into a desired shape, known as casting. and has been used to make decorative fences and other aesthetic forms.
Cast iron facades were invented in America in the mid-1800s and were produced quickly, requiring much less time and resources than stone or brick. They were also very efficient for decorative purposes, as the same molds were used for many buildings and a broken piece could be quickly remolded. Because iron is powerful, large windows were utilized, allowing a lot of light into buildings and high ceilings that required only columns for support.
Steel is an alloy made from iron that usually contains several tenths of a percent of carbon, which increases its strength and durability over the other forms of iron, especially in tensile strength.
Strictly speaking, steel is just another kind of iron alloy, but has much lower carbon than cast iron, and about as much carbon (or sometimes slightly more) than working iron, with other metals frequently added to give it additional properties.
Most of the steel produced today is called carbon steel, or simple carbon, although it can contain metals other than iron and carbon, like silicon and manganese.
The advantages of steel are numerous, from great tensile and compression strength to speed of manufacturing to low cost, it is the metal of choice in construction when compared to iron.
Although iron and steel appear to be similar, they are two distinct materials that have their own specific characteristics and qualities. Iron is a pure mineral and steel is an alloy material that contains a percentage of carbon. Depending on the amount of carbon mixed with iron, different products emerge, and this includes the creation of steel.
Steel is far stronger material and there is no better metal at this time that is used when strength and cost are major factors.