Quantum Computing: The Origin and Its Applications Explained

Illustration of quantum light rays
Quantum computing. Close up of optical CPU process light signal.  Photo: iStock

You have definitely heard the word computing and you might have also heard the term ‘quantum’. However, it is unlikely that you have heard both of these words together.

The term ‘Quantum Computing’ hasn’t gotten the much-needed traction in the tech world as yet and those that have traversed through this subject might find it a bit confusing, to say the least. But there are experts who strongly believe that quantum computing is not just the future, but also the future of humanity, as we move ahead of the binary computer bit and venture into the world of computing located down at the subatomic level.

If you don’t have a clue what we are talking about, you are not alone. Stay with us through this article where we will discuss quantum computing in great detail—what it is—how it will change the tech world and its practical implications (both for better or worse).

But before we usher in the discussion of this potential life-changing advancement, it is necessary to discuss the platform on which its foundation is based i.e. Quantum theory. 

What is Quantum?

Illustration of the Atom
Particles of the atom: protons, electrons, and neutrons. Nucleus. Photo: iStock

Also known as quanta, in simple terms, it represents the minimum amount of energy that can be used within any physical interaction.

Using examples of particle interaction within the atom, a quantum of light would be a photon, and a quantum of electricity would be an electron. There can be no activity smaller than when these particles have an interaction.

The Potential Enabler of Quantum Computing 

The industrial revolution of the 20th century was one of the greatest milestones of modern history. From the invention of the automobile to industrial steel, elevators, and aircraft, it gave birth to a plethora of things that now define our civilization and will continue to shape the history of our future. 

Enter the 21st century and we are watching a transition from the tangible to the intangible (virtual) world; notably, computer technology, its hardware, software, and the world wide web.

Among the many incredible things that are ensuing during this technological revolution is the colossal development in physics, specifically quantum theory. We will try to keep the explanation of quantum theory as simple as possible in order to make this an interesting and informative article. 

Modern Physics

It is important to understand that the field of physics is divided into two definite branches: classical and modern. The former branch was actually established during the period of the Renaissance and continued to progress after that. Classical physics is majorly erected on the ideas put forward by Galileo and Newton. The principles are primarily focused on macroscopic (visible to the naked eye) and the solid nature of the world around us.  

Conversely, modern physics is about analyzing matter and energy at microscopic levels. The subject lies heavy on electromagnetism, the wave nature of light and matter, and the theory of duality. It is interesting to note that all these motifs of modern physics come from quantum theory.

While we are at it, it is important to clarify that quantum theory doesn’t just refer to one idea or hypothesis. It is actually a set of a number of principles. We will discuss them in a simple and brief manner and remain focused on the provisions that are relevant to quantum computing. 

    • The work of physicists Max Plank and Albert Einstein in the earliest of the 20th century theorized that energy can exist in discrete units or ‘quanta’. The hypothesis contradicts the principle of classical physics which states that energy can only exist in a continuous wave spectrum.
    • In the following years, Louis de Broglie extended the theory by suggesting that at microscopic (atomic and subatomic) levels, there is not much difference between matter particles and energy and both of them can act as either particles or waves as per the given condition. 
    • Lastly, Heisenberg proposed the theory of uncertainty, which entails that the complementary values of a subatomic particle can’t be simultaneously measured to give accurate values. 

Neil Bohr’s Interpretation of Quantum Theory: The Primal Basis of Quantum Computing

 

Illustration of a quantum computer
Image by Pete Linforth from Pixabay

During the time period, when the stipulations of quantum theory were extensively discussed among top physicists, Neil Bohr came up with an important interpretation of the theory. He suggested that the properties or the reality of any quantum system (an environment governed by wave-particle duality) can’t be determined or specified until they are particularly found out. 

This assertion led to the development of the Principle of Superposition, which in simple words, suggests that any quantum system exists in all its possible states at the same time until one goes on to find out the exact state.

The infamous Schrodinger’s Cat thought experiment is an easy way to understand this concept. The experiment entails that a cat enclosed in a box (which is supposed as a quantum system) with poison would be considered both dead and alive simultaneously until the box is opened and the cat is observed.

Use of Superposition to Develop Computer Algorithms 

Now, this is the point where the theory actually demonstrates its potential to be the basis of a new computer algorithm. In order to understand the quantum-based algorithm, it is essential to understand how contemporary/conventional computing systems work. 

Whether it’s a handheld gadget or a supercomputer working in the server room of Google, at the core of it, every computing device works on the binary language. In conventional computing systems, every bit of information can exist in one of either two states: 0 or 1 (hence ‘binary’). 

On the other hand, when we talk about quantum algorithms, they are actually inspired by the idea that any particle-wave system can exist in multiple states at any given time (Principle of Superposition).

This means when data is stored in a quantum system, it can be stored in more than two states. This supposition makes quantum bits (also referred to as ‘Qubits’) more powerful and expensive than conventional computing bits.

Standard Binary Computing Vs. Quantum Computing 

Seamless pattern with abstract binary code, digital matrix background
4 rows of 8 bits = 4 rows of bytes. Photo: iStock

The fact that a quantum bit can exist in multiple states gives quantum computing an uncontested edge over conventional binary computing. With the help of a simple example, we will try to demonstrate how superior quantum computing could be in comparison to its classical counterpart. 

For example, picture a cylindrical rod, and each end of the rod is a bit, which is either a  1 or 0. That’s it! When one side is a 1, then the other side must be a 0. There is no in-between here. 

On the other hand, the quantum bit exists in every possible state simultaneously. This means every point on the surface of the cylindrical rod denotes the quantum bit. 

The above explanation exhibits in a really simple manner that quantum bits can hold an unprecedented amount of information and hence the computing governed by this type of algorithm can exceed or super-exceed the processing of any classical computing machine. 

Apart from storing more information than classical computers, quantum computing can also implement the principle of entanglement.  In simple words, this principle will enable every quantum bit to be processed separately even without getting drifted away from each other. This feature will also enhance the processing capability of a quantum computer manifold. 

Beneficial Uses of Quantum Computing

The supreme processing capabilities of quantum computing make them an ideal machine to carry out many tasks where conventional computers lag behind.

Science and Life Sciences 

The study of complex atomic and molecular structures and reactions is no mean task. A lot of computing capacity is required to simulate such processes.

For instance, the complete simulation of a molecule as simple as hydrogen is not possible with the available conventional computing technology. So, quantum computing can play a significant role in understanding many of the concealed facts of nature and more particularly of life. Many significant chemical, physical and biological research works stalled for years can take off after the development of quantum computers. 

Artificial Intelligence and Machine Learning 

Artificial Intelligence Illustration AI
Image by Tumisu from Pixabay

Even though scientists have made significant inroads in the area of machine learning and AI with the existing computing resources, quantum computing can help in making the progress that we have always aspired for i.e. to make a machine as intelligent as human cognition. Machine learning feeds on big data. The processing of humongous databases goes into the development of any system based on machine learning. 

With the fast processing of quantum computing, even the usual AI will become more streamlined. In addition, the unrestrained computing power of quantum devices will revamp the development of artificial intelligence.

Improvement of General Optimization Procedures 

In today’s bustling life, we feel the need for optimization more than ever—whether it’s personal or commercial dealings. An individual trying to find the best commute for his day-to-day destinations or a financial entity trying to come up with different plans for its every unique customer, a good optimization can only be done when more variables are involved. 

With the addition of more variables, the number of permutations and combinations also goes up and the amount of data to be processed increases exponentially. Optimization of a financial plan might need the processing of several petabytes. Implementation of such extensive optimization in everyday activities can only be achieved with the processing powered by quantum computers.

Other Side of the Coin: The Dangers Involved with Quantum Computing 

One should not be surprised by this heading. We have seen it all through the course of history how the advent of any new technology, intended for the benefit of humankind, is followed by its misuse. And there is no exception for quantum computing. Adding insult to injury, the unrestrained processing power that can be harnessed by a quantum computer can make its exploitation more deadly. It’s important to mention here that the researchers working in the domain are well aware of the unwanted repercussions of quantum computing. 

Quantum Computing Puts Data Encryption Practices in a Great Danger 

Digitization of our everyday activities has shifted nearly every valuable piece of information into the digital form of data. From nuclear codes to personal banking information, everything now exists in the form of digitized data. For that matter, data is now considered a precious commodity. 

And as we know every precious commodity is vulnerable to vandalism, breaches, and thefts. So, in order to address this data vulnerability, computer scientists have developed encryption modules that are used to lock down the data in order to give it only authorized access. 

The encryption of data can only be neutralized with the help of a decryption key designed by the developers and stored with them. Any unauthorized party can’t get around the encryption without a technique called brute force cracking. But it is important to mention here that brute force might only work to crack simple passwords and basic encryption consisting of only a few bits. 

Let’s try to understand this with the help of numbers. 

As per the calculations done by the researchers, using a supercomputer, it could take more than a billion, billion years to crack data that is protected by what is called a 128-bit  encryption key.  In order to put things into perspective, our universe is just 13.75 billion years old. So, it is impossible for a standard 128-bit key to get cracked by the brute force algorithm, using the conventional binary coding system which has only two possible states..

But when we replace this two-state bit of computing with a quantum bit of unlimited existing states, the tables surely get turned.

The 128-bit Key that is so formidable against the brute force of classical binary supercomputers will fall flat when quantum computing is used to carry out the brute-force algorithm. No operating quantum computing machine exists as of today, but experts have estimated that a quantum supercomputer will be able to crack 128-bit encryption keys within 100 seconds. Compare that to the billon-billion years it would take a binary computer to crack the code!

Aftermath 

The aftermath of such a scenario won’t be less than any technological dystopia. Data encryption becoming ineffective will expose everything to the shenanigans of criminal elements. To understand just a fraction of this devastation, imagine that every person on the earth linked to the banking system loses access to his/her account. The mere imagination of such a situation can send chills down the spine.

Apart from that, the neutralization of data encryption can lead to cyber warfare between nation-states. Here also, rogue elements will easily be able to capitalize on the situation. A global outbreak of war in a world with the existing eight nuclear powers can end up with a dreadful outcome. All things considered, the manifestation of quantum computing can bring along many irretrievable repercussions. 

Preparation to Protect Against the Nefarious Use of Quantum Computing 

Google and IBM have successfully carried out quantum computing in a controlled environment. So, to think that quantum computers are a distant reality won’t be deemed an insightful judgment. For that matter, businesses should start preparing against the abuse of quantum computing. There is no point in waiting for formal rules and protocols to be issued. Experts working in the domain of digital security and cryptography recommend some measures to protect business data in the future from any exploitation of the quantum era. 

Conclusion 

How technology has progressed in the last few decades is clearly indicative of the fact that quantum computing is the reality of the future. So, the arrival of quantum computers is not the question of ‘if’ – it’s the question of ‘when’.

Quantum computing with all its benefits for the development of life sciences, the financial sector, and AI poses a great threat to the existing encryption system, which is central for the protection of any type of confidential data. The proper approach for any nation and business is to accept this unwanted aspect of quantum mechanics as a technological hazard and start preparing against it with the help of experts. 

With that said, it is also a blessing when used proactively for the benefit of humankind and we look forward to a better lifestyle for each of us when quantum computing becomes a reality.

Units of Power and How They are Related to Electricity

Before we learn about kilowatts and kilowatt-hours, let’s get a jump start (pun intended 😅) on what these terms mean.

The Units of Electrical Power

Note: If you are not a physics enthusiast and want to skip the physics of electrical energy, you can jump to this section.

Let’s travel into our wayback machine and go back to high school physics 101. These terms and measurements are for background purposes only. We will not be using them later on, but understanding these concepts can help you better comprehend how power (energy) is referenced in units of watts (w) and how they are calculated. Let’s do it!

Speed

The rate of time at which an object is moving along a path.
Units: Length, Time
Example: The car traveled 1 mile in 60 seconds or 1 mile/minute.
Further Reading: What is speed in physics?

Velocity

The rate of time at which an object is moving along a path in a particular direction.
Units: Length, Time, Direction. More precisely, length/time (speed) in a particular direction.
Example: The car traveled 1 mile/minute going west.
Further Reading: What is the difference between speed and velocity?

Acceleration

When we speak about acceleration, it is the rate at which the velocity changes. In other words, velocity doesn’t stay constant.
Units: Feet per second per second or feet/second squared.
Example: A plane traveling south accelerates from 550 m/h (mph) to 600 m/h over a time period of 40 seconds. It has a change in velocity from 550 m/h to 600 m/h and the time period that this occurs in 40 seconds.
Further Reading: Speed, velocity, and acceleration.

Newton

Here we add a new component – Force. When we talk about the measurements of Newtons, we are talking about an acceleration (remember, acceleration means just a change in velocity) of an object.

Illustration_on_One_Newton
By Mhermsenwhite – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=70624309

One newton is the force needed to accelerate one kilogram of mass at the rate of one meter per second squared in the direction of the applied force. Simply put, this is the amount of push (force) of one kilogram of an object that weighs one kilogram at a changing velocity (acceleration) of one meter per second per second.
Units: 1 kg⋅m/s2
Example: Joe is pushing a box weighing one kg down the road at 1 m/s

Joule

Joules refer to the amount of work done. A joule is equal to the work done by a force of one newton moving one meter, so Joe has pushed the box weighing one kg down the road at 1 m/s squared for a distance of 1 meter. A joule is also referred to as energy.

Say Watt?

The number of Joules that an electrical device (e.g a lightbulb) is burning per second. Joules and watts both refer to work and equate to power, but both are interchangeable.

Here is the connection:

1 Watt = 1 Joule per second (1W = 1 J/s), so a watt is the amount of energy (in Joules) that an electrical device (such as a light) is running per second. So if a device is burning 500 watts for 60 seconds, then a Joule would equate to 500 * 60 = 30,000 J. Moving ahead, if an air conditioner is burning 1000 watts for 1 hour (60 sec * 60 min = 3600 seconds), then that equates to 1000 watts * 3600 seconds = 3,600,000 Joules (of energy that was used for that hour).

A kilowatt is equal to 1000 watts, so 1 kWh represents the amount of energy transfer that occurs over one hour from a power output of 1000 watts (i.e., joules per second). Thus 1 kWh is equal to 3,600,000 joules of energy transfer (work).

            What Does This Mean?

            It means that the work of one newton is being performed in the form of electrons that are being pushed through the wire per meter. Saying it in a simpler form, one watt is one joule of energy running a device per second.

  Just Tell Me in Plain English What a Watt is!

Transparent Light Bulb
Consider this to be a one-watt light bulb. If it was a two-watt light bulb, it would be about twice as bright. If it was a 500-watt bulb, more power is needed to provide that additional wattage; hence, more power or we can say more current or voltage is needed, and up goes your electric bill! See how it works? Photo by LED Supermarket

Glad you asked. 1 watt is equal to voltage times current: W=EI  (don’t worry, you don’t have to memorize this formula). Also known as power, a watt is a unit of power.  The more the voltage and/or current that flows through the wire, the more power (watts) is used to run the device.

Let’s Talk About Time

Devices run for a period of time, right? So we have to add this value to our watt calculations. That way, we will know how many watts are used for a certain period of time, and as we will see later, this will help us determine what it costs to run electrical devices, or more specifically, what the electric company charges us and why.

Examples: Joe turned on a one-watt lightbulb for 60 seconds, so that is equal to 60 watts.

Now Joe turned on a 250-watt lightbulb for 2 minutes, so that is equal to (250 * 0.333 hours) = 83.25 watts.

(Remember, for you physics guys, 83.25 watts is the same as saying that 4995 joules of power have been generated).

We’ll be going into this in another article, but just to enlighten you, if your electric company charges you 14.34 cents per 1000 watts used per hour (that’s what they generally charge in New York), then, using the example above, you have paid the company 14.34 cents * (per 1000 watts) * 0.25 watts * 0.0333 / hour (2 minutes) = .036 cents per hour.

If Joe ran the 250-watt bulb for 1 hour, then he would be paying 3.6 cents per hour, but if Joe ran a 1000-watt device for 1 hour, he would be paying 14.34 cents.

OK, but if Joe ran the 1000 watt bulb for 10 hours, then he would owe the energy company $143 cents or $1.43.

OK forget about Joe. What if your electric company charges you 14.34 cents per hour for a 2000-watt air conditioner? You would be paying 29 cents per hour, so if you run the air conditioner for 10 hours each day, you would be paying $2.90 every day. That’s $29.00 every 10 days or close to $100 per month.

Say 1000 Watts!

Are you getting tired of hearing of thousands of watts? This author is also, so let’s call 1000 watts – 1 kilowatt. There you go. Kilo means 1000 so 1kw is 1000 watts.

If you run a 1000-watt device for 1 hour, then the designation is 1Kwh (1 kilowatt-hour or you can say a 1-kilowatt device is running for one hour), denoted as kWh. So, 1 kilowatt is equal to 1,000 watts. If a unit consumes 60 watts hourly and runs for 60 hours, then the energy consumption rate will be 60 watts x 60 hours to equal 3,600 watts per hour, which is equal to 3.6 kWh of electricity.

Ok we know, you want to know what it cost to run your electrical devices in your home and you probably want to know about your air conditioner for starters.  Let’s just say that a typical air conditioner runs about 3 kWh per day. To calculate how much that costs you, just call your local energy company to get the correct number. For our area, Nassau County, the cost is 7 cents per kWh. If you want to know more about your air conditioner costs, check it out here.

 

 

Gas Cars Vs. EV Cars – What You Need to Know!

White Tesla Model 3 Charging at Home
Austin, Texas, 2-1-2021: Tesla Model 3 charging at home in front of a house on an L2 charger. Photo: iStock

Why Electric?

There are a number of benefits of driving an electric vehicle (EV). One is the cost savings on gas. The other is the environment. We will concentrate on the former now and will talk about the environment in a separate article.

Before we start discussing how EV costs are calculated, make sure you have read our articles on the atom, electric current  and Units of Power and How They are Related to Electricity. but if you haven’t, no worries. You can skip them and just see our final calculations below. 

Review

Here’s a brief overview for those who didn’t read the articles mentioned above.

    • Electrons are subatomic particles (one of the entities within an atom) that travel through the wire when power is applied (the wire is attached to an electrical socket). This is known as electrical current and is referred to in units of amps. More on this here.
    • Voltage is the force that pushes the electrons through the wireSimilar to turning on the pressure of a water faucet.
    • Current usually flows through a copper wire which is the conductor and the wire is covered by an insulator (rubber packaging around the wire so that the copper is not bear).
    • Resistance is the opposition to the current (electrons) that is flowing in an electrical circuit. Think of it as the friction that brushes along the side of the current.
    • A watt is the energy (power) that runs the electric device. It is a product of how much electrical current is running and how much voltage (push) is occurring. It is determined by multiplying the voltage times the current. The formula is E=IR (E=voltage, I=current, and R=resistance).
    • A kilowatt is 1000 watts (kW).
    • A kilowatt-hour (kWh) equates to 1kw that runs a device for 1 hour.

Example: If you run an air conditioner for one hour and that air conditioner uses 70 kilowatts of electricity per hour, then you have used 70 kilowatts of electrical energy for that hour. If you run the air conditioner for two hours, you would have used up 140 kilowatts of energy.

Most EVs, with the exception of the high-end luxury ones, have batteries that consist of a 60-65kWh capacity. Sparing you the formula, a battery of this size will equate to about 260 miles after a full (100%) charge.

Note: Most EVs are set to charge to 80% only. Constant charging to 100% diminishes the battery’s lifetime. 80% of a 65kWh battery equates to about 230 miles. 

How Does Kilowatts Equate to Electrical Costs?

Electrical Towers
High voltage transmission towers with red glowing wires against blue sky – Energy concept. iStock

Here’s the breakdown.

We will use a 2021, 4-cylinder Nissan Altima as our example.
Gas tank size: 16.2 galsMPG: 31 average. 

If we multiply 31 miles/gals * 16.2 gals, we can determine the total mileage that this car can run on a full tank of gas, which is 502 miles.  

As of this writing, the price for a gallon of gas is $5.00 on average across the United States. So $5.00 * 16.2 gallons (a full tank) equals $81 to fill up.

Electrical Vehicles

For EVs, we calculate units per mile instead of MPG. For this example, we will use a 2020 Kia Niro EV, which is a fully electric vehicle and contains a 65kWh battery.

As mentioned, the industry standard for charging a  65kWh EV to 80% is about 230 miles. If you have an EV, never let it go below 30%, as you may run into trouble if you are on the road and can’t find a charging station. 

Let’s review what we know so far:

    • Filling up a gas tank of a 2021 Nissan Altima will take you about 502 miles without having to fill up again.
    • The cost to fill up this car as of this writing is $81.00.
    • To charge a 2020 Kia Niro’s battery to 80%, the car can go about 230 miles without having to recharge.

Cost of Charging an EV

We need to add the cost of electrical use in the home, and for this example, we will use the costs from PSEG of Long Island (PSEGLI), New York, which powers Nassau County where the offices of Howard Fensterman are located.

Electrical power companies charge per kWh, so this is how we will proceed to determine how much it costs to charge your EV at home.

As the expression goes “If you read it on the Internet, it must be true!“.  Well, we read it on the Internet and we got electrical costs ranging from 14 cents all the way up to 22 cents/kWh.

Then we decided to do something smart. Why not call PSEGLI directly? So we did. The information we were given was adequately explained along with a booklet they sent us that breaks down all the costs.

Understanding Your Electric Bill

The specific electrical costs are determined via the location where you reside and the electric utility plan that you have. While the brochure explains how the costs are calculated, it is recommended that you refer to your specific electric bill to determine the precise amount of your electrical costs.

Below you will find the bill in part from a home in Nassau County, New York.

Portion of electrical bill for a household in Nassau County, NY
Portion of an electrical bill for a household in Nassau County, NY. Photo: SS

The Components of Calculating Electrical Costs

Note: As mentioned, we are using Nassau County, New York as our example. Other locations may vary. We recommend you contact your local electric utility company for specific pricing.

There are two components: (Refer to the brochure which explains in detail why and how they are calculated)

    • Delivery Charge: We will use the higher price listed, which is .1152 dollars or 1152 cents per kWh.
    • Supply Charge: .130715 dollars or 130715 cents per kWh.
    • Taxes (To keep things simple, we will not include the taxes paid in the calculation)

The electrical cost to charge an EV from the home is 14.227 cents (1152 cents + 130715 cents) per kWh. 14 cents rounded off.

To calculate the costs to charge a 65kWh vehicle to 100%, we do the following.

    1. Calculate the delivery charge: $.1152 * 65 kWh = $7.488
    2. Calculate the supply charge: $.130715 * 65kWh = $8.4965

The total  cost for charging a 65kW battery to full capacity from the home is about $15.9844

But, we are only charging this device (your EV) to 80% of the 65kWh, so that results in $15.9844 * .80 = $12.78758 or $13 rounded off.

To charge a 65kWh EV to 80% capacity, 230 miles, costs $13 from a home level 2 charger. 

Note: It can take up to four hours to charge an EV using a 220-volt level 2 charger.

Selective Plans

Most electric utility companies provide more than one plan that you can select for your household. Besides the default plan which provides a standard price for electric consumption throughout the day and night, there is a plan that can allow you to select lower rates at different times of the day.

This plan, called Time of Use (TOU) is available at PSEGLI and many other utility companies nationwide. Refer to our brochure as to exactly how this works.

Proportion 

We will now take this cost of $13 and compare it to filling a gas tank of a conventional car that equates to the same mileage (230 miles).

Here are the steps:

    • Divide the total mileage to charge the battery to 80% by the total mileage to fill a gas tank to get the percentage between the two: 

230 mi / 502 mi = 45%
So 230 is 45% of 502

    • Multiply this percentage by the total cost to gas up a car: 

To get the cost for a conventional car to go 230 miles, we multiply the cost to fill up the gas tank ($81.00) by 45% to match the 230 miles, and that cost would be 0.45 *$85 = $38.7. 

Using an average of today’s gas prices ($5.00 as of today), it would cost a gas car $38.7 to go 230 miles of highway driving and an EV car would cost $13 to go the same distance (230 miles) in Nassau County, New York.

That’s a savings of $25.7 for every 230 miles you drive for a typical EV in Nassau County, NY for a gas price of $5.00 per gallon.

Gas hose on a money background

Conclusion

If you are looking to save money on gas, EV cars are the way to go. Yes, these vehicles are more expensive than conventional gas cars, but at $5.00+ a gallon, you will be pleasantly surprised how much your savings can accumulate.

Finally, we leave you with this. Below is a copy of the estimated charges that accrued for the month of July 2022, from a 1,100-square-foot home that has an EV in its garage in Nassau County, NY. The family charges the car to its 80% capacity about three – four times per month. Notice that the cost in the Electronics category is only 10% of the total usage in the house. Something to think about!

Copy of estimated charges from PSEGLI for a home in Nassau County
Photo: SS

 

 

What is Voltage and Electrical Current? (A Brief Guide)

Electrical Towers
High voltage transmission towers with red glowing wires against blue sky – Energy concept. IStock.

Electrical current is the measure of electrical flow. It’s measured in amperes, or amps for short. The current refers to the number of electrons that pass by a point in an electrical conductor in one second, and it’s usually given in units as milliamps (mA) or microamps (μA). This article explains what electrical current is and how it works. Keep reading to learn more about this topic!

How Does Electrical Current Work?

Electrical current travels through a wire (conductor) to reach a device (eg. light bulb) which causes the device to enable. This traveling of electrons through the wire to the device is called a circuit. It is the pathway for an electrical current to flow from the source to the load. 

Wires showing copper cables
Copper cables are surrounded by rubber insulation. The copper wire is the pathway from the source to the load.  iStock

 

 

 

 

 

 

There are three basic parts to a circuit:

  • The “source,” or “sourcing device,” is where the electrons come from. This can be a battery, a generator, or the flow of electricity from a wall outlet. 
  • The “load,” or “dumping device,” is where the electrons go after completing the circuit. This could be a light bulb, an appliance, or some other device. 
  • The “pathway,” or “wiring,” is the middle part that brings the electrons from the sourcing device to the dumping device. The wiring is almost always made of copper, iron, or in electronic devices, a semiconductor. The current can only flow when the circuit is complete. When the circuit is broken, the current stops.

What Is Electrical Conductivity?

Electrical conductivity is the ability of a material to allow an electrical current to flow through it. The term conductivity is used to describe the extent to which a material will allow the flow of an electrical current. If a material has high conductivity, such as copper, it means that it is very good for allowing electrons to flow rather freely through the wire, while low conductivity, such as rubber will inhibit the electron flow to a greater extent, known as resistance.

The harder it is for the electrons to flow, the more resistance the material has. That’s why rubber is used to insulate the copper wire in almost all manufacturing that will transmit electric current. Rubber has a high resistance rating. 

Wood and glass are two types of materials that have a very low conductivity rating. Have you ever used wood to connect to an electrical circuit or battery? On the other end, copper is one of the most conductive materials around and that is why you see so many wires and/or cables that have copper wiring.

Besides the type of material that is used, electrical conductivity can be affected by a number of factors. For example, temperature, and the presence of contaminants like dust and water.

What is Voltage?

Turn on your water faucet about a quarter of the way and place a cup under it. Notice how fast (or slow) the water is running to fill the cup. How long did it take?

Now turn the faucet to make the water run faster. When you do this, the water fills up the cup sooner. 

This is your voltage (actually an equivalent of voltage). The faster the water comes out, the more the force or pressure of water will be used. In electricity, this means that the more the pressure, the faster the electric current will come out to power an electrical device. The bulb will light up quicker, which you won’t notice, since it happens so quickly, but that is what will happen.

Ohm’s Law

A law that states the relationship between voltage, current, and resistance in a conductor (or insulator). It states that voltage is equal to current times resistance or E=IR. So the voltage equates to the amount of current that flows through the wire but includes the amount of resistance the current is subjected to. 

Types of Electrical Current

There are two basic types of electrical current: Direct Current (DC) and Alternating Current (AC). A direct current is a constant flow of electrons that always flows in the same direction. It can flow in one direction or it can flow in both directions. It is provided by batteries, solar cells, and hydroelectric plants. Electrical current can be changed from DC to AC by using a device called a transformer. Transformers are used to change the voltage of the electricity.

Summary

Electrical current is the flow of electrons through a conductor. A complete circuit is where electrons flow from the source to the load through a pathway or wiring. Electrical current works when a circuit is complete. A circuit is a pathway for an electrical current to flow from the source to the load. There are 3 basic parts in a circuit. The source is where the electrons come from. The load is where the electrons go after completing the circuit. The pathway is the middle part that brings the electrons from the sourcing device to the dumping device.

There are two basic types of electrical current: Direct Current (DC) and Alternating Current (AC). A direct current is a constant flow of electrons that always flows in the same direction. AC current can change from DC to AC by using a device called a transformer.