Offshore Oil Drill

Offshore Drilling Units allow drilling for oil underwater.

History

Early patents describe over-water drilling wells that never worked. Thomas Rowland filed an 1869 patent for a “submarine drilling apparatus.” There are limited records showing submerged oil wells in 1891, in Grand Lake in Ohio.

However, it wasn’t until 1894 that offshore exploration started in earnest. Henry Williams started exploring for oil around the Santa Barbara coastline. Early wells were promising but Williams theorized more oil is offshore. In 1896, he built a long pier and mounted an oil pump on top, the first offshore drilling unit. The well was productive and soon almost two dozen oil companies were pumping for oil off 14 piers along the California coast.

By 1911, Gulf Refining Company switched from piers to tugboats and barges in Louisiana. When they found a productive well, they’d build a floating platform anchored to the underlying seabed. Water Pyron noticed bubbles in a Texas lake. He and his associates realized the bubbles were flammable. In 1911, they drilled to a depth of 2,185 feet (666 meters) and struck oil. Their well produced 450 barrels of oil per day.

Offshore production stalled during WWI as the country focused on the war effort. However, by 1938 Pure Oil and Superior Oil built freestanding drilling platforms in the Gulf of Mexico, predecessors of the modern rig. The oil field contained four million barrels. In 1947, Kerr-McGee built a platform 10 miles (16 km.) out to sea. Despite the distance, the sea was only 20 feet (6 meters) deep. That well produced 40 barrels per hour, setting off a boom of offshore drilling.

Larger Rigs

Ever larger rigs were built in ever deeper water. The thirst for oil was unquenchable. However, the rigs are dangerous and there have been a relatively large number of fatalities. Additionally, offshore rigs can cause catastrophic environmental damage. The BP Deepwater Horizon oil spill leaked 4.9 million barrels of oil, polluting hundreds of miles of otherwise pristine coastline. Among other sea-life, infant dolphin deaths increased six-fold. As of 2019, the spill cost BP $65 billion USD.

Lithium-Ion Battery

Lithium-Ion Batteries (LIBs) power everything from smartphones to power tools and electric cars. Entire cities store power generated during the day, via solar panels, for use at night from large lithium battery arrays. One of the largest factories in the world, the Tesla Gigafactory, is devoted solely to manufacturing Lithium-Ion batteries.

Background

An ability to recharge relatively rapidly and deliver a steady stream of electricity without overheating makes LIBs the workhorse of portable power. The batteries are so popular that there are fears of an upcoming lithium shortage.

There were several iterations of lithium batteries and the technology is still evolving.

John Goodenough, a professor of engineering at the University of Texas at Austin, is the primary inventor of the battery. Goodenough has an interesting history. He graduated summa cum laude in mathematics from Yale. After WWII, the army ordered him back to the US to attend graduate school in physics.

Goodenough spent his life in academia but, in the 1970s, decided to focus on batteries. Like many, he was angered by the OPEC-led energy crisis and also concerned about the smog and pollution internal combustion engines caused. A strong battery could power cars, he reasoned and switched his attention to battery technology.

Patent Wars

Goodenough was 57 years old in 1980 when he invented the Lithium-Ion battery. Sony commercialized the technology and the new battery became a blockbuster. Despite the commercial success, Goodenough earned no royalties from his battery. However, there was a massive patent battle between various companies and people involved in creating and commercializing the battery.

In 2017, Goodenough, age 94, announced he has created a successor solid-state battery that lasts longer, holds more energy, is more environmentally friendly (it contains no cobalt), and is safer than his original battery. As of 2019, he is still alive and still working on building a better battery.

Solar Cells

Solar Cells produce electricity from sunlight.

Early History

In 1873 and 1874, scientists noticed that selenium reacted with light to produce electricity. During the 1870s William Adams and Richard Day proved that light plus selenium generated current. Eventually, famous German scientist Werner von Siemens (founder of Siemens) was excited about the possibility of solar cells in the late 1800s. Indeed, in 1905, Einstein explained what made solar cells work, “light quanta” – that we now call photons.

Subsequently, by the early 1930s, scientists were enamored with solar cells: “…in the not distant future, huge plants will employ thousands of these plates to transform sunlight into electric power…that can compete with hydroelectric and steam-driven generators in running factories and lighting homes” wrote German scientist Bruno Lange in 1931.

However, as the cells proved inefficient, interest waned. By 1949, scientists had all but given up hope on a reasonably efficient solar cell.

Eventually, five years later, scientists Calvin Fuller and Gerald Pearson of Bell Labs were working with silicon to create transistors. They noticed that silicon could generate electricity.

In a different area, scientist Daryl Chapin was tasked with the remote generation of electricity. He started to experiment with selenium but faced the same problem earlier scientists had, efficiency of just 0.5 percent. Pearson told Fuller he was wasting his time with selenium and to try silicon, which ran at 2.3 percent efficiency, much higher.

Bell labs continued working on solar cells and, on Apr. 25, 1954, displayed a 21-inch Ferris wheel that ran on a solar-powered battery. The press loved the concept: unlimited, free energy from the sun.

More Recently

Subsequently, solar cells have since fared better and worse, becoming popular in the 1970s only to disappear again. Eventually, they reemerged in the 2000s as a viable source of electricity.

As of 2018, solar cells have efficiency as high as 22.5%. As efficiency increases and price decreases, solar is becoming one of the least expensive options to generate electricity; only wind energy costs less.

Catalytic Converter

Catalytic converters prevent knocking in engines without leaded fuel.

Houdry was a Frenchman working on high-octane fuels. His initial focus were race cars. Sun Oil sponsored the early work, in the 1930’s, moving Houdry to the US. The fuel work was a success but could not be use in mass production because the catalysts that allow the use of high-octane fuel would be destroyed by lead, which was then used in gasoline to increase compression to prevent knocking.

Time went by (during WWII, Houdry was a vehement anti-Nazi). Eventually, it became clear to Houdry and others that lead fuels caused serious environmental problems.

Houdry developed “catalytic cracking” to create high octane fuels via the use of a catalyst. Initially, fighter airplanes relied on high high-octane fuels. Correspondingly, this gave a substantial edge to Allied forces during WWII.

After the war, Houdry created and patented the catalytic converter allowing engines to run on lead-free gasoline. He patented the innovation in 1956 with a 1952 innovation date, assigning the rights to his company, Oxy-Catalyst Company.

His patent expired Apr. 16, 1970, less than a year before the newly created US Environmental Protection Agency (EPA), established Dec. 1970, labeled leaded gasoline as a threat.

Subsequently, the United States and most European government banned cars that required leaded gasoline in the 1970s.

Nuclear Power

One of the great physicists, Fermi won the Nobel Prize in 1938, at the age of 37. No sooner did he receive his prize than he fled from his home in fascist Italy to New York City, taking US citizenship.

Eventually, Fermi and the other nuclear scientists had convinced President Roosevelt that the Nazis could and would produce a nuclear bomb, which led the US government to grant them virtually unlimited funding.

On Dec. 2, 1942, Fermi’s reactor ー under the squash court at the University of Chicago ー went critical to become the first self-sustaining nuclear reaction.

Fermi would eventually work on the Manhattan Project, to develop nuclear weapons and the Atomic Energy Commission.

Like many early nuclear scientists, Fermi died of cancer at the young age of 53.

Eventually, in 1951, Walter Zinn connected a Fermi reactor to the rest of the equipment needed to generate electricity. This created the first working nuclear power plant.

Better Oil Drill Bit (Tricone Rotary Rock Drill)

A drill bit sounds relatively petty compared to the other inventions on this list. Granted, it’s not Watt’s condensing steam engine, Edison’s long-lasting lightbulb, Tesla’s induction motor or the Wright Brothers airplane. But Hughes drill bit dramatically lowered the cost of drilling for oil. It also opened previously unavailable oil fields where oil reserves lay beneath rock. This enabled low-cost fuel for the burgeoning auto industry.

Howard Hughes, Sr. partnered with Walter Sharp in 1902 to drill for oil in Texas. Like everybody else, they became frustrated that their drill bits kept breaking. They worked on innovating a better bit in 1906, achieved a dramatically better one in 1908, patented it in 1909. Eventually, they quit drilling to start their drill bit company, Sharp-Hughes Tool Company. Sharp died in 2012 and Hughes bought out his interest. Howard Sr. died in 1924 and his son bought out the interest he didn’t own.

Hughes Jr., the billionaire, never showed an interest in the drill bit company. He used the cash flow to focus on movies, aircraft, and developing Las Vegas. Hughes tool dominated the oil bit market in its heyday. Even today, it continues to hold a strong market share (after merging with Baker International to become Baker Hughes International in 1987).

Steam Turbine

In much the same way that Watt’s condensing steam engine vastly increased the value of Newcomen’s engine, the steam turbine vastly improved the value of Edison’s electric factory.

Steam turbines allow steam, generated by heating water, to efficiently turn generators, usually to make electricity. In addition to steam, water (ex: waterfalls) or wind (ex: windmills) can drive turbines.

Background

Watt and similar steam engines would use the pressure of steam to move the engine. Simplifying, pressure would build up to tip a bucket that would drive a crankshaft converting the movement into energy, similar to how a river drives a water wheel. Typically, gravity (though, in later models, steam) would then return the shaft so the process could repeat. These engines had a lot of power but ran at a slow speed.

Electrical generation, however, requires less power but higher speeds to turn the generator. Steam velocity spins generators rather than steam pressure. Yet steam velocity is very fast and would quickly cause any system, especially one built of 1800’s era metals, to fall apart. There were no metals that could withstand the centrifugal force of steam velocity.

Parsons

In response, Parsons created a series of blades, each larger than the next, which spin at their own manageable speed. The slower but collectively more powerful spinning is due to capturing the steam velocity as it expands. The steam turbine is still in use today to generate electricity.

Jet engines use a similar system in reverse, where the engine turns a turbine that compresses air for thrust. Modern windmills also use turbines to generate electricity.

Parsons created Newcastle and District Electric Lighting Company, an early power company (established 1889, about seven years after Edison’s New York power plant) and the first to use turbines to spin generators. Parson’s turbine company still exists as a division of Siemens.

Gutaf de Laval created a different type of turbine to accelerate a stream.

Long Lasting Light Bulb

Edison’s bulb is well-known but what’s less understood is the enormous infrastructure required to power it. Edison created a power plant in New York City, power cables, transformers, power meters, insulators. When the lights finally came on, at the New York Times building, it represented the end of a herculean undertaking and the beginning of a new era.

Background

At the simplest, Edison’s long-lasting bulb lowered the cost of doing things at night.

Countless people, dating back to 1802 (77 years prior to Edison’s bulb), invented various lightbulbs. Russian engineer Paul Jablochkoff lit up the Avenue de l’Opera in Paris using arc lights from an AC generator. American William Wallace used arc lights to illuminate his foundry. But arc lights were too bright for ordinary use (they’d been in use, in lighthouses, since the 1860’s) and they were dangerous, routinely throwing sparks.

Edison

Edison, by then already a well-known innovator ー the “Wizard of Menlo Park” ー invented the first bulb suitable for indoor use, safe, long-lasting. Edison’s low-cost bulb represented a revolution. It was neither too bright, nor too dark, and safe.

Edison realized a series of centralized dynamos, rather than batteries, could create long-lasting electrical current, an electricity factory. He also worked out that the key to electrical distribution, and a lamp, was low amperage but (relatively) high voltage, requiring less copper wire to power the system.

“No Matches Are Needed…”

Edison’s Pearl Street Station came online Sept. 4, 1882.

Yesterday for the first time The Times Building was illuminated by electricity. Mr. Edison had at last perfected his incandescent light, had put his machinery in order, and had started up his engines, and last evening his company lighted up about one-third of the lower City district in which The Times Building stands. The light came on in sections. First there came in a series of holes in the floors and walls. Then several miles of protected wires, then a transparent little egg-shaped glass globe, and, last of all, the fixtures and ground glass shades that made everything complete.

The lamp is simplicity itself… To turn on the light nothing is required but to turn the thumbscrew; no matches are needed, no patent appliances. As soon as it is dark enough to need artificial light, you turn the thumbscrew and the light is there, with no nauseous smell, no flicker and no glare.

The New York Times, Tuesday, September 5, 1882.

Using carbon thread, created from burnt cotton, in a vacuum tube the bulb that would light, and change the world, was born.

Decades passed before Edison’s low-cost light bulbs became ubiquitous due to a lack of widespread electrical grid.

Cornelius Vanderbilt and J.P. Morgan financed Edison’s work.

Oil Drill

Drake’s oil drill is one of the stranger stories, in a collection of innovation origin stories where strange is common.

The oil drill vastly lowered the cost and increased the efficiency of collecting oil. Before the drill, oil was usually collected in naturally formed pools at the ground surface. Most early oil was distilled into kerosene for lamps or home heating.

Initially hired by Seneca Oil to look for an oil Drake invented a rod-like system, the modern oil drill. Before then oil exploration involved digging holes, similar to water wells.

He dug ever further. Finding nothing, Seneca eventually cut his funding.

People stopped by to laugh at him. He took a personal loan to continue operating his steam-engine driven drill, drilling deeper.

Drake eventually discovered oil but failed to patent his methods or exploit the oil.

Countless people earned a fortune drilling for oil. Drake, however, fixated on the mechanics of his drill, not the business. He would have died in poverty but the State of Pennsylvania awarded him a $1,500 annuity, a tribute to igniting a new industry. Oil barons also donated funds to support him.

Others claim to have invented oil drilling equipment before Drake. Their unverified claims were likely fabricated to obtain patents.

Matches

1827

Friction matches are ordinary matches. Strike them against a flint or, for some types, any hard service and they start a fire. Friction Matches were invented by Englishman John Walker in 1827.

While it seems hard to believe it took so long before the innovation of matches by Walker people would have to find a fire to start another. One predecessor match involved encasing flammable chemicals in small glass beads that, when broken, would ignite wrapped paper. Since these could easily break by accident, and even when used correctly tended to erupt into flames, they were dangerous.

Other early attempts at matches used phosphorous which, as every high-school student learns in chemistry class, ignites when exposed to air. Which is fine in a chemistry lab but not so fine in an 18th-century wooden house.

Walker invented his match in 1827. Unlike others, it used tips coated with potassium chloride-antimony sulfide paste. Which is a long-winded way of saying they were inexpensive, safe, and easy to use. They only lighted afire when struck in a specific way.

Already wealthy, Walker purposefully decided not to patent the match and released the innovation for the public good.

Other businesses eventually released safer and more reliable versions of patented matches. Most notably, Austrian Anton von Scrötter discovered red phosphorous in 1845, a type of phosphorous that did not spontaneously combust. Combined with a specialized striking head invented a decade later, this became the safety match still in use today.

Before safety matches — or any matches for that matter — people would use flints or embers that were more difficult to light a fire with. It’s notable that despite the necessity of fire and its widespread use, matches were a relatively recent invention. Gas-powered lamps, invented in 1792, were widely used before matches were invented. Voltaic pile batteries were powering telegraph machines and railroads were carrying people and equipment. All the while people were still starting fires no differently than they had dating back to caveman times.