RF-ID

RF-ID is the technology allowing a low-power or no-power chip to communicate wirelessly. Dongles in a car that pay tolls, credit-cards that can be charged without contact, and badges that open doors are all RF-ID applications.

RF-ID stands for Radio Frequency Identification and works by wirelessly transmitting what is essentially a barcode. There are two basic types: one that has a power source that is able to transmit much further and those without a power source, wirelessly powered by the unit that reads the codes. Non-powered RF-ID devices, including name badges and credit-cards, are more common. However, powered RF-ID devices — like the clickers that open car doors and serve as a key, or that pay tolls — have more uses. (But, as anybody who has ever lost a car remote knows, they are extremely expensive.)

The first crude RF-ID systems were invented by Nazis, who found that rolling their planes a certain way changed radio waves being reflected identifying them as Axis rather than Allied planes. The Allies built on that, developing a system the planes themselves transmitted, an early “friend-or-foe” device.

Researchers continued building on the systems, with the most widespread use being anti-theft devices that triggered a radio wave to beep, alerting a store something was being carried out. Despite the simplicity of the system, and that it dates back to the 1960s, this type of antitheft device remains in widespread use.

In 1973, British-born American immigrant Charles Walton patented a device to transmit serial numbers that unlock doors. Walton would go on to patent many more RF-ID devices and is generally considered the innovator of modern RF-ID systems.

The US government created a system in the 1970s to track nuclear materials, including nuclear weapons, by “buzzing” a transponder in a truck as it drove by. Scientists who developed it eventually left to form the first automatic toll payment system. Another US government agency, the Department of Agriculture, devised a passive non-battery operated system to track cows.

RF-ID technology continues to develop, with non-powered chips that operate at ever further distances. Animals, especially dogs, routinely have chips implanted under their skin to help locate an owner if they become lost. A small number of people have done the same. Although the technology for people and dogs is essentially the same, there are obvious ethical differences.

RF-ID chips are used in supply chains, to track boxes for example, but are typically too expensive to replace barcodes for ordinarily priced individual items. However, scientists continue working on the project with the goal of simply shopping and walking out of a store, with all items scanned automatically. In 2018, Amazon created a store called Amazon Go that uses cameras, sensors, and machine learning which functions this way. However, Amazon Go does not use RF-ID chips, which would increase the cost structure.

In the future, the use of RF-ID chips is likely to dwindle as cameras and similar “seeing” devices increase in power and decrease in price. The chips, no matter how inexpensive, will always cost more than printed barcodes which themselves may become obsolete as computers simply recognize items in much the same way that people do.

Ted Hoff’s General Purpose Microprocessor

“…even though science and technology are wonderful, what really gets them out there for people to use is to have businesses built around them. It takes savvy businessmen as well as savvy technologists to make that work.”

Tedd Hoff

Background

Ted Hoff had access to then state-of-the-art vacuum tube circuits in high school. In 1954, he graduated and gained access to then-new transistors and magnetic core memory. Eventually, he earned a bachelor’s degree when came to Stanford, earning a Ph.D. in 1962.

During that time, he talked to Rex Rice, an on-campus recruiter for Fairchild Semiconductor. Particularly, the Traitorous Eight founded Fairchild and Doriot student Arthur Rock funded the business.

Hoff believed the new field of integrated circuits could work well for memory, replacing the clunky and relatively enormous core memory. Eventually, this led to a referral to Bob Noyce. He worked at Fairchild but was starting a new company, Intel. Evidently, Noyce intended Intel to focus on semiconductor memory and was searching for somebody with Hoff’s background.

Intel

In 1970, while waiting for the technology to mature, Intel decided to build one-off chips for the desktop calculator market. Eventually, Hoff was assigned to assist building a chip for Japanese company Busicom. At first, Japanese engineers were expected to do all the work with Hoff acting as a liaison and coordinator.

However, Hoff noticed the Japanese design was sub-optimal. There were about a dozen chips and the entire system appeared needlessly complex. Hoff raised his concerns to Noyce who encouraged him to make a “backup” design.

Hoff’s design incorporated random access memory and programmability. It was vastly simpler yet overall more powerful by being programmable rather than single-purpose. After a meeting, the customer adopted Hoff and Intel’s design.

Federico Faggin joined Intel and refined Hoff’s idea, optimizing the general-purpose chip to take advantage of Intel technology. By January 1971, the team had a fully functional microprocessor.

The Microprocessor is Born

Their original goal was an embedded system, not a PC chip. Embedded systems are specialty chips that people never see; they make other machines work. The final chip, renamed the Intel 4004, contained between 2,100 and 2,300 transistors, depending upon how one counted. In 1974, Intel’s 4004 was followed by the 8008 then the 8080. That chip became the foundation of the Altair, the first microcomputer. The Altair inspired a young Bill Gates and Co. to start a software company and a young Steve Jobs and Wozniak to form a computer company.

Reasonably Priced Business Computer (IBM/360)

The IBM/360 is the first mass computer, designed as a general-purpose computer affordable for mid-sized businesses yet powerful enough for large enterprises.

Background

In 1962, IBM’s revenue was $2.5 billion. CEO Thomas Watson Jr. believed in the vision of a general-purpose computer that supports timesharing, the ability of a computer to do multiple things at once. Thereafter, he invested a staggering $5 billion ($42.5 billion adjusted to 2019), double the company’s annual revenue, to develop the IBM/360. Indeed, more than 100,000 people scattered over 165 cities worked together to create the IBM/360.

One key feature of the IBM/360 was forward and backward compatibility, along with upgradability. Before the IBM/360, businesses purchased a computer and, when they outgrew it, purchased a new computer. In contrast, the IBM/360 enabled extra peripherals, increasing the capacity of the computer. Additionally, a significant amount of older IBM software ran on an emulator.

Prior to the IBM/360, computers were typically custom-tailored to the task at hand. Scientific computers were different than business computers. Additionally, a computer to run an accounting system was different than a computer to run inventory management. Much like Intel created a general-purpose microchip, IBM created a general-purpose overall computer.

The IBM/360 is one of the few computers that both sit in the Computer History Museum and is still in use, 55 years after its introduction. Even though the vast majority of smartphones contain more computing power and memory, the 360 oftentimes does one task, do it well, and have done it for decades. Businesses should move the tasks to newer computers but the 360 is so reliable that migration is oftentimes a low priority.

Third-Party Peripheral Market

Besides forward and backward combability with other computers, IBM allowed third-party companies to create certified peripherals for the 360. While this idea seems common now, it was a groundbreaking experiment when the 360 launched. “Half a million saved is half a million earned,” read third-party peripheral makers advertising low-cost high-quality add-on’s.

Success

The IBM/360 was incredibly successful. IBM was unable to keep up with orders for years. Eventually, even the Soviet Union copied it and named their System/360 knockoff the “Ryad” computer. By 1989, the 360 and successor computers accounted for more than $130 billion in annual revenue.

Cordless Tools

In 1895, C&E Fein, a German company, invented the first electric tool. It was a handheld drill weighing 16.5 pounds. The drill was underpowered because it ran on DC electricity. It also required two people to operate.

In 1910, Duncan Black sold his car for $600 and used the funds to open a machine shop in Baltimore. His friend and business partner, Alonzo Decker, joined the venture.

Their first project involved improving the C&E Fein electric drill. They looked towards Colt’s pistol handle to envision a power drill small enough for one hand with a pistol grip. The 1916 Black & Decker power drill was vastly lighter, stronger, and required only one person to operate.

At first, Black & Decker only sold their power tools to other businesses. Eventually, they realized the consumer market was also interested in the convenience of power tools and built their business-to-consumer channel. By the early 1920s, the company was advertising power tools in popular newspapers and magazines.

Eventually, other companies created various tools. Over time, power tools became the norm.

In 1961, Black & Decker took the innovation one step further and invented cordless power tools. Like the original C&E Fein drill, the first cordless power tools were heavy and underpowered. However, even with these drawbacks, the benefits were obvious.

In 2005, Milwaukee Electric Tool Company released the first lithium-ion tools. These changed the industry, making cordless tools powerful, long-lasting, and easy-to-use.

Today, virtually every tool imaginable run on batteries. Drills, saws, sanders, chainsaws, and even lawnmowers utilize battery-driven electric motors.

Electronic Desktop Calculator

Desktop calculators led the idea of computers small and cheap enough to sit on an individual’s desk. Eventually, they also became the impetus for the general-purpose microchip.

History

The first desktop electronic calculator is the ANITA Mark VII and ANITA Mark VIIII, both launched late 1961. The Bell Punch Co. of Britain designed the ANITA. Markedly, they used vacuum tubes and cold-cathode, and nixie tubes for the numerical display. Norbert (“Norman”) Kitz led the design and engineering work.

Eventually, the ANITA VII sold in continental Europe and the ANITA VIII in the UK and the rest of the world. However, soon after launch, Bell dropped the ANITA VII and consolidated the product line.

Cost was a major factor producing the ANITA. To make the calculator, Bell Punch needed to sell the product for about 1/100th the least expensive electronic computers of the day cost. Eventually, ANITA went on the market for £355 (about £7,800 in 2018, about $10,500 USD). In contrast, the least expensive general-purpose computers in 1961 cost about £50,000 (just over £1 million adjusted to 2018). The device weighed 34 pounds (15.5 kg).

Transistor-Based Calculators

Eventually, by 1964, competitors started to release calculators that used transistors rather than tubes. Sharp, Canon, Sony, Toshiba, Wang, and countless others released transistor-based calculators. However, these calculators were similarly priced to the ANITA, or even more expensive. Significantly, were significantly smaller and lighter due to the lack of tubes.

The Soviet Union literally weighed in with the T-64 built in Bulgaria. However, despite the use of semiconductors, the calculator weighed 8kg (17.6 lbs.) and is the first calculator to compute square roots.

Calculators continued to decrease in price, size, and increase in performance.

General-Purpose Microchip

Many calculator companies hired Intel, a young company, to produce custom chips for their calculators. Eventually,  in 1970, Intel engineer Ted Hoff instead created a general-purpose chip for Japanese company Busicom. Unlike other calculator chips, the Busicom chip was programmable to do multiple functions, not only those specific to one calculator. In 1971, Intel licensed the chip back and rebranded it the Intel 4004, Intel’s first general-purpose microprocessor.

Just-In-Time Manufacturing

Just in time manufacturing delivers the parts required to complete a product shortly before they are needed. Accordingly, this vastly reduces inventory cost while typically increasing quality by aligning the manufacturing needs of part suppliers and the final manufacturer.

Background

Toyota engineer Taiichi Ohno needed a better way to manufacture. Specifically, efficiency was low and quality suffered, especially when necessary parts ran out. In due time, he noticed that supermarkets used a visual card to indicate when an item was running low. Therefore, this system signaled to supermarket workers to restock the bin immediately. Without this system, bins might be filled with unneeded food that would spoil or sit empty, forcing customers to make a later trip or go to a different store.

Ohno adapted this system calling it “Kanban,” which means “visual signal” or “card” in Japanese.

Eventually, Ohno brought the system to Toyota’s manufacturing facilities. When parts ran low, workers turned over a car and somebody quickly came to replenish the parts. There were never too many nor too few parts for a workstation on an assembly line.

Kanban has four core properties. First, visualize the workflow. It is necessary to lay out a workflow so an ordinary person can grasp it visually. Second, limit work-in-progress. There must never be too much nor too little work-in-progress. Third, manage flow. It is necessary to align the workflow with the workers and the need for literal or figurative parts. Fourth, make process policies explicit. Clarify the workflow so everybody understands what is required. Fifth, create feedback loops. Ask and observe what works and what doesn’t and adjust accordingly. Finally, improve collaboration. Use small, continuous, incremental evolutionary changes that stick. Do not try to boil the ocean.

Toyota found Kanban vastly increased efficiency and decreased costs and adopted it through the Toyota system.

JIT

During the 1950s – 1970s, the quality of Japanese manufacturing rapidly increased while the quality of US manufacturing similarly declined. American executives studied the Japanese and found the core two components of Japan’s secret sauce was the use of Kanban and techniques taught by statistician W. Edwards Deming after the war. Deming tied Kanban’s flow into a statistical system called Total Quality Management TQM, producing higher quality goods (especially cars) at lower prices.

Eventually, US firms adopted Kanban and TQM while the process evolved in both Japan, the US, and elsewhere. Most notably, Michael Dell created a computer company that relied heavily on parts created by others. Dell computers were custom-configured when ordered, then quickly delivered. He needed a system where vendors aligned with his own factory to quickly build high-quality computers. Dell’s Just-In-Time (JIT) methods revolutionized manufacturing, enabling him to work with countless suppliers to ensure the supply bins were never either empty nor too full.

Precision Guided Munitions

“In the past, wars’ slaughter has been largely confined to armed combatants. Obviously the airman, riding so high above the earth that cities look like ant hills, cannot aim his deadly cargo at armed males. All below will be his impartial target.”

Major Gen. James Fechet, US Army Air Corps, 1933

Precision Guided Munitions (PGM’s) are highly precise bombs. Usually, a laser held by a soldier or mounted to an aircraft guides the bombs. Bombs are launched by aircraft, submarines, land vehicles, and individual soldiers. Because PGM’s are more accurate they are also more lethal against their intended target and less likely to destroy an incorrect target.

Background

Lobbing projectiles is an ancient practice. Bombs dropped from aircraft originated during WWI when pilots would literally pick up and drop a bomb from the cockpit. During the interwar years and WWII, aircraft and bombing technology increased at a rapid pace. Primitive devices calculate aircraft speed, wind speed, and altitude to guide when to drop a bomb. Gravity took control once a bomb dropped.

In 1942, the Germans developed radio-controlled guided bombs. Radio signals controlled the bombs after deployment. They also developed a radio-controlled “glide-bomb” that flew up to six miles (9.5km) to destroy ships. By 1944, German radio-controlled bombs flew 19-miles (30km) with a nose-mounted radio television and radio uplink.

In 1943, the Allies in turn released their own radio-controlled bombs. The famous Bridge over the Kwai River was destroyed by a US radio-controlled bomb. The US used 1,357 “AZON” radio-bombs to destroy 41 bridges. By 1945, the US released “The Bat,” the first autonomous “fire and forget” radar-controlled glide-bomb.

PGM’s took a back-seat in the post-war decades due primarily to cost. Armies focused on nuclear weapons and conventional bombs. Less than 1% of bombs dropped in Vietnam were PGM’s. One exception is Israel, which decisively used PGM’s during the 1973 Yom Kippur War. A small number of extremely accurate PGM bombs developed by Israel proved decisive in stopping tanks. This sparked renewed interest by both the US and USSR.

PGM’s Become Mainstream

By the 1980s, American bombs could fly day or night, retain altitude, and attack pinpoint targets. By the first Gulf War, about 10% of US bombs were PGM’s but they accounted for 75% of total damage.

The latest PGM, developed by the US, is a flying knife-bomb intended to eliminate damage beyond the targeted individual.

Jukebox

The Jukebox is an automated coin-operated music player which plays individual songs. The differentiating factor of the Jukebox from a simple coin-operated record player is the ability of an automated machine to replace live music in a restaurant or bar.

Background

Louis Glass and William Arnold modified Edison’s record players to operate by coins. These contained multiple listening stations before the introduction of the loudspeaker. Eventually, these record players evolved but, until the 1930s, they were not for widespread use. You couldn’t dance to the early Edison phonographs.

Literature often confused and co-mingles the Jukeboxe and the Nickelodeon. However, they’re entirely separate.

However, player pianos existed since the late 1880s, including coin-operated models. By 1896, the Wurlitzer company was selling coin-operated player pianos. In 1924, de Forest’s electric tube amplifier enabled amplified music and the Jukeboxes that followed.

Golden Years of Jukeboxes

In the early 1930s, Americans lacked both money and fun. The Great Depression and prohibition of alcohol put a damper on the fun. Phonographs were not expensive but were not free, and neither were the recordings.

In response, various inventors created the modern Jukebox. It is a machine that plays 45rpm single-song recordings over a loudspeaker, one after another, for an affordable price.

Two groups found the jukebox controversial. First were Americans who believed that jukeboxes encouraged immorality and crime. Organized crime did control jukeboxes in New York City, reinforcing this negative impression.

Controversy

However, organized crime also controlled the low-cost speakeasy’s the jukeboxes originally played in until the repeal of prohibition in 1933. Realistically, these people didn’t like the influence of the music, especially on young people. Jukeboxes often played jazz and, later, rhythm and blues and later rock and roll. This music was tied to African Americans and the “concerns” certain people are little more than thinly-veiled racism.

Another group with a more substantive concern were musicians. Before jukeboxes, musicians routinely played in bars and pubs throughout the US and Europe. Live music was the norm, not the exception. However, whereas a bar owner paid musicians the jukeboxes produced revenue. Even if mobsters ran the jukeboxes, they still cost the bar owner nothing, unlike live musicians.

Jukeboxes created a cultural convention that people could have the music they wanted when they wanted it for a reasonable price. While the machines eventually faded away, the demand for individualized music did not.

Prefabricated Housing Components

History

Limited amounts of prefabricated components date back to ancient times. Mesopotamian’s used burnt clay bricks. Romans utilized concrete molds for aqueducts and tunnels and William the Conqueror conquered the concept. There were movable modular buildings for industry, defense, and even hospitals. However, hand construction was the norm for the vast majority of houses and buildings.

That changed in 1908 when Sears Roebuck released a new item in their catalog, houses. People could order all the parts and pieces required to build entire high-quality homes and they’d come in a kit. Sears brought standardized parts, the “American Manufacturing Method” (invented by the French) to houses.

“For $1,062 we will furnish all the material to build this Eight-Room House, consisting of Lumber, Lath, Shingles, Mill Work, Siding, Flooring, Ceiling, Finishing Lumber, Building Paper, Pipe, Gutter, Sash Weights, Hardware and Painting Material,” reads a typical ad from 1908. All houses also included free architectural plans to aid in permitting.

Some houses were modest though many were large and there was at least one mansion.

As you can see from the catalog page, this was quite a house!

Sears discontinued selling kit houses in 1940 after selling about 70,000 houses.

Modern Day

However, the idea of modular building components remains. Today, doors routinely come with frames for installation. Hand-built trusses, that hold up roofs, are virtually unheard because factory-made ones are safer and cost less. Windows routinely come preassembled and, in some places, hand-built windows are illegal for safety reasons. Countless components of modern houses, especially in the US but also elsewhere are built at factories, not job sites.

Besides prefabricated house parts, entire prefabricated houses and buildings still exist.

In addition to prefabricated parts, there are also “modular” construction units. These function like building blocks, with various parts of houses and buildings fitting together. Modular buildings theoretically cost less than one-off construction but have higher quality since the pieces are built in tightly controlled factories.

Hotel chain Citizen M uses prefabricated modules to build entire hotels, including a 300-room hotel in New York City. The Chinese famously built the 57-story “J57 Mini Sky City” in 19 days using modules.

Windshield Wiper

Windshield wipers are a vital component of a car.

Inclusion Criteria

However, countless other components in cars are also important. Excluding the vast majority of auto components from innowiki is a purposeful decision. Undoubtedly, these components are oftentimes enormous markets. However, they do not teach us about anything especially important. They are components in a larger machine.

Accordingly, we’ve tried to separate cases from the meaning behind the cases we make a special exception for windshield wipers. They illustrate the difficulty of personally profiting from one’s work, even after successful commercialization.

Particularly in the case of windshield wipers, auto companies refused to pay, declaring the invention was “obvious” after-the-fact. Different patent offices around the world carry differing definitions of “obviousness” creating a slippery slope. Undeniably, countless inventions intuitively feel obvious after-the-fact. And, arguably, countless innovators were lucky with timing. To read more, switch over to the analytical part of the site.

Windshield Wiper Inventors

All three major windshield-wiper inventors had their patents blatantly infringed.

Noticing that it was difficult to see, Mary Anderson realized the need to keep windshields dry in the rain. Subsequently, she invented a hand crank to wipe the water off auto windshields. She hired an engineering firm to perfect the device and patented it in 1903. However, nobody purchased nor licensed the patents.

Charlotte Bridgwood invented and patented the automatic electric wiper in 1917; nobody paid her either.

Robert Kearns invented and patented the variable speed wiper in 1969. Nobody paid him either until he engaged in a prolonged series of lawsuits and prevailed against Ford ($10.1M), Chrysler ($18.7M initially – $30M in final verdict after $10M in legal fees).

Kearns served as his own lawyer for much of the litigation though at least four firms he hired throughout quit, saying he was too difficult to work with. He lost cases against GM, Mercedes, and Japanese companies on technicalities usually related to filing deadlines. The 2008 movie Flash of Genius is about Kearns and his legal battles.