Last July, Dallas police killed a suspected gunman with an Unmanned Ground Vehicle (UGV). By any conceivable stretch of the imagination, this was a “good kill.” The gunman had murdered 5 policemen, threatened to kill more with hidden bombs, and refused to give up even after hours of negotiations. A credible threat against civilians had been made. The threat had to be neutralized in a timely fashion. Using an UGV reduced the danger to officers, and protected civilians at the same time.

We are not sure what kind of robot was used, but it was probably one designed to detect Improvised Explosive Devices (IED). Ironically, the robot, which was most likely designed to counter explosive devices, killed the gunman by detonating a bomb (a pound of C4 explosives).

Sources claim that the UGV was a Remotec Andros F-5 model (some say it was a MARCbot, but this appears to be speculation). We do know that the UGV was remotely controlled. Both the MARCbot and the Andros F-5 have been used in overseas combat operations.

The tactic of jury rigging an explosive device to a UGV in order to perform a kinetic action is not a new one. The Brookings Institution reported in the “Military Robots and the Laws of War” that soldiers would strap Claymore anti-personnel mines to UGVs in attempts to kill insurgents.

Some are alarmed that police used military equipment and tactics on American soil. Many feel that the whole “killer robot” scenario was just plain ominous. The following quotes are typical.

“The ‘targeted killing’ of a suspect on ‘U.S. soil,’ as opposed to extraterritorial declared or undeclared war zones, where this operation also has clear precedents too, has captivated the attention of scholars and the public. … the event raises many issues about the rules of engagement and the constitutional rights of a suspect—issues that obviously the Dallas police completely skirted, and do not seem too willing to discuss in the aftermath.” Javier Arbona, Assistant Professor, University of California at Davis in American Studies and Design, UC Davis

“As with other game-changing technologies, police robotics—especially as weapons—could change the character of law enforcement in society.”  Patrick Li, Director of the Ethics & Emerging Sciences Group and a philosophy professor at California Polytechnic State University, San Luis Obispo, IEEE Spectrum

“…placing a bomb on a police robot with the intention to kill a suspect—if that is, in fact, what happened—would represent a major shift in policing tactics….It now appears that a tactic of war deployed on foreign soil is being used on the streets of American cities. There are still a lot of moving pieces and things to sort together in the aftermath of the police shootings in Dallas. But one thing is clear: the rules of police engagement might have just changed forever.” Daniel Rivero, Fusion

As evidenced by the above quotes, legal experts and ethicists are anxious over the use of a “killer robot.” You know who didn’t seem concerned? Police.

 “Admittedly, I’ve never heard of that tactic being used before in civilian law enforcement, but it makes sense. You’ve got to look at the facts, the totality of the circumstances. You’ve got officers killed, civilians in jeopardy, and an active shooter scenarios. You know that you’ve got to do what you’ve got to do to neutralize that threat. So whether you do it with a sniper getting a shot through the window or a robot carrying an explosive device? It’s legally the same.” Dan Montgomery, former police chief, Time

I called AMREL’s Director of Public Safety Programs, William Leist, to ask his opinion. A former Assistant Chief California Highway Patrol, he has many years of police experience. He fell into the camp of no big deal.

 “Deadly force is deadly force. Whether we use a bomb, a vehicle or a firearm, if the situation is such that deadly force is authorized, lawful, and necessary, the mechanism really shouldn’t matter.”

I am inclined to strongly agree with Bill Leist (and not just because he works with AMREL in supplying rugged computers and biometric devices to law enforcement). The use of a UGV in this particular instance isn’t really a game changer, as some experts are maintaining.

However, let’s be devil’s advocate and point to potential problems. There is an old saying, “good cases make bad laws” In other words, a specific high-profile instance may not be the best guide for similar circumstances in the future. Just because the use of a UGV in Dallas was a lethal operation that even Gandhi would have approved doesn’t mean that there isn’t potential for abuse.

The adoption of new technologies does not always lead to expected results. Police have expanded an application of another new technology, UGVs, beyond the original purpose. Granted, this new application is justifiable in this instance, but are we in a position to predict future consequences?

Police are being overwhelmed with technology. Law enforcement officer are dealing with a multitude of biometric devices, communication technologies, non-lethal weapons, biometric equipment, and a wide variety of computer platforms. In many departments, training has not kept up with the needs. Should there be certain minimal standards for operators of UGVs? Who sets them?

Suppose something goes wrong and a civilian gets hurt. The police will say that it’s not their fault, because the UGV malfunctioned. The manufacturer will say their UGVs are not designed for lethal operations, so it isn’t their responsibility. Granted, almost all police actions have liability issues. However, unmanned technologies pose liability problems that are beyond traditional concerns, and are major factors in slowing their adoption.

What about visibility? Whether a shooting is justified or not may depend on what a policeman sees or what he thinks he sees. Remote-controlled unmanned systems, who operators may have limited visibility, may present new problems.

Using lethal force with an UGV does not herald a new era of law enforcement. The adoption of this new technology is unlikely to fundamentally alter the current legal dynamics surrounding the use of police force. However, they may present new complications and challenges.

 

Every summer, I have a front-row seat to hell. I live at the foothills of the San Gabriel Mountains, just east of Los Angeles. During this time of year I can walk out of my house and see hills pockmarked by flames, as if they were covered by the campfires of an invading hoard.  All […]

AMREL is always on the lookout for new applications for Unmanned Ground Vehicles (UGV), since we are the premier supplier of Operator Control Units for them. Bomb detection, guard duty, and farming are well-known uses for UGVs. To this list, we can add unlocking the secrets of a lost civilization.

For over 100 years, archeologists have been exploring the ruins of Teotihuacan, a location outside of Mexico City. Famous for its huge pyramids, it once housed 100,000 to 200,000 people, making it one of the largest cities in the world at that time. Almost nothing is known about the original inhabitants or why they inexplicably disappeared.

After a heavy rainstorm, Sergio Gomez, an archeologist who has devoted his career to studying the Teotihuacan ruins, noticed a sinkhole at the base of one of the pyramids. In best Indiana Jones fashion, he lowered himself by rope into the mysterious hole. No word about whether he wore a fedora or carried a whip. Nor were there any reports of snakes or giant rolling stone balls.

tunnel

He descended approximately 45 feet into the dark, unknown, never-explored opening. Was he scared? According to him, he was terrified. However, at the end of descent, he discovered something that made the trip worthwhile; a man-made tunnel. Around 100 yards long, this underground passage eventually yielded 75,000 artifacts including seashells, animal bones, jewelry, pottery, rubber balls, obsidian knives, and green stone statuettes.

Excavation of the final end of the tunnel was critical. Located deep beneath the Temple of the Plumed Serpent, it was thought to hold vital information about the purpose of the tunnel, as well as pyramids themselves.

However, the tunnel narrowed significantly toward the end; no person could crawl through it. Using traditional mining techniques were out of the question, since they were too likely to damage the sensitive archaeological site.

Fortunately a university in Mexico City had a solution. Named for Aztec rain gods, Tlaloque and Tláloc II were two UGVs that had previously been used to explore Teotihuacan. A similar UGV had been used in an Egyptian tomb.

A 4×4 traction vehicle, Tlaloque 1 is only 20 cm in height, 30 cm wide, and 50 cm long. Located one at the back and one at the front, Its 2 remote-controlled camcorders can do 360 degree turns. It has its own lights and transmits images to an external computer monitor.

Tlaloque 1

The larger meter-length Tláloc II is equipped with heavy duty tires, designed to navigate wet, muddy soil. Its meter-length mechanical arms, can chew through earth and clear obstacles. Equipped with bright lights as well as video cameras, it stores images on an on-board hard drive. It uses a laser to scan the topography. Unlike Tlaloque 1, Tláloc II is remote controlled by wireless means and carries an independent “insect” robot that has an infrared camera. Both the Tláloc II and the “insect” robot can be seen in the video at the end of this post.

Tláloc II

Tláloc II and Tlaloque successfully transversed the 2,000 year old tunnel. Much to the archaeologists’ surprise, the UGVs didn’t discover a tomb at the end of the passageway, but instead a spacious cross-shaped chamber, with more jewelry and several statues. The purpose of the chambers is still being determined.

Although UGVs have been used before on archaeological digs, this is the first time robots have been instrumental in a major archaeological discovery.  I suspect it won’t be the last.

Recently, Tom Green, founder and editor-in-chief of Robotics Business Review put on a webinar about an Unmanned Ground Engineering Vehicle (UGEV).  Before I explain what a UGEV is, a little background.

In the webinar, Tom Green noted that we are soon approaching the fifth anniversary of the Fukushima accident and the thirtieth of the Chernobyl disaster. He said that 60,000 Russian (and Ukrainian) workers were exposed to radiation at Chernobyl and that 6,000 have since died (BTW, the exact number of casualties form this accident is fervently debated). Many were critically exposed when they operated vehicles to dump thousands of tons of concrete on the nuclear plant itself. Vehicles are pictured below:

UGEV chernobyl

To this day, Green noted, these vehicles are still too irradiated to use.  Obviously, many casualties could be avoided if the proper unmanned systems were available. He wondered if anyone had developed an appropriate system. (The lack of nuclear disaster robots was discussed in this blog in: Where are the Japanese robots ).

Unmanned systems operating in nuclear disasters have all sorts of problems. Radiation affects their CCD (electronic light sensor) and other electronics. Clearing away the concrete and rubble can be a formidable task. Green went in search for what he called a “tough boy.”

He found one in Tel Aviv. Agritechnique Engineering had developed what is described as an Unmanned Ground Engineering Vehicle (UGEV). Specifically created to operate in nuclear disasters, this UGEV can move as fast as 20KM per hour, has one of the strongest undercarriages on the market, can breach doors, beak down walls, and has a boom that lifts two tons. Built by Avner Opperman (CEO of Agritechnique) who has 40 years of earth-moving experience, Green enthusiastically proclaimed that he had found his “tough boy.”UGEV 1

By far the coolest thing about the UGEV is its versatility. The boom can be fitted with up to 80 different tools, including hydraulic hammers, cutting discs, clamps, and buckets. Carried in attached storage compartments, tools have their own IP. The boom can autonomously switch tools out in the field, matching them to their appropriate tasks. A fully automatic quick coupler hydraulic assembly utilizes the tools with 2 x 55° tilt and endless 360° range of motion.UGEV coupler assembly

Opperman designed the flexible tool system with the idea that the UGEV could perform multiple tasks, and be capable of adapting to a wide a variety of needs as they occurred in the field. He wanted an unmanned system that could replace the many different kinds that are used now in disasters.

View video below.

UGEV specs

You can read also about it in Robotics Business Review.

My first reaction to the UGEV was favorable. It reminded me of the Flexbay and Flexpedient concepts that AMREL had successfully incorporated into our mobile rugged platforms. We created Operator Control Units with Flexbays that enabled personnel to switch applications in the field. The military loved how it increased operational range and simplified logistics. They deployed thousands in theater. Similarly, our Flexpedient® AT80 tablets can be modified to a wide variety of applications, enabling quick customization. New product developers have seized upon it as a way of getting their solutions to market faster and more economically.

However, it’s one thing to create mobile rugged computer platforms that are interoperable and flexible. It is another to build a vehicle that is an “all in one” unit, i.e. one that replaces a heterogeneous collection of unmanned systems as Opperman advocates.

While UGEV undoubtedly looks capable of clearing away debris and pouring cement (which is what Green was looking for), can it really replace the mixed lot of unmanned systems that are currently used?  Is it even a good idea?   For one thing, the UGEV is big.  It’s 13 feet long and almost 6 feet wide. It’s difficult to imagine it fitting into the narrower spaces in the interior areas of some nuclear plants.

I am not an expert in disaster robotics, but Dr. Robin Murphy of Center for Robot-Assisted Search and Rescue (CRASAR) is.  She has written that for disasters: “ground robots are generally not useful” (CRASAR).  She has also said that “…there is not a single robot that will work for all missions” (CRASAR).

The biggest problem with the UGEV – and all disaster robotics – is that no one wants to pay for them. Writing for Slate William Slaeton noted, “Power companies want cheap robots that can replace workers and are always useful. They don’t want robots expensively equipped to handle unlikely nightmare scenarios.”  They prefer the time-tested technique of pretending nothing bad will ever happen.

What do you think?  Is the UGEV the revolution that Green and Opperman think it is, or is it a technological dead-end? Send us your opinions to editor@amrel.com  (please note that your messages may be used in future blog posts).

An article posted in ExtremeTech is interesting, not only for the implications for unmanned systems, but also as an example of how a technology developed for one purpose can be used for something else entirely.

As noted in this blog, worldwide aging populations have generated interested in “social welfare” unmanned systems. These robots will assist the elderly with bathing, taking medications, cooking, and other Activities of Daily Living (ADL).

Programming every single ADL step is tedious, so the obvious option is to have the robots “learn” them.   As described in the article below, Artificial Intelligence currently needs large data sets that have been labeled or otherwise processed in order to “infer functions,” i.e. learn.

The folks at the RoboWatch project claim that they can enable robot learning by having them watch numerous You Tube instructional videos. Presumably, no labeling or processing is required.

ExtremeTech points out that this development could be combined with work on “real-time video summarization,” a technique designed to automate surveillance by detecting behaviors that are deemed suspicious.  RoboWatch’s project involves analyzing videos for universal steps that are essential in a process, while “video summarization” identifies anomalous actions that are correlated with criminal behavior.  It is easy to see how these two methods could work together.

In other words, a method for teaching robots how to cook eggs could one day be used to identify terrorist suspects. This example proves just how difficult it is to predict the impact of innovations.

What happens if RoboWatch’s learning method is applied to non-instructional videos? What if robots try to “learn” by watching action movies? Sexually explicit films? Political debates? People have trouble distinguishing reality from the fiction they see on televisions. How will this affect robots?

Will the future ushered in by advanced robot learning techniques be a dangerous one?  I have no idea, but I have a feeling things are going to get weird.

ExtremeTech article below:

Astute followers of artificial intelligence may recall a moment from three years ago, when Google announced it had birthed unto the world a computer able to recognize cats using only videos uploaded by YouTube users. At the time, this represented something of a high water mark in AI. To get an idea for how far we have come since then, one has only to reflect on recent advances in the RoboWatch project, an endeavor that is teaching computers to learn complex tasks using instructional videos posted on YouTube.

That innocent “learn to play guitar” clip you posted on your YouTube video feed last week? It may someday contribute to putting Carlos Santana out of a job. That’s probably pushing it; it’s more likely that thousands of home nurses and domestic staff will be axed long before guitar gods have to compete with robots. A recent groundswell of interest in bringing robots into the marketplace as caregivers for the elderly and infirm, in part fueled by graying population bases throughout the developed world, has created the necessity for teaching robots simple household tasks. Enter the RoboWatch project.

Most advanced forms of AI currently in use rely upon a branch of supervised machine learning, which requires large datasets to be “trained” on. The basic idea is that when provided with a sufficiently large database of labeled examples, the computer can learn to recognize what differentiates the items within the training set, and later apply that classifying ability to new instances it encounters. The one drawback to this form of artificial intelligence is that it requires large databases of labeled examples, which are not always available or require much human curation to create.

RoboWatch is taking a different tack, using what’s called unsupervised learning to discover the important steps in YouTube instructional videos without any previous labeling of data. Take for instance a YouTube video on omelet making. Using the RoboWatch method, the computer successfully parsed the video on omelet creation and catalog the important steps without having first been trained with labeled examples

robot learning 1Color code activity steps and automatically generated captions,

all created by the RoboWatch algorithm for making an omelet.

It was able to do this by looking at a large amount of instructional omelet-making videos on YouTube and creating a universal storyline from their audio and video signals. As it turns out, most of these videos will contain certain identical steps, such as cracking the eggs, whisking them in a bowl, and so on. When presented with enough video footage, the RoboWatch algorithm can tease out what the essential parts of the process are and what is arbitrary, creating a kind of archetypal omelet formula. It’s easy to see how unsupervised learning could quickly enable a robot to gain a vast assortment of practical household know-how while keeping human instruction to a minimum.

The RoboWatch project follows similar advances in video captioning pioneered at Carnegie Mellon University. Earlier this year, we reported on a project headed by Dr. Eric Xing, which seeks to use real-time video summarization to detect unusual activity in video feeds. This could lead to surveillance cameras with the built-in ability to detect suspicious activity. Putting these developments together, it’s clear unsupervised learning models using video footage are likely to pave the way for the next breakthrough in artificial intelligence, one that will see robots entering our lives in ways that are likely to both scare and fascinate us.

Check out the video below of Uran-9, the new Russian unmanned armored vehicle.

Pretty impressive, isn’t it? This thing packs:

  • 30mm 2A72 automatic cannon
  • Coaxial 7.62mm machine gun
  • Ataka Anti-Tank Guided Missiles

According to Defense Talk, it has “a laser warning system and target detection, identification and tracking equipment.”

Russia has big plans to promote Uran-9 in the international market where it “…will be particularly useful during local military and counter-terror operations, including those in cities.”

Not to be too much of a wet blanket, but why would anyone buy this?  Russia says that it will reduce personnel casualties, one of the prime benefits of unmanned systems.  I wonder if this is true.  For one thing, the remote operators are located in a mobile command unit. Is that safer than an armored vehicle? Is the operating range so great that the remote operators are significantly out of harm’s way?

For Russia’s target market, how significant are personnel casualties? The US is obviously not going to buy it (for one thing the American version would be called UranUS and yes, I wrote this whole article so I could use this joke). America pours a lot of money into its soldiers, so that they are the most expensive piece of equipment on any given battlefield. Other countries also invest a lot in their soldiers, but none do as much as the US. I bet for many countries, the loss of an armored vehicle is more expensive than the personnel casualties.  I realize that this sounds cold blooded, but as commanders decide where to commit their limited resources, they will do these calculations. Will they want to pay more for a system that has a greater number of points of failure?

Furthermore, there is a good chance that there will be no cost savings for personnel, even if an operator could drive multiple vehicles. The installation and maintenance of the mobile command center as well as the remote control systems themselves will require highly trained personnel.

Also, what will be the situational awareness of the remote operator as compared to one actually inside the vehicle? As a rule, remote operators have less. Will that leave the vehicle more vulnerable?

Of course, my doubts may be completely unfounded, and this thing may sell better than iPhones. But I can’t help feeling that this vehicle was designed not for practical reasons, but because somebody somewhere thought unmanned systems were cool.

I asked Richard Barrios, ex-Marine Ammo tech and aficionado of all things that go BOOM, to take a look at the video.  He agreed that the ordinance was impressive, but pointed out that the Geneva Convention prohibited these particular weapons from being used on personnel (as opposed to tanks and similar platforms).  This puts a crimp in Russia’s assertion that this is a counter-terrorist weapon. Richard and I agreed that Russia and its clients probably will not care about this particular problem.

Richard suggested that this might be a good defense vehicle for a base. I countered that there were cheaper and more practical alternatives. Richard agreed and said that he preferred Precision Remote’s .50 caliber M2 remote-controlled solution (see video here). BTW,  in the video you can see the ROCKY DK Rugged Tablet being used as a control unit for this .50 caliber solution.

Richard likes the Russian armored UGV, but like me, he harbors doubts about the usefulness of this platform as compared to a manned one. “I wouldn’t want to take it to war,” he said. “But I would want it as a toy.”

Recently, I was asked to answer a question on the social media site Quora.   I was asked, “How does a person get their robot design/prototype bought by the U.S. military?”  Here is my answer:

This is one of those “if you have to ask, forget about it” questions. Selling to the American military is a world in itself, and your best bet is to partner with someone who has direct experience with Defense acquisition.

I work for AMREL (American Reliance). It makes rugged computers that are the platforms for Operator Control Units for Unmanned Ground Vehicles (UGV). While I have never directly dealt with military procurement, I have had many conversations with salesmen who have.

Keeping in mind that I am not an expert, this is my general impression of what is involved:

  • Target the specific bureaucracy: What kind of unmanned system you make will determine who you sell it to. ARMY likes Tactical Unmanned Aerial Vehicles (TUAV). The NAVY’s Advanced Explosive Ordnance Disposal Robotic System (AEODRS) is responsible for IED-detecting UGVs. The AIR FORCE flies the big Predator drones that you hear so much about in the news. (BTW, as you can tell from this paragraph, be prepared to learn acronyms. LOTS of acronyms.)

Keep in mind that the Department of Defense (DoD) is a bit like the universe, i.e. big, mysterious, and mostly invisible. Finding out who might want to buy your robot and who is the appropriate person to contact can be challenging.

  • Ninety percent of sales is listening: Nowhere is this more true than in military sales. I don’t care if your robot can travel backwards in time, and makes non-fattening chocolate cake; if it doesn’t meet their requirements, the DoD is not interested. Find out what THEY want and then make it. From time to time, various elements of military hold public events that vendors can attend. Sometimes it’s just a table-top tradeshow or big-time demos like the Robot Rodeo. These are valuable places to gain info. Do a net search, plan your travel itinerary, print up some datasheets, and above all LISTEN.
  • Be prepared to make changes, LOTS of changes: If the military is interested in your robot, it will go through an enormous amount of testing and approvals. This can take quite a bit of time. Large weapon systems have been known to spend decades in development. With every stage, you will receive feedback about what needs to be altered. End-user feedback is considered especially critical.
  • Be prepared to spend money: Traveling around the country and preparing the appropriate documentation are expensive undertakings. And that’s not even counting the expense of multiple prototypes. AMREL has carved out a niche for itself in customizations of rugged computers for low volume orders with low-to-no NRE. This enables developers to make economical prototypes for the reiterative development process.
  • You may end up selling to a big Defense contractor, rather than directly to the DoD. These guys are called “primes.” If you are making a garlic-sniffing submersible unmanned system, and Fat Cat, inc. holds the contract for this kind of robot, you have to sell to them. Just as you need to learn the intricate labyrinthine ways of the DoD, you are going to have to study the quaint and colorful traditions of Fat Cat if you want to do business with them. There are advantages and disadvantages to selling to primes, which is a whole other answer.
  • Be prepared to spend lots of time. I was at a conference in which about half a dozen representatives of Defense primes were up on a stage answering questions. When asked “How long does it take to get a contract,” the average answer was 5 years.
  • Make your robot cheap. The DoD may have more money than anyone else, but it’s under enormous pressure to be economical. They want solutions that save money.
  • Can you buy parts for your robots off the shelf? The DoD used to spend big bucks on specialty items. Not anymore. They want to buy parts at the local big box store. The magic word is “Commercial Off The Shelf (COTS).”
  • Does your robot work and play well with others? In this case, the magic word is “interoperability.” AMREL has been able to dominate the OCU market, because we came up with an interoperable solution that worked on multiple UGVs. Other companies created proprietary control systems that worked only on one robot. This caused a logistical nightmare for the DoD.
  • Don’t believe the headlines about corruption and incompetence. Yes, the Defense procurement process is a mess. Why is a whole other answer. But it’s not the fault of typical DoD personnel. By and large the people you encounter in the military and Defense are smart, dedicated, and honest. They are haunted by the specter that the equipment they procure may result in the death of American servicemen. If you have a product that can save lives, then you might just have yourself a sale.

 “China could well turn out to be ground zero for the economic and social disruption brought on by the rise of the robots.”

New York Times

 

Last November, the World Robot Exhibition was held in Beijing. It was an opportunity for the media to gawk at cute, dancing automations, and stoke unfounded fears about China’s “advanced armed attacked” robots (“Isn’t that just a PackBot with a rifle strapped to it?”).

It was also time to consider how important China is to the future of unmanned systems, especially those for industrial applications. According the International Federation of Robotics (IFR):

  • China was the biggest market for robots in 2014.
  • In 2014, Chinese factories accounted for about a quarter of the world’s industrial robots (54% increase over 2013).
  • By 2017, China is projected to be home to the most robots of any country.

Substantial_increase_in_China_and_Korea

Source: International Federation of Robotics (IFR):

 

A good example of China’s commitment to unmanned systems is Foxconn, the maker of iPhones. Three years ago it announced that it would install 1 million robots in order to automate about 70% of factory work.  It already has a fully robotic factory in Chengdu. Other Chinese companies are enthusiastically pursuing similar plans.

 [layerslider id=”31″]

Opportunities for Western robot manufacturers

The path to widespread Chinese adoption of industrial robots is not without its obstacles. For one thing, Chinese-made robots have a significant quality problem.

Qu Daokui, chairman of Siasun Robot and Automation, a Shenyang-based industrial robot producer, stated in the South China Morning Post that quality is the number one challenge faced by Chinese robot manufacturers. This industry “lacks core technology,” he is reported to have said and is stuck at “low-end application in a high-end industry. As a result, it is under pressure of being marginalized in Western-dominated markets.”

Evidence for the quality problem is found in Foxconn’s troubled “Foxbot.” Difficulties with this robot forced the manufacturer to scale back its ambitious automation plans.

Some of the sources for poor quality can be tied to the relative newness of the Chinese robot industry. It is estimated that 15% are start-ups less than five years old. Chinese robot manufacturers still have not geared up production to meet demand and lack the latest technologies, such as 3-D printing.

Like everyone else in the world of robot business, AMREL is always on the lookout for new markets (we make Operator Control Units for unmanned systems). The poor quality of Chinese robots offer opportunities for foreign robot suppliers. China relies on imported key parts such as sensors and motors. In fact most robots are imported, as demonstrated by the chart below:

China_2014_more_than_56000_new_robots

Source: International Federation of Robotics (IFR):

Ironically, Chinese industry which has made the world dependent on its manufacturing is itself dependent on foreign imports of industrial robots.

[layerslider id=”31″]

Jobs

However, quality and dependency on foreign imports are not China’s biggest robot problems. By far the biggest concerns are about job loss. Sixteen million factory jobs disappeared between 1995 and 2002, roughly 15% of total Chinese manufacturing employment. Many think that Chinese job loss caused by automation may be faster than it was in the West.

Chinese workers are already having a tough time. One survey states that 43% of Chinese workers consider themselves to be overeducated for their current positions.  China’s dramatic reduction of poverty is historically unprecedented, but it has not translated into middle class lifestyle for most.

Furthermore, no one really understands the Chinese economy; it is a unique combination of wild infrastructure spending, state–directed capitalism, extremely high domestic savings rate, and poor incomes for typical workers. Many economists are hesitant to apply traditional models to the Chinese economy.

Add to the uncertainty about the impact of robotics on the economy and its workers, there is anxiety about basic Chinese social structure. The grip of the Chinese Communist Party is rock solid, but what happens when robot-created job loss hinders China’s extraordinary growth rate? Authoritarian regimes fear their people in a way that democracies don’t.

The whole world will be looking to see how China handles these challenges. Humanity has been dealing with job loss caused by automation for over a hundred years, and has yet to find a good answer.

The problem of robots and jobs can best be illustrated by an anecdote supposedly told about Walter Reuther, past leader of the United Auto Workers. An automobile company executive was showing the legendary union leader around a modernized factory floor. Pointing out the new industrial robots, the executive teased Reuther, “How are you going to collect union dues from all these machines?” Reuther replied, “That is not what’s bothering me. I’m troubled by the problem of how to sell automobiles to these machines.”

Have inside information about this topic? An opinion? Inappropriate jokes?

Send them to editor@amrel.com

Go to a solar energy conference, and much of the discussion will be about financing. Go to a meeting of national security professionals and you will hear about terrorism. Gather folks interested in unmanned systems – specifically Unmanned Aerial Vehicles (UAV) AKA drones –  and you will hear endless talk about the Federal Aviation Administration (FAA).

So, it made sense that last November Robotics Business Review held a webinar on the FAA’s impact on drones. In 2012, Congress ordered the FAA to develop a comprehensive plan to safely integrate UAVs into the National Airspace System (NAS). This Congressional directive has had an interesting side-effect in that all across the country, law firms are adding unmanned divisions to deal with expected UAV regulations. We may think of UAVs as a technological challenge or as a business enterprise, but right now it is the legal environment that will determine their future.

[layerslider id=”31″]

Believe the hype

The first thing the webinar made clear is that the UAV market is exploding. See table below.

UAV Facts & Figures FAA artcile

Two major factors are driving this incredible growth. First, the combination of the UAVs and cameras is magic. UAVs with cameras are used by:

  • Farmers to determine which crops are getting enough water
  • Engineers to inspect bridges, buildings, wind farms, oil rigs, power lines, cellular towers, and other parts of our infrastructure
  • Firefighters to examine wildfires
  • Realtors to photograph properties for sales
  • Law enforcement to patrol borders & exert crowd control
  • Filmmakers to capture shots that would be otherwise too expensive
  • Roofers to check shingles

Secondly, UAVs are a relatively cheap package of hi-tech goodies. For $1000 to $4000, you can get not only a camera, but also a satellite connection, GPS, infrared sensors, sonar, as well as some autonomous capabilities. One of the seminars participants stated the concentration of technology in a UAV is comparable to that of a smartphone. For the amount of money involved, it’s a good value, and represents a low financial threshold for a pioneering innovator looking for a new disruptive application.

 [layerslider id=”31″]

FAA & the law

Current law mandates that anyone who operates a UAV for business purposes needs a pilot license. This is absurd over qualification, and many expect it be changed in the near future. One of the webinar’s participants speculated that a simple low-fee brief online course will be all that is required.

Another current requirement is that a business operator of an UAV needs a Section 333 exemption. The FAA is understaffed, underfunded, and overworked, so the paperwork for this exemption can take 3 months. As you might expect, promoters of UAVs for business applications are chomping at the bit and are impatiently waiting for new regulations that will make their enterprises more practical.

Until recently, recreational users of UAVs needed to follow only a few simple rules. UAVs must:

  • Fly within Line of Sight of the operator
  • Fly below 400 feet
  • Be kept 5 miles from airports

One of the biggest problems facing the FAA is that recreational users do not know about these rules, or simply ignore them. The webinar was full of cautionary tales of UAVs endangering aircraft thousands of feet in the air, hampering firefighting efforts, and interfering with airport operations.

Under proposed rules (that have been announced since the webinar), recreational operators will need to register their UAVs if it weighs between a half-pound and 55 pounds. The low end of this weight requirement has created some backlash.  As one person put it, “You want me to register a toy when we don’t even register guns?”

 [layerslider id=”31″]

Government regulations are good for business

While the new recreational rules stirred controversy, it is the business regulations that entrepreneurs are anxiously awaiting (since AMREL makes Operator Control Units for unmanned systems, we are among those keeping a close eye on this). While some may grouse about anti-business bureaucracies, there are solid reasons why the process is proceeding at a glacial pace.

The FAA was never designed to regulate UAVs.  It oversees manned aircraft, and does a pretty good job at it. On any given day, there are over 7000 aircraft in American skies. Yet, accidents are so rare, that even a near-miss will make the news.

We may not like the intricate bureaucratic labyrinth that the FAA has weaved around air travel, but it has demonstrated its value. Without invasive government regulations, people would have no confidence in air travel, and that industry would be a fraction of what it is today.

So, it should be no surprise that the FAA would adopt a similar heavy-handed regulatory approach to UAVs, especially since they have proven problematic. Not only have UAVs endangered aircraft as mentioned above, but they have also rammed into skyscrapers, crashed into football games & tennis tournaments, interfered with police helicopters and have entered secure airspaces, such as the White House.

The good news is that the FAA is looking to business to help solve these problems. The advisory task force has such industry luminaries as SpaceX and Amazon. With the help of these big players, the FAA will eventually comply with Congress’ mandate and issue regulations that will integrate UAVs into the national airspace. When that happens, the Era of Business UAVs will have finally arrived.

Have inside information about this topic? An opinion? Inappropriate jokes?

Send them to editor@amrel.com

I lived overseas for several years. When I came back to America, one of the differences that I noticed was that more men were seriously into cooking.

Not just any cooking either – manly cooking.  Guys were deep-frying whole turkeys and even tenderizing meats with explosives (“Well, sure it’s dangerous, but so what?”).

The video below demonstrates part of this manly cooking trend; it shows an Unmanned Aerial Vehicle (UAV AKA “drones”) cooking a turkey with a flame thrower.

[layerslider id=”31″]

The same person once posted a video of a UAV with a gun rigged to it. The Federal Aviation Administration (FAA) was not amused.  Attaching weapons to aircraft is expressly forbidden by the FAA.

However, it is an open question about whether or not weaponized UAVs are protected by the Second Amendment. For a detailed discussion on this topic, see this US News article.

My next bumper sticker:

“When roasting turkeys with flame-throwing drones is outlawed,

only outlaws will roast turkeys with flame-throwing drones.”