GLIMPSES OF THE FUTURE – APRIL 2020

COVID-19:

We are living through a major inflection point in history. 

This means that when “normal life” resumes, the future won’t look or feel much like “the normal” we have been used to.

Social and economic life will eventually recover and, in time, will thrive again, but many things will have changed.

We will be living and working in a new landscape.

It is too soon to know what these changes will include and how far reaching they will be.

Likely candidates for change include a massive, permanent shift to online working, education and healthcare delivery.

There will be a complete rethinking of globalisation and supply chains with much manufacturing being re-shored and a massive increase in robot/automation investment and AI deployment.

 As Lenin observed in 1919: “There are decades when nothing happens and there are weeks when decades happen.”

 

Computers That Can Replace Sniffer Dogs

 Computers can already boast superhuman sensory abilities in sight and hearing, but smell has been much more difficult. The human nose isn’t a particularly good one compared to the rest of the animal kingdom, but it’s still a complex piece of machinery, with around 450 different types of olfactory receptors. Researchers are now starting to give this capability to computers.

One such effort is underway at Intel, where a neuromorphic computing group has been working with olfactory neurophysiologists from Cornell University to see if artificial intelligence can help computers mimic the way the brain analyses and categorizes olfactory data.

The team gave Loihi access to data from 72 chemical sensors, all sitting in a wind tunnel, as 10 different odors including ammonia, acetone and methane were blown through, and sure enough, Loihi was able to build neural representations of each of these smells, and identify them again even when there was “strong background interferents.”

This is very different, the team says, to how your home smoke and carbon monoxide detectors work – these devices can identify specific airborne molecules and make a beep, but they can’t learn or categorize new smells in any way.

This would appear to be a significant step forward in the race toward a true multipurpose “electronic nose” that can rival, and indeed one day maybe surpass, the abilities of the human nose, or even the canine nose, in picking up smells and instantly working out what they are.

The use cases range from the detection of dangerous chemicals and explosives, to the detection of drugs and contraband, to the identification and classification of wines, to quality control in factories. There are even some diseases that can be diagnosed by smell.

Google Invents Fish-Face Recognition

 The advanced research lab at Google’s parent company Alphabet, is taking aim at an unlikely new target for its technologies: fish.

In an attempt to boost the use of fish farms, and reduce the world’s consumption of wild fish and meat, Alphabet’s X Development has invented a system that will eventually recognise and monitor every individual fish in farms that hold hundreds of thousands.

The three-year-old project, dubbed Tidal, is working with farms in Europe and Asia. It pairs underwater cameras with AI techniques such as computer vision to track species including salmon and yellowtail.

The hope, according to Astro Teller, the director of X, is to reduce the world’s dependence on land-based proteins, such as beef, and to free the oceans from damaging fishing practices.

Helicopter Drone Drops Bombs On Forest Fires

 One of the good things about drones is the fact that they can safely be flown in conditions that would prove hazardous for crewed aircraft. That’s where the JC260 unmanned helicopter comes in, as it’s designed to fight forest fires.

Created by Chinese manufacturer QilingUAV, the JC260 can be equipped with two of the company’s retardant-filled “fire extinguishing bombs.” Dropped separately or in unison, each of the bombs can reportedly cover a flaming forest area of 50 cubic meters (1,766 cu ft).

Lift is provided by two sets of counter-rotating rotor blades, measuring 3.6 m (11.8 ft) in diameter. These are powered by two 34-hp water-cooled gasoline engines, taking the aircraft to a claimed cruising speed of 100 km/h (62 mph). One tank of gas should be good for a flight time of three to four hours.

A ground-based operator remotely pilots the drone in real time, based on output from its onboard cameras. A simple one-click system on the remote is used to drop the bombs.

Samsung Announces A New Long-range Solid-State Battery

 At 50 percent smaller by volume than a typical lithium-ion battery,Samsung’s prototype solid-state pouch cells could enable 500-mile electric car ranges and cycle lives over 1,000 charges in a much safer package.

The drive towards solid-state is one of the key fronts in the battle to break through to the next generation of batteries that will power our vehicles, aircraft, devices and homes in the coming decades, provided the coronavirus doesn’t send us back to using sharp rocks as tools.

Where current-gen lithium-ion batteries use liquid electrolytes, in which lithium ions float back and forth between the cathode and anode every time you charge or discharge the battery, solid-state batteries use a congealed solid that passes charges back and forth.

Eliminating the liquid electrolyte not only allows for much more dense and compact batteries with much higher capacity by volume, it also deals with heat much better. Solid-state batteries will thus require less heat evacuation equipment, meaning even less weight and bulk for an electric car to carry around, and a longer lifespan. They also don’t explode or catch fire, which is a rare but deal-breaking issue with current technology.

Are You Ready For 3-D Car Dashboards?

 The instrument panel is about to start jumping out of car dashboards. Tier-one OEM supplier Continental has built a glasses-free, “auto-stereoscopic” 3D car display, and it’s about to debut on the new Genesis GV80 SUV.

The system uses eye-tracking technology to pinpoint the driver’s eye position and angles a set of slanted “parallax barrier” slats within the display such that your left and right eyes receive different images. In this way, a stereoscopic 3D image can be built up for a single viewer and things can begin to appear to rise out of the dash or sink back into it.

Why? Well, Continental says it’s in the name of safety. Modern cars have so many automated features, warnings and idiot lights that important things can get lost in a barrage of information. So things will literally start jumping out at you if they’re high-priority warnings. Mind you, Continental doesn’t want you staring at this stuff – indeed, attention detection is built in to detect and help correct distraction and fatigue.

Liquid Metal That Floats On Water

 A liquid metal alloy less dense than water has been made by injecting the material with glass beads – and it could be used to make lightweight exoskeletons or transformable robots.

Like mercury, which has the lowest melting point of pure metals at -38.8°C, liquid metal alloys don’t solidify at room temperature. They are also eutectic, meaning that they melt at a lower temperature than the individual melting points of the metals they are made from.

Jing Liu at Tsinghua University in China and his colleagues have created such a material by mixing pure gallium and indium to create a liquid metal alloy with a melting point of 15.7°C.

To decrease its density, they stirred tiny glass bubbles filled with air into the liquid. The loose beads, which were 75 micrometres in diameter or smaller, clustered together in the mixture. Oxygen mixes in with the liquid metal, which helps the glass beads stay suspended, says Liu.

Satellite Cameras Can Now See Through Clouds

 Britain’s gloomy weather has at last met a foe it cannot match, depriving farmers of one reason to complain when clouds appear on the horizon.

Scientists have created a device that can use satellites to accurately look through clouds to map what is below for the first time, allowing farmers to study their crops over time from above. The development will also allow the authorities to monitor coastal erosion and flooding.

Visible and infrared waves of light cannot penetrate the cloud. Now, however, a team of astrophysicists at the University of Hertfordshire have built a system that can predict what the land looks like as if there were a clear blue sky, with the use of artificial intelligence and a supercomputer.

The team adapted the way they look at distant galaxies, by measuring reflected light rays. They relied on radar imaging from satellites that can penetrate clouds. These radio waves bounce off the surface of the Earth, with the scientists measuring the reflectiveness of the rays to tell what they are looking at — as the waves reflect differently depending on what type of surface they have hit.

 

No Fields Found.