Exponential Technology Report – 13 November 2019

This is the weekly report focusing on the news stories for this week that focus on exponential technologies.


We are holding an Exponential Organisation event on 27 November 2019. If you are in Johannesburg then it would be great to have you. Apart from the talks we will also have WITS showcasing their really great robotic hand development and Eden Labs brings VR headsets to show us their amazing work. The cost is R495 and you can book your ticket by looking for Exponential Organisations Event on Quicket. I will have a link in my show notes as well.


A reminder that the exponential technologies fall into different categories, which are 3D printing and digital fabrication, Artificial intelligence (AI), Augmented and virtual reality (AR, VR), Autonomous vehicles, Blockchain, Data Science, Digital biology and biotechnology, Digital medicine, Drone technology, Internet of things, Nanotechnology , Networks and computing systems, Quantum computing and Robotics.


For this report the format I use is to go through the exponential technologies in alphabetical order. All the links to the articles can be found by clicking on the image. I hope you enjoy it!



3D Printing and Digital Fabrication


Article: “How do you like your beef… old-style cow or 3D-printed?” - www.theguardian.com


After the success of the Greggs vegan sausage roll and the juicy-yet-meatless Impossible Burger, the next new food sensation is coming to a plate near you: 3D-printed steaks and chicken thighs.


Printed meat could be on European restaurant menus from next year as Israeli and Spanish firms serve up realistic beef and chicken produced from plant protein. And, within a few years, the printers are likely to be available to buy so that consumers can produce their own at home.


Layers of material are built up by 3D printers until there is a solid object conforming to very precise specifications. The meat can be produced either from vegetable matter or from animal cells grown in a lab. The printer uses these raw ingredients, which come in a Nespresso-style cartridge, to build up a steak or chicken fillet that tastes like the real thing.

Eshchar Ben-Shitrit, co-founder and CEO of Israeli firm Redefine Meat, said switching to printed meat would have huge ecological benefits. “The biggest reason for going to alternative meat is because of the future of our planet,” he said. “We love meat but we don’t have enough resources for it. Cows require a lot of water, a lot of food and a lot of land but we don’t have enough of any of these. We can recycle, drive electric cars, we can shower less, but these changes can’t compete with reducing consumption by one hamburger per week.”


Reducing beef production would result in a huge reduction in CO2 emissions and far less clearance of wild countryside for grazing land. Other meats, such as pork and fish, will soon be added to the menu, reducing the need for pig-rearing or fishing.


Redefine Meat will pilot its plant-based meat in restaurants throughout Europe early in 2020, so British diners could have next year’s Christmas lunches printed for them. It has already served hundreds of people in Israel and conducted tastings in Europe. The products will initially be more expensive than traditional meat – the firm is aiming for a price point around £28 per kg, twice the cost of British supermarket sirloin steak – but this will come down over time and should, eventually, be cheaper than traditional meat.


The technology is being closely watched by the food industry. Emma Lake, news editor of the Caterer, said: “The potential market for 3D-printed meat could be substantial; we’ve already seen dramatic growth in vegan meat imitations in response to more consumers cutting down on their consumption of animal products for environmental and health reasons. The launch of the Greggs vegan sausage roll in January demonstrated the interest in such products.”



Artificial Intelligence (AI)


Article: – “Artificial Intelligence Converts 2D Images Into 3D Using Deep Learning” – www.scitechdaily.com


A University of California, Los Angeles research team has devised a technique that extends the capabilities of fluorescence microscopy, which allows scientists to precisely label parts of living cells and tissue with dyes that glow under special lighting. The researchers use artificial intelligence to turn two-dimensional images into stacks of virtual three-dimensional slices showing activity inside organisms.


“This is a very powerful new method that is enabled by deep learning to perform 3D imaging of live specimens, with the least exposure to light, which can be toxic to samples,” said senior author Aydogan Ozcan, UCLA chancellor’s professor of electrical and computer engineering and associate director of the California NanoSystems Institute at UCLA.


In addition to sparing specimens from potentially damaging doses of light, this system could offer biologists and life science researchers a new tool for 3D imaging that is simpler, faster and much less expensive than current methods. The opportunity to correct for aberrations may allow scientists studying live organisms to collect data from images that otherwise would be unusable. Investigators could also gain virtual access to expensive and complicated equipment.


This conversion is valuable because the confocal microscope creates images that are sharper, with more contrast, compared to the wide field. On the other hand, the wide-field microscope captures images at less expense and with fewer technical requirements.


Artificial Intelligence (AI)


Article: “13 Mind-Blowing Things Artificial Intelligence Can Already Do Today” – www.forbes.com


By now, most of us are aware of artificial intelligence (AI) being an increasingly present part of our everyday lives. But, many of us would be quite surprised to learn of some of the skills AI already knows how to do. Here are 13 mind-blowing skills artificial intelligence can already do today.


The thirteen things are:

1. Read: Is one of your wishes to save time by only having to pay attention to the salient points of communication? Your wish has come true with artificial intelligence-powered SummarizeBot.

2. Write: Would you believe that along with professional journalists, news organizations such as The New York Times, Washington Post, Reuters, and more rely on artificial intelligence to write?

3. See: Machine vision is when computers can “see” the world, analyze visual data, and make decisions about it.

4. Hear and understand: Did you know artificial intelligence is able to detect gunshots, analyze the sound, and then alert relevant agencies?

5. Speak: While it’s helpful (and fun) to have Alexa and Google Maps respond to your queries and give you directions, Google Duplex takes it one step further and uses AI to schedule appointments and complete tasks over the phone in very conversational language.

6. Smell: There are artificial intelligence researchers who are currently developing AI models that will be able to detect illnesses—just by smelling a human's breath.

7. Touch: Using sensors and cameras, there's a robot that can identify "supermarket ripe" raspberries and even pick them and place them into a basket!

8. Move: Artificial intelligence propels all kinds of movement from autonomous vehicles to drones to robots.

9. Understand emotions: Market research is being aided by AI tools that track a person’s emotions as they watch videos.

10. Play games: It's not all serious business with artificial intelligence—it can learn to play games such as chess, Go, and poker.

11. Debate: IBM’s Project Debater showed us that artificial intelligence can even be successful at debating humans in complex subjects.

12. Create: Artificial intelligence can even master creative processes, including making visual art, writing poetry, composing music, and taking photographs.

13. Read your mind: This is truly mind-boggling—AI that can read your mind!



Augmented and Virtual Reality (AR, VR)


Article: “Apple reportedly plans 2022 release for first AR headset, followed by AR glasses in 2023” - www.theverge.com


Apple plans to release its first augmented reality headset in 2022, followed by a smaller device — a pair of AR glasses — in 2023, according to a new report from The Information.


Apple’s entry into the world of augmented reality has long been rumored, with many in the tech world seeing AR and VR as the next big platforms after mobile. But an exact entry date has been unclear, with some analysts suggesting a 2020 launch. Citing internal presentations made at Apple’s headquarters, The Information pushes this rumored timeline back to 2022, presumably due to difficulties in developing the technology.


As well as the new timeline, The Information’s report offers new detail about Apple’s AR headset, codenamed N301. The device supposedly resembles a slimmer version of the Oculus Quest, a virtual reality headset released in May. It has AR and VR capabilities, uses external cameras to map the user’s surroundings (including the outlines of people, furniture, and rooms), and has a high-resolution display to show information and blend virtual objects with the real world. Employees were told that the company would be reaching out to developers to build software for the headset from 2021.


As the smartphone market matures, Apple and many other tech companies are looking to virtual and augmented reality as the next big tech platforms. The iPhone maker has been building up resources in this area for years, buying tech from smaller companies and dedicating more employees to the project. Rivals like Facebook, Microsoft, and Google have also been investing heavily in this area through projects like HoloLens and Oculus.


However, virtual and augmented reality have proved tough ground for development, with bulky hardware and disappointing user experiences stymieing growth. Just last month Google effectively ended its Daydream experiment, which used phones to power VR headsets, citing a lack of developer adoption and “decreasing usage” from customers. In this context it seems wise for Apple to bide its time rather than rush to market.



Autonomous Vehicles


Article: “Alphabet’s self-driving car project Waymo is shuttering its Austin operations” - www.cnbc.com


Waymo has been facing challenges to commercialize self-driving cars. Morgan Stanley cut its valuation on Waymo by 40% last month from $175 billion to $105 billion, concluding that the industry is moving toward commercialization more slowly than expected, and noted that Waymo still relies on human safety drivers, which CNBC reported in August.


“Waymo is growing our investment and teams in both the Detroit and Phoenix areas, and we want to bring our operations teams together in these locations to best support our riders and our ride-hailing service,” a Waymo spokesperson said in a statement sent Friday to CNBC. “As a result we’ve decided to relocate all Austin positions to Detroit and Phoenix. We are working closely with employees, offering them the opportunity to transfer, as well as with our staffing partners to ensure everyone receives transition pay and relocation assistance.”



Blockchain


Article: “Bitcoin And Blockchain Job Hunts Drop 53% Over Past Year” – www.forbes.com


Over the past several years, the cryptocurrency and blockchain space has seen immense growth, not only regarding digital asset prices, but also in the many startups and existing businesses that are now involved in the industry and its underlying technology.


Numbers from the past year, however, show decreased interest from job seekers even though the number of jobs in the crypto and blockchain industry has grown, according to recent data gathered by popular employment site Indeed.


From September 2018 to September 2019, “the share of cryptocurrency job postings per million on Indeed have increased by 26%, while the share of job searches per million have decreased by 53%,” Indeed detailed in a report provided to me.


These numbers are also down from the figures posted one year prior. September 2017 to September 2018 tallied a 214% growth in crypto and blockchain jobs on Indeed with a 14% growth in related job seeker searches, according to an email from an Indeed representative.


Drone Technology


Article: “Drones will swarm our skies when these 3 things happen” - www.cnet.com


In the not-too-distant future, drones will crowd the skies. Quadcopters, hexacopters, octocopters, svelte fixed-wing drones that look like miniature airplanes, hulking aircraft designed to lift 500 pounds, self-piloting Boeing air taxis and DJI's teensy Mavic Mini flying camera -- they'll all be competing for airspace.

NASA Administrator Jim Bridenstine loves the idea. At the Commercial UAV Expo drone conference late last month in Las Vegas, Bridenstine challenged the industry to get tens of thousands of daily drone flights over at least one US city by 2028. He also set out several "grand challenge" milestones to get us there, including a 2022 test flight with the cargo weight equivalent of at least one human passenger in simulated urban airspace.


Here are the next three steps the industry has plotted to make the dream a reality.


Step 1: Saving lives with drones

It's a lot harder to say no to drones when lives are on the line, so drone companies are eagerly pursuing medicine, search and rescue, firefighting and emergency situations


Step 2: New airspace rules for drones

To make drones commonplace, we need a system to keep them from colliding and falling out of the sky. Today's air traffic control system, designed for a small number of big aircraft, is completely unsuited.


Step 3: Delivering the goods

Once we get used to aerial shipments of blood, prescription drugs, organs for transplant and antivenom for snakebites, those flying burritos could start to look more tempting.



Internet of Things


Article: “Catching Up On The Latest In IoT Intelligence, 2019” – www.forbes.com


These and many other fascinating insights are from Dresner Advisory Associates’ 2019 Edition of IoT Intelligence® Market Study now in its 5th year of publication. Internet of Things, or IoT, is defined as the network of physical objects, or “things” embedded with electronics, software, sensors, and connectivity to enable objects to collect and exchange data.


Key insights from the study include the following:

  • Marketing & Sales and R&D place the highest levels of importance on IoT today, significantly above IT, executive management, and finance.

  • Manufacturing companies consider IoT technologies the most critical to their operations than any other industry included in the study.

  • IoT’s importance increases as an enterprises’ size increases, reflecting the higher number of digital initiatives, new products, services, and business models under development large-scale organizations can take on.

  • Enterprises are prioritizing investments in data supply chains and IoT analytics this year.

  • Those championing IoT adoption or IoT Advocates have more significant expertise and insight into how best to apply big data, data mining, IoT, and IT analytics to enterprise challenges than the overall sample of respondents.



Quantum Computing


Article: “A natural biomolecule has been measured acting like a quantum wave for the first time” – www.technologyreview.com


One of the great counterintuitive puzzles of quantum mechanics is wave-particle duality. This is the phenomenon in which objects behave both like particles and like waves.


Numerous experiments have shown that a single particle—an electron or a photon, for example—can interfere with itself, like a wave. The double slit experiment, in which a particle passes through two slits at the same time, is a famous demonstration.


And because all objects are fundamentally quantum in nature, they all have an associated wavelength. So in principle, macroscopic objects should show this kind of wave-particle duality too, given a sensitive enough experiment.


Today, they get an answer thanks to the work of Armin Shayeghi at the University of Vienna and a few colleagues, who for the first time, have demonstrated quantum interference in molecules of gramicidin, a natural antibiotic made up of 15 amino acids. Their work paves the way for the study of the quantum properties of biomolecules and sets the scene for experiments that exploit the quantum nature of enzymes, DNA, and perhaps one day simple life forms such as viruses.


“The successful realization of quantum optics with this polypeptide as a prototypical biomolecule paves the way for quantum assisted molecule metrology and in particular the optical spectroscopy of a large class of biologically relevant molecules,” say the researchers



Robotics


Article: “Soft robots of the future may depend on new materials that sense damage and self-heal” – www.thesouthafrican.com


Now Boston Dynamics’ nimble four-legged robot, Spot, is available for companies to lease to carry out various real-world jobs, a sign of just how common interactions between humans and machines have become in recent years.


And while Spot is versatile and robust, it’s what society thinks of as a traditional robot, a mix of metal and hard plastic.


Many researchers are convinced that soft robots capable of safe physical interaction with people – for example, providing in-home assistance by gripping and moving objects – will join hard robots to populate the future.


Soft robotics and wearable computers, both technologies that are safe for human interaction, will demand new types of materials that are soft and stretchable and perform a wide variety of functions.


The people at the Soft Machines Lab at Carnegie Mellon University develop these multifunctional materials.


Along with collaborators, we’ve recently developed one such material that uniquely combines the properties of metals, soft rubbers and shape memory materials.

These soft multifunctional materials, as we call them, conduct electricity, detect damage and heal themselves. They also can sense touch and change their shape and stiffness in response to electrical stimulation, like an artificial muscle.


This idea that the material is the machine can be captured in the concept of embodied intelligence. This term is usually used to describe a system of materials that are interconnected, like tendons in the knee.


When running, tendons can stretch and relax to adapt each time the foot strikes the ground, without the need for any neural control.


It’s also possible to think of embodied intelligence in a single material – one that can sense, process and respond to its environment without embedded electronic devices like sensors and processing units.


A simple example is rubber. At the molecular level, rubber contains strings of molecules that are coiled up and linked together.


Stretching or compressing rubber moves and uncoils the strings, but their links force the rubber to bounce back to its original position without permanently deforming. The ability for rubber to “know” its original shape is contained within the material structure.



This draws an end to this week's Exponential Technology report. We hope that you have found this report both interesting and useful. If you would like to subscribe to our weekly update then visit www.ideastorm.co.za.

18 views

27-60-796-5195