Google will not develop artificial intelligence for use in weapons, surveillance violating internationally accepted norms or technologies were the risks substantially outweigh the benefits.
Machine learning is the ability of machines to receive data and learn for themselves without being programmed with rules.
Apart from using ML for its own products and research, Google is working with several partners to provide solutions for problem either too vast or complex for humans
“We believe AI can help tackle some of the most difficult social and environmental challenges of our times, and not just in computer science but in areas where you necessarily expect it like healthcare, environmental conservation and agriculture,” Jeff Dean said in the keynote.
Product Manager for Google Health Lily Peng said the company’s AI ventures were helpful in the field of healthcare – primarily in lung cancer screening and breast cancer metastases detection.
“We believe that technology can have a big impact in medicine, helping democratise access to care, returning attention to patients and helping researchers make scientific discoveries,” she said.
Lung cancer results in over 1.7 million deaths per year and is the sixth most common cause of death globally.
Evidence has shown early detection is the best treatment, however radiologists are often forced to search for minuscule signs of cancer from hundreds of 2D images captured during a single CT scan.
Google’s machine learning model can create a 3D image of the scans and search for subtle malignant tissue in the lungs – it can also factor in information from previous scans.
When using a single CT scan for diagnosis, Google’s model performed better than six radiologists. It detected five percent more cancer cases while reducing false-positive exams by more than 11 percent compared to unassisted radiologists in its research.
In breast cancer metastases detection, Google says its machine learning model can find 95 per cent of cancer lesions in pathology images – pathologists can generally only detect 73 per cent.
Humpback whale populations are currently listed as endangered as a result of whaling practices.
To give the at-risk marine species a better chance for survival, Google has partnered with National Oceanic and Atmospheric Administration (NOAA) to create a solution.
The bio-acoustics project used 19 years worth of underwater audio data collected by NOAA to train Google’s neural network to identify the call of a humpback whale.
Product Manager at Google AI Julie Cattiau said machine learning is able to distinguish the sound of humpback whales easily from other similar sounds – something humans struggled to do.
“We started by turning the underwater audio data into a visual representation of the sound called a spectrogram, and then showed our algorithm many example spectrograms that were labelled with the correct species name,” Google explained.
“The more examples we can show it, the better our algorithm gets at automatically identifying those sounds.”
The machine learning program gives a better understanding of where humpback whales live and where they travel.
“In the future, we plan to use our classifier to help NOAA better understand humpback whales by identifying changes in breeding location or migration paths, changes in relative abundance,” Google explained.
ALS is neuro-degenerative condition that can result in the inability to speak and move.
By collaborating non-profit ALS organisations, Google has been recording the voices of people suffering the condition to optimise AI based algorithms so that mobile phones and computers can transcribe speech of people with impairments.
“The first step of our research effort is to ask volunteers to record voice samples that we can use to improve our speech recognition models. Once we have enough recordings from someone, our team builds a personalised communication system that works specifically for people who recorded their voice,” said Google AI Product Manager Julie Cattiau.
“Our AI algorithms currently aim to accommodate individuals who speak English and have impairments typically associated with ALS, but we believe that our research can be applied to larger groups of people and to different speech impairments.”
In addition to improving speech recognition, Google is also training personalised AI algorithms to detect sounds or gestures which generate spoken commands to Google Home.
The tech giant showcased the potential in a video (see above) with an ALS patient using non-speech sounds to trigger smart home devices such as lights and facial gestures to cheer during a sports game.
Software engineering manager at Google AI Sella Nevo has been working on a machine learning project that will better predict areas that will hit by devastating floods.
“The reason we do this work is to be able to warn people and protect them… We’re working to give people even more information and alert them early,” he said.
Mr Nevo said flood forecasting is currently based on low-resolution elevation maps that are nearly two decades old, making it virtually impossible to accurately predict affected areas.
However, by using machine learning models combined with satellite imagery and data from government agencies, researchers have been able to develop the Flood Forecasting Initiative.
Google launched a pilot program in India last year as the country accounts for nearly 20 per cent of the flood-related fatalities in the world- nearly 107,487 were recorded as a result of heavy rains and floods between 1953 and 2017.
The pilot program ran hundreds of thousands of simulations on its machine learning models ahead of a flooding natural disaster in Patna, India last year.
It predicted the regions affected by the flood with an accuracy of over 90 per cent, with the tech giant alerting those at risk using notifications on smartphones.
SURELY IT CAN’T ALL BE GOOD
Google AI Lead Jeff Dean said projects on the the company’s cloud services have restrictions, but admits the tech giant reluctantly has to accept those taking the open-source technology and using it for dubious purposes.
One possible example would be the whale tracking technology being used by illegal whalers.
“One of the things we decided when we open-sourced TansorFlow was to make it very flexible. Take it and do what you want with it,” he explained
“I think there is an issue that one could use it to build a higher level machinery do particular things that we might find not so great.”
The author of this article travelled to Tokyo as a guest of Google.
© Nine Digital Pty Ltd 2019