Here we explore some of the main AI trends predicted by experts in the field. If they come to pass, 2020 should see some very exciting developments indeed.
What are the next big technologies?
According to sources like Forbes, some of the next "big things" in technology include, but are not limited to:
- Blockchain As A Service
- AI-Led Automation
- Machine Learning
- Enterprise Content Management
- AI For The Back Office
- Quantum Computing AI Applications
- Mainstreamed IoT
What are some of the most exciting AI trends?
- The use of AI to make healthcare more accurate and less costly
- Greater attention paid to explainability and trust
- AI becoming less data-hungry
- Improved accuracy and efficiency of neural networks
- Automated AI development
- Expanded use of AI in manufacturing
- Geopolitical implications for the uses of AI
What AI trends should you watch in 2020?
Further to the above, here are some more AI trends to look out for in 2020.
1. Computer Graphics will greatly benefit from AI
One trend to watch in 2020 will be advancements in the use of AI in computer-generated graphics. This is especially true for more photorealistic effects like creating high fidelity environments, vehicles, and characters in films and games.
Recreating on screen a realistic copy of metal, the dull gloss of wood or the skin of a grape is normally a very time-consuming process. It also tends to need a lot of experience and patience from a human artist.
Various researchers are already developing new methods of helping to make AI do the heavy work involved in creating comlex graphics. NVIDIA, for example, has already been working on this for several years.
They are using AI to improve things like ray tracing and rasterization, to create a cheaper and quicker methods of rendering hyper-realistic graphics in computer games.
Researchers in Vienna are also working on methods of partially, or even fully, automating the process under the supervision of an artist. This involves the use of neural networks and machine learning to take prompts from a creator to generate sample images for their approval.
2. Deepfakes will only get better, er, worse
Deepfake is another area that has seen massive advancement in recent years. 2019 saw a plethora of, thankfully humorous, deepfakes that went viral on many social media networks.
But this technology will only get more sophisticated as time goes by. This opens the door for some very worrying repercussions which could potentially damage or destroy reputations in the real world.
With deepfakes already becoming very hard to distinguish from real video, how will we be able to tell if anything is fake or not in the future? This is very important, as deepfakes could readily be used to spread political disinformation, corporate sabotage, or even cyberbullying.
Google and Facebook have been attempting to get out ahead of the potential negative aspects by releasing thousands of deepfake videos to teach AI's how to detect them. Unfortunately, it seems even AI has been stumped at times.
3. Predictive text should get better and better
Predictive text has been around for some time now, but by combining it with AI we may reach a point where the AI knows what you want to write before you do. "Smart" email predictive text is already being tested on programs like Gmail, for example.
If used correctly, this could help users speed up their writing significantly, and could be especially useful for those with physical conditins that make typing difficult. Of course, many people will find themselves typing out the full sentence anyway, even if the AI correctly predicted their intentions.
How this will develop in 2020 is anyone's guess, but it seems predictive text may become an ever-increasing part of our lives.
4. Ethics will become more important as time goes by
As AI becomes ever-more sophisticated, developers will be under more pressure to keep an eye on the ethics of their work. An ethical framework for the development and use of AI could define how the human designers of AI should develop and use their creations, as well as what AI should and should not be used for.
It could also eventually define how AI itself should behave, morally and ethically. Called "Roboethics" for short, the main concern is preventing humans from using AI for harmful purposes. Eventually, it may also include preventing robots and AI from doing harm to human beings.
Early references to Roboethics include the work of author Isaac Asimov and his "Three Laws of Robotics". Some argue that it may be time to encode many of Asimov's concepts into law, before any truly advanced AIs are developed.
5. Quantum computing will supercharge AI
Another trend to watch in 2020 will be advancements in quantum computing and AI. Quantum computing promises to revolutionize many aspects of computer science and could be used to supercharge AI in the future.
Quantum computing holds out the hope of dramatically improving the speed and efficiency of how we generate, store, and analyze enormous amounts of data. This could have enormous potential for big data, machine learning, AI, and privacy.
By massively increasing the speed of sifting through and making sense of huge data sets, AI and humanity could benefit greatly. It could also make it possible to quickly break virtually any encryption - making privacy a thing of the past. The end of privacy or a new Industrial Revolution? Only time will tell.
6. Facial recognition will appear in more places
Facial recognition appears to be en vogue at the moment. It is popping up in many aspects of our lives, and is being adopted by both private and public organizations for various purposes, including surveillance.
Artificial Intelligence is increasingly being employed to help recognize individuals and track their locations and movements. Some programs in development can even help detect individual people by analyzing their gait and heartbeat.
AI-powered surveillance is already in place in many airports across the world, and is increasingly being employed by law enforcement. This is a trend that is not going away anytime soon.
7. AI will help in the optimization of production pipelines
The droid manufacturing facility in Star Wars Episode II: The Clone Wars might not be all that far, far away. Fully autonomous production lines powered by AI are set to be with us in the not-too-distant future.
While we are not quite there yet, AI and machine learning are being used to optimize production as we speak. This promises to reduce costs, improve quality, and reduce energy consumption for those organizations who are investing in it.