Creative Artificial Intelligence Has Crossed A Significant Milestone. What Happens Next?
Even though it’s only been a few years, the 1920s have been a very eventful decade so far. People often say that since the start of the Covid-19 pandemic, we’ve seen five years’ worth of technological changes in the last 18 months, mostly because we had to. And the development of artificial intelligence (AI) hasn’t slowed down at all.
Back in 2019, which seems like a different lifetime, I tried to pick out what I thought were the most important AI developments so far. With everything that has happened since then, I thought it would be a good idea to put together a list of the most important and important breakthroughs of the last 10 years.
Not surprisingly, many of the biggest and most talked-about changes have been in the health care field. But research is also pushing the limits of what is possible with natural language processing and self-driving cars.
During the time of the pandemic, one of the most important things AI has done well helps make vaccines. Usually, it takes many years to make a new vaccine, but by March 2020, just three months after the first cases of Covid-19 were reported in China, scientists were ready to test possible vaccines on people. Machine learning models were used to quickly sort through the huge amounts of data on how the immune system responds and figure out which compounds were most likely to work.
Baidu, a major Chinese AI company, made its LinearFold algorithms available to scientists all over the world. This helped with the development of the new generation of mRNA vaccines. These vaccines work by letting cells make proteins that make the immune system react.
Even though a vaccine was made, it wasn’t the end of the story because Covid-19 could quickly change into new forms. An algorithm made by the University of Southern California has made it much faster to figure out if vaccine candidates are good choices when mutations and variants are shown that could make existing vaccines less effective. It has been shown that the algorithm can get rid of 95% of possible vaccine compounds in seconds, which is a process that usually takes several months.
AI has also been used for things other than making vaccines. For example, one company uses computer vision to find people who have been close to someone who has tested positive. On the other hand, it should be said that not every AI covid story is a success. Several studies have shown that attempts to make algorithms that can diagnose an infection from X-rays and other medical images have not worked.
In 2020, OpenAI showed off the latest version of its Generative Pre-Trained Transformer language modeling algorithms. This deep learning model can make the most convincing and flexible natural language that an AI system has ever made. GPT-3 can even automatically make computer code, which means it could, in theory, be used to make software programs on its own.
To do this, a set of 175 billion machine learning parameters is used to make the most advanced and complicated set of natural language algorithms ever made. This lets it write text (or code) in natural language, answer questions, automatically summarise and annotate content, and translate between different languages.
GPT-3 isn’t widely used outside of academic settings yet, and its creators have said that because of how powerful it is, it needs to be used ethically and responsibly. For example, we need to make sure that bias in the datasets it is trained on, which includes the public crawl set scraped from millions of web pages, won’t affect its results too much. We all know that bias is common online.
Level 2 Of Tesla’s Self-driving Cars Hit The Road, And Robotaxis Are Coming To China
So they are a little behind schedule. Tesla CEO Elon Musk famously said that he would have fully autonomous (level 5) vehicles ready by the end of 2020 (if he did, he hasn’t shown it to anyone yet!)
But in 2021, a big step forward was made: level 2 autonomy became commercially available as an update for some owners of cars that already had the Autopilot upgrade. Level two autonomous vehicles can do most of the basic driving tasks that would be expected of a human driver on a regular road. However, a human must always be “hands-on” with the controls and ready to take over if the computer makes a mistake.
Tesla is in the process of shifting its research on self-driving cars away from radar and lidar sensor arrays, which are what most car companies use now, and toward systems that can navigate using only cameras.
This would mean that, in human terms, they only use their eyes to drive, just like people do. The update isn’t available to all Tesla drivers right away. Instead, it’s being rolled out slowly so that the company can collect data and watch how it works in a controlled way.
In China, Apollo Go, Baidu’s first autonomous “robotaxi” service, became available to the public in 2020. Its vehicles are still co-piloted by humans for safety and customer trust, but the service in Guangzhou is able to travel between 200 “stops” on its own. Baidu wants to grow the service, which is already available in Beijing, Changsha, and Cangzhou so that by 2023, 30,000 cars in 30 Chinese cities will be able to use it.
Possibly The First Autonomous Trans-Atlantic Crossing!
So, the Mayflower, the ship that was supposed to make the first fully autonomous trip between the UK and the USA in 2020, hasn’t always had smooth sailing. After being held up for a long time because of Covid, it finally left in June of this year. But, because of damage to its exhaust system, it had to go back to Plymouth, England, just a few days into its planned three-week trip.
The original Mayflower, which brought pilgrims to the New World in 1620, had to turn back twice before making the crossing, so everyone is sure that the new Mayflower, which is a joint project between IBM and the non-profit research organization ProMare, will make it in the end.
The Mayflower is a research ship for scientists. This is an important use case for self-driving ships because research ships need to be out at sea for a long time without having to go back to port. It’s also good for the environment because it’s very light and doesn’t need much energy to move because it doesn’t have the things people need for safety and comfort.
The self-driving Mayflower gets information from 30 sensors, such as radar, GPS, cameras, and depth sensors. This information is then sent to the “AI Captain,” which is an application with edge computing that can make decisions without sending the information to a base station on land. Assuming it finishes its journey as planned, this will be a great moment in the history of AI and self-driving ships.
Read more :
- The Dall-e Image Generator Is Now Available To Everyone
- New Youtube Experiment Makes 4k Videos A Premium-exclusive Feature