Google is determined to make Artificial Intelligence more human, and this time it’s teaching robots how to sing – well how to make music, anyway. Codenamed ‘Magenta’, Google’s latest project will discover if robots can actually make tunes – and it’s already begun.
Announced at this week’s Mioogfest; a
After being fed with hours of music, Project Magenta will look for patterns or rules in the files, and then apply them to its own self-generated work. In fact, Google has already fed Magenta hours of music, and it’s asking musicians to feed the AI with music, too.
So what does it sound like? At this week’s Moogfest, Google showed the AI jamming on a synth, and frankly it wasn’t great. However, like most machine-learning projects, Magenta will get better the more data is uploaded and analysed. Google is pretty confident that Magenta will improve, and suggests that Magenta could be used to generate music for the public.
So is Magenta actually creating art after all? At first I thought it wasn’t, but the more I think about it, the more Magenta is actually creating music like your or I. After listening to other works for inspiration, and sticking to strict rules in the form of scales or keys, Magenta is then forming its own ideas – which is sort of how real-life musicians work.
Perhaps in the future, Google could make an album using Project Magenta – although it should steer clear of using Microsoft Tai as a vocalist.
READ NEXT: Google’s AI is learning to write
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.