Especially with the emergence of Midjourney, DALL-E 2 and ChatGPT, the whole world saw what artificial intelligence actually developed. In addition to artificial intelligences that turn texts into visuals, there were even artificial intelligences that could dub movies, prepare designs for you, imitate your voice in a few seconds, and even act as lawyers in court.
On the other hand, although not very popular, artificial intelligence supported music programs are slowly starting to attract attention. Google, on the other hand, intends to make an ambitious entry into this sector with its new artificial intelligence introduced today.
You can make music just by typing
Google’s new artificial intelligence called MusicLM actually works in a very similar structure to DALL-E 2. However, this time, what you write appears as music instead of a visual. At this point, you can either create a rhythm with a single instrument or prepare a completely finished piece.
When we look at MusicLM’s Github page, you can see some case studies prepared for artificial intelligence by the Google AI team. Although the artificial intelligence still in the development stage cannot produce some sounds clearly, it should be noted that the sounds produced are Hi-Fi, that is, high resolution. It is also stated that by combining this artificial intelligence with ChatGPT, much more detailed and original music can be prepared.
You can hum instead of typing.
Keunwoo Choi, one of the Google AI engineers who developed the project, states that MusicLM emerged with the combination of 4 different artificial intelligence models. Each of these models, named MuLan, AudioLM, w2v-BERT and SoundStream, has its own tasks. When all these models are combined, MusicLM emerges.
MusicLM can also turn your humming sounds into songs. At this point, artificial intelligence, which was far ahead of its rivals, started to receive reactions from musicians, just like the reactions to DALL-E 2.
If you want to listen and examine the music and “sounds” prepared by this artificial intelligence, which is still in the testing phase, you can use the link here.