AI Music

From New Media Business Blog

Jump to: navigation, search

Contents

Introduction

AI music is an emergent technology that uses artificial intelligence algorithms to generate and compose musical pieces. AI Music is a powerful and creative tool that can create an entirely new soundscape. It often combines elements of traditional musical composition with modern technology and artificial intelligence algorithms, allowing for more complex pieces of music than ever before possible. [1]

AI music composition works by training machine learning algorithms on large datasets of existing music to create music that sounds similar to the ones in the dataset [2]

History of AI Music

Early Stages (1950s - 1970s)

In 1951, British computer scientist Alan Turing was the first to record computer-generated music using a machine that filled almost an entire lab floor called Manchester Mark II.[3]

Alan Turing with the Manchester Mark II[4]

With the introduction of computer-generated music by Alan Turing with the help of the Manchester Mark II, a new world of possibilities was formed in researching music intelligence using computers to recognize, create and analyze music. [5]

Soon after, in 1957, Lejaren Hiller and Leonard Isaacson, two American composers, made the Liliac Suite which is widely regarded as the first original piece composed by a computer [6] To create the Liliac Suite, they had to develop a computer algorithm to create the musical output. The model was necessary for generating data that could be translated into musical notation. The aim was to implement a model that enabled the computer to make organizational choices related to the various aspects of musical composition. They eventually went with a Monte Carlo method which is a model that uses random number generation with strategic restrictions to create musical-sounding works. [7]

Transitional Period - Generative Modeling (1980s-1990s)

The following decades saw a transitional period in AI music, where the focus shifted from simple algorithmic generation to more complex generative modelling. A key figure during this phase was David Cope, a composer and professor of music, who introduced the experiment's musical intelligence. These allowed computers to recompose music by combining and modifying elements from previous works. David Cope created work that absorbed powerful works that had gone before them and recombined melodies and phrases in similar but distinctive ways. [8]

David Cope

Experiments in Musical Intelligence

David Cope at work in his California home [9]

While working on a commission for an opera, David Cope developed a project he called Experiments in Musical Intelligence or EMI. EMI was initially developed to analyze his own musical style by feeding it past compositions in hopes of finding patterns of little musical signatures that only he does and replicating them in new ways. [10] However, Cope found that software would work on any composer, and he began to experiment with famous composers like Bach, Beethoven and Chopin. Through his experimentation, he created the foundation for several of his later projects and what began a new period of AI music used today.

Emily Howell

As Cope continued to develop his work on EMI, he was able to create a more sophisticated AI program called Emily Howell. Emily Howell is an AI program designed to compose original classical music. The system uses algorithms and machine learning techniques to analyze and emulate the musical style of various composers, such as Bach, Beethoven, and Chopin. It can also create its own original compositions inspired by these styles.

Instead of feeding Emily a database of works that already existed, he gave her a collection of works that EMI had produced to get it going, and from there, she began working on her own musical style. Cope described Emily's style as similar to modern composers, a "sort of an amalgam of all styles" and very contemporary. [11] Emily will generate an output which the composer can choose what sounds good and parts that might not fit and the program will then generate a new output, which continues until a final piece is created. Emily Howell is an evolving AI composer who can learn from their compositions and refine their output over time. The program's ability to produce convincing and expressive compositions has brought significant attention and sparked controversial debates among musicians and AI enthusiasts. He received extremely positive feedback from the science community, but most musicians and composers felt it threatened human aspects of creation. [12]

AI in Music Today: Recommendation and Discovery

AI has played a crucial role in the music industry for many years, by assisting with music creation for music composition, production, and personalized music recommendations.

Music Composition

‘AI composers’ use machine learning techniques to generate compositions of all different genres and styles. This algorithm has allowed creators the opportunity to compose innovative music that will open up the music industry to greater possibilities. [13]

Music Production

Alongside music composition, AI has created opportunities for music production by enhancing the production process. AI provides tools such as automation to streamline the production process in which producers can save time and allocate time to other creative processes. [14]

Personalized Music Recommendations

Lastly, AI plays a role in personalized music recommendations by analyzing user preferences, listening history, and other user data to recommend hyper-personalized music to listeners. Many music streaming platforms have utilized AI to streamline their recommendation systems, and one example to analyze is Spotify. [15]

Case Study of Spotify’s Successful Music Recommendation Platforms

Many music streaming platforms utilize a combination of AI algorithms to provide personalized music recommendations to their users. These algorithms allow music streaming companies to generate revenue as they generate active users and provide services that align with their tastes and preferences.

Spotify is one of the top music streaming platforms used by users all across the globe. Spotify has been successful at utilizing AI for its personalized music recommendation system, and through the use of music and artist-driven data, they continue to introduce new features to provide better services to its users.

To provide hyper-personalized music recommendations, Spotify relies on a combination of two algorithms: Content-Based Filtering and Collaborative Filtering [16].

Content-Based Filtering

Content-based filtering focuses on the contents of the music itself by analyzing artist-sourced metadata, raw audio signals, and texts using Natural Language Processing models [17].

Artist-Sourced Metadata

When the music is first introduced to Spotify’s platform, the system analyzes its artist-sourced metadata including but not limited to track title, release title, artist name, etc. [18]

Raw Audio Signals

Once the metadata is analyzed, the data gets pushed further down Spotify’s recommender system to have the song’s raw audio signals analyzed. The raw audio signals being analyzed may consider three features such as danceability, energy, and valence [19] . Danceability considers how suitable the song is for dancing, energy looks at the intensity of the music, and valence describes how positive the song is [20] . A song with a higher valence will be categorized as more positive sounding, and one with a lower valence will be considered negative. [21]

Natural Language Processing Models

Lastly, the music being analyzed will get processed through a Natural Language Processing model, where semantic information is extracted in three methods to determine how users are interacting with that particular song or artist. The models will look at lyrics analysis, web-crawled data, and user-generated playlists. [22]

Lyrics analysis: NLP will analyze a song’s lyrics to determine what the specific theme of the song is, or the meaning of its lyrics that is being portrayed by the artist. [23]

Web-crawled data: NLP will be run against multiple sources of data contained on the web, such as blogs, news outlets and more. The model will attempt to gather data related to that particular song or artist to determine if the artist/song is trending, what people’s perceptions and opinions are, and whether any artist or artist song is being talked about in relation to that specific artist or song [24]. [25]. This helps determine whether a particular artist/song is one that users prefer to listen to and whether recommending it to their users will fulfil the user’s needs in music selection.

User-generated playlists: the NLP model will also analyze user-generated playlists to further specify the music and artists users listen to and enjoy based on the songs added to various playlists. The playlists help provide insights into the song’s mood, style and genre, and rely on the playlists title to discover these patterns [26]. For example, if a user creates playlists with the word “sad” in its title, then NLP will analyze and determine those songs’ moods, style and genre as sad, mellow songs.

Collaborative Filtering

In conjunction with the content-based filtering system, Spotify also relies on its collaborative filtering recommendation method to provide personalized music recommendations to its users. Content-based filtering is more data-driven, while collaborative filtering focuses on the behavioural trends and patterns of users to provide recommendations. There are two methods Spotify uses in its Collaborative Filtering system, the Item-user Matrix Approach and the Playlist-Centric Approach.

Spotify Personalization [27]

Item-User Matrix Approach

With collaborative filtering, Spotify can determine if two songs are similar by looking at users listening behaviour for those songs, and if two users are similar by looking at their listening behaviour for all songs they listen to [28]. Simply put, if Spotify analyzes that user A and B listen to the same songs, but user B does not listen to a particular song or artist user A listens to, it will recommend the missing song or artist to user B to determine if this user will behave the same way towards that song or artist [29]. Although this method of collaborating filtering can be deemed as intuitive and straight-forward, it does cause issues with accuracy, scalability, and speed [30]. Therefore, Spotify’s most current method is to rely on user-generated playlists.

Playlist-Centric Approach

With user-generated playlists, Spotify can use the data of different users to come to more conclusive decisions as to what music they have in common, and which artists to recommend. Spotify’s current use of collaborate filter looks at “organizational similarity”, which analyzes the behaviour of the users by looking at whether both users have included the same song on the same playlist [31]. If two users listen to the same two songs, it does not mean that the two artists are similar. However, if two users put two songs on the same playlist, collaborative filtering helps to determine that these two songs may have something in common [32]. This method provides a more accurate explanation of the specific trends and patterns in behaviours of users towards certain songs and artists that Spotify relies on to provide personalized recommendations.

With the combination of content-based filtering and collaborative filtering models, the Spotify recommendation system requires the generation of User Taste Profiles to provide a fully personalised recommendation [33].

Sample User Profile [34]

User Taste Profiles

In addition to the data derived from the song and behavioural trends between users, Spotify’s recommendation system also relies on user taste profiles, in which context-rich information is gathered to create a holistic listening profile for their users [35]. This system relies on two types of feedback: explicit and implicit feedback [36]. Explicit feedback looks at information such as library saves, playlists adds, shares, etc [37]. Implicit feedback looks at information such as listening sessions length, and repeat listens [38]. All of this information is gathered to create a user profile, which is where the most specific information about the user is displayed.

Spotify DJ

Spotify has released one of its most recent AI features, the Spotify DJ. Spotify DJ can be classified as a “curated music service based on your listening habits” [39]. The feature is created using AI and uses a DJ-like voice to recommend a series of tracks that are based on the user’s listening habits [40]. The feature is only available to Premium users, and it joins Spotify’s other features such as Discover Weekly and the annual Wrapped playlists to provide further personalized music recommendations to its users [41].

Spotify’s Partners

Spotify has acquired many companies and startups that are experts in AI algorithms to successfully provide personalized music recommendations. Spotify has acquired the following companies to assist with their AI models (not a conclusive list):

Tunigo: acquired in 2013 for better music recommendation algorithms [42].

Echo Nest: acquired in 2014 to improve Spotify’s recommendations[43].

Seed Scientific: acquired in 2015 to understand how artists, listeners and brands interact with its platform [44].

Sonalytic: acquired in 2017 to use machine learning for audio detection and music recommendation [45].

Niland: an AI startup, acquired in 2017 to optimize music searches and recommendations [46].

Mediachain Labs: a blockchain company, acquired in 2017 to help artists get paid correctly for every track play on Spotify’s platform [47].

Sonantic: a company that specializes in text-to-speech generation is acquired in 2022 to help produce “stunningly realistic” voices from text [48].

Business Opportunities and Market Size

Growth of AI in the Music Industry

Growth of AI in Music Industry [49]
The presence of AI has been steadily growing in the music industry and market over the past decade [50]. The global Generative AI in Music Market size is accounted for USD 229 million in 2022 and has a projected compound annual growth rate of 28.6% [51]. This accounts for revenue of USD 2.6 billion by 2032 [52].

Two factors can be considered to account for the growth of this market: increased use in composition of music, and increased use in music generation from texts. [53]. Increased used in composition of music is a result of companies frequent use of AI tools for music composition [54]. Spotify is a prime example of a company that has been utilizing AI tools to recommend music to users. However, companies such as Google and Meta are utilizing AI tools to generate and compose music instead. Increased use of music generation from texts is a result of artists now preferring to use AI tools to generate their own music [55]. These tools are deemed more efficient by artists because it helps them save time, money, increases their creativity as well [56].

Big Tech Companies and their involvement in AI music

Many companies have contributed to the growth of AI in the music industry, and big tech companies such as Google and Meta are now utilizing their AI algorithms to provide text-to-music generators.

Google MusicLM

In 2023, Google released its new experimental AI tool, MusicLM, accessible on the AI Test Kitchen App [57]. This is a tool that can turn text descriptions into music [58]. Users have the choice of inputting any prompt that interests them, in addition to a specific instrument or mood, and have the AI tool generate music for them. The generator will produce several versions of a song based on the prompt inputted, and it was trained on five million audio clips and 280,000 hours of music at 24 kHz [59] [60]

Meta MusicGen

Similarly, Meta has also released an AI text-to-music generator called MusicGen [61]. The music generator is described as a “simple and controllable model for music generation” [62]. The model will take in text prompts and turn them into 12-second long music clips [63]. It has been released as open source, and it is trained on 20,000 hours of licensed music and 390,000 instrument-only tracks from ShutterStock and Pond5 [64].

Overview of the Business Potential and Job Creation

AI Helps the Music Industry

AI has been shown to help certain companies to generate more revenue and expand their consumer base. AI is now being used more frequently by companies to provide better services to their customers and to assist with innovation and creativity in music generation and production. Many artists such as songwriters and producers are now using AI as an “extension of the human creative process”, and embracing it as a tool to encourage music creation and creativity [65].

AI Can Hurt the Music Industry

Although AI has shown promising results for the music industry, many are still concerned with the legal and ethical implications it causes for artists and record companies. Many artists are concerned about the illegal use of their music, and companies worry about entering into legal disputes if AI is used with the wrong intentions. However, AI has a potential impact on copyright infringement practices and may introduce a new license structure for labels for legal uses [66]. With further research into the uses of AI to provide solutions to the problems the music industry is facing, the music world can take advantage of the ethical and legal lessons being learned, and look for ways to solve problems in collaboration with AI.
Sony's VP Executive of AI [67]

AI Aids in Job Creation

As the first company in the music industry, Sony has hired its first Executive VP of AI, Geoff Taylor [68]. As the movers and shakers in this industry, Sony has proven to view the power of AI in a positive light, and have shown that jobs can be created to work with AI as a result. Geoff Taylor’s responsibilities will surround the coordination of the company’s efforts around artificial intelligence, as well as global digital business and business and legal affairs [69]. This move, it is proving that the industry is not afraid of the changes AI is bringing to the industry, and instead, companies and key players are embracing the change and working with AI to determine new standards and regulations.

Legal Issues in AI-Generated Music

The integration of AI in the music industry has given rise to various legal considerations, which are complex and ambiguous. As the AI landscape continues to develop at a fast rate, legal systems are not able to keep up with the rapid developments, which results in uncertainty and multiple grey areas.

Copyright Implications for AI-Generated Works

Copyright grants the exclusive rights to creators to reproduce, distribute, perform, or display their works [70]. However, AI-generated art blurs the line between human and AI authorship, raising questions about the eligibility of copyright protection for AI-created works. According to recent formal guidance issued by the Copyright Office on March 15, their position remains that works solely created by AI without any human intervention or involvement cannot be eligible for copyright protection. However, the Copyright Office clarified that if a work includes AI-generated material but also exhibits sufficient human authorship, it may still be considered copyrightable. For instance, if a human creatively selects or arranges AI-generated content, that particular arrangement could still be eligible for copyright protection [71]. The challenge lies in defining the extent of human involvement necessary to establish copyright ownership for AI-composed music.

Determining ownership of AI-generated compositions presents a grey area. There is still debate around whether the original programmers, AI system owners, or even the AI itself can claim copyright ownership over the final output.

Training AI Models with Copyrighted Data

In 2022, an AI-generated song called "Heart on My Sleeve" written by Drake and The Weeknd caught music-lovers attention. The song gained over 20 million views across the various platforms it was posted to, before being taken down [72]. Their attorneys filed a DMCA takedown notice to have the song removed, citing copyright infringement. They argued that the AI-generated song included a producer tag that had been copied from their original works. There was also the claim that the AI-generated song was a derivative work based on their original compositions [73].

Copyright protects original work, and in AI-generated music, the concept of fair use gains importance. Fair use allows specific use of copyrighted material without permission for educational, training, commentary, and parody purposes [74]. Parodies, despite having a commercial, monetary purpose, can be considered fair use as they transform original work.

Similarly, AI music also transforms the original work in some significant manner. However, it differs from parodies as it lacks the intent to comment, criticize, or reference the original piece. Instead, AI music attempts to monetize successful musical styles that previous artists have commercialized and popularized [75]. AI music creators free-ride on the efforts of the original artists, leading to direct competition with the original work and potentially reducing incentives for innovation and creativity.

The "Heart on My Sleeve" case and the fair use debate in AI-generated music reveal the complex relationship between copyright, transformative works, and creativity in the digital era. Striking a balance that respects human creators' rights while embracing AI-generated content is vital for an innovative and creative landscape.

Drake and The Weeknd[76]

Ethical Considerations

Job Displacement

The rapid development of AI in music creation raises ethical concerns about the displacement of composers and musicians. The efficiency and versatility of AI technologies may tempt artists and producers to increasingly use AI-driven composition, reducing job opportunities in the music industry. Striking a balance between AI-driven efficiency and preserving the value of human creativity becomes important to address these ethical challenges. By fostering a collaborative approach that harnesses the strengths of AI and human artistry, the music industry can continue to grow while protecting the talent of human creators.

Accessibility

One of the most significant benefits of AI in music is its ability to make music production more accessible to a wider audience. Aspiring musicians, regardless of their background or resources, can now capitalize on their creativity using AI-powered tools, rather than having to gather resources to hire composers or producers. This inclusivity empowers artists who may have previously faced barriers to entry, creating a more diverse music landscape.

Bias and Representation

A potential challenge comes about in the data used to train AI algorithms. If AI systems solely rely on popular or historically significant music data, there is a risk of perpetuating biases. Such biases could lead to an overrepresentation of certain styles or genres while neglecting lesser-known or underrepresented musical cultures. This lack of diversity and representation in AI-generated music may reinforce existing inequalities and biases within the music industry, overriding any accessibility benefits that it provides.

AI Music and Awards

Grammy Awards [77]
The Grammy Awards have had to adapt to the changing music environment. Initially, their stance was that AI-generated music would not be eligible for awards. On July 4th, 2023, Recording Academy CEO and President, Harvey Mason Jr., affirmed that music with AI-generated elements is eligible for Grammy nominations. However, he clarified that while AI-influenced compositions can be nominated for Grammy Awards, the recognition would be attributed to the human contributors rather than the AI components. For example, where AI generates vocal elements, the song may qualify for songwriting categories, but not for performance. Similarly, if a human did all the performing and the lyrics are written by AI then the song would only be eligible in a performance category [78]. The reason behind this stance is to ensure that technology complements and enhances human creativity, rather than displacing it. This creates precedence for other award associations, fostering ongoing dialogue about the legal and ethical considerations of AI in music.

The Future: Balancing Innovation and Ethics

When exploring the ethical implications of AI in music, finding a balance between innovation and addressing ethical concerns becomes important to maintain its use while continuing to embrace human creativity. Rather than viewing AI as a replacement for human musicians, the music industry can view AI as a collaborative tool that enhances artistic potential.

Establishing Ethical Guidelines and Governance

To ensure a responsible and ethical use of AI in music, robust ethical guidelines and governance structures are essential. As legal issues regarding AI in music are only appearing recently, courts are still grappling with the issue, attempting to set precedence. A collaborative effort involving musicians, producers, legal experts, and other stakeholders is vital to create frameworks that address potential ethical concerns. These guidelines should prioritize transparency in AI-generated music, ensuring that listeners are aware when they engage with AI-created compositions.

Many creators feel the impact of copyright issues, but AI startups are exploring solutions for the future. One approach involves AI researchers developing databases that eliminate the risk of copyright infringement. This can be done by using properly licensed materials or generating content explicitly for AI training purposes. An instance is "The Stack," a dataset designed to train AI while avoiding any copyright infringement allegations. It exclusively consists of code with highly permissive open-source licenses and provides developers with a convenient option to remove their data upon request. The creators of this dataset believe that their model could have widespread applications across the industry [79].

Diversifying AI Datasets

Developers must take proactive steps to train AI algorithms on diverse and representative datasets. Music associations and bodies should set guidelines around the diverse range of music that needs to be used when creating an AI algorithm. This involves incorporating underrepresented artists, genres, and perspectives into the training data. We can mitigate potential biases and avoid prolonging existing inequalities in the music industry.

Specialized Databased for AI Training

AI startups are at the forefront of introducing groundbreaking solutions that offer immense potential for the future. One particularly promising approach involves AI researchers developing specialized databases. These databases are carefully crafted to avoid any copyright infringement issues by incorporating properly licensed materials or generating content exclusively for AI training purposes.

An illustrative case is "the stack," a dataset meticulously designed to train AI models while strictly adhering to open-source licenses with high levels of permissiveness. This meticulous approach ensures the legality and compliance of the dataset. Additionally, developers have the option to easily remove their data upon request [80].

Conclusion

While AI has the potential to enhance music artistry, it raises questions about accountability, creativity and the preservation of artistic integrity. The grey areas around copyright, fair use, and accountability emphasize the need for conversation and adaptation to ensure that there can be coexistence between human creativity and technology. It is important that stakeholders, from musicians and composers to legal experts and policymakers, collaborate in defining the ethical and legal frameworks that govern the use of AI in music.

The use of AI in the music industry can change the industry greatly, for the better or worse depending on the market’s intentions. As of now, there are three ways that AI has shown to be reshaping the music industry.

Artist Vs. Machine

Consumers and artists are concerned about whether AI is being used to assist the musician or AI is now the musician itself [81]. It can be argued that both assumptions are correct, depending on the method of use of AI. Companies like Google and Sony have created projects that use AI as a tool for music production, in which AI is the musician as it creates and performs songs artificially [82]. However, the important factor to keep in mind is that these algorithms and software still require the help of human skills and creativity and an original source to analyze their data. The existence of these AI tools is encouraging artists to remain competitive, and those who take the initiative to learn to adapt to these technologies will prosper in the music industry [83].

Personalized Recommendations & Marketing

Record companies invest $4.5 billion annually worldwide in A&R-targeted marketing [84]. These companies spend upwards of $2 million to break an artist and they require AI and technology to help them break down and analyze the billions of streams to determine which artists to market and recommend to fans [85]. Spotify’s personalized music recommendation features like the Discover Weekly playlists and the annual Wrapped playlists are the success stories of their acquisitions of companies like Echo Nest and Niland, which has made their recommendations stronger. [86]. Companies’ investments into AI algorithms and technologies will help them achieve their goal of providing the best music to align with their users’ and fans’ tastes.

New Emerging Stars

Record companies assign an A&R (artists & repertoire) representative to find promising new artists for the label to sign [87]. The role requires the representative to look through thousands of songs and artists to find the right one that users are reacting positively to. Technology has lowered the barrier for music creation, and streaming platforms are getting thousands of new songs added to their platforms everyday [88]. Therefore, use of AI and technology is critical to help with companies’ A&R, to help them find a new emerging artist to support and bring forth to the music industry [89]. Many companies are now acquiring and collaborating with other companies and startups to assist with the technology required for their A&R. Warner Music Group acquired Sodatone for an algorithm to identify unsigned artists [90]. Apple is now investing in A&R technology as well by acquiring Asaii to help with their A&R analytics [91]. Overall, companies are starting to show that they are ready to invest their money and time in AI and technology for the betterment of their companies, the music industry, and to provide the best services to their consumers.

Authors

Thomas Kurian Parmida Noroozzadeh Maya Fernandes
Beedie School of Business
Simon Fraser University
Burnaby, BC, Canada
Beedie School of Business
Simon Fraser University
Burnaby, BC, Canada
Beedie School of Business
Simon Fraser University
Burnaby, BC, Canada


References

  1. https://bigtimemusicians.com/how-ai-music-works
  2. https://www.one-submit.com/post/ai-in-the-music-industry-today
  3. https://www.forbes.com/sites/bernardmarr/2019/07/05/the-amazing-ways-artificial-intelligence-is-transforming-the-music-industry/?sh=149ecc385072
  4. https://watt-ai.github.io/blog/music_ai_evolution
  5. https://watt-ai.github.io/blog/music_ai_evolution
  6. https://watt-ai.github.io/blog/music_ai_evolution
  7. http://www.musicainformatica.org/topics/illiac-suite.php
  8. https://www.theguardian.com/technology/2010/jul/11/david-cope-computer-composer
  9. https://www.theguardian.com/technology/2010/jul/11/david-cope-computer-composer
  10. https://arstechnica.com/science/2009/09/virtual-composer-makes-beautiful-musicand-stirs-controversy/
  11. https://arstechnica.com/science/2009/09/virtual-composer-makes-beautiful-musicand-stirs-controversy/
  12. https://arstechnica.com/science/2009/09/virtual-composer-makes-beautiful-musicand-stirs-controversy/
  13. https://medium.com/@kalanabandaranayake/data-driven-music-how-ai-is-shaping-the-future-of-sound-7423db8a12c5#:~:text=AI%2Dpowered%20recommendation%20systems%20analyze,tailored%20and%20immersive%20musical%20experience.
  14. https://medium.com/@kalanabandaranayake/data-driven-music-how-ai-is-shaping-the-future-of-sound-7423db8a12c5#:~:text=AI%2Dpowered%20recommendation%20systems%20analyze,tailored%20and%20immersive%20musical%20experience.
  15. https://medium.com/@kalanabandaranayake/data-driven-music-how-ai-is-shaping-the-future-of-sound-7423db8a12c5#:~:text=AI%2Dpowered%20recommendation%20systems%20analyze,tailored%20and%20immersive%20musical%20experience.
  16. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  17. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  18. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  19. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  20. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  21. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  22. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022]
  23. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  24. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  25. https://outsideinsight.com/insights/how-ai-helps-spotify-win-in-the-music-streaming-world/
  26. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  27. https://www.engadget.com/spotifys-new-ai-dj-will-talk-you-through-its-recommendations-140052560.html
  28. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  29. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  30. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  31. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  32. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  33. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  34. https://research.atspotify.com/2020/05/giving-voice-to-silent-data-designing-with-personal-music-listening-history/
  35. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  36. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  37. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  38. https://www.music-tomorrow.com/blog/how-spotify-recommendation-system-works-a-complete-guide-2022
  39. https://www.producthunt.com/stories/spotify-dj-how-to-use-get-spotify-ai-dj
  40. https://www.producthunt.com/stories/spotify-dj-how-to-use-get-spotify-ai-dj
  41. https://www.producthunt.com/stories/spotify-dj-how-to-use-get-spotify-ai-dj
  42. https://www.marketingaiinstitute.com/blog/spotify-artificial-intelligence
  43. https://www.marketingaiinstitute.com/blog/spotify-artificial-intelligence
  44. https://techcrunch.com/2015/06/24/pulling-the-data-rug-out-from-under-apple/
  45. https://www.marketingaiinstitute.com/blog/spotify-artificial-intelligence
  46. https://www.marketingaiinstitute.com/blog/spotify-artificial-intelligence
  47. https://outsideinsight.com/insights/how-ai-helps-spotify-win-in-the-music-streaming-world/
  48. https://www.engadget.com/spotify-buys-ai-voice-platform-sonantic-145242336.html
  49. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  50. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  51. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  52. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  53. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  54. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  55. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  56. https://www.globenewswire.com/en/news-release/2023/04/03/2639706/0/en/Generative-AI-in-Music-Market-to-Reach-Valuation-of-USD-2-6-Bn-at-CAGR-of-28-6-by-2032.html
  57. https://techcrunch.com/2023/05/10/google-makes-its-text-to-music-ai-public/
  58. https://techcrunch.com/2023/05/10/google-makes-its-text-to-music-ai-public/
  59. https://techcrunch.com/2023/05/10/google-makes-its-text-to-music-ai-public
  60. https://www.musicbusinessworldwide.com/meta-just-released-an-ai-music-generator-that-was-trained-on-20000-hours-of-licensed-music
  61. https://www.musicbusinessworldwide.com/meta-just-released-an-ai-music-generator-that-was-trained-on-20000-hours-of-licensed-music/
  62. https://www.musicbusinessworldwide.com/meta-just-released-an-ai-music-generator-that-was-trained-on-20000-hours-of-licensed-music/
  63. https://www.musicbusinessworldwide.com/meta-just-released-an-ai-music-generator-that-was-trained-on-20000-hours-of-licensed-music/
  64. https://www.musicbusinessworldwide.com/meta-just-released-an-ai-music-generator-that-was-trained-on-20000-hours-of-licensed-music/
  65. https://variety.com/2023/music/opinion/ai-can-hurt-help-the-music-business-1235636453/
  66. https://variety.com/2023/music/opinion/ai-can-hurt-help-the-music-business-1235636453/
  67. https://variety.com/2023/music/news/sony-music-geoff-taylor-a-i-1235653059/#respond
  68. https://variety.com/2023/music/news/sony-music-geoff-taylor-a-i-1235653059/
  69. https://variety.com/2023/music/news/sony-music-geoff-taylor-a-i-1235653059/
  70. https://www.wipo.int/copyright/en/
  71. https://hls.harvard.edu/today/ai-created-a-song-mimicking-the-work-of-drake-and-the-weeknd-what-does-that-mean-for-copyright-law/
  72. https://www.nytimes.com/2023/04/19/arts/music/ai-drake-the-weeknd-fake.html
  73. https://hls.harvard.edu/today/ai-created-a-song-mimicking-the-work-of-drake-and-the-weeknd-what-does-that-mean-for-copyright-law/
  74. https://www.americanbar.org/groups/entertainment_sports/publications/entertainment-sports-lawyer/esl-39-01-spring-23/2022-eli-writing-competition-winning-essay-protecting-artist-licensing-an-aigenerated-music-market/
  75. https://hls.harvard.edu/today/ai-created-a-song-mimicking-the-work-of-drake-and-the-weeknd-what-does-that-mean-for-copyright-law/
  76. https://www.bbc.com/news/entertainment-arts-65298834
  77. https://www.nme.com/news/music/grammys-introduces-rule-that-bans-music-created-solely-by-ai-3457165
  78. https://www.cbc.ca/news/entertainment/ai-music-grammys-1.6896423
  79. https://www.marktechpost.com/2022/12/16/meet-stack-a-3tb-of-permissively-licensed-source-code-for-llms-large-language-models/
  80. https://www.marktechpost.com/2022/12/16/meet-stack-a-3tb-of-permissively-licensed-source-code-for-llms-large-language-models/
  81. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  82. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  83. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  84. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  85. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  86. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  87. https://www.berklee.edu/careers/roles/ar-representative#:~:text=An%20A%26R%20(artists%20and%20repertoire,Careers%20in%20Business
  88. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  89. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  90. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
  91. https://www.entrepreneur.com/en-au/technology/how-ai-is-reshaping-the-music-business-globally/327781
Personal tools