OpenAI Gets Sued (Again)

PLUS: Are Transformers Finished?

Hello readers,

Welcome to another edition of This Week in the Future! It wasn’t long after the holidays (and even during them) that AI news started appearing again, such as The New York Times suing OpenAI, so we wanted to get you up to speed with a quick edition.

As always, thanks for being a subscriber! We hope you enjoy this week’s content — no video breakdown or podcast this week. Stay tuned for next week.

Let’s get into it!

OpenAI v New York Times

Image generated by DALL·E 3

OpenAI is being sued (again). The lawsuit filed by The New York Times against OpenAI and Microsoft centers around copyright infringement claims. The Times has accused the companies of using its articles to train generative AI technologies, causing “billions of dollars in statutory and actual damages.” The lawsuit demands the destruction of the Times-produced training data and the resultant chatbots. In their complaint, they share numerous examples of ChatGPT regurgitating articles verbatim. However, it’s not clear if these responses were prompted with search capability enabled or not.

The Times argues that this not only misuses their content without payment but also creates products that compete with and potentially take away their audience. They also cite instances where the AI attributed false information to them.

Why This Matters

If the lawsuit is successful, it could establish a precedent for AI training data that would definitely impact enterprise adoption. Enterprises are already cautious as the legal and ethical conversations around AI technology are ongoing.

Our Take

The case can be made that an AI learning from its training data and its subsequent outputs being influenced by that data is also what humans do. The sticking point will be whether or not ChatGPT is regurgitating training data verbatim. If so, this might be a fixable technical issue. Whether this ends in a quick settlement or is a brutal slog remains to be seen.

Mamba v Transformer

Image generated by DALL·E 3

A new AI architecture called Mamba has the potential to supplant the famous Transformer architecture that unleashed the generative AI wave. The most significant difference between Mamba and Transformer architectures lies in their approach to handling long sequences of data. Transformers struggle with computational inefficiency when dealing with lengthy sequences. This is due to their reliance on attention mechanisms that compare each element in a sequence to every other, which becomes increasingly complex as sequences grow.

Mamba addresses this inefficiency by integrating selective State Space Models (SSMs). This allows Mamba to focus on relevant parts of a sequence while filtering out less relevant information, thereby enhancing its efficiency in handling long sequences. Mamba's architecture is simpler and more homogeneous, resulting in faster computation and reduced memory requirements, especially beneficial for applications involving extensive data like language, audio, and genomics.

Why This Matters

The significance of Mamba lies in its potential to reshape the landscape of AI, particularly in areas where handling large-scale data and lengthy sequences is crucial. Its innovative approach offers a more efficient way of sequence modeling, paving the way for faster and more powerful AI applications.

🔥 Rapid Fire

📖 What We’re Reading

AI in 2024 (link)

“As we head into 2024, it can be easily surmised that we’re going to see a lot more movement when it comes to AI regulation. While the EU was the first to pass a concrete initiative regulating the technology, I believe the U.S. is the true innovator of AI and is on the right track by not rushing to regulate the technology – especially as it’s yet to reach its full potential.”

Source: AI For All
AI: The Game Changer in Telecom Network Optimization (link)

“Achieving seamless connectivity is a driving goal in telecommunications. As the demand increases for faster and more reliable networks, the telecom industry has turned to AI to revolutionize network optimization, with one projection estimating telecoms’ overall AI spending will reach $38.8 billion by 2031.”

Source: AI For All

“With a concept as profound and consequential as artificial intelligence, you would think that there would be more exploration of the topic in cinema. However, not only are there not that many true AI movies, but there is yet to be that one movie that defines AI in cinema. Nevertheless, there are just enough AI movies to make a best and worst list.”

Source: AI For All

💻️ AI Tools and Platforms

  • Cline → Lightweight A/B & split testing software

  • Anyscale → Scalable compute for AI and Python

  • Perplexity → AI-powered internet search

  • Robin AI → The AI copilot for contracts

  • HeyCloud → AI cloud assistant to build cloud stacks