RV Blog Global-economy Fun Times at DeepMind

Fun Times at DeepMind

A Bitcoin exchange-traded fund (ETF) is an investment vehicle that tracks the price of BTC or assets associated with bitcoin’s price, like futures. It’s traded on traditional stock market exchanges rather than on crypto exchanges. A Bitcoin ETF gives investors exposure to BTC without the need to actually own and hold the crypto asset.

Hi Visionaries,

I kicked off the Exponentialist last month by writing about the foundational role that machine intelligence will play in the Exponential Age.

The last few days have seen yet another step forward in the epic journey we’ve taken with AI in 2023 — and another example of what I’m talking about.

Researchers at Google DeepMind used a large language model (LLM) to create authentically new mathematical knowledge. Their new FunSearch system — so called because it searches through mathematical functions — wrote code that solved a famous geometrical puzzle called the cap set problem.

The researchers used an LLM called Codey, based on Google’s PaLM 2, which can generate code intended to solve a given maths problem. They tied Codey to an algorithm that evaluates its proposed solutions, and feeds the best ones back to iterate upon.

They established the cap set problem using the Python coding language, leaving blank spaces for the code that would express a solution. After a couple of million tries — and a few days — the mission was complete. FunSearch produced code that solved this geometrical problem, which mathematicians have been puzzling over since the early 1970s.

DeepMind say it’s the first time an AI has produced verifiable and authentically new information to solve a longstanding scientific problem.

‘To be honest with you,’ said Alhussein Fawzi, one of the DeepMind researchers behind the project, ‘we have hypotheses, but we don’t know exactly why this works.’

⚡⚡ The Exponentialist Take:

For pure mathematicians, a solution to the cap set problem is a big deal. For the rest of us, not so much.

But this result really matters, because it resolves a central and much-discussed question about LLMs: can they create new knowledge?

Until this week, many believed LLMs would never do this — they thought they’d only ever be able to synthesize and remix knowledge that already existed in their training data. But we can be sure there was no solution to the cap set problem in the data used to train Codey, because one did not exist. Instead, Codey created novel and true information all of its own making.

This points to a future in which LLMs solve problems in, for example, statistics and engineering, or can create new and viable scientific theories.

In other words, this little and somewhat nerdish research paper heralds a revolution. So far, only we humans have been able to push back the frontiers of what we know. It’s now clear that in 2024, we’ll have a partner in that enterprise.

It’s yet another building block in the Exponential Age.

The acceleration we’ve seen this year across AI, but also robotics, the life sciences, quantum computing, and much more has been something to behold. I’m more convinced than ever — more so than even just a couple of years ago — that an unprecedented socio-technological acceleration is coming. The economic and social implications will be profound.

For Exponentialist members, I’m writing a comprehensive lookout to what 2024 has in store; it’s coming in the January instalment.

The TL;DR? It’s been a wild year; things are about to get even wilder.

*

Thanks for reading…!

I’ll keep watching this and multiple other technology revolutions, and working to make sense of it for all of us as investors, and humans.

And I’ll be back next week with another postcard from the Exponential Age. In the meantime, if you’d like to join us in the Exponentialist, you can become a charter member right her

RELATED CATEGORIES: Defi, Global Economy, Investing, Learning, Technology