AI programs would need a ‘world view’ that stuffing them with zettabytes of data just can’t provide
This is the second column I am writing about the limitations of ’foundation models’. A few months ago, research around Artificial Intelligence (AI) models took a new turn with the long-anticipated release of foundation models such as GPT-3, DALL-E and Bert. These models attempt to map and include almost every document that is available on the internet. This is for use in what can only be described as a brute-force attempt to provide as global a repository as possible for future AI programs to base their own ‘training’ data on.