News


From the editor's desk: Intelligence is needed to train AI

22 November 2023 News


Peter Howells, Editor

As I sit in front of my computer and ponder the year that is swiftly coming to a close, I again marvel at everything that has happened. It has definitely been a busy year. There have been plenty of announcements in the tech space, flooding my inbox, especially when it comes to artificial intelligence and machine learning.

No matter who you speak to, they will have heard something about AI and the effects it will have on our everyday life. And they all have their own opinions too. Many of them, however, are not based on any sort of scientific fact.

One opinion piece that did cross my desk this month was an article on training of AI models. Many people do not realise that AI is trained on human-generated information. This mostly comes from information that already exists on the internet in some form: articles that have been written, responses to questions, shopping habits, browsing history; pretty much anything is fair game for use in the training of AI models.

But there is a finite amount of non-repetitive information that exists. Let me clarify what I mean by this. Much of the enormous amount of information created each day is a copy of data/information that already exists. People reposting text, images and other multimedia, copying of other data from one platform to another; all this counts towards the total amount of information created.

Yes, there is still a massive amount of information available for training, but AI models are becoming more and more powerful, and are able to be trained on larger volumes of data. This year, even though the total amount of data generated globally was in the order of 120 zettabytes, much of this cannot be used for training models. ChatGPT was trained on 570 gigabytes of data, which amounted to around 300 million words. The more data used to train these AI models, the more accurate the models’ responses will become.

And this is where the concern starts to kick in for AI researchers; the volume of datasets needed to train AI models is growing much more rapidly than the growth of online data stocks. If the current training trend continues, in a paper published in 2022 it was predicted that we will run out of high-quality data before 2026. If models then turn to the remaining low-quality data, this will also be exhausted, sometime between 2030 and 2050.

But do we really want to train our models on low-quality data? We all know the bad decisions that can be made when only poor-quality data is available. After all, the internet is full of examples of ‘average’ people doing stupid things based on lack of insight or forethought. Do we really want our artificial intelligences to be only as smart as the average person?

One hope is that newer AI models will have a lower data overhead, that is, to be able to be trained suitably well using less data than their predecessors. I believe this would be similar to how many people get to conclusions nowadays – they are able to make quite reasonable decisions even when they do not know everything about a subject.

The one overriding thing I have taken away from all this talk about AI during this year is that we are certainly all living in an interesting and exciting era, even if it can be quite concerning at times.

To all our readers I would like to take this opportunity to wish you all a joyous and restful season. May your new year be filled with new goals, new achievements and above all, happiness.


Credit(s)



Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

From the editor's desk: Groq – the future of AI processing?
Technews Publishing AI & ML
The introduction of Groq’s ASIC-based approach to AI inferencing marks a significant shift in the landscape of LLMs.

Read more...
Electronic News Digest
News
A brief synopsis of current global news relating to the electronic engineering fields with regards to company finances, general company news, and engineering technologies.

Read more...
Jemstech to produce PCB assemblies for Kamstrup
Jemstech News
Jemstech is pleased to announce that they have successfully concluded a supplier agreement with Kamstrup A/S in Denmark, a leading supplier of intelligent metering solutions in the global market.

Read more...
New appointments at Hiconnex
Hiconnex News
Hiconnex, a leading provider of electronic components and solutions, has announced key appointments to support its continued growth and commitments to its clients.

Read more...
FoundriesFactory service more affordable for smaller OEMs
News
Foundries.io has announced a new, tiered pricing scheme which reduces the cost of its highly regarded FoundriesFactory service for OEMs in the development phase of a new edge AI or Linux OS-based product.

Read more...
DMASS 2024 results
News
The semiconductor business faced a severe downturn, with a 31,9% decrease compared to 2023 and a 30,3% drop in Q4 2024 compared to the same period last year.

Read more...
Using satellite comms to end copper theft
News
According to Transnet COO Solly Letsoalo, the scourge of copper theft could be a thing of the past by eliminating the use of copper cabling and switching to a satellite communication system.

Read more...
Strategic merger: Etion Create and Nanoteq
Etion Create News
Reunert has announced the successful merger of two business units within the Applied Electronics Segment, namely Etion Create and Nanoteq, effective 1 October 2024.

Read more...
Securex South Africa 2025
Specialised Exhibitions News
Securex South Africa 2025 is co-located with A-OSH EXPO, Facilities Management Expo, and Firexpo to provide a time-saver for visitors looking for holistic solutions for their facilities.

Read more...
Chinese AI causes Silicon Valley stocks to tumble
News
Many stocks took a downward spike, with Nvidia being the hardest hit, losing 16,9% after one day’s trading.

Read more...