The Fear of the AI, ChatGPT-4
Category Artificial Intelligence Friday - April 28 2023, 02:21 UTC - 10 months ago ChatGPT-4 is an AI which grew from the original task of guessing the next word in a sentence. This AI, trained on 570GB of data and over 6 trillion words, has been able to learn logic, reasoning, and grammar through mastering the task of next-word guessing and can output 310 million words per minute, making it highly effective for summarization of existing works, coding and programming, and creating vast connections between certain sets of information.
Friday - April 28 2023, 02:21 UTC - 10 months ago
ChatGPT-4 is an AI which grew from the original task of guessing the next word in a sentence. This AI, trained on 570GB of data and over 6 trillion words, has been able to learn logic, reasoning, and grammar through mastering the task of next-word guessing and can output 310 million words per minute, making it highly effective for summarization of existing works, coding and programming, and creating vast connections between certain sets of information.
Martial Art Legence and Philosopher, Bruce Lee, famously said, I fear not the man who has practiced 10,000 kicks once, but I fear the man who has practiced one kick 10,000 times.
Many now fear the AI, ChatGPT-4. ChatGPT grew from the original task of guessing the next word in a sentence, but it has done this task trillions of times. I fear not the AI who has practiced 10 trillion guesses once, but I fear the AI who has practiced next word guessing 10 trillion times.
The nature of intelligence is being revealed with the success of ChatGPT and other Large Language Models. This AI success is showing some aspect of truth to the infinite monkey theorem. The infinite monkey theorem states that a monkey hitting keys at random on a typewriter keyboard for an infinite amount of time will almost surely type any given text, such as the complete works of William Shakespeare.
The generative AIs are learning reasoning and logic as emergent capabilities from the need to do more to master its task of next word guessing. ChatGPT is more than just infinite monkeys.. ChatGPT was trained on at least 570GB of data. It learned from the large amounts of data. It will be interesting to say the least what emerges at 6000 gigabytes of text and larger amounts of sound and video and other data formats. A study conducted by Google Books found that there have been 129,864,880 books published since the invention of Gutenberg’s printing press in 1440. At an average of 50,000 words per book, that is about 6.5 trillion words in total. ChatGPT may be currently outputting 310 million wpm (words per minute) or about 450 billion words per day.
Stephen Wolfram, who created WolframAlpha, described how ChatGPT works. WolframAlpha is an answer engine developed by Wolfram Research. It answers factual queries by computing answers from externally sourced data. WolfamAlpha answers most any question in math and science. It was built upon an earlier product Wolfram Mathematica, a technical computing platform.
Wolfram describes how ChatGPT discovered logic, reasoning and grammar from mastering the task of next-word guessing. Getting extremely good at next-word guessing required encoding and discovering logic, reasoning and math.
We need to understand how much of this comes from being able to develop a synthetic ability to basically read and reason. Create some structures for context and for connecting dots. Ingesting descriptions including all of the "answers" in Wikipedia. All of the answers in every teacher guide. This form of AI currently has limitations in how much multilevel reasoning it can achieve.
The visible flaws in hallucination are where it's connecting the dots go off track. Once one flaw occurs then they compound.
This goes to the need to have many people and other systems tune the AI and calibrate answers.
The systems are highly useful for massive summarization of existing works of knowledge created by people and also using what it has ingested and structured to mimic other works. It can also be a DeeJay and remix what it has.It is tireless in ingesting large amounts of all kinds of information.
It is good at coding and programming because it has taken all of the written code in Github. Coding is very structured and involves a lot of remixing many mini-algorithms or blocks of code.
It supports Wolfram's provable principle of science that for any system described in the broadest terms, you can actually prove theorems about it.