I was reading an article earlier about how AI is improving on Tests of Intelligence and Creativity.
Now, I believe that the "Test of Creativity" is likely deeply flawed, but the Tests on IQ are far more Interesting to me.
If it is to be believed, AI has gone from being in the lowest 2% of IQ Tests to the lowest 37% of IQ Tests.
It is still a dramatically lower percentage of IQ than most Humans, but with the change in just a year is quite a feat.
There are still doubts in my opinion though, just looking at how current AI Models Work, that would suggest that AI may hit a barrier at 50% IQ, but let us "Assume" AI can break that Barrier.
Let's make an "Estimation" that AI "Does" Truly Reach the Top 1% of IQ.
Let's Even Assume that AI can reach the #1 IQ Score of All Time.
Would that make AI Smarter than Humans?
Would that mean that We are Doomed to Serve AI or would we be Better Off allowing AI to Run our Lives?
Well, I would say Not So Fast!
There are some interesting things to Consider here.
So, it would be fair to say that, perhaps if an AI system "did" become #1 today, that does not mean that in 10 years Humans could not Outperform the AI.
With this, we've seen that with Technological Advancements in the Past, Humans seem to become Exponentially Better as New Technology is Developed instead of Technology "Replacing" Humans Completely.
This could mean that as AI becomes #1, to "Improve" it would have to learn from Itself which would quickly Demolish the Progress.
Another possibility could be that we would discover that an AI with the Highest of IQs would have the same Faults as Humans with the Highest of IQs.
These Faults would then simply become Exponentially Realized without the Temperance of Outside Perspectives.
But let me go back a moment to what happens with Technological Advancements.
As has been observed with Chess, when People are Pitted against Machines in Chess, the Machine Wins.
However, there is another observation that has been made.
Humans working "With" Machines end up beating Machines Alone.
When it comes to the Idea of AI becoming Smarter than Humans, I think this is the Wrong Type of Thinking.
"If" AI can get exceptionally Better than Current Models (which given recent happenings seems questionable), I Believe we will Discover a Similar Occurrence.
I Do Not Believe that we have to "Worry" about AI being Smarter/Not Smarter than Humans.
That Misses our Greatest Strength as Humans.
Adaptability.
I think that "If" AI Models become "Smarter", then Humans will Adapt and "Use" the Strengths of AI Models to get to New Heights.
This would, in effect, create a "Different" Type of Intelligence that Humans would Utilize to their Advantage.
We would Adapt, use the Strengths of AI to our Advantage, and go on to Accomplish even More Incredible things than we already have.
In this Respect, if AI became "Smarter" in some Aspect of Intelligence, we would become "Smarter" in a different Aspect.
Better than anything else, I believe that Humans Excel at Adapting to New and Different Circumstances.
There is another aspect to consider here though.
If AI became Smarter than Humans, should we let it "Rule" over us or do our Thinking for us?
I believe that would be Unwise.
AI is still a Technology, and what would happen if, say, all Technology gets decimated - a possibility with a Large Scale War or Terrorist Attack.
If we "Completely" Rely on AI to do our Thinking, we leave ourselves Completely Vulnerable to many Problems.
It could be Fantastic to work "With" AI, but we must also Understand "How" to Think for Ourselves.
Separately, there are Major Problems and Disadvantages that would occur.
Together may become our Best Option, "if" AI can get to that level.