Theory and Conceptual History of Artificial Intelligence

Artificial Intelligence

Artificial intelligence (AI) is a sixty-year-old discipline that consists of a collection of studies, theories, and techniques (including mathematical logic, statistics, probability, computational neuroscience, and computer science) aimed at replicating a human’s cognitive abilities. It began in the midst of World War II, and its advancements are inextricably tied to those of computing, allowing computers to execute increasingly complicated jobs that could previously only be entrusted to a human. However, in the strictest sense, this automation is not human intelligence, prompting some academics to question the designation. The final step of their study (a “strong” AI, i.e. the ability to contextualize a wide range of specialized problems in a completely autonomous manner) is incomparably superior to existing efforts (“weak” or “moderate” AIs, extremely efficient in their training field). To be able to represent the entire world, the “strong” AI, which has only yet materialized in science fiction, would require gains in basic research (not simply performance improvements). However, since 2010, the discipline has undergone a resurgence, owing to significant advancements in computer processing power and access to huge amounts of data. Renewing promises and fantasizing about issues make it difficult to grasp the phenomena objectively. Brief historical reminders can assist in placing the discipline in context and informing current arguments.

1940-1960: Birth of AI in the wake of cybernetics

In the aftermath of cybernetics, AI was born between 1940 and 1960. Between 1940 and 1960, there was a strong correlation between technological advancements (of which the Second World War was a catalyst) and a desire to understand how to combine the functions of machines and living beings. The goal, according to Norbert Wiener, a pioneer in cybernetics, was to unite mathematical theory, electronics, and automation into “a comprehensive theory of control and communication, both in animals and machines.” Warren McCulloch and Walter Pitts had produced the first mathematical and computer model of the biological neuron (formal neuron) as early as 1943. John Von Neumann and Alan Turing were the founding fathers of the technology underpinning artificial intelligence (AI) in the early 1950s. They made the shift from computers to 19th-century decimal logic (which dealt with values from 0 to 9) and machines to binary logic (which relies on Boolean algebra, dealing with more or less important chains of 0 or 1).

Artificial Intelligence Development
Photo Credit: https://www.surveycto.com

The architecture of today’s computers was therefore codified by the two researchers, who demonstrated that it was a universal machine capable of performing what was programmed. Turing, on the other hand, raised the issue of a machine’s possible intelligence for the first time in his famous 1950 article “Computing Machinery and Intelligence,” in which he described a “game of imitation” in which a human should be able to tell whether he is talking to a man or a machine in a teletype dialogue. Regardless of how divisive this piece is (this “Turing test” does not appear to qualify many experts), it will frequently be identified as the source of the questioning of the human-machine divide. John McCarthy of MIT coined the term “AI,” which Marvin Minsky of Carnegie-Mellon University defines as “the construction of computer programs that engage in tasks that are currently more satisfactorily performed by humans because they require high-level mental processes such as perceptual learning, memory organization, and critical reasoning.” The Rockefeller Institute-sponsored symposium at Dartmouth College in the summer of 1956 is regarded as the discipline’s birthplace.

Anecdotally, the enormous success of what was not a conference but rather a workshop is worth highlighting. Only six people had remained consistent throughout the project, including McCarthy and Minsky (which relied essentially on developments based on formal logic). While technology remained exciting and promising (see, for example, the 1963 paper “What Computers Can Do: Analysis and Prediction of Judicial Decisions” by Reed C. Lawlor, a member of the California Bar), its appeal waned in the early 1960s. Because the devices had limited memory, using a computer language was challenging. However, some foundations were already in place, such as solution trees for solving problems: the IPL, or information processing language, had made it feasible to construct the LTM (logic theorist machine) program, which attempted to show mathematical theorems, as early as 1956. In 1957, economist and sociologist Herbert Simon predicted that AI will beat a human at chess in the next ten years, but the AI then went through its first winter. Simon’s prediction turned out to be correct…

1980-1990: Expert systems

Stanley Kubrick created the film “2001 Space Odyssey,” in which a computer named HAL 9000 (whose letters are identical to those of IBM) encapsulates all of the ethical issues raised by AI: will it represent a high level of sophistication, a benefit to humanity, or a threat? Naturally, the film’s impact will not be scientific, but it will help to popularize the topic, much like science fiction novelist Philip K. Dick, who will never stop wondering if machines may one-day experience emotions. With the introduction of the first microprocessors at the end of 1970, AI resurfaced, ushering in the golden age of expert systems. DENDRAL (expert system specialized in molecular chemistry) and MYCIN (molecular chemistry expert system) were the first to pave the way at MIT in 1965 and Stanford University in 1972, respectively (system specialized in the diagnosis of blood diseases and prescription drugs). These systems were built around an “inference engine,” which was designed to mimic human reasoning in a logical way.

The engine supplied replies with a high level of knowledge after entering data. The promises predicted a great expansion, but the frenzy peaked at the end of 1980 and early 1990. It took a lot of effort to program such information, and there was a “black box” effect between 200 and 300 rules, where it was unclear how the machine reasoned. As a result, development and maintenance became increasingly difficult, and many other, less sophisticated, and less expensive options were available. It’s worth remembering that in the 1990s, the word “artificial intelligence” was almost forbidden, and more moderate variants, such as “advanced computing,” had even entered university jargon. The victory of Deep Blue (IBM’s expert system) versus Garry Kasparov in the chess game in May 1997 fulfilled Herbert Simon’s 1957 forecast 30 years later, however, it did not promote the funding and development of this type of AI. Deep Blue’s operation was based on a methodical brute force algorithm that analyzed and weighted all feasible motions. The human defeat remained a historical emblem, although Deep Blue had only managed to tackle a very small perimeter (the laws of the chess game), far from the potential to represent the world’s complexity.

Since 2010: a new bloom based on massive data and new computing power

The fresh rise in the discipline around 2010 can be attributed to two things. – First and foremost, access to vast amounts of data. It used to be essential to do your own sample in order to employ algorithms for image categorization and cat recognition, for example. Today, a simple Google search can yield millions of results. – Next, the great efficiency of computer graphics card processors was discovered to speed up the calculation of learning algorithms. Due to the iterative nature of the method, processing the complete sample could take weeks before 2010. The processing capability of these cards (which can handle over a thousand billion transactions per second) has allowed for significant advancement at a low cost (less than 1000 euros per card). This new technological equipment has led to several notable public achievements and increased funding: in 2011, Watson, IBM’s AI, will defeat two Jeopardy champions! ». Google X (Google’s search lab) will be able to distinguish cats in videos in 2012.

Conclusion

This last operation required more than 16,000 processors, but the potential is enormous: a machine learns to discern between things. The European champion (Fan Hui) and the world champion (Lee Sedol) will be defeated by AlphaGO (Google’s AI specialized in Go games) in 2016. (AlphaGo Zero). Let us stipulate that the game of Go has considerably larger combinatorics than chess (more than the number of particles in the universe) and that such enormous outcomes in raw strength are not attainable (as for Deep Blue in 1997). For more “How Big Data Artificial Intelligence“.

Apart from this, you can also read EntertainmentTech, and Health-related articles here: Bollyshare, Samsung Galaxy F22 Review1616 Angel Number444 Angel NumberMoviezwapY8JalshamoviezWebsite traffic checkerProject Free TVKickassanime777 Angel NumberSeptember 8 ZodiacKissasian666 Angel Number333 Angel NumberHoliday SeasonSamsung Galaxy Z Flip 3 reviewPUBG Launch Date in IndiaSears Credit CardGoDaddy EmailFree Fire Redeem CodeMangagoJio RockersNew iPhone 13Vivo Y53s ReviewEye ShapesM4uHDFever DreamMoon wateriPhone HeadphonesSpanish MoviesHip dips,  M4ufreeNBAstreams XYZCCleaner Browser reviewAvocado CaloriesBear Grylls net worthRihanna net worth 2021Highest Paid CEOThe 100 season 8Sundar Pichai net worthGrimes net worthF95Zonehow to change Twitch nameSherlock Season 5Homeland Season 9.


Subscribe to Our Newsletter

Related Articles

Top Trending

Project Astra Future of AI Google
Project Astra May Be the Future of AI at Google
Slack Gets a Discord-Style
Slack's New AI Policy Sparks Privacy Concerns: Opting Out is a Challenge
bruce wilpon wife
Meet Bruce Wilpon's Wife: The Role of Margaret, Susan, and Yuki in His Career
Science-Backed Tips for Better Sleep
15 Science-Backed Tips for Better Sleep
Low Glycemic Index Fruits
14 Low Glycemic Index Fruits for Diabetic People

LIFESTYLE

Creative Ways to Show Appreciation for Mothers
Creative Ways to Show Appreciation for Mothers on Mother's Day
Mothers Day Speech Ideas
Inspiring Mother's Day Speech Ideas for a Memorable Tribute
Rabindra Jayanti 2024
Rabindra Jayanti 2024: Celebrating the Life and Legacy of Rabindranath Tagore
May 6 Zodiac
May 6 Zodiac: Positive Traits, Compatibility and More about Taurus
why initial bracelets perfect personalized gifts
Why Initial Bracelets Make the Most Personalized Gifts

Entertainment

GTA 6 Leaks
GTA 6 Official Announcement, Plot, Trailers, Gameplay, and More
Guy Maddin Cannes Debut Oscar Winners
Cult Filmmaker Guy Maddin Debuts at Cannes with Oscar Winners' Help
devon aoki husband
Who Is Devon Aoki's Husband? Devon Aoki and James Bailey Relationships Latest
dabney coleman dies at 92
Legendary Actor Dabney Coleman, Master of Villain Roles, Dies at 92
sean diddy combs alleged altercation with cassie ventura
Sean "Diddy" Combs Caught on Camera in Alleged Violent Altercation with Cassie Ventura

GAMING

GTA 6 Leaks
GTA 6 Official Announcement, Plot, Trailers, Gameplay, and More
GTA 6 Release Date Autumn 2025
Fans Finally Have a Release Date for GTA 6: Autumn 2025
How to Save Money on Video Games
How to Save Money on Video Games
ghost of tsushima pc preorders canceled
Ghost of Tsushima PC Pre-Orders Canceled in Non-PSN Countries
Tips and strategies for winning the feudle
A Step-By-Step Guide and Strategies for Winning the Feudle Word Game in 2024

BUSINESS

bangladeshis on forbes 30 under 30 asia 2024
9 Bangladeshis Named in Forbes 30 Under 30 Asia 2024 List
indias brightest young minds forbes 30 under 30 asia
Meet India's Brightest Young Minds: Forbes Unveils '30 Under 30' Asia List
Housing Crisis RBA Warning No Quick Fix
RBA Warns of Prolonged Housing Crisis: No Quick Solutions in Sight
Reddit Shares Jump Openai Chatgpt Deal
Reddit Shares Surge Over 10% After Partnership Deal with OpenAI
taylor swift eras tour boosts uk economy
Taylor Swift's Tour Hands UK Economy £1 Billion Boost: Study

TECHNOLOGY

Project Astra Future of AI Google
Project Astra May Be the Future of AI at Google
Slack Gets a Discord-Style
Slack's New AI Policy Sparks Privacy Concerns: Opting Out is a Challenge
How to Watch Microsoft Build 2024
How to Watch the Microsoft Build 2024 Keynote Live on May 21?
Google Cloud Stack Overflow Gemini Partnership
Google Cloud Error Deletes $125B Pension Fund, Disrupts 500,000 Members
what does nfs mean snapchat
What Does Nfs Mean on Snapchat, Wizz, Instagram, and Texts in 2024

HEALTH

Science-Backed Tips for Better Sleep
15 Science-Backed Tips for Better Sleep
Low Glycemic Index Fruits
14 Low Glycemic Index Fruits for Diabetic People
Hacks to Reduce Anxiety
3 Science-Backed Hacks to Reduce Anxiety & Boost Happiness 
massachusetts man dies after pig kidney transplant
Massachusetts Man Dies After First Successful Pig Kidney Transplant
International Nurses Day 2024
The Heart of Healthcare: Celebrating International Nurses Day 2024