Artificial Intelligence – The History, Now, and Future

Artificial Intelligence – The History, Now, and Future
Published on

Every decade seems to have its own technological breakthroughs. After modern computers became available, following the Second World War, it has become likely for the world to develop programs that perform intricate intellectual tasks. From these programs, general tools are built which have applications to solve a different variety of everyday issues.

The history of Artificial Intelligence is quite interesting and dated back in 20th century when science fiction familiarized the world with the concept of artificially intelligent robots. For Instance, the Czech writer Karel Čapek in 1920 introduced a science fiction play named Rossumovi Univerzální Roboti (Rossum's Universal Robots), also known as R.U.R. The play introduced the word robot to the world. R.U.R. tells about a factory that builds artificial people named as robots. They distinguish from today's term of the robot. R.U.R. robots were looked like living creatures, who are more similar to the term of clones. The robots in R.U.R. first worked for humans, but then the robot rebellion happened that leads to the extinction of the human race.

Early Days of Artificial Intelligence 

The 1950s period world saw a generation of scientists, mathematicians, and philosophers with the concept of AI culturally understood in their minds. One such person was Alan Turing, a young British polymath who discovered the mathematical possibility of artificial intelligence. He suggested that humans can utilize available information as well as reason in order to solve problems and make decisions.

But some great challenges stopped Turing from getting to work right then and there. The first reason was computers needed to fundamentally change. Because, before 1949 computers lacked a key prerequisite for intelligence – they couldn't store commands, only execute them. It means that computers could be told what to perform but couldn't remember what they performed. And the second reason was computing was extremely expensive.

The Era of Success and Setbacks for AI

During 1950-1974, when AI was in its infancy, a lot of pioneering research was being performed, also huge hype was being created that pushed AI into seclude state, and the research funding gone dry. During the same period, technology flourished. Computers could store more information and became faster, more affordable, and more convenient. Machine learning algorithms also improved and people got better at knowing which algorithm to practice to their problem.

These accomplishments, along with the advocacy of leading researchers convinced government agencies such as the DARPA (Defence Advanced Research Projects Agency) to fund AI research at several institutions. Afterward, researchers poured huge funding, and research in AI continued to gather steam.

However, breaching the initial fog of AI disclosed a large number of blocks. Of which the biggest was the lack of computational power to perform anything substantial, meaning computers simply couldn't store enough information or process it fast enough. At the time, many researchers noted that computers were still too much weak to show intelligence. But in the 1980s, AI was reignited by two sources – an expansion of the algorithmic toolkit, and a boost of funds.

Modern Aspect – AI is Everywhere 

Today, technological advancements continue heightening and people are now living in the age of Big Data where they can glean huge sums of information which was too clumsy for a person to process earlier. In this context, the application of artificial intelligence has already been quite productive in several industries, from technology and banking to marketing and entertainment, among others. As current research in AI is constant and continues to grow, it will be interesting to see what this technology will bring for the future.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net