Breaking News

The rise of machines: Timeline of the evolution of Artificial Intelligence – First, what is the big fuss really? – Economic Times

ET Bureau|27 Jul 2020, 04:42 PM IST1/7​First, what is the big fuss really?In July, San Francisco-based OpenAI stunned everyone with the capabilities of its AI language model, the GPT-3. The text generator can pen fiction, compose poetry and write business memos — all without any human intervention. Simply put, it is being seen as a tool that brings machines a wee bit closer to mimicking human intelligence. A look at how AI has come this far.The GPT-3, or Generative Pre-training Transformer 3, can use half a sentence as input and type out the rest correctly. It doesn’t stop at that. The text predictor can then type out a whole para, or a book, that make logical sense. The GPT-3, however, lacks the ability to reason abstractly. It is at a loss when faced with new ideas.Getty Images2/7​So it is just a glorified letter writer?Even though GPT-3 is in the beta stage, it can code in any language, design websites and even prescribe medication. An earlier version called GPT-2 that was released in 2019 had 1.5 billion parameters. This one has 175 billion. It can ingest everything available on the internet to achieve its task.Man & Machine MergeAI-driven advancements will see machines mimicking humans and getting better than humans in some tasks. Futurists like Ray Kurzweil say robots will achieve “singularity” — that point when machines become smarter than human beings. This will lead to the development of Artificial General Intelligence (another term for human level intelligence). Kurzweil is betting on machines matching human intelligence by 2029.Getty Images3/7​1637: I Think, Therefore I AmLong before even computers emerged, philosopher Rene Descartes had in the 17th century pondered over the possibility of machines thinking and making decisions. He concluded humans are superior because they can think. Cogito, ergo sum or I think, therefore I am. However, he said, machines could perform a specific task or a set of machines could come together to perform a task. Today, the former is called specific AI and the latter is called general AI.1950s: Dartmouth Conference & Modern AIBritish mathematician Alan Turing is considered a pioneer in AI. He came up with the idea of building machines that could think. The Turing test that he developed in 1950 is still used to determine whether a computer is capable of thinking or not. Later, Dartmouth College professor John McCarthy coined the term artificial intelligence to define this ability, at a conference in 1956. The Dartmouth Conference laid down the framework for academic exploration and the development of “thinking machines”Getty Images4/7​1966: ELIZA is HereEliza, a natural language processing computer, was the first chatbot. Developed at MIT’s Artificial Intelligence Lab, it is a direct ancestor of the Siri, Alexa and other digital assistants. Eliza could not talk like Alexa but communicated via text and was not capable of learning from human conversations. Yet, she paved the way to break down the communications barrier between humans and machines.1990s: Internet, the Fuel for AIIn 1991, Tim Berners Lee put the world’s first website online and published how HTTP or hypertext transfer protocol worked. That ushered in the world wide web. Even before that, some computers were connected to share data. But WWW made it easier for anyone to plug into what we call the internet. Data generated by users is fuel for AI.Agencies5/7​1997: Deep Blue Beats HumanIBM’s Deep Blue computer defeating chess champ Garry Kasparov was a watershed moment for AI. Deep Blue, which researchers started developing from the 1950s, leveraged the increased capacity for raw computing power and applied it to chess to beat Kasparov. Computers soon became better at activities in which humans were so far unbeatable.2011: Watson’s VictoryIBM’s computer engine Watson played against humans on the TV game show Jeopardy!, a quiz competition. Watson went home with the $1 million award. This again reinforced the proposition that machines can win games where moves can be described mathematically.Agencies6/7​2012: Machines Spot CatsResearchers at Stanford and Google tried to accelerate the pace of AI development by training machines to recognise pictures of different cats. They were able to create a network with about one billion connections, calling it a big step towards building an “artificial brain”. Of course, a lot more research is required as the human brain has 86 billion neurons joined by a network of around 10 trillion connectors.2016: AlphaGo’s Charge: AlphaGo, created by Google subsidiary Deep Mind defeated world Go champion Lee Sodol. The game was considered to be the most challenging classical game for artificial intelligence because of its complexity. Did you know, there are over 100,000 possible opening moves in Go. Chess has 400.2018: Self-Driving Cars: The year marked a significant milestone in machines doing complex human tasks like driving. Google spinoff Waymo runs a self-driving taxi service in Phoenix, Arizona. Users can request a pick up via an app and get driven around a technology that combines computers, sensors, cameras and AI.Agencies7/7​AI in the next 15 yearsBetween 70% and 90% of all initial customer interactions are likely to be conducted or managed by AI.Product development in a range of sectors, from fashion items and consumer goods to manufacturing equipment, could increasingly be undertaken and tested by AI.Individuals will be able to define and design the personalised products and services they require in sectors ranging from travel, banking, savings and insurance.Autonomous vehicles will start appearing in many cities across the world.Crypto tokens may be accepted alongside fiat currencies as we move towards a single global medium of exchange.AI is likely to penetrate every commercial sector.Self-aware and self-replicating software systems and robots will emerge.There is a possibility of achieving Artificial General Intelligence (human-level intelligence).(Sources:, Wired, Medium, Images