Just go to a CSE or EE department in any engineering college in India (other than the major IITs), you will find 90% of the faculty and students working on what they refer to as “machine learning”. In fact, I will not be surprised if it is 100%. I recently sat in a few faculty selection committees where all 30/30 candidates were in AI/ML (also called soft computing in India) !!! You will almost never find anybody working on hardware, theory, networking, databases, or computer graphics in a CSE department or on devices, control theory, communication theory, or VLSI in an Electronics department. Even if someone has done her Ph.D in theory, she just works on her version of AI/ML, and theses from B.Tech to Ph.D are produced at an industrial scale. I recently was made aware of a university that has close to 500 faculty in CS, 480+ work in ML !!!
Why ??? See, doing what I call “false ML”, which is just limited to running Python libraries or Matlab code without having any clue of the underlying math, is easy and you can do lots of fraud research with it. It is a research paper printing press. Take 200 people, measure the size of their noses, how many intact teeth they have, how many hours they spend watching Netflix and predict their COVID symptoms. Train a CNN or LSTM: you have a paper and also a Times of India headline. You don’t need to have a clue of how CNNs or LSTMs work or what is a Hessian or a Jacobian or Singular Value Decomposition, you just need to know how to collect data and run a Python function on it. There is no need to prove, verify, or justify any result. In fact, if one just fakes results and spikes up the accuracy numbers substantially, it almost impossible to find out. Just tell the other guy that he does not have the right hyper-parameters. Just run the Python code, show some result, and voila, you are all done !!!
The other reason is peer pressure. These are more common in a post-truth world: perceptions matter and facts don’t. Sadly, companies and the media are to blame. Starting from hardcore hardware companies to BPOs, everybody is saying that they do AI/ML even though they are clearly lying. I raised this issue with some of my friends who routinely come to IIT Delhi for campus interviews. I told them that they are wilfully misleading our students. They have no remote connection with AI/ML — one does hardware design for GPUs, one person designs compilers, and another makes cloud monitoring frameworks. They all had the same answer, “It is hot, and that is what students and their parents want to hear; once they join we will reset their expectations by making them do regular C++ coding !!!” Even luminaries like Elon Musk tout AI/ML even though they make money by either designing electric cars, sending rockets to space, or simply drilling holes. They never ask people to become better Mechanical engineers or stellar Chemical engineers, even though they live off their talent. Similarly, in the software world other than a few startups, most employees write regular code and very rarely design and train neural networks. I wish this fact was more widely known and people forget about the notion that there is more “money” in AI/ML (clear falsehood). This situation is no different from the lies that we typically tell just to appear “cool”, when we clearly know that we don’t mean them.
As a teenager: I want to be a doctor/engineer As a college student: I have a girlfriend who looks like a supermodel. As an employee: I love my company and am very passionate about my work. As a married man/woman: I love my in-laws. As a 40+ person: My age is 35 As a COVID positive person: I didn't go anywhere or meet anyone, the virus came to me via an Amazon package !!! Add I love AI/ML to the list
What damage has it done? Well, a lot. We interview roughly 300 Ph.D candidates. A standard question is, “How many lines of code have you written (lifetime total)?” Students seriously count. The answers are between 50 and 500 lines for CS graduates !!! Now, tell me how will such students fare in the real world or in a more competitive PG program that expects a reasonable amount of coding? Students struggle and they struggle badly. In EE, things are equally bad. They have barely any experience with EDA tools or embedded systems or even lab equipment like Oscilloscopes. The only real lab that students have done is Matlab. As a result, there is very little that one can achieve after 4 years of UG training. Students’ practical skills are very, very poor making them quite unemployable. This handicaps them for life and steals many opportunities ranging from jobs to higher studies. Colleges have to become more responsible and teach students skills that they will use in the real world such as coding, designing embedded systems, and operating communication equipment.
How to fix this? Universities have to ensure that students are developing “real lab” skills. Every CSE student should at least write 5000 lines of code and work with one large piece of open-source software before getting a degree. Similarly, EE students have to show working projects that use a variety of chips in the lab. They need to also get their hands dirty with EDA tools, embedded systems projects, and networking hardware. Faculty should be encouraged to do either “true ML research” that involves a lot of serious math, or not do it at all. It is far better for colleges to ensure that their students are working with and building real systems that actually work even though the “perceived research value” is nil. Nothing stops colleges from giving projects on modifying the code of open-source compilers, web browsers, and the Linux kernel. Students will at least learn something that they will use in their later life as opposed to learning nothing and just producing research spam.
Disclaimer and conflicts of interest: I hold AI/ML research in very high regard; I have published one of the earliest papers in the computer architecture community that used AI techniques; I was instrumental in designing the data science curriculum for Class 11-12 CBSE board students; I used to work on ML in IBM Research and have published papers on ML as well. I frequently use a lot of AI/ML techniques in my work, and they are often very useful. I am associated with the School of AI in IIT Delhi and highly encourage “genuine” AI/ML research.