In my university we learned that we should learn to learn.
It has proven to be very useful.
this, and also nothing is 100% new - knowledge in similar areas will always help
I got an Electronics Engineering Degree almost 30 years ago (and, funnilly enough, I then made my career as a software engineer and I barelly used the EE stuff professionally except when I was working in high-performance computing, but that’s a side note) and back then one of my software development teachers told us “Every 5 years half of what you know becomes worthless”.
All these years later, from my experience he was maybe being a little optimist.
Programming is something you can learn by yourself (by the time I went to Uni I was already coding stuff in Assembly purelly for fun and whilst EE did have programming classes they added to maybe 5% of the whole thing, though nowadays with embedded systems its probably more), but the most important things that Uni taught me were the foundational knowledge (maths, principles of engineering, principles of design) and how to learn, and those have served me well to keep up with the whole loss of relevance of half I know every 5 years, sometimes in unexpect ways like how obscure details of microprocessor design I learned there being usefull when designing high performance systems, the rationalle for past designs speeding up my learning of new things purelly because why stuff is designed the way it is, is still the same, and even Trignometry turning out to be essential decades later for doing game development.
So even in a fast changing field like Software Engineering a Degree does make a huge difference, not from memorizing bleeding edge knowledge but from the foundational knowledge you get, the understanding of the tought processes behind designing things and the learning to learn.
GPT is not equipped to care about whether the things it says are true. It does not have the ability to form hypotheses and test them against the real world. All it can do is read online books and Wikipedia faster than you can, and try to predict what text another writer would have written in answer to your question.
If you want to know how to raise chickens, it can give you a summary of texts on the subject. However, it cannot convey to you an intuitive understanding of how your chickens are doing. It cannot evaluate for you whether your chicken coop is adequate to keep your local foxes or cats from attacking your hens.
Moreover, it cannot convey to you all the tacit knowledge that a person with a dozen years of experience raising chickens will have. Tacit knowledge, by definition, is not written down; and so it is not accessible to a text transformer.
And even more so, it cannot convey the wisdom or judgment that tells you when you need help.
This has to be the stupidest AI take yet.
Was learning to do math made “obsolete” by calculators?
One thing I found especially dumb is this:
Jobs that require driving skills, like truck and taxi drivers, as well as jobs in the sanitation and beauty industries, are least likely to be exposed to AI, the Indeed research said.
Let’s ignore the dumb shit Tesla is doing. We already see self-driving taxis on the streets. California allows self-driving trucks already, and truck drivers are worried enough to petition California to stop it.
Both of those involve AI - just not generative AI. What kind of so-called “research” has declared 2 jobs “safe” that definitely aren’t?
Mostly bullshit because the ultimate goal of college isn’t to make you term basic facts which you need to graduate, instead the ultimate goal is to teach you how and where to learn about new developments in your field or where to look up Information which you don’t know or don’t remember.
Yes, but the how and where to learn are changing too, which is the problem
Claiming modern day students face an unprecedentedly tumultuous technological environment only shows a bad grasp of history. LLMs are cool and all, but just think about the postwar period where you got the first semiconductor devices, jet travel, mass use of antibiotics, container shipping, etc etc all within a few years. Economists have argued that the pace of technological progress, if anything, has slowed over time.
This is such a shit and out of touch article, OP. Why bring us this crap?
This has arguably always been the case. A century ago, it could take years to get something published and into a book form such that it could be taught, and even then it could take an expert to interpret it to a layperson.
Today, the expert can not only share their research, they can do interviews and make tiktok videos about a topic before their research has been published. If it’s valuable, 500 news outlets will write clickbait, and students can do a report on it within a week of it happening.
A decent education isn’t about teaching you the specifics of some process or even necessarily the state-of-the-art, it’s about teaching you how to learn and adapt. How to deal with people to get things accomplished. How to find and validate resources to learn something. Great professors at research institutions will teach you not only the state-of-the-art, but the opportunities for 10 years into the future because they know what the important questions are.
In theory, you go to college to learn how to think about really hard ideas and master really hard concepts, to argue for them honestly, to learn how to critically evaluate ideas.
Trade schools and apprenticeships are where you want to go if you want to be taught a corpus of immediately useful skills.
deleted by creator
I mean, that’s really only true for compsci. While scientific and technological advances will indeed be made in STEM in general, they aren’t that fast or significant enough to make what was learned unviable.
Not even though. The things I learned about in my bachelor’s and master’s didn’t suddenly get mase obsolete.
I’d like to see the innovation that makes algorithm theory obsolete.
Fair. I was thinking more about changes in coding language usage, but I suppose that also depended on when you were attending university. There have been periods where things changed faster in compsci than other periods.