Universities outdated? (Human vs ChatGPT comparison)
I started this by writing an adequate timeboxed short piece on why I think the standard K-12,16,20 system probably has peaked in the US. I have a few other things I’d rather write, but they take longer and more research, and wanted to keep the one post per day tempo. Then I decided to ask ChatGPT to see if it would do an adequate job, then realized comparing the two was at least as interesting, if not more so, than the original topic.
(There are plenty of great treatments of the topic itself, far better than my 40 minute effort; I strongly recommend Paper Belt on Fire by Michael Gibson, among others.)
I think my conclusion is ChatGPT wins vs. human when the desired results are “sufficient”, easily evaluated, and time constrained. This doesn’t cover everything, but does cover the vast majority of routine business and professional writing. The ideal is probably to have some very well written human (and maybe collaborative/iterative human) works which are just reused verbatim much of the time, with AI output used when these standard works need to be tailored or adapted. The obvious downside to this is that the standard works or “masterworks” don’t just magically appear — some of the best training for a great creator is rote copying of masters, followed by improvisation and adaptation of the master’s work, followed by innovation. Without these human training and development opportunities, the production of masterworks will decrease — initially individual creators become more productive due to technological assistance (putting 100% of time into creating these works, with lesser tasks delegated to AI). There’s a bad form of this argument in saying any technology reduces the craft and skill of the creator, but I don’t think a writer is a worse writer because he uses a computer with infinite storage, nearly effortless production of each individual character on the screen, speech to text transcription, draft and revision tracking, etc. vs. manually carving into stone tablets or using the lower class innovation of papyrus or vellum. Maybe there are cases where contemplation or cost makes a stone tablet enscriber better (certainly more concise…), but on the whole the cost isn’t worth it, and AI-augmented writing seems to fall into the same. Drawing a hard distinction between the physical manipulation of objects vs. the intellectual task of creation is attractive, but breaks down at detail.
My attempt: (heavily timeboxed to 40m; would be much better with editing, but then would be a 1-4 hour effort)
I don’t think the isolated, dedicated, and rigid program K-12 followed by 4 years of college or university followed by potentially grad school before entering adult life is the optimal path for almost anyone anymore.
This track, with detours along the way for “dropout”, “high school only”, “undergrad only”, and maybe a tweak or two like a gap year between high school and college, maybe a brief time in the workforce between college and grad school, and summer jobs or internships during undergraduate periods, has been largely the US default, at least aspirationally, since around 1945. K-12 had been fairly universal in the US for nearly a century, mostly (local) government funded. In addition to K-12’s educational value, K-8 especially provides a childcare function, especially in the dual income or single parent family world, allowing mothers to enter the outside-the-home workforce.
College costs were initially subsidized heavily by DoD (through GI Bill for returning veterans of WW2), and the entire program wasn’t unbearably expensive — a normal person could pay his way through college by working a side job and in the summers. Larger numbers of graduates in many graduate programs became superfluous and only able to become professors, and the increase in supply of college degrees made them an easy credential to require for other employment. The “signaling” vs. “human capital increase” aspects of college are pretty clear, as most students test the same going in and graduating, etc.
I am somewhat biased as a high school dropout (to go to college early, after some exposure to college and especially the Internet), a college dropout (financial reasons, and to do a startup during the first dotcom bubble and anonymous electronic cash opportunity days), but I don’t think my perspective is too far out of the mainstream today. There are people who benefit from the current system (university administrators and tenured professors most of all) — for those who are just barely capable of graduating with a credential, the lowest ranked student probably gets a big surplus, and there are social engineering programs to drive certain groups through the educational system to get those credentials. Most of the signaling value is category (“university degree”) followed by the specific school (based on reputation, especially of selectivity, but also based on public accomplishments in other domains). There’s human capital formation largely due to networking. There is value provided through access to labs, intellectual giants of a field, etc., the ostensible value of the programs, but it appears very easy to attend (and even graduate) from most programs without much measurable benefit from these.
[…running out of caring here…summarizing history and status quo isn’t very interesting…]
Credentialism has imposed high costs on the economy. For many, entering a chosen profession is out of reach, or at least out of immediate reach, due to costs (time, opportunity cost, direct tuition, support) of gaining a credential — not merely for doctors and civil engineers, but for flower arrangers, real estate brokers, etc. Credentialism is enabled by credentials — and the “undergraduate degree” as a fungible
One issue is actual capabilities testing isn’t done. Partially this is because IQ tests and other more generalized tests are legally risky (Griggs vs Duke Power), partially it’s because adequate work sample tests are difficult to create and administer, partially it’s because hiring is to some degree delegated to Human Resources departments, one of the main non-academic bastions of degreed but fundamentally unemployable people.
Segregating children from the adult world for their entire lives up to (potentially) mid/late 20s has many problems of its own. Before the Prussian-derived educational system, students would be in mixed environments with children of different ages, and would often apprentice or work with adults, at least part time. This isn’t “coal mining is good for 10 year olds”, but “14 year old works with his uncle to learn a trade”. As trades (and all aspects of daily life) become more technological, it’s natural that the 12-16 year old apprentice would be helping with intellectual tasks, and with both human and machine assistance (video lectures, interactive workbooks, AI sessions…) would be learning “academic” skills at the same time. Children in the workforce also learn about commerce, morals and ethics, etc. in a much more natural environment, and see the benefits of work vs. merely “because you have to do it”.
All of this might seem hard to believe, but we have great examples in certain fields already. Kids teach themselves how to program when given a real incentive, like writing tools for games, chat systems, etc. “Social Media Manager” is a “real job” which many teenagers do for free. Autistic levels of well-researched expertise on various topics aren’t rare at all; they just tend to be topics other than “German idealism of the 18th and 19th centuries”. Accessible tools which encourage manipulation and learning, without gatekeeping, are a great way to harness this natural interest.
[extra points which would get integrated with editing:]
(Highest costs in the current system are probably for those who drop out at various stages without good options — a high school dropout in a world of strong credentialism there is an issue, so the GED was created, but a college dropout often gets very little of the college-attending value premium, but can have a lot of the cost.)
College itself has added many costs beyond tuition and financial (themselves substantial, and often leading to debt) and opportunity cost. Colleges are socially fairly homogenous, left-liberal, and authoritarian. Delaying “starting life” until late 20s also suppresses human flourishing, including marriage and children.
[conclusion, doesn’t really exist without editing]
There are some hacks. If a lot of value is acceptance and signaling, “accepted to the best school, then dropped out to do a X” works, provided the dropout was clearly not “improperly accepted”. Theoretically one could get accepted to MIT, attend for one paid semester, and then hang around for another 2-3 years not paying but taking classes (and getting retroactively de-registered for non-payment at the end of each semester…), then eventually leave to do a startup in the Caribbean. Purely theoretically, of course. The unpaid bursar bills are normal debt (and universities don’t tend to collect those as aggressively, only really blocking graduation…). Combined with education from the Internet and other sources, you can get a relatively strong credential at a small fraction of the cost. Ironically this only really works at the very top institutions (famously Bill Gates and Mark Zuckerberg at Harvard).
Hopefully, as we move away from the outdated credentialism of attended/graduated time-duration programs, and instead focus on measurement of specific skills, K-12-16-20 will become less critical. Educational programs which are more ongoing and parallel to life (from ~teens onward — there is probably a basic foundation of social and academic skills needed, probably K-6 or K-8, in a mixed-ages setting) , less rigidly structured, and more adaptive to actual needs will also help.
[…done. enjoy the free content…realistically a good version of this is 2-4h of writing, and hours of editing…]
In the United States, college has long been seen as a necessary step towards success, with many people believing that it is the key to finding a good job and achieving financial stability. However, in recent years, there has been a growing belief that college is overrated, or even outdated, in today's job market.
One reason why college may be overrated is the cost. The price of tuition, fees, and living expenses has been steadily increasing, and many students end up taking on significant amounts of debt in order to pay for their education. This debt can take years, or even decades, to pay off, and can have a significant impact on an individual's financial well-being.
Another reason why college may be overrated is that it does not guarantee success. While a college degree can certainly be an asset in the job market, it does not guarantee that an individual will be able to find a good job or advance in their career. There are many people who have been successful without a college degree, and in some cases, real-world experience and skills may be more valuable to an employer than a college education.
Additionally, the job market is constantly changing, and the skills that are in demand may not always align with what is taught in college. Many jobs that didn't exist a few years ago now require specific skills and training, and a traditional college degree may not provide the necessary preparation.
Finally, college is not for everyone. Some people may not thrive in a traditional academic setting, and may be better suited to other forms of learning or job training. For these individuals, the time and money spent on a college education may not be worth it.
In conclusion, while college can certainly be a valuable experience for some people, it is not right for everyone. It is important to carefully consider the costs and benefits of a college education, and to determine whether it is the best path for you in today's job market.
I, for one, welcome our new AI overlords.