Posted by novaRom 5 days ago
After working on that company for a couple of years I realized using tech in education (pre university) was a mistake. One of the reasons I left.
In a decade or two the long term consequences of inundating kids with tech and then removing it will be quite obvious. This will be studied for decades to come. Reminds me of the Dutch kids that were borm during the 1944-1945 Dutch famine.
https://www.ohsu.edu/school-of-medicine/moore-institute/dutc...
I think we should use tech in education, but in a targeted way. It's important that children gain basic technical literacy, like how to touch type and use basic software. I suspect there is a gap in the technical literacy of lower income students, whose parents are less likely to have a computer at home.
The real problem is separating reading/writing skills from tech skills. We shouldn't stop teaching handwriting just because typing exists. And being able to read long books and essays teaches fundamental cognitive skills like attention, focus, and information processing.
For those, obviously you need a computer and completely agree that those are important skills to learn... But you maybe need to spend 1h/week during last 2 years of middle school on those at the computer lab (as it's been done since the 90s in many schools around the world)
But for any other course such as Math, English (or whichever primary language in your country), second languages, history, etc. : that's where using tech is a mistake
A bit of tech is ok, but it cannot be "everyone does their homework and read lesson on a iPad/Chromebook"
It makes me think back to my writing assignments in grades 6-12. I spent considerable time making sure the word processor had the exact perfect font, spacing, and formatting with cool headers, footers, and the footnotes, etc. Yet, I wouldn't even bother to proofread the final text before handing it in. What a terrible waste of a captive audience that could have helped me refine my arguments and writing style, rather than waste their time on things like careless grammatical errors.
Anyway, I do agree with the idea of incorporating Excel, and even RStudio for math and science as tools, especially if they displace Ed-tech software that adds unnecessary abstractions, or attempts to replace interaction with knowledgeable teachers. One other exception might be Anki or similar, since they might move rote memorization out of the classroom, so that more time can be spent on critical thinking.
* Have something like 5 bins, numbered 1-5.
* Every day you add your new cards to bin nr. 1 shuffle and review. Correct cards go to bin nr. 2, incorrect cards stay in bin nr. 1.
* Every other day do the same with bin nr. 1 and 2, every forth with bin nr. 1, 2 and 3 etc. except incorrect cards go in the bin below. More complex scheduling algorithms exist.
* In a classroom setting the teacher can print out the flashcards and hand out review schedule for the week (e.g. Monday: add these 10 new cards and review 1; Tuesday: 10 new cards and review box 1 and 2; Wednesday: No new cards and and review box 1 and 3; etc.)
* If you want to be super fancy, the flash card publisher can add audio-chips to the flash-cards (or each box-set plus QR code on the card).
I'm all for going back to analog where it makes sense, but it seems wrongheaded to completely remove things that are relevant skills for most 21st century careers.
I don't think there's anything wrong with showing kids some videos every now and then. I still have fond memories of watching Bill Nye.
> Should we not teach basic numerical and statistical methods in Python?
No. Those should be done by hand, so kids can develop an intuition for it. The same way we don't allow kids learning multiplication and division to use calculators.
> No. Those should be done by hand, so kids can develop an intuition for it. The same way we don't allow kids learning multiplication and division to use calculators.
I would think that it would make sense to introduce Python in the same way that calculators, and later graphing calculators are introduced, and I believe (just based on hearing random anecdotes) that this is already the case in many places.
I'm a big proponent of the gradual introduction of abstraction, which my early education failed at, and something Factorio and some later schooling did get right, although the intent was rarely communicated effectively.
First, learn what and why a thing exists at a sufficiently primitive level of interaction, then once students have it locked in, introduce a new layer of complexity by making the former primitive steps faster and easier to work with, using tools. It's important that each step serves a useful purpose though. For example, I don't think there's much of a case for writing actual code by hand and grading students on missing a semicolon, but there's probably a case for working out logic and pseudocode by hand.
I don't think there's a case for hand-drawing intricate diagrams and graphs, because it builds a skill and level of intimacy with the drawing aspect that's just silly, and tests someone's drawing capability rather than their understanding of the subject, but I suppose everyone has they're own opinion on that.
That last one kind of crippled me in various classes. I already new better tools and methods existed for doing weather pattern diagrams or topographical maps, but it was so immensely tedious and time-consuming that it totally derailed me to the point where I'd fail Uni labs despite it not being very difficult content, only because the prof wanted to teach it like the 50s.
I found that calculators didn't help all that much once you got into symbolic stuff. They were useful for the final reductions, obviously, but for algebra the lion's share of the work is symbolic and at least the relatively cheap two-line TI calculator I was using couldn't do anything symbolic.
I know that there are calculators that can do Computer Algebra System stuff, and those probably should be held off on until at least calculus.
I would rather a teacher have to draw a concept on a board than have each student watch an animation on their computer. Obviously, the teacher projecting the animation should be fine, but it seems like some educators and parents can't handle that and it turns into a slippery slope back to kids using devices.
So for most classrooms full of students in grades prior to high school, the answer to your list of (presumably rhetorical) questions is "Yes."
The problem is that people seem to want to go to extremes. Either go all out on doing everything in tablets or not use any technology in education at all.
its not just work skills, its also a better understanding that is gained from things such as the maths animations you mentioned.
I think the latter is mostly a reaction to the former. I think there is a way to use technology appropriately in theory in many cases, but the administrators making these choices are largely technically illiterate and it's too tempting for the teachers implementing them to just hand control over to the students (and give themselves a break from actually teaching).
Maybe. Back in the day I had classes where we had to learn the rough shape of a number of basic functions, which built intuition that helped. This involved drawing a lot of them by hand. Initially by calculating points and estimating, and later by being given an arbitrary function and graphing it. Using Desmos too early would've prevented building these skills.
Once the skills are built, using it doesn't seem a major negative.
I think of it like a calculator. Don't let kids learning basic arithmetic to use a 4 function calculator, but once you hit algebra, that's find (but graphing calculators still aren't).
Best might be to mix it up, some with and some without, but no calculator is preferable to always calculator.
And Logo or BASIC >> Python in school context IMO.
I had computer lab in a catholic grade school in the mid-late 80's. Apple II's and the class was once a week and a mix of typing, logo turtle, and of course, The Oregon Trail.
What for? I've been writing computer programs and documentation since 1969 and I can't touch type. I've never felt enough pressure to do it. I can still type faster than I can think. When I'm writing most of my time is spent thinking not tapping the keys.
https://en.wikipedia.org/wiki/Touch_typing
I think they're referring to the latter.
And then I became an adult and visited China and met actual Chinese immigrants and married a native chopstick holder. And half of them don’t hold chopsticks “the real way”. Somehow it all works out. As long as you can eat a peanut with them, you pass.
As an adult I learned that there’s also a whole lot of prescriptive bullshit that basically nobody pays attention to. The strict definition of touch typing seems like one of those. If you can type without looking at the keys, you can touch type.
Just never cared to get perfect at it in school. I would get absolutely crushed on typing tests though with the kids who actually learned touch typing. They all had piano experience and could reach the modifiers while holding on to the home row still. I still can't really do that on my right hand, its like my pinky doesn't reach.
I've heard this before too but apparently most people hold a pencil wrong anyway so it doesn't actually help.
You should be able to type without looking at your keyboard.
But the specific 5 finger arrangement taught often as "tough typing" isn't needed for that, some common issues:
- it being taught with an orthogonal arrangement of your hand to they keyboard, that is nearly guaranteed to lead to carpal tunnel syndrome if you have a typical keyboard/desk setup. Don't angle your wrist when typing.
- Pinky fingers of "average" hands already have issues reaching the right rows, with extra small or extra short hands they often aren't usable as intended with touch typing.
I was taught touch typing as a kid. None of it took. I dont use the home row. I developed into the gamer home row hand positioning for typing.
If you’re capable of typing quick enough to publicly take meeting notes, then it’s fine. But if you can’t, I could see it being professionally embarrassing in the same way that not understanding basic arithmetic could be professionally embarrassing.
That’s the kind of (in)capability we’re talking about when it comes to Gen Z. Like not knowing ctrl-c ctrl-v.
What could you possibly teach about touch typing besides just telling people to do typing tests or write papers over and over again?
People aren't bad typers because they weren't taught. They're bad typer because they dont type.
Recordings are one of the worse ways to store knowledge for later reference, except in usual scenarios. They're very awkward to work with. The only plus is their cheap an easy to make.
Trust me, I work at a company where "documentation" is often an old meeting recording (and sometimes you have to count yourself lucky to even have that).
I had a boss that typed with one finger on each hand, it was laughable, but he was an incredible programmer, so it didn't affect him at all.
Touch typing would probably be faster, but I've never found slow typing speeds a limiting factor in either writing or software dev.
https://entropicthoughts.com/typing-fast-is-about-latency-no...
A few years ago I invested in a rectilinear split keyboard which has a slightly different layout, but much more ergonomic. But interestingly I can now type 120wpm+.
I think touch typing is very similar to learning penmanship (and I guess cursive to an extent). If I followed the exact rules I learned about handwriting in school, I'd have much more legible handwriting but I'd write so much more slowly. Instead I my own way, which lets me get my thoughts out quickly, albeit not as neat as "correct" penmanship. Fortunately typing is much more lenient on this front.
can even be harmful
IFF we interpret "touch typing" as the typical thought typing method and not just "typing without looking at the keyboard".
In general key arrangement traces back to physical limitations of type writers not ergonomics and layout choice isn't exactly ergonomic based either.
But even if it where, the biggest issue of touch typing is that it's often thought around the idea of your hands being somewhat orthogonal to your keyboard, _which they never should be_ (if you use a typical keyboard on a typcal desk setup) as it leads to angling you hands/wrist which is nearly guaranteed to cause you health issues long term if you are typing a lot.
The simple solution is to keep your wrist straight leading to using the keyboard in a way where you hand is at an angle to it's layout instead of orthogonal which in turn inhibits perfect touch typing. But still allows something close to it.
As keys are arranged in shifted columns this kinda works surprisingly well, an issue is the angle differs depending on left/right hand :/
Split or alice style keyboards can also help a bit, but I often feel man designs kinda miss the point. Especially many supposedly ergonomic keyboards aren't aren't really that ergonomic, especially if your hand is to large, small, or otherwise unusual...
Which brings us to the next point, human autonomy varies a lot, some people have just some very touch typing incompatible hands, like very short pinky fingers making that finger unusable for typical touch typing (even with normal hands it's a bit suboptimal which is why some keyboards shift the outer rows down by half a row).
Some of us "a bit older" seem to have gone through a golden era of tech, where we actually learned that tech en-masse. In a class of maybe 30 students, around 20, 25 of them were able to configure dial up modems, come on IRC (servers, ports, channels needed to be configured) and do a bunch of other stuff our parents mostly considered "black magic" (except for a few tech enthusiasts), and the general concensus was, that every generation will know more and be "better" than the previous generation.
A few decades have passed.. and kids can't type anymore on a keyboard, can't print, have no idea what can be changed in the settings on their smartphone, don't know how to block ads, can't cheat in games anymore (except via pay-to-win) and have no idea where to change their instagram password.
So, now you have boomers, who can't use computers and kids, who can't use computers anymore.
The latter is a fairly small demo though - supposedly around a third.
The split is more by education than by age.
Kids can use computers - phones - as app appliances, but they don't understand computers.
Peak literacy is young Gen X and older millennials.
They certainly will at home.
> I suspect there is a gap in the technical literacy of lower income students, whose parents are less likely to have a computer at home.
In which country?
I live in Mexico and even here you really need to go to the poorest families to find a home without a laptop. Even those families have multiple smartphones. Today a smartphone is not a good replacement for a laptop but maybe in a couple of years it will be.
The following article suggests that in the United States, about 59% of lower income households have a laptop or desktop computer, compared to 92% of upper income households.
https://www.pewresearch.org/short-reads/2021/06/22/digital-d...
When I think back to using computers as a kid, both at school(starting in 1999) and at home I don't think it's all that black and white wrt just playing at home vs learning useful skills at school.
At some point in the early 00s my underfunded elementary school acquired a bunch of old windows 95 computers. We would have classes where we mostly did basic touch typing, MS Office etc. At home, my middle class parents had also managed to find me some old outdated clunker. And yes, most of my time at home was spent playing games, chatting with friends on msn, pirating mp3s etc.
But I'd say I learned orders of magnitude more from my frivolous activities than from whatever we did at school. At home I was learning things like: online research(into warcraft cheat codes or quest guides for Runescape etc); software troubleshooting(having to reinstall windows because I downloaded malware on limewire or otherwise borked my install somehow); fast typing(from chatting with friends about whatever 12 year olds like to discuss. Probably 90% of my typing practice back then came at home, not at school, and there was no touch typing. I could type 100+ wpm on just 4 fingers by the time I was in middle school. Never actually learned to properly touch type until I had force myself into it 5 years ago due to RSI); English as a second language(from various forums, irc, etc, hard to avoid back then); And I believe one of my first forays into programming was trying to get a cracked game with a broken .bat installer off TPB to work. I had a friend who got into it via Morrowind modding.
Actually, come to think of it, most of computer class was also in reality spent sneakily playing flash games and/or messing around with the computer settings just to screw with the next student/teacher to use it.
Even generalising beyond computers, I think a remarkable portion of the skills and interests that end up defining us as people, can be traced back to stuff we did trying to avoid boredom as children.
To summarise though, I do think computers have a place in school. But especially at an elementary school level, I think play should be a significant portion of their use, because play is how kids explore the world and themselves.
I ended up with a non-home-row style of touch-typing just by being forced to get chat messages out quickly in StarCraft multiplayer. So it's at least possible to learn it from that, even if most don't.
Is it possible that there are alternative ways than handwriting for cognitive development?
Probably in 500 BC they said you had to hack at stone with a chisel for cognitive development, and then someone invented the pen and paper.
The difference is the task had to change as well. People were able to write thousands of pages (rather than a few stone blocks) over their education, and making full use of that ability in order to "keep the brain CPU close to 100%" was a necessary concurrent change in order to preserve cognitive devolpment.
Thamus:
> "For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom, for they will read many things without instruction and will therefore seem to know many things, when they are for the most part ignorant and hard to get along with, since they are not wise, but only appear wise."
https://www.historyofinformation.com/detail.php?entryid=3894
You are forgetting that in 500 BC literacy rates were well under 10%. Nobody optimized for anyone’s cognitive development.
The only cognitive development people cared about was for the rich (aristocrats, royalty, some merchants, etc). Much of that happened orally through hands-on tutoring by an army of people specifically employed to create the next generation of leaders.
Anyone would thrive with that much resources thrown at them. And I’m pretty sure many of them considered reading and writing beneath them. They got people for that.
there are countless of ways to develop fine motor skills, but handwriting replacing a chisel was not a step down because handwriting is a demanding task in contrast to the, by nature, impoverished interaction with digital rather than analog devices. I help in a maker-space and you can literally tell young people apart who only ever interacted with a phone compared to kids who play an instrument, work with tools etc.
Additionally a pen and paper come cheap compared to a tablet. It was always the perfect example of a kind of "digitalism". "oh we're so cool, we use technology, let's give everyone tablets, we're a modern country". Just expensive nonsense in the absence of educational standards and physical development.
I wonder whether it has contributed to the evolution of smaller brains: https://www.bbc.com/future/article/20240517-the-human-brain-...
Memorizing a few hundred lines of epic poetry probably is indeed unusual. But I bet most people have more than a few hundred lines of poetry in the form of song lyrics memorized (along with the tune of the song).
A lot of cultures that emphasize oral memorization will have things like this as memory aids
For example, Buddhist mantras have a specific way to pronounce, and if you alwaya do it this way memorizing becomes much easier
Or for a more Western thing, prayers also have a specific cadence that is learned alongside the text itself and that aid memorization
Update your mental model, except for the grand works, they used sticks on clay tablets similar to writing
I do think that digital technology was introduced a bad way in most schools. In my own experience it was less "digital technology education" and more "navigate Microsoft windows UI education". The teachers didn't know much about computers, of course the result was mostly a waste of time. I think the first thing kids should be taught in computer class is touch typing.
I have very bad handwriting due to dysgraphia. I suffered a lot in older years of school due to lack of ability to use Word/Latex homework to submit homework. Handwriting is not as important.
But what is exttenely important is ability to think with writing/drawing. Because at the end, paper is still the most free form of writing/drawingyou can do and actually creates and reinforces that individuals own style. Computers, however good you have them, at the end force students into one style of exposition which is the software you are using. Whether word or latex.
Paper allow you and force you to develop you own style of writing/organization information which is essentially what teaching is all about.
I think the k-shaped economy where some people are financially succeeding while the rest go through hardship is a reflection of a k-shaped education system where those who are able to ignore the distractions and succeed are doing well. The top of the k can use more edtech as they just need tools for further educational attainment. Things modern edtech can bring. The bottom of the k has different needs.
There is no way to be done away with tech on school, but some balance and freedom must be achieved.
Why an "extreme" amount of freedom?
> There is no way to be done away with tech on school, but some balance and freedom must be achieved.
Yes there is. Students were educated just a couple decades ago without it. We can easily return to that style.
As dangerous as this sounds, with guidance, I think it could be done. Government and public institutions love to lock the environment into something safe but useless for further learning and adaptability
I am wondering what you mean by it and why you think it's needed.
Digital tech is here to stay...
I have several friends who work in education.
At one point there were computer labs in school, there was education around computing. The pervasiveness of computing killed these programs, along with various kinds of skill based classes, like wood/auto/home economics (cooking and or sewing).
All of them tend to agree that the losses of these programs is, in hindsight, problematic. Many of them think that a return to computer education (and conveying deeper insight) would be a net positive.
> EdTech
To a person, every one I know thinks their EdTech platforms suck. One of them is in a support role as part of their job and often tells me stories of how lamentable the software and faculties interactions with it is/are.
"Progress is at fault" is a tale as old as time: https://xkcd.com/1227/
Worth noting that all of those examples are adjacent to the industrial revolution. At least personally I don't know enough to have perspective on cottage production but I imagine daily life must have been quite different 1000 years ago.
In the context of general education I can understand the strategy, it could be a useful learning environment, but certainly not if it is about digital education, tech knowledge or general engineering. Nobody becomes an engineer in a prison, you need to give your users freedom.
Ironically, Gen Z was supposed to lead the way as "digital natives", but in many ways they are (speaking broadly) much less technically adapt than, say, Gen Xers, because Gen Xers had to struggle to figure stuff out because it hadn't been all wrapped up with a bow yet, and thus we got to understand the details of how thing worked at a deeper, more fundamental level.
I recall reading some articles about how many Gen Zers new to the workplace didn't even understand how file systems or directories worked, because things like iPads largely hide those details from the end user.
And to emphasize, I'm not dumping on Gen Z - they're, like everyone, just a product of the environment they grew up in. But I strongly disagree that getting access to an iPad makes anyone more technologically adept.
About as neat a trick as opening a slot machine, pulling out the mech and fixing a jam.
There is a massive qualitative difference between API knowledge and foundational knowledge. The former is tied to the usefulness of the platform, the person with a macbook or an iphone looks at you the same way you look at a person fixing their car or slot machine. I for one am sick of the gross fetishization nerds do of cheap knowledge.
The same thing that makes your knowledge useful (usable) is the same thing that makes it useless (negative utility). You can only change your likely PC parts because it's long been standardized and a whole industry has ossified around those standards. You've confused learning about computers with learning about a standard. Someone else would roll their eyes at your statement, "well duh of course you can't take an IBM 360/40 to the shop"
For my own kid, I do limit screen time just because their eyes are still fragile before age 9, not because the above reason.
I asked an AI about the reasoning and the answer comes down to: kids need real-world interaction to support brain development. But if that's the case, aren't these two seperate issues? Using a tablet doesn't damage your brain ... it's just a low-value activity that fails to build the good skills (like video games?) that other activities do. It is not that screens make you dumber, it is that they crowd out the things that make you smarter.
Naturally, the kids should learn AI and AI workflows also. And personal AI assistants can probably help many kids in their studies. Learning AI should be its own subject but that should not ruin the way kids study other subjects where there are proven old ways to get to great results.
Source: I have 10 Finnish kids
Edit: FYI: an old (2018) link to an article about a finding about the matter: https://yle.fi/a/3-10514984 "Finland’s digital-based curriculum impedes learning, researcher finds"
No one needs training in prompting AI. I could understand if they meant a deeper layer of integrating tech with systems but all they ever mean is typing things in to a text box.
In other words, the aim is to get kids used to using AI as soon as possible, so that they do not learn the skills to function without depending on it.
I can see the angle for making sure kids start using it before they develop the skills to become independent of it.
I've been using AI for some legal issues, and it's been incredibly good at searching for case law and summarising the key implications of various statutes - much more efficient than web search, with direct links to the primary sources it finds.
I'm still the one gaming out "What if...?" and "Does that mean..?" scenarios and making sure the answers are grounded in the relevant statutes, and aren't mistakes or hallucinations.
It's not so much a prompting problem as a critical thinking and verbal reasoning problem.
Schools are slow, by the time the teachers get around to teaching the sophisticated techniques you use today, those techniques will be obsolete, the new AI models will require completely different style of prompts.
As for critical thinking and reasoning, those are even harder to teach. How can teachers teach what they don't know?
And that means you have to learn without AI to understand when the AI is wrong. This is just how its dangerous to use a calculator without knowing math since you wont spot when you entered things wrongly etc.
My 6 year old kid who watches me is a better prompter.
http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-co...
It seems to me that if someone can read and think critically-- they can RTFM and get much better much quicker at computers and AI than people who spent all their time tapping an ipad to watch the next video.
It would take a few sessions at most to take someone from 10 years ago and get them fully up to speed with AI tools since they have zero learning curve.
Evaluating AI output it’s not a skill on its own. It’s just general critical thinking and literacy.
I think it comes easily to the sort of people who comment here. Moat people have a very vague understanding of computers in general.
Kids are using crappy subscription education services for homework and doing all their reading on screens (and educators are toiling away to work with these systems) because the people who make money off the services and screens paid to have the incentives distorted such that buying their products is the least shitty option.
I've used them when studying new languages (human languages not programming languages) and ML algorithms and they've been really useful.
Learning to check the citations it gives you is a useful skill too. I wish many adults were more sceptical about the things they are told.
A bit like software development.
You're wasting effort and teaching an obsolete technology if you try to make primary/secondary education too topical. Students can learn how to decompose a task and how to think critically without ever touching a Large Language Model.
Addiction is a much harder problem than distraction.
This would be just the modern version of "Computer class" back in the day when we learned to use word, excel, etc. Just another tool among others that is helpful to learn but should be limited to that specific class.
Though actual sad thing learning from friends with kids is that the modern "computer class" does not actually teach kids to use computers much these days.
When it introduces Harvard vs. Von Neumann architectures, it doesn't invent some dumb RISC computer to illustrate the difference... No... it makes you learn the actual von Neumann machine! Also Conrad Zuse Z machine.
Cragon's argument is that students will not learn the concept of engineering trade-offs, if presented with a clean "textbook" architecture.
I hated MIX for various reasons, it's sort of in-between simple and kludgy.
[0] Cragon was professor at University of Texas Austin ca 1980. Also the architect of TI's ASC in the 1960s.
Buddy AI is here to stay. You remind me of my 2nd grade teacher who said 'we wont have calculators in our pockets'.
The best thing to do is to set the kids up to learn the most important thing - which is how to teach oneself. If a kid can read about something, and then understand what was important from the reading, and then write about it, and then know where to go next they will be well served in the AI world.
This is bad -- an F grade for the education system that let them slide by without learning an essential skill. The chinese aren't this lazy. And if we persist in not learning this, America's future will regress to us asking them, "Do you want fries with that?"
For one thing you do not need to do much arithmetic to do algebra, for another estimating and getting a feel for numbers is not the same skill as learning a bunch of arithmetic techniques. No one is going to do long division while shopping.
I can keep enough digits in my working memory to do long division in the grocery aisle.
I also compulsively factor numbers on license plates..
In proper countries the price per kg is displayed under the price
Eventually everything that can be learned from a book will be done much better by machines, so for humans to have any chance of being employable they'll need to develop the soft skill of working with intelligent machines.
You will not do maths casually until you have memorized enough multiplication to make it not torture. You will not pick up multiplication from using a calculator any more than you will pick up programming from using a computer.
> native speakers of a language will conjugate correctly without memorising
They do not. They have memorized, through massive, constant, and forced practice, and now they conjugate correctly. The alternative of consulting a computer every time they need to speak is not a realistic one.
Sure you will, at least assuming we're still talking about memorizing multiplication tables here and not how to do long division or the like. I don't think algebra or even basic calculus has any convincing need to involve rote memorization.
I've ended up unintentionally memorizing many things due to frequently needing to consult various lookup tables.
> conjugation
Competent ones will. Wrong conjugations usually "sound" wrong to me even when I haven't seen them before and that's in English of all things.
Doing maths is not torture if you do not know multiplication tables if you have a calculator.
Native speakers of a language do not memorise conjugations through forced practice, they memorise through hearing them repeatedly from others.
The entire point of AI is to accommodate the user. AI doesn't do anything that people can't do, is worse at most of those things, but is a lot faster at some of them (basically looking up things.) The point of AI is natural language UI.
Teaching people how to use AI is just teaching people enough about the world to give them something to ask AI for.
Are you sure it isn't both? Learning how to bypass the school's internet filtering so that I could get to flash games and reddit probably taught me more than anything in the lessons.
My contention is that it's feasible to use laptops in classrooms productively, especially considering the value in applications like word processors. Of course it's necessary to balance the educational value with the potential for distraction. A way to minimize the latter is to extend classroom management to address device use, e.g., instilling discipline. I've personally seen it done well and done poorly (often not attempted at all), and given an otherwise healthy classroom setting, it comes down to discipline and ethics that address device use. That comes after tailoring the specific device format (e.g., tablets lending themselves more to entertainment, socially and habitually) to the appropriate grade level (maturity, responsibility, and technical potential increasing with age).
Some classrooms are too disruptive for device use, but that's not inherently a tech problem, even if you blame disruptive classrooms on broader cultural problems stemming from tech's role in society. Other classrooms exist in cultures that reject the necessary classroom management strategies.
It's not my contention that any device format should be used at any grade level and that distractions can be managed by simply saying "don't" and expecting success.
To address your other point above, yes, reading a book is different, often better, than reading on a screen, even for adults, so I'm also not arguing that devices should replace books.
I am somehow involved in this field and am yet to see an actual paradigm shift anywhere in Europe. Going back to books just mean that we will continue using old methods, because those same old methods moved onto screen didn't bring improvements we though they would as we labeled them digitalisation
I suspect the people I see saying they were able to not get distracted when using a laptop in class are either outliers or liars.
https://www.regjeringen.no/no/aktuelt/endrer-skolehverdagen-... [link in Norwegian, no English source available]
no meta-analysis done into this topic could conclude anything beyond the digital medium being a bit more efficient on reading speed. and these studies do not account when comparing one way to the other on the plethora of ways a digital medium can expand knowledge (videos, gifs, images, interactive visualizations and so on)
Screen readers take longer.
Feis A, Lallensack A, Pallante E, Nielsen M, Demarco N, Vasudevan B. Reading Eye Movements Performance on iPad vs Print Using a Visagraph. J Eye Mov Res. 2021 Sep 14
https://pmc.ncbi.nlm.nih.gov/articles/PMC8557948/?utm_source...
Another
https://users.soe.ucsc.edu/~srikur/files/HCII_reading.pdf?ut...
Tangential: One study finds few significant effects of disruptions on just on-screen reading, no printed books.. https://www.frontiersin.org/journals/psychology/articles/10....
Cited in Card Catalog , Hana Goldin, "What scrolling did to reading" here:
https://open.substack.com/pub/cardcatalogforlife/p/what-scro...
i suggest you do some read, specially of effect sizes found in many studies showing "better performance" (minuscule effect size). there's a plethora of political things you'll ignore by thinking books are better. i gathered you some stuff (there are more than 200,000 people studied on the links i'm sending to you) and i truly hope you don't try to counter-argument by pointing some meta-analysis i linked concluding the analog is better. they admit the effect is minimal to negligent and if you actually consider studies done on text that user doesn't have to "scroll" but rather advance the page with a tap/pgDN and the user don't have their social media hooked on their device (muted or absent), there literally NO EVEDIENCE of any difference between paper and digital learning
[0] https://psycnet.apa.org/record/2024-16892-001
[1] https://futureofreading.eu/wp-content/uploads/2016/03/1-s2.0...
[2] https://pmc.ncbi.nlm.nih.gov/articles/PMC10606230/
[3] https://link.springer.com/article/10.1007/s10639-025-13843-8
[4] https://www.sciencedirect.com/science/article/pii/S277250302...
Any scientific backing that screens are at fault? I don't think so. E-ink tablets do exist. When I'm having children, I'm buying them a remarkable with all the books scanned. Sure, they still need physical sheets of paper and a pen, but they don't have to carry 2-3 kgs of literature.
The major reason against digital literature is that it's free, book authors wouldn't get paid and books wouldn't get sold (Wikipedia / OpenStax / pirated books). Money. It's always been about money.
If it was a physical book there is a good chance I will remember a place for the equation, say about 3/4 of the way through the book, near the top of a left hand page and the right had page had a graph showing solutions of the equation.
I can then quickly riffle through the book to find that spot.
With an ebook I get much less sense of place. If what I read on has consistent paging so I've always seen that equation at the same place on the screen I might remember that but not much more. If the "search" function in the reader software isn't good enough to find the equation and I can't think of enough prose that was around it to find it by searching for that it is going to be harder to find than it is in the physical book.
I think the physical books is activating the same mental facilities that are used to organize memories of a physical journey through a space. An ebook doesn't do that for me.
Lots to think about there.
https://open.substack.com/pub/cardcatalogforlife/p/what-scro...
The main problem mentioned in the article you link to seem to be distraction from what they were supposed to be doing.
Distraction is not always bad and kids can learn a lot by being distracted by something that catches their interest. it depends on the approach and its more of a problem following a fixed curriculum in a classroom. Probably more of a problem for uninterested or younger children.
I think video can be a big problem, particularly given the tendency of sites to try to keep you there.
An allowlist might be a good place to start.
Could any of the IT professionals, here, think of a way around it? Then likely the kids could.
It's like social media ban for children. If you stop and think about it there is nothing special about children, it's terrible for everyone.
If such a basic distraction in a digital device isn't fix, it means the experiment wasn't even tried!
Same in neighbouring Norway. Hi, neighbour! :)
Steve Jobs popularized among startup/HN audiences that intuitive interfaces are the best interfaces, but I have yet to see evidence for that. Are systems that work without education truly better than those where you might be a lot more effective with some amount of study? Probably there's an optimal point, like a year of full-time education to use your OS won't pay off at median longevity, but whether <5 minutes of learning is the optimum...
It's great for onboarding people to your tech product, though. Incentives may be misaligned between what's best for the users and what's best for the cashflow
What? Why? And why "naturally" as if this is an entirely uncontroversial statement?
Wait what?
I bet Zuckerberg doesn't allow his children to use social media.
And I assume that Sam Altman won't allow his children to use AI chatbots.
What does that tell us?
Jobs was literally just parenting. Limiting screen time is something all parents should do. We also limit access to sugary foods and other things that can be damaging in excess. Calling tech executives hypocrites for having common sense parenting limits is not really a dunk.
He was talking about a future he was aiming for. I know it's hard to remember the tech optimism we still had heading into 2010, but most people still viewed things as getting better at that time. When Jobs announced the iPad, the iPhone had been on the market for 2.5 years and we basically only saw the conveniences of how cool it was to be able to check Facebook on the go with a cool futuristic touchscreen experience.
It's really easy to see how misguided Jobs was with 15 years of hindsight.
Maybe you do, but not everybody does. 19.7% of American kids are obese. The hypocrisy is that tech executives promote and lobby for excessive use of their products (even manufacturing addiction), but know better for their kids.
lotta folks here with FAANG pedigrees...
There's absolutely nothing insightful about CEOs with "unprecedented insights" coming to the same conclusions as everyone else.
Yeah, something tells me we shouldn't be taking advice regarding children from this man.
It doesn't forgive them for lobbying ferociously against any regulation of marketing to children.
Yes, tech companies are liable for pushing this technology that they know to be addictive.
There is no apologist revisionist history for billionaires that are actively making the world a worse place. People act like Jobs was some kind of hero. Dude was a snake. Made some damn good products, but you don't achieve that level of wealth by being a kind person.
Assuming this were to be the case, one would need to explain why this doesn't happen to men.
> Among men, the prevalence of obesity was lower in both the lowest (31.5%) and highest (32.6%) income groups compared with the middle-income group (38.5%).
And among women, one would need to explain why it doesn't happen to Black women.
> Among non-Hispanic black women, there was no difference in obesity prevalence among the income groups.
It also needs to explain why no statistically significant result happens for Asian women
> Among women, prevalence was lower in the highest income group (29.7%) than in the middle (42.9%) and lowest (45.2%) income groups. This pattern was observed among non-Hispanic white, non-Hispanic Asian, and Hispanic women, but it was only significant for white women.
Without looking deeper into the issue, the natural thing the income vs. obesity thing overall shows is a population blend issue (Simpson's paradox). It gets too tortured otherwise: yeah, Black women always have inconvenience, Asian women mostly don't have more convenient lives as they become richer, and White women get massively more convenient lives as they get wealthier. Men until 2008 got less convenient lives as they got wealthier and then their lives got neither more convenient nor less convenient but stayed the same.
That's pretty rough number of epicycles to stick into this convenience angle.
I'm sure almost no family have an upper limit on book time.
Thus aiming for screens the replace books is a bad aim.
Why have social media when you can have Jeeves "do it" for you?
> What does that tell us?
It tells us three things:
1. Do not give a child access to iPads, social media or ChatGPT until they are old enough and are aware of their addictive nature.
2. Get them to read books as an alternative.
3. Being unable to restrict access to iPhones, ChatGPT to a child is a parenting skill issue and not the responsibility of a government to impose global parental controls on everyone for the purpose of surveillance.
Your kid will be the odd one out, missing some shared culture, left out of conversation or meetups they arrange in IM, etc.
The government should absolutely forbid social media and addictive games to kids under 16, otherwise it’s very hard as a parent to escape these little addiction machines and you can only try to limit damage.
Of course, we have to find a way that is not damaging privacy at the same time.
(If you don’t have kids or have kids that are under ~10, you do probably not know what the pressure is like… yet.)
Missing out on social interactions weighs heavily on kids too.
Making everything harder is that even primary schools sometimes allow kids to play kids to play Roblox or use ChatGPT. For parents it's an uphill battle if even their role models think it's fine to play addictive games or make Tik Tok videos. We picked plenty of battles of not allowing videos of our kid to be uploaded to Youtube/Facebook, etc., luckily there are consent forms now, but you have to be constantly vigilant, because sometimes the consent forms are ignored or you get e-mails saying 'if you object, react by the end of the day'. If they play at friend's houses, they typically have access to the same games as well. Do you also want to say 'no' to playing at other kids' homes?
It has been shown scientifically that social media, certain games, etc. are bad and nearly as addictive as heroin. Maybe it's time to make a law to forbid use by kids, just like we have laws that you cannot sell alcohol, drugs, or cigarettes to kids?
And again, we should find a privacy-preserving way to do it.
Edit: better exaple would be cigarettes, since that's something we as a society recognize is bad for kids and generally require proof of age if there is any doubt. Imagine if all your kid's friends smoked, and there were cigarette vending machines at school, and all you could do was say "no."
Consider Lee de Forest, one of the early pioneers of radio. He expected radio to act almost like a moral and intellectual uplifter for society. He thought people would use it to essentially listen to religious sermons and educational lectures.
The Internet allows you to get every classical work of philosophy or theology online immediately both in the original language or in translation. You can find videos discussion many of them in-depth. Someone in Nepal with an Internet connection can get an education that would rival the best universities of the 1800s, if they want.
Or you can watch cat videos.
LLMs also do quite well at "decoding" the obscure language of these classic works and rephrasing it in more contemporary terms. Even a small local LLM will typically do a good enough job of this, though more world knowledge (with a bigger model) is always preferable.
I'm close-reading Aristotle in a Meetup group where we compare many translations and indulge the controversies in translating the Greek.
When I've tried to get LLMs to bear on a topic, they can't even relate to the concept I'm looking at, instead generating a summary of the easiest parts. LLM is basically a beginner student.
I doubt that, but the others seem reasonable
The ones a year from now from all companies will likely be better than the best today.
An offline iPad with a limited set of educational apps/books would be a good classroom aid
Of course, an iPad without those limits is bad
The biggest problem is you get conditioned to instant and constant dopamine hits, which works directly against a lot of the things one is supposed to learn in school.
Kids learn the A-Z in record speed in 1st grade. But they don't learn to concentrate or that learning things can sometimes be challenging and the value of perseverance and that understanding eventually comes.
So in later grades they pay for learning the A-Z too fast through the iPad. Because they didn't learn how to learn.
The net effect in Norwegian classrooms over past 5 years of iPad education seems to be negative and it is not about what kids are exposed to. It is about not learning to concentrate.
> In their book, ‘Screen Schooled: Two Veteran Teachers Expose How Technology Overuse is Making Our Kids Dumber,’ educators Joe Clement and Matt Miles write: “It’s interesting to think that in a modern public school, where kids are being required to use electronic devices like iPads, Steve Jobs’s kids would be some of the only kids opted out.”
"The Battle for Your Kids' Hearts and Minds" https://kidzu.co/parent-perspective/the-battle-for-your-kids...
Now it’s just an absolute cesspit of paid content, ads and boomers posting in groups.
I don’t even think it’s appropriate to call it social media anymore. It’s barely social.
Not a single friend of mine posts anything on there.
Almost all my friends have stopped posted. The only social thing I see from most people is wishing people happy birthday.
I’m not even arguing with you. I’m just disappointed in how quickly so many on HN throw out all pretense of being interested in data as soon as a personal hot button issue comes up. It’s human nature I guess, but still depressing.
Or do you imagine that there's a study out there that will reveal that arguing on Twitter with someone called Catturd2 is good for your mental health?
Data is map, not terrain. It can explain some of the quantifiable world, not all of it. Common sense can also fill some of the gaps, some of the time. And there remains plenty still that's too entropic for our grasp. Waiting for data to speak is not always the best move. Heck, it might even sometimes be the worst. It seems this is a lesson we collectively keep forgetting over and over, despite the endless list of data-backed "facts" that, in hindsight, it turns out we were wrong or short-sighted about. Apparently, that too is human nature.
It is perfectly rational to rely on experience for what screens do to children when that's all we have. You operate on that standard all the time. I know that, because you have no choice. There are plenty of choices you must make without a "data" to back you up on.
Moreover, there is plenty of data on this topic and if there is any study out there that even remotely supports the idea that it's all just hunky-dory for kids to be exposed to arbitrary amounts of "screen time" and parents are just silly for being worried about what it may be doing to their children, I sure haven't seen it go by. (I don't love the vagueness of the term "screen time" but for this discussion it'll do... anyone who wants to complain about it in a reply be my guest but be aware I don't really like it either.)
"Politicians" didn't even begin to enter into my decisions and I doubt it did for very many people either. This is one of the cases where the politicians are just jumping in front of an existing parade and claiming to be the leaders. But they aren't, and the parade isn't following them.
Data beats vibes, even when vibes are qualitatively correct. I’m surprised this is surprising.
1. https://journalistsresource.org/health/child-access-preventi...
"The product is disgusting, but there's nothing I can do; I'm only the CEO"
More to the point - if the CEO of DogFoodCo won't let his own family pets eat any of his company's flagship products, then maybe smart dog owners should follow his example?
In Zuck's case especially, in order to use what we know about childhood development and education to get kids addicted early.
I’d be super interested in the panels of experts that Jobs, Zuckerberg, and Altman (assuming GGP’s “asssumption” is correct) convened when making these decisions.
Absent that, this isn’t any more persuasive than saying that Coca Cola is good for infants because I assume Coke execs feed it to theirs.
Even ignoring my point, these people have more insight than anyone into their own products and their harmful/beneficial nature.
I am saying that tech execs have no special knowledge, and their actions should not be used to inform one’s own opinions or social policy on the topic.
There IS tons of data in this area. Please, do yourself a favor and read it (pay attention to the population of studies —- many use adults in their 30’s or older as proxies for children).
You can absolutely find real data supporting your position. And it will be more persuasive (albeit less dramatic) than imagining what Altman probably does.
It tells us almost nothing about the unimportant any irrelevant part - how a few individuals choose to raise their kids
This is largely an American phenomenon. If you visit some other countries, students don't walk around all day saddled with what look like Medieval tomes in backpacks that come comically close to dwarfing the student. There is no reason for them to be so thick, so heavy, so expensive, hardcover, or even loaned. And there is no reason to lug them around all day either.
Frankly, teachers should be relying more on delivering material in class without a textbook.
That tells us more about you than about tech CEOs.
Jobs was a products guy that had an intricate understanding on the relation of people and technology. The others are just finance bro's dressed up in tech clothes.
On another totally unrelated note, this guy [1] that is not at all connected to the Epstein class whatsoever (he is) and is only an advisor to the leader of some some small little organization called the world economic forum says you and your children should be kept “happy” with drugs and video games.
Skip to the very end for the statement or listen to the whole little clip to hear how the demigods think about you and your children “worthless” children.
So the kids will continue to be harmed. EdTech will get money because this time they will do it right. AI will lead to a new thoughtless generation.
I had never even realized.
As a bonus I now also see cranks proposing to raise other peoples children in some kind of sweatshop calling it education and schools. As if that was ever the goal.
I'm so lucky I didn't have this in the classroom.
Digital doodling should be possible; I know I've used the zoom annotation feature to doodle during meetings.
I've advised college students to leave their laptops in their dorm room. Take a spiral notebook to lecture, and a couple pens. Write down everything the professor writes on the chalkboard.
When studying, going over the notes, you'll hear the lecture again in your head.
Of course, if the professor doesn't use a chalkboard, and does a slide presentation instead, that will make studying harder for you.
The best presentation I ever gave was when the presenter didn't show up, and the conference asked for volunteers. I volunteered and gave an impromptu presentation using markers and the big whiteboard. The back-and-forth with the audience was very productive!
Most conferences have no way to do this. I tried using an overhead projector and markers, but the conference people thought I was crazy. There was just too much expectation of a packaged slide presentation.
I'm a gen Z college student (born 2005) and this is pretty much what I do all the time. I attend most of the lectures, even if they are not very good and I take notes with a pen during each of them. Some people at my school see it as being subservient to the professors and wasting time, but for me it's about rebuilding my attention span, confidence and clarity of mind (besides internalizing the material far better). It made me a different person compared to who I was in hs and freshman year. It's so simple. Most of us don't have ADHD, we're simply addicted and we can fight it.
We did almost everything on paper, even exams. I admit writing MIPS assembly on paper seemed strange to me at the time, but the effort you put in to put things black on white somehow made the knowledge stick into my mind more effectively. Some of that knowledge will stay with me forever, and I'm not sure the same could be said if I had taken "shortcuts".
That is a) a BS claim and b) wouldn't be a feature, on average, given the quality of college lectures.
It seems fairly clear that manual note taking help with learning, over using a computer, but overblown claims like this do more damage than good in convincing people to do the right thing.
I'm absolutely supportive of using blackboards & paper and pen over computers.
What I'm saying is that making unsubstantiated claims like "you'll hear the lecture again when you read it" is completely detrimental to making that point, because it's entirely unsubstantiated and doesn't make any relevant point you couldn't make in a well-supported way instead.
Handwritten notes help to remind yourself when studying about what the instructor thought important. You write down the emphasis. I sure made it clear to my classes that my emphasis was on this, this and that. The instructor is writing and grading the tests. Or was, back in the ancient times when we made chalk dust, as a pedagogical tool.
It worked for me. Have you tried it?
> given the quality of college lectures
I attended a university where that wasn't a problem. Prof Daniel Goodstein, for example, turned his lectures into a video series "The Mechanical Universe". But, frankly, I liked his in-person lectures using the blackboard and chalk better.
I have. It doesn't work that way for me - but that hardly matters. More importantly, there's plenty of research around inner monologue and sensory replay also pointing out that this isn't true for most people.
> I attended a university where that wasn't a problem.
You were blessed :)
> But, frankly, I liked his in-person lectures using the blackboard and chalk better.
That's not something I'm arguing against :) I think they're a great teaching tool.
My objection is that making a specious but unsupported, and often easily anecdotally invalidated point to support a case that's actually got a ton of solid evidence in support is detrimental to making that case.
Physical note taking, and being actually present for a lecture are tremendously important. Laptops are hugely problematic for learning. And those points are important enough that we should make solid arguments in favor of them, not easily discredited ones. Because we also know that many students are very muched biased to discarding these points given half a chance.
There's currently a related post on the home page: "Good ideas do not need lots of lies in order to gain public acceptance (2008)"
> You were blessed
Indeed, but not totally. I took an economics course, and soon discovered that the prof was a Marxist. Since Marxism is a fantasy, I saw no more value in it than listening to a lecture on astrology.
> many students are very muched biased to discarding these points given half a chance
I have little interest in helping people that don't want to make an effort.
What you were trying to say is "I can't be bothered to engage with people who are not taking my word as gospel".
[1] see https://news.ycombinator.com/newsguidelines.html#generated and https://news.ycombinator.com/item?id=47340079
[2] https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...
Btw when you damaged a book beyond repair, you needed to pay the full price. Only the exercise books needed to be bought freshly as they were "used up" fully after the year. Still, they were often seen as optional.
It's absolutely insane.
Maybe in a post-Soviet country they did. In my school they shredded them so the next class had to buy a new set.
In Slovenia, a post-Yugoslavian country, the school library coordinated a textbook borrowing scheme, where they would own all the material and lend it to students each year. Parents would pay a small "subscription", so each year or two one subject would get new books.
America dropped the "phoneme, sound it out" of decades past and taught students to recognise word "shapes" and learn what they said. It was a complete, and total failure. Children did not learn to read.
What never comes up in the news is the absolutely crazy approach to rolling this out..... "Amazing new education idea is here! We're all doing it!"...... 8 years later, school graduates fail at X, Y, Z BECAUSE of that amazing idea.
There's no accountability, no recompense. Just a news article saying "5% graduate this year! Oh no! Education is terrible!".. papering over the accountability.
For some things there just is no easy way.
Basically. It wouldn't require a 20 year experiment probably. Looking at whole words vs phonics as an example, you'd get a handful of schools to participate and they'd try phonics in one class and whole word in the other. By the time the kids were in 2nd grade the fact that whole word learning wasn't working and that a higher rate of kids needed remedial lessons to catch up would have been obvious. And if it had worked really well you'd expect to see that performance improvement in reading by 2nd grade too!
So the experiment would take 3 years. Though then you'd probably want a larger scale experiment. I'd think if things were going well once kindergarten finished you could probably start involving more schools in the experimwnt the next year. So like 3-6 years altogether.
We have been successfully educating kids for a long time; if we want to mix things up with some fancy new pedagogy we should absolutely be studying if it actually works before rolling it out at scale!
That's essentially what you're arguing. Perhaps not what you intended, but it is what it becomes given the context, and more importantly, the people involved that you disregard so callously.
If it's so darned expensive to do, have you considered that you have the free will and intellectual sophistication required to just . . . not do it? If it'd be so expensive to recuperate a group of people, either your methods have too high a probability of requiring it or your method is just perhaps not ready yet if the potential end result are that disastrously bad. Either way, it points towards going back to the drawing board instead of to town.
But if it's oh so difficult to get these studies done, you know what you can do? You can do it over longer periods of time, just like you bemoan, because that larger time scale will stop you from ruining other people for your own curiosity of will x work in y. You could give people the choice to join the study, you could have smaller cohorts every time and refine the process as you go, you could keep each cohort limited to a year or two to avoid long-term damage, and you could test in different age ranges to get more data.
The list goes on and on and on. Almost like studies on people require larger caution than just testing to see what works without any precautions and going from there. When learning about the scientific method, the idea that people are, you know, people and not test subjects is pushed constantly. Because certain people sadly need that reinforced to avoid being callous researchers. It's oh so easy to forget the numbers you toy with are real lives with real value regardless of what is done with those lives.
We trade immediate results and dubiously better efficiency for larger time spans exactly so that we can ensure the people in them remain protected. Giving people choice in the matter, and letting guardians weigh the value proposition (like other studies have done successfully) by giving them the prerequisite information required to make those decisions, allows for a higher likelyhood of avoiding disasterous effects on those very same people. It's not "generational inconvenience" when lives are affected for multitudes of years; it's callous impatience. It's not "no change ever," is respect for the people involved in attempting those changes and respect for the potential ramifications of those changes. It's borderline evil to disregard people because you, and I do mean you here, don't have the patience to ensure people's safety because, oh no, it'll take a while, or cost a lot if you're held accountable.
Rather, it's okay that things take time, it's wanted that we don't make haste. Because haste makes waste. Because we don't need immediate results. Because we're not working with machines, we're working with the single most valuable thing we have on this earth; a human life. Have some compassion for those people, and you'll find that change doesn't take so long after all.
Mostly it's a question of middle ground for an acceptable scale of decision, but "only change something if we know for a fact it's purely beneficial" is not a realistic plan no matter how intensely important the matter is. At some point decisions have to be made.
This is one of the things that becomes harder and more entrenched the worse those decisions are democratically legitimated. I think it's not unlikely that the difference in expectations between us boils down to a general different level of trust in authority.
When I got to college a few years later I’d sit in the back of classrooms and see that a majority of students who’d brought a laptop (ostensibly for notes) were consistently distracted and doing something else, be it games or StumbleUpon. I can only imagine these decisions were made by groups of adults sitting around conference rooms, each staring at their own laptop and paying 20% attention to the meeting at hand.
BUT one of my kids has Asperger’s and it is extremely hard for him to muster up the energy to do something ”boring”. So gamified learning on an iPad works very well for him. Also doing math on an iPad where, instead of seeing full pages of equations to solve, he sees only one equation makes it much easier for him to get started.
With these kids you learn to not focus on parenting/teaching principles and instead focus on the goals. I’ll do whatever to get him to go to school and learn, no matter if I’ll have to drive him the 700 m to school while he is watching YouTube or having him to math on an iPad.
So as long as the push for more analogue tools is just a direction and not without individual exceptions I’m all for it.
Sadly today’s Swedish government seems more focused on being seen as hard on kids, crime, immigrants etc (basically everything except environmental protections) than actually following scientific principles.
Ipads are not the solution - that gets you back to screen/computer mode in the classroom.
e-readers/ e-paper tablets might be worth a try. (Just please don't make every child have a mandatory amazon account to link with their school kindle.) It would be interesting to know whether the "books+hand notes > screens + typing" comprehension studies have something to say about e-paper (I don't think this has been done yet).
My own experience, even to this day, is that it's easier for me to learn a new language or technology from a book compared to on a screen, even if the digital version lets me work on actual code: if I can, I first read the book and take notes, then I do the online version.
I went to school a million years ago, but IIRC we kept our textbooks in the classroom until middle school (7th grade for me). Maybe one textbook might go home with math homework or an English project. For my kid, they would usually just send worksheets home; which is ok, but if you wanted to reference not on the sheet, too bad. Post-covid, there's a lot more dependence on google classroom with all that comes with it (but maybe that's also how the upper grades were working anyway)
E-readers with textbooks loaded could work, but hopefully the textbooks are tuned for the medium.
Anyway, isn't a heavy backpack a secret fitness program???
Like, maybe download wikipedia onto the device but don't give internet access. Let the device sync at school with required books and assignments.
Effectively, you could give kids a pocket library but that's the extent of what they should have.
It's essentially a notebook and a book reader.
You can take notes directly on the book if you use pdf (epubs can only have notes on the side).
I think that's the tech I want to see in school, no tablets please.