This is a good article, but it only touches on the main issue with learning styles. Multiple independent groups of educational psychologists (if I remember correctly it was three) reviewed all the articles they could find on learning styles. They found that the majority of the studies had poor experimental design, since they were missing proper control groups. Of the remaining studies there were more examples of negative results for learning styles than positive results. This was pretty damning, given the publication bias for positive results in academic research. The article links to one of these reviews, but here is another one:
https://www.psychologicalscience.org/journals/pspi/PSPI_9_3....
I teach science, and it's frustrating that nearly all of my students can tell me their learning style. Their previous teachers have taught them about learning styles as a way to improve their studying, and they've jumped on that information. But the literature indicates that matching instruction to a learning style doesn't improve learning. This is a total waste because we're effectively giving students a magic talisman to help them learn science, because once students find out it's wrong they'll be less likely to take our advice on studying, and because the time we waste on it could be spent on telling people about real things that actually improve learning.
The studies I have seen (I'll admit- I didn't read TFA- maybe I have in the past idk) all used some sort of survey to identify learning styles. Nobody, to my knowledge, has measured learning by modalities to identify them.
You are spot on that attempts to match instruction to learning style have been futile in the VARK-like learning style classifications. They also assume that if a student has a learning style, it will be the same across subjects. If it turns out there is such a thing as learning styles I woudn't be at all surprised if it varied by what you are learning.
Just as a thought, just because it doesn't improve learning outcomes doesn't mean it doesn't improve the learning experience. There might be a qualitative aspect missing from dismissing learning styles. Even if learning styles only improve experience as a placebo/illusion of self-control.
Community colleges like the one I work at can't afford the journal fees. One of my assignments involves reading several peer-reviewed articles on a narrow topic and writing an explanation of the research. Every semester I have students that struggle with finishing this assignment because the articles they need are inaccessible.
If the barrier to accessibility is such a problem, why perpetuate it by throwing your students against that wall?
The journals' business model does suck, but you're delivering students to a paywalled garden to retrieve information on what amounts to a proprietary topic. You have the power to change this by redirecting the students' attention elsewhere, either by loosening requirements or changing topics to one with more accessible research...
They choose the topic. I'm fine with anything in biology, medicine, chemistry, and even some social science topics. Some of them run into trouble and others don't depending on what they choose.
I've considered removing the assignment a few times, but there's no skill more essential to success in science than the ability to synthesize different research articles into a coherent whole. I wouldn't feel like I was doing my job if I stopped assigning it.
I teach community college biology, and I agree that we're really bad at teaching critical thinking. But the Collegiate Learning Assessment (CLA) cited by this article was graded by a computer last time I checked. Here's a direct quote from one of their papers a few years ago:
"Beginning in fall 2010, we moved to automated scoring exclusively, using Pearson’s Intelligent Essay Assess
or (IEA). IEA is the automated scoring engine developed by Pearson Knowledge Technologies to evaluate the meaning of text, not just writing mechanics. Pearson has trained IEA for the CLA using real CLA responses and scores to ensure
its consistency with scores generated by human raters."
Most of you are more knowledgeable about technology than I am. So I'll leave it to you to decide if using an algorithm to grade an essay-based exam of critical thinking is a valid approach to this problem.
Leave it to Pearson to sell us the problem and then sell us the solution. Taking poetic license to exaggerate just a bit...
The problem: High school kids now spend 100% of their time studying prepared curricula, sold by Pearson, to study for standardized tests, sold by Pearson.
The result: Students lose critical thinking skills.
The solution: A standardized test for critical thinking skills, sold by Pearson, and of course a prepared curriculum to study for the test.
As a computational linguistics grad student I find Pearson's "product" line completely mind-boggling, and their peddling it deserving a giant class-action lawsuit. Consider that the state of the art in machine representation of words is something around Word2Vec or GLoVe, and that we have some okay dependency parsers. That their system provides scores consistent with human raters is likely just evidence that they have a coarse-grained and noisy human evaluation system.
I've often thought that a lot of the high-brow analysis put into art was junk. Just people taking dots and connecting them with shreds of evidence existent in the art. Confirmation bias masquerading as analysis. It's nice to see an artist agreeing with that viewpoint.
I should clarify that I don't mind when the context of a piece is explained. I like knowing about where the artist was when a work was created; what was happening around them that might have influenced the work. It's when that jumps to "and this small detail is about this particular thing that was happening" -- and always spoken with confidence -- that I feel like the train jumps the rails.
Ha! I don't know if you've ever seen Ocean's 13, but there's a line in that movie that cracks me up along the same "high brow" analysis lines.
> Matt Damon - "Do you have any wine back there?
> Lady - "Château d'Yquem?"
> Matt Damon - "As long as it's not '73..."
Just makes me chuckle every time because he's a con artist in such a broad field almost nobody can actually identify all of the good and bad varieties from any given year. By just giving an obscure reference you somehow sound like you know what you're talking about...knowing that nobody else actually knows enough about what they are talking about to call you on it.
Just struck me as a great bit of "high brow crowd" humor.
Haven't seen the movie, so it's hard to directly comment, but for what it's worth, Château d'Yquem is a very famous wine. Exactly the (rare) sort where the popular wine magazines will routinely every few years have an article reviewing how the different historic vintages are holding up -- should you drink that 1975 d'Yquem now or hold it a few more years?
It also would be a very dangerous wine to BS about if you didn't know anything about it. 1973 apparently was a lesser year. (Still, it would run you something like $500 a bottle today.) 1975 and 1976 were classic years, name those as something to skip and people will be questioning your taste. And they didn't release a wine at all for the 1972 and 1974 vintages, because they didn't think the grapes were up to snuff.
I had to look those details up because I haven't paid much attention to wine in 20 years. (Wife doesn't like it, so it's hard to justify buying even $20 bottle. Not that I ever could have afforded a Château d'Yquem.) But I still remembered the mid-70s produced a couple of really good vintages. Someone who was actively into wine could probably have given you all those details without any research.
>> It's nice to see an artist agreeing with that viewpoint.
Yep. I've always beaten myself up over my ACT score. Near-perfect scores on grammar, science, and math, but near-zero on reading comprehension. And it was a lot of, "what is the author trying to express by using this word in the title?" I'd rather know how good I am at, "after reading this 5 page article, did you catch this really important detail well enough to recall it quickly?"
Art's important too, but can you judge someone's artistic side in a multiple choice test graded on a scale of 1-9? Don't think so...
The fact that it claims to check spelling and grammar seems suspect to me because if it really were even at least as good as Microsoft word at good at checking spelling and grammar they would have spun it out and sold it as a spelling and grammar checker instead of as a complete packaged writing analysis tool. This makes me doubt the validity of their more ambitious claims like checking for quality of "ideas" and analytics.
It seems to me that there is a much easier way of automating logical reasoning tests. Just make it a standardize multiple choice and have a machine check the answers - the LSAT is probably one of the most successful analytic and logical reasoning test and it has been done this way for a while.
I teach biology at a community college and most of the services listed in that article wouldn't improve my courses. I've already switched three of my four courses over to open textbooks, which students can download for free or purchase for the cost of printing. I use Openstax for these.
For the online learning software, I've also dumped the publisher products and switched to the free spaced-repetition software Memrise.
I think most of my colleagues will be moved over to open educational resources within fifteen years, and I'm not sure there's long-term profit to be made in this market.
>I think most of my colleagues will be moved over to open educational resources within fifteen years, and I'm not sure there's long-term profit to be made in this market.
As a student, I'm not sure I agree with this. I think there's a lot of value in having a really well laid out, well designed textbook with good examples and illustrations. Especially in lower division classes where you end up mostly teaching yourself the material from the text anyways. Most of the open courseware I've had to deal with in classes was extremely sub par compared to actual textbooks.
I think if someone could create textbooks at the quality of Pearson in the range of $20-$50 instead of $100-$200, I would be happy to pay for better materials.
Check out https://cogenteducation.com/products when you have a chance. We make interactive case studies for biology that go much further than any textbook, digital or otherwise.
> I think most of my colleagues will be moved over to open educational resources within fifteen years, and I'm not sure there's long-term profit to be made in this market.
Sounds like he's looking for a free solution, not trialware.
I agree. It's rare for a teacher to pay for our cases out of their own pocket; it's generally the school system who purchases a package out of their yearly budget.
> We didn't know how to talk or study or dress or think the way our peers did. It took me years to learn.
Please tell me everything about this. I teach at a community college and my students come from the middle class, the working class, and the place I can't see.
Not OP, but I'm not sure I follow your question. Are you asking to learn context about problems outside of your own class?
I was raised in a lower/middle/working class area and went to its local community college and could probably comment. Much of what OP described matches very closely to my own experiences.
Yes that's wrong, but the paper that corrected this misconception wasn't published until January of 2016. I think you should give them at least a year to update before unleashing the disapproving scowl.
The most disappointing YC-backed product I tried was Stypi.
I teach community college and sometimes I wonder about the thought processes behind some of my students' papers. Paul Graham linked an essay he wrote in Stypi, where you could watch him write it in real time. This was clearly the greatest computer-assisted tool for teaching writing ever, and I immediately incorporated Stypi into one of my writing assignments. I wanted to know how much my students proofread, how they structured essays, and what they struggled with as they wrote. I was so excited about it that I wrote the entire assignment in Stypi and linked my students to the replay in case they were interested.
It was a disaster. So many students lost essays in browser crashes or were flat out unable to use the software. I ultimately had to apologize to my class, give everyone an extension, and cut Stypi out of the project.
Apparently they were acquired though, so I guess they made someone happy.
> Fortunately, my cofounder and I were lucky enough to have had access to several other acquired founders who helped us ultimately navigate our first multimillion-dollar exit for a company barely a year old.
> DISCLAIMER: Every startup and founder group is unique so the data provided here may not apply in all cases. In particular, it is skewed towards < $25m acquisitions made by much larger companies.
access to several other acquired founders who helped us ultimately navigate
That's founder code for "we actually failed, but we know rich people, so our failure is really our success—suck it, everybody else who fails without millionaire friend safety nets."
When I was a kid I put water wings on my ankles so I could walk on the water like Jesus. I fell over within 0.5 seconds of entering the pool and they held me upside down with my feet sticking out. I remember thinking I was going to die right up until my Mom jumped in and saved me.
I teach at a community college, and as it was explained to me by our financial aid office, student aid isn't paid out until a few weeks in as a mechanism to prevent financial aid fraud. Every year people sign up for classes, collecting their financial aid money, and then never show up. Community colleges are the primary target for this type of fraud, since we have open admissions.
The federal government is not okay with this, so they instituted a system of mandatory reporting where we have to keep track of whether or not a student shows up during the first few weeks. That way they know who not to send aid to when that time is up. If we fail to properly report this then the college is expected to send money to the federal government to make up the difference.
So it's not a trick on our part to get you to use the campus bookstore. I personally encourage my students not to use the bookstore, since I think marking up the prices on textbooks is inappropriate. I've also pushed hard for the adoption of open textbooks in my department, but it looks like that's going to be at least two years away.
I teach science, and it's frustrating that nearly all of my students can tell me their learning style. Their previous teachers have taught them about learning styles as a way to improve their studying, and they've jumped on that information. But the literature indicates that matching instruction to a learning style doesn't improve learning. This is a total waste because we're effectively giving students a magic talisman to help them learn science, because once students find out it's wrong they'll be less likely to take our advice on studying, and because the time we waste on it could be spent on telling people about real things that actually improve learning.