Learn to Code? No: Learn a Real Language

column | Coding

Learn to Code? No: Learn a Real Language

Why coding should not count towards foreign language requirements

By Frank Catalano (Columnist)     Feb 10, 2014

Learn to Code? No: Learn a Real Language

The “learn to code” movement may be about to run afoul of the Law of Unintended Consequences.

Few (least of all nerdy me) will argue that learning a computer language as a kid doesn’t have merit. Grasping some of the basics of computer science by picking up a programming language is a great way to determine how to break big hairy problems into manageable component parts, experiment with cause and effect, and discover the importance of attention to detail. (Yes, children, capitalization matters, even beyond that English assignment.)

Besides, becoming a silicon whisperer can open new windows onto math and science, not to mention careers.

Yet even if computer languages are foreign to us carbon-based life forms, they are not equivalent to human languages. And you wouldn’t necessarily know that based on some legislative proposals making the rounds.

  • In New Mexico, a state senator is pushing to have computer programming, such as in JavaScript or HTML, count toward the “language other than English” graduation requirement.
  • Kentucky is pondering similar legislation for the required two credits of foreign language in the name of what the bill’s sponsor calls “flexibility,” and it cleared the state’s Senate Education Committee late in January.
  • In the U.S. House of Representatives, a measure promoting school-age programming has been introduced with the admittedly delightful hexadecimal title of the “416d65726963612043616e20436f646520 Act of 2013.” Its author, Representative Tony Cardenas of California, says it would designate computer languages as “critical foreign languages.”

This is both so geekily pro-tech, and so very wrong.

True, computer programming is learning a “foreign language” in the sense that you are trying to map how you give instructions in your native tongue to how a digital recipient recognizes what you’re trying to say in its highly structured vocabulary and syntax. That’s great for honing logical thinking, accomplishing tasks and maybe even making a few bucks (or more) by creating apps or pursuing a job.

But vocabulary and syntax are pretty much where the similarities between computer and human languages end. It’s the difference between communicating with something versus someone.

Human languages are not just for describing actions of other meat puppets, but also are an entry point for grokking different ways of describing physical objects, emotions and concepts that may have a cultural or historical basis different from the language learner’s, providing a new perspective--and avenues for further education.

I’ll never forget being fascinated by German’s endless compound nouns, or my inability to only approximate a translation of idioms in Spanish or German that make perfect sense to native speakers. (Need proof of how hard this is? Read any 1980’s era videocassette recorder manual. Or just use Google Translate today.)

And let’s face it, for all nerd culture has in appeal, I doubt it’s as rich or ancient as that of the Russian, Chinese or Greek. Klingon, with its own language, may be the exception--but that’s non-human.

Long term, I’d argue that learning another human language can be as beneficial financially as any specific computer language. Aside from encouraging greater mental flexibility and creative thinking skills, being bilingual is in demand by employers as companies adopt an international posture and as more non-native English speakers are part of the U.S.’ own demographic shifts.

Finally, there’s the matter of longevity. The German and Spanish I learned as a child are still contemporary and, with a little renewed exposure, useful in the form in which I learned them. I cannot say the same about the FORTRAN IV I studied in junior high school. Though I could, with a little prompting, probably still use Hollerith statements to create a mean USS Enterprise out of individual asterisks.

I doubt many of the well-intentioned code advocates like Code.org anticipated or even wanted this, especially with CodeDay coming up on February 15. And Washington, along with a couple of handfuls of other states, has taken a more, well, logical path by allowing computer science courses to count toward science or math graduation requirements, understanding computer programming is more STEM than speech.

A better approach? Encourage both. Picking up a programming language might spur a student to later take on another human language, or vice versa. Use each as a gateway to the other: Computer languages to teach meaningful abstraction, and human languages to teach deep communication.

In a dystopian society, the risk is that we learn how to communicate with machines--and forget how to communicate with each other. Unless you’re just hedging your bets. Right, SkyNET?

Editor's Note: This post originally appeared in GeekWire.

Frank Catalano (@FrankCatalano) is an independent strategist, author and veteran analyst of digital education and consumer technologies whose regular GeekWire columns take a practical nerd’s approach to tech (see the column archive). He still has his FORTRAN IV manual somewhere but is completely out of paper tape and punch cards. He is also a frequent EdSurge contributor.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up