Very interesting article (link to it)

Quote:

Why Johnny can't code
BASIC used to be on every computer a child touched -- but today there's no easy way for kids to get hooked on programming.
BY DAVID BRIN
For three years -- ever since my son Ben was in fifth grade -- he and I have engaged in a quixotic but determined quest: We've searched for a simple and straightforward way to get the introductory programming language BASIC to run on either my Mac or my PC.

Why on Earth would we want to do that, in an era of glossy animation-rendering engines, game-design ogres and sophisticated avatar worlds? Because if you want to give young students a grounding in how computers actually work, there's still nothing better than a little experience at line-by-line programming.

Only, quietly and without fanfare, or even any comment or notice by software pundits, we have drifted into a situation where almost none of the millions of personal computers in America offers a line-programming language simple enough for kids to pick up fast. Not even the one that was a software lingua franca on nearly all machines, only a decade or so ago. And that is not only a problem for Ben and me; it is a problem for our nation and civilization.

Oh, today's desktops and laptops offer plenty of other fancy things -- a dizzying array of sophisticated services that grow more dazzling by the week. Heck, I am part of that creative spasm.

Only there's a rub. Most of these later innovations were brought to us by programmers who first honed their abilities with line-programming languages like BASIC. Yes, they mostly use higher level languages now, stacking and organizing object-oriented services, or using other hifalutin processes that come prepackaged and ready to use, the way an artist uses pre-packaged paints. (Very few painters still grind their own pigments. Should they?)

And yet the thought processes that today's best programmers learned at the line-coding level still serve these designers well. Renowned tech artist and digital-rendering wizard Sheldon Brown, leader of the Center for Computing in the Arts, says: "In my Electronics for the Arts course, each student built their own single board computer, whose CPU contained a BASIC ROM [a chip permanently encoded with BASIC software]. We first did this with 8052's and then with a chip called the BASIC Stamp. The PC was just the terminal interface to these computers, whose programs would be burned into flash memory. These lucky art students were grinding their own computer architectures along with their code pigments -- along their way to controlling robotic sculptures and installation environments."

But today, very few young people are learning those deeper patterns. Indeed, they seem to be forbidden any access to that world at all.

And yet, they are tantalized! Ben has long complained that his math textbooks all featured little type-it-in-yourself programs at the end of each chapter -- alongside the problem sets -- offering the student a chance to try out some simple algorithm on a computer. Usually, it's an equation or iterative process illustrating the principle that the chapter discussed. These "TRY IT IN BASIC" exercises often take just a dozen or so lines of text. The aim is both to illustrate the chapter's topic (e.g. statistics) and to offer a little taste of programming.

Only no student tries these exercises. Not my son or any of his classmates. Nor anybody they know. Indeed, I would be shocked if more than a few dozen students in the whole nation actually type in those lines that are still published in countless textbooks across the land. Those who want to (like Ben) simply cannot.

Now, I have been complaining about this for three years. But whenever I mention the problem to some computer industry maven at a conference or social gathering, the answer is always the same: "There are still BASIC programs in textbooks?"

At least a dozen senior Microsoft officials have given me the exact same response. After taking this to be a symptom of cluelessness in the textbook industry, they then talk about how obsolete BASIC is, and how many more things you can do with higher-level languages. "Don't worry," they invariably add, "the newer textbooks won't have any of those little BASIC passages in them."

All of which is absolutely true. BASIC is actually quite tedious and absurd for getting done the vast array of vivid and ambitious goals that are typical of a modern programmer. Clearly, any kid who wants to accomplish much in the modern world would not use it for very long. And, of course, it is obvious that newer texts will abandon "TRY IT IN BASIC" as a teaching technique, if they haven't already.

But all of this misses the point. Those textbook exercises were easy, effective, universal, pedagogically interesting -- and nothing even remotely like them can be done with any language other than BASIC. Typing in a simple algorithm yourself, seeing exactly how the computer calculates and iterates in a manner you could duplicate with pencil and paper -- say, running an experiment in coin flipping, or making a dot change its position on a screen, propelled by math and logic, and only by math and logic: All of this is priceless. As it was priceless 20 years ago. Only 20 years ago, it was physically possible for millions of kids to do it. Today it is not.

In effect, we have allowed a situation to develop that is like a civilization devouring its seed corn. If an enemy had set out to do this to us -- quietly arranging so that almost no school child in America can tinker with line coding on his or her own -- any reasonably patriotic person would have called it an act of war.

Am I being overly dramatic? Then consider a shift in perspective.

First ponder the notion of programming as a series of layers. At the bottom-most level is machine code. I showed my son the essentials on scratch paper, explaining the roots of Alan Turing's "general computer" and how it was ingeniously implemented in the first four-bit integrated processor, Intel's miraculous 1971 4004 chip, unleashing a generation of nerdy guys to move bits around in little clusters, adding and subtracting clumps of ones and zeroes, creating the first calculators and early desktop computers like the legendary Altair.

This level of coding is still vital, but only at the realm of specialists at the big CPU houses. It is important for guys like Ben to know about machine code -- that it's down there, like DNA in your cell -- but a bright kid doesn't need to actually do it, in order to be computer-literate. (Ben wants to, though. Anyone know a good kit?)

The layer above that is often called assembler, though there are many various ways that user intent can be interpreted down to the bit level without actually flicking a series of on-off switches. Sets of machine instructions are grouped, assembled and correlated with (for example) ASCII-coded commands. Some call this the "boringest" level. Think of the hormones swirling through your body. Even a glimpse puts me to sleep. But at least I know that it is there.

The third layer of this cake is the operating system of your computer. Call it BIOS and DOS, along with a lot of other names. This was where guys like Gates and Wozniak truly propelled a whole industry and way of life, by letting the new desktops communicate with their users, exchange information with storage disks and actually show stuff on a screen. Cool.

Meanwhile, the same guys were offering -- at the fourth layer -- a programming language that folks could use to create new software of their very own. BASIC was derived from academic research tools like beloved old FORTRAN (in which my doctoral research was coded onto punched paper cards, yeesh). It was crude. It was dry. It was unsuitable for the world of the graphic user interface. BASIC had a lot of nasty habits. But it liberated several million bright minds to poke and explore and aspire as never before.

The "scripting" languages that serve as entry-level tools for today's aspiring programmers -- like Perl and Python -- don't make this experience accessible to students in the same way. BASIC was close enough to the algorithm that you could actually follow the reasoning of the machine as it made choices and followed logical pathways. Repeating this point for emphasis: You could even do it all yourself, following along on paper, for a few iterations, verifying that the dot on the screen was moving by the sheer power of mathematics, alone. Wow! (Indeed, I would love to sit with my son and write "Pong" from scratch. The rule set -- the math -- is so simple. And he would never see the world the same, no matter how many higher-level languages he then moves on to.)

The closest parallel I can think of is the WWII generation of my father -- guys for whom the ultra in high tech was automobiles. What fraction of them tore apart jalopies at home? Or at least became adept at diagnosing and repairing the always fragile machines of that era? One result of that free and happy spasm of techie fascination was utterly strategic. When the "Arsenal of Democracy" began churning out swarms of tanks and trucks and jeeps, these were sent to the front and almost overnight an infantry division might be mechanized, in the sure and confident expectation that there would be thousands of young men ready (or trainable) to maintain these tools of war. (Can your kid even change the oil nowadays? Or a tire?)

The parallel technology of the '70s generation was IT. Not every boomer soldered an Altair from a kit, or mastered the arcana of DBASE. But enough of them did so that we got the Internet and Web. We got Moore's Law and other marvels. We got a chance to ride another great technological wave.

So, what's the parallel hobby skill today? What tech-marvel has boys and girls enthralled, tinkering away, becoming expert in something dazzling and practical and new? Shooting ersatz aliens in "Halo"? Dressing up avatars in "The Sims"? Oh sure, there's creativity in creating cool movies and Web pages. But except for the very few who will make new media films, do you see a great wave of technological empowerment coming out of all this?

OK, I can hear the sneers. Are these the rants of a grouchy old boomer? Feh, kids today! (And get the #$#*! off my lawn!)

Fact is, I just wanted to give my son a chance to sample some of the wizardry standing behind the curtain, before he became lost in the avatar-filled and glossy-rendered streets of Oz. Like the hero in "TRON," or "The Matrix," I want him to be a user who can see the lines that weave through the fabric of cyberspace -- or at least know some history about where it all came from. At the very minimum, he ought to be able to type those examples in his math books and use the computer the way it was originally designed to be used: to compute.

Hence, imagine my frustration when I discovered that it simply could not be done.

Yes, yes: For three years I have heard all the rationalized answers. No kid should even want BASIC, they say. There are higher-level languages like C++ (Ben is already -- at age 14 -- on page 200 of his self-teaching C++ book!) and yes, there are better education programs like Logo. Hey, what about Visual Basic! Others suggested downloadable versions like q-basic, y-basic, alphabetabasic...

Indeed, I found one that was actually easy to download, easy to turn on, and that simply let us type in some of those little example programs, without demanding that we already be manual-chomping fanatics in order to even get started using the damn thing. Chipmunk Basic for the Macintosh actually started right up and let us have a little clean, algorithmic fun. Extremely limited, but helpful. All of the others, every last one of them, was either too high-level (missing the whole point!) or else far, far too onerous to figure out or use. Certainly not meant to be turn-key usable by any junior high school student. Appeals for help online proved utterly futile.

Until, at last, Ben himself came up with a solution. An elegant solution of startling simplicity. Essentially: If you can't beat 'em, join 'em.

While trawling through eBay, one day, he came across listings for archaic 1980s-era computers like the Apple II. "Say, Dad, didn't you write your first novel on one of those?" he asked.

"Actually, my second. 'Startide Rising.' On an Apple II with Integer Basic and a serial number in five digits. It got stolen, pity. But my first novel, 'Sundiver,' was written on this clever device called a typewrit --"

"Well, look, Dad. Have you seen what it costs to buy one of those old Apples online, in its original box? Hey, what could we do with it?"

"Huh?" I stared in amazement.

Then, gradually, I realized the practical possibilities.

Let's cut to the chase. We did not wind up buying an Apple II. Instead (for various reasons) we bought a Commodore 64 (in original box) for $25. It arrived in good shape. It took us maybe three minutes to attach an old TV. We flicked the power switch ... and up came a command line. In BASIC.

Uh. Problem solved?

I guess. At least far better than any other thing we've tried!

We are now typing in programs from books, having fun making dots move (and thus knowing why the dots move, at the command of math, and not magic). There are still problems, like getting an operating system to make the 5141c disk drive work right. Most of the old floppies are unreadable. But who cares? (Ben thinks that loading programs to and from tape is so cool. I gurgle and choke remembering my old Sinclair ... but whatever.)

What matters is that we got over a wretched educational barrier. And now Ben can study C++ with a better idea where it all came from. In the nick of time.

Problem solved? Again, at one level.

And yet, can you see the irony? Are any of the masters of the information age even able to see the irony?

This is not just a matter of cheating a generation, telling them to simply be consumers of software, instead of the innovators that their uncles were. No, this goes way beyond that. In medical school, professors insist that students have some knowledge of chemistry and DNA before they are allowed to cut open folks. In architecture, you are at least exposed to some physics.

But in the high-tech, razzle-dazzle world of software? According to the masters of IT, line coding is not a deep-fabric topic worth studying. Not a layer that lies beneath, holding up the world of object-oriented programming. Rather, it is obsolete! Or, at best, something to be done in Bangalore. Or by old guys in their 50s, guaranteeing them job security, the same way that COBOL programmers were all dragged out of retirement and given new cars full of Jolt Cola during the Y2K crisis.

All right, here's a challenge. Get past all the rationalizations. (Because that is what they are.) It would be trivial for Microsoft to provide a version of BASIC that kids could use, whenever they wanted, to type in all those textbook examples. Maybe with some cool tutorial suites to guide them along, plus samples of higher-order tools. It would take up a scintilla of disk space and maybe even encourage many of them to move on up. To (for example) Visual Basic!

Or else, hold a big meeting and choose another lingua franca, so long as it can be universal enough to use in texts, the way that BASIC was.

Instead, we are told that "those textbooks are archaic" and that students should be doing "something else." Only then watch the endless bickering over what that "something else" should be -- with the net result that there is no lingua franca at all, no "basic" language so common that textbook publishers can reliably use it as a pedagogical aide.

The textbook writers and publishers aren't the ones who are obsolete, out-of-touch and wrong. It is people who have yanked the rug out from under teachers and students all across the land.

Let me reiterate. Kids are not doing "something else" other than BASIC. Not millions of them. Not hundreds or tens of thousands of them. Hardly any of them, in fact. It is not their fault. Because some of them, like my son, really want to. But they can't. Not without turning into time travelers, the way we did, by giving up (briefly) on the present and diving into the past. (I also plan to teach him how to change the oil and fix a tire!) By using the tools of a bygone era to learn more about tomorrow.

If this is a test, then Ben and I passed it, ingeniously. In contrast, Microsoft and Apple and all the big-time education-computerizing reformers of the MIT Media Lab are failing, miserably. For all of their high-flown education initiatives (like the "$100 laptop"), they seem bent on providing information consumption devices, not tools that teach creative thinking and technological mastery.

Web access for the poor would be great. But machines that kids out there can understand and program themselves? To those who shape our technical world, the notion remains not just inaccessible, but strangely inconceivable.


This inspired me to start doing BASIC256 with my dad.
The very first "language" I played around with was HTML and Javascript, which still exists on every computer with a browser (every computer)
I personally started with BBC BASIC and Logo and do agree that it would be nice if modern equivalents were more widely available and used. Python is probably the closest thing we have these days but I think even that is a bit daunting for the very young and inexperienced. There are toy languages such as Microsoft's Small Basic but that is very strictly a toy and I wouldn't want to develop anything but the most trivial of programs in it.

SirCmpwn wrote:
The very first "language" I played around with was HTML and Javascript, which still exists on every computer with a browser (every computer)

HTML is a markup language, not a programming language, so I'll skip over that one. JavaScript is indeed widely available but doesn't have the immediate response and low barrier to entry that a language like BASIC does, and cannot perform particularly useful tasks on its own.

(I've said it many times before, but it probably bears repeating at this point: the programming language built into the TI-83+ series calculators is not BASIC, even though that's what the community calls it. The BASIC snippets in textbooks mentioned in the article therefore won't run on the calculator).
It is not a programming language, but it still serves as a gateway to more in-depth technologies.
The first language I started with was Batch scripting. I didn't get very far before I found that you couldn't do much with it. Then a year later, after a break, I picked up TI-Basic, and then progressed to learn as many and all that I could (starting with z80 ASM). I also completely agree with that article. In my software design class (pre-requisite to AP Computer Science), we are using QBasic and then working our way up to Java (with, ugh, visual basic in between). Even though I hated the idea at first, it really does work well as a teaching tool. I would love to have a nice, easy language to use on the computer that I could have used on day one to learn programming, rather than it taking a year or two to get trivial programs written.
My first language was actually VBA in the form of Excel macros. I never really did anything with it except learning how For loops work, though. When I got my calculator in 6th grade, I started learning TI Basic, and I continued to learn assembly in eighth grade. I learned Bash in seventh grade, and a little C in eighth grade as well, and I'm learning Java in Pre-AP Computer Science this year.
benryves wrote:


(I've said it many times before, but it probably bears repeating at this point: the programming language built into the TI-83+ series calculators is not BASIC, even though that's what the community calls it. The BASIC snippets in textbooks mentioned in the article therefore won't run on the calculator).


While that's completely true, TI-BASIC is very similar to real BASIC. I came across a book using the real stuff a few days ago in a DSP book and I was able to translate it into TI-BASIC almost line for line, excluding some optimizations I made to get the code to run a bit faster.
Qwerty.55 wrote:
While that's completely true, TI-BASIC is very similar to real BASIC. I came across a book using the real stuff a few days ago in a DSP book and I was able to translate it into TI-BASIC almost line for line, excluding some optimizations I made to get the code to run a bit faster.

Right, but you could say the same for numerous other languages; being able to translate code from one language to another does not make one language the same as another, and a beginner is not going to be able to do so without being familiar with both.

The TI-"BASIC" version of the following old computer benchmarking program is going to look rather different, for example:

Code:
  130 PRINT "BM9"
  140 FOR N=1 TO 1000
  150   FOR K=2 TO 500
  160     M=N/K
  170     L=INT(M)
  180     IF L=0 THEN 230
  190     IF L=1 THEN 220
  200     IF M>L THEN 220
  210     IF M=L THEN 240
  220   NEXT K
  230   PRINT N;
  240 NEXT N
  250 PRINT "E"
  260 END

However, once upon a time you would have been able to easily run that code on just about any computer with no modifications.
I didn't read the whole thing (TL;DR), but here's my take on it (which is probably all covered in the article). The thing about BASIC on the 8-bit computers of the 80's that made it easy to pick up and program was that you had to learn at least the basics of it in order to do anything useful on the computer at all, such as loading and running a pre-written program. If you wanted to calculate something or do some other simple task, you could do that too right at the command prompt. The interactivity made it very easy to break into writing short programs at first and then bigger and bigger programs as you learned the language and more advanced programming concepts.

If only there were some language today that, like BASIC used to be, is already installed on every major system* and can be used both interactively for day-to-day tasks and for writing programs. I'll have to bash my brain thinking about it for awhile...


* Excluding Windows, but that doesn't really count anyway for someone trying to learn programming.
christop wrote:
* Excluding Windows, but that doesn't really count anyway for someone trying to learn programming.

Windows comes with PowerShell, which I'd say is quite a bit better than bash from a programmer's perspective. bash is better than DOS/Windows batch scripting, admittedly, but still far off any half-decent programming language.
christop wrote:
already installed on every major system*

* Excluding Windows, but that doesn't really count anyway for someone trying to learn programming.
While it's true bash isn't out of the box on Windows, as ben said, there's PowerShell, and you can get bash on Windows. But excluding Windows when you say "every major system" is like excluding the letter "e" when you say "every common letter in the English language" (Actually, more like excluding the letters e, t, a, o, i, n, s, h, r, d, l, and c, given the market share of Windows). Most people use Windows, so there being something installed on Linux doesn't really help the majority of the population learn to program. Hell, I didn't start using Linux until I got to college, and I was programming well before that.
benryves wrote:
Python is probably the closest thing we have these days but I think even that is a bit daunting for the very young and inexperienced.

lolwut? Python is only as daunting as you want it to be. If you don't utilize the more powerful features you can pretty much ignore them, and, for example, its for loops are very accessible and intuitive.

Quote:

like Perl and Python -- don't make this experience accessible to students in the same way. BASIC was close enough to the algorithm that you could actually follow the reasoning of the machine as it made choices and followed logical pathways. Repeating this point for emphasis: You could even do it all yourself, following along on paper, for a few iterations, verifying that the dot on the screen was moving by the sheer power of mathematics, alone.

In what way does the Python you'd be teaching a 10 year old obscure the "logical pathways" of the algorithm? if, elif, else, while, and for do exactly what they say on the tin. You can still do it all yourself following along on paper.

Alternatively, C. Pointers really aren't that hard if you explain them competently. Unlike Java, there's very little boilerplate in C a program, which translates to there not being any voodoo words that you can't explain until they've been programming for a couple months already. If you want something a little more readable, there's always Pascal.

I truly don't understand the purpose of this article, except that it tells me that either

  • The author is too far to the code-monkey end of the programmer spectrum (as also evidenced by his use of the word "code" on the title) to really understand the material he's trying to teach his son.
  • He's blinded by nostalgia
Let me reiterate Elfprince13's point about pointers. When I was taking CS102, my fellow students had heard that pointers were this big massive roadblock to understanding, to the extent that some even preemptively declared they didn't think they'd understand them (and subsequently decided they didn't understand them). I had a challenging C teacher, but he taught us well, and the thing that was challenging was the scope of the assignments he gave us. I had great fun with them, but many others complained, when other C programming sections were getting absolutely laughably trivial assignments and tests. I feel like people build up that same sense of impending doom when they're trying to learn ASM, to the extent that for many of them it becomes a reality.
Actaully a few of my Textbooks had "Try this on your TI-83+" sections, which I was one of the few people who new enough to actually do so. I really wish my Math teachers had actually spent time on this and the only one I recall even mentioning calc programs durring class was actually my physics teacher.

On the subject of pointers my biggest issue is alwas remembering when to use */& for argument passing and pointer manipulation.

The issue with powershell is it isn't installed by default and the default windows terminal emulator is rather lacking. Command line programs are definately second class citizens on windows, and that is one of the reasons I feel much more comfortable programming on Linux.
TheStorm wrote:
On the subject of pointers my biggest issue is alwas remembering when to use */& for argument passing and pointer manipulation.


This really shouldn't be an issue if you program in C for more than an hour, any more than remembering where to use * and / or + and - for normal arithmetic.

  • * dereferences a pointer, meaning it follows the pointer to what it is pointing at.
  • & gives you a pointer that points at the thing you are &ing.
  • Pretty much anything other than primitive types should always be passed as a pointer to avoid abusing the stack.


TheStorm wrote:
the default windows terminal emulator is rather lacking. Command line programs are definately second class citizens on windows, and that is one of the reasons I feel much more comfortable programming on Linux.

I want to cry every time I have to interact with a console application on Windows. This isn't a "default terminal emulator", it is the only terminal that is built into the system, and everything that wants to be a console application has to use it. Applications like Console that intend to replace it can't, they just hook around it.
elfprince13 wrote:
Alternatively, C. Pointers really aren't that hard if you explain them competently.


If you get it, they aren't hard. If you don't get it, they are hard. Same with recursion. From what I've seen, it either makes sense quickly or it doesn't and you struggle with it.

Quote:
Unlike Java, there's very little boilerplate in C a program, which translates to there not being any voodoo words that you can't explain until they've been programming for a couple months already.


C has quite a bit of boilerplate. Includes, header files, declarations, preprocessor stuff to force single includes, etc...
Jonimus, I had a few math textbooks like that, and I remember being disappointed at how very few programs they had for me to try out. I also remember in my freshman year of high school that there was a BASIC program in my math textbook that I took great pleasure in succeeding in porting over to my calculator.
Kllrnohj wrote:
elfprince13 wrote:
Alternatively, C. Pointers really aren't that hard if you explain them competently.


If you get it, they aren't hard. If you don't get it, they are hard. Same with recursion. From what I've seen, it either makes sense quickly or it doesn't and you struggle with it.

Quote:
Unlike Java, there's very little boilerplate in C a program, which translates to there not being any voodoo words that you can't explain until they've been programming for a couple months already.


C has quite a bit of boilerplate. Includes, header files, declarations, preprocessor stuff to force single includes, etc...


I must say that I 100% agree with Kllr, simply because 2 years ago I started trying to learn C++ with no knowledge of programming, and didn't fare too well. Working with C's confusing boilerplates turned me off fast -- I don't think I ever even got to pointers, I was too busy trying to comprehend what a string was (I thought it had something to do with threads)

Recursion, however, I got right on the first time I learned about the concept (however, it was well after this C++ learning incident). Many things in C are very hard to even begin comprehending without simpler programming knowledge, and I can say this firsthand directly from experience.
My math textbooks had one or two "Program this!" sections, which, for the most part, I didn't use, but I read the code and understood it. With C, the biggest hurdle for me was figuring out which dereferenced the pointer and which got the value at that pointer. After I got that, everything was pretty easy.
Kllrnohj wrote:
If you get it, they aren't hard. If you don't get it, they are hard.

If someone understands arrays they already understand pointers and just don't know it.


Quote:
C has quite a bit of boilerplate. Includes, header files, declarations, preprocessor stuff to force single includes, etc...

Sure, but you don't have to have all of that right away.

Your first Java program:

Code:

public class HelloWorld{

public static void main(String[] args){
 System.out.println("Hello World");
}

}

Questions it raises:
What is a class? "Don't worry about it, I'll tell you later"
What does "public" mean? "It has to do with visibility of the things you define to other classes or outside your package, but don't worry about that, because you won't actually be defining multiple classes for 2 more months, and who even knows when you'll have to make your own package"
What does "static" mean? "It doesn't need to be called on an instantiated object, but don't worry about that, because you won't know what that means for another couple of weeks"
What does "void" mean? "It's an empty return type, but don't worry about that, because you won't be writing any other functions for another 6 weeks"
Why do I need String[] args and where do they come from? "magic, don't worry about those, because you won't learn what an array is for another 6 weeks"

Your first C program:

Code:

#include <stdio.h>

main()
{
    printf("Hello World\n");
}

Questions it raises:
What is stdio.h? "Standard libraries for input and output."
What is that \n thing? "It makes a newline"


Have I made my point?
  
Register to Join the Conversation
Have your own thoughts to add to this or any other topic? Want to ask a question, offer a suggestion, share your own programs and projects, upload a file to the file archives, get help with calculator and computer programming, or simply chat with like-minded coders and tech and calculator enthusiasts via the site-wide AJAX SAX widget? Registration for a free Cemetech account only takes a minute.

» Go to Registration page
Page 1 of 2
» All times are UTC - 5 Hours
 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum

 

Advertisement