The BASIC programming language turns 60

Python has that operator ( / is floating point, // is integer division), and it's sometimes very useful. And the % remainder operator is super useful for working with modulo math.
And a lot of strongly typed languages will just do integer division on two integers even if you're assigning the result to a floating point or decimal type which is not a super-obvious-to-non-developers behavior.
 
Upvote
4 (4 / 0)

zogus

Ars Tribunus Angusticlavius
7,184
Subscriptor
Python has that operator ( / is floating point, // is integer division), and it's sometimes very useful. And the % remainder operator is super useful for working with modulo math.

I have no idea how the VB team implemented theirs, they could certainly have screwed it up somehow, but both integer division and remainders are, IMO, great features. They're not hard to do manually, but the builtins are really handy.
What's wrong with an integer division operator, anyway? perl5 didn't have one built in, which was pretty annoying. Yes, there are workarounds like defining a div() function that rounds the result to integer, or declaring 'use integer;' to turn all of the operators in scope integer-only, but 98.6% of perl's appeal was that it allowed you to write scripts quickly without resorting to these silly incantations.
 
Upvote
2 (2 / 0)

Vorpal42

Smack-Fu Master, in training
1
Missing from this article: In the 70s and 80s, there were thousands of PICK-based business applications around the world written in a wonderful high-level version of BASIC . At certain points PICK was likely the most used OS in the world. Seemingly every company had a version: Fujitsu, IBM, McDonnel-Douglas, Adds, the list goes on and on. I ended up on these systems in the early 80s, including the excellent PRIMOS (from Prime Computers), which had a powerful PICKish application call "Prime Information". PICK was eventually ported to all of the various Unix variants, and so I ended up progamming in PICK-BASIC on unix systems for my entire career, until about 4 years ago. It was integrated well with unix and I always used highly structured code. NO branching (like gotos) EVER whatsoever, even though the language allowed it. Also there were no line numbers. The last company I helped with this environment recently replaced it with Microsoft Dynamics 365FO, and productivity dropped so badly, they literally doubled their staff (over 100 people had to be hired!).
 
Upvote
2 (2 / 0)

Dzov

Ars Legatus Legionis
16,028
Subscriptor++
What's wrong with an integer division operator, anyway? perl5 didn't have one built in, which was pretty annoying. Yes, there are workarounds like defining a div() function that rounds the result to integer, or declaring 'use integer;' to turn all of the operators in scope integer-only, but 98.6% of perl's appeal was that it allowed you to write scripts quickly without resorting to these silly incantations.
Your silly incantations aren't silly enough. I prefer to convert my floating division result to a string, and then pull the left digits up until the decimal point and convert them back into a number for the result.
 
Upvote
3 (3 / 0)

mediadude

Smack-Fu Master, in training
2
Started my Basic journey as a junior in high school on a time-shared connection from an IBM 360/67 at MIT to Dartmouth. Also learned Fortran and PL/1 there. Subsequently, my friend Craig and I became the custodians of our high school's Wang "minicomputer."

Each morning, before 1st period, we would bootstrap the machine with hex code, a relatively short black paper tape and then the Basic interpreter on an enormous yellow paper tape. The Wang used time-shared teletypes directly connected to the box, so multiple people could write and run Basic programs simultaneously.

Also, if the Wang crashed during the day, we went down to the lab to rinse and repeat, thus getting to skip some classes.

Later, I wrote a CRM program in TRS Basic for my boss and received a TI-99/4a as compensation. Amazed how many TI-99 folks commented here.

Basic was very good to me... happy birthday!
 
Last edited:
Upvote
4 (4 / 0)

richlove

Seniorius Lurkius
6
Though not nearly as popular; VB.NET is as functional a development language as C# (indeed, both compile to the same intermediate code and there are only a handful of functional differences, such as VB's support of shadowed classes), and easier for a new person (who lacks experience in a C-style language) to read. I do know of some places that use it for professional development.

Another basic language, FutureBasic for Macintosh has evolved into a very powerful language.
It maintains most of the early BASIC commands while adding modern commands.
The apps it generates become objective-c universal apps that run on Intel or Apple silicon...

"FutureBasic (FB) is a high-level procedural programming language combined with an "Integrated Development Environment" (IDE) for creating native Apple Silicon, Intel or Universal Macintosh 64-bit applications. FB includes an editor, translator, project manager, documentation, and code samples.

FB's compiled BASIC allows easy access to the Mac's graphical user interface and macOS file system via Apple's standard frameworks. It provides access to all standard data types (integers, strings, floating point numbers, etc.) and Apple's object data types (such as: CFStrings, CFArrays, CFDictionaries), structures such as nestable records and arrays, various forms of subroutines (local functions with recursion) and callbacks, and is as powerful as C but without its complexities. FB supports mixing both C and Objective-C source code with FB source."
 
Upvote
2 (2 / 0)

richlove

Seniorius Lurkius
6
Missing from this article: In the 70s and 80s, there were thousands of PICK-based business applications around the world written in a wonderful high-level version of BASIC . At certain points PICK was likely the most used OS in the world. Seemingly every company had a version: Fujitsu, IBM, McDonnel-Douglas, Adds, the list goes on and on. I ended up on these systems in the early 80s, including the excellent PRIMOS (from Prime Computers), which had a powerful PICKish application call "Prime Information". PICK was eventually ported to all of the various Unix variants, and so I ended up progamming in PICK-BASIC on unix systems for my entire career, until about 4 years ago. It was integrated well with unix and I always used highly structured code. NO branching (like gotos) EVER whatsoever, even though the language allowed it. Also there were no line numbers. The last company I helped with this environment recently replaced it with Microsoft Dynamics 365FO, and productivity dropped so badly, they literally doubled their staff (over 100 people had to be hired!).
There are currently many business systems around the world using PICK basic.
I learned PICK basic while working for McDonnell Douglas field service in the 80's.
I later wrote a Prism terminal emulator for Macintosh that was used by Bank of America.
After years of adding more emulations, that app became MacWise (written in FutureBasic)
 
Upvote
2 (2 / 0)

NetMage

Ars Tribunus Angusticlavius
9,745
Subscriptor
I learned computer programming writing Basic on an Atari 400. Ahhh that chiclet keyboard, I developed muscles on my fingers.
I upgraded my 400 with a third party kit that replaced the membrane keyboard with real keys, and programmed it in Action!, which was a pretty amazing product for its time. Later I could afford to get an 800 when it was on sale, but it just wasn't the same.
 
Upvote
0 (0 / 0)

NetMage

Ars Tribunus Angusticlavius
9,745
Subscriptor
I later wrote a Prism terminal emulator for Macintosh that was used by Bank of America.
A high school friend took me with her parents to buy a computer to use at university when she was transferring from the two year community college to the four year university (that I went to) and for some reason settled on an original Mac. It wasn't very useful for doing her CS homework which was on a VAX system at the college until I wrote a VT-100 terminal emulator in MacBASIC to use with the modem and do assignments from her apartment.
 
Upvote
1 (1 / 0)

jefito

Ars Centurion
309
Subscriptor++
I had some early exposure to Basic in high school, in a summer college class for HS students. Programmed a PDP 8 using Basic; programs were saved on paper tape. Some time later, in the PC era, I wrote a Basic compiler for use in programming our company's solid modeling system, which already had a separate -- and pretty rudimentary -- home-grown C compiler (same generated executable code used by the Basic compiler), plus a Lisp interpreter (to try to suck in those AutoCAD uses familiar with AutoLisp). The Lisp stuff faded out due to non-use by our customers, as did the Basic compiler. But I took over the C compiler, ANSI-fied it, and that's been used to create some pretty sophisticated software systems that use the solid-modelling stuff, and which still exist, as far as I know.

I had fun at that company, but I don't really miss Basic, though...
 
Upvote
0 (0 / 0)
We had a Blue Chip floppy drive for our C128. It caught on fire (literally) so we had to send it back to be refurbished under warranty. The next month or two was painful as my brother and I waited for them to fix it and ship it back (we lived on a small island in Alaska where most packages came via the slow boat). We had no way to load programs, so we had to type everything in. If it was a long program, we’d leave the computer on for days because we didn’t want to lose all that effort. We were really happy to have that drive back!
I forgot about leaving the computer on. My (dad's, but he didn't learn to use it) TRS-80 was plugged into a switch-controlled socket in my parents' bedroom. I would leave the switch on, but family members would complain "who left the light on" and switch it off. They would forget I had said don't switch it off.
 
Upvote
1 (1 / 0)

malor

Ars Legatus Legionis
16,093
IBM doesn't belong in that headline when Atari is absent.
Atari BASIC was pretty buggy and terrible, as I recall.

But, yes, Atari was in much earlier than IBM. A proper subhead probably should have been about the Holy Trinity: the Commodore PET, the Apple II, and the TRS-80. (they were referred to that way in magazines for a few years; IIRC, they all shipped in 1977.) The IBM offering didn't show up until 1981.
 
Upvote
1 (1 / 0)
Legend has it that the acronym for BASIC was made up after the fact by Kemeney and Kurtz (though they later denied that), and that it originally just meant what the word "basic" means, as in something fundamental. It was capitalized in order to distinguish references to the programming language from other common uses of the word. They got tired of responding to the question "What does it mean?", so they came up with an acronym that fits the word. The name supposedly came from a prior unpublished paper by Kurtz, though many still claim it is a bacronym rather than an acronym.

In actual fact, BASIC does not fit it's acronym very well.

It is certainly well suited for beginners because it relies primarily on variants of English language words and common arithmetic symbols, rather than striving to reduce keystrokes through the use of "special characters" like most other programming languages of the era. Anyone who has ever used a keypunch machine or a Teletype understands the motive to reduce keystrokes.

All-purpose? More accurately, it could be described as having no specific purpose like Fortran or COBOL, though an early version was specifically designed to tackle linear algebra and matrix math like Fortran. But there are a great many programming tasks that BASIC is not well suited for.

Symbolic? Note even vaguely. It used line numbers rather than symbols to represent locations in the code. It used single letters to represent variable names. Numbers were added to letters to increase the number of available names. Some early versions even allowed the use of full words as variable names, even though only the first letter was significant. But it would be more than two decades before BASIC would allow the use of symbolic names for subroutines, variables, and constants.

Instruction Code? It would be very unusual in those days to refer to any high level programming language as an "instruction code". This term was used almost exclusively to refer to singular commands in the native language of the processor - it's "machine code", or the assembly language mnemonic that represents that code. Multiple codes would be referred to in the plural, and the entire suite of codes would be called the "instruction set". But nobody would call an entire language an "instruction code".

In short, BASIC began as "basic" rather than "B.A.S.I.C.".
 
Upvote
-1 (2 / -3)

hisnyc

Smack-Fu Master, in training
82
Subscriptor
Though not nearly as popular; VB.NET is as functional a development language as C# (indeed, both compile to the same intermediate code and there are only a handful of functional differences, such as VB's support of shadowed classes), and easier for a new person (who lacks experience in a C-style language) to read. I do know of some places that use it for professional development.

Wash your mouth out with soap. VB.NET's defaults are all wrong and it is way too easy to generate terrible IL. I once dealt with a clean-up that showed that the code the VB compiler generated was slower than an in house interpreted language.

Frankly, I think languages that tried to be more 'natural' were a tremendous mistake. A precise, ideally mathematical, definition of operations ends up being much more maintainable and understandable.

And I hate AndAlso... Get off my lawn.
 
Upvote
2 (2 / 0)

RoninX

Ars Praefectus
3,239
Subscriptor
I first learned how to program in BASIC in junior high, on an HP-3000 timesharing system owned by the local university that our school had access to. At the time, none of the schools (not even the high school) had programming classes. Then I ended up getting a TRS-80 and spent many hours writing my own computer games.

I also remember the magazines like Creative Computing and Softside that had BASIC games that you could type in by hand. In some cases they left out all the spaces to save memory.

The great thing about BASIC is that you can sit down and type and immediately have something that works. I was always interested in science and technology, but that's what made me decide to pursue computer science as a career. Unlike C or C++ or Java or whatever, you don't need to worry about having the right libraries installed or dealing with compiler errors (or worse, segmentation faults). All of those things could get in a the way of a young person learning to enjoy programming and convince them that it was a painful chore instead.

In general, whether you're talking about programming, math, science, reading, or writing, there seems to be a huge difference between the kids who explored it for fun and those who were forced to do it for classes and homework.

In my opinion, Python now is the best replacement for BASIC for learning how to program. As an interpreted language, it provides the same "just type a program and run it" sense of satisfaction without worrying about compilation and build systems. Of course, it's much more powerful than BASIC, and is actually a good first choice for may large commercial applications. That fact that it was designed from the start for ease-of-use is a big contrast from a language like C++.

Speaking of Creative Computing, anyone else remember the "Computer Myths Explained" comic?

Every month, Monte Wolverton would draw a full-page reductio ad absurdum vision of some simplistic utopian or dystopian misconception that people had about computers ("Computers will solve all the worlds problems!" or "Computers will destroy the world!"). But, as they say, we live in a post-reductio world, and some of those comics have come close to becoming true. This one in particular stuck in my mind:

computer_myths_explained.png
 
Upvote
1 (2 / -1)

h_vogt

Smack-Fu Master, in training
1
It has been more than home computing or learning: In the 70s we had a HP9830 with HP Basic, about 16k of main memory, cassette deck, thermo printer, HP pen plotter and a HP-IB interface (IEEE bus to control measurement systems like voltage sources or voltage and current meters).

This computer with its BASIC programs was used to control automatic measurements of transistors on silicon wafers (2 inch) for MOS process characterization, inluding stepping from chip to chip on a wafer prober, storing data and location on the cassette, and running statistical evaluations with wafer plots and histograms
 
Upvote
1 (1 / 0)

acucons

Smack-Fu Master, in training
1
I learned BASIC in high school in the early 1970's. We had two ASR33 teletypes, one in the math lab and one in the principal's office. Programs were saved on paper punch tape, and it was great fun to log into the OTIS (Oregon Total Information System) to write and run programs. Later one of our math teachers got us hooked up with the local community college so we could use the IBM mainframe with FORTRAN and punch cards. Lots of flashing lights and a real thrill to be able to touch an actual computer.

I have to credit the dedicated teachers in the Bend high school math department that went out of their way to give us an opportunity to learn computing in these early days. Thank you Mr. Hegg and Mr. Schonlau for the basis of a gratifying career in engineering!
 
Last edited:
Upvote
4 (4 / 0)

abehrens

Seniorius Lurkius
20
It's hard to imagine writing disk controllers, etc., in BASIC, and I don't think that's actually what was done; the Dartmouth Time Sharing System was probably written in assembler, as would have been common in the early 60s.
You're correct. The DTSS operating system was written in assembler. Later on, in the '80s, it was rewritten in XPL, a subset of PL/I that was well suited for systems programming.
 
Last edited:
Upvote
1 (1 / 0)
Y'all are a bunch of Johnny-come-lately youngsters. The first computer language I learned was BASIC, just like you, but the first computer I worked on was a mainframe through a Teletype terminal in classes at the Lawrence Hall of Science in Berkeley. We didn't have any new-fangled display screens. Oh, and the first personal computer I ever saw booted up from a stack of punch cards. Well, it did in theory. The guy who was showing it off couldn't get it to boot. Something was wrong with his cards.

Do I win the old fart prize?
 
Upvote
1 (1 / 0)
It was 1971 and I was 12. The school had dial-up access to some time-shared system at some local company on Long Island (east of NYC). I was one of the 3 or 4 students in my junior high school that used one of these: https://en.wikipedia.org/wiki/Teletype_Model_33 - we wrote simple BASIC programs (games mostly, as I recall) and used paper tape for storage. I remember cannibalizing a desktop pencil sharpener as a way of making it easier to wind / unwind the rolls of programs.
Dammit. I thought I'd win the old fart prize for first using BASIC through a Teletype, but you did it earlier than I did. In 1971, I started kindergarten. I was still in the same school when I got my first after-school classes in computer programming, but I was probably in 3rd grade by then.
 
Upvote
0 (0 / 0)

AdrianS

Ars Tribunus Militum
3,741
Subscriptor
For those of you following along at home, that means that all variables have to be explicitly declared before use. Mostly, it helps with typos in variable names.

Without that option, if you mistyped a variable name, the interpreter would kindly create a new variable with that name, and initialise it to zero (or an empty string) for you when you ran the program, without telling you.
 
Upvote
0 (0 / 0)

zogus

Ars Tribunus Angusticlavius
7,184
Subscriptor
I wish Microsoft kept having something like BASIC shipped with the OS. It's easier now, but kids finding it and playing with it would be better.
Well, it does ship with a Javascript interpreter, which is arguably a far nicer sandbox to play in than the old Microsoft BASIC....
 
Upvote
1 (1 / 0)
Upvote
4 (4 / 0)

NetMage

Ars Tribunus Angusticlavius
9,745
Subscriptor
Well, it does ship with a Javascript interpreter, which is arguably a far nicer sandbox to play in than the old Microsoft BASIC....
Strictly speaking, both a Javascript and VBscript interpreter are included, but they only run in a batch mode (reads code from file). See cscript and wscript.
 
Upvote
1 (1 / 0)

skeptical-technocrat

Smack-Fu Master, in training
53
Subscriptor++
When I was a kid, I use to go to the Lawrence Livermore Science Lab and go to their public computer room where they had dozens of paper teletypes for the public to use to play with BASIC.

I believe you mean the Lawrence Hall of Science (LHS) in the Berkeley Hills above the University of California at Berkeley, not the Lawrence Livermore National Laboratory (LLNL) out in Livermore, CA. The former is a science education museum and did run a timeshared multi-user BASIC interpreter service in the 1970s & early 1980s on a Data General NOVA (and later Eclipse) minicomputer, with public terminals at LHS, and TTYs connected via leased line to quite a number of high schools in the San Francisco Bay Area - I know because I used that service myself at two different schools in the 1970s.

One of the popular computer games available to play on that system was Trek73. The exterior of LHS was used as a location shot for the dystopian 1970 movie, Colossus: The Forbin Project.
 
Last edited:
Upvote
1 (1 / 0)