Jump to content

Wikipedia:Reference desk/Archives/Computing/2020 October 7

From Wikipedia, the free encyclopedia
Computing desk
< October 6 << Sep | October | Nov >> Current desk >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


October 7

[edit]

Software development before commonplace Internet

[edit]

These days, by far most of commercial software development is done with version control either on-site or in some kind of cloud storage, with Internet access from the developers' workstations to the version control server.

However, commercial software development predates commonplace Internet. How did companies do it in the 1980s, for example? Did they just distribute floppy disks along?

At least I have heard that Psygnosis was forced to abort the development of their upcoming game Superhero because the master disk had been lost. They officially claimed burglars had stolen it, whereas there is a rumour that the real reason was that the chief developer had ended up in a fight with his wife, and she had physically destroyed the disk.

So how was the development done back then? JIP | Talk 00:32, 7 October 2020 (UTC)[reply]

Bigger places would still have hard drives to store software and source code in the 1980s. There would be software libraries, and versions, but less complex version control. A lot of software development though might have been done by single developers or small teams, using a whiteboard for control. In earlier decades magnetic tapes, paper tape or card decks would be used. Microcomputer software was distributed on floppy disk, or perhaps cassette tape, or perhaps a ROM. Mainframe and minicomputer software was distributed on magnetic tape. Graeme Bartlett (talk) 02:25, 7 October 2020 (UTC)[reply]
Source_Code_Control_System came out in 1973, and there are others from that era and early 1980s. RudolfRed (talk) 03:07, 7 October 2020 (UTC)[reply]
These answers are correct, but I think the OP is asking about developers' "access to the version control server" not using the Internet. And the answer is simply that the whole development process would take place on one computer. If SCCS or another version control system was in use, then the developers would have accounts on the same computer where the SCCS archive was. (I'm old enough to have worked in environments like that.) --174.89.48.182 (talk) 06:19, 7 October 2020 (UTC)[reply]
Tektronix 4014
VT100
Perhaps the OP also needs to realise the difference between today's personal computers linked to a server and the older model of dumb terminals linked to a central computer. When I started in the industry our minicomputer was about 5'6" high and consisted of linked cabinets making up around 20'. Two stand alone tape drives (each in a cabinet 5'6" high x 3' wide) and an array of external disks each disk in a cabinet around 3' high. Mainframes were much larger! This little beauty (a VAX-11/782) served up to 20 users with a mixture of dumb terminals and CAD workstations (also dumb). (The original workstations were Tektronix 4014 green screen displays with a DEC VT100 for control and a large (up to 4'x6') graphics tablet, all on separate cables back to the computer.) Communication was serial over RS232 lines at about 10 kb/s with no networking. In such a scenario, as others have said above, whiteboards, tapes and software on the one single computer were sufficient. Martin of Sheffield (talk) 07:46, 7 October 2020 (UTC)[reply]
If you are interested in going back a little further, I was a computer and electronics developer in the 60s. General purpose compters were not being used everywhere. So, a developer had to understand circuit design as well as programming. I designed a lot of logic boards to translate real-world signals into the binary sequences that the ICs required. Of course, ICs at the time were very simplistic compared to modern CPUs. But, they were programmable. They took instructions that would examine input signals and produce output signals. Now, how did we do source control? On paper. I had stacks of binders for everything I worked on. When I made a change to a program or a circuit board, I updated the manual for it. By the 70s, paper manuals were still common. I took a civilian contract that was baed on the TRS80. I wasn't given programs on disk. I was given a stack of manuals that were full of source code for the program and circuit designs for the periphial equipment. By the 80s, everything was on disk. Nobody liked manuals. It was pointless to write a manual because nobody would read it. I still documented my programs with very detailed manuals. Then, when someone else would work with the code, they would ask me a hundred questions clearly detailed in the manual. By the 90s, I was told to stop writing manuals for my work. It was a waste of time and a security issue. Nobody was freaking about milnet, but the internet got everyone upset. Around 1998, I began getting the response "because of security" when stupid decisions were made. That's when I retired. It wasn't just security issues. It was also code complexity. Even in the 80s, I could be a sole developer for a project. It didn't require a team. I still write programs for this and that, but sole developer projects are nearly impossible now. Code complexity just drowns projects. Imagine that you want to write a little game where you make a little guy run around a maze. You'll spend months trying to get a 3-D engine to properly render the little guy and then the engine you chose will be abandoned by its developers because some new engine is the cool thing. Then, you find out that the audio mixer you chose has a security flaw and you have to trash all of that and start over. Then, an entirely new version of Windows is out and your code won't even launch on the new system. You have to start over. It simply isn't worth trying to develop anything now. When I get the bug to write something, I tend to use SVG for the interface and JavaScript for the programming and make a simple web-based application. 97.82.165.112 (talk) 11:36, 7 October 2020 (UTC)[reply]
I've just done a scan of the cover of my copy of DEC's "The digital Logic Handbook" - Flip Chip Modules for "66-67" (no Y2K issues then). I was going to post it here, but I thinkl it would fall foul of WP:NFCCP. Shame! Martin of Sheffield (talk) 16:02, 7 October 2020 (UTC)[reply]
I just searched and found DEC manuals online. That would be good to look at to see what I mean by paper-based source control. DEC did a good job. They included descriptions, circuit breakouts, logic charts, and code examples. I tended to get stuck with terrible manuals like Litton Industries. They didn't appear to care. In a single manual, the circuit diagram wouldn't match the logic chart and the instruction sets would be misnumbered. I spent more time correcting their manuals than adding to them. When they partnered with Univac and used the CP-808 computer, the manuals were much worse. I believe it is because Univac placed all source countrol on disk, so they ignored the paper. 97.82.165.112 (talk) 20:18, 7 October 2020 (UTC)[reply]
DEC always were a model of documentation. The VMS manuals set was a luxury I didn't appreciate until I moved into the realm of UNIX and found the man pages were only online. When we had the 11/782 it even came with a complete print set - circuit board diagrams and component specifications! IIRC the set was about 2' high. Martin of Sheffield (talk) 21:29, 7 October 2020 (UTC)[reply]
Re: manuals, I was linked this bit about W3C recently by a Pale Moon browser developer [1] It explains a lot about the state of web today. 93.136.178.2 (talk) 22:38, 7 October 2020 (UTC)[reply]