• IBM System/370 on a… Raspberry Pi

    Home » Forums » Newsletter and Homepage topics » IBM System/370 on a… Raspberry Pi

    Author
    Topic
    #2293964

    Astounding. @mainframed767 on Twitter just tweeted this: I have been running a full IBM System/370 Mainframe on a $20 Raspberry Pi Zero for ~5 months.
    [See the full post at: IBM System/370 on a… Raspberry Pi]

    7 users thanked author for this post.
    Viewing 25 reply threads
    Author
    Replies
    • #2293979

      In the days of mainframes with punch card hoppers, tape drives with spools the diameter of large dinner plates and hard drives the size of washing machines, as a graduate student at the University of New South Wales, I did the computations for my Engineering PhD thesis in an IBM System/360/50, predecessor of the 370 and that was already a rather old machine in those days. Eventually, it was replaced with a new CDC mainframe. I have no doubts that, if someone could find this version of the IBM OS, it could also be easily run on a Raspberry Pi, or a 1980’s Commodore. (I believe, if memory serves, that the 512 MB of memory of the Zero is comparable to the memory size of that System/360/50 of my student days. Although at 1 GHz, the Zero’s CPU might be a tad faster.)

      I have been told by someone I have no reason to doubt, that one of the ferrite core frames of “my” 360’s RAM is now part of an an exhibit on the evolution of computing in the Sydney Technology Museum, in a converted old power house overlooking Sydney Harbour.

      This article from the Australian Computing Society explains what was the historical significance of those 360 computers:

      https://ia.acs.org.au/article/2017/acs-heritage-project–chapter-14.html

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

      9 users thanked author for this post.
    • #2294031

      My first CS course was Fortran programming on an IBM 360.  They only ran the student stuff at the top of every hour and woe be you if one of the punch cards in the deck was wrong!  When I did my post-doc at Cornell in the mid-1970s we had a DEC PDP-11 mini-computer.  Booting it in the morning was great fun with all the toggle switches and paper tape reader.  Even with a decent BASIC compiler, it was not what one would term a fast computer.

      2 users thanked author for this post.
    • #2294036

      Yep, that’s how I earned a living. Learned on a 7070 at U of R and then worked on 360’s, 370’s, etc. I remember writing a mod to their OS that would auto restart the comm as it kept crashing leaving the rest of the system operating but with no way to communicate. Great fun, but yeah, immediately dwarfed by PCs.

    • #2294055

      I studied coding on an IBM 1401 and started working at a bank on NCR 315.
      Max size of program was 4K !! so we had to write big programs using 4K modules which we loaded/unloaded…
      That was in 1968.

    • #2294064

      We’re showing our age here.

      My first programming effort was on an NCR 500 with (wait for it) 1,200 bytes of core memory. That was in the Army, after which I studied computer science at Hopkins with most of that work done in high level languages on an IBM 7094 fronted by an IBM 1401 that handled all the input/output tasks (the 7094 just did computation). Our professor scored a tiny bit of time on a System 360/91 so we could study PL/1.

      Woody, there is certainly a sense of astonishment when seeing something like this. I think it’s because we’ve lived through one of the fastest advancements in technology the world has ever seen, something that hasn’t been true before. Generations before us have lived their entire lives without such tectonic technological transformation. For example, someone born in 1850 might not have lived to see diesel or electric trains – their entire experience would have been with steam. By contrast, Eniac was born just slightly before me and today is so antiquated as to be completely useless. Computing at the start of our lives and here closer to the end of them is incredibly different.

      The space shuttle used five IBM Series/4 Pi (there’s a coincidence) systems, which in essence used the System 360 architecture. A $150 smart phone with a c***** 8-core processor could run the whole thing.

      I’ve got three PCs on my desk, two of them about 8 years old. I’ve got more compute and storage capacity than the entire data processing department of the bank I worked for in the early ’70s and that occupied an entire floor of a building.

      I wish I could be around to see what the computing landscape looks like 50 years from now. I’m sure it will be something to behold. Will Windows still be around?

      7 users thanked author for this post.
    • #2294088

      Computing at the start of our lives and here closer to the end of them is incredibly different.

      Right on Will, in 50 years we would not recognize this world …

      Tech-Kids

      3 users thanked author for this post.
    • #2294091

      Used them all while growing up – an abacus, slide rule, and mainframes with punch card hoppers.

      But for me the revolution came with the HP-27 pocket calculator that I purchased in 1976. The HP-27 was expensive – $200.00. A fortune for a graduate student, but I was lucky to have a fellowship that covered books, etc. The HP-27 was part of the etc.

      The HP-27 was capable of doing mathematical, statistical and business operations. Its statistical and business capabilities got me through graduate school (including exams) and my early years on Wall Street working as a banker specializing in mine finance.

      While a banker, we would call the bank’s stenography pool, dictate a memo or letter, receive a draft, make corrections, return the draft, and get a final copy.

      Likewise for international communications, except in this case we would take a text of the message we wanted to send to a telex operator who would convert it to code and send it along the recipient. If all went well the next day we would get a response from the recipient.

      Then came the PC revolution.

      4 users thanked author for this post.
    • #2294100

      Kathy Stevens wrote:

      Used them all while growing up – an abacus, slide rule, and mainframes with punch card hoppers.

      But for me the revolution came with the HP-27 pocket calculator that I purchased in 1976.

      In 1985 I bought an HP-15C calculator with many functions, programmable. I still have it and use it almost daily for things I can do without the need of a full-blown computer.

      But the big revolution, for me, was the advent of Windows PCs and Macs that could do enough number-crunching fast enough to satisfy my needs, mainly for analyzing satellite data and making simulations as part of my research. They liberated me from the need to use mainframes run by people I didn’t know, who didn’t know about me and that might or might not be able or willing to help. Or having to hang around computer rooms with decks of punched cards, waiting for my job to come back to me as a printout from a line printer. Debugging software this way was slow and painful, particularly in the middle of the night, or if it was nice and sunny outside.

      Until the late 1990’s I used those personal computers (mostly Macs) as “dumb” terminals to connect remotely from my office, or from home via telephone to different mainframes (and that was already a revolution: no more hanging around in computer rooms!) Although they were not completely dumb, as I could use them also to write and edit the code I then submitted to the remote mainframe.

      By the late ’90s the personal computers were already fast enough, had enough RAM memory and hard disk space (and, in the case of those running Windows, they could address enough of that RAM), that I decided to buy what was to be my first PC, a Toshiba Laptop running Windows 98. Imagine that: a full-blown computer I could use, sitting on a table in my living room! That I could take around with me! I used it for most of my software development and testing for the next six and a half years. Then I replaced it with an IBM “Think” PC running Windows XP.

      And to think that, back in the ’70s, while still using slide rules, trig and log tables, I learned to use computers by programming a PDP8 in assembler, with programs and data punched on paper tape…

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

      1 user thanked author for this post.
    • #2294111

      Kathy Stevens wrote:

      Used them all while growing up – an abacus, slide rule, and mainframes with punch card hoppers.

      But for me the revolution came with the HP-27 pocket calculator that I purchased in 1976.

      I forgot to mention the wonders of using 5 place log tables when I took quantitative chemical analysis in the mid-1960.  Slide rules were no good for this chore and doing triplicate analyses entailed a boat load of additions and subtractions using the trust CRC math handbook that had the log tables as well as a lot of other useful stuff.  I think my copy is in an attic box.  To think how much easier life would have been with a PC back in the day.

       

      • #2294133

        @agoldhammer remembers using and trusting the CRC math handbook.

        Still have my hard-covered CRC math handbook as well as its companions the Merriam-Webster’s Collegiate Dictionary and Roget’s Thesaurus on the shelf behind my desk.  And a yellow Pickett slide rule in a blue case quietly sleeps in the draw below.

        All made redundant by the computer revolution.

        1 user thanked author for this post.
    • #2294116

      Brings back memories of PL/1 and JCL (who remembers JCL?).

      Windows 10 Pro 64 bit 20H2

      3 users thanked author for this post.
      • #2294130

        I remember PL/1 and JCL. Still have the text by Joan K. Hughes (John Wiley & Sons, 1973). For my Ph.D. thesis in linguistics, I wrote a parser of English in PL/1 for use in stylistic analysis. Any job over 128K had to be run during the night hours. 2 or 3 boxes of punch cards went through the reader, the first few and the last few of which were JCL commands. I’d have to wait at least an hour for the turn around. Results came out on huge-sized continuous paper feeding through a dot matrix printer.

        1 user thanked author for this post.
      • #2294144

        I remember JCL, PL/1, RPG, Fortran…

        • This reply was modified 4 years, 8 months ago by Alex5723.
        2 users thanked author for this post.
      • #2294202

        I was a JCL junkie! We had a 360/195 and a 370/158 (?) so we used TSO, then ISPF. I programmed in CLIST and start learning REXX. But then the branch I was in got an DEC MicroVax, pretty nifty!

         

        Eliminate spare time: start programming PowerShell

    • #2294125

      I did my first programming with Fortran IV on an IBM mainframe in 1967. Three things I remember: The cost of the computer was $10,000 a month to rent, took two IBM techs to keep it running on punch cards, and the printer (which printed an entire line at a time, by positioning a set of long vertical bars) made more noise than you can imagine! Each of the three units (card reader, computer, and printer) was the size of a chest style food freezer. And, yes, the computer did work… about half the time.

      1 user thanked author for this post.
    • #2294156

      Where I worked, someone designed and wrote JOL (Job Organisation Language), which was a kind of high level JCL  You wrote your job commands in this much more English-like format, and from there it would generate syntactically correct JCL statements.  It was much easier to use and understand, and was popular with the application programmers also because it greatly reduced the chances of a job failing with a JCL error.  Just a memory now.

      Windows 10 Pro 64 bit 20H2

      • #2294314

        I never heard of JOL and I wish I had. For a long time I had a recurring dream with JCL in it. Not a good one.

        It was only when, later, I started working with CDC machines and their OS “Kronos”, later replaced with “NOS” with similar job commands, that I finally could read and write job commands that were enough like plain English to be understood right away. And everyone I knew who started to use the CDC systems thought the same thing.

        Then UNIX came into my life and it was back to school. LINUX and FreeBSD followed, but by then that was not a problem, having already had to learn UNIX. Later I had to learn some DOS, which was made easy because (as I saw it) its line-command language was mostly a ripoff of UNIX with backslashes instead of forward slashes and “copy” instead of “cp”. I also went through the versions (1 – 9) of Mac OS. And now I am dealing with those of the current one, OS X, renamed macOS, fortunately another UNIX-like OS.

        I wonder if that system 370, as installed in the Raspberry Pi, understands JOL.

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2294166

      Enough already with the reminiscing, or I will add my two-pennorth worth, and yes, I do mean pennies.

      Try looking forward to tonight’s party:

      Determine how many 8-bit bytes of disk storage you have tied up in MP3 music tracks, and translate that to 80-column punched cards (at 12*80 bits per card), then calculate (on your 370) how many fifteen-foot U-Haul trucks you would need to ferry your music to tonight’s party across town. Assume 10,000 punched cards in a carton 18″ by 15″ by 4″.

      Cheers

      Chris

      Unless you're in a hurry, just wait.

      • This reply was modified 4 years, 8 months ago by Chris Greaves.
      2 users thanked author for this post.
    • #2294174

      Chris: you are forgetting how the 029 card punch worked!  Each column of the punch[ed] card could have no more than three vertical positions of the 12 potentially available punched out, and the total character set was 64 EBCDIC characters.  This is a much lower data density than you are presuming for your MP3 music calculation.  Better hire more trucks!

      And there’s another problem (or two) – you couldn’t read the cards fast enough to produce a data stream to play the “MP3” file from punched cards, and the amount of noise the card reader would make could drown it out anyway.

      I think your thinking is flaude…

      BATcher

      Plethora means a lot to me.

      • #2294181

        And there’s another problem (or two) – you couldn’t read the cards fast enough to produce a data stream to play the “MP3” file from punched cards, and the amount of noise the card reader would make could drown it out anyway.

        True, but with the right music selection and enough speakers you might still be able to make it work…

        anyone up for some “Spastic Industrial Metal”? 😀

      • #2294273

        Ah, the 029, successor of the 026 <— 024. Printing what one was pecking at was such an improvement. Anyone remember the 129, MTST? Or the MCST?; somewhere I have a couple of mag cards. (But these are off topic. Sort-of.)

        But I partially disagree on the amount of data on a card, on two accounts:
        1 – Yes, data could use all 80 columns, source and object decks reserved the last 8 columns for card sequence number; and in Assembler, 71 was continuation punch.
        2 – But, data also could be punched (by program, not practical by keypunch, except for test cases) in Binary format. So it was possible to lace a card with 960 holes. (Don’t know why one would want 120x’FF’ though.) No hanging chads when machine punched.

        Haven’t seen it mentioned yet – but it was possible to play music on the printer. (On the 1403 – not the Model N1, it was too fast. And the print train didn’t work quite the same as the print chain, as I recall.) Since printing was performed by little hammers that hit the chain characters as they passed, data could be sent to the printer that would whap the hammers at an audio rate. Lots of harmonics (true digital music!). The printer had to be set for thickest paper (hammers farthest from the backing bar). And remove the ribbon: for timing, there was no line feed; and any ribbon was shredded in very short order.

        1 user thanked author for this post.
    • #2294175

      Computing at the start of our lives and here closer to the end of them is incredibly different.

      Right on Will, in 50 years we would not recognize this world …

      Tech-Kids

      Perhaps in 50 years time man will handle digitalisation more wisely, and hopefully outlive the problems that I shall never encounter anymore

      * _ ... _ *
    • #2294180

      Aaaaacccckkk!!!

      Scary memories of college!  Computer Engineering students racing through campus with shoe boxes full of punch cards, “syntax error in line 472,” WATFOR, WATFIV, WHYFORARTTHOU??  As a more mundane Mechanical Engineer, my smaller card sets were wrapped with printout sheets cinched by rubber bands.

      Mid nineties I bought my first desktop ‘cuz that’s about when PC’s became affordable and useful for a wide range of activities.  I’m not counting the Brother Word Processor, Sinclair ZX or TI 99 even though they were evolution steps.  Or the various programmable calculators which really haven’t evolved much. Or the weird HP Gizmo the size of a large suitcase our high school had that answered 2×3 with 5.999999999999999.  Strange beast there.

      Occasionally, when the antics of some Tech company infuriate me, I step back and ponder the amazing (used literally, not tech nitwit style) capabilities of today’s devices.  Communicating with OS’s, particularly Linux, even with GUI’s is still a mix of Fred Flintstone and Jimmy Neutron, maybe those weird languages I learned really were useful!

    • #2294201

      Right on Will, in 50 years we would not recognize this world …

      No, but I can tell you one thing. Those kids won’t be handling anything as bulky as those hand computers. They’ll get an implant at birth.

      In a related vein, Arthur C. Clarke’s “3001,” the fourth and final book in the series (and I think the best), is an interesting view of the future. Spoiler alert: everyone is bald. Read the book to find out why. (The book was written in 1997.)

      • This reply was modified 4 years, 8 months ago by Will Fastie. Reason: Book date
      2 users thanked author for this post.
    • #2294205

      But for me the revolution came with the HP-27 pocket calculator that I purchased in 1976.

      My father was a professor at Johns Hopkins in physics and astronomy. In the late ’60s he bought a Friden calculator, I think a model 132. This was a huge thing, almost the size of a PC, that used a delay line for memory storage. It was expensive, thousands of bucks at the time. He bought it for one reason only – it had a square root key! That was such a benefit and time saver that it was worth the money. It was also RPN…

      In 1972 he bought several HP-35s, $495 each. Of course, nobody touched the Friden after that and in short order HP calculators were all over the campus.

      Just a few years later, I bought the much less expensive and much more powerful HP-25. used it for years.

    • #2294257

      I made a career on these big iron mainframes.  I LOVED programming in Assembler!

      For year, I spent counless nights hacking throughout the internals of MVS (OS/360 successor) after my day’s work.  That went on for 20+ years up until OS/390 was the thing.  Personal research I would tell my bosses.  I still have all my code from back then and sometimes go through it just for the fun of remembering those times.  I managed to pull some pretty nice tricks over the years.  Also ran those programs on a PC using Hercules MVS/SP emulator­.  Loads and loads of bare bones FUN.  

      Call me crazy but I still believe that knowing Assembler is the best way to really know what you’re doing on a computer.  Nothing like it even though today’s languages like C and C++ allow you to do pretty much anything with the internals components of a computer.  Those were good days for me.

    • #2294312

      This thread really takes me back down memory lane… no pun intended… ha! 🙂

      The 370 series was mainstream way back early in my information technology career, when I was a computer operator. Loved those punch cards and JCL! Not to mention those huge mag tape reels and drives, and washing machine sized disk drives! But conveniently was able to compile and debug my own Cobol for my college class assignments while at work.

       

      Windows 10 Pro 22H2

    • #2294313

      memory lane

      Good one.

    • #2294327

      slide rule

      Yup. In the early 70’s in our high school, we were still required to use the slide rule on tests, even though the first 4-function calculators were becoming available. They were still pretty expensive, but I got one as a major Christmas present one year. Allowed in class and labs, but not on tests, where we still had to show our work. Long division. Ugh!

      I started out with an old straight wooden slide rule that my mother had used in her school days, then switched to one of the fancy new circular plastic ones. More portable.

      Windows 10 Pro 22H2

    • #2294336

      In light of the more mundane business application of IBM System/370 with roots in this era, I am still amazed by this thing, especially in comparison with modern computing devices…

      Your Mobile Phone vs. Apollo 11’s Guidance Computer

      https://www.realclearscience.com/articles/2019/07/02/your_mobile_phone_vs_apollo_11s_guidance_computer_111026.html

      On board Apollo 11 was a computer called the Apollo Guidance Computer (AGC). It had 2048 words of memory which could be used to store “temporary results” – data that is lost when there is no power. This type of memory is referred to as RAM (Random Access Memory). Each word comprised 16 binary digits (bits), with a bit being a zero or a one. This means that the Apollo computer had 32,768 bits of RAM memory.

      In addition, it had 72KB of Read Only Memory (ROM), which is equivalent to 589,824 bits. This memory is programmed and cannot be changed once it is finalised.

      The Apollo Guidance Computer (AGC)

      https://en.wikipedia.org/wiki/Apollo_Guidance_Computer

      Windows 10 Pro 22H2

      1 user thanked author for this post.
      • #2294347

        The early limitations of computer hardware, as in the case of the Apollo spacecraft, led to the rapid development of ingenious and powerful coding techniques, as well as of new mathematical developments on which those were based, many of which are still in wide use. To give an example: the, at the time, small data and code storage capacity of the computers available for  military applications, for example missile guidance, was very important for the quick adoption and further development of the recursive Kalman filter algorithm and all the others later derived from it, now ubiquitous in the software used to optimize estimated trajectories determined from onboard sensors and, or ground tracking with radar or (laser) lidar, including the orbits of satellites, particularly in real time, among many other applications, such as engineering and scientific data analyses, machine control, etc. The memory and other hardware limitations of those very early digital computers are no longer with us, but the many practical advantages of the way they were overcome continue to offer today’s programmers, makes such algorithms often being preferred to older and still valid ones that require considerably more computer resources .

        Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

        MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
        Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
        macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2294353

      I remember taking a FORTRAN class using punch cards in 1980. Then in the late 1980s I took additional programming classes (in Pascal, I think) on a Data General mainframe. The thing I remember most about the Data General was that its text editor was identical to EDLIN – same commands, same format, everything, except that the command names were different, but the commands were identical. I liked that, because when I then decided to try out EDLIN, I picked it up immediately.

      Why would I want to learn EDLIN, you might ask? Simple – we were still in the DOS era, and EDLIN was on every DOS computer. Whenever I did remote PC support, I needed to have a text editor on the customer’s computer, and when all else failed, there was always EDLIN. I had a strange love for EDLIN, because I was about the only person who was able to figure it out! It was a thing of pride for me!

      I preferred using Edit over EDLIN, except for one thing: when I needed to save the text file, I would have to press ALT and then the other key strokes. But I couldn’t press ALT remotely; I always had to ask the customer to press ALT on their keyboard! Consequently, I would avoid Edit and instead use EDLIN.

      At some point I found a very small and quick text editor called Q Edit, and so I started copying it to my customers’ computers, because it always worked when I was remoted into a customer’s computer.

      Group "L" (Linux Mint)
      with Windows 10 running in a remote session on my file server
    • #2294484

      One more thought on that quaint picture of the System/370.

      I don’t recall many people ever sitting in that chair at the master console, except when performing an IPL (reboot), or the IBM field engineer was making a visit for preventive maintenance.

      We usually had terminals at the tape operator’s and printer operator’s positions, where we spent 8 or 12 hours per shift slinging mag tapes or loading the printers with forms. Batch job flow was typically run from the tape operator position.

      But that bright chair and the fern (or whatever it is) in the corner make a nice sales shot! 🙂

      Windows 10 Pro 22H2

    • #2294572

      Wow! I’d like to know how he did that, since I also happen to own a Raspberry Pi Zero W! <g> I first started my computing journey with an IBM 360 at a military base in Germany through courses with the University of Maryland. In 1971 when my first daughter was born, I thought computers might go somewhere, so I took courses in PL/1, Fortran IV, and COBOL. In Y2K I had head-hunters calling me because of that COBOL course! Lol!

      The first time I ever saw a grown man cry was when one of the troops was taking a tray of punch cards to the reader for payroll and tripped over a cord, dropping the tray. Cards flew everywhere! He had my empathy hand-reading all those cards to get them back in order! <GRIN> Probably took all night or longer!

    Viewing 25 reply threads
    Reply To: IBM System/370 on a… Raspberry Pi

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: