Robert W. “Bob” Bemer - who worked at Lockheed's Missile Systems Division in Van Nuys and who would become its IBM 650's power user - carefully cut out the article and placed it into a scrapbook. In 2018, through its Access to Historical Records grant from the National Archives' National Publications and Records Commission, CHM digitized and made freely available online roughly 10 percent of Bemer's historical collection, over 3,000 pages.Read More
In the realm of software, a “branch” is a computer instruction that causes a shift from the default pattern of activity to a different sequence of actions, a different way of moving ahead if you will. For Ann Hardy, a pioneer in timesharing software and business, her contributions to computing were achieved through repeated, creative branching in the face of sexist discrimination.Read More
The photograph was dated 1950, a date when a now unimaginably small number of humans had ever beheld a computer, no less touched one, and when unabashed racism and discrimination was endemic on the American scene. Who was the young African-American man who nevertheless sat at the controls of this storied machine? What was his name? What was his story?Read More
Perhaps you are like me: You’ve aware that quantum computing is a hot topic today but have a nagging feeling that you don’t really have a good picture of what it’s all about. Sure, you know it has something to do with the unintuitive behavior of the world described by quantum mechanics—cats in boxes that are blends of alive and dead until you look inside, and photons coordinating their properties instantaneously over great distances and that are also sometimes a particle and sometimes a wave. And you also know that somehow in this weird behavior, researchers see the possibility for a new kind of computer that accomplish feats that computers like the ones that you own could never dream of doing. Oh, and you know there is something about these quantum computers being able to break all the codes.Read More
In 1950, the physicist Arnold Nordsieck built himself this analog computer. Nordsieck, then at the University of Illinois, had earned his PhD at the University of California, Berkeley, under Robert Oppenheimer. To make his analog computer for calculating differential equations, the inventive and budget-conscious Nordsieck relied on US $700 worth of military surplus parts, particularly synchros — specialized motors that translate the position of the shaft into an electrical signal, and vice versa.Read More
The experience of women, and the issues of gender and sexuality, are vitally important to our understanding of the story of computing, and hence our contemporary world, for many reasons. Perhaps most straightforwardly, women have been ubiquitous throughout the history of computing as makers and users of it. As Eileen Clancy, the archivist and City University of New York graduate student, so aptly put it in her recent talk “Sekiko Yoshida: Abacus ‘Software’ in the Early US Space Program” at the Society for the History of Technology’s 2017 meeting: “The women are always there, if you look for them.”Read More
“There is no cloud,” goes the quip. “It’s just someone else’s computer.”The joke gets at a key feature of cloud computing: Your data and the software to process it reside in a remote data center — perhaps owned by Amazon, Google, or Microsoft — which you share with many users even if it feels like it’s yours alone.