55

Human-vampire hybrid wakes up from hibernation that started in the late 30's and finds that world has changed.

What kind of knowledge from the 30's would be best starting point to learn computers & networking?

Traveller
  • 556
  • 1
  • 4
  • 8
  • 11
    What do they intend to do? Are they learning computers and networking so that they can type out an email to their progency? Or are they trying to "moonlight" as a night tech at a google server farm? In the case of the latter, it may be more effective to start from scratch. – Cort Ammon Dec 06 '16 at 16:32
  • 1
    1)Can that vampire change its identity(the way he looks like)?2)can he hypnotize someone? – Mukul Kumar Dec 06 '16 at 16:34
  • 85
    Does your vampire want to learn how to be a programmer and network engineer or do they just want to learn how to order food on dating websites? – Philipp Dec 06 '16 at 16:34
  • 3
    @CortAmmon Enroll in the university. I want to have a starting point or at least some idea what the classes are about. – Traveller Dec 06 '16 at 16:37
  • 1
    @MukulKumar 1 Acquiring forged papers yes, shape shifting no. 2 Hybrid could play mind games to a certain degree. – Traveller Dec 06 '16 at 16:42
  • 9
    When the first digital vacuum tube computers became publicly known, people (scientists) slapped their foreheads and said "Damn, why didn't we do this twenty years ago?" After you understand the transistor, there's not much of a surprise. Put you zombie in a museum that has a selection of working computers from the last 50 years and let him work his way to the present over a week or two. Shouldn't be hard. – Karl Dec 06 '16 at 16:49
  • 2
    Knowledge of telephone exchanges. Computers from the 40s-50s (e.g. Colossus, EDSAC, WITCH) used old telephony equipment - binary and decimal vacuum tubes, electromechanical relays, routing and signalling, etc. – OrangeDog Dec 07 '16 at 15:10
  • 1
    Just in case, not exactly related, but somehow remembers me this. Vampire slept for 820 years comic into current society

    http://www.webtoons.com/en/fantasy/noblesse/ep-1/viewer?title_no=87&episode_no=1

    – Alfred Espinosa Dec 07 '16 at 18:51
  • 1
    Another side-note with the other direction: In Charles Stross' The Rhesus Chart a bunch of network/computer guys become vampires. – yatima2975 Dec 08 '16 at 11:21
  • 1
    Telephone switchboard theory/relay theory. Some electrical/electornics engineering. Maybe experience with punchcards (banks, Wallstreet, Census Bureau, etc). This were the foundatoins of modern ICT and they were quite solid in the 1930s. – HingeSight Dec 08 '16 at 12:59
  • @Phillipp - I'd think that something like "Nightowl, fascinated with 'Twilight' series, seeking like-minded SF for dinner and ???" would work wonders. – Bob Jarvis - Слава Україні Dec 08 '16 at 19:21
  • 1
    @Traveller He just wants to enroll in University? He probably wouldn't have to know much about math, engineering, or logic to be an end-user. He knew about telephones, it's not much of a stretch to say "Oh it's like a telephone but it sends text and pictures too, neat." – Michael Dec 08 '16 at 20:22
  • 1
    From what I've read so far in "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold, your vampire could do worse than just reading that book in full. It almost seems custom-written for the purpose. – Wildcard Dec 09 '16 at 08:01
  • 1
    Some protocols used in the internet only use technology from before the 30s. They would be easy to understand for your human-vampire hybrid. Such as RFC 1149. – v7d8dpo4 Dec 09 '16 at 15:21

15 Answers15

66

The best starting point would be knowing about Turing Machines. Alan Turing thought these up in 1936, so your human-vampire hybrid could have learned about these shortly before entering hybernation.

In case you are not aware, a Turing machine is basically an abstract version of a computer. It's a theoretical device that can compute anything that can be computed. Computers are really just fast Turing machines with fancy outputs.

If your hybrid had heard about Turing machines, waking up from hibernation and seeing computers would, instead of being a "what in the world is that" experience, be a "wow, they actually made them" experience.

Rob Watts
  • 19,943
  • 6
  • 47
  • 84
  • 42
    And lambda calculus https://en.wikipedia.org/wiki/Lambda_calculus grants ability to write lisp code immediately https://en.wikipedia.org/wiki/Lisp_(programming_language) – slobodan.blazeski Dec 06 '16 at 16:52
  • 106
    If your vampire was proficient enough in LISP, nobody would ever think to question the strange hours he keeps or his strange choice of words. LISP programmers think differently enough from everyone else in the world that nobody would question his eccentricities =) – Cort Ammon Dec 06 '16 at 17:22
  • 1
    Yeah, lambda calculus would be good, but I don't think it would be the best starting point. You'd have to get pretty far into a programming before someone would talk about how it relates to lambda calculus. – Rob Watts Dec 06 '16 at 17:23
  • 4
    @RobWatts Wrong many universities start with scheme as introductory language such as famous MIT 6.001 http://bit.ly/2heiJck . Though I must say that lately MIT switched to Python. – slobodan.blazeski Dec 06 '16 at 17:29
  • 4
    @CortAmmon https://xkcd.com/297/ – slobodan.blazeski Dec 06 '16 at 17:31
  • 9
    @slobodan.blazeski I was more thinking https://xkcd.com/224/ =) – Cort Ammon Dec 06 '16 at 17:36
  • @slobodan.blazeski I haven't heard of scheme being common as an introductory language. Is that something that is no longer the case (like with the MIT class you mentioned)? I suppose it really depends on when the human-vampire wakes up, as the field has changed quite a bit over the years. – Rob Watts Dec 06 '16 at 17:38
  • 2
    @RobWatts It's not as popular as it used to be until few years ago but it's still present : Berkeley http://cs61a.org/ Yale http://www.cs.yale.edu/homes/hudak/CS201S08/ Haskell or Scheme – slobodan.blazeski Dec 06 '16 at 17:50
  • 1
    In Toulouse they use Caml as the introductory language, and lambda calculus is teached in second year I think. – Shautieh Dec 07 '16 at 02:26
  • 41
    Computers are absolutely not based on Turing machines... Turing machines are mainly useful as a mathematical model of what "computability" means, which is not something that's particularly relevant to real computers. (Even if you want to know if something is computable, in practice it's computable if you can write a program to do it on a real computer, not on a Turing machine) – user253751 Dec 07 '16 at 04:29
  • 4
    @immibis It's much more the idea of being able to program something and have it compute answers. Even though there are lots of specifics about how computers work that are very different from Turing machines, the basic idea is the same. – Rob Watts Dec 07 '16 at 04:31
  • 13
    @immibis 1+ on that. This is why I down-voted the answer: Turing machines are a mathematical abstraction and a useful tool for defining what algorithms are. They are next to useless when it comes to understand the concept of a modern computer and how to use it, and equally you can learn to use a computer excellently without ever having any clue what a Turing machine is. And conversely: knowing about a computer and how to program does not mean you automatically know what a Turing machine is or how it works. – MichaelK Dec 07 '16 at 11:44
  • 7
    Ummm... https://en.wikipedia.org/wiki/Charles_Babbage That and the telephone system would probably be a better explanation for your average vampire unless s/he is also a graduate math student, lol. Sit down and try to get a few normative contemporary literate computer users to understand the significance of Turing machines. They are going to wander off within two minutes. And have forgotten the whole event in as many hours. – goldilocks Dec 07 '16 at 14:05
  • 1
    @delicateLatticeworkFever The telephone system is a good idea, the original automated telephone systems were somewhat similar to computers in principle, but specialized for telephone switching only. – user253751 Dec 07 '16 at 22:21
  • 1
    I meant more WRT "computers & networking". A major difference would be that telephone networks are (traditionally) circuit switched whereas data networks are packet switched, but this is another distinction that I think will generally get you glazed, blank stares. I guess telegraph or radio would do there too, all of which would have been common knowledge in the 30s. Whereas Alan Turing I can't imagine was even known to most mathematicians or philosophers! – goldilocks Dec 07 '16 at 22:29
  • 4
    @RobWatts As someone who actually had lambda calculus in university and worked on VLSI design, I will outright say that neither turing machines nor lambda calculus were useful even once in hardware design. Same goes for programming. The only situation where you'll ever profit from having this knowledge is when analysing algorithms in theoretical CS and even there there's much, much more useful math around. Electrical engineering would be quite useful if we're going more in the hardware direction. – Voo Dec 07 '16 at 22:34
  • 1
    @immibis Tape, external memory storage? Same difference. Both TM and modern computers design are FA with an augmented memory system which stores both data and code. – Yakk Dec 08 '16 at 18:37
  • Knowing what a theoretical machine could be doesn't do you any good if you don't know how it would work. It was Peirce whom first described a logic gate in 1886. But it was Shannon's work that "introduced the use of Boolean algebra in the analysis and design of switching circuits in 1937" which "became the foundation of digital circuit design" and contains "the fundamental concept that underlies all electronic digital computers." See here. – Mazura Dec 10 '16 at 06:46
  • All known LISP programmers are actual vampires... – Jorge Aldo Dec 11 '16 at 00:20
51

Math. It's the basis for computer languages and would give a good starting place to learning modern languages.

The first low level programming languages where created in the 40's, and high level languages in the 50's.

So the vampire would not have had any experience or knowledge of computers in the 30's.
Knowledge of math would at least give a background in abstract logical thinking and make things easier.

AndyD273
  • 34,658
  • 2
  • 72
  • 150
  • 16
    Pretty solid answer, but I'd say that you need a touch more than just general mathematics (even if we're talking PhD level). You also need formal logic. Computer science is essentially the meeting of the two. – Draco18s no longer trusts SE Dec 06 '16 at 16:24
  • 14
    Honestly I would say logic more then math. I rarely need my geometry or calculus or trig or even Linear Algebra (which I took just to prepare for computers) for my every day programming. I think logic, discrete structures, set theory etc is a very specific subset of math he needs, and the rest is far less important, if good to have. – dsollen Dec 06 '16 at 18:58
  • 6
    I'm giving +1 to this answer due to math, although I disagree with the rest of it. We've had binary computers using floating-point numbers in 1930s. Also, analog computers and electro-mechanical calculating machines were common back then. Someone who had experience with how such devices work (or maybe even worked on design on some of them) would have very good foundations needed to transfer over to the modern digital design. – AndrejaKo Dec 06 '16 at 19:30
  • 8
    @dsollen Honestly, a solid foundation in bog standard algebra and geometric proofs are all you need to easily learn computer programming. (Not because you need the algebra and geometry, but because of how they teach you to think.) – RonJohn Dec 06 '16 at 20:09
41

My great-grandfather headed an Electrical Engineering department from 1946-1956, just as the important transitions into electronic processing were being made and WWII computing technology became unclassified. During his tenure as chair, he purchased the university's first computer, REAC 100, and hired some professors to develop on it.

His background was in electrical engineering and acoustics. He wrote papers on such things as electronic hearing aids, communication theory, and bridge circuits during the 1930s. This would probably be the best applied background to make the transition into computational sciences.

kingledion
  • 85,387
  • 29
  • 283
  • 483
  • Great answer! Could you provide a few citations from your great-grandfather? – Robert Columbia Dec 06 '16 at 22:10
  • 3
    @RobertColumbia Links added. My father has a stack of typewriter drafts of his publications. Its fascinating just how hard they are to understand even though I work in the field today and aught to have 80 years of accumulated knowledge on great-grandpa. – kingledion Dec 07 '16 at 01:11
31

Anything that involves processing data

Computers are all about... well... computing. So first option would be some kind of profession that deal with collecting, collating and computing data. Preferably so in an very strict and organized manner according to certain procedures / algorithms.

MichaelK
  • 43,723
  • 6
  • 106
  • 189
  • 15
    Remember that "computer" was an actual human position at many firms. They had many systems that resemble modern networks -- distributing a complex task out in parts to many computers, searching for results previously computed, etc. – SRM Dec 06 '16 at 19:01
  • @SRM Good point – MichaelK Dec 06 '16 at 19:07
  • 2
    @SRM You mean like this http://dilbert.com/strip/1996-02-20 – Soba Dec 06 '16 at 19:23
  • 1
    I came here to make this exact answer. Except for the hard sciences, most computer work is actually just really, really fast office processing from the late 19th and early 20th centuries. Hard drives instead of filing cabinets, Google instead of librarians, databases instead of card catalogs.

    Don't forget: there were two major high level languages for computers at the beginning: FORTRAN and COBOL. FORTRAN was all about heavy duty math, and was used on the big iron for science. But the vast majority of computers from International Business Machines did extremely fast bureaucracy in COBOL.

    – Zoey Green Dec 10 '16 at 09:03
25

The 1930s had the battle grounds where the formal mathematical definitions of computation were hashed out, there's a random internet find timeline here with all the popular names, but I think that's pretty irrelevant. Most computer users today don't go for a mathematical understanding, even advanced users.

But the hundred years or so leading up to that had enormously relevant inventions. Your wealthy and curious Vampire could have known about the existence and some-level of workings of these computer precursors:

  • Prisms, lenses (historical), Fourier transform (1822)
    • Sound and light are things you can measure and calculate on
  • Charles Babbage's Difference Engine (1823+)
    • calculating by machine instead of human; a serious design and attempt at cog-and-gears calculation, funded and supported by a national government)
  • Electric telegraphs and Morse Code (1836)
    • long distance communication
    • encoding letters as signals
    • electrifying a message and delectrifying it somewhere else
    • electric machines with output devices - buzzer, beeper, direction indicator dial
  • Punched-card telegraphs (1844)
    • information storage on machine-use paper, not human-readable
  • Teleprinters (1846 ish)
    • keyboard on one end of a wire, electric link, printer on the other end
    • all-in-one electrify/encode/decode/delectrify
  • Boolean algebra (1854) (AND, OR, logic)
    • foundations for chaining electric switches together
  • Mechanical typewriters (1868)
    • print, on a small scale, without a printing press or any typesetters!
    • a way you could make a dot matrix printer
  • Baudot code (1874)
    • Morse code was meant for human use, Baudot code was not meant to be human readable, but machine to machine.
  • Wax cylinders and phonographs, photographs (late 1800s)
    • Machine storage of audio and visual information. Only text or drawings until now.
    • mass produced cylinders and prints. Mass storage.
  • Electric radio with vacuum tubes and codes (1896)
    • signalling without wires. Also: lighthouses, signalling with light.
  • Electric light (late 1800s to early 1900s) (blinkenlights)
  • Sinking of the Titanic, famously with radio call for help (1912)
  • World War 1 (19whenever to yeah yeah everyone's forgotten this by now)

From there, they are close to knowing about encoding letters as code symbols, representing symbols as electric signals, reconstructing electric patterns into useful output, about recorded and stored information (paper, punched cards, pressed audio cylinders), about decomposing sound into frequency blocks. The jump to encoding sound measurements as code symbols, storing words on pressed cylinders, representing text characters with lots of very small lightbulbs (pixels), using those lightbulbs as dots to represent pictures like in a newspaper print, keyboards as electric typewriters, and tiny control circuitry is a matter of disbelief of scale or practicality rather than disbelief full stop.

Cherry picking these examples really makes digital computers seem inevitable, doesn't it? I'll say the best thing your Vampire could know from the 1930s era was enough of these things: telegraphs, keyboard style interface for writing with tele-typewriters, the existence of signalling codes, and the use of electricity to power and control things (switches, lights, machine tools, sewing machines, beepers, indicators, dials, readouts) that it seems reasonable that the combined effects could give rise to modern computers, that computers are surprising only in scale, not in principle.

The real mental jump problem is 'stored program computers can do things' and I can't think there's any reasonable way a person from pre-computer era will be able to grasp that intuitively without closely following up-to-the-minute math developments. But signalling codes, electricity as a controller, and feeback systems in general, give as good a background as they're likely to get.

And the mention of WWI is that the Vampire would have lived through it, and would have seen both the desperation of the participants, the burst of technological development (foot warfare to tank warfare, no planes to aircraft, more and more submarines, etc). and could maybe have seen economic problems of 1930s Germany and stresses in Europe. Therefore learning about WWII would not be much surprise that the scale was bigger, the weapons were more deadly, the war was longer, and the technology boost was starting from a higher place and pushed harder and faster.

TessellatingHeckler
  • 1,918
  • 1
  • 12
  • 12
  • 2
    If the vampire's study of logic and maths included the works of Ada Lovelace, then they'll have met the concepts that we now think of as "computer programs", which is the last step to solving the mental jump you mention. From there I think the main jump really is miniaturisation, and that can be seen in other technologies, particularly in the 1900-1930 period. Studying WW2 would help; it's close enough to the vampire's home time to make sense to them but has rapid technological advancement, including things like radios become much more smaller, to the point of being portable for use by troops. – anaximander Dec 07 '16 at 11:21
  • 2
    Much better than the accepted answer. – goldilocks Dec 07 '16 at 14:07
  • 2
    Babbage’s difference engine is impressive, but it can only tabulate polynomials. His Analytical Engine, however, is fully programmable, and is arguably the first real computer to have ever been designed. Ada Lovelace’s programs were intended for the Analytical Engine. – Edgar Bonet Dec 07 '16 at 20:42
  • 1
    The real mental jump problem is 'stored program computers can do things' I'm inwardly giggling about someone who is flabbergasted by the thought that its a computer at the other end of the keyboard responding to what he types and not another person. I mean, unless the vamp hung out with some real eggheads, the most he likely saw were the teletypes that were becoming quite common by the 1930's, and all of those let you talk to or request information from someone, not something. – Tim Dec 08 '16 at 19:33
8

Library science. It helps people to think in terms of storing and finding information, which is what "the net" is all about. Perhaps a bit of mathematics on top, graph theory.

o.m.
  • 114,994
  • 13
  • 170
  • 387
8

Somebody already mentioned Turing, but that has to be number one. The second thing to pick up on is Claude Shannon's work. In 1937, he wrote a thesis demonstrating that electronic circuits could simulate all the constructs of Boolean algebra. Given that mathematicians had already shown that the rest of mathematics could be built up from Boolean algebra, this pretty much lays the basis for electronic (rather than electromechanical) computing. He did a lot of significant work in the 1940s, as did Turing, but that's outside the scope of the question.

Another thing to look into would be Polish decrypting techniques of the late 1920s and early 1930s. These techniques led to the construction of the Bombe, a device that was replicated at Bletchley Park in Britain during WWII.

It's probably worth learning the differential analyzer, designed by Vannevar Bush. This was an analog computer, and therefore could be a distraction from the mainstream of the future, which would be digital. But it would give you a handle on the power of simulations. Knowing that, plus the knowledge that digital simulations would eventually come along, would be helpful.

And, of course, there's Herman Hollerith who devised a scheme for storing census data on punched cards in 1890. His company was bought out by a sales man for NCR, who gave the company a new name, "International Business Machines".
IBM would remain in the punched card business until 1954, when it decided to take over the computer industry, more or less.

And, of course, the work of the first programmer, Ada Lovelace. Her work had been around for almost a century in 1930, but it was still ahead of its time. She practically invented software engineering as a separate discipline from equipment design, and previewed the development of artificial intelligence.
If you want some background on other innovations available in 1930, take a quick peek into The Innovators a book by Walter Isaacson. He outlines progress in infotech from Babbage through Google. The first couple of chapters should give you lots of material.

A lot of this stuff was old by 1930, but learning it at that time would put you ahead of the curve.

Walter Mitty
  • 1,837
  • 10
  • 12
  • Excellent answer, but IMO top billing goes to Charles Sanders Peirce who "described how logical operations could be carried out by electrical switching circuits" in an 1886 letter, giving us the logic gate. – Mazura Dec 10 '16 at 06:54
7

In addition to math (especially binary), Morse Code would give a good understanding of how computers communicated with each other.


Edit: My apologies. I think Morse Code is a bad example. Perhaps I should have said the telegraph and Baudot code. Baudot would introduce your vampire to the concept of 5 bit character encoding so the concept of 7 or 8 bit ASCII (or just bits in general) wouldn't be a drastic leap for the vampire.

Tim
  • 2,752
  • 11
  • 20
  • 2
    Can you expand a little bit on this answer? My experience with very old people who have learned Morse code for radio applications tells me that it has negative impact on their understanding of modern communications systems. They'll tend to associate the - and . with 1 and 0, instead of associating different amplitude levels with 1 and 0, which is the way modern pulse amplitude modulation works. At first, this may look OK, but in Morse code, even time periods when no transmission takes place are important, because they provide separation between the . and -, letters and words. – AndrejaKo Dec 06 '16 at 19:48
  • Also, quite often, they'll not be able to see the source encoding done by the Morse code and will have difficulties understanding how formal source encoding mechanisms actually work. – AndrejaKo Dec 06 '16 at 19:49
3

The sort of mechanical computing people were doing back then was things like this: https://www.youtube.com/watch?v=s1i-dnAH9Y4

There are useful precursors; Morse code is a binary code he might know. He would not have used a calculator, but might have used a slide rule, which requires him to know how to convert numbers to logarithms and back for multiplication and division. (That’s still a technique electrical engineers sometimes use under the hood to implement those operations in hardware.) The layout of modern keyboards is the same as mechanical typewriters, although most men of his time would not know how to use one, because that was a job for women. Telephones existed, but there is literally nothing about a modern phone he would recognize, not even the concept of reaching people by dialing only digits without talking to an operator, or of leaving a recorded message instead of getting a busy signal, or that it’s possible, much less expected, to call someone up when you or he are not at home and rude to show up unannounced. Movies existed, but not television or VCRs, and you couldn’t watch video in color any time you wanted and pause or go back whenever you felt like it.

Science fiction wouldn’t help him much. Isaac Asimov two decades later was still imagining that the engineers in the far future of his space operas would carry in their pockets unimaginably more-advanced slide rules. As for social changes, if anything, science fiction of his day was behind the times.

The part of the curriculum that’s changed the least would be pure math, but there are still a lot of huge differences. He wouldn’t have learned to add or subtract the way old people today think of as “traditional,” because that’s the New Math introduced in the 1960s. Proofs were something he learned in geometry class, not something done in high school algebra, a lot of his time would have been spent memorizing techniques to do by hand calculations that are now always done by computer, it wasn’t generally accepted that the foundation of mathematics was ZF set theory, and nobody has even heard of the Rule of Three.

He’s essentially a foreigner in the modern world. People don’t walk, talk or dress like he expects. Especially women. Everybody wants to be on a first-name basis. Radio exists, but he wouldn’t recognize anything on it. Newspapers exist, but nobody reads them and even the comic strips are tiny and crude. If anything, computer programming is the one aspect of his daily existence at a university today least likely to cause him culture shock: the Internet and video games would be weird, but he has no expectations about them to violate, and computers are possible to completely understand.

Davislor
  • 4,789
  • 17
  • 22
  • So how did people add and subtract before 1960? – JDługosz Dec 07 '16 at 07:20
  • Isaac Asimov described a 1970’s pocket calculator, down to glowing red numbers on grey plastic, published in 1951. most scitfi had slide rules; Asimov was the exception. – JDługosz Dec 07 '16 at 07:24
  • 1
    Typewriters: (mostly male) executives did not use them but you can see in old movies that lots of men were reporters, writers, and even police men, used them. I think it was Mark Twain that first submitted a manuscript that he wrote using a typewriter. – JDługosz Dec 07 '16 at 07:27
  • 4
    Morse isn't actually a binary code (it can be encoded as binary, as any finite code can, but it actually has four fundamental symbols - dot, dash, interletter space, and interword space). It could be argued that the space between dots and dashes within a letter is a fifth symbol, but it can be encoded without it. – Random832 Dec 08 '16 at 00:44
  • @JDługosz Tom Lehrer explains it: https://www.youtube.com/watch?v=UIKGV2cTgqA – Davislor Dec 08 '16 at 04:08
  • @JDługosz Huh. I thought I remembered the line about the high-tech replacement for the slide rule from the Foundation series, but either that was from somewhere else or Asimov wrote it both ways. Thanks for the correction. – Davislor Dec 08 '16 at 04:53
  • @Davislor Tom Lehrer’s song/skit is mocking “new math”, just as people ridicule “common core” today. He is making fun of the new method, and does not represent how people today (since the 60s) add and subtract. It says nothing about how people added differently before. – JDługosz Dec 08 '16 at 06:02
  • @JDługosz If you listen again, you’ll see that he starts with a different algorithm that people used to use for subtraction, then starts making fun of the method we use for subtraction today. That is the New Math. He was joking how confusing it was to carry the 1. The term’s used disparagingly today, but the New Math he satirized as “so very very simple, that only a child can do it,” has become the Traditional way of doing arithmetic that opponents of the Common Core are defending. Someone who woke up from the ’30s would have learned to do it a different way. – Davislor Dec 08 '16 at 06:34
  • 1
    Can you summarize the “other way”? He’s just making fun of different bases and nomenclature that goes into detail of each step, rather than the old way of doing it by rote. But it’s still right to left digit at a time. – JDługosz Dec 08 '16 at 07:11
  • Actually, we did have television in 1930s. In 1920s, Televisors were used. By late 1920s, color transmission was possible. In 1930s, they were getting replaced by electronic television and there was already a number of TV stations world-wide transmitting mechanical TV. The 1936 Olympics were broadcast on TV. – AndrejaKo Dec 08 '16 at 12:32
  • @AndrejaKo Let’s keep in mind the big difference between what had just been invented somewhere and what would have been part of his daily life or mass culture. Although, one possibility is that this guy really did keep up with all the latest technical discoveries. That would say a lot about his personality (and also probably mean he followed a bunch of stuff like Quine’s New Foundations, which he’d eagerly look up and see went basically nowhere, and not just the stuff by Turing and Church that turned out to be a lot more important than anyone realized at the time). – Davislor Dec 08 '16 at 18:25
  • 1
    I don't think it's a big deal whether someone understands how TV works, or whether it's a surprise that it exists. No amount of physics / engineering understanding will help grok the way a tech affects the way people live / think when it's widespread. The social aspect, and implicit assumptions people make about what's common knowledge, are the important part. It's not common knowledge how to actually program a computer or how they encode data, but it is common knowledge that you can google almost anything. – Peter Cordes Dec 08 '16 at 19:16
  • 1
    If you care, here’s every arithmetic book in Google Books published between 1900 and 1920. https://www.google.com/search?q=arithmetic&tbas=0&gbv=2&source=lnt&tbs=cdr%3A1%2Ccd_min%3A1900%2Ccd_max%3A1920&tbm=bks Bear in mind that only a third of teenagers graduated high school back then, and that it was a national scandal in the ’40s that the Army had to teach its recruits enough remedial math to do bookkeeping and gunnery, since some people want to believe that standards have gone downhill. – Davislor Dec 08 '16 at 19:33
  • 1
    “_He [...] might have used a slide rule, which requires him to know how to convert numbers to logarithms and back for multiplication and division._” No, it doesn't: the slide rule does those conversions all by itself. – Edgar Bonet Dec 09 '16 at 13:11
  • https://en.wikipedia.org/wiki/Cross-multiplication#Rule_of_Three was supposedly known for millennia. – hmakholm left over Monica Dec 11 '16 at 11:11
2

Step 1: Look around for some days to get an idea of current world and society also set a camp/safe-house somewhere.
Step 2: Keep doing step 1 and start asking people about "security" and "measures" taken while doing some crime and learn how not to behave strange by using some hypnotic-power you have.
Step 3: Get an idea of "how people are educated", specifically in that field using your hypnotic-powers.
step 4: find a university and track bright students and ask them to help you by giving some tuition at their home/rooms.

Now, if your vampire is this much smart then he will have a safe-house, rough knowledge of modern society and a bright friend who will be willing to teach him computers and will be far less suspicious.

Mukul Kumar
  • 139
  • 5
2

Definitely understand that whatever computers (and embedded computers in appliances, cars, industrial control etc., and ANY "digital" electronics) do these days would have always been done by something mechanical or electromechanical (relays, electromechanical counters...) back then, not by something built with early electronics (or only using these in very select places, eg. a photocell amplifier)- these were expensive and large, and reserved to tasks that required dealing with mostly analog (speech, video), fast (an electromechanical system will usually be hard pressed dealing with things happening in less than 1/1000th or maybe 1/100000th second, or signals above 1-100kHz ... non-electronic telephones,loudspeakers... are among the faster electromechanical devices) and weak (a radio signal from an antenna cannot operate a relay or motor usually, a signal from a microphone will not drive a loudspeaker directly, even though electromechanical amplifiers did (rarely) exist) signals/events.

Telephone exchanges (that already existed at that time) are a good example of an electromechanical system that handled "digital" tasks (and they were still built new that way well into the 1970s in some countries). Somebody understanding how these worked could understand a lot of computer logic by seeing it as a very fast acting telephone exchange that used electronic switches instead of electromechanical ones.

Electronic controls on eg. a tape recorder or household appliance are a good example of something that only became economic to do electronically in the 1980s - earlier designs used complex mechanics to do these things instead.

rackandboneman
  • 391
  • 1
  • 5
2

By the 1930s there were devices whose operation was controlled by complex electrical circuits. Push Button Elevators are a good example.

Relay Logic was being used to control sequencing of machinery:

large relay logic circuits were employed from the 1930s onward

See also George Stibitz

... studied the correspondence of statements of symbolic logic with binary relay circuits while a graduate student at MIT. Shannon wrote his graduate thesis (published in 1938) on that subject ... Clearly the idea of using relays to implement binary logic was common in the late 1930's.

The seeds of digital computing were well planted by that time.

AShelly
  • 223
  • 1
  • 5
1

I suggest "Maths".

The proof is that Maths was my whole/only background, before I was introduced to computers.

Someone (a mathematician) who's trained to accept algebra (e.g. "x = y + z") can extend that to computer programming: it's symbol-substitution.

Perhaps for a similar reason I think that alternatively some of the earlier 'boffins' might have been linguists: bilingual or multilingual, whether with modern languages or classics (Latin or Greek). Training for multiple languages, translating from one language to another, might be another kind of good preparation for a programmer (i.e. it's symbol-substitution again).

I guess that (personal) character/disposition is important too. A programmer needs to be willing to read the manual[s], spend time alone with the machine. If you can only get your kicks from interacting with other people, maybe you won't be willing to study this kind of stuff.

FWIW I learned the elements of networking at my first employer too. The historical terms you'd want to be familiar with are "circuit switching" versus "packet switching" (and maybe "multiplexing"). I presume that most modern networks are packet switching.


My career as a programmer started in the early 1980s: before personal computers were common, and before programming was a frequently-used option at university.

I'd hardly used a computer by the time I left university. I read a couple of programming language books (Fortran and Cobol) from the public library when I was 12 or so -- and having read them I thought that, "maybe when I grow up I'll work for a company big enough to actually own a computer, to program".

Nevertheless (in spite of the fact that most graduates didn't have experience with computers) my first employer (Bell) employed 100s or 1000s of programmers. I think they figured that people who could learn Maths (at university) could learn programming, if they were hired and given a computer terminal.

I think I learned, given:

  • Hardware (e.g. a terminal) and office space
  • Introduction to a programming language (book)
  • Programming reference manual
  • Software tools (text editor and compiler)
  • Existing source code to read, run, debug, and modify
  • Software specification (i.e. defining what the software was supposed to do)
  • A year or so
  • A manager to give me tasks, and a competent QA process to test my edits.
ChrisW
  • 1,784
  • 8
  • 11
1

Lets see...

I think he would not keep up with what the world has now, even in computers (seriously, my dad is 65 years old and he doesn't know how to turn on my computer.). Lets give this as a example.

Your computer puts up the welcome screen.

You have a 600 year old vampire, with a knowledge of computers equaling that of a child, I'll bet he would say "Thank you" to the computer screen (and how about if you describe to "open a window")

The basic knowledge which he must have is the the basics, turning on the computer, what happens when it turns on, what happens when windows pop up, and what to press or not, which in the 30's does have that kind of system, but is not as complex as our current computer system.

If you will, give your vampire a calculator first.

Mr.J
  • 2,618
  • 14
  • 32
-2

Railroads. He will understand the concepts of bandwidth, power, routing and speed. Having him be an expert vampire railroad baron would be advantageous for your story.