What was the single thing you learned (either in classes or during work) that felt most like scales falling off your eyes?
For me, it was a lecture about microcode, because that filled the gap of understanding between electrons flowing through transistors to form logic gates, and assembler programming. It finally made me feel that I understood completely how a computer works, on all levels.
Related question: What is the single hardest programming skill or concept you have learned?
-
That I should learn something practical. Take care of animals, grow plants or learn to survive. When I understood how computers worked - the same you did; I also understood I have to get out of this electron-fantasy very soon :D
Michael Borgwardt : If you're participating on this website, I'd say it means you didn't get out after all :)Skuta : Friends got me a plant! -
Learning about programming language concepts, for instance static/dynamic linking, parsing, stacks, heaps and the inner works of how computer languages function internally.
-
Polymorphism - suddenly Object-Orientation made sense!
-
CPU insfrastructure and machine organisation. How did the bits get from memory and into the CPU and what happened during their execution. Understanding that was a turning point for me.
-
I think the first time I realised "Wow, the computer does whatever I tell it!", followed by the first time I realised "Oh, it really does exactly what I tell it, not what I want it to do."
That and, like you, when I realised I had learnt enough to have a rough idea of how a computer works, all the way from electrons to user interfaces. I find having the understanding of the levels below the one you are working on to be very helpful, especially when things don't happen as you expect - you are then able to reason about it from first principles and often work out why the machine is doing what it's doing. Knowing how the computer works down to physical processes also helps to reinforce that it's a machine and prevent one from anthropomorphising it - intentionally or otherwise.
Michael Borgwardt : Oh yes - I read somewhere some guy working on the first computers wrote about the moment he realized that he'd be spending much of his professional life from then on fixing mistakes in his own programs.Overflown : Yes - this is the biggest stepping stone for most people. They think that when there is an error in whatever they are doing, that it is due to some random chance. With probability infinitesimally close to 1, it was actually exactly how their crappy app was written. -
Pointers.
Once I figured out how to use a location, rather than the contents of a location, a lot of things became clearer to me.
Kyralessa : Well, for one thing, if you don't understand pointers, you're not likely to understand how object references work...and hence you're not likely to understand OO very well.Abizern : Also - passing values or references between functions. In particular - passing objects with multiple levels of indirection.Abizern : Oh, DavidK - you might know me as Stompy on LFGSS - so we have something to talk about the next time we go on a ride. -
When I stopped listening to the lecturer telling me to think of data objects as "like cars, made of components" and started thinking of them as custom data types with their own commands. Suddenly I could program Java.
Software Monkey : Wish I could vote this up more than once!furtelwart : :) Perfekt answer!Ben Aston : totally agree - sometimes i look back and think perhaps my lecturers didn't know what the hell they were talking aboutTT : +1, Many Interface tutorials use the Car metaphor as well. Interface Car, Class Ford : Car, Class Mazda : Car. Why would I create a class for each individual brand. Completely screwed up my understanding of interfacesTetha : Indeed, looking through the things they try to teach you is the biggest step -- and the hardest, because it makes it hard to talk to other students.vg1890 : Can anyone cite any articles, blogs, etc. that expand on thinking about objects in terms of custom data types? As someone new to OOP this concept makes a lot more sense to me and it'd be nice to see some tutorials that taught OOP this way.Redbeard 0x0A : Also, keep in mind that it is a difficult job to be a teacher, especially with 100 students in a class. EVERYBODY LEARNS DIFFERENTLY!Michael Borgwardt : note that the "custom data types" way of looking at objects is valid only in class-based languages like Java, not so much in prototype-based languages like JavaScript -
"Wow, can I make my own class!?!!?"
discorax : That is a great moment indeed!IceHeat : Especially after having them show up in Intellisense, just like the "real" built-in classes. -
Storage. Once it actually twigged that each bit of data actually needed to live somewhere, in memory or on disk, and not just mysteriously exist somehow, things fell into place, programming wise.
-
Recursion.
Specifically, when I learned in university that one could implement a path-finding algorithm with a simple recursive function. I'd previously thought it was only useful for things like computing factorials.
It completely opened my mind.
Dave Markle : To understand recursion, you must first understand recursion.Eugene M : To understand recursion make sure you have a terminating case.EightyEight : // upvoted for the comments.Slartibartfast : And the sad thing is that although everyone sees his/her first recursion when professor says "you can make factorial in it", it's actually completely inappropriate to use it for something for which FOR loop is designed, and real usages are seen much later -
has changed the way I see and do my work more than any other tool. No longer understanding version control as a backup tool and having the whole power of an own repository at hand is very enlightening:
- Being able to do everything in private - even large changes over a large number of patches without anyone even have a chance to see your stupid mistakes
- If in doubt add one more branch
- Move patches and branches around like Lego bricks
- Do not only decide how the code looks like but also how the history looks like
- Have a line by line control what of the changes actually makes it into the commit
- reorder, rebase, squash and split patches with ease
- have a set of damn powerful tools that build on the basic version control functionality
While the learning curve is a bit steeper than those of other tools the view from the top is much better.
PeterAllenWebb : But you will get so much better if everyone sees your stupid mistakes! -
Work.
Once I started working, applying my theoretical knowledge, I realised how little I knew about the important things in this business.
For example, being a DB wiz means nothing unless you have the skills to extract the real requirements from a client, and ignore what he has scribbled on the back of a fag packet.
Another example is when I learnt that just because I've done Task X a million times and have considered all the expected consequences, I should still take a backup; I found out just how many unexpected consequences there could be, and how incredibly likely they are to occur. A senior colleague once told me: 'Never take a step forwards unless you are reasonably confident that you can take the same step backwards'.
Zan Lynx : Million to one chances show up 9 times out of 10. -- some Terry Pratchett character.Simucal : Fag packet is a slang term for a packet of cigarettes widely used throughout the United Kingdom. Learn something new everyday. I was trying to figure out what kind of "packet" could be considered gay.Daniel Earwicker : As for that senior colleague's advice, I bet it was upsetting when they first heard about the 2nd law of thermodynamics.CJM : Simucal - sorry, a translation would have removed the implied distain in my story. And alternative would be to write 'on the back of torn-up beer mat'. In reality, sometimes we don't even get requirements on a fag packet! -
Application of Strategy pattern in some PHP code (then I started to look into design patterns).
This one opened mine eyes on the object oriented programming, which earlier I thought being just stupid parent-child relations.
-
"A computer can't do anything complicated, it can just do simple things lots of times fast." That was a big starting point for me in programming.
Also recursion, which although I learned about it right at the start I found I needed to go back over and learn in more depth as the rest of my programming got better. From time to time I realise it's time for me to go and relearn recursion, which although...
Daniel Earwicker : Seems like a strange distinction to me - of course they can do complicated things, they just do it by doing simple things lots of times and fast.CJM : Earwicker, I think that is the point. A daunting project is suddenly less daunting when broken down into a series of small stages. -
Realizing that what you produce lives longer than you think - which is a nice way of saying people a long long time from now will be dealing with what you write today.
I got a call in 2002 from someone at a company I left in 1989 pleading for help with a database application I wrote on a Sun workstation in C & UNIFY. It had since been ported to VAXen/INGRES and then to Windoes2000/SQLServer. They were STILL using it and trying to build a web app out of it!
-
Not quite a duplicate, but very closely related to this post.
-
I have to say, top of the list has to be function pointers in C.
I'd missed the class and copied the notes from a friend. I wrote an example and fireworks went off in my head, suddenly it occurred to me - all at once - just how incredibly powerful they were, with these you could write ANYTHING. Compilers, operating systems etc. it completely changed how I looked at code, and in my mind made everything achievable.
Later that year I implemented a project using what I now realise was a primitive form of Polymorphism. It took less than a quarter of the code the rest of the class used. All thanks to (deep announcers voice) the power of Function Pointers.
Years later - after learning some OO - on the job I was introduced to UML and Design patters. Again suddenly I had a vocabulary that allowed me to communicate all the cool ideas I was having e.g. Instead of saying
- "What we need is a thing and you'd have a couple of different ones, but like you'd only be using one at a time and what it does is create the specific objects for you and don't worry it'll be fine I know what I'm talking about"
I could say
- "We need a Factory"
and everyone would know what I was talking about (and if not, I'd hand them my design patterns book, and they could RTFM)
-
When I needed to solve a problem and realized I could do it with a program.
Mark Robinson : I once has a math professor tell me to stop writing programs to prove theorems, he said I was "missing the point". I was freshmen and very proud of myself, for using my CS skills to make my life easier. -
When I read how LISP works.It was beautiful
-
Wait, you mean I didn't really need to learn calculus for web development jobs?
Joshua Hudson : I thought about this one a bunch as well. Though the longer I'm in the industry the more I appreciate the calculus I did learn. Not that I ever use it but the problem solving and the way you tackle a hard math problem can be brought over into the development world.BenMaddox : Oh, it is helpful, but not needed. I'd say the problem solving aspect was mostly learned through algebra and geometry for me. I guess my point was I didn't *need* to take it that far. -
"Hello World!"
-
Functional Programming. Doing Miranda and Haskell changed the way I think about programming and solving certain kinds of problem.
Using my second programming language (Fortran -- don't laugh) also opened my mind.
-
Computer Science is not about digital computers and it is not a science.
It was said a long time ago. But most programmers don't get it.
Our design of this introductory computer-science subject reflects two major concerns. First, we want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Second, we believe that the essential material to be addressed by a subject at this level is not the syntax of particular programming-language constructs, nor clever algorithms for computing particular functions efficiently, nor even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems. SICP
J.F. Sebastian : Related http://www.cs.uni.edu/~wallingf/blog/archives/monthly/2008-09.html#e2008-09-16T21_43_48.htmDaniel Earwicker : "Thus, programs must be written for people to read, and only incidentally for machines to execute." Try telling that to the compiler!Robert Rossney : Science is developing an understanding of the universe through controlled repetition. That's what CS does. It isn't any less a science because the subset of the universe it examines is manmade.J.F. Sebastian : @Robert Rossney: What you saying doesn't contradict my point (maybe I've poorly expressed it, so see http://uk.youtube.com/watch?v=zQLUPjefuWABobbyShaftoe : I quote that from SICP all the time!BobbyShaftoe : Also, I don't think Computer Science is exactly a science as science is really about explanation. You have testable hypotheses and so forth. Computer Science more aligned with mathematics and logic than science. -
I learned to program early on - probably when I was 12 back in 1987. I just kept programming and writing little things that suited my purposes or studies. I'd never considered myself an actual developer until at one point during a co-op term it suddenly occurred to me that professional programs (Word / Lotus 123 / Doom) all worked by using the same ints and float variables I was using, and that I actually was a developer.
I remember the thought struck me so hard that I stopped and said 'huh.'. For me, that's a massive emotional outburst.
-
My first introduction to patterns. Finally I figured out how not to reinvent the wheel every week. I finally figured out what colour it should have been in the first place.
-
Realizing that there are unsolvable problems. And then, realizing that one can approximate solutions to these problems.
-
IOC Containers. All of a sudden your application is beautiful - less coupled, easier to maintain and a lot easier to write in the first place! Ban Spaghetti!! Microsoft's Unity is easy to use, but NInject is superb. There are loads out there (StructureMap, Castle Windsor, AutoFAC etc) and it doesn't really matter which one you use - just use one.
-
Digital Logic Design class... The professor said an XOR and AND gate was a universal set. In other words, you could build any computer from a combination of them. Pretty mind-blowing! :)
Barry Brown : NAND *is* "not AND" :)Kim : NAND alone is sufficient, but XOR + AND is also universal -
My moment was when I was reading the S&ICP book: I realized that lambdas in the presence of closures allow you to implement any data structure you like. Suddenly, cons, car and cdr were not the essence of Lisp, as we were taught back in Russia. Lambda was.
(not that this affected my day-to-day C++ slug, but it was beautiful)
-
Humility. Going into my first 9 to 5 development job, about twelve years ago, thinking it was going to be a cakewalk and quickly getting put in my place. It is the realization that knowledge is not the same as experience.
Joshua Hudson : I would vote this up a hundred times if I could.Tmdean : Some of us had the opposite experience.joseph.ferris : You realized that knowledge is the same as experience? Good luck with that! -
Deterministic finite state automata and the realization that there is a direct transformation that goes from a drawing with circles and arrows to logic gates - this was the zen moment of knowing that software and hardware are one.
-
It was at my first real programming job when "scope" clicked. I had a basic understanding of it, but wasn't optimizing for it in high school. My job (right after high school) made sure that I knew what scope was.
I had a basic understanding of the concepts of all OOP. However, until scope clicked, I wasn't able to dive in and start running with development.
-
Probably in work and working with people. I think the hardest thing ever is to work with people in a group and modifying a pre-existing code within a tight deadline. In school, all we were taught is, "Make a student register application" and then it only taking 100 lines and rarely give any insight to large scale applications/maintenance and working with groups.
-
that all the theoretical stuff that i learn in cs doesn't necessary apply that well in real world. yup, normalization should always be done to tables in databases...wat?!!! all your data is in this table? what is this column all over the places?
we should use UML to do all the documentation, wait, what you guys have never heard of UML? is this some sort of a twilight zone?
that, trying to write a good program is more than getting the algorithm which is closest to n(0) or just remembering the syntax really, really well.
-
When I understood that data and instructions were both just bit patterns in storage, and that what happened with them depended totally on what process was interpreting those bit patterns. The eight bit byte 00110000 (hex 30) could be a 6502 branch on minus opcode or the ASCII code for digit 0 or the number 48. etc.
-
Mine was an anti-CS moment: realizing that for all we care about the complexity of a data structure, the constant matters a lot, and can get messed by memory layout, file systems, etc.
MaD70 : Yes, Big-O notation can be misleading. -
When I realized that code and data were the same thing. It's all bits and code can be manipulated just like any other data.
Paul Nathan : Only in von Neumann architectures. ;-) -
My first big system.
For about 10 years, I had programed on and off as a hobby, doing programs between 10 and 100 lines. I then arrived at college, and completed some very complex, yet brief algorithm assignments- still under 1000 lines, perhaps 3-5 files.
Then, I took a class with a term project. The task was to create a web based information system - you know, something similar to things people actually use. There is no experience quite like starting from nothing, figuring out a technology, and creating a multi thousand line application. It seemed somewhat magic that my sql commands actually created a functioning database, that they actually made it over a network. It was also an eye opener to realize that I, as a programmer, was fully capable of creating things commonly sold for thousands of dollars.
-
Hashing. When I realized that I could use a mathematical transform of a key to pick an index in O(1), and that I could get constant time storage and retrieval I wanted to store EVERYTHING in a hash table!
Oak : I also remember learning hash-tables as something mind-blowing... too bad that only a year later we've learned the cache implications of over-using hashtables! -
Well, this was really early in my CS education, but at one point I'd written a number of short straight-line execution programs. Then I learned about arrays and loops, and it really was an amazing experience to see that light bulb switch on.
-
Starting out: GOSUB. Wow! you can REUSE bits of code in your program?
At Uni: The Universal Turing Machine. A realisation of how simple computers fundamentally are.
Work : More difficult to pick one thing out, but possibly finally grasping the implications of apply-templates in xslt.
-
The most profound thing I learned going for my Bachelor's in CS came during my AI class, when I learned that information can be considered interchangeable with energy. This totally blew my mind and changed the way I look at the world.
More practically, I didn't have a true understanding of pointers until taking an assembly language course. Before understanding their implementation, they might as well have been useful and yet unpredictable gremlins.
-
The most important thing I ever came to understand as a programmer is that it is universally my fault when my code behaves incorrectly. Even in the few cases where it is not my fault, it's still probably my fault.
-
Learning Smalltalk, in my concrete case Squeak.
-
when i realized that there is much more to computer programming than writing compilers and assemblers...
-
Data Structures.
When I learned all of the common ones I realised that a lot of the code I had made could have been done a lot better. Although when I learned all of that stuff it was during my first year of University, so I wasn't exactly an expert by that stage. Before learning about linked lists, I was doing silly things like creating very large arrays, and simply hoping that the array won't ever be exhausted.
Now when I see a problem, I have a much clearer idea of how the data should be stored and accessed, along with the speeds of each implementation.
-
I had started windows programming with MFC but concept of Windows (parent, child, sibling etc.) was not very clear to me. It may seem very weird but as soon as I read about GetDlgItem function, everything became clear :-). Suddenly reading MSDN become my favorite hobby.
-
I was writing a small C++ program for a data structures class in college (using a DOS version of Borland). I had gone through a few iterations, but by now I understood exactly what it was doing. It was so simple, there was NO WAY it couldn't work... except that it didn't!
Stepping through the debugger, I watched it jump to some "random" line of code "for no reason at all". At my wit's end after watching it do this 10 or 15 times, I rebooted the PC and ran the program again. It worked fine! Hmmm... Guess I should've paid more attention to all those lessons about pointers and needing to be careful about accidentally venturing past the end of your arrays!
-
Python's use of lists. After reviewing the list of methods, I was extremely confused as to why something called a "list" would need these. Working through them, however, taught me quite a bit about data structures, including stacks, queues, linked lists, and eventually tuples, dictionaries, and sets as I worked through "why does this need something different than a list?"
For a while, though, my Python code did more list manipulation than my Scheme code.
-
Learning that a computer can be built using nothing but NAND gates and a clock.
Then much later on actually simulating this process myself. http://www.hackszine.com/blog/archive/2008/03/from_nand_to_tetris_in_12_step.html
-
Once, when I had been programming for about 6 years, my company sent me on a totally inappropriate course (can't even remember what it was - something to do with local area networks maybe...). Bored with it, I had a look in the class next door where they were doing structured programming. There were more course notes than students, so I was able to take one home and read it. There was nothing in it that I didn't kind of know intuitively and from experience - but I had never been explicitly taught it either, and having the principles and reasoning behind it spelt out was very illuminating.
Since then I have learned many other languages, and newer techniques such as OO, but the principles of structured programming are just as valid now as they were then.
-
'A' + 32 = 'a'
Took me a while to fully appreciate the fact that empirically everything is just numbers and not abstract 'a' letters and 'b' letters etc. I'm also a EE, so I'm slightly biased.
-
University classes in theoretical computer science and compiler construction.
On the theoretical side, I learned about terms like correctness and formal provableness, so where the limits of being able to write correct software are. I my view, some knowledge in this area is mandatory for writing software that does what it should, even if formal proofs are acutally hardly ever done in read-life software development.
There has been no other place where I learned so much about the programming, importance of theory (in some areas), but also stuff like how to implement complex data structures efficiently like in compiler construction. Knowledge in this area does not only help for related problems like building parsers for complex data formats, macro languages or similiar stuff, but also helps to get some idea what the computer actually does when we enter instructions in high-level language and what needs to be done to implement software efficiently.
-
recursion.
-
This is taking me way back to my youth. Apple ProDOS had just come out. Prior to that was plain ol' Apple DOS 3.3 with its flat file system, which I cut my teeth on.
I had an "a ha!" moment when I figured out the difference between absolute and relative pathnames and that they were interchangeable. The concept of the "current working directory" suddenly took on a whole new dimension which was missing before. Sure, it had all been explained in numerous books and magazines, but it didn't sink in until that moment.
-
Learning how pointers worked in C was definitely a light-bulb moment. But I had a much better one a few years later: modularity and abstraction. What is significant is that it came after I'd been doing both for months. Experience can be a wonderful teacher.
(What actually happened was that I was learning how to write Windows programs in C against the Win16 API. The Petzold book was absolute gold, but it taught "start with this skeleton". That was the key. I eventually had a batch file to start a new program by copying the template I had made of the essential pieces. When I learnt DDE, there was so much mechanical stuff you had to do that it was (by then) natural to abstract it away into another .c file. Then I built a small library on top of my own DDE one and that's when I realized what I was doing. The lesson has stayed with me every since.)
-
Sitting with a user group of professionals all asking about solutions to their problems and realising that every single one was a problem to do with individual people and not technical issues. Every problem was a people problem.
-
I dunno if any scales fell off my eyes :-) but something that I thought was really cool was spatial data structures, like kd-trees and PR quadtrees. I also liked doing 3D graphics with matrices.
-
When I realized what functions were, a light bulb went off. "I don't have to do copy and paste anymore!"
mackenir : It goes dark when lightbulbs go off ;) -
Functional Programming.
Although I'm a determined C# programmer, this tutorial was an eye-opener to me that there are other ways of solving problems. Using overloading in C# to emulate functional programming, I was able to refactor some of my more complex algorithms using less than 20% of the code before while having a better readability.
-
AND, NOT and OR. I was aware what they do but one day our teacher explained to us how you need to arrange them to add up two 4-bit values. 1 minute later I was understanding how you would go about arranging them to do whatever operation you want them to perform on operands of any size you'd like. 2 minutes later I was thinking "a 32-bit CPU must be about the most complicated thing in the world but I still understand how it works, yay".
Of course every processor I would have built back then would be missing state, but we learned that later :-)
Recursion, polymorphism, the power of LISP and some other things I can't think of right now where also pretty big eye openers.
-
Lisp and the idea that you can use code to execute other code.
-
In my 2nd CS C++ class and I finished my homework assignment on pointers. I couldn't believe it actually worked.
-
1972 -- "fixed" a broken photo-typesetting machine by having the 4-bit computer flash every character twice, thereby getting the newspaper to the pressroom. The manufacturer's techinician was 150 miles away, and was able to replace the weak flash power supply the next day. Fixing or working around broken hardware with software was a "WOW" moment for me.
-
Being astounded that the IBM 1620 could do millions of operations, each nearly instantaneously, and never make a mistake. Computers really were a brave new world.
For a mechanical engineer, where things slip, wear out, fatigue, rust, and eventually break, that was phenomenal.
Or that a chunk of program that would take millons of operations could be invoked by a single instruction. That is like a machine the size of a battleship hanging comfortably from the thinnest wire.
-
I first encountered the idea of object-orientation in college, and while I understood the mechanics well enough — the "how", if you will — I didn't quite get the "why". It seemed like just another way of representing data and actions, and a fairly cumbersome one at that. It certainly didn't inspire me to stop writing procedural code at the time.
Some time later, however, I found myself reading through the language documentation of an interpreted language I was considering taking up (I've forgotten which) and while scanning through the examples, found what seemed the single most transcendent notion I'd ever encountered. The example was something akin to the following:
" foo ".trim();
In all my courses, I had never seen an object method called on a literal. It astounded me! For whatever reason, the idea of objects suddenly made sense. Classes as a way of structuring data had seemed clear enough before, but until that moment, the idea that objects could be so deeply embedded in the design of a language that actual string literals were objects with class methods had never occurred to me.
I've always felt a great debt to whatever anonymous programmer decided to add that particular example. Not only did it greatly expand my concept of how code is written, but I don't think I would have survived learning JavaScript without it!
-
Concurrent programming/Multi-threading flow control. This is where we got an assignment where we would enter a # of kids and a # of toys and each kid would receive a random toy trying to complete the set by trading duplicates to get any that were missing. The key was to create Semaphores to prevent cheating like somebody getting 2 toys while giving away one. Very cool assignment that showed how complex the real world can be.
Second on the list would be the realisation that there are only about a dozen lines of code needed for a function to do its work. I remember that being said and seeing some examples where that is how some things are done, if it takes more lines of code then it can be refactored down to that size likely.
-
The biggest thing I learned was that the customers have no idea what they want and once you show them something they usually will know they don't want exactly that and "can you make that green?" or "how about a picture of a tree here instead" will happen.
Basically Joel said it best in his article The Iceberg Secret, Revealed An application takes 1-10% of your time to look good and 90% - 99% to be functional but the customer will only care about that 1-10%
-
Test-Driven-Development and Domain Driven Design.
Nilsson's book, Domain Driven Development, has opened my eyes to the benefits of testing and modeling.
It was an unnerving experience looking back at my untested code.
-
It's been said many times but it's the same for me: Pointers
A spooky "clunk" noise at the back of my head as the relationship between code and hardware fell into place in a way it never really had before.
I'm sure there are many other ways to get that "ah ha" moment and that I was a dullard to have taken so long to have it happen for me but, until it happens, there is something fundamentally missing in one's grasp of the whole system one is working with.
-
Probably the most eye-opening experience was a required computer engineering class that went over logic gates, flipflops, adders, all the way up to state machines and ALUs. It was fun learning how those things worked but at the end of the class we actually designed a CPU. It was shocking to see how it worked. A CPU instruction was really just a bit pattern used by multiplexers to specify the input registers, output registers, and operation of the ALU (obviously modern CPUs are much more complex).
It was then that I felt like I understood computers "all the way down" from the higher level stuff like Java, C++, lower level stuff like assembly and of course knowing how logic gates worked. But the CPU design which connected the highest level hardware devices - register, ALUs, etc, up to the lowest level programming -- assembler meant that I now had a 'complete path' all the way from transistors at the very lowest level too whatever you could imagine at the highest level: OO design, scripting languages, whatever.
Other then that, the theoretical stuff was enjoyable, but it was a gradual progression of "cool stuff" rather then any one 'ah hah' moment.
-
When I started to learn Design Patterns, I then realized the real power of polymorphism. It really opened my eyes, and has completely changed the way I think about every project.
-
Much like the OP, my epiphany occurred while tracing instructions through an pipeline. It was like the last piece of the puzzle. All of a sudden there was no mystery left about computers. This was all there was to know, everything else was just gravy.
-
I am 24 and still learning a lot of CS stuffs, but so far the biggest eye-opener has been my exercise to learn Common Lisp and reading the SICP book.
-
With assembly, writing bytes to the address space of the screen and seeing pixels change.
-
Coming from a C++, C#, and Java background, many of the concepts of Scheme (a dialect of Lisp) such as first-order procedures, code as data, data as code were eye-opening (see Structure and Interpretation of Computer Programs).
Clojure (a JVM Lisp) is also eye-opening for its use of built-in concurrency.
-
It's the day you start thinking in code, rather than thinking about what code you going to be writing.
This seems to happen about 1 - 3 years after starting a language.
-
0.
"Computer science is no more about computers than astronomy is about telescopes."
1.
Computer science and mathematics are closely linked. Math would help me with cutting edge Computer science.
2.
That I didn't need to know advanced mathematics to be a successful programmer (to have high income).
-
When I first saw how a clever algorithm could be used to replace a bunch of really stinky code, I realized there was more to programming than just learning IBM 1401 instructions. Many times I have ached to start coding a project, and then forced myself to do some more thinking.
-
Desk Calculator example in Kernighan and Pike. It demonstrated how to use lexx, yacc, function pointers, implement a new language, all in one simple example.
0 comments:
Post a Comment