Mostrar mensagens com a etiqueta Assembler. Mostrar todas as mensagens
Mostrar mensagens com a etiqueta Assembler. Mostrar todas as mensagens

sexta-feira, dezembro 16, 2016

Enabling Creation, Not Just Consumption: "Coding for Kids"

Follow my blog with Bloglovin

(some of my Apps at the Google Play Store)


'10 PRINT "HOW DO I GET MY 9 YEAR OLD STARTED?"
20 GOTO 10'



I taught myself BASIC on a ZX Spectrum when I was a teenager. 

I'd sell computer science as basically being much more fun than it sounds. At it's best, programming is a very creative activity, a drive for a balance of simplicity, power and elegance. It can be incredibly rewarding and luckily there's also no shortage of work if you're good at it, and high pay.

It has many of the plus sides of any physical engineering discipline (including the pleasure of making and evolving something after an idea), and few of the limitations:


  • very short feedback loops, you're almost instantly able to make something and see the effect - incredibly motivating to newbies;
  • the ability to return to any point during construction and 'branch' with alternate ideas or start over;
  • all the resources and community you need to start from scratch and become expert. No college degree is required, just a laptop, internet connection, motivation and talent.

I think we need to sell computer science in terms that will appeal to kids. But there's far more to IT than coding. Nearly every child will use IT in their future career and coding will suit very few. There's networks, hardware, storage, capacity and more. It's too simplistic a view in my view (I'm an IT worker since 1991).



(My Apps at the Google Play Store)


Once upon a time, people needed to learn about levers, gears and linkages to may their way in the industrial world. Later, it was triodes and pentodes, then transistors and integrated circuits. Nowadays it's C and other languages. It's a natural progression. Obviously, there will be some who just want to do hairdressing and "beauty therapy", but for others with ambition and a bit of an engineering bent, programming is where it's at. Single-board microcomputers such as Raspberry Pi and Arduino are an easy, accessible way to get started with this, and I can imagine that it will be very exciting and satisfying for a surprisingly wide range of students. There is very little exciting and engaging about making a Powerpoint presentation or using Microsoft Word - those are "skills" that you pick up anyway. But the real world runs on code and people still have to write that. If you don't understand it, even at a basic level, you are going to find yourself in a world you don't understand nor thrive in. Who wants that? Taught in the right way, there's nothing boring about it. I've shown people the very basics of Arduino - the "flash an LED" program - and without exception when people get that working using their own efforts it's always something that seems to get them excited. The usual response is that they never thought electronics/programming was that simple, but was inherently inscrutable and always beyond them. Once you show them that it isn't, the ideas and inspiration naturally follow. Even if that's only ever as far as it goes, they've learned something about how things work in the real world, and that is of great value.

I'm sure this will take off. It's even making me think that becoming a teacher for this sort of thing could be worth considering. If the teachers are not inspired, how can the kids be? I have yet to meet a teacher who is inspired by what can be done with Powerpoint. The computer is probably the most important and powerful piece of equipment most people have in their house, and everyone should have the skill to be able to tell it what to do for a change.
When it comes to children, some things (like cryptography or computational biology) are rather advanced subjects to start off with. Other things, such as recursion, or even search, is not really worthy of it's own mention. You leave of a lot of, for children, more interesting subjects such as animation, drawing and user interaction. Also I miss basic concepts, such as how does a computer actually work, what's it made of? If I had to make a list of stuff for kids to learn, it'd be something like this:


  • Basic computer concepts
  • Algorithms (would also include search and simple cryptography, also use of Open Source resources)
  • User interaction (not necessarily with GUIs)
  • Animation and Drawing
  • Web Design
  • Social Design and Crowd Sourcing
  • Basic System Administration
  • Cloud Computing

A lot of these things could be taught without computers in the form of games for younger kids. Information technology is not just about computers. Kids (and others) really need to understand more of the fundamentals and not just how to use excel. Otherwise, in a few years, a lot of key (but old) technologies will be lost (such as C or assembler). If you want to be able to do anything seriously useful with computers, you've got to have the maths. It's the maths that determines your ability to model a real-world scenario with data. That means matrices, differential equations, analysis, and all that other beastly stuff. Teaching someone "a programming language" will not address this deficiency. What objects will they be programming with? This stuff is tough, for students and teachers, much tougher than teaching Logo - but it's a gold mine.

I am pro programming in the curriculum. Why? Because I am pro problem solving in the curriculum and there is far too little of that. Programming (or coding) is a way of expressing thoughts towards solving a problem through to communicating the solution to others. It is like writing, music or art. It is a tool for creativity, for modelling and communication. It is about enabling creation not just consumption, it is about cooking not just eating. I’m not sure however, how we give children opportunities to, ‘…understand computational thinking…’. I don’t want them to just understand it, I want them to bloody do it!


NB: You can find my programming posts by using the following list (by year):

2016 (4):



Enabling Creation, Not Just Consumption: "Coding for Kids"
Android App: Brick-a-Brack
Android App: Whack the Minion!


NB: Counting in the post you're reading now, amounts to 4 posts this year.

2015 (9 posts):



Android App: "Sons de Bichos"
Android App: Children's Day: "Urso Taralhouco", A Flappy Bird Look-alike
Android App: (2b) U ~(2b): "Hamlet's To be ot Not To Be Soliloquy"
Android App: "Encryption e Decryption"
Android App: "Zombie Alarm"



2013 (5 posts):




Advanced Python Class: "PacMan" (extra-project)
Advanced Python Class: "Asteroids' Game (Final Project)"
Advanced Python Class: "Blackjack"
Advanced Python Class: "Memory Game"
Advanced Python Class: "Pong Game" (no AI Engine)

quinta-feira, fevereiro 13, 2003

Elegance Personified: "The Art of UNIX Programming" by Eric S. Raymond





(Original Review, 2003-02-13)




My two cents on Unix, C, Gates, Ritchie, Jobs, Apple OS, Windows, C++, Objective-C, Java, BSD, ...

The toe curling pieces on Jobs were way over the top, rather like Gates, Jobs lifted a lot from other people. Ritchie and co, rather like Tim Berners-Lee, gave the computing world so much, and I do mean gave (let’s not be offensive, not equate the ham Gates with Jobs.)

One problem is that "Software Engineering", whilst requiring some skill amongst participants, and constitutes a trade, isn't as robust a discipline as the Professional Engineering Disciplines. Having said that, I agree that Ritchie when with Bell (remaining with Bell until he retired) made a far greater impact on the computing community than other upstarts (including Tim Lee) - when Jobs left Apple, and developed the NeXT Workstation, its operating system was based on Unix - and when Apple acquired NeXT, its operating system led to Apple OS X based on Unix BSD. Even PC users can now enjoy the benefits of Open Source Computing, and install Linux on their PCs, Laptops and Portable Devices - whereas, even with modern Wintel systems, purists may ignore the sacrilegious Windows environment, and revert to DOS.

Of course, whereas within Bell Telecomms Research Laboratory, Ritchie contributed to the development of both C and Unix, this represents a far more significant contribution than even Tim Lee who, whilst with CERN, simply developed trivial utilities for use by other researchers before being more widely adopted.

I would have thought C a far more important contribution than Unix (sans Linux) looking back. I love the recursive way they, having written Unix in assembler, and then developed C, they re-wrote Unix in C! Elegance personified!

It's unfair to Jobs and Ritchie to compare them with each other. Ritchie was a brilliant backroom boy, and I agree with the praise listed above. Jobs was a product developer, a business man and an evangelist.

It's comparing chalk and cheese. What Jobs did was understand the importance and utility of work by Ritchie or the researchers at Xerox Parc. Without people like Jobs these great inventions remain intellectual curiosities. Without Ritchie there is nothing to develop. You need both and that is why Silicon Valley was successful.

I agree 100%; as I implied before - even Steve Wozniak takes pride in making his friend Steve Jobs' role clear when they worked together at Apple, so I doubt Jobs' friends, family and colleagues will bother about these rather obvious comments that he was no hardcore computer scientist. They were quite a double act.

Be nice to remember Ritchie without referencing Jobs. They both did good stuff but as a programmer even 'tho I'm typing this on a Mac (of course running OS X which is part of the UNIX family tree the OS and that Dennis "created") like 99% if people I don't have a constant Homer Simpson like Ritchie vs Jobs battle going in my head

This is all getting very Lady Di vs Mother Theresa - the silly Indian lady had the nerve to die in the same week as the pretty one who was far more simple to write about - how rude! (JOKE by the way!)
Pouring a mountain-dew hi-energy on the curb. RIP

Dennis Ritchie gave us the tools to build the web, modern computer etc. Steve Jobs was the person who combined the concept of the PC with that of a household appliance, and was adamant right from the start that the PC was to be a consumer device that anyone could use.

The Xerox team were still developing for researchers - they didn't view their system as one for the average consumer. Their mouse had 3 buttons, and a lot of interaction was with the keyboard including resizing windows. If you've ever double clicked then that's the influence of Jobs.

Even fewer people have heard of Alan Kay who was instrumental in PARC in the development of the windowing and objective orientated programming. The point is they were all instrumental to the modern computer. Dennis Ritchie is feted by programmers who work directly with the tools he created and I'm pretty sure he'd be happy about that. Actually, OS X and iOS use the 4.4 BSD-lite codebase, which by definition contains none of Ritchie & Thompson's AT&T code. However, that doesn't take anything away from the status of the man: he was a true genius and innovator. Also, his C programming language and its descendants (C++, Objective-C, Java, etc.) are even more pervasive than Unix: Windows does not derive from Unix (from VMS, if anything), but it is mostly written in C/C++.

I think a few things are worth pointing out:

  1. C has good points and bad points -- some clever ideas, and some mistakes. The lack of proper string-handling tools, and the laxer-then-necessary type system were two mistakes that had an expensive legacy in terms of buggy software, student learning curve, failed projects, etc.;
  2. The K&R book is worth mention as a model of great textbook-writing -- clear, readable and brief, without being oversimplified;
  3. Ritchie, Thompson, et al., didn't give their work away. They were paid very decent salaries. Also, BSD Unix started out as a flagrant breach of license terms and copyright, and had to be rewritten to avoid lawsuits;
  4. Stallman's free software idea has probably caused a degree of stagnation in software development. A free product that works drives most of the competition out of the market, but does not hurt the biggest players, so it creates an oligopoly;
  5. If Jobs hadn't persuaded Apple to buy his company and bring him back into the fold, Apple would probably have bought BeOS which, if anything, was a better platform than BSD Unix/Objective C.


Thank u Dennis Richie for providing me with many years of employment. Using C, C++, and Java, all based on C-syntax. If I had to program in assembler, I'm sure I would have quit long ago. It would be nice if heads of state would come out and praise your accomplishments, but I won't hold my breath.

I am not interested in getting into a language war now, or a platform war. C has its good points and its well-documented bad points. C99, C++, Java and C# all attempt to correct many of the widely acknowledged flaws of the original C, often by borrowing ideas from other languages that were designed in the 1960s (including Algol, Simula and Smalltalk). I'm sure even you do not think that C is perfect.


Bottom-line: What? No code and still 5 stars?? Read it I you want to understand the evolution of the Unix OS and all of its look-alikes, and all of its flavours.

NB: Many eons ago I coded this in the boot script of all Unix user sessions (I was younger then...):

Unix erotica?
%^How did the sex change^ operation go?
Modifier failed.
%make love
Make: Don't know how to make love. Stop.
%sleep with me
bad character
%man: why did you get a divorce?
man:: Too many arguments.


%blow
%blow: No such job.