Mostrar mensagens com a etiqueta Heartbleed. Mostrar todas as mensagens
Mostrar mensagens com a etiqueta Heartbleed. Mostrar todas as mensagens

domingo, setembro 17, 2017

Programming is Like Music: "Python - Become A Master In Python" by Richard Dorsey


Just what is the fascination with spreadsheets? I played with them on my Spectrum in the 80s, but it wasn't very useful. I used a spreadsheet on a Psion handheld in the 90s to keep track of some data. And nowadays I have a spreadsheet in LibreOffice to keep track of my expenses and work out my tax (estimate, since obviously, you need to use a proper package to get it right). I've worked in places in the meantime where bosses think that Excel is a suitable tool for project planning. It isn't. But if you only give people a hammer, everything looks like a nail to them. As a programmer, myself, I'm finding this whole thing fascinating. The quality of the kid's programming output (and yes, it is programming, not 'coding') is going to be directly proportional to the teacher's ability who's teaching them. I have a big worry that this will go the way of foreign language learning in school though, even without this concern over the quality of teaching. It's a subject that needs self-determination and a lot of time spent outside of the class room to truly get to grips with. Without these two things pupils, will probably grow to despise the subject - and we may even start to put off future would be programmers. Children as young as four have been learning programming skills in the classroom for many years with programmable toys: Big Trak, Roamers and BeBots are some examples which have been whirring around on the floor. Disguise a robot as a sheep and get it to run away from the farmer or program a lifeboat to reach a sinking ship etc.

But programming is hard; very hard. Heartbleed and the concurrent Apple invalidation of security certificates in their software demonstrate how bloody hard it is. Teaching children to code is analogous to teaching them to make nuclear bombs. Though I think it’s not so much like teaching them to make nuclear bombs; it’s more like doing physical education with the goal of teaching them all to be fast bowlers. Or music with the idea of trying to make them all composers of classical sonatas.

Python is the right choice, and it really is easy as languages go. But for most people even learning Python is going to be frustrating to the point of impossibility. You could try LiveCode - also open source. A bit like the old Hypercard. Or you could try learning the Bash shell or Awk - both restricted purpose non-GUI languages which may be more accessible because they have very clearly defined purposes and limits. Or you could try the Gnome package Zenity. Python is very general and it has the complexity of having lots of IDEs...The problem most people have is conceptual. Their minds simply do not work like that. There is no particular reason why they should. Most people will not be able to be good fast bowlers either. They are perfectly fit, healthy and intelligent people. Inability to programme is no bar to learning or achievement of all sorts. It is much more important to know how to set up an OS, how to set up a network, to understand something about security and servers, permissions, users, all that stuff. Python really is simple when you compare it to a language like C. For example, to create an array with even integers from 1 to 100 in just one line in Python, you can do list comprehensions:

myArray = [x for x in range(1, 101) if x % 2 == 0]
Try doing that in C, you'll end up with something like this:
main() {
int myArray[100], i, index;
index = 0;
for (i = 0; i < 101; i++) {
if (i % 2 == 0){
myArray[index] = i;
index ++;
}
}
}

Wait! Why would I want that in an array I have no idea...

This looks much better:

for (i = 0; i < 51, i++) array[i] = i*2

In any case, what does the length of the code matter?

What matters is the readability and clarity of the code and how fast the programs runs.

Having learned both basic, Z80 machine code and assembler in the early 1980s I would say that the revised mental processing I needed to master to be able to create programmed solutions to problems using any of these coding methods has proven very useful in all manner of situations requiring clear thinking since that time. The big problem with learning this stuff is getting over the jargon and meeting the standard of prior assumed knowledge. They will also need to teach kids quite detailed machine architecture otherwise this scheme will fail.

Programming is like music or creative mathematics. Only 10 or 15 percent of the population are going to be able to do it. An even smaller percentage of current teachers is going to be able either to do it, or still less teach it. The idea that we take a year, teach all teachers to be programmers, and then have them teach all children to programme? It’s simply mad. Not only is it impossible, it is squeezing out from the curriculum the teaching of something that is much more useful and which is possible to teach everyone. That is systems management. Setting up computers and networks, trouble shooting, installing operating systems, servers and the like. Files and file management. The command line. Elementary scripting to the extent necessary to use the command line properly. In short, how to manage computers and networks. Not how to write programmes in two languages. Teach this, and you will be giving a valuable general purpose skill children will use in employment and private lives. And it is possible to teach it to almost everyone.

We don't try to give all teachers a knowledge of music composition next year, and have them then teach it to all children the following year. This is as crazy an idea as that would be. The only result is that we will prove once and for all by a wonderful national experiment that programming is a very specific and comparatively rare ability. And in the process, we will make a lot of perfectly intelligent and able people feel totally stupid and frustrated, when we could have given them useful and enjoyable instruction in things they could learn and would use.

Having this stuff ingrained young means it’s part of the way you think for life, and it’s hard for today's adults to estimate how much of this knowledge is going to be needed in the future just to be able to have access to decent jobs. Almost in the same way that typing was appropriate in the age before computerisation so that people could get higher paid clerical, administrative, and executive roles.

School should be as much about teaching kids to learn as it is teaching them what to know. The distinction is subtle but important.


Bottom-line: Will Dorsey’s book help on this road to computer literacy? Nope. Too short and without the stuff one needs to learn how to program in Python, but I’m not even sure that was the author’s intention. I don’t really know what kind of rationale these type of programming books fulfill, to be honest. How can anyone become a master at Python programming without the use of classes (strangely absent in the book)? Mind-boggling to say the least…

sábado, maio 27, 2017

WannaCry Ransomware



Question? As Linux is open source, is there not the chance that hackers can find vulnerabilities more easily?
Answer: No. Since it is open source, defects are easily found by competent engineers and patched quickly, as you'd know had you any competence yourself.

We are frequently told that proper architecture and solutions are too expensive and that they need to be more "pragmatic" (i.e. cheaper) in their approach and everything will be fine. The reality is that it doesn't work. The direction comes from the top; project and program managers are under pressure to reduce costs as their number one priority. Ministers take the line from those who tell them about cost reduction, not from experts who are "just being perfectionist" and "scaremongering".

There's a few other things in here. Security Architecture and Data Protection strategies need good threat and risk modelling and the application of year by year transformation. Overwhelmingly the 80/20 rule applies and that 20% (essential, difficult, error prone) gets descoped. We are only just getting to the point our expertise can handle the challenges at scale. This is because the last 20 years have seen the birth of the Internet and all its attendant tech... but even if we have reached the end of the beginning (and I would argue we haven't quite hit that, the IoT is just birthing all around us) things will not stand still, and our new-found powers won't keep up. Add to that the impotency of people relying on one and only one operating system without the ability to embrace other much safer and locked down systems and you know why IT literacy is thriving. And this start in schools where the “OS with windows” is basically creating a generation of young people not knowing that knowing how Excel works is not IT knowledge. Nevertheless, I rather resent the implication that this is a generational problem. The fact is the tech-savvy and the tech-incompetent are well-represented in every group. Being able to use instagram or attach a photo to an email doesn't mean that the person doing it necessarily has the first clue about how any of this stuff works, or indeed how to avoid security problems. It is certainly true that few politicians seem to understand how any of this works. Unfortunately, neither do their interlocutors in the media who seem quite capable of pinning down a slippery spokesman like the Portuguese one I saw on the TV this last week whose surname rhymes with “turbulent”, blaming the previous governments and companies for this latest security breach. Things are not so clear-cut.

If I’d built a car which was so badly designed it not only crashed all the time but anyone could steal it, you could call it a ShittyOS. Human nature being what it is, what can be easily stolen will be. And rather than redesign it you just added reinforcements, patches, bits and pieces and lock after lock after lock (equally flawed) onto the bits and pieces you might expect that no one would buy it and that victims of its failures would run to the courts. But inexplicably this glutinous tangle of threads, patches and flat tires still has a market and the idea that the maker is responsible doesn’t seem to occur to the owners of ShittyOS products.  Rather than focusing on the very simple point that the current administrations are culpable for both ending a support contract with some software houses I won’t name here, and failing to provide the resources to enable all older machines to be upgraded from older OS.

I think this problem is endemic in the public and private sectors. How many businesses were affected by the Dyn outage last year, the AWS S3 outage some time ago? The list goes on.

To protect and ensure services run smoothly, costs money, and also requires a proactive stance rather than a reactive one. Board members often see little value (wrongly) in upping their IT spend, and some IT professionals have apathy towards the confrontation it takes to push forward with initiatives. Commoditisation of the IT sector has also contributed in some way, with many providers on the 'race to zero' and therefore devaluing such services and conversations.

We rely on our IT, and services can crumble if there's a problem. Budgets need to reflect this fact better. I don't see this as a generational issue, an education one maybe, a reactive Vs proactive mentality. definitely.

By the way, do you know who the real beneficiary of the global ransomware attack will be? No, not the hackers. It will be none other than the software houses. I can almost hear the champagne bottles pop, because the attack will force companies and governments that still use WinXP, Windows Vista, Win7, Win8 (or Windows Server 2003) to switch to Win10 (or Windows Server 2012+). Incidentally, it appears it was Win7 and not WinXp which was the bad guy in this picture.

People think IT is a onetime spend, and it will work for its lifetime independent of investment. This is just not the case. IT is an ongoing expense, and should be one of the first things allocated in the budget. As much as you espouse the free OS's, users just aren't ready to learn them, and a lot of applications do not support them. It surprises the hell out of me they let the contract with some software houses to end though, without renewing or updating the machines. That is just stupid, and simply suggesting that "switch to Linux" is a flawless solution is pure hubris, both because of compatibility (have the diagnostic tools been written for non-Windows operating systems? It's not like you can just copy the .exe over and expect it to work) and for the fact that it's not 100% secure like so many people seem to fallaciously claim (remember Heartbleed?). Regardless of this, I'd always recommend Unix or Unix base systems because they were engineered for security from the start - for heavy duty use - but glossy marketing hype ensures we still use effectively the same old ropey ShittyOS that have been around for generations. Unix BSD kernel systems were indeed built for much more resilient counter virus infection, scalable robust operation.

Unix is used to manage the NY stock exchange, the ATM Banking systems and so on. All. Mission. Critical. Systems.

Any computer at work or home based on a UNIX kernel is hard to crack. And even harder to spread.
If you can’t go the Unix/Linux way, Win 10 made some major advances in the field of security. So much so that it would make my system more vulnerable if I were to install a third party anti-virus product than if I were to only use the included Windows Defender. Even Google engineers now recommend only using Windows Defender.

What’s the bottom-line? It's not 'tech experts' that are necessarily needed. What is required are senior IT managers that understand the issues and listen to the technical experts.

I vividly recall a period in my career when part of my brief was to oversee the implementation of management information systems. The constant battle was against a short-term accountancy mentality (and I knew even then that we had too many of them and too few practical technicians) which laboured under the delusion that computerised systems offered immediate savings for a modest one-off expenditure. Which they don't - any decent system will take time to implement, and will probably require a short-term rise in costs (apart from the capital expenditure) in order to glean any long-term savings.

And then there is the phenomenon of data growth that systems often generate, whereby what was an impossibility becomes possible, and expands the role of a particular activity. The syndrome illustrates poor IT management understanding - at a wider level - of the role of investment.

What about this specific wnacry issue? No, Microsoft doesn't have a duty of care here. If they had suddenly stopped providing security patches with no warning, then they would be at fault. But that's not the case. Windows XP has been unsupported (so-called 'End of Life', or EOL) since 2014, a date that had been widely known about since 2007. Win 7 since 2009. The reason for this is that security patches don't write themselves, and there logically must be a cutoff point where the software needs to be written off and upgraded (kind of like how, the fourth or fifth time your car fails its MOT, the repair costs eventually become higher than the value of the car and it's far more worthwhile to get a new one instead of patching up the old one).

Windows XP has been around for almost sixteen years now. Windows 7 for almost 8 years. This is ancient when it comes to software. It's long been time to upgrade, and the risks of not doing so were well known by those who decided not to. The emails that were opened should never have got that far. It's that simple. Of course, end-users cannot be trusted to act sensibly. Not open suspect emails. Not click on links in them. Ignore screen messages asking, "Are you sure?". Heck, one of mine even clicked the button when the whole page was in a foreign language and she didn't have a clue, literally, to what she was agreeing to. They trust the system providers to protect the system.

This isn't only a matter of "government and company cuts"; it's IT management failure. There should be a sudden blossoming of job adverts for IT staff and managers. Time for the dead wood to be identified and thinned out.

Once the email containing the worm got inside a network, it would not need any email attachment to spread further. The worm had been quietly spreading from machine to machine for weeks. The worm's encryption package triggered world-wide on May 12th, but it has been spreading silently for much longer. It seeks out machines that were not patched with the Microsoft solution in the last two months. Any Windows machine (10, 8, 7, or XP) not patched since March was vulnerable EVEN if its user never opened any attachments. Once an un-patched machine capable of being addressed by an infected machine on the same network or wide area network was switched on, if there was even one infected machine elsewhere in that network, t would become infected and would begin searching out vulnerable machines itself.

Imagine people take lots of LSD and keep jumping off balconies, breaking their legs, backs and /or killing themselves. You could say: “we need lower balconies” or you could get rid of LSD. Anyone with a degree in Computer Science will tell you that whatever IT system you have put in place, it will be hacked - many reasons for this, one of the simplest is that the dudes who develop the security systems are the same ones who hack it, since they know how to. Any senior people in industrial IT will tell you this. Modernity and its defenders will happily say that we need to lower the balconies, whilst our LSD-stoned friends now wander into the road and are killed by oncoming traffic. IT is the problem so we need more IT to counter it?

No, not really.