sábado, outubro 21, 2017

C = B * log2(1+S/N): "An Equation for Every Occasion - Fifty-Two Formulas and Why They Matter" by John M. Henshaw


Oh man, I once had a heated debate with someone, who was convinced that vinyl gave better sound quality than CDs because it contained higher frequencies than 22.05 kHz. And that those frequencies would "harmonically" influence the lower frequencies, even though he accepted his ears could only detect  <  20 kHz. He was insistent that he could hear the difference, despite his ear essentially being a low-pass filter (well ok band pass, but you know what I mean). Apparently they don't teach linear superposition in some engineering courses.

One of the glaring absences is the mentioning of the contribution of Boltzmann who was the first person to formulate the logarithmic function to connect the average uncertainty with the probability of random variable. Shannon had extended this result into the communication scenario to propose two theorems-source coding theorem and the channel coding theorem-that are the basis of the modern communication technology; Shannon’s treatment became “Shannon’s Law.” Until then nobody really knew what they were doing before Shannon. It was all very ad-hoc.

Shannon's main contribution was to show us that it was possible to send large messages with an arbitrarily low number of errors even when the channel itself makes errors, provided the data rate is less than the Shannon channel capacity: C = B * log2(1+S/N), where B is the bandwidth in hertz, S is the signal power in watts, N is the noise power in watts, and C is the capacity in bits/sec.

Shannon didn't actually tell us *how* to design codes to reach channel capacity. Starting in the 1960s, largely driven by deep space communication, better and better methods were developed until the early 1990s when turbo coding was discovered. It comes so close to Shannon's capacity limit that there will probably not be anything better.

The big missing thing in Information Theory is trying to understand what are "natural bits of information" due Physics and Math dominating the field. There are lots of opportunities here for students that learn to thing on the big picture. Accordingly to Matrix/DNA Theory models, natural bits are a "living thing", while bits inside wires are artificial bits - the difference between salvage animals free in the jungle and domesticated animals as prisoner at our houses. But, a living thing is something that contains the Matrix universal formula for systems, which first shape is natural light wave. Academic Science is studying the process of communication between two or more different systems. This process is an avenue of two ways. At the point of the sender, the environment is chaotic, then, the signal needs trespassing this zone till arriving to the receptor, inside or outside the chaotic zone. Sender and receptor, chaos and order, becomes, by a mathematical viewpoint, 0s and 1s systems, so, like two humans systems speaks the human language, these 0s and 1s speaks the 0 and 1 language. These two 0 and 1 systems are two brains of two computers, and here is the cause that computers does not learn how to think by themselves. Natural bit of information is, in itself, a complete and working system, containing the seven universal systemic functions, or variables. So, these bits can be processed and transformed by a natural brains, producing a third new product. I think that quantum computation will need seven variables, instead the two 0 and 1. But, knowing the Matrix/DNA formula, we can do it.

Our modern human technology about information and transmission still is at the stone age. The most advanced technology for transmission of information will use the "cosmic wave background", accordingly to Matrix/DNA Theory's models. If there is a natural light current flowing by the whole universe, why not use it, instead of manipulating pulsars of voltages of wired electrical currents?
But, then, humans will need a revolution on the understanding about natural light, natural and living bits of information, and what's the cosmic radiation. Matrix/DNA is investigating these issues. These ex-machine light contains the force that imprints dynamics into inertial matter, creating natural systems, by its life's cycle process. The natural white and positive waves of light emitted by quantum vortexes (these are the first shape of natural living bits of information), at every micro big bang of this pulsating Universe has no time, it is instantaneous, expanding by the whole; the light that we grasp and from which we know its "speed" is merely these cosmic waves re-transmitted by receptor/sender stations known as stars, pulsars, etc. So, we will have a bit of information transmitted instantaneous throughout the whole Universe. Knowing the distant supreme goal will drive us to develop the technology.

Mathematically Shannon's formula for information is just Boltzmann's formula for entropy. Shannon actually called it entropy and equated information with entropy, his interpretation being that the information obtained when "reading a message" was greater the greater one's initial lack of knowledge regarding the message (as measured by the entropy).

In college I did a paper on compression algorithms and information theory; it really was an interesting and expansive subject and this book give us a sound explanation of the fundamentals, although short on the particulars.

sexta-feira, outubro 20, 2017

Non-Standard Fantasy: “The Blade Itself” by Joe Abercrombie


I "discovered" Abercrombie in 2012 when I was actually looking for some fantasy novels that "weren't Dragonlance-level shit". Back in 2012 I started off by reading “The Heroes” first. Only in 2013 I got to reading the First Law from the beginning.

Abercrombie does not sugar-coat his narrative. That’s for sure. That’s the first indication you’re not reading your running-of-the-mill fantasy:  it’s disturbing because it skews closer to real life than we are used to or comfortable with fantasy-wise. Protagonists fail, start things but don’t finish them, have their plans changed in mid-stride and generally push through as if they were making it up as the narrative progresses. While reading “The Blade Itself” I kept expecting conventional fantasy storytelling to assert itself and bring the characters back around to the “right” path, despite evidence to the contrary. I’m not that well versed in fantasy lore, but I think this first novel in Abercrombie’s fantasy milieu sets up a precedent for an ending that just isn’t what you expect, but I still kept waiting for that tide to turn back and give me a the usual happy ending cropping up in a lot of fantasy nowadays.  What I found most unsettling is that there IS a happy ending – it’s just the last person in the entire book you’d expect gets everything he wants. It was one of those endings, and one of those books, that sits with you for a very long time.

A lot of the fantasy I still read tends to have 'evil' as an abstract, exterior force and 'virtue' as somehow innate and hereditary. There is room for moral complexity in fantasy but a lot of people make good money without bothering, which dilutes the impact of the better stuff (as does the tendency of critics and publishers to pretend that anything not thud'n'blunder broadsword-opera is magic realism or some kind of new genre, and to misprise anything that looks naturalistic until a particular point as 'going off the rails' or 'getting confused'.)

There are probably simple reasons too, but shrugging and saying 'the public like simple' rather weakens your case that there's more to fantasy than that. I reserve the right not to only like what a lot of other people like, but more to the point I think other people might like the less, um, generic work if it were more widely available. If I wanted to make a point about the processes of 'othering' in the rather linear world-view of the tabloids, I’d say that many fantasy works examine this process either directly or as a side-effect of the way we read fantasy. "Simply 'Good vs Evil' stories" aren't as satisfying. I've read all that before. Identifying evil is a tricky enterprise. The smoker and the loud biker were --- and maybe in some degenerate places and times still are --- considered good, by way of "cool", while in reality they are evil polluters. It's all down to how much bollocks a society can live by before they wake up and smell the coffee of reason.

Most fantasy these days (i.e. the last 20 or so years) has shied away from the cliché of evil overlords and his countless, faceless minions. In fact, even at its most prevalent form, it was only the most visible form of "evil," rather than the most common. I might even go so far as to say that an awful lot of fantasy focused on the snark between the main characters rather than the "good vs evil" thing - hell, the thing everyone remembers from the endless "Dragonlance" novels is the parasitic relationship between Raistlin and his twin brother Caramon, just as a f'r'instance. But to get back to the main thrust of this skewed-sort-of-review, in K. J. Parker's novels the heroes are quite often men and women who would be regarded as evil in other circumstances and much of his appeal as a writer is in the moral quandary this creates. I would point everyone reading these words in the direction of his excellent and grimdark-before-it-was-cool “Academic Exercises” and “The Folding Knife”.


NB: I’ve been told repeatedly to read Brent Weeks which I never did (I’m looking at you Bookstooge...). It’s time to rectify this. Weeks is probably the only major fantasy writer I haven’t read yet.

quinta-feira, outubro 19, 2017

Tomato Soup is Lava: "Time Ages in a Hurry" by Antonio Tabucchi, Antonio Romani (Translation)



Tabucchi’s notion of time (e.g., aging) is a weird one. I grew up thinking it didn't really exist, that it was just something us humans invented as a measurement, like cm or mm. But I also used to think tomato soup was lava. Time is the only God, because it behaves in exactly the way any self-respecting God should: it continues to do its thing utterly dependably, and ignores everything else. The problem, I think, is that our scientific knowledge of time is so limited that in any discussion, we can't avoid drifting into metaphysics, which doesn't really add to the discussion. Regarding "time" as an entity, I feel we are like a caveman looking at the Mona Lisa and wondering how it was done and what it could mean. We simply don't understand the extent of what we're looking at, and, like every generation, fall into the familiar trap that, because we are the here-and-now, we are the cleverest there's ever been, so we KNOW the answer, when, in fact, we're not much smarter than all the thousands of generations before us. The generations who follow us will behave in exactly the same way.

They don’t understand. When you are young, you don't really believe it will happen to you. 'The old' are a different species. By the time you get it, you are old yourself. Age sneaks up on us. I look in the mirror and ask 'Who are you and what did you do with my body'? Old age is just are the last pages of a tale told by an idiot, full of sound and fury, signifying only the rare pearls you found in an ocean of manure, and letting the glowing memory of those rare pearls play you out into oblivion. Hand me my single malt, please. The upside of being dead? Much of the bad, maybe even the worst, is behind you. You feel no pain, or even mild frustration, when you are dead. That's good!

I used to think I was old when I was 30. Getting old is not something that worries me rather than the inability to do certain things that comes with age. There are many things we do not understand. But I have come to understand two very important things due to personal experience. The first one is that we are not alone, and that there are beings who treat us in a similar way that conservationists treat wild animals by tagging and observing them. I have come to accept this, and I do not need have a need for anyone to accept this, I accept it myself, and that is good enough for me! The point is: Am I getting on in years? We all are! But I can still pedal the living arse off any of the teenagers or 20 somethings around here. I'm still reading about string theory, loop quantum gravity and topology. As Petruchio puts it in “The Taming of the Shrew”: “Where is the life that late I led?” I had this sentence come at me about 10 times in my life, but it's true (I know it’s all in Shakespeare you dumb ass!). The real meaning of this sentence dawned on me a long time ago, but the sentence only neon-ed up two years ago when I re-read all of my Shakespeare.


I don't doubt that someone, finally, will unravel the mystery of "time", but I don't realistically expect it for hundreds, if not thousands, of years. Meanwhile I’m still on this wonderful journey of reading all of Tabucchi’s body of work. Aging, Time? Bah! Read Tabucchi! It’s all there.

terça-feira, outubro 17, 2017

I Can No Longer Bear the Aggressiveness of Poetry: "Berlin-Hamlet" by Szilárd Borbély, Ottilie Mulzet (Translator)



"When I came to Berlin, I no
longer
wanted to live. Why isn't
   there a way, I thought, if 
  someone doesn't  want to live
any more, simply to 
         disappear."

In "Berlin-Hamlet" by Szilárd Borbély, Ottilie Mulzet (Translator)

"I do not believe in poetry"

In "Berlin-Hamlet" by Szilárd Borbély, Ottilie Mulzet (Translator)

"I can no longer bear the aggressiveness of poetry,
and I do not wish my deeds to be investigated."

In "Berlin-Hamlet" by Szilárd Borbély, Ottilie Mulzet (Translator)


"My need is for those who will know/how/all of this will end."


In "Berlin-Hamlet" by Szilárd Borbély, Ottilie Mulzet (Translator)


I can't give any more quotes...The book is a long quote.

After having finished reading this heart-wrenching poetry book, my thoughts come back to Hamlet, as always. It's always about indecision... 

Borbély is masterfully able to give us this indecision in a modern version.

The Hamlet's main soliloquy reflects the character's conflict and uncertainty after his father's ghost has told him of the sins of his mother and the crimes of his uncle, and he's asking himself what best to do with that knowledge. The best point for this introspection can be debated and played with. It isn't likely that treating it as Hamlet's greatest hit and getting it out of the way first thing is appropriate for character development. Although I haven't seen the production and it may be awesome. But the soliloquy really doesn't refer to his particular situation at that particular moment. There are no first person pronouns in it at all, and his other soliloquies are much more specific about what's happening to him. It is a generalised piece of philosophical thinking. Beautiful, insightful and compassionate, it may be, but it isn't a man deciding whether to kill himself or not. It isn't even especially emotional: there are no exclamations in it (two of his other soliloquies begin "oh").

It isn't an accident that the line 'to be or not to be' is such a passive, neutral construction; it's a meditation on the human condition, not a great emotional outpouring. It only touches Hamlet's own case, and then obliquely, when it gets to "lose the name of action" right at the end. And it's really not anchored very securely in Act III, since nothing immediately before it seems to provoke it, and it isn't the cause of anything that directly follows. I think it might work well as a prologue (though I don't know how well this production made it work). It might set the whole up thematically. Olivier in his film used a different speech as prologue, and added his own words: "this is the story of a man who could not make up his mind" (doesn't he also move "to be or not to be"? - Sacrilege!

It's perfectly possible for a specific individual to make a general philosophical argument, especially if it is entirely in keeping with their character. Hamlet is intelligent and skeptical, a thoughtful student and scholar. All of that is reflected in the way he thinks. You can't imagine Laertes ever having these thoughts. It is the generalisation within the speech that makes it so effective. Hamlet isn't just talking about his own situation (in fact he doesn't really mention it at all) he's talking about all of our lives and doubts. "... And makes us rather bear those ills we have, than fly to others that we know not of," is a wonderful way of turning the whole argument out towards the audience. The context is of a man capable of such extraordinary philosophical thought, trapped within this destructive narrative of revenge. Szilard played with an un-fucked-about version of Hamlet, but he still fucked with my head. Everybody fucks about with the words and rightly so. That's what makes this kind of stuff so gut-wrenching.

Should have gone with "the rest is silence". God, I hate this kind of poetry...5 stars because of that. I'll say no more...

NB: This collection was published in the original Hungarian in 2003 and this English version has been translated by Ottilie Mulzet.

NB2: If you want to hear what this particular soliloquy sounds like, look no farther. I built an Android App where you can find all the classical actors reciting it:

Kenneth Branagh
John Gielgud
Laurence Olivier
Derek Jacobi
Paul Scofield
David Tennant
Christopher Plummer
Ethan Hawke
Kevin Kleine
Ben Crystal
William Belchambers
Richard Burton
Vincent Price
Mel Gibson
Toby Stephens.

domingo, outubro 15, 2017

The Linux Server Encyclopaedia: "Anonymous" by Roland Emmerich



Sigh. 

Sorry to interrupt, but what is it about the nature of our species that is so attracted to conspiracy theories? We can trace this as far back as Homer and plenty of modern examples as well.

If I had a crystal ball I think it may well show a 2416 Ox/Cam luminary frothing at the bung as he expounded on the impossibility of an illiterate uneducated Lennon seen as the co-author and author of his celebrated works. I took an interest in the claims of the Earl Of Oxford after the film Anonymous made its preposterous contribution in 2011. I was particularly interested in the fact that the denialists draw so much confidence from their claims to have discovered hidden ciphers in epitaphs and ancillary texts. The Oxfordian method of unwinding these hidden messages (they are never ciphers) involves little more than separating all the letters and making words out of them as if they were a Scrabble bag with two dozen blank tiles. Oxfordians tend to stop as soon as they have found what they want. I was able to go a bit further, whilst sticking rigidly to their 'method'. As a result, I can offer a few new ideas about Shakespeare's favourite books which not even Professor Jonathan Bate may not have considered.

1. The Autobiography of Howard Kendall

By far the most distressing revelation for a lifelong Kopite is that Shakespeare was an Everton supporter. As a native of the Midlands, he would have been forced to look north for a credible team to support. How he came to to choose The Toffees is a source of amazement but a 6x48 grill made from the epitaph reveals the legend "Evrtn is grat".

2. The Linux Server Encyclopaedia

It's fairly safe to assume most playwrights of the period, like creatives today, were Mac users but Will obviously needed industrial strength servers for his prolific output and showed a strong preference for Japanese hardware. On a desert island, with no online access to help, a cautiously competent techie would surely have taken a manual. A 4x96 grill reveals "Sony btr thn HP".

3. The Brilliant Bumper Joke Book

Much has been written about Will's comic knowledge and his instinctive grasp of the science of timing. His tavern jokes and gag lines like "William the Conqueror was there first" are legendary. No one has explored the possibility that Will may have been an early comic stand-up artist, yet in his epitaph (12 x 9 grid this time), he clearly left us one of his most treasured punchlines "Jesus saves, Moses paies owt". I think he'd have liked this book to remind him of his audience.

So remember, whilst almost all of what Oxfordians have to say might look completely ridiculous to anyone with a knowledge of the work, there will still be a legacy after the few who are left have gone.

Of course, Oxfordians don't really seem to like the fact that plays are in fact plays, and they will tend
to ignore everything that is known about how plays were produced in the period. Paul Crowley believes that the "canonical plays" were "rarely if ever performed" while William talks of plays being "held back". The fact is, that prior to 1616, the year of Shakespeare's death and Jonson' First Folio, the idea of English drama as an authorial publishing venture was unknown. The quartos were published by printers as and when they could get hold of prompt copies, and playwrights did not enjoy the benefits of copyright legislation. Plays were written to be staged. For no other reason - certainly not just to send coded messages within an aristocratic coterie.

Clear internal evidence shows typecasting, and typecasting determined a deal of choices for Shakespeare. Hamlet, which is explained by Oxfordians as a kind of autobiographical fable, tells us much about the context of drama at the time. Hamlet and Polonius (or the actors playing them) joke about the one stabbing the other in Julius Caesar the previous season. Hamlet demands that clowns stick to the script, shortly after a popular ad-libbing clown has been replaced with a more sober actorly clown.

The fact that plays were a successful commercial form of entertainment is very bothersome to Oxfordians, which is why they try to refute or tone down the idea wherever possible.

Last night I watched the movie again. After having read all of the Shakespeare work since 2011, I said to myself: "Maybe the the movie will have some merit after all"...Nope. It's was belly-button fluff in 2011, it's still belly-button fluff in 2017.A word to the wise for any brilliant writers out there - you'd better make sure that when you die, you leave behind you a trail of debris in your personal life to rival that of any of your characters.

If you write great romances, leave ample proof of all those sordid affairs you had, all those hearts you broke! Swoon for all you're worth in front of the cameras, baby, and don't leave the house without your lipstick on. Keep a detailed record of all the illegitimate children you had, and who adopted them, so that DNA testing on your descendants in the 25th century will prove you to be the author of your bodice-ripping yarns. Do not, under any circumstances, die unmarried, undivorced, or worse yet a virgin - the people of the future will mock the very idea of you understanding romance, and will put you in the fraudsters hall of shame alongside Jane Austen (whose books, as we all know, were really written by her male editor).

If you write spy thrillers, you'd better put a copy of your MI5 file in the safe for future generations to find. Better still if you can leave a copy of your old Stasi file alongside it. Shhh! Don't keep anything more than this or it will make you seem careless... careless like a bad spy who could never have come up with that twist in the ending of that triple-agent novel you wrote, you know the one, only you don't, do you? Because you didn't write it, you liar, you were just the front, the patsy for that CIA operative who couldn't use their own name because they had to do the job of a real man, a job you can't begin understand. You disgust me.

If you write fantasy fiction and care about the integrity of your legacy you really need to leave proof of your pagan/wiccan/voodoo/Satanic/other* predelictions. This is quite difficult, as a few scrawls in the margins of a tattered copy of the Book of Shadows might not be enough to convince people in the 25th century. Try getting arrested for the ritual murder of a virgin, or at the very least, indecent exposure when dancing around Stonehenge at midnight. Laminate the subsequent newspaper reports to ensure they don't degrade over the centuries as future generations will consider electronic files too easily faked, and besides, most of them were lost forever in the great EMP war of 2323, which was all a bit convenient for you, wasn't it? Someone still covering up for you after all that time, hmm?

If you write science fiction, for goodness sake don't be an actual scientist! People in the 25th century will understand physics in a way we cannot hope to comprehend and will therefore find your faster-than-light drive hilarious, and refuse to believe that a scientist wrote such a thing, attributing it instead to your alcoholic second cousin who still lived with his mother, as that's what science fiction writers are supposed to do. Please don't tell me you've moved with your girlfriend. We really are beyond hope now, aren't we?

Follow these basic rules, and you too can die happy in the knowledge that centuries from now, your body of work will not be used as an anti-establishment sledgehammer by an irate cultish group seeking to "bring down the man" by reading fiction in strictly autobiographical terms and calling everyone "sheeple".


To paraphrase Bill Bryson:

"Oxford would certainly have had ample leisure to write the plays after 1604, assuming he was not too dead to work."

Shakespeare wrote some of his finest plays after the death of Oxford. That's how stupid these people are. Shakespeare belongs to us, not the inbred, narrow aristocracy and thick actors. To the tower with them.

NB: It always amazes me what some people get obsessed by. Engaging with most "Shakespeare didn't write Shakespeare" advocates is a bit like being button-holed by someone who thinks they can prove the Ark of the Covenant is really buried under Birmingham New Street Station, and prove it mathematically based only on the Book of Revelation and the paintings of Rembrandt.

sábado, outubro 14, 2017

Entangled Strings: "Theories of Everything" by Frank Close


I’ve got a theory that the rules of the universe ARE created by people thinking up theories about it. Although due to elitism bias, i am yet to receive any funding for my groundbreaking “hypothesis.” Fucking scientist bastards, getting paid for thinking about stuff they think I can’t understand... what a scam.

I suspect that a lot of the hostility and rejection of science by people who can't understand it is because it makes them feel stupid. It is, after all, fundamental to understanding how the world works. Some people are scientists; some people are not, but know what science is; but some people not only don't understand science, but don't know that they don't know, because they can't even see it. This is a bit analogous to being able to read. Some can go into a library and read in a few languages, some only in one, others can know what books are but not be able to read, and some don't actually know what books are and feel stupid, so pretend that they either don't exist or are some sort of conspiracy against them, which makes them feel important. There are theories around which involved such complex mathematics only a handful of people in the entire world can understand them. Peer review not much use here and enter this new age of egg-heads trying to “out-complexify” each other.

You have only 12 dimensions? ...... pffft... Look here, I have a closed equation which explains life, the universe, and everything with dimensionality to the power of infinity minus 1. Theories aren't always testable however science tends to disregard theories which aren't testable because it does experimental physicists out of a job (and if its not testable then it becomes a matter of belief rather than science). Quite extreme theories are potentially needed to displace quantum/mechanics/relativity because they tend to be harder to test. The Bohm interpretation of Quantum mechanics offers explanations for things standard quantum mechanics doesn't explain but also produces identical results to standard Copenhagen interpretation of Quantum mechanics. Some disregarded it solely for that reason that it appeared to not be testable, though in recent years suggestions have been made for possible deviations from standard model results.

Take instance one of my favourite cases: Holonomics, i.e., the idea that the universe is a multidimensional projection of a two-dimensional universe, also partly due to Bohm, is also largely ignored because it appears to be untestable. Where theories are untestable, proponents tend to spend much of their time trying to come up with experiments that will allow them to be tested.
String theory has had a similar problem.

An important part of physics is figuring out how theories can be tested. As a pupil I was pretty good at mathematics and physics. So I'd conclude that our ordinary four dimensions plus the six extra dimension would result in a ten-dimensional universe. Would I be right? Indeed I’m frigging right! But there are other versions of string theory that call for differing numbers of dimensions. After 10, I believe 26 is the next mathematically credible number. But the questions is: “Will it make my cornflakes stay crispy in the milk?” Answer: “In the 5th dimension they will be crispy, in our dimension they will remain soggy. In the 6th dimension they will be a moonbeam. I'm sure I saw one of the extra dimensions doing a spot of shopping for the weekend, in Lisbon, last Thursday. It had nice legs and a cotton frock.”

String theory is the theory that matter, energy and women are made up of tiny strings. It states that whenever you put a set of perfectly arranged strings in any container, they will come out completely tangled, no matter what the arrangement or the container. The aforementioned three ingredients (plus lard that acts as the glue) give rise to various elaborate, sophisticated and highly complicated and yet subtly simple and non-functioning existences, such as: iPod headphones, Christmas Tree Lights, garden hoses, electric cords, string panties, shoelaces, my Kodi player, my Synology NAS, etc.; although surprisingly beautiful and functioning constructions have also appeared, such as horse intestines, beetle legs, belly-button fluff, the area behind your computer desk still has a lot of trash from the last century. The answer to that is loop quantum gravity; an opposing theory to string theory and one that has concrete evidence including the Higgs-Boson you happen to have heard about.

Seriously. Something that I've always found difficult to get my head round in "simplified" explanations of multidimensional physics is the concept of dimensions rolled up so small that we can't perceive them. (And I'm well into mathematical physics, though not this specialty). An explanation I've come up with, with the request that it be criticised and corrected if possible, and the hope that it might be accurate enough to help:

Imagine a creature living in a perceived one-dimensional world, a cotton thread (not necessarily straight as viewed by an outside 3D observer). The creature will only see one dimension (with 2 directions, forwards and backwards). Its senses are not sensitive enough to discover this, but actually the thread is 3-dimensional, with the perceived dimension of extension along the thread, although the thread has a diameter and an interior, so that expressing a position microscopically would require 3 dimensions (in polar coordinates extension along, distance from the centre, and angle relative to some arbitrary radial axis). The values of the 2nd and 3rd coordinates would always be infinitesimally small (for a very thin 1D-in-3D universe), and space would be seen as one-dimensional. How’s that for a visualization? Pretty neat, ah.

I agree general relativity is at a dead end when it comes to a theory that explains how the force of gravity is transferred. Perhaps it is time to shake it up and take a close look at the base fundamentals behind some current academic research. The challenge for fundamental analysis goes to the young up and coming academic researcher who is actively seeking solutions and innovation. If this is you then the principles of atomic gravity are your starting point! It may be your time to race past your peers with both prestige and setup a great career path. The principles of atomic gravity are tools used to advance academic research in the natural sciences. The principles describe the method to how the force of gravity is transferred in atomic structure. Understand the principles to understand the bigger picture.

The next step is easy. A summary of the principles can be found using Google. It is better to understand the principles now before spending too many years chasing ghosts like the many vested current academic pre-retirees and retirees whose past research centered on the fundamentals of gravity through the theory of general relativity.

New ideas are born and the old theories fade away demonstrating how the evolution of scientific knowledge has advanced through-out human endeavour. Take a step forward and get in the lead!

sexta-feira, outubro 13, 2017

Literaryness Made Easy: "The Edge of the Horizon" by Antonio Tabucchi, Tim Parks (Translator)


Of course it's the old "can you teach talent" argument, isn't it? That's the meaty question, the puzzler of substantial length and girth that needs to be grabbed firmly with both hands. What produces worse writing? People striking off alone, with nobody to tell them to stop and their critics being self-selected (because you see a lot of that online in fandom communities) or people going to study creative writing and, much like Larkin claims parents do, getting fucked up by their teachers' preferences? Books aren't quite the same as music, there's less chances for an obviously wrong note that doesn't fit; even a single poorly chosen word in a 50,000 word novel is often far less jarring in the grand scheme of things than a G# when you expect a G in a 10-minute concerto. As they say, even Homer nods. Of course if you open a book and it begins "It was the best of times, it was the best of times", then there's a problem. And "bad" is just a really broad term. A book might be beautifully written but completely morally repellant, and I'd call that bad; it might have a thrilling plot but contain nothing but dull clichés and poor imagery and I'd call it bad. I'd even call a book bad if it was great for three quarters of its length and then had an awful ending. All these different “badnesses” are forgivable by different people to different degrees; I'd be more kind to a book which just had a bit of a flat ending to a book that thoroughly endorsed objectivism as a moral philosophy as its sole Daseinszweck. I'd be more forgiving of something that used cliché and well-worn archetypes with brio and enthusiasm and a little inventiveness than something that tries so hard to not be formulaic it feels like a schoolchild told they can't use "got, nice or went".

When is a literary novel worth reading like this one by Tabucchi? It depends on how you define "culture." Literary novels were certainly an emblem of high (educated) culture as opposed to low (mass) culture--much like classical music. How did one truly get educated 50/150 years ago--you read seriously, including literary novels. There was no PC/Web on which to waste time. Right after I graduated from college, I spent the better part of a decade reading literary novels--best thing I ever did. My daughters are voracious readers--but of course it is all serialized apocalyptic teen fiction. For a while I have been telling them that they will soon be reading classic novels--and that they will grow to appreciate them and the genre. But as I write this words I realize that maybe they won't. The Millennial generation will be well-educated and able to do difficult work--but they probably won't read novels. They’re not wired like that.

Any reason to think that writing itself will be around in the future? Once upon a time, not that long ago, people lived without it. In a future of virtual reality and brain to brain interfaces who says it will still be needed? We've gone from oral storytelling, in which small groups made their own imaginative creations from the ever varying iterations of various storytellers - to writing in which one storyteller addresses the imaginations of millions - to cinema in which one storyteller eliminates the need for anyone to imagine anything. Maybe the next step is one story, one storyteller, one humanity, and no ability to imagine anything individually. And one sensory feed to rule them all...What I'd love to see is the return of the SHORT novel of great beauty and clarity like “The Edge of the Horizon” so masterfully does.

The contemporary writer is so passionless. So stale. Such meandering, somewhat antiseptic prose causes you NOT TO REALLY CARE. I always think it's a terrible crime when a novel loses the opportunity to get to the heart of the matter, the rape and pillage of all that is considered art and it's harekari at the feet of the Consumer.

Since I learnt to read, I have always spent a lot of my spare time reading. I plough through books at a rate of knots. I used to be buying books all the time. Now I read on the Kindle just as much. I hate it when a smack addict who often uses words that send readers willing to read him scurrying for a dictionary. That’s what loses readers. The idea that "Complexity" and "Literaryness" (and their adjuncts "Depth" and "Meaning") are things that writers consciously write into their books - they sit down and say "I'm going to write a Literary Novel", as if Literature is something you add to a work rather than a post-fact label ascribed to works that stand out from the crowd. Now that sounds like cowardly equivocation of my own - "literature's, like, totally relative, man" - but I think there's something to it. Throughout history a frankly tiny proportion of the massive corpus of books ever written endures and gains the label "literature" and, as many defenders of popular culture will tell you at great length, some of them were written as "popular books!" (It's always Shakespeare, Dickens and Austen, I find - and they'd all apparently have written comics/TV shows/whatever you're trying to sell).

So if "literature" is best applied as a label post-fact to the best writing an era can offer, perhaps what's really stultifying - and killing - culture is the efforts to capture that zing, that unknown quality that means you remember “The Merchant of Venice” over “The Jew of Malta”, and turn it into something you can mass-produce, teach in schools and writers can use as their selling-point.  

Narrative in written form has been mined until the seams have been gleaned clean in places. To create new voices like Tabucchi’s, to tell the seven basic plots in new and interesting ways has become more difficult as there are so many out there doing it. Yet what has already been written is alive and out there, waiting for new readers to discover. I’ll add that this novel doesn’t reach 100 pages…No small feat, considering the punch it gives. Tabucchi’s language is a wonderfully rich one. Should we impoverish it by stripping it of perfectly good words and phrases simply because they are uncommon? Why is a phrase construction that you and I know a better choice than one that we don't know but Tabucchi does?  

With Tabucchi I need not fear the usual House Syndrome:

1) Patient has strange condition.
2) House treats patient, assumes that they're cured.
3) Patient gets worse. Patient is on the verge of death.
4) House has epiphany. Cures patient.

It's hard enough these days to find time to get through books that aren't of the order of “The Edge of the Horizon”. Not that I worry too much about that, though - I think the best reading years of my life were in my twenties, when during the five years of my Engineering degree I was free to read novels and poetry all day in cafes and parks, and to go out drinking at night and do other unmentionable things. You can't do it forever, though, and there's a lot of stuff that I suspect I just couldn't be bothered with now, stuff that I read almost religiously and with enormous excitement when I was in my early to mid-20s - Kafka, Hemingway, Kerouac, TS Eliot, DH Lawrence, Malcolm Lowry, and so on. I'll never be mad for all that stuff the way I was back then. Living is easy when you're young, so you challenge yourself with art. When you get older, living itself becomes the main art you have to cultivate, and high falutin books don't hold the same fascination.

I can still tear into a Antonio Tabucchi like this one, though, no problem there. It’s not perfect, but it’s still better than most of the crap being published nowadays. Maybe Tabucchi has taken modernism and post-modernism as far as it can go (e.g., I can’t stand most of what’s being labelled and published as post-modernism in this day and age). Maybe it is impossible to trump a novel where everything is a chain of imaginings: a senile mind imagining an uneducated comatose mind imagining a lunatic mind. If so, it is a good thing that the final modernistic novel is not difficult at all, nerdishly very entertaining: I was disturbing other passengers by laughing sometimes while I was reading it. I know. I’m weird. And the way you write freely, without caring about trying not to sound pretentious, is very likable, in fact, charming. But modernism is not the only way of telling a story. Other movements will appear. Art, generally, is light and spectacular at the present. But this is just a phase. It will pass after the dominance of old men has been broken. The plutocracy will not always be forcing the young to waste all their time on useless work. Nor does it matter if appreciation of novels is confined to a cult. The Portuguese-speaking population with practical access to Portuguese-language novels is already greater than it was in 1950 by at least one order of magnitude. There will always be people who will appreciate the simplicity of pure text, without the complication of sound and visuals.  

It cannot happen in the near future. But there will be a renaissance of all the arts starting shortly before, or during, or shortly after the collapse of the plutocracy. I reckon.

The Edge of the Horizon is us.


NB: This time I re-read the book in English to see whether it'd would hold up. It did. One word: Marvellous.