Coding with two heads

I love stories like this, that peel the layers of the hard work that makes real innovation happen, to reveal something thats often counter-intuitive.

This is the fascinating story of Googles only Senior Fellows – Jeff Dean and Sanjay Ghemawat – a pair of coders that solved foundational problems for Google and helped create the Internet experience as we know it, by coding together.

To solve problems at scale, paradoxically, you have to know the smallest details

Alan Eustace, Google

The Friendship that made Google Huge

What we get wrong about technology

Source: What we get wrong about technology — ft.com

The point of this article will be clear to anyone struggling to use new technology in current processes.

The article stops short of future predictions but the insights from the past are illuminating.

To become really transformative, Electricity required the reinvention of the the manufacturing process, worker skills, factory architecture and more. Just sticking an electric motor where a steam engine originally was, did very little.

The same way that replacing a typewriter with Email and Microsoft Word actually does very little in real terms.

Garbage in, Garbage out

Machine learns racial and gender biases embedded in human data.

Lets not assume AI will be evil or wise. AI see, AI do, like any monkey. At some point it may grow up and learn ‘good’ from ‘bad’ but thats debatable.

Machine learning algorithms are picking up deeply ingrained race and gender prejudices concealed within the patterns of language that humans commonly use, scientists say.

For instance, in the mathematical “language space”, words for flowers are clustered closer to words linked to pleasantness, while words for insects are closer to words linked to unpleasantness, reflecting common views on the relative merits of insects versus flowers.

The latest paper shows that some more troubling implicit biases seen in human psychology experiments are also readily acquired by algorithms. The words “female” and “woman” were more closely associated with arts and humanities occupations and with the home, while “male” and “man” were closer to maths and engineering professions.

And the AI system was more likely to associate European American names with pleasant words such as “gift” or “happy”, while African American names were more commonly associated with unpleasant words.

Source: AI programs exhibit racial and gender biases, research reveals

Who needs CGI?

Stan Draws Spaceships – Beautifully

This deceptively simple hand-drawn animation, created out of passion, has more beautiful scene framing, composition, perspective and cuts, all of which is sync’d beautifully with the narrative, bringing the subject alive.

Most of the heavy Computer generated graphics animations you typically see don’t even come close, relying more on graphic detail, than storytelling.

Goes to show that knowing how to use a tool – typewriter, paintbrush, graphics software – doesn’t make you a storyteller.