Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Thursday, December 20, 2018

Insanely Great

Insanely Great: The Life and Times of Macintosh, The Computer That Changed Everything
© 1994, 2000 Steven Levy
304 pages




Apple Computers had already made its mark before 1984, by pioneering personal computers long before IBM entered the consumer market.  In January 1984, it hoped to make a larger one -- to make a dent in history.   So it did...just not quite the way its creators intended. Insanely Great chronicles the history and influence of the Macintosh computer,  which became the company's chief product before its wildfire consumer products of the 2000s. Originally written in Apple's lost years when it hemorrhaged talent and could not find a stable hand at the rudder,  it includes an afterward on the recent turn of Jobs.  It's a history that doubles as a labor of love, because it has a biographical thread concerning Levy himself -- a man grudgingly seduced by computers. who was so enamored with the promise of the Macintosh that he bought on on release.

Although Jobs would later shanghai the project, the Macintosh originated in the person of Jef Raskin, who wanted to create an extremely cheap but versatile computer, an electronic Swiss Army Knife,  that would be easy for first time users to pick up, with an intuitive interface. While it wouldn't boast any specs worth mentioning, it would have simple tools that ordinary people would find useful, like a word processor.   Raskin wanted to push this computer into the familiar realm of home appliances:  when computers became like phones and calculators, he thought, then they would have arrived. After working with the Lisa project, Apple's first attempt at creating a machine with a GUI which proved to be an extremely expensive dud, Steve Jobs drifted into the Macintosh room and was seized by its potential.  Jobs would take over the team and make the Mac far beefier than Raskin ever intended, eventually, and his obsession with perfecting every detail meant that for all its expanded capacity, the Mac was under-powered for much of its basic operations.  Maintaining a glowing screen full of images, and drawing each bit of text effectively as an image, was asking a lot of 128K memory. And it wasn't going to be like an Apple II, either; users couldn't just open up the hood and add to the Mac's hardware.  (The Mac team snuck around on the side and allowed for the ability to do a little memory expansion, since they knew -- Jobs not withstanding -- the Mac was going to need more as soon as consumers started playing with it.)

Perhaps the Mac was a little too user-friendly.  Although those who tried it loved the operating system, many looked past it. It wasn't a serious machine; it looked like a toy.  Apple II and IBM machines which still ran the DOS system may have required getting used to typing in computer commands,  but they had a well-established library of software, including the business applications people were mostly relying on computers for. Mac was still developing its own, with the help of Microsoft.  Microsoft would use its experience with Macintosh's graphical user interface to develop Windows, though this was not a simple care of Microsoft taking Apple's idea: the pioneers there were Xerox, and  several  GUI systems were in development in the mid 1980s.  Although the little Macintosh would take over the company -- via Jobs, who diverted more resources into it away from the Apple II line, which also had the GUI by now -- and still lives in Apple in name (its current computers are much more like the Macintosh than the moddable Apple II, and have the same working-out-of-the-box approach), Levy admits that its greatest success was achieved by leading to Windows, which took a commanding lead over OSes to the point that prior to Chromebooks, it had an effective monopoly.

Although Insanely Great is sometimes more of a tribute than a serious history, I enjoyed the look at history it offers, both into the Lisa and Macintosh project, and the bit of biography: given that Levy is definitely a tech enthusaist, I was astonished to learn that he had once been anti-computers, and only when he was asked to do Hackers was he won over.  He shared Job's hatred and distrust of IBM, and  for him seduction by the Macintosh was his entry into the world of computers. Therein lies his affection, for the little machine. which literally changed his life.

For a more balanced perspective, I would recommend this video in which an Apple fan argues that the Macintosh was a mistake, and that Jobs hobbled the performance of Apple II's GUI model (GS) to promote the technically inferior yet more expensive Macintosh instead.  It's 8 minutes.  For a look at the "other side",  there's also a video on YouTube of someone unboxing a new 1984 IBM-AT.  That one is much longer, but I was surprised at the amount of software setup required just to get it started, and it helped me  appreciate the "turn on.....ready" approach of the Mac.





Thursday, December 13, 2018

The One Device

The One Device: A Secret History of the iPhone
pub. 2017 Brian Merchant
416 pages




Love them or hate them, smartphones have revolutionized society like few other inventions. Entire sectors of the economy now exist which wouldn’t be there had they had not been invented, and barring  some kind of global collapse it’s unlikely their influence will fade anytime soon.  The One Device: A Secret History of the iPhone reviews not just how a computer company decided to gamble on making what would become the best-selling consumer device ever, but investigates how the various technologies which make it possible came into being, and how everything was finally put together. Merchant illustrates that a lot of key elements were already in existence and argues that Apple’s success was putting them together at the right time,  building on to them, and adopting to market pressures in a few key areas (grudgingly allowing for third-party apps, for instance).  It’s faintly anti-Steve Jobs, for as much as it quotes from Isaacson’s biography it also relegates Jobs himself to the background, choosing to focus instead on the inventors, tinkerers, programmers, and engineers whose ideas and grueling work made the device possible.

Unless you're a fan of retrotech videos like myself,  you'll probably be surprised to learn that the idea of smartphones predates Apple, and that the first was made by Apple's hated foe, IBM -- Big Blue itself.  IBM's "Simon", however, was before its time, with a battery life of a single hour. Other technologies which were later incorporated also had their genesis in a place other than Apple's R&D department.  Merchant suggests that many technologies have a long stewing period before they're truly ready for work. In the mid 2000s,  Apple was at a place where they were looking for an edge. Jobs' experiment in remaking Apple products as a linked digital hub --the iMac and iPod linked together with iTunes, for instance -- was a great success,, but he anticipated iPods being undercut in the future by cell phones and wanted address the problem by turning the iPod in to a phone.  The shuffle wheel, as useful as it was for scrolling through music, was poorly suited for dialing phone number. However, a team working on a tablet computer were onto something with touchscreens, and  Jobs' focus on the phone project was such that the tablet, the "iPad", was shelved until a little later.  The phone didn't meet immediate success, however: Merchant reminds readers that the original only had three apps that weren't Apple products, all from Google,  and there was no App store. It took increasing pressure from people hacking into their iphones to allow for third-party programs to force Apple's hand.  It was immediately advertised as an essential feature of the phone in Apple advertisements, and Merchant suggests that the phone would have never taken off (given its price) were it not for the store.

The One Device blends technical research and business history,  and at times its level of detail may cool the interest of a casual reader.  Merchant is generally more personable than technical,  with the exception of the chapter on the iPhone’s processor, and the subjects covered are diverse -- everything from software to mining to business deals.  There's a lot of surprising content in here, too, so if you've an interest in popular tech,  The One Device will probably be of interest.

Wednesday, December 12, 2018

The Perfect Thing

The Perfect Thing: How the iPod Shuffles Commerce, Culture, and Coolness
pub. 2006 Steven Levy
304 pages


I've never had an iPod, but given that Audible was doing a sale this week and that I seemed to be doing an Apple-related set, why not? The Perfect Thing hails the influence of the iPod and shares its history, both how Apple came to experiment with a consumer device and how it used the device to transform the music industry.  It's light "reading" (I listened to it, so the description is imperfect), and its datedness has appeal: this is an Apple book written before the iPhone took over everything else,  written when Jobs had announced that yes, he had cancer, but it was easily remedied with surgery and all was well now.

When Jobs returned to Apple in 1997 and pushed the company to focus on just four products -- professional and consumer variants of desktop and laptop computers --  his idea for the desktop computers was that they were to become key components in home entertainment, a "digital hub".  The iMac came packaged with software like iMovie and iTunes to allow users to create their own videos and play music from the computer -- and not just play the CD, but copy music onto the computer to allow the iMac to be a digital music library.  Around the same time, the .mp3 coding format had been established, and there were even clunky attempts at a consumer-marketed mp3 player.  Then the inspiration: what if Apple created its own mp3 player, one that would be designed to  link perfectly with iTunes?

Although its price gave cause for balking,  the device's ease of use and attractive design made it a marketplace winner, changing the way people approached music.   Although CD players had already started allowing for more musical freedom --  make it easy to listen to the same song over and over again, or skip weak songs in an album instead of having to manually fast forward and rewind tape -- the iPod and its clones would make it a breeze.  Although a certain artform was lost in the process (having an album that told a story when listened to in entirety, in order),  most people just wanted to listen to the music they lived, when they wanted it. 

The other great influence of the iPod on music was on the industry itself.   In the days of Napster and Kazaa, the record companies were seeing the rug pulled out from under them, with CD sales following as people were able to just help themselves to goodies out there for the taking -- along with viruses,  malicious jokes, and extremely poor information as people shared files with the wrong artist and title names.  Jobs proposed an alternative: iTunes could be more than a music player and CD ripper;  it could become a storefront, allowing the record companies a way to adapt to the  demand for digital music and maintain an income stream, while giving consumers a safe and legal alternative to obtaining music at a fairly good price -- $0.99 a song. 

Levy is a tech enthusiast, an it's therefore not surprising that he completely dismisses all who look askance at the takeover of people by their little devices.  Are people retreating from one another and reality by losing themselves in their music whenever they feel like it? Sure, and why not? Although there is truth in Levy's statement that moral panics always erupt  around new technologies,  it doesn't follow that there aren't legitimate causes for concern when people put themselves into danger or ignore their family and friends (in their very company) by dropping out.

Tuesday, December 11, 2018

Jobs

Steve Jobs
© 2011 Walter Isaacson
656 pages


The past twenty years have been an amazing ride for Apple Computers, in which the ailing and speeding towards bankruptcy company suddenly metamorphosed into the most valuable company in the world, responsible for creating some of the most iconic products of the modern age.  Walter Isaacson's Jobs chronicles the life of the man who co-created Apple, fell out with it, and then came back to orchestrate the biggest brand revival ever   Throughout  it, Isaacson illustrates how pivotal Jobs and Apple were to not just the computer age, but to the opening of the 21st century as a whole. Although I am not an Apple user,  I enjoyed this biography like few others.

While many computer tinkerers of the day, Jobs' partner Steve Wozniak included, were interested in computers as tools, Steve Jobs’  background in the youth movements o the 1960s prepared him to see computers as revolutionary. He saw them as a way to marry technology and the humanities, and to ignite human potential  -- but he took. His lack of zeal for the tech side for it's own merits meant that he wasn’t particularly supportive of building machines that could be physically altered and expanded. He had a vision of how a thing should be, and didn't anyone else tampering with it.

For Jobs, computers were not merely a technological tool; they were more like art in their ability to expand horizons.  He wholly attached himself to his vision, and resisted the view of computers as open systems to be altered at will. Early in his career at Apple, he and Steve Wozniak argued over how many open slots the Apple II board should come with; Jobs only wanted two, for printers or modems, while the tinkerer Wozniak insisted on eight.  While Wozniak won that battle, he would lose all the  rest, as future creations presided over by Jobs were far more closed off to modding. Apple later used custom screws in their products, for instance, to stymie attempts at would-be home modders or consumers who wanted to repair their products at doing so. (Jobs’ vision for controlling products end to end has continue: just recently news broke that OSX systems would soon be capable of identifying hardware changes and then locking themselves down if someone other than an Apple-sanctioned repairman had replaced  a part.) 

Jobs’ insistence on controlling the vision he had for Apple products made him a domineering and mercurial boss, obsessive about seemingly small details and abusive in their implementation.  Associates at Apple joked about the 'reality distortion field' around Jobs,  the means by which he could convince himself that the impossible was practicable, and even convince  others to join him in the pursuit.  (Sometimes, with enough ninety-hour workweeks, they even achieved the impossible.) Such was his behavior that once Apple had grown from a two-man garage company into a full corporation, its board of directors effected his removal, hoping to dampen the disruptuons he caused.  He would later return after Apples’ drifting performance nearly bankrupted it, but in the meantime he developed his own computer system (Next) and  gained more management experience as Pixar. Next would be a failure, hardware wise, but the kernel of its OS would later be incorporated into Mac’s OS and even help Jobs get  his old  position of power back. At Pixar he was more able to pursue the intersection of art and tech, as  computer-generated graphics proved themselves capable of stories that gripped the human soul in Toy Story.

It was after Jobs’ return to Apple that things got really interesting, however. Jobs’ essential personality never changed, but he learned to be less meddlesome and gave to Apple the ability to focus.  He forced them to target on four products instead of a menagerie; these products were to be the best imaginable, “insanely great”. He often used novel designs that played with the imagination. Instead of familiar beige boxes, for instance, the iMacs of 1997 were colorful egg-shaped units that stood out and were advertised as being especially made for the internet, which by then was roaring.  Jobs focused not just on function, but on feeling; he wanted product to resonate with people, to give them a certain joy in using them, and that was part of the reason he was so obsessive about small details. Everything mattered, even the arrangement of the interior which might not ever be seen.  Jobs was the first to make aesthetics a key consideration in build quality. Jobs' vision prove itself when he pushed Apple beyond computers, into consumer products, with the ipod -- and later, the best-selling consumer device of all time, the iPhone.

Although Jobs was not an easy person to work with or know, there can no denying his pivotal role in the making of 21st century technological society. While Apple's hostility towards the right to repair makes me shudder, I can appreciate the commitment to exquisite design that Jobs made part of Apples culture; every time I help someone at work with an iPhone or a MacBook I enjoy the experience.    As much as I prefer the open moddability of Windows and Androids systems,  Isaacs' book made me far more curious about the 'other side' than I would have imagined.

Saturday, November 24, 2018

Short rounds and leftovers:

Hello, readers! Here's hoping those of you in the US had an enjoyable Thanksgiving on Thursday. I thoroughly enjoyed the company of my cousins, though I did rather poorly in our board game of choice.  I blame the dice.   Throughout the week I finished up a couple of titles and wanted to comment on them.



First up is The Cathedral and the Bazaar, which is less a book and more of a long essay on Linux, an open-source operating system -- and specifically, how Linux's bottom up, emergent order approach is much different from the controlling top-down approach of Microsoft and Apple.  I was interested because I recently used a boot disk with Ubuntu (a Linux variant)  to access a computer and extract files from it after it stopped booting Windows. I was pleasantly surprised by its intuitiveness, because I'd previously regarded Linux as something of interest chiefly to programmers and system administrators. Everything I had to do I managed through the graphical interface, just like Windows or Apple, and I made another boot disk with another Linux variant (Mint) to test next time.  An interesting quote from the book:

"The Linux world behaves in many respects like a free market or an ecology, a collection of selfish agents attempting to maximize utility which in the process produces a self-correcting spontaneous order more elaborate and efficient than any amount of central planning could have achieved. Here, then, is the place to seek the 'principle of understanding'.

The 'utility function' Linux hackers are maximizing is not classically economic, but is the intangible of their own ego satisfaction and reputation among other hackers. Voluntary cultures that work this way are not actually uncommon; one other in which I have long participated is science fiction fandom, which unlike hackerdom has long explicitly recognized [ego-boosting] as the basic drive behind volunteer activity."

Although a lot of the content of The Cathedral and the Bazaar is over my head (given my status as definitely-not-a-programmer),  I like the idea of the open source movement, and not just because it produces good programs that are free of cost, like VLC Media Player, LibreOffice, and the GNU Image Manipulation Program (GIMP), two of which I use.   Developers are becoming insanely clingy about controlling users, and about what they allow users to control; these days the proprietary software on computers isn't so much owned as rented.  And some of the software produced by these places isn't even that great: my favored music player, Winamp, makes it far more easy to build and edit playlists than iTunes or Groove, and it's been using the same simple approach for all the 15+ years I've been using it.  



Also up is Coffee to Go, a truck-driving...journal from a Scottish author who drove principally between the UK and western Europe. This book was recommended to me on the basis that he travels to Russia, but no such trip was recorded here, with the farthest reaches being Austria and northern Scandinavia. (There may be multiple editions?) Although I like trucking memoirs generally, this one was....well, less a memoir and more of a journal. Hobbs records every bit of his trip, from how much he paid for coffee to what he said to the fellows as customs, and I found it tedious. The last fifth of the book are recollections of his trips from before he started keeping a diary, and those are much more interesting to read because of all the play-by-play action is absent, replaced by a general narrative with thoughts on traveling to tiny places like Andorra. Easily the most interesting chapter were his memories of driving into Western Berlin during the Soviet era, when  the western side of the city was a pocket surrounded by the dismal DDR.  Hobbs seems like a nice guy, but this wasn't one I'll remember much about, I'm afraid.



Saturday, September 29, 2018

Build Your Own PC for Dummies

Build Your Pwn PC For Dummies
© 2009 Mark Chambers
336 pages



Both my increasing interest in learning how to work with computer hardware, and my nephew's desire to build a gaming computer,  have led to me watching hours of build videos on YouTube, and scrounging around the internet for helpful resources. Although this book was published in 2009, it has a long history of solid reviews, and I was able to find a used copy which included a working DVD.     This beginner's guide to building a PC first assures reader that it's not nearly as difficult process as they imagine, and requires minimal tools -- usually, just a Phillips-head screwdriver.   Because building a PC is an inherently sequential process --  beginning with the case and motherboard, and building from there --  the book's organization follows that process.  The initial chapters cover the first steps:   deciding on what kind of machine to build,  finding a case and motherboard that will meet the need, and installing essentials like the power supply,  processor, and RAM.  Once the hard drive is installed, the author shifts to optional-but-likely add-ins like DVD drives, graphics and sounds cards, and other accessories.  The video is divided into similar stages.

Obviously, a book on computer hardware from 2009 is going to be dated at this point, and arguably it was dated upon release given that it includes a chapter on floppy disks, when retail PC builds had stopped carrying  units with floppy disk drives at least three years before. (My family purchased a PC in 2004/2005 that had no floppy disk reader, just USB ports and a never-used reader for zip cards. ) Still,  storage and data transfer (SATA cables were still nosing into the market here)  are the only real age-related weaknesses. The book is designed to be read independent of any other sections, so each starts with the same advice about grounding yourself to prevent any static electricity discharges. The author always uses a joke to introduces these, which gets old quickly if you're reading it through.  The jokes are not as pervasive on the video, but they're there. 

Although certain elements of this are badly dated, the basic process remains current, and I think it would be helpful to someone introducing themselves to the idea of building a PC.  Fixing Your Computer: Absolute Beginner's Guide has more more information on the actual components and what their advertised specs mean, though. 

Tuesday, July 24, 2018

Brave New World Revisited

Brave New World Revisited
© 1958 Aldhous Huxley
144 pages


Aldous Huxley's Brave New World (1931)  transported readers to a deeply creepy nightmare-vision of the future, in which man had disappeared as an independent being, instead becoming the raw materials for a new, engineered hive creature.  In Brave New World Revisited,  Huxley shares his fear that the technocratic domination of society is proceeding much more quickly than he had anticipated, and then outlines reasons for concern and the vectors by which free minds could be compromised and manipulated.

The crux of the problem, says Huxley, is overpopulation. Viewing a global population of 3 billion in horror, Huxley anticipated not only only mass starvation, but the rise of tyranny across the world.   Rising population would crowd more of humanity in cities, where disease both physical and mental would become an ever-greater threat.  The rising misery, he believed, would have the effect of  fraying civil society so much that Communist orders promosing food for all would be imposed.  Though not a libertarian, Huxley takes Lord Acton's appraisal of power and human nature to heart. Even an innocent desire for order, he argues, can carry the controlling authority away, resulting in creeping and then  quickly-hardening tyranny. Eugenics is an obvious example, and the subject of his second chapter.

The bulk of the book, after the opening essays on population crises and eugenics, examines ways in which technology might begin to subjugate human psychology.  His original novel was published in 1931, two years before Adolf Hitler took power and achieved the closest thing the world had seen to total technological command of a people;  Hitler not only grasped how mob mentalities could be manipulated, he used the latest in communications technology to constantly convey his message.  Huxley examines the tools of Hitler's trade, as well as others introduced in the decade after World War 2 that might be the stuff of future empires. These include chemical agents, sleep conditioning, emotional propaganda, and different forms of torture.  In each section, Huxley mentions precursors of them already in-use, like pervasive advertising and the  attempted creation of consequence-free feel-good drugs. 

I knew nothing about Huxley before starting this, but he proves to have  been a thoughtful and well-read man  Some of his concerns about overpulation obviously seem dated, given that the global population is presently 7.6 billion, with consistent declines in starvation rates.Overpopulation means increased demand for everything, not just food, so  it's still an issue to be concerned about -- whether your  concern is resource wars or global warming. The pressure these populations put on governments to "do something" -- about a great many things -- has resulted in declining self-determination across the board, with all levels of government.  Huxley's view of the city as a profoundly unnatural environment, one that induces mental diseases, is still argued -- see Desmond Morris' The Human Zoo

Modern readers of this will find, then, some of it dated but a great deal still relevant, as far as human psychology goes;   whatever one makes of shifts in our mores, human nature has not changed since 1958.

Wednesday, July 11, 2018

A Crack in Creation

A Crack in Creation:  Gene Editing and the Unthinkable Power to Control Evolution
© 2017 Jennifer Doudna and Sam Sternberg



"No longer at the mercy of the reptile brain, we can change ourselves. Think of the possibilities." - Carl Sagan

A few years ago I tuned into the middle of a science-news podcast and encountered a panel of otherwise sensible people caught up in an enthusiastic conversation about...crisper? Crisper drawers?  I'd missed something.

What I'd missed was a story about CRISPR, a gene-editing tool with enormous and explosive potential for  medicine and agriculture.  The outgrowth of attempts to use bacteria as microsurgeons,  CRISPR allows for fine-tuned genetic manipulation with reproducable results.  The first half of A Crack in Creation delivers the story of how CRISPR as a tool was discovered, and this history of scientific investigation is followed by the author's thoughts on the implications. While optimistic about the tool's applications for agriculture and medicine,   she admits that the potential for abuse in modifying the human genome itself is high.

Humans have been manipulating domesticated populations' genomes for millennia, of course, but with clumsier methods:   finding animals with expressed traits we favor, and breeding them while taking the rest home to cook.  We have toyed with forcing mutations with chemical and radioactive agents, but the results thereof are unpredictable.  Now,  nearly two decades into the 21st century,  we have the ability to make fine-tuned adjustments, with applications both serious and trivial.  An internal biological weapon used to disarm viruses  and effect cellular repair can instead be used as a tool to remove  and supply whatever genes we desire.

  We've already created mosquito populations which have been stripped of the ability to propagate malaria, and -- depending on trials and the weight of government oversight --  may use pigs to grow human organs for use in transplants.   As Doudna warns, however,   modifying humans -- modifying ourselves -- takes us into an area fraught with ethical quandaries.    She speculates that we may wish to discriminate between germ cells (sperm and egg cells, which would be capable of reproducing whatever edits we make) and somatic cells, which constitute the rest of the body.  Unless, of course, eugenics makes a comeback and we decide to create a race of supermen, a la Khan Noonian Singh. Then, germ cells would be fair game. (Okay, the bit about Khan is just me. As one of the principle discoverers of CRISPR, Doudna is seriously concerned about the ethical implications, to the point that she's had a literal dream about Hitler contacting her with an interest in learning how to use CRISPR.)

Although I'm still trying to understand the mechanics of it (as much as I like biology, genetics is a definite weak point for me),  the potential for this excites me. Medicine is going to go very interesting places in the decades to come.


Tuesday, July 3, 2018

How To Watch TV News

How to Watch TV New
© 1992 Neil Postman, Steve Powers
192 pages (2008 edition)




Don't.


Well, that was easy.  From television insider Steve Powers and technological critic Neil Postman comes this slim book, How To Watch Television News, which explains how televised news is produced and scrutinizes the platform's ability to deliver seriously useful information.  Although  this is not a takedown of television news -- at the end they merely encouraged readers to reduce their TV news consumption by a third --   it doesn't foster trust in the medium.   Powers' insight reveals an industry which scrambles to stay ahead of the latest developments, seizing on whatever is most likely to keep eyes on the screen and keep the ratings high.    Most readers are aware, of course, that television news programs play to the ratings:  no serious journalist would focus their attention on the goings-on of celebrities otherwise.  What we might fail to appreciate, however, is how carefully orchestrated television shows are, from the music chosen to the arrangement of news sequences,  designed to draw viewers in and keep them fixated.  Because of the pace, the need to keep as many viewers' attention as possible,  and the amount of production work required to put each show together, serious journalistic pieces are impossible for something as small as the nightly news, whether it's a half-hour local news spot or an hour-long nationwide show.  To truly evaluate what's happening in the world, Postman and Powers maintain, we need print media -- stories that allow us to consider ideas at length, not merely be distracted by them as objects on the screen.  If readers were to reflect on the news and the commericals which it actually serves, they might see through the illusion -- and see that just as a mouthwash commericial is more about social acceptnance than mouthwash, a news show is more of a show than the actual news.

Despite its multitude of references to the eighties, How to Watch TV News is far from outdated. Powers' 2008 revision updated some references and tech, but Postman's contributions are timeless. Some of them will be familiar to anyone who has read Postman before, from his view that different technologies foster different beliefs, to the belief that television has trivialized and eroded culture in general.  How To Watch Television News is less about television, however, and more about news, the barrage of facts we're told are important. Postman and Powers help us to look for the stage behind the story: why are these facts being presented,   what judgments are we expected to accept in viewing them? In giving recommendations to the reader, however, Postman urges readers to realize they don't have to have an opinion about everything.  This has never been more relevant than today, when  the social media cloud that we're all forced to live in - -because it rains on those of us who don't use it, when people insist on talking about what they're tweeting or reading -- constantly pushes us to react to everything as if it were important. We are still a nation -- and now a globe -- amused to death, frazzled by distraction.

Also from Neil Postman:
Technopoly: The Surrender of Culture to Technology
Amusing Ourselves to Death: Public Discourse in the Age of Show Business 
The Disappearance of Childhood
Building a Bridge to the 18th Century


Friday, June 1, 2018

How Microsoft's Mogul Reinvented an Industry

Gates: How Microsoft's Mogul Reinvented an Industry-and Made Himself the Richest Man in America
© 1994 Stephen Manes and Paul Andrews
560 pages



I recently watched Pirates of Silicon Valley, a questionably-acted movie based on the rise of  Bill Gates and Steve Jobs, and found myself curious about the facts. Did a young Bill Gates really race bulldozers and ram his buddy's sportscar?   Gates is an astonishingly detailed biography of not just Gates himself, but of the computer industry as it developed throughout the seventies, eighties, and early nineties. The book culminates with the release of the then-revolutionary Windows 95, an OS that merited even Rachel and Chandler from Friends pitching it.  The evolution of computing hardware and software overshadow Gates himself, not surprising given that developing software was his singular obsession from high school on.   This mix of biography and technical history makes itself more attractive as computer history than personal, but it still presents a more interesting Gates than "Brilliant, Nerdy Billionare".  He really did race bulldozers, and they weren't his.

Gates is not a rags to riches stories, as young William Gates started off fairly comfortably: his parents sent him to a private school that exposed its older students to computer programming, and one of Gates' classmates there would become his partner in founding Microsoft later on -- Paul Allen. Both were enthusiastic members of a student club called the Lakeside Programmers Group,  who were allowed free computer time -- back when computer users could be billed on how many seconds of computer processing they used --  in  exchange for helping debug programs and and machines.   Being both self-confident teens and curious about what they could do, Gates and his friends also found ways to cheat the billing cycle outside their arrangement -- and when Gates took on the challenge of creating student schedules,  he somehow found himself the only boy in a class otherwise filled with girls.

Even before they were out of high school, Gates and Allen were making a name for themselves as programmers, and exploring the possibilities of this for their future. Their first huge coup was writing a language to use with the first consumer-marketed microcomputer, the Altair.  The Altair was amazing popular considering it had to be assembled, component by component, by the buyer, and that the finished product was initially only capable of blinking its lights. Programming was done not with a keyboard, but by flipping toggle switches.   Although Gates and Allen did attempt building their own computer, one pitched at municipal governments for managing traffic,  their talents lay in software.  Gates was both obsessive and aggressive:  he had no objections to working eighty hours a week trying to iron out bugs, and expected that from whomever he hired later on.  Gates hated to lose, and if that meant selling products he hadn't even built yet-- hadn't even planned yet --  to prevent someone else from making the pitch, he would.  (Hence the reason for those eighty hour workweeks..)  Gates' success came not just from his gifts with programming language, but because he and his partners were so intent on making sales: one of Gates' tricks was to use one product to sell another.  His dream was a computer in every home, on every desk, running Microsoft software. It didn't matter who the manufacturer was: Microsoft did work for both IBM and Apple, as well as smaller computer companies which have fallen away, and Gates' goal was to create a hardware ecosystem where everyone was using a common software, with the effect that devices would be cross-compatible.  A monitor made by one manufacturer -- IBM, say -- would be compatible with a computer made by another firm, like Hewlett Packard.

Gates  delves into an astonishing amount of detail both on the technical hurdles and on the business deals that Gates made: there's an entire chapter on a font battle with Adobe, for instance.  Readers do see the man behind the machine, however: Gates the crazy-competitive, Gates the parsimonous executive who regarded hotel rooms and first class as decadent,  Gates the teenage millionare, Gates the spectacularly reckless driver, Gates the bellicose boss who liked people who stood up and yelled right back at him.    Although Gates is not necessarily the ideal book for someone merely curious about the man, its depth of technical and business history would recommend to those interested in the  microcomputer revolution.  Oh, and the bulldozers? Gates literally saw them sitting in a rural construction yard, discovered the keys were in them, and decided to figure out how they worked. Then he and a buddy drove them around and raced, because that's what you do when you're twenty and it's 3 am.

Related:
Pirates of Silicon Valley, trailer below
CYBERPUNK: Outlaws and Hackers on the Computer Frontier

"I got the loot, STEVE!"


Wednesday, May 9, 2018

Tales from a Mainframe Mechanic

The Computer Guy Is Here! Mainframe Mechanic
© 2018 John Sak
201 pages


When John Sak began his training with IBM as a young college drop out, instructors informed his class that the only constant they could expect from their careers was change. Their jobs would probably not exist before they retired. Sak entered the field when mechanical tabulating machines with some electrical work were giving way to electronic computing units, and ‘continuing ed’ would be a staple of his career at IBM as computers, printers, and computer-driven devices continued to advance. By the time he retired, smaller desktop computers were supplanting the closet towers and basement behemoths. The mainframes Sak and company serviced, of course, were not simply larger and slower versions of PC towers. Although by the end of his career many devices accepted instructions via keyboards and the like , as a younger engineer instructions were fed into computers via stacks of punched IBM cards, with the patterns giving the machine different instructions. Refer to a disk drive today and most may think of a DVD tray, but the unit covered here is the size of a washing machine.

The book is a memoir rather than a personal history, but Sak’s stories cover the many various aspects of field engineers’ work and the IBM culture. Saks and his colleagues weren’t just repairmen, called out to replace or fix faulty mechanisms; they also analyzed new equipment in the field and compared notes to determine if there was a design flaw that could be corrected, or weaknesses which could be improved. This memoir of life as an IBM field engineer combines a few profiles of odd characters with accounts of diagnosing problems, along the way explaining how older room-sized devices operated. (One model, only discontinued in 2005, ran for just over 40 feet and was devoted to letter-sorting.)

Computing has had an amazing history so far, and I greatly appreciated Sak's account of its boom years -- forgiving the primitive cover.


Wednesday, March 21, 2018

The Inevitable

The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future
© 2016 Kevin Kelly
336 pages


No one can say where exactly a ball thrown in the air will land,  but at least on Earth it’s a certainty that a thrown ball will land.   Kevin Kelly,  formerly of Wired magazine, can’t  say exactly what the future will look like, but he is confident enough to predict what trends will continue based on present technology.   Our global civilizations have been radically transformed from the 1970s til now, but computers weren’t the catalyst for all the change we see around us. Networked computers were. By themselves,  the first computers were house-sized calculators and overpriced filing cabinets;  when they began exchanging information freely, magic happened.   What world-changing wonders can we expect from the current trends in technology?

First, says Kelly, is “becoming”:    In the late eighties, Zygmunt Bauman introduced the term liquid modernity to our sociological lexicon.  In previous generations, changes happened slowly enough that our societies were able to digest them and establish a new normal.   As the 19th gave way to the 20th century, however, the rate of change has quickened to the point that a new normal is impossible: societiy is revolutionized multiple times within a single generation, with the effect that there is no stable ground to be had, no new normal to be reached. Now our products are no longer discrete products, but services that are continually being changed -- think of Office 365, or even Windows 10. Windows 10 is rumored to be the last Windows, not because Microsoft is retiring from the OS business, but because Windows 10's frequent updates constantly add  new features that would have otherwise been developed and delivered in a new Windows.  Our phones, too, are not merely the device that came out of the original box: as we add apps and accessories, we change their nature.

The second big-ticket item in here is "cognifying", by which Kelly means using machine intelligence for everything. There won't be a master AI that controls every aspect of our lives, he says;  instead,  we''ll develop multiple machine intelligences for different suites of needs, and they''ll be utterly mundane -- and already are. When we execute a google search for recipes or ask it for directions, we are in fact helping train and  benefit Google's machine-learning algorithisms: we are teaching them what we're most likely to be looking for.  Those "Related Products" that Amazon helpfully shows you are also an early example of machine intelligence, as Amazon's database learns your shopping preferences and attempts to predict what you would like next.

Two more concepts from the book worth sharing quickly here are Accessing and Tracking. Tracking sounds obvious, but Kelley isn't just talking about website cookies or Google & Apple recording your movements through your phone's  GPS. By tracking, Kelly means that the door is open to quantifying every aspect of our lives.   People can already use their phone's apps to track how much they walk per day, how well they asleep, and record their diets;  they can already use phones to monitor their heartbeat;   phones in the near future will be able to monitor blood pressure and blood sugar, as well.  Cheap cameras and cloud storage mean that we can record more moments of our lives,  and later poke through them at our leisure as if they were files in a drawer.  The cloud is a key aspect of much of what Kelly covers, but it is especially prominent in the "Accessing" chapter, in which he writes that we're moving away from an ownership society. We no longer need to own a car;  we just need access to one. Apps and tech allow us to share resources,  and in some cases the resources are becoming so cheap that they can be offered for free: no one needs to  struggle with an ersatz Office clone when they can use the freely available OfficeOnline.

There are ten real concepts in total (there are two more chapters, "questioning" and "becoming", but they're less about content than thinking about our relationship with content), and the author purposely avoids mentioning any downsides. He takes it for granted that everything can be used to malicious purposes, but that would be another book entirely. (A book like Future Crimes II, perhaps...) I also liked the chapters on Interacting and Screening; one addressed the future mundane role of virtual reality and augmented reality,  in that games and movies will become more "real", and our travels in the real world will have a digital overlay adding more information -- the ubiquity of screens dovetails with that rather nicely. One disturbing possibility Kelly mentions is having glasses or ocular implants with different apps installed; one can read people's faces and match them to a driver's license database.   The other concepts in the book are extensions of minor things happening now, like remixing and filtering.

As someone who can be both entranced and repelled by the promises of technological -- completely fascinated on a abstract level,  distantly horrified at a human level --  I found The Inevitable enthralling reading.  The author is sloppy with language, however, using "socialism" and "collective" when 'cooperative' would have been more accurate. For some reason he thinks libertarian individualism is contradicted by Wikipedia , when it's merely individuals voluntarily working together toward a common goal.  Socialism makes me think more of involuntary mass actions, like taxes and slavery.





Monday, February 26, 2018

The Silent Intelligence

The Silent Intelligence: The Internet of Things
© 2013 Daniel Kellmereit, Daniel Obodovski
166 pages



A couple of years ago I created a Digital World label in recognition of the fact that the Internet was no longer a discrete service that one could engage in or detach from - -that it had become instead part of the infrastructure of everyday life.  The Silent Intelligence is a technological/business briefing that expands on that,  documenting  “Machine to Machine” networking  that will allow the tools and infrastructure we use to coordinate with one another automatically – so that the lights in our house, for instance, can be informed by an app tracking our phone that we pulling in the driveway.    This is rapidly aging news now, of course,  given that there are now competing systems for managing home electronics.   After explaining the technological breakthroughs that are making this trend possible, the authors then examine challenges facing the field, and discuss possible areas where it might find the most immediate use, like hospitals and homes.  Imagine if a nurse in a large hospital,  in search of a piece of needed equipment could consult an app on her phone, which would direct her to the closest available piece.   In this  this case each instance of the equipment would be tagged,  almost like Zipcars are now.  Some of the predictions have already come to pass, like Redbox movie rental kiosks that can monitor their inventory and report when they need to be serviced,  and there’s no shortage for opportunities here.   The Patient Will See You Now expanded on this kind of technology in the medical field.    Last year I acquired another book (Smart Cities) whose premise was also introduced here - -the idea that cities would become more “alive” than ever, as  apps and infrastructure talked to each other and allowed for real-time monitoring of pollution, traffic, etc.    Technologically, the 21st century will be a very exciting place to live

The Silent Intelligence is not leisure reading unless someone likes to read about the nuts and bolts of an emerging industry’s technical problems, but it’s one of the first books about the “internet of things” I was able to find. I’m sure more will follow as the built environment is reprogrammed along these lines.

Resistance is futile. Your home will be adapted to serve the Internet of Things

Wednesday, December 27, 2017

Fixing Your Computer

Fixing Your Computer: Absolute Beginner's Guide
© 2013 Paul McFedries
336 pages


This title is exactly as it describes itself, a beginner's guide to computer maintenance. McFedries begins with the computer as a whole, and takes readers through physical and digital cleanup -- going from compressed air canisters for keyboard gunk, to creating backup discs for your system. Then he takes readers inside the computer case itself, explaining the use of each component and offering information as to the specs for different parts one might see in the store.  He offers lists of reputable parts suppliers, and then delivers step by step instructions for taking out and replacing parts -- including the processor itself.  The sections are written to be read independent of one another, so someone just taking a look at one page won't miss any background advice, like how to avoid static  electricity buildup.

This is a very useful book for reference,  and despite the release of Windows 10, remains current. I checked it out because I'm contemplating cannibalizing one of my first computers, using its DVD drive to replace a sputtering one in another of my machines. I read the whole thing, though, as I've only lately even learned to identify all of the stuff inside the computer.

If you are interested in computer hardware, one my favorite YouTube channels' is LazyGameReviews, which despite its name is not just about gaming, nor is it remotely lazy. The host does a lot of work, and primarily features older and odder technology.  I found him via his series on 'oddware', while looking for some computer design from the 1980s.


Saturday, September 9, 2017

The Circle

The Circle
2013 Dave Eggers
507 pages

Sharing is Caring.
Privacy is Theft.
Secrets are Lies.

Imagine an internet transformed by a company  so innovative and ambitious that it had swallowed Facebook, Google, etc. whole.  It began with TruYou, a common login that allowed people to use one login for virtually everything online, from their hobbyist forums to their bank accounts.  It ends....well, that's up to you and me. The Circle combines 1984 and The Social Network to present a dystopia-in-the-making,  one most users of internet service will recognize in their own habits. Funny and alarming,  it's easily the most riveting novel I've read this year.

The Circle as a novel begins with the arrival of Mae, a frustrated twenty-something to the customer service desk of the company. As the novel progresses, her willingness to adopt to the Circle culture and work hard to perpetuate it,  take her to the heights of power, to the company's own inner circle. Readers witness her transformation as she grows to ignore the concerns of her ex-boyfriend and her parents that something isn't right about the world the Circle wants to build.  At the center of the Circle are the Three Wise Men -- a reclusive young genius,  a charismatic public face, and an avaricious  financier.  Between the three of them, they want to bring about a techno-utopia by allowing for -- and even mandating -- total transparency.  The Circle isn't just a social network/search engine/marketplace on steroids, it's also an Apple-esque technology company that produces new tools -- tools like small, discrete cameras that allow for live-streaming from multiple locations. The network and its tools grow throughout the novel to allow for technocratic control of society:  child abductions are thwarted by chip implants,  politicians begin wearing bodycams to prove they aren't sitting in smoking rooms hatching conspiracies, and neighborhood watch programs alert residents every time a non-registered person enters their block. Mae's own ascent into the elite happens when  she leads a campaign to turn the Circle political, to make its platform a voting mechanism. When one person drives off a bridge to get away from the Circle, their response is to wistfully say that would have never happened if we could make everyone give up automobiles that aren't self-driving.

The Circle is both warning and dark comedy, mocking compulsive users of social networks while building a threat that is more ominous than hilarious. In an early scene, Mae is called into her supervisor's office to resolve a serious dispute between her and another coworker -- one she has never met, but who invited her to a party for kindred hobbyists, and one who was deeply hurt when she never responded, not even to say "Sorry, no can do".  The world of the Circle demands constant interaction, constant attention, constant sharing.  It's not enough to go to after-work parties: there have to be pictures, shares, tags, likes, and tweets about the party. Mae first approaches her position like a nine to five job; she does her work, she goes home or goes kayaking, she returns. This is not the Circle way. The Circle's social demands  are such that some people simply live on campus, and even when they leave, the idea that they've left the Circle is almost blasphemous. Mae went kayaking....by herself? She didn't tell everyone she was going so they could send her Smiles, and worst of all...she didn't record anything. No one can benefit from her experience except from Mae!  What selfishness.

Although I have my doubts about how effectively this novel could happen (the NSA has problems with storage and cooling, and it's not coping with hundreds of thousands of simultaneous camera feeds),  Eggers' novel makes obvious two dangers of the growing social network apparatus. First, there are people whose histrionic obsession with social media make them not far removed from Circlers. Two, the role that some companies have as web infrastructure -- principally Google,  with its search engine, browser, cloud storage, email, control of YouTube and blogger --  poses a threat to free communication. Google is not a neutral actor; it has an agenda and does not brook dissent, either external or internal.  Facebook is no less threating to privacy; the reclusive genius used in The Circle is a transparent clone of Zuckerberg, complete with hoody.   The greatest problem shown by The Circle is what happens when these two factors combine -- the needy child-mob on social networks, and the infrastructural control they rely on and enabled.


For what it's worth: I maintain a Wordpress copy of this website in case Google ever gets really nefarious.

Related:
4 Alternatives to Google



Sunday, July 2, 2017

Medical tricorders, dirty old men, and controlling the internet

Before we head further into July, here are a few 'missed' reviews..



First up, The Patient Will See You Now. This book was part of the "Rebuilding Towards the Future" series, in which I read books about ways that ideas and work of regular people, as well as technology, are allowing us to make a better life for one another.  This particular book argues that smartphones and big data will (1) give control of their medical data to people by making them the originators of it, and (2) use that data in conjunction with everyone else's  to fight big diseases like cancer.  He documents the incredible functionality of apps and sensors that can turn smartphones into diagnostic scanners taking all measure of readings.  I was suitable awed, but so poorly-read in the area of medical technology that I can't comment too much. I was introduced to this book by EconTalk, as Russ Roberts interviewed its author back in May 2015.



Next:  Edward Abbey's Black Sun.  Abbey opens with a character very much like himself, a disgusted ex-professor who has found solace in the wilderness. For half the year,  Will Gatlin lives by himself in the southwest wilderness, manning a fire tower.  His chief human contact is the radio, and a friend of his who  writes letters entreating him to come to town and chase skirts like a normal human being.  A girl shows up, and seduction follows; he is seduced by her despite having twenty years on her, and she is seduced by the wilderness. In terms of content it's much like Hayduke Lives! -- nature writing mixed with  utter randiness. Unlike Hayduke, I finished this one, as it was rather short.



Lastly, this past week I read Who Controls the Internet, an interesting mix of internet history and law. The author begins by reminding readers of  a time when cyberspace was a discrete thing, not part of our everyday life, and as an imagined world, people hoped the usual rules would not apply. They imagined a border-less new world, where people could be who they wanted, without regard to culture or the states in power. The book then goes on to explain and document how borders re-asserted themselves.  Because the internet originated as a military research project, the US did not want to lose control of it, and other governments have no interest in losing control of their people. China, for instance, aggressively pursues internet connectivity in order to propel itself forward economically, but also works with manufacturers of internet hardware like Cisco to block 'undesirable information' from entering the Chinese web.   Much of the borderization was driven on by people themselves, however:  as more 'common' people started using the internet, they began congregating with like-minded people (fellow Chinese speakers, for instance) and when they began using the internet for goods and services, businesses like Yahoo found that having region- or language-specific portals a necessity.

As Tuesday is the Fourth of July, expect some American lit and a dash of American history or biography this week.  More internet books to come as the summer progresses, too!


Saturday, July 1, 2017

Hackers

Hackers: Heroes of the Computer Revolution
© 1984 Steven Levy
458 pages



How did computers cease to be the playthings of secretive governments, universities, and multinational corporations and become instead fixtures in 80-90% of all American homes?   Hackers: Heroes of the Computer Revolution is a history of that transformation, driven by young men who could not be satisfied with the status quo. Stealing into locked rooms, or spending night after night learning the best tricks to convert typed words into real-world action, their persistent  curiosity edged technology forward.  Their obsession with mastering computers, with pushing them to their limits and fiddling with them to get more out of them, not only influenced the development of the machines themselves, but created new industries.

Nowadays we think of a hacker as a force for ill, someone who invades others' computers and systems and wreacks havoc or steal things.  That negative baggage was acquired only in the mid-1980s, however, when a few young people made headlines through their network intrusions.  Before that, the term referred to ..tweakers, if  you will, to those who fiddled with  electrical and computer systems to learn their ways and to see what they could do with them -- often improving them along the way.  Hackers fills itself with the stories of young, awkward men (and one woman) who forced innovation by refusing to stop their incessant modding. Through these restless lives we see a progression of computers, increasingly accessible and increasingly more agile. This was not the area of "plug and play":  some users were operating in basic assembly language,  compared to which FORTRAN and company were user-friendly.  The computers were often put to unorthodox uses, programmed as calculators or even games (Spacewar). As interested in them grew,  companies arose to put computing hardware into the hands  of technically-savvy consumers.  This was not the era of the Apple II, though -- not yet. The first 'hardware kits' produced a machine whose 'output' was blinking lights.  Hackers is not all technical, however; some people who are drawn to computers have grand ideas for their use, as a portal to human awakening. Some of the pioneers here weren't pushing hardware so much as they were access - like a computer 'collective' on the west coast that sought to establish a public-access mainframe in Berkeley, with a communal directory of information.

Hackers is thus a personal history of the computing revolution,  driven on by curious enthusiasts whose fascination with the potentials of these devices bordered on obsessive.  In a day where "nerd" and "geek" have achieved a kind of faux-chic,   Hackers provides a memory of the genuine article.


Saturday, May 27, 2017

Countdown to Zero Day

Countdown to Zero Day:  Stuxnet and the Launch of the World's First Digital Weapon
© 2014 Kim Zetter
448 pages



A couple of years ago I created a new label, 'digital world', in recognition of the fact that the Internet is no longer a discrete system (like a grid of water pipes). It has seeped into every aspect of our everyday lives, as basic as electricity. Through it, the entire developed world moves. War is no exception to this digital revolution, and the fun is just beginning. People may associate cyberwar with the theft of intelligence, or perhaps monkeying-around with the power grid, but the case of "Stuxnet" demonstrates how weaponized computer programs can cause physical destruction no less complete than a bomb. What's more, the specific vulnerability used to great effect here is virtually universal in the industrial world. Countdown to Zero Day is a forensic-political history of how the United States used a computer virus to effect the kind of destruction only imaginable before by an airstrike, and a warning to the entire online world that we are vulnerable.

If war is the continuation of politics by other means, cyberwar appears to occupy a grey area between the two. The policy of the Bush administration, once it became obvious that Iran was pursuing nuclear weapons, was to squelch the threat through any means necessary. While there may have been many in DC who wanted to see another example of shock-n-awe, even Bush knew a third war in the same mideast minefield wasn't possible. Remote sabotage, however, offered an alternative to war or a nuclear Iran, and a program which started under Bush would bear full fruit during the Obama administration. What a small elite knew in DC as "Olympic Games", the world would later call "Stuxnet": a virus that began as a carefully targeted weapon and but which would later spread across Eurasia.

The author delivers the full story of Stuxnet in a back and forth narrative: the first track begins with the eruption of the virus, and the methodical picking-apart that Symantec, Kapersky, and other cybersecurity firms subjected the code to. Step by step, they attempted to figure out what the code was doing, how it got in, what mechanisms the code was using, and finally -- what was its intended target? This campaign of digital detection work wasn't the product of one cyber Sam Spade, but a collaborative effort between various businesses who shared their information and results. Eventually, over the course of two years, they realized that the initial program was highly target specific: it was aimed at two kinds of programmable logic controllers, or computers used in industrial work. The particular PLCs targeted were used in rotors that were specific to the kind of centrifuge that Iran used to enrich uranium.

The teams dissecting the Stuxnet code marveled several times at its structure, but marveled all the more when they figured out - -based on reports coming in from Iran -- how the program worked. Because the centrifuges' speed and weight necessitate careful handling -- slow acceleration and then slow deceleration, nothing too abrupt -- the program's main attack was to methodically stress the centrifuges by taking them up to speed, or down, in patterns resigned to slowly ruin the pieces. What's more, long before this act of digital undermining ever began, the program silently sat and waited, recording the normal activities: during the actual sabotage, the program fed recorded data to he plant's control room, meaning eventually the Iranians had to physically watch the motors to see what was happening. The program had a nucleus so deeply hidden that when the machine software was placed under repair by the Iranian engineers, the core program methodically re-wrote the new programming. It's as if an invasive bacteria promptly turned the body's immune system into its own means of reproduction.

The case of Stuxnet is important because PLCs are pervasive; they aren't just used in manufacturing, but are common wherever computer-controlled machinery is used. They're in hospitals, food production plants, powerstations, transit networks: there's no end to the mischief that could be managed by attacking them, and until recently very little done to protect the systems. Stuxnet was a wakeup call to many technical directors in the developed world, an alarm bell to their vulnerability. As the recent WannaCry attack which cripped hospitals in the UK demonstrates, however, we're not taking cybersecurity anywhere near enough seriously. (The WannaCry and Stuxnet attacks also demonstrate the volatility of cyberweapons: they don't go away. In both cases, code and tools designed by DC were trapped and corralled into use by other parties.) Throughout the world we rely on computers which haven't been protected for years, or we have foolishly ensnared vital public infrastructure like the power grid with the public internet. Stuxnet was only the beginning -- perhaps it may be like the Hiroshima-Nagasaki attacks, a singular event that frightens everyone into more caution. I doubt it, though.

Related:
@ war: The Rise of the Military-Internet Complex, Shane Harris
Glass Houses: Privacy, Secrecy, and Cyber Insecurity in a Transparent World,  Joel Brenner

Wednesday, May 24, 2017

The Dark Net

The Dark Net: Inside the Digital Underworld
© 2015 Jamie Barlett
320 pages



In middle school, the Internet was a distinct place, a world apart from 'real life'.  Now it has grown so ubiquitous that it's as exciting as the paved street outside my house.  When I first heard of the dark net a few years ago, I caught a whiff of the old excitement - there are still places that haven't been bulldozed into boringness! While I had no interest in exploring the dark alleys of the internet, I took comfort in knowing they were there.  Jamie Barlett's  The Dark Net promises to reveal a little of what goes on in the digital shadows,  but its true real subject is the human condition, and how it is interacting with the possibilities that the internet and its shadows provide.  Barlett mixes criminal voyeurism and philosophical debate about the nature of freedom to great effect.

Barlett examines two different 'dark nets'; the first is the submerged internet, websites which are only accessible with certain programs and certain knowledge.  Virtually all of the websites we use on a daily basis exist both above the surface and a little below it; for instance,  banks  have ample public areas for potential customers to explore, but certain rooms, like our individual account pages, are slightly submerged and accessible only through our username and password. But there's a deeper level to the web,  websites that require specific browsers and knowledge of their URL to appear. (One address  Barlett finds here actually requires a series of cookies from other websites:  users must visit a set of websites in a particular order before being able to load the target successfully, otherwise, an attempt to load the address will produce a completely-different looking site.)  On these hidden pages -- accessibly only through secure browsers -- anything is for sale, from illicit drugs to lives, but there are also safe havens for whistle-blowers to upload documents to the media, or hide from opinion-policing.

Connected but not limited to this is the second 'dark' aspect that Barlett explores, the effect that the internet's anonymity and diverse opportunities have on the human psyche. Here he catalogs support groups for destructive behavior: websites promoting anorexia and suicide, for instance,  or which radicalize political opinions and produce bombers out of disaffected coeds.  The websites he explores here operate on the surface net -- places like 4chan and reddit -- but the behavior promoted reveals the darkness inside the human soul itself,  its capacity for brutality.  In one case, a woman who foolishly posts a photo of herself without clothing is identified by background information, and the 4chan residents promptly start sending the photos to everyone on the woman's facebook page to publicly humiliate her.

Despite this catalog of horrors, from child pornography to terrorist communities, The Dark Net  is not a polemic against the evils of new-fangled technology.  Early on, he writes about the enthusiasm early adopters had for the internet, the generation of 'cypherpunks'  who viewed the digital world as their long-waited escape from the medieval dreariness of nation-states and castles of control and surveillance. The internet was anonymity and freedom -- liberty.  Although the internet was normalized relatively quickly in the 1990s and early 2000s, saturating the geeks' playground with boring e-businesses and teenagers posting their journals online, technology did arrive to give the cyphers more of what they wanted: anonymous browsing via browsers like Tor, and secure modes of payment like Bitcoin.  Just as the lightless ocean depths create extraordinary creatures, so to have the pressures of illicit marketplaces created  new ways of communicating and doing business, creating payment structures that  allow some degree of trust but without legal exposure.  Just as PGP encryption for secure messages has filtered down into email clients like Thunderbird, so too  may multisignature escrow accounts for online payments one day appear above the surface for those who prefer not to use credit cards online for mundane security reasons.

The Dark Net is disturbing reading at times, particularly the chapter on child pornography.  There are lessons to be learned here, though, like the ways that highly specific interest groups amplify their members' devotion as they enter into an echo chamber where increasingly more strident views and increasingly more antagonistic behavior are viewed as perfectly acceptable. (Barlett believes, for instance, that child pornography grows off of increasingly compulsive consumption of ordinary pornography, as the user requires increasingly more provocative content to engender excitement.) The conclusion is worth reading in itself, as Barlett uses two people -- a technohumanist and an anarcho-primitivist -- to examine different views of freedom. Although both view the present state of things as unattractive for shared reasons, their solutions are utter opposites.  Ultimately, although there's much here to give one pause about human nature, I still find myself faintly relieved to know (as I did in reading A Renegade History of the United States) that rebellion lives.

Related:
The Internet Police: How Crime Went Online, Nate Anderson
#Republic: Divided Democracy in the Age of Social Media, Cass Sunstein. Book about how highly specific internet filters and communities lead to increased polarization and disaffection.
Spam Nation, Brian Krebs


Sunday, February 19, 2017

Drone

Drone
©  2013
432 pages



In El Paso, Texas, the raging narco-wars between drug trafficking gangs in Mexico has bled over into American streets -- claiming the life of the American president's son.  Having run on a platform of balancing the budget and reversing foreign-policy foul-ups that have lost countless American lives and money overseas, President Meyers nevertheless realizes something has to be done. After diplomatic and above-board covert ops fail to produce results, she turns to an ex-CIA spook named Pearce, who is now the head of a private military contractor that specializes in combat drones. His deadly campaign against one drug cartel will stir up a hornet's nest of woes, because several factions within Mexico are being manipulated by an Iranian who is involved in a multinational conspiracy.  More an intelligent technothriller than a Duke Nukem action-American novel, Drone offers speculation as to how drones might be employed -- and legally justified -- in the near future.  Drones are depicted here not just providing recon and a platform to launch missiles, but sniping targets using facial-recognition software.  Maden's presidential figure is an interesting character, a populist who achieved office by running against her own party and vowing to end endless foreign wars;  she struggles to keep her desire for justice and order in line with a firm commitment to Constitutional government.  A downside of the novel, but a necessary part of its drama, was the domestic chaos that erupts from Meyer's  new policies toward Mexico. After the narco-gangs strike back and the border is functionally militarized,  the media casts Meyers as an anti-Mexican tyrant, creating 'a day without immigrant' labor strikes, etc.  Maden has a good mind for the diverse kind of political chaos imaginable in the United States today, but -- alas for those of us who read this presently -- that sort of chaos is going on now, so it's not enjoyable in the least to read about.   Everyone in this novel has a little schmutz on their face, including the principled executive who can only take the least-worst option of a list of bad choices.