MANY'S THE BOOK author (and movie scripter) who has romantically rewritten the history of great discoveries. The common conceit is to celebrate a singular person toiling away in his garret until having that great lightbulb moment that changes everything!

So mused tech chronicler Walter Isaacson in a speaking gig earlier this week at the Central Branch Library on Logan Square.

And geez, even Isaacson has been guilty of the crime, to a degree, with his name-branded hit biographies on Steve Jobs, Ben Franklin, Henry Kissinger and Albert Einstein.

But that's not happening with his new book, The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution (Simon and Schuster). It chronicles the men and oft-overlooked women who brought us breakthroughs like the transistor, the programmable computer, personal computer, video games and the Internet.

These innovations were mostly group projects, he asserted. And Isaacson even "group sourced" parts of his book, putting chapters up online for critical appraisal. "In one week, I got 18,000 responses on the cultural brouhaha in San Francisco," he said.

Isaacson himself is no stranger to celebrities. Or technology.

As a youth, the self-proclaimed geek soldered together Heathkits (DIY radios, TVs and hi-fi amplifiers) in his basement. In adulthood, he took leadership positions at CNN and Time magazine, launched Time's online digital initiatives and the innovative "Teach for America" project, active in Philadelphia since 2002.

And for the past 10 years - when not writing hit biographies - he's steered the Aspen Institute, a think tank and meeting place for political leaders, business tycoons and transformative creators from around the world.

Like his late friend Jobs, who trusted the writer to share his story in 2012's best-selling book, Isaacson believes products and society operate best when both sides of the collective brain are engaged and working together - when equal respect is given to arts and humanities, science and mathematics.

SHOCK THERAPY: William Shockley, the "anti-hero" of Isaacson's new book, "was bad about not sharing credit" for the transistor, arguable the most important breakthrough of the modern tech age (sometimes called the "Third Industrial Revolution").

While leading a Nobel-Prize winning team at Bell Labs in Princeton, Shockley (also condemned later for racist remarks) had two major collaborators, John Bardeen and Walter Brattain, plus a strong backfield likewise working on the micro-sized, low energy, solid state replacement for glass vacuum tubes that took charge in everything electronic.

THE FIRST NETIZEN? Al Gore has sometimes been identified (and taken credit) as "father" of the Internet. But long before he put his paws on it, the communications conduit had gotten up and running through collaborative work between the military establishment (traditionally a high-tech driver) and universities.

Still, the Web remained a private club until the Gore Act of 1992 - "passed in bipartisan fashion" - unlocked the net to "everyday users" through early services like CompuServe, Prodigy and AOL, said Isaacson.

GETTING THE LUDD OUT: Tracing 17 of the most important innovations of the digital revolution, Isaacson's new tome also takes pains to undo the myth that tech is a man's world.

He awards "first conceptualization of a multi-functional computer" honors to 1830s figure Ada Byron Lovelace, a mathematician and daughter of Britain's great romantic poet Lord Byron. Ironically, her dad was a true Luddite, a follower of a young hooligan named Ned Ludd who went around smashing mechanical looms programmed by punch cards because he feared the device would put weavers out of work.

Ada, by contrast, was "mesmerized by the loom," noted Isaacson, "believing the punch cards were a way to teach machinery to do things that were beautiful, that could connect art and machinery . . . She called it poetical science."

CARD SHARKS: Punch cards also would be deployed on the first truly programmable computer built right here in Philadelphia at the University of Pennsylvania. We're taking 'bout the military-funded ENIAC (Electronic Numerical Integrator and Computer), developed first for our armed forces to use during World War II to fine tune missile trajectories.

Two Penn guys - James Mauchly and J. Presper Eckert - usually get the credit for ENIAC's invention and for the commercial variant UNIVAC they'd build after leaving university.

But the guys never would have gotten the thing cranking, argues Isaacson, without six female mathematicians who learned how to program ENIAC at the Aberdeen Proving Grounds. Or if Mauchly hadn't first sniffed around for building tips "like a bumblebee collecting pollen" at the 1939 World Fair, Dartmouth, Vassar and Iowa State, where physics professor John Vincent Atanasoff had been toiling away on the tech tip.

"Atanasoff was headed in the right direction but was struggling with just one student assistant and no financial resources. He couldn't get the computer mechanics down, couldn't get the punch card burners going," related The Innovators chronicler in his talk here.

There's another digital age message, too: "Vision without execution is just hallucination."