November 21, 2024

Jobs Don’t Give Us Purpose and Meaning, Helping Does

photo for Koch help blog smallFace it, few of us can say that our work has a high degree of purpose and meaning in the greater scheme of things. True purpose and meaning implies a degree of selflessness that few of us can afford when considered against rent, mortgages, college funds, or car payments. It’s one of the unstated frustrations in all the recent surveys that show how unhappy most employees are in their jobs.

That’s why it’s important for companies to provide an extra boost on the purpose and meaning fronts.

Companies Must Make Room for Helping
When you boil purpose and meaning down to their essences, you get helping. We want to feel like we’re contributing not just to our own or our families’ wellbeing but also to something bigger. You see it when a disaster happens – we instinctively rush to the blood bank or (often ill-advisedly) to the scene itself. We become desperate to do something to help.

A quieter sense of desperation follows us to work each day. But we often don’t have the opportunity to get that sense of helping on the job – unless we get a little help from our employers.

A few years ago, I was given the day off from work and walked with about two dozen colleagues into a huge warehouse that was filled with broken boxes. The boxes were filled with medical supplies that had been rejected for a bunch of different reasons, none of which included damage to the actual supplies themselves. We spent the day taping up those boxes and stacking them on pallets and wrapping them up with shrink wrap. They would soon be on their way to Africa, where they were desperately needed.

Helping Beyond the Job Builds Engagement with the Job
Now, I’m not going to tell you that suddenly my life was filled with purpose and meaning. But I did do something to help that I will never forget and got to bond with some colleagues that I had never met before. Kudos to my employer, SAP, which offers programs like “October Days of Service,” as well as a more ambitious program in which employees can work in developing countries for months at a time.

Doing good deeds like that don’t just make me feel good and help build bonds with colleagues; they also bond me more tightly to SAP. I’ve never had an employer offer such programs. All other things being equal, why wouldn’t I now feel more loyalty to SAP? Obviously, there’s a lesson for companies here.

Helping Can Happen on the Job
But we don’t need to help the world to feel like we’re helping. When I was a journalist, I mentored young writers who asked for it. Trouble was, they had to ask for it. I can’t imagine anything worse than mentoring people who don’t feel they need or want it (even if they do). That’s why such programs need a push from above to succeed. Research by my colleague Michael S. Goldberg has found three examples of how companies can do this:

Intergenerational learning. American Express piloted a phased retirement program to allow some older Cobol programmers to work part-time instead of retiring so they could train younger programmers, act as mentors and coaches to younger workers.

Peer recognition, amplified. Macy’s reported higher employee engagement after implementing an in-house portal for retail associates to post stories celebrating peers’ good work. Recognition occurs at store, regional and national levels.

Purpose and public service. IBM offers sabbatical programs for employees to work on pro bono projects in developing countries. Employees get to ply technical and management skills while having a meaningful experience, strengthening bonds with employer. Starbucks paid for its associates to contribute 631,000 community service hours to local neighborhood projects in 30 countries and runs a website with a list its employees can join.

What are you doing to give yourself or your employees a greater sense of purpose in their work? I’d appreciate your thoughts.

Why Can’t Companies Be More Like the Iroquois?

English: Flag of the Iroquois Confederacy, Hia...

English: Flag of the Iroquois Confederacy, Hiawatha Belt Français : Drapeau de la Confédération Iroquoise (Photo credit: Wikipedia)

If ever there was a cooperative organization that had less reason to endure until today, it is the Iroquois League.

First formed sometime between the 15th and 16th centuries, it brought together five distinct tribes that had been warring and otherwise squabbling for centuries prior. Each tribe had its own language, customs, and culture. Located mostly in what is today northern New York State, the tribes’ unity (a sixth joined in the 18th century) allowed them to wield serious political power as Dutch and English colonists came to North America.

Caught on the Wrong Side
The league probably should have fallen apart after it backed and fought with the losing side in the American Revolution, the British. With most of their former lands seized by the Americans, the tribes were forced to move to Canada; those that remained were relocated into small reservations sprinkled across New York.

Yet despite all the wars, relocations, deprivations, and disease epidemics brought upon the individual tribes, the league survives. Perhaps it has to do with the flexible style of governance that has been in place since the beginning. Each tribe has the freedom to govern itself, yet there is a Grand Council of 56 (that number has never changed) Hoyenah (chiefs) or Sachems that confers about issues that concern the league as a whole.

Women hold a strong place in Iroquois society, leading individual clans within tribes, helping determine chiefs, and holding veto power over treaties and declarations of war (the Iroquois declared war on Germany in both world wars). In the 19th century, no treaty was binding unless it was ratified by 75% of the male voters and 75% of the “mothers of the nation.”

Why Can’t Companies Do This?
So the obvious question becomes, why can’t companies cooperate like this? Most are riven by silo (tribal) warfare, as employees who are all supposed to be working for the same cause – serving the customer – engage in turf battles and subvert one another in an attempt to appear to be the most effective contributors to the company.

It’s happened to me multiple times in my fewer than three years at SAP: nasty emails from someone I’ve never met demanding to know who gave me permission to publish a story that touches on his or her silo but that does not overlap with anything they’re trying to do. There’s no discussion of whether what I’ve published is of good quality or could be helpful to a customer – it’s all about fear and power. I’m guessing you’ve experienced the same thing at some point if you’ve ever worked in a big company (or maybe a small one, too).

This has got to stop.

Customers are demanding that we deal with them in a unified, cross-channel fashion. They don’t want three different calls from three different sales areas of your company. They don’t want duplicative or conflicting messages coming from different parts of the marketing organization.

How Do We Stop the Infighting?
Research by my colleague Rob O’Regan has revealed a few ways to develop cross-channel cooperation:

  • Put someone in charge. Organizations need someone to orchestrate the cross-channel experience, even if they don’t own it. This person must be relatively senior in stature and visible across all functions, serving as an internal partner to connect disparate groups around a customer-centric strategy.
  • Develop plans collaboratively. More organizations are moving away from traditional top-down, bottom-up planning. Instead of having sales, marketing, finance, and operations each develop their own strategic plans, these companies have introduced collaborative planning, which puts everyone in the same room to create a shared plan, with the customer at the center.
  • Talk to the frontliners. Companies should also tap into customer-facing employees, who are a rich source of insights. Whoever heads up customer experience should oversee an effort to ask every frontline employee what’s impeding their ability to deliver excellent service.
  • Form temporary problem-solving teams. Companies have pockets of expertise about the customer experience spread across the company. They should look for ways to tap into these people to quickly resolve specific customer problems.
  • Focus metrics and incentives on long-term retention. Customer experience initiatives should be measured not on short-term transactions but on longer-term measures, such as lifetime value. For example, instead of measuring how quickly a call center agent answers a customer’s question, measure how infrequently customers call back.

I’m sure there are other ways to reduce silo and channel conflict that I haven’t mentioned. What do you recommend?

 

6 Freaky, Funny, and Scary Abilities of Computer Organisms

robot picture for Koch blog 300I intended to write this blog about some of the amazing, helpful, and scary things that robots can do today, but even a quick look at the information out there demonstrates that robot is already an archaic term, like calling a car a horseless carriage.

There’s simply no way that everything we’ve come to expect from robots in sci-fi novels and movies will exist independently within whatever bodies we put robots in for the foreseeable future. They are as dependent on the computing environment that surrounds them as living organisms are on their ecosystems.

Robots can’t do anything better than a computer except move. And the various species of computerized electronic devices now covers a spectrum comparable to that of living organisms: We have everything from the equivalent of one-celled animals, such as microscopic, single-function sensors, to highly-evolved super-computers and computers that, when moved around on a dolly, can visit TV sets to humiliate humans on game shows. In other words, robots are one of just many species of life-like computer organisms.

Here are just some of the behaviors in the computer kingdom that exist today:

  1. Tap into your subconscious. Electroencephalography (EEG), which has been around for almost 100 years, records brain activity through electrodes on the scalp. However, as scientists point out, only a small part of the brain activity measured by the EEG is under the sensoree’s voluntary control. Other parts, like emotions and likes and dislikes, are also captured. And everyone’s EEG happens to be as unique as a fingerprint.
    Traditionally, that hasn’t been an issue because EEGs have been the exclusive domain of research labs and hospitals. But in 2009, EEG escaped from the lab and out into the wild, in the form of a an EEG device intended for gamers to levitate, Luke Skywalker-style, an object in a Star Wars simulation game using only their minds. A publicly-released programming platform followed (there are now over 40 different games developed for it) and a Jurassic Park’s worth of unintended consequences suddenly became possible.
  2. Hack your thoughts, beliefs, and your bank PIN. Recently, researchers successfully launched a mock spyware attack through an EEG game in which they were able to reveal information about the user’s ”month of birth, area of living, knowledge of persons known to the user, PIN numbers, name of the user’s bank, and the user’s preferred bank card.”
  3. Support life for lower robotic organisms. One of the problems with tiny computers is that there’s no room to store a lot of power. Anyone who’s had a first-generation GPS-equipped phone remembers how quickly these tiny chips sucked the life out of their phone hosts. But a French company has deployed a sub-internet in San Francisco that would let simple sensors send data frequently and far distances without requiring much power and at a much lower cost. It opens up many more possibilities for monitoring technologies for health, business, fitness, and other activities because the sensors can essentially live longer and more independently without an external power source.
  4. Make humans shed tears. Japan is the capital of cute, so it’s no surprise that this little computer with the face and movements of an infant but the brain of an astronaut caused his Japanese co-pilot to nearly lose face in tears when he left the computer alone to run the ship while he returned to earth. But a much-uglier computer had the same effect when it imploded while exploring some of the deepest ocean trenches known to man.
  5. Move like animals. They’re not not the smartest chips in the fab, but there are now four-legged robots that have mastered one of the most difficult tricks that animals perform – balance – and can run and carry more weight than a cavalry horse while trailing their masters like loyal dogs. But walking on two legs is a much tougher challenge. There are still no robots than can move anything like humans, even with external assistance.
  6. Carry out assassinations without remorse (yet). We all have opinions about whether killer drones are right or wrong. I’m not going there here. But the military is experimenting with giving robots a basic moral compass. Because besides being weapons, robots are also rescuers and explorers. To carry out their duties, they should know whom to rescue first. But when it comes to knowing whom to shoot first, researchers are highly divided as to whether robots can ever be trusted not to act like mass killers or terrorists.

What’s your favorite computer organism?

The Worst Source of Work Conflict

workplace conflict320I have two attributes that set me up for conflict with my current colleagues at SAP.

First, I worked in small companies for my entire career leading up to 2012, when I joined SAP, which is a huge company, and like any huge company, has its share of bureaucracy that was absent from my previous workplaces. Second, I spent most of my career as a journalist, which trained me to be vigilant about whether what people say is what they really mean – especially when it comes to marketing, which is the function I work in in SAP.

But after two-and-a-half years working here, I’ve decided that both of those assumptions are completely wrong.

The worst culprit in workplace conflict is technology – no matter the size of the company.

There are five ways that technology (current technology, anyway) leads to work conflict.

  • Institutionalizes the memo mindset. If you’re an old fart like me, you remember a time when the internet was something being hatched by DARPA and there was no voice mail. If bosses wanted to send a one-to-many communication, they used the memo.
    But bosses knew that employees hated getting these stilted, impersonal communications and that memos usually signaled that something bad was coming and that the boss was too scared to tell them face-to-face in a meeting. Plus, memos took a long time to produce and distribute so lazy bosses tended to avoid them. E-mails are just like memos, except unlike memos, they are easy to produce and distribute. Just type angry, pick a distribution list, and let it fly.
  • Leaves time to invent meaning between the lines. Research has found that e-mail communications strip away trust, and lack of trust lights a can of Sterno under employee relationships, starting a slow burn that can lead to overheated conversations.
    However, e-mail is not the only culprit here. Blogs, social media, and even videos have much more power to offend people. In social media the conflict can become exponentially worse because it can be crowd sourced.
  • Slices the workday into impenetrable chunks. Sometimes I wind up writing an e-mail about a subject that I know would come across better in a conversation but which doesn’t rise to the importance of a meeting invite. Calendar apps have taken away our permission to just pick up the phone and chat with a colleague.
  • Culturally clueless. One of the most important institutions in Japanese culture is the concept of honne and tatemae, which translate roughly to public face and private face. Historians trace the phenomenon to Japan’s history as a crowded island that needed to avoid conflict. There are at least 16 different ways in Japanese to use the word “I” to describe the relationship between speaker and listener. So clearly, the Japanese have been working on this idea of public and private meaning for a long time and get it.
    Americans? Not so much. In fact, nothing angers an American more than a colleague who sends an e-mail that doesn’t match with what that colleague said in a meeting or even in private.
  • Makes it easy to create stir extended conflict. It’s way too easy to put people on a cc list in e-mail that the recipient doesn’t know, thereby seeding mistrust. It’s also easy for the people on the cc list to lack the context for judging the communication properly. Cc is also sometimes taken as license to contact the original recipient with opinions or orders that seem to come out of nowhere.
  • Calls for skills many people don’t have. Most people are not natural writers. The smartest engineers may not be able to write a decent e-mail or tweet.
    And writing isn’t the only skill needed to survive technological communications. From infancy, we are trained to respond to social cues. Videoconferencing or any sort of public speaking makes many people nervous. They give off social cues that viewers automatically pick up, which can lead to a negative bias, which can lead to conflict.

So if all these technologies are so dehumanizing how do we fix them? In a word: context.

Online communities have taken one step towards defusing potential conflict by requiring participants to fill out extensive biographies and list their experience and interests. Few companies have followed suit, however. And there is so much more that could be done from a technological point of view to ease employees’ minds about the intent of a colleague. Providing more and better context will be critical to the future of work, especially given the high degree of disenchantment among employees today.

How do you think technology could become a way to defuse conflict at work rather than encourage it?

The TV Is No Longer a TV

I am in the vanguard of cord cutters, a small but growing group of cable TV subscribers who have decided to ditch the cable box in favor of a variety of geeky devices that serve up entertainment through an internet connection.

Between 2008 and 2013, 5 million (or 5%) of US cable subscribers cut the cord, with 1.3% brandishing the scissors in 2013 alone, according to Toronto-based Convergence Consulting Group.

When I told my wife and daughter we were trendsetters, they rolled their eyes and said I was just being cheap (again). Regardless, a change in the way we think about entertainment has swept through my household and 5 million others in the US: The TV is no longer a TV; it is simply the biggest screen we have for watching entertainment.

It’s All About Screens Now
That’s because our new content providers, Hulu and Amazon Prime, are as easy to watch on a computer, an iPad, or, in a real pinch, a phone, as they are through the Roku device attached to the former TV. (When we absolutely need to see live network broadcasts – my wife and daughter insisted on seeing the Oscars live, for example – I plug in a set of Radio Shack digital bunny ears to turn our big screen back into a TV for a few hours.)

The New York Times says that cord cutting doesn’t save much money but I can attest that in my house (near Boston) it saves $125 per month. Not exactly chump change. Plus, we never watched that much programming to begin with, so the savings are that much more satisfying.

As you might imagine, stories like these are starting to throw a scare into the cable companies and the entertainment industry as a whole. “In the U.S., consumers are seeing fewer differences between telecommunications and entertainment,” says Jack Plunkett, CEO of Plunkett Research. “It’s all the same thing. We have truly entered an era of convergence where data, entertainment, and communications are all falling into one package.”

Except now it’s the consumers doing the packaging rather than the cable and telecom providers. Research by my colleague Polly Traylor turned up three ways that the status quo is threatened:

  • Frictionless consumption. There is a reason why Netflix and Apple iTunes have been so successful: they both have world-class selection and make it extremely simple to find what you want and begin listening or viewing immediately.
  • Everything is an entertainment device now. Even the top providers of gaming platforms– Sony, Nintendo and Microsoft– are now vying for the same entertainment eyeballs as the studios and networks and are retrofitting machines into multipurpose entertainment devices that stream content from Netflix and other Internet video providers.
  • Disruptors are everywhere. I’m sure that by now you’ve heard of an Internet TV startup called Aereo that uses tiny individual antennas to let consumers in several U.S. cities watch live broadcasts on Internet-connected devices and store shows in the cloud to watch later. All the major broadcasters have sued for copyright infringement and pushed it all the way up to the Supreme Court. Needless to say, if a tiny, barely two-year-old startup is already having its day in (Supreme) Court (against its will), we are in the midst of interesting times for the entertainment industry.

How have you changed the ways you consume entertainment?

Enhanced by Zemanta

What Pisses off the Man Who is the Face of 3D Printing

MakerBot, a manufacturer of desktop 3D printers priced at the level of a decent laptop, is the best known of companies producing a product that has already been raised to PC-level stature in terms of its potential impact on business and society.

Bre Pettis, the CEO of MakerBot, has become the face of this long-simmering but suddenly hot business (3D printers – the really expensive kind, anyway – have been around for decades) in part because he had lots of practice being a public face long before he ever thought about launching his company.

That background is why he was so pissed off at the recent Front End of Innovation conference, where he gave a speech about his approach to innovation.

What Caused the F-Bomb
Now, it’s important to put pissed off in proper context when talking about Pettis, who, when it comes to being the face of a new technology, hews much closer to Apple’s polite, tranquil (and nearly forgotten) co-founder Steve Wozniak than the other Steve. Besides some hair gel to sweep back a thick shock of salt-and-pepper hair and some long, hipsterish sideburns, Pettis wears the uniform of the typical sloppy, slack, sneakered, untucked anynerd and seems utterly comfortable in the skin beneath it.

That’s why when he uttered the F-bomb on stage (he apologized in advance) it came as a bit of a shock. He was talking about the US education system, saying that it is “f***ed.”

After his speech to the conference at large, Pettis held a Q&A in a small side vestibule where he was asked to explain what makes him so angry. Basically it’s the things we tamp down with Ritalin today: “We don’t let kids be playful, explore, or help them understand who they are,” he said.

A CEO Who Lived the Crisis in Education
Pettis is one of the few CEOs today who can speak about the education system from experience. He taught art in the early ’00s in a middle school in a poverty-stricken Seattle neighborhood. “If you are white you can basically skip school and not miss anything,” he says bitterly. “If you’re poor, you need the structure of the school system. About half the kids I taught got their breakfast through the school.”

With kids like these, many of whom lacked a consistent adult presence in their lives, Pettis discovered that using the medium that the kids were growing up with, video, was a good way to reach them. He did a series of videos of himself demonstrating how to make art projects and then had the eerie experience of playing the videos in class while standing next to the monitor. “They retained the information better when they watched the videos than they did when I taught in person,” he recalls.

Viral How-Tos
Not that Pettis was a bad teacher. Indeed, when he began uploading the videos to the internet (this was pre-YouTube days), they got tens of thousands of views. That led a publication called Make to approach him to make a series of weekly how-to videos for more pay than he was getting from the Seattle school system.

An obsessive, energetic tinkerer from an early age, Pettis couldn’t resist the offer. The connection with Make eventually led him east, where he co-founded NYC Resistor in a warehouse in Brooklyn so that people like him could get together and spend evenings cobbling weird stuff together.

From that base emerged MakerBot in 2009, which was co-founded by Pettis and two others. NYC Resistor is now housed in space upstairs from MakerBot’s Brooklyn factory.

Bringing Playfulness to Schools
As he demonstrated at the conference, Pettis hasn’t lost his passion for education. In 2013, the company launched a program called MakerBot Academy, in which teachers can request a 3D printer for their classrooms through DonorsChoose.org, a crowdfunding site for educators. Pettis pledged personally to supply every high school in Brooklyn with a printer, and has enlisted hackers on the company’s 3D print-plan community sharing Web site, Thingiverse, to develop a curriculum for teachers to download and plans for printing objects in the classroom. “I want to put a 3D printer in every school in the US so kids can feel empowered to create,” he said.

Of course, he wants those printers to be made by MakerBot, but what keeps his pledge from sounding like another one of those attempts to burnish the company image and revenues while doing good is the personal connection Pettis says he still has to teaching. “My skill is that I know how to gather people to do wonderful things,” he said. “I couldn’t be a CEO without having been a teacher first.”

When we think about the industrial supply chain, we can’t forget the resources that ultimately make it happen when they grow up: children.

Enhanced by Zemanta

The Robot I’d Have a Beer with – After He Takes My Job

The robots are taking over. I know, sounds like a teaser for a 1950s sci-fi movie. But it’s happening. It’s really happening.

Actually, it’s starting with just two robots. One is named Baxter. If you haven’t heard of Baxter, he is to our traditional perception of robots as a salt lick is to a gourmet meal. Most robots are single-function creatures, welding roofs to cars or drilling holes or some other basic operation.

But Baxter is a robot of refinement. He multitasks. For example, he can be programmed to pick and sort as well as any college kid at the UPS or FedEx distribution megaplexes at Christmastime and unlike his primitive forbears, who freak out when the bolts aren’t in exactly the same place they were the last time they reached for them, Baxter is cool with uncertainty. He can find stuff if he needs to, though if he drops a tool out of reach (proof of his amazing dexterity, not his klutziness) his expression (he has a simple, kindly monitor face) turns to one of confusion and he throws up his arms (well, actually he puts them down and shuts off and waits for a human to help him).

Baxter’s Kind of a Wimp
See, that’s the problem I have with Baxter. He is, to be perfectly honest, a bit of a milquetoast. He’s kind of afraid of humans – his software DNA tells him instinctively to avoid bumping into humans when working on an assembly line, for example. And he lets humans invade his personal space whenever they want to manipulate his arms to teach him new things to do.

And he just lets them do it!

What a wimp. I’m not having a beer with Baxter anytime soon.

Victor Has Got Some CPUs on Him
Now Victor, this guy, uh robot, is another story entirely. His monitor face has got prematurely gray hair tousled in a kind of I-don’t-care-but-I-care-enough-to-spend-$200-on-this-haircut kind of look. And he’s got a soul patch and a pair of glasses that look like they came off the rack at Armani. I’m sure that beneath his screen he’s wearing a black turtleneck that would have made Steve Jobs proud. In other words, Victor’s got attitude.

And man, can he trash talk, especially when he’s playing Scrabble, which is his favorite game. First he tries to amp his cred and intimidate his rivals with bombast like, “I am the correct king of Scrabble, Victor the mechanical marvel – that’s Victor the brilliant for short.”

Then he goes after his opponents. He’s currently hanging at a university proudly known for its nerdiness, Pittsburgh’s Carnegie-Mellon University (CMU), so he goes right for the awkwardly protruding Adam’s apples of the students who challenge him: “Your words scored less than a CMU student at a party.”

Man, this dude’s got CPUs, huh?

Robots Are Learning to Fit In
But like most trash talkers, Victor’s abilities don’t quite match up to the words. In fact, he sucks at Scrabble and he’s a really bad loser. And he gets down on himself when he loses, looking really pissed off and revealing a vulnerability that is, well, kind of human.

And that’s the point. Scientists are realizing that for robots to co-exist with humans, they have to be a little more like us. So you take Victor and give him some arms (hey, who needs arms to work a virtual Scrabble board?) and a better education and now you’ve got a robot that could fit in at any Silicon Valley startup.

Lose the goatee and glasses and put a baseball cap over that hairstyle and he might be ready for a few boilermakers at a dive bar with the gang from the loading dock – if he doesn’t replace them first, that is.

What Happens When They Really Do Fit In?
And that’s the other point. Robots are already able to do a lot of things that humans can do. And that has major implications not just for factory workers, but for, let’s say, home health aides who lack a sense of humor, or journalists who can’t write like Hemingway (and even some that can). Business leaders and politicians need to get ready for the wave of job displacement that’s going to come when robots like Victor and Baxter get their mojos really working.

This is a big issue that MIT researchers Erik Brynjolfsson and Andrew McAfee tackle head on in their fascinating book The Second Machine Age. You should read it. It will change your view of the internet, economics, and robotics forever.

Are you ready for a robot to replace you?

Enhanced by Zemanta

Patents Are Dead

I have always associated innovation with patents.

It’s the classic vision of the lone, wild-haired inventor in a grimy white lab coat putting himself and his family (assuming they’re still hanging around – he spends all of his time in the basement tinkering and cursing) through abject poverty until he finally hits on something useful and a huge conglomerate buys his invention and he moves to a mansion.

Thanks to something called a patent.

Patents Make Things Valuable Through Scarcity
Patents were first invented by the Venetians back in the middle ages to stimulate the wild-haired inventors (in this case glass makers) to invest the kind of obsessive energy required to create new things by protecting them from copycats and by giving them a financial reward for their work.

Ever since then, patents have created what economists and lawyers call artificial scarcity. For a certain period (usually 10-20 years) one else is allowed to make or sell the product (or service) besides the inventor and the company he or she chooses to make the patented product so that they can recoup the investment made to invent, design, manufacture, and distribute the product and make a tidy sum besides – basically whatever price the market will bear.

Things Were Fine Until the Internet Came Along
But the patent system assumed that the promise of financial reward for their efforts was what motivated the wild hairs to keep at it. Turns out that’s not the case, at least not anymore. As the entertainment and publishing industries have discovered much to their chagrin, people just like to create things, solve problems, and get public recognition for it, even if that recognition doesn’t involve financial reward.

And patent holders can’t even use patents for their more dystopian purpose anymore: to discourage copycats from getting in on the action. Despite unleashing tens of thousands of copyright infringement lawsuits (copyright is a kind of more relaxed hippie cousin to patents) upon internet music and video pirates, the entertainment industry has failed to stop the practice.

Yet despite widespread piracy, musicians produce more music now than ever and quality has not suffered. Publishing has endured the same fate, yet more books are being churned out than ever. And let’s not even get started about YouTube.

The Suits Lose, Artists Keep Suffering
It’s really only the suits that are losing out. The artists are simply continuing to suffer just as they always have.

What has kept musicians and inventors in the basement for all those years wasn’t the promise of financial reward (though that certainly helped), it was that the costs of producing and distributing the thing they wanted to do were just too high for them to bear on their own.

But that’s all changed now. As I explained in a recent post, the costs of producing and distributing books and music have dropped to nearly zero. Just set up a free blog or download free audio recording software to your computer and you can stream your creations to a nearly limitless audience for free.

The Costs of Production Approach Zero for Everything
But publishing and entertainment aren’t anomalies. There are three other areas where the costs of production will soon reach zero, according to a fascinating academic article (in this rare case that’s not an oxymoron) by Mark A. Lemley, William H. Neukom Professor of Law at Stanford Law School:

  • 3D printing. You’ve already heard how physical items can now remain in the nearly cost-free form of bits and bytes until they reach a really cheap printer in your bedroom or office.
  • Synthetic Biology and Bioprinting. Scientists have made entirely new forms of bacteria different than anything found in nature. It’s not easy. But it will get easier. Synthetic biologists are developing collections of “biobricks” – individual modules that can be put together in organisms. Because these bricks are information, they can be shared and recombined in numerous ways, says Lemley. Just like music and publishing, development and distribution costs will approach zero.
  • Robotics. Honda predicts that it will sell more robots than cars in 2020. Sounds like a market that it could try to sew up through patents. Yet even the realm of super-expensive and complex robotics is vulnerable to a loss of scarcity. Just like PCs, the value of robots will be not in the hardware but in the programming – what we can get the robot to do besides clean floors or weld cars – and we already know that there’s a robust open source community for programming on the internet.

Can the patent survive in a world without economic scarcity? Lemley, who is one of the world’s leading authorities on patent law, isn’t so sure. Intellectual property law’s original justification, that it stimulates innovation, seems to be a myth. Instead, patents encourage commercialization – the financial return that investors see in bringing a patented idea to market.

But patents don’t even do that well. Lemley argues that patents actually discourage commercialization; inventors and investors are too concerned about patent turf to invest in competing (as opposed to copycat) products. For example, a government report found that patents in the field of genetics actually limited patient access to genomic testing that would have helped them determine whether they had potentially life-threatening genetically-based diseases.

Lemley predicts that industries that see their profits threatened by the democratization of production will do just what the music and entertainment industries did: try to sue the threat out of existence, with the same costly failure for the suits and the same flowering of innovation for the wild hairs.

Patents will not disappear anytime soon, Lemley says. It’s still simply too expensive to bring a blockbuster movie or drug to market for patents and copyright to become completely irrelevant. But increasingly, high-cost products “will be islands of IP-driven content in a sea of content created without the need for IP.” (See, I told you this guy can write.)

As for me, I will never confuse patents with innovation ever again. How about you?

Enhanced by Zemanta

How Do You Market Something That’s Worthless?

I come from an industry (publishing) where the cost to produce the product has dropped to zero. Today, anyone can go to WordPress.com, set up a Web site, and begin publishing news and information to the world – for free. (I know, tell you something you didn’t already know, right?)

It Won’t Stop with Virtual Goods
But here’s the new wrinkle. The publishing industry is imploding because its products can be produced entirely via bits and bytes and therefore, the marginal cost, as Jeremy Rifkin puts it so eloquently in this interview and video, has dropped to zero. It becomes extremely difficult for a publisher to sell a Web site subscription when so much is available for free.

But what happens, asks Rifkin, when you cross the line from the virtual to the physical? Seems pretty hard to bring the cost of producing a cell phone to zero right? And remember how economists have been saying that localized service jobs (plumbing, hair cutting, etc.) are immune to this kind of disintermediation?

Not so, says Rifkin.

Beyond the Hype of 3D Printers
What’s refreshing is that Rifkin doesn’t just list the in-vogue economic disruptors of the moment, the 3D printer and the Internet of Things, as the reasons why physical products and services will go the way of publishing. Rather, he combines them together into a compelling vision of overall economic transformation.

The missing piece of the puzzle that fell in place for me as I listened to him talk was that since the World Wide Web came along we have been continuously training generations of people to do things themselves and in collaboration with others. For example, we figure out how to get a free WordPress blog ourselves online or through word of mouth and then we learn how to collaborate with others through social media.

Pretend the Industrial Revolution Never Happened
In a sense, we are training people to pretend that the industrial revolution never happened and that we can go back to making things the way we used to before factories and steam engines came along: by ourselves or in small (or, thanks to the near ubiquity of the internet) large collaborative groups.

Given access to the same easy-to-use, free tools that I use for publishing, I could produce a cell phone that does exactly what I want it to (my iPhone 5S doesn’t). But it won’t look like something from a Lego box because of another important development: I will be able to gather data about what the cell phone can and should do and what it should look like from my friends and the general public. These things all have GPS devices in them that only do location today, but will do much more very soon. Already app writers are pushing the boundaries of GPS on cell phones.

Monitoring – the Good Kind
As we become more comfortable having monitors on ourselves all the time (which means we will also very soon need something equivalent to the Bill of Rights for data), we will demand them on our products and vehicles, too. And that will give us access to extraordinary amounts of valuable data that until recently were only available to governments and big companies, as well as apps that make analyzing and interpreting that data as easy for us as it is for them (okay, so it’s not so easy yet).

We will have the power and information to invent or at least easily assemble many things that we have relied on companies to do for us and it means that for companies, innovation, rather than plants and machines, will be the key to survival.

My question to you is, How do we market in a world of zero marginal cost?

Enhanced by Zemanta

How Technology Keeps Us from Being Citizens

Repurposing this from my recent post for SAP because I think it’s really important:

 

Every now and then we have to remind ourselves that not having access to a computer or a smartphone has more serious consequences than missing out on playing Candy Crush whenever we feel like it.

That’s particularly true when it comes to being a citizen. For every person who tweeted during the Arab Spring, there were many others who could not. In fact, every nation, regardless of its wealth and level of political control, has its share of people who cannot participate as citizens because they lack access to technology.

Ines Mergel, in her book, Social Media in the Public Sector, describes this phenomenon in the United States as the “Town Hall Divide,” according to research by my colleague Regina Maruca. The lack of technology access, combined with increasing apathy and frustration among citizens in the US and elsewhere, means that governments need to do more to keep the technology loop from tightening only around the more wealthy and educated among us.

This isn’t easy to do. There aren’t any comprehensive solutions – not even on the horizon. “I think you can draw a direct line from the problem as it exists today right back to the inception of what we think of as online government,” says Mark Headd, Philadelphia’s chief data officer. “We got saddled with this problem and I don’t think we’ve done a very good job addressing it.”

Take Small Steps
The good news is that even small steps taken to address the problem can make a difference. Philadelphia, for example, has a set of about 80 different public computing centers designed to encourage people to engage with the city through digital channels. The city also has a program called “PhillyRising,” a pilot program that puts city staff members on the ground in troubled areas of the city to engage with residents directly and also to use technology assets to try and bring more people into the online engagement world.

In Boston, meanwhile, the city is trying to use its own technology as a way to fill the online access gap. Using data from different departments, Boston is creating a heat map that reveals what properties around the city have the most emergency calls and police incidents, tax problems, utility issues, and so forth. With these different systems coming together, the city’s task forces can operate on fact rather than on feeling. They can make decisions more quickly, with a higher confidence level than they ever had previously.

More needs to be done, however. How are you seeing this issue play out in your community?

Get Adobe Flash player