The Mythology of the University Degree

When I spoke to a national collegiate conference in 2015, I titled my talk “Expect Chaos: IT Career Paths – The Next 50 Years.” I made it clear to university students that they had been told a story about education and career that would probably not come true for them. It’s the same story that we tell to all of our young people:

Graduate high school – get into a good college – declare a major – graduate – get a job in your field – advance in your profession – retire

This pathway does not reflect the reality of our careers, the value of education, or the expectations of the marketplace. US Census data shows that only 32.5% of adults hold a bachelor’s degree or higher. Of those people, only 28% of them work in their field of study. Do the math, and you end up with roughly 9% of the adult population. After you consider people who need a degree to practice – physicians, architects, engineers – who is left? The minority of the workforce has a degree, and the minority of degree holders are active in their field. If university education is the key to a good career, how do we explain the disconnect between degree and career?

In context of these numbers, consider the hiring process. Job descriptions for professional positions often include: “bachelor’s degree required, master’s preferred.” Ignoring the fact that only 12% of adults have an advanced degree, the bachelor’s requirement shuts out 70% of the workforce. Since most Human Resource departments use software to qualify candidates based on educational attainment and keywords, there is little hope that a job seeker without a degree will be considered. Their resume goes directly into the NO pile, regardless of practical experience or performance capability.

Why does that matter? Go back to the IT students that I addressed. Statistically, 59% of them will graduate within six years. So, 41% will fall into the no-degree pile. There are more complicated reasons for leaving university than lack of aptitude, including high cost, perceived lack of value, and pressure to work. A 2011 Pew Research paper showed that people understand the connection between  a degree and lifetime earnings, but short-term financial obligations win. To hiring managers, people with some college experience are the same as those with no college experience, regardless of capability or practical experience.

For many people, the very reason to complete a degree program is to qualify for work opportunities, which is why they take on student loan debt. There’s the promise of a payoff. And, in the US, it’s a real threat. In the Fast Company article How The Master’s Degree Became The New Bachelor’s In The Hiring World, Lydia Dishman showed that US employers are raising educational requirements due to correlated work quality. Requiring a master’s degree means that playing field is narrowed from 30% to 12% of the total workforce. Dishman contrasted that situation with two UK firms who have delisted educational attainment as a requirement because it didn’t correlate with performance or because the practice unfairly barred people from economically disadvantaged backgrounds.


Official White House photo by Pete Souza

But let’s get back to the title of my conference talk, “Expect Chaos.” In the marketplace, change is fast, relentless, and constant. People who are retiring now at age 68 graduated university in 1970. That year, there were 2,388 bachelor’s degrees awarded in computer and information sciences. In the entire country. Programming meant punch cards. Now, anyone can build an app with another app, host a server for a favorite online gaming community, and access artificial intelligence through consumer-grade services like The Grid and The market will shift dramatically in the next five years, let alone 50, and the pace of change requires the ability to acquire new skills in the workplace in real time, independent of degree status. The Bureau of Labor Statistics has never studied career change, so there are no official statistics, but I suspect that most students will end up in a different place than they started.

I believe that Human Resources is using educational attainment as an outdated measuring device – a heuristic. It is a mental shortcut that gives organizations the illusion that they are selecting employees who have demonstrated certain capabilities. Using minimum educational attainment appears to be a safe way to determine that a worker is qualified for a professional position. From an organizational perspective, limiting the talent pool means that the organization misses out on talented employees who are otherwise qualified. It’s dangerous to organizational capacity, diversity and inclusion, and general competitiveness.

For transparency, I am among the 41% of US college dropouts, the 67.5% without a bachelor’s degree. I am also a committed adult learner who has produced TEDx events, contributed to a United Nations initiative, and helped homeschool my children. When I was a 20 year-old student enrolled in General Studies, I found my university experience uninteresting and pointless. I had no specific career trajectory, and my parents had four other children to educate. I felt like I was wasting resources and time, so I chose to leave.

My personal experience obviously guides my perspective on this issue, but I’m not alone. If money were no object, many people would continue, restart, or begin their university education. Detroit and Boston now provide two years of free community college to their high school graduates because they know that an associate’s degree makes people more attractive to Human Resources and gets students closer to a bachelor’s degree. Higher educational attainment also correlates with a reduction in poverty, criminal activity, unplanned pregnancy, and other social problems.

In practical terms, our workforce needs more capable, skilled workers to keep up with global market demands. If you believe that those workers come from universities, make the education free and accessible to all people. If you agree that skilled workers exist but aren’t given a fair chance to contribute, make a bachelor’s degree “nice to have” but not a requirement for employment consideration.

Did Agriculture Ruin Equality?

Wherever you are in the world, you are surrounded by dreams that came true. We live in built environments, using man-made tools, and communicate in languages of our own design. Someone devised the relationships you use — romantic, organizational, monetary — and the devices you use to care for your home and body. In every part of your life, you are influenced by dreams of a new future.

When planning for the future, it’s critical to understand the past. Humans have a very short history — 200,000 years — during which we’ve emigrated from Africa to cover the globe. For nearly that entire time, human societies were, by today’s standards, incredibly egalitarian. While many aspects of work were divided by gender, women had roughly the same social status as men. [Wikipedia] Adults made decisions as a group, social structures were flat, and the concept of individual ownership was nearly unknown.

How did we get to now? Agriculture. Between 12,000–10,000 years ago, humans began cultivating and storing grain. This most recent six percent of our history has been a period of radical change, and it is not yet over.

Burial chamber of Sennedjem, Scene: Plowing farmer, circa 1200 BCE [Public Domain]

Agriculture produced excess food. Eventually, some people stored grain privately, leading to basic inequality. Inequality meant that former peers were set into opposition for resources — the individual against the group. This resulted in conceptual shifts in ownership, social structure, exchange, labor, and political control. We started to change the rules of our deepest interactions.

Why is this important? The changes brought by agriculture were not inevitable. They were challenged at every turn… and they are still being challenged in our modern world. For 94% of our history, we provided for each other. We shared resources and advantages. We shared authority and decision making. For the past 6%, we have been pushing for individual rights, for personal wealth, and for ultimate domination of plants, animals, and other humans. These forces — the group and the individual — are in constant conflict.

As each level of community, people are trying to reconcile ideas about equity and fairness. Each advantage, by definition, requires a disadvantage. Cities face the ever-higher demands of capitalism by granting tax advantages to corporations that build new housing that reduces both the communal tax revenue and increases housing costs for the poor. Nations implement compulsory education programs and debate how much is enough. Globally, nations battle each other for control of resources while sending aid after natural disasters.

Humans have a relatively strong sense of duty. We cared for our own, and we still do, even when it clashes with our sense of self-protection. The problem is that we don’t know where our obligations end. Should our slogan be Every Man for Himself?* Should it be One for All, and All for One? Meet in the Middle?

By Paulrudd (Own work) [CC BY-SA 3.0 ( or GFDL (], via Wikimedia Commons

Can we figure out income equality if we still struggle with the drive to hoard grain for ourselves? Could we mutually benefit from minimum income if we don’t trust the people we can’t see? Is our population simply too large for us to truly understand that the children suffering poverty are our children?

Humans make the rules of society. When our tribes expanded because of excess food, we could no longer work out differences face-to-face as a group. We needed laws to govern behavior. We established some laws in the spirit of equity and equality, and others were developed to expand advantages for the already-powerful. But we made them.

And it is not yet over. In a complicated web of dreams, someone devised the relationships we use — romantic, organizational, monetary. We continue that tradition. The group and the individual are in constant conflict. Modern conditions exist because of the past, and as we dream of the future, we need to acknowledge our historical sense of equity.

*This slogan is intentionally left male-centric to highlight inequity. Nice touch, yes?

We Are All Going Out of Business

Google just announced they are giving away Nik, a set of Photoshop plug-ins that used to cost $500. In November 2015, they open sourced the code for Tensorflow, their machine learning system [AI]. Of course, they already provide a giant list of products and services that are accessible at little or no cost.

And it’s not just Google. Tesla released all their patents in 2014. Khan Academy launched a free online education in 2006. Oh, and in case you suspected this was all about giant tech companies or those crazy startup kids, six US universities like Stanford and Duke have teamed up to provide free learning at Coursera. Alternately, you could go to Germany and get a full traditional bachelor’s degree in English for free.

And one more example to scare the attorneys: we can all make legal contracts at no cost, thanks to Shake.

When talking with a friend this week, I realized that four of the last six companies I’ve worked for directly have closed or were acquired/absorbed. A fifth just closed their physical office and terminated all employees in favor of on-demand contractors. It’s not safe out there.

When advising corporations, NGOs, universities, and governments, I repeat one undeniable fact: someone is actively trying to eliminate your operating model. Every sector is vulnerable to challengers, especially — and this is important — from outside the sector itself. The digital revolution did more than digitize; it neutralized the defenses that organizations had developed over hundreds of years.

I argue that the concept of “sustainability” is inherently unsustainable. Sustainability is focused on keeping an environment in a consistent zone for its inhabitants. What’s been stable in your industry during the past 20 years or 20 months? Organizations are constantly changing the rules of the game. Is it you or somebody else?

My replacement for sustainability is adaptability.* In order for an organism to survive in an environment, it needs to be compatible with that environment. When the habitat changes, three main things happen to a resident population: relocation, genetic alteration, or extinction. Move, change, or die.

The modern organization is challenged to read changes in its environment and make changes that keep it alive and prosperous. This function is more complicated and complex than it was before because the challengers are coming from outside industries. Education is being contested by non-education companies. Transportation is being disrupted [yes, I said it] by non-transportation companies. Governments? By non-governments.

Without regard to his politics, I will always love something Donald Rumsfeld said: “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.”

The truth: what you don’t know will hurt you. The old strategy of becoming valuable or irreplaceable through expertise was extinguished when Google [and the rest of the web] replaced the experts. Humans have dumped as much information as possible into the system, and we can all access it. The education system is churning out smarter 22 year-olds every year. TED talks provide short updates on the advances of any industry. The value of what you know is diminished. Advantages now come from your ability to identify changes in the environment, combine ideas, and modify your behavior. In short, your adaptability.

We are all going out of business. Every day, the environment changes a little bit. New regulatory requirements, new business models, new ways of looking at the world. A friend in the insurance business told me that his company sees medical doctors retiring early because they can’t keep pace with the changes in the field. When they left medical school, they didn’t expect the world they inhabit now. Nobody really knows what the future brings, so we have to pay close attention to every little shift.

“We are still the masters of our fate. Rational thinking, even assisted by any conceivable electronic computors, cannot predict the future. All it can do is to map out the probability space as it appears at the present and which will be different tomorrow when one of the infinity of possible states will have materialized. Technological and social inventions are broadening this probability space all the time; it is now incomparably larger than it was before the industrial revolution — for good or for evil.

The future cannot be predicted, but futures can be invented. It was man’s ability to invent which has made human society what it is. The mental processes of inventions are still mysterious. They are rational but not logical, that is to say, not deductive.” — Dennis Gabor, Inventing the Future, 1963

*[I realize that the scientific definition of adaptation is not a point-to-point match to business, but we’re using it in context of human activity. The correct terms are learning and acclimatization. I can change the rules.]

Design Thinking Is Not the Answer

Twenty-five years after design thinking was applied to commercial purposes by IDEO’s founder David Kelley, the concept seems to have landed in the mainstream business community. [If you have been encouraged to “think outside the box,” you’ve been exposed to one element.] Of course, when a concept moves from the fringe to the center, problems appear.

“Thinking like a designer can transform the way organizations develop products, services, processes, and strategy. This approach, which IDEO calls design thinking, brings together what is desirable from a human point of view with what is technologically feasible and economically viable. It also allows people who aren’t trained as designers to use creative tools to address a vast range of challenges.” [source:]


By Christopher W. (originally posted to Flickr as The Apple Mouse) [CC BY-SA 2.0 (, via Wikimedia Commons

Design thinking has produced items like Apple’s first mouse and experiences like Wells Fargo’s ATM Interface. Putting the end user at the forefront of the process, prototyping, and testing have changed the way some organizations operate. As the challenges of the future become ever more complex, more corporations, universities, NGOs, and governments are looking toward design thinking to provide an alternative to “business as usual.” There are some flaws in that line of thinking.


One of the concerns about design thinking is that the methodology is becoming an ideology. Early advocate Bruce Nussbaum called it quits in 2011 because corporations were corrupting the spirit of the concept. “Companies absorbed the process of Design Thinking all too well, turning it into a linear, gated, by-the-book methodology that delivered, at best, incremental change and innovation. Call it N+1 innovation. CEOs in particular, took to the process side of Design Thinking, implementing it like Six Sigma and other efficiency-based processes.” [Fast Company, 2011]

Effectively, design thinking was interpreted as a panacea for solving complex problems. Example: Forbes contributor Lawton Ursey wrote, “Design thinking can and does work for all types of organizations, big and small.” The article was titled, Why Design Thinking Should Be At The Core Of Your Business Strategy Development. [Forbes, 2014]

Back to Nussbaum: “From the beginning, the process of Design Thinking was a scaffolding for the real deliverable: creativity. But in order to appeal to the business culture of process, it was denuded of the mess, the conflict, failure, emotions, and looping circularity that is part and parcel of the creative process.” A culture that wants clear ROI, predictable input, and output guarantees just doesn’t absorb design thinking.

This is echoed by design strategist, Sean Baker, who believes a key aspect of design strategy is being aware of the discipline’s potential pitfalls. “One of the biggest worries I have with the spread of design thinking is the idea of a packaged set of tools that can just be copied and pasted into any situation,” he says. [Metropolis Magazine, 2015]

To be clear, design thinking itself is not the problem. Neither is it the answer. Tim Malbon, co-founder of product innovation studio Made by Many, wrote, “… innovation demands a full-stack approach that doesn’t privilege design over other disciplines…” [The Problem with Design Thinking]

When faced with deteriorating results, shifting market expectations, and gaps in institutional knowledge, organizations need to change. It’s called adaptability. Marked by awareness, diversity, and self-determination, an adaptable organization can meet these challenges. Recognizing shifts in the environment, aka problem identification, is step one. Diversity in thinking, if not built into the organization, sends groups scrambling for answers like design thinking. But it won’t plug the hole in the boat.

The opposite of “business as usual” is not design thinking. Design thinking doesn’t work effectively as a one-time exercise with your leadership team, to develop a sure-shot breakthrough product, or in a siloed department of “creative types.” It’s a way of operating, much different than what many organizations use to keep the machine churning. Additionally, design thinking is a way, not “the way.” What about systems thinking, public health research process, or lean startup methodology? Or an amalgam of different methodologies that acknowledge the complexities of the marketplace, shifting customer expectations, and uncomfortable realities? Thinking outside the box might mean that design thinking won’t work for you.

Is the US done building cities?

My city, Omaha, was founded in 1854. As I sat downtown, I started thinking historically. What makes people establish new towns? Why do people agree to start a new place? And when does that urge change?

omaha-tiltIn 1854, the US was in a boom cycle, fueled by steel, railroads, guns, and land to grab. Manifest Destiny was the driving force that brought people streaming from the East Coast… that and immigration pressure. Thousands of cities and towns in the Midwest can trace their start to that era.

In other parts of the world – think China – new cities are being built to deal with the rural-to-urban migration. The US is dealing with migration, too, but it seems like cities adjusted post-WWII by spreading to suburbs… geographic extensions of the core. Now, US cities are scrambling to revitalize or stabilize those cores after wealth relocated to the fringe where land was accessible and “build to suit.” Redevelopment is the movement of the day.

Are we done establishing new cities? A 2011 article reported that just nine cities formed in the US during the prior two years… to resist annexation or to contract for services like garbage collection. While not a full investigation of urban establishment, that seems to indicate that US expansion is complete.

After an expansion, expect a contraction. That might explain why so many rural areas are skeletons of a former century. The US expanded, and now it’s contracting. Is that just the natural expression of a nation? Is that something we should fight?

In a 2015 NPR interview, architect Renzo Piano discussed his efforts to bolster up suburbs to be as vibrant and robust as the city core. “And it’s crucial, Piano says, that Italians not build any more peripheries, because stretching services and public transportation further outward is unsustainable. He says peripheries must be developed not by expansion, but by implosion; by transforming what’s already available.”

Patterns of expansion and contraction are evident to anyone who studies history. Cycles of development and growth are followed by periods of stagnation and reformation. Will this city boom, followed by a suburb boom, be replicated soon? If not in the US, in China and India? Where else are cities being formed? Or redeveloped? Worse, where are communities crumbling without the leadership to bring them back to life?

Opportunities will be claimed by the people who are able to read changes in the environment, then adapt to the new circumstances. It’s not all about the boom, but also about who can shake off the dust after a bust. Are you flexible enough to do both?