What does it mean to be human?

Guest Blog from: John Benedict

“… I’d like to share a revelation that I’ve had during my time here. It came to me when I tried to classify your species, and I realized that you’re not actually mammals. Every mammal on this planet instinctively develops a natural equilibrium with the surrounding environment; but you humans do not. You move to an area and you multiply, and multiply until every natural resource is consumed and the only way you can survive is to spread to another area. There is another organism on this planet that follows the same pattern. Do you know what it is? A virus. Human beings are a disease, cancer on this planet, you are a plague, and we…are the cure…”

Let’s hope it doesn’t come to that.

Eighteen years have passed since the birth of a blind child and his graduation from high school. Eighteen years ago, there were no iPods, USSR was a superpower, Japan looked to the United States for economic leadership and support, smoking was permitted on airplanes, there were no companies which researched on biotechnology and only a handful of mobility and medical specialists taught in the nation’s public schools.

In eighteen more years, today’s blind infants will graduate from a strikingly different world. What we teach these kids today will determine how well they survive in their future. We have to make educated guesses about that future (and keep guessing) to prepare them for success.

When a much earlier world changed from a hunting-and-gathering culture to an agricultural age, human relationships were redefined and concepts about space and time changed. The speed of life accelerated. Leadership shifted; old power structures were replaced by the newly empowered. Old definitions and institutions collapsed and new ones took their place.

The hunting-to-survive stage lasted for a million years, the agricultural age – another six thousand years and the Industrial age lasted three hundred years. Some futurists defined an information age and then declared it dead after forty years.The concept of a “job” was also invented by the Industrial age. It pulled the children off the farms to the cities where they had to adjust to new spatial and temporal rules. A job required an employee to be at a certain place for a set amount of time, to do repetitive tasks – to “work” at producing things that were not immediately relevant to the individual’s life. In exchange for the loss of an agricultural lifestyle, employers gave steady wages (not affected by the weather or natural rhythms).

The industrial age saw the creation of vacations, health insurance, and sick days; all resulting from the invention of the job (a new way to work). This change was traumatic for a farm-based  agricultural culture, and many resisted. Human beings no longer were “ruled” by their natural rhythms or by the seasons. Respect for the wisdom of the elders of the society declined as their power was bypassed; they no longer controlled the source of wealth, and their knowledge was irrelevant to the new age.

The rules are ever changing in this age of communication. The life cycle of a business is only seven years now. The cycle in technology is down to six months, and in the software business, if a company is to survive, it must bring new products to market within two or three months. There is hardly time to plan; certainly the present is of little help.

The amount of information in the world is doubling every eight years. One-half of everything a college student learned in his or her freshman year is obsolete by the time they graduate. The amount of knowledge we are asking a typical high school senior to learn is more information than their grandparents absorbed in a lifetime. Our decision load is growing. We are running too fast, making too many decisions too quickly about things we know too little about. How can all these grand ideas about individual web pages, global consciousness, and the coming of massively capable workstations ever be implemented when we hardly have time to eat? This is the major social question facing the beneficiaries of the communications age.

The question remains – with advancements in technology, do we have too little time for what is important and much more for what might not? Are we missing out on morals and courtesies and relying too much on an online presence? We may be called social beings, but are we stepping away from human interaction? The answers to all these are terrifying to even think about! It’s time that we reclaim what we lost.

I finish this essay as I started – with a quote from The Matrix Revolutions.

“…Illusions, Mr. Anderson. Vagaries of perception. Temporary constructs of a feeble human intellect trying desperately to justify an existence that is without meaning or purpose. And all of them as artificial as the Matrix itself, although… only a human mind could invent something as insipid as love…”

The machines may be right but our entire purpose is built on something as insipid as love.

John Benedict is from Hyderabad, India and works with Amazon, India.

Technology In the Classroom?

The Wall Street Journal has a Pros/cons article on this question … which is at the core of Social Impact of Technology in Education.

My son-in-law teaches a university class where students get the “lecture” portion online, and come into class to work on projects/homework. My granddaughter has online assignments regularly, many key tests are done online, and they don’t get ‘snow days’ — in case of inclimate weather they stay home and login. Programs like the Kahn Academy, and a number of Universities offer courses free to “audit”.

At the same time, kids need the real world collaboration, social experience, ideally no bullying, and ideally sufficiently strong (positive) peer groups that help them develop a bunch of skills that are real world based.

What are the key references you find informative on the question of how we educate the next generation?

Technologists who Give a Damn?

I’ve been using Zen and the Art of Motorcycle Maintenance in classes for a while now.  One key message of the book is that professionals (well everybody) needs to care about their work.  Perhaps more in Zen terms, to be mindful while they work.  The author asserts that one of the reasons technology is so alienating now-a-days is that the lack of care is evident in the workmanship, robustness, etc. I’ve also been working on an update of the SSIT Strategic Plan, and one element of that discussion has been what catchphrase should we use?… Like on business cards.  IEEE’s is “Advancing technology for humanity” which is a good one.  Currently we are using “Where Technology and Society Talk” … but it is tempting to use: “Technologists that Give a Damn” … a bit demeaning to imply that some (many?) don’t, but unfortunately this is at least occasionally true. There are at least two levels of caring.  The obvious one for SSIT is paying attention to the social impact of inventions and products (the “should we make it” as opposed to the “how we make it“).  There is a lower level that is also critical, in software we might ask “is this code elegant?”  Oddly, there seems to be a relationship between underlying elegance and quality.  Clean, simple design often works better than a ‘hack’, and it takes both a level of mastery, and a level of mindfulness to accomplish.  Some number of cyber security holes are a result of code where folks didn’t care enough to do it right. No doubt many “blue screen of death” displays and other failures and frustrations emerge from this same source.  Often management is under pressure, or lack of awareness, and is satisfied with shipping the product rather than making sure it is done well.  I’m not aware of any equivalent in most development facilities of the Japanese “line stop buttons” that make quality a ubiquitous responsibility.  The reality is we need technologists who invent and produce products that are right socially, done right technically — technologists who embrace “care” at all levels. A retired career counselor from the engineering school at one of our ivy league schools in my Zen class observed that we were more focused on ‘career skills’ than ‘quality’ in our education, and may be suppressing student’s sense of care.  We then observed that this apparent lack of care, evidenced in so many consumer products, might be a factor in why girls are choosing to not enter STEM education and careers. I suppose the question  that remains is “do we care?”

Ethics and Responsibility in Technology-for-Good: A Human-Centered Approach

A guest blog entry, author: Jim Fruchterman, CEO, Benetech
[Note: we welcome guest blog entries, send in your proposals via our input form — I knew of Jim’s socially oriented entrepreneurial work from his presentation at the 2011 Sections Congress.]

Our networked world has advanced to a point where information technology is touching all aspects of society. The cost of prototyping and deploying new technology tools is now extremely low and data has the potential to accelerate social progress in areas ranging from poverty to human rights, education, health, and the environment. However, we have yet to come to grips with what is ethical and what the laws should be in relation to rapidly changing technologies.
At Benetech, we regularly grapple with questions related to this issue. For instance, we ask, how can we harness the power of technology for positive social impact; and how can we mitigate the risks to privacy and civil rights posed by the age of big data?
As engineers who want to do the right thing, we follow four general guidelines: first, when it comes to data and technology in the social sector, apply a human-centered approach; second, treat the people you want to help with respect; third, when working to protect vulnerable communities, follow the “do no harm” maxim; and finally, bridge communities and establish partnerships-for-good.
Let me explain further.

1. Context matters

Building technology solutions for the social sector isn’t purely an armchair exercise, based on the thrill of empowering people in principle. We first understand those we aspire to help and the real-world conditions in which they live and operate. We must also put our technology innovations into the users’ hands, see what actually works, and adapt as necessary. This iterative method helps us focus on building products that are responsive to real needs.

2. Treat users as customers, not recipients of charity

People in challenging situations must invest their time and limited resources to improve their lives. Our role as technologists is to provide the tools that empower them to do so. Treating them as customers, rather than objects of charity, promotes their sense of ownership and self-agency as they use the tools that we develop to achieve their own goals.

3. When it comes to data, rights, and privacy, first do no harm

Vulnerable groups served by social justice organizations-such as victims of human rights abuse, refugees, LGBT individuals, or survivors of gender-based violence-deserve the same kind of respect for their sensitive information as citizens of wealthy countries expect for theirs. Having long supported human rights activists, we know the importance of confidentiality when working with victims and witnesses. For instance, Benetech’s Human Rights Program is focused on helping human rights practitioners, activists, and journalists uphold their commitments to protect and do no harm to the communities they serve. Our strong cryptography technology, Martus-a free, open source, secure information management tool-makes it easier for groups working with vulnerable populations keep the sensitive information they collect confidential.

4. Community and partnership are paramount

Technology only goes so far in creating social progress, but a galvanized community of partners and supporters who work together toward a greater good can generate lasting impact. Case in point: our accessible online library, Bookshare. Bookshare is the result of joint efforts of our partners in the education, technology, publishing, student, parent, and volunteer communities. Our technical tools by themselves don’t make change: it is these communities using our tools that create social good. As toolmakers, our ultimate impact is measured by what other people build with our tools.

In a world where the benefits of technology are still often limited to reaching the richest 1% or 5% of society, we are trilled to see a growing movement of engineers motivated to help humanity. As technologists with a focus on creating social good, we need to keep in mind principles of safety and ethics. While the context and the users may vary in each case, the principles of human-centered design and treating others as we would like to be treated remain the same. If we can keep these principles in mind, we can turn good ideas into proven solutions with lasting impact.

Jim Fruchterman is the Founder and CEO of Benetech, a Silicon Valley nonprofit technology company that develops software applications for users in the social sector. He is the recipient of numerous awards recognizing his work as a pioneering social entrepreneur, including the MacArthur Fellowship, Caltech’s Distinguished Alumni Award, the Skoll Award for Social Entrepreneurship, and the Migel Medal from the American Foundation for the Blind.