As cultures evolve they not only change things, but change the way they change things – Sander van der Leeuw
In May 2014, a Hong Kong VC firm announced that they were putting an ‘AI’ (Artificial Intelligence) on their Board of Directors[i].
Hong Kong based venture capital firm Deep Knowledge Ventures (DKV) has appointed a machine learning program to its board. Called VITAL, it’s an “equal member” that will uncover trends “not immediately obvious to humans” in order to make investment recommendations.
Despite being an obvious publicity stunt, the action does indicate a trend that cannot be ignored. Machine intelligence is getting to the point where it can accomplish knowledge tasks that people cannot. The implications of this ‘tipping point’ will affect virtually every aspect of our lives, not the least of which is the way we innovate – with our AI partners and for our AI partners.
Innovation has been, and will continue to be, about creating artifacts that people want and adopt. Doing innovation well has necessitated the development of increasingly sophisticated methodologies, processes and tools. With the rise of intelligent artifacts, innovation will inevitably need to address the creation of artifacts that other (intelligent) artifacts want[ii]. This will require both a qualitative and quantitative change in the methodologies, processes and tools employed. The tipping point we are approaching will result in:
- The need to develop and use intelligent tools to assist (and take over) innovation activities
- The need to innovate to meet the needs of intelligent artifacts as well as people
Take, for example, the concept of an intelligent Virtual Personal Assistant (VPA)[iii],[iv]. The signs that this is inevitable are all around us, from actually useful voice recognition capabilities[v] to Google Now[vi] to Amazon’s concept of ‘zero-click’ shopping[vii]. For the innovator, the VPA is part of the human-system whose unmet needs and desires must be understood. The VPA is constantly connected, constantly sensing, constantly communicating and processing at rates unachievable by an individual, yet it is influencing the individual at every decision point. In this world, we will need the means to understand not just the human, but their VPA and all the other machine intelligences that surround us as well. The tools we use to do this will themselves need to be intelligent and they will inevitably be taking over many of the functions we do today.
Machines Become Innovators
Many things that innovators do now will become automated and augmented. A major part of every innovation effort is the exploration undertaken to learn new knowledge, gain new understanding and insight and discover new opportunities. Often these explorations rely on serendipitous and ad-hoc means to find the interesting and the relevant. It’s what we search for, the links we follow, the people we meet, the conferences we go to that today are our means of exploration and discovery.
The most advanced methods of exploration impose some structure and discipline, as well as provide a lot of breadth and focus, on who, what and how the innovator finds and uses new knowledge sources but even these most advanced systems and frameworks require a tremendous amount of manual labor.
One trend that is occurring with remarkable speed is the convergence of ‘big data’, artificial intelligence and ‘instrumented’ social networks (i.e. social networks in which virtually every encounter or interaction is recorded). Together, these hold out the promise that soon, one of the primary tasks of today’s innovator, that of discovering the relevant, tacit knowledge of groups of people, will be largely automated.
Computing technology will advance to the point where issues of discovering new knowledge is no longer an issue. Intelligent Innovation Assistants (IIAs) can be instructed to look for content and find the relevant ‘snippets’ within the content that you need to see, all automatically. They can then synthesize these ‘snippets’ into patterns and models that are unobservable to a mere human. In addition, these IIAs can tap into multiple social networks, observe behaviors, and calculate unmet needs and will even ‘interview’ select individuals to uncover future demand.
These tools will become ubiquitous. There will be no special differentiation in anyone’s ability to discover relevant knowledge for creating something new. Intelligent Innovation Assistants (IIAs) will become smart enough to recognize behavioral context and to maintain selective focus over time even when adopter’s attention switches contexts. These IIAs are beyond-human in their ability to sense and understand both the real and the digital world. Even creativity will be partially automated[viii] as new AI’s see subtle associations and suggest new possibilities that no human could have thought of.
Artifacts become Adopters
We create artifacts[ix] and our experience of the world is mediated by them. We create artifacts to satisfy our wants – the needs and desires of our own or others. But what if it’s the artifacts themselves who ‘want’ to be adopted and that humans, as their creators, are the means they use to evolve into something that is functional and desirable and therefore adopted.
Just like Richard Dawkins did in his 1985 book ‘The Selfish Gene’, Kevin Kelley, in his 2010 book ‘What Technology Wants’, looks at technology through an evolutionary lens that reverses perspective. Kelley and Dawkins put genes (Dawkins) and technology (Kelley) at the center of an evolutionary process that moves forward through variation and selection. It is us individuals who serve to propagate our genes, our technology and our artifacts. They will persist long after any one of us.
What does the world look like if we treat our artifacts, as well as people, as adopters? This is rapidly becoming the case as our creations are becoming increasingly intelligent. They advise us about all sorts of things and we increasingly rely on them, and believe what they tell us. As we create increasingly automated, intelligent and complex systems, it is the artifacts themselves that are ‘demanding’ new things that we, the human innovators, work to create. The entities we are satisfying are increasingly the artifacts themselves. The artifacts ‘want’ things from their human creators.
We are used to considering people as the ones who adopt new things and have developed the tools of marketing research, voice of the customer, surveys, focus groups, ethnography etc. designed to get us inside the mind of the customer and determine what they want. But the nature of the world today involves a lot of human activity spent on creating things that other artifacts (e.g. the hardware and software products and services) ‘want’. When an engineer designs a complex, high pressure fuel injector for a new engine, the requirements for that come from the ‘needs’ of the engine and the vehicle. When a software engineer creates an application to monitor and control the environment of a building, the requirements for that are as driven by as much from the needs of the machines that regulate the environment of the building as from the needs of the human occupants of the building.
As our devices get more and more autonomous and intelligent, as our AI enabled personal assistants better predict what it is we want, the more we will be designing new artifacts to ‘satisfy’ the ‘needs’ of other artifacts. It may now seem odd to talk of artifacts having motivations, but, as our artifacts get more intelligent, it is only a matter of time before we start attributing motivation to the things we create. How often, even now, do we speak of our smartphone app ‘needing’ to ‘know’ our location, calendar, contact list and preferences? How comfortable do we feel with our television ‘watching’ us to determine what we are doing or if we are paying attention? These things are already coming into being[x],[xi],[xii],[xiii] and they will affect the way we innovate in the future.
The Future Innovator
Today the innovator is challenged with the uncertainty, ambiguity and complexity of both the future design realm of what will be possible and the future demand realm of what will be wanted. In addition, the innovator must navigate the internal organizational structures, processes and culture to get the organization to do the right thing. These activities require ever more advanced tools, methodologies and processes to do well.
In the future, as artifacts, organizations and people transform their activities and behaviors due, in large part, to the increasingly intelligent, connected and sensed environment being built, the job of the innovator will become one of managing the increasing complexity of the creation process. It will be a task of bringing together the natural and artificial entities required to create order and simplicity out of complexity, to organize and synthesize the deep information and knowledge that is available and, with the help of our AI partners, to design new artifacts that cannot now be imagined.
[i] Dvorsky, G.; Venture Capital Firm Appoints Machine Intelligence As Board Member; May 14, 2014
[ii] Kelly, K.; What Technology Wants; Penguin Books; October 2010
[iii] Winarsky, N. and Mark, B.; The Future Of The Virtual Personal Assistant; TechCrunch; March 2012
[iv] Dizick, A.; Virtual Gofers Tackle Personal To-Do Lists ; Wall Street Journal; June 2010
[v] Sangani, K.; Voice Recognition Comes of Age; Engineering and technology Magazine; July 2013
[vi] Farber, D.; The Future of Search is Now; CNet; May 2013
[vii] Bort, J.; Here’s How Websites Know Who You’ll Date And What You’ll Buy Before You Do; Business Insider; December 2012
[viii] Computational Creativity is becoming a discipline. Int’l Conference on Computational Creativity
[ix] See previous post – Innovation or Artifact
[x] Markoff, J.; IBM Watson’s performance on Jeopardy. Computer Wins on ‘Jeopardy!’: Trivial, It’s Not; New York Times, February 2011
[xi] Intel; A Framework for the Internet of Things; 2013; 7th Edition
[xii] O’Brien, J., X. How AR Will Change the Way We Search; Mashable; March 2013
[xiii] Ulanoff, L.; Google Knowledge Graph Could Change Search Forever; Mashable; February 2012