Knowledge science = data science + learning science

If a computer proves a theorem and no human understands it, is it math?

We are experiencing a new era of data abundance rising from machines transmitting human data and, increasingly, producing data of their own. How do we understand and make use of it all?

Data transfers among humans and machines. It is communicated by one end and learned by the other1. We tend to call such data knowledge when it is learned and understood by a human.

We all have countless experiences with human-to-human knowledge–conversation, lecture, writing, dance–but it is changing both in scale and process. Online courses can reach hundreds of thousands of people, and they are often taught by professors who have invented some of what is taught. But is this the best way to learn? Are the right things being taught?

Machine-to-machine knowledge is something we increasingly need to understand. Venkatash Rao calls computing the Mother of All Disruptions and anticipates machines increasingly treated as economic agents. How does such knowledge evolve when it is outside of human intervention, and what scope of tasks will we afford machines to apply this knowledge?

Your FitBit sits on your wrist and silently records data throughout the day, and at the end of the day conveys an insight about your walking habits: “I don’t walk as much as I thought.” This is a machine-to-human example. Data visualization is one of many ways that machines will be able to communicate (how about music?).

Codifying human learning processes will be a major task in adapting to data abundence. Given new knowledge generated by a computer–say, a novel mathematical theorem–how can the computer use a model of human learning to best communicate that result? Can the computer teach what it creates, or will there be knowledge that we abandon forever to the machine-to-machine realm?

Finally, human-to-machine knowledge has quickly evolved from punch cards to high-level programming languages to Siri. But what will the divide in digital literacies–such as programming ability–mean for economic inequality, or humanity in general, when machines are given increasing power?

[1] Data may sometimes be merely experienced by the receiving end, but I’m not qualifying that as data transfer. My working definition of learning is a change in long-term memory. (The definition works for computers too: something is stored.)