Fortune Brainstorm Tech

It’s no secret that data and analytics are transforming just about every industry, so I wasn’t surprised to see a number of sessions at Fortune Brainstorm Tech focus on the topic. But I found the discussion of new uses for agricultural and genomic data quite interesting, as well as a talk on “controlling AI” that really came down to data too.

Genomic Information at Ancestry and Color

Othman Laraki - Color and Margo Georgiadis- Ancestry FBT19

Ancestry CEO Margo Georgiadis, and Color co-founder and CEO Othman Laraki discussed how genomic data could impact the health care market.

Georgiadis noted that Ancestry, which currently has information on 100 million family histories and the largest repository of consumer DNA, has been around for 30 years and focused on consumer interactions. But she also talked about partnering with other companies to get to better health outcomes through genomics

She reminded the audience that “Your genes are not your destiny,” saying it was only one signal, and that it was important to look at family history as well.

Laraki, whose firm focuses on precision medicine, discussed using genomic information to “build a health care infrastructure that can see further down the road.” In the future, we “won’t think of it as genomics, we will think of it as health care.” He noted the huge disconnect between what we are spending on health care and the value we’re getting. This is the “biggest human and entrepreneurial opportunity of our generation,” he said, noting that the health system is just beginning starting to use genomics in primary care.

He talked about how there were both consumer applications and population-level health care implications and talked about the company’s relationship with MIT’s Broad Institute.

Still, Georgiadis said privacy was the root of the company’s relationship with its customers, and said individuals use and control their own data. She said the company never gives information to law enforcement unless it is compelled to do so, and last year that happened only 10 times. The requests were all related to credit card fraud, not genetic information.

She said collective insights that can be gleaned between the records were important. “Our customer is never the product,” she said, “that alignment is deeply important.”

Georgiadis said companies that collect genomic information must be clear about what they stand for, and in making sure customers understand how the organizations will use and share the data. She said that Ancestry, 23andMe, and Helix had set up a set of genetic privacy standards and were encouraging other players to sign on. This includes using population-level data for medical and health research.

Every technology creates a new set of issues, Georgiadis said. “As leaders, we need to take responsibility for thinking and anticipating those issues and setting high standards for the way in which we do business.”

Agricultural Data

In another session, Land O’Lakes CEO Beth Ford and Gro Intelligence founder and CEO Sara Menker discussed how data is changing agriculture and the businesses around it.

Beth Ford - Land O'Lakes FBT 19

Ford talked about the Land O’Lakes research into predictive models that capture farmer data about what is planted in various soil types and what practices they do, to help farmers know what changes they can make within the growing season. She said the firm’s Truterra Insights Engine contains a trillion data points. The goal is to increase resilience but at the same time improve productivity.

Land O’Lakes is a cooperative owned by farmers, Ford noted, and therefore is focused on helping improve farm productivity as well as sustainability. The goal was to improve the incentive structure for farmers, saying 96 percent of farms are still family-owned. She discussed the “shared destiny” that all of us share, adding that technology is necessary or food security will be at risk.

She said an individual farmer’s data is siloed, but combined with predictive models including data collected from satellites and drones. “We will capture their data,” Ford said, “but they own it.”

Predictive models and making changes “in-season” as never been more important than it is this year, Ford said, noting the dramatic weather-related issues farmers are facing. She said the average farmer lost money last year, and that low commodity prices have been a problem for many farmers for years.

Sara Menker - Gro Intelligence FBT19

Gro Intelligence is working on building predictive models to forecast supply, demand, and price for any agricultural product anywhere in the world, Menker said. She said that food and beverage companies, banks, and commodity traders need this information, especially because of the changes coming from extreme weather events. She noted that 10 million acres of farmland have been abandoned due to floods this year, representing $6.5 billion in lost revenue.

Menker talked about how the system is designed to ingest data sets and react to market events, and how this will allow firms to structure financial instruments to better manage risks. This, she said, will eventually lower the cost of capital for farmers. She used to trade oil and gas, and that it has been easier to get capital to develop energy than to farm.

IBM and Salesforce on Data, Fairness, and AI Ethics

Richard Socher - Salesforce and Dario Gil- IBM Research FBT19

IBM Research Chief Operating Officer Dario Gil and Salesforce Chief Scientist Richard Socher talked about AI and the importance of using it in ways that are ethical and fair.

“Every single industry will be impacted by AI,” Socher said, but in the end, AI can only be as good as the data we use to train it. As a result, he said, the field needs to focus more on ethics. He noted that like any tool—computers, the internet, or even a hammer—AI can be used for good or bad.

Gil called AI “an unfortunate term,” because people hear the term and think it’s acting on its own. He said we should just substitute the word “software” for “AI.” That makes it more clear where the responsibility lies. “Accountability needs to rest with the people and the institutions that are creating the software,” he said.

Asked about “deepfakes,” Socher said that people have faked photographs for a long time, and at the same time, people have gotten better at identifying fake photos. He said, we will have to come to the same understanding with video, but it was currently very hard to create really convincing videos. For now, Socher said, he was much more worried about people creating fake news, sharing it on social media, and AI recommending it.

Gil talked about the question of bias, pointing to multiple layers of the problem. At the first layer is the core AI algorithm. Beyond that, there is the issue of data. For instance, he noted that there are regulations and an aspect of accountability in assessing credit in banking. But if you just use approvals over the last 20 or 30 years, the model would give more credit to men than women. The neural net isn’t biased, he said, but the data set is. At another level, he talked about a higher-level bias, in that most of the people working in AI are white men, a situation he said the industry is “trying to improve.”

One silver lining, Gil said, is that if someone is denied credit and a person makes the decision, it’s easy for one person to give an excuse. But if you look at decisions from an algorithm over a period of time, it’s much easier to see what is really happening. “AI puts a mirror in front of our faces,” he said, noting that it’s easier to change an algorithm than to change 1,000 people.

As part of this, he described work IBM is doing to look for bias in the data, and to make more fair decisions. He noted that fairness involved many different metrics, and that variables are correlated to each other in hidden ways, and that makes it hard.

Socher noted that bias was “not as easy to remove as it seems.” He noted that you could remove race or gender from an algorithm but get much of the same result by considering zip code and income. He noted it was hard because Salesforce doesn’t build one application—instead it creates smaller applications for 150,000 orgs, each using its own data. He noted that some form of bias may be acceptable, such as not marketing breast pumps to men. But in other cases, it might be illegal or wrong. There is “no silver bullet,” Socher said, “It has to be a mindset.”