All About Grants Podcast

Transcript

 

David Kosub:��������������� Hello, and welcome to another edition of NIH's All About Grants podcast. I'm your host, David Kosub, with the NIH Office of Extramural Research. Today we're going to have a discussion about a topic that is probably on the minds of many of our listeners, and that's the protection of your personally identifiable information. We have with us today Dr. Richard Ikeda, who directs the OER's Office of Research Information Systems, and as also our privacy coordinator, along with Katrina Pearson, who directs the OER's Division of Statistical Analysis and Reporting branch. You may be familiar with some of their work if you've ever used any of our handy web tools to assess the publicly available data on NIH funded grants, or apply for funding, or even report on your research progress. It's important to note today that the discussion is going to be focusing on the data that NIH is collecting on its grantees and on its applicants. We're not going to be focusing the discussion on the data that's collected on persons who participate in clinical research. That's a conversation for another time. So, with that in mind, before we get too far in, let's have a better understanding of what we mean by personally identifiable information. Rick, can you please explain?

 

Dr. Richard Ikeda:������� It is important to define what personally identifiable information is, or PII. If we look at the National Institute of Standards and Technologies definition, it's any information about an individual that can be used to trace or distinguish an individual's identity. Now, this means that we actually have public PII data, like your name, which is out there on your Facebook account, or your LinkedIn account or your Google account. We also have sensitive PII data, like your race, your gender, your ethnicity, your Social Security number or your birthdate. And that's what we really strive to protect.

 

David Kosub:��������������� Okay, well, with that, can you tell us a bit more about how NIH views these data, and perhaps specifically, what information NIH may be collecting on its grantees, or its applicants and trainees, Katrina?

 

Katrina Pearson:��������� NIH views sensitive information and personally identifiable information, PII, as records of information that could be combined with anything that might be publicly available. For example, sensitive may include pre-decisional information on an unfunded applicant or grantee organization, whereas PII is considered as a disclosure of information on someone's gender or race.

 

David Kosub:��������������� Okay, great, thank you for further explaining to us what we mean by personally identifiable information. You know, where exactly is this information stored, and who can have access to it? I mean, I assume no one can just access it on a whim.

 

Katrina Pearson:��������� Sure, David, I can take that question. The information is stored in the ERA Commons database, and registered users can maintain their information through a profile that is created. Mainly, an investigator or trainees that are registered through the Commons for an account. They're able to enter their information such as their employment, expertise, gender, citizen status, race and ethnicity. All that information is collected and maintained within the system. And the individual is the one that maintains it. Furthermore, various staff here at NIH have access to personally identifiable information on grantees and applicants in ERA. These are staff who all have the appropriate permissions to review such data and undergo regular training to ensure they are up to speed as much as possible on privacy concerns. We also ensure that ERA Commons and other relevant systems contain the highest levels of IT security controls.

 

David Kosub:��������������� Great, great. Sounds like we have a lot of safeguards put in place for this information that's being stored here at NIH. But we often hear about information, our personally identifiable information being released out there to the world. What does NIH do in the case of a breach of our personal data?

 

Dr. Richard Ikeda:������� I'll take that question, David. NIH is very serious about protecting PII data. If we deem that there's a breach, or we detect that there's a breach, we act immediately to close that hole or that access.

 

David Kosub:��������������� So what about for those whose data is released during a breach? Are they notified?

 

Dr. Richard Ikeda:������� Absolutely David. We determine whose data has been released during a breach situation, and then take the proper steps to notify them.

 

David Kosub:��������������� Okay, great. I'm glad to hear that. I've also heard a lot about the Freedom of Information Act, or FOIA. This is a topic that we've had a podcast on before, for those who may be interested. And I'd like to talk about here the relationship of FOIA with the privacy act? What exactly may be released from my personal information if a FOIA request comes in?

 

Dr. Richard Ikeda:������� Well, David, there are two laws that govern our use of personally identifiable data. The Freedom of Information Act, and the Privacy Act. In cases where we receive a FOIA request for information that's protected by the Privacy Act, we must review the request and determine whether any information may be withheld under one of the exemptions provided in the Privacy Act. For example, exemption six protects individual privacy, but also requires the privacy interest to be weighed against the public interest in that information. For example, commons will record who the principal investigator is on a funded award. We actually make that available to the public through the NIH RePORT, because it's important to know who the lead is on a scientific project.

 

David Kosub:��������������� But what about for data that is aggregate and de-identified?

 

Dr. Richard Ikeda:������� So, de-identified aggregate data can be released as a report if it's a pre-existing report or analysis that's already been done, such as the Ginther paper, which looked at the fate of applicants via review, via race, ethnicity, and gender. That's an aggregate report that we drew conclusions and acted to help NIH to solve problems that it might be having with different issues, or different processes within NIH. It doesn't, however, reveal individual identities.

 

David Kosub:��������������� Okay, great. So, before we go, is there anything that either of you would like to leave with our listeners today about privacy?

 

Dr. Richard Ikeda:������� I'd just like to emphasize that NIH takes very seriously the responsibility to protect all the data, to keep it safe and to use it responsibly. Katrina, do you have any last words?

 

Katrina Pearson:��������� Sure. I would just echo Rick and say that NIH does take privacy seriously. And, but also, the public trusts us to maintain their information in a safe manner. So, we do feel obligated to ensure we're following the Privacy Act and data security policies, making sure that certain protections are in place.

 

David Kosub:��������������� Fantastic. Thank you very much for this great opportunity to speak with you both, Rick, and Katrina, on this important topic of the privacy of our personally identifiable information. My name is David Kosub with the NIH Office of Extramural Research, and All About Grants. Thank you.

]

Outtro:������������������������� �For your reference, the Ginther et al paper that Dr. Richard Ikeda mentioned can be found in the August 19, 2011 issue of Science magazine. It is entitled Race, ethnicity, and NIH research awards.