Staff Q&A with Lauren Steinfeld

Lauren Steinfeld, senior advisor for Privacy and Compliance, knows that keeping things at Penn private—and creating policies that reflect the shifting technology landscape—is challenging.

Take Facebook, for example. Steinfeld set out to update Penn’s recommendations on Facebook privacy settings only to have the social networking site update them a week later. Facebook had changed its rules. Steinfeld had also made a helpful video to walk people through Facebook privacy settings, but had to take down the video because—you guessed it—the social network had changed its rules again.

Her role looks much different than it did 11 years ago, when Steinfeld came to Penn full-time as the first privacy officer in higher education.

Today, she coordinates with departments across campus, from the Information Security Office and Provost’s Office to Human Resources, as well as with other staffers in the University’s Office of Audit, Compliance, and Privacy, including Maura Johnston, University privacy officer; Torie Jones, Penn Medicine privacy officer; and Linda Yoder, institutional compliance officer.

“Privacy needs a lot of cops on the beat. At Penn, there’s a very large committed network of people formally and informally working on privacy issues,” says Steinfeld. “It’s just an effort that relies on a lot of people caring and acting.”

Steinfeld also helps to educate people about everything from cloud computing to guidelines for social media at Penn. January being Data Privacy Month, Steinfeld says they’re pushing hard to raise awareness about privacy issues. If you have questions, most likely, Steinfeld and the other members of the team have worked with departments across the University to craft a thoughtful policy and process.

The Current sat down with Steinfeld, a lawyer by trade and former associate chief counselor for privacy at the Office of Management and Budget in the Clinton Administration, to discuss cloud computing, what people think about when they contemplate online privacy, and what’s kept her here for more than a decade.

Q. You have both ‘institutional compliance’ and ‘privacy’ in your job title. Can you explain what you do?
A. They’re really two different functions. Institutional compliance is focused on providing a supportive infrastructure to a very wide range of compliance initiatives at Penn. We try to serve the broad Penn community by promoting the Principles of Responsible Conduct [the code of conduct], by running the 215-P-COMPLY helpline. … We raise awareness of the do’s and don’ts and where to go to ask questions. We also support functional compliance leaders in charge of areas where specific requirements exist.
Now privacy is a little different because in this way we’re more like the functional lead on a specific issue. We look at risk assessments—where are there risks to people’s personal information and what are some of the things we can do about that risk? We also work with partners—privacy liasons—in the schools and centers.

Q. What are some of the challenges unique to a place like this, as opposed to a corporation or a smaller non-research college?
A. I’m glad you asked that. It’s an oversimplification, but corporations have customers and employees. We have faculty, staff, students, alumni, parents, donors, patients, research subjects, visitors, and that’s not the end. We have so many different constituencies. We also offer so many types of services. We teach, provide care, conduct research, provide dining, provide parking, provide recreational programs. People have said Penn is like a city. When you put it all together, we pretty much hit almost every type of regulation. We have health care, financial, student record, and credit card data. We have many different types of relationships and services and so it makes our privacy and compliance landscape more complicated.
We also have a lot of independent thinkers, which is one of the greatest things about working in higher education. But because of their independence, they may do some innovative things with information. That also makes this a richer place to look at privacy issues.
We’re also decentralized and distributed, even in our computing model. So, often times, developing the policy and the guidance is the easy part, whereas having it be read and understood and implemented throughout an organization like this is the most challenging.

Q. Is your priority safeguarding the privacy of Penn or the individuals at the University?
A. Let me put it differently. The way we focus it is on protecting Penn data, which includes information about students, faculty, staff, alumni, and others. This is important to the institution and also important to the people whose data we’re trying to protect.

Q. You say that a big part of your job is dealing with cloud computing issues. How so?
A. Cloud computing was not a term that I knew, and I’m sure had not been coined at the time I started in this position. It’s prevalent now and the reason is that service providers offer a lot of advantages to institutions like Penn, and to any individual user. People have started to use them as a lower-cost method of storing information, collaborating, and accessing information from anywhere.
The problem for us, from the privacy perspective, is that in many cases, we lose control of the data. That is usually not a problem for non-sensitive data, but it is a problem for sensitive data. If we don’t have an institutional agreement with a cloud provider with the right assurances, highly sensitive and regulated data should not be housed with that provider.

Q. What kind of data can be held in the cloud and what can’t be?
A. We have an obligation under certain laws to protect, for example, HIPAA [Health Insurance Portability and Accountability Act] data, or FERPA [Family Educational Rights and Privacy Act] data. With HIPAA data, you usually can’t share it with a third party unless they have given us certain specific assurances of how they will use and protect it. Most cloud providers don’t offer up that kind of HIPAA promise under their standard ‘click through’ agreement. At Penn we have a twofold strategy. First, we’ve raised awareness through published guidance about the risks of using cloud services—and it’s not just privacy risks. It’s security, availability, support issues, export controls, and more. Second, we recognize how important it is to our community to have good cloud services that they can use safely …
A lot of my work was to look at the privacy controls and determine if they seemed adequate and to ask for different kinds of promises in the contracts.

Q. Do you feel most people are open to cloud computing or do people have trust issues?
A. I think the trust issues are there. The major change in information management in the last decade is that we have less control over our information. Previously, we had fewer devices, and we generally knew where our data was stored and what organizations had access to it. … Some people want to stay in that world because of trust issues. But I think increasingly, you’re seeing more people gravitate to the cloud because it facilitates access from multiple devices.

Q. When people think about privacy online, what do you think they’re thinking about? Are people becoming much more casual about their own privacy and the information that’s out there?
A. I think that’s a real concern. People sharing their information online is so ubiquitous that you think it must be OK and it’s that kind of mindset that absolutely raises privacy concerns.
Because it’s so common for people to post and share, there is often a lack of sensitivity to what is really going on—which is sharing your personal details. Privacy is different for every single person. Some people do not care about putting their information about their vacation, about what they read, about their extracurriculars online. Some people care very much. …  Many people are very concerned about sharing health information, some people are not. What is most important to me is that everyone think about what is important to them and act accordingly. I will say, it’s always a bad idea to share your Social Security number if you don’t have to.

Q. Can you talk about the social media guidelines you released last year?
A. We felt it was important to explore the rules of the road for such activities, because of the powerful and complex nature of social media. … We put out a piece that I feel very good about. It is not very prescriptive but it does emphasize that existing Penn policy applies to online, as well as offline, activities. It does raise awareness of some of the key features that need to be considered—the very high impact of anything that’s done on social media and what that means for privacy, for protecting human subjects in research, for hiring, for teaching in the classroom.

Q. Are a lot of policy decisions about privacy driven by the technology? Are you reactive as opposed to proactive?
A. I see us as more proactive rather than reactive but I agree that technology has been a big piece driving the privacy agenda here and elsewhere. At the same time, we find that when people start thinking about privacy because of some technology concerns, they become sensitive to privacy issues generally, and start thinking about offline practices, too.

Q. What’s kept you here for more than 10 years?
A. I just find it constantly interesting. I really like the people I work with. The issues that I’ve worked on in the last two years were not issues that I worked on the first two years.

Steinfeld-slideshow