Vision Critical

The customer insight platform
for customer-centric companies

Request a Live Demo


Imagine this: The next time you go for a checkup, the pharmacist knows which drug your doctor will prescribe even before you utter your first word.

It sounds like conspiracy theory, but if a recent New York Times articles is correct, it is closer to reality than people might think. The article looks at the use of data for "drug marketing," a practice where pharmaceutical companies use vast databases of patient and doctor information to market drugs:

The information allows drug makers to know which drugs a doctor is prescribing and how that compares to a colleague across town. They know whether patients are filling their prescriptions - and refilling them on time. They know details of patients' medical conditions and lab tests, and sometimes even their age, income and ethnic backgrounds. (Source: The New York Times)

As the article points out, some doctors are uncomfortable with the idea of pharmaceuticals using sensitive patient information to market drugs. The article also highlights a larger ethical question that researchers and marketers constantly contend with: How do we use data to understand our consumers without invading their privacy?

Pharmaceuticals and other brands using consumer data need to do at least three things to make the data-gathering process fairer to consumers and therefore more ethical:

  1. Be more transparent.

What happens after pharmaceuticals gather patient data? This is a key question that the ordinary consumer doesn't quite know the answer to. Drug makers say they use the data to help improve chances that patients follow their medications as prescribed, but how exactly they do this and who it actually benefits are unclear. Furthermore, most consumers are not aware what type of data pharmaceuticals use and just how confidential that data is.

If people don't know what you do with their data, you can't expect them to fully trust you. The key is for pharmaceuticals and other brands to be more transparent about the type of data they use and how they use it. Also, letting people know how their data is informing business decisions will make you more likely to earn consumer trust.

  1. Make safeguarding of consumer data a top priority

In the U.S. the Department of Health and Human Services has strict rules regarding the handling of sensitive consumer data. However, as the New York Times article points out, some privacy advocates are concerned that people in anonymous databases can sometimes be re-identified.

Regulations exist, but privacy advocates want more and for good reason - current rules are not enough. Industry standards are one way of addressing privacy concerns. For instance, insight communities managed by a third party vendor (such as Vision Critical) subscribe to the CASRO Code of Standards and Ethics for Survey Research. This requires companies to never provide clients with anything on an individual level that would or could be personally identifiable. By following industry standards, companies such as Vision Critical have been safeguarding respondent data for years.

Being managed by a third party also adds a second layer of protection for people participating in research. It's a similar analogy to running a community made up of employees: How can it be anonymous and encourage people to respond honestly and truthfully if it is being managed by someone at that very company? As a third party, Vision Critical and other insight community providers help safeguard individuals from being singled out.

  1. Encourage double opt in

The story points out that doctors can opt out if they don't want companies to use their data through an American Medical Association program. But doctors cannot opt out completely: The use of their insurance claims and other related data is fair game.

For consumers, the situation is even worse: It appears they don't have any options at all. Unlike doctors, consumers cannot opt out.

Instead of automatically opting people in and leaving it up to them to opt-out, what if companies invite people in? For example, an insight community with doctor members will let companies talk to doctors directly without invading their privacy. People are willing to share their feedback - especially if it's in an environment where they aren't being sold products - but they need to be invited into the conversation.

In a recent research, my colleague Andrew Grenville talks about the concept of "communities of consent." In these communities, companies only gather information from people who have given permission to do so. With communities of consent, brands can continue to use data to drive business decisions, but consumers have opted in into the process, making it a more transparent system of gathering insights.

To avoid invading people's privacy, companies can collect information directly from physicians and consumers who want to be a part of this data gathering effort - from people who double opt in. This approach is different from the current approach because it invites people who specifically want to be a part of this data-sharing practice versus the current situation where consumers and physicians have limited say.

The smart use of consumer data allows brands to make informed decisions more quickly and efficiently. But brands need to tread carefully when using consumer data. Treating consumers as partners is a good first step in ensuring that the process creates a win-win situation both for brands and consumers.

If your company uses consumer data to enhance your sales process, learn more by downloading our free whitepaper, Communities of Consent: Privacy, Permissions and Possibilities in the Wild West days of Big Data.

Text Size

- +

Subscribe & Stay Informed

Can't get enough? Want to be notified as we continue to publish new content? Subscribe now and get insights straight to your inbox.